Research & development - Eindhoven | More than two weeks ago
At IMEC’s (Holst-centre lab) in Eindhoven, The Netherlands, we are developing a novel neuromorphic radar sensor backend called event-radar that targets always-on low-power sensing, sparse data streaming, and on-sensor processing. In-line with this work we seek for a motivated student to undertake a project, which will focus on exploring and developing temporally and spatially sparse (event based) encodings of radar signals for short-range radar application tasks (gesture recognition, vital sign detection, room activity classification). The objective will be that these signals can be generated and used for inference right at the sensor (low-power budget and real-time application inference).
(To be considered for this position: The European candidates must be enrolled in a Master program. Non-European master students who are enrolled in a Dutch university are also welcomed to apply).
Typically, most sensors
today (camera/microphone/radar/etc.) generate a lot of data that need to be
communicated for processing/inference by a model. This allows the sensor to do
little processing work at the expense of the bandwidth that is needed to
communicate the data to the downstream processing pipeline.
By contrast neuromorphic sensors (dynamic vision sensor , cochlea audio sensor , e-skin sensor ), exploit bio-inspired sensory processing principles, generates sparser temporal signals and consume significantly less power. A big advantage of this paradigm is that it leaves resources for application-related processing right at the sensor as well. Towards a similar objective in the neuromorphic group of IMEC (Holst-Centre lab, Netherlands), we have been developing an analogous neuromorphic radar-sensor backend, for indoor or short distance sensing applications (human machine interface using gestures commands, human activity classification, vital signs, and other applications for smart spaces).
The goal of this
project will be to explore various temporal encodings and sparse distributed representations
of the radar signals, their suitability for embedded low-power processing and
their efficacy in machine learning related application tasks.
For example, a baseline exploration point can be a differential encoding (delta or sigma-delta modulator), and one may move on to introduce reverberating dynamics with neural networks such as recurrent spiking neural networks that can be “nudged” to resonate according to the radar front-end detections or move to trainable sparse signature representations  of the activity taking place in front of the sensor.
The results of
this exploration will be compared with more common-place traditional radar DSP pipelines
(e.g., FFT based) and evaluated in various application tasks such as those
Project duration is set to 9 or 12months (e.g., internship and MSc project) and depending on outcomes, there will be opportunity to patent or publish the results in high-visibility conference or journal in the field.
While the work is primarily algorithmic, depending on competence and
interest, the student is also expected to work directly with the radar sensor
and dynamic vision sensor hardware prototype and novel neuromorphic accelerators
(available at IMEC), for collecting relevant dataset and running experiments.
Candidates are expected to be highly motivated, with relevant background in one or more of the following fields: sensor signal processing, neuromorphic computing/engineering, optimization and learning in neural networks, statistical pattern recognition / probabilistic learning models. The candidate must have good programming skills in Python and reasonable exposure to C/C++ (there will not be opportunity to learn elementary programming during the project). Interested applicants are welcome to submit their CV, and academic transcripts (courses taught, and scores or level attained wherever applicable).
 Galego et al. (2020). Event-based vision: a Survey. IEEE transactions on Pattern Analysis and Machine Intelligence.
 S.Liu et al. (2014). Asynchronous Binaural Spatial Audition Sensor With 2x64x4 Channel Output. IEEE Transactions on Biomedical Circuits and Systems.
 F.Bergner et al. (2020). Design and Realization of a Resistive Efficient Large-Area Event-Driven E-Skin. MDPI Sensors.
 W. Brendel et al (2020). Learning representations spike-by-spike. PLOS Computational Biology
Literature review on neuromorphic sensing and processing.
Click on ‘apply’ to submit your application. You will then be redirected to e-recruiting.