Vai al contenuto principale

NextPerception - Next generation smart perception sensors and distributed intelligence for proactive human monitoring in health, wellbeing, and automotive systems

Tipologia
Progetti internazionali
Programma di ricerca
JU ECSEL
Ente finanziatore
JU ECSEL
Budget
€ 348.300,00
Periodo
01/05/2020 - 01/05/2023
Responsabile
Marco Botta

Partecipanti al progetto

Descrizione del progetto

L’obiettivo del progetto è costruire soluzioni di sensing più versatili, robuste, proattive e human-centric di quelle esistenti, finalizzate al monitoraggio e al controllo (decision making) in ambito healthcare e automotive, puntando a incrementare e distribuire intelligenza in sensori e sistemi di sensori. Per meglio sfruttare le potenzialità conseguibili mediante l’adozione di sistemi intelligenti ed interagenti, il progetto si propone di superare le attuali limitazioni sul sensing e sui relativi sistemi di sensori aumentando l'intelligenza nei sensori stessi e nei sistemi che li collegano. Il progetto intende sfruttare il potenziale delle nuove tecnologie sui sensori, sui metodologie implementative per realizzare una elaborazione intelligente e distribuita del sensing che possa esaltare specifiche applicazioni in diversi domini quali healthcare e automotive.

We put our lives increasingly in the hands of smart complex systems making decisions that directly affect our
health and wellbeing. This is very evident in healthcare - where systems watch over your health - as well as in
traffic - where autonomous driving solutions are gradually taking over control of the car. The accuracy and
timeliness of the decisions depend on the systems’ ability to build a good understanding of both you and your
environment. That understanding relies on observations and the ability to reason on them. While humans are
well equipped to grasp their environment and respond to it, building smart sensor systems to do the same is far
from trivial. And as decision making on life-threatening situations in healthcare and traffic is transferred from
people (doctors, drivers) to machines, it is of utmost importance to ensure the reliable and secure operation of
the underlying sensing and reasoning technologies.
This project aims to bring perception sensor technologies to the next level, enhancing their features to allow
for more accurate detection of human behaviour and physiological parameters. Ranging technologies like Ra-
dar, LiDAR and Time of Flight cameras have been applied for detecting objects and assist in navigation in
recently developed ADAS solutions. The next generation of these perception sensors will be able to success-
fully detect humans and accurately monitor their behaviour and physiological parameters. Besides more accu-
rate automotive solutions ensuring driver vigilance and pedestrian and cyclist safety, this innovation will open
up new opportunities in health and wellbeing to monitor elderly people at home or unobtrusively assess health
state.
In order to tackle challenges related to the reaction speed, scalability, versatility, reliability and security of
these ever more complex systems, this project will embrace the new Distributed Intelligence paradigm to fa-
cilitate building complex smart systems and ensure their secure and reliable operation. Distributed Intelligence
leverages the advantages of Fog, Edge and Cloud computing, building on the distributed computational re-
sources increasingly available in sensors and edge components to distribute also the intelligence. Designing
this kind of System of Intelligent Systems needs new design methods and tools and support from a well-thought
architecture.
The goal of this project is to develop next generation smart perception sensors and enhance the distrib-
uted intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions
for the health, wellbeing, and automotive domains
The solutions developed in the project are key to realise the foreseen revolutions in Health and Well-Being -
patient-centric care through advanced monitoring solutions, and Transport and Smart Mobility - automated
driving made safe by driver, and vulnerable road user monitoring.

Risultati e pubblicazioni

Selected Contributions


Challenges for Driver Action Recognition with Face Masks 

Elvio G. Amparore, Marco Botta, Idilio Drago, Susanna Donatelli, Giuseppe Mazzone 
Submitted at: 25th IEEE International Conference on Intelligent Transportation Systems (IEEE ITSC 2022). 

Abstract: Advanced Driver Assistance Systems (ADAS) are enabling technologies in Intelligent Transportation Systems. Modern ADAS include algorithms to classify drivers' actions and distractions, aiming at identifying situations in which the driver is inattentive. Such systems typically include components for Driver Action Recognition (DAR) and Visual Distraction Classification (VDC), which prevent risky situations during semi-autonomous driving. DAR and VDC often rely on cameras that track the driver and classify actions based on image recognition algorithms. The COVID-19 pandemic has changed several common social behaviours, including the widespread use of face mask even during driving. In some cases, (taxi, bus) face covering policies are compulsory in many legislations. We here show that these behavioural changes challenge state-of-the-art DAR and VDC systems, with the average F1-score in some scenarios dropping by around 30% when exposed to images of drivers wearing masks. Noting a lack of public datasets to update the ML classifiers performing such tasks, we contribute MaskDAR, an open dataset for Action Recognition of Drivers wearing face Masks. Finally, using MaskDAR we show the importance of including subjects with face masks in datasets for DAR. 

The paper shows that face masks results in significant degradation of the classification performances for DAR systems that were not designed for that classification task. 
Analysis carried out using eXplainable AI (see above) reveals that Convolutional Neural Networks heavily rely on facial traits. 


Automotive embedded software architectures in the multi-core age 

Workshop. 
Elvio Amparore, Idilio Drago, Susanna Donatelli, Marco Botta 
Presentation of the NextPerception work and results to start collaborations with other industrial partners. 

Inserimento dell'immagine in corso...

The slide summarizes the work performed by UNITO in the 1st development cycle.


From lab data to real car module 

Cross-pollination activity 
Elvio Amparore, Idilio Drago, Susanna Donatelli, Marco Botta, Maria Jokela (VTT) 
Summary of the UniTO/VTTcross-pollination activities. 

The slide summarizes the cross-pollination work performed by UNITO and VTT in the 2nd development cycle.


NeuNAC: A Novel Fragile Watermarking Algorithm for Integrity Protection of Neural Networks,

Marco Botta, Davide Cavagnino, Roberto Esposito. Information Sciences, 2021 

The slide summarizes the workflow of the watermarking algorithm. It takes as input a trained neural network, computes a watermark that depends on the structure and parameters of the network itself, and embeds it into the parameters of the network with no effect on the performances of the classification accuracy. Once deployed, the authenticity of the neural network can be verified even in real-time.

Note

Bando di ricerca: H2020-ECSEL-2019-2-RIA

Miur: ECSEL-2019-2-RIA

Ultimo aggiornamento: 15/06/2023 10:45
Location: http://informatica.unito.it/robots.html
Non cliccare qui!