Deep learning has transformed data generation, particularly in creating synthetic sensor data. This capability is invaluable in fields like autonomous driving, robotics, and computer science. To achieve this, we train models using real data, enabling them to replicate sensor data closely. These models introduce variations and noise, enhancing diversity and realism. Prominent techniques, including generative adversarial networks (GANs), variational autoencoders (VAEs), and recurrent neural networks (RNNs), excel in generating synthetic sensor data. Our paper focuses on Autoregressive Convolutional Recurrent Neural Networks (CRNN) for Multivariate Time Series Prediction. We incorporate Denoising Autoencoders (DAE) to mimic real-world noise characteristics. Our model is trained and validated using Ultra Wide Band (UWB) and Ultra High-Frequency Radio Frequency Identification (UHF-RFID) sensor data. It integrates sensor measurements and diverse information sources to produce synthetic data complementing real-world data. While demonstrated with UHF-RFID and UWB sensors, these techniques extend to industrial automation, healthcare and environmental monitoring. While our methodology exhibits broad potential, we present practical demonstrations with UHF-RFID and UWB sensors. Our deep neural network model allows researchers to construct datasets for algorithm validation, eliminating the need for costly and time-consuming data collection.

Romanelli, F., Martinelli, F. (2023). Synthetic sensor measurement generation with noise learning and multi-modal information. IEEE ACCESS, 11, 111765-111788 [10.1109/ACCESS.2023.3323038].

Synthetic sensor measurement generation with noise learning and multi-modal information

Fabrizio Romanelli;Francesco Martinelli
2023-01-01

Abstract

Deep learning has transformed data generation, particularly in creating synthetic sensor data. This capability is invaluable in fields like autonomous driving, robotics, and computer science. To achieve this, we train models using real data, enabling them to replicate sensor data closely. These models introduce variations and noise, enhancing diversity and realism. Prominent techniques, including generative adversarial networks (GANs), variational autoencoders (VAEs), and recurrent neural networks (RNNs), excel in generating synthetic sensor data. Our paper focuses on Autoregressive Convolutional Recurrent Neural Networks (CRNN) for Multivariate Time Series Prediction. We incorporate Denoising Autoencoders (DAE) to mimic real-world noise characteristics. Our model is trained and validated using Ultra Wide Band (UWB) and Ultra High-Frequency Radio Frequency Identification (UHF-RFID) sensor data. It integrates sensor measurements and diverse information sources to produce synthetic data complementing real-world data. While demonstrated with UHF-RFID and UWB sensors, these techniques extend to industrial automation, healthcare and environmental monitoring. While our methodology exhibits broad potential, we present practical demonstrations with UHF-RFID and UWB sensors. Our deep neural network model allows researchers to construct datasets for algorithm validation, eliminating the need for costly and time-consuming data collection.
2023
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore IINF-04/A - Automatica
English
Artificial neural networks (ANNs); Convolutional neural networks (CNNs); Deep neural networks (DNNs); Denoising autoencoder (DAE); Long short-term memory (LSTM); Machine learning (ML)
Romanelli, F., Martinelli, F. (2023). Synthetic sensor measurement generation with noise learning and multi-modal information. IEEE ACCESS, 11, 111765-111788 [10.1109/ACCESS.2023.3323038].
Romanelli, F; Martinelli, F
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
Synthetic_Sensor_Measurement_Generation_With_Noise_Learning_and_Multi-Modal_Information.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Creative commons
Dimensione 3.48 MB
Formato Adobe PDF
3.48 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/439426
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact