Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-21968
Publication type: Conference paper
Type of review: Peer review (publication)
Title: New autonomous intelligent sensor design approach for multiple parameter inference
Authors: Michelucci, Umberto
Venturini, Francesca
et. al: No
DOI: 10.3390/engproc2020002096
10.21256/zhaw-21968
Published in: Engineering Proceedings
Volume(Issue): 2
Issue: 1
Page(s): 96
Conference details: 7th Electronic Conference on Sensors and Applications, Online, 15-30 November 2020
Issue Date: Feb-2021
Publisher / Ed. Institution: MDPI
ISSN: 2673-4591
Language: English
Subjects: Optical sensor; Machine learning; Artificial neural network; Oxygen sensing; Dual sensor
Subject (DDC): 006: Special computer methods
621.3: Electrical, communications, control engineering
Abstract: The determination of multiple parameters via luminescence sensing is of great interest for many applications in different fields, like biosensing and biological imaging, medicine, and diagnostics. The typical approach consists in measuring multiple quantities and in applying complex and frequently just approximated mathematical models to characterize the sensor response. The use of machine learning to extract information from measurements in sensors have been tried in several forms before. But one of the problems with the approaches so far, is the difficulty in getting a training dataset that is representative of the measurements done by the sensor. Additionally, extracting multiple parameters from a single measurement has been so far an impossible problem to solve efficiently in luminescence. In this work a new approach is described for building an autonomous intelligent sensor, which is able to produce the training dataset self-sufficiently, use it for training a neural network, and then use the trained model to do inference on measurements done on the same hardware. For the first time the use of machine learning additionally allows to extract two parameters from one single measurement using multitask learning neural network architectures. This is demonstrated here by a dual oxygen concentration and temperature sensor.
URI: https://digitalcollection.zhaw.ch/handle/11475/21968
Fulltext version: Published version
License (according to publishing contract): CC BY 4.0: Attribution 4.0 International
Departement: School of Engineering
Organisational Unit: Institute of Applied Mathematics and Physics (IAMP)
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2021_Michelucci-Venturini_Autonomous-intelligent-sensor-design.pdf313.27 kBAdobe PDFThumbnail
View/Open
Show full item record
Michelucci, U., & Venturini, F. (2021). New autonomous intelligent sensor design approach for multiple parameter inference [Conference paper]. Engineering Proceedings, 2(1), 96. https://doi.org/10.3390/engproc2020002096
Michelucci, U. and Venturini, F. (2021) ‘New autonomous intelligent sensor design approach for multiple parameter inference’, in Engineering Proceedings. MDPI, p. 96. Available at: https://doi.org/10.3390/engproc2020002096.
U. Michelucci and F. Venturini, “New autonomous intelligent sensor design approach for multiple parameter inference,” in Engineering Proceedings, Feb. 2021, vol. 2, no. 1, p. 96. doi: 10.3390/engproc2020002096.
MICHELUCCI, Umberto und Francesca VENTURINI, 2021. New autonomous intelligent sensor design approach for multiple parameter inference. In: Engineering Proceedings. Conference paper. MDPI. Februar 2021. S. 96
Michelucci, Umberto, and Francesca Venturini. 2021. “New Autonomous Intelligent Sensor Design Approach for Multiple Parameter Inference.” Conference paper. In Engineering Proceedings, 2:96. MDPI. https://doi.org/10.3390/engproc2020002096.
Michelucci, Umberto, and Francesca Venturini. “New Autonomous Intelligent Sensor Design Approach for Multiple Parameter Inference.” Engineering Proceedings, vol. 2, no. 1, MDPI, 2021, p. 96, https://doi.org/10.3390/engproc2020002096.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.