Full metadata record
DC FieldValueLanguage
dc.contributor.authorVenturini, Francesca-
dc.contributor.authorBaumgartner, Michael-
dc.date.accessioned2019-08-29T10:40:05Z-
dc.date.available2019-08-29T10:40:05Z-
dc.date.issued2019-
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/18009-
dc.description.abstractLuminescence sensors are based on the determination of emitted intensity or decay time when a luminophore is in contact with its environment. Changes of the environment, like temperature or analyte concentration cause a change in the intensity and decay rate of the emission. Typically, since the absolute values of the measured quantities depend on the specific sensing element and scheme used, a sensor needs two inputs to work: 1) a description of the dependence of the quantity to be determined, for example the analyte concentration, from sensed quantity, for example the decay time. This can be done by analytical expressions or by a look-up table. 2) A calibration of the sensor at known reference conditions. In this work we explored a new approach based on machine learning for luminescence sensing. Without using any model and a-priori information about the intensity decay characteristics, we developed a neural network (NN) to determine an analyte concentration. The new NN was then used to realize an optical oxygen sensor based on luminescence quenching. After training the NN on synthetic data, we tested it by applying it to measured data. The new approach allowed to achieve an accuracy in the oxygen determination of 4000 ppm vol O2, being limited mainly by the accuracy of the data used for the training. With this work we demonstrated that the new approach based on machine learning allows the development of an optical luminescence oxygen sensor without any analytical model of the sensing element and sensing scheme used.de_CH
dc.language.isoende_CH
dc.publisherSPIE - The International Society for Optical Engineeringde_CH
dc.rightsLicence according to publishing contractde_CH
dc.subject.ddc006: Spezielle Computerverfahrende_CH
dc.titleNew approach for luminescence sensing based on machine learningde_CH
dc.typeKonferenz: Sonstigesde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitut für Angewandte Mathematik und Physik (IAMP)de_CH
zhaw.conference.detailsSPIE OPTO, San Francisco, USA, 2-7 February 2019de_CH
zhaw.funding.euNode_CH
zhaw.originated.zhawYesde_CH
zhaw.publication.statuspublishedVersionde_CH
zhaw.publication.reviewPeer review (Abstract)de_CH
zhaw.webfeedPhotonicsde_CH
zhaw.author.additionalNode_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
There are no files associated with this item.
Show simple item record
Venturini, F., & Baumgartner, M. (2019). New approach for luminescence sensing based on machine learning. SPIE OPTO, San Francisco, USA, 2-7 February 2019.
Venturini, F. and Baumgartner, M. (2019) ‘New approach for luminescence sensing based on machine learning’, in SPIE OPTO, San Francisco, USA, 2-7 February 2019. SPIE - The International Society for Optical Engineering.
F. Venturini and M. Baumgartner, “New approach for luminescence sensing based on machine learning,” in SPIE OPTO, San Francisco, USA, 2-7 February 2019, 2019.
VENTURINI, Francesca und Michael BAUMGARTNER, 2019. New approach for luminescence sensing based on machine learning. In: SPIE OPTO, San Francisco, USA, 2-7 February 2019. Conference presentation. SPIE - The International Society for Optical Engineering. 2019
Venturini, Francesca, and Michael Baumgartner. 2019. “New Approach for Luminescence Sensing Based on Machine Learning.” Conference presentation. In SPIE OPTO, San Francisco, USA, 2-7 February 2019. SPIE - The International Society for Optical Engineering.
Venturini, Francesca, and Michael Baumgartner. “New Approach for Luminescence Sensing Based on Machine Learning.” SPIE OPTO, San Francisco, USA, 2-7 February 2019, SPIE - The International Society for Optical Engineering, 2019.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.