Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-22213
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMichelucci, Umberto-
dc.contributor.authorVenturini, Francesca-
dc.date.accessioned2021-04-01T07:08:11Z-
dc.date.available2021-04-01T07:08:11Z-
dc.date.issued2021-
dc.identifier.issn2504-4990de_CH
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/22213-
dc.description.abstractNeural networks present characteristics where the results are strongly dependent on the training data, the weight initialisation, and the hyperparameters chosen. The determination of the distribution of a statistical estimator, as the Mean Squared Error (MSE) or the accuracy, is fundamental to evaluate the performance of a neural network model (NNM). For many machine learning models, as linear regression, it is possible to analytically obtain information as variance or confidence intervals on the results. Neural networks present the difficulty of not being analytically tractable due to their complexity. Therefore, it is impossible to easily estimate distributions of statistical estimators. When estimating the global performance of an NNM by estimating the MSE in a regression problem, for example, it is important to know the variance of the MSE. Bootstrap is one of the most important resampling techniques to estimate averages and variances, between other properties, of statistical estimators. In this tutorial, the application of resampling techniques (including bootstrap) to the evaluation of neural networks’ performance is explained from both a theoretical and practical point of view. The pseudo-code of the algorithms is provided to facilitate their implementation. Computational aspects, as the training time, are discussed, since resampling techniques always require simulations to be run many thousands of times and, therefore, are computationally intensive. A specific version of the bootstrap algorithm is presented that allows the estimation of the distribution of a statistical estimator when dealing with an NNM in a computationally effective way. Finally, algorithms are compared on both synthetically generated and real data to demonstrate their performance.de_CH
dc.language.isoende_CH
dc.publisherMDPIde_CH
dc.relation.ispartofMachine Learning and Knowledge Extractionde_CH
dc.rightshttp://creativecommons.org/licenses/by/4.0/de_CH
dc.subjectNeural networkde_CH
dc.subjectMachine learningde_CH
dc.subjectBootstrapde_CH
dc.subjectResamplingde_CH
dc.subjectAlgorithmde_CH
dc.subject.ddc006: Spezielle Computerverfahrende_CH
dc.titleEstimating neural network’s performance with bootstrap : a tutorialde_CH
dc.typeBeitrag in wissenschaftlicher Zeitschriftde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitut für Angewandte Mathematik und Physik (IAMP)de_CH
dc.identifier.doi10.3390/make3020018de_CH
dc.identifier.doi10.21256/zhaw-22213-
zhaw.funding.euNode_CH
zhaw.issue2de_CH
zhaw.originated.zhawYesde_CH
zhaw.pages.end373de_CH
zhaw.pages.start357de_CH
zhaw.publication.statuspublishedVersionde_CH
zhaw.volume3de_CH
zhaw.publication.reviewPeer review (Publikation)de_CH
zhaw.author.additionalNode_CH
zhaw.display.portraitYesde_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2021_Michelucci-Venturini_Neural-network-performance-bootstrap.pdf755.58 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.