Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-18993
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAghaebrahimian, Ahmad-
dc.contributor.authorCieliebak, Mark-
dc.date.accessioned2019-12-19T14:49:29Z-
dc.date.available2019-12-19T14:49:29Z-
dc.date.issued2019-
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/18993-
dc.description.abstractDeep Neural Networks have advanced rapidly over the past several years. However, it still seems like a black art for many people to make use of them efficiently. The reason for this complexity is that obtaining a consistent and outstanding result from a deep architecture requires optimizing many parameters known as hyperparameters. Hyperparameter tuning is an essential task in deep learning, which can make significant changes in network performance. This paper is the essence of over 3000 GPU hours on optimizing a network for a text classification task on a wide array of hyperparameters. We provide a list of hyperparameters to tune in addition to their tuning impact on the network performance. The hope is that such a listing will provide the interested researchers a mean to prioritize their efforts and to modify their deep architecture for getting the best performance with the least effort.de_CH
dc.language.isoende_CH
dc.publisherSwisstextde_CH
dc.rightsNot specifiedde_CH
dc.subject.ddc006: Spezielle Computerverfahrende_CH
dc.titleHyperparameter tuning for deep learning in natural language processingde_CH
dc.typeKonferenz: Paperde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitut für Informatik (InIT)de_CH
dc.identifier.doi10.21256/zhaw-18993-
zhaw.conference.details4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019de_CH
zhaw.funding.euNode_CH
zhaw.originated.zhawYesde_CH
zhaw.publication.statuspublishedVersionde_CH
zhaw.publication.reviewPeer review (Abstract)de_CH
zhaw.webfeedSoftware Systemsde_CH
zhaw.webfeedNatural Language Processingde_CH
zhaw.author.additionalNode_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
swisstext19_Hyperparameters.pdfHyperparameter Tuning for Deep Learning in Natural Language Processing171.02 kBAdobe PDFThumbnail
View/Open
Show simple item record
Aghaebrahimian, A., & Cieliebak, M. (2019). Hyperparameter tuning for deep learning in natural language processing. 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019. https://doi.org/10.21256/zhaw-18993
Aghaebrahimian, A. and Cieliebak, M. (2019) ‘Hyperparameter tuning for deep learning in natural language processing’, in 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019. Swisstext. Available at: https://doi.org/10.21256/zhaw-18993.
A. Aghaebrahimian and M. Cieliebak, “Hyperparameter tuning for deep learning in natural language processing,” in 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019, 2019. doi: 10.21256/zhaw-18993.
AGHAEBRAHIMIAN, Ahmad und Mark CIELIEBAK, 2019. Hyperparameter tuning for deep learning in natural language processing. In: 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019. Conference paper. Swisstext. 2019
Aghaebrahimian, Ahmad, and Mark Cieliebak. 2019. “Hyperparameter Tuning for Deep Learning in Natural Language Processing.” Conference paper. In 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019. Swisstext. https://doi.org/10.21256/zhaw-18993.
Aghaebrahimian, Ahmad, and Mark Cieliebak. “Hyperparameter Tuning for Deep Learning in Natural Language Processing.” 4th Swiss Text Analytics Conference (SwissText 2019), Winterthur, June 18-19 2019, Swisstext, 2019, https://doi.org/10.21256/zhaw-18993.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.