Please use this identifier to cite or link to this item:
https://doi.org/10.21256/zhaw-20804
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tuggener, Lukas | - |
dc.contributor.author | Amirian, Mohammadreza | - |
dc.contributor.author | Benites de Azevedo e Souza, Fernando | - |
dc.contributor.author | von Däniken, Pius | - |
dc.contributor.author | Gupta, Prakhar | - |
dc.contributor.author | Schilling, Frank-Peter | - |
dc.contributor.author | Stadelmann, Thilo | - |
dc.date.accessioned | 2020-11-12T13:24:47Z | - |
dc.date.available | 2020-11-12T13:24:47Z | - |
dc.date.issued | 2020-11-06 | - |
dc.identifier.issn | 2673-2688 | de_CH |
dc.identifier.uri | https://digitalcollection.zhaw.ch/handle/11475/20804 | - |
dc.description.abstract | We present an extensive evaluation of a wide variety of promising design patterns for automated deep-learning (AutoDL) methods, organized according to the problem categories of the 2019 AutoDL challenges, which set the task of optimizing both model accuracy and search efficiency under tight time and computing constraints. We propose structured empirical evaluations as the most promising avenue to obtain design principles for deep-learning systems due to the absence of strong theoretical support. From these evaluations, we distill relevant patterns which give rise to neural network design recommendations. In particular, we establish (a) that very wide fully connected layers learn meaningful features faster; we illustrate (b) how the lack of pretraining in audio processing can be compensated by architecture search; we show (c) that in text processing deep-learning-based methods only pull ahead of traditional methods for short text lengths with less than a thousand characters under tight resource limitations; and lastly we present (d) evidence that in very data- and computing-constrained settings, hyperparameter tuning of more traditional machine-learning methods outperforms deep-learning systems. | de_CH |
dc.language.iso | en | de_CH |
dc.publisher | MDPI | de_CH |
dc.relation.ispartof | AI | de_CH |
dc.rights | http://creativecommons.org/licenses/by/4.0/ | de_CH |
dc.subject | Automated machine learning | de_CH |
dc.subject | Architecture design | de_CH |
dc.subject | Computer vision | de_CH |
dc.subject | Audio processing | de_CH |
dc.subject | Natural language processing | de_CH |
dc.subject | Weakly supervised learning | de_CH |
dc.subject.ddc | 006: Spezielle Computerverfahren | de_CH |
dc.title | Design patterns for resource-constrained automated deep-learning methods | de_CH |
dc.type | Beitrag in wissenschaftlicher Zeitschrift | de_CH |
dcterms.type | Text | de_CH |
zhaw.departement | School of Engineering | de_CH |
zhaw.organisationalunit | Institut für Informatik (InIT) | de_CH |
dc.identifier.doi | 10.3390/ai1040031 | de_CH |
dc.identifier.doi | 10.21256/zhaw-20804 | - |
zhaw.funding.eu | No | de_CH |
zhaw.issue | 4 | de_CH |
zhaw.originated.zhaw | Yes | de_CH |
zhaw.pages.end | 538 | de_CH |
zhaw.pages.start | 510 | de_CH |
zhaw.publication.status | publishedVersion | de_CH |
zhaw.volume | 1 | de_CH |
zhaw.publication.review | Peer review (Publikation) | de_CH |
zhaw.webfeed | Datalab | de_CH |
zhaw.webfeed | Information Engineering | de_CH |
zhaw.webfeed | Software Systems | de_CH |
zhaw.webfeed | ZHAW digital | de_CH |
zhaw.webfeed | Natural Language Processing | de_CH |
zhaw.webfeed | Machine Perception and Cognition | de_CH |
zhaw.webfeed | Intelligent Vision Systems | de_CH |
zhaw.funding.zhaw | Ada – Advanced Algorithms for an Artificial Data Analyst | de_CH |
zhaw.author.additional | No | de_CH |
zhaw.display.portrait | Yes | de_CH |
Appears in collections: | Publikationen School of Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2020_Tuggener-etal_Design-patterns-deep-learning-methods_AI.pdf | 6.28 MB | Adobe PDF | View/Open |
Show simple item record
Tuggener, L., Amirian, M., Benites de Azevedo e Souza, F., von Däniken, P., Gupta, P., Schilling, F.-P., & Stadelmann, T. (2020). Design patterns for resource-constrained automated deep-learning methods. Ai, 1(4), 510–538. https://doi.org/10.3390/ai1040031
Tuggener, L. et al. (2020) ‘Design patterns for resource-constrained automated deep-learning methods’, AI, 1(4), pp. 510–538. Available at: https://doi.org/10.3390/ai1040031.
L. Tuggener et al., “Design patterns for resource-constrained automated deep-learning methods,” AI, vol. 1, no. 4, pp. 510–538, Nov. 2020, doi: 10.3390/ai1040031.
TUGGENER, Lukas, Mohammadreza AMIRIAN, Fernando BENITES DE AZEVEDO E SOUZA, Pius VON DÄNIKEN, Prakhar GUPTA, Frank-Peter SCHILLING und Thilo STADELMANN, 2020. Design patterns for resource-constrained automated deep-learning methods. AI. 6 November 2020. Bd. 1, Nr. 4, S. 510–538. DOI 10.3390/ai1040031
Tuggener, Lukas, Mohammadreza Amirian, Fernando Benites de Azevedo e Souza, Pius von Däniken, Prakhar Gupta, Frank-Peter Schilling, and Thilo Stadelmann. 2020. “Design Patterns for Resource-Constrained Automated Deep-Learning Methods.” Ai 1 (4): 510–38. https://doi.org/10.3390/ai1040031.
Tuggener, Lukas, et al. “Design Patterns for Resource-Constrained Automated Deep-Learning Methods.” Ai, vol. 1, no. 4, Nov. 2020, pp. 510–38, https://doi.org/10.3390/ai1040031.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.