Title: Advanced applied deep learning : convolutional neural networks and object detection
Authors : Michelucci, Umberto
Extent : 350
Publisher / Ed. Institution : Apress
Publisher / Ed. Institution: Berkeley
Issue Date: 15-Oct-2019
Edition: 1st edition
License (according to publishing contract) : Licence according to publishing contract
Type of review: Editorial review
Language : English
Subjects : Machine learning; Deep learning; Python; TensorFlow; Convolutional neural networks; Neural networks; Object detection
Subject (DDC) : 004: Computer science
Abstract: Develop and optimize deep learning models with advanced architectures. This book teaches you the intricate details and subtleties of the algorithms that are at the core of convolutional neural networks. In Advanced Applied Deep Learning, you will study advanced topics on CNN and object detection using Keras and TensorFlow. Along the way, you will look at the fundamental operations in CNN, such as convolution and pooling, and then look at more advanced architectures such as inception networks, resnets, and many more. While the book discusses theoretical topics, you will discover how to work efficiently with Keras with many tricks and tips, including how to customize logging in Keras with custom callback classes, what is eager execution, and how to use it in your models. Finally, you will study how object detection works, and build a complete implementation of the YOLO (you only look once) algorithm in Keras and TensorFlow. By the end of the book you will have implemented various models in Keras and learned many advanced tricks that will bring your skills to the next level.
Departement: Life Sciences and Facility Management
Organisational Unit: Institute of Applied Simulation (IAS)
Publication type: Book
ISBN: 978-1-4842-4975-8
URI: https://digitalcollection.zhaw.ch/handle/11475/16969
Appears in Collections:Publikationen Life Sciences und Facility Management

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.