17 Highly Recommended Books For Deep Learning Researchers.

Prominent computer science professors and engineers like Nando De Freitas, Michael I Jordan, Juergen Schmidhuber, Alex Lamb and Geoffrey Hinton recommend these books to learn everything on Deep Learning.
Asymptotic Statistics
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Asymptotic Statistics

by A. W. van der Vaart

About the book:  Here is a practical and mathematically rigorous introduction to the field of asymptotic statistics. In addition to most of the standard topics of an asymptotics course--likelihood inference, M-estimation, the theory of asymptotic efficiency, U-statistics, and rank procedures--the book also presents recent research topics such as semiparametric models, the bootstrap, and empirical processes and their applications. The topics are organized from the central idea of approximation by limit experiments, one of the book's unifying themes that mainly entails the local approximation of the classical i.i.d. set up with smooth parameters by location experiments involving a single, normally distributed observation.


Notes:  This book is used to teach students at Berkeley. As a book it shows how many ideas in inference (M estimation---which includes maximum likelihood and empirical risk minimization-the bootstrap, semiparametrics, etc) repose on top of empirical process theory.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Introductory Lectures on Convex Optimization
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Introductory Lectures on Convex Optimization

by Y. Nesterov

About the book:  It was in the middle of the 1980s, when the seminal paper by Kar­ markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op­ timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre­ diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc­ tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop­ ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12].


Notes:  Y Nesterov's book gives a way to start to understand lower bounds in optimization.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Machine Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Machine Learning

by Kevin P. Murphy

About the book:  Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package -- PMTK (probabilistic modeling toolkit) -- that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.


Notes:  Nando de Freitas recommends this book for machine learning. Kevin Murphy will be coming out with an updated version of the same book that will have a much better deep learning section.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
An Introduction to Statistical Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

An Introduction to Statistical Learning

by Gareth James,Daniela Witten,Trevor Hastie,Robert Tibshirani

About the book:  An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform.Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.


Notes:  For those who have worked on Unsupervised, Supervised and LSTM based models, and have a great conceptual understanding of the models, Nando De Freitas recommends this book before pursuing their PhD in this field.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Reinforcement Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Reinforcement Learning

by Richard S. Sutton,Andrew G. Barto

About the book:  Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.


Notes:  This book is a survey of traditional Reinforcement Learning. It's perfect for those who have just started in this field and want to learn more about machine learning.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Pattern Recognition and Machine Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Pattern Recognition and Machine Learning

by Christopher M. Bishop

About the book:  This is the first textbook on pattern recognition to present the Bayesian viewpoint. The book presents approximate inference algorithms that permit fast approximate answers in situations where exact answers are not feasible. It uses graphical models to describe probability distributions when no other books apply graphical models to machine learning. No previous knowledge of pattern recognition or machine learning concepts is assumed. Familiarity with multivariate calculus and basic linear algebra is required, and some experience in the use of probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.


Notes:  Said to have a probabilistic view, Juergen Schmidhuber calls this book the bible of traditional machine learning,

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Deep Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Deep Learning

by Ian Goodfellow,Yoshua Bengio,Aaron Courville

About the book:  "Written by three experts in the field, Deep Learning is the only comprehensive book on the subject." -- Elon Musk, cochair of OpenAI; cofounder and CEO of Tesla and SpaceXDeep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Machine Learning
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Machine Learning

by Tom M. Mitchell

About the book:  This book covers the field of machine learning, which is the study of algorithms that allow computer programs to automatically improve through experience. The book is intended to support upper level undergraduate and introductory level graduate courses in machine learning.


Notes:  This book is used mostly at a university level. It covers just the fundamentals and would be perfect for a beginner in this field.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Hands-on Machine Learning With Scikit-learn and Tensorflow
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Hands-on Machine Learning With Scikit-learn and Tensorflow

by Aurélien Géron

About the book:  Through a series of recent breakthroughs, deep learning has boosted the entire field of machine learning. Now, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—scikit-learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn a range of techniques, starting with simple linear regression and progressing to deep neural networks. With exercises in each chapter to help you apply what you’ve learned, all you need is programming experience to get started.Explore the machine learning landscape, particularly neural netsUse scikit-learn to track an example machine-learning project end-to-endExplore several training models, including support vector machines, decision trees, random forests, and ensemble methodsUse the TensorFlow library to build and train neural netsDive into neural net architectures, including convolutional nets, recurrent nets, and deep reinforcement learningLearn techniques for training and scaling deep neural netsApply practical code examples without acquiring excessive machine learning theory or algorithm details


Notes:  It provides the one of the best overviews on how to think about and use TensorFlow. The chapters provide not just theory, but even full-blown practice. It skips over the very basic Python theory, most of the times used as a filler by other books.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
Multi-Valued and Universal Binary Neurons
The Amity Affliction - Pittsburg
The Amity Affliction - Shine on
See your ratings

Multi-Valued and Universal Binary Neurons

by Igor Aizenberg,Naum N. Aizenberg,Joos P.L. Vandewalle

About the book:  Multi-Valued and Universal Binary Neurons deals with two new types of neurons: multi-valued neurons and universal binary neurons. These neurons are based on complex number arithmetic and are hence much more powerful than the typical neurons used in artificial neural networks. Therefore, networks with such neurons exhibit a broad functionality. They can not only realise threshold input/output maps but can also implement any arbitrary Boolean function. Two learning methods are presented whereby these networks can be trained easily. The broad applicability of these networks is proven by several case studies in different fields of application: image processing, edge detection, image enhancement, super resolution, pattern recognition, face recognition, and prediction. The book is hence partitioned into three almost equally sized parts: a mathematical study of the unique features of these new neurons, learning of networks of such neurons, and application of such neural networks. Most of this work was developed by the first two authors over a period of more than 10 years and was only available in the Russian literature. With this book we present the first comprehensive treatment of this important class of neural networks in the open Western literature. Multi-Valued and Universal Binary Neurons is intended for anyone with a scholarly interest in neural network theory, applications and learning. It will also be of interest to researchers and practitioners in the fields of image processing, pattern recognition, control and robotics.


Notes:  The author wrote about “deep learning of the features of threshold Boolean functions", one of the most important objects considered in the theory of perceptrons.

Buy on Amazon Add to your reading list Remove from your reading list
Available on: Asset 1 Asset 1 Asset 1
1