Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning for Beginners

You're reading from   Deep Learning for Beginners A beginner's guide to getting up and running with deep learning from scratch using Python

Arrow left icon
Product type Paperback
Published in Sep 2020
Publisher Packt
ISBN-13 9781838640859
Length 432 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Pablo Rivas Pablo Rivas
Author Profile Icon Pablo Rivas
Pablo Rivas
Dr. Pablo Rivas Dr. Pablo Rivas
Author Profile Icon Dr. Pablo Rivas
Dr. Pablo Rivas
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Preface 1. Section 1: Getting Up to Speed
2. Introduction to Machine Learning FREE CHAPTER 3. Setup and Introduction to Deep Learning Frameworks 4. Preparing Data 5. Learning from Data 6. Training a Single Neuron 7. Training Multiple Layers of Neurons 8. Section 2: Unsupervised Deep Learning
9. Autoencoders 10. Deep Autoencoders 11. Variational Autoencoders 12. Restricted Boltzmann Machines 13. Section 3: Supervised Deep Learning
14. Deep and Wide Neural Networks 15. Convolutional Neural Networks 16. Recurrent Neural Networks 17. Generative Adversarial Networks 18. Final Remarks on the Future of Deep Learning 19. Other Books You May Enjoy

The perceptron learning algorithm

The perceptron learning algorithm (PLA) is the following:

Input: Binary class dataset

  • Initialize to zeros, and iteration counter
  • While there are any incorrectly classified examples:
  • Pick an incorrectly classified example, call it , whose true label is
  • Update as follows:
  • Increase iteration counter, , and repeat
Return:

Now, let's see how this takes form in Python.

PLA in Python

Here is an implementation in Python that we will discuss part by part, while some of it has already been discussed:

N = 100 # number of samples to generate
random.seed(a = 7) # add this to achieve for reproducibility

X, y = make_classification(n_samples=N, n_features=2, n_classes=2,
n_informative=2, n_redundant=0, n_repeated=0,
n_clusters_per_class=1, class_sep=1.2,
random_state=5)

y[y==0] = -1

X_train = np.append(np.ones((N,1)), X, 1) # add a column of ones

# initialize the weights to zeros
w...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at £16.99/month. Cancel anytime