Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Modern Computer Vision with PyTorch

You're reading from   Modern Computer Vision with PyTorch Explore deep learning concepts and implement over 50 real-world image applications

Arrow left icon
Product type Paperback
Published in Nov 2020
Publisher Packt
ISBN-13 9781839213472
Length 824 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Yeshwanth Reddy Yeshwanth Reddy
Author Profile Icon Yeshwanth Reddy
Yeshwanth Reddy
V Kishore Ayyadevara V Kishore Ayyadevara
Author Profile Icon V Kishore Ayyadevara
V Kishore Ayyadevara
Arrow right icon
View More author details
Toc

Table of Contents (25) Chapters Close

Preface 1. Section 1 - Fundamentals of Deep Learning for Computer Vision
2. Artificial Neural Network Fundamentals FREE CHAPTER 3. PyTorch Fundamentals 4. Building a Deep Neural Network with PyTorch 5. Section 2 - Object Classification and Detection
6. Introducing Convolutional Neural Networks 7. Transfer Learning for Image Classification 8. Practical Aspects of Image Classification 9. Basics of Object Detection 10. Advanced Object Detection 11. Image Segmentation 12. Applications of Object Detection and Segmentation 13. Section 3 - Image Manipulation
14. Autoencoders and Image Manipulation 15. Image Generation Using GANs 16. Advanced GANs to Manipulate Images 17. Section 4 - Combining Computer Vision with Other Techniques
18. Training with Minimal Data Points 19. Combining Computer Vision and NLP Techniques 20. Combining Computer Vision and Reinforcement Learning 21. Moving a Model to Production 22. Using OpenCV Utilities for Image Analysis 23. Other Books You May Enjoy Appendix
Artificial Neural Network Fundamentals

An Artificial Neural Network (ANN) is a supervised learning algorithm that is loosely inspired by the way the human brain functions. Similar to the way neurons are connected and activated in the human brain, a neural network takes input and passes it through a function, resulting in certain subsequent neurons getting activated, and consequently producing the output.

There are several standard ANN architectures. The universal approximation theorem says that we can always find a large enough neural network architecture with the right set of weights that can exactly predict any output for any given input. This means, for a given dataset/task we can create an architecture and keep adjusting its weights until the ANN predicts what we want it to predict. Adjusting the weights until this happens is called training the neural network. Successful training on large datasets and customized architecture is how ANNs have gained prominence in solving various relevant tasks.

One of the prominent tasks in computer vision is to recognize the class of the object present in an image. ImageNet was a competition held to identify the class of objects present in an image. The reduction in classification error rate over the years is as follows:

The year 2012 was when a neural network (AlexNet) was used in the winning solution of the competition. As you can see from the preceding chart, there was a considerable reduction in errors from the year 2011 to the year 2012 by leveraging neural networks. Over time since then, with more deep and complex neural networks, the classification error kept reducing and has beaten human-level performance. This gives a solid motivation for us to learn and implement neural networks for our custom tasks, where applicable.

In this chapter, we will create a very simple architecture on a simple dataset and mainly focus on how the various building blocks (feedforward, backpropagation, learning rate) of an ANN help in adjusting the weights so that the network learns to predict the expected outputs from given inputs. We will first learn, mathematically, what a neural network is, and then build one from scratch to have a solid foundation. Then we will learn about each component responsible for training the neural network and code them as well. Overall, we will cover the following topics:

  • Comparing AI and traditional machine learning
  • Learning about the artificial neural network building blocks
  • Implementing feedforward propagation
  • Implementing backpropagation
  • Putting feedforward propagation and backpropagation together
  • Understanding the impact of the learning rate
  • Summarizing the training process of a neural network
You have been reading a chapter from
Modern Computer Vision with PyTorch
Published in: Nov 2020
Publisher: Packt
ISBN-13: 9781839213472
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime