Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Neuro-Symbolic AI

You're reading from   Neuro-Symbolic AI Design transparent and trustworthy systems that understand the world as you do

Arrow left icon
Product type Paperback
Published in May 2023
Publisher Packt
ISBN-13 9781804617625
Length 196 pages
Edition 1st Edition
Concepts
Arrow right icon
Authors (2):
Arrow left icon
Alexiei Dingli Alexiei Dingli
Author Profile Icon Alexiei Dingli
Alexiei Dingli
David Farrugia David Farrugia
Author Profile Icon David Farrugia
David Farrugia
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Chapter 1: The Evolution and Pitfalls of AI 2. Chapter 2: The Rise and Fall of Symbolic AI FREE CHAPTER 3. Chapter 3: The Neural Networks Revolution 4. Chapter 4: The Need for Explainable AI 5. Chapter 5: Introducing Neuro-Symbolic AI – the Next Level of AI 6. Chapter 6: A Marriage of Neurons and Symbols – Opportunities and Obstacles 7. Chapter 7: Applications of Neuro-Symbolic AI 8. Chapter 8: Neuro-Symbolic Programming in Python 9. Chapter 9: The Future of AI 10. Index 11. Other Books You May Enjoy

Introducing popular neural network architectures

In this chapter, we explore some of the most popular ANN architectures beyond the basic single-layer and multilayer perceptrons. The recurrent neural network (RNN) is a feed-forward network that incorporates temporal relationships and is widely used in applications ranging from sentence autocompletion to stock market predictions. However, RNNs suffer from the vanishing gradient problem, which hinders their learning ability. Competitive networks, such as Kohonen networks and self-organizing maps, classify inputs without supervision, while Hopfield networks, a special ANN with every node connected to every other node, act as associative memory and tend to converge based on similarities. Boltzmann machines (BMs) and restricted Boltzmann machines (RBMs) are variants of the Hopfield network that have additional restrictions and are trained using unsupervised learning approaches, making them great at extracting discriminative features from...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime