Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Artificial Intelligence for Beginners

You're reading from   Hands-On Artificial Intelligence for Beginners An introduction to AI concepts, algorithms, and their implementation

Arrow left icon
Product type Paperback
Published in Oct 2018
Publisher Packt
ISBN-13 9781788991063
Length 362 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
David Dindi David Dindi
Author Profile Icon David Dindi
David Dindi
Patrick D. Smith Patrick D. Smith
Author Profile Icon Patrick D. Smith
Patrick D. Smith
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. The History of AI 2. Machine Learning Basics FREE CHAPTER 3. Platforms and Other Essentials 4. Your First Artificial Neural Networks 5. Convolutional Neural Networks 6. Recurrent Neural Networks 7. Generative Models 8. Reinforcement Learning 9. Deep Learning for Intelligent Agents 10. Deep Learning for Game Playing 11. Deep Learning for Finance 12. Deep Learning for Robotics 13. Deploying and Maintaining AI Applications 14. Other Books You May Enjoy

Word2vec

The Word2vec algorithm, invented by Tomas Mikolav while he was at Google in 2013, was one of the first modern embedding methods. It is a shallow, two-layer neural network that follows a similar intuition to the autoencoder in that network and is trained to perform a certain task without being actually used to perform that task. In the case of the Word2vec algorithm, that task is learning the representations of natural language. You can think of this algorithm as a context algorithm – everything that it knows is from learning the contexts of words within sentences. It works off something called the distributional hypothesis, which tells us that the context for each word is found from its neighboring words. For instance, think about a corpus vector with 500 dimensions. Each word in the corpus is represented by a distribution of weights across every single one of...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image