Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering Computer Vision with TensorFlow 2.x

You're reading from   Mastering Computer Vision with TensorFlow 2.x Build advanced computer vision applications using machine learning and deep learning techniques

Arrow left icon
Product type Paperback
Published in May 2020
Publisher Packt
ISBN-13 9781838827069
Length 430 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Krishnendu Kar Krishnendu Kar
Author Profile Icon Krishnendu Kar
Krishnendu Kar
Arrow right icon
View More author details
Toc

Table of Contents (18) Chapters Close

Preface 1. Section 1: Introduction to Computer Vision and Neural Networks
2. Computer Vision and TensorFlow Fundamentals FREE CHAPTER 3. Content Recognition Using Local Binary Patterns 4. Facial Detection Using OpenCV and CNN 5. Deep Learning on Images 6. Section 2: Advanced Concepts of Computer Vision with TensorFlow
7. Neural Network Architecture and Models 8. Visual Search Using Transfer Learning 9. Object Detection Using YOLO 10. Semantic Segmentation and Neural Style Transfer 11. Section 3: Advanced Implementation of Computer Vision with TensorFlow
12. Action Recognition Using Multitask Deep Learning 13. Object Detection Using R-CNN, SSD, and R-FCN 14. Section 4: TensorFlow Implementation at the Edge and on the Cloud
15. Deep Learning on Edge Devices with CPU/GPU Optimization 16. Cloud Computing Platform for Computer Vision 17. Other Books You May Enjoy

Overview of GNNs

A Graph Neural Network (GNN) extends CNN learning to graph data. A graph can be represented as a combination of nodes and edges, where nodes represent the features of the graph and edges joins adjacent nodes, as shown in the following image:

In this image, the nodes are illustrated by solid white points and the edges by the lines joining the points.

The following equations describe the key parameters of the graph:

The transformations of the graph into a vector consisting of nodes, edges, and the relationships between the nodes are called the graph embeddings. The embeddings vector can be represented by the following equation:

The following list describes the elements of the preceding equation:

  • h[n] = State embedding for current node n
  • hne[n] = State embedding of the neighborhood of the node n
  • x[n] = Feature of the node n
  • xe[n] = Feature of the edge of...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime