In this chapter, we covered two categories of emerging NN models—GNNs and MANNs. We started with a short introduction to graphs and then we looked at several different types of GNN, including GraphNN, graph convolutional networks, graph attention networks, and graph autoencoders. We concluded the graph section by looking at the NGL and we implemented an NGL example using the TensorFlow-based NSL framework. Then we focused on memory-augmented networks, where we looked at the NTM and MANN* architectures.
In the next chapter, we'll look at the emerging field of meta learning, which involves making ML algorithms learn to learn.