Using embeddings in supervised ML
Alright! We’ve made it through some really fun hands-on work involving network construction, community detection, and both unsupervised and supervised ML; done some egocentric network visualization; and inspected the results of the use of different embeddings. This chapter really brought everything together. I hope you enjoyed the hands-on work as much as I did, and I hope you found it useful and informative. Before concluding this chapter, I want to go over the pros and cons of using embeddings the way that we have.
Please also keep in mind that there are many other classification models we could have tested with, not just Random Forest. You can use these embeddings in a neural network if you want, or you could test them with logistic regression. Use what you learned here and go have as much fun as possible while learning.
Pros and cons
Let’s discuss the pros and cons of using these embeddings. First, let’s start with...