Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning for Data Mining

You're reading from   Machine Learning for Data Mining Improve your data mining capabilities with advanced predictive modeling

Arrow left icon
Product type Paperback
Published in Apr 2019
Publisher Packt
ISBN-13 9781838828974
Length 252 pages
Edition 1st Edition
Languages
Tools
Concepts
Arrow right icon
Author (1):
Arrow left icon
Jesus Salcedo Jesus Salcedo
Author Profile Icon Jesus Salcedo
Jesus Salcedo
Arrow right icon
View More author details
Toc

A sample neural network model

Let's use an example to understand neural networks in more detail:

Notice that every neuron in the Input Layer is connected to every neuron in the Hidden Layer, for example, Input 1 is connected to the first, second, and even the third neuron in the Hidden Layer. This implies that there will be three different weights, and these weights will be a part of three different equations.

This is what happens in this example:

  • The Hidden Layer intervenes between the Input Layer and the Output Layer.
  • The Hidden Layer allows for more complex models with nonlinear relationships.
  • There are many equations, so the influence of a single predictor on the outcome variable occurs through a variety of paths.
  • The interpretation of weights won't be straightforward.
  • Weights correspond to the variable importance; they will initially be random, and then they will go through a bunch of different iterations and will be changed based on the feedback of the iterations. They will then have their real meaning of being associated with variable importance.

So, let's go ahead and see how these weights are determined and how we can form a functional neural network.

Feed-forward backpropagation

Feed-forward backpropagation is a method through which we can predict things such as weights, and ultimately the outcome of a neural network.

According to this method, the following iterations occur on predictions:

  • If a prediction is correct, the weight associated with it is strengthened. Imagine the neural network saying, Hey, you know what, we used the weight of 0.75 for the first part of this equation for the first predictor and we got the correct prediction; that's probably a good starting point.
  • Suppose the prediction is incorrect; the error is fed back or back propagated into the model so that the weights or weight coefficients are modified, as shown here:

This backpropagation won't just take place in-between the Hidden Layers and the Target layer, but will also take place toward the Input Layer:

While these iterations are happening, we are actually making our neural network better and better with every error propagation. The connections now make a neural network capable of learning different patterns in the data.

So, unlike any linear regression or a decision tree model, a neural network tries to learn patterns in the data. If it's given enough time to learn those patterns, the neural network, combined with its experience, understands and predicts better, improving the rate of accuracy to a great extent.

Model training ethics

When you are training the neural network model, never train the model with the whole dataset. We need to hold back some data for testing purposes. This will allow us to test whether the neural network is able to apply what its learned from the training dataset to a new data.

We want the neural network to generalize well to new data and capture the generalities of different types of data, not just little nuances that would then make it sample-specific. Instead, we want the results to be translated to the new data as well. After the model has been trained, the new data can be predicted using the model's experience.

You have been reading a chapter from
Machine Learning for Data Mining
Published in: Apr 2019
Publisher: Packt
ISBN-13: 9781838828974
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime