Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Introducing Open AI’s Reptile: The latest scalable meta-learning Algorithm on the block

Save for later
  • 2 min read
  • 08 Mar 2018

article-image
Reptile, developed by Open AI, is a simple meta-learning algorithm. Meta-learning is the process of learning how to learn. A meta-learning algorithm takes in a distribution of tasks, where each task is a learning problem, and it produces a quick learner. This means a learner must be able to generalize from a small number of examples.

An example of a meta-learning problem is few-shot classification. Here, each task is a classification problem within which the learner after seeing only 1 - 5 input-output examples from each class must classify new inputs.

What Reptile does

It samples a task repeatedly, performs stochastic gradient descent on it, and finally updates the initial parameters towards the final parameters learned on the task.

Any Comparisons?

Reptile performs as well as MAML, which is also a broadly applicable meta-learning algorithm. Unlike MAML, Reptile is simple to implement and more computationally efficient.

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime

Some features of Reptile :

  • Reptile seeks an initialization for the parameters of a neural network, such that the network can be fine-tuned using a small amount of data from a new task.
  • Unlike MAML, Reptile simply performs stochastic gradient descent (SGD) on each task in a standard way. This means it does not unroll a computation graph or calculate any second derivatives. Hence, Reptile takes less computation and memory than MAML.
  • The current Reptile implementation uses TensorFlow for the computations involved, and includes code for replicating the experiments on Omniglot and Mini-ImageNet.

To Read more on how Reptile works, visit the OpenAI blog. To view Reptile implementations, visit its GitHub Repository.