Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Transfer Learning with Python

You're reading from   Hands-On Transfer Learning with Python Implement advanced deep learning and neural network models using TensorFlow and Keras

Arrow left icon
Product type Paperback
Published in Aug 2018
Publisher Packt
ISBN-13 9781788831307
Length 438 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Nitin Panwar Nitin Panwar
Author Profile Icon Nitin Panwar
Nitin Panwar
Raghav Bali Raghav Bali
Author Profile Icon Raghav Bali
Raghav Bali
Tamoghna Ghosh Tamoghna Ghosh
Author Profile Icon Tamoghna Ghosh
Tamoghna Ghosh
Dipanjan Sarkar Dipanjan Sarkar
Author Profile Icon Dipanjan Sarkar
Dipanjan Sarkar
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Machine Learning Fundamentals FREE CHAPTER 2. Deep Learning Essentials 3. Understanding Deep Learning Architectures 4. Transfer Learning Fundamentals 5. Unleashing the Power of Transfer Learning 6. Image Recognition and Classification 7. Text Document Categorization 8. Audio Event Identification and Classification 9. DeepDream 10. Style Transfer 11. Automated Image Caption Generator 12. Image Colorization 13. Other Books You May Enjoy

Transfer learning strategies

Let's start by first looking at a formal definition for transfer learning and then utilize it to understand different strategies. In their paper, A Survey on Transfer Learning (https://www.cse.ust.hk/~qyang/Docs/2009/tkde_transfer_learning.pdf), Pan and Yang use domain, task, and marginal probabilities to present a framework for understanding transfer learning. The framework is defined as follows:

A domain, D, is defined as a two-element tuple consisting of feature space, , and marginal probability, P(Χ), where Χ is a sample data point.

Here, Χ = {x1, x2....xn} with xi as a specific vector and Χ . Thus:

A task, T, on the other hand, can be defined as a two-element tuple of the label space, γ, and objective function, f. The objective function can also be denoted as P(γ| Χ) from a probabilistic view point...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime