Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Data Augmentation with Python

You're reading from   Data Augmentation with Python Enhance deep learning accuracy with data augmentation methods for image, text, audio, and tabular data

Arrow left icon
Product type Paperback
Published in Apr 2023
Publisher Packt
ISBN-13 9781803246451
Length 394 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Duc Haba Duc Haba
Author Profile Icon Duc Haba
Duc Haba
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Data Augmentation
2. Chapter 1: Data Augmentation Made Easy FREE CHAPTER 3. Chapter 2: Biases in Data Augmentation 4. Part 2: Image Augmentation
5. Chapter 3: Image Augmentation for Classification 6. Chapter 4: Image Augmentation for Segmentation 7. Part 3: Text Augmentation
8. Chapter 5: Text Augmentation 9. Chapter 6: Text Augmentation with Machine Learning 10. Part 4: Audio Data Augmentation
11. Chapter 7: Audio Data Augmentation 12. Chapter 8: Audio Data Augmentation with Spectrogram 13. Part 5: Tabular Data Augmentation
14. Chapter 9: Tabular Data Augmentation 15. Index 16. Other Books You May Enjoy

Machine learning models

In this chapter, the text augmentation wrapper functions use ML to generate new text for training the ML model. Understanding how these models are built is not in scope, but a brief description of these ML models and their algorithms is necessary. The Python wrapper functions will use the following ML models under the hood:

  • Tomáš Mikolov published the NLP algorithm using a neural network named Word2Vec in 2013. The model can propose synonym words from the input text.
  • The Global Vectors for Word Representation (GloVe) algorithm was created by Jeffrey Pennington, Richard Socher, and Christopher D. Manning in 2014. It is an unsupervised learning NLP algorithm for representing words in vector format. The results are a linear algorithm that groups the closest neighboring words.
  • Wiki-news-300d-1M is a pre-trained ML model that uses the fastText open source library. It was trained on 1 million words from Wikipedia 2017 articles, the UMBC...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime