Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
The Applied AI and Natural Language Processing Workshop

You're reading from   The Applied AI and Natural Language Processing Workshop Explore practical ways to transform your simple projects into powerful intelligent applications

Arrow left icon
Product type Paperback
Published in Jul 2020
Publisher Packt
ISBN-13 9781800208742
Length 384 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
Ruze Richards Ruze Richards
Author Profile Icon Ruze Richards
Ruze Richards
Krishna Sankar Krishna Sankar
Author Profile Icon Krishna Sankar
Krishna Sankar
Jeffrey Jackovich Jeffrey Jackovich
Author Profile Icon Jeffrey Jackovich
Jeffrey Jackovich
Arrow right icon
View More author details
Toc

Table of Contents (8) Chapters Close

Preface
1. An Introduction to AWS 2. Analyzing Documents and Text with Natural Language Processing FREE CHAPTER 3. Topic Modeling and Theme Extraction 4. Conversational Artificial Intelligence 5. Using Speech with the Chatbot 6. Computer Vision and Image Processing Appendix

Recursion and Parameters

Importing files one at a time is time-consuming, especially if you have many files in a folder that need to be imported. A simple solution is to use a recursive procedure. A recursive procedure is one that can call itself and saves you, the user, from entering the same import command for each file.

Performing a recursive CLI command requires passing a parameter to the API. This sounds complicated, but it is incredibly easy. First, a parameter is simply a name or option that is passed to a program to affect the operation of the receiving program. In our case, the parameter is recursive, and the entire command to perform the recursive command is as follows:

aws s3 cp s3://myBucket . --recursive

With this command, all the S3 objects in the bucket are copied to the specified directory:

Figure 1.33: Parameter list

Figure 1.33: Parameter list

Activity 1.01: Putting the Data into S3 with the CLI

Let's start with a note about the terminology used in this activity. Putting data into S3 can also be called uploading. Getting it from there is called downloading. Sometimes, it is also called importing and exporting. Please do not confuse this with AWS Import/Export, which is a specific AWS service for sending a large amount of data to AWS or getting it back from AWS.

In this activity, we will be using the CLI to create a bucket in S3 and import a second text file. Suppose that you are creating a chatbot. You have identified text documents that contain content that will allow your chatbot to interact with customers more effectively. Before the text documents can be parsed, they need to be uploaded to an S3 bucket. Once they are in S3, further analysis will be possible. To ensure that this has happened correctly, you will need to install Python, set up the Amazon CLI tools, and have a user authenticated with the CLI:

  1. Configure the CLI and verify that it can successfully connect to your AWS environment.
  2. Create a new S3 bucket.
  3. Import your text file into the bucket.
  4. Export the file from the bucket and verify the exported objects.

    Note

    The solution for this activity can be found via this link.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image