Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Leveraging Google Cloud for Custom Endpoint with OpenAI

Save for later
  • 8 min read
  • 20 Feb 2024

article-image

Dive deeper into the world of AI innovation and stay ahead of the AI curve! Subscribe to our AI_Distilled newsletter for the latest insights. Don't miss out – sign up today!

This article is an excerpt from the book, OpenAI API Cookbook, by Henry Habib. Integrate the ChatGPT API into various domains ranging from simple wrappers to knowledge-based assistants, multi-model, and conversational applications

Introduction

In the realm of application development, the integration of OpenAI's capabilities through custom backend endpoints on Google Cloud is a pivotal step towards unlocking intelligent solutions. This article explores the process of creating such endpoints using Google Cloud's Cloud Functions, allowing for control, customization, and the seamless integration of OpenAI's features. Through this fusion of technologies, developers can craft innovative applications that leverage the power of artificial intelligence to enhance user experiences and drive creativity in their projects.

Creating a public endpoint server that calls the OpenAI API

There are many important benefits of creating your own public endpoint server that calls the OpenAI API, instead of connecting to the OpenAI API directly – the biggest being control and customization, which we will explore in this recipe and the next recipe.

In this recipe, we will use GCP to host our public endpoint. When this endpoint is called, it will make a request to OpenAI for a slogan for an ice cream company and then will return the answer to the user. This sounds simple and almost unnecessary to make a public endpoint, but it is the final step we need to build a truly intelligent application that leverages OpenAI.

To do this, we will create a GCP resource called Cloud Functions, which we will explore later in the How it works… section of the recipe.

Getting ready

Ensure you have an OpenAI platform account with available usage credits. If you don’t, please follow the Setting up your OpenAI API Playground environment recipe in Chapter 1. Furthermore, ensure you have created a GCP account. To do this, navigate to https://cloud. google.com/, then select Start Free from the top right, and follow the instructions that you see.

You may need to provide a billing profile as well to create any GCP resources. Note that GCP does have a free tier, and in this recipe, we will not go above the free tier (so, essentially, you should not be billed for anything).

You may need to create a project if this is your first time logging into Google Cloud Platform. After you log in, select Select a project from the top left and then select New Project. Provide a project name and then select Create.

The next recipe in this chapter will also have this same requirement.

How to do it…

1.  Navigate to https://console.cloud.google.com/. In the Search field at the top of the page, type in Cloud Functions and select the top choice from the drop-down menu, Cloud Functions.

leveraging-google-cloud-for-custom-endpoint-with-openai-img-0

Figure – Cloud Functions in the dropdown

2. Select Create Function from the top of the page. This will begin to create our custom backend endpoint and start the configuration steps.

On the Configuration page, fill in the following steps:

  • Environment: Select 2nd gen from the drop-down menu.
  • Function name: Since we’re creating a backend endpoint that will produce company slogans, the function name will be slogan_creator.
  • Region: Choose the environment location nearest you.
  • In the Trigger menu, choose HTTPS. In the Authentication sub-menu, select Allow unauthenticated invocation. We need to check this as we are going to create a public endpoint that will be accessible from our frontend services.
leveraging-google-cloud-for-custom-endpoint-with-openai-img-1

                                                                                  Figure – Sample configuration settings of a Google Cloud Function

3. Select the Next button on the bottom of the page to then move on to the Code section.

4. From the Runtime dropdown, select Python 3.12. This ensures that our backend endpoint will be coded using the Python programming language.

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime

5. For that Entry point option, type in create_slogan. This refers to the name of the function in Python that is called when the public endpoint is reached and triggered.

6. On the left-hand side menu, you will see two files: main.py and requirements.txt. Select the requirements.txt file. This will list all the Python packages that need to be installed for our Cloud Function to operate.

7. In the center of the screen where the contents of requirements.txt are displayed, enter a new line and type in openai. This will ensure that the latest openai library package is installed. Your screen should look like what’s displayed in Figure below.

leveraging-google-cloud-for-custom-endpoint-with-openai-img-2

Figure – Snapshot of the requirements.txt file

8. From the left-hand side menu, select main.py. Copy and paste the following code into the center of the screen (where the content for that file is displayed). These are the instructions that the public endpoint will run when it is triggered:

import functions_framework
from openai import OpenAI

@functions_framework.http
def create_slogan(request):

   client = OpenAI(api_key = '<API Key here>')

   response = client.chat.completions.create(
       model="gpt-3.5-turbo",
       messages=[
           {
           "role": "system",
           "content": "You are an AI assistant that creates one 
slogan based on company descriptions"
           },
           {
           "role": "user",
           "content": "A company that sells ice cream"
           }
       ],
       temperature=1,
       max_tokens=256,
       top_p=1,
       frequency_penalty=0,
       presence_penalty=0
   )

   return response.choices[0].message.content

As you can see, it simply calls the OpenAI endpoint, requests a chat completion, and then returns the output to the user. You will also need your OpenAI API key.

9. Next, deploy the function by selecting the Deploy button at the bottom of your page.

10.   Wait for your function to be fully deployed, which typically takes two minutes. You can verify whether the function has been deployed or not by observing the progress in the top left section of the page (shown in Figure below). Once it is green and checkmarked, the build is successful, and your function has been deployed.

leveraging-google-cloud-for-custom-endpoint-with-openai-img-3

                                                                                      Figure – The Cloud Function deployment page

11. Now, let’s verify that our function works. Select the endpoint URL, found on the top of the page near URL. It’s typically in the form https://[location]-[project-name]. cloudfunctions.net/[function-name]. It is also highlighted in the above Figure.

12. This will open a new web page that will trigger our custom public endpoint, and return a chat completion, which, in this case, is the slogan for an ice cream business. Note that this is a public endpoint – this will work on your computer, phone, or any device connected to the internet.

leveraging-google-cloud-for-custom-endpoint-with-openai-img-4

Figure – Output of a Google Cloud Function

How it works…

In this recipe, we created a public endpoint. This endpoint can be accessed by anyone (including your application in future recipes). The logic of the endpoint is simple and something we have covered prior: return a slogan for a company that sells ice cream. What’s new, however, is that this is our very own public endpoint that is hosted in Google Cloud, using the Cloud Function resource.

Note that we used the free tier of Google Cloud Functions, which does have limitations such as a cap on the number of function invocations per month, limited execution time, and constrained computational resources. However, for our current purposes, these limitations are not a hindrance, allowing us to deploy and test our functions effectively without incurring costs. This setup is ideal for small-scale applications or for learning and experimentation purposes, providing a practical way to understand cloud functionalities and serverless architecture in a cost-effective manner.

Conclusion

In conclusion, the synergy between Google Cloud's infrastructure and OpenAI's capabilities offers developers a powerful platform for creating intelligent applications. By leveraging Cloud Functions to build custom backend endpoints, developers can unlock a world of possibilities for innovation and creativity. This article has provided a comprehensive guide to integrating OpenAI into Google Cloud, empowering developers to craft intelligent solutions that enhance user experiences and drive the evolution of application development. With this knowledge, developers are well-equipped to embark on their journey of building intelligent applications that push the boundaries of what is possible in the digital landscape.

Author Bio

Henry Habib is a Manager at one of the world's top management consulting firms, advising F500 companies on analytics and operations, with a particular focus on building intelligent AI-driven solutions and tools to create impact. He is a passionate online instructor and educator, amassing a of more than 150K paid students and facilitating technical programs at large banks and governmental.
A proponent in the no-code and LLM revolution, he believes that anyone can now create powerful and intelligent applications without any deep technical skills. Henry resides in Toronto, Canada with his wife, and enjoys reading AI research papers and playing tennis in his free time.