Using Files for Prompts and API Key Access
In this recipe, you will learn how to use external text files to store and retrieve prompts for interacting with the OpenAI API through Python. This method allows for better organization and easier maintenance as you can quickly update the prompt without modifying the main script. We will also introduce a new method of accessing the OpenAI API key – that is, using files – making the process of changing the API key much more flexible.
Getting ready
Ensure you have access to the OpenAI API and have set up your API key according to the Creating an API key and interacting with OpenAI and Setting the OpenAI API key as an Environment Variable recipes.
How to do it…
This recipe demonstrates a practical approach to managing prompts and API keys, making it easier to update and maintain your code. By using external text files, you can efficiently organize your project and collaborate with others. Let’s walk through the steps to implement this method:
- Create a new text file and save it as
prompt.txt
. Write your desired prompt inside this file and save it. - Modify your Python script so that it includes a function to read the contents of a text file:
def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read()
- Using the script from the Sending API Requests and Handling Responses with Python recipe, replace the hardcoded prompt with a call to the
open_file
function, passing the path to theprompt.txt
file as an argument:prompt = open_file("prompt.txt")
- Create a file called
prompt.txt
and enter the following prompt text (the same prompt as in the Sending API Requests and Handling Responses with Python recipe):Explain the difference between symmetric and asymmetric encryption.
- Set up your API key using a file instead of environment variables:
openai.api_key = open_file('openai-key.txt')
Important note
It’s important to place this line of code after the open_file
function; otherwise, Python will throw an error for calling a function that has not been declared yet.
- Create a file called
openai-key.txt
and paste your OpenAI API key into the file with nothing else. - Use the prompt variable in your API call as you normally would.
Here is an example of how the modified script from the Sending API Requests and Handling Responses with Python recipe would look:
import openai from openai import OpenAI def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read() client = OpenAI() def get_chat_gpt_response(prompt): response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}], max_tokens=2048, temperature=0.7 ) return response.choices[0].message.content.strip() openai.api_key = open_file('openai-key.txt') prompt = open_file("prompt.txt") response_text = get_chat_gpt_response(prompt) print(response_text)
How it works...
The open_file()
function takes a file path as an argument and opens the file using the with open
statement. It reads the file’s content and returns it as a string. This string is then used as the prompt for the API call. A second open_file()
function call is used to access a text file containing the OpenAI API key instead of accessing the API key using environment variables.
By using an external text file for the prompt and to access the API key, you can easily update or change both without needing to modify the main script or environment variables. This can be particularly helpful when you’re working with multiple prompts or collaborating with others.
Note of caution
Using this technique to access your API key does come with a certain level of risk. A text file is easier to discover and access than an environment variable, so be sure to take the necessary security precautions. It is also important to remember to remove your API key from the openapi-key.txt
file before you share your script with others, to prevent unintended and/or unauthorized charges to your OpenAI account.
There’s more...
You can also use this method to store other parameters or configurations that you may want to change frequently or share with others. This could include API keys, model parameters, or any other settings relevant to your use case.