Using Prompt Variables (Application: Manual Page Generator)
In this recipe, we’ll create a Linux-style manual page generator that will accept user input in the form of a tool’s name, and our script will generate the manual page output, similar to entering the man
command in Linux Terminal. In doing so, we will learn how to use variables in a text file to create a standard prompt template that can be easily modified by changing certain aspects of it. This approach is particularly useful when you want to use user input or other dynamic content as part of the prompt while maintaining a consistent structure.
Getting ready
Ensure you have access to the ChatGPT API by logging in to your OpenAI account and have Python and the openai
module installed.
How to do it…
Using a text file that contains the prompt and placeholder variables, we can create a Python script that will replace the placeholder with user input. In this example, we will use this technique to create a Linux-style manual page generator. Here are the steps:
- Create a Python script and import the necessary modules:
from openai import OpenAI
- Define a function to open and read a file:
def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read()
- Set up your API key:
openai.api_key = open_file('openai-key.txt')
- Create the
openai-key.txt
file in the same manner as the previous recipe. - Define the
get_chat_gpt_response()
function to send the prompt to ChatGPT and obtain a response:client = OpenAI() def get_chat_gpt_response(prompt): response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}], max_tokens=600, temperature=0.7 ) text = response.choices[0].message.content.strip() return text
- Receive user input for the filename and read the content of the file:
file = input("ManPageGPT> $ Enter the name of a tool: ") feed = open_file(file)
- Replace the
<<INPUT>>
variable in theprompt.txt
file with the content of the file:prompt = open_file("prompt.txt").replace('<<INPUT>>', feed)
- Create the
prompt.txt
file with the following text:Provide the manual-page output for the following tool. Provide the output exactly as it would appear in an actual Linux terminal and nothing else before or after the manual-page output. <<INPUT>>
- Send the modified prompt to the
get_chat_gpt_response()
function and print the result:analysis = get_chat_gpt_response(prompt) print(analysis)
Here’s an example of how the complete script should look:
import openai from openai import OpenAI def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read() openai.api_key = open_file('openai-key.txt') client = OpenAI() def get_chat_gpt_response(prompt): response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}], max_tokens=600, temperature=0.7 ) text = response['choices'][0]['message']['content'].strip() return text feed = input("ManPageGPT> $ Enter the name of a tool: ") prompt = open_file("prompt.txt").replace('<<INPUT>>', feed) analysis = get_chat_gpt_response(prompt) print(analysis)
How it works…
In this example, we created a Python script that utilizes a text file as a prompt template. The text file contains a variable called <<INPUT>>
that can be replaced with any content, allowing for dynamic modification of the prompt without the need to change the overall structure. Specifically for this case, we are replacing it with user input:
- The
openai
module is imported to access the ChatGPT API, and theos
module is imported to interact with the operating system and manage environment variables. - The
open_file()
function is defined to open and read a file. It takes a file path as an argument, opens the file with read access and UTF-8 encoding, reads the content, and then returns the content. - The API key for accessing ChatGPT is set up by reading it from a file using the
open_file()
function and then assigning it toopenai.api_key
. - The
get_chat_gpt_response()
function is defined to send a prompt to ChatGPT and return the response. It takes the prompt as an argument, configures the API request with the desired settings, and then sends the request to the ChatGPT API. It extracts the response text, removes leading and trailing whitespaces, and returns it. - The script receives user input for the Linux command. This content will be used to replace the placeholder in the prompt template.
- The
<<INPUT>>
variable in theprompt.txt
file is replaced with the content of the file provided by the user. This is done using Python’s stringreplace()
method, which searches for the specified placeholder and replaces it with the desired content. - Prompt explanation: For this particular prompt, we tell ChatGPT exactly what type of output and formatting we are expecting since it has access to just about every manual page entry that can be found on the internet. By instructing it to provide nothing before or after the Linux-specific output, ChatGPT will not provide any additional details or narrative, and the output will resemble actual Linux output when using the
man
command. - The modified prompt, with the
<<INPUT>>
placeholder replaced, is sent to theget_chat_gpt_response()
function. The function sends the prompt to ChatGPT, which retrieves the response, and the script prints the analysis result. This demonstrates how to use a prompt template with a variable that can be replaced to create customized prompts for different inputs.
This approach is particularly useful in a cybersecurity context as it allows you to create standard prompt templates for different types of analysis or queries and easily modify the input data as needed.
There’s more...
- Use multiple variables in your prompt template: You can use more than one variable in your prompt template to make it even more versatile. For example, you can create a template with placeholders for different components of a cybersecurity analysis, such as IP addresses, domain names, and user agents. Just make sure you replace all the necessary variables before sending the prompt to ChatGPT.
- Customize the variable format: Instead of using the
<<INPUT>>
format, you can customize your variable format to better suit your needs or preferences. For example, you can use curly braces (for example,{input}
) or any other format that you find more readable and manageable. - Use environment variables for sensitive data: When working with sensitive data such as API keys, it’s recommended to use environment variables to store them securely. You can modify the
open_file()
function to read an environment variable instead of a file, ensuring that sensitive data is not accidentally leaked or exposed. - Error handling and input validation: To make your script more robust, you can add error handling and input validation. This can help you catch common issues, such as missing or improperly formatted files, and provide clear error messages to guide the user in correcting the problem.
By exploring these additional techniques, you can create more powerful, flexible, and secure prompt templates for use with ChatGPT in your cybersecurity projects.