Chain of Thought Prompting

Chain of Thought Prompting also known as COT is a way to enhance the reasoning and problem-solving abilities of Large Language Models (LLM).

It helps guide the LLM through a step-by-step process to arrive at a final result. It’s like showing your work on a math problem. This is done by breaking down the prompt into smaller sections which are solved one at a time. By prompting in this fashion, your language models can generate more accurate and coherent responses, especially for complex queries.

In this article, we are going to cover 5 different examples of COT Prompting for both ChatGPT as well as the OpenAI API. These examples include: Zero-Shot, One-Shot, Few-Shot, Automatic, and Multimodal

If you do not want to read the article, I have a full YouTube video going over each of the examples provided in the article.Â

Also, if you are looking for any work that deals with Large Language Models, hit us up over email or the contact form on the website.

If you came here for the ChatGPT examples, feel free to skip this section. But if you want to code, please follow these steps.

For the tutorial you are going have to pip install langchain, openai, and langchain_openai

!pip install langchain
!pip install openai
!pip install langchain_openai

Once these are installed, you can now import in OpenAI, FewShotPromptTemplate, PromptTemplate, and ChatOpenAI

from langchain_openai import OpenAI
from langchain.prompts.few_shot import FewShotPromptTemplate
from langchain.prompts.prompt import PromptTemplate
from langchain_openai import ChatOpenAI

Lastly

import os
os.environ["OPENAI_API_KEY"] = ""

Zero-Shot Chain-of-Thought prompting

A Zero-Shot prompt is when no examples are provided to a large language model.Â
In terms of a COT prompt, you need to add verbiage to the end of your prompt to get the LLM working.Â
Common expressions like “Explain Your Work” or “Show your steps” are often used.
Â

Chat GPT Example

Them Crooked Vultures is playing a concert. 13 of the 15 songs played are from
their self titled album. How many songs were played that weren’t on the album?
Explain your answer step by step

From the prompt above, what makes this a chain of thought is the last line: Explain your answer step by step.

Below is the results when the prompt is fed to ChatGPT

OpenAI API Example

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0)
question = "Can you solve the following math question? Please show how you got the answer: 9x + 62 = 89"

From the prompt above, what makes this a chain of thought is the last line: Please show how you got the answer.

output = llm.invoke(question)
print(output)
 

Below is the results we will get

To solve the equation 9x + 62 = 89, we need to isolate the variable x.

First, subtract 62 from both sides of the equation:9x + 62 – 62 = 89 – 62

9x = 27

Next, divide both sides by 9 to solve for x:

9x / 9 = 27 / 9

x = 3

Therefore, the solution to the equation 9x + 62 = 89 is x = 3.’

One-Shot Chain-of-Thought prompting

For one-shot prompting, we add in one example. You can see that we have a Q and A.

In this example we go over a slight variation of the question we want answered.Â

Q: A band is playing a concert with 20 songs. 14 of the songs are from their latest album.
How many songs were played that weren’t from the latest album?

A: The band played 20 songs in total. Out of these, 14 songs are from their latest album.
To find out how many songs were not from the latest album, we subtract the number of songs from the latest album
from the total number of songs: 20 – 14 = 6. Therefore, 6 songs were played that weren’t from the latest album.

Q: Them Crooked Vultures is playing a concert.
13 of the 15 songs played are from their self titled album. How many songs were played that weren’t on the album?
Explain your answer step by step
Â

OpenAI API Example 1

We are going to take a look at two different examples with the OpenAI Api. The first is what I call the lazy way.Â

Just use your prompt with an additional example and send it to the large language model.

one_shot_prompt = """
Q: Can you solve the following math question? Please show how you got the answer: 11x + 15 = 125
A:
 
Step 1: Subtract 15 from both sides.
11x = 110
 
Step 2: Divide by 11
x = 10
 
Step 3:
The final answer is 10. 11 * 10 + 15 = 125
Verified
 
Q: Can you solve the following math question? Please show how you got the answer: 9x + 62 = 89
A:
"""
output = llm.invoke(few_shot_prompt)
print(output)

OpenAI API Example 2

This is the better approach to take. It correctly lays out all the steps to prompting with examples.

example = [
    {
        "Question": "Can you solve the following math question? Please show how you got the answer: 11x + 15 = 125",
        "Solution": """
Step 1: Subtract 15 from both sides.
11x = 110
 
Step 2: Divide by 11
x = 10
 
Step 3:
The final answer is 10. 11 * 10 + 15 = 125
Verified
        """
    }
]

After an example is set up, we want to create an example_prompt.

example_prompt = PromptTemplate(
    input_variables=["Question", "Solution"],
    template="Q: {Question}\nA: {Solution}"
)

After we can create a few shot prompt template.

one_shot_prompt = FewShotPromptTemplate(
    examples=example,
    example_prompt=example_prompt,
    suffix="Q: Can you solve the following math question? Please show how you got the answer: {input}\nA:",
    input_variables=["input"],
)
    input_variables=["Question", "Solution"],
    template="Q: {Question}\nA: {Solution}"
)

If you want to see how your prompt looks like before being sent to the model, use the following code.

formatted_prompt = one_shot_prompt.format(input="9x + 62 = 89")
print(formatted_prompt)
 
response = llm.invoke(formatted_prompt)
print(response.content)

Few Shot Chain-of-Thought prompting

Few shot is when you give 2+ examples. Let’s take a look at the ChatGPT band setlist and math problem one more time.

Chat GPT Example

In the example below, we add the addition of the musician performing 25 songs at a concert.

Q: A band is playing a concert with 20 songs. 14 of the songs are from their latest album. How many songs were played that weren’t from the latest album?
A: The band played 20 songs in total. Out of these, 14 songs are from their latest album. To find out how many songs were not from the latest album, we subtract the number of songs from the latest album from the total number of songs: 20 – 14 = 6. Therefore, 6 songs were played that weren’t from the latest album.

Q: A musician performed 25 songs during a concert. 10 of these songs are from their debut album. How many songs were not from the debut album?
A: The musician performed 25 songs in total. 10 of these songs are from their debut album. To find out how many songs were not from the debut album, we subtract the number of debut album songs from the total number of songs: 25 – 10 = 15. Therefore, 15 songs were played that weren’t from the debut album.

Q: Them Crooked Vultures is playing a concert.
13 of the 15 songs played are from their self titled album. How many songs were played that weren’t on the album?
Explain your answer step by step

OpenAI API Example (Better Example)

examples = [
    {
        "Question": "Can you solve the following math question? Please show how you got the answer: 11x + 15 = 125",
        "Solution": """
Step 1: Subtract 15 from both sides.
11x = 110
 
Step 2: Divide by 11
x = 10
 
Step 3:
The final answer is 10. 11 * 10 + 15 = 125
Verified
        """
    },
    {
        "Question": "Can you solve the following math question? Please show how you got the answer: 4x - 7 =21",
        "Solution": """
Step 1: Add 7 to both sides.
4x = 28
 
Step 2: Divide by 4
x = 7
 
Step 3:
The final answer is 4. 4 * 7 - 7 = 21
Verified
        """
    },
 ]
example_prompt = PromptTemplate(
    input_variables=["Question", "Solution"],
    template="Q: {Question}\nA: {Solution}"
)
few_shot_prompt = FewShotPromptTemplate(
    examples=example,
    example_prompt=example_prompt,
    suffix="Q: Can you solve the following math question? Please show how you got the answer: {input}\nA:",
    input_variables=["input"],
)
response = llm.invoke(formatted_prompt)
print(response.content)

Automatic chain of thought (Auto-CoT)

With Auto COT you are constantly asking questions to get to an answer. This is done through an example as well so think of it as an extension of a one or few shot prompt.

ChatGPT Example

Example 1: Basic Arithmetic
Q: The Tampa Bay Rays sell 15000 tickets for a baseball game. They also sold 5000 parking passes at $20. If the average ticket price is $40, how much ticket revenue did they bring in?

What amount of tickets sold?
15000

What is the avg ticket price?
$40

What is the total ticket revenue? Find the total ticket revenue by multiplying the tickets sold and avg ticket price
15000 * 40 = 600000

Can you grab the final result and write out a statement
The Rays brought in $600000 in revenue from ticket sales

#Q
The Rays are in a playoff game. They sold 30000 tickets. They also sold $20,000 of revenue for merchandise. The average ticket for the game was $50. What was the ticket revenue that night?

OpenAI API Example

cot_prompt = """
Q: Can you solve the following math question? Please show how you got the answer: 11x + 15 = 125
 
Can you Identify the equation used:
11x + 15 = 125
 
How can you isolate X?
Subtract 15 from each side
 
Can you perform the operation
11x = 110
 
How can you get x to be by itself?
divide by 11
 
Can you perform the operation
x = 10
 
Is this the final answer? Check again to make sure
11(10) + 15 = 125
 
Confirmed
 
Q: Can you solve the following math question? Please show how you got the answer: 9x + 62 = 89
 
"""
output = llm.invoke(vot_prompt)

OpenAI API Example

examples = [
    {
        "Question": "Can you solve the following math question? Please show how you got the answer: 11x + 15 = 125",
        "Solution": """
Identify the equation used:
11x + 15 = 125
 
How can you isolate X?
Subtract 15 from each side
 
Perform the operation
11x = 110
 
How can you get x to be by itself?
divide by 11
 
Perform the operation
x = 10
 
Is this the final answer? Check again to make sure
11(10) + 15 = 125
 
Confirmed
        """
    },
 ]
example_prompt = PromptTemplate(
    input_variables=["Question", "Solution"],
    template="Q: {Question}\nA: {Solution}"
)
few_shot_prompt = FewShotPromptTemplate(
    examples=example,
    example_prompt=example_prompt,
    suffix="Q: Can you solve the following math question? Please show how you got the answer: {input}\nA:",
    input_variables=["input"],
)
formatted_prompt = few_shot_prompt.format(input="9x + 62 = 89")

Multimodal Chain-of-Thought prompting

Multimodal prompting is when you use both images and text. We can essentially use an image with a zero-shot prompt to achieve a solution.

ChatGPT Example

Calculate the area and the perimeter of the hexagon in the image.
Each side is the same length. Please show the step by step process of solving this question

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

OpenAI API Example

MODEL="gpt-4o"
client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
from IPython.display import Image, display, Audio, Markdown
import base64
 
IMAGE_PATH = "/content/slope_example.png"
 
# Preview image for context
display(Image(IMAGE_PATH))
def encode_image(image_path):
    with open(image_path, "rb") as image_file:
        return base64.b64encode(image_file.read()).decode("utf-8")
 
base64_image = encode_image(IMAGE_PATH)
 
response = client.chat.completions.create(
    model=MODEL,
    messages=[
        {"role": "system", "content": "You are a math assitant who can solve algebra problems. You show every step taken"},
        {"role": "user", "content": [
            {"type": "text", "text": "I'm going to provide you an image of a graph. Please breakdown the process of finding the slop and confirm the answer"},
            {"type": "image_url", "image_url": {
                "url": f"data:image/png;base64,{base64_image}"}
            }
        ]}
    ],
    temperature=0.0,
)
 
print(response.choices[0].message.content)

Leave a Reply

Your email address will not be published. Required fields are marked *