Can ChatGPT Solve this Math Riddle?

Testing ChatGPT on High School Math Puzzle

It’s a beautiful Saturday morning, perfect for ChatGPT and math puzzles!

Problem Statement

Here’s the problem statement:

(we’re going to ask ChatGPT to solve the puzzle above)

But first, let’s talk about the solution…

When you get to sin(Pi), do you assume it’s a variable or a value?

You’ve probably never seen a variable named Pi.

You’d treat Pi as a value.

This makes sin(Pi) = 0 and therefore the whole product equals zero.

The correct answer is 0.

But can ChatGPT figure it out?

Here’s a youtube version of this post:

Naive Prompt

Fail.

ChatGPT can’t answer the problem without specific values for the angles.

Chain of Thought

Similarly, ChatGPT insists it can’t solve the problem without specific values for the angles.

Plan and Execute

What if I ask ChatGPT to create a plan first, then follow the plan?

Still no luck!

Agents

Here I try pseudo-agents.

Not really agents, but agents in spirit.

I ask ChatGPT to solve the problem, review the solution (i.e. give feedback), then solve it again using the “feedback” from step 2.

Honestly, surprised that didn’t work better!

The professional mathematician’s review didn’t add much value. Overthinking it, ChatGPT!

Give a Hint

Now I nudge ChatGPT to think about each multiplier in the problem statement and whether any multiplier is zero. This is a pretty big hint:

It seemed promising:

“So, we should check if any of the angles is a multiple of pi. If at least one angle in the set … is a multiple of pi, then… [answer is] 0”

But the answer just stopped there.

ChatGPT failed to proceed to check if one of the angles is a multiple of pi.

Give Another Hint

Let’s make it even more explicit…

Hmm…

Why isn’t ChatGPT writing out each multiplier, like I ask it to?

Force Expansion

Ok, I’m going to be as explicit as possible.

ChatGPT: write out each multiplier, without ellipsis, then solve the problem.

I throw in Chain of Thought too.

This is the only prompt I’ve tried so far that gives both the correct answer and correct explanation:

ChatGPT listed out the Greek alphabet.

Then, ChatGPT explicitly wrote out the product.

It noticed pi is one of the angles in our product.

Hence, the product … is zero. 

Nice work, ChatGPT!

Really glad I won’t have to sit here all day 🫠 

BUT it feels like I had to give overly specific instructions for ChatGPT to finally get it. I don’t want to have to do that.

Easier ‘Solution’ — Give More Context

Seeking a shortcut with less handholding, I try giving ChatGPT some context:

“Solve the following riddle”

The answer is correct but reasoning is flawed:

“omega, which is a multiple of pi”

I don’t know why it assumes that.

But at least the answer is correct, with minimal hand-holding from me!

My hypothesis:

Using “riddle” makes ChatGPT more certain that there is a definitive answer.

I run it again:

Correct answer and correct explanation!

Again:

Correct answer and I think correct explanation. ChatGPT says “pi is typically included in such series to indicate completeness” which is a bit vague. But I’ll take it.

Out of 3 runs with the “riddle” context, ChatGPT got it right 3 times.

But its reasoning was a bit shaky that first run — “omega which is a multiple of pi”.

It seems like contextualizing this problem as a “riddle” forces ChatGPT to produce an answer rather than give a generic solution.

Changing “Riddle” to “Problem”

Now I change ONE word.

“Riddle” to “problem”.

Back to wrong, generic answers.

Keep the Riddle, Change SIN to COS

I’m curious what effect the word “riddle” has.

Let’s change sin to cos and see what happens.

It’s no longer 0.

Not much of a riddle, but let’s see what ChatGPT says.

Seems like the term “riddle” forces it to look for a closed-form solution.

I like this output even more because ChatGPT says it needs more information AND tries to come up with a plausible answer for the riddle.