ChatGPT: From answer-seeking to question-crafting [Students, School, & Chatbots]

In a world where chatbots are becoming increasingly prevalent, the ability to ask probing questions and think creatively has become more important than ever. As educators and parents, it is our responsibility to equip students with the skills they need to confidently navigate our ever-changing world. 

Recently, I attended a thought-provoking seminar by Po-Shen Loh, the National Coach for the USA International Mathematical Olympiad team and a former classmate of mine from Caltech. During the talk, Po posed a challenging question to ChatGPT Plus: Find the largest fraction less than 1/2, with the added restriction that both the numerator and denominator are positive integers less than or equal to 10,000. ChatGPT was unable to provide the correct answer. 

Inspired by the talk, I was curious to explore how we could use this example as an exercise for students to engage with A.I. in a thoughtful and meaningful way while fostering critical thinking. 

When I prompted ChatGPT with the same question, it gave an incorrect response:

ChatGPT made a number of mistakes in its initial solution. Let’s see if we can guide ChatGPT. Let's see if we can steer ChatGPT in the right direction. Let’s pose the same question again, this time followed by “think in steps.” The new response generated is as follows:

While it is still incorrect, ChatGPT is getting close. Let's ask the same question again, this time followed by “think like a programmer.” ChatGPT responds with a Python script that can be readily compiled using any web-based compiler.

Running the suggested code yields the value 0. This would be an excellent teaching opportunity to remind students that, while ChatGPT's response may sound correct, it can still be seriously flawed. We can encourage students to analyze the code carefully and challenge them to find the error.  

I highlighted the error in the following prompt to help ChatGPT find the right answer: “For a fraction to be less than 1, the numerator is smaller than the denominator. Can you fix the code accordingly?” The response is as follows:

This time, ChatGPT’s code yields the correct answer. However, the solution is highly inefficient. Let’s check if ChatGPT can improve the code by prompting, “The code is highly inefficient. Reduce the search space.” ChatGPT’s reply reads:

The outcome of running the new script is 0. Again, ChatGPT's ostensibly assured response is false. In fact, it repeats the same logical mistake that we previously identified, indicating a lack of comprehension. The code is far from efficient, and more importantly, ChatGPT fails to provide an elegant mathematical solution, suggesting a lack of creativity as well.

Chatbots will only become more powerful in the future. Engaging students in such exercises, whether in STEM or literary fields, is instrumental in helping them understand the limitations of chatbots. By actively participating in the process, students sharpen their problem-solving skills and learn to navigate the delicate balance between using technology as a tool and nurturing an independent mindset.

Previous
Previous

Game-changing strategies for every student

Next
Next

ChatGPT: Great for chatting, not so great for cheating [Students, School, & Chatbots]