Prompt Engineering for Chatbots: 6 Core Strategies & Best Practices
Get a bird’s-eye view of chatbot prompt engineering: learn 6 proven strategies—context, personas, constraints, iteration, and metrics—to craft smarter AI conversations.
In the first part of this tutorial series, we created the foundation of an AI-powered Quiz Generator using OpenAI and Next.js. Here’s what we accomplished:
This setup allows educators to upload documents and automatically generate quizzes, while students can view and select answers for the generated questions.
Now that we have the quiz questions generated and displayed, it’s time to close the learning loop by adding:
To implement this feature, we add an event listener to each question. When the user picks an answer, we provide the question, the correct answer, and the Content source, which is our file in the OpenAI storage, and ask LLM to provide feedback for the user to show it in the UI. The Feedback should be in Markdown format.
Let's pass the file ID down from the Dashboard to the Quizzes component so we can use it in the LLM prompt.
In the Dashboard.tsx
, where you render the Quizzes, add fileId as props
In the Quizzes.tsx
make sure you are getting that prop
What we want to do here is, when a user clicks on a choice, we grab the question, the correct answer, and the user's answer.
Open the Quizzes.tsx
file and add an event handler to the Quizzes component like the following
For each Input radio add this event handler, onChange
Your file should look like the following
Now, when you interact with a question, you should see userAnswer printed in the browser console:
We are going to create an endpoint that receives userAnswer
via a POST call and passes it to the LLM to get feedback. In the api
directory, create a new folder called feedback
and create a route.tsx
file in it.
We create a simple POST API that checks if we are receiving all the necessary data, and when all checks pass, we return feedback to the UI. We will implement the util function next, so we can provide real feedback.
Let's change our Quizzes component
userAnswer
Make a POST call to the endpoint and send the userAnswer
userAnswer
for that question, with the feedback LLM generates for ushandleOptionChange
async function so it can handle the fetch callLet's first install the Markdown parser libraries we need npm i react-markdown remark-gfm
Now open the Quizzes.tsx and change its content to the following
In the utils directory, create a new file called feedback.tsx
to
Now let's change our feedback route to return the LLM feedback instead of hardcoded text
For this section, we are not going to use AI for grading since we have multiple-choice questions, and we know the correct answer and the user's response. Let's define 3 as the total score for our question, and when the user clicks on submit, we show a modal with a breakdown of the grade per question and total score. Open the Quizzes.tsx
file and change it to the following to show a modal when the user clicks on the submit button. This modal shows the total grade and breakdown of each question's score in the table
With the interactive feedback and grading system now in place, your AI Quiz Generator has become a much more engaging and effective learning tool. Students receive immediate, personalized insights on their answers, and the grading breakdown helps them understand their overall performance clearly.
However, there are still areas we can improve to enhance the user experience further. For example, the current feedback generation depends on calls to the language model, which can introduce some latency, meaning students might wait a bit before seeing their feedback. Also, managing multiple attempts or allowing users to reset their answers and try again will make the quiz more flexible and learner-friendly.
Congratulations! 🎉 In this second part of our tutorial, you successfully implemented:
These features significantly improve the learning experience by closing the feedback loop and motivating students to engage deeply with the material.
You can find the Github Repo here
In the upcoming Part 3, we’ll focus on:
Looking to learn more about openai, nextjs, edtech, education and ? These related blog articles explore complementary topics, techniques, and strategies that can help you master Build an AI Quiz Generator with OpenAI: Step-by-Step Tutorial Part 2.
Get a bird’s-eye view of chatbot prompt engineering: learn 6 proven strategies—context, personas, constraints, iteration, and metrics—to craft smarter AI conversations.
Master LLM prompt engineering and boost Google Search Console performance. Craft high-impact prompts, monitor keywords, and elevate your site’s SEO results.
Explore Alan Turing's five pioneering AI contributions that laid the foundation for modern Artificial Intelligence. See his legacy today!
Learn how to build a powerful AI sales data chatbot using Next.js and OpenAI’s Large Language Models. Discover prompt engineering best practices, tool calling for dynamic chart generation, and step-by-step integration to unlock actionable insights from your sales data.
Learn how to build a powerful contract review chatbot using Next.js and OpenAI’s GPT-4o-mini model. This step-by-step tutorial covers file uploads, API integration, prompt engineering, and deployment — perfect for developers wanting to create AI-powered legal assistants.
Learn how to build an AI-powered quiz generator using OpenAI and Next.js. Upload PDF content and automatically generate multiple-choice questions with answers and difficulty levels in JSON format.
Learn how to create an AI-powered social media video maker using Next.js and Google Veo. This step-by-step tutorial covers integrating Veo’s text-to-video API, building a modern web app, and generating cinematic sneaker ads ready for social media publishing.