RESEARCH ON PROMPT ENGINEERING AND PROPOSED SOLUTIONS FOR DATA GENERATION IN EDUCATIONAL QUESTION-ANSWERING SYSTEMS
Abstract
This research explores prompt engineering techniques—Zero-Shot, Few-Shot, Chain of Thought (CoT), and Retrieval-Augmented Generation (RAG)—to assess their effectiveness in educational question-answering systems. It evaluates their ability to handle complex queries, reason through multi-step problems, and generate accurate, contextually relevant responses. Results indicate that CoT and RAG are particularly effective for tasks requiring logical reasoning and multi-source information synthesis, while Zero-Shot and Few-Shot methods are more efficient for straightforward questions with lower computational demands. The research highlights the crucial role of prompt engineering in enhancing model performance and generating high-quality datasets for educational applications. Practical solutions are proposed, including optimizing prompts for different question types, leveraging retrieval-based methods to dynamically update responses, and balancing efficiency with computational costs. These findings contribute to advancing educational question-answering systems, enabling Large Language Models (LLMs) to deliver precise, well-contextualized, and reliable responses in academic settings.