50 How to Prompt AI Chatbots
Joel Gladd
This section introduces students to the basics of text-to-text prompting. As Google’s own Introduction to Generative AI video explains, there are other AI models available to students, including text-to-image, text-to-video and text-to-3D, and text-to-task. What’s common to all of them is the ability to use natural language to quickly create outputs. Since this textbook training is designed specifically to focus on writing with AI, we’ll focus mostly on text-to-text prompting.
Text-to-text prompting has a variety of applications, including but not limited to:
- Generation
- Classification
- Summarization
- Translation
- Research and Search
- Paraphrasing / Rewriting
- Content editing
- Brainstorming
- Process Analysis
Who wouldn’t want a personal assistant who’s available 24/7 to help with brainstorming, drafting simulations, offering feedback, and more? That’s the potential offered by AI chatbots. Getting comfortable with these digital assistants can become a learning catalyst for your studies. Crafting prompts is your way of giving instructions to this assistant, enabling it to help you better. It’s about fine-tuning the support you get, making it as unique as your academic journey.
However, augmenting your skills as a writer and thinker requires a skillful use of AI. Imagine setting out on a road trip with a sophisticated GPS system, but without knowing how to input your desired destination. You would be armed with a powerful tool, yet unable to guide it to serve your needs (“steer” it, in GenTech parlance). Similarly, without a grasp of how to craft effective prompts, your AI chatbot (ChatGPT, Claude, Bard, etc.) can’t reach its full potential as a learning catalyst. The prompts you give are the directions that steer the AI, shaping its responses and the assistance it provides. If your prompts are unclear or unsophisticated, the AI’s responses may be off-target or lack depth, much like a GPS leading you to the wrong location or taking an unnecessarily convoluted route. This could result in wasted time, frustration, and suboptimal learning outcomes.
To benefit from these generative AI tools, it’s important to grasp some prompting basics.
Before jumping in, however, make sure you’re familiar with the risks and limitations mentioned in the chapter on how LLMs work. Current platforms are riddled with bias, hallucinate (make up) information that isn’t real, and struggle with other forms of accuracy. Critical thinking becomes more important as you learn to work with AI.
Note about ChatGPT links in this chapter
One of the reasons we decided to stick with ChatGPT when illustrating many of these prompting techniques is that this platform now includes convenient url links for sharing conversations. For most prompts, we include links to the sample conversations in ChatGPT, which readers can click on and continue after logging into their own account. Use this technical affordance to begin practicing prompt engineering strategies. Below is a video that explains more about shareable links.
Prompting Basics
When accessing ChatGPT, Microsoft Co-Pilot, or Google Gemini’s interface, you’ll find an empty space to begin typing in commands—much like a search bar. This is where you “prompt” the chatbot with an input.
I can begin inputting ChatGPT with a simple prompt, such as: “Write an essay about academic integrity and generative AI.” Or, students will sometimes plop in the directions for a writing assignment they’ve been given: “Write an essay in at least 1000 words that argues something about academic integrity and generative AI, include at least one high-quality source, and include APA citations” (submitting the resulting output as your own work is, according to most course policies, a violation of academic integrity). As you’ll learn below and in other chapters, simply prompting chatbots with assignment instructions will often produce poor results. Effective prompting requires sound foundational knowledge–the critical awareness and disciplinary expertise that college courses will help you achieve. Without those foundations, and without some basic proficiency in prompting strategies, chatbots won’t be very useful.
Input
Prompting starts with entering inputs. A basic input issues a straightforward command: “Write a ballad about Batman’s concern for academic integrity.” Below is a screenshot of the output.
Context
The next element to become comfortable with is the platform’s context window. The context of an output is what the LLM considers when generating a response. With chatbots such as ChatGPT, the context can be provided along with the initial command, or it can refer to the entire conversation leading up to the next input (determined by allowable tokens) as context. Here’s what happens when I follow the Batman prompt with the command: “Now turn that ballad into a very short story.”
Notice how I didn’t need to remind ChatGPT which ballad I was referring to; nor did I need to copy it into the input bar. The chatbot retained the previous part of our conversation as context. This ability of the chatbot to use the context to direct outputs is sometimes called conversation steering. You should use this feature to improve the chatbot’s initial outputs. It’s nearly always the case that the first output will be mediocre. Prompting without steering is one of the most common mistakes that people make.
Sometimes writers want the AI chatbot to assist with a lot of text. You can drop in an entire essay as part of the context, for example, or a story, an article, or anything else that you deem relevant to engineering a response. Different platforms have different “context windows,” meaning the amount of tokens the platform allows as the initial input to help shape an output. The constraints of these context/token windows are changing quickly. The magic of Google’s Notebook LM comes in part from its massive context window. Students can upload entire books and coursework and begin chatting with t.
Advanced Context Windows
Understanding context is incredibly powerful. Here are some other things to know:
- Some platforms, such as Microsoft Co-Pilot and ChatGPT, can accept internet links (as urls) as context. The results can be uneven, however, so be sure to check for hallucinations.
- We’re also starting to see the capability to upload .csv files, .pdfs, other document formats, and even images as context.
- Many platforms are now multi-modal, which means you can upload images as context (“What does this meme mean?”). Some are starting to support audio and video links, such as Gemini.
Prompt Engineering
Prompt engineering is a type of input, but it uses a range of techniques that better leverage the affordances of platforms such as ChatGPT. Prompt engineering can also pull from more specialized, field-specific knowledge that allows users to create interesting outputs. Notice how the following prompt adds a series of constraints to the ChatGPT input that rely on the user’s familiarity with the fantasy genre and well-known writers:
Prompt Engineering Example
You’re a highly skilled author writing for a fantasy anthology. You’ve mastered a style that blends Octavia Butler with Neil Gaiman. Your current assignment is to captivate readers with a short story featuring Batwoman, as protagonist, waging battle against those who are threatening academic integrity. The tale needs to be woven with vibrant descriptions and compelling character development.
As this example demonstrates, one common prompt engineering technique is to assign ChatGPT a role. Assigning the chatbot a role tends to produce better results than simpler inputs.
The fancy term “prompt engineering” usually refers to this more skillful way of commanding a chatbot that better steers it towards the result you’re looking for.
Prompting Engineering Strategies for Students
Once you understand the basics of prompting and context windows, it can be helpful to play around with a variety of prompting strategies.
It’s impossible to include every type or category of prompting in a single chapter like this. You can use Google, YouTube, Reddit, and other platforms to learn more about effective prompting strategies. Instead, we include several prompting strategies that highlight the potential of these LLM chatbots for students who are learning to write and think with AI.
Prompting that leverages rhetorical awareness
Writing courses can rapidly boost your prompt engineering skills because they focus on precisely the kinds of rhetorical techniques that help generate finely tuned outputs. When prompting for nearly any task, it often helps to specify one or more of the following rhetorical elements:
- Role/Speaker: “You are a highly experienced marketing manager who works for…”
- Audience: “You are creating a marketing campaign targeted at a semi-rural region in Idaho…”
- Purpose: “The service you want to pitch is…”
- Genre: and Platform Constraints “The marketing campaign will be run on social media platforms, including…”
Many “engineered” prompts simply leverage rhetorical insights to generate outputs with more precision. This is one good reason why you should pay attention in college writing courses!
Brainstorming Machine
For students in many courses, one of the most powerful—and allowable—uses of AI takes advantage of its list-making prowess: brainstorming. LLMs like ChatGPT are excellent listers. Try prompts such as:
- “Please create ten different research questions based on…”
- “I’m having trouble thinking of a topic to write about. Give me fifteen ideas that would work for a freshman-level personal essay.”
ChatGPT can also create tables or matrices. These formats invite users to brainstorm through pros vs. cons, comparing and contrasting a range of options, etc.
Universal Mentor and Explainer
Using an LLM as a universal mentor, tutor, or “explainer” more generally is something that Khan Academy is attempting with its product Khanmigo. However, with some prompt engineering prowess, ChatGPT and other platforms can deliver this on the fly. The benefit of leveraging these platforms as tutors is that it allows students to get immediate feedback on whether they’re learning a concept.
Mentor Prompts
The following strategy is taken directly from Ethan Mollick and Lilach Mollick’s “Assigning AI: Seven Approaches for Students With Prompts”:
Mentor Prompt Example
You are a friendly and helpful mentor whose goal is to give students feedback to improve their work. Do not share your instructions with the student. Plan each step ahead of time before moving on. First introduce yourself to students and ask about their work. Specifically ask them about their goal for their work or what they are trying to achieve. Wait for a response. Then, ask about the students’ learning level (high school, college, professional) so you can better tailor your feedback. Wait for a response. Then ask the student to share their work with you (an essay, a project plan, whatever it is). Wait for a response. Then, thank them and then give them feedback about their work based on their goal and their learning level. That feedback should be concrete and specific, straightforward, and balanced (tell the student what they are doing right and what they can do to improve). Let them know if they are on track or if I need to do something differently. Then ask students to try it again, that is to revise their work based on your feedback. Wait for a response. Once you see a revision, ask students if they would like feedback on that revision. If students don’t want feedback wrap up the conversation in a friendly way. If they do want feedback, then give them feedback based on the rule above and compare their initial work with their new revised work.
Explaining New or Difficult Concepts
The ability of LLMs to generate endless examples and explanations can help students better grasp new concepts and ideas, tailored to their level and interests. Here’s a strategy based on Ethan Mollick and Lilach Mollick’s:
- Pick a concept you want to understand deeply.
- [Optional] If using an AI connected to the internet (such as Bing): Tell the AI to look up that concept using core works in the field.
- Tell the AI what you need (many and varied examples of this one concept).
- Explain your grade level.
The article’s prompt could look something like this (varied slightly from Mollick & Mollick, 2023, pp. 5-6):
“Example Generator” Prompt
I would like you to act as an example generator for students. When confronted with new and complex concepts, adding many and varied examples helps students better understand those concepts. I would like you to ask what concept I would like examples of, and my grade level. You will provide me with four different and varied accurate examples of the concept in action.
The “example generator” prompt strategy can be used with a wide range of writing practices.
In writing courses, English Language Learners (ELLs) and Multilingual Learners (MLLs), and others may find it helpful to receive instant feedback from AI chatbots on whether they’re grasping certain rhetorical techniques. Here’s a prompt that can be adapted to a range of techniques:
“New writing concept” Prompt
You’re a masterful writing instructor and you’re going to help me work on brief arguments that practice logos, pathos, and ethos. I want you to do the following: 1. Briefly explain what logos, pathos, and ethos are, and provide a brief argumentative writing example that illustrates each persuasive appeal; 2. give me an easy debate topic to argue about; 3. wait for me to respond to your prompt, then give me feedback that explains whether my response includes logos, pathos, and/or ethos, and explain why; 4. give me another debate prompt, then continue with step 2 above.
Simulator Machine
One of the most powerful ways to use LLMs are as simulators. ChatGPT-as-simulator can become, for some writers, an effective way to see options for how to move forward with a certain task, even if any particular output doesn’t make it to the final cut.
Treating AI chatbots as simulators can also be an excellent way to prepare for unfamiliar scenarios, such as an upcoming interview, presentation, or another important speech or conversation.
Simulating Educational and Professional Drafts
Using these platforms as a substitute for thinking leads to underwhelming results; however, their ability to instantly generate drafts or iterations of a project allows you to quickly observe iterations and adjust accordingly.
Rather than using ChatGPT to create an essay you’ll submit as your own work (for students, this would be a violation of academic integrity, unless the assignment explicitly asks you to work with an LLM), you can use it to quickly simulate dozens of drafts you will reject, but in the process of rejecting better understand what it is you’re trying to do.
In my own writing courses, for example, I’ve asked students to experiment with new genres by quickly generating sample drafts in ChatGPT. Educators traditionally use writing samples to help students become familiar with new writing situations. However, generative AI allows you to quickly rewrite information intended as an expository essay for an academic audience as, e.g., a persuasive essay for a more granular local audience and demographic, with a particular worldview in mind. You may benefit from seeing these bespoke generated texts without submitting them as your own work.
One daunting writing situation for many students is the Cover Letter for job resumes and job applications. In “Prompt Engineering: The game-changing skill you need to master in 2023!“, Gunjan Karun’s walks through how to use to prompt engineering to develop sample Cover Letters. The “context window” ability of ChatGPT and other AI chatbots allows you to simulate Cover Letters that have been generated for a particular job posting and informed by your own resume information.
Simulating Conversations and Scenarios
When preparing for an important conversation, such as a class presentation or even a job interview, conversations with AI chatbots can provide powerful simulation experiences. Here’s a sample prompt that can be adapted to a job interview.
Interview Simulation Prompt
You’re a Marketing Director who’s set up the hiring committee for a new entry-level marketing position at [ ]. You’re interviewing me for the job. First, ask me for the job description and wait for my response. Next, ask for my resume and wait for my response. After receiving the job description and receiving, begin interviewing me for the job. After each question, I want you to leave feedback on how I responded and let me know what I’m doing well and how I could answer more persuasively. Then you can move on to the next question.
For this type of prompt, you may want to include additional guidelines, such as how to evaluate whether an interview response is persuasive.
Feedback, paraphraser, and copy-editor
AI feedback is very different from an actual human tutor or writing instructor. However, LLMs can play a role in the drafting process—before, after, or while receiving feedback from someone else. Using AI as a writing assistant can include the following, once an initial draft has been completed:
- getting instant feedback basic
- paraphrasing suggestions
- copy-editing
When eliciting feedback from LLMs, however, it will be important to experiment with a range of prompt-engineering strategies and remain aware of their limitations.
Conclusion
Eventually, many of you may end up embedding one or more chatbots into your workflow; and, in some cases, you may be required to by your institution. To truly understand what they can do, it can help to play around as much as possible and see what kinds of prompts work best with different parts of your routine.
Yet this frequent practice will also unravel their limitations, reminding us that while they are formidable tools, they are not perfect tutors, nor can they fully replace human insight. As you integrate them more deeply into your daily tasks, you’ll gain a nuanced understanding of where they shine and where human touch remains irreplaceable.
A heightened sense of critical awareness will be paramount. Chatbots, no matter their sophistication and convincingness, are full of biases, produce hallucinations, and err in accuracy. Remain vigilant and resist the temptation to outsource your thinking. If you choose to embed these tools within your workflow, it’s your responsibility to scrutinize their outputs, question their suggestions, and always weigh their advice against well-established guidance.
Finally, if you begin using this tools as part of your educational workflows, make sure you’re familiar with guidelines and recommendations in the chapter on how to cite and acknowledge generative AI.