18 How Do I Give and Receive Peer Feedback?
This section on Peer Review has been adapted from Writing for Success, Chapter 8, CC-BY-NC-SA 4.0.
After working so closely with a piece of writing, writers often need to step back and ask for a more objective reader (it can be argued that generative AI is actually a biased reader, but we’ll save that concern for ethics conversations). What writers most need is feedback from readers who can respond only to the words on the page. When they are ready, writers show their drafts to someone they respect and who can give an honest response about its strengths and weaknesses.
Most English courses incorporate this step as a part of the writing process we call peer review. After evaluating the feedback and assessing what is most helpful, the reader’s feedback will help you when you revise your draft. You can work with a partner in your class and identify specific ways to strengthen each other’s essays. Although you may be uncomfortable sharing your writing at first, remember that each writer is working toward the same goal: a final draft that fits the audience and the purpose. Maintaining a positive attitude when providing feedback will put you and your partner at ease. The box that follows provides a useful framework for the peer review session.
What about Using Generative AI Tools for Peer Review?
Throughout this textbook, we have been exploring ways to use generative AI to improve our writing. But one way you must never use generative AI without explicit consent is to provide peer feedback on colleague’s paper. The reason you shouldn’t do this is not because AI is not capable of the task. Rather, it’s an ethical concern, one that in many respects mirrors the foundational concerns of writers and artists about the way that large language models like ChatGPT were trained. The training sets for these models contain works that were used without the author’s permission.
When you copy and paste (or attach) anything written by someone else into a large language model, you’re taking the risk that this material will be used as part of a training data set. This can violate both your peer’s privacy and intellectual property (copyright) rights.
However, it’s fine to use generative AI tools as “peers” to review your own work, as long as you understand the tool’s terms of service.
In our course, we have been experimenting with weekly AI formative feedback on our writing tasks. “Formative” means that you’re still working on your overall paper, so the feedback you get from the AI is designed to strengthen your writing, not assess it. The summative assignment for Unit One is the final draft of your exploratory research essay. A comprehensive AI review based on the essay rubric can help you to identify areas where you are meeting expectations while also pointing out ways to improve the essay to meet the rubric’s specific requirements.
Using Feedback Objectively
The purpose of peer feedback is to receive constructive criticism of your essay. Your peer reviewer is your first real audience, and you have the opportunity to learn what confuses and delights a reader so that you can improve your work before sharing the final draft with a wider audience (or your intended audience).
It may not be necessary to incorporate every recommendation your peer reviewer makes. However, if you start to observe a pattern in the responses you receive from peer reviewers, you might want to take that feedback into consideration in future assignments. For example, if you read consistent comments about a need for more research, then you may want to consider including more research in future assignments.
Using Feedback from Multiple Sources
You might get feedback from more than one reader as you share different stages of your revised draft. In this situation, you may receive feedback from readers who do not understand the assignment or who lack your involvement with and enthusiasm for it.
You need to evaluate the responses you receive according to two important criteria:
- Determine if the feedback supports the purpose of the assignment.
- Determine if the suggested revisions are appropriate to the audience.
Then, using these standards, accept or reject revision feedback. You should ask the same questions about any suggestions you receive from a generative AI tool. Remember that ultimately, you are the (human) writer in the loop!
How to leave good feedback for other writers
When other writers ask us to give feedback, it’s tempting to mimic what so many English instructors have done to our essays: Go through the rough draft with a red pen in hand (literal or figurative), marking grammatical errors, and crossing out words and sentences. Peer review, in this manner, feels like flogging. The more criticisms we can offer, the better we feel about our comments.
But that’s missing the point entirely.
As the author of the draft being reviewed, how does that kind of feedback feel? Were you motivated to excel? Did you feel empowered to move forward and hone your expertise as a burgeoning writer? Some might, but most feel the opposite. An entirely critical (in the negative sense) approach to peer review might make the reviewer feel good about themselves, but it only demotivates the author.
Leaving feedback that is constructive, on target, and empowering is a life skill. It’s not only something you’ll do in writing courses. Nearly all professional jobs require employees and administrators to review one another, usually on a fairly consistent basis. Here are some tips for practice helpful feedback.
Start by noticing
Eli Review offers a high-quality platform for doing peer reviews, and part of its framework is inspired by Bill Hart-Davison’s strategy of Describe – Evalute – Suggest, summarized by this short video:
https://youtube.com/watch?v=KzdBRRQhYv4%3Ffeature%3Doembed%26rel%3D0
The Hart-Davison model of peer feedback described in the video works in part because it encourages peer reviewers to begin by noticing what the writer is doing (or attempting to do). This first step aims to be as neutral as possible. Language that notices might sound something like:
Overall, I see that you’re arguing ___________. Your essay opens by ___________.You then ___________. Finally, the last few paragraphs of the essay ___________.
Feedback that notices accurately “says back” to the writer what’s happening in the draft. This saying-back allows them to see how their draft is being received, in comparison with what they thought they were doing.
When evaluating, share criteria
The second move in Hart-Davison’s approach is to evaluate a draft by clearly sharing the criteria the peer reviewer is using. In a course, these criteria might be the course outcomes, or the more specific outcomes pertaining to that Unit. Perhaps an assignment expects a student to practice certain persuasive strategies; or, instead, an essay prompt might ask students to analyze something according to certain key concepts. The peer reviewer should read and offer feedback with those particular outcomes in mind. But there are also broader composition strategies that apply to nearly all writing situations. Writing for unity, coherence, cohesion, and style are important goals for all forms of writing. The second half of this chapter offers tips for tackling those areas.
Language that evaluates might sound something like this:
When reading your draft for coherence and cohesion, I notice a few areas that feel unclear to me as a reader. On page 2, for example, ___________.
When making suggestions, remain constructive and ask questions
The final move in the Hart-Davison heuristic is to “make suggestions for success.” This is where a reader can begin to offer more targeted comments on what steps the writer can make to improve their draft. Even here, however, it’s important to think about how your feedback will be received. Tone matters. Notice the difference between the following two suggestions:
You make a lot of grammatical mistakes on page 3. Fix them.
I think I understand the main idea of your third paragraph, but it took me awhile to figure it out. Can you add a sentence or two earlier in the paragraph to make it more obvious?
The first example above sounds harsher. It’s also a little vague. As a writer, if I’m told I made a lot of grammatical mistakes, I’ll probably feel like I’m failing the English language more generally (that’s how it comes across). The second example avoids generalizing the identity of the writer because it’s highly targeted. The second example also has a more constructive tone because it frames the suggestion as a question rather than a demand (can you _______? vs. do __________!).
When completing your peer reviews, you don’t have to follow each stage of the Hart-Davison model. Sometimes it might feel more appropriate to jump right to the Evaluation stage, for example. But your peer feedback should usually:
- accurately “say back” to the writer what their draft is doing,
- evaluate according to clearly defined criteria,
- offer constructive and empowering feedback.
Peer feedback that does those three things, in one way or another, will result in better drafts and help you become a more effective collaborator in any environment where giving and receiving feedback is important, such as your workplace.