Reading Closely: What Does it Really Mean?
Marguerite Sheffer, a high school English teacher in Oakland Unified, knew she wanted her students to read “closely,” but she wasn’t sure exactly what that entailed. Frustrated with the mixed messages she was receiving from professional development resources about how to support students with close reading, she turned her uncertainty into a rigorous teacher inquiry project that helped clarify her ideas around this skill. In a series of three blog posts, she shares her insights from her year long inquiry with Mills Teacher Scholars.
The new Common Core standards explicitly call out the skill of “close reading of a rigorous text.”
Depending on who you ask, though, “close reading” means reading multiple times, reading through multiple critical lenses, or reading for “layers” of meaning. All agree that close reading is deep reading, but there is not one sure path to get students there. It becomes even murkier when we turn to removing the scaffolds and empowering students to closely read on their own.
As a teacher, I had become frustrated because I was receiving different “close reading resources” that were so varied that they amounted to different cognitive demands on students–some were more teacher-directed, whereas others were open-ended and student-focused.
Through my inquiry work at Mills, I was able to turn my frustration into advocacy, and enter the debate on what close reading means and how best to teach it.
What is NOT Close-Reading: “Find the Right Answer and the Best Evidence”
In my 11th grade ELA classroom, we begin the year with a heavy focus on finding and analyzing textual evidence. I had trained my students very well to look instantly for the “right” answer when reading a text. I would provide them with a question, usually one that I had toiled over for hours the night before, trying to find the perfect mix of accessibility and analytical thought. They would find evidence from the text that provided the “best” answer to the question. They would analyze the evidence, showing how it answered my question. Students would combine their response (topic sentence) with their evidence and analysis to form a neat, structured paragraph. The method I was using–deliberately crafting critical questions to guide student discussion and writing about text–was essentially identical to the “text-dependent question” method offered to teachers as a tool for Common Core alignment.
All well and good, but the paragraphs were unoriginal, and did not show that students were grappling with the complexities and the ambiguities of our complex text, Shakespeare’s Macbeth. For example, when I asked: “Do the female characters of Macbeth follow or defy gender stereotypes?” my students’ responses showed competence, but not originality–nearly every response I got was the same.
I was excited as I read and graded the first paper in this series, then quickly grew bored. Students were miming my own argument back at me, boxed in by the prompt. I had done the hard part–developing an argument–for them, then left them with task of proving it. While this might be useful practice, it was not close reading.
Bold analysis should not be boring–this is how I knew that despite making gains in many areas, my students could not really be closing reading. Their analysis was surface-level, and repetitive from student to student, not representing the multitude of voices and perspectives in my crowded classroom. This concept of “boring-ness” became part of my Mills Teacher Scholar inquiry, as one of my indicator’s of success became students, “Putting forth an original claim about the text.”
With the help of my Mills Teacher Scholars colleagues, I pinpointed one problem: My students were drawn towards the evidence that most obviously matched my question, but this was often the least nuanced evidence. For instance, in those essays on gender stereotypes in Macbeth, students like Jennifer, mentioned Lady Macbeth’s line: “Unsex me here!” Students found the keyword “unsex,” which clearly fit the prompt, so they did not look further for more complex evidence. While students were not in error, choosing this evidence did not lead them to counterarguments or complex readings. As a Mills Teacher Scholar staff member observed, “This best practice [providing students with thoughtful critical questions] was really undermining students’ ability to interact with text complexity.”
Next Step: Stop Asking the Questions
In working towards trying to answer my question, students were drawn towards the areas of the text they understood best, or the areas that were the least complex. They were going for the most obvious answer. Because of the limitations of the task I was asking them to perform, students were leading themselves away from the juiciest, most intricate parts of the text.
In my classroom, the practice of “text-dependent questions” was actually getting in the way of students performing close reading, because it was blocking them from creating an original argument, and from focusing on areas that they had genuine confusion about.
My job, then, was to help them navigate back to those complex parts of the text. I needed to find a scaffold or strategy to help students ask questions, not just find easy answers.