by Melissa Braaten
It’s common knowledge that many adult students struggle with word problems – which, incidentally, make up the majority of the questions they will be asked to answer on high stakes HSE exams. Since word problems bring together both language and mathematical reasoning, they require students to use and integrate several skill sets. Deficits in any of these skills can cause students to get lost. A lot of literature on word problems involves helping students build operation sense (the ability to know what the operations can look like in the real world in order to select the correct ones to use), building mathematical vocabulary (as distinct from a focus on “key words,” which can be misleading), and problem solving strategies.
While the skills mentioned above are indispensable and probably account for a great deal of the difficulty that students encounter with word problems, I have also found, in many students, an additional difficulty that seems to be distinct. I have worked with students who appear to have the necessary mathematical and vocabulary foundations to approach a word problem, and who have demonstrated problem solving acumen in other contexts, and yet they are still completely lost reading a word problem. It appeared as if, although they could decode the words and even know what the words meant, they still couldn’t understand what they were reading. This led me to wonder: Is reading math questions different than reading other types of text? I suspected it was, but wanted to learn more.
To attempt to assess “math reading ability” in isolation, I took HSE style word problems and wrote three options for paraphrasing the question from the word problem. Two of the options were not a correct paraphrase, and one was. I asked students not to solve the word problem, but only to identify which of the choices was asking the same thing as the original question. Students struggled quite bit with these exercises.
Some of the “easier” examples could be identified by matching a basic unit:
St. Thomas’ School has decided to put tile in the math classroom. The classroom is 12 feet x 15 feet. The tiles come in boxes, and each box will cover 6 square feet of floor. How many boxes are needed?
A. How many tiles come in a box?
B. What is the area of the classroom that will be covered in tile?
C. How many boxes of tiles will it take to cover the area of the classroom?
In the question above, students could have chosen the correct answer merely by identifying that option c is the only question that asks for a number of boxes. When I tested this question with 16 adult education students with varying levels of math and reading ability, 13/16 or 81% chose the correct answer.
Questions that involved more complex units like rate were harder.
Folders come in packs of 10. St. Thomas’ needs 4 folders per student, and expects to enroll 20 students in September. Folders cost $11 for one pack. How much will St. Thomas spend for folders per student?
A. How many folders will St. Thomas need to buy for September?
B. What is the cost per folder?
C. What is the cost to buy 4 folders for one student?
When I put this question through a readability checker to test for vocabulary and sentence complexity, it was given a GLE of 3.2. Every student I gave this to has a reading comprehension of at least GLE 4 (and some up to 11), yet this question was only answered correctly by 6 students, or 38% of the group. Of those who answered incorrectly, 7 chose option a, a question which asks for a number of folders, which doesn’t match the unit of the original question.
What is going on here? It would take more detailed and careful research to answer that question. To identify that option c was asking the same question as the original, students would have to realize that “How much did St. Thomas spend” and “What is the cost” are asking for the same type of unit, and they would also have to equate “per student” with “for one student.”
I wanted to see if explicit instruction in identifying the unit in a question and defining the word “per” would help students with this type of task. The same group of students received one hour of instruction and practice in identifying the units in a question and identifying that per described a unit rate. Four weeks later, they were given question 2 again. In the post-test of 14 of the original 16 students, four of the students who had originally answered incorrectly now chose the right answer, while one person who got it right the first time got it wrong. Another way to see it is that 8 out of the 14 post-testers got the question right after instruction, or 57%, which is a modest improvement.
While my informal classroom “research” needs a lot more work to tells us anything definitive about what skills students may be missing and how to intervene, it does suggest that the ability to read math problems is distinct from overall reading ability and that instructional interventions might be helpful. I hope to encourage more interest in this question so we can find ways to help students overcome this barrier. If you have your own observations or interventions, or have encountered useful research in this area, please share below in the comments!
Melissa Braaten is an adult education instructor at St. Mary’s Center for Women and Children in Dorchester, MA. Melissa has taught ASE and pre-ASE math and reading, as well as ABE writing, computer skills, and health classes. Melissa also is a training and curriculum development specialist for the SABES PD Center for Mathematics and Adult Numeracy at TERC.