Open Access Paper
2 July 2019 Error detection tasks and peer feedback for engaging physics students
Author Affiliations +
Proceedings Volume 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019; 111433J (2019) https://doi.org/10.1117/12.2523795
Event: Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 2019, Quebec City, Quebec, Canada
Abstract
We show how Error-Detection Tasks (EDT) are an effective way for students to practice giving peer feedback. In EDT, students are presented with a solved physics problem, prepared by a fictional in-class peer, containing one or more errors - algebraic, procedural, or conceptual. Students must identify and explain how to correct the error(s), as though they were explaining it to the “peer” who wrote the solution. EDTs have been developed for a web-based learning platform (myDALITE.org) that provides students with asynchronous peer instruction such that students can provide feedback and evaluate feedback from other students. Results show that students trained with EDT significantly outperform a control group in correctly identifying errors and providing more meaningful feedback to their fictional in-class peer.

1.

INTRODUCTION

1.1.

Background

There is considerable research showing that receiving feedback is essential for learning [1-9]. Peer feedback is often a central part of pedagogies such as active learning, which is grounded in social constructivist theories and relies heavily on group work and collaboration [2-3]. Peer feedback can promote the development of higher order cognitive and metacognitive skills such as critical thinking, problem solving and decision making [6]. While there is much written about the effectiveness of receiving feedback, much remains to be learned about the effect of providing feedback. Providing feedback is arguably important as a mechanism through which students can develop self-regulatory skills, learning how to monitor and evaluate their growing knowledge [7]. In the process of providing feedback, students take on the responsibility of seeking out information and acting on it [8]. Because of this, providing peer feedback is a complex task and calls for students to learn how to assess the work of others, and in doing so, gain the skills necessary to evaluate their own work [9]. Research suggests that this self-regulatory and agentic action, providing peer feedback, can improve with practice [5] and training [10]. However, due to the scarcity of research on this topic, there is still much to be learned about how and what students learn by providing peer feedback, as well as how educators can best help students learn from the process. Accordingly, in this study we looked at whether an online peer instruction tool called (myDALITE) can be used with error-detection tasks (EDT) to improve the quality of peer feedback and to engage students with course content.

1.2.

Peer Instruction to enable peer feedback

Peer Instruction (PI) [11] is an effective way for students to give and receive peer feedback on their conceptual understanding because of its use of a think-pair-share script that includes discussion and reflection on a conceptual question. In traditional in-class implementations of PI, these discussions and subsequent peer feedback are not recorded. Our team has developed myDALITE (mydalite.org) [12], a free web-based learning platform that provides students with asynchronous PI that can be used in- and out-of-class. Supporting active learning pedagogy, myDALITE engages students in 1) written explanation – students must provide rationales for their answers; 2) comparison of these rationales to those of peers; and 3) reflection on the quality of the rationales – students can answer the question again, choosing a different rationale or keeping their original rationale.

PI engages students in discussion at the conceptual level and focuses on student’s deep understanding by requiring them to engage in a form of reciprocal teaching [13] and to explain and reflect on their understanding. It is particularly designed for those stubborn conceptual understandings that require restructuring of certain beliefs and ideas because they are not consistent with the canonical (i.e., normative) explanations of phenomena – i.e., conceptual change (e.g., [14]). Such intentional reflection is believed to facilitate conceptual change [15]. In doing so, PI requires students to compare their understanding to the explanation and arguments put forward by other students. Additionally, it distributes to students the responsibility of arriving at criteria and standards for what is a good explanation or argument, compared to normative ways of thinking within the discipline.

1.3.

Error detection tasks as a meaningful pedagogical activity

In our EDT, students are presented with the solution to a physics problem written by Pat, a fictional in-class peer. Pat’s solution contains zero, one, or multiple errors which can be algebraic, procedural, or conceptual, depending on the learning objectives. Students identify and explain how to correct the error(s), as though they were explaining it to Pat. This requires a more careful reading and understanding of the task and helps students to avoid making similar errors and misconceptions in the future.

In the types of physics problem solving used in this study, an important aspect is the recruitment of procedural knowledge by the student. Students must go backward into their previous learning to find out what the error was, but then must also go forward to explain and give feedback. However, initial student thinking is not necessarily normative, rather it is often naive [16], that is to say a more novice rather than expert understanding. It is difficult for anyone to change their conceptual knowledge from within a naive framework, so one goal of EDT is to scaffold the students’ progress to achieve a more normative understanding. By practicing this critique of procedural tasks, procedural tasks become embedded in a framework that is normative and which, therefore, forces students to shift and think differently. This is because students’ mental models are more likely to change if they are constrained to this new framework [17-19]

The literature reports that EDT can be more effective under certain conditions, compared with other problem solving tasks. For example, [19] found that students with medium-high initial prior knowledge learned best from a mix of correct and error-detection problems, while students with low prior knowledge benefitted more from studying correct worked examples alone rather than from a mix of correct worked problems and problems with errors, even if the specific errors were highlighted. Importantly, the inclusion of explicit feedback steps seems to play a role. Stark et al [20] showed a benefit for all students from a mix of correct and error-detection problems when giving explicit feedback and comparing/contrasting. The effectiveness of EDT has also been established compared with computer-based problems that provide immediate feedback on individual steps. In these studies, the EDT groups performed better on a delayed posttest than the control group [21]. While these studies provide important insights into the process and use of EDT, none of them explicitly combined EDT with the provision of peer feedback, as we have done in the current study in which students engaged in EDT and provided feedback to their fictional in-class peer, Pat.

1.4.

Scripting the EDT and peer feedback activity in myDALITE

Students log into myDALITE and are directed to the prepared assignment that includes one or more EDT as well as other questions. Each EDT requires the student to engage in a 6-step process:

  • 1) Students are presented with a solution to a physics problem written by their fictional in-class peer, Pat. Using a multiple-choice format, students must choose where the error (or the most important error if there are many) is located in the solution.

  • 2) Students must provide feedback to Pat - explaining the error and how to correct it.

  • 3) Students are asked to reconsider their original answer in the context of other rationales for their own answer, and a similar selection of rationales for an alternative answer. This purposeful comparison is designed to provide the variety that is sometimes missing in face-to-face PI.

  • 4) Students re-select their answer, choosing either to stay with their original selection or change, based on the feedback they have read. In this way, students give feedback, but also evaluate the feedback that other students have written.

  • 5) Students are shown what their initial answer and feedback was, followed by their second answer and feedback choice to display the change (or lack thereof) in their thinking and feedback.

  • 6) The correct answer to where the error (or most important error) is located is provided to the student.

We have selected a simple harmonic motion (SHM) EDT and peer feedback activity to illustrate these steps in Figures 1-5.

Figure 1:

myDALITE screen capture for step 1.

00092_PSISDG11143_111433J_page_3_1.jpg

Figure 2:

The solution to a physics problem written by a fictional in-class peer, Pat.

00092_PSISDG11143_111433J_page_4_1.jpg

Figure 3:

myDALITE screen capture for step 2.

00092_PSISDG11143_111433J_page_5_1.jpg

Figure 4:

myDALITE screen capture for steps 3 and 4.

00092_PSISDG11143_111433J_page_5_2.jpg

Figure 5:

myDALITE screen capture for steps 5 and 6.

00092_PSISDG11143_111433J_page_6_1.jpg

2.

METHODOLOGY

Using a quasi-experimental design [22], this study was implemented in two calculus-based mechanics physics sections, which had different teachers. This course is equivalent to a typical undergraduate freshman university physics course. Both sections had similar pre-instruction scores in the Force Concept Inventory (FCI) [23]. One section, the treatment group (N=32), had four separate EDT activities in myDalite over the course of a month. The activities corresponded to the regular content sequence in the course. They also had a short training in what makes for good peer feedback. The control section (N=30) did not have any EDT, nor any other questions in myDALITE, nor any practice or training in providing peer feedback, but followed a similar topic sequence and timing.

Following the feedback training, both treatment and control groups were given an EDT quiz (on paper, see Figure 6 for student instructions), in which they were asked to error-detect a novel solution on the current topic, and to provide feedback to a fictional in-class peer. The feedback quality was analyzed on whether they had identified where the error was, what the error was, and how to explain the error correctly, based on a teacher rubric. In addition to correctly answering the problem, the affective quality of the feedback was evaluated based on how personalized it was.

Figure 6:

Instructions to students for the paper-version error detection quiz.

00092_PSISDG11143_111433J_page_6_2.jpg

After the EDT quiz, the treatment group continued with EDT during the semester, completing 10 in total. Finally, the treatment group was given a short survey during the last week of class about their experience with EDT and providing feedback to a fictional in-class peer.

3.

RESULTS AND DISCUSSION

We begin with an analysis of student feedback from the EDT quiz. The feedback-quality rubric examined if students had correctly identified where the error was, what the nature of the error was, and how to correct the error, assigning either 1 if present and correct, or 0. In addition, an assessment of the affective quality of the feedback was scored. The affective quality was scored positive if the students made any reference directly to the fictional in-class peer Pat, for instance, “Hi Pat” or “You did this wrong, Pat.” This data is shown in Table 1. Not surprisingly, students who had more practice with EDT and giving feedback, scored much better than those students who had not had this practice. The data also shows that students who identified where the error was, largely also identified what the error was and how to fix it. The treatment group got it correct at rates about double that of the control group. Both groups showed higher affective scores, than the other correctness scores, indicating sensitivity to the target audience, something that was increased from the myDALITE activity, evidenced by the treatment group having an average score double that of the control. This shows that, for this quiz, not only did the students get the answer more correctly than the control, but they also wrote differently by addressing Pat directly.

Table 1:

Results of the treatment and control groups on the EDT and peer feedback quiz.

  ControlTreatment
Avg % ScoreWhere3066
What2763
How2763
Affective4081

Table 2 reports on the results of a short student survey of students in the treatment group. Overall, the students were positive about the approach and strongly felt that the process of explaining to someone else [the fictional in-class peer, Pat] did help them to learn physics. It is a cliché that the best way to learn is to explain to someone else. However, the ‘someone else’ is not always there. This online PI allows students to have a feedback interaction [writing, and then seeing others’ feedback], outside of class time, allowing teachers to implement the benefits of feedback and peer instruction, in a different way.

Table 2:

Survey results about students’ engagement with EDT and peer feedback, for the treatment group only.

Question:YesNo
Did giving “feedback” and explaining to Pat help you learn physics?82%18%
Do you feel you made a connection with Pat?54%46%
Have you learned to provide better feedback as a result of EDT questions in myDALITE and in your classroom activities? Likert 5-point scale No 3.5%: 7%: 36%:36%:17.5% A lot

Students were also encouraged to provide comments in the survey. Regarding the first question presented in Table 2, most students who said “Yes” provided comments along the lines of, “Yes, it did. It not only helped me with the structure of good solution, but it helped me be more aware of errors I could possibly make.” What is interesting is that some of the students who said “No” felt that the EDT were not challenging enough, hence were not helpful for their learning, “It wasn’t a big challenge, so it didn’t make me use all I was taught. Each problem had one or two mistake that were easy to detect.”

For the second question presented in Table 2, the students were more split with regards to making a connection to their fictional in-class peer, Pat. Their reasons for saying “Yes” or “No” vary, for example:

“Yes, Pat feels like a friend who you study with, trying to help him out. By helping Pat out, it improves my understanding of the topic, just like studying with someone.”

“Yes, I have made a connection with Pat, whenever I hear his name, I get stressed out because I do not do well on his quizzes generally.”

“No, and I don’t think that providing feedback is very relevant as a type of quiz. I think that correcting his errors and understand them would be enough.”

“Nooooo! … this idiot making so many mistakes. But I think it’s the best way of giving feedback if he wants to learn from his mistakes.”

Even a negative connection is still a connection, and it remains to be seen if having any connection, negative or positive, could implicate student engagement, and therefore impact learning.

4.

CONCLUSIONS

By using scripted EDT and peer feedback activities in myDALITE, students practiced giving feedback to a fictional inclass peer and engaged with physics content. These students significantly outperformed a control group (that did no prior EDT and did not use myDALITE) in correctly identifying errors and providing more meaningful feedback to a fictional in-class peer. Students were positive about the approach and strongly felt that the process of providing feedback helped them to learn physics. Given the limited scope of the intervention, this study shows that error-detection that includes the provision of peer feedback as a pedagogical strategy holds promise. As we continue to use myDALITE to develop and implement EDT, we will gain deeper insight into the learning gains offered by regular engagement in EDT and providing peer feedback. Our results suggest that future research into the processes and effects of providing peer feedback may help us, as researchers and practitioners, better understand how students can more actively engage in their own learning.

ACKNOWLEDGMENTS

The authors thank the Programme d’aide à la recherche sur l’enseignement et l’apprentissage (PAREA; grant PA2017-013), the Ministry of Education (MELS & MERST) program, in the Province of Quebec and our research assistant Chao Zhang.

REFERENCES

[1] 

Hattie, J. A., & Yates, G. C., “Using feedback to promote learning,” Applying science of learning in education: Infusing psychological science into the curriculum, 45 –58 2014). Google Scholar

[2] 

Falchikov, N., and Goldfinch. J., “Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks,” Review of educational research, 70 287 –322 (2000). https://doi.org/10.3102/00346543070003287 Google Scholar

[3] 

Strijbos, J., and Sluijsmans. D., “Unravelling peer assessment: Methodological, functional, and conceptual developments,” Learning and Instruction, 20 265 –269 (2010). https://doi.org/10.1016/j.learninstruc.2009.08.002 Google Scholar

[4] 

Poverjuc, O., Brooks, V., and Wray, D., “Using peer feedback in a Master’s programme: a multiple case study,” Teaching in Higher Education, 17 465 –477 (2012). https://doi.org/10.1080/13562517.2011.641008 Google Scholar

[5] 

Brown, G. T., Peterson, E. R., and Yao, E. S., “Student conceptions of feedback: Impact on self-regulation, selfefficacy, and academic achievement,” British Journal of Educational Psychology, 86 606 –629 (2016). https://doi.org/10.1111/bjep.2016.86.issue-4 Google Scholar

[6] 

King, A., “Structuring peer interaction to promote high-level cognitive processing,” Theory into practice, 41 33 –39 (2002). https://doi.org/10.1207/s15430421tip4101_6 Google Scholar

[7] 

Ferguson, P., “Student perceptions of quality feedback in teacher education,” Assessment & Evaluation in Higher Education, 36 51 –62 (2011). https://doi.org/10.1080/02602930903197883 Google Scholar

[8] 

Evans, C., “Making sense of assessment feedback in higher education,” Review of educational research, 83 70 –120 (2013). https://doi.org/10.3102/0034654312474350 Google Scholar

[9] 

Nicol, D., Thomson, A., and Breslin, C., “Rethinking feedback practices in higher education: a peer review perspective,” Assessment and Evaluation in Higher Education, 39 102 –122 (2014). https://doi.org/10.1080/02602938.2013.795518 Google Scholar

[10] 

Alqassab, M., Strijbos, J. W., and Ufer, S., “Training peer-feedback skills on geometric construction tasks: role of domain knowledge and peer-feedback levels,” European Journal of Psychology of Education, 33 11 –30 (2018). https://doi.org/10.1007/s10212-017-0342-0 Google Scholar

[11] 

Mazur, E., Peer instruction, Prentice Hall, Upper Saddle River, NJ (1997). Google Scholar

[12] 

Charles, E. S., Lasry, N., Bhatnagar, S., Adams, R., Lenton, K., Brouillette, Y., Dugdale, M., Whittaker, C., and Jackson, P., “Harnessing peer instruction in- and out- of class with myDALITE,” in Proc. SPIE - Education and Training in Optics and Photonics (ETOP) Conference, (2019). https://doi.org/10.1117/12.2523778 Google Scholar

[13] 

Palincsar, A., and Brown. A., “Reciprocal teaching of comprehension-fostering and metacognitive strategies,” Cognition and instruction, 1 117 –175 (1984). https://doi.org/10.1207/s1532690xci0102_1 Google Scholar

[14] 

DiSessa, A. A., “A history of conceptual change research,” The Cambridge Handbook of the Learning Sciences, 265 –281 2006). Google Scholar

[15] 

Intentional conceptual change, L. Erlbaum, (2003). https://doi.org/10.4324/9781410606716 Google Scholar

[16] 

Vosniadou, S., “Conceptual Change in Learning and Instruction: The Framework Theory Approach,” International handbook of research on conceptual change, 23 –42 Routledge,2013). Google Scholar

[17] 

Adams, D. A., McLaren, B. M., Durkin, K., Mayer, R. E., Rittle-Johnson, B., Isotani, S., and Velsen, M., “Using erroneous examples to improve mathematics learning with a web-based tutoring system,” Computers in Human Behavior, 66 401 –411 (2014). https://doi.org/10.1016/j.chb.2014.03.053 Google Scholar

[18] 

Durkin, K., and Rittle-Johnson, B., “The effectiveness of using incorrect examples to support learning about decimal magnitude,” Learning and Instruction, 22 206 –214 (2012). https://doi.org/10.1016/j.learninstruc.2011.11.001 Google Scholar

[19] 

Große, C. S., and Renkl, A., “Finding and fixing errors in worked examples: Can this foster learning outcomes?,” Learning and Instruction, 17 612 –634 (2007). https://doi.org/10.1016/j.learninstruc.2007.09.008 Google Scholar

[20] 

Stark, R., Kopp, V., and Fischer. M. R., “Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education,” Learning and Instruction, 21 22 –33 (2011). https://doi.org/10.1016/j.learninstruc.2009.10.001 Google Scholar

[21] 

McLaren, B. M., van Gog, T., Ganoe, G., Karabinos, M., and Yaron, D., “The efficiency of worked examples compared to erroneous examples, tutored problem solving, and problem solving in computer-based learning environments,” Computers in Human Behavior, 55 87 –99 (2016). https://doi.org/10.1016/j.chb.2015.08.038 Google Scholar

[22] 

Experimental and quasi-experimental designs for research, Ravenio Books(2015). Google Scholar

[23] 

Hestenes, D., Wells, M., and Swackhamer, G., “Force Concept Inventory,” The Physics Teacher, 30 141 –158 (1992). https://doi.org/10.1119/1.2343497 Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Rhys Adams, Phoebe Jackson, Kevin Lenton, Michael Dugdale, Chris Whittaker, Nathaniel Lasry, and Elizabeth S. Charles "Error detection tasks and peer feedback for engaging physics students", Proc. SPIE 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 111433J (2 July 2019); https://doi.org/10.1117/12.2523795
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Acquisition tracking and pointing

Physics

Error analysis

Feedback control

Mechanics

Back to Top