|
1.INTRODUCTION1.1.BackgroundThere is considerable research showing that receiving feedback is essential for learning [1-9]. Peer feedback is often a central part of pedagogies such as active learning, which is grounded in social constructivist theories and relies heavily on group work and collaboration [2-3]. Peer feedback can promote the development of higher order cognitive and metacognitive skills such as critical thinking, problem solving and decision making [6]. While there is much written about the effectiveness of receiving feedback, much remains to be learned about the effect of providing feedback. Providing feedback is arguably important as a mechanism through which students can develop self-regulatory skills, learning how to monitor and evaluate their growing knowledge [7]. In the process of providing feedback, students take on the responsibility of seeking out information and acting on it [8]. Because of this, providing peer feedback is a complex task and calls for students to learn how to assess the work of others, and in doing so, gain the skills necessary to evaluate their own work [9]. Research suggests that this self-regulatory and agentic action, providing peer feedback, can improve with practice [5] and training [10]. However, due to the scarcity of research on this topic, there is still much to be learned about how and what students learn by providing peer feedback, as well as how educators can best help students learn from the process. Accordingly, in this study we looked at whether an online peer instruction tool called (myDALITE) can be used with error-detection tasks (EDT) to improve the quality of peer feedback and to engage students with course content. 1.2.Peer Instruction to enable peer feedbackPeer Instruction (PI) [11] is an effective way for students to give and receive peer feedback on their conceptual understanding because of its use of a think-pair-share script that includes discussion and reflection on a conceptual question. In traditional in-class implementations of PI, these discussions and subsequent peer feedback are not recorded. Our team has developed myDALITE (mydalite.org) [12], a free web-based learning platform that provides students with asynchronous PI that can be used in- and out-of-class. Supporting active learning pedagogy, myDALITE engages students in 1) written explanation – students must provide rationales for their answers; 2) comparison of these rationales to those of peers; and 3) reflection on the quality of the rationales – students can answer the question again, choosing a different rationale or keeping their original rationale. PI engages students in discussion at the conceptual level and focuses on student’s deep understanding by requiring them to engage in a form of reciprocal teaching [13] and to explain and reflect on their understanding. It is particularly designed for those stubborn conceptual understandings that require restructuring of certain beliefs and ideas because they are not consistent with the canonical (i.e., normative) explanations of phenomena – i.e., conceptual change (e.g., [14]). Such intentional reflection is believed to facilitate conceptual change [15]. In doing so, PI requires students to compare their understanding to the explanation and arguments put forward by other students. Additionally, it distributes to students the responsibility of arriving at criteria and standards for what is a good explanation or argument, compared to normative ways of thinking within the discipline. 1.3.Error detection tasks as a meaningful pedagogical activityIn our EDT, students are presented with the solution to a physics problem written by Pat, a fictional in-class peer. Pat’s solution contains zero, one, or multiple errors which can be algebraic, procedural, or conceptual, depending on the learning objectives. Students identify and explain how to correct the error(s), as though they were explaining it to Pat. This requires a more careful reading and understanding of the task and helps students to avoid making similar errors and misconceptions in the future. In the types of physics problem solving used in this study, an important aspect is the recruitment of procedural knowledge by the student. Students must go backward into their previous learning to find out what the error was, but then must also go forward to explain and give feedback. However, initial student thinking is not necessarily normative, rather it is often naive [16], that is to say a more novice rather than expert understanding. It is difficult for anyone to change their conceptual knowledge from within a naive framework, so one goal of EDT is to scaffold the students’ progress to achieve a more normative understanding. By practicing this critique of procedural tasks, procedural tasks become embedded in a framework that is normative and which, therefore, forces students to shift and think differently. This is because students’ mental models are more likely to change if they are constrained to this new framework [17-19] The literature reports that EDT can be more effective under certain conditions, compared with other problem solving tasks. For example, [19] found that students with medium-high initial prior knowledge learned best from a mix of correct and error-detection problems, while students with low prior knowledge benefitted more from studying correct worked examples alone rather than from a mix of correct worked problems and problems with errors, even if the specific errors were highlighted. Importantly, the inclusion of explicit feedback steps seems to play a role. Stark et al [20] showed a benefit for all students from a mix of correct and error-detection problems when giving explicit feedback and comparing/contrasting. The effectiveness of EDT has also been established compared with computer-based problems that provide immediate feedback on individual steps. In these studies, the EDT groups performed better on a delayed posttest than the control group [21]. While these studies provide important insights into the process and use of EDT, none of them explicitly combined EDT with the provision of peer feedback, as we have done in the current study in which students engaged in EDT and provided feedback to their fictional in-class peer, Pat. 1.4.Scripting the EDT and peer feedback activity in myDALITEStudents log into myDALITE and are directed to the prepared assignment that includes one or more EDT as well as other questions. Each EDT requires the student to engage in a 6-step process:
We have selected a simple harmonic motion (SHM) EDT and peer feedback activity to illustrate these steps in Figures 1-5. 2.METHODOLOGYUsing a quasi-experimental design [22], this study was implemented in two calculus-based mechanics physics sections, which had different teachers. This course is equivalent to a typical undergraduate freshman university physics course. Both sections had similar pre-instruction scores in the Force Concept Inventory (FCI) [23]. One section, the treatment group (N=32), had four separate EDT activities in myDalite over the course of a month. The activities corresponded to the regular content sequence in the course. They also had a short training in what makes for good peer feedback. The control section (N=30) did not have any EDT, nor any other questions in myDALITE, nor any practice or training in providing peer feedback, but followed a similar topic sequence and timing. Following the feedback training, both treatment and control groups were given an EDT quiz (on paper, see Figure 6 for student instructions), in which they were asked to error-detect a novel solution on the current topic, and to provide feedback to a fictional in-class peer. The feedback quality was analyzed on whether they had identified where the error was, what the error was, and how to explain the error correctly, based on a teacher rubric. In addition to correctly answering the problem, the affective quality of the feedback was evaluated based on how personalized it was. After the EDT quiz, the treatment group continued with EDT during the semester, completing 10 in total. Finally, the treatment group was given a short survey during the last week of class about their experience with EDT and providing feedback to a fictional in-class peer. 3.RESULTS AND DISCUSSIONWe begin with an analysis of student feedback from the EDT quiz. The feedback-quality rubric examined if students had correctly identified where the error was, what the nature of the error was, and how to correct the error, assigning either 1 if present and correct, or 0. In addition, an assessment of the affective quality of the feedback was scored. The affective quality was scored positive if the students made any reference directly to the fictional in-class peer Pat, for instance, “Hi Pat” or “You did this wrong, Pat.” This data is shown in Table 1. Not surprisingly, students who had more practice with EDT and giving feedback, scored much better than those students who had not had this practice. The data also shows that students who identified where the error was, largely also identified what the error was and how to fix it. The treatment group got it correct at rates about double that of the control group. Both groups showed higher affective scores, than the other correctness scores, indicating sensitivity to the target audience, something that was increased from the myDALITE activity, evidenced by the treatment group having an average score double that of the control. This shows that, for this quiz, not only did the students get the answer more correctly than the control, but they also wrote differently by addressing Pat directly. Table 1:Results of the treatment and control groups on the EDT and peer feedback quiz.
Table 2 reports on the results of a short student survey of students in the treatment group. Overall, the students were positive about the approach and strongly felt that the process of explaining to someone else [the fictional in-class peer, Pat] did help them to learn physics. It is a cliché that the best way to learn is to explain to someone else. However, the ‘someone else’ is not always there. This online PI allows students to have a feedback interaction [writing, and then seeing others’ feedback], outside of class time, allowing teachers to implement the benefits of feedback and peer instruction, in a different way. Table 2:Survey results about students’ engagement with EDT and peer feedback, for the treatment group only.
Students were also encouraged to provide comments in the survey. Regarding the first question presented in Table 2, most students who said “Yes” provided comments along the lines of, “Yes, it did. It not only helped me with the structure of good solution, but it helped me be more aware of errors I could possibly make.” What is interesting is that some of the students who said “No” felt that the EDT were not challenging enough, hence were not helpful for their learning, “It wasn’t a big challenge, so it didn’t make me use all I was taught. Each problem had one or two mistake that were easy to detect.” For the second question presented in Table 2, the students were more split with regards to making a connection to their fictional in-class peer, Pat. Their reasons for saying “Yes” or “No” vary, for example: “Yes, Pat feels like a friend who you study with, trying to help him out. By helping Pat out, it improves my understanding of the topic, just like studying with someone.” “Yes, I have made a connection with Pat, whenever I hear his name, I get stressed out because I do not do well on his quizzes generally.” “No, and I don’t think that providing feedback is very relevant as a type of quiz. I think that correcting his errors and understand them would be enough.” “Nooooo! … this idiot making so many mistakes. But I think it’s the best way of giving feedback if he wants to learn from his mistakes.” Even a negative connection is still a connection, and it remains to be seen if having any connection, negative or positive, could implicate student engagement, and therefore impact learning. 4.CONCLUSIONSBy using scripted EDT and peer feedback activities in myDALITE, students practiced giving feedback to a fictional inclass peer and engaged with physics content. These students significantly outperformed a control group (that did no prior EDT and did not use myDALITE) in correctly identifying errors and providing more meaningful feedback to a fictional in-class peer. Students were positive about the approach and strongly felt that the process of providing feedback helped them to learn physics. Given the limited scope of the intervention, this study shows that error-detection that includes the provision of peer feedback as a pedagogical strategy holds promise. As we continue to use myDALITE to develop and implement EDT, we will gain deeper insight into the learning gains offered by regular engagement in EDT and providing peer feedback. Our results suggest that future research into the processes and effects of providing peer feedback may help us, as researchers and practitioners, better understand how students can more actively engage in their own learning. ACKNOWLEDGMENTSThe authors thank the Programme d’aide à la recherche sur l’enseignement et l’apprentissage (PAREA; grant PA2017-013), the Ministry of Education (MELS & MERST) program, in the Province of Quebec and our research assistant Chao Zhang. REFERENCESHattie, J. A., & Yates, G. C.,
“Using feedback to promote learning,”
Applying science of learning in education: Infusing psychological science into the curriculum, 45
–58 2014). Google Scholar
Falchikov, N., and Goldfinch. J.,
“Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks,”
Review of educational research, 70 287
–322
(2000). https://doi.org/10.3102/00346543070003287 Google Scholar
Strijbos, J., and Sluijsmans. D.,
“Unravelling peer assessment: Methodological, functional, and conceptual developments,”
Learning and Instruction, 20 265
–269
(2010). https://doi.org/10.1016/j.learninstruc.2009.08.002 Google Scholar
Poverjuc, O., Brooks, V., and Wray, D.,
“Using peer feedback in a Master’s programme: a multiple case study,”
Teaching in Higher Education, 17 465
–477
(2012). https://doi.org/10.1080/13562517.2011.641008 Google Scholar
Brown, G. T., Peterson, E. R., and Yao, E. S.,
“Student conceptions of feedback: Impact on self-regulation, selfefficacy, and academic achievement,”
British Journal of Educational Psychology, 86 606
–629
(2016). https://doi.org/10.1111/bjep.2016.86.issue-4 Google Scholar
King, A.,
“Structuring peer interaction to promote high-level cognitive processing,”
Theory into practice, 41 33
–39
(2002). https://doi.org/10.1207/s15430421tip4101_6 Google Scholar
Ferguson, P.,
“Student perceptions of quality feedback in teacher education,”
Assessment & Evaluation in Higher Education, 36 51
–62
(2011). https://doi.org/10.1080/02602930903197883 Google Scholar
Evans, C.,
“Making sense of assessment feedback in higher education,”
Review of educational research, 83 70
–120
(2013). https://doi.org/10.3102/0034654312474350 Google Scholar
Nicol, D., Thomson, A., and Breslin, C.,
“Rethinking feedback practices in higher education: a peer review perspective,”
Assessment and Evaluation in Higher Education, 39 102
–122
(2014). https://doi.org/10.1080/02602938.2013.795518 Google Scholar
Alqassab, M., Strijbos, J. W., and Ufer, S.,
“Training peer-feedback skills on geometric construction tasks: role of domain knowledge and peer-feedback levels,”
European Journal of Psychology of Education, 33 11
–30
(2018). https://doi.org/10.1007/s10212-017-0342-0 Google Scholar
Mazur, E., Peer instruction, Prentice Hall, Upper Saddle River, NJ
(1997). Google Scholar
Charles, E. S., Lasry, N., Bhatnagar, S., Adams, R., Lenton, K., Brouillette, Y., Dugdale, M., Whittaker, C., and Jackson, P.,
“Harnessing peer instruction in- and out- of class with myDALITE,”
in Proc. SPIE - Education and Training in Optics and Photonics (ETOP) Conference,
(2019). https://doi.org/10.1117/12.2523778 Google Scholar
Palincsar, A., and Brown. A.,
“Reciprocal teaching of comprehension-fostering and metacognitive strategies,”
Cognition and instruction, 1 117
–175
(1984). https://doi.org/10.1207/s1532690xci0102_1 Google Scholar
DiSessa, A. A.,
“A history of conceptual change research,”
The Cambridge Handbook of the Learning Sciences, 265
–281 2006). Google Scholar
Intentional conceptual change, L. Erlbaum, (2003). https://doi.org/10.4324/9781410606716 Google Scholar
Vosniadou, S.,
“Conceptual Change in Learning and Instruction: The Framework Theory Approach,”
International handbook of research on conceptual change, 23
–42 Routledge,2013). Google Scholar
Adams, D. A., McLaren, B. M., Durkin, K., Mayer, R. E., Rittle-Johnson, B., Isotani, S., and Velsen, M.,
“Using erroneous examples to improve mathematics learning with a web-based tutoring system,”
Computers in Human Behavior, 66 401
–411
(2014). https://doi.org/10.1016/j.chb.2014.03.053 Google Scholar
Durkin, K., and Rittle-Johnson, B.,
“The effectiveness of using incorrect examples to support learning about decimal magnitude,”
Learning and Instruction, 22 206
–214
(2012). https://doi.org/10.1016/j.learninstruc.2011.11.001 Google Scholar
Große, C. S., and Renkl, A.,
“Finding and fixing errors in worked examples: Can this foster learning outcomes?,”
Learning and Instruction, 17 612
–634
(2007). https://doi.org/10.1016/j.learninstruc.2007.09.008 Google Scholar
Stark, R., Kopp, V., and Fischer. M. R.,
“Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education,”
Learning and Instruction, 21 22
–33
(2011). https://doi.org/10.1016/j.learninstruc.2009.10.001 Google Scholar
McLaren, B. M., van Gog, T., Ganoe, G., Karabinos, M., and Yaron, D.,
“The efficiency of worked examples compared to erroneous examples, tutored problem solving, and problem solving in computer-based learning environments,”
Computers in Human Behavior, 55 87
–99
(2016). https://doi.org/10.1016/j.chb.2015.08.038 Google Scholar
Experimental and quasi-experimental designs for research, Ravenio Books(2015). Google Scholar
Hestenes, D., Wells, M., and Swackhamer, G.,
“Force Concept Inventory,”
The Physics Teacher, 30 141
–158
(1992). https://doi.org/10.1119/1.2343497 Google Scholar
|