Dr Brett Becker, UCD Computer Science Lecturer, is the co-author of a paper which has been selected for the best Computer Science Education Research Award at the 2019 ACM SIGCSE Technical Symposium on Computer Science Education.
Solving programming problems often involves the completion of multiple distinct steps. When learning to program, novice programmers can quickly find themselves entirely lost as a result of a single mistake in one step. This is because novice programmers typically lack metacognitive awareness (‘knowing about knowing’) and don’t know where they are in the problem-solving process. When they encounter an error they do not know how to recover and consequently can’t make progress in the programming problem. To overcome this, students need to develop metacognitive skills to reflect on the problem solving process. Previous research has shown that explicitly teaching key steps of programming problem-solving, and having students reflect on where they are in the problem-solving process, can help students complete future programming assignments. This can be done through personal tutoring, but in larger groups of students there are not enough instructors to teach this.
In this paper, we found that simple modifications to an automated assessment tool can improve ‘metacognitive awareness’ in novice programmers. Specifically, students that used the modified tool showed a higher degree of understanding problem prompts and were more likely to complete programming tasks successfully.
Our paper explores an automated assessment tool which ‘scaffolds’ the process of problem solving. With this tool, students were given a problem prompt, and immediately given a test case to solve, in order to see if this would help students to better understand the process of problem solving. As far as we know, this is the first investigation of the use of an automated tool to scaffold students’ metacognitive awareness. For the first time, we are approaching a scalable alternative to personal tutoring.
This study used a number of techniques to measure the impact of the new tool. We collected all code submissions by students in both control and experimental groups. We conducted think-aloud studies with participants, and participant observation allowed us to record the participants’ actions, apparent thought and problem-solving process, and external reactions to error messages and other feedback. We used pre- and post-tests to measure the shift in participants’ growth mindsets.
Although our results are quite preliminary, we are very excited about the potential impact of this research. We have identified two specific questions for future work.
- How do randomly generated test cases impact studies like this?
- Is there a correlation between the number of times a student re-reads a problem prompt and the number and severity of metacognitive difficulties faced?
We are also keen to replicate this work at a larger scale, both larger numbers of students, but also students derived from diverse institutions.
Dr Becker co-authored this paper with: James Prather, Ray Pettit, Paul Denny, Dastyni Loksa, Alani Peters, Zachary Albrecht and Krista Masci and is looking forward to future work with them in this area.
The paper was presented on Friday March 1 at SIGCSE 2019 in Minneapolis, Minnesota, at the 50th ACM SIGCSE Technical Symposium on Computer Science Educaiton. There were a total of 526 paper submissions across three tracks (Computer Science Education Research, Experience Reports and Tools, and Curriculum Initiatives) with 169 accepted, for an overall acceptance rate of 32%. This year's conference attendance was 1,865.
The paper, presentation, and photos are available on the ACM Digital Library (link at www.brettbecker.com/publications).