There have been many attempts to understand how the organization of a student's solution to a problem relates to the correctness of that work. Understanding this relationship can provide early warnings and targeted feedback to students who are struggling in a course.
High enrollment in many undergraduate courses often makes manually grading every homework assignment prohibitively time consuming for instructors. However, when instructors do not grade homework assignments, students may not have sufficient incentive to complete their homework assignments diligently. As a compromise, instructors use tactics such as grading only a subset of problems on each homework assignment, providing grades for homework assignments solely based on completion, or selecting one question from each assignment for use as a quiz problem. These strategies reduce the work load for the instructor, but they severely limit the feedback that students receive.
Additionally, feedback during the course of a class, such as, for example a quiz or a test taken in the classroom cannot be conducted efficiently. Previously, others have used video cameras to record problem-solving activities, but the analysis of such data is a difficult and time-consuming task that requires human judgment that can lead to erroneous results. However, capturing the work as time-stamped pen strokes enables a much more precise and efficient analysis of the work.
Data mining techniques have been proposed as a solution for assessing student learning. Educational data mining uses machine learning techniques, data mining techniques, and other similar techniques to examine education research issues. A recent overview of this work has been provided by Cristóbal Romero and Sebastián Ventura in their paper: Educational data mining: A Review of the State of the Art. IEEE Transactions on Systems, Man, and Cybernetics—Part C: Applications and Reviews, 40(6):601-618, 2010. Disadvantageously, this current work relies on data collected in online environments such as web applications and intelligent tutoring systems that are different from standard classroom settings which changes the learning environment and skews the results.
Other researchers have used journaling to examine student work habits. For example, Marisa Orr, Lisa Benson, Matthew Ohland, and Sherrill Biggers in their paper: Student Study Habits and Their Effectiveness in an Integrated Statics and Dynamics Class., In Proceedings of the 2008 American Society for Engineering Education Annual Conference and Exposition, 2008, examined students' journal responses about their study habits, including factors such as when and how they completed their homework, and if they took advantage of assistance programs. While the results proved interesting, journals capture students' perceptions of their work habits rather than an objective characterization of them. Our work provides a complement to this work as the system captures a detailed time stamped record of a student's work over the duration of the course.
Other researchers have explored various mechanisms for providing rapid feedback. For example, Antti Rasila, Linda Havola, Helle Majander, and Pekka Alestalo in their paper: Automatic assessment in engineering mathematics: evaluation of the impact, In Myller, E. (ed.), ReflekTori 2010 Symposium of Engineering Education, 37-45. Aalto University School of Science and Technology, 2010, explored the benefits of an online assessment tool for engineering mathematics. They found that automatic assessment was highly useful and improved the feedback provided to students. John Chen, Dexter Whittinghill, and Jennifer Kadlowec, used electronic conceptual quizzes during lectures within a statics course to help guide the lecture content as documented in their paper Classes that click: Fast, rich feedback to enhance student learning and satisfaction, Journal of Engineering Education, 99(2):159-168, 2010. They found that the rapid feedback produced a significant increase in student performance.
The work of Sharon Oviatt, Alex Arthur, and Julia Cohen in their paper Quiet interfaces that help students think, In Proceedings of the 19th annual ACM symposium on user interface software and technology (UIST '06), 191-200, New York, N.Y., USA, 2006, suggests that natural work environments are critical to student performance. In their examination of computer interfaces for completing geometry problems, they found that “as the interfaces departed more from familiar work practice, students would experience greater cognitive load such that performance would deteriorate in speed, attentional focus, meta-cognitive control, correctness of problem solutions, and memory.” There have been several studies examining student work habits and performance in statics. For example, Paul Steif and Anna Dollár's Study of Usage Patterns and Learning Gains in a Web-based Interactive Statics Course, Journal of Engineering Education, 94(4):321-333, 2009, examined usage patterns of a web-based statics tutoring system to determine the effects on learning. They found that learning gains increased with the number of tutorial elements completed. This study relied on an online learning environment, while the system considers ordinary handwritten work. In another study, published in Improving Problem Solving Performance by Inducing Talk about Salient Problem Features, Journal of Engineering Education, 99(2):135-142, 2010, Paul Steif, Jamie Lobue, Anne Fay, and Levent Kara examined whether students can be induced to talk and think about the bodies in a statics problem, and if doing so can increase a student's performance. They used tablet PCs to record the students' spoken explanations and capture their handwritten solutions as time-stamped pen strokes. The study focused on the spoken explanations, with the record of written work was left mostly unanalyzed.
Researchers have also used video recordings to examine student problem solving. For example, Paul Blanc's Solving a Non-routine Problem: What Helps, What Hinders?, In Proceedings of the British Society for Research into Learning Mathematics, 19(2):1-6, 1999, examined video recordings of student work in mathematics and analyzed the path that students used to solve an example problem. Although Blanc recorded more than 75 problem solutions, only two were analyzed in his paper. That speaks to the labor intensive nature of analyzing video records. Additionally, human error can be introduced using this technique.
A long standing need of educators is a means to rapidly and inexpensively identify students who may be struggling in a course so that extra assistance can be provided. Therefore there is a need for a system for using smartpens as a tool for automatically assessing student learning that is simple and accurate.