Project Overview
As a part of the 2019 Hackathon at UVA (HooHacks) our team developed a web application within 48 hours. The application records and provides feedback for students learning and transcribing letters in real time.
Based on my experience as a preschool teaching assistant, I could attest to the lack of resources surrounding elementary education, especially for children with special needs. With the lack of personal attention and feedback, students fall further and further behind.  While this is a broad issue, one specific area that students struggle with is the practice of transcribing worksheets in order to learn how to write the English alphabet. 
The Team:
James Yun: Lead programmer
Henry Carscadden: Machine learning & classification
Alex Wassel: HTML/CSS
Jeevna Prakash (me): Concept design & developer​​​​​​​
Goals and Outlook:
- Develop a prototype for an application that will help increase access to learning resources and feedback
- Collaborate with teammates to learn more about web development and design
​​​​​​​As a former teaching assistant, I helped develop the concept and design of the application. I also used HTML and CSS to create the interface, including JavaScript components.  
The Process

Research Overview
How do students currently learn how to write?
We started by conducting a rapid round of research into the current methods of teaching children how to learn to read and write.  
Some of the learning methods we found and chose to build on were the D'Nealian and Palmer handwriting methods, which use arrows and dotted-lines as writing guides. The central problem with this pencil-paper method is the inability to assess the accuracy of the pencil strokes without a teacher present, as well as a lack of clarity of the guides given.  ​​​
How can we leverage machine learning?
The most direct and accessible way to teach how to do something is to get students to observe and do it themselves. Using an extended data set, we put together an interface for English learners to practice their letter writing skills. 
By comparing our students handwriting to other English writers through Optical Character Recognition (OCR), we aimed to actively score and guide students through the learning process.
Ideation & Design

We translated the D'Nealian and Palmer methods into an analogous, responsive system. The arrows and shapes of letters were replicated into the system via an animated GIF of the letter being written. The animation of the standard strokes were intended to show in real-time, how the letters are usually written.  
The feedback in the system comes in the form of a score shown at the bottom of the screen. Rather than a teacher evaluating a student, a student can very quickly see their accuracy and progress given out of 3 stars. This evaluation was done by comparing the student's rendition of a letter in a binary classifier, created by our team. To accommodate for multiple handwriting styles, the model was trained using multiple different handwritten texts in various styles. 

Design Summary

The application was designed with a friendly, colorful user interface. Given the short timeline and the development constraints, the design possibilities were slightly more rigid. However, the necessity for a simplistic, kid-friendly interface lent itself well to our time constraints.  

The Alphabet Game: Our programmer's rendition of the letter 'B' with a score and the example lettering on the left.  This is a screen capture - in the application, the example letter is a dynamic rendition of how to trace the letter
How does our system work?

The front-end was created using HTML/CSS. The application's backend included several JavaScript functions that perform the linear algebra to conduct client-side binary classification with a confidence score. The support vector classifier was trained on the EMNIST dataset using ScikitLearn Python modules in a Jupyter Notebook running on a Google Cloud compute cluster.

The live application can be found here!
Evaluation & Next Steps

While the hackathon's limited time did not allow for user testing on students, we did manage to conduct some guerilla testing among our prospective judges before our final demo.  Overall we found that there was a lack of mapping from the trackpad of a laptop to the application window, so in the future, it might be better to gear the application development toward mobile tablets.  Our team won an honorable mention, and our team leaders are pictured below!
We also discussed integrating more features to expand the range of what students can practice.  I envision teachers and tutors being able to add word lists to a student's practice queue, and building the word animations out of the character GIFs we've already made.​​​​​​​

The full documentation of this application can be found on my Github

You may also like

Back to Top