Cover.png

Introducing TaskTouch, an interactive AR learning platform. This project served to demonstrate the platform interface and functionality through a breadboarding tutorial use case.

2.png

I have always been passionate about designing in the education space and bridging the gap between learning and technology. Allowing target users (students) the ability to visualize instructional cues in real-time could optimize their learning, increase their interest in subject material, and instigate interactive experiences with technology as simple as a smart device camera.

3.png

As this project was a way to develop my design skills and convey a new instructional mode for students, particularly those interested in STEM principles (I was pursuing my engineering degree at the time).

4.png

Before designing digitally, I created a paper prototype to map out what potential interactions on the application could look like. I also sketched out potential UI components which could serve as features within the AR overlay.

5.png

Above is the final demo. Instructional content appears at the bottom of the screen. Finger scrolling allows the user to see the previous task, current task, or upcoming task that is presented on the timeline. A double tap provides any visual aids for the current task, and the user can zoom in or out to get a better look. Upon performing the task, there is a confirmation popup and the instructional content resumes.

What I Learned

  • Provide a basis for design. As I developed this project out of a curiosity for eduTech, there was no research phase to guide the product needs or functionality.

  • Testing is imperative. Sharing and communicating my ideas to potential users would inform both the intuitiveness and content of the project itself and refine the product to fit user needs.

Thanks for viewing!