OralCam is a mobile health system that allows users to self-examine their oral health conditions. I worked on this project during my summer research internship at UCLA HCI Lab, and I was advised by Prof. Xiang Anthony Chen.
With collaborated dentists, OralCam supports assessments of 5 common diseases. The project helps the patients in areas without abundant dental resources be aware of their oral health conditions and improve their habits with an easy-to-use web-based app.
My main contributions included:
- Literature review of relevant areas, like interactive machine learning and mobile health research.
- Designed and conducted the user study.
- Academic writing.
- Developed the full-stack application independently.
    1. Front-end: Vue.js, Element-UI, TypeScript, Node.js
    2. Back-end: Python, Flask, PyTorch, OpenCV
    3. Continuous Integration: Docker, Jest
The paper was accepted by SIGCHI 2020, and we received a 4.0/5.0 average score from reviewers, which was higher than 90% of submitted papers according to the official post on twitter.
It is one of the best paper honorable mentioned.
You can read the paper here.

You may also like

Back to Top