top of page

Virtual Human for ASD: a Dialogue Based Virtual Human Assistant Application for Parents to Train Autistic Children

App Designer & Developer | ASD, Tablet App, Interdisciplinary Collaboration

Teammates: Tong Wu, Weibo Li, Liwen He, Ruitian Wu (all from Duke Kunshan University)

Supervisors: Prof. Xin Tong, Prof. Ming Li (all from Duke Kunshan University)

Time: 08/2022-Present

——Project Description——

The research project aims to help autistic children's parents better train their children with an intervention of a dialogue-based virtual human assistant application. The app takes in users' audio and facial snapshot as inputs and sends this information to the back end for processing. The processed result is sent back to the app's front end and enables the virtual agent to speak with facial expressions and gestures.

Demo Video

——My Role——

  1. Accessed front-end camera and microphone to take a user's snapshots (.png) and audio (.wav) as input with Unity.

  2. Collaborated with a teammate in charge of the back end (back end for speech recognition and facial expression recognition) and achieved the data transfer between the Unity front end and the python back end.

  3. Designed and implemented the virtual human's moving mechanism while speaking. Including the virtual human's facial expressions, mouth movements, and body gestures.

  4. Implemented the designed UI with Unity.

Python server front end

Unity app front end

snapshots

audio

virtual human's text

virtual human's emotion (float value)

server.png

Analyze user's emotion with snapshots and prepare answers based on audio

Data Transfer Pipeline

UI_Welcome.png

Welcome Page

UI_Menu.png

Menu Page

UI_Conversation.png

Conversation Page

——My Future Work——

  1. Design and develop the data transfer mechanism for multi-round conversation.

  2. Finalize the app's UI outlook on Unity. 

  3. Implement the app on a tablet.

  4. Final round debug.  

bottom of page