Google is asking users to help teach its AI how to speak. A new “Experiments with Google” called LipSync asks users to lip sync a small part of “Dance Monkey” by Tones and I, Android Police reports.
LipSync, which is built by YouTube for Chrome on desktop, will score your performance. It will then feed the video to Google’s AI — it doesn’t record any audio.
Google plans to use the video clips to teach its AI how human faces move when we speak. This could inform tools for people with ALS and speech impairments. Someday, AI might be able to guess what they are saying by observing their facial movements and then speak out loud on the person’s behalf.
Google already has several accessibility features — from Android apps for the hard of hearing to accessible locations highlighted in Maps. This AI speech recognition capability could lead to useful additions.
Google, Lip sync, Artificial intelligence
World news – CA – Google wants you to train its AI by lip syncing ‘Dance Monkey’ by Tones and I