Google has announced that the code for Project Gameface, a game mouse that can be controlled by facial expressions, is now available as an open source for Android developers. The implementation will allow users to open their mouths to move the cursor or raise their eyebrows to click and drag, UNN reports citing The Verge.
Details
Android users will soon have access to a revolutionary form of device control: completely free navigation using Google AI and face tracking.
The Gameface project in question was announced at the Google I/O conference last year: it is designed for desktop computers, uses the device's camera and a database of facial expressions with an API to control the cursor.
"Using the device's camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive and personalized controls. Developers can now create apps where their users can customize their experience by adjusting facial expressions, gesture sizes, cursor speeds, and more," Google explained in its announcement.
AddendumAddendum
While Gameface was originally designed for gamers, Google says it is also working with Integral, a social enterprise in India that focuses on accessibility, to see how they can extend it to other environments such as work, school, and social situations.
The Gameface project was inspired by Lance Carr, a paralyzed video game streamer who suffers from muscular atrophy. Carr collaborated with Google on the project, aiming to create a cheaper and more accessible alternative to expensive head tracking systems.
Recall
Google is introducing AI-powered search with the Geminmodel to provide direct answers and planning capabilities such as meal and vacation plans, as well as visual search with video, which will be rolled out globally by the end of 2024.