- The Google Assistant is now more accessible through Tobii tablets and apps.
- Eye tracking, touch and scanning make Assistant an option for people with disabilities.
- Tobii also lets you create Action Blocks on Android.
The Google Assistant can be very helpful with questions and everyday tasks. However, it can be difficult to use when disabilities make it difficult to speak or use a traditional touchscreen. However, this might not be a problem for long. Google has teamed up with Tobii to improve the accessibility of the assistant and to make the AI helper accessible to many more people.
Google Assistant is now built into Tobii’s Snap Core First on the company’s Dynavox tablets and mobile apps, so you can easily take a command using eye tracking, scanning, or touch. You can ask about the weather or turn on the lights by looking at or touching a tile instead of speaking a command or navigating a complex surface.
Continue reading: The best accessibility apps for Android
It is relatively easy to set up. Once you have a Google account, configure an assistant-based smart speaker or display in the Google Home app on your phone or tablet. If you give permission to the Snap Core First accessibility app, you can instruct tiles to issue Google Assistant commands.
In addition, Tobii can assign its image communication icons to Dynavox to create action block buttons on Android devices and bring that familiar accessibility interface to the Google platform. People with cognitive disabilities can send texts, play videos, and otherwise perform common tasks on mobile devices without having to relearn the buttons.
Google added that the wizard was “always” built with an “accessibility” feature. However, these supplements could be especially important. Many people with disabilities do not have full voice or touch capabilities. This opens up assistant technology to them that may not have been possible before.