Over 30 visually-impaired students including those from University of Lagos (UNILAG), others were recently trained by the Christ’s Outreach on Disabilities (CODISA) Foundation in partnership with Google.
The training was on the effective use of Artificial Intelligence (AI) and Android devices to improve their lives and make them less dependent.
Speaking at the event, CODISA Leader, Prince Olaoluwa Awojoodu, noted that the training was informed by the need to assist visually- impaired individuals to be independent and improve their access to general services.
He underscored the foundation’s commitment to transforming lives, stating that it has impacted over 10,000 people that have special needs since it was established in 1984. “We have 40 homes that we sponsor and we have been doing this for over 40 years,” he said.
Director Google West Africa Region, Olumide Balogun, expressed delight to be in partnership with CODISA to carry out the technology training to help visually-impaired persons to have access to their phone, artificial intelligence in a way that it can benefit their life, their purpose and by extension, the society.
“At Google, we’re always looking for where we can really step in and support and make a meaningful difference. Google Serve is an activity that we carry out every year. Where we think about our communities and say, where can we serve? Where can we go physically, not just from the office, but where can we go physically and serve.”
Google Search Partnerships Lead for Sub-Saharan Africa, Ugochi Agoreyo, who took the group through sessions on utilising their phones and AI effectively, explained that the practicals would help them to navigate their phones through talking back to the phones, as opposed to using hand prompts and improve their everyday life with Android devices and AI.
“We’ve introduced them to some accessibility tools on their Android called Voice Access, Reading Mode, and TalkBack. This can help them to access their devices and control their devices through voice access, through Reading Mode, help them read their screens with Reading Mode on Android, and then TalkBack also helps them to navigate their phones through talking back to the phones, as opposed to using hand prompts.
“We’ve also introduced them to how AI can support them in learning and in navigating their environment, the environment around them. So, for that, we introduced them to Gemini Live, Gemini 2.5 Flash, which has an audio ability for them to find information from anywhere around the world by just talking into the Gemini app. Gemini 2.5 flash helps them navigate their environment – Once they activate Gemini Live and activate the camera it becomes their eyes. They can ask it to scan the room and tell them what it sees. It can help them read anything that the eye sees in the environment like signs so that they are safe and understand where they are.”