Many of us will have seen an example of ‘smart glasses’ before, if not in real life then definitely in the news. Google’s ill-fated Glass project made the news quite frequently for privacy and ethical concerns, as well as aesthetic issues.
The project was relatively short-lived, only being publicly available between 2014 and 2017. This then led to many other similar projects being shuttered.
However, technology like this is making a resurgence among one group in particular.
Those who suffer with visual impairments have found new ways to harness the use of technology like Google Glass to help them see the world around them.
Seeing Eye glass
New technology, based on the tech used in Google Glass, can help blind people ‘see’ the world around them.
The glasses, which contain several small cameras, are able to identify almost anything they see, which they then relay to the user through an earpiece.
If the glasses are worn while walking, they can alert the user to any potential obstacles. If they’re worn in front of a fridge, they can tell the user the entire contents. If worn whilst looking at any printed text, the glasses can read the text aloud to the wearer.
Social cues But it’s not just physical obstacles that this technology is helping people overcome. Social cues can often be missed out on when you are visually impaired, especially through the use of facial expressions.
However, new technology for the Google Glass shows actual emotion recognition, at a match rate of 65% – far above the 56% that regular humans achieve! This upgrade for the Google Glass goes beyond facial recognition to fully identify the emotions of the person you’re looking at, meaning no more missed social cues.
But using technology to help the visually impaired isn’t just limited to smart glasses. Microsoft are offering a new technology in the form of an app, called Seeing AI, which narrates the world around you
All you have to do is point the camera and the app will tell you exactly what it is you’re looking at. Show it a person, and the app can tell you if they’re smiling. Point it around the park, and it can narrate what each individual is doing and even what the weather is.
This kind of technology is quite revolutionary, using deep learning, machine learning and AI methods to help users understand the world around them.
Whilst this technology is still quite new, it has to be housed within a phone. However, over the coming years, we would expect that it naturally starts to make its way towards glasses. Not necessarily the Google Glass project, but something more understated, such as the Snap Spectacles.
However, Microsoft isn’t the only big tech giant working on technology like this. Apple, Google, Facebook, and Amazon are all working on similar projects. This means that the race is on to get this out to people that need it, improving the lives of people all across the globe.