Google Presentation

My Design Challenges presentation at the second Google Accessibility Summit on the panel relating to "Design Challenges for Persons with Multiple Disabilities" at Mountain View, California, USA on 10th February 2015.

Arun Mehta

Since we last met, we at Bapsi have focussed on communication problems of the deafblind. Deafblind persons in the US use Apple devices together with a screen reader to communicate electronically, which together cost at least two thousand dollars. Our attempt is to make such communication possible using simple Android phones costing around 50 dollars.Out of this experience, I have some praise and some criticism for google. Let me start with the praise.

Google has helped me to look smarter than I am. Last time I was here, they gave us a debit card to cover expenses, which I used in the dining car of a train. The guy behind the counter swiped my card and asked me if I worked for Google. “No,” I replied, “they invited me for a conference.” “You must be a genius,” he replied. I am very honored to be addressing a roomful of geniuses!

More seriously, I have loved using App Inventor, which Google developed together with folks at MIT. Akhtar is the first  person I communicated with.who is deaf and has low vision. We started with the help of a sign-language interpreter, but soon I pulled out my Android tablet on which I had a simple app running, called TellMyPhone. It uses speech to text and displays the result in large font size. The expression on Akhtar’s face, when he realized that with this app I could communicate with him without the interpreter was priceless indeed. Everyone was impressed, making my subsequent work much easier. Nothing succeeds like success. Yet,  creating the app was no more difficult than sticking 5 lego bricks together! Now comes some criticism.

For someone who cannot see nor hear at all, the only accessible phone output is the vibration. The phone outputs megabits  for the graphics, and kilobits for the audio, but with vibrate you can output only about one bit per second, that too not all day, as I learnt by destroying my Samsung tablet while debugging the apps I was writing. The mismatch between the ability of the body to receive input via touch, and the ability of the phone to provide it, is extreme.The smartphone would be significantly improved if it had 6 tactile actuators instead of just 1, allowing it to output in Braille and also provide far richer tactile output for non-verbal communication.

Wearable computers touch significant areas of skin, through which information could be sent to the user. It saddens me to see them attempt to communicate through small screens. It seems to me that developers are  touch-challenged. I know I am.

I come from the country that invented untouchability. We avoid routine physical contact as in shaking hands. Instead we put our palms together in greeting. This may have started as a way to avoid catching disease. The central role that touch plays in sex is also a reason why we seem to fear physical contact. When the sensation of touch can reliably be communicated electronically, without a doubt the pornography industry will be quick to use it.

In short, therefore, we face formidable challenges, both technical and social.

We are hoping to start a facility at the Helen Keller Center for the Deaf and DeafBlinnd which any disabled person could approach for technology needs, the way she can go to a hospital for medical needs. This facility will enable developers to learn to appreciate touch better, and to test the software.We will also teach interested  trainers and students Morse code (the only way a fully deaf and blind person can receive text) and programming. We look forward to your support in this.

Comments