Google Live Transcribe

Saturday, June 20, I received my Cochlear Implant. With all personnel wearing masks I wondered how I would “hear” anything said to me. I have 0% word recognition without lip reading. With everyone wearing a mask I needed something to help me as lip reading is impossible. As soon as I walked into the facility I turned on the Google Live Transcribe app on my phone. It works without WiFi but cellphone signals are notoriously poor inside some buildings so I asked the desk for their WiFi guest codes and turned it on. From then on I could “hear” what was said to me and in fact, what I said. The nurses were amazed. I soon had half a dozen or more of them at my bedside talking and looking at my screen (social distancing?). Sadly, many of them had iPhones and this is an Android app. Perhaps the only Android app that’s better than anything iPhone has. I’m sure iPhone will get it sooner or later or create one similar but I’m hanging on to my Android phone! Even though the Nucleus7 processor I will receive is “Made for iPhone” meaning music and phone calls are streamed to the processor without a streamer by bluetooth.

I had worried about going into surgery without being able to hear the masked staff but they let me keep my phone. I had refused pre-surgery sedation as that in my previous experiences causes nausea so I was wide awake when wheeled in. In fact I was still holding the phone when I felt the anesthetic going into my vein. When I woke up in recovery they handed it back to me. I must say the staff was very helpful.

I hope my experience is helpful to anyone needing to be with masked people.


That’s a better experience than I had. My hearing aids were misplaced and I had some excitement while asleep (long story) and when I woke up I was intubated (couldn’t talk) and restrained (couldn’t gesture) and on a ventilator that I couldn’t adjust to, perhaps because I couldn’t hear the rhythm. It took quite some time, but I was able to write the letters out D E A F with my fingers over and over until a nurse figured out that I needed my hearing aids. I was getting quite desperate since it was getting so hard to breathe.

Grantb, I hope you’re okay now. That’s quite a story. My surgery went well, I’m told, but now 3 days later I still have significant pain. I’m hopeful.

I have it on my phone. I haven’t used it yet. My experience has been that a large number of people with normal hearing don’t have a lot of patience with the hearing impaired. I can’t imagine the reaction if I have to read what they said before I can respond. Or God forbid the translation is wrong

I’ve used it from time to time but I feel like it doesn’t use all the microphones on my phone properly (Samsung Galaxy S10+). Another thing is when my aids are connected to my phone the app steals the audio and uses my hearing aid microphones by default, and even switching back to the phone microphones, it doesn’t switch the audio back until I either kill the BlueTooth or switch my aids back to Autosense by holding the button.

Hass5744, I do find that many very young people don’t have a lot of patience with the hard of hearing, but for the most part when I tell a person I’m deaf they respond by telling me their mother or father is hard of hearing and they make an effort to help me.

Try Live Transcribe just with yourself. You’ll see there is almost no lag between your speech and its appearance on the screen. That’s unlike my landline caption phone (CaptionCall) where the lag can be very long and I keep having to explain to the person I’m speaking to that their words have not yet appeared on my screen. I also find the words on Live Transcribe to be more accurate than the captions on my caption phone.

I don’t question the accuracy of live transcript. But my experience over the years was age was never a factor. Rudeness was rudeness. If it wasn’t done directly to the person the rude remarks were done behind the person’s back. Not everybody. But certainly a large number of people who felt that a person’s hearing impairment was an inconvenience to them. Maybe the severity of my loss and speech comprehension contributed to it. But I’ve heard rude comments made about deaf people. It’s unfortunate. A one time encounter with some isn’t really indicitive of the real world. Most people will try to be helpful and understand for the short term.

CaptionCall has a human being listening in on your call and typing captions; that’s why it takes longer. That’s also why only the D/deaf/HoH are allowed to use it.

People can be really stupid sometimes I remember when my employer started enforcing safety policy’s in the machine shop. Ear plugs were required through out the shop. One person interrupted a company wide meeting and went on a rant that the one totaly deaf person in the shop was not required to wear ear protection so no one else should have to. He went on and on shouting and cursing and the managment just let him rant. Days later he was still ranting that the deaf person had to wear ear protection are be fired. As FG said stupid is as stupid does. Sometimes we just have to shake our head and say “Bless their Heart”


My favorite thing about call caption is when it says could not under speaker. And with Google you still have to read it

I have used Live Transcribe at HLAA meetings. for personal use ( I have an iPhone) I use Otter which I would rate as a close second to Live Transcribe.

It would be VERY helpful to have logic diagrams to show the connectivity options, as well as indication methods to indicate actual status. There are many mics and “speakers” involved here. For example, Apple Watch, CarPlay, hearing aids, HomePods, etc., (or their Android counterparts) all have one or more of each. Lots of potential combinations, with (currently) inadequate controls, override options, and status indicators, IMHO. Best, Bob.