I listened to this BBC Ouch chat in the morning, about blind and deaf and sign language and communication barriers.
— Berkshire Vision (@berkshirevision) March 20, 2017
A lovely listen, and it also comes with a transcript so the deaf can read it too.
And a very interesting and useful topic.
What if… all of us were taught some sign language, ideally as kids? And maybe some basic braille or at least some blind good manners? Would the world become a more inclusive place? I certainly hope so.
— Enable Magazine (@EnableMagazine) March 20, 2017
There are a lot of good comments on this article, and I think it’s the one the BBC Ouch folks are referencing in the other chat.
The BBC Ouch one, the first one listed, has so many good points to ponder. Deaf are usually very visual, and eye contact is vital to them. So if you look somewhere else while a deaf person is talking to you, they might stop talking because from their point of view, you are not listening. From my point, if they are speaking using their voice, it doesn’t matter which way my eyes point or if they are open or closed. Plus, if I’m looking “right at you”, I don’t exactly see what you would imagine. I’ll see something, kind of like thru a marble. It’s just easier if I don’t have to stress about which way my eyes should point.
Another point that I would struggle with if I learned sign language, is facial expressions. First, because I don’t see them that well (it took me several months to find the “confused face” emoji on iOS as I had no clue what a confused face would look like in humans). Second, because as a continuation to that, it can be difficult to use the “correct” facial expression when speaking or communicating. Since I don’t see much of other peoples’ facial expressions, my natural tendency is to use little to no facial expressions with others, unless I’m with people I know well enough, and still, would never use a facial expression as an only mean to communicate something. I use the words; not even the tone as that’s another layer of difficulty. I can try to use the tones of voice thing to communicate more effectively to the hearing neurotypical world, but when I am speaking, ultimately what matters is the words. So the whole facial expression thing would be a long delay and probably a “speech impediment” in sign language for me. And how can people see both the hands and the face of someone when they sign on the same time? I would massively struggle with this. A bit like watching movies with subtitles or closed captions: If I can physically see the texts, I will read those, and be out of sync with the audio, and will miss 90% of the screen. So I’ll have to make my pick – and with TV these days the audio always wins. If I can understand the language, I won’t even bother looking. I will miss the blurry screen, but in return I’ll get a much better audio experience which may need some filling in.
I would love to have some deaf and deafblind friends. I remember Rikki Poynter showed in a video how she used an app called Make It Big on iPhone; that would probably be comfortable when talking to friends. You type, I’ll listen. Or in analog world, tactile sign. (I’d love to learn that too). For a start, tactile signing letters on palm should always work. Use your finger as a pen, draw the capitol letters (Latin alphabet please, or braille) on the palm… It won’t be very fast or effective, but it’ll work in the dark too, with no pen or paper, no iPhones or braille displays or hearing aids.
Curious about the deaf and hard of hearing thoughts on this. 🙂