Read my lips

This post is also available in Dutch.

Reading lips is something the Deaf have to do on a daily basis to communicate with the hearing world. But is lipreading as easy as reading a book?

“Can you read my lips?” Picture from Max Pixel (License:CC0 1.0)

When someone reads lips they try to piece together words and phrases without the input of sound. Actually in the scientific literature, they generally call this speechreading instead of lipreading, because information from the whole face, not just the lips, is used when you watch someone speak. For deaf people, speechreading can be the main way of oral communication but hearing people use visual information in a similar way, although often without even realizing it. Imagine you are talking to someone in a noisy bar. You will understand the other person much better if you’re looking directly at them and can see their mouth, facial expressions, and hand gestures.

Reading lips isn’t like reading a book
In books, letters and words are always clearly different, but when trying to speechread, differences between sounds can become teeny-tiny! There’s a whole world of terminology for this: in trying to differentiate words, the smallest detectable unit of a word that allows you to distinguish words from one another is called the morpheme. For example, pit and bit differ by one morpheme and refer to different concepts. When you read the written words pit vs. bit you can easily visually distinguish b from p. But the human face is not like a book. People tend to use facial expressions, speak fast, mumble, and they have accents or dialects. Try to visually distinguish pit and bit from someone’s lips. It is really tough. The reason for that brings us to our next term: the limited amount of visually distinctive units in the mouth. Many consonants and vowels are produced with the same parts of the mouth. For example, [p], [b], and [m] are all produced with the upper and lower lips; [f] and [v], with your lower lip and upper teeth. Therefore, the p and b in pit and bit may look the same if all you can see are the lips. In fact, only about 30% of speech can be speechread. Don’t believe me? Watch the movie below to experience speechreading yourself.


Lipreading or sign language?
According to an old view called Oralism, deaf people should learn speechreading, along with mouth shapes, breathing, and vocal exercises to help produce speech.  Supporters of Oralism believed that it was important for the deaf community to assimilate into the hearing world to function “normally” in society. After quite some debate researchers nowadays focus on what aspects of language and communication work the best.  Bimodal bilingualism (proficiency in both speech and sign language, read more in this blog) is now generally preferred in language education for deaf children.

As we have seen, people who speechread often work harder to communicate, stretching the boundaries of their senses to understand the other. So now you can maybe imagine how they feel and it’s up to the rest of us to cut them some slack, and take some extra steps to make our speech easier to understand. As your mum would say, “no mumbling!”


This blog was written by Francie Manhardt  and edited by Annelies van Nuland and Monica Wagner

Add a Comment

Your email address will not be published. Required fields are marked *