I recently participated in a special project, 32 for 32 Project, that centered around 32 consecutive days without speaking. The purpose of the project was to honor 32 victims of the tragedy that took place on the Virginia Tech campus on April 16, 2007.
I haven’t yet had time to process this experience enough to write in any coherent way about the method of mourning, my emotions while thinking about the honored Hokies, the benefits and drawbacks of silence, and the differing responses and reactions of various people as I participated in this special event. I’m still working on that.
I do want to tell about a technology I used during the project and which is relatively new to me – Augmentative and Alternative Communication (AAC).
AAC is about finding and providing alternative means of self expression for people who have problems communicating by typical means. For example, sign language is an example of AAC that can allow hearing impaired people to communicate better. During my silence, some people suggested that I should learn sign language.
That would have been impractical in my case. I am very slow to learn anything that involves specialized movements, and the project was only 32 days long. I’d actually tried to learn ASL years ago, and failed. But more than that, learning sign language would have helped me to communicate only with others who know sign language.
I took an idea from the film Wretches & Jabberers, which I’ve mentioned previously in this blog. Two men with challenges in speech travel the world to help change the perception of disability. They often use electronic devices to communicate. This is a typical application of AAC, and it matched my needs and interests. I was a technologist without speech or time. The library recently acquired new iPads, and I found some apps.
This screenshot is from iMean on the iPad. The keys are relatively large compared to the normal iPad keyboard. Some people who have difficulties in producing speech also have difficulty with manual dexterity, so the large keys can be helpful in these cases. In this screenshot, the keys are in alphabetical order, but there’s also an option to switch the screen to the usual QWERTY layout.
As text is entered, it appears in the gray area near the top of the iPad. Touching the gray area reads the text out loud. The user can select from a male or female voice. The voices do a relatively good job of pronouncing the words, but there are occasional eccentricities. In close settings, the conversation partner may find it easier to read directly from the screen.
There are other apps that are alternatives to typing out words on the screen. The below screenshot is from a demo of TapToTalk.
Words and phrases are divided into categories, and represented by very large icons on the display. Tapping on an icon causes the iPad to speak a phrase, and to display more specific icons related to the selected phrase. An example communication sequence might go this way:
I would like a drink -> I would like some water, please
Here, the user has clicked on the drink icon. Another screen appeared with icons for several types of drinks.
The user clicked on water. The application spoke each phrase out loud as the user selected the icons.
A similar sequence:
I’m hungry -> I want some fruit -> Strawberries, please
Most of the sequences are kept to two or three levels for simplicity. This application might be very interesting for young children with special interests in technology.
Apps on iPads are an example of medium technology AAC. Communication is accomplished with consumer grade electronic devices that were not designed especially for the purpose.
An example of high technology AAC might be a special purpose hardware device designed especially to aid in communication. Instead of keypad style input that expects functioning fingers, it might make use of an eye tracking input device, or a blow-suck tube.
Virginia Tech has an Assistive Technologies lab. I’m looking forward to visiting the lab soon in search of a new library outreach project, and while we’re there I’m going to look for other technologies that support Augmentative and Alternative Communication.
Sometimes, though, depending on where communication was taking place and who was involved, I found that a low technology approach – pencil and paper – was most effective. On a few occasions, silence is even better. Everyone should be given the opportunity to communicate, but nobody should be forced. The rule “If you can’t say anything nice, don’t say anything at all” is at times a good rule.