megan image

The World of Communicative Technology: One User’s Journey

Today’s post is written by Andi Fry, our BridgingApps Coordinator for Montgomery County Outreach, in partnership with her adult daughter Megan.

Megan was born with severely limited muscle control, which ruled out both verbal speech and sign language for communication. So she’s always needed augmentative and alternative communication (AAC) devices. It’s been a long journey to her current assistive technology.

Starting Out

Early on, Megan tried different types of switch-operated devices: big buttons, little buttons, sticks. She ultimately settled on “Jelly Bean” switches in the form of large plastic buttons; two of these were built into her wheelchair headrest. She was very hard on the buttons, and we needed to replace them a few times a year.

Later, we experimented with wearables—goggles, a forehead sticker—that picked up light beams from a camera attached to the main AAC device. These proved a poor fit:

  • They had to be maneuvered by smooth movements of the head and neck, and Megan’s movements tend to be jerky.
  • Megan’s AAC system has a screen with over 100 buttons, making it even harder to keep movements small enough to hit the right spot.

On to Eye-Gaze

When we discovered modern eye-gaze technology, that was the real game changer. Still, it had a labor-intensive learning curve. For one thing, it meant unlearning old habits: it was hard for Megan to not move her head around, when she’d spent so many years doing just that with switches. In the beginning, she would get tired a lot, with headaches from all the eye movement.

Megan’s first eye-gazing device also brought “headaches” of a different kind. First, we had to purchase an entirely new computer to get the built-in camera that the device depended on. Second, the device couldn’t connect to internet browsers.

Moving Forward

Eye-gaze technology has come so far since then. Megan’s current device is a separate small bar that attaches to a bracket on her Microsoft Surface laptop. The camera “sees” where the user’s eyes are looking, then uses two infrared beams to bounce a signal to the screen location. To select a button, the user can blink, or can “dwell” (focus the eyes) on the right spot.

Another wonderful feature: you can customize how fast the cursor moves, also the amount of time a user needs to dwell on a button. The one big challenge is keeping the camera calibrated to Megan’s eyes. If the camera and device aren’t positioned exactly right, the mechanism will not work.

The Journey Continues

Through all the challenges and changes, Megan loves communication devices because they allow her to participate in everyone’s conversations. She can speak directly to friends, family, caregivers, doctors, professors, therapists, employers, and even strangers. She can also keep up with school work, social media, games, and art. She has an active role in the world.

Leave a Reply