Since the roll-out of MacBook Airs to students across my school, I have tried to carve out some time each week to check in on our students from the state’s school for the blind, who take coursework on our campus.
The state’s school for the blind is a residency campus that teaches students across the state, beginning at a young age. As students mainstream into the middle school and high school, on-campus experts provide students with assistive technologies that were wide-ranging, with everything from braillers to screen-reading software on Windows laptops.
In my work to integrate technology into our school, I find that I, too, need to be providing students with the tools and skillsets that will make them successful as learners beyond the school day. There’s a big difference between treating kids as students and as learners, in my opinion, which is why I run a school-wide Edmodo group for students that posts tech tips, digital study skills, software/service tutorials and also announcements about our one-to-one program. For the students who are visually impaired, this means an entirely new approach to what I am used to providing for the 99% of our students.
Gone must be the presentations with small arrows or the how-to videos that act as a visual aide to steps around a graphic interface. And, as I came to find out, your choice in communication mediums makes for all the difference when spreading information. (Some students found out the hard way that Edmodo is not accessibility-friendly: screen readers cannot process it’s Facebook-like stream.) When training students with visual impairments on how to use Macs to do school work, a few profiles need to be kept in mind.
Accommodating students with limited sight
It pained me to see the size and weight of some of the zooming technologies that students were using in our school. When making a school-wide digital transition, the mobile technology of a laptop or tablet opens up many new possibilities for students with limited sight to engage in activities without heavy magnification equipment — the graphics processors and pixels handle everything. Probably the most popular feature for our students with limited sight is Apple’s built-in zoom features.
Imagine making those bulky magnification screens as large as imaginable…and customizing that experience to each document, application or website. The virtualization of space and zoom is so easy with Apple’s Accessibility features. Beyond that, the color and pointer preferences mean that students who have a hard time distinguishing the cursor or other similarly-colored items now do not need expensive add-on software to their Windows machines.
It’s just a good thing that the decision-makers went with a 13″ model of the MacBook Air rather than the 11″.
Accommodating students with no sight
I could not have been more impressed by the excitement from the students I worked with who had no sight at all. Having come from a world of publishing and digital media, I was worried that the flare of the design-minded Apple brand might be lost on our students. However, I came to find out many of these students are already iPhone users who find great success in using Siri to stay in touch with each other (as do many others).
Once we synced Bluetooth headsets to their Macs and turned on VoiceOver, Speakable Items as well as Dictation (sidebar: it’s still weird to me that Dictation is separated out of the Accessibility folder in System Preferences), the only hard part was keeping straight what technology read for them and which read to them. (It’s a learning curve I still struggle with.) Bluetooth headsets, though, made for a night-and-day difference in the usability of these services. In bustling classrooms during collaborative activities (and even during training when students are trying to help each other) a closer, more sensitive and noise-reducing microphone connected to a Mac means that students dictate only what they want, and not what others (mostly innocently) interject into the computer. The headsets, too, mean that students are no longer the ones with the “special” bulky headsets from some assistive technologies company that makes a good buck off of school districts. A consumer Bluetooth headset runs now about $30-$40, but watch out that you get one that supports A2DP profiles. A2DP is a required profile for using Bluetooth with Speakable Commands, Dictation, etc. My inexperience with braille displays was supplemented by another technologist’s from the school for the blind, so hopefully the USB-connected devices make for quicker interfacing, too.
With some help from aides, students who were completely blind were then set up with email notifications from Edmodo (who has terrible accessibility for those with visual impairments) to their school-issued email accounts. From the Mail application, students can use text-to-speech (or Braille displays) as well as dictation to receive text-based notifications and then respond to teachers, respectively.
In the long term, students with no sight will need to make an investment in learning and training the Speakable Items as well as spending time learning how to navigate the Mac with VoiceOver. Fortunately, Apple translated their otherwise text-based manual on VoiceOver into a podcast series which students can listen to (and self-pace!) through iTunes or on their own audio devices. Teachers will still need to work with these students, too, to create a more comprehensive and custom-tailored selection of Speakable Commands so students can be more independent of the very graphical interfaces of OSX and other operating systems.
Changes I made in workshops
Many of the accommodations and assistive settings teachers, aides or I made before we started these workshops. Pairing our students with no sight with an aide did help those students feel confident about finding keys no a new keyboard layout.
In either situation, students with keyboarding skills/background took off. Where the adults really took off, however, was when they started hurriedly writing down keyboard shortcuts and commands. Moments of glowing revelation came upon the Windows-minded educators when I pointed out how to use the Apple menu to find shortcuts for those students who could not find them easily (because of limited sight or none at all). For both cases, I feel one of the biggest turning points was teaching students and teachers how to use arrow, tab and enter/return keys to navigate menus, which could be read by VoiceOver or simply explored for useful shortcuts (oh, those shortcuts could also be translated into Speakable Commands!).
Humans have not always used graphic interfaces to command computers. And Apple was not always as well-designed with its clean/chic/modern icons and visuals. The most often-used tool in my training sessions with students of either sight profile was Spotlight, which is a very robust search tool built into OSX. Activating it by holding Command and tapping the space bar allows the user to input words that are searched across the accessible disks of a Mac, generating a complete list of all files, folder names, application titles and even dictionary definitions of the search terms. (Fortunately, you don’t need to know how to type out file structures anymore a la DOS.) This changed the game for my students who couldn’t find the programs they needed when navigating Finder’s Application folders or even the Dock. For every time I asked a student to open a new Application, I also narrated how I would used Command + space bar to Spotlight exactly what I was looking for.
By using scroll gestures with modifier keys to zoom, I was able to use very large close-ups of different parts of a screen or slide (that is, for those with limited sight). This came in handy when I needed to put up the naming/password conventions for student email accounts.
Extensions for Special Education
There is so much potential for Apple’s Accessibility items to be leveraged for any student with special needs or mandated accommodations. Special education teachers whose budgets have been hacked away may find that VoiceOver is a viable option for reading assignments or tests to students at their own pace instead of expensive Kurzweil software. And even Apple’s QuickTime has built-in screen and audio recording that can be used for such accommodations also, should a student prefer a human voice… that they can still listen to at their own pace!
Students with reading disabilities may also find Apple’s text-to-speech feature (highlight the text of an article > right-click > Speech) useful for independent research or other web-based activities. Students with fine motor skills who struggle with using keyboards would find the Dictation features useful for composing essays, etc..
Ever have one of those experiences where students “level-up”. Though it may be a gamer’s mindset, it’s worth celebrating this experience. Students expressed to me that they never thought they could do some things on computers.
While on this topic I should be careful not to colloquially say “I love seeing this all happen”, I am going to avoid it this time because there is something more experiential here that one of the five senses alone cannot grapple.
A one-to-one initiative is one of the few times that students can feel like they have been equalized as learners. One-to-one is no longer about who can afford more literacy experiences or who has the time to explore on the Web. I truly believe that technology in the hands of students is entirely different than issuing textbooks to everyone or providing free and reduced lunches to the underserved. There is no way to parallel this type of educational movement to anything that has been done before in schools. I have witnessed technology now enable those who have felt marginalized and even the playing field in so many ways.
For more tricks and tips (that aren’t linked above), check out Apple’s Accessibility Support page.
For more on Dictation and Speech-to-Text features, check out MacWorld’s “Using Mountain Lion’s dictation and text-to-speech features”.