When it comes to smart home technology, what many reviewers and users may see as convenience, other people see as accessibility. In other words, using one’s phone or voice to flip a light switch may be convenient for you because you don’t need to get up. For me and other disabled people, this makes it accessible.
The best illustration of this concept involves the AmazonBasics microwave. The Alexa-powered device (which, as far as can be determined, has not been available for sale for over a year) was admittedly a bit gimmicky. Why would you want to talk to your microwave to heat your food? The controls are right there. But while it is true that the microwave is pretty mediocre — my partner constantly bemoans how small and underpowered it is — she’s willing to tolerate it because she knows how accessible the thing is for me. Instead of standing at the microwave and squinting at the low-contrast keypad — even with the good lighting in the kitchen, the numbers can be hard for my low vision to distinguish — I can be across the room and use my voice to tell Alexa to heat up my leftovers.
The implications of this transcend convenience. In fact, between the microwave, the Alexa-based Echo Wall Clock (which is helpful for seeing timers), and an old Echo Dot hooked up to control both devices, our kitchen is arguably the most accessible room in the house.
In many ways, smart home tech represents accessibility and assistive technology at its very best. It’s not merely pragmatic — it’s empowering. It takes ostensibly mundane everyday objects like lamps and garage doors and microwaves and turns them into spectacular, borderline magical marvels.
For many in the disability community, this transformation means the difference between inclusion and independence or exclusion and dependence. These are qualities that many people, especially those in tech media, fail to consider in their coverage of the smart home but that are crucially important if one wishes to understand technology in a more holistic way.
Take my household. My partner and I are pretty hardcore Apple users. We both have iPhones. We both wear an Apple Watch and AirPods. There are multiple iPads, HomePods, and Apple TVs in our house. As a result, we primarily use HomeKit to control our various smart home devices, including those from Nest (more on that later).
The fact that we wade knee-deep in the Apple ecosystem is convenient, but more important for me is accessibility
The fact that we wade knee-deep in the Apple ecosystem is convenient, but more important for me is accessibility. With one exception (a MyQ garage door opener, which I’ll touch on later), I can control all of my smart home devices from the Home widget in Control Center.
To tell the truth, this says as much about the accessibility of Apple’s vertical integration as it does about smart home devices. The big win here is that if I need to turn off the lamp in the living room, I don’t need to figure out what I need to do. I just have to grab whichever device is closest to me: my iPhone, my Apple Watch, an iPad, the Apple TV, or the HomePod.
Sometimes, of course, you have to tweak your tech in order to get it to work the way you want it to — especially if you don’t want to invest in an entirely new set of devices. One downside to using HomeKit is that the Nest products we have — a Nest Thermostat E, Nest Protect smoke alarm, Nest x Yale smart lock, Nest Hello doorbell, and two Nest Cam IQ outdoor cameras — don’t natively support it. These are all older products that predate the Matter standard that ostensibly promises interoperability between smart home platforms. But we don’t want or need to replace it — however dated our Nest gear is, it all continues to work great, especially in the original Nest app.
Still, I wanted to get all of it to show up in the Home app because we’re mainly HomeKit users. For us, the solution came in the way of the Starling Home Hub. It’s a little box you connect to your network and, upon hooking up your Nest and HomeKit credentials, it turns your thermostat or other devices into “native” HomeKit products. It allows me to ask Siri to lock the front door and adjust the thermostat as well as control them using the aforementioned Home widget in Control Center.
Arguably the best smart home gadget we have is the MyQ garage door opener. I added this a few years ago when MyQ maker Chamberlain made the HomeKit version (sadly, it was recently discontinued). I call it the best because, for many years, our garage door was opened using a keypad outside, which was completely inaccessible, with small washed-out markings that were hard to see and mushy buttons that were hard to press. The addition of the smart opener means I can open and close the garage with a single tap. (Unfortunately, as of this writing, the HomeKit integration is broken — it shows a persistent “no response” status message — but it’s still fully functional within the MyQ app on my phone and watch.)
It’s not all roses, however. The biggest frustration is maintenance. Especially with HomeKit, there are occasions when devices show as “no response” for no explicable reason. When my network goes down or is updated, sometimes it breaks the Starling Home Hub. But while playing IT support technician for my devices is annoying, it doesn’t erase the fundamental benefits of what all of these smart home devices add to my everyday life.
It goes back to what I wrote at the outset about convenience and accessibility. From what I’ve read, the vast majority of reviewers and analysts see the smart home as made up of things you want but don’t need to live. This assumes that everybody uses technology in (mostly) the same way, and it’s just not true.
The emotional gains are just as important as the practical ones
For a disabled person, myself included, being able to control light switches and garage doors with your devices means one’s home is more accessible. It instills greater feelings of agency and autonomy because I don’t need to ask for help for, say, turning on the lights. The emotional gains are just as important as the practical ones, and it’s for this reason that accessibility trumps convenience in this context. What’s convenient to you may be life-changing to me.
All of this is not to say smart home devices are perfect — the salient point is it’s misguided to perpetually frame smart products as mere novelties that, in the case of the Alexa microwave, are tech for tech’s sake. It’s much more meaningful than that, but most people don’t have the foresight to consider other viewpoints.
Smart home tech has far greater resonance than sheer convenience. It can be accessible and empower everyone — profoundly so.
]]>You may not know it, or you may not need them, but your Mac comes with a bevy of accessibility features that help make your computer more accessible if you have disabilities. Apple is well-known for building best-in-class assistive technologies into all of their platforms — and the Mac, almost four decades old, is no exception. In fact, Apple has a knowledge base article all about the accessibility features of macOS.
When you explore the Accessibility pane in System Preferences, you’ll notice Apple has organized the system’s accessibility features across various developmental domains: Vision, Hearing, Motor, and General. There’s also an Overview tab where Apple concisely summarizes what accessibility does for you. “Accessibility features adapt your Mac to your individual needs,” the copy reads. “Your Mac can be customized to support your vision, hearing, physical motor, and learning & literacy requirements.” Accessibility features are turned off by default, but you can visit System Preferences to enable anything you need or want. Most are accessible system-wide via a keyboard shortcut.
Let’s examine each category and its features.
Note: this article was written using a macOS Monterey system; the Ventura version of the operating system was still in beta at the time.
Under the Vision category, Apple lists VoiceOver, Zoom, Display, Spoken Content, and Descriptions.
VoiceOver, the award-winning screen reader, is arguably the canonical Apple accessibility feature. It’s the one most users (and app developers) are most familiar with. As you’d expect from a screen reader, VoiceOver allows people with blindness or low vision to navigate their computer through voice prompts. As you move through the Dock, for example, VoiceOver will say “Button, Mail” as your pointer hovers over the mail icon. VoiceOver is deeply customizable as well; users can train it to recognize certain words, and the voice and talk speed can be varied as desired.
Zoom is pretty straightforward: turn it on and the interface is zoomed in. As with VoiceOver, Zoom can be customized to a considerable degree — you can choose to scroll with a modifier key (such as the control or option keys). You can zoom the full screen, via split-screen, picture-in-picture, and more.
One notable feature in the Zoom section is Hover Text. After turning it on, users can hold Command (⌘) while the mouse is hovering over something (hence the name) to show a large-text view of the item. This is especially useful for reading the small print in System Preferences, for example. And yes, Hover Text is easily customizable — you can change the font type and colors of the text box in order to suit your visual needs.
The other three features under Vision are closely interrelated. Display allows a slew of options for more accessible ways to view the screen, such as increasing contrast and reducing transparency. Spoken Content allows you to change the sound and speaking rate of the system voice; you also have the option to toggle on or off the ability to speak announcements like notifications, items under the pointer, and more. Lastly, Descriptions let you turn on audio descriptions for what Apple describes as “visual content in media.”
There are three features under this category: Audio, RTT, and Captions.
The Audio section is pretty sparse, only giving the option of a screen flash when an alert comes in. Conceptually, this serves the same purpose as the flashing telephone we had in our house when I was growing up. My parents were both fully deaf, so every time the phone would ring, a lamp in the living room would flash (in addition to the usual ring I could hear), alerting them that the phone was ringing.
RTT, or real-time text, is a mode where people can call deaf and hard of hearing people who use a TDD device. TDDs make a unique sound, so it was easy to know when another TDD user was calling my parents; I would simply place the phone’s receiver on the TDD and tell my parents the call was for them. (Note: older Macs may not include the RTT feature.)
Finally, Captions allow users to customize the look and feel of the system-wide captions to suit their tastes.
The Motor category includes Voice Control, Keyboard, Pointer Control, and Switch Control.
Voice Control, introduced with much fanfare in macOS Catalina at WWDC 2019, allows you to control your entire Mac with just your voice, which is liberating for those who cannot use traditional input methods like a mouse and keyboard. You can choose to enable or disable specific verbal commands and even add specific vocabulary that you’d prefer to use.
Keyboard lists a slew of options for configuring how the keyboard behaves. For example, Sticky Keys (found in the Hardware tab) is helpful for those who cannot hold down modifier keys to perform keyboard shortcuts. Pointer Control is analogous to Keyboard insofar as it allows customization for how the pointer behaves; its Alternate Control Methods tab helps you enable several useful options. For example, Enable alternate pointer actions lets you control your pointer with a separate switch or a facial expression, while Enable head pointer lets you use head movement. Switch Control, like Voice Control, allows for hands-free operation of one’s computer using external buttons called switches. Apple sells a variety of Mac-compatible switches on its website.
General consists of two features: Siri and the Accessibility Shortcut.
Under Siri, Apple gives users the ability to enable Type to Siri, which allows users — who are Deaf or have a speech delay, for example — to interact with Siri in a Messages-style interface.
Shortcut is straightforward. Using a keyboard shortcut (Option-Command-F5), you get a pop-up menu that lets you invoke whichever accessibility feature you choose. It’s also possible to set more than one shortcut.
One important thing to note about all the macOS accessibility features is their place in the broader Apple ecosystem. Most of them are available on one (or more) of Apple’s other platforms, like iOS, iPadOS, and tvOS. This is notable from an accessibility perspective because of its consistency.
For those with certain cognitive conditions who move between devices, the linearity of the accessibility features across platforms means a more comforting, consistent experience. A person will know what to expect and how to use certain things, which goes a long way in shaping a positive user experience when regularly jumping from device to device.
Update July 11th, 2022, 3:15PM ET: Updated to add a note that this was written about macOS Monterey.
]]>