Apple is accustomed to the WWDC as it announces new software for its platform fleet, to stealthily lay the foundations of long-term products. In 2014, Apple announced HealthKit and Elegance Length APIs in iOS 8, foreshadowing respectively the original Apple Watch and larger iPhone 6 and 6 Plus models. A year later, the company brought the iPad multitasking to iOS 9, which foreshadowed the first iPad Pro. Three years ago, they brought ARKit to iOS 11, which started the march towards anything focused on augmented reality. And recently, the visual redesign of macOS Big Sur has led to the assumption that the iOS-type update is a harbinger of upcoming touchscreen Macs.
Apple paints in ARKit, which includes the Measure app added to iOS 12, have accelerated since 2017. At the hardware level, the revised iPad Pro released last spring features a LiDAR scanner, which is useful in RA-centric applications. (The next iPhone Pro models will have it, too, according to Bloomberg’s Mark Gurman). However, in this WWDC, the company has provided the most powerful clues to date that the next major novelty of the tech titan is on the horizon. As time gets closer and closer with the revelation of a so-called “Apple Glass” that runs “rOS” that hides behind the Cupertino Curtain, it’s worth noting that the most important clues to date come from an unlikely but not insignificant source: accessibility.
Consider the innovations VoiceOver has received. The critically acclaimed screen reader has been enriched with more physically powerful device learning capabilities. Apple has made VoiceOver much smarter in terms of recognition: text, objects, Internet images, and other images. Image descriptions, known as “alternative text” in the visually impaired and blind community, have immense ramifications for augmented reality.
It’s not complicated to extrapolate logic. If other blind and visually impaired people are expected to participate in an improved long-term through the RA, as they should, because inclusion is what is intended to be generated by accessibility, then screen readers, zooms and other types of assistive technologies want to be provided to manage this new paradigm. Apple is not a thoughtful company; everything they do is a long-term omen, even if that long term is too far away. In other words, VoiceOver taking advantage of the merit of device learning is really useful today, but its ultimate true game is tomorrow. When exactly is tomorrow? Only Apple knows, and they’re obviously not in a position to say it. His institutional quest for secrecy and wonder has only one mantra: announcing when he is in a position.
Also think of Group FaceTime, which now recognizes sign language. It’s not hard to believe a scenario where you meet a deaf user on the street (in a post-pandemic world, you realize) and you have to talk to him. Your elegant glasses, powered through AR, may only tell you about your preference to speak. Maybe Apple will go even further in the new iOS 14 Translate app over the next two years and provide real-time translation to other people who don’t speak the signalal language. Maybe they’ll do it the other way around: take someone’s words and subtitle them for a deaf or hearing impaired user dressed in Apple Glass. There is an explanation as to why daily press meetings on coronavirus held through governors such as Gavin Newsom of California have ASL interpreters; language translation is a form of accessibility. Therefore, a scenario can be presented where this concept applies to an augmented truth device.
Finally, it’s theoretically imaginable that Apple’s adds the Back Tap feature, also new to iOS 14, to be applicable to a pair of glasses. Perhaps one or two comfortable faucets cause an accessibility shortcut, as does Back Tap (and the iPhone’s appearance button). Maybe urgency makes a variety in the user interface or scrolls through the menus, who knows. The actual implementation is applicable, especially since the actual prototypes of the mythical creature are literally immaterial to anyone, unless Tim Cook, Alan Dye, Craig Federighi and other high-level Apple Park staff. The highlight is that Back Tap’s conceptual basis can eventually be reapplied elsewhere.
Even at the macro level, iOS UI conventions, such as the new iOS 14 widgets, compact user interface, and “map” metaphor, seem perfectly suited to an imaginable augmented truth-imaginable operating system. It makes sense to build them in advance, knowing that what’s happening will probably be percentage charts, to make a small “dog food” invested in the shipping software. Apple controls the total “stack” after all. To reiterate an earlier point, Apple is not reactionary: they build hardware and software not only for the present, but also for the future.
Ultimately, making plans and foreshadowing through Apple is very important from an accessibility perspective. Especially for portable devices such as Apple Watch, AirPods and Google Glass glasses, the marketing message is incomplete without accessibility. Using anything on your face is a very different experience, whether functionally and tactilely, to dress up with anything on your wrist or ears. Apple will have to force us with reasons to buy a pc for our face, which is not an easy task. However, its challenge is more complex: Apple will also have to force other people with disabilities to buy a computer for our face by explaining how and why it makes real-world navigation more accessible. Again, rest assured that Apple is fully aware of this and plans accordingly.
In the meantime, it’s clear that any industry observer will get attached enough to watch Apple skate as fast as you can imagine to the next place. The clues have always been there, but they have become more apparent than ever this year. They just pretend to be useful accessibility features here and now.
Steven is a freelance freelance technical journalist in San Francisco, California. It covers everything similar to accessibility. What makes your policy unique is the fact that it is not valid.
Steven is an independent freelance technical journalist founded in San Francisco, California. It covers everything similar to accessibility. What makes your policy unique is the fact that you are disabled: this first-hand wisdom gives you instant credibility and allows you to write about authority-assisted technologies. Steven’s paintings have been printed in publications such as iMore, TechCrunch and Macworld. He has also given the impression on podcasts, NPR and Cheddar TV.