Apple’s software is generally pretty good, although the company has expanded its focus between more platforms than ever before — macOS, iOS, iPadOS, tvOS, watchOS, whatever software Apple makes for its perhaps one-day car, and AR/VR.
And it’s been a while since Apple Maps screwed up; But Apple’s biggest mistake is placing the Safari URL bar on the wrong part of the screen, according to The Verge tech report.
And over the past two years, the company’s software announcements at WWDC have been recurring and almost exclusively added—for example, last year’s iOS announcements were some improvements in the quality of FaceTime and some new types of identifiers that work in Apple Wallet. Other than that Apple has mostly rolled out new settings menus: new controls for notifications, focus mode settings, privacy tools – that kind of thing.
Apple is also a top speed-follower in software, remarkably quick to adapt and refine anyone else’s new ideas about software. Apple devices are as feature-packed, long-lasting, stable, and usable as anything you’ll find anywhere.
Many companies try to reinvent everything all the time for no reason and end up creating problems where they didn’t exist. Apple is nothing if not a ruthlessly efficient machine, and this machine is hard at work sharpening every pixel its hardware makes.
But we’re at an inflection point in technology that will demand more from Apple. It’s becoming somewhat clear that AR and VR are the next big thing for Apple, the next supposedly earth-shaking industry after the smartphone.
Apple isn’t likely to show off a headset at WWDC, but with virtual reality and augmented reality coming into our lives, everything about how we experience and interact with technology must change.
Apple has been showing off augmented reality for years but all it shows are demos and things you can see or do on the other side of the camera.
According to the report, little has emerged from the company about how it thinks augmented reality devices will work and how we will use them. For its input devices, the company will need a few new hardware and a new software model to match. That’s what we’ll see this year at WWDC.
Last year, Apple announced the ability to take a picture of a piece of paper with an iPhone that automatically scans and identifies any text on the page, an AR feature that uses your phone’s camera and artificial intelligence to understand and categorize information in the real world.
The entire tech industry thinks this is the future – that’s what Google does with Maps and Lens and what Snapchat does with its Lenses and filters. Apple needs a lot of where Live Text came from.
From a simple user interface perspective, the only thing AR requires is a more efficient system for getting information and getting things done. No one will be wearing the augmented reality glasses that send them Apple Music ads and news notifications every six minutes, and full-screen apps that require your individual attention will increasingly become a thing of the past.
The phrase “Use your phone without getting lost in your phone” appears to be a theme at this year’s WWDC. According to Bloomberg’s Mark Gorman, we can see the iOS lock screen displaying useful information without requiring you to unlock your phone.
A more viewable iPhone seems like an excellent idea and a good way to stop people from unlocking their phones to check the weather only to find themselves deep in a TikTok hole three and a half hours later. The same goes for Interactive Widgets, which are rumored to let you do basic tasks without having to open an app. And if focus mode gets some rumored improvement — and especially if Apple can make focus mode easier to set up and use — it could be a really useful tool on your phone and a totally essential tool on AR glasses.
AR will demand from programs that offer more but also get out of the way more
Apple is expected to continue bringing its devices closer together in terms of what they do and how they do it in an effort to make its entire ecosystem more usable, with almost an entire lineup of Macs and iPads running on Apple’s M-chip — and perhaps an entire line yet. WWDC If the long-awaited Mac Pro comes out, there’s no reason why devices can’t share more DNA.
Universal Control, which was probably the most exciting announcement for iOS 15 even if it didn’t ship until February, is a good example of what Apple looks like to treat its many screens as part of an ecosystem.
If iOS 16 brings true multitasking to your iPad, then the iPad on your keyboard is essentially your Mac. Apple used to avoid this convergence that it seems to embrace. And if she sees all of these devices as a companion and accessory to a pair of augmented reality glasses, she’ll need them all to do the job well.
It was Apple’s last and last time – having a really new idea of how we use gadgets in 2007 when the iPhone was launched. Since then the industry has been going the “yes” route, improving and tweaking without deviating from the basics of multitouch. But augmented reality will break all that. It couldn’t work otherwise that’s why companies are working on neural interfaces, trying to master gesture control, trying to figure out how to display everything from translated text to maps and games on a small screen in front of your face.
Meta already ships and sells its best ideas; Google is coming out in the form of Lens and Videos features. Apple needs to start showing the world how it thinks about the future of augmented reality. Headphone or no headset This will be the story of WWDC 2022.