Apple’s biggest press conference dubbed the Worldwide Developers Conference 2019 is only a few months away and it is expected to reveal the new iOS major updates. The iOS 13 is reportedly introducing many new features and improvements; one of the biggest is said to be the augmented reality platform, ARKit.

Apple’s augmented reality push looks set to get some very interesting upgrades which will be revealed at the WWDC 2019 this coming June 3-7. Predictions are strong that the tech giant’s ARkit will see a brand new Swift-only framework for AR. Furthermore, reports claim that a companion app that lets developers create AR experiences visually will be added to the platform. 

More recent news added that the ARKit will get the ability to detect human poses. As for game developers, the OS will support controllers with touch pads and stereo AR headsets.

Apple has offered ARKit, its framework for augmented reality applications and games, since iOS 11. However, until now, it has focused primarily on mixed reality experiences through an iPhone or iPad.

With iOS 12 revealed at the last year, Apple launched ARKit 2.0. The platform’s major update then included multi-user support which allowed two people to interact with the same augmented reality experience together.

However, the end-game for ARKit might be smart glasses of some sort. Rumors are rife that Apple is currently developing its first AR glasses. The upcoming AR glasses is also said to deliver 3D AR graphics with each eye being shown a slightly different, offset view of the virtual world. People might have to wait a little more, though, as the wearable is expected to come out in 2020.

The Upcoming AR Glasses

Even before Apple makes an official announcement about its rumored AR glasses, leaks of the wearable’s features were already published.

In a previous report, it was revealed that the Apple AR glass will offer more effortless visual guides like leading the users along an unfamiliar street or recipe and tools that enable safer driving. A patent was also released showing how the Apple AR glass will contain biometrics and light sensors to track detailed face movements and anatomical gestures.

The human poses that the sensors will detect might include chewing, blinking, winking, smiling, eyebrow raising, jaw motioning, mouth opening, and head gestures. As for its biometrics, it is said to allow the users to browse through holographic photo albums.

It’s been 2 years since Apple was first rumored to create its own AR glasses. Will it really come in 2020?
Source: ZoneofTech

Augmented reality for Apple has always been viewed through an iPad or iPhone. But reports have it that putting on a set of special AR glasses and viewing the realities hands-free has long been a key part of Apple’s roadmap.

Other iOS 13 Updates

Apple’s iOS13 won’t stop at its ARKit, though. It will also include new developments to its AI functions

New Siri Intents

According to iPhoneHacks, the new version of Siri will use simpler commands. Features will include media playback, event ticketing, voice calling, message attachment, extracting flight, airport gate and seat information and searching train trips.

Marzipan Improvements

Developers porting their iOS apps to the Mac will have access to new APIs that allow their UIKit apps to integrate with Mac-specific features like Touch Bar and menu bar. UIKit apps on the Mac will also be able to open multiple windows.

Split View apps will now have the ability to be resized by dragging the divider and alter its position by double clicking the divider, just like native Mac apps.

The
Other AI and API updates will also be available for the iOS13.
Source: Pixabay

Taptic Engine, Links, NFC

Developers will finally take more control over the Taptic Engine which only offers a small set of feedback styles. The update will add a new framework to the platform as well as new functionality for developers to include link previews in their apps, similar to those that appear in iMessage conversations.

On the other hand, NFC will see a major improvement which will give third-party developers the ability to read any ISO7816, FeliCa or MiFare tags.

The new version of CoreML will allow the developers to update their machine learning models on-device. Currently, models have to be pre-trained and are mostly static after deployment. The upcoming version will allow the apps to change their behavior as their ML models learn from user actions.

The Vision framework will get a built-in image classifier. This goes without the need for developers to embed a machine learning model to classify images into common categories.

Other updates will include the availability of document scanning functionality in some iOS apps such as the Notes will already be available for third-party developers with a new public framework. Another is the API apps being able to capture photos from external devices such as cameras and SD cards, without having to go through the Photos app.

On the Mac side, there will be new apps that offer file provider extensions. This is said to improve the way certain apps such as Dropbox can integrate with Finder. New API apps will also let developers write over device drivers.

After Apple unveils the iOS 13, tvOS 13, macOS 10.15 and watchOS 6 at the WWDC 2019, developers will get access to the first beta immediately. The public betas will come later and the final version of the systems should be released to consumers in September.

LEAVE A REPLY

Please enter your comment!
Please enter your name here