Apple accelerates to the max, all teams work in Mixed Reality. It has been known for years that Apple has great ambitions on Augmented Reality AR, Mista MR: the dedicated viewer should be presented in June and the actual device is only the “tip of the iceberg” of a growing effort that involves the majority part of the business units, divisions and teams.
Apple releases the first beta of iOS 16.4 for iPhone
The US site Computerworld highlights that with Augmented Reality (AR), Apple does not intend to present a simple product but an entire ecosystem, and to achieve this goal it needs the collaboration of all the teams that deal with software development and services in Cupertino.
In addition to ensuring the robustness of the operating system, the viewer will have to work with all the other devices, applications and services of Apple, a job of no small importance for the various development teams.
As far as the hardware is concerned, rumors have circulated according to which Apple’s AR/VR viewer will take advantage of an internally designed chip, another element on which it has been working for years, continuing to improve the “neural engine”, the “engine” that in modern SoCs deals with the execution of complex calculations and artificial intelligence algorithms, increasingly specialized in the field of “computer vision” with analytical skills comparable to those of a flesh-and-blood observer.
Optical components and imaging technologies (hardware and software) are essential to enable the execution of complex tasks, offering the harmony that allows the user to feel that sort of “magic” when using certain devices, without realizing the hidden complexity behind.
Among the teams involved in the development of Apple’s AR/VR viewer there are also those dedicated to Accessibility, which deal with integrating technologies such as voice control, or detecting doors and other elements around the user, tracking and interpreting the gestures of hands and fingers, bearing elements in the experience that Apple intends to offer.