Today, the US Patent & Trademark Office published a patent application from Apple which concerns a method for providing information on the behavior of a user with respect to at least one reference object, by particular a virtual reference object, via a network of smart glasses or MR HMD to a second device such as a Mac or an iPhone/iPad. The invention also covers a system for providing information on the behavior of a user in a particular virtual scene. The invention is applicable to mixed reality environments which could be used in games, flight simulation training and much more.
Mobile HMD gaze data transmission
Apple’s invention applies in particular to the field of virtual reality and eye tracking systems. Virtual reality can advantageously be used for a wide variety of different applications.
Besides games and entertainment, virtual reality, especially in combination with eye tracking, can also be used for market research, scientific research, people training, etc. For example, eye tracking data can advantageously provide information about where a user currently experiencing the virtual environment is looking in that virtual environment. Thus, for example for market research, one can use the virtual environment in combination with eye tracking to analyze for example which objects, which are presented as virtual objects in the virtual environment, for example a virtual supermarket, attract more or less user attention.
In addition, the combination of the virtual environment and eye tracking can be used for training purposes, for example by simulating a virtual training situation, for example in the form of a flight simulator or a vehicle simulator, and using the captured eye tracking data to analyze whether the user looked at the right important objects or instruments or was attentive or not or tired, etc. Especially in such situations, it would be very desirable to be able to share such a virtual reality user experience also with third parties, such as an observer, an instructor or a supervisor, who wishes to observe or analyze the user’s behavior and interaction. of the user with the virtual environment or to give instructions, advice or recommendations to the user who is currently experimenting with the virtual environment.
Apple’s patent FIG. 2 below is a schematic illustration of a system for providing information on a user’s behavior with respect to a reference object via a network from a first device to a second device
In one example, Apple notes that when the user associated with the first device (smartglasses #14) moves around and interacts with a known virtual environment, which is displayed as the VRS virtual scene, for example when playing a game or walking through a virtual supermarket, it is enough to provide information about the current state of the user on the second device (#16 Mac / computer) to recreate the user experience on the second device. The recreation may also be intentionally altered, for example by increasing or decreasing the resolution, for example in the region of the VRS virtual scene that includes the user’s current gaze point. In a static and interactive virtual environment, the unknown component is how the user moves around and interacts with it, where the known component is the virtual environment itself.
For details, see Apple patent application number US 20220404916 A1.
Apple inventors
- Tom Sengelaub: Senior Engineering Manager – Computer Vision
- Julia Benndorf: software engineer
- Marvin (Vogel) Klinkhardt: computer vision engineer
The three inventors came to Apple when SMI SensoMotoric Instruments GmbH was acquired by Apple in 2017. SMI was a world leader in eye tracking technology. In 2015, SMI introduced its first eye-tracking smart glasses to siggraph as shown in the video below.