Apple has Won a Patent for Immersive Video Streaming for their Future Mixed Reality Headset - Patently Apple

1 year ago 49

Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to streaming immersive video contented for presumption to a idiosyncratic wearing a caput mounted device.

Immersive Video Streaming

According to Apple, immersive video contented tin beryllium presented to a idiosyncratic successful three-dimensions utilizing a wearable show device, specified arsenic a virtual world headset oregon an augmented world headset. Further, antithetic portions of the immersive video contented tin beryllium presented to a user, depending connected the presumption and predisposition of the user's assemblage and/or the user's inputs.

Apple's patent FIG. 1 beneath shows an illustration strategy #100 for presenting immersive video contented to a idiosyncratic #102. The strategy 100 includes a video contented root #104 communicatively coupled to a wearable show instrumentality #106 via a web #108.

The wearable show instrumentality tin beryllium immoderate instrumentality that is configured to beryllium worn by a idiosyncratic and to show ocular information to user. As an example, the wearable show instrumentality tin beryllium a wearable headset, specified arsenic a virtual world headset, an augmented world headset, a mixed world headset, oregon a wearable holographic display.

2 Apple patent figs 1  2a b

Apple's patent FIG. 2A supra is simply a diagram of an illustration viewport for presenting immersive video content; FIG. 2B is simply a diagram of illustration degrees of state of question of a user's body.

Apple's HMD Viewport

Further to patent FIG. 2A, Apple notes that immersive video contented #200 tin see ocular information that tin beryllium presented according to a scope of viewing directions and/or viewing locations with respect to a user.  A viewport #202 tin beryllium selected to contiguous a information of the immersive video contented to the idiosyncratic (e.g., based the presumption and/or predisposition of the user's head) to springiness the idiosyncratic the content that they're viewing the ocular information according to a peculiar tract of presumption and/or viewing perspective.

Further, the viewport tin beryllium continuously updated based connected the user's movements to springiness the idiosyncratic the content that they're shifting their regard wrong a ocular environment.

The Headset's sensors can besides beryllium configured to observe the presumption and/or predisposition of a user's caput successful aggregate dimensions. For example, referring to FIG. 2B, a Cartesian coordinate strategy tin beryllium defined specified that the x-axis, y-axis, and z-axis are orthogonal to 1 another, and intersecting astatine an root constituent O (e.g., corresponding to the presumption of the user's head).

The sensors (#120, FIG. 1) tin observe the idiosyncratic translating her caput on 1 oregon much of these axis and/or rotating her caput astir 1 oregon much of these axes (e.g., according to six degrees of freedom, 6DoF).

For example, the sensors can observe erstwhile a idiosyncratic translates their caput successful a guardant oregon backwards absorption (e.g., on the x-axis), sometimes referred to arsenic a “surge” motion. As different example, the sensors can observe erstwhile a idiosyncratic translates their caput successful a near oregon close absorption (e.g., on the y-axis), sometimes referred to arsenic a “sway” motion. As different example, the sensors  can observe erstwhile a idiosyncratic translates their caput successful an upward oregon downward absorption (e.g., on the z-axis), sometimes referred to arsenic a “heave” motion.

As different example, the sensors can observe erstwhile a idiosyncratic rotates their caput astir the x-axis, sometimes referred to arsenic a “roll” motion. As different example, the sensors can observe erstwhile a idiosyncratic rotates their caput astir the y-axis, sometimes referred to arsenic a “pitch” motion.

As different example, the sensors can observe erstwhile a idiosyncratic rotates her caput astir the z-axis, sometimes referred to arsenic a “yaw” motion.

Developers and/or engineers could dive deeper into the details of this invention nether Apple's granted patent US 11570417 B2.

Apple Inventors

  • Fanyi Duanmu: Video Coding and Processing Engineer
  • Jun Xin: Engineering Manager, Video Coding and Processing
  • Xiaosong Zhu: Senior Software QA Engineer (many years of acquisition successful Broadcast, integer video encoding process, and IPTV industry)
  • Hsi-Jung Wu: No LinkedIn Profile was found.

10.52FX - Granted Patent Bar

Read Entire Article