MMS • Renato Losio
Article originally posted on InfoQ. Visit InfoQ
Google Cloud recently announced the general availability of Immersive Stream for XR, a managed service to host, render, and stream 3D and extended reality (XR) experiences. The new service makes the rendering of 3D and augmented reality no longer dependent on the hardware of smartphones.
Using Immersive Stream for XR, the virtual experience is rendered on Google cloud-based GPUs and then streamed to a variety of devices where users can interact using touch gestures and device movement. Sachin Gupta, vice president of infrastructure at Google Cloud, writes:
With Immersive Stream for XR, users don’t need powerful hardware or a special application to be immersed in a 3D or AR world; instead, they can click a link or scan a QR code and immediately be transported to extended reality.
The general availability of the service adds new features, including support of content developed in Unreal Engine 5.0 and of content in landscape mode for tablet and desktop devices.
Immersive Stream for XR can be used for rendering photorealistic 3D digital objects and spaces. Gupta describes use cases where users can move around the virtual space and interact with objects:
Home improvement retailers can let their shoppers place appliances options or furniture in renderings of their actual living spaces; travel and hospitality companies can provide virtual tours of a hotel room or event space; and museums can offer virtual experiences where users can walk around and interact with virtual exhibits.
Google released a template to start development and an immersive stream example developed with the car manufacturer BMW. Fabian Quosdorf, managing director at mixed.world, comments:
This cloud service enables frictionless access to high-quality content to millions of mobile devices! It could be a huge competitor to Azure Remote Rendering service. With current layoffs in this sector at Microsoft, news like this gives developers like me hope that Mixed Reality still has a chance of surviving.
Paul McLeod, principal at Decision Operations, wonders if the new service might end up like Stadia, the cloud gaming service that Google shut down:
Interesting but does not seem compelling without HMD support. Meanwhile, Microsoft HL2 supports Remote Rendering and it’s at the level engineering firms need. Seems like they’re laying the ground for something. May work out terrific, but could be another Stadia.
Similarly, Amazon Sumerian, a managed service to run AR and VR applications was recently discontinued by AWS. A common question in Reddit threads is how cloud rendering can work as latency and latency jitter are critically important for interactive experiences. User Hopper199 explains:
Latency for XR over the cloud is lower than it is for 2D games, which typically run at 60 Hz instead of 90 or 120 Hz for XR. But the main reason why Cloud XR works well, or even at all, is because of the last-second reprojection of the video stream on the headset. If your latency is too high, you can’t even fix that, but in practice with 50ms or less, it’s fine (…) The trick here is that the viewpoint is virtually lag-free, due to reprojection.
The pricing of Immersive Stream for XR depends on the configured streaming capacity, defined as the maximum number of concurrent users that the experience can support. Currently available in a subset of Google Cloud regions, the service hourly charges 2.50 USD per unit in the cheapest regions.