NVidia GPUs and 5G Cellular networks | NINJAR Blog

Digital Reality, Cloud GPUs and 5G Cellular Networks

CHANGES FROM NINJAR.COM

Recently Nvidia announced a new series of GPU running the Turing architecture. This is the most exciting GPU architecture as it brings new technology for running real-time ray tracing and advanced artificial intelligence into consumer PCs. The last ucd pdate from Nvidia on this scale was 2 years ago — and that is a long time in GPU time.

The biggest update in the cards are the RT cores — specialist hardware chips that allows lifelike renders by accurately simulating the movement of light in a virtual scene.

Raytracing — The “holy grail” of real time graphics

To illustrate how this works — imagine you had a classic art scene containing a bowl of fruit lit by a candle.

For a computer to recreate this classic art image it starts with the pixels on the screen, and for each one it traces lines out towards the fruit to see which rays bounce off the fruit and hit the candle. It can then then set the colour of each pixel on the screen which we see as a complete image. As the rays trace the scene, any individual ray might reflect from one fruit to another (causing reflections), or be blocked by the fruit bowl (causing shadows), or pass through transparent or semi-transparent layers like the fruit peel (causing refractions and sub surface scattering). All of these interactions are combined to produce the final color of the pixel on the screen. For it to simulate how these rays of light actually work is very computationally expensive, as you need an ungodly amount of light rays to fill the resolution of a modern screen with an accurate representation of the scene.

The new Nvidia cards can trace the path of 10 Gigarays a second. That’s enough to paint complex scenes beautifully, in real time, like this demo for Porsche:

This way of rendering simulates reality and is common for film production, but until recently has not been popular in real time experiences. Most games and virtual reality apps use faster approximations and optimised shortcuts to display a 3D scene — polygonal and texture based limitations that were needed for frame rates, but at the expense of quality.

With Nvidia’s new RTX technology we can expect far more realistic shadows, reflections, surfaces and lighting in computer generated scenes. And because it’s realtime and supported by the Unreal Engine we can expect more high quality, realistic content appearing.

The new GPU cards also feature VR-Link — a simpler standard based on USB-C for powering and communicating with VR headsets with super low latency.

https://sites.google.com/view/virtuallink-consortium/home

The production process is not fundamentally changed by this move to ray tracing, and it might become easier to reach higher levels of quality via ray tracing rather than the current time consuming way of shader programming and deferred rendering

Another advantage of the latest Nvidia GPUs is their custom A.I. focused chipset called the NGX. This onboard technology stack allows developers to run trained neural nets much faster than on CPU and previous GPU cores. These neural nets could be used for image processing and at Ninjar.com we are interested in how the NGX can be used for Computer Vision, Image Recognition and driving believable Mixed Reality Avatars.

As we are seeing the mobile camera being used ubiquitously in daily life we expect it to become an important component in the paradigm of Mixed Reality. The camera is what will bring Mixed Reality into the real world and these intelligent cameras are a key to the M.R. strategy we mentioned in a previous articles:

AI and Mixed Reality

Virtual Avatars

With this update, NVIDIA has produced a fantastic piece of hardware for Mixed Reality. By linking the card to a camera — through a mobile phone camera today, and tomorrow via wearable headsets and cameras — we can build an A.I. system that can be trained to recognise the content and context of a scene and augment it with photo realistic objects.

For example maybe the user is looking at a ping-pong table through her HMD and streaming the video to a nearby PC. The NVIDIA card in the machine can use computer vision to identify the table and then the realtime ray-tracing system can kick in to produce a lifelike opponent for an instant game of table tennis:

A bit like this maybe, but with photo realistic graphics.

Graphics will only get better, AI will only get smarter. But is in unclear if the technology will get smaller and cheaper. There is a thermal and hence power limit on mobile processing though and it will never catch up to this level of GPU power.

This leaves us with divisions of market sizes and graphical quality. On one side you have Magic Leap and Oculus releasing stand alone head mounted displays with mobile phone quality graphics, and on the other side you have Nvidia’s latest technology tied to desktop PC’s and with VirtuaLink powered headsets. Tying Mixed Reality to a single location considered poor form- as Mixed Reality is inherently spacial. It begs the question: How can we use the power of Desktop PC’s along with freedom promised by Mixed Reality?

5G and Cloud GPUs

To stream the video from the user’s camera to a remote GPU for processing not only requires a lot of bandwidth, but it also requires the video to be sent fast. Any lag between the user’s movements, the updates of the camera and any A.R. placed in the world causes a jittery sensation in the experience and instantly the immersion is broken. Rendering 3D content like games and vehicles is already happening, Playstation Now allows users to play games for Sony’s console remotely and Nvidia has a service called GeForce now which allows you to play PC games from the cloud. A car visualisation company called Zerolight has sidestepped the processing power of local devices by rendering Car Configurators in the cloud and broadcasting the live video of virtual renders back to the user

So the technology is proven, but until now it was considered only fast enough for Gaming, but not fast enough for Mixed Reality. This is where the promise of the 5G cell phone networks come in. The new network is estimated to 10x faster at sending signals to remote GPUs and back.

This might put it in a range which is acceptable for offloading the understanding of the user’s scene and the rendering of digital characters to the cloud. Pose estimation and location compensation can still be processed on the local device to help lock the digital world to the real. But maybe the future of Mixed Reality will see a blend of computation running locally and remote.

The near instant response of 5G broadcasts (as low as 1ms), and the increased bandwidth will also open the doors for teleportation holograms — ie it should be possible for multiple users to interact in real time with computer generated avatars — whether they are in the same place or not. The increased bandwidth of 5G could also accommodate 360 video at retina resolution and 3D Light Field movies.

Mixing Reality with Raycasting GPUs, Cloud GPUs and 5G Cellular Networks All these technologies are coming together to allow users to actually experience Mixed Reality in the real world .

For example a user can walk around a city and look through a mobile phone or headset and it’s 5G module can stream the video to an NVIDIA card in the cloud. Mixed Reality can be generated on the fly with Nvidia’s NGX AI recognising objects in the stream and the RTC photo-real ray tracing engine augmenting the scene and rendering it back to the 5G transmission.

This allows objects in the real-world, for example all the cars driving down a street, to be identified in real time and replaced by photo real, ray traced Porches.

This is how Mixed Reality will look in the very near future.

Did you enjoy our post? You can comment and join the chat on Medium

We’d love to hear from you. Come find us on Twitter , Facebook and LinkedIn

For companies looking to get into VR/AR/MR our Virtual Reality services offer guidance on how these technologies can enhance and support your brand strategy.

Connect with us on LinkedIn and follow @ninjar on Twitter or send us an email to hello@ninjar.com

Recommended Articles

Ninjar Creators Marketplace

21 Mar by Chi

"The next step for us is to establish a Creators Marketplace where brands and businesses looking for help with their XR products can find the exact person or studio they need in a quick and easy way..."

XR Outsourcing Trends 2019

4 Jan by Marco

"While software is said to be eating the world, immersive software developers and other technical talent remains in short supply. This shortage has led to intense competition among major tech companies to hire the best engineers.."

When we look back at AR ten years from now we’ll laugh

21 Dec by Marco

"‘beep-beep-beep…. eeee …. oooo’ - that’s the sound that we used to hear every time we wanted to connect to the internet during the dial-up phase..."

Connect with Your Users or Perish

16 Nov by Marco

"Looking at a study done by Harvard Business Review and Microsoft showing the growing trend of MR that is too strong to deny.."

Immersive Communication

12 Nov by Matt

"Visualising imaginary places and objects is fine - But what about people? How can immersive technology help bring people together, and improve our social lives?.."

How 3D & VR makes automotive design 100x better

7 Nov by Sabina

"Years ago if you wanted to buy a car, you would walk onto the lot and be shown models by some salesman, who would disclose the information he wanted to, often times at the expense of the buyer.."

VR Sickness — what it is & what we can do about it

15 Oct by Sabina

"What is it that makes putting on a headset and stepping into a virtual world feel like you are on a rocking sailboat about to be hurled into the ocean?"

The Future of XR is Serious Business

12 Oct by Marco

"Instagram already has virtual humans which attract over a million followers. The use of X.R. together with A.I. mixes together to make a brilliant new show, broadcast 24/7 with you as the star.."

Virtual Avatars

23 Sep by Matt

"Instagram already has virtual humans which attract over a million followers. The use of X.R. together with A.I. mixes together to make a brilliant new show, broadcast 24/7 with you as the star.."

Artificial Intelligence and XR

21 Sep by Matt

"Cameras are the eyes of AI, and they are everywhere.The first steps into the brave new world of Extended Reality will happen through the existing hardware of mobile phones.."

Digital Reality and Culture

18 Sep by Matt

"There is an interesting bias in modern culture – people are mainly hired for scientific and technical knowhow — not for humanity, morality, religion or creative talent.."

Technology Extends Reality

15 Sep by Matt

"We are creating the hive mind. Wires linking everything. We are building another world inside our heads, and we are teaching A.I how to thrive there. But is it what we ultimately want?.."

How Can Reality Be Extended?

12 Sep by Matt

"When we built the computers to help us - we created an exo-cortex. A part of our mind outside our bodies. It extends our thoughts, our consciousness, via digital signals.."

VR | AR | MR — What do these buzz words really mean

12 Sep by Sabina

"In 2018 we are now able to mix the real world with virtual worlds in very realistic ways — and this technology is just getting started!"

Why ninjAR is a Platform and Why That Matters

11 Sep by Matt

"Everywhere we look we see the world becoming virtual. Everything we know, such as media, currencies, businesses, products, services and even people are being converted to digital assets..."

VR Creation Made Simple

10 Sep by Marco

"Our team spent seven years creating innovative digital products. During this time, we were lucky to establish and collaborate with a network of world-class developers, designers, and product managers to create beautiful, functional software..."

Back to Blog