What is Parallax?

How XR Tricks Our Visual Perception

Parallax is the secret sauce that makes XR so convincing to the human eye – and consequently, the camera. In the real world, parallax is defined as:

“The perceptual difference in an object’s position when seen from different vantage points.”

But what does that mean exactly? For humans and many other living creatures, this is the way our brains perceive depth. Because each of our eyes are in different positions in our head, they’re both simultaneously viewing the world from slightly different perspectives. Try this quick demonstration: Close one eye and look out of the other one. Now switch. As you switch back and forth between which eye is open and which is closed you should notice that your view shifts slightly based on which eye you’re seeing with. It’s that slight offset in vision that allows your brain to perceive depth and calculate where things are relative to each other in three-dimensional space.

left eye / right eye shifting view

Taking things a step further, as you move around in the world, you’ll also notice parallax at work in the interplay between your eyesight and the position of the object you’re focusing on. This also holds true for camera lenses. To get a better idea of what I’m talking about, take a look at this video:

Since all of this is dependent on your view of objects in actual 3D space relative to their positions within that space, a problem arises when you try to view objects in 2D space like a photograph or a painting where the view of an object never changes in relation to where you’re positioned. Try watching the same video again only this time the background is a still image with no parallax.

FPO:2D Demo here

Now you may be thinking, “But what about video? Even though it’s technically 2D, I can still experience parallax because the view can still change in relation to the objects in it.” Well…yes and no. The view can change, but only relative to the camera that filmed it and objects in the video itself. That doesn’t work for virtual production because the background on the LED wall or green screen needs to be in lock step with the studio camera filming your talent or it won’t look right. Even if the viewer isn’t aware of the problem, they will still most likely feel like something isn’t right because our brains are trained to pick up on spatial relationships like that.

To solve that problem, virtual production uses camera tracking, lens encoders and real time graphics driven by powerful game engines to digitally recreate that process. These tools sync the real-life camera’s movements and focal length to a digitized replica of the camera inside the 3D world of the game engine. The game engine allows us to render the digitized camera’s field of view onto the LED wall in real time, creating the illusion of real-world parallax for the physical camera and ultimately the viewer. Confused? Check out this side-by-side comparison for a better understanding:

FPO: Comparison Video Here

Related Posts

Leave a Comment