Extended and augmented reality for XR studio

Extended and augmented reality for XR studio

Final VFX processor for compositing the Camera with the Set-extension and AR

extended and augmented reality for xr studio needs a specific XR/AR license. If you'd like to try out these features to decide whether you want to purchase an XR/AR license, please contact us.

Explanation

Set-extension is the last part of a XR project. This consists in sending the final stream, composed of the camera stream to which we add the virtual layout above. This has to be done with a dedicated Processor This section describes how to configure Smode processors in order to configure your output(s) and also make some nice and smooth stage pre-visualisations Read More in which we have to put first the Video Input Extract the image from a Video Input Device Read More In order to avoid delay between the camera stream and the extension, a technique must be followed:

Extended Reality

Create a Processor This section describes how to configure Smode processors in order to configure your output(s) and also make some nice and smooth stage pre-visualisations Read More in the pipeline and select your physical camera as its Current Camera.
In this Processor This section describes how to configure Smode processors in order to configure your output(s) and also make some nice and smooth stage pre-visualisations Read More , create a Video Input Extract the image from a Video Input Device Read More that will correspond to the tracked camera stream.
If you are not on on stage and don't have your camera yet, create a Compo Composite 2D Layer, Shape layer and 3D Layer together. [Ctrl] + [Shift] + [N] to create a new composition after opening Smode Read More with a Stage Preview Renders the stage in Preview mode Read More inside.
Then, Create another composition, Drag and drop the Content Map This section describes how to set up your content maps and use Content Area Read More inside it and set it's renderer to Unproject 3D Project the Content Map into a 3D plane according to a perspective point of view Read More
We need to mask this new composition to display our camera stream. Create a Layer mask Turn any 2D generator into a mask Read More over it and set it to inverse. Add another composition inside this mask and inside it, create Stage Display A 3D generator that refers to a stage element Read More for each LED screens.

Smooth the Extension's mask

Change the renderer to a AutoIlluminate Surface Display geometry's triangles as a AutoIlluminate surface Read More . Apply a Placement mask Solid mask that has a 2D Placement Read More on the Uniform Generate a solid, uniform image, filled with a unique color Read More . Set the Feather link of the placement mask to none and modify the exponent and feather value to smooth the edges of the selected display.
We advise to expose the parameters below inside a Parameter Bank Group a bunch of Exposed Parameter together Read More .
  • Feather (as clamped vertical canvas value)
  • Exponent

Augmented Reality

Simply create a last Compo Composite 2D Layer, Shape layer and 3D Layer together. [Ctrl] + [Shift] + [N] to create a new composition after opening Smode Read More ,the Content Map This section describes how to set up your content maps and use Content Area Read More in it and set it's renderer to Unproject 3D Project the Content Map into a 3D plane according to a perspective point of view Read More
If you have no alpha in your AR, check if your content map and compo have alpha. if you use Unreal check the How-To Unreal In Smode How to integrate your Unreal content inside Smode Read More

Delaying the Extension Content

Delaying the Extension and Ar content for syncing the Show animation
For this simply add a Frame Delay Travel an x number of frames into the past Read More onto the Content Map This section describes how to set up your content maps and use Content Area Read More . The delay you need to input into the Frame Delay Travel an x number of frames into the past Read More is the delay of the close Loop calculated by the XR-Latency Calibration Automatically evaluate the amount of frame delay between sending a picture to the stage and receiving it back through the camera. Read More

Sync the Video input and Traking Data

if your VFX look like this when rotating the camera, you need to to adjust the Camera Tracking Data Latency Sync the Camera Video Input with the Tracking Data Read More

Match the Extended and AR Color to the video input

To make the color calibration you can follow this guide: XR-Color Calibration Blend perfectly the walls colors with the virtual surrounding. Read More . It will go throw the step form recording the color to applying the Smart Lut Modify colors using a LUT from a XR Display information Read More to your VFX processor
At the end your VFX processor will look like this:
That's it! You are now ready to go on stage to proceed to the XR - Calibration procedures Calibrate your XR Studio Read More .

See Also: