XR-Color Calibration

XR-Color Calibration

Blend perfectly the walls colors with the virtual surrounding.

xr-color calibration needs a specific XR/AR license. If you'd like to try out these features to decide whether you want to purchase an XR/AR license, please contact us.

1) Theory

In order to Extended and augmented reality for XR studio Final VFX processor for compositing the Camera with the Set-extension and AR Read More , the colors of the video-input must be as close as possible to the colors of the overlay image. However, between the screens and the camera, the image passes through several colorimetric profiles.
The picture below has been taken before and after applying a color calibration to the extended.

How it works:
Just like the Geometric Calibration, Smode will need one or several viewpoint of the setup to determine the color correction to be applied to the extended.
These points of view are not images but sequences of grids of different colors cast on the screens. Smode will then compare each color sent in each square with the one received. He will then determine which is the color model of your setup.

Color model: the colorimetric profile of your setup

2) Before Starting a Color Calibration

  1. Your Smode is fully optimized in graphical performances. Use the Profile Enable/Disable performance profiling Read More feature if needed.
  2. The frame rate is stable. If not, it might be because your set-up isn't fully genlocked
  3. You are filming every impacted screens
  4. the camera does not move
  5. Ensure that you have done a XR-Latency Calibration Automatically evaluate the amount of frame delay between sending a picture to the stage and receiving it back through the camera. Read More
  6. Turn off every lights on the stage
  7. Alert people not to pass in front of the camera

3) UI Check-out

  1. Viewport : Display the stream of the XR Calibrator A specific tool for XR calibration Read More Video Input Extract the image from a Video Input Device Read More
  2. Enable Detector : Enable the detection of April Tag Generate an April Tag Read More and display helpers in the viewport.
  3. Detection count : Number of April Tag Generate an April Tag Read More detected. (result of the April Tag detector modifier Detect all the April Tag in frame Read More
  4. Tracker information : Display the current position orientation of the Tracker a Tracker represent a real tracker and is part of a Tracking System Read More of the Physical Camera A camera that simulate a real-one Read More such as the deviation. A positive value of Position Deviation and Orientation Deviation means that your tracker is currently moving.
  5. Send Locators : Display a April Tag Grid Generate a grid of April Tag Read More in each Led screens.
  6. Shoot viewpoint : Start the shoot of a viewpoint for calibration
  7. List of viewpoints : every viewpoints appears in that list
  8. ViewPoints information: display the number of April Tag Generate an April Tag Read More detected for each screens and the pixel gap between their position in the video input stream and the stage simulation.
  9. Evaluate : Make an average evaluation of the differences of colors between Emitted colors and Received colors
  10. Calibrate : Start a calibration. Calculation depends on the number of viewpoints shooted.
  11. Console output
  12. Save as Calibration State : Save the calibration results as a calibration state.
  13. Calibration States list : Every calibrations results can be called back as states. They appears in that list.

4) Calibration process

In the XR Calibrator, Color tab, enable the Enable locator.
Try to detect the maximum amount of apriltag, especially at the junctions of the walls and the corner, as these are the places where there is the most need to catch up on color. Play with the focus of the camera to detect more of them.

There is also the possibility to change the "Quad Decimate" parameter (In -> Detector: April Tag) to increase the number of tags displayed in the screens.

You need to lower the Decimate value, that will allow detecting more tag ( but Smode can slow down if your machine is not powerful enough )
Take a viewpoint shoot. Remove the locators before each viewpoint shoot to optimize performances.

When a ViewPoint is taken, several different colors are sent to the screens at the level of each tag. Smode then records, for each detected tag, each color: the one that is emitted at this place of the screen, and the one that is received.
It is then able to deduce the difference between a color sent, at a given place on the screen, compared to the color received at the same place. You can visualize the data of a viewpoint by unfolding "viewpoints" at the bottom of the color parameter of the XR-Calibrator:

Once your viewpoint has been shoot, move the camera to get another point of view, wait for your camera to be stable, and take another one. You don't have to look at all the screens for a viewpoint-shoot. Look at the interesting parts of the stage to be calibrated.
Try as much as possible to take viewpoints seen from the front of the screens. Feel free to shoot twice on the same position. The colors sent being randomized, this can improve the quality of the measurements.
If you have a fixed position camera make one or multiple shoot form it's position
In some cases, it can be interesting to "merge" the color models of the screens that make up the walls of your setup. Select the corresponding XR Display information Layer that store latency calibration informations and configure and store the color calibration Read More and then in Color Model -> General -> Type -> Switch to Merge.
A panel warns you that several target parameters will be deleted. Press "YES".
In XR Display information Layer that store latency calibration informations and configure and store the color calibration Read More , merge the color model of one screen with the other one.

You are ready for starting a calibration. Press Calibrate.
By default only the color model is calibrated, the inverse color model is a lot more complex to setup and have good result with it ( and it take a lot more time to Calibrate )

The calculation generate a collection of LUT stored in the XR Display information Layer that store latency calibration informations and configure and store the color calibration Read More and can be apply to your setup using a Smart Lut Modify colors using a LUT from a XR Display information Read More .

6)Export the Calibration data

Just like the XR-Geometry Calibration Lets Smode know the position of cameras and Led Screens in the real world Read More , you have the possibility to save your color calibration into .smartlut file. Those files can be re-imported later.

7) Apply the Calibration

At this step, you have to Modify the colors of your AR and Extension compositions to make them match the colorimetric profile of your setup
You can drag and drop the .smartlut you just exported it will create a Smart Lut Modify colors using a LUT from a XR Display information Read More that you can apply directly onto the Extended and AR compo in your VFX processor.
or you can use a Modifier Stack Layer Deal with a group of 2D Modifier as you deal with Scene Read More so your color control are centralized in your show and you can adjust it by hand
Once you have imported a Smart Lut Modify colors using a LUT from a XR Display information Read More you need to select the Display ( for the optional angle data)
if you want to directly connect the LUT to the display without exporting it, you can create a Smart Lut Modify colors using a LUT from a XR Display information Read More and change it's mode to "internal Direct" and the select the Lut display

8) Correcting the LED color depending on the angle

The Display Angle Mask Mask any content or modifier relatively to the angle of a stage element Read More can also be helpful because it's role is to mask any 2D Modifier Modify a 2D image Read More or 2D Generator The part of a 2D layer that generates the image Read More according to the angle of the selected Stage Element An element inside the Stage Read More angle.

See Also: