xr - geometry calibration needs a specific XR/AR license. If you'd like to try out these features to decide whether you want to purchase an XR/AR license, please 
contact us.
                    
                    Video Tutorial
                    A video tutorial that uses a simulator so you can learn the calibration process without a real Stage 
                    You can download the project file here: 
CalibrationTutoStart.zip
                    
                    You can also learn the calibration process for FreeD with Zoom
                              		Download the project file here: 
CalibrationTutoFreeD.project.zip
                     
                    
                    1) Theory
                     
                    
                    How to let know Smode the positions of cameras and screens  ?
                    That's the job for the geometry calibration. This calibration part will enable to Smode to detect the positions of the screens in the real stage to adjust the virtual one.
                    The calculation works thanks 
                     April Tag
                     Generate an April Tag
                    Read More
April Tag
                     Generate an April Tag
                    Read More
                     and 
                     April Tag detector modifier
                    Detect all the April Tag in frame
                    Read More
April Tag detector modifier
                    Detect all the April Tag in frame
                    Read More
                    , called 
                    
                      Locators
                    
                    The geometry calibration part consist of taking a sufficient number of shoot called 
                    
                      Single frame
                     of different points of view of your real stage with 
                     April Tag
                     Generate an April Tag
                    Read More
April Tag
                     Generate an April Tag
                    Read More
                     broadcasted in screens.
				During this step, you will have to do succinctly:
                    
                      - 
                        Take a shoot (= store an single frame)
                      
- 
                        Move the camera
                      
- 
                        Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
                      
- 
                        Take a shoot (= store an single frame)
                      
- 
                        Move the camera
                      
- 
                        Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
                      
- 
                        Take a shoot (= store an single frame)
                      
- 
                        Move the camera
                      
- 
                        Wait for the tracker Position&Orientation deviation to be at 0 ( or a very low value )
                      
- 
                        etc
                      
 
                    
                    You don't have to store a frame that display all the screen. But be sure to detect enough of 
                     April Tag
                     Generate an April Tag
                    Read More
April Tag
                     Generate an April Tag
                    Read More
                    . Modify the focus of your camera if needed.
 
                    
                    2) Before Starting a geometric Calibration
                    Before launching the latency calibration, you must ensure that:
                    Check that the camera position and orientation in the 
                    Stage
                    3D modeling of the real-world video setup
                    Read More
                     is approximately the same as in the Real Stage
                    For this you can:
                    
                      - 
                        Ask the people who set up the tracking device where de 0 is on the real stage.And place the 
                        Tracking System
                        To visualize tracked points using a tracking device 
                        Read More
                         to this point.
                      
- 
                        Moves the camera up-down, front-back, and left-right axes. And rotate,if needed, the 
                        Tracking System
                        To visualize tracked points using a tracking device 
                        Read More
                         according to your observation
                      
- 
                        Pan and till the camera to verify that the camera look in the same direction, if not you need to offset the orientation of the 
                        Tracker
                        a Tracker represent a real tracker and is part of a Tracking System
                        Read More
                          
 
Once you ensure the previous points, you can start a geometry calibration. 
                    
                    2.1) For Stype tracking system
                    
                      - 
                        [On Stype world]: Also ensure that the camera position is correctly supported by Stype (seen by all cameras). Example below is not good:  
 
- 
                        [On Stype world]: Calibrate the min and max zoom of the Stype Computer.
                      
 
                    
                    For FreeD tracking system
                                    You need to report the maximum and minimum values of zoom on your Camera model. Zoom in and out until the maximum values for min and max are reached.
                Verify if the values set in the 
                    
                      Custom Zoom Interval
                     are correct.
                
 
                    
                                    If not, you can set it manually.
            
 
                    
                    
                    4) Calibration process
                     
                    
                    Enable "Send Locator" (1) 
                    Wait until the "Standard deviation" parameter for position and orientation reaches 0 (2).
					This data represents "jitter" in the signal output from your tracking system. Either the camera is not yet stable or there is a problem, verify with the people who set up the tracking device tracking system.
                    Verify that a sufficient number of tags are detected (3). If necessary, adjust the Focus and use the Enable detector function (4) without being on Air to view the detected tags in the viewport
 
                    
                    
                    Then you can store single frame (5).
                    and you can move the camera for the next frame,then wait until it is stable (Position/orientation deviation),Then store single frame (5).
                    The last steps need to be repeated several times.
 
                    
                    
                    When you have multiple frame:
					you can press 
                    
                      Evaluate (6)
                     and delete or mute frames with errors way above the average.
 
                    
                    
                    Once you have verified your frames press 
                    
                      Calibrate (7)
                    
                    Wait until the toggle is automatically unset at the end of the calibration
 
                    
                    4.1)Stype Calibration process
                    When using Stype you need to chose if you want to use the optic data from Stype or calibrate a Prime lens optic in Smode in your 
                    Physical Camera
                    A camera that simulate a real-one
                    Read More
                     with the mode parameter
                    
 
                    
                    4.2) FreeD Calibration process
                                	FreeD Calibration take time ( 30 min to 1 hour )
            
                    1) Roughly place the position and orientation of the tracking system in the stage
                    2) Scan zoom to get the min & max zoom => c.f. display in the Physical Camera "custom zoom interval
                    3a) Go to a wide zoom level and capture multiple frames with different orientations so that we cover the 4 corners of the camera image with aprils tag., do it for 3 different position
                    3b) Verify the the polynomials Fov, K1, k2, ShiftX, ShiftY are at degree 0 and press Calibrate
                    4a) Staying in the current position is to engage different new zooms, covering the whole matrix of the detection camera, with the
				the whole matrix of the detection camera, especially in the wide shots
				(=> 8 frames in total)
                    4b) Calibrate with degree 2
                    4c) Calibrate with degree 4
                    5a) Look for problematic areas, make more frames
                    5b) Calibrate with increased degree
                    5c) Go back to 5a if there are still problematic areas
                    to save your Camera model check: 
                    Physical Camera
                    A camera that simulate a real-one
                    Read More
                     
                    
                    5) Export Calibration
                    If you are satisfied with the calibration, export it. This will create a .geocal file directly in the Smode project.
 
                    
                     
                    
                     
                    
                    6) TroubleShoots
                    
                      - 
                        Try calibrating with only one frame enabled
                      
- 
                        Verify that the screen are connected to the right output
                      
- 
                        Verify your UV if you are using Fbx File
                      
- 
                        Verify that the orientation of the screen is correct ( with a test pattern for example )
                      
 
                                Next step: 
                    XR - Color Calibration
                    Blend perfectly the walls colors with the virtual surrounding.
                    Read More