How To Do Calibration For Kinect Camera In Matlab
Articulation-Calibration-Toolbox-for-Kinect
Table of Contents
- Code
- Usage
- Demo
- Offline scale
- Online calibration
- Offline vs. Online
- Analogy
- View depth data
- Capture
- Record
- Configuration
- Environs
- Compile libfreenect yourself
- After Calibration
- Visualization
- Reprojection
- Dispatiry file content
- Conversion betwixt disparity and depth
- Reference
- Update
Code
This toolbox contains some useful code in calibrating Kinect, which also replicates my review work in Kinect calibration.
For anyone intersted in this work, you tin find details from my review paper "A review and quantitative comparison of methods for kinect calibration" besides equally the corresponding technical report.
Usage
You will get three of the following directories:
- capture--including libfreenect library and its Matlab wrapper.
- tape--a tool that helps to record data stream from Kinect in Matlab.
- demo--including the core of calibration and sample dataset.
Demo
Run main.grand to starting time, program is quite easy to understand as long as you follow the command prompts. In that location are 2 ways for you to calibrate your kinect:
- offline--calibrate using your pre-obtained information.
- online--calibrate using data captured in real-time.
Offline scale
I am providing dataset '../smallset/' for reference (however, you must capture data with your own Kinect) which examfies file format of our offline calibration: (instructions cited from document attached with the original code)
- Put all files in the aforementioned directory, whose path needs to be specified in the command prompt.
- For a successful calibration 4 types of images must be present: frontal airplane, aeroplane rotated around the 10 axis, plane rotated around the Y axis, and full image planar surface for distortion correction.
- If you want to calibrate the disparity distortion model, yous should also take some images of a bigger aeroplane (e.one thousand. a wall) that fills the entire image.
- For a proper calibration, the algorithm requires around 30 images from different angles.
- All images taken by Kinect color camera is formatted as "0000-c1.jpg".
- All images taken by Kinect depth camera is formatted every bit "0000-d.pgm".
- Images corresponding to the aforementioned plane pose should have the same prefix number.
- A given airplane pose may have only one color paradigm, merely the depth image, or any combination.
- The calibration uses the information from those images that are present.
It should be noted that during offline scale, intermediate data will be saved in directory where all images exist, in case that whatsoever blow happens. If you want to redo any intermediate step, just delete respective .mat file and its previous .mat file, then algorithm should become through correctly.
Note on the sequence of intermediate data storage: dataset.mat->rgb_corners.mat->depth_plane_mask.mat->calib0.mat->final_calib.mat
Online calibration
There are several of import default parameters that should be antiseptic in online scale:
- sqr_def: square size for checkerboard (m).
- win_def: window size of automatic corner finder (pixel).
- cnt_x_def: inner corner count of checkerboard in 10 direction.
- cnt_y_def: inner corner count of checkerboard in Y direction.
- pic_num: number of images that must be taken for each of the three views including frontal, effectually 10 and around y, at various distances.
- wall_num: number of images that must be taken in front of a wall, at various distances.
Decrease pic_num and wall_num for your ease of apply, however, as for a proper scale, 30 images are needed while 50 images (approximately) will help achieve a satisfactory upshot. That is to say, eight images for every pose plus 5 of wall images (default) meet the bones requrement, while 10 for every pose plus 8 wall images are definitely amend (approximately).
As well, please see the following illustration on how to count inner corner count of the checkerboard:
Be aware that during online calibration, all intermediate data including images captured, rgb corners detected and depth corners selected, etc., volition be deleted for concerns of conflicting information. This may non be a good blueprint, therefore, I volition add together functions to salve intermediate data in the future.
Offline vs. Online
In the original code, the author only provided offline calibration, which makes corner detection a headache problem due to depression-quality captured images and far placement of checkerboard. Offline calibration of this demo remains the aforementioned with original ane, i.east., for each of the images in specified directory, the corner finder will try to detect (cnt_x_def+ane) × (cnt_y_def+ane) number of squares with side size sqr_def (by default).
If the detection failed, yet, y'all will be asked to manually label the 4 corners of checkerboard. But remember, virtually likely the corner finder will still not detect all the points successfully, given the epitome quality is low and/or checkerboard is placed far away (more than 5m, empirically) from the camera. To avoid this, I advise:
- Using online calibration when possible.
- If corner detection failed in offline calibration, delete that specific movie (and corresponding depth) and re-take ane with the same position, adjust illumination status and/or distance. Do not forget to name the images retaken (both rgb and depth) with same index in prescribed format.
- If corner detection failed in online scale, simply adjust illumination condition and/or distance and restart following the prompt.
- For your ease of use, try to make the distances smaller than 5m.
Illustration
To accept a meliorate understanding of how you should capture your data, delight practice information technology co-ordinate to the following illustration:
View depth information
If yous take difficulty in viewing pgm file with Windows built-in Windows Photo Viewer, please download Xnview.
Capture
Run mex_test (make sure your kinect is well connected) to run across whether you tin can use libfreenect smoothly, you lot will encounter a window pop-upwardly like this
This programme will exit one time you click '10' on the effigy.
Tape
Run record (make sure your kinect is well connected) to start, the plan will begin recording data stream in '../output/pic/'. Once yous are turning on make_avi flag in tape.m, the stream will exist recorded every bit video with specified frame charge per unit, in directory '../output/video/'. Besides, you lot can also adjust starting alphabetize of data ready with variable set_ind, and 1 for file name with variable cnt. The format of file name is for eg. "set0_0000.jpg" and "set0_0000.pgm", which stores rgb and raw depth (besides called disparity) respectively.
Note that the chief.m is able to capture data by its own and therefore does not rely on this tool.
This program will get out once y'all right-click on the figure.
Configuration
Exist aware that in order to capture Kinect data stream instantly from Matlab, you must have the prescribed configuration on your system.
Environment
The lawmaking is dependent on 3rd-lib libfreenect to connect Kinect, I accept compiled its Matlab wrapper (named kinect_mex.mex) for your convinient use in Matlab. Currently, it supports all three popular platforms of Matlab with 64-flake environment:
| Windows seven | Windows 8.1 | Linux (Ubuntu fourteen.04) | MAC OS Ten | |
|---|---|---|---|---|
| 32-bit | Northward/A | N/A | Due north/A | North/A |
| 32-bit | Yes | Yes | Yes | Yeah |
Compile libfreenect yourself
You may observe that yous are not able to run kinect.mex when capturing data, which means, yous take to compile libfreenect yourself.
However, if you are using Matlab/Windows of 32-bit, y'all may encounter painful experiences in resolving dependencies to compile libfreenect on your own system. For your reference, I am attaching the environs and all dependencies used when I was compiling libfreenect:
- Platforms: Windows 7 (64-flake) + Matlab (64-fleck) + Visual Studio 2010 (32-flake--merely) + Windows SDK seven.1
- Libraries: libfreenect-0.ane.ii + overabundance-3.7.6-bin + libusb-1.0.nineteen + libusb-win32-bin-ane.2.4.0 + pthreads-w32-2-8-0-release + unistd
Unfortunately the newest version (v0.5.0--Saturn) of libfreenect that I compiled can not run successfully and after digging for quite a while, I plant libfreenect-0.i.2 is virtually suitable for the wrapper nosotros utilise. As you may notice, in that location are ii versions of libusb linked, the reason is that libfreenect-0.1.2 call interfaces from both versions.
In addition, e'er make sure that you are using the same environment of the following:
- Mex compiler embedded in Matlab
- C compiler (usually located in Visual Stuio IDE), using mex -setup to find C compiler for Matlab
- All other 3-rd part dependencies
to compile for mex in Matlab. For east.g., if you are running 32-scrap Matlab, you can use default C compiler located in Visual Studio. However, for 64-bit Matlab, please download WinSDK 7.one in guild to let Matlab utilise 64-bit C compiler and meanwhile, re-compile all other three-rd dependencies using 64-bit compiler and link all together for kinect.mex. You may see "undefined external symbols" when y'all are trying to compile for mex files, if any of your dependencies' bit environment is unlike from your compiler's, or the two compilers' differ from each other.
See compile_mex.thou in capture directory for details.
After Calibration
Visualization
In demo/toolbox directory, find file show_calib_result.m, with which you lot tin utilize to visually evaluate your scale outcome, equally shown below:
Note that in show_calib_result.grand, change calib_filepath to the path where final_calib.mat locates, you tin can either exam on single image file (file that must be taken by this specific Kinect yous have calibrated), or on the information stream, but past swtiching the variable testing_realdata. Besides, plough off add_noncalib_comparison to skip comparing with not-calibrated issue.
Likewise, be aware that the program utilized a multi-scale bilateral filter to fill the 'holes' from depth paradigm, which resulted from surfaces of high/low reflectivity. Code of filter can be downloaded from NYU Kinect toolbox V2.
This programme will leave once you click '10' on the effigy.
Reprojection
For anyone who wants to use calibration result to reproject depth points from depth space to rgb space, please make use of the function compute_rgb_depthmap. Also, refer to show_calib_result.grand on how to apply this function correctly.
Dispatiry file content
The raw information obtained from libfreenect using Kinect depth sensor is actually a 11-bit number between 0-2047, called disparity (with unit in kdu--kinect disparity unit), while other libraries similar OpenNI provides converted depth data. One of the easiest convertion is (used in the demo code):
where, d is disparity and z d is converted depth. [c 0, c 1] are role of the depth camera intrinsic parameters to exist calibrated. If y'all want to catechumen data from disparity to depth with better approximation, take a look at this link: http://openkinect.org/wiki/Imaging_Information
It is worth mentioning that if yous employ Matlab imread to read a pgm file, it will automatically rescale data from [0, 2047] to [0, 65535] which leads to the failure of algorithm. Therefore, I am providing a part format_disparity that will recover data read from imread. The second argument indicates that whether you are using disparity read from Matlab imread (true) or not (imitation).
The following are suggestions for anyone who wants to use demo code for their own purpose:
- If you want to read disparity with the path of pgm file, apply read_disparity straight.
- If you accept disparity captured from libfreenect, practise not exercise annihilation.
- If you accept disparity read by Matlab imread from pgm file, employ format_disparity.
In the code I uploaded, format_disparity was called immediately after capturing data from libfreenect. At that place was no effect of that, except for telling you exist aware of the pgm file content.
Conversion between disparity and depth
For all the three kinect libraries: Windows Kinect SDK, Libfreenect and OpenNI/OpenNI2, you should be aware of the information format they apply. As for libfreenect, it returns disparity while the others render depth in mm. Nevertheless, if you lot want to calibrate with this toolbox, you must input disparity rather than depth. For depth data y'all take captured from Windows Kinect SDK and OpenNI/OpenNI2, the first affair you should do is to convert depth to disparity, which tin exist solved by using the changed of above equation.
It is noted that after y'all applying inverse equation on depth, disparity values volition lie on certain range. For eastward.g., if you catechumen with [c 0, c 1] = [3.3309495161 -0.0030711016]. Which is recommened by the book "Hacking the Kinect", the converted disparity will accept a range of [678.0987, 1002.6947] (I am assuming the data is captured from WinSDK, which has a range limit of at least 0.8m). Notwithstanding, the raw disparity data returned by libfreenect lies in range like [380, 1009] (lower limit indicates that the libfreenect can captrure data as close as 0.4m), which is expected to be used by this toolbox.
Therefore, if y'all accept converted disparity data, make sure you know which two of [c 0, c 1] you have used, or used by the people who send you lot data. Also, to visualize the disparity data, yous need to save them into a range of [0, 65535]. Then, the correct sequence of conversion from depth to disparity must be (Supporse you use Windows Kinect SDK or OpenNI/OpenNI2):
- Catechumen depth data from uint of millimeter to meter.
- Convert depth in meter to disparity using inverse equation of the ane shown higher up, where you lot demand to know which [c 0, c 1] you use. I recommend to use [c 0, c 1] = [3.3309495161 -0.0030711016] (which shows better performance after experiments).
- Exam the range of disparity data you have obtained, say it is [min, max].
- Scale disparity with standard range of [0, 2047], past using "imd=(imd+one-min)/(max-min+one)". (adding ane in order to distinguish between outlier value 0 with minimal disparity value in image).
- Scale disparity of [0, 2047] to [0, 65535] by using bit shifting operation (from bit one:log(2048) to 1:log(65536)) and salve it in pgm format.
- Pgm files at present can be visulized with tools like Xnview.
Here are some recommendations for reading pgm files:
- If you lot are reading data captured from libfreenect, yous don't take to do anything.
- If y'all are reading information captured from Windows Kinect SDK and OpenNI/OpenNI2, read information and then firstly normalize disparity values into standard 11-bit range, i.e., [0, 2047]. After that, make sure you scale your data with obtained range [min, max] by using "imd=imd*(max-min+1)+min-1". Now you lot tin convert from disparity to depth safely and get correct depth data.
Reference
If you find this toolbox useful, delight cite our paper at:
@inproceedings{xiang2015review, championship={A review and quantitative comparison of methods for kinect calibration}, writer={Xiang, Wei and Conly, Christopher and McMurrough, Christopher D and Athitsos, Vassilis}, booktitle={Proceedings of the 2nd international Workshop on Sensor-based Activeness Recognition and Interaction}, pages={3}, year={2015}, system={ACM} } This work originates from the University of Oulu, by: Herrera C., D., Kannala J., Heikkila, J., "Joint depth and colour camera calibration with baloney correction", TPAMI, 2012. Please run into link of the original code: http://www.ee.oulu.fi/~dherrera/kinect/, as well as corresponding document .
The libfreenect Matlab wrapper is provided by Alexander Berg.
For details on how to use Kinect, please take a look at: Kramer, Jeff, Nicolas Burrus, and Florian Echtler. Hacking the Kinect. New York, NY, Us:: Apress, 2012.
Update
Curret upwardly-to-date version is v02_11_2015 (named by the last date I modified the files).
Questions, comments and bugs, please email to: wei.xiang@mavs.uta.edu
Source: https://github.com/mmLukas/Joint-Calibration-Toolbox-for-Kinect
Posted by: boyerwalach84.blogspot.com

0 Response to "How To Do Calibration For Kinect Camera In Matlab"
Post a Comment