Calibrate fisheye lens - ofxCv

I have successfully integrated the ofxCv libraries calibrate functionality into my project (a threaded multi-camera system for Point Grey GigE Cameras) in Ubuntu 15.04 x64 with OF 0.8.4.

The problem is the camera is a 1/1.2 sensor with a 2.7mm lens. The calibration function does not appear to work with Fisheye lenses directly. I have done quite a bit of research, but admittedly I am new to working with calibration in OpenCv. It appears that I need to set flags to something like:

	int calibFlags = CV_CALIB_FIX_K1 | CV_CALIB_FIX_K2 | CV_CALIB_FIX_K3 | CV_CALIB_FIX_K4 | CV_CALIB_FIX_K5 | CV_CALIB_FIX_K6 | CV_CALIB_FIX_FOCAL_LENGTH | CV_CALIB_USE_INTRINSIC_GUESS;

and apply this to calibrateCamera();

I am adapting the ofxCv code directly, I have backed up the original and when I get it down my plan is to make another header and c++ file to extend it’s functionality.

Even with the flags I cannot get the fisheye distortion to clean up like I can lenses in the normal range, it is also apparently unable to automatically calculate the sensor size or the focal length, I could also use help on how to set that manually since I know the numbers.

Essentially, any help I can get in adapting the ofxCv calibration to function with Fisheye lenses would be great, and how to manually handle sensorWidth/sensorHeight, and focalLength.

Also, regarding the settings.yml the xcount and ycount, does the counting start from 0 or 1. my actual x=11 and y=8, but it only works if I list as 10,7.

Here is the calibration.h file I am using in my project to reference the ofxCv calibration:

#include "ofApp.h"

using namespace ofxCv;
using namespace cv;

class GigECalibration{

    public:
        GigECalibration(){
            active = true;
        }

        virtual ~GigECalibration(){
            //calibration.reset();
        }
        void setup(ofImage cam) {
            camImage = cam;
            calibration.setFillFrame(false);
            FileStorage settings(ofToDataPath("settings.yml"), FileStorage::READ);
            if(settings.isOpened()) {
                int xCount = settings["xCount"], yCount = settings["yCount"];
                calibration.setPatternSize(xCount, yCount);
                float squareSize = settings["squareSize"];
                calibration.setSquareSize(squareSize);
                CalibrationPattern patternType;
                switch(settings["patternType"]) {
                    case 0: patternType = CHESSBOARD; break;
                    case 1: patternType = CIRCLES_GRID; break;
                    case 2: patternType = ASYMMETRIC_CIRCLES_GRID; break;
                }
                calibration.setPatternType(patternType);
            }

            imitate(undistorted, camImage);
            imitate(previous, camImage);
            imitate(diff, camImage);

            lastTime = 0;

            //active = true;
        }

            void update(ofImage cam) {
                camImage = cam;
                Mat camMat = toCv(camImage);
                Mat prevMat = toCv(previous);
                Mat diffMat = toCv(diff);

                absdiff(prevMat, camMat, diffMat);
                camMat.copyTo(prevMat);

                diffMean = mean(Mat(mean(diffMat)))[0];

                float curTime = ofGetElapsedTimef();
                if(active && curTime - lastTime > timeThreshold && diffMean < diffThreshold) {
                    if(calibration.add(camMat)) {
                        cout << "re-calibrating" << endl;
                        calibration.calibrate();
                        if(calibration.size() > startCleaning) {
                            calibration.clean();
                        }
                        calibration.save("calibration.yml");
                        lastTime = curTime;
                    }
                }

                if(calibration.size() > 0) {
                    calibration.undistort(toCv(camImage), toCv(undistorted));
                    undistorted.update();
                }

        }

        void draw(int camW, int camH) {
            ofSetColor(255);
            camImage.draw(0, 0,camW/2, camH/2);
            undistorted.draw(camW/2, 0, camW/2,camH/2);

            if(active){
                ofDrawBitmapString("Calibration Active",ofGetWidth()-200,20);
            }else{
                ofDrawBitmapString("Calibration Inactive",ofGetWidth()-200,20);
            }
            stringstream intrinsics;
            intrinsics << "fov: " << toOf(calibration.getDistortedIntrinsics().getFov()) << " distCoeffs: " << calibration.getDistCoeffs();
            drawHighlightString(intrinsics.str(), 10, 20, yellowPrint, ofColor(0));
            drawHighlightString("movement: " + ofToString(diffMean), 10, 40, cyanPrint);
            drawHighlightString("reproj error: " + ofToString(calibration.getReprojectionError()) + " from " + ofToString(calibration.size()), 10, 60, magentaPrint);
            for(int i = 0; i < calibration.size(); i++) {
                drawHighlightString(ofToString(i) + ": " + ofToString(calibration.getReprojectionError(i)), 10, 80 + 16 * i, magentaPrint);
            }
        }


        bool active;
    protected:
        const float diffThreshold = 2.5; // maximum amount of movement
        const float timeThreshold = 1; // minimum time between snapshots
        const int startCleaning = 20; // start cleaning outliers after this many samples

        ofImage camImage;
        ofImage undistorted;
        ofPixels previous;
        ofPixels diff;
        float diffMean;

        float lastTime;

        ofxCv::Calibration calibration;


};

Here is the slightly adapted ofxCv/Calibration.cpp

#include "ofxCv/Calibration.h"
#include "ofxCv/Helpers.h"
#include "ofFileUtils.h"

namespace ofxCv {

	using namespace cv;

	void Intrinsics::setup(Mat cameraMatrix, cv::Size imageSize, cv::Size sensorSize) {
		this->cameraMatrix = cameraMatrix;
		this->imageSize = imageSize;
		this->sensorSize = sensorSize;
		calibrationMatrixValues(cameraMatrix, imageSize, sensorSize.width, sensorSize.height,
														fov.x, fov.y, focalLength, principalPoint, aspectRatio);

	}

	void Intrinsics::setImageSize(cv::Size imgSize) {
		imageSize = imgSize;
	}

	Mat Intrinsics::getCameraMatrix() const {
		return cameraMatrix;
	}

	cv::Size Intrinsics::getImageSize() const {
		return imageSize;
	}

	cv::Size Intrinsics::getSensorSize() const {
		return sensorSize;
	}

	cv::Point2d Intrinsics::getFov() const {
		return fov;
	}

	double Intrinsics::getFocalLength() const {
		return focalLength;
	}

	double Intrinsics::getAspectRatio() const {
		return aspectRatio;
	}

	Point2d Intrinsics::getPrincipalPoint() const {
		return principalPoint;
	}

	void Intrinsics::loadProjectionMatrix(float nearDist, float farDist, cv::Point2d viewportOffset) const {
		ofViewport(viewportOffset.x, viewportOffset.y, imageSize.width, imageSize.height);
		ofSetMatrixMode(OF_MATRIX_PROJECTION);
		ofLoadIdentityMatrix();
		float w = imageSize.width;
		float h = imageSize.height;
		float fx = cameraMatrix.at<double>(0, 0);
		float fy = cameraMatrix.at<double>(1, 1);
		float cx = principalPoint.x;
		float cy = principalPoint.y;

		ofMatrix4x4 frustum;
		frustum.makeFrustumMatrix(
			nearDist * (-cx) / fx, nearDist * (w - cx) / fx,
			nearDist * (cy) / fy, nearDist * (cy - h) / fy,
			nearDist, farDist);
		ofMultMatrix(frustum);

		ofSetMatrixMode(OF_MATRIX_MODELVIEW);
		ofLoadIdentityMatrix();

		ofMatrix4x4 lookAt;
		lookAt.makeLookAtViewMatrix(ofVec3f(0,0,0), ofVec3f(0,0,1), ofVec3f(0,-1,0));
		ofMultMatrix(lookAt);
	}

	Calibration::Calibration() :
		patternType(CHESSBOARD),
		patternSize(cv::Size(10, 7)), // based on Chessboard_A4.pdf, assuming world units are centimeters
		subpixelSize(cv::Size(11,11)),
		squareSize(2.5),
		reprojectionError(0),
		fillFrame(true),
		ready(false) {

	}

	void Calibration::save(string filename, bool absolute) const {
		if(!ready){
			ofLog(OF_LOG_ERROR, "Calibration::save() failed, because your calibration isn't ready yet!");
		}
		FileStorage fs(ofToDataPath(filename, absolute), FileStorage::WRITE);
		cv::Size imageSize = distortedIntrinsics.getImageSize();
		cv::Size sensorSize = distortedIntrinsics.getSensorSize();
		Mat cameraMatrix = distortedIntrinsics.getCameraMatrix();
		fs << "cameraMatrix" << cameraMatrix;
		fs << "imageSize_width" << imageSize.width;
		fs << "imageSize_height" << imageSize.height;
		fs << "sensorSize_width" << sensorSize.width;
		fs << "sensorSize_height" << sensorSize.height;
		fs << "distCoeffs" << distCoeffs;
		fs << "reprojectionError" << reprojectionError;
		fs << "features" << "[";
		for(int i = 0; i < (int)imagePoints.size(); i++) {
			fs << "[:" << imagePoints[i] << "]";
		}
		fs << "]";
	}

	void Calibration::load(string filename, bool absolute) {
		imagePoints.clear();
		FileStorage fs(ofToDataPath(filename, absolute), FileStorage::READ);
		cv::Size imageSize, sensorSize;
		Mat cameraMatrix;
		fs["cameraMatrix"] >> cameraMatrix;
		fs["imageSize_width"] >> imageSize.width;
		fs["imageSize_height"] >> imageSize.height;
		fs["sensorSize_width"] >> sensorSize.width;
		fs["sensorSize_height"] >> sensorSize.height;
		fs["distCoeffs"] >> distCoeffs;
		fs["reprojectionError"] >> reprojectionError;
		FileNode features = fs["features"];
		for(FileNodeIterator it = features.begin(); it != features.end(); it++) {
			vector<Point2f> cur;
			(*it) >> cur;
			imagePoints.push_back(cur);
		}
		addedImageSize = imageSize;
		distortedIntrinsics.setup(cameraMatrix, imageSize, sensorSize);
		updateUndistortion();
		ready = true;
	}
	void Calibration::setIntrinsics(Intrinsics& distortedIntrinsics, Mat& distortionCoefficients){
		this->distortedIntrinsics = distortedIntrinsics;
		this->distCoeffs = distortionCoefficients;
		this->addedImageSize = distortedIntrinsics.getImageSize();
		updateUndistortion();
		this->ready = true;
	}
	void Calibration::reset(){
		this->ready = false;
		this->reprojectionError = 0.0;
		this->imagePoints.clear();
		this->objectPoints.clear();
		this->perViewErrors.clear();
	}
	void Calibration::setPatternType(CalibrationPattern patternType) {
		this->patternType = patternType;
	}
	void Calibration::setPatternSize(int xCount, int yCount) {
		patternSize = cv::Size(xCount, yCount);
	}
	void Calibration::setSquareSize(float squareSize) {
		this->squareSize = squareSize;
	}
	void Calibration::setFillFrame(bool fillFrame) {
		this->fillFrame = fillFrame;
	}
	void Calibration::setSubpixelSize(int subpixelSize) {
		subpixelSize = MAX(subpixelSize,2);
		this->subpixelSize = cv::Size(subpixelSize,subpixelSize);
	}
	bool Calibration::add(Mat img) {
		addedImageSize = img.size();

		vector<Point2f> pointBuf;

		// find corners
		bool found = findBoard(img, pointBuf);

		if (found)
			imagePoints.push_back(pointBuf);
		else
			ofLog(OF_LOG_ERROR, "Calibration::add() failed, maybe your patternSize is wrong or the image has poor lighting?");
		return found;
	}
	bool Calibration::findBoard(Mat img, vector<Point2f>& pointBuf, bool refine) {
		bool found=false;
		if(patternType == CHESSBOARD) {
			// no CV_CALIB_CB_FAST_CHECK, because it breaks on dark images (e.g., dark IR images from kinect)
			int chessFlags = CV_CALIB_CB_ADAPTIVE_THRESH;// | CV_CALIB_CB_FILTER_QUADS;// | CV_CALIB_CB_NORMALIZE_IMAGE;
			found = findChessboardCorners(img, patternSize, pointBuf, chessFlags);

			// improve corner accuracy
			if(found) {
				if(img.type() != CV_8UC1) {
                    copyGray(img, grayMat);
				} else {
					grayMat = img;
				}

				if(refine) {
					// the 11x11 dictates the smallest image space square size allowed
					// in other words, if your smallest square is 11x11 pixels, then set this to 11x11
					cornerSubPix(grayMat, pointBuf, subpixelSize,  cv::Size(-1,-1), TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1 ));
				}
			}
		}
#ifdef USING_OPENCV_2_3
		else {
			int flags = (patternType == CIRCLES_GRID ? CALIB_CB_SYMMETRIC_GRID : CALIB_CB_ASYMMETRIC_GRID); // + CALIB_CB_CLUSTERING
			found = findCirclesGrid(img, patternSize, pointBuf, flags);
		}
#endif
		return found;
	}
	bool Calibration::clean(float minReprojectionError) {
		int removed = 0;
		for(int i = size() - 1; i >= 0; i--) {
			if(getReprojectionError(i) > minReprojectionError) {
				objectPoints.erase(objectPoints.begin() + i);
				imagePoints.erase(imagePoints.begin() + i);
				removed++;
			}
		}
		if(size() > 0) {
			if(removed > 0) {
				return calibrate();
			} else {
				return true;
			}
		} else {
			ofLog(OF_LOG_ERROR, "Calibration::clean() removed the last object/image point pair");
			return false;
		}
	}
	bool Calibration::calibrate() {
		if(size() < 1) {
			ofLog(OF_LOG_ERROR, "Calibration::calibrate() doesn't have any image data to calibrate from.");
			if(ready) {
				ofLog(OF_LOG_ERROR, "Calibration::calibrate() doesn't need to be called after Calibration::load().");
			}
			return ready;
		}

		Mat cameraMatrix = Mat::eye(3, 3, CV_64F);
		distCoeffs = Mat::zeros(8, 1, CV_64F);

		updateObjectPoints();


		//int calibFlags = 0;
		int calibFlags = CV_CALIB_FIX_K1 | CV_CALIB_FIX_K2 | CV_CALIB_FIX_K3 | CV_CALIB_FIX_K4 | CV_CALIB_FIX_K5 | CV_CALIB_FIX_K6 | CV_CALIB_FIX_FOCAL_LENGTH | CV_CALIB_USE_INTRINSIC_GUESS;
		//CV_CALIB_FIX_PRINCIPAL_POINT | CV_CALIB_USE_INTRINSIC_GUESS | CV_CALIB_FIX_ASPECT_RATIO
        //int calibFlags = CV_CALIB_FIX_K1 | CV_CALIB_FIX_K2 | CV_CALIB_FIX_K3 | CV_CALIB_FIX_K4 | CV_CALIB_FIX_K5 | CV_CALIB_FIX_K6 | CV_CALIB_ZERO_TANGENT_DIST | CV_CALIB_USE_INTRINSIC_GUESS |CV_CALIB_ZERO_TANGENT_DIST;

		float rms = calibrateCamera(objectPoints, imagePoints, addedImageSize, cameraMatrix, distCoeffs, boardRotations, boardTranslations, calibFlags);
		ofLog(OF_LOG_VERBOSE, "calibrateCamera() reports RMS error of " + ofToString(rms));

		ready = checkRange(cameraMatrix) && checkRange(distCoeffs);

		if(!ready) {
			ofLog(OF_LOG_ERROR, "Calibration::calibrate() failed to calibrate the camera");
		}

		distortedIntrinsics.setup(cameraMatrix, addedImageSize);
		updateReprojectionError();
		updateUndistortion();

		return ready;
	}

	bool Calibration::isReady(){
		return ready;
	}

	bool Calibration::calibrateFromDirectory(string directory) {
		ofDirectory dirList;
		ofImage cur;
		dirList.listDir(directory);
		for(int i = 0; i < (int)dirList.size(); i++) {
			cur.loadImage(dirList.getPath(i));
			if(!add(toCv(cur))) {
				ofLog(OF_LOG_ERROR, "Calibration::add() failed on " + dirList.getPath(i));
			}
		}
		return calibrate();
	}
	void Calibration::undistort(Mat img, int interpolationMode) {
		img.copyTo(undistortBuffer);
		undistort(undistortBuffer, img, interpolationMode);
	}
	void Calibration::undistort(Mat src, Mat dst, int interpolationMode) {
		remap(src, dst, undistortMapX, undistortMapY, interpolationMode);
	}

	ofVec2f Calibration::undistort(ofVec2f& src) const {
		ofVec2f dst;
		Mat matSrc = Mat(1, 1, CV_32FC2, &src.x);
		Mat matDst = Mat(1, 1, CV_32FC2, &dst.x);;
		undistortPoints(matSrc, matDst, distortedIntrinsics.getCameraMatrix(), distCoeffs);
		return dst;
	}

	void Calibration::undistort(vector<ofVec2f>& src, vector<ofVec2f>& dst) const {
		int n = src.size();
		dst.resize(n);
		Mat matSrc = Mat(n, 1, CV_32FC2, &src[0].x);
		Mat matDst = Mat(n, 1, CV_32FC2, &dst[0].x);
		undistortPoints(matSrc, matDst, distortedIntrinsics.getCameraMatrix(), distCoeffs);
	}

	bool Calibration::getTransformation(Calibration& dst, Mat& rotation, Mat& translation) {
		//if(imagePoints.size() == 0 || dst.imagePoints.size() == 0) {
		if(!ready) {
			ofLog(OF_LOG_ERROR, "getTransformation() requires both Calibration objects to have just been calibrated");
			return false;
		}
		if(imagePoints.size() != dst.imagePoints.size() || patternSize != dst.patternSize) {
			ofLog(OF_LOG_ERROR, "getTransformation() requires both Calibration objects to be trained simultaneously on the same board");
			return false;
		}
		Mat fundamentalMatrix, essentialMatrix;
		Mat cameraMatrix = distortedIntrinsics.getCameraMatrix();
		Mat dstCameraMatrix = dst.getDistortedIntrinsics().getCameraMatrix();
		// uses CALIB_FIX_INTRINSIC by default
		stereoCalibrate(objectPoints,
										imagePoints, dst.imagePoints,
										cameraMatrix, distCoeffs,
										dstCameraMatrix, dst.distCoeffs,
										distortedIntrinsics.getImageSize(), rotation, translation,
										essentialMatrix, fundamentalMatrix);
		return true;
	}
	float Calibration::getReprojectionError() const {
		return reprojectionError;
	}
	float Calibration::getReprojectionError(int i) const {
		return perViewErrors[i];
	}
	const Intrinsics& Calibration::getDistortedIntrinsics() const {
		return distortedIntrinsics;
	}
	const Intrinsics& Calibration::getUndistortedIntrinsics() const {
		return undistortedIntrinsics;
	}
	Mat Calibration::getDistCoeffs() const {
		return distCoeffs;
	}
	int Calibration::size() const {
		return imagePoints.size();
	}
	cv::Size Calibration::getPatternSize() const {
		return patternSize;
	}
	float Calibration::getSquareSize() const {
		return squareSize;
	}
	void Calibration::customDraw() {
		for(int i = 0; i < size(); i++) {
			draw(i);
		}
	}
	void Calibration::draw(int i) const {
		ofPushStyle();
		ofNoFill();
		ofSetColor(ofColor::red);
		for(int j = 0; j < (int)imagePoints[i].size(); j++) {
			ofCircle(toOf(imagePoints[i][j]), 5);
		}
		ofPopStyle();
	}
	// this won't work until undistort() is in pixel coordinates
	/*
	void Calibration::drawUndistortion() const {
		vector<ofVec2f> src, dst;
		cv::Point2i divisions(32, 24);
		for(int y = 0; y < divisions.y; y++) {
			for(int x = 0; x < divisions.x; x++) {
				src.push_back(ofVec2f(
					ofMap(x, -1, divisions.x, 0, addedImageSize.width),
					ofMap(y, -1, divisions.y, 0, addedImageSize.height)));
			}
		}
		undistort(src, dst);
		ofMesh mesh;
		mesh.setMode(OF_PRIMITIVE_LINES);
		for(int i = 0; i < src.size(); i++) {
			mesh.addVertex(src[i]);
			mesh.addVertex(dst[i]);
		}
		mesh.draw();
	}
	*/
	void Calibration::draw3d() const {
		for(int i = 0; i < size(); i++) {
			draw3d(i);
		}
	}
	void Calibration::draw3d(int i) const {
		ofPushStyle();
		ofPushMatrix();
		ofNoFill();

		applyMatrix(makeMatrix(boardRotations[i], boardTranslations[i]));

		ofSetColor(ofColor::fromHsb(255 * i / size(), 255, 255));

		ofDrawBitmapString(ofToString(i), 0, 0);

		for(int j = 0; j < (int)objectPoints[i].size(); j++) {
			ofPushMatrix();
			ofTranslate(toOf(objectPoints[i][j]));
			ofCircle(0, 0, .5);
			ofPopMatrix();
		}

		ofMesh mesh;
		mesh.setMode(OF_PRIMITIVE_LINE_STRIP);
		for(int j = 0; j < (int)objectPoints[i].size(); j++) {
			ofVec3f cur = toOf(objectPoints[i][j]);
			mesh.addVertex(cur);
		}
		mesh.draw();

		ofPopMatrix();
		ofPopStyle();
	}
	void Calibration::updateObjectPoints() {
		vector<Point3f> points = createObjectPoints(patternSize, squareSize, patternType);
		objectPoints.resize(imagePoints.size(), points);
	}
	void Calibration::updateReprojectionError() {
		vector<Point2f> imagePoints2;
		int totalPoints = 0;
		double totalErr = 0;

		perViewErrors.clear();
		perViewErrors.resize(objectPoints.size());

		for(int i = 0; i < (int)objectPoints.size(); i++) {
			projectPoints(Mat(objectPoints[i]), boardRotations[i], boardTranslations[i], distortedIntrinsics.getCameraMatrix(), distCoeffs, imagePoints2);
			double err = norm(Mat(imagePoints[i]), Mat(imagePoints2), CV_L2);
			int n = objectPoints[i].size();
			perViewErrors[i] = sqrt(err * err / n);
			totalErr += err * err;
			totalPoints += n;
			ofLog(OF_LOG_VERBOSE, "view " + ofToString(i) + " has error of " + ofToString(perViewErrors[i]));
		}

		reprojectionError = sqrt(totalErr / totalPoints);

		ofLog(OF_LOG_VERBOSE, "all views have error of " + ofToString(reprojectionError));
	}
	void Calibration::updateUndistortion() {
		Mat undistortedCameraMatrix = getOptimalNewCameraMatrix(distortedIntrinsics.getCameraMatrix(), distCoeffs, distortedIntrinsics.getImageSize(), fillFrame ? 0 : 1);
		initUndistortRectifyMap(distortedIntrinsics.getCameraMatrix(), distCoeffs, Mat(), undistortedCameraMatrix, distortedIntrinsics.getImageSize(), CV_16SC2, undistortMapX, undistortMapY);
		undistortedIntrinsics.setup(undistortedCameraMatrix, distortedIntrinsics.getImageSize());
	}

	vector<Point3f> Calibration::createObjectPoints(cv::Size patternSize, float squareSize, CalibrationPattern patternType) {
		vector<Point3f> corners;
		switch(patternType) {
			case CHESSBOARD:
			case CIRCLES_GRID:
				for(int i = 0; i < patternSize.height; i++)
					for(int j = 0; j < patternSize.width; j++)
						corners.push_back(Point3f(float(j * squareSize), float(i * squareSize), 0));
				break;
			case ASYMMETRIC_CIRCLES_GRID:
				for(int i = 0; i < patternSize.height; i++)
					for(int j = 0; j < patternSize.width; j++)
						corners.push_back(Point3f(float(((2 * j) + (i % 2)) * squareSize), float(i * squareSize), 0));
				break;
		}
		return corners;
	}
}

SOLVED THE PROBLEM!!!

Successful Calibration with tolerable distortion for Point Grey Grasshopper 3 with a 2.7mm Lens.

So after a lot of research and trial and error I have solved the issue.

The main issue was with the calibFlags under the Calibration:calibration() function in ofxCv.

It is defaulted to int calibFlags = 0;
This did not work for me, and since this was not set as a variable I could control when calling the object I decided to make a new Calibration class to extend the ofxCV.

The correct flags for my lens were:

int calibFlags = CV_CALIB_FIX_PRINCIPAL_POINT | CV_CALIB_ZERO_TANGENT_DIST | CV_CALIB_FIX_FOCAL_LENGTH | CV_CALIB_FIX_ASPECT_RATIO | CV_CALIB_FIX_K3 | CV_CALIB_FIX_K4 | CV_CALIB_FIX_K5 | CV_CALIB_FIX_K6;

In my slightly altered class for calibrate I added:

void setCalibFlags(int calibFlagSet);

Which allows me to specify the flags I need in relation to normal or fish-eye lenses without having to rewrite the class.

The main problem I was facing over and over was my misguided use of CV_CALIB_FIX_K1 and K2, which are for low quality lenses.

In addition, I removed the

#ifdef USING_OPENCV_2_3
#endif

from the Calibration.cpp file which surrounded the flags for the assymetrical_circles and circles grids. Which allowed me to set the calibration to utilize assymetrical circles, which I found to work much better than the checkerboard for this wide angle lens(2.7mm).

I also did not know how the xcount and ycount was to be listed in the settings_asymmetricalCircleGrid.yml file, which I discovered to be

%YAML:1.0
xCount: 4
yCount: 11
squareSize: 2.3
patternType: 2

.

Though I am still unsure as to why the calibration is so much better using asymmetrical circles grid vs the checkerboard.

I found this site helpful in describing the attributes: http://ninghang.blogspot.com/2012/08/fish-eye-camera-calibration.html

calibration.h (4.8 KB) CalibrationExtended.cpp (14.9 KB) CalibrationExtended.h (3.9 KB)

When I am completely finished cleaning up my code I will release it on GitHub. It runs on linux for Multi-Cam Point Grey GigE camera installations with calibration and blob tracking. All I have left to do now is the blob tracking, which I have done before so I feel the worse is behind me, fingers crossed.

1 Like

Thought screen captures of the distortion correction could be helpful for others who run into this problem.

3 Likes

Have you posted your project on Github yet?

i was hoping using your code might solve a image problem i am having with the square calibration.
but it still persists.
would you know why the resulting image looks so bad?

seems like selecting 640x480 cam resolution for my logitech c920 caused this problem.
using 1920x1080 solved that.

@abocanegra

Hi

calibration with your code works well. thank you for that.

but loading the .yml file crashes my app

i call calibration.load(“calibration_cam920_fisheye.yml”);

and get this error in the console


OpenCV Error: Null pointer (Null pointer to reader or destination array) in cvReadRawDataSlice, file /Users/danielrosser/Documents/openFrameworks/scripts/apothecary/build/opencv/modules/core/src/persistence.cpp, line 3245
libc++abi.dylib: terminating with uncaught exception of type cv::Exception: /Users/danielrosser/Documents/openFrameworks/scripts/apothecary/build/opencv/modules/core/src/persistence.cpp:3245: error: (-27) Null pointer to reader or destination array in function cvReadRawDataSlice

would you know why?

thanks.

i am using OS X 10.10.5 and OF 0.9.3.
i noticed that the .yml files produce a double bracket like this [ [ ] ] while the ofxCv example from OF 0.8.4 only has single brackets.

when removing those the app does not crash. but it also seems the calibration is not applied.

------update------
a very important line to not for get is calibration.setFillFrame(false);
if set to false you see the black brackground if your image is very bend.
if set to true the image i scaled somehow.
if you do not set it app crashes.

but still when using OF 0.9.3 the .yml file produces double brackets [ [ , which OF 0.8.4 does not do.
after removing the [ [ manually the .yml did load and the app used the calibration.

actually the higher resolution only hide the artifact problem. it did not solve it

@stephanschulz

My first thought would be that it is a formatting issue in the yml file.
Can you show me your camera specs (WxH, sensor size, Sensor Ratio) and
the yml file? I only tested this on Linux - ubuntu 14.04 - 16.10. If we
can get it working on mac as well, that is fantastic.

Thanks,

@abocanegra

i fixed is this way