using QTKit & insertSegmentOfMovie

Hello,

Don’t know if this is beginner or advanced, but
I am trying to do some quicktime editing in OF/ making a simple QT editing addon,
first mainly using insertSegmentOfMovie

For obj-c I found:

// in order to add segments to our newly created movie
// we must make sure the “QTMovieEditableAttribute”
// attribute is set
isEditable = [NSNumber numberWithBool:YES];
[outMovie setAttribute:isEditable forKey:QTMovieEditableAttribute];

which would make ‘outMovie’ editable.

In Movies.h and in the ofQTsaver addon code i found:

BeginMediaEdits (media); /* Inform the Movie Toolbox that we */
/* want to change the media samples */
/* referenced by a track’s media. */
/* This opens the media container */
/* and makes it ready to receive */
/* and/or remove sample data. */

but this enables a the media of a track to be editable, and insertSegmentOfMovie works on multiple tracks.

Does anybody knows how to set the QTMovieEditableAttribute using the qtkit in OF ?

Thanks a lot,

Keez

I’ve done a little bit of this and found this example really helpful:
http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/QTKitApplicationTutorial/CreatingaQTKitStoporStillMotionApplication/CreatingaQTKitStoporStillMotionApplication.html

I think the pattern is to initialize an empty quicktime movie that is editable, add frames to it from other sources (movies, cameras, generative graphics), then save it out.

This would be a great addition to the QTKit stuff in OF. Also it’s worth investigating how AVFoundation handles the same stuff, as it may be cleaner for doing this type of stuff. Check out AVAssetWriter classes.

Good luck!!

I don’t know if this is of any help for you, but enclosed is a movie recorder thing I made that uses QTKit to create a movie file and add frames to it. I use it to save frames coming in from a video camera to a movie file. Among other things, it uses the QTMovieEditableAttribute you mention.
(I may or may not have based this off of someone else’s code, I don’t actually remember
:slight_smile:

Header file:

  
#pragma once  
  
#ifndef QTKIT_MOVIE_RECORDER_H  
#define QTKIT_MOVIE_RECORDER_H  
  
#include <string>  
#include "ofImage.h"  
#if __OBJC__  
#include "GL/glew.h"  
#include <QTKit/QTKit.h>  
#endif // __OBJC__  
  
class ofxQTKitRecorder {  
	public: ofxQTKitRecorder();  
	public: ~ofxQTKitRecorder();  
  
	public: void setup(std::string filePath);  
	public: void addFrame(ofPixels& framePixels);  
	public: void flushToFile();  
  
	private: void outputFrame(ofPixels& frameImage, unsigned long frameDurationMS);  
  
#if __OBJC__  
	private: QTMovie* qtMovie;  
#else // __OBJC__  
	private: void* qtMovie;  
#endif // __OBJC__  
	private: ofPixels previousFrame;  
	private: unsigned long previewFrameTimestampMS;  
};  
  
  
#endif // ndef QTKIT_MOVIE_RECORDER_H  
  

Implementation file (a .mm file since it contains both Objective-C and C++ code):

  
#include "ofxQTKitRecorder.h"  
  
#include "ofImage.h"  
  
ofxQTKitRecorder::ofxQTKitRecorder() :  
	qtMovie(nil),  
	previousFrame(),  
	previewFrameTimestampMS(0)  
{  
	  
}  
  
ofxQTKitRecorder::~ofxQTKitRecorder() {  
	if (qtMovie != nil) {  
		if (previousFrame.isAllocated()) {  
			unsigned long nowTimestampMS = ofGetSystemTime();  
			assert(nowTimestampMS >= previewFrameTimestampMS);  
			unsigned long previousFrameDurationMS = nowTimestampMS - previewFrameTimestampMS;  
			if (previousFrameDurationMS > 0) {  
				outputFrame(previousFrame, previousFrameDurationMS);  
			}  
		}  
		flushToFile();  
		[qtMovie release];  
		qtMovie = nil;  
	}  
}  
  
void ofxQTKitRecorder::setup(std::string filePath) {  
	NSError* error = nil;  
	qtMovie = [[QTMovie alloc] initToWritableFile:[NSString stringWithUTF8String:filePath.c_str()] error:&error];  
	if ((qtMovie == nil) && (error != nil)) {  
		NSLog(@"%@", [error localizedDescription]);  
	}  
	[qtMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];  
}  
  
void ofxQTKitRecorder::addFrame(ofPixels& framePixels) {  
	if (qtMovie != nil) {  
		unsigned long nowTimestampMS = ofGetSystemTime();  
		if (previousFrame.isAllocated()) {  
			assert(nowTimestampMS >= previewFrameTimestampMS);  
			unsigned long previousFrameDurationMS = nowTimestampMS - previewFrameTimestampMS;  
			if (previousFrameDurationMS > 0) {  
				outputFrame(previousFrame, previousFrameDurationMS);  
			}  
		}  
		previousFrame = framePixels;  
		previewFrameTimestampMS = nowTimestampMS;  
	}  
}  
  
void ofxQTKitRecorder::flushToFile() {  
	if (qtMovie != nil) {  
		[qtMovie updateMovieFile];  
	}  
}  
  
void ofxQTKitRecorder::outputFrame(ofPixels& framePixels, unsigned long frameDurationMS) {  
	assert(frameDurationMS > 0);  
	QTTime duration = QTMakeTimeWithTimeInterval(static_cast<double>(frameDurationMS) / 1000.0);  
  
	NSDictionary* attributesDictionary =  
		[NSDictionary dictionaryWithObjectsAndKeys:  
			@"jpeg", QTAddImageCodecType,  
			[NSNumber numberWithLong:codecNormalQuality], QTAddImageCodecQuality,  
			nil];  
  
	int imageWidth = framePixels.getWidth();  
	int imageHeight = framePixels.getHeight();  
	int bitsPerChannel = framePixels.getBitsPerChannel();  
	int numChannels = framePixels.getNumChannels();  
	int bitsPerPixel = framePixels.getBitsPerPixel();  
	int bytesPerRow = framePixels.getBytesPerPixel() * imageWidth;  
	unsigned char* pixelsPointer = framePixels.getPixels();  
	NSBitmapImageRep* imageRep =  
		[[NSBitmapImageRep alloc]  
			initWithBitmapDataPlanes:&pixelsPointer  
			pixelsWide:imageWidth  
			pixelsHigh:imageHeight  
			bitsPerSample:bitsPerChannel  
			samplesPerPixel:numChannels  
			hasAlpha:NO  
			isPlanar:NO  
			colorSpaceName:(bitsPerPixel <= 8) ? NSDeviceWhiteColorSpace : NSDeviceRGBColorSpace  
			bytesPerRow:bytesPerRow  
			bitsPerPixel:bitsPerPixel];  
	[imageRep autorelease];  
	NSImage* frameNSImage = [[[NSImage alloc] initWithSize:NSMakeSize(imageWidth, imageHeight)] autorelease];  
	[frameNSImage addRepresentation:imageRep];  
	[qtMovie addImage:frameNSImage forDuration:duration withAttributes:attributesDictionary];  
}  
  

HTH
/Jesper

Thanks Jim & Jesper,

Very useful.
Seems I have dive into obj-c & Lion for AVFoundation.

Cheers,
Kz

This is the first example I have found that works reliably for app generated images. We are working to try an integrate an audio track as well if we can get the timing right we will post the results. I tried using the ofxQTKitVideoSaver found here: https://github.com/mizt/ofxQTKitVideoSaver and although while debugging the pixels are being outputted to the bitmapRepresentation correctly, it only generated black frames in the final move. Thanks so much for posting.