I've successfully got Apple's basic capture application working with a shiny looking Cocoa GUI around it:
http://developer.apple.com/document...1.html#//apple_ref/doc/uid/TP40004574-CH4-SW2
However, I started looking into QTKit because my attempts to get ffmpeg/ffserver (Linux originating command line tools) to capture from my i-Sight turned out to be impossible (from what I've read ffmpeg doesn't support OS X when it comes to capture devices).
I wanted to build a small CLI tool which captures video from the i-Sight and streams it over HTTP in some compressed/encoded form, just like I could do with ffmpeg/ffserver on Linux. Taking apple's little tutorial linked above I was hopeful that it wouldn't be too difficult and would be a learning experience in Mac programming anyway.
What I've done is built the app and compiled it fine (also spent some time fiddling with some GUI adjustments in IB). Now I'm trying to run it without a GUI. I've pretty much just tried using the Foundation and QTKit frameworks then dropping the QTCaptureView from the QTCaptureSession.
All that happens now is my main() method runs in a split second and the application exits. It all happens so fast the i-Sight light doesn't even blink. How do I, inside a command line Objective-C app, get my QTCaptureSession to stay open until I hit CTRL+C? Do I need to start a new Thread for the session and then sleep the current thread? Pretty much at a loss.
Here's the code I've tried so far:
main.m
Classes/RecordMeController.h
Classes/RecordMeController.m
http://developer.apple.com/document...1.html#//apple_ref/doc/uid/TP40004574-CH4-SW2
However, I started looking into QTKit because my attempts to get ffmpeg/ffserver (Linux originating command line tools) to capture from my i-Sight turned out to be impossible (from what I've read ffmpeg doesn't support OS X when it comes to capture devices).
I wanted to build a small CLI tool which captures video from the i-Sight and streams it over HTTP in some compressed/encoded form, just like I could do with ffmpeg/ffserver on Linux. Taking apple's little tutorial linked above I was hopeful that it wouldn't be too difficult and would be a learning experience in Mac programming anyway.
What I've done is built the app and compiled it fine (also spent some time fiddling with some GUI adjustments in IB). Now I'm trying to run it without a GUI. I've pretty much just tried using the Foundation and QTKit frameworks then dropping the QTCaptureView from the QTCaptureSession.
All that happens now is my main() method runs in a split second and the application exits. It all happens so fast the i-Sight light doesn't even blink. How do I, inside a command line Objective-C app, get my QTCaptureSession to stay open until I hit CTRL+C? Do I need to start a new Thread for the session and then sleep the current thread? Pretty much at a loss.
Here's the code I've tried so far:
main.m
Code:
#import <Foundation/Foundation.h>
#import "Classes/RecordMeController.h"
int main(int argc, char *argv) {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
RecordMeController *controller = [[RecordMeController alloc]
initWithOutputFilePath:@"/Users/chris/movie.mov"];
[controller startRecording];
[controller release];
[pool release];
return 0;
}
Classes/RecordMeController.h
Code:
#import <Foundation/Foundation.h>
#import <QTKit/QTKit.h>
@interface RecordMeController : NSObject {
QTCaptureSession *mCaptureSession;
QTCaptureDeviceInput *mCaptureDeviceInput;
QTCaptureMovieFileOutput *mCaptureMovieFileOutput;
NSString *mMovieSavePath;
}
- (id)initWithOutputFilePath:(NSString *)path;
- (void)startRecording;
- (void)stopRecording;
@end
Classes/RecordMeController.m
Code:
#import "RecordMeController.h"
@implementation RecordMeController
- (void)dealloc {
[mCaptureSession release];
[mCaptureMovieFileOutput release];
[mCaptureDeviceInput release];
[super dealloc];
}
- (id)initWithOutputFilePath:(NSString *)path {
//Set the save path
mMovieSavePath = path;
//Start the capture session
mCaptureSession = [[QTCaptureSession alloc] init];
BOOL success = NO;
NSError *error;
//Find the default device
QTCaptureDevice *device = [QTCaptureDevice
defaultInputDeviceWithMediaType:QTMediaTypeVideo];
if (device) {
success = [device open:&error];
if (!success) {
//handle error here
}
//Create a new input for the device
mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:device];
//Add the input to the session
success = [mCaptureSession addInput:mCaptureDeviceInput error:&error];
if (!success) {
//handle error here
}
//Create the output
mCaptureMovieFileOutput = [[QTCaptureMovieFileOutput alloc] init];
//Add the output to the session
success = [mCaptureSession addOutput:mCaptureMovieFileOutput error:&error];
if (!success) {
//handle error here
}
//Tell the output that the controller is this instance
[mCaptureMovieFileOutput setDelegate:self];
//Start the session now
[mCaptureSession startRunning];
}
return self;
}
- (void)startRecording {
[mCaptureMovieFileOutput recordToOutputFileURL:[NSURL
fileURLWithPath:mMovieSavePath]];
}
- (void)stopRecording {
}
- (void)captureOutput:(QTCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
forConnections:(NSArray *)connections
dueToError:(NSError *)error
{
//Roughly translates to Finder -> Open -> MyRecorderMovie.mov
// [[NSWorkspace sharedWorkspace] openURL:outputFileURL];
}
@end