video input

Video Input doesn't see Blackmagic inputs on Mountain Lion

adamfenn28's picture

I've been using the Video Input patch with a Blackmagic Intensity Extreme (or UltraStudio 3D, or Mini Recorder) for some time. On the patch settings, I have to select the proper input mode, but it works great... on Lion. I've recently tried to doing this on Mountain Lion with no luck. On Mountain Lion, the Video Input patch only sees a single Blackmagic device, simply called Blackmagic. It no longer sees the individual input modes, which I had to select in Lion. The trouble is, that when I select the 'Blackmagic' device in Mountain Lion, the Video Input patch sees no input. How can I make the Blackmagic devices work with the Video Input patch on Mountain Lion?

Thanks! Adam

Selectable Video Input

adamfenn28's picture

Needing to be able to specify the capture device on the Video Input patch from a published input, I set out looking for something to provide that. I found a patch that does exactly that, but only for video media types (not muxed video types). I need to support both, so I made a few changes and thought I pretty much had it taken care of... but I didn't. For muxed media types (coming in from my Canopus ADVC-110), when I test the CVImageBufferGetColorSpace() of my imageBuffer, it is (null). I've looked at this and the QTKit documentation and I just doing see what I'm missing. What else do I need to do?

Thanks!

#import <OpenGL/CGLMacro.h>
#import "CaptureWithDevice.h"
 
#define   kQCPlugIn_Name            @"Capture With Device"
#define   kQCPlugIn_Description      @"Servies as a replacement for the default Video Input patch, and differs in that it allows the input device to be specified by the user."
 
@implementation CaptureWithDevice
@dynamic inputDevice, outputImage;
 
+ (NSDictionary*) attributes
{
   return [NSDictionary dictionaryWithObjectsAndKeys:
         kQCPlugIn_Name, QCPlugInAttributeNameKey, 
         kQCPlugIn_Description, QCPlugInAttributeDescriptionKey,
         nil];
}
+ (NSDictionary*) attributesForPropertyPortWithKey:(NSString*)key
{      
   if([key isEqualToString:@"inputDevice"]) {
        NSArray *videoDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo];
      NSArray *muxedDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed];
 
        NSMutableArray *mutableArrayOfDevice = [[NSMutableArray alloc] init ];
        [mutableArrayOfDevice addObjectsFromArray:videoDevices];
        [mutableArrayOfDevice addObjectsFromArray:muxedDevices];
 
        NSArray *devices = [NSArray arrayWithArray:mutableArrayOfDevice];
        [mutableArrayOfDevice release];
 
        NSMutableArray *deviceNames= [NSMutableArray array];
 
        int i, ic= [devices count];
 
 
        for(i= 0; i<ic; i++) {
         [deviceNames addObject:[ [devices objectAtIndex:i] description] ];
            // be sure not to add CT to the list
      }
 
 
 
      return [NSDictionary dictionaryWithObjectsAndKeys:
            @"Device", QCPortAttributeNameKey,
            QCPortTypeIndex,QCPortAttributeTypeKey,
            [NSNumber numberWithInt:0], QCPortAttributeMinimumValueKey,
            deviceNames, QCPortAttributeMenuItemsKey,
            [NSNumber numberWithInt:ic-1], QCPortAttributeMaximumValueKey,
            nil];
   }
   if([key isEqualToString:@"outputImage"])
      return [NSDictionary dictionaryWithObjectsAndKeys:
            @"Video Image", QCPortAttributeNameKey,
            nil];
   return nil;
}
+ (QCPlugInExecutionMode) executionMode
{
   return kQCPlugInExecutionModeProvider;
}
 
+ (QCPlugInTimeMode) timeMode
{
   return kQCPlugInTimeModeIdle;
}
 
- (id) init
{
   if(self = [super init]) {
      [[NSNotificationCenter defaultCenter] addObserver:self 
                                     selector:@selector(_devicesDidChange:) 
                                        name:QTCaptureDeviceWasConnectedNotification 
                                       object:nil];
      [[NSNotificationCenter defaultCenter] addObserver:self 
                                     selector:@selector(_devicesDidChange:) 
                                        name:QTCaptureDeviceWasDisconnectedNotification 
                                       object:nil];
   }
   return self;
}
 
- (void) finalize
{
   [super finalize];
}
 
- (void) dealloc
{
   if (mCaptureSession) {
      [mCaptureSession release];
      [mCaptureDeviceInput release];
      [mCaptureDecompressedVideoOutput release];
   }
   [[NSNotificationCenter defaultCenter] removeObserver:self];
   [super dealloc];
}
 
@end
 
@implementation CaptureWithDevice (Execution)
 
- (BOOL) startExecution:(id<QCPlugInContext>)context
{
   return YES;
}
 
- (void) enableExecution:(id<QCPlugInContext>)context
{
}
static void _BufferReleaseCallback(const void* address, void* info)
{
    CVPixelBufferUnlockBaseAddress(info, 0); 
 
    CVBufferRelease(info);
}
- (BOOL) execute:(id<QCPlugInContext>)context atTime:(NSTimeInterval)time withArguments:(NSDictionary*)arguments
{
   if (!mCaptureSession || [mCaptureSession isRunning]==NO || _currentDevice!=self.inputDevice){
      NSError *error = nil;
      BOOL success;
 
      NSArray *videoDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo];
      NSArray *muxedDevices= [QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed];
 
        NSMutableArray *mutableArrayOfDevice = [[NSMutableArray alloc] init ];
        [mutableArrayOfDevice addObjectsFromArray:videoDevices];
        [mutableArrayOfDevice addObjectsFromArray:muxedDevices];
 
        NSArray *devices = [NSArray arrayWithArray:mutableArrayOfDevice];
        [mutableArrayOfDevice release];
 
 
      NSUInteger d= self.inputDevice;
      if (!(d<[devices count])) {
         d= 0;
      }
      QTCaptureDevice *device = [devices objectAtIndex:d];
        success = [device open:&error];
        if (!success) {
            NSLog(@"Could not open device %@", device);
         self.outputImage = nil; 
            return YES;
        } 
        NSLog(@"Opened device successfully");
 
 
 
 
      [mCaptureSession release];
        mCaptureSession = [[QTCaptureSession alloc] init];
 
        [mCaptureDeviceInput release];
        mCaptureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:device];
 
        // if the device is a muxed connection  make sure to get the right connection
        if ([muxedDevices containsObject:device]) {
            NSLog(@"Disabling audio connections");
            NSArray *ownedConnections = [mCaptureDeviceInput connections];
            for (QTCaptureConnection *connection in ownedConnections) {
                NSLog(@"MediaType: %@", [connection mediaType]);
                if ( [[connection mediaType] isEqualToString:QTMediaTypeSound]) {
                    [connection setEnabled:NO];
                    NSLog(@"disabling audio connection");
 
                }
            }
        }
 
 
 
        success = [mCaptureSession addInput:mCaptureDeviceInput error:&error];
 
        if (!success) {
            NSLog(@"Failed to add Input");
         self.outputImage = nil; 
            if (mCaptureSession) {
                [mCaptureSession release];
                mCaptureSession= nil;
            }
            if (mCaptureDeviceInput) {
                [mCaptureDeviceInput release];
                mCaptureDeviceInput= nil;
 
            }
            return YES;
        }
 
 
 
 
        NSLog(@"Adding output");
 
        [mCaptureDecompressedVideoOutput release];
        mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput alloc] init];
 
        [mCaptureDecompressedVideoOutput setPixelBufferAttributes:
         [NSDictionary dictionaryWithObjectsAndKeys:
          [NSNumber numberWithBool:YES], kCVPixelBufferOpenGLCompatibilityKey,
          [NSNumber numberWithLong:k32ARGBPixelFormat], kCVPixelBufferPixelFormatTypeKey, nil]];
 
        [mCaptureDecompressedVideoOutput setDelegate:self];
        success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput error:&error];
 
        if (!success) {
            NSLog(@"Failed to add output");
         self.outputImage = nil; 
            if (mCaptureSession) {
                [mCaptureSession release];
                mCaptureSession= nil;
            }
            if (mCaptureDeviceInput) {
                [mCaptureDeviceInput release];
                mCaptureDeviceInput= nil;
            }
            if (mCaptureDecompressedVideoOutput) {
                [mCaptureDecompressedVideoOutput release];
                mCaptureDecompressedVideoOutput= nil;
            }
            return YES;
        }
 
        [mCaptureSession startRunning];   
      _currentDevice= self.inputDevice;
   }
 
 
   CVImageBufferRef imageBuffer = CVBufferRetain(mCurrentImageBuffer);
 
    if (imageBuffer) {
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
        NSLog(@"ColorSpace: %@", CVImageBufferGetColorSpace(imageBuffer));
        //NSLog(@"ColorSpace: %@ Height: %@ Width: %@", CVImageBufferGetColorSpace(imageBuffer), CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer));
        id provider= [context outputImageProviderFromBufferWithPixelFormat:QCPlugInPixelFormatARGB8           
                                                                pixelsWide:CVPixelBufferGetWidth(imageBuffer)
                                                                pixelsHigh:CVPixelBufferGetHeight(imageBuffer)
                                                               baseAddress:CVPixelBufferGetBaseAddress(imageBuffer)
                                                               bytesPerRow:CVPixelBufferGetBytesPerRow(imageBuffer)
                                                           releaseCallback:_BufferReleaseCallback
                                                            releaseContext:imageBuffer
                                                                colorSpace:CVImageBufferGetColorSpace(imageBuffer)
                                                          shouldColorMatch:YES];
      if(provider == nil) {
         return NO; 
        }
      self.outputImage = provider;
    } 
   else 
      self.outputImage = nil; 
 
   return YES; 
}
 
- (void) disableExecution:(id<QCPlugInContext>)context
{
}
- (void) stopExecution:(id<QCPlugInContext>)context
{
}
 
- (void)captureOutput:(QTCaptureOutput *)captureOutput
  didOutputVideoFrame:(CVImageBufferRef)videoFrame 
    withSampleBuffer:(QTSampleBuffer *)sampleBuffer 
      fromConnection:(QTCaptureConnection *)connection
{    
    NSLog(@"connection type: %@", [connection mediaType]);
    CVImageBufferRef imageBufferToRelease;
    CVBufferRetain(videoFrame);
    imageBufferToRelease = mCurrentImageBuffer;
 
 
    @synchronized (self) {
        mCurrentImageBuffer = videoFrame;
    }
    CVBufferRelease(imageBufferToRelease);
}
- (void)_devicesDidChange:(NSNotification *)aNotification
{
}
@end

Video Capture with SDI Capture Card PE4 Active Silicon

lightmagic's picture

Hey there,

i have a problem getting a SDI Capture Card (Active Silicon PE4) to work with Quartz Composer. Driver is installed (there is only a CDA driver available, no Quicktime driver), in Catalyst (Media Server Software) it works. With the video input coming with Quartz Composer it shows only my iSight.

Is there any chance to embed the driver to Quartz Composer? Or is there any other video input patch around?

Thanks

Marc