v002 screen capture fail to work with histogram

i cannot get v002 video capture to function with histogram in QC 4.0. the video input device works fine, any image source seems to work with histogram. i've tried combining video capture with other image and video, but no valid histogram data is available.

anyone else have this issue?

dust's picture
Re: v002 screen capture fail to work with histogram

not really sure how your trying to set this up. try this file..... its just a histogram visualizer with v002 screen capture feed into it. all i did was render the v002 capture with a render in image patch set to default settings. that seems to put the screen capture patch into a pixel space that the histogram patch can get data from.

sorry i can not offer any reason as to why this occurs and why another renderer is needed. i think syphon now has screen recording built in so you may want to try with that as well as... i'm thinking syphon is a bit more mature in its implementation and may give you a better frame rate but i'm uncertain as to what your needs are.

PreviewAttachmentSize
v002Histogram.qtz23.95 KB
v002Screen.png
v002Screen.png126.57 KB

vade's picture
Re: v002 screen capture fail to work with histogram

This is partly due to how the screen capture works internally, using an IOSurface which QC does not really "know" how to lock a buffer representation of a IOSurface backed texture (it requires additional work over a normal texture for those that care).

Use a Core Image filter right after the output of the Screen Capture, it should fix the issue until (or maybe if - see below) I figure out a good solution.

Speaking "if", 10.7 breaks all OpenGL based screen capture applications, so stay tuned for a 10.7 only version when I have time. Fun...

dust's picture
Re: v002 screen capture fail to work with histogram

i hear you on 10.7, i'm looking forward to 10.7.1 fixing things. seems like some essential patches are missing ... the new patches work for me in 10.6 but not in 10.7. i guess its all par the course.

vade's picture
Re: v002 screen capture fail to work with histogram

This is not something that will be fixed with 10.7.x. This is a specific, documented removal of APIs with basically lack-luster, slow replacements. OpenGL based screen capture wont ever work again, nor do I expect its replacement to ever be as performant.

gtoledo3's picture
Re: v002 screen capture fail to work with histogram

vade wrote:
This is not something that will be fixed with 10.7.x. This is a specific, documented removal of APIs with basically lack-luster, slow replacements. OpenGL based screen capture wont ever work again, nor do I expect its replacement to ever be as performant.
This is really disturbing, and a weird way of handling security issues (assuming, maybe wrongly.)

vade's picture
Re: v002 screen capture fail to work with histogram

I believe you are correct.

usefuldesign.au's picture
Re: v002 screen capture fail to work with histogram

There was me thinking IOSurface might be one way out of the lack of FCP X video monitoring — as in a 3rd display dedicated to uncompressed video.

Some blogger was saying AVFoundation provides no video 'pipe' or 'hooks' for 3rd party products to do I/O video monitoring. Have know idea of that's valid of course.

dust's picture
Re: v002 screen capture fail to work with histogram

i don't know much about the IOSurface framework. (it being an undocumented api...) so vade is the man in those regards. what i do know, from looking at a bit of vades source is that his screen capture uses IOSurface.

he is stating here that the alternative solutions fail in comparison and possibly will be broken in lion. I'm taking a leap here and am assuming he is referring to IOSurface. so with those statments maybe IOSurface would be a good solution to video monitor but then again there is his whole if statement.

in reference to the blog about the lack of support for video monitoring in final cut with av foundation, vade again would probably offer the best answer if that blog entry has any validation or not as he has been working with av foundation on mac.

from my experience with av foundation in IOS it is totally possible to monitor raw video from a capture device, make a composition and output it to an external monitor device. at the same time. with the mac av foundation makes it a bit easier than on iOS seeing there is a CVSampleBuffer to GL texture method available on the mac.

as the blog mentioned I have yet to see clear way to output to an apple tv from av foundation. this is supposed to be possible. for now you have to write your asset to file in realtime with a special pixel buffer adaptor class and then forward that to your apple tv. this would be a nice preview feature for FCP but do not see it as a proper monitoring solution of the past.

i don't think its impossible to monitor out with av foundation. apple obviously thought all this out and for what ever reasons choose to negate a monitor out solution. i started video editing with various systems before FCP. using media 100 BNC systems to avid composer systems. with set ups like that it was imperative to have broadcast monitor. FCP came out and i choose to use it over other more expensive solutions before all the fancy color sync, and correction tools and what not where implemented. so with FC 1 you really needed a external broadcast monitor to check your colors with.

video content delivery seems to be pushing towards an all RGB pixel space. so if the internet is going to be the new way we distribute videos having a broadcast monitor is pointless. however you still can color sync your monitor to what ever broadcast specs you want and use your computers external dvi to monitor once you calibrate it. now if that is not possible than i am totally shocked.

this sucks for people that all ready have these expensive monitor solutions and older computer systems and cameras. for me i would have to go to my moms house to get access to a computer with this new thunderbolt IO system.

hey I'm all for the evolution and progression of media technology. however it is really hard to keep up with, planned obsolescence is a bitch.

what happened to dvd studio pro ? or is blue ray studio pro now ?

vade's picture
Re: v002 screen capture fail to work with histogram

Don't put words in my mouth. We are quickly delving into NDA territory here but whatever. Lion is literally a few days away.

1) No, I am not talking about IOSurface. IOSurface is thankfully working 100% in 10.7. Syphon works as well. I specifically mentioned that 10.7 breaks all OpenGL screen capture applications. That is all and exactly what I meant. That has nothing to do with IOSurface, it has everything to do with the removal and breakage of CGL APIs for doing full screen screen capture. It does not work. At all. This means ScreenFlow, iShowU, etc, and Apple developer examples no longer work. See below.

2) The replacement API is AVFoundation. The replacement API is slow as fuck-balls and does not provide nearly enough performance or features. This is a shame. People are scrambling to use the new API, and ensure they can get things as fast as possible. Lot of features will be broken or missing in 10.7, if the App works at all.

Read the relevant dev forum thread if you have access here:

https://devforums.apple.com/thread/95604?tstart=0

3) As far as I know, there is no Core Media Sample buffer OpenGL example on the mac, nor API to do that at all. The method is to use CMSamplerBufferGetImage. CMSampleBuffer to CVImageBuffer to OpenGL texture is not fast path and is slow as balls. I cannot emphasize that enough. It is slow. As. Balls. Slower than any other path that has existed on Mac OS X I have ever used, even QTMovie -> NSImage. If you've read the documentation you will see it acknowledges this.

The proper way to handle this is to use a AVPlayerLayer. AVPlayerLayer is fast as hell. However, doing this means you have no control over the produced texture, and have to capture your render, wasting a potentially ginormous amount of VRam in the process.

Read the relevant thread here if you have access here:

https://devforums.apple.com/thread/89032?tstart=0

4) The question is not about Capture, which AVFoundation handles well. it is about CoreMediaIO plugins to allow uncompressed output via a hardware solution like AJA, Black Magic, etc boards. Current beta drivers do this in probably the worst way possible, emulating an additional software accelerated Screen. This means UI elements, etc, are also potentially in the output image. This means changing easy setups in FCPX means having to match in System Preferences -> Monitors output frequency, resolution, bit-depth. Part of this is because there is no real output to tape functionality in FCPX, and so was not a priority for AVFoundation. AVFoundation to my knowlege has no API for output, just capture.

"apple obviously thought all this out"

What they did was piss a lot of people off, and break a lot of functionality, and introduce amateur hour bugs in FCPX that make work arounds to the broken functionality (like no FXPlug) non-functional because rendering happens in the wrong color space, in the wrong blend mode, with incorrect gamma.

I'm sure things will get better, but it looks like things are much worse for the time being. 10.7 is really an odd beast.

dust's picture
Re: v002 screen capture fail to work with histogram

didn't mean to put words in your mouth vade. i have not actually used FCPX yet so my opinions are really mute. i am sure i will be as outraged as you and others are once i actually open it up. i'm only speaking in reference to the negation of monitor output in FCPX when i say that was "obviously thought out". i mean come on ! cutting out that one... feature and all the others people are talking about had to take some clear thought. its not a gross oversight.... like oops i forgot program in tape output.

again i can not really comment on the lion side of things as i'm still using snow leopard but i have read the AVFoundation programming guide more than one time and confidently can say that AVFoundation deals with both IO (input, output) the AVCaptureVideoDataOutput object lets you define your video output settings. yes technically to output ext hdmi or to an apple tv at the moment require additional frameworks.

i know the woes of trying to do this in iOS but have not tried anything AVFoundation for mac yet. i'm taking vades word for it on mac.... as i understand the AVPLayer and AVPlayerLayer and underlying concepts pretty good. just changing the volume of an AVPlayer is a bit tricky and to get at the raw video file for gl processing i used a CMSampleBufferRef - > CVImageBufferRef ->GL or to a CGImage.

slowly like...

(again this is for iphone)

- (void)textureFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
 CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
     CVPixelBufferLockBaseAddress(imageBuffer, 0 ); 
     unsigned char* linebase = (unsigned char*)CVPixelBufferGetBaseAddress(imageBuffer);
     CMFormatDescriptionRef formatDesc = CMSampleBufferGetFormatDescription(sampleBuffer);
     CMVideoDimensions videoDimensions = CMVideoFormatDescriptionGetDimensions(formatDesc); 
     glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, videoDimensions.width, videoDimensions.height, GL_BGRA_EXT, GL_UNSIGNED_BYTE, linebase); 
     CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}

vade is right this type of method is slow and not suited for real-time video processing. apple even acknowledges this fact in the AVFoundation programing guide but offers no other clear alternative or solution only a sluggish sampleBuffer to UImage method via a CGContext, again this is all pertaining to iOS.

everybody knows AVFoundation or video in general is crippled on the iphone so i just don't see this as a huge surprise if indeed the same issues arise with AVFoundation on the mac in lion.

vade's picture
Re: v002 screen capture fail to work with histogram

It is a gross misunderstanding of the industry. Tape is far from dead. Its presumptuous, and means that shops of all sizes that have invested in earlier version of Final Cut Pro, decks, input boards, SANs and workflows are basically fucked if they want to update. What is equally as odd, is that CoreMediaIO seems to be able to handle it fine. It was not a mistake, it was an over-sight and a mis-calculation, and it is costing them dearly in terms of how professionals see themselves viewed. I've spoken with Apples Final Cut Pro evangelists personally at editorial houses I used to work for, had them listen to editors voice complaints, request features or simply voice concerns. Its clear to everyone that the product was not made for professional editors, but people cutting indy films from a DSLR.

Secondly, FCPX has huge bugs with multi-monitor setups as it is, so in a way, they actually have huge bugs in pixel format selection, where textures reside (on multi-GPU systems) and the like. The program needs a lot of work. Seriously. Too many real issues to discuss.

Anyway, rant aside,

Quote:
AVFoundation deals with both IO (input, output) the AVCaptureVideoDataOutput object lets you define your video output settings.

Did you even read the header?

Quote:

@class AVCaptureOutput @abstract AVCaptureOutput is an abstract class that defines an interface for an output destination of an AVCaptureSession.

@discussion AVCaptureOutput provides an abstract interface for connecting capture output destinations, such as files and video previews, to an AVCaptureSession.

An AVCaptureOutput can have multiple connections represented by AVCaptureConnection objects, one for each stream of media that it receives from an AVCaptureInput. An AVCaptureOutput does not have any connections when it is first created. When an output is added to an AVCaptureSession, connections are created that map media data from that session's inputs to its outputs.

Concrete AVCaptureOutput instances can be added to an AVCaptureSession using the -[AVCaptureSession addOutput:] and -[AVCaptureSession addOutputWithNoConnections:] methods.

Quote:
/*! @class AVCaptureVideoDataOutput @abstract AVCaptureVideoDataOutput is a concrete subclass of AVCaptureOutput that can be used to process uncompressed or compressed frames from the video being captured.

@discussion Instances of AVCaptureVideoDataOutput produce video frames suitable for processing using other media APIs. Applications can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.

AVCaptureVideoDataOutput is not for sending "out" arbitrary frames. It is used in conjunction with an AVCaptureSession, and lets one take the captured frame and do something with it. The pre-requisite is you have a capture session running. Capture. Session.

The issue is with FCPX and AVFoundations lack of "video output components" in older API parlance. Video output components are the opposite of capture components, as they take arbitrary video frames and send them to a device. Thus output to tape, preview output from your source or sequence frames, with tight hardware based, genlocked timed video frequencies, formats and resolutions.

As was said before, this is not something AVFoundation currently handles. CoreMediaIO looks sufficient to do this however I know little about the API, but a quick look at the headers shows it looks possible to do so.

dust's picture
Re: v002 screen capture fail to work with histogram

vade you have the right to rant and be upset or anybody else for that matter who's business is directly effected by the new final cut.... don't listen to me i'm talking out of my ass, i haven't used final cut in years. i do feel bad for people that have gear, jobs, plugins, applications, etc.. that are effected by all this.

mron's picture
Re: v002 screen capture fail to work with histogram

Thanks for the example. It seems to have solved several problems I was having with the the viewport resizing itself as well.

Great help

Ron

jersmi's picture
Re: v002 screen capture fail to work with histogram

FCPX is rightly scaring the shit out of people. What is Apple becoming? Is Apple moving to iOS and leaving behind the rest? Is Apple moving away from MacPro's and even MacBook Pro's? Is Apple discontinuing the ProApp division? As a complete rewrite for FCP, perhaps adding proper pro features could bring the new version up to snuff, and maybe add up to an amazing new frontier (giving it the benefit of the doubt x10) but at present it's a mess. Not to mention the notorious lack of customer support veiled by ridiculous opaque marketing hype.