This is about, and a listing for, different effects I wrote using the IBasicVideoEffect interface, and wanted to share.
Doing this work require a familiarity with coding in general, implementation of the Media Capture object, and some understanding of the basic video effect interface. If you set up a Media Capture using Basic photo, video, and audio capture with MediaCapture, or with this simpler one Simple Camera Preview Access then add a custom video effect from the hardware sample (the second half, at the bottom of the linked help page), and it works, you are close enough to there.
It is a lot of work. Especially UI, structures, and getting everything to run right.
Everything is in C-Sharp anymore, now so are we. Stuff that uses pointers is subject to the typical IntPtr warnings, and is 'not meant for production.'
First off, here are the effects I wrote. I will provide the sealed classes and documentation for each of my components on separate pages here on this blog.
It need to be mentioned right now is that the true bitwise effects are slow. There is no video hardware accelerator for such things. Using them in a chain can cause hangs. High resolution use can cause frame rates to slow. These were tested at a maximum resolution of 1920 x 1080.
FramePoolEffect -- have a ready circular frame pool running alongside the video all the time, grab a near-new frame immediately.
FrameAcquireEffect -- acquire frames one at a time, pretty fast, without async calls. Except for the one async call. Oh.
Eye Yam XORING. |
FrameWidthXorEffect -- xors every frame, bitwise, pixel by pixel, with its predecessor. (see picture)
FramePoolXorEffect -- xors every frame with one that is (framepoolsize -1) frames back. Adjustable frame pool size.
I am shifting left by 3 bits. |
ShlEffect -- Shifts the bits in each pixel's rgb channels left any number. Strips out the upper bits, and brightens the lower ones.
I am not watching moving content at all. |
FramePoolBlankingEffect -- This blots out moving content, and three variables can be adjusted to blot out anything that moves over most any short time frame.
I am zooming where it says OK. |
ZoomEffect -- Uses hardware acceleration to cut out and upscale an image segment to output. I use this to zoom my video output.
Notes:
It should be noted that while I have been able to implement effects in a Media Capture environment without problem, I have yet to get one to work with Media Player. There are still some things I want to try, but have other things to do.
The documentation for the TimeIndependent property of the video effect says to set this depending on if it is time independent or not. But explains nothing about what that means in any context. Whatsoever. I did run into something else that was time independent/dependent but it was a bit obscure as to whether that is what is going on here. The hint was Media Capture was time dependent, while composition, which can run at whatever speed is not. All the same
More things not mentioned include the inner void "Close" being called when the effect starts up as well as twice, with different parameters, when closing. So if you are erasing some pointer or unhooking some memory there, you need to check the pointer for null, because it will sweep through again, and waaaaaay before.
Also, the void SetProperties runs twice on startup. So that is not the place where you want to be allocating memory, for instance. Maybe that has something to do with me debugging.
Loose Discussion:
I have been playing around recently with the IBasicVideoEffect Interface after mastering Media Capture. I stumbled across the custom video effects page at Microsoft Docs. Bookmarking it, I supposed one day I would "figure it out." I seemed so arcane, and everything was in C-Sharp as well. Then I learned some CS, and was good enough to go.
After working with it some, I ported everything to VB. But then went back to C-Sharp. I don't know why. Just felt like it. This is made possible because Custom Video Effect code is independent of project code, and even resides in its own project.
Programming any of these things requires patience and troubleshooting skills.
Just working with the Media Control was touchy. Media Control often fails sometimes on video handshake. A big problem because it throws no error and reports back as running and OK.
A programmer's dream there.
To test to see if the Media Capture is synced with the source, I check to see if it can deliver on MediaCapture.GetPreviewFrameAsync ... it may crash, but if the MediaCapture is running but not showing video, it typically will just bail into the ether without throwing any error. So I wrap the test in a timer loop. If the Video Frame exists after some period of time, that means the code has not jumped into nothingness. Otherwise I have to do a whole new reinit after closing the Media Capture object and getting rid of it. The timer loop's time should be generous. It can take a while to deliver a frame.
It is a very hard error to track for the reason it fails without flag. Was a big pain in the ass for a few hours there. Or a couple of days! Most of the time was wasted going over properties of things trying to find a flag that would indicate an error state. Then I got interested in what the black frames were made out of, which is when I started trying to get them through that async call, and came up with the checking paradigm. The black frames of nothingness are exactly that.
One of the problems with programming is so few things work "right out of the box" when implemented. Stuff that does is sub par. Everything need configuration and customization. Lots of "scratch" work.
The first effect I made was a motion blanker. The idea was to use a second computer to display the content I was trying to read, but blank out any motion, so I can read that content.
After that I played around for a while with full 32bits-per-bit masking. One problem with those is the amount of processing time. Two hundred milliseconds. I can use it for composition, if I ever have that. Slowly.
I then worked my way through simple bitwise operator effects using back frames. The most dramatic of these are the two XOR effects. These ran at 50 ms/frame, on my I7 3000 for 1920 x 1080. I tried other bit comparisons, OR, also AND but they were meh.
I actually watch TV sometimes using the XOR effect, especially cartoons. You really get to see how they were drawn by the amount of motion. Only things that change, that is what you see.
Note that normal Graphics.Canvas.Effects items run way faster than this when you are processing frames.
For instance I have a video effect class that uses the Atlas and Scale Graphics.Canvas.Effects inside its processing cycle, instead of my bitwise things. It runs around 10 ms at 1920 x 1080, and does not stumble. It is fast because it employs special hardware video processing. Which is available on my computer, and most any other windows PC -- no video card necessary. There is a passable hardware visual processor as part of normal architecture, I assume. Have never seen one missing hardware video, in recent times, though it can be turned off in settings.
There is a long set of those kind of fast-processing hardware effects to be found. Ones that can be used in custom effect like here, normal drawing on a CanvasControl and game drawing on a CancasAnimatedControl. They are here at Graphics.Canvas.Effects.
I combed it for ones that I could use to implement bitwise things, to no use. One had an "exclusive or" option as a parameter, but I am not sure what they were going for with that, because it looked no where near true bitwise xoring when I set it up. Lots of blends, shading, different math and tables, but no fricking bits. Maybe I missed one when I was tired reading. There sure are a lot of them. I did work through them several times.
With a little down-conversion my less radical bitwise effects will run fast enough for real time. I can run a Frame-to-Frame or Frame Pool Xor effect at 1360 x 768 and 30 FPS with fair success. Again, on an I7-3000.
If you have read this far, well, thanks for reading. 😺 Good luck!