Welcome Guest ( Log In | Register )


Important

The forums will be closing permanently the weekend of March 15th. Please see the notice in the announcements forum for details.

 
Virtualdub Recording In 8bit Modes?
« Next Oldest | Next Newest » Track this topic | Email this topic | Print this topic
Vigile
Posted: Mar 9 2013, 12:33 AM


Newbie


Group: Members
Posts: 1
Member No.: 36229
Joined: 9-March 13



I am using Vdub to record data from a dual-link DVI capture card, the Datapath DL-DVI. I am trying to capture 1920x1080 @ 120 Hz but the BW is the limit on the card. I don't need perfect reproduction and I think doing an 8-bit video quality capture would be fine, and I have heard that Vdub will support it.

Any idea how to try to do that?
 
     Top
phaeron
Posted: Mar 9 2013, 07:26 PM


Virtualdub Developer


Group: Administrator
Posts: 7773
Member No.: 61
Joined: 30-July 02



Ouch, I've bought entire computers that cost less than this capture card.

I'm pretty sure that the distinction between an 8-bit and a 24/32-bit video mode is gone by the time the video card is producing DVI output -- the video card would have already expanded the 8-bit display. VirtualDub will capture 8-bit (256 color) output, but only if the capture hardware itself can produce that format. From what I can tell, DVI only supports RGB, which means it could not possibly produce classical indexed color output at 8 bits/pixel.

What's more confusing is that 8-bit can also mean 8 bits per channel, as in 8 bits/channel * 3 channels (RGB) = 24 bits/pixel. That you are already likely capturing by default.

One way to reduce bandwidth would be to get the computer to output 4:2:2 YCbCr instead of 24-bit RGB, but it looks like this is only possible with HDMI and not DVI.
 
    Top
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:
1 replies since Mar 9 2013, 12:33 AM Track this topic | Email this topic | Print this topic

<< Back to Capture