|
|
| rjisinspired |
| Posted: Dec 26 2011, 10:16 AM |
 |
|

Advanced Member
  
Group: Members
Posts: 1256
Member No.: 20008
Joined: 12-October 06

|
I'm starting a video project and at the start of this video there will be a gradated background. I'm using PSP X2 for working with still images.
Looking around I am seeing advice for selecting 16 bit color inside of PSP or use some uniform noise with the gradient to smooth out the bands but none of these actions have really worked. I switched to gaussian noise and the noise is much finer but right when it cancels out the banding the grain is very evident.
I went ahead to check my display settings and they are at 32 bit color, there is no 24 bit option, only 16 or 32.
Many of the effects require an 8 bit image so I get the dialogs that request changing down to 8 bit.
Are there any alternate ways to minimize or eliminate banding? |
 |
| phaeron |
| Posted: Jan 8 2012, 06:19 AM |
 |
|

Virtualdub Developer
  
Group: Administrator
Posts: 7773
Member No.: 61
Joined: 30-July 02

|
You're mixing up the bitness values. 24-bit color means 24 bits per pixel, and since there are three channels -- red, green, and blue -- that's 8 bits per channel. That 8 bits/channel is what's giving you the banding. 32-bit color is no better than 24-bit color because it's just 24-bit color with an extra unused 8 bits thrown in just to pad the pixels out to a nice size.
What you need is more than 8 bits per channel. In Photoshop, this is done by choosing 16-bit or 32-bit per channel depth pixels. Keep in mind, though, that your display is still 8 bit/channel, so unless PSP X2 supports dithering in the display, you're still going to see banding. What it will still help with is preventing the banding (quantization errors) from accumulating and getting worse while you work. You do need to be careful that any filters you use also support deep color depths as they may still be limited to 8 bits/channel.
There are displays that support 10 bits per channel and will show greater precision than a standard 24-bit display. They're uncommon, though. If you have such a display AND have a video card that can drive it at that depth AND have a program that sets an appropriate video mode, then you can get better than 24-bit color on screen.
The big problem you'll run into in the end is that video compression schemes mostly don't support more than 8 bits/channel, except for a few 10-bit/channel formats used specifically for video editing. In fact, they're usually WORSE than 8 bits/channel as they have headroom in the luma and chroma channels and only have about 7.8 bits in luma. From a pure image standpoint, the way to combat this is dithering, which amounts to adding a bit of noise to the image to break up the banding patterns. This is more difficult in a compressed video format, though, because the noise is the first thing that the codec will stomp out to try to save bits. If you're willing to forgo the smooth gradient, then deliberately adding a texture to the background would be a more forceful way to accomplish this.
|
 |
| rjisinspired |
| Posted: Jan 18 2012, 09:42 AM |
 |
|

Advanced Member
  
Group: Members
Posts: 1256
Member No.: 20008
Joined: 12-October 06

|
Thanks Phaeron.
I started off with a 16 bits/channel image and made these 3 examples: http://rjschat.dyndns.org:8080/paranoha/ap...dub/bluegrad.7z
The image "bluegrad" is a straight flood bucket image of a black and white gradient and then colorized to blue. The "bluegrad-gaussian3" is the same image but with 3% gaussian noise added and "bluegrad-uniform3" is with 3% uniform noise added.
3% uniform noise sample might be visible. I kind of see it even though my eyes aren't that good or maybe I'm seeing flecks, lol.
About dithering in the display: do you mean as being applied to the display image itself or the program working with the physical display in some way?
|
 |
| phaeron |
| Posted: Jan 23 2012, 11:53 PM |
 |
|

Virtualdub Developer
  
Group: Administrator
Posts: 7773
Member No.: 61
Joined: 30-July 02

|
By dithering in the display, I mean the program dithering the image that it puts on screen without affecting the one you're working on. The hardware display can do dithering, too, but usually that's well hidden from you -- in fact, I think that's how just about all modern displays work. |
 |