10 bit color output

about everything
Post Reply
  • Author
  • Message
Offline
User avatar
*sensei*
Posts: 267
Joined: 12 Oct 2012, 19:14

10 bit color output

Hello, folks! Hi, Boris!

Its been a while since hype around 10 bit color output calm down, yet i'm curios if its possible to get it working in games.
Howdays its not a problem to buy a monitor that supports 10 bit output with 16 bit internal LUT. Yet i dont have enough info, does consumer grade video cards support such output?
Is it possible to force OS and directx / opengl to use 10bits per channel output textures?

Why I am asking? Its time to change my monitor, and Im afraid that profesional monitors work with 10 bit output only with professional videocards and only in latest photoshop. Yet i want to try it with games.
_________________
Image

Offline
User avatar
*blah-blah-blah maniac*
Posts: 17442
Joined: 27 Dec 2011, 08:53
Location: Rather not to say

Re: 10 bit color output

If i remember, have seen 10 bit output on amd videocard very long time ago, but all others use rgbx (8 bit per channel) format for back/front buffers, games also working only with this format and replacing it have big chance that game will not work properly. Personally i don't believe in such devices, because humans are unable to see more than 24 bit of colors. Only if display have much bigger contrast and brightness range... I know which next display i'll get (after testing my current purchased not long time ago), it will be 4k which is used with it's interpolation feature and running 1920*1080 with it to keep comfortable gui size. For me interpolation is the most important loss after migration from CRT display, all small details of images (grass, sand) looks like point filtering of textures, because i see every pixel of display as square, this is awful and don't give immersion of window to reality of CRT.
_________________
i9-9900k, 64Gb RAM, RTX 3060 12Gb, Win7

Offline
User avatar
*sensei*
Posts: 267
Joined: 12 Oct 2012, 19:14

Re: 10 bit color output

Thx for reply! I'll look over 4k with interpolation.
My logics behind idea of moving to 10 bits:
8bit per channel color have only 256 gray gradations, and 10bit will incease it to 1024. IMHO our eyes are quite sensitive to grays, so i sounds legit for me.
_________________
Image

Offline
User avatar
*blah-blah-blah maniac*
Posts: 17442
Joined: 27 Dec 2011, 08:53
Location: Rather not to say

Re: 10 bit color output

Just get any gradient image and try to see lines in it. I can't see. Decreasing steps from 256 to 128 shows color banding artifacts, even 192 levels, not not 256. This means that 10 bit display must have 4 times bigger maximal contrast difference between black and white. I don't think it's possible, even on crt display with it's perfect black level i was not able to notice gradient lines. When i lurked on the forums of photographs and designers, was very dissapointed how incompetent they are with tech things. They just believe in digits and purchase displays of highest price, talking that these are the only proper displays for their tasks. But in fact, they are same people and they can't see the world differenly, because eyes are the same and can't be trained (only women have higher perception to colors, but in normal conditions you can't notice that). The world is full of fools and liars.
_________________
i9-9900k, 64Gb RAM, RTX 3060 12Gb, Win7

Offline
Posts: 41
Joined: 13 Jan 2016, 22:48

Re: 10 bit color output

Would not an OLED display offer enough contrast to see the difference? Dell is releasing a 30" Ultrasharp OLED later this year (at an obscene price).

Offline
User avatar
*blah-blah-blah maniac*
Posts: 17442
Joined: 27 Dec 2011, 08:53
Location: Rather not to say

Re: 10 bit color output

Oled itself have poor quality after very short period of usage because of degradation.
_________________
i9-9900k, 64Gb RAM, RTX 3060 12Gb, Win7
Post Reply