• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

I THINK I'M ON TO SOMETHING

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

supergenius74

Member
Joined
Mar 1, 2001
Location
Fort Wayne, IN.
Ok, been doing some research on this and can't seem to draw a good conclusion. The old question of is 32-bit better than 16-bit even at a performce hit problem is still bothering me even after almost 2 years. What tipped me off is that I have been getting into image editing software lately and it only seems to support up to 24-bit images. Acording to what is written and my math skills
8bit images = 2 to the 8th power = 256 colors = crappy,
16bit images = 2 to the 16th power = 65,536 colors = good ,
24bit img.=2 to the 24th power =16,777,216 colors=no difference
32bit img =2 to the 32nd power = 4,294,900,000 colors =
yes thats 4 billion colors which i have yet to see. now here's the kicker, I have yet to see a 32 bit image, in my software 24 bit doesn't look any better than 16 bit and every place on the net that talks about 32 bit images says there 16 million colors but isn't that 24 bit ? I've tried every game and App I have and I can't tell the difference between 16-bit and 32 bit settings. but I already knew that what i am wondering is there a restriction on human eyes that they are not telling us? can we only see so many colors like we can only here so many frequencies? Are our monitors unable to create 16 million or even 4 billion colors. Why did manufacturers make such a big deal about 32 bit and why do people like john carmack talk about the future of games and 64- bit grafix man that would be 1 with 19 zeros behind it in colors. Why do they even bother with loosing framerates over a color difference that I can't see when they could be spending framerates on other ways of making images better. I would love to hear other peoples opinions and knowlege on this please!
 
All those extra colour slots are for the women out there :) Us guys usually see a colour and call it one of three things (ie. for Purple we have Lt Purple, Dark Purple, and Purple). Women, on the other hand see things as Lavender, Fuchia, Ochre, Peach, etc. Since there are so many women out there, and they don't always agree with each other, 4 billion choices seems rather small ;D
 
While the human eye can only pick up so many colors - there are other traits that your color-bitrate can go to - like this little thing called 'Alpha' (kinda like Gamma) - I won't get into details but lets say it's kind of like - lets see - instead of using tricks to calculate it actually setting how reflective a color is (a purple on metal will look different than the exact same purple on plastic). Make Sense?

Stolid
 
32 bit color is the same as 24bit color in that you have 8bits for the three primary color of light ( red, green, and blue). The remaining 8 bits are for alpha. Alpha refers to the colors transparency and is used in things like alpha blending where the video card in rendering the pixel blends the color of the pixel with the color of the background pixels underneath it and does so based on how transparent the color is supposed to be which is determined by the 8bits representing the alpha channel, thereby creating the illusion that the pixel is transparent and the background is showing through. So, with 8bits as an alpha channel a pixel can have 256 different varying levels of transparency from completely opaque to very transparent. As you can probably imagine this opens the door for alot of interesting effects and possibilities.
 
I'm sorry, but if you can't see the major color banding with 16 bit color maybe you need to get your eyes checked. Make sure you are not playing a game that only has 16 bit textures (since 32 bit color will not make much difference then). The difference between 16 bit and 32 bit color is like the difference between night and day in my opinion...I really can't understand how you can claim to see no difference...

As for John Carmack, let's just say he's trying his hardest to make a game that is going to bomb. I mean, I'm sorry but if Doom3 is really going to run at 30 fps at 800*600 on a Geforce3, then he's hammering the nail into his own coffin. Even on a Radeon 2/Geforce 4 that game will not run smoothly. Maybe his "godhood" has finally gone to his head? Either way, I wouldn't put much stock into what he says anymore.
 
I dunno, I host Delta force land warrior on a machine with a voodoo3 on it, which is 16bit color,

and I play on another machine with a 32bit radeon, and there is a HUGE differeance in the way things look as far as color goes. The 16bit color just seems dull compared to 32 bit.
 
Hear hear! Finally some ppl that agree with me in 32bits vs 16bits... I have to admit though, if you have an image, there really isn't THAT much difference in it between 16m colors & 256k colors, as most images don't utilize 16m colors. And usually, when looking at a 16bit image, it has been dithered to reduce the loss of color depth.
But when it comes to games, I'm ALWAYS disappointed if a game doesn't support 32bits...
 
Back