• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

16 bit vs 32 bit

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

supergenius74

Member
Joined
Mar 1, 2001
Location
Fort Wayne, IN.
Sorry to keep bringing this up guys, I do apreciate all the responses I got last time. Ok now, I'm still researching the subject and I still can't find any solid evidence. I did recently get my eyes checked and as usual I have perfect eyesite. No color blindness and I've never had to wear glasses. If someone can give me an example of an image and show it rendered in 16 bit and then 32 bit and show a big difference then I will shut up but so far I can't. I have taken screen shots from various new games in 32 bit mode and then analized them in my image software and counted the colors used and all but one game weighed in at well under 65,000 colors and the only one that was more was about 70,000 colors which still didn't look any different in 16 bit mode to me. I still haven't figured out why 32 bit colors are refered to as the same as a 24 bit color, I think it's because we can't see it anyway. In fact i'm willing to bet we can't even see true 24 bit colors, c'mon thats 16 million different colors thats got to be impossible, theres a big difference between 65 thousand and 16 million, its 256 times as many colors and then there is 32 bit which should be mathmatically 4 billion colors. Impossible I say, I think it's a bunch of bull and if our computers are actually running in 32 bit mode then we can't see it anyway and your smarter to run in 16 bit and and jack up the FSAA so you can actually make the image look better. I dare anyone to prove me wrong! please!
 
Some Vidcards "do" 24-bit, some "do" 32-bit. Frankly, you're asking a very valid question. I sometimes think I can see can see a difference between 16 bpp and 32 bpp. Maybe some games, on some monitors, using some Vidcards show a difference. All I know is that 32 bpp requires a lot of work from Vidcards owned by people without lotsa bucks. Perhaps the answer is the Kyro II?
 
good to see someone who agrees with me. As for anyone else who disagrees, you must have super eyesite or something, i'm not sure, if if you can see a difference between 16 and 32 your still probably only seeing something like 17 bit color which would be 131,000 colors not 24-bit ....16 million or 32-bit....4 billion colors. As for me I will continue to run all my games in 16-bit as long as the games allow me too. In 16-bit with 1280 x 1024 res with with all settings to max and FSAA my geforce ultra and 1 gighz P3 still pulls off an average on 30+ FPS even in the most complex games and maps available and damn it looks good. If you all are still not convinced by my words then go run 3Dmark 2K and run the same res but both 16 and 32 bit modes of the demo and till me where the major differences are and then after you can't tell the difference, explain to me why you can only run 3Dmark2001 in 32 bit....guys I really think I'm on to something....
 
the difference between 16 and 24 bit color is noticeable.
the difference between 24 and 32 is not.

if you use adobe photoshop you will often end up with images that use well over 64,000 colors (16bit). so if you make a image that uses alot of one color then the 16 bit pallet is too small and color transitions are not smooth.
 
I play alot of games in 32bit color just for the 24bit zbuffer so I can actually see everything. I'm not sure how this affects non nvidia cards but it makes a difference on mine. As for difference in image quality I can't really tell all that much, it was just a marketing ploy by nvidia to sell their cards over 3dfx, which worked :)
 
One that comes to mind is Unreal Tournament. Bring up the game in 16-bit color and look at the menus, GUI, and the game itself then change the setting to 32-bit and there is a big difference. If you can't tell the difference between 16-bit and 32-bit color then can you tell the difference between 8-bit and 16-bit color?
 
Although im not qualified to answer this debate, I thought i would throw in my opinions and believe that the difference and the ability to detect the difference greatly depends upon the quality of your screen and graphics card and the same argument can be carried on to many other topics such as when playing Quake can you tell the difference between 90 FPS and 110 FPS or can you tell the difference between a graphics card with 32mb compared to a card with 64mb of ram in general use. All that will be discovered is peoples opinion of the hardware they own. I would however assume that on the topic of 16 or 32 bit that the difference would be more noticable in hard copy format such as a printed photograph, however this greatly depends on the printer and paper quality also.. I just think that as PC owners we are lucky as we have a large choice as to what hardware we choose unlike mac users.
 
32bit color is impossable to see with the naked eye as human beings can only see in a max of 24bit color. Our eyes can't see more than 24 bit color so don't even bother with this thread. The only point of having 32bit color rendering is for image QUALITY, not color scales. If you play Quake 3 Arena and go into a foggy area of a map and look right at the fog in 16 bit mode, and then again in 32bit mode, you will see that in 16 bit mode, there are lines or "layers" of fog where as the 32bit fog is better blended. Hope that helps!
 
There is no real big difference between 16 and 32 bit colors, in some games, and certain areas it does make a difference. Anyway 24bit and 32bit are both 16.7 million colors, the only difference is that 32 bit works better or faster than 24 bit. Our monitors use 3 primary colors, which are red, green, and blue. That's 3 colors only, each color has 256 shades. So 256 shades of red x 256 shades of green x 256 shades of blue equals to 16.7 million colors and 24bits as each 256 shades represents 8bits. That's the maximum that monitors can display.
 
Its begining to make sense now. 24 is the same as 32 and 32 is basically a marketing ploy, ok. So what it really comes down to is a monitor works at a maximum of 256 shades of a color which in turn equals 16 million colors or 24 bit. That would make 16 bit (65 thousand) at about 40 shades of color....40 x 40 x 40 = 64,000 now that means the difference between the 2 is about 216 shades of a color, I just don't see that much more detail man, at 40 shades things look pretty good, change it to 32(24) bit and I can't see 4 times more detail. Now this is what I think really happend. I think some industry giant screwed up and years ago when grafix were still standard vga monitors and 8 bit grafics they figured on 16 bit being the max they would need so that became the future or industry standard then the did there homework and found that some people if not most could actually see more that 16-bit..40 shades maybe like 50 or 75 shades but that didn't fit with the evolution of things ...2 bit, 4 bit, 8 bit, 16 bit, 32 bit, etc. etc. so they came up with 24 bit and 32 bit just as big *** ploy or solution as to better grafics. hehe sorry but I think everything is a conspiracy! thanks for all the responses.
 
Back