• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What is your opinion of 64-bit color?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

supergenius74

Member
Joined
Mar 1, 2001
Location
Fort Wayne, IN.
Just read an article over at theinquirer and they were talking about 64-bit color and how important it would be in order to achieve pixar type computer rendering in realtime. They were kind of vague on real facts and rambled on about how john carmack wants this and they speculate that Nvidia will go to this soon. Now someone help me understand this but I didn't think you would ever need anything higher than 32-bit color, I mean, at least, our monitors can only show somewhere between 16 and 24 bit pixels. wouldn't we need new monitors then? If 8bit =
2(8power)=256 colors and 16bit = 2(16power)= 65,536 colors and 24bit= 2(24power)= 16,777,216 colors, then 32bit =2(32power) = 4,294,900,000 colors then finally 64bit would equal 2(64power) = 1,844,600,000,000,000,000 colors!!!! I'm not even sure how you say that number but I seriously doubt that we can see that many differnt colors, Nor can our monitors create them. IF anyone can tell my exactly how many shades of a color the human eye can actually see please tell me so i can write this off as nonsense. But if its true then i will start saving my moola for a new monitor or vid card. :eek:
 
It's only something Nivida would get people hyped up over..Then once it becomes standard people are going to say I can't tell the differnt from 32-bit colour from 64-bit...
 
welllllll.......

i'm not sure if this is along the same lines or not, but...

when dealing with 3d renderings or just plain pictures, you have options to render at 24 bit (which is 8bit red, 8bit green, 8bit blue) or 32 bit (which just adds an 8bit alpha channel, or transparency channel for easier understanding).

all colors in the 24bit color scheme are based off a 0-255 value for each of the 3 primary colors, red, green and blue.

now, i personally dont think going beyond 24/32 bit will make a visible difference, but i could be wrong, as i've never seen anything rendered higher.

i would, however like to read the article to see what exactly they plan on doing with the bit depth, and how exactly they think it would help or not.
 
the artical is at the inquirer

the article in which i speak is at theinquirer.net, i know we are not supposed to post links so i wont but its on the main page about 1/2 way down on the right called 64-bit? or something like that. :rolleyes: they say thats why we will need more memory for cards like 128 meg cards. I'm sure the added memory will help with running larger resolutions like 1200 x 16000 or larger textures but i seriously doubt we will need 64-bit color, wouldn't the OS have to support it also? i have not heard of any OS claiming to support that nor a monitor. So if 32 bit really isnt 32 bit then 64 bit is even more crazy, hmmmm. :beer:
 
You're all right, 64 bit color won't make much of a difference... now (except to slow all your games down more).

The reasoning behind it is simple though, technology is going to keep evolving, so why be contented where we're at for awhile until everything else catches up? I think coloring is an issue for 3D more than anything as it allows for much smoother transitions in shading. Monitors will eventually catch up.

Another interesting side thought... what happens when someone invents a better eye? People have already begun to experience better than 20/20 vision because of laser surgery. I realize this isn't color related, but there are infinitely more things left that could be accomplished by mankind, so we might as well start planning for the future. That's the idea behind 64 bit.
 
They are talking about something different than 64 bit color on your monitor.
They are talking about how software can be written in 64 bit color instruction instead of 32 bit to allow more information per shader pass.
The article is poorly written in my opinon and misleading.
 
Spartacus51 said:

The reasoning behind it is simple though, technology is going to keep evolving, so why be contented where we're at for awhile until everything else catches up? I think coloring is an issue for 3D more than anything as it allows for much smoother transitions in shading. Monitors will eventually catch up.

Another interesting side thought... what happens when someone invents a better eye? People have already begun to experience better than 20/20 vision because of laser surgery. I realize this isn't color related, but there are infinitely more things left that could be accomplished by mankind, so we might as well start planning for the future. That's the idea behind 64 bit.

There is a limitation of humans. What you're suggesting is that we increase the colors to those we can no longer see. Just as our ears can hear only so many sounds. Well..frequencies that make up sound. Going beyond that is pointless.
 
i think that 64bit colour is just crap cause we cant see that mcuh anyway and it wont make much of a differance form 32bit

its just that nvidia want more cash so they design all this other stufff just for cash
 
i agree with most people here that it wont make a difference.....but im sure thats what people first said about 32bit color too...
 
I'd like to add that any analog monitor can display an infinite number of colors. It is the job of the video card to convert the video signals fom digital to analog. The video data is divided into 3 colors- red, green, and blue. Each color is give a brightness level (in 24 or 32 bit color the level is represented as an integer from 0-255). The video card takes this digital information and coverts it to analog, which the monitor can use. In analog, you aren't limited to integers as you are in digital- you can display any color (red, green, or blue) at any brightness level you want. You old-timers should think about it- was your monitor rendered obsolete when 24 or 32 bit color came around? No.

Now, whether this will make any difference in image quality is still up for discussion. I, for one think that the frame rate hit of actually having to handle millions of times the number of colors to display will be too much for the first couple years after the technology is introduced, and even then, its iffy whether or not you will be able to notice the difference.
 
Can you see 64-bit colour over 32-bit colour = YES!

Whoever says you cant see the difference between 32-bit colour and 64-bit colour is wrong and misleading.

Yes its true, 32-bit colour has already removed the colour banning in rendering realtime but 32-bit colour still doesnt have enough colours to display the proper image colour. To the human mind, there is no limit on how many colours there really is and how we see colour deffies on what we are looking at. Human cloneing and genitic alterations need 64-bit colour to accuratly define two define properties visually over a small .01 micron (1000X) zoom.

64-bit colour will also improve the quality of Trilineary filtering by improving blending with textures.

Carmack is da man . Dont doubt one with a talent and iq twice of mine and yours. His passion for graphics and design far surpasses mine and yours.

The NV30 will be the first with 64-bit colour in both 2D/3D. The current name for this card is the Eclipse and wont make the market untill sometime 2003. The current prototype has 400MHz (8-pipe) GPU but that could change.:D


AXIA
 
It's really more about blending and less about being able to see the difference between 64 and 32. If you were to present a standard 2d picture, one in 32 and one in 64bpp, I would bet that very few people could pick out the difference. Where 64bit color does come into play is in layering. When you paste a few textures on top of each other (assuming you're using some transparencies) the precision of the final color is degraded with each layer. 64bit rendering gives alot more headroom for color combining, and will yield noticeable visual improvement in increasingly complex scenes.
 
I can't believe how many people go on about adding extra colors. Its never been about adding more colors. 32 bit was about adding alpha channels, not more colors. The new technology these days is 3D and monitors that render 3D. The problem is that there is no standard system to send pixel information for a 3d image. Instead we send 2 32bit images that the monitor interprets and overlays when displaying 3D mode. This is usually done with 120Hz refresh rate, 60Hz per eye alternating. Then you use your 3D glasses to synchronize the right eye/left eye signals with what your eyes can see. For software purposes a 64bit image would allow for a single graphics instruction to carry full 32 bit right eye/ and full 32 bit left eye. This takes more memory and more processing time, but compression algorithms like JPEG will be able to compile it all together in a compressed state and still store the full 3D image. Can you find a 3D jpeg image or avi/mpg/mov movie in a single file? You can't because they don't exist yet. Computers and Blu-ray players that support 3D are still rendering 2 files on a separate stream. Standards have to be made for these things, and a push for 64 bit colors is the necessary step to allow global 3D rendering in standards compliant file formats.

Also 64bit images for the purpose of 3D, will need a 3D capable display and a video card that can support it. Old analog monitors will probably have trouble with 120Hz.
 
Last edited:
Son of a.....I actually read this entire thread before Logan clued me in to how old it is....

Serves me right for waking up so bloody early..
 
So if there is no notable difference between 24-32 much less 32-64 then we should be focusing on getting consumer grade holograms next.

- - - Updated - - -

To further bring this thread up-to-date, I suppose 64-bit combined with 4k would be a remarkable difference!
 
World of Warcraft used to be 16-bit or 24-bit color, then it got upgrade to 32-bit and i see ZERO difference between 24-bit and 32-bit :rain:
 
Back