webertech Posted August 13, 2009 Share Posted August 13, 2009 just for you people out there with screen lag issues not everyone knows this but the human eye can not see the difference in color has far has it gos in a video game that is (now on something like a computer you can.) same is true about 32 vs 16. just giving you a lil tip Quote Link to comment Share on other sites More sharing options...
CaldasGSM Posted August 14, 2009 Share Posted August 14, 2009 the rule is pretty simple.. with 1 Byte = 8 bits = you can define 256 different colors 2 Bytes = 16 bits = 65.536 colors 3 Bytes = 24 bits = 16.777.216 colors 4 Bytes = 32 bits = 4.294.967.296 colors (including transparencies) since is believed the human eye only distinguishes up to 10.000.000 (10 million) color anything above 24 bits... does not produce a "explicit" improvement on quality... that is why a few years ago the 24 bits colors depth of the video cards was called "true color".. (doesn't get any more real for you.. )... so as an optimization you can cap the colors to 24 bits.. you wont notice the difference.. and I bet that if you cap it to 16 bits you will still find it pretty playable.. but one thing has to be noticed.. using 16bit color instead of 32bits (half) does not mean the game will go twice as fast.. or only take half of the resources.. It allows for some graphical optimization.. but does not do magic for the rendering engine Quote Link to comment Share on other sites More sharing options...
webertech Posted August 14, 2009 Author Share Posted August 14, 2009 the rule is pretty simple..with 1 Byte = 8 bits = you can define 256 different colors 2 Bytes = 16 bits = 65.536 colors 3 Bytes = 24 bits = 16.777.216 colors 4 Bytes = 32 bits = 4.294.967.296 colors (including transparencies) since is believed the human eye only distinguishes up to 10.000.000 (10 million) color anything above 24 bits... does not produce a "explicit" improvement on quality... that is why a few years ago the 24 bits colors depth of the video cards was called "true color".. (doesn't get any more real for you.. )... so as an optimization you can cap the colors to 24 bits.. you wont notice the difference.. and I bet that if you cap it to 16 bits you will still find it pretty playable.. but one thing has to be noticed.. using 16bit color instead of 32bits (half) does not mean the game will go twice as fast.. or only take half of the resources.. It allows for some graphical optimization.. but does not do magic for the rendering engine i find it changes some lag for those people that are getting say 25.56 fps? and give them that extra 2 to 4 that they need....and!!!!!!!!!!!!!!!!!!!! FORGET YOU FOR MAKING ME READ ALL THAT MATH!!!! DO YOU HAVE ANY IDEA THE BAD MEMORIES THAT BRINGS UP!?!?!? ITS LIKE MATH CLASS ALL OVER AGAIN LOL......good job thou you should repost this lol and put ni your comment Quote Link to comment Share on other sites More sharing options...
Number1Dad Posted September 14, 2009 Share Posted September 14, 2009 There are no 64 bit displays to my knowledge, only operating systems and software. Quote Link to comment Share on other sites More sharing options...
Crimson Posted September 20, 2009 Share Posted September 20, 2009 Thought I'd add this: What are the benefits of 64-bit color? The eye can't see a difference when color depth is higher than 24 bits, so why the need for 64-bit color? Instead of looking at the total bits and total color (or colour ;-]) possiblities, it's better to look at the bits per channel to know how smooth shade transitions can be. With 24-bit or 32-bit color, the RGB channels have 8 bits each. That allows 256 shades. For example, going from solid black to solid white would include 256 shades (black-0, white-255, and 254 in between). 24-bit = RGB; 32-bit = RGBA (alpha included). I know that I can see the difference in shades here, but maybe not everyone can. It's not very noticeable though. There are not 16,777,216 shades of one color, but that's the total number of all shades of all colors, in case anyone is not aware. Now when lots of color combining occurs in real-time graphics at 32-bit precision (8 bits per channel), some precision can be lost with each pass of rendering. As graphics get more demanding, more passes will occur too. Eventually, games will use stuff like 64 texture layers per poly (maybe 4 passes and 16 textures per pass), plus FSAA, motion blur, fog, and more color blending effects. With 8 bits per channel, even if 2 bits are lost, that's already down to only 64 shades per color channel (only slightly better than 16-bit color). If 4 bits are lost, we're down to 16 shades (below 16-bit quality). You can look at 64-bit color as Color Anti-Aliasing. Everything can be processed at 64 bits internally, and the monitor could display it at 24-bit still. Just like how supersampled FSAA renders internally at a resolution larger than the screen res, and motion blur renders internally with more frames than your monitor/eyes can process. 64-bit color can reduce color banding and can improve sub-pixel accuracy, full-scene anti-aliasing, and color/gamma correction. Plus, 64-bit color will be using 16-bit floating point value channels, and that's more friendly for the graphics pipeline. Current 32-bit color uses 8-bit integer values per channel. http://hankfiles.pcvsconsole.com/answer.php?file=430 Quote Link to comment Share on other sites More sharing options...
Administrators JoeDirt Posted September 20, 2009 Administrators Share Posted September 20, 2009 This thread is full of inaccuracy to lazy to look for facepalm pic to spam it. There are no 64 bit displays out there, the 64 bit is suppose to be referring to the memory data bus and how much info it can transmit at once,but puny little nub brains can't comprehend that. Quote Link to comment Share on other sites More sharing options...
Tonka Posted September 20, 2009 Share Posted September 20, 2009 (edited) This thread is full of inaccuracy to lazy to look for facepalm pic to spam it. There are no 64 bit displays out there, the 64 bit is suppose to be referring to the memory data bus and how much info it can transmit at once,but puny little nub brains can't comprehend that. my appoligies for having puny little nub brain Edited September 20, 2009 by Tonka Quote Link to comment Share on other sites More sharing options...
Aphrodite01 Posted January 20, 2010 Share Posted January 20, 2010 the rule is pretty simple.. with 1 Byte = 8 bits = you can define 256 different colors 2 Bytes = 16 bits = 65.536 colors 3 Bytes = 24 bits = 16.777.216 colors 4 Bytes = 32 bits = 4.294.967.296 colors (including transparencies) since is believed the human eye only distinguishes up to 10.000.000 (10 million) color anything above 24 bits... does not produce a "explicit" improvement on quality... that is why a few years ago the 24 bits colors depth of the video cards was called "true color".. (doesn't get any more real for you.. )... so as an optimization you can cap the colors to 24 bits.. you wont notice the difference.. and I bet that if you cap it to 16 bits you will still find it pretty playable.. but one thing has to be noticed.. using 16bit color instead of 32bits (half) does not mean the game will go twice as fast.. or only take half of the resources.. It allows for some graphical optimization.. but does not do magic for the rendering engine You gave me a headache Now am stressed again... I need more coffee.. Quote Link to comment Share on other sites More sharing options...
God Posted January 20, 2010 Share Posted January 20, 2010 This thread is full of inaccuracy to lazy to look for facepalm pic to spam it. There are no 64 bit displays out there, the 64 bit is suppose to be referring to the memory data bus and how much info it can transmit at once,but puny little nub brains can't comprehend that. 64-bit is not only referring to the memory data bus at all... The main differences are that 64-bit refers to extended instructions found on the CPU, and the ability to process larger amounts of data. I guess your, and I quote, "puny little nub brain can't comprehend that", though. And yes, right now, GPUs arn't powerful enough to process/display 64-bit color, but soon enough they will be. Although every single colour cannot be seen by the eye, the effect of added colour will still be noticed. As I said, 64-bit's just allowing the processing of more data, when there are 64-bit GPUs, and games capable of outputting even more colours, this will be available. Quote Link to comment Share on other sites More sharing options...
PHANTASM Posted January 21, 2010 Share Posted January 21, 2010 lol this won't end well. Quote Link to comment Share on other sites More sharing options...
Blackguard Posted January 21, 2010 Share Posted January 21, 2010 I don't think so either. Quote Link to comment Share on other sites More sharing options...
Administrators JoeDirt Posted February 2, 2010 Administrators Share Posted February 2, 2010 64-bit is not only referring to the memory data bus at all... The main differences are that 64-bit refers to extended instructions found on the CPU, and the ability to process larger amounts of data. I guess your, and I quote, "puny little nub brain can't comprehend that", though. And yes, right now, GPUs arn't powerful enough to process/display 64-bit color, but soon enough they will be. Although every single colour cannot be seen by the eye, the effect of added colour will still be noticed. As I said, 64-bit's just allowing the processing of more data, when there are 64-bit GPUs, and games capable of outputting even more colours, this will be available. Its about 64bit displays. We were talking whats available today not in the future. I was referring to the article from where this convo started. Dont take my shit out of context. NEXT!!!!!!!!!!!!!!!!!! Quote Link to comment Share on other sites More sharing options...
-=Medic=- Posted February 4, 2010 Share Posted February 4, 2010 I just installed win 7 64bit last night and really amazing all drivers are available never see that with vista and xp and a lot of my application are available for 64bit version which is really cool, and finally my 4GB ram are available to use ^^ just check out this video, for people who don't know much about windows 7 32 or 64 bits and some tricks and tips for user, video is 45 minutes and you will learn some nice information about win7 the whole video is nice, only the first excitement minutes about "win7 is here wow" is little bit too much. lolz http://www.youtube.com/watch?v=3jRD7PWj9So Quote Link to comment Share on other sites More sharing options...
Nugget Posted September 12, 2010 Share Posted September 12, 2010 128bit is just around the corner. Which is going to force a lot of people to upgrade to 64bit hardware as they will be phasing out 32bit. Quote Link to comment Share on other sites More sharing options...
*Evil*Stevedawg Posted September 12, 2010 Share Posted September 12, 2010 32 bit is dead/dying. Pretty sure microsoft announced their next OS will not have 32. And while 128 is the next step past 64, it will be a long ways down the road before its needed or standard. Computer programs are just starting to utilize multiple threads, and the reality is most everyday programs still only need and use one core. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.