• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Rant about high end gaming today.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
You realize that a 6GB dual GPU is an effective 3GB card, correct?

I know so many people that add the gpu vram together for that, lol. Especially with cards like the 690, people buy and think they actually have 4GB of vram.
 
Dual gpu cards are just nice if you are looking to drop money and don't have much room for 2 seperate cards.
 
I really have no clue what this is all about... LOL!

Doesn't all electronic devices have to go through this testing anyway? Yes for multiple reasons. Isnt it contained inside the IHS, and case in the first place? No, it can emit beyond. I don't know much about it, but that doesn't sound remotely right... otherwise, I would be glowing when I left work every night (I work in a Data Center with hundreds of physical servers, most with dual/quad CPUs with 4-12 cores on each...) I didn't say anything about glowing. That's your idea

In your Data Center, it would not surprise me that you should wear hearing protection with noise over 60db. That would be your first health risk.

The most major radiation caused by PCs and phones (mobile PC) is actually from transmitting and receiving radio-frequency energy. and I quote.

This device is designed to be within the emmision limits for exposure. 1.6 W/kg average over 1 gram of body tissue.

My most major point was and still is that we CAN produce processors and other technology that could be harmful. To further add to that statement... all our equipment is actually not very powerful to the human potential of technology because it can be potentially harmful to us. Cpu GPU or even wifi nic cards.
 
I just do not feel that the radiated emissions from my pcs to be much of a danger to me, kriky I drive in this town, these people scare me half to death.
 
In your Data Center, it would not surprise me that you should wear hearing protection with noise over 60db. That would be your first health risk.

The most major radiation caused by PCs and phones (mobile PC) is actually from transmitting and receiving radio-frequency energy. and I quote.

This device is designed to be within the emmision limits for exposure. 1.6 W/kg average over 1 gram of body tissue.

My most major point was and still is that we CAN produce processors and other technology that could be harmful. To further add to that statement... all our equipment is actually not very powerful to the human potential of technology because it can be potentially harmful to us. Cpu GPU or even wifi nic cards.
Noise /= radiation.

and you quote....what? links? a quote?

Sorry for the sidebar... but just curious. If you want, shrimp, pm me the results. :thup:
 
Noise /= radiation.

and you quote....what? links? a quote?

Sorry for the sidebar... but just curious. If you want, shrimp, pm me the results. :thup:

No your correct. No need for pm. Pc equipmentvis regulated to not create health risks. I was simply saying we have the recources to make equipment powerful enough TO make a health risk. Sorry if thats not clear. I cant explain better than that.

Also I cannot link the fcc rules I read from my phone in the regulatory and safety tab.
 
We need more power, I want blu ray movie quality gaming, I don't care how many watts it takes.:cool::popcorn:
Crysis 3
crysis 3.jpg
Man of steel
man of steal.JPG
 
Last edited:
1080p, 24fps, and extreme motion blur are everything I don't want from gaming :beer:

Honestly, games that really push the envelope graphically like tomb raider or crysis 3 are pretty solid already. We just need an improved world to go with it. IE better physics, destructible environments, things like that. A lot of people claimed crysis was poorly optimized when in launched, but to this day it's hard to find a game that lets you use a machine gun to chop down a tree, cut a chunk off the trunk, pick it up and throw it through a passing cars window to kill the driver and cause an accident. We need more of that imo.
 
AI that turns on beast mode and rapes your soul, that's the AI I want. I seem to recall older games being far more difficult than modern titles.
 
1080p, 24fps, and extreme motion blur are everything I don't want from gaming :beer:

Honestly, games that really push the envelope graphically like tomb raider or crysis 3 are pretty solid already. We just need an improved world to go with it. IE better physics, destructible environments, things like that. A lot of people claimed crysis was poorly optimized when in launched, but to this day it's hard to find a game that lets you use a machine gun to chop down a tree, cut a chunk off the trunk, pick it up and throw it through a passing cars window to kill the driver and cause an accident. We need more of that imo.

+1. Can't believe that today, games are still being released where shooting through a glass window and using a car to break down a streetlight are the limits of in-game physics, and the AI thinks diving next to the giant oil tank is a bright idea. These days, you don't need a top of the line, $1,000 cpu to have beastly back end processing power, I want in-game physics and AI to take advantage of it.
 
I think down the line... You're GPU will come in a stacker type case with it's own power supply and cooling system. You would just plug it into you're PC via HDMI and a Display Port cable.

When you're not gaming or doing anything 3D intensive it would go into a power saving mode and use very little power.

It would sit next to you're PC case and it would be very powerful... As far as connecting it to you're PC I was just thinking about what we use now. But down the line... I could be something as simple as a single PCI-E 512X cable that would connect right into the back of you're Tower... Who know's, lol.

Or thing's will keep getting smaller and cooling systems will keep getting better and use less power. But I have a feeling we are already at a all time peak of GPU power needed. If game's and 3D app's were developed better to fully use the CPU core's and the GPU's we have in our system now. Then adding more power would not be the issue. Hardware has evolved much faster then software/game's have. Because as I said, we have all the power we need in our current systems. But the software is just not optimized to use the CPU/GPU power the system provides it. In 90% of any game I have played anyway.
 
Here's where I display my ignorance of the hardware design-and hopefully cure some of it. Why do they use the memory from two cards if half of the total is pointless ? I assume there is a reason they use both sets of memory with two GPUs , but couldn't that all be changed on a non-reference card ? Are the memory chips of the processor necessary for the GPU to function ? Or are there stipulations from AMD/NVidia about the architecture that the card makers must follow ? And am I even asking the right questions ?
 
I could be off-base, but if I remember correctly, SLI/Xfire works by splitting the work load. One video card takes care of rendering the top half of the screen while the second takes care of the bottom half. So the work is split in half essentially. As for the vram, that is mirrored.

Someone with more insight can verify that for me.
 
I could be off-base, but if I remember correctly, SLI/Xfire works by splitting the work load. One video card takes care of rendering the top half of the screen while the second takes care of the bottom half. So the work is split in half essentially. As for the vram, that is mirrored.

Someone with more insight can verify that for me.

Depending on which cards (NVidia, AMD, old, new, etc) it can either be split screen rendering or alternate frame.
Most new setups do alternate frame, IIRC.
 
I could be off-base, but if I remember correctly, SLI/Xfire works by splitting the work load. One video card takes care of rendering the top half of the screen while the second takes care of the bottom half. So the work is split in half essentially. As for the vram, that is mirrored.

Someone with more insight can verify that for me.

Depending on which cards (NVidia, AMD, old, new, etc) it can either be split screen rendering or alternate frame.
Most new setups do alternate frame, IIRC.

But does any game benefit from X-Fire?

Just wondering for the day I actually stop mining and X-Fire my 7850 with my 270x. :)
 
Back