• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

5870 real pics :D and specs :D

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
^^

i like my 295 better then my 4890, trust me it gets alot better :D

5870 is going to be a really sweet deal, 400 bux is not alot to ask for what you are getting!

If you're on a 3x generation or earlier I would upgrade to it for sure, but if you're on the 4x generation? Need to see some reviews. :)
 
This card is not going to be exciting for most people, since the average person does not have a high resolution monitor.

I am dissapointed about one thing though. This "free AA" bull**** that nVidia pulled. No information on it yet, but it seems to not be effected by enabling AA. Last time I saw that it was a LOD hack of AA called CSAA that no one would notice because 90% of the people with this highend cards were running cheap LCD monitors.

So far what the reviews showed me, is that, this will drive down the cost of the 4890. The performance difference without AA enabled is relatively small. Since I do not use AA because I am not a low resolution gamer, the card for me is the 4890.

I guess lots of people are running low res monitors still, I just do not understand why people want AA enabled. It was developed because you cant have detail at 1024x768 resolution and get a "photorealistic" image. I dont want realistic, I want crisp images with clear delineations between a tree and the person hiding in it.

If I wanted a smudgy, gamma flooded image that looked "real" I would get an nVidia card... jeez My eyes are not bad, why would I want to see what everyone thinks they are looking at...
 
I wouldn't care too much but i recently discovered the glory of 2048x1536 with AA. It's pretty cool when you can look at the tiniest things in the distance in game and the are no pixels in sight. It's like a painting. I want to go through the fallout 3 expansions with high res texture mods like that :)

Meh, sounds like you'll love Intel's Larrabee if it all they promise. Fully raytraced graphics should completely obsolote all this crap. What resolution do you game at?

Intels answer: who cares, it all looks the same when its raytraced.

:screwy::screwy::screwy::screwy::screwy::screwy:
 
yeah it's kind of pointless for a low res but when you have high res and maybe 4x aa it looks incredible. not smudgy. none of those little "ants" crawling around on the edges of models. i used to be fine with 1024x768 but i decided to try it out just for the hell of it and it's really nice. my card just can't hack it cranked all the way. i don't play multiplayer like that though. no way lol. cranked graphics is for immersion in single player like FO3 or Oblivion.
 
I play at 1680x1050 and I can tell when AA is disabled or enabled. I guess I am not a high resolution gamer. Jaggies irritate me. Maybe it's time I upgrade to a 23" on my next upgrade, but then I will have an unused 22" that I won't be able to sell since its 5ms and has scratches on it.
 
Not to mention the power consumption, you can run crossfire 5870s for less power than one 4870 and yet you get 2-3x the performance.

Huh? I see the following:

Load Power Consumption:
5870 CF = 664 W
5870 = 401 W
4870 = 339 W

:confused:
 
well thats strange, we are getting conflicting power usage information, last i saw at load and at idle it beat a 4870 in both
 
still not enough info to determin for sure. so far we have one legit reviewer...

and that got me to thinkin when i read that, they say it should do 27 watts but they say at idle its like 121 or something like that. was that a whole system idle reading?
 
So far what the reviews showed me, is that, this will drive down the cost of the 4890. The performance difference without AA enabled is relatively small. Since I do not use AA because I am not a low resolution gamer, the card for me is the 4890.

I guess lots of people are running low res monitors still, I just do not understand why people want AA enabled. It was developed because you cant have detail at 1024x768 resolution and get a "photorealistic" image. I dont want realistic, I want crisp images with clear delineations between a tree and the person hiding in it.

If I wanted a smudgy, gamma flooded image that looked "real" I would get an nVidia card... jeez My eyes are not bad, why would I want to see what everyone thinks they are looking at...

Low resolution gamer?? Many of us run 1920x1200 or higher and still see the benefits from AA. When there aren't jaggies 'crawling' along the side of an object, it's much easier to maintain suspension of disbelief. I can see how it might not bother someone with an undiagnosed vision problem or something, but you'd really would have to be half blind not to see the improvement made by antialiasing.
 
Last edited:
I have to agree with ratbuddy, Neuro cmon man, get a smaller flag lol :D

I play at 1920x1080, and with my 295, I can tell the diff between 32xqaa, 16xqaa, 8xqaa. I understand you dont like nvidia that much, but the IQ of ATi is not as godly as you make it out to be lol :)

I like the colours of ATi, But I still prefer the realism of NV ;)

Hopefully the new ATi will sway me further :)
 
Idle power. You're idle more than you're at full load aren't you? :screwy:

Yes, but if you do the math, the huge difference in load power consumption causes the breakeven point to be just under 95% idle. i.e. if more than 5% up time is spent gaming, the 4870 is more power efficient. Do you game less than 5% of the time on your gaming rig? :confused:
 
what really makes me laugh is ppl claiming now ATI is more powerful than nvidia, comparing older NV gpus with the newest ATI ones.

I just want to say, wait guys, just wait... it will be as ever it has been. Nvidia performing a little better and getting way too much overpriced than ati cards.

And btw, it doesn't mean anything that ati released dx11 cards before nvidia, ATM high end rigs with gt200 or the 4xxx series can handle any game at the most common gaming resolutions

upgrading now to the 5xxx is a rather stupid move.

wait for gt300 to come out. prices will drop a little and hopefully we can use all that power with the most recent dx11 games (although it will take a lot for some to come out)
 
what really makes me laugh is ppl claiming now ATI is more powerful than nvidia, comparing older NV gpus with the newest ATI ones.

I just want to say, wait guys, just wait... it will be as ever it has been. Nvidia performing a little better and getting way too much overpriced than ati cards.

And btw, it doesn't mean anything that ati released dx11 cards before nvidia, ATM high end rigs with gt200 or the 4xxx series can handle any game at the most common gaming resolutions

upgrading now to the 5xxx is a rather stupid move.

wait for gt300 to come out. prices will drop a little and hopefully we can use all that power with the most recent dx11 games (although it will take a lot for some to come out)


We're comparing price points which is more than fair.
 
what really makes me laugh is ppl claiming now ATI is more powerful than nvidia, comparing older NV gpus with the newest ATI ones.

I just want to say, wait guys, just wait... it will be as ever it has been. Nvidia performing a little better and getting way too much overpriced than ati cards.

And btw, it doesn't mean anything that ati released dx11 cards before nvidia, ATM high end rigs with gt200 or the 4xxx series can handle any game at the most common gaming resolutions

upgrading now to the 5xxx is a rather stupid move.

wait for gt300 to come out. prices will drop a little and hopefully we can use all that power with the most recent dx11 games (although it will take a lot for some to come out)

nVidia is slacking with their GT300, bottom line. If all were fair in love and war, nVidia would have released the GT300 today and both teams would be off and racing with their new high end product. But as things stand, they can't. Therefore the only way you can compare companies is by their current, fastest products on the market, and at this time that's the HD 5870 1GB vs. the GTX295.
 
Back