• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Tempted to go for a 4870..

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

CreasianDevaili

Member
Joined
Sep 18, 2005
I am so tempted to get a 4870 tommorow. I have been wanting to wait for the 1gb versions or the 4870x2 but now that I got the money burning in my hand I is taunting me.

For someone who wants to just game 2304x1440 do you think the 512 will be enough with two in CF?
 
i have 4870 512mb and its more than enuf...in crossfire total 1gb plus the horsepower of those two will obliterate the 280gtx and will play ANYGAME maxed out besides crysis of course but still will play it very well at v high. As for the waiting since you havnt bought one i suggest you padlock your wallet until the shops release the 1gb version. Why?
1. im sure the release isnt far away, 2. when the 1gb comes out the 512mb version will be pushed down in price 3, im sure the 1gb version will be well worth the wait and not hat much more is cost.
 
i dont think the 512mb's will drop much at all atm...i think the 1gb 4870 will just be priced higer...prolly the $350 range
 
FYI... 2 cards in CF with 512Meg of ram does not = 1Gig... Its still 512Meg of ram.

And why the such weird resolution?
 
If you have 4870X2 money then by all means wait a few more weeks. If you want to save a little cash then the 4870 is unbelievable! I bought a Diamond HD4870 yesterday. Initially I felt bad about spending the money even though I payed cash and could afford it. Memories of how my 2900XT burned me surfaced and made me hesitate. Luckily, I bought it.

Words cannot describe how happy I am with this card. It is 2x faster than my old 2900XT WITH 4xAA and 16xAF. Actually I run 8xAA in all the games I play at 1680x1050 without as much as a stutter. I tried to find a downside to this card but there is none. Even with 512mb of memory, the GDDR5 makes up for any bandwidth or frame buffer requirements. You can even crank this card up in Crysis and it will play smooth!

The only scenario where I can think you'll NEED 1GB of memory is in a future GPGPU situation that we have yet to come across. I have a 22" monitor with a 1680x1050 native resolution and that doesn't even phase the card. At your resolution you may want to have a 4870X2 to crank AA and AF all the way, though. If you do get a 4870 tomorrow you WILL NOT be disappointed. I will give you my word on that. The thing makes my old socket 939 system fly and my only disappointment is that its making it hard to justify upgrading my platform to an Intel or AM2+ system. My gaming requirements have all been satisfied above and beyond my wildest expectations. This card is no joke....
 
i dont think the 512mb's will drop much at all atm...i think the 1gb 4870 will just be priced higer...prolly the $350 range

Price dosent really "matter" in the regard that I couldnt afford it. I just want to make it last a good while. 1920x1200 shouldnt be a issue but I wonder about 2304x1440. I wont be going 2560x1600 as I have to dip too low on my refreash rates to do it. At 2304x1440 i can still do 75hz in gaming which is acceptable, as such, I want to push to that resolution.

It is just.. :bang head I was hoping some 1gb versions would be out today, july 8th, so i wouldnt have this issue. I worry about the 4870x2 2gb being so scarce when it is released that I will have to watch a site like a hawk to get one.

Main worry is if the 512mb on the cards will handle 2304x1440, for awhile to come, less I should heed caution and just go with the 1gb versions or 4870x2 2gb.



FYI... 2 cards in CF with 512Meg of ram does not = 1Gig... Its still 512Meg of ram.

And why the such weird resolution?

Using a Sony FW900. The 24 inch widescreen CRT. 2304x1440 75hz is the max official resolution. Can hit 2560x1600, and have, but its 60hz which kills my eyeballs :eek: I have been staying at 1920x1200 90hz cause I only have a single 8800GT right now which was just a emergency card in febuary to replace a blown up x1900gt.

I just want to use all the potential I got in my monitor I guess.
 
Last edited:
Price dosent really "matter" in the regard that I couldnt afford it. I just want to make it last a good while. 1920x1200 shouldnt be a issue but I wonder about 2304x1440. I wont be going 2560x1600 as I have to dip too low on my refreash rates to do it. At 2304x1440 i can still do 75hz in gaming which is acceptable, as such, I want to push to that resolution.

It is just.. :bang head I was hoping some 1gb versions would be out today, july 8th, so i wouldnt have this issue. I worry about the 4870x2 2gb being so scarce when it is released that I will have to watch a site like a hawk to get one.

Main worry is if the 512mb on the cards will handle 2304x1440, for awhile to come, less I should heed caution and just go with the 1gb versions or 4870x2 2gb.

ATI isn't like nVidia in regards to memory size and issues with high dips in games because of it. Surely at such a high res extra memory probably would help out nicely but can't say for sure yet since we don't have any reviews on one which time will tell in that regaurds.

At that res though, you really might need 2x 4870's to give it a decent chance depending on what games your playing.

Using a Sony FW900. The 24 inch widescreen CRT. 2304x1440 75hz is the max official resolution. Can hit 2560x1600, and have, but its 60hz which kills my eyeballs I have been staying at 1920x1200 90hz cause I only have a single 8800GT right now which was just a emergency card in febuary to replace a blown up x1900gt.

I just want to use all the potential I got in my monitor I guess.

So what games you playing at that res? Just curious.
 
FYI... 2 cards in CF with 512Meg of ram does not = 1Gig... Its still 512Meg of ram.

And why the such weird resolution?

the 48** series is supposed to be able to share memory, at least in the x2 cards, not sure about xfire.

the x2 may come out with 2G.

becuase of ATi going with DDR5 it is fast enough that the diff between 512 and 1G wont be an issue really due to the super fast speeds of ddr5 pushing data it wont keep alot in ram kind of thing
 
ATI isn't like nVidia in regards to memory size and issues with high dips in games because of it. Surely at such a high res extra memory probably would help out nicely but can't say for sure yet since we don't have any reviews on one which time will tell in that regaurds.

At that res though, you really might need 2x 4870's to give it a decent chance depending on what games your playing.

I edited my response to your resolution question in the above post. Didnt want to double post so deleted the second one and edited it into the one above.

I mostly play mmorpgs. However I enjoy FPS such as Quake 4, F.E.A.R., and CoD4. Also like RTS such as Total War series.

Most FPS I have played run great. However I noticed my gpu starting to show it's colors when I played Age of Conan on 1920x1200. In mmorpgs, especially where people like to sneak up behind you, it is good to have as much viewing as you can. I just cant do the 2304x1440 with the 8800GT. Just isnt going to happen with how much goes on on mmorpgs. Well and keep a steady fps.
 
Last edited:
I edited my response to your resolution question in the above post. Didnt want to double post so deleted the second one and edited it into the one above.

I mostly play mmorpgs. However I enjoy FPS such as Quake 4, F.E.A.R., and CoD4. Also like RTS such as Total War series.

Most FPS I have played run great. However I noticed my gpu starting to show it's colors when I played Age of Conan on 1920x1200. In mmorpgs, especially where people like to sneak up behind you, it is good to have as much viewing as you can. I just cant do the 2304x1440 with the 8800GT. Just isnt going to happen with how much goes on on mmorpgs. Well and keep a steady fps.

Yeah those are too graphically demanding except AoC so I could see that the card possibly running full bore at that res with graphics cranked.
 
You might as well buy the 4870 now and then when the 4870x2 comes out you can sell it on the Classifieds for minimal loss, if you feel the single card isn't enough.
I kinda wish there was a 4850x2 planned, buying two cards and putting them in Crossfire is more expensive than a double-GPU card.
 
if i do recall, someone had posted info about the x2 cards coming in 1gig and 2 gig flavors.

so with that, that would tell me that the 1 gig models are really 2x 512 cards and the 2 gig models are the 2x 1 gig cards. so they do share memory.

i have been struggling since nvidia released thier new cards on which card to buy.

so far i have gone through a 9800GX2, whihc i returned and bought a 8800GTS 512 ( which i have right now installed with my G80 8800 GTS 320) and yesterday i purchased VisionTek 4870 which i hope is the end of me vuying new cards for awhile.

i only play WOW and TF2, but i play on a 32 inch LCD @ 1900x1200. both games dont require awhole lot from a card tho, but i wanted to futureproof the card atleast for the next 6 months.
i also run 3 other 19 inch LCDs, but thats mainly websurfing, watching TV, and temp monitoring on those. so the G80 8800GTS 320 will be fine to do all that, and i will return the 8800GTS 512
 
if i do recall, someone had posted info about the x2 cards coming in 1gig and 2 gig flavors.

so with that, that would tell me that the 1 gig models are really 2x 512 cards and the 2 gig models are the 2x 1 gig cards. so they do share memory.

They don't share memory. If you want to picture it for crossfire terms you have a Primary Core, then you have the Secondary Core (add more if needed). The Primary Core's memory is the main one, the ruler of all. The Secondary Core and all after is a mirror image of the Primary's Core Memory. So if they say it has 1Gig, really it only has 512Meg usable. If they say it has 2Gigs, it only has 1Gig usable.

Or you take into account that one that had 3 cores on it I think.. which ever it was has 1.5Gig well really its only 512Meg. If you pair up 4 cores for say 4Gig, it really only has 1Gig of viable memory.

Think you get the picture.
 
deathman20
FYI... 2 cards in CF with 512Meg of ram does not = 1Gig... Its still 512Meg of ram.

hm...didnt know that...so what why does it only make use of 512 ram not more..if it uses the 2nd cards processing power would it not use the memory if needed be e.g lets say u play on a 50 inch hd lcd..if the rez was high enough would it not use memory on both cards.??!?!???
 
Yeah those are too graphically demanding except AoC so I could see that the card possibly running full bore at that res with graphics cranked.

Strangely enough AoC is pretty demanding even in dx9. However as I said I have only played Total War Rome and Medieval, and the FPS at 1920x1200. They very well on the single 8800GT but when things get ready busy, which is the most fun for me, I have hit some hits on fps that distracts me. I want to relieve that and have enough to handle the slew of new pc titles coming out within the next year or so.

You might as well buy the 4870 now and then when the 4870x2 comes out you can sell it on the Classifieds for minimal loss, if you feel the single card isn't enough.
I kinda wish there was a 4850x2 planned, buying two cards and putting them in Crossfire is more expensive than a double-GPU card.

Well I am set on the 4870 variant Crossfire setup. I was all set for the 4870x2 setup but with the issues with producing the 1gb by now, which is affected by the shortage of GDDR5, I am wondering if I would be waiting until very late september or october. 15% improvement is nice but with the shortage the prices might shoot way up due to demand and end up waiting 2-3 months.

I was originally going to get a 3870x2 in Febuary but got the 8800GT because I decided to wait on the t200 and x48xx cards.

They don't share memory. If you want to picture it for crossfire terms you have a Primary Core, then you have the Secondary Core (add more if needed). The Primary Core's memory is the main one, the ruler of all. The Secondary Core and all after is a mirror image of the Primary's Core Memory. So if they say it has 1Gig, really it only has 512Meg usable. If they say it has 2Gigs, it only has 1Gig usable.

Or you take into account that one that had 3 cores on it I think.. which ever it was has 1.5Gig well really its only 512Meg. If you pair up 4 cores for say 4Gig, it really only has 1Gig of viable memory.

Think you get the picture.

That was one of my concerns. Unsure how Farcry 2, fallout 3, and a few other titles that I am looking forward to will handle 2304x1440.
 
deathman20


hm...didnt know that...so what why does it only make use of 512 ram not more..if it uses the 2nd cards processing power would it not use the memory if needed be e.g lets say u play on a 50 inch hd lcd..if the rez was high enough would it not use memory on both cards.??!?!???

It uses all the memory but the way cards are talking to each other, since they don't directly access a single shared memory, they must have all the textures, and other information the GPU needs stored in memory. So thats where if you run more than 1 card, each card gets duplicates of each other since they will basically show the same information since they alternate frame by frame, all the same textures ext.. are used.

ATI isn't affected by the whole memory thing anywhere near as badly as nVidia's memory is. When nVidia's memory is filled it gets bloated and slows down, when ATI's card memory fills up, it basically flushes the unneed crap out and continues with minor performance drop. ATI excels in huge memory management on thier cards.

BTW the size of the display doesn't matter you could have a 100" LCD that is 1280x720 and the card would have plenty of power, now if it had a resolution of say 2160P well thats another story.
 
I have the same monitor and you really cant force it to do 2304x1440 in many games. Just expect to play at 1920x1200 in almost every game you play. Really the only way to do it is to edit .ini files and even then it usually wont work. Also the real advantage, i feel, to this monitor is 90hz+ refresh rates at 1920x1200 in my opinion. Also the pixel pitch gets rather small on a 22" monitor (and yes thats what is viewable on the fw-900) after 1920x1200 so you really wont see any difference from 1920x1200 to 2304x1440. Basically imho the sweet spot to the fw-900 is at 1920x1200 and not as you have noticed at 2560x1600. I feel that 2304x1440 is decent for desktop but not for gaming as well becuase it takes to much work and really donesnt look all that different on this screen plus you lose out on some refresh rate.
 
Last edited:
I have the same monitor and you really cant force it to do 2304x1440 in many games. Just expect to play at 1920x1200 in almost every game you play. Really the only way to do it is to edit .ini files and even then it usually wont work. Also the real advantage, i feel, to this monitor is 90hz+ refresh rates at 1920x1200 in my opinion. Also the pixel pitch gets rather small on a 22" monitor (and yes thats what is viewable on the fw-900) after 1920x1200 so you really wont see any difference from 1920x1200 to 2304x1440. Basically imho the sweet spot to the fw-900 is at 1920x1200 and not as you have noticed at 2560x1600. I feel that 2304x1440 is decent for desktop but not for gaming as well becuase it takes to much work and really donesnt look all that different on this screen plus you lose out on some refresh rate.

I tried 2560x1600 and I couldnt stand it because as you said this monitor is only 22.5 viewable, which isnt big enough for 2560x1600. I just wanted to try 2304x1440 the mmorpgs that I play. I had messed with custom resolutions and found my ranges, which while I can go higher I prefer 1920x1200 90hz overall.

I had a few FTP mmorpgs that wouldnt allow past 60hz refreash rates at any resolution and while in game I was fine the moment I alt+tabbed out i started getting a headache. I wear glasses so desktop viewing is kind of limited to 1920x1200 period.

I appreciate the reply. It is a dose of reality I guess. 1920x1200 90hz+ right now is nothing to sneeze at of course. So the 512mb 4870's should be plenty for me in the end.
 
deathman20


hm...didnt know that...so what why does it only make use of 512 ram not more..if it uses the 2nd cards processing power would it not use the memory if needed be e.g lets say u play on a 50 inch hd lcd..if the rez was high enough would it not use memory on both cards.??!?!???

It uses all the memory but the way cards are talking to each other, since they don't directly access a single shared memory, they must have all the textures, and other information the GPU needs stored in memory. So thats where if you run more than 1 card, each card gets duplicates of each other since they will basically show the same information since they alternate frame by frame, all the same textures ext.. are used.

ATI isn't affected by the whole memory thing anywhere near as badly as nVidia's memory is. When nVidia's memory is filled it gets bloated and slows down, when ATI's card memory fills up, it basically flushes the unneed crap out and continues with minor performance drop. ATI excels in huge memory management on thier cards.

BTW the size of the display doesn't matter you could have a 100" LCD that is 1280x720 and the card would have plenty of power, now if it had a resolution of say 2160P well thats another story.

I was going to respond, but it looks like deathman took care of you.

The only way at the moment to use memory from both cards I believe is using dual monitors.

Read the above post.

You are way off course on the dual-monitor theory.
 
Back