• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

X1900 reviews

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
When nVidia launched GTX and they started coming out, ATI promised that it will deliver right away on their R520's and we kept on waiting and waiting and waiting...

After long time has passed by they finally released R520's which didn't match the expectations. Then they started saying how the development team didn't go a good job on transferring to a 90nm die and that they had lot of problems. However they more then once stated that team on R580 has had much better luck and that they will bring out a true next gen ATI card that R520 was supposed to be. I was more then hyped about it until they mentioned that it will still be 16 pixel pipe card but the addition of 48 shaders should be a substitute for that. So far from what I have seen it only works well on certain games like F.E.A.R. but in rest of them the performance is decent.

As for 512MB GTX which was originally supposed to be 7800Ultra, was a counterpart for X1800XT's, but G71 (nVidia's first attempt at 90nm die) will be the counterpart for X1900's so we will see how it turns out. Hopefully better then R580's, and if rumors stay true it will be Much better.
 
RedDragonXXX said:
When nVidia launched GTX and they started coming out, ATI promised that it will deliver right away on their R520's and we kept on waiting and waiting and waiting...

After long time has passed by they finally released R520's which didn't match the expectations. Then they started saying how the development team didn't go a good job on transferring to a 90nm die and that they had lot of problems. However they more then once stated that team on R580 has had much better luck and that they will bring out a true next gen ATI card that R520 was supposed to be. I was more then hyped about it until they mentioned that it will still be 16 pixel pipe card but the addition of 48 shaders should be a substitute for that. So far from what I have seen it only works well on certain games like F.E.A.R. but in rest of them the performance is decent.

As for 512MB GTX which was originally supposed to be 7800Ultra, was a counterpart for X1800XT's, but G71 (nVidia's first attempt at 90nm die) will be the counterpart for X1900's so we will see how it turns out. Hopefully better then R580's, and if rumors stay true it will be Much better.
I agree :thup:
 
It's not a next gen card, it's a refresh. The next gen cards will be the 600 series, and should usher in WGF (DX10).

My thoughts on it are mixed. They are certainly taking a gamble going down the road they've started on. If games become very shader oriented then they could age really well. On the other hand, if they don't, next (northern hemisphere) summer they will get walked all over. Still, the current trend seems to be to add more and more shaders, and it's not like they are totally clueless up in Markham so they must have a good feeling about it.

And 500 GFLOPS. It would make quite the math co-processor, or physics engine. http://graphics.stanford.edu/projects/gpubench/results/X1900XTX-5534/

edit: you guys have been busy posting while I've been concentrating on my lunch.
 
compared to my x850xt the image quality on my 7800gtx is rubbish - thats just using nvidia standard drivers vs ati standard drivers. of course its all subjective, and im no fanboy - the ati card just looked better, but the 7800gtx spanked it performance wise. my loyalty lays with my wallet, but when x1900 comes down in price my eyes are gonna make me change camp again....
 
thegreek said:

:thup: as well.

If the 7900 debutes with 1.1ns memory again and even sports a modest speedbump over the GTX 512(even 650mhz on the core), it "should" soundly take the upperhand...again...EVEN with its 24 pipe config. With the speculation that it will be 32 pipes AND 700mhz+ on the core AND 1800mhz memory, I think we could see a blowout.

Now the price of that beast is a WHOLE other story :p . I'm hoping Samsung has ramped up enuff 1.1ns memory to keep the prices down. But it seems odd that if Samsung has actually pumped out a bunch, why wouldn't ATI go with it instead of the 1.2ns for the x1900s?
 
RedDragonXXX said:
When nVidia launched GTX and they started coming out, ATI promised that it will deliver right away on their R520's and we kept on waiting and waiting and waiting...

After long time has passed by they finally released R520's which didn't match the expectations. Then they started saying how the development team didn't go a good job on transferring to a 90nm die and that they had lot of problems. However they more then once stated that team on R580 has had much better luck and that they will bring out a true next gen ATI card that R520 was supposed to be. I was more then hyped about it until they mentioned that it will still be 16 pixel pipe card but the addition of 48 shaders should be a substitute for that. So far from what I have seen it only works well on certain games like F.E.A.R. but in rest of them the performance is decent.

As for 512MB GTX which was originally supposed to be 7800Ultra, was a counterpart for X1800XT's, but G71 (nVidia's first attempt at 90nm die) will be the counterpart for X1900's so we will see how it turns out. Hopefully better then R580's, and if rumors stay true it will be Much better.

These are all facts yes, but we will see if NV has an easier time than ATI did going down to 90nm, if they do and they launch lots of cards at a decent price then they can be fully commended, I think the point is the x1900xt for the price/performance is the best card in a long time, I like to upgrade alot, so somone got a good deal on my gtx and therefore I got a excellent deal on a x1900xt (especially being under $510) :p

It may not make as much sense economically for everyone though.
I personally love the hell out of FEAR will play the single player for the 3rd time with this card and online play is where i will be spending my time along with far cry ...
 
ATi didn't have any problem going to 90nm. It was a third party fab problem.

http://www.beyond3d.com/reviews/ati/r520/
Beyond3D said:
According to public reports ATI noticed that as late as July, issues occurred that prevented the R520 core being clocked close to its target speeds, which is consistent with leakage issues. Curiously, the issue did not occur across all their 90nm products - ATI had already delivered Xenos to Microsoft using the same 90nm process R520 does, and other derivatives of the R520 line suffered the same issue (RV530) but others did not (RV515) - the fact R520 and RV530 share the same memory bus, while RV515 and Xenos have different memory busses is not likely to be coincidental in this case. ATI were open about talking about the issue they faced bringing up R520, sometimes describing the issue in such detail that only Electronic Engineers are likely to understand, however their primary issue when trying to track it down was that it wasn't a consistent failure - it was almost random in its appearance, causing boards to fail in different cases at different times, the only consistent element being that it occurs at high clockspeeds. Although, publicly, ATI representatives wouldn't lay blame on exactly were the issue existed, quietly some will point out that when the issue was eventually traced it had occurred not in any of ATI's logic cells, but instead in a piece of "off-the-shelf" third party IP whose 90nm library was not correct. Once the issue was actually traced, after nearly 6 months of attacking numerous points where they felt the problems could have occurred, it took them less than an hour to resolve in the design, requiring only a contact and metal change, and once back from the fab with the fix in place stable, yield-able clockspeeds jumped in the order of 160MHz.
I doubt nV will have any 90nm problems. Especially with everyone else already at 90nm.
 
Illyest said:
:thup: as well.

If the 7900 debutes with 1.1ns memory again and even sports a modest speedbump over the GTX 512(even 650mhz on the core), it "should" soundly take the upperhand...again...EVEN with its 24 pipe config. With the speculation that it will be 32 pipes AND 700mhz+ on the core AND 1800mhz memory, I think we could see a blowout.

Now the price of that beast is a WHOLE other story :p . I'm hoping Samsung has ramped up enuff 1.1ns memory to keep the prices down. But it seems odd that if Samsung has actually pumped out a bunch, why wouldn't ATI go with it instead of the 1.2ns for the x1900s?

7900Ultra at 450MHz (that's almost half it's core) core clock is outperforming 512MB GTX in all benches. That's should give you an idea of how good this card will be.

Guru3D and Register said:
Nvidia's 90nm G71 new high-end flagship graphics chip, possibly planned to ship as the GeForce 7900 GTX, will not launch until March according to moles in the industry.

According to the rumors, the chip will contain 32 pixel-processing pipelines in a core clocked at a hefty 700-750MHz. The GDDR 3 memory will run at 800-900MHz (1.6-1.8GHz, effective). The site's moles point out that the new part will deliver significantly better graphics performance than the GeForce 7800 GTX 512 - to match the older part's performance, the G71 would only need to be clocked to 430MHz, apparently. Nvidia is said to have begun sampling the G71, but it appears the part won't ship in boards until the launch, currently pegged for CeBIT, which runs from 9-15 March. That's the best part of two months after ATI is expected to announce its next-generation part, the R580.

But I agree on one point with ATI and that is their pricing. If G71 doesn't match my expectation I'll upgrade to X1800XT as there is no need to get X1900XT when comparing the two.
 
Last edited:
RedDragonXXX said:
...I'll upgrade to X1800XT as there is no need to get X1900XT when comparing the two.
If you already have a GTX, why upgrade at all?

I'd take the X1900 over the X1800. The more I look around the more it looks like Shaders are going to play an increasing role in games. I'd hate to get an X1800 now and then find out that the X1900 runs UT2007 twice as fast.

But then I don't buy video cards that often. The last three cards I've had were the 7200, 8500, and the X800XL (current card).
 
JCLW said:
If you already have a GTX, why upgrade at all?

I'd take the X1900 over the X1800. The more I look around the more it looks like Shaders are going to play an increasing role in games. I'd hate to get an X1800 now and then find out that the X1900 runs UT2007 twice as fast.

But then I don't buy video cards that often. The last three cards I've had were the 7200, 8500, and the X800XL (current card).


cuz 7800gtx 256 suXors in some stuff, like FEAR bigtime for example and ever since x1800xt I wanted to go ATI again, just my 2 cents

and cuz i can
 
JCLW said:
If you already have a GTX, why upgrade at all?

I'd take the X1900 over the X1800. The more I look around the more it looks like Shaders are going to play an increasing role in games. I'd hate to get an X1800 now and then find out that the X1900 runs UT2007 twice as fast.

But then I don't buy video cards that often. The last three cards I've had were the 7200, 8500, and the X800XL (current card).

I don't have the GTX anymore.
 
Did I read something about the 7900 ultra being faster than...

Sure would like to read that arty...link please!!

As for the card, it performs same/better than 2x256 GTX in F.E.A.R. And that's the only game the SLI setup disappointed me in. I'd much rather run a single card if it'll get me healthy rates around 16x12 4xaa. Less heat, less $$, less issues...

I'm also thrilled to death to check out new tech. I'm a new tech fanboy and my loyalties lie with whoever provides the latest whiz bang fun adventure in chip world. Oh...and let's not forget ATI's always had better IQ than NV...go on...flame away, lol! But it's true!
 
Back