Sapphire (ATi) 9600 “Atlantis”

A very good mid-level DX9 graphics card – Brian

SUMMARY: A very good mid-level DX9 graphics card.

The good guys at Newegg were nice enough to send a
Sapphire 9600XT for a test spin.

Card front

When I posted my Mid-level Graphics Card Shootout
last month, I received a number of e-mails inquiring why I hadn’t used ATi’s 9600XT instead of their “Pro” model.

In response to the deluge of inquiries, here’s the 9600XT. We’ll take a look at the card first, and then backtrack to the comparo, and see how it fares against the three cards already tested.

The Bundle

Box

Bundle

“No phone, no lights, no motor cars,
Not a single luxury,
Like Robinson Crusoe,
As primitive as can be.”
*

Included with this particular card (the “Lite” retail version), is the S-video cable, Power DVD software, driver disk, and a case badge.

No copy of Half Life 2, coupon for, or mention of (or any game, for that matter). This does however, keep the bottom line (the price) down, so it’s not a bad thing.

This particular model uses a conventional chipset heatsink/fan arrangement, rather than the
heatpipe
cooling system found on some of Sapphire’s models.

Somehow, I just feel more comfortable with that. I can’t explain why, I just am.

Also, because of this cooling method used, there’s no possible interference between the heatpipe assembly that wraps around to the back of the card, and a Northbridge cooler, which I
thought might be an issue with the Sapphire
9600 Pro I tested previously.

Card back

For those interested, the ram chips are labeled: “Samsung 332 / K4D263238E-GC33 / WVD250BA Korea”.

Installation

With the standard cooling used on this card, and it’s short length, installing it raised no issues with clearance of any kind.

I did encounter however, the same hit and miss video problem I had with the Sapphire 9600 Pro. Occasionally on booting, almost always on rebooting, the video signal would cut out.

This happened on both the AMD and Intel based systems I tested this car with. The computer would continue booting (at the correct time, you could “hear” Windows startup (the default
startup sound file would play)), but you could “see” nothing.

I installed the 9600XT onto the Abit AN7
motherboard I was testing out at this time, to see if I could diagnose this. It had the same intermittent video problem with that motherboard as well.

The AN7 has a feature built in that shows a series of codes on a two digit LED display built into the board, for diagnosing boot problems. Nothing out of the ordinary was ever displayed on that, when
the video would cut out. So the motherboard(s) didn’t detect any failure, and Windows loaded properly, but no video signal appeared.

This made it an extreme PITA setting these machines up for testing. After any driver/Windows Update/major settings change that required a reboot, I’d have to shut the machine down fully, turn the power
off at the PSU, count to ten, and then restart the system. This almost always worked, and I got a video signal.

I still have yet to track down any plausible reason why these two cards did this.

Sapphire logo

Detailed Specifications

  • Graphics Controller: RADEON ™ 9600XT
  • Memory Configuration: 128MB DDR
  • Connectors: VGA out(15 Pin D-Sub), TV-Out(S-Video), DVI Connector
  • Bus Interface: AGP 1X/2X/4X/8X
  • Maximum Resolution: 2048 x 1536 @ 85Hz
  • Video-Out: Supported (TV Tuner and Video-in N/A)
  • Clock Speed: 500MHz
  • Memory speed: 600MHz
  • Operating Systems Support: Windows ® XP/2000/NT/ME/98
  • 3D Acceleration Features Supports: DirectX ® 9.0, the latest OpenGL ® functionality, SMARTSHADER ™ 2.1, SMOOTHVISION ™ 2.1, VIDEOSHADER ™, and ATI’s new FULLSTREAM ™ technology

What’s different between this and the 9600 Pro I tested recently? The GPU clock speed. The core clock gets a bump up to 500MHz, versus the 400MHz the GPU on the 9600 Pro runs at.

Aside from that, all other specs are identical.

* (From the “Gilligan’s Island” theme song, by Sherwood Schwartz)

{mospagebreak}

Installed

Mounted up in the Intel Machine

As I tested this card out, I recreated the same environment I used when testing the 9600 Pro, FX5700 Ultra, and Ti4200. A fresh format, and install of Windows XP Pro, w/ SP1
was installed, drivers and benchmark software installed, the hard drive defragged, and tests were then run.

Again, as before, the card is not overclocked. The AGP bus was locked at 66MHz. Both systems used the same hard drive, a Western Digital WD400BB model, and the same
RAM, two sticks of 512MB (1 GB total) Corsair XMS PC3200C2 running in Dual Channel mode.

Unreal Tournament 2003 was patched to v2225. All tests were run at 1024 x 768 resolution, with a refresh rate of 75Hz. AntiAliasing is turned off, and Anisoscopic Filtering
will only be enabled in the AquaMark 3 test.

I got a ton of e-mail concerning overclocking, and using AA and AF with the other cards. Ok, ok…after I run the tests at default, to provide a baseline, and comparison with the other cards, I’ll
give this card a push, and see how that effects scores.

Let’s get the baseline established first…

Here’s a rundown of the two systems this card was tested with:

The AMD System

AMD rig
  • Chieftec Dragon mid-tower case
  • Abit NF7-S v2.0 motherboard
  • AGP aperture in BIOS set to 256MB, 8X mode
  • AMD XP2100+ (Thoroughbred “B”), running at 2200MHz (200 x 11)
  • 2 x 512MB (1GB total) Corsair XMS PC3200C2 DDR, running in Dual Channel Mode, 1:1 ratio (200 FSB)
  • Western Digital WD400BB hard drive (ATA100, 7200 RPM, 40GB)
  • Custom watercooling on CPU and Northbridge, active air cooling on Southbridge and mosfets
  • 420W power supply
  • Windows XP Pro, SP1
  • ATi Catalyst v3.10 graphics driver

The Intel System

P4 rig
  • Raidmax Scorpio 868W mid-tower case
  • Abit IC7 v1.0 motherboard
  • AGP aperture in BIOS set to 256MB, 8X mode
  • Intel P4 1.6A (Northwood), running at 2400MHz (150 x 16)
  • Same DDR as above, running in Dual Channel, 4:5 ratio (187FSB)
  • Same hard drive as above
  • Swiftech MCX4000 w/ 80mm Sunon 39CFM fan on CPU, modified HSF (made from the center slug of another MCX4000) on the Northbridge, passive heatsinks on clock generators, voltage regulator, and mosfets
  • 420W power supply
  • Windows XP Pro, SP1
  • ATi Catalyst v3.10 graphics driver

Benchmarking the XT next;

{mospagebreak}

All benchmarking was done at 1024 x 768, 75 Hz. The default settings were used with each individual program. For example, 3D Mark runs at a 16 bit depth, with triple buffering,
16 bit Z-Buffering, and Texture formatting by default. 3D Mark 2001SE uses 32 bit depth and 24 bit Z-Buffering for default settings. Whatever settings came up when the programs
were first run, is how they were run.

I ran each test a minimum of three times thru, and kept the best (highest) score. The ShaderMark v2.0 benchmark doesn’t give a “total” score like the rest, but rather a
complete breakdown of each individual test within the benchmark itself. A screenshot of this in Notepad from each system is included below.

Benchmark

AMD system

Intel system

3D Mark 2000

15262 Marks

13136 Marks

3D Mark 2001SE

12591 Marks

12201 Marks

3D Mark 2003

3758 Marks

3759 Marks

AquaMark 3

30819

30213

Unreal Tournament 2003 Flyby

163.474

159.149

Unreal Tournament 2003 Botmatch

77.976

66.087

PC Mark 2002 CPU

6798

5902

PC Mark 2002 Memory

5719

8192

PC Mark 2002 HDD

659

662

PC Mark 2004

3887

3309

AMD Shader

ShaderMark v2.0, AMD system (above), and Intel (below)

Intel Shader

I noticed as I ran these tests that they seemed to go faster than they had with any of the other cards I had tested last month. When I started cross-referencing the scores I got
here, I found that was definitely the case.

It is faster…in some cases, substantially.

CONCLUSIONS

As mentioned, the video cutting out on booting made a reappearance, once again taking a lot of the fun out of testing this card. Once I got through the chore of installing Windows, drivers,
Service Packs, etc…things were better. I still cannot speculate on what’s causing this, or why it only happened with these two cards (the 9600XT here, and 9600 Pro before).

I will say, this put in a much more impressive showing than it’s junior sibling did. I’ll cover how this card fared in benchmark comparisons in the next few pages, an add in a few in-game
screenshots as well.

If there’s a DX9 game out there you really want, this card will run it well. I still think it ought to do better with the older benchmark suites, but it did run them better than
the cards I recently tested.

How much? We’ll look at that next.

All in all, this is a really good mid-level card. Given it’s current price tag ($150ish, as I write this), it’s a performance bargain.

If I knew more on the video cutting out issue, I’d be more inclined to jump up and down. But I will give it a hearty “thumbs up”.

I’d like to thank Newegg for letting us test this out.

Email Brian

{mospagebreak}

Comparing the cards

Let’s start by looking at the 9600XT against it’s sibling, the 9600 Pro.

As I’d mentioned earlier in this article, the 9600XT has a higher GPU clock speed, by 100MHz (500MHz, to 400MHz for the 9600 Pro)

9600XT vs. 9600 Pro

Italics indicate the highest score in each test.

Benchmark

9600XT AMD

9600XT Intel

9600Pro AMD

9600Pro Intel

3D Mark 2000

15262 Marks

13136 Marks

14539 Marks

12987 Marks

3D Mark 2001SE

12591 Marks

12201 Marks

11691 Marks

11270 Marks

3D Mark 2003

3758 Marks

3759 Marks

3371 Marks

3374 Marks

AquaMark 3

30819

30213

26334

26328

Unreal Tournament 2003 Flyby

163.474

159.149

148.785

145.832

Unreal Tournament 2003 Botmatch

77.976

66.087

72.781

61.730

PC Mark 2002 CPU

6798

5902

6793

5900

PC Mark 2002 Memory

5719

8192

5837

8302

PC Mark 2002 HDD

659

662

669

677

PC Mark 2004

3887

3309

3925

3337

Looking at these scores we see that 100MHz of GPU clock speed certainly makes a difference in the final numbers achieved in the bulk of the tests.

The 9600XT outscored it’s sibling in every graphics test, and one of four of the “System” based tests.

Perhaps it because I live just south of Boston, but after the Super Bowl (XXXVIII, Feb 1, 2004) when the Patriots won (again!), this next matchup is kind of
anti-climatic… =)

9600XT vs. FX 5700 Ultra

Which card will emerge “victorious” from this clash? It’s a lot closer than you think…

Italics indicate the highest score in each test.

Benchmark

9600XT AMD

9600XT Intel

FX5700 Ultra/AMD

FX5700 Ultra/Intel

3D Mark 2000

15262 Marks

13136 Marks

14971 Marks

13013 Marks

3D Mark 2001SE

12591 Marks

12201 Marks

13778 Marks

12924 Marks

3D Mark 2003

3758 Marks

3759 Marks

3892 Marks

3895 Marks

AquaMark 3

30819

30213

29182

29163

Unreal Tournament 2003 Flyby

163.474

159.149

166.534

162.392

Unreal Tournament 2003 Botmatch

77.976

66.087

73.835

62.393

PC Mark 2002 CPU

6798

5902

6774

5945

PC Mark 2002 Memory

5719

8192

5759

8188

PC Mark 2002 HDD

659

662

670

668

PC Mark 2004

3887

3309

3878

3277

To save a bit of space, I’ll refrain from repeating all four of the ShaderMark scores again (refer to Pg 3 here for the 9600XT scores, the link at the top of this section for the FX5700U’s scores).

Suffice to say the 9600XT won that benchmark. Handily.

Factoring in the ShaderMark v2 scores, the ATi 9600XT wins by a score of 7 – 4 in these benchmarks here. Cutting out the “System” based scores, the margin becomes 4 -3 with the 9600XT on top.

Looking closely at those scores, however, shows they’re not all wide margins, in either direction. These two cards match up very well against each other, with each having it’s good and bad points respectively.
The FX 5700 Ultra lands right in between the 9600 XT and Pro offerings, performance wise.

A few of the trends noticed from the Comparo last month held true here, as well;

  • The AMD system led in the majority of the graphics scores.
  • The ATi card still seems to be better, system wide.
  • The Intel system dominated the Memory score from PC Mark 2002.

Not for the squeamish;

Just to say I did, here’s the numbers from the 3 Way Comparo put up by my old 4X AGP GeForce 4 Ti4200.

Italics indicate the highest score in each test.

Benchmark

9600XT AMD

9600XT Intel

Ti4200 AMD

Ti4200 Intel

3D Mark 2000

15262 Marks

13136 Marks

14535 Marks

12312 Marks

3D Mark 2001SE

12591 Marks

12201 Marks

12112 Marks

10675 Marks

3D Mark 2003

3758 Marks

3759 Marks

1426 Marks*

1566 Marks*

AquaMark 3

30819

30213

15454

15473

Unreal Tournament 2003 Flyby

163.474

159.149

135.580

131.859

Unreal Tournament 2003 Botmatch

77.976

66.087

70.345

61.662

PC Mark 2002 CPU

6798

5902

6781

5949

PC Mark 2002 Memory

5719

8192

5431

7697

PC Mark 2002 HDD

659

662

696

675

PC Mark 2004

3887

3309

3704

3151

(* NOTE: The Ti 4200 would not run “Game 4 – Mother Nature”, and the Feature Test “Pixel shader 2.0” in 3D Mark 2003)

Nor would the Ti4200 run the ShaderMark v2.0 application, at all. Without the proper shader support, or Microsoft DX9 support, it kicked me right out every time.

It still seems to me, the Ti4200 ought to get it’s ass handed to it in the older benchies, but it doesn’t. It shows strong in those, but drops off dramatically (if it runs them at all)
when the DX9 tests are used.

There’s still no DX9 based games on my desktop, so having the 9600XT winging it’s way back to Newegg really doesn’t bother me. For what I play, the old silicon still
holds up pretty well.

I will, no doubt, be one of those agonizing over the decision of buying a new card in the near future, or waiting for PCI Express. As much as I still like my old warhorse,
at a bill and a half, that 9600XT is mighty hard to say no to.

On the other hand, my wife in the next room makes it very easy to say no to it….=P

A few screenies, and I give the card a “shove”;

{mospagebreak}

Let’s take a look at Eidos’ “Tomb Raider, Angel of Darkness” first.

I started off by cranking up the AA and AF a bit;

Settings

I’m still running a resolution of 1024 x 768, and 75Hz refresh rate, however.

AOD SS 2 resized

One of the neat things about this game is there is a way to hack into it and turn on some debugging stuff, which allows you to see how the game is running. You can turn off and on various
things, and see the frame rates rise or fall in real time.

F11bump

One of the things you can turn on/off is bumpmapping, as seen here;

F11bumpless

AOD SS DOF CU

Another neat thing about this game is the Depth Of Field ability. Notice in the last three pictures, how things more distant are blurry. Even with all the eye
candy turned on fully, I still managed to average 30 to 60FPS, depending on the surroundings.

Lara twin gun

“What are YOU lookin’ at?!?!?”

Whoops…looks like Lara’s getting a trifle peeved at all the screenies I’m taking…must be time to move along…. =P

{mospagebreak}

Another game I played a good amount while using the 9600XT, was Fox’s “No One Lives Forever 2”.

This game played extremely well on with the 9600XT. The Lithtech game engine used in NOLF2 looked beautiful, especially with the settings maxxed, and the AA and AF at
6x and 8x respectively.

Let’s see if Ms. Archer would mind us tagging along for a bit;

If I'm not back

Now, now…we won’t take that much of your time.

Cate

Wish me luck

Cate full-fraps

The picture immediately above this, is the full shot the second image was cut from. Visible in the upper left in the full view is the readout from the program “FRAPS”.
I noticed the framerates were a bit lower in the cut scenes, than in gameplay. Usually, the rates were much higher.

Don't leave bodies

Good sound advice, no matter the situation. =)

Phone full-fraps

More NOLF2, and UT2K3 next;

{mospagebreak}

Wardrobe

“Huh? What the… WARDROBE!!!! *SHEESH* =P

Falling leaf

Seriously….the game played absolutely amazing with the 9600XT. Every detail was beautifully rendered, and the framerates were very good. I could fill page upon page of screens from this
game.

Unreal Tournament 2003

For some reason, no matter what I tried, I just couldn’t seem to make this game click with the 9600XT. While it certainly ran better with it than my Ti4200, it still seemed lacking
somehow. It just seemed like the framerates should have been higher than they were. In all fairness, this seemed to be the case with the 9600 Pro and FX5700 Ultra tested previously, as well.

Note; I turned off AA and AF when playing this game. The framerates dropped extremely low when they were turned on, even at 2x (both AA and AF). I ran the UT2K3 bench with them on, and
the scores dropped as seen below. I then disabled AA and AF.

UT2K3 6xAA-8xAF

With 6x AA and 8x AF enabled, this dropped from 163.474/77.976, respectively. Quite a hit.

UT2K3 cranked

Although the framerates were very low, playing with the AA and AF enabled produced some dazzling images. Resizing and reproducing this here loses a lot of the detail, unfortunately.

AA and AF is disabled for the shots below.

Visible (barely) in the upper right hand corner is the FPS reading. UT2K3 has facility to display this, so I turned it on for these shots;

UT1

“CTF-Hall Of Giants 2004” (43 FPS shown)

UT2

It was right about then that [OC]Mr_B decided perhaps going fishing might have been a better idea…. =) (16 FPS)

UT3

“CTF-Face3” (64 FPS)

UT4

This was one area that always plagued my Ti4200, dropping the framerates into single digits. The 9600XT did handle it better. (58 FPS)

UT5

“DM-Tokara Forest” (48 FPS)

  • Piglet: “P-P-P-Pooh? I d-d-on’t think we’re in the 100 Acre Woods anymore…”
  • Pooh: “No Piglet, we’re not.”
  • PIGLET tried to juggle Gorge’s hand grenade
  • PIGLET was killed!
  • Pooh: “Oh bother….” =P

{mospagebreak}

Overclocking

This time I did give the card a bit of a push, to see what speeds I might be able to hit, and the effect on benchmark scores and gameplay it produces.

Def Perf profile

Powerstrip, at the default settings. 500MHz core, 300MHz (600 DDR) Memory.

From here, I raised the core clock until I lost stability, then did the same with the memory clock. Only then did I go in the BIOS and raise the voltage.

After a lot of trial and error (more error, really), I finally found the fastest this particular card wanted to go was 540/320, at 1.8v. For some reason, I thought it might go higher,
but I still saw a sizeable increase in some benchmark scores here. I re-ran the DX9 benchmarks, as that where this card really seemed to perform best.

Bear in mind also, this is with the OEM chipset cooler. Upgrading this item, and/or adding ramsinks would likely produce better results.

3D Mark 2003

Going back to this benchmark, after giving the card a bit of a boost, netted this result, up from 3758 Marks at default;

3D03 OC

3D03 OC details

540-320

AquaMark 3

9600XT Default

9600XT 540/320

Triangles Per Second

9,277,577

9,986,076

FPS

30.82

33.17

GFX Score

3876

4257

CPU Score

7513

7519

Overall Score

30819

33172

As you can see, these scores rose substantially, as well. Better cooling, and a slightly higher increase in speed would likely break 10M TPS, and perhaps hit 34 FPS as well.

Final thoughts;

This card seemed to respond well to overclocking, although for some reason I expected a higher percentage overclock. After reading how the 9600 Pro’s are capable of some outrageous
numbers, I thought I might see some of the same here. Perhaps I was limited by the stock cooling, perhaps I just got a card that doesn’t overclock as well as some others.

It did show, that with some effort, it’s capable of producing some pretty good results.

I was quite impressed by this card, although again, I would have been all the more impressed if I didn’t have the issue with the video cutting out intermittently while trying to
get the systems set up for benchmarking.

That issue aside, this item packs a very high price to performance ratio, that can’t be ignored.

Bang for your buck, is what we all really want, right? This delivers.

Email Brian

Be the first to comment

Leave a Reply