• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Comparing GPUs, ATI 6770 and Nvida 550 ti

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

techiemon

Member
Joined
Jan 29, 2007
I currently have a Radeon 6770, but am having some issues and the store suggested a replacement for an Nvidia 550 ti card, however after looking at this site, http://www.gpureview.com/show_cards.php?card1=647&card2=658 I feel I am being fooled to believe that the Nvidia card is better. I see that the Gflops is almost half that of the Radeon card and that shader cores is also much less on the Nvidia card. I know numbers don't tell the whole story, so I am posting here to see what you guys think. The GPU will be used mostly for folding on Prime and Collatz.

Can you guys recommend a better card either ATI or Nvida that would do work units faster but are fairly close in price to these two cards?
 
You're a bit out in the weeds, in this forum. If you want benching data, re-post in the benching activity forum (top forum).

You also might want to check the hwbot and/or WPrime website, and see what cards are performing well. Then go to some web sites and see how the prices stack up.

In general, the Radeon cards are much better than N'Vidia in integer computations, but the N'Vidia trounce Radeon cards, in floating point speed.

The WPrime tests I'm familiar with, are heavy with floating point operations, but not having either card right now, that's just speculation not data.

Which is what you need. Edit: OK, Collatz conjecture is a d/c - I'd thought it was a bench test. Still, check out the bench team and bench sites. Might try the Collatz forum, if they have one.

It is tough to keep a straight face when reading the Collatz conjecture, tbh. :rofl:

Good luck.
 
Hi Adak, thanks, I took your advice adn reposted in the benchmark forum. I have never posted there, didn't even know it exsited until now, haha. I usaully post in the other forums.

I was told by Asus today that the NV card has about 7x more power to fold than the ATI card does. But I am wondering about the effect on other applications by lowering the integer computations. can you tell me what these two things are responsible for actually? I mean integer computations and floating point, what do they do and what are the purposes?

Why do you laugh at Collatz? Think it is a joke what they are trying to find? ;-) haha.
 
For example, in Folding@Home, the cores (that actually do the computations), use single precision floating point numbers, like 2.72345, rather than whole integers. As a result, ATI/Radeon gpu's suck eggs, and N'Vidia's shine.

In other projects, the cores use integers, and N'Vidia sucks eggs, compared to ATI/Radeon. For decent performance from a gpu in a project, match the card, to the project. My bet would be that Collatz works with integers, much more than floating point numbers, but that's just because of the nature of the conjecture.

The FPU (floating point unit) in a cpu, is different than the ALU (arithmetic logic unit). Some chips have a more extensive FPU, others don't. The cpu in one rig I have only has one FPU for two ALU's, on the cpu (which has 16 cores), for instance.

I can't see any possible reason to explore Collatz's Conjecture, any further. Can you think of just ONE possible benefit of having it proven, or disproved?

I can't.

Also, since it applies to every natural number, and the natural numbers go on into infinity, how could it ever be absolutely proven?

It's math (and thus science), until you get to numbers with about 20 digits. After that, it's just silly, since the answer will not matter, in any case.
 
Whoops, I'm sorry; I moved your post to General GPUs and didn't know you were directed to the benching forum, sorry about that.. I did respond though. :)
 
Hi Adak, You are right, but how about Primegrid then? Will the 550 ti or 560 be better for these work units? I probably will stop doing Collatz as I feel you are right, it is a waste of my GPU time actually.

I would like to continue running prime and maybe some others once I get the new GPU installed.

Hi Hokiealumnus, I had posted here and then Adak suggested that I move it to benchmarks, oh well, I think both threads and some useful info so.. Anyway yes I got your reply there also, thank you!
 
I'm unfamiliar with Primegrid, and most d/c projects. I've done a good stretch with SETI, Rosetta, World Community Grid, and of course, Folding@Home.

I happened to stumble across Collatz Conjecture when I was looking for a math oriented project.

The way I see it, most of these projects are big -- REALLY BIG. Anything that you're going to really put work into, needs to somehow, someway, be worth it. If the "juice" isn't worth the "squeezing", then why bother picking that lemon in the first place?

The best description I've read about the distribution of prime numbers, is that there is no pattern - they sprout up like weeds in the yard - where you do, and where you do not - expect them to.

But like I say, I don't know anything about the Primegrid project.
 
My current favorite ATI DC project is Milkyway@home, it has more potential to be useful than the others IMO.
(I actually run bitcoin, I find money for myself to be more useful than a map of the milkyway)
 
You both are probably right, I have many projects installed but moved to prime grids as I do feel they could be useful in building better computers in the future, which in turn woudl help the other projects run faster and produce more results quickly.. but I may consider dropping both of them. Does seti and WCG also have GPU projects then?

I don't understand your meaning Bob? Bitcoin is a grid? I only found that it is a system similar to paypal so to speak.. What do you mean?
 
Read through the bitcoin wiki for in depth information.
In short, you get paid in bitcoins for running the calculations that help secure the bitcoin network. It's all integer stuff so ATI knocks the socks off nvidia.
 
Read through the bitcoin wiki for in depth information.
In short, you get paid in bitcoins for running the calculations that help secure the bitcoin network. It's all integer stuff so ATI knocks the socks off nvidia.

....just do the calculations to ensure your power bill increase is less than what you get bitcoin mining, lest you go into the negative. :cool:
 
No kidding! haha


Bob, how much do you actually make with Bitcoin a month?
 
When I remember to keep the GPU running? Something like a buck a day or so, buck seventy or so if I run the 5830 as well.
 
Seems like the electric bill would be higher than what you are bringing in, but I suppose with other projects you use the same amount of energy but you get nothing in return, but still help out the world or whatever, I guess to each their own, whatever you like.
 
Oh, he's active in those too. Rosie is his choice iirc. AMD/ATI cards are good for bitcoin but pretty bad at Folding/Rosie/SETI.

The new gen should be much better if they code for the new architecture though.
 
Back