- Joined
- Nov 12, 2002
- Location
- Rootstown, OH
Ivy, he was asking about dry ice or liquid nitrogen cooling on it... He wants to know what sort of numbers it can put up if its super cooled.
Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
I like it although i am not a reference card style fan so i think i will stick with the MSI 6970 i want until i can see a better priced non-reference card available which i know will not be till some time late next year.
I think you will see the opposite of this actually...Actually, from what I heard, AMD sent the chips to Asus, MSI, and etc early and told them they didnt have to launch with the reference design, that they could pretty much do what they wanted.
Now wether or not they decide to do this, is up to them, but I dont think you will that many reference designs this release.
I am wondering if anyone has a pair of them to test scaling in xfire. would be very interesting to see what 2 of these brutes would do in some competition benchmarks
I think they are a little AMD bias'd. They pride themselves on "bench marking practice" yet they turned on phyx for the Batman testing. What better way to kill framerates on the nvidia side.
I think they are a little AMD bias'd. They pride themselves on "bench marking practice" yet they turned on phyx for the Batman testing. What better way to kill framerates on the nvidia side.
, anyway, the point they make.. do you need PhysX nowadays when Intel is delivering us CPUs with Alienpower? Finally the result counts and if Nvidia want to crush themself by overusing theyr GPU... well yeah, i guess they just wanted to make a point. My view is, the GPU on next gen graphics got so much rendering to handle, it truly would be happy to hand over the physics job to a CPU, and they truly can do that. Its just not true when Nvidia is telling us that only a GPU can be doing it, i dont believe it. For games a CPU usualy is handling AI and physics, thats what they are here for, in theory. Many years ago (C2D and older) CPUs surely had much higher impact on gaming performance because they had more work to do, nowadays thats pretty much GPU limited (especially on high res).
Havoc should be able to do the same thing, if the CPU goes down the roof it can aswell be used on Havoc, although im truly no expert at those stuff.
They truly should make something non proprietary and it took a long time till Nvidia only partially gave away theyr proprietary rights to own any single piece of it and therefore make devs unable to implement anything like that on Radeon, as far as i know. However, the statement of AMD/ATI was always pretty clear, that they dont feel the urge need of something like that. So they usualy gave pretty low support to devs tryring to implement something. I remember, Nvidia had aswell other software which is licensed and when MS was moving from a Nvidia to a ATI GPU on Xbox, they had to pay license fee for even being able to run that kind of software on a Radeon. I mean, Nvidia truly was mean with stuff like this, im glad they finally tried to open the matters a bit, giving devs more access to implement stuff on Radeon.
I aswell dunno what you mean by "blow away" in particullar, you might tell me, im interested. At a game like Shogun Wars with a pretty high physics need (insane amount of units), a CPU still can successfully run it. Now what kind of phyics do you have in mind which is unable to be run on any CPU?
Havoc should be able to do the same thing, if the CPU goes down the roof it can aswell be used on Havoc, although im truly no expert at those stuff.
They truly should make something non proprietary and it took a long time till Nvidia only partially gave away theyr proprietary rights to own any single piece of it and therefore make devs unable to implement anything like that on Radeon, as far as i know. However, the statement of AMD/ATI was always pretty clear, that they dont feel the urge need of something like that. So they usualy gave pretty low support to devs tryring to implement something. I remember, Nvidia had aswell other software which is licensed and when MS was moving from a Nvidia to a ATI GPU on Xbox, they had to pay license fee for even being able to run that kind of software on a Radeon. I mean, Nvidia truly was mean with stuff like this, im glad they finally tried to open the matters a bit, giving devs more access to implement stuff on Radeon.
I aswell dunno what you mean by "blow away" in particullar, you might tell me, im interested. At a game like Shogun Wars with a pretty high physics need (insane amount of units), a CPU still can successfully run it. Now what kind of phyics do you have in mind which is unable to be run on any CPU?
Finally i do want the better hardware to win, not the one being a wonder of software implementations. Besides, devs are always free to create open source for Radeon, but apparently rarely anyone ever going to do it and dunno why. Maybe because the devs nowadays just want to have everything shoven inside theyr mouth, already warmed up and served on a silver plate, seriously, where is the true dev skill? Are we nowadays totaly ruled and bound by the harsh rules of money and economics? So we aswell start to move away more and more from PCs because consoles simply are more bucks? And so we lose a lot of quality because many of those dont even try to tune a game for good PC hardware anymore..? Well yeah.. who knows.
Its surely good that Nvidia is trying to drag devs toward theyr side but they truly have to work together with devs and aswell with AMDs Radeon somehow, not only trying to boost themself. Even as a competitor working together in many aspects brings more power regarding software tuning.