• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

AMD superiority secret?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
spyware said:
A 3200+ is a s 754 with a 1600 h/t
and x2 is s 939 with 2000 h/t
I do see what you mean and windows and alot of other progs use both cores but it's still slow,maybe with new software it'll speed up but I still wish I stayed with intel.
even so it's called 4200+ it's surely not comparable to a
Intel at 4.2 ghz (they should be sued for using that rating system it's very mis-leading)
Or maybe they mixed up the numbers 4.2 =2.4??
and I am running it at 2.5 so 5 ghz intel hah

My 3200+ is an s939. And I belive the 3200 is slower than a P4 at 4.2 ghz, but I am not sure. Why you'd buy a dual core chip and think it would be faster at everythign is beyond me though. for the same price you could probably have gotten some wicked San diego core and OCed it alot. Beat the pants off that ol p4
 
hehe wicked san diego?? you do know that a x2 is basically 2 san diego cores? anyway I got some ultra ram 1gig kit cl2223
Now its startin to perk up.gonna break 9000 on 3dmark 05.
newsss.jpg
 
Last edited:
RETAIL" (boxed) game, just like Kyle says, you'll never see 134fps in Doom3, you will always see a 60fps cap frame rate and the same goes for Quake 4 which uses Doom 3's engine.

Well that is wrong - Toss in Quake 2 - or Quake, they used in benchmark still and FPS jump upto like 500 - soo yes - you hack your Doom file to go above 60FPS - which i beleive you can and in 2 years with vid cards being incredible fast i can see Doom or any engine on it hitting above 100 + FPS easy.
 
3DFlyer said:
Hey CoreGamer, There's no flames here. I haven't even got out my fire suit. There's not even a spark and we sure haven't seen any gas yet either. I probably will get out the fire suit when Pressler and the Cedar Mills start showing up in benches here though. You never know though, with all the crying, the tears may put the fires out! hahaha
What I ment was this arguement was getting a little....tense. Whenever you put the words AMD and Intel together trying to compare them its like striking a match on a box, I have not yet seen an AMD vs Intel thread on this board that did not end up to be a flame fest...:(
 
Very good point, but will LCD be the main stream in 2 years, or will Plasma or the other forms of digital displays :D and do they have the same limitations ?

I read that they were working on a new form of CRT that was as flat as an LCD - cheaper to make and has all the good qualities of a high end CRT does.
 
Last edited:
Umm who cares about monitors this is about programming not monitors AMD memory bandwidth or an anything.

Its about taking advantage of special instructions which by the way whoever said legacy CPUs would be screwed if they started using special instructions is right becuase they are screwed right now by high end games. There is no way my dual P3 runs fear or BF2 or Quake 4. Athlons with bart cores have SSE2 instructions so just about anything good enough to run these games has the instruction set and if they didn't its just all line of code that is used if compatible and if not compatible is not used. So basically the biggest complaint is programmers are lazy and forget what anal instructors teach them or sometimes forget to teach them.
 
Nasgul said:

yeah! but? how about the native refresh rate on monitors? will they ever be above the 100 hertz? So far everyone is going crazy with LCD monitors which to me are not "that" good because of their 70 to 75 maximum refresh rate.


You are sort of comparing apples to oranges. An LCD monitor does not suffer from refresh problems, but rather response time. Each pixel is turned on and off as required. At any given moment, the pixel is always ON. You can argure that the color information is sent to an LCD 75 times per second(75Hz), but in the gap between signals, those pixels are still ON. A CRT, on the other hand needs the high refresh rate to keep the pixel illuminated. When that signal is removed, the pixel is not charged, and in other words OFF. This is what causes the flickering, much like turning a light bulb on/off real fast.
 
Back