• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Who in his right mind can post this?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

ravaneli

Member
Joined
Mar 3, 2009
link

OK, I needed to share my thoughts after reading this, so excuse me if u find it unnecessary,

But you tell me if that NVidea guy is retarded or shameless! He pairs i7 with a 250 and wants it to show superiority? A fully loaded 250 can probably throw no more than 30% load on a E8400, let alone i7. Especially on 1920x1200. Of course IF THERE IS A DIFFERENCE AT ALL it will be obvious when u bottleneck the CPU, not the video card. And he has the nerve to post particular results. Do you think the results would be the same if they paired those CPUs with 2 or ever 3 295s in SLI? And in a game like Crysis that can take advantage of 4 cores..

I don't even have a i7, but the way this test was carried and then results displayed is shocking to me, given that the poster is NVidea itself..

Or am I missing something? Anyone agree or disagree?
 
It's as much marketing bs as Intel saying Core i7 'increases gaming performance by 80%.' Just cherrypicked benchmarks and hardware setup to make a point. when one company that's competing with another talks about the other's products it's a big neon 'IGNORE' sign to me.
 
yea nothing new with any single video card i7 wont be faster then core 2. only way you would see a difference is in a tri SLI/CF and quad SLI/CF. good thing i didnt buy a i7 for SLI or CF, would do me no good as i don't game as much as i use to. rosetta runs 24/7 on this box that is all that matters. :)
 
But you tell me if that NVidea guy is retarded or shameless! He pairs i7 with a 250 and wants it to show superiority? A fully loaded 250 can probably throw no more than 30% load on a E8400, let alone i7. Especially on 1920x1200.
Exactly what that retarded guy was trying to say....rather than spending the extra bucks on i7/X58 , get a E8400/GTX260
 
nVidia is absolutely right up to a point. Several recent benchmarks have shown that the CPU stops being a factor after a certain point - it all hinges on the video. What percentage of people, even gamers, have more than two video cards? And once you're past 60 FPS (72 if your monitor can deliver it) it's all numbers anyway - your eyes can't tell the difference.


Three GTX295's? $2400 worth of video cards?!?!? You're kidding, right ...?
 
Last edited:
Hmm, so they say we need a more powerful gpu to play games at higher settings? I wonder who managed to work that out -.-

I bought my cpu for quicker transcoding and cpu tasks, not to get 1fps more in flight simulator...
 
Hmm, so they say we need a more powerful gpu to play games at higher settings? I wonder who managed to work that out -.-

I bought my cpu for quicker transcoding and cpu tasks, not to get 1fps more in flight simulator...

+1 lol. Also, there are games out the that I have noticed a huge increase in after getting my i7. GTA4 (yes its a horrible port, no trashing needs follow this statement) and Far Cry 2 have been alot better for me. But yeah, the HT encoding benefits are what I wanted to start to utilize and thus the reason for my i7 purchase and not a QX-whatever. nVidia just wants to sell more overpriced 295s and 285s to the mass idiots out there who are technology challenged is all.

Okay i'm done bashing. Have a good day all lol.
 
Pretty normal article to me.

Fairly obvious though, yet still see people recommending these serious overkill setups for gaming. Gotta have enough GPU before you worry about CPU. Gotta have enough monitor to worry about GPU.

If you are stuck on a 1024x768 LCD monitor than the i7 will probably do more for you than GPU, because you can OC it up to 4GHz and try feeding the GPU enough data.

If you are runnign a 1024x768 LCD though, it is time to stop buying games and save up for a monitor. Even an Old 19" CRT would be a massive improvement. :)
 
It's as much marketing bs as Intel saying Core i7 'increases gaming performance by 80%.' Just cherrypicked benchmarks and hardware setup to make a point. when one company that's competing with another talks about the other's products it's a big neon 'IGNORE' sign to me.

actually there is 80% gain in performance,

tests were done at 640x480 low settings to not bottleneck the card though -.-

(equals lame testing)
 
there are some reason to buy the i7 (not the 965) for gaming.

it is a real quad cpu with ht, and if you just bough the i7920 (i bought it for 280 EU)
you can be sure that it will run all the games you want without bottlenecking for at least 3 years when i am sure the e8400 won't be able to keep the pace.

correct me if i am wrong =)

(btw the i7-965 scored 1 fps more in the nvidia benches because all the background apps most probably don't run on the cores used by the game in that moment)
 
Exactly what that retarded guy was trying to say....rather than spending the extra bucks on i7/X58 , get a E8400/GTX260

You missed my point entirely man. This is a simple logical argument. You can call it mathematical, if u wana look at the performance as a function. Everybody knows this, you don't need to carry out any tests to show that if you put the same cap on 2 different functions they will yield the same result.

But they wanted to create a convincing to ignorant people little piece of propaganda. Bottom line, everything in the article is, while true, taken out of context and entirely misleading.
 
But they wanted to create a convincing to ignorant people little piece of propaganda. Bottom line, everything in the article is, while true, taken out of context and entirely misleading.
I don't think there's anything misleading about the article. To me it says that for gaming, you're generally not going to see a big increase in performance when upgrading from C2D/C2Q to i7. How is that misleading or taking things out of context?
 
actually there is 80% gain in performance,

tests were done at 640x480 low settings to not bottleneck the card though -.-

(equals lame testing)

I didn't say there was NOT an 80% increase, I said that it was marketing bs with cherry picked benchmarks or hardware aka 'lame testing.' I'm not sure what the purpose of your reply was?
 
I didn't say there was NOT an 80% increase, I said that it was marketing bs with cherry picked benchmarks or hardware aka 'lame testing.' I'm not sure what the purpose of your reply was?

i was just pointing it out how they managed to get that 80% nothing to do with you =)
 
Omg, guess what. AMD Phenom II X4 with the same GPU scored simular! I guess AMD also implemented this revolutionary 80%+ gaming increase on their CPUs too! :beer:

Haha, that article is a tad misleading. Once again Intel makes a claim out of know where. Nvidia fought back, but they were a tad biased too. They should have explained. Games like Fallout 3, CPU wont give you much of a lead, majority of the work there is GPU (like in most games). However a game like GTAIV, well then CPU plays just as large of a roll as the GPU.

They shouldent have said, i7's make peformance go up. Beacuse the reality is, in most games right now a C2D/C2Q can beat a i7 in some games, or come very close. And also what, a 80% increase with the i7 in games from what?!. Its like yea, if I moved from a P4 to a i7 I would see a huge % increase. But if im moving from a C2D Extreme, I may see no difference at all in games.
 
They shouldent have said, i7's make peformance go up. Beacuse the reality is, in most games right now a C2D/C2Q can beat a i7 in some games, or come very close. And also what, a 80% increase with the i7 in games from what?!. Its like yea, if I moved from a P4 to a i7 I would see a huge % increase. But if im moving from a C2D Extreme, I may see no difference at all in games.

i haven't seen benches where an i7 gets beaten by a C2D or C2Q :-/
 
Again, Any comparison of CPUs should bottleneck the CPUs, not the GPU. Bottlenecking the GPU tells nothing about the CPU. NOTHING. It may be 80% or 800% faster; no way for you to know. That's why the NVidea article is worthless as information and is pure propaganda. In that regard Intel's claim is far more accurate, since the 640x480 res allows the weight to drop over the CPU. If u think this res is unreal go SLI 2x295 and u get the same result with 1920x1200.

In a perfect world Intel would make obvious that the gains COULD BE up to 80%, and that depends on other bottlenecks in your system, but I can accept that as a reasonable assumption.

I don't think I can make my point any clearer. I know people will always disagree. I have actually owned only NVidea cards, and don't see that changing soon, so please don't dump me in the ATI fanboy basket..
 
I think you're the one missing the point. This article is not meant to be a pure CPU performance comparison. From the opening paragraph, it's obvious that the discussion will center around gaming performance. Intel's claim of an 80% gaming performance improvement with Core i7 is preposterous given the hardware and resolutions the vast majority of gamers are running. This article only serves to illustrate that point. Further, I don't see any wild or misleading claims made by the nVidia spokesperson. If you disagree, please list them and tell us why. If anything, the title of the article is a bit misleading, but you can blame that on the author who's trying to generate discussion, not the nVidia spokesperson.
 
Back