• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Strix or Giga G1 overclock more or is it a silicon lottery?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Status
Not open for further replies.
We are talking about stock, not 2000+mhz. As Johan showed even the Geforce hits that on LN2. Warning would be if you aren't planning on modding the Strix will be slower on average then other brands.
 
Then on stock all hit ~1500MHz as I already said. All other mods are not provided by manufacturer and +/- 50MHz depends from chips quality.
 
But that's the argument, i have seen so many people in forums all happy they hit 1530mhz+ stock on a G1 or EVGA and so very little on Strix that go above 1492mhz. Hence what i said, statistically the Strix will OC lower because of the lower voltage (until you explained that the voltage is the same, simply different targets). Now i have no explanation as to why they statistically OC lower since its (supposedly) better built then the other brands. Explain to me in certain terms why (and pls don't tell me "silicon lottery", that is just slang for "i have no idea") and i will happily recant.

Again the same happened with the 970s (which i mentioned 2x) and still no one bothered to tell me anything.
 
Then on stock all hit ~1500MHz as I already said. All other mods are not provided by manufacturer and +/- 50MHz depends from chips quality.
thanks again!

Confirmation bias was a serious explanation for what you are seeing and saying. I believe you saw what you are saying.. but it's such a small sample and the variance so large, that it isn't worth it to stay away from ANY brand (for that reason).








Silicon Lottery men...pick the cheapest card that looks good and has the features/support you need. :)

Edit: silicon lottery is not another term for I have no clue, lol. Each gpu/cpuis cut from a larger wafer. Some chips are golden, some are junk. This is why at the same voltage a cpu/gpu is stable at different clocks. This is why asic values are different. :)
 
Last edited:
Depends on what you do, gaming wise is 1-5 fps depending on the game, in image and video processing a lot more if you use encoding with specific nVidia language. Benchmark wise is "a small step for men, a giant leap for computing" :D
 
Depends on the application. A couple/few percent though for gaming. Plenty of reviews show that they overclocked core clocks xx% which yields x% gains. It is typically not a 1:1 with the lesser value being real world benefits...

I dislike using fps because of matters of scale. For example, 5 fps over 20 means a lot, a 25% increase, while 5 fps over 100 means 5% and relatively nothing to a gamer ;)

... another reason why it's a silly, imo, for the average user to shun brands over a several mhz of potential better overclocks. ;)
 
Confirmation bias was a serious explanation for what you are seeing and saying. I believe you saw what you are saying.. but it's such a small sample and the variance so large, that it isn't worth it to stay away from ANY brand (for that reason).

Edit: silicon lottery is not another term for I have no clue, lol. Each gpu/cpuis cut from a larger wafer. Some chips are golden, some are junk. This is why at the same voltage a cpu/gpu is stable at different clocks. This is why asic values are different. :)

If you really believed what i wrote you wouldn't be arguing and asking for confirmation. And i always spoke from the AVERAGE point of view, or statistically as i later found out was the word i meant. So statistically the majority of the G1 and EVGA chips are from better cut then the Strix, that is the only logical explanation correct ? Still strange that the other components are so good and the chip is of worse quality.

I would assume then if that is the case that is a very good reason to still pick the G1 or EVGA over the Strix UNLESS you are going to use LN2 ? offer me a better explanation and like i said i will recant and offer apologies for this monumental loss of time.
 
Now i have no explanation as to why they statistically OC lower since its (supposedly) better built then the other brands. Explain to me in certain terms why (and pls don't tell me "silicon lottery", that is just slang for "i have no idea") and i will happily recant.

How is this thread still churning out posts?

"Silicon lottery" == "luck of the draw"

1500 is the average. With any dataset, there will be outliers (winners and losers of the silicon lottery), but they shouldn't be taken as an indicator of anything really.

Unless you've seen overclocking results (with the exact same test methods, mind you) from at least ... 100 of each brand, I don't think anything meaningful can be extrapolated.

And i always spoke from the AVERAGE point of view, or statistically as i later found out was the word i meant. So statistically the majority of the G1 and EVGA chips are from better cut then the Strix

An average of ten chips does not yield useful data.
 
If you really believed what i wrote you wouldn't be arguing and asking for confirmation. And i always spoke from the AVERAGE point of view, or statistically as i later found out was the word i meant. So statistically the majority of the G1 and EVGA chips are from better cut then the Strix, that is the only logical explanation correct ? Still strange that the other components are so good and the chip is of worse quality.

I would assume then if that is the case that is a very good reason to still pick the G1 or EVGA over the Strix UNLESS you are going to use LN2 ? offer me a better explanation and like i said i will recant and offer apologies for this monumental loss of time.
Just for reference, repeating the same thing over and over when we know otherwise, doesn't make it true.

There may be reason to pick the other cards over the strix, but average overclocks sure as hell isn't one of them. ;)
 
Just for reference, repeating the same thing over and over when we know otherwise, doesn't make it true.

There may be reason to pick the other cards over the strix, but average overclocks sure as hell isn't one of them. ;)

I could say the same to you, i still have no proof of what you said being true. The only one that explained things well and not "luck of the draw" was Woomack. Like you i can only go from what i see, and that is that G1/EVGA/Zotac have higher overclocks/perform better at stock then the Strix. That would be my reason to tell everyone "don't buy Strix".

You pull a review and immediately it becomes a holy bible. How do i know that the results weren't altered somehow to sell more Strix over the rest like Intel used to do against AMD ? Yes i sound off paranoid but it is a valid point nonetheless.

I NEED a valid explanation that doesn't require luck, the argument is too much like the stupidity of religion for me to ever accept it.
 
Last edited:
If you say so...

1. The reviews we BOTH referenced showed what Woomack and I said.
2. Woomack also inferred the luck of the draw(oops, He Straight Said it). In essence he said very little which was different from what I said (voltage limit the same, overclocks in the ballpark, no reason to shun asus, etc.
3. It's you who are referencing forums and trends but haven't sourced your info.
4. You can't on one side of your mouth rely on a review, then the other bunk it for your purposes.

This is an exercise of insanity at this point. Either prove your point with lots of datasets, or just stop. I'm not saying you have to believe us, but bunking what those in the know ( a few of us) are saying while calling me out for lack of info while you have provided nothing but support for my assertions, has run its course. If the thread keeps going this way it will be closed.

So, the ball is in your court kenrou... we are all anxiously waiting for a large enough dataset to prove your point... but in the end, a an average difference of 10 20 mhz is a percent difference in game. Again is it really worth it to shun a brand over that POTENTIAL? IT'S just not worth it Kenr. :)
 
Last edited:
I think the truth of the matter is the average user doesn't give two turds either way. BTW if you're encoding that close to the edge you're just asking for problems like unplayable media. Just saying.
 
I have already said what forums I used earlier on, oc.com oc.net rog, not that you payed any attention. Just scroll and watch the threads. YOU haven't explained anything other then "luck of the draw" and "silicon lottery". Woomack did in detail. I want to believe you but at the same time... "Luck of the draw"...

Cmon scientific reasons man. Bad binning, shoddy construction, bad programming, something that makes sense other then blind stupid luck...

If I follow your reasoning there's no point in buying anything other then stock GeForce if you're not using LN2 ?

- - - Updated - - -

@Johan : Then why do we reply get this one or that one ? One is better then the other right ?

I used OCCT for errors, I'm 100% stable since the damned board underclocks to 1480mhz 😊
 
I asked for links and data... onus is on you, not me.

I linked to reviews... the same ones that you did and more...Silicon Lottery plays a role... we all said so. I'm not sure what more you want. It has all been explained by multiple people from multiple angles. We painted a picasso, but all you see is is a child's watercolor...

If you think a refernce card will last with ln2 and support the same clocks, well, you have a lot more to learn about how a video card works.

Then why do we reply get this one or that one ? One is better then the other right ?
I don't typically... I have frequently said it doesn't matter. But it's situation dependent. ;)
 
Last edited:
@EarthDog : And I asked for explanation with hard facts and only Woomack gave me that. And you're still misquoting, I said "if you're NOT using LN2".
 
I really do not understand this need for everyone to provide in depth, detailed, information. Someone did provide said information, and that information is yours to do with as you will.

In the end, is pure overclocking ability the only thing you want out of a GPU?
I am pretty sure that the reasoning for the difference in performance of GPUs can be easily pinned on simple environmental issues. Things like humidity, how the card has been handled during its lifetime, crap like that can influence anything.

And of course, blind stupid luck.

Sometimes, the answer really and truly is, **** happens, no matter how much we try and find a scientific answer.

I just spent 15 minutes googling along the lines of "why do two identical processors perform differently" and could not come up with anything perfectly along those lines.

Sorry for being a bit terse, but blah >.<
 
You're actually making more sense then EarthDog at this point but might be just lack of sleep 😝 I'm anal like that, I've always loved to know the why of everything. Probably why I have such a big problem with religion, blind faith in something that man invented sounds across as completely ridiculous to me.

I'm going to end this then still half unanswered and leave it at that, hopefully won't start another argument.

Thank you for your time gentleman (and gentlewomen if there was any) 😊
 
Lol, now you are on me because I didn't give facts? You are a piece of work! Kenrou, at this point It doesn't matter who gave them, they are on the table. If you want people to believe your assertion, bring something to the table!

Oops, bricked on the ln2 thing. If people don't care about noise, yeah, refernce is fine for moderate overclocking. Where people have concerns with reference are pushing them to the limits. Their power bits aren't made for that sustained and their blower style cooler they typically come with is pretty damn loud. Non reference models have more robust vrms to support higher overclocks, but there is a point of diminishing returns on that for ambient clocking. Akin to having a lambo but only being allowed to drive it 65mph.. the parts are there, it's a beast, but doesn't mean you can reach 180 where we drive.. you need a track (better cooling/bios mods) for that. Otherwise, it's just a lambo begging for more.

At nzk, I mentioned already whyou the same gpu/cpu performs different at the same voltage. It's just the quality of the silicon from the wafer that is used on the board. There are better pieces, and worse. Now, a technical reason for that, I don't know, I'm not an EE and know silicon. I may be able to get Dolk in here to answer that question way above our heads, lol!
 
Last edited:
Status
Not open for further replies.
Back