• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia delays Fermi to March 2010; AMD to launch new GPU in January-February

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Wow so april\may\june for the new king of cards. This is getting funny. I don't know why I always want a new card though. Everything works great on this 260. It's a sickness.
 
I don't know why I always want a new card though. Everything works great on this 260. It's a sickness.

yah thats becoming a huge issue. Personally I dont really see much of a reason to upgrade from my 4850, which I downgraded to a year ago, even though a part of me really wants to buy something new.

I think this is why we are seeing a huge push for cuda and the like as we see more and more console ports I'd imagine it gets harder to sell cards biased only on performance.
 
Doesnt look good for NVIDIA, they are falling too far behind.

While NVIDIA has probably most of their staff working on finishing fermi and the other half working on the next generation after fermi... AMD must have all their staff working on the 6900 series.

I was hoping NVIDIA would launch it soon, I want to get a DX11 card but prices are too high right now because AMD has a monopoly on it.
 
There are some high profile HPC's coming out in the near future and Nvidia is rumored to be partnering with some of those massive projects with their new Fermi Tesla. (Multi-million dollar multi-petaflop machines)

If Nvidia is promising their initial supplies to these projects, it's expected that the consumer level cards get pushed way back (Geforce).

Nvidia has much bigger aspirations for Fermi than merely the gaming marketplace.

I believe most of Nvidia's GPU revenue is from Tesla anyhow.

I hope Nvidia comes out with a new integrated GPU soon based on the new core.
 
I think this is why we are seeing a huge push for cuda and the like as we see more and more console ports I'd imagine it gets harder to sell cards biased only on performance.

I have a feeling that this GPU based calculations will help games down the line a bit once companies and engine builders figure out how to offload more than proprietary physics technologies. Sure its both harder and in some cases secondary to framerate and just looking good, but I think could have a significant impact depending on what game you are playing.

So I have a feeling Fermi could be a game changer for PC games (and maybe trickle into consoles) but I think it will take a little bit longer to really make an impact in something other than the framerate department.
 
What a shame. To bad its getting pushed back so much. Really would be nice to see them but then again im not in the market for a new card just yet. Though I would like to get a DX11 card, I'll wait til both companies come out with there cards before deciding. By then as well hopefully a few more games will be out that support DX11.
 
I'm really excited for GPU accelerated web pages.

Me too! I also can't wait for GPU accelerated word processors, windows messenger, skype, entourage, adobe apps, etc etc.

So many people buy into this GPU acceleration rubbish it's actually quite painful. Most applications to not need to work massively parallel because user actions are not massively parallel.

Also, what's with GPU accelerated flash!?! I was not aware that most CPUs were struggling to run it... Oh wait, they weren't. This rubbish is all because Nvidia is unable to make a CPU to compete with Intel, that's all what it is.
 
Me too! I also can't wait for GPU accelerated word processors, windows messenger, skype, entourage, adobe apps, etc etc.

So many people buy into this GPU acceleration rubbish it's actually quite painful. Most applications to not need to work massively parallel because user actions are not massively parallel.

Also, what's with GPU accelerated flash!?! I was not aware that most CPUs were struggling to run it... Oh wait, they weren't. This rubbish is all because Nvidia is unable to make a CPU to compete with Intel, that's all what it is.

Anything involving video?... AKA most websites in a few years.

Smart phones and ultraportables struggle with flash, aka the new computer market where everyone is investing in.

No one cares about PC's anymore. Cool you have an i7 / i5, grats you're wasting 50% of your thread resources 99% of the time doing everyday tasks on a CPU that was meant for servers. Wasting power is not cool anymore.

But I digress.
 
No one cares about PC's anymore. Cool you have an i7 / i5, grats you're wasting 50% of your thread resources 99% of the time doing everyday tasks on a CPU that was meant for servers. Wasting power is not cool anymore.
owning a i5/i7 on OCF, has a unwritten rule you know. YOU MUST JOIN THE ROSETTA TEAM! :comp:
 
Doesnt look good for NVIDIA, they are falling too far behind.

While NVIDIA has probably most of their staff working on finishing fermi and the other half working on the next generation after fermi... AMD must have all their staff working on the 6900 series.

I was hoping NVIDIA would launch it soon, I want to get a DX11 card but prices are too high right now because AMD has a monopoly on it.

Wrong. Totally!

AMD has not revised its MSRP once for the 5800 Series cards since introduction.

If you want to place blame on someone, at least pick the right target.
 
owning a i5/i7 on OCF, has a unwritten rule you know. YOU MUST JOIN THE ROSETTA TEAM!

LOL I wasn't aware of this rule??

I dont know about anyone else's i7, but mine really bogs down while looking at ocf. I think I need quadfire 5870's to help out :D
 
Wrong. Totally!

AMD has not revised its MSRP once for the 5800 Series cards since introduction.

If you want to place blame on someone, at least pick the right target.
Did he mention AMD jacked up the MSRP??

They DO have a monopoly on the high end GPU market which is part of the reason you are seeing some inflated prices. And to be honest, $20 on a $399 MSRP card, is not that bad at all.

:shrug:
 
I am not sure there is a need yet. I started playing Dragon Age recently with second monitor showing CPU/GPU usage and temps and I have yet to hit 80% CPU or GPU load, and I am on a AMD 920 and 4890 at 1920x1200 resolution all high textures.

Would love to play some games that would actually tax this system, but only game I know of for sure that will do it is Crysis and I was not impressed with that game at all. Played it once, that was enough.
 
Anything involving video?... AKA most websites in a few years.

Smart phones and ultraportables struggle with flash, aka the new computer market where everyone is investing in.

No one cares about PC's anymore. Cool you have an i7 / i5, grats you're wasting 50% of your thread resources 99% of the time doing everyday tasks on a CPU that was meant for servers. Wasting power is not cool anymore.

But I digress.

I was talking about laptops and desktops, ultra-portables are a different ball game altogether.

Wasting power? What nonsense. My i7 clocks down when I don't need the power, but when I do (which is easily 1+ hour a day, everyday) then it is there. Meant for server use? That's a sweeping statement right there and for the things I do on this machine a C2Q simply doesn't cut it. Plus I hope you don't mean I should use an even more inefficient, fat GPU to do what an i7 can do just as well.

Ultra-portables are making headway because they are new and hip among consumers. Of those who I've know that have bought UMPCs they've since bought desktops/laptops as they don't like using eye-strain-o-vision. Small gadgets are handy little things for 2 minute viral videos that you want to show to your mates but very little else. Of course UMPCs and Smartphones are taking off compared to PCs, the PC market has been going for years and years, UMPCs and Smartphones are brand spanking new for home consumers.
 
Did he mention AMD jacked up the MSRP??

They DO have a monopoly on the high end GPU market which is part of the reason you are seeing some inflated prices. And to be honest, $20 on a $399 MSRP card, is not that bad at all.

:shrug:

No it's not bad at all, the case in the UK is that the retailers are jacking up the price, not AMD. This is not due to having more powerful cards, but stock levels.
 
Pricing is all on the retailers end.
This is not really a deley, because Nvidia said Q1 2010 last fall, and March is in Q1.
Im anxious though, I want prices to fall, and I want a new card, beit ATI or NV
 
Did he mention AMD jacked up the MSRP??

They DO have a monopoly on the high end GPU market which is part of the reason you are seeing some inflated prices. And to be honest, $20 on a $399 MSRP card, is not that bad at all.

:shrug:

inflated prices is consumer demand, lack of cards prices goes up.
 
Back