• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Better Blood Splatters: introduction of the PPU

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Anyone know what kinda interface it will use? Will it use standard PCI or PCI-E? This might be a good reason for me to go PCI-E.
 
They are planning both PCI and PCIe versions. I would imagine the PCIe version would be better (faster conncetion to the rest of the system), but at this point the details are so thin that we don't really know.
 
Sentential said:
Anyone know what kinda interface it will use? Will it use standard PCI or PCI-E? This might be a good reason for me to go PCI-E.

if you look at the pic of the card, you will see a PCI connection on one side and a PCI-E connection on the other (?!)
 
Sjaak said:
if you look at the pic of the card, you will see a PCI connection on one side and a PCI-E connection on the other (?!)
THat's just a dev version, I think there will be a final version just for pci and a cinal version just for pcie.
 
wow, this cars STARTS with an external molex connector. Took video cards years to get there. If this sucks as much as a video card, then even non SLI users will need 500W PSUs. And SLI users...
 
Stoanhart said:
wow, this cars STARTS with an external molex connector. Took video cards years to get there. If this sucks as much as a video card, then even non SLI users will need 500W PSUs. And SLI users...

Good that you mention it. What will be the power usage of such a card? I doubt if today's systems can deliver another 50W on the 12v line :rolleyes:
 
Maybe the power company should just start feeding 12v DC through the power lines, and anny appliance that doesn't need it can convert back to 110 AC :)

EDIT: Wow, it has an SLI bridge connector, too! Already thinking of dual PPUs?! Maybe see if one succeeds. Also, based on the length of the connector, is this going to need a 16x size PCIE slot? If you want two, where do the graphics go?

Maybe future motherboards (like 3'rd gen PCIE or something) will be all 16x slots, since the chipsets will probably be able to support that many channels.
 
No, it doesn't. One of the connectors on the card is a PCIe connector (the small one) and the other is a regular PCI connector. This prototype supports both interfaces.
 
Stoanhart said:
Maybe the power company should just start feeding 12v DC through the power lines, and anny appliance that doesn't need it can convert back to 110 AC :)

EDIT: Wow, it has an SLI bridge connector, too! Already thinking of dual PPUs?! Maybe see if one succeeds. Also, based on the length of the connector, is this going to need a 16x size PCIE slot? If you want two, where do the graphics go?

Maybe future motherboards (like 3'rd gen PCIE or something) will be all 16x slots, since the chipsets will probably be able to support that many channels.
you cant change dc to ac.
dc is not good for transmition.
 
Yes and No.

I have an inverter in my car. I can hook 110v AC appliances to my 12v cigarette lighter output with it, so yes you can convert. They come in various sizes to support different wattages. The radio shack here sells them.

But true, DC sucks for transmition.

EDIT: Lol, I get it now. I thought when they said PCIe one side PCI other, they meant one side of the pins were pci, and the other side pcie. I didn't realise you had to flip the card!
 
Stoanhart said:
EDIT: Lol, I get it now. I thought when they said PCIe one side PCI other, they meant one side of the pins were pci, and the other side pcie. I didn't realise you had to flip the card!

It was fairly new to me, too, lol
 
I haven't read through the new posts in the thread, but this was originally written up a few days ago, and has been sitting in Notepad ever since because of the database errors when I tried to post it...

~~~~~~~~~
I think that keeping such a device OFF the GPU would be a good thing. I probably don't speak for the average OCer, but I tend to keep my systems around for a while, upgrading them as much as I can. Unfortunatly, with things like CPUs changing so quickly, my box is behind the times with a mobo that dosen't even support T-Breds (even if it did though, the fastest it could do are the 333MHz ones, which is the max supported FSB). This leaves me with quite a bit slower-than-usual framerates for my system, since the CPU is seriously bottlenecking everything when it comes to the physics.

I could just upgrade the CPU ($150), but that would entail an upgrade of the mobo (another $100), of the RAM (since I'd need to get RAM that's fast enough to work with the new chips: $100), and a new PSU that can handle the increased power draw ($75). All in all, paying $425 just for faster physics isn't my thing.

Having a PPU available on it's own card would be great, even at $150, since that's a LOT cheaper than the other option, and I don't even really need a faster CPU for anything except physics. Inside the video card would work, but Macklin is right when pointing out the problems that this entails. While it probably would be a little bit faster, I don't thing the graphics card is the place for the PPU...

JigPu
 
I now have a new piece of hardware to watch as it unfolds :drool:

Stoanhart said:
Yes and No.

I have an inverter in my car. I can hook 110v AC appliances to my 12v cigarette lighter output with it, so yes you can convert. They come in various sizes to support different wattages. The radio shack here sells them.

But true, DC sucks for transmition.

It does. The electric grid loses a few percent of their power just in transmission, and that's with 40,000 to 500,000 Volts, AC. And every transformer they use is ~95-98% efficient. For 12v DC you would have to do ALOT of line swapping, and be prepared to pay alot more for your utilities too.
 
I'm excited about PPU's. I certainly do not think that CPU's are going to handle the demands of realistic physics like water effects, smoke, fog, falling leaves, moving grass, exploding bodies...well, you know. I would certainly prefer the chip to be seperate from the CPU/graphics card as it would aid overclocking, avoid upgrading CPU's/graphics cards even more than we do already and current compnents draw enough power and emit enough heat as it is already.

I suppose the problem is how the PPU communicates data to the CPU/memory/graphics cards. Even though PCIe would help I fear standard motherboards are going to get bogged down by a lot of data traffic to unleash the real power of a PPU.

Also, PPU's would be great in single player games but there might some limitations for multiplayer games, where everyone would have to experience the same physics effects for it to be worth it. Just compare Half-Life 2 physics to Counter Strike : Source where in the latter, all you get are a few rolling barrels that you can't get near because the server cannot handle that much information, (or at least cannot transmit it to the clients quick enough).

Still exciting though. :)
 
Well, I think the multiplayer issue could be solved. I'm sure if the PPU catches on, there will be a sort of interface language, like a new extension of DirectX (DirectPhysX). A common language allows for identical output regardless of hardware. If you look at two identical screenshots, one from nVidia and one from ATI, aside from a few pixels, they are almost dead-on identical. And that is with completely different hardware. So if you could refine the PPU to the point where you know the output is the same regardless of on whom's hardware it ran, the multiplayer issue would be solved. You'd have a situation like this:

Wrong:
-Player throws grenade at pile of rubble.
-Server calculates physics for each of serveral hundred pieces of rubble
-Server distributes info to all players.

Right:
-Player throws grenade at pile of rubble.
-Server tells all other players where grenade landed
-All players use their physx cards to work out the chaos themselves - bandwidth saved.
 
Back