• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Questions about T&L

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Racoon

Registered
Joined
Oct 25, 2001
So i've searched google for hardware T&L and it explains the technology but I got some unanswered questions. What does hardware T&L actually do? Does it increase performance and/or visual quality?

Suppose I have a card with hardware T&L and a second card, identical to the first but without hardware T&L. Will the first card out-perform the second in all 3D games or only on games that are specifically written for hardware T&L?

Say a game has been optimized for hardware T&L. What happens if you run it with the second card. Does the cpu have to do all the hard work or will it just ignore the T&L code?
 
Great questions.

nVidia rediscovered hardware T&L a few years ago and debuted it on the first generation of GeForce cards. (Rediscovered because the old Commodore Amiga used the same idea back in the early nineties.)

The idea was, as I'm sure you read, to save CPU power by rerouting some of the graphics calculations to the video card. nVidia presented the idea as a breakthrough because it would allow game developers to devote more CPU clock cycles to AI and physics while the GPU drew the pictures.

But because games had to be written to take advantage of T&L, the GeForce2 was out before T&L optomized game titles began to appear. And in that time, Moore's Law had begun to return big gains in CPU power.

So hardware T&L became a performance cheat fewer and fewer gamers needed. If your CPU is three times faster than the game recommends, as most systems are today, doing the T&L calculations on the CPU or the GPU are going to return the same framerates.


BHD
 
Back