News

NVIDIA GeForce RTX 3090 Ti Launch Day

I have been a little "lax" when it has come to posting news about the RTX 3090 Ti launch and from what I have seen there wasn't much to say.  However, TechPowerUp made this comment in their ASUS Strix 3090 Ti Review and seems to tie things up nicely.

The ASUS ROG Strix LC GeForce RTX 3090 Ti "Ampere" is being launched today, as a finale for the Ampere graphics architecture and the fastest graphics card from ASUS. The RTX 3090 Ti is designed to be a double-digit percentage faster than even the RTX 3090, and improves in several areas we didn't even think NVIDIA could tap into. It enables all shaders physically present on the GA102, the largest graphics silicon based on Ampere, and pairs it with even faster memory than the one powering the RTX 3090, along with higher clocks and power limits. The RTX 3090 Ti targets those who want to play any of today's games at 4K native resolution with maxed out details, including ray tracing, and even explore higher resolutions such as 8K with the help of the DLSS 8K feature NVIDIA debuted with Ampere.  ~TechPowerUp

This quote from the PR post also helps to fill in the blanks

With the ASUS STRIX LC, ASUS is betting big on their liquid cooled thermal solution. The card itself is more compact than other RTX 3090 Ti models, because the cooling magic happens in the radiator. A large factory overclock is included, too, and the power limit adjustment range goes up to 525 Watt.

Web Reviews

ASUS GeForce RTX 3090 Ti STRIX Liquid Cooled @ TechPowerUp
MSI GeForce RTX 3090 Ti Suprim X @ TechPowerUp
EVGA GeForce RTX 3090 Ti FTW3 Ultra @ TechPowerUp
ASUS GeForce RTX 3090 Ti TUF Gaming OC @ Guru 3D
Zotac GeForce RTX 3090 Ti Amp Extreme @ TechPowerUp

The way I see it, the gist of this release is that to take full advantage of Ampere you need a 4 slot cooler and at least a 1000w PSU.  This reminds me of the SLI recommendations from back in the day.  What is most interesting is that despite everything most sites are reporting a 10% performance boost which tells me that we are more than CPU bound and it takes a very special Hardware Enthusiast to take advantage of this GPU.