Today Nvidia is pulling the wraps off the GK110-based GeForce GTX Titan, a single-GPU card that is expected to easily capture the title of Baddest Ass GPU in the world when benchmarks are released this Thursday, February 21st. The Titan is Nvidia's "Big Kepler" GPU, and has double the transistors and almost double the CUDA cores of the mid-range GK104 chip found in its flagship GeForce GTX 680 GPU. Though it runs at a lower clock speed in stock trim, it should still offer a sizable performance improvement over the already capable GTX 680.
The GK110 has been in use for over a year in the super-computing world, most notably in the Tesla K20X GPUs used by the Oak Ridge National Laboratory's Titan supercomputer, which is currently the fastest super computer in the world. Though benchmarks for the Titan haven't been posted just yet, Nvidia tells us it should be neck-and-neck with the dual-GPU GeForce GTX 690, which is currently the fastest single card GPU in existence. In other words, the Titan should be almost fast as dual GTX 680s in SLI, but from a single GPU. Oh, and guess what? It's actually quieter than both of them, even under load (yes, we've heard it).
So let's talk specs. The GTX Titan features a whopping 2,688 CUDA cores in 14 SMX units, 6GB of GDDR5 memory running at 6GHz, a core clock speed of 836MHz with a boost clock of 876MHz, 7.1 billion transistors, a 250w TDP, 384-bit memory interface, and a single-fan cooling unit. Power is provided by an eight-pin and a six-pin connector, and the price is a surprising 1000 euros, so hopefully around £900 (around the same as the GTX 690). The card is 10.5" long and comes wrapped in an aluminium shell, as opposed to the magnesium alloy shell used by the GTX 690, and like that card, Nvidia is not allowing any changes by add-in board manufacturers, so all Titans will look just like the one shown here.
Perhaps even more impressive than the card's raw specs is how quiet it is thanks to technology Nvidia is calling GPU Boost 2.0. When we first sat down to check out the card in an undisclosed location in San Francisco, we were shocked to find out that not only was a Titan running a mere 3 feet from where we sat and made no noise whatsoever, but another machine just a few feet from us to the right was running Battlefield 3 using three cards in SLI, and was also as quiet as a whisper. The secret isn't just how Nvidia designed the single-fan cooling apparatus, though that obviously plays a huge role, but rather in how it designed the card's software to work with the card. Put simply, you can now control the maximum temperature you want the card to reach, and it will throttle itself to remain at that temperature though a combination of fan speed, voltage and clock speeds. This lets you tweak the card to run any way you want - overclocked with a loud fan, stock speeds and relatively quiet, or underclocked and totally silent. You can adjust all these variables in real-time too through the software provided by the add-in card manufacturers, most likely Asus and EVGA if the same precedent set by the GTX 690 is followed.
Aside from the new temperature, voltage and clock speed controls (yes, you can over-volt the card), a new feature that is exclusive to the Titan will let you overclock your display's refresh rate when running Vsync, so you could theoretically achieve 80fps even when using a 60Hz panel, and the same goes for 120Hz panels as well. Nvidia told us the GPU simply "lies" to the panel and tells it to run at a rate higher than it's supposed to, with the caveat that not all panels will allow this, but the only way to find out is to try.
Another interesting feature is the colour and status of the GeForce GTX logo that is lit up on the side of the card can now be controlled through software. We didn't get to see it in action, but Nvidia tells us you'll be able to tweak it to perform a range of stunts including changing color based on temperature, you can have it "breathe," or just be a different colour to match your rig's mood lighting.
We never thought Nvidia would do this, simply because tradition dictates mild, incremental advancements in its GPU lineup year-to-year, especially when it's not introducing a new architecture. Plus, since Nvidia already held a narrow performance advantage over the AMD HD 7970 card (at least in terms of power consumption), it would stand to reason that we'd see a modest 20 percent performance boost in the GTX 780 card, or something along those lines. Nobody expected a card to be released that promises almost double the performance. In fact, we stated as much in our 2013 Tech Preview article in our Holiday 2012 issue of the magazine when we wrote about the prospect of the GK110 showing up in gaming trim, stating, "It would be totally uncharacteristic of Nvidia to double performance in the next generation, particularly when the competition doesn't warrant it." It looks like we were wrong, and we couldn't be happier about it.
So why would Nvidia release this card, especially when it's already largely considered to be the leader in the single-GPU space? We have two opinions on that issue. The first is that it wanted to have a definitive lead, not just the lead in heat output, noise and energy consumption. We all know the GTX 680 consumes less power and produces less heat and noise than the HD 7970, but the two cards are so close in benchmark performance now that AMD has the GHz edition of the card circulating that in most tests it's either a tie or AMD has a slight advantage. That obviously didn't sit well with Nvidia, especially when it had all this horsepower lying around in the form of the GK110. Our guess is that Nvidia decided to end this pussy-footing around with the mid-range GK104 and just release the hounds once and for all.
The second reason for the Titan's deployment is the sudden interest in the new generation of Small Form Factor rigs such as the Falcon Northwest Tiki, iBuyPower Revolt, Digital Storm Bolt and others. The GTX 690 was a bit too long to fit inside these cases, but the Titan fits perfectly being about an inch shorter despite having roughly the same width. This will allow system builders to construct tiny boxes with no-compromise graphics performance, and will also allow home builders to stuff the Titan into most ATX cases without any issues as well, providing they survive the sticker shock and can find one in stock.
That's our quick glimpse of the new GeForce Titan. Check back in two days for full benchmarks in both single-and-SLI configs. We can hardly wait to show them to you.