Is LN2 on GPU really worth the trouble?

  • News, Editorials
  • 0
  • HWBOT

Is LN2 on GPU really worth the trouble?

Author: Pieter-Jan Plaisier

Extreme overclocking is not only about using as much liquid nitrogen as possible to get the best results, it’s also about knowing if it’s interesting to use another batch of 50L LN2 or if the hardware has been maxed out completely. In addition, for the not-so-extreme overclockers who don’t have the resources to acquire liquid nitrogen every weekend there’s the problem of how much switching to LN2 will actually improve the score. The HWBOT database seems to have an answer to these questions …

First of all, being an extreme overclocker (and extreme tech enthusiast) myself, I have to say the following: “YES, it is always interesting to test LN2”. Whether it’s a high-end Radon 5870 or a low-end Geforce GT220 … liquid nitrogen can give you a better view on the scalability of you hardware.

Now, to continue finding an answer to the questions posed in the first paragraph. The basic question we posed here is: “To what extend is changing the temperature of the hardware configuration yielding higher performance”. So, for each video card, we have two variables; the independent variable ‘temperature’ and the dependent variable ‘performance’ or, to make it a bit easier to understand, we manipulate the temperature and measure the performance. The more the performance increases with given decrease in temperature, the more interesting it is to use liquid nitrogen to cool down the hardware. All agree?

Now, as I prefer to follow a scientific methodology as much as possible, I would normally assemble a couple of test systems and do the testing myself. However, the main issue we face here is the lack of time and finances for this kind of semi-academic purposes, so instead I’m using the 400,000 results counting HWBOT database. Because time is actually a very-much restraining factor, I have also limited the measurement tool (performance) to 3DMark06, single card as well as the amount of temperature variables:

# +°C CPU/GPU = positive temperatures for both CPU and GPU, meaning AIR or WATER cooled configurations
# -°C CPU / +°C GPU = subzero temperature for CPU, but still an air/water cooled GPU
# -°C CPU/GPU = subzero configuration, so both CPU and GPU below 0°C

As for the hardware, I have limited the amount of cards to the top 8 cards of ATI and Nvidia, excluding those samples that don’t have enough data to fit all the temperature categories (e.g.: Geforce GTX 470). Another important side note is that I have no control over the used test platform, which means that the individual results may vary because different platforms have been used. Do note, however, that this doesn’t make the charts invalid as we are not focusing on the performance of the hardware in comparison to each other, but the performance of the hardware compared based on temperature.

temp-06.png

In this first chart, which is solely used to display what we’re measuring here, we can see that in general it’s always interesting to decrease the temperature of a given platform as the performance always increases when we change cooling methods. A trained eye, however, will notice instantly that the difference between cooling methods differs from hardware to hardware. A simple example is the difference between the Radeon 5970 scaling and Geforce GTX 275.

To find out what video card is most interesting to blow your LN2 on, we have to alter our data a bit to see the percentual increase over air/water cooled configurations.

temp-perc.png

The blue line indicates the base score when using either an air or water cooling mechanism on both CPU and GPU. The red line indicates the performance increase when changing the CPU to a subzero cooling mechanism (e.g.: phase-change or liquid helium). The variation between cards can indicate the level of CPU dependant a video card really is, or, to what extend your CPU is bottlenecking the performance of your video card. In general, the more high-end VGA cards will have the most benefit from a higher clocked CPU.

The green line indicates the performance increase going from an air/water cooled configuration to a subzero cooled configuration or, to make it more clear, from FULL air/water to FULL subzero and NOT just changing the GPU configuration to subzero. The reason is quite simple: we assume that whoever is testing (or rather: submitting scores with) the GPU under LN2 will also have the CPU subzero to maximize the scores. Very important assumption!

The last step is to figure out whether it’s interesting to cool the GPU with liquid nitrogen or not. To find an answer to this question, we measure the difference between two test environments: only CPU subzero and both CPU/GPU subzero. The difference between both indicates the performance increase when changing the cooling method of the GPU.

scaling.png

data.pngWithout too much effort, I believe it’s fairly easy to conclude that the Nvidia Geforce GTX 275 is quite an interesting choice to use LN2 with. On the bottom of the list, we see that the Radeon 5970 is not really that interesting since the performance difference between CPU only and cpu/gpu subzero is a mere 3%. A very big side note to make here is that this doesn’t mean the card wouldn’t be able to scale more. It’s important to understand that since the data has been pulled from the HWBOT database, we make the assumption that if the card doesn’t scale properly with subzero cooling, people will not attempt to try LN2. This means that as we don’t have any control over the behavior of the overclocker, it’s not possible to determine whether not switching to LN2 means there’s no performance scaling or there’s no interest because it’s too difficult or too dangerous.

In general, however, we can see that the difference between two high-end models (relative to their release date) is most definitely noticeable. Something else noteworthy is the fact that most of the cards in the upper region of the ‘worthiness’-list are Nvidia, which would indicate that (in general) Nvidia scales better with cold. Therefore, I’ve also compiled an Ati versus Nvidia chart averaging the Ati and Nvidia performance results split up per temperature category.

avg1.png

Although a load of assumptions and side notes have to be made to this chart, it seems to be that in general Nvidia’s latest graphics cards scale better with temperature than Ati’s counter-products do.

If I had the time and the financial resources, I could perform similar research using different benchmarks and different hardware and even expand the cooling resources or vary platforms. Maybe we could even integrate this kind of information on the information pages, but sadly enough the current situation doesn’t allow us to do this. But, to end on a positive note, it does seem that the HWBOT database has a lot more to offer than just the raw scores, but can give us a better insight on the behavior of processors, video cards and recently also mainboards and memory when being pushed to the absolute limite in terms of operating frequency and operating temperature.


Please log in or register to comment.