The GPU Flashback Archive: NVIDIA GeForce FX and the GeForce FX 5200 Card

  • News, Hardware, Articles
  • 1
  • HWBOT

The GPU Flashback Archive: NVIDIA GeForce FX and the GeForce FX 5200 Card

This week’s GPU Flashback Archive article continues with a look at the NVIDIA GeForce FX series, or GeForce 5 if you prefer to keep things somewhat tidier. In truth however, the FX series was perhaps one of the least tidy product launches that NVIDIA have produced. The GeForce FX series spanned two years in terms of graphics card releases, used a total of six different GPU designs, two manufacturing nodes, three bus interfaces and technically speaking three different kinds of memory. To keep things reasonably simple, we’ll look at the new features that the FX series debuted and the technologies that were introduced while at the same time keeping our remit in focus with a look at the launch flagship GeForce 5800 Ultra, and the budget GeForce FX 5200, the most popular FX series card with HWBOT members.


NVIDIA GeForce FX: Overview

The NVIDIA FX series replaced the previous generation of GeForce 4 series cards, at least in terms of product launch dates. In reality the two products overlapped during the period towards the end of 2002 and early 2003. Although the GeForce 4 series was a success, bringing the video game industry the hardware needed to make DirectX 8.0 a reality, the GeForce 4 MX series had left a sour taste in the mouth of many tech reviewers and hardcore gamers. Despite being branded as a fourth generation NVIDIA product, it entirely lacked DX8 compatibility.

The GeForce FX series arrived with a mandate to end any perceived injustice, bringing next-level DX9.0 API support to the masses. For the first time in a while however ATI had beaten NVIDIA to the punch. The ATI 9000 series arrived with DX9.0 and AGP 8x support a few months before NVIDIA could get the FX series out of the door. This was an era when two fabless GPU design houses were vying for ascendancy on the cusp of 3D gaming. Unreal Tournament and Doom3, to give two examples, were among a slew of games that were finally taking advantage of the new technologies and innovations available to developers. Screen resolutions were moving towards the much vaulted High-Definition era, while features like anti-aliasing were placing more demands on GPUs than ever.

NVIDIA’s challenge was to try and serve all markets and bring DX9.0 support, Shader model 2.0 support (with improved arithmetic precision) and support for OpenGL 2.1 with pixel and vertex shaders to the entire segment. From entry-level to high-end. To achieve its goals, the company developed a series of GPUs that used the NV30 nomenclature. The first cards to arrive on the market were the high-end GeForce FX 5800 Ultra and the GeForce 5800, arriving in February 2003 (later than expected due to manufacturing issues at TSMC, apparently involving the shift from 150nm to 130nm nodes).

The flagship 5800 Ultra came with a $399 USD price tag similar to that of its main rival, the ATI Radeon 9700 Pro. It was based on the NV30 GPU, had a GPU clock of 500MHz and graphics memory clock of 500MHz (using DDR which double data rated at 1GHz effective). It’s more affordable brother, the GeForce 5800 was priced at $299 and had a more modestly clocked GPU at 400MHz with memory also at 400Mhz (800MHz effective). Both cards arrived with dual-slot designs that packed fans which even NVIDIA eventually admitted were a little on the loud side.

The GeForce FX 5800 Ultra featured 4 pixel shaders, 2 vertex shaders, 8 Texture Mapping Units (TMUs) and 4 Render Output Units (ROPs). It had 128MB of GDDR2 using a 128-bit bus to offer bandwidth of up to 16GB/sec. It was a flagship in all respects and was designed to keep the company in step with ATI.

The cards required more power than was available via the AGP slot and were therefore some of the first cards to have additional Molex power sockets. The cooling system used in these cards (featured above) were also considered pretty outlandish at the time. Using a cooling fan mounted on a copper heatsink that vented out of the PC chassis was fairly unique back in 2003, but what drove people truly mad was the noise output. Back in January 2003, this is what Anand Lal Shimpi had to say:

“To give you an idea of how loud the GeForce FX 5800 Ultra can get we’ve recorded two videos several inches away from a GeForce FX 5800 Ultra and a Radeon 9700 Pro for comparison purposes. The FX 5800 Ultra video starts out in 2D mode and then we start up a game, triggering an increase in clock and fan speed which you’ll definitely be able to hear.”

You can find the original .AVI files from Anandtech here.

The NVIDIA GeForce FX 5800 Ultra managed to achieve its primary goal, to outperform the Radeon 9700 Pro, ATI’s top offering. However, arriving late to market with a card that was noisy as hell and price tag that way beyond all but the most affluent enthusiasts meant that NVIDIA had to find a way to address the mainstream market. To do this they used the NV34 and NV31 GPUs, which includes the GeForce FX 5200, the most popular card with HWBOT enthusiasts.


The Most Popular NVIDIA GeForce FX Card: The GeForce FX 5200

Let’s jump straight into the data and look at the numbers in terms of GeForce FX submissions on HWBOT.

  • -GeForce FX 5200 – 20.72%
  • -GeForce FX 5500 – 17.03%
  • -GeForce 5200 SE – 6.52%
  • -GeForce 5900 XT – 6.20%
  • -GeForce 5200 Ultra – 5.10%
  • -GeForce 5950 Ultra – 4.60%
  • -GeForce 5600 – 4.48%
  • -GeForce 5900 – 4.41%
  • -GeForce 5700 – 4.07%
  • -GeForce PCX 5300 – 3.16%

It’s immediately apparent that neither the FX 5800 nor the FX 5800 Ultra appear on the list. The most likely reason being that HWBOT didn’t actually exist back in 2003, which means that the data we have is from largely retrospective overclocking sessions. It may also be that many older systems that require an AGP card used the FX 5200 for 2D benching sessions simply because it was a widely available and popular card. It was also available passively cooled and required no additional power.

The GeForce FX 5200 retailed for just under $70 and packed the same features that we find on the more expensive mid-range FX 5600 and high-end FX 5800 cards – DX9 and OpenGL 2.1 support, plus pixel and vertex shaders 2.0. However, in simple performance metrics, the card was ultimately a disappointment. As a budget solution the card offered great dual-screen support and support for the latest APIs, but nothing close to the frame-rates that gamers craved.

In terms of high GeForce FX cards, the enthusiast-grade FX 5950 Ultra is worth a mention making an appearance with 4.6% of the submissions share. This beast of a card was launched in October 2003 and was one of the first ever graphics cards to boast a 256-bit memory bus. The heat issues that were apparent on the previous high-end FX series cards were magnified with the FX 5950 Ultra, giving 3rd party card vendors an excuse to come up with even more elaborate, and somewhat memorable cooling designs. Here’s an example from Chaintech which boasts dual fans.


NVIDIA GeForce FX: Record Scores

We can now take a look at some of the highest scores posted on HWBOT using NVIDIA GeForce FX series cards.


Highest GPU Frequency

Although technically speaking, GPU frequency (as with CPU frequency) is not a true benchmark, it does remain an important metric for many overclockers. Sifting through the database, it appears that the submission with the highest GPU core frequency in the HWBOT database comes from respected retro overclocker macsbeach98 (Australia). He pushed a GeForce FX 5900 Ultra card (based on the updated NV35 GPU) to 753MHz (+67.33%) with graphics memory pumped up to a very impressive 517MHz (+21.65%). His rig also included an Intel Pentium 4 (3.4Ghz – Prescott) clocked at 4,459MHz (+31.15%).

Here’s an awesome shot of the rig from macsbeach98 which also used an ASUS P4C800-E Deluxe motherboard (Intel 875P chipset) and some freakishly tall LN2 pots.

The 3DMark03 submission was made entirely retrospectively as a part of the Old School is Best School contest back in May 2015. You can find the submission from macsbeach98 here on HWBOT: http://hwbot.org/submission/2878654_macsbeach98_3dmark03_geforce_fx_5900_ultra_9747_marks


3DMark2001 SE

The highest 3DMark2001 SE score submitted to HWBOT using an NVIDIA GeForce FX card was made by Slovenian overclocker tiborrr. He pushed a PNY GeForce FX 5900 XT card to 730MHz (+82.50%) on the GPU core and 510MHz (+45.71%) on the graphics memory. With this configuration he managed a hardware first place score of 39,037 marks. The rig he used also featured an Intel Pentium E6500K ‘Wolfdale’ processor clocked at 5,305MHz (+80.87%).

He’s a shot of the rig in action:

You can find the submission from tiborrr here on HWBOT: http://hwbot.org/submission/2354773_tiborrr_3dmark2001_se_geforce_fx_5900_xt_39037_marks


Aquamark

In the classic Aquamark benchmark we find tiborrr and his PNY GeForce FX 5900 XT card to be number one. His score of 84,069 marks was made with the exact same configuration as note above.

You can find the submission from tiborrr here on HWBOT: http://hwbot.org/submission/2354857_tiborrr_aquamark_geforce_fx_5900_xt_84069_marks

Thanks for joining us for this week’s episode of the GPU Flashback Archive series. Next week we will return with a look at the classic NVIDIA GeForce 6 series of graphics processors and cards.





1

Taiwan sdougal says:

Any fond memories of the GeForce FX series? Or were you way too busy with your Radeon 9000 series cards?

Please log in or register to comment.