Review: Gigabyte GF3000 GeForce3 video board

30

Author: JT Smith

By Jeff Field

For hardcore gamers, there is little question as to which chipset they want their video board based on — the NVidia GeForce3. Because most of its cards deliver more than 60 frames per second at most resolutions, Nvidia says it has reached a point at which they’ll begin to focus on image quality and other features instead. Today, I’ll take a look at how the first GPU developed under that new focus holds up under Linux, by testing the GV-GF3000 from Gigabyte.
The board
Known primarily as a motherboard manufacturer, Gigabyte also has a line of video cards, based on Nvidia GPUs (Graphics Processing Units). Gigabyte seems to have a variant for every chipset NVidia has put out since the TNT, and Nvidia’s latest, the GeForce 3, is no exception. It is a reference board (that is, it is based off the reference design Nvidia provides.) The GV-GF3000 supports a variety of new features provided by this chipset; unfortunately, due to driver issues, not all of these features are supported under Linux.

Features
nfiniteFX engine — This feature of the GeForce 3 is derived from the programmable Vertex and Pixel processors. The combination of these two processors on the GeForce3 GPU allows game developers to create their own custom special effects in games.

High resolutions antialiasing (HRAA) — High resolution antialiasing (using Nvidias’ Quincunx AA technology) allows for 4x-like antialiasing with the performance of 2x antialiasing. Antialiasing is a process where the jagged edges of objects are smoothed out. This feature, at present, does not appear to be enabled in the Nvidia Linux drivers, but could be added in a future driver. The current Linux driver only supports up to 2x antialiasing.

Lightspeed memory architecture — Lightspeed memory architecture, like the nfiniteFX engine, is really several features combined under one name. The first, and biggest feature, is Nvidia’s “crossbar memory controller.” On a GeForce2, the memory controller is 128-bits, and so when, say, 32-bits of data is requested, the other 96-bits that could hold something are wasted. In order to fix this problem on the GeForce3, Nvidia split the memory controller into 4 32-bit DDR memory controllers. This means, that if 32-bits of data was needed, the GeForce3 need only use one of the four memory controllers, leaving the other three to do other things, allowing for more parallel operation, and thus being more efficient.

The other features of this memory architecture are simpler, and while having less impact, when combined affect performance quite a bit.

Z-Occlusion Culling is a process in which pixels in the Z-buffer (the place where the Z-axis values of pixels are stored, hence the name) are compared to find which ones are not going to acctually be seen, and are not sent to the framebuffer for display. In past cards, this did not happen, even though it seems so simple, and therefore overdraw occured, wasted time drawing things that you can’t even see. Next is Z-buffer compression. This loss-less compression allows you to fit four times as much data in the Z-buffer, without loss. Finally, there is a fast clearing ability for the Z-buffer, allowing the Z-buffer to quickly be flushed, speeding up this common operation.

Documentation, packaging and software
The documentation for the card was very Windows-oriented, as is true with most video cards. Thankfully Nvidia has good enough documentation with its Linux drivers, and physical installation was covered in the manual enough that most users should have no trouble getting everything up and running. The card came with drivers, game demos and other software for Windows. I can’t blame the card-makers either — the vast majority of people who’ll use this card run Windows, but it would have been nice to at least see Linux drivers on the disc.

Performance
System Specifications
Athlon Thunderbird 1400MHz
Gigabyte 7DXR
256 Megs PC2100 DDR SDRAM from Crucial.com
Western Digital 7200 RPM 10.2 Gig Hard Drive
3Com 3C905TX-C 10/100 NIC (PCI)
400 Watt Future Power ATX Power Supply
Mandrake 8.0 with Kernel 2.4.3

Quake 3 Arena Timedemos (Frames Per Second)
Board 640×480 800×600 1024×768 1200×1024 1600×1200
Default Quality
Gigabyte GV-GF3000 GeForce3 183.6 182.6 172.8 126.1 90.7
Abit Siluro GeForce2 MX400 175.4 130.2 85.5 52.9 36.8
Highest Quality
Gigabyte GV-GF3000 GeForce3 183.2 179.0 146.1 98.0 71.4
Abit Siluro GeForce2 MX400 125.7 88.2 58.9 37.7 27.2

As you can clearly see here, these boards are meant for two different things — one for budget, one for raw power. You could pick up four or five GeForce2MX boards for the price of a GeForce3, but if you are a hardcore gamer, there is little choice — the GeForce3 does 1600 * 1200 * 32-bit in high quality with a framerate that is still above 60FPS. That is amazing speed, and should quench even the mightiest thirst for frames.

Conclusions
Even though the company is closed source, it is hard to ignore the fact that Nvidia’s graphics chipsets are the fastest on the market. Even with the lack of Open Source drivers on the disk, one can hardly say that Nvidia has ignored the Linux community — the drivers available are excellent, and while they could only get better by being Open Sourced, they are on a par with Nvidia’s Windows drivers. As for the GV-GF3000 itself, the board is just like most other GeForce3 boards — an excellent performer. If you can find it for a good price, and you play a lot of 3D games, I highly recommend picking up a GeForce3 based card. The GF3000 is an excellent performer. My only issue was difficulty in finding a price for it online.

For discussion of this review and any other hardware-related topics, please join #Hardware on OpenProjects.net.

Category:

  • Unix