0 Graphics Chip Chronicles Promo 5fc56f7b268cb

Multi GPUs: A Story of Promise and Potential Failure

Dec. 3, 2020
The Graphics Chip Chronicles Vol.5 No.1 - The concept of combining multiple graphics cards to scale up performance emerged at 3Dfx more than two decades ago. Since then, Nvidia and AMD have struggled to sell the idea to consumers and gaming enthusiasts.

>> Electronic Design Resources
.. >> Library: Article Series
.. .. >> Series: The Graphics Chip Chronicles
.. .. .. >> Introduction to this Series
.. .. .. << Nintendo 64

When 3D graphics controllers were emerging in the late 1990s, 3Dfx was experimenting with ways to scale up performance and accelerate 3D gameplay. One technology it developed was called scan-line interleave (SLI), which was introduced in 1998 as part of its second-generation graphics processor, the Voodoo2.

In SLI mode, two Voodoo2 add-in-boards (AIBs) could run in parallel, with each one drawing every other line of the display. The original Voodoo Graphics also had SLI capabilities, but the feature was generally used only in the arcade and pro graphics markets.

In addition to reducing scan time, SLI also promised to increase the available frame buffer’s memory. That would allow larger models to be loaded and it would also increase the maximum resolution of the screen. Unfortunately, the texture memory remained the same because each AIB needed to duplicate the scene data. That, combined with other overhead issues, dragged on the theoretical performance boost. As 3D models and screen resolutions continued to grow, so did the size and number of texture maps, further cutting into the promised benefits.

3Dfx tried to overcome this problem by adding another chip: the texture mapping unit (TMU).  The TMU allowed a second texture to be drawn during the same graphics engine pass with no performance penalty. When it was introduced, Voodoo2 was the only 3D AIB capable of single-cycle dual-texturing. Using the Voodoo2’s second TMU depended on the application software. Two very popular games of the time, Quake II and Unreal, successfully took advantage of dual-texturing. In fact, in 1998, multi-textures were almost the standard.

It took a little longer before the price-performance analysis showed up. An 8MB Voodoo2 AIB sold for $249 in 1998, about $480 today. A pair of Voodoo2 AIBs would be around $500 then. The problem was the average performance improvement was only 60 to 70% depending on the game and the central processing unit (CPU). The payoff was never there, nor could it ever be. However, in the end, the concept never faded out completely.

When Nvidia bought 3Dfx’s assets in 2000, included in the IP package was SLI. Nvidia didn’t reintroduce it until 2004 due to a lack of motherboards with dual AGP ports. And – Nvidia being Nvidia – they rebranded it to scan-line interface. Nvidia also expanded the concept, making it capable of combining up to four AIBs, which 3dfx had accomplished in the professional space with its Quantum3D products. The company also added several operating modes: Split-frame rendering (half per AIB), alternate frame rendering, and even SLI anti-aliasing as well as the ability to use an integrated GPU, a mode it called Hybrid SLI.

But expansion and rebranding could not solve SLI’s fundamental problem: the technology never delivered anything more than 170% improvement for 200% of the cost. On top of that, AIBs were increasing in price year after year. In addition, the driver support Nvidia had to provide, amounting to a tweak for almost every game, was adding up with each new generation.

In late 2005, reacting to Nvidia’s SLI rebranding, AMD, which had just acquired ATI, introduced its own take on the technology, called CrossFire. Then, in 2013, AMD ushered the concept to the next level and eliminated the over-the-top (OTT) strap. Instead, the company used an extended direct memory access (XDMA) to open a direct channel of communication between multiple GPUs in a system, connected via the PCI Express (PCIe) interface.

AMD’s XDMA eliminated the external bridge by opening a direct channel between the multiple GPUs in a system. That channel operated over the same PCIe interface as AMD’s AIBs. PCIe is typically used to transfer graphics data between GPUs, main memory, and CPU. When AMD introduced XDMA, the AIBs at the time were not using all the bandwidth PCIe could offer, which was considerably more than an OTT strap. The bandwidth of an external OTT bridge was only 900 Mbps, whereas PCIe Gen 3 with 16 lanes could supply up to 32 Gbps.

AMD’s added bandwidth and elimination of the OTT (a perk that later on Nvidia charged extra for) gave it a competitive edge. However, AMD’s AIBs at the time struggled to match the performance level of Nvidia’s, which hurt it in the marketplace. Ironically, when AMD introduced the RX480 in 2016, the company pushed users to purchase a pair of AIBs that it claimed would outperform a single Nvidia AIB at a lower cost. It was a clever marketing pitch, but it didn’t help AMD’s sales. It also wasn’t true.

In 2017, as AMD and Nvidia rolled out Dx12 AIBs, AMD dropped support for CrossFire. The company stated, “In DirectX 12, we reference multi-GPU as applications must support mGPU, whereas AMD has to create the profiles for DX11. We’ve accordingly moved away from using the CrossFire tag for multi-GPU gaming.”

Nvidia followed suit in 2019 and made it official in 2020. For its professional graphics AIB line, Quadro, Nvidia introduced a higher-bandwidth scheme it calls NVLink for multi-AIBs. NVLink specifies a point-to-point connection with data rates of 20, 25 and 50 Gbps.

In late 2020, the company introduced a high-end consumer graphics card, the RTX3090, and made NVLink an option for it. The 350-watt RTX 3090 was introduced at $1,499. It is unlikely many gamers will spend $3,000, plus another $90 for the NVLink technology. They may also need to add a larger power supply (PSU) to manage all the extra performance. However, content creators may want to shell out for the added performance.

>> Electronic Design Resources
.. >> Library: Article Series
.. .. >> Series: The Graphics Chip Chronicles
.. .. .. >> Introduction to this Series
.. .. .. << Nintendo 64

About the Author

Jon Peddie | President

Dr. Jon Peddie heads up Tiburon, Calif.-based Jon Peddie Research. Peddie lectures at numerous conferences on topics pertaining to graphics technology and the emerging trends in digital media technology. He is the former president of Siggraph Pioneers, and is also the author of several books.  Peddie was recently honored by the CAD Society with a lifetime achievement award. Peddie is a senior and lifetime member of IEEE (joined in 1963), and a former chair of the IEEE Super Computer Committee. Contact him at [email protected].

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!