Future Tech

Can a Xilinx FPGA recreate a 1990s Quake-capable 3D card? Yup! Meet the FuryGpu

Tan KW
Publish date: Sun, 31 Mar 2024, 06:17 AM
Tan KW
0 460,014
Future Tech

If you've ever wondered whether a Xilinx Zynq UltraScale+ FPGA can be configured as a homegrown gaming 3D GPU capable of accelerating Quake and other faves from the 1990s, we have an answer - and it's yes.

In theory, looking at the FPGA's specs and what it needs to do, it should totally be possible with a large amount of effort. Now we're starting to see proof, and hopefully materials to help you follow along if retro computing and FPGA programming are your thing.

Created by software engineer Dylan Barrie, FuryGpu is the culmination of four years of work that started as a simple project. Barrie was interested in the idea of making a homemade GPU as his next hobby. "I'd spend a few months making a spinning cube or something, and be done with it," he predicted, according to his account of building the kit.

However, the scope of the project expanded beyond graphics demos and into the realm of actual 3D-accelerated games running on an homemade graphics card. The FuryGpu uses a custom PCB equipped with a Kria system-on-module that features a Xilinx Zynq UltraScale+ FPGA, a couple of video outputs, and a PCIe 2.0 x4 connection.

So far, FuryGpu boasts a 400MHz GPU clock, a 480MHz texture unit clock, an FP32-capable front-end, and support for linear and bilinear filtering. The GPU-in-FPGA has no programmable shaders, though, and is fixed-function. It runs games via Barrie's own FuryGL API rather than DirectX3D or Vulkan.

Of course, supporting just a homemade API means Barrie has to port games to his FuryGpu. He said "while that is not a massive undertaking, it's not a trivial thing to do either." Thankfully, the 1990s predated the advent of mega-games, and most titles were developed by a small team, or even just a single programmer. In other words, adding support for FuryGL wasn't too much hassle.

Guess what caused the most trouble

While programming an FPGA to make it a 3D-accelerating GPU and porting games to a custom API is challenging enough, apparently it wasn't the hardest part.

"Of all the parts of this project, writing Windows drivers for it have been the most painful," Barrie said. It speaks volumes when porting a game is easier than writing drivers on Windows.

Barrie has been sharing his odyssey of getting FuryGpu to work since 2022, with the main focus being compatibility with the OG Quake. Each video in his series about the hardware demonstrates a new feature being added, such as rendering particles, adding textures, and enabling audio. 

It was only on March 1 that Barrie was able to finally show (see below) FuryGpu running Quake at full 720p resolution and 60 frames per second, made possible thanks to finally adding asynchronous CPU and GPU operation to the FuryGpu drivers. At this level of performance, FuryGpu is roughly equivalent to a graphics card from the mid-1990s, a time when 3dfx and Matrox were still in the running and before Nvidia started pushing the term GPU (and saying it had invented it).

Plus, FuryGpu can be used for basic computer work. It can display the UEFI interface for interacting with the motherboard's BIOS, the Windows desktop works as normal, and there's functional audio as well. Maybe watching YouTube is out of the question but a low-res DVD could be within the realm of possibility. 

For those who want to make their own FuryGpu, there's good news: The project is set to become open source. However, Barrie explicitly says he doesn't know when that's going to happen, saying he will do it "at some point." More details here and here. ®

PS: Yes, we're aware of things like Ben Eater's homemade VGA card, a 486DX in an FPGA, the whole MiSTer project, etc. But Barrie's work is pretty cool, and caught our eye. Comment away below if there are similar hardware efforts you've found lately.

 

https://www.theregister.com//2024/03/30/furygpu_xilinx_fpga_graphics/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment