Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

How Intel could use AI to tackle a massive issue in PC gaming

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn’t introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the “limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes.” Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to “classic source representations, while also improving quality over previous work.”

Ellie holds a gun in The last of Us Part I.
Image used with permission by copyright holder

It doesn’t seem dissimilar from Nvidia’s Neural Texture Compression , which it also introduced through a paper submitted to Siggraph. Intel’s paper, however, looks to tackle complex 3D objects, such as vegetation and hair. It’s applied as a level of detail (LoD) technique for objects, allowing them to look more realistic from further away. As we’ve seen from games like Redfall recently , VRAM limitations can cause even close objects to show up with muddy textures and little detail as you pass them.

Recommended Videos

In addition to this technique, Intel is also introducing an efficient path-tracing algorithm that it says, in the future, will make complex path-tracing possible on mid-range GPUs and even integrated graphics.

Path tracing is essentially the hard way of doing ray tracing, and we’ve already seen it be used to great effect in games like Cyberpunk 2077 and Portal RTX. For as impressive as path tracing is, though, it’s extremely demanding. You’d need a flagship GPU like the RTX 4080 or RTX 4090 to even run these games at higher resolutions, and that’s with Nvidia’s tricky DLSS Frame Generation enabled.

Intel’s paper is introducing a way to make that process more efficient. It’s doing so by introducing a new algorithm that is “simpler than the state-of-the-art and leads to faster performance,” according to Intel. The company is building upon the GGX mathematical function, which Intel says is “used in every CGI movie and video game.” The algorithm reduces this mathematical distribution to a hemispherical mirror that is “extremely simple to simulate on a computer.”

Screenshot of full ray tracing in Cyberpunk 2077.
Nvidia

The idea behind GGX is that surfaces are made up of microfacets that reflect and transmit light in different directions. This is expensive to calculate, so Intel’s algorithm essentially reduces the GGX distribution to a simple-to-calculate slope based on the angle of the camera, making real-time rendering possible.

Based on Intel’s internal benchmarks, it leads to upwards of a 7.5% speed up in rendering path-traced scenes. That may seem like a minor bump, but Intel seems confident that more efficient algorithms could make all the difference. In a blog post , the company says it will demonstrate how real-time path tracing can be “practical even on mid-range and integrated GPUs in the future” at Siggraph.

As for when that future arrives, it’s tough to say. Keep in mind this is a research paper right now, so it might be some time before we see this algorithm widely deployed in games. It would certainly do Intel some favors. Although the company’s Arc graphics cards have become excellent over the past several months, Intel still focused on mid-range GPUs and integrated graphics where path tracing isn’t currently possible.

We don’t expect you’ll see these techniques in action any time soon, though. The good news is that we’re seeing new techniques to push visual quality and performance in real-time rendering, which means these techniques should, eventually, show up in games.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel Arc B580 vs. Nvidia RTX 4060: a one-sided showdown
The back of the Intel Arc B580 graphics card.

Intel is back with one of the best graphics cards you can buy -- the Arc B580. As you can read in my Intel Arc B580 review, it's a graphics card that has no business being as powerful as it is given how inexpensive it is. And when comparing it to its main competitor, Nvidia's RTX 4060, Intel mops the floor with its rival.

I've been testing Intel's latest GPU over the last couple of weeks, and I decided to put it head-to-head with Nvidia's budget RTX 4060, which is currently the second-most-popular GPU on Steam. Given the performance I've seen, Intel's GPU deserves to start climbing up the rankings in those same charts.
Specs and pricing

Read more
Prices for Intel’s Arc B580 are already shooting through the roof
The Intel logo on the Arc B580 graphics card.

Intel just launched its new $249 Arc B580 graphics card, and as you can read in our Intel Arc B580 review, it's one of the best graphics cards you can buy. It seems PC gamers have gotten the memo, as most models of the card are sold out online. If you want to get one now, you'll have to spend close to double the list price.

Looking at online retailers, it looks like Newegg has the most models listed for sale, though almost all of them are sold out. The only models available come from Gunnir, and they're both very expensive. The
Arc B580 Index is listed for $379
, while the
Arc B580 Photon is listed for $429
. Both are sold by third-party sellers -- they aren't sold and shipped by Newegg -- so I wouldn't recommend spending up for one of these cards.

Read more
Intel’s new $249 GPU brings 1440p gaming to the masses
An exploded view of Intel's Arc A580 GPU.

Intel is trying to redefine what a "budget GPU" really means in 2024, and it's doing so with the new Arc B580 GPU. In what Intel itself described as its "worst kept secret," the B580 is the debut graphics card in Intel's new Battlemage range of discrete GPUs, and it's arriving at just $249. That's a price point that's been relegated to 1080p for decades, but Intel says the B580 will change that dynamic.

It's a 1440p GPU, at least by Intel's definition. That's despite the fact that Intel is comparing the card to GPUs like the RTX 4060 and RX 7600, both of which are more expensive than the B580 and squarely target 1080p. Intel says it can deliver higher performance than these two GPUs while undercutting the price, all in an attempt to capitalize on 1440p gamers. "1440p is becoming 1080p," as Intel's Tom Petersen put it in a pre-briefing with the press.

Read more