It's been a wild year in the graphics space, topped by Intel's announcement that it has started a new program to launch graphics cards for both gaming and enterprise markets (purportedly code-named Arctic Sound). That's shocking because Nvidia and AMD have been the primary two discrete GPU producers for the last 20 years.

FireShot Capture 109 - Intel Xe Graphics Card_ Rumors, News,_ - https___www.tomshardware.com_new.jpg

As with all market upsets, that could be good for the consumer. Nvidia has taken the uncontested performance lead with its 2000-series graphics cards, granting it license to charge steep premiums for its top-end gear. Meanwhile, AMD seems resigned to cater to the entry- and mid-level portions of the market while it sheds talent during the bring-up of its 7nm Navi architecture. These conditions mean the graphics industry is ripe for a shake-up that could ultimately lead to lower prices for the consumer.

Developing a new GPU is an ambitious goal for Intel, especially considering its failed Larrabee project in 2010. But the company has been busy assembling the right team for the job. That includes veterans like Raja Koduri, rock star chip architect Jim Keller, and graphics marketer extraordinaire Chris Hook, to name just a few. The company also recently purchased Ineda Systems, a graphics company based out of Hyderabad, India, for its experience with SoC design.

With the team in place, now it comes down to the hardware and design, but Intel is prepared there, too. Intel has an install base of over a billion screens around the world. That advantage comes courtesy of its integrated graphics chips in its CPUs that make Intel the world's largest GPU manufacturer. The company also has an IP war chest (at one point it owned more graphics patents than the other vendors combined) to do battle.

As with all silicon designs, these types of advances take years to come to market, with the typical design cadence for a GPU lasting roughly four years. That means Intel's project began secretly several years ago, but now the details are just beginning to trickle out. Here's what we know so far.

The Intel Xe Graphics Family - Coming in 2020

Intel whipped the covers off its Xe graphics branding at its recent Architecture Day, but it doesn't represent the actual final brand names, like Radeon or GeForce. Instead, the Xe branding signifies Intel's full range of low- to high-power graphics solutions.

According to Intel, its coming Gen11 engine represents a huge step forward for its integrated graphics. Given the facts and figures the company presented, we can reasonably spitball raw performance of these new integrated graphics in the range of the Radeon Vega 8 cores that come with the Ryzen 3 2200G.

Intel's Xe line will come after Gen11 and replace the Gen12 branding. These graphics processors will scale from integrated graphics chips on processors up to discrete mid-range, enthusiast, and data center/AI cards. Intel says it will split these graphics solutions into two distinct architectures, with both integrated and discrete graphics cards for the consumer market (client), and discrete cards for the data center. Intel also disclosed that the cards would come wielding the 10nm process and arrive in 2020, which is in line with its previous projections.

FireShot Capture 110 - Intel Xe Graphics Card_ Rumors, News,_ - https___www.tomshardware.com_new.jpg

Intel also plans for the graphics cards to work with its OneAPI software, which the company designed to simplify programming across its GPU, CPU, FPGA, and AI accelerators. The new software provides unified libraries that will allow applications to move seamlessly between Intel's diverse types of compute. If successful, this could be Intel's answer to Nvidia's CUDA, except it would work on any Intel processor.

Intel GPU Pricing

For Intel, pushing into the enthusiast graphics market makes sense. As always, data center and enterprise applications represent the land of high margins, but the consumer market serves as the high-volume engine. Much like Nvidia does with its GPUs and Intel already does with its desktop CPUs, sales to the broader desktop PC market drive up production volume and build the economy of scale that lowers costs. In turn, that assures low pricing and high margins on GPUs designed for the data center.

Intel still has a lot of work to do on the enablement front, such as recruiting AIB partners, but it is possible the company will produce the new cards natively. It's hard to tell which path Intel will take, including which segments of the market it will target first. It's natural to expect Intel's freshman effort to target the mid-range enthusiast market with enticing price points, but true to its heritage, the company may also bring high-end cards based on enterprise designs to the enthusiast space, just like it does with its high-end desktop processors.

Only time will tell, but Intel has the deep pockets to take losses for the first few generations as it builds a customer base. In either case, Intel's new cards will force both AMD and Nvidia to become more competitive on pricing.

Intel Discrete GPU Drivers

Intel's hardware is only part of the equation, though. The company has a reputation for a slow cadence of graphics driver and software releases, which often means the company doesn't have day-zero drivers ready for big game launches.

In a sign of things to come, Intel has increased the cadence of driver releases lately, with a heavy focus on zero-day releases. At its recent event, the company also reiterated its commitment to the software ecosystem in the graphics market, so we know the company is headed in the right direction on that front. Market-leader Nvidia employs more software engineers than hardware engineers, so success in this area is key.

Intel is also already working diligently on enabling its driver ecosystem, with the first Linux patches, 42 in fact, arriving in early 2019. The 4,000 lines of code enable support for device-local memory regions, which is key for supporting a discrete graphics card with its own onboard memory. Intel typically releases driver patches roughly a year ahead of the launch of new devices, so these updates indicate the project is well underway.

Will the Discrete Xe Graphics Cards use 3D Chip Stacking?

Intel recently surprised the world with its announcement of its new Foveros (Greek for awesome) 3D chip stacking technology. Intel says it built Foveros upon the lessons it learned with its innovative EMIB (Embedded Multi-Die Interconnect Bridge) technology, which is a complicated name for a technique that provides high-speed communication between several chips.

The Foveros chip stacking technique connects multiple stacked dies into one 3D package. This technology may sound like a far-off pipe dream, but Intel already has working chips based on the technology (albeit small ones) coming to market in 2019. That leaves the company some room to debut the technology with its new discrete GPUs in 2020, though that hasn't been confirmed.

FireShot Capture 111 - Intel Xe Graphics Card_ Rumors, News,_ - https___www.tomshardware.com_new.jpg

In either case, speculation is running rampant. Intel announced that it is already developing a new FPGA using the Foveros technology. The company claims that this technology will enable up to two orders of magnitude performance improvement over the next-gen Falcon Mesa FPGAs, along with density and power efficiency improvements. Those same types of radical performance advances could also apply to GPUs built on the same technology. That means it is possible that we could see 3D stacked GPUs from Intel in the future, even if they don't debut in the company's first discrete graphics cards.

Intel could also use an MCM (Multi-Chip Module) approach that essentially ties multiple small die together into one heterogeneous package but in a less-risky 2D alignment. The graphics industry has been studying this approach for years, but a lack of sophisticated packaging and high-speed interconnects between the die has slowed progress. Intel already has EMIB interconnect technology shipping, so it wouldn't be surprising to see the company combine multiple chips.

One thing is certain: Intel has more packaging options than either AMD or Nvidia. In fact, Intel spends more money on packaging than all other chip-producing companies in the world, combined.

Both AMD and Nvidia also rely on outside fabs. Intel's fabs have been a liability lately as the company struggles with its 10nm process, but when used correctly, Intel's fabs are a tangible advantage that could give the company a leg up on its competitors. Chip packaging techniques are becoming the true differentiator in the waning light of Moore's Law, and its fair to say that the companies that produce AMD and Nvidia's GPUs (TSMC and GlobalFoundries) cannot compete with Intel's next-gen packaging technologies. At least for now.

Intel Xe GPU Specifications and Features

It's hard to tell where Intel's graphics cards specs will land, but we expect the cards will support all of the native features we get today with its integrated graphics, but hopefully with a bit more firepower to back them up. That includes things like support for 4K streaming, legacy VP8 and AVC codecs, along with HEVC 10-bit decode/encode, VP9 8/10-bit decode, and VP9 8-bit encode (and perhaps even support for VP9 10-bit encode). We also expect HDR and Wide Color Gamut support.

We also know that Intel plans to support Adaptive Sync technology for all of its future graphics solutions, which includes its discrete GPUs. That's the same underlying technology used for AMD's FreeSync, which could be a setback for Nvidia's proprietary G-Sync.

The Arctic Sounds of Anticipation

There's no doubt that Intel's entrance into the discrete graphics card space is the most exciting news the industry has received in years. We just hope it can execute on its mission this time around instead of taking unreasonable risks like it did with Larrabee. Intel has assembled a veteran team that should guide the company away from obvious missteps, though. Many of those hires hail from AMD and Intel has opened a new graphics development office in Toronto, close to AMD's campus, so it can do some recruiting.

We also see a new level of engagement from the company on the graphics front, as evidenced by the company's recent Ask You Anything session on Reddit where it asked enthusiasts what they would like to see from its coming products. Intel appears to be using the feedback wisely, which is a refreshing change of pace.

The company also announced its new "Join the Odyssey" program, which is an outreach program designed to keep enthusiasts up to date on the latest developments through an Intel Gaming Access newsletter. The information-sharing goes both ways, though, as the company also plans to use the program to gather feedback from gamers to help guide its design decisions.

FireShot Capture 112 - Intel Xe Graphics Card_ Rumors, News,_ - https___www.tomshardware.com_new.jpg

There's no doubt that Intel needs to succeed in the GPU market, at least in the data center. We often look to the supercomputing realm to sniff out new trends, and the rise of GPUs in that space is explosive: in 2008 not one supercomputer used GPUs for computation, instead relying on the tried-and-true CPU, but now 80% of compute power in the top 500 supercomputers now comes from GPUs. It appears the pressure Nvidia's GPUs are putting on Intel's Xeon sales is finally having an impact: We would have never expected Intel to share a chart like the one above, which clearly shows that GPUs are part of the path forward to extending Moore's Law.

Intel is perhaps the only company with the deep pockets and desktop PC leadership position to successfully bring a new ground-up discrete graphics architecture to market. Competition is good for everyone, but Intel has a long way to go before we see the end products in 2020. Meanwhile, it's almost certain that Intel will turn the spigot on with more details during its keynote at CES 2019. We'll be there in the front row and update this post as new information comes to light.