While CES 2020 technically doesn’t wrap up for another couple of days, I’ve already been asked a good dozen or so times what the neatest or most surprising product at the show has been for me. And this year, it looks like that honor is going to Intel. Much to my own surprise, the company has their first Xe discrete client GPU, DG1, at the show. And while the company is being incredibly coy on any actual details for the hardware, it is none the less up and running, in both laptop and desktop form factors.

Intel first showed off DG1 as part of its hour-long keynote on Monday, when the company all too briefly showed a live demo of it running a game. More surprising still, the behind-the-stage demo unit wasn’t even a desktop PC (as is almost always the case), but rather it was a notebook, with Intel making this choice to underscore both the low power consumption of DG1, as well as demonstrating just how far along the GPU is since the first silicon came back to the labs in October.

To be sure, the company still has a long way to go between here and when DG1 will actually be ready to ship in retail products; along with the inevitable hardware bugs, Intel has a pretty intensive driver and software stack bring-up to undertake. So DG1’s presence at CES is very much a teaser of things to come rather than a formal hardware announcement, for all the pros and the cons that entails.

At any rate, Intel’s CES DG1 activities aren’t just around their keynote laptop demo. The company also has traditional desktop cards up and running, and they gave the press a chance to see it yesterday afternoon.

Dubbed the “DG1 Software Development Vehicle”, the desktop DG1 card is being produced by Intel in small quantities so that they can sample their first Xe discrete GPU to software vendors well ahead of any retail launch. ISVs regularly get cards in advance so that they can begin development and testing against a new GPU architecture, so Intel doing the same here isn’t too unusual. However it’s almost unheard of for these early sample cards to be revealed to the public, which goes to show just how different of a tack Intel is taking in the launch of its first modern dGPU.

Unfortunately there’s not much I can tell you about the DG1 GPU or the card itself. The purpose of this unveil is to show that DG1 is here and it works; and little else. So Intel isn’t disclosing anything about the architecture, the chip, power requirements, performance, launch date, etc. All I know for sure on the GPU side of matters is that DG1 is based on what Intel is calling their Xe-LP (low power) microarchitecture, which is the same microarchitecture that is being used for Tiger Lake’s Xe-based iGPU. This is distinct from Xe-HP and Xe-HPC, which according to Intel are separate (but apparently similar) microarchitectures that are more optimized for workstations, servers, and the like.

As for the dev card, I can tell you that it is entirely PCIe slot powered – so it’s under 75W – and that it uses an open-air cooler design with a modest heatsink and an axial fan slightly off-center. The card also features 4 DisplayPort outputs, though as this is a development card, this doesn’t mean anything in particular for retail cards. Otherwise, Intel has gone with a rather flashy design for a dev card – since Intel is getting in front of any dev leaks by announcing it to the public now, I suppose there’s no reason to try to conceal the card with an unflattering design, which instead Intel can use it as another marketing opportunity.

Intel has been shipping the DG1 dev cards to ISVs as part of a larger Coffee Lake-S based kit. The kit is unremarkable – or at least, Intel wouldn’t remark on it – but like the DG1 dev card, it’s definitely designed to show off the video card inside, with Intel using a case that mounts it parallel to the motherboard so that the card is readily visible.

It was also this system that Intel used to show off that, yes, the DG1 dev card works as well. In this case it was running Warframe at 1080p, though Intel was not disclosing FPS numbers or what settings were used. Going by the naked eye alone, it’s clear that the card was struggling at times to maintain a smooth framerate, but as this is early hardware on early software, it’s clearly not in a state where Intel is even focusing on performance.

Overall, Intel has been embarking on an extended, trickle-feed media campaign for the Xe GPU family, and this week’s DG1 public showcase is the latest in that Odyssey campaign. So expect to see Intel continuing these efforts over the coming months, as Intel gradually prepares Tiger Lake and DG1 for their respective launches.

Comments Locked

83 Comments

View All Comments

  • Tomatotech - Friday, January 10, 2020 - link

    Nobody has any idea what that case is? I do like the look of it.
  • FANBOI2000 - Saturday, January 11, 2020 - link

    Looks a bit like the new NUC elements case From Intel.
  • lmcd - Thursday, January 9, 2020 - link

    I just hope Intel holds onto the consumer-available GPU virtualization functionality they've included in their iGPUs.
  • bigsnyder - Friday, January 10, 2020 - link

    Will it be able to run Crysis?
  • eastcoast_pete - Friday, January 10, 2020 - link

    Will it be able to pull Intel out of its crisis?
  • UltraWide - Friday, January 10, 2020 - link

    Enough with the previews! Just release the damn product!! lol
  • eastcoast_pete - Friday, January 10, 2020 - link

    Of course this is mainly done to show that Intel's talk about dedicated graphics is not just that - talk. I for one see a market for a dGPU that is basically a faster version of their current top iGPU (in the G71 or whatever it's called), further boosted by actual video RAM. A card like that, with 4 GB of GDDR6, HDMI 2.1, an on-board ASIC for 10bit HDR decoding and streaming and all for under $100 could give some desktops another 2-3 years of usable life. Now, this being Intel, they'll probably manage to screw that up, somehow. They usually do.
  • spkay31 - Friday, January 10, 2020 - link

    Seeing is believing? Seeing early availability developer only stuff at CES with no concrete information about product availability and pricing is only believing if you have blind faith in Intel. They've been having so many problems just getting 10nm node ramped up I don't expect this to be making much of an impact on graphics market before the next CES.
  • sftech - Friday, January 10, 2020 - link

    Intel thinks there's a client market for GPUs? They're doubting the game streaming revolution?
  • FANBOI2000 - Friday, January 10, 2020 - link

    I think completion is a good thing, at the moment AMD have had APUs all to themselves and that will make them lazy. Already most of the stack has no graphics and based on what I have seen we aren't going to get any great discounts from AMD when it comes to DGPU. If Intel could hit 3200g performance then it would be notable and change the narrative quite a bit. I'm all over AMD in desktop, and if I am A fan boy it would be AMD, but chips like the 9400F are still good CPU at a fair price. I build systems for people as paying hobby and CPU still has a place against AMD. imagine it as an APU with 3200G levels of performance.

    anyway I'm drifting, at the moment I place this higher up my cynicism scale than I do AMDs mobile claims. I think AMD is saying little about battery life for a reason and I think Intel is showing us empty boxes for all it is worth at the moment. I imagine we are probably at least a year away before Intel will be ready for market. And I also can't see them really focusing on gaming. If you look at Intel it isn't all about CPUs, they have a lot of other IP they can trade off like Optane so they will be looking at compute, machine learning, things like that. But any completion is still a good thing and I am happy for AMD even if I have some doubts about the power draw with their new mobile chips. If I am wrong then I am very wrong because I have argued that Intel would be more robust in mobile and server. While they have taken a complete beating in server, again it isn't all about the CPU but also server will take some time to filter through for AMD I think. Companies will just bin all their Intel stuff overnight? I think with mobile though, if AMD are being straight, that could be very disruptive. I can't remember a time when I ever considered AMD mobile chips. Maybe in some SFF ten years ago, where power consumption was less of an issue.

Log in

Don't have an account? Sign up now