Taking place next week is the 2016 Game Developers Conference in San Francisco. GDC has been an important show for some time, but in recent years it has taken on an even bigger role as what happens and what is announced at GDC have greater implications for not just developers, but end-users as well. GDC has been the backdrop for PC hardware launches, graphics API launches, and more. And GDC 2016 promises to be much the same, as in the PC world developers look to embrace DirectX 12, Virtual Reality, and other emerging technologies.

Ahead of next week’s show, I had a chance to sit down and talk shop with an interesting trio of techies: Brian Langley, Microsoft’s DirectX 12 lead, Max McMullen, Microsoft’s principle lead for Direct3D, and Dan Baker, co-founder and guru developer for Oxide Games. Microsoft of course is looking to further push the development of (and developers towards) DirectX 12, as the first games come out for the APi. Meanwhile Oxide’s Ashes of the Singularity has been a common sight around here, as while it won’t claim the title of the first DX12 game – that technically goes to the new Windows 10 port of Gears of War – Ashes is arguably the first game to take meaningful advantage of the API. As a result there’s a lot of excitement with Ashes not only at Oxide, but at Microsoft as well ahead of its impending March 31st launch.

With the chance to talk to developers on both sides of the spectrum – API development at Microsoft and application development at Oxide – I wanted to ask the gathered gurus about their experiences with bringing up the API and implementing it in games, what their perceptions are of the wider market, what developer response has been like, and what’s in store next for DirectX 12. Though there are rarely grand revelations in brief conversations such as these, it was none the less an interesting view into how DirectX 12 has taken root since it officially shipped back in July with Windows 10.

DirectX 12 Adoption & Stability

It didn’t take long for our conversation to reach the point of discussing DirectX 12 adoption, both from a development standpoint and an end-user standpoint. Historically speaking it has taken many years for new versions of DirectX to be widely adopted by most games. The reasons for this are varied, but it’s often a mix of slow user adoption of new OSes, slow developer adoption when working with multi-platform titles – developers tend to stick to the API that most closely matches the consoles - and the fact that new versions of DirectX and new hardware standards have often gone hand-in-hand.

DirectX 12 is very different in that respect, both because it runs on 2012+ hardware and that the necessary OS upgrade is free. In fact free is likely playing a huge part here, as Baker has mentioned that Oxide’s seeing a “fairly strong uptake” of the new OS. For reference, Steam’s most recent hardware survey puts Windows 10 64-bit adoption at 34% of all machines surveyed, and with a sub-1% gap, it’s likely that it will cross Windows 7 64-bit this month.

A relatively rapid adoption of Windows 10 by end-users means that developers can in turn make their own leaps sooner, as the necessary critical mass will be in place sooner than with past generations. Both Baker and Langley agreed that DirectX 12 will likely see faster adoption from developers than past generations have, as the user base is building up much sooner. Also helping matters is the fact that the consoles (particularly the Xbox One) are so similar to DirectX 12 with their own respective low-level APIs, which means that developers can synchronize multi-platform titles around low-level APIs much easier than in past generations where the consoles have lagged behind. The APIs won’t be perfectly identical due to some inherent platform differences such as memory management (more on this later), but Microsoft is looking to make the Windows and console APIs as close as reasonably possible to help facilitate this.

Microsoft for their part is of course pleased with this outcome, but even within the realm of the DirectX development team they have made it clear that they aren’t done yet and want to do even more to drive the adoption of DirectX 12 for both end-users and developers and to convince the holdouts to make the jump to Win10. Now that DX12 is out, they have been working on better tools for developers to make the API more approachable and easier to debug. At the same time while Microsoft isn’t being specific, they are making it clear that they aren’t done adding features to the API, and that along with fixing bugs there’s more to come for DX12.

But what surprised me the most in our conversation on adoption was Baker’s comments on the state of DirectX 12. “DX12 is in far better shape than DX11 was in the first generation; it's way further along,” Baker said, and a lot of this has to do with the basic design of the API. Because DX12 is low-level and fairly thin, what bugs there are tend to be fairly straightforward. DirectX 11, by comparison, took years to sort out, and even then Baker doesn’t trust GPU drivers when it comes to DX11 multi-threading. DX12, by comparison, is handling upwards of 16 threads from Ashes of the Singularity without encountering any issues. Which is not to say that DX12 is already perfect, but DX12 is on the path to quickly being in a better overall state than DX11, even more than 6 years after the introduction of the latter.

From Microsoft’s point of view, Langley echoed Baker’s statements. Working with developers, Microsoft is already finding that DX12 is typically a net win for CPU performance in most cases. Just how much any in-development title is benefitting from DX12 varies from game to game, but a common thread in all of these cases is that the earlier game developers can implement it the better. Games that add DX12 at the last moment are benefitting the least – and Microsoft is trying to help developers integrate it sooner – whereas games that do integrate it sooner like Ashes are seeing much more significant benefits.

One question I threw at both groups was whether DX12’s lack of abstraction meant that developers were being exposed to any hardware bugs. And though there have been driver bugs, neither the developers Microsoft had worked with nor Oxide had run into notable hardware bugs. Which given just how much hand-holding DX11 required at times by developers to adapt for implementation differences, the stricter implementation standards for DX12 have made things a lot easier in some ways even with the intricacies of working at a lower level.

Ultimately not only is DirectX 12 likely to be faster than any version of DirectX before it, but there’s a very real possibility that DirectX 12 will become the baseline version of the API for major games (outside of internal Microsoft projects) far sooner than with DirectX 11. Though making it clear that it’s merely an option on the table at this time and not yet a decision made, Baker said that Oxide’s next game may go DX12-exclusive, as adoption is strong and doing so would give Oxide’s developers the freedom to implement some new rendering strategies that they can’t properly implement in a game that needs to support both DX11 and DX12. Similarly, multi-platform developers looking to synchronize their projects between the consoles and Windows will have further incentive to go with DX12 exclusively if it means they can reuse the vast majority of their existing low-level code; a DX11 path in this case would mean spending a lot more effort on a rendering path for a single platform.

Developing For DirectX 12

One point that has consistently been reiterated about DirectX 12 and other low-level APIs is that they’re not for the faint of heart, and that making effective use of it will require more guru-level programmers who can work with a video card without all of the hand-holding that came with DirectX 11 and earlier APIs. And though DirectX 11 isn’t going anywhere, in our chat Microsoft said that they want to help more developers make the jump.

One part of that is going to be to improve the tools situation for DX12 in order to give developers better and easier to understand tools to work with. Though Microsoft isn’t being specific at this time – and from this sounds of it this is what part of their GDC presentation will be about – Langley said that the DirectX group “really wants to take [DX12] and broaden it, and make it the API that everyone uses to do all of their game development." The path to DirectX 12 for many developers will still be through inheriting it from licensed engines, but for those developers who do go their own route, Microsoft wants to make the jump less painful.

Even so, for developers it has definitely been a learning experience. Making effective use of DX12 requires a better understanding of the underlying hardware, and how to best treat it. Avoiding pathologically bad cases is one major hurdle for new developers, particularly those who don’t have a firm grasp on the hardware. The low-level nature of DX12 means that more control over optimizations will be in the hands of developers – and they will need to rise up to the challenge for best results – as opposed to video card drivers.

Similarly however, it’s also a new world for driver developers, and while drivers overall are responsible for less of the optimization process, they do have their own role to play. Drivers are still responsible for exposing various hardware queues and HLSL shader compiling, not to mention implicit mode DX12 multi-adapter. So driver developers will still be a part of the optimization process, though in a different way than before.

Meanwhile in the case of Ashes of the Singularity, Oxide is at an interesting position for both better and worse. As the first game to make extensive use of DX12’s strongest features, the game is a pathfinder for other games to follow. And at the other end, because so many eyes are on the game, Oxide has needed to walk a sometimes narrow path to avoid favoring one hardware vendor or another (or being seen as doing so). As Baker notes, since the PC is such a large and varied platform “You can never perfectly optimize for every platform because it's too much work” as compared to the highly regulated consoles, so instead the name of the game is making generic optimizations and try to be as even-handed as possible. At the same time the company has also been atypically transparent with its code, sharing it with all of the GPU vendors so that they can see what’s going on under the hood and give feedback as necessary.

An unexpected outcome of this has been that as Baker and the rest of the Oxide crew have needed to learn more about GPUs to better write for the DirectX 12, they have also learned some things that have helped them write a more efficient DX11 rendering path. Though DX11 abstracts a great deal from developers, from a broad perspective there are still some algorithms and techniques that are a better match to modern hardware than other techniques, and with DX12 strongly pushing developers towards taking efficiency into their own hands, this has impacted DX11 development as well.

Memory: A Uniquely PC Perspective

While we were on the subject of developing for DirectX 12, the matter of memory management came up, and how the PC situation is still unique compared to all other platforms. The consoles are fixed hardware devices, with the most recent incarnations running games inside hypervisors with a fixed memory allocation since only one game can be running at a time. Developers in turn don’t get all 8GB a console offers, but what they do get they can count on getting virtually the entire time.

The PC on the other hand is a very different beast. Besides the obvious matter of having separate VRAM and system DRAM pools for the CPU and GPU respectively (for systems with a discrete GPU), PCs are also multi-tasking environments. Games aren’t running in a hypervisor and they can’t be written counting on receiving a specific allocation of memory all to themselves. This is coupled with the fact that the amount of DRAM in a video card varies wildly between 2GB and 8GB for most recent cards, so developers can’t even count on the GPU having all the resources they would like to use.

Consequently, memory management under DirectX 12 is still a challenge, albeit one that’s evolving. Under DirectX 11 memory management was typically a driver problem, and the drivers usually got it right – though as Baker noted in our conversation, even now they do sometimes fail when dealing with issues such as memory fragmentation. DX12 on the other hand gives all of this control over to developers, which brings both great power and great responsibility. PC developers need to be concerned with issues such as memory overcommitment, and how to gracefully handle it. Mantle users will be familiar with this matter: most Mantle games would slow to a crawl if memory was overcommitted, which although better than crashing, is not necessarily the most graceful way to handle the situation.

As a result it’s still a learning process across the board for DX12 developers. In developing Ashes, Oxide has developed new strategies to deal with memory management, though it has taken some time to do so. However successfully tackling DX12 memory management also reaps its own rewards: since even automated DX11-style memory management is not without its faults, a well-tuned DX12 implementation has the potential to exceed DX11, offering better performance and avoiding DX11’s faults in the process.

Though even with better tools, this will always be something that sets apart the PC from the consoles in the low-level API space. As Microsoft noted in our call, their goal is to align the console and Windows DirectX APIs as close as possible, but memory management will be one of a handful of areas where the two APIs still diverse.

Looking Towards the Future

Though much of our conversation was focused on the present, both Baker and the DirectX team are also looking towards the future of DirectX 12. I’ve previously mentioned Microsoft’s plans to improve the toolset available for DX12, but tools are only one part of the equation. At the end of the day DX12 is a very powerful API, and it’s up to developers to make the best possible use of it.

In Oxide’s case, Ashes is ahead of the curve in several ways. Along with utilizing DX12’s more fundamental multi-threading capabilities, it’s also been pushing the envelope on features such as asynchronous shading/compute and multi-GPU support. In fact both the DirectX team and Oxide were surprised with just how well the latter worked at this early stage, with Baker noting that the image quality from AMD and NVIDIA GPUs was closer than he expected. And though Ashes’s unique AFR multi-GPU support is one possible implementation, the wider development community also has their eyes on looking at ways to meaningfully combine a dGPU and an iGPU, as virtually all dGPU systems have the latter, and it’s currently going unused.

As for asynchronous shading, for Ashes it’s primarily being used to improve performance by improving GPU utilization. However Baker believes this is just scratching the surface of the technology, and once DX12 becomes the baseline API for a game, there are far more exotic uses he wants to look into. This includes having the GPU work on more pure compute tasks, such as running first-order physics simulations or parts of the game simulation on the GPU rather than the CPU. And this wouldn’t just apply to clients; in games with a dedicated server, the server could be GPU accelerated as well, using the GPU in a pure GPGPU context to do some of the aforementioned compute work.

Though for the time being, it may be all that the rest of the PC ecosystem can do to keep up with DX12 as-is. While every game will be unique, in the case of Ashes Oxide has already run into situations where they are both CPU memory bandwidth and CPU core count limited. Much of this has to do with the game’s expensive AI and simulation code paths, but as Baker was all too proud to recount, Ashes’ QA team had to go track down a more powerful system for multi-GPU testing, as their quad core systems were still CPU limited. DX12’s low-level nature is going to reduce CPU usage in some ways, but with its multithreading capabilities it’s going to scale it back up again in other ways that may very well push the limits of conventional quad core CPUs in other games as well.

Ultimately even pathfinder games like Ashes are still treating DX12 as a more advanced graphics API, which certainly reaps several immediate benefits, but isn’t the only thing the API is good for. As we’ve already seen in some instances with other low-level APIs such as Apple’s Metal, these kinds of APIs are a path towards using the GPU as general compute processor, and game developers have not even begun to scratch the surface there. Once games start using DX12 as a baseline API, many more options become available to developers who are prepared to break traditional graphics rendering paradigms.

Wrapping things up, be sure to check back next week for our GDC 2016 coverage. With confirmed events from AMD, Crytek, Epic Games, Microsoft, and more, it should be another busy and interesting year for PC game development.

Comments Locked


View All Comments

  • Voldenuit - Saturday, March 12, 2016 - link

    Gee, if MS had released DX12 for Win7, they would have close to 100% adoption of DX12...
    And they could have released UWP and Win Store for Win 7 (and Win 8) as well, keeping what ppl wanted while avoiding what ppl didn't (loss of control over updates, loss of MCE, privacy concerns etc.).

    I'm no Win10 conspiracist (my laptop is Win10), but my gaming desktop is staying Win 7 for as long as I can.
  • siriq - Saturday, March 12, 2016 - link

    Ryan! Any chance to face Nvidia with Fermi DX 12 question?
  • Ryan Smith - Sunday, March 13, 2016 - link

    It's on the list for GDC. But I do know that NV has it working in dev drivers.
  • siriq - Sunday, March 13, 2016 - link

    Thx for that. I was just checking some interesting topic. Nvidia also starts early at monday with DX 12.
  • BrokenCrayons - Monday, March 14, 2016 - link

    "Because DX12 is low-level and fairly thin, what bugs there are tend to be fairly straightforward."

    While I'd love to believe that the simplification of the API is the magic bullet that will solve all problems in the world, the unstated bottom line is that GPUs remain complex pieces of computing hardware. Taking advantage of that complex hardware still requires a certain amount of software complexity SOMEWHERE even if it was ejected from the API by Microsoft. It went somewhere and that somewhere appears to be onto the shoulders of game developers that ultimately answer to cost-sensitive game publishers. At least now, when we get the next Batman: Arkham Shovelware or Bethesda Softworks sandbox RPG that requires 3rd party modder patches to actually add the required sand, we can more rightfully point angry fingers at the publisher and developer studio.

    On a side note, I do hope developers focus on Vulkan (though I realize how much of a delusional pipe dream that is) so those of us who are pure penguins after getting fed up with setting up firewall rules to stop Windows 10 from Big Brothering our computers with all those encrypted 40MB packages it periodically pushes back up to the mothership.
  • Stuka87 - Monday, March 14, 2016 - link

    Of course no mention was made of some of the more recent DX12 failures, such as Gears of War, which is an absolute travesty.
  • xenol - Monday, March 14, 2016 - link

    It's nice to see this article show that DX12 and by extension, Vulkan, isn't some kind of free ride to performance. Nor is it going to be a quick transition. These API are changing the way developers have to code their games.

Log in

Don't have an account? Sign up now