Gaming Performance

In our synthetic tests we started to get the first inkling that the graphics subsystem in the Alienware M18x R2 may actually be CPU-limited. This isn't entirely surprising; the GeForce GTX 680M is a tremendous leap in performance over the 580M and roughly on par with a desktop GTX 570. That card was itself already essentially CPU-limited at 1080p in most cases.

We start with our mainstream benchmark suite, which the pair of 680Ms should have no trouble at all tearing through.

Batman: Arkham City - Mainstream

Battlefield 3 - Mainstream

Civilization V - Mainstream

DiRT 3 - Mainstream

Elder Scrolls: Skyrim - Mainstream

Portal 2 - Mainstream

Total War: Shogun 2 - Mainstream

Unfortunately while we still can't get Total War: Shogun 2 to benchmark properly on the M17x R4, it's pretty clear that mainstream settings are an absolute waste of the two GTX 680Ms in SLI, and many of our games seem CPU bound. So let's see what happens when we crank things up.

Batman: Arkham City - Enthusiast

Battlefield 3 - Enthusiast

Civilization V - Enthusiast

DiRT 3 - Enthusiast

Elder Scrolls: Skyrim - Enthusiast

Portal 2 - Enthusiast

Total War: Shogun 2 - Enthusiast

Well, there it is, and that's pretty much what I was getting at. While in some cases (Batman: Arkham City and Battlefield 3) the pair of GTX 680Ms appear to be GPU-limited and thus offer a fairly strong linear increase in performance, in most situations it seems clear the CPU is holding back the graphics hardware. Battlefield 3 demonstrates what should've been a foregone conclusion—two GTX 680Ms have more horsepower than a single desktop GTX 680—but the other games are more telling. We already knew a single GTX 680M was more than enough for a mobile gamer (heck the 580M/675M was basically on the cusp), but impressively we now have a graphics subsystem that is actually outright excessive for a notebook.

Application and Futuremark Performance Display and Build Quality, Battery, Noise, and Heat


View All Comments

  • DarthPierce - Friday, September 28, 2012 - link

    I have a hard time believing a samsung series 7 with 3615QM and gt 650M is getting scores 4x higher than a 3720M with raid 0 SSDs and SLI GTX680s. (22890 vs 5542)

    If those scores are real, why aren't they explained?
  • Dustin Sklavos - Friday, September 28, 2012 - link

    Quick Sync can dramatically bloat certain scores. Reply
  • Freakie - Friday, September 28, 2012 - link

    "...but if you absolutely must have the most performance you can cram in a notebook, pricetag be damned, obviously this is the way to go."

    I don't know, I'd probably go with a Clevo P370EM over this... especially since I can configure a Sager one $800 cheaper and not have to worry about voiding my warranty if I want to apply my own paste xP Though anyone dropping this kind of money on a mobile gaming rig can more than chose whatever brand they want xP
  • GTRagnarok - Friday, September 28, 2012 - link

    Repasting an AW doesn't void the warranty. I might consider a Clevo if they put more thought into certain things, like the keyboard for instance. They cheap out and use the same keyboards on their 15" and 17" laptops. You end up with a pretty pathetic keyboard for a 17". And bad layout aside, the AW keys are just way nicer. Reply
  • Meaker10 - Friday, September 28, 2012 - link

    The cooling just does not compare either. Reply
  • SlyNine - Friday, September 28, 2012 - link

    When I had my XPS studio 16 one of the guys ( after getting to know my ability alittle better) said it would be fine to change the thermal paste, He even suggested it might help it throttle less.

    I wonder if these suffer from any throttling issues?
  • bennyg - Saturday, September 29, 2012 - link

    "Pricetag be damned" then every point you mention is about price or warranty.

    AW is also one of the very, very few who bother to get GPU switching working (and well) on their big gaming rigs, while the 2.5-3 hrs I get on my Clevo P150HM is good enough for me (and didn't justify the premium commanded by AW) the AW have always been better on that front ever since they got a 9400M integrated working alongside the SLI'ed 9800Ms a number of years ago. I think that was even before Nvidia brought out Optimus?
  • PCMerlin - Friday, September 28, 2012 - link

    Open article.
    WUXGA or better screen?
    Close article.
  • KineticHummus - Friday, September 28, 2012 - link

    you missed a step. forgot to add in making a worthless comment Reply
  • Dustin Sklavos - Friday, September 28, 2012 - link

    1920x1200 is dead. Get over it. Reply

Log in

Don't have an account? Sign up now