Company of Heroes 2

Our second benchmark in our benchmark suite is Relic Games’ Company of Heroes 2, the developer’s World War II Eastern Front themed RTS. For Company of Heroes 2 Relic was kind enough to put together a very strenuous built-in benchmark that was captured from one of the most demanding, snow-bound maps in the game, giving us a great look at CoH2’s performance at its worst. Consequently if a card can do well here then it should have no trouble throughout the rest of the game.

Since Company of Heroes 2 is not an AFR friendly game, getting the best performance out of the game requires having the fastest GPU. While the GTX 780 Ti has a clear lead over the 290X across the average of our games, in this specific case it’s going to come up short, as AMD’s performance with this game is simply too high to be overcome without a significant performance advantage. Conversely this means that GTX 780 Ti and 290X are still close enough that NVIDIA won’t be able to sweep every game; in games where AMD still does exceptionally well, they’ll be able to close the gap and surpass the GTX 780 Ti.

Meanwhile, looking at a straight-up NVIDIA comparison, the GTX 780 Ti holds a slightly smaller than normal lead over its counterparts. At 5% faster than GTX Titan and 17% faster than GTX 780 it’s still the fastest of the cards, but it won’t pull ahead in this game by as much as it does elsewhere.

The minimum framerate story is largely the same. GTX 780 Ti is the fastest NVIDIA card, but it will trail the 290X by over 10% in both scenarios.

Metro: Last Light Bioshock Infinite
Comments Locked

302 Comments

View All Comments

  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Physically the 780 and 780 TI are literally the same unit, minus minor things. The difference is the neutered chip, and OC'd VRAM, Which means your paying for the same unit at 2 completely different prices, in fact, How much does it cost to disable the SMX texture? So shouldn't the overhead on the original unit be higher with more work having to be done? Or like AMD tricore are we paying for defective chips again ?
  • TheJian - Thursday, November 7, 2013 - link

    Wrong. They have been saving DEFECT FREE GK110 units for months just to be able to launch this with good quantity (probably only started having better results at B1, which all of these are). I doubt that there are many 780's that have fully working units that are disabled. They are failed Tesla chips (you can say DP is disabled on purpose, but not the SMX's). Do you really think 550mm chips have a ZERO defect rate?...LOL. I would be surprised if the first runs of Titan had any more working SMX's too as they were directly failed Tesla's. Sure there are probably a few cards with some working that are disabled but Yields and history say with chips this big there just has to be a pretty high defect rate vs. 100% working chips. It is pretty much the largest chip TSMC can make. That's not easy. Both AMD and NV do this to salvage failed chips (heck everybody does). You come with a flagship, then anything that fails you mark as a lower model (many models). It allows you to increase your yield and chips that can be sold. You should be thankful they have the tech to do this or we'd all be paying FAR higher prices do to chucking chips by the millions in the trash.
  • TheJian - Thursday, November 7, 2013 - link

    http://www.tomshardware.com/reviews/geforce-gtx-78...
    "Not to be caught off-guard, Nvidia was already binning its GK110B GPUs, which have been shipping since this summer on GeForce GTX 780 and Titan cards. The company won’t get specific about what it was looking for, but we have to imagine it set aside flawless processors with the lowest power leakage to create a spiritual successor for GeForce GTX 580. Today, those fully-functional GPUs drop into Nvidia’s GeForce GTX 780 Ti."

    There, don't have to believe me...confirmed I guess ;)
  • beck2448 - Thursday, November 7, 2013 - link

    Great job Nvidia! I think the partners with custom cooling will get another 15 to 20 % performance out of it with lower temps and less noise, and that is insane for a single GPU. Can't wait to see the Lightning and Windforce editions.
  • aznjoka - Thursday, November 7, 2013 - link

    The crossfire scaling on the 290x is much better than the 780ti. If you are running a dual card set up, getting a 290x is pretty much a no brainer.
  • beck2448 - Thursday, November 7, 2013 - link

    from Benchmark reviews: In conclusion, GeForce GTX 780 Ti is the gamer’s version of GTX TITAN with a powerful lead ahead of Radeon R9 290X. Even if it were possible for the competition to overclock and reach similar frame rate performance, temperatures and noise would still heavily favor the GTX 780 Ti design. I was shocked at how loud AMD’s R9 290X would roar once it began to heat up midway through a benchmark test, creating a bit of sadness for gamers trying to play with open speakers instead of an insulated headset. There is a modest price difference between them, but quite frankly, the competition doesn’t belong in the same class.
    Read more at http://benchmarkreviews.com/8468/nvidia-geforce-gt...
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    TBH the stock Nvidia cooler isn't that much better either they tend to run a lot hotter than ACX/Twin Frozer, and other such solutions, so both have cooling headroom, Hawaii though is just plain ridiculous.
  • deedubs - Thursday, November 7, 2013 - link

    Noticed the graph for shadowplay performance has its labels reversed. It makes it look like SP increases performance instead of decreasing.
  • Ryan Smith - Thursday, November 7, 2013 - link

    Whoops. Thanks for that. The multi-series graph tool is a bit picky...
  • Filiprino - Thursday, November 7, 2013 - link

    I don't see NVIDIA as a real winner here, really. Their margin is very tight, and AMD drivers still have to mature, and when you talk about crossfire, AMD is doing clearly better, for $200 less and 6dB more.

Log in

Don't have an account? Sign up now