System Performance

When I started testing these two systems from Puget Systems I honestly wasn't prepared for the kind of tug of war that would occur. Fundamentally, the expected results are on the page: the Intel CPU outclasses the AMD APU at every turn, while AMD's integrated graphics hardware thoroughly outclasses Intel's. I'm not going to lie either, the results are about what you'd expect. What impressed me was just how wide the gaps were. Take a look.

Futuremark PCMark 7

Futuremark PCMark Vantage

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R11.5

Video Encoding - x264

Video Encoding - x264

While the A6-3500's CPU performance would certainly be fine for a notebook, it's absolutely lousy on the desktop. Granted much of our competition is pretty unfair, with overclocked systems abounding, but look at how badly it even struggles against a last-generation Phenom II X4 955, much less the Intel Core i5-2320 in the Alienware X51. The i7-2600S is consistently two-to-three times faster in roughly the same power envelope.

To be fair, though, these results need to be framed in a more meaningful way than just "the A6-3500's CPU is dog slow." We need to consider the environments in which these systems are going to be used, and at the risk of sounding like an AMD apologist, I don't see many situations where the Intel chip's mammoth lead on the A6-3500 is going to be relevant. The A6-3500 is fine for basic Photoshop work, and neither of these systems are really ideal for serious video editing, where you need a much faster storage subsystem and CPU/GPU than either can provide either internally or externally.

Where a computer is much more likely to see frequent (if casual) use is in trying to run games, and here's where things take a turn.

Futuremark 3DMark 11

Futuremark 3DMark Vantage

Futuremark 3DMark06

Gaming Performance

The Intel Core i7-2600S with its crippled IGP can't even run two of our benchmarks, and only produces playable performance in one of them: Portal 2, with its ancient (albeit updated) Source engine. Meanwhile, the Radeon HD 6530D inside the A6-3500 can stretch its legs and deliver playable performance across every game except Battlefield 3, where a dip in resolution or settings will render that game playable as well.

Granted, these are conservative settings at a relatively low resolution, but the point remains that if someone wants to play a game on the A6-3500, they can, and reasonably comfortably. I've seen someone suddenly decide they want to try and play a game only to discover their system's integrated graphics can't handle it at all, and forums are rife with threads of people asking how to upgrade the graphics of their cheap desktops or their notebooks being met with the same answer: "you're screwed." With larger desktop systems, it's a different matter, but for mini-ITX and laptops you have to be prepared to live with whatever graphics the system includes from the factory.

And in the Green Corner... Build, Noise, Heat, and Power Consumption
POST A COMMENT

62 Comments

View All Comments

  • Spunjji - Friday, March 23, 2012 - link

    This is addressed very well in the article, particularly in the conclusions. The editorial is nicely balanced if you took the time to read it.

    Yes, it stinks that people will look at the graphs in this and nothing else, but that's people for you and it's AMD's responsibility to combat that.
    Reply
  • trane - Wednesday, March 21, 2012 - link

    Since you have brought up video editing and gaming as the two usage scenarios, I would like to contest that Llano would do the former much slower than SNB. The GPU is not just for gaming, but GPGPU as well. Quite a few editing software today are starting to be heavily OpenCL GPU accelerated - including Sony Vegas and Cyberlink Powerdirector. I would have also mentioned Premiere Pro but it is CUDA only for now, should be OpenCL in the near future. Perhaps you should add a Sony Vegas Pro benchmark to your suite, Sony already have a standard benchmark project available (http://www.sonycreativesoftware.com/vegaspro/gpuac... and GPUs bring massive gains over CPU-only.

    Just a suggestion, as Llano's GPGPU capabilities almost always goes unnoticed, and unfairly so. Yes, not many applications are heavily GPU accelerated today, but video editing is certainly one of them.

    It's a pity the A8-3800 isn't available, that would have been pretty great, and much faster than A6-3500 for a small price.
    Reply
  • trane - Wednesday, March 21, 2012 - link

    Here's a link to the benchmark project: www.sonycreativesoftware.com/vegaspro/gpuacceleration Reply
  • Dustin Sklavos - Wednesday, March 21, 2012 - link

    The problem is that GPGPU and dedicated hardware encoding still, to my knowledge, have issues with end quality. If you're just transcoding for the internet or for yourself, they're probably fine, but CPU-only encoding remains the gold standard.

    That said, Premiere CS5.5 benefits tremendously from CUDA, but not entirely on the encoding side. Mercury Playback Engine still produces reference quality video, but CUDA accelerates decoding and effects layering on the timeline by a substantial degree, in some cases meaning the difference between editing in realtime and not.

    GPGPU has promise but that promise is, presently, nascent on the desktop.
    Reply
  • trane - Wednesday, March 21, 2012 - link

    Do note that I am not referring to encoding! On Vegas Pro the entire video processing pipeline is heavily GPU accelerated. Right from decoding to colour space transforms to scaling to transitions/motion graphics to nearly all video effects - nearly everything is GPU accelerated - even before we hit the encoding stage. Much more extensive than Premiere Pro. Do give the benchmark project a try, you might be surprised how far GPGPU has come. Reply
  • silverblue - Thursday, March 22, 2012 - link

    The i7-2600S sports QuickSync, so if the software supports it, it may not actually be a victory for AMD on this one. Reply
  • hypercube33 - Wednesday, March 21, 2012 - link

    This is bull. As posted by sabot they have plenty of higher powered APU's available up to the newer A8-3800.

    This is like cutting off the arms of your opponent and then saying he didnt even throw a punch. I am not saying AMD is better, but this review is skewed so badly that its not even close to worth publishing.
    Reply
  • weiran - Wednesday, March 21, 2012 - link

    Yes AMD have a higher power CPU in the A8-3800. But available?

    I'm in the UK so the availability is probably even worse than the US, but I've been unable to find any stock of the A8-3800. The only place you can get one seems to be in pre-built HP desktops.
    Reply
  • silverblue - Thursday, March 22, 2012 - link

    Not to mention the significantly higher power consumption of a quad core CPU at 2.9GHz with much more powerful graphics. Sure, the 38xx series would be preferable, if only you could actually grab hold of them.

    The 3870K is available on CCL for £103, but the A6-3500 is a mere £55 from the same site. The only available models available on that site are the A4-3300, A4-3400, A6-3500, A6-3670K and A8-3870K - there is no sign of the 3800 or 3820.
    Reply
  • djfourmoney - Thursday, April 12, 2012 - link

    Its not just you, all the major e-tailers don't carry the A8-3800 in North America. The B&M's don't either (Micro Center and Fry's). That's why I got the A6-3500 and called it a day.

    An A4-3400 is plenty for HTPC use, does 29/59 and 3D without issues. When you start using 3rd party stuff like MadVR, you may have a few problems with interlaced content as found in Rene's testing on AVS forums.

    The Triple Core is the best case scenario of price and performance. It will do what my current system does, only faster (current rig is 5000+ BE) and that's plenty for me.

    MB, APU, SSD and Memory all for under $200

    Reply

Log in

Don't have an account? Sign up now