SPEC2006 Perf: Desktop Levels, New Mobile Power Heights

Given that the we didn’t see too many major changes in the microarchitecture of the large Lighting CPU cores, we wouldn’t expect a particularly large performance increase over the A12. However, the 6% clock increase alongside with a few percent improvement in IPC – thanks to improvements in the memory subsystems and core front-end – could, should, and does end up delivering around a 20% performance boost, which is consistent with what Apple is advertising.

I’m still falling back to SPEC2006 for the time being as I hadn’t had time to port and test 2017 for mobile devices yet – it’s something that’s in the pipeline for the near future.

In SPECint2006, the improvements in performance are relatively evenly distributed. On average we’re seeing a 17% increase in performance. The biggest gains were had in 471.omnetpp which is latency bound, and 403.gcc which puts more pressure onto the caches; these tests saw respective increases of 25 and 24%, which is quite significant.

The 456.hmmer score increases are the lowest at 9%. That workload is highly execution backend-bound, and, given that the Lightning cores didn’t see much changes in that regard, we’re mostly seeing minor IPC increases here along with the 6% increase in clock.

While the performance figures are quite straightforward and not revealing anything surprising, the power and efficiency figures on the other hand are extremely unexpected. In virtually all of the SPECint2006 tests, Apple has gone and increased the peak power draw of the A13 SoC; and so in many cases we’re almost 1W above the A12. Here at peak performance it seems the power increase was greater than the performance increase, and that’s why in almost all workloads the A13 ends up as less efficient than the A12.

In the SPECfp2006 workloads, we’re seeing a similar story. The performance increases by the A13 are respectable and average at 19% for the suite, with individual increases between 14 and 25%.

The total power use is quite alarming here, as we’re exceeding 5W for many workloads. In 470.lbm the chip went even higher, averaging 6.27W. If I had not been actively cooling the phone and purposefully attempting it not to throttle, it would be impossible for the chip to maintain this performance for prolonged periods.

Here we saw a few workloads that were more kind in terms of efficiency, so while power consumption is still notably increased, it’s more linear with performance. However in others, we’re still seeing an efficiency regression.

Above is a more detailed historical overview of performance across the SPEC workloads and our past tested SoCs. We’ve now included the latest high-end desktop CPUs as well to give context as to where the mobile is at in terms of absolute performance.

Overall, in terms of performance, the A13 and the Lightning cores are extremely fast. In the mobile space, there’s really no competition as the A13 posts almost double the performance of the next best non-Apple SoC. The difference is a little bit less in the floating-point suite, but again we’re not expecting any proper competition for at least another 2-3 years, and Apple isn’t standing still either.

Last year I’ve noted that the A12 was margins off the best desktop CPU cores. This year, the A13 has essentially matched best that AMD and Intel have to offer – in SPECint2006 at least. In SPECfp2006 the A13 is still roughly 15% behind.

In terms of power and efficiency, the A13 seemingly wasn’t a very successful iteration for Apple, at least when it comes to the efficiency at the chip’s peak performance state. The higher power draw should mean that the SoC and phone will be more prone to throttling and sensitive to temperatures.


This is the A12, not A13

One possible explanation for the quite shocking power figures is that for the A13, Apple is riding the far end of the frequency/voltage curve at the peak frequencies of the new Lightning cores. In the above graph we have an estimated power curve for last year’s A12 – here we can see that Apple is very conservative with voltage up until to the last few hundred MHz. It’s possible that for the A13 Apple was even more aggressive in the later frequency states.

The good news about such a hypothesis is that the A13, on average and in daily workloads, should be operating at significantly more efficient operating points. Apple’s marketing materials describe the A13 as being 20% faster along with also stating that it uses 30% less power than the A12, which unfortunately is phrased in a deceiving (or at least unclear) manner. While we suspect that a lot of people will interpret it to mean that A13 is 20% faster while simultaneously using 30% less power, it’s actually either one or the other. In effect what this means is that at the performance point equivalent to the peak performance of the A12, the A13 would use 30% less power. Given the steepness of Apple’s power curves, I can easily imagine this to be accurate.

Nevertheless, I do question why Apple decided to be so aggressive in terms of power this generation. The N7P process node used in this generation didn’t bring any major improvements, so it’s possible they were in a tough spot of deciding between increasing power or making due with more meager performance increases. Whatever the reason, in the end it doesn’t cause any practical issues for the iPhone 11’s as the chip’s thermal management is top notch.

The A13's Memory Subsystem: Faster L2, More SLC BW System & ML Performance
Comments Locked

242 Comments

View All Comments

  • Quantumz0d - Wednesday, October 16, 2019 - link

    Lol. Very funny. Just 2.5% of the screen, why not take a pencil and poke in your laptop/monitor display and say just a 0.3% like Samsung HOLED or this dead pixel zone. And put an RGB strip around/inside it to make it similar to S10 Hole notification led.

    A phone's primary component is Display which allows for man-machine interface, and if that itself is ruined no matter how much value the device has, it is a complete waste. Even Google realized this after their horrible Watertub 3XL disaster notch.

    Notch, Hole are worse than an Asymmetric bezel on Pixel 4 and Symmetric bezels are much better to look at - V30, ROG II, Note 8, S8 while S9 and Note9, OP7 Pro, Zenfone 6, Nex have asymmetric design but they are fine over the dreaded Notch abomination or holes anyday anytime.

    Apple did this crazy thing in 21 century. If engineers from CRT era were there they'd be laughing like no tomorrow. Kudos to Apple for creating something utter miserable and complete bullshit.

    Why didn't the CMOS sensors / Panavision or any Film cameras or any Display shape have a notch/hole or any. Or our eyes or any animal eyes, even the insects like spiders with many eyes have full uninterrupted vision without no hole or notch in their eyes, hell even mirrors ? Until this POS hit the market the perception of a display was perfectly uniform with no interruption, it all changed thanks to Apple. They should be ashamed of creating the worst abomination while claiming the best industrial design.

    "Sir" Jony Ive should be ashamed of his Knighthood lmao. Wonder he left because of Apple that he couldn't stand this Notchabomination as a black mark on his life forever (Also not note, since Chief Design Officer role doesn't exist anymore and the design teams report to Operations COO, It's interesting as if Andrei says this is the last Notch BS from Apple or whether we will see a smaller notch in 2020 lol)
  • uhuznaa - Thursday, October 17, 2019 - link

    You have a literal blind spot in your eye where the optical nerve passes through the retina. Really. You just don't see it because it's always there. Same as with the notch by the way...
  • Quantumz0d - Thursday, October 17, 2019 - link

    No man stop convincing yourself, Retina blind spot is not notch lol. 210 Degree FoV and a blind spot how can you relate these ? Insane.
  • uhuznaa - Thursday, October 17, 2019 - link

    OK, so then just put a piece of black tape on the parts of the screen left and right of the notch and you have a bezel instead if you like this better.

    Seriously, what's so bad about having a bezel on top that still displays a clock and a few status symbols in the left and right corners? Exactly this is what the notch is. It just means using parts of the bezel in a limited but still useful way. I really don't understand what you notch-haters want instead of a notch. A full-width notch (aka "bezel")? Why would this be better? It would just push the status bar down, leaving less room for actual content for apps.
  • WinterCharm - Thursday, October 17, 2019 - link

    The notch is the only weak excuse some Android fanboys have to convince themselves about why they can't move to iPhone.
  • willis936 - Friday, October 18, 2019 - link

    The space is virtually useless and makes for a more difficult user experience. Nothing is gained by forcing information into a tiny nook or to have the user need to stretch to hit a small target. Just keep the display a rectangle. It just works.
  • Quantumz0d - Sunday, October 20, 2019 - link

    Exactly, but look at the Apple apologists they will buy excuse for whatever they seem fit to make the company look great. Any wrong pointing bam personal attacks, eh. New low for AT commenters to be honest. "Insecure Android phone" these noobs do not know how an Israeli company deals with breaking smartphones. Idiots. Only god can help them.
  • Anand2019 - Monday, October 21, 2019 - link

    "Only god can help them". What does a fantasy figure have to do with this? Maybe santa can help them too?
  • Xyler94 - Thursday, October 17, 2019 - link

    Every time I look at my mother's iPhone X, the first thing that pops into my mind is "why do people fuss so much about the notch?"

    It's barely noticeable when you use it. It's just like your nose. your brain will ignore your nose when you're looking, but the moment you realize your nose is in your view, you'll see it again. Same as the notch, don't even think about it, and it becomes a non-factor.
  • Total Meltdowner - Thursday, October 17, 2019 - link

    ahahahahahahahahaahAHAhahaha!

Log in

Don't have an account? Sign up now