First Samsung Cortex-A57, A53 chips arrive with big performance boosts


For years, we’ve covered the advent of 64-bit processors in the ARM ecosystem, from Apple’s custom A7 and A8 to the company’s own Cortex-A53 and Cortex-A57. What’s been missing from the equation, however, is an actual evaluation of how ARM’s next-generation silicon compares to its previous efforts. Now, we’ve got a first look at that question — and an evaluation of Samsung’s blink-and-you-‘ll-miss-it 20nm technology to boot. The Exynos 5433 is one of the only chips to use Samsung’s 20nm; the company has focused its efforts almost entirely on the upcoming 14nm FinFET solution.

Anandtech has put together a comprehensive deep dive into all aspects of the Cortex-A53, A57, and Samsung’s Exynos 5433 in particular. Anandtech’s post is vast, detailing many specific and particular areas that I’d recommend to any serious chip enthusiast, but there are some interesting high points that I want to draw into conversation.

Performance and efficiency

Power efficiency has been absolutely critical to ARM’s rise to power in the mobile space, but the Cortex-A15 — ARM’s last 32-bit chip — was widely panned as drawing too much power in its initial incarnations. ARM’s entire big.Little approach was created to give it the ability to meet the needs of high and low-power operating environments, but early implementation efforts from Samsung were poorly built and suffered from significant flaws. Do the new Cortex-A57 and A53 deliver dramatic improvements on this front?

Evidence here is decidedly mixed. The Cortex-A53 and A57 are significantly faster than the old 32-bit Cortex-A7 and Cortex-A15 that they replace, but the Cortex-A53’s improved performance (it’s 49% faster in integer code and 2-3x faster in FPU code) but they aren’t necessarily more efficient. Anandtech’s investigation shows the newer, 20nm Exynos 5433 slightly less efficient than the 28nm Exynos 5430 when comparing the Cortex-A53 against the Cortex-A7. The Cortex-A57 is slightly more efficient than the Cortex-A15, but the Cortex-A57 can draw 2x as much power as the Cortex-A15 under full load.



This reinforces a point that we’ve touched on previously in our various ISA discussions: When it comes to low-level power efficiency, the underlying ISA doesn’t have much impact on how efficient the final CPU is (or isn’t). Each successive generation of ARM core has featured higher performance, larger caches, and more memory bandwidth — and each generation tends to consume more power than the last, even when taking process node improvements into account.

Unlike the Cortex-A15, there’s no evidence that the A53 and A57 pair on Samsung’s Exynos are prone to overheating, but there’s plenty of evidence that Samsung is still struggling to properly implement big.Little. Again, AT has the full details on this topic, but the company appears to have chosen power conservation settings that favor benchmarking over battery life and chosen to rapidly migrate threads to the “big” Cortex-A57, while only grudgingly moving them down to the smaller and more power-efficient Cortex-A53. The company also isn’t using the latest version of the Global Task Scheduler (GTS) option that’s designed to improve overall big.Little performance.



Anandtech’s final analysis is that despite some power efficiency headaches, the Cortex-A57 and A53 more than prove themselves, often easily outperforming the Snapdragon 805 despite the latter’s significant clock speed advantage. Once Android moves to 64-bit code, these advantages should be even larger. The data also illustrates that implementing a maximally effective big.Little strategy is going to take even more software integration than we’ve seen to date, while the rising maximum power draw of newer chips, even on a smaller process node, suggests that such methods will be even more important to keeping smartphone battery life at an acceptable level in the future.

Chips based on Samsung’s 14nm FinFET process should ship by the end of the year and bring an additional set of power efficiency improvements, but extracting real gains against the tyranny of lithium-ion battery life is going to require further software optimizations or Qualcomm’s custom architecture.