Your Next Mac Won’t Have A GPU

A Discrete GPU

The Tasty Cookie
4 min readMar 17, 2021

It’s been some-time since Apple announced that they are going to move their entire Mac lineup to Apple Silicon. So far, only the 13-inch Macs have moved to Apple Silicon. This is much slower than I initially anticipated. Of course, these delays may be due to our current climate which is to Apple’s credit not under their control.

But could it be due to something else?

The GPU

When talking about Apple Silicon the main thing that people always talk about is ARM. It is easy to forget that Apple Silicon also brings along a different memory architecture. An architecture Apples calls unified memory. Contrary to popular belief, I believe that the outstanding longevity of the M1 Macs is not due to ARM but instead a byproduct of moving to a unified memory architecture.

A unified pool of memory removes the need to send and synchronize data with the GPU. As a result, you avoid the expensive operations needed for GPU memory management. Which in turn leads to power savings. However, this approach has a problem. It doesn’t allow for a discrete GPU.

A discrete GPU, something like an RTX 3070 requires there be some memory that is really close to the GPU. This quote by Redwan Hasan explains why that is the case:

“To get higher bandwidth — If GPU has to access the main system memory over the PCI express bus to the main processor and then main memory, bandwidth will reduce drastically. An AMD 290X has a 320GB/s memory bandwidth where the main system memory might have around 20GB/s in dual channel mode and if the GPU has to access that memory this 20GB/s value might drop down even further.

To get lower latency — Having this memory close to the GPU makes it a very low latency memory thus help the card to perform nicely.”

A discrete GPU will need its own memory to perform its best. However, the unified memory architecture does not allow for the GPU to have its own memory. The result of all this is that it is very unlikely to see discrete GPUs in Macs that use Apple Silicon.

The Solutions

Luckily, a beefy discrete GPU is not the only way to increase performance. Apple has chosen to improve graphic performance by making the Integrated GPU (iGPU) much beefier.

According to MacWorld:

“The M1 Chip offered eight graphics cores (or seven in the case of the entry-level MacBook Air). For the M1X we can expect to see a 16-core GPU.

The benchmarks indicate that there will be 256 execution units for the M1X, compared to 128 execution units in the M1.”

The M1X will be much faster than the M1 because it has twice as many GPU cores. But do keep in mind that, despite the doubling in GPU cores you won’t get double the GPU performance. There will be some loss in performance due to heat and other inefficiencies.

Increasing the number of GPU cores does increase the GPU’s performance but there’s a limit to how many cores you can add. Eventually, Apple will hit a wall. A wall consisting of heat and power consumption.

To further improve performance, Apple would then need to rely on more accelerators, more fixed function hardware. For example, Apple could double the amount of video encoders/decoders. This gives better performance at video editing related tasks at a fraction of the power cost incurred by a discrete GPU. Having said that, this approach does have its own weaknesses. It is inflexible. You can’t repurpose these accelerators for other tasks, meaning your Mac will get outdated faster.

Another approach Apple could take is to rely on the cloud. There’s an Apple patent circulating around that Apple is moving Final Cut Pro to a subscription service. From Patently Apple:

“On Monday Apple filed an update to their trademark ‘Final Cut Pro’ in Europe adding Nice Classification #42 that hints that Apple could decide to go the way of Microsoft’s subscription model for Final Cut Pro by adding in that class verbiage covering “rental of software.”

At first glance, it may seem like Apple is charging a subscription fee for new features similar to Adobe’s creative cloud. But what if Apple included a service to render your videos using the power of the cloud? I think that’s cool.

The Real Pros

Without a discrete GPU, Apple could alienate a portion of their user base. People who need RTX 3080’s in SLI for modeling or data science. The real pros. But I don’t think Apple really cares.

Apple has never really provided for the real pros. The trash can Mac is a really good example. It looks nice but it’s not really practical. Neither is it very serviceable. For the price you could get a window or even a linux box that performs compute tasks much better at half the price.

The truth is Apple can’t compete with PCs at the very high end because Apple likes good margins. It makes more sense for Apple to target normal consumers and prosumers. People like you and me. That market is much bigger and has better margins overall.

Having said that, I would love to be proven wrong. So Apple, please prove me wrong.

Please?

--

--