The short answer? No, there won't be a Mac that runs on the A7. It just doesn't have the performance. But an A9 or A10? That's less clear.
Rumours that Apple is looking to replace Intel's processors in its Mac range have circled ever since the company first revealed that it was making its own chips for the iPhone and iPad. Apple's success has been built on integration and the idea of controlling everything from the development of the hardware to the optimisation of the software to make it the best possible experience.
Macs, however, use a lot of third-party hardware in their construction, with Intel's chips being one of the most prominent. This partnership has offered great advantages, though, with Intel's and Apple's priority largely in alignment at the moment: power management.
Apple's best-selling Macs are its laptops, and with Intel having previously focussed on computing power over energy usage, it's now putting its considerable effort into making its mobile chips as electrically frugal as possible - a process that has resulted in the excellent battery life on the latest MacBook Airs.
Yet Intel is still miles behind the technology that underlies Apple's A-series chips. The architecture of Apple's chips has two advantages over Intel when it comes to power use: it uses literally a different way of processing information that's more efficient; and the chips are less complex in design. The flipside of being simpler, though, is that they're less powerful, even if performance is improving significantly every year - the dual-core A7 in the iPhone 5s appears to be slightly more than a third as powerful as the Intel Core i5-4250U processor in the latest MacBook Air (based on Geekbench scores).
Both of these chips are dual-core, with a clock speed (meaning the frequency of operation) of 1.3GHz. That there's such a difference in performance at the same clock speed tells you a lot about the more advanced design of Intel's chips, but if Apple were to look at putting its chips in a Mac instead of a mobile device, it could make some changes to the design, particularly since it would have to worry less about the amount of power used and heat produced.
In the first instance, Apple could increase the clock speed, though this is no magic solution to better performance - increase too far and you start to get steep increases in the power needed for even modest additional gains. Some architectures are also optimised for lower-clocked operations, including Apple's.
"Apple has made some very specific design choices that will make it incredibly difficult, if not impossible, to take the A7 and run it at anything like 3GHz," says David Kanter, principal analyst at Real World Tech. He gives the example of the small memory cache on the CPU - Apple uses a cache twice the size of Intel's that's accessed more efficiently. It's a great design, but it doesn't scale up to desktop chips.
Doing it this way at something like 3GHz would increase the power needed for that one task hugely - Apple has optimised for low-power, low-speed chips. Turbo Boost However, a more modest increase to something like 1.8GHz is feasible, especially if combined with something akin to Intel's Turbo Boost mode, which lets CPU cores produce much higher clock speeds for short times under heavy loads.
Apple could also switch to a quad-core design, doubling the number of CPU cores available, but again, this isn't quite the fantastic solution it might sound: it only produces significant speed increases in apps that can split their processes to take advantage of more cores.
These two improvements both carry energy use and size penalties, but these could certainly be accommodated in a laptop form factor. Of course, Apple will also be working to improve the more fundamental design elements of its CPUs, making each core more powerful even at the same clock speed. However, while Apple is improving the performance of its chips to get near the current MacBook Air range, Intel will be bringing the energy usage of those same chips down to get that kind of performance into tablets - the opposite approach to Apple.
The current MacBook Air chips use up to 15 Watts of power, while the kind of chips in the iPad typically use more like 4 Watts, though this can rise under heavy loads. That's a big advantage to Apple, but Intel's next generation of chips, codenamed Broadwell, looks set to drop energy use by at least 30% while still offering good performance boosts.
Intel has also dabbled in versions of the current generation Core i5 chips that can use just 6W of power under lighter loads, and 11.5W when under stress. Interestingly, though, this shows that Intel's performance currently doesn't scale down as well as its power use, with that chip only outscoring the A7 in Geekbench benchmarks by 30%.
Processor performance
When it comes to processor performance, currently it looks like Apple could take the advantage in the balance of performance and energy use in the gap between the iPhone/iPad and the MacBook Air, but when you get into the flexibility needed to make chips that scale from phones and tablets up to laptops, Intel has huge advantages.
The fabrication plants it uses to build its processors are years ahead of those Apple uses in terms of creating smaller, less energy-hungry components, and can produce components capable of running at lower voltages. By the time Apple can make its tablet CPUs start to compete with Intel's current laptop range, Intel will have moved on, and may be putting chips of the current level of performance in tablets.
But does the balance of power shift if the processor becomes less important? The Heterogenous System Architecture (HSA) standard is a guide for building the kind of system-on-a-chip that Apple makes, but places a much larger emphasis on using the graphics processing unit (GPU) for general computing more often, instead of relying on the CPU for almost everything. The reason is that, though the CPU is good at many tasks, it works by processing things serially, working through a problem one step at a time.
The GPU is designed to work in parallel, computing large amounts of data in intensive tasks simultaneously. It can not only do these kinds of tasks faster than a CPU, but much more efficiently too, literally using less power to do the same job.
Not all computing tasks will benefit from being on the GPU, but the idea is to make sure than any that can be made faster by going there do so, rather than just having the CPU do everything out of convenience - even if they're small tasks such as gesture recognition or face detection. It's unlikely that Apple will use the HSA standard exactly, but may implement many of the ideas in a similar way.
Intel, being better at processors than graphics cards, is not certain that HSA is the future, unsurprisingly. Like its processors, Intel's GPUs are more powerful than those Apple uses, but the PowerVR Series 6 GPU in the iPhone 5s is the most powerful in the mobile market, and is designed to scale up easily. It's currently about eight times slower than Intel's GPUs, but if you used a faster version of the PowerVR Series 6 GPU, and used 20 of its 'clusters' instead of the four in the iPhone 5s, you could match the raw computing performance of Intel's current GPUs.
It's interesting to note that Apple is already taking a GPU-heavy route with the new Mac Pro, which will feature two extremely high-end GPUs, but only one CPU. However, while GPU-optimised computing is great as an ideal for changing the way we think of computers, there are practical problems with trying to introduce it for operating systems with long legacies, and that can hold it back as being a replacement for the CPU in more general-purpose use.
"The truth is that an awful lot of code cannot be moved over to the GPU," says Kanter. "The point of the CPU is that you spend a lot of resources compensating for poor programming. A lot of the things that architects spend a lot of time creating is just there to tolerate bad code, and the kicker is that if you run bad code on a GPU, it will just run terribly."
Of course, there's the question of which operating system would run on an Apple-chip-based laptop. If it ran OS X, it would have to be a new version adapted for the completely different architecture of Apple chips compared to Intel, and this would make the current range of OS X apps unavailable on it - Apple would have to supply a way for developers to recompile their apps for the new type of machine, though there's no guarantee that all developers will take advantage. It would also mean a third platform for Apple to support, effectively - iOS, OS X for Intel, and OS X for Apple chips.
An alternative might be that instead of adapting OS X to run on the Apple chips, Apple could evolve iOS to include features we've come to expect and rely on, such as mouse support, true multitasking and the ability for apps to pass information to each other. But aside from these changes, there's also the problem that iOS apps wouldn't fit the widescreen format of laptops, so it would need either some form of windowing or more flexibility in apps' layout and shape, which again would mean more work for developers.
Cash in your chips
With all of the issues of developer support and technical capabilities, and the fact that Intel will continue to create more powerful chips, you might wonder why Apple would bother doing any of this at all.
There is another factor, though: cost. Intel's laptop chips cost nearly $300 dollars to buy for manufacturers. Apple's A-series chips are estimated to around $30 to produce. Now, a more complex Apple chip needs to become significantly larger, and costs will increase hugely with that. But let's say that Apple were able to create a chip as powerful as what's in the current MacBook Air for around $150 - that would still make it $100 cheaper than an Intel one. That could allow Apple to create a new lower-priced line of MacBooks or an even smaller Mac mini starting at $500/£400.
That said, Apple doesn't tend to introduce lower-priced options without strong reason, so perhaps it's more likely that Apple would keep the MacBook Airs at their current price and include features such as Retina screens or 4G as standard, giving it a huge feature lead on the competition for the price.
All of the above, though, assumes that an Apple-powered Mac would work roughly the same as current computers do. Things may change by then. Across the industry, 'the cloud' is starting to be used for actual computations, rather than just for storage. Apple's iCloud version of iWork does a lot of work on the server side, YouTube offers video editing through your web browser, and Autodesk already offers cloud rendering for some 3D modelling tasks.
We might see the return of the 'thin client', where your computer only needs a processor powerful to be the interface for these apps, with all the hard work done elsewhere. In that case, even the current Apple chips might be suitable - you wouldn't need a fast computer, you'd just need fast internet. But even that still assumes a fairly traditional form factor for the Mac.
What if even the concept of what a computer is made up of has changed over the next few years? Apple has been working on technologies that make wireless connections utterly configuration-free, and that make wireless video smooth and fast. Its iBeacons technology uses Bluetooth to let devices see when they're close to each other, and pass information back and forth appropriately, while Wi-Fi Direct is being used to establish AirPlay Wi-Fi video connections without a router.
These technologies could form the basis of a system of flexible computing - your desktop computer could be simply a largescreen display with Wi-Fi capabilities, with a wireless mouse and keyboard (or whatever we use to control PCs in the future).
When you sit at your desk, the iPhone in your pocket detects the set up and gives you a custom desktop display on the screen using wireless video, letting you control it by using the mouse and keyboard - the iPhone becomes your computer hardware, capable of performing light tasks itself, and with heavy lifting done by servers in the cloud. In that case, there certainly would be an Apple chip in your future Apple computer, but the Mac may be long gone.