Using MacOS alongside Linux

I “grew up” on Debian and got to know the Linux spirit, the use of commands. I never used Ubuntu, only in a virtual machine, as opposed to Linux Mint, which was my main distribution for a long time. Then I heard about Arch, but I tried Manjaro first, but I thought it was too regulated, so I tried Antergos. After its development was finished, I turned to EndeavourOS, before Covid, and I’ve been here ever since, and I feel like I’m in the right place. EOS is the best choice for someone who doesn’t want to use pure Arch, but wants to be close to it. EOS is the closest distribution in the world to Arch Linux, with its KISS touch, but with the powerful support of the developer and user community. Here you will find answers to questions that might be considered stupid on the Arch forum.

That’s all for the preface, and now let’s get to the point. I’ve had a Macbook Air for a while now, but I’ve had a desktop Mac since last year. I’m finally tired of Windows updates, often unsuccessful, switching to Windows 11 has quite strict hardware requirements, while macOS is a stable system on a stable machine. A strong argument in favor of switching to Mac was my knowledge of Linux (UNIX) commands, on Mac it’s often worth using the terminal if you want to make deeper settings on the system. macOS itself is also UNIX-based. While in the Linux desktop environment you have to be able to configure a lot of things yourself, often with the help of the community, on macOS you get all of this ready-made, and for example the dock, which is increasingly common in modern Linux desktop environments, on Mac you get it ready-made by default. So since the switch, I’ve been running Windows11 and various, including pure ARM-based Arch Linux, in a virtual machine for testing and then for use purposes. This is how I came across Fedora desktop, which I would call a semi-rolling release, after using Red Hat for a while. All in all, I would even learn to program on a Mac if I had enough time. (Swift) I wouldn’t even bother playing games on a Mac, depending on the time I have available. There is only one drawback to the Mac feeling, which I admit makes me uneasy at times. And that is that Apple hides the solution to many problems that are common to average users from users. The best example of this is that I asked Apple support last time how to globally revoke the microphone permission granted once from Safari. This is a relatively simple setting that was in the microphone section of the privacy and security settings in previous macOS versions, but Safari is missing there now. In contrast, you can still revoke the microphone permission granted once from Chrome, for example. Needless to say, support automatically directed me to the security and privacy settings, where Safari could previously be turned off in the microphone section. So they didn’t even know that this setting option was gone, I thought there was a way to solve this problem in the command line, and it really is!. I asked Google Gemini (chatgpt), first he also directed me to the settings. After I told him that Safari wasn’t there, secondly he gave me the command that can be used to revoke any permission to use the microphone, not just the one that is listed in the documentation on Apple’s developer site. (tccutil) By the way, my opinion about this is that this whole thing, that you can’t simply revoke the microphone permission from Safari, can open a backdoor for a poorly written application, such as a video chat that calls Safari, which can obviously be used maliciously. If I have time, I will send this to Apple as a bug report, because this is the only way to write them a direct email, although I can’t produce this possible security incident, but supposedly they take all such reports seriously.
After that, I talked to him a little, and we agreed, for example, that Apple hides many minor settings even from average users, and in fact, this trend has become even stronger in recent years. Apple focuses too much on end users. In this respect, not only Linux, but also Microsoft is ahead of it, because due to its strong corporate support, end user support is also better than Apple’s. In other words, although an Apple machine is more stable than a Linux or Windows-based machine, it is at a disadvantage compared to both due to support. I saved my conversation with Gemini, and with difficulty, I was able to send it to Apple support. I will be curious to see what the final result will be.

I am curious about your opinions on the whole topic

Arch says it can be done. Side by side. Same disk. Boot might be a little tricky.

And while I skimmed your long post, I think your questions was ‘what would you think if I did it”?

Here’s my thoughts: I wouldn’t do it. Not in a million years. Too many variables. Macs are hardwired to be Macs. So many have happily run linux all by itself on a mac. It never worked kumbayah the times I tried it with Win.

If WIN is unacceptable to you then so should Mac. I can’t see a difference in their telemetry and how they conduct business and their non-existent privacy policies. I walked away from win and mac years ago after I discovered Linux.

People that love macs (I did once upon a time) will have a different opinion so I am just giving you mine.

EDIT: if you have a desktop and could do this on different disks, then my tune changes a little bit.

Native linux support is essentially limited to the current state of the Asahi linux project.

Based on this, most recent Macbook Airs would be supported, but with some limitations such as Thunderbolt, TouchID and USB-C display outputs aren’t supported, yet.

Which is only valid in case of Intel-based Macs, not apple silicon.

That being said, I’m just assuming that the question is tailored towards apple silicon, as ARM-based Arch is mentioned. From my side: The effort required to setup an Arch distribution may not achieve the same results as the Fedora Asahi Remix.

Additionally: It’s uncertain how long asahi will need to provide support for M3/M4 chips as there are major roadblocks included to reverse-engineer those newer generation SoCs. The M3 was introduced almost two years ago. And the current roadmap for the support of this SoC isn’t really encouraging.

Thus, Apple did a lot to lock down their arm-based SoCs. Which totally makes sense out of their perspective. They essentially selling the hardware with a highly integrated operating system that is streamlined. Avoiding certain bottlenecks of an open & modular system design approach (e.g. user upgrade able hardware, providing M.2 nVME slots and using standardized RAM modules).

Some may argue that this is solely due to their philosophy of charging a premium for hardware upgrades, but let me just counter that highly integrated (and power efficient) SoCs with tight memory timings can’t be achieved that easily in case you’re relying on modular and user-configurable hardware.

As a different example: Frameworks Desktop system that incoorperates the Strix Halo Ryzen AI Max+ 395 CPU, also is only available with memory soldered directly to the PCB (LPDDR5x-8000) as the required RAM timings couldn’t be achieved by the use of socketed ram modules.

For example, EOS runs on a separate machine, with Windows 10 Pro, which is supported until October of this year, as well as openSUSE Tumbleweed and Linux Mint. On the Mac, pure Arch runs in a UTM virtual machine, as do Fedora and Debian Sid. I find the Mac stable, but there are too many restrictions in its use, which of course I expected. What struck me at some point was how limited Apple’s support is, which is limited to end users. This is due to both the knowledge of the support team and the low number of support channels, which means that Apple practically only has chat and phone support, not email, and community support has been reduced over time. Apple considers feedback to be the only channel for reporting problems, but the disadvantage of this is that although many people report a given problem, this feedback hardly reaches development engineers. These problems are escalated to phone or chat support at Tier 2 level, they don’t get an answer from Tier 3 either, and then the circle is closed. Of course, it is also true that many people hardly notice operational problems, because this requires a certain level of experience. That is why I think that for example, those who are tired of having to configure many things themselves on Linux will get a similar graphical interface, where the configuration options are limited, but in return it is more convenient to use. Maybe over time I also became more comfortable, that is why I switched to Mac, although I still use Linux because I like challenges.

Indeed, I use an Apple silicon Mac. I am curious to see where the revival of ARM-based Apple chips will lead after the PowerPC era, after the Intel era. At the time, the PowerPC G3-G4 was still very exciting, mysterious, robust and inaccessible to average users, including me. The question will be whether the floating-point computing capabilities of ARM-based silicon chips reach those of Intel chips. Also, can they port all existing x86 applications to the ARM platform so that they can run natively.. The biggest advantage of ARM is still lower consumption compared to Intel.

Wow GREAT reply, thank you. I’ve been able to remain Linux 9 years now, doing everything I did in Windows but you do need a Win backup for emergencies: webex, etc. The learning curve at the beginning is the hardest part and it yes, it has it challenges sometimes but so did Win. And MAC which I spent the 90’s running.

Interesting about the Apple channels drying up. Learned some things.

Don’t forget that the PowerPC chips weren’t developed by Apple themselves, but within a consortium together with IBM and Motorola back in the early 1990s. The PowerPC architecture can not be compared with the ARM-based silicone directly, as they are both different implementations on the basis of RISC instruction sets.

In short, the PowerPC architecture emphasized high performance and reliability. ARMs approach on the other hand focuses on low energy consumption, miniaturization, and to a huge extend the integration of other tasks.

Apple began their own chip development with the A4 chip, which is an ARM-based design they developed in cooperation with Samsung and Samsung as the foundry / chip manufacturing partner (which is a different subsidiary than Samsungs mobile division) . That A4 chip was included in the 4th generation of iPhones and the 1st generation of iPads. Apple invested a lot of resources and new personal to gradually implement their own, independent chip design team, they partnered Samsung as the chip manufacturer for several design iterations up to the A7 chip and switched their foundry to TSMC in the process. And TSMC is Apples supplier for the current SoCs (system on a chip) and SiPs (system in a package - e.g. the iPhone camera modules, are independent systems in a package, more or less - a better example would be TouchID sensors, that are storing the biometrical data independently in an own chip - and the SoC only transmits the command & check and verify the current fingerprint instead of interchanging the data with the SoC).

That being said, Apple silicone chips are more or less an highly integrated SoCs and SoPs that are not only the current Processors within their mobile devices and additionally their M line of processers as used in their desktop devices. But furthermore they implemented their own trusted platform modules - e.g. for finger print readers or the whole usb bus on their devices. And those custom chips are vendor locked. I.e. there are no open standards or well documented reference designs that can be reverse engineered easily.

In other words: They’re implementing their hardware in a streamlined set of requirements, based on their software requirements. And cut all the open interfaces out of the equation that are the common norm, at least in terms of traditional x86-64 bit chipsets from AMD and Intel. Some may criticize Apples hardware policies, especially in terms of nVME storage and RAM memory modules. But they’re essentially taking benefit form their freedom to implement their own SoC and other hardware components - without the need of supporting the conventional hardware standarts. In short: regular M.2 nVME drives still have a controller chip on the drive. Apple instead included that memory controller directly on the SoC instead. Which is beneficial in terms of their unified memory approach.

The major culprit in the scope of the Asahi linux project is essentially: where the M1 & M2 series of SoCs were sucessfully reverse-engineered to gain full access to the SoC via the USB host bus controller (if I remember it right), Apple introduced a new chip of their own design for the M3 & M4 SoC which also being used in the iPhone 15 and later models. There are major obstacles in the way to gain access to that chip which is called ACE3 and is manufactured by Texas Instruments.

In short: During last years convention of the Chaos Computer Club in Hamburg, Germany a talk which given about this exact topic ACE up the sleeve: Hacking into Apple’s new USB-C Controller. Up so far, I’m almost certain that even that after gaining access to the chip … they’re still far away from an major step, which would be a firmware dump of that specific chip. Which turns out to be impractical, as each of theses USB-C controller chips seem to contain a task-specific / personalized firmware and thus the firmware can’t be dumped from one chip and flashed onto another one within the system.

Long story short: Apple will increase their efforts to prevent reverse-engineering attempts and it will definitely not get easier in the future. Nowadays, in comparison the other manufacturers of mobile devices … apples mobile phones are already the hardest to gain unauthorized access to. Which is a good thing, in terms of privacy and data protection laws. But for the scene of reverse-engineers and those who have jail broken the first iPhone in the past, its more or less a nightmare. If they’ll succeed in their efforts and release an exploit. It’s pretty certain that Apple will mitigate the window of attack for their subsequent hardware designs.

I saw a Macintosh computer park in a local high school in the 1990s. It hasn’t spread to education since then, I think the proportion of Windows computers is higher worldwide, or maybe in the US, secondary and higher education mostly uses Apple computers?

Thank you for your valuable answer. I also sense the business policy trend you outlined, for example, regarding the Asahi project. By the way, this is exactly what turned my attention back to Fedora, because at the turn of the millennium, Red Hat was the first distribution I used at work, while I was getting to know Suse and Debian. Did I understand correctly from your words that Apple is focusing more on chip production for mobile devices? So, will there not be another desktop computer range similar to the great G era, including laptops?

No, they first pushed their own mobile chip development, with the introduction of the A4 chip within the iPhone 4 and gradually made it their own design. That started back in 2008. And besides their later mobile chips, the current SoC within the current iphone is the A18 btw., in parallel they extended their design for the desktop market with the line of the M1 as introduced back in 2020.

Also the M1 SoC is also not exclusively used with the desktop computers, but are also being used within the iPad Pro of the 5th generation (in 12,9”) as well as the 3rd generation of iPad pros in 11° format. both introduced on 2021. In 2022 the IPad Air of the 5th generation also was launched with an M1 SoC.

That being said, they started with their mobile lineup to retrieve their own in-house chip design, extended & refined it gradually - and the M1 chip is more or less similiar as the A14 bionic chip that has been used within the iPhone 14. As a matter of fact, an jailbroken iPhone 14 Pro was foundation to analyse the M1 chip, as both chips share the same microarchitecture, but differ in their numbers of compute cores. The M1 has a total of 8 cores, the A14 only 6. The clock rates of the M1 are higher and has a different cache layout.

But the largest difference between both chips is that the M1s GPU is essentially 4 times larger and features 128 CUs where the A14 bionic chip only has 16 CUs. And that is also true for the current iPhone chip A18 and the latest M4. The mobile devices have generally less CPU cores, and an significantly smaller integrated GPU as their major use case is limited to a single screen.

So in summary, they scaled up their mobile chips for desktop & multi-monitor support. Besides the base M models, they also produced the Pro & Max variants with even more cores and GPU compute units.

As a fun fact: this comparison of benchmark results shows that the A18 essentially outperforms an M1 Mac, in single threaded benchmarks. Isn’t that far behind in multi-threaded workloads. Mostly due to the higher core clocks.
But really falls behind the M1 GPU performance.

Besides the lower core counts for the CPU and GPU, in comparison to the M line of chips, the mobile line of A chips doesn’t has any support for PCIe devices and are limited to USB interfaces & connectivity.

Additionally, they offer various variant of the M chips besides the base model, there are the pro, max & ultra variants within their lineup. With the M3 ultra as the biggest chip that features a 28‑Core CPU (20 performance and 8 efficiency), a 60‑Core GPU and a 32‑Core NPU / Neural Engine.

So in conclusion, They streamlined their whole product lineup to a homogeneous hardware & software infrastructure.

Way too much to read up above for these tired eyes, but hopefully I got the gist…

I run EOS on my daily driver - a 2019 iMac desktop. Fabulous.

I run dual boot OSX (very rarely) and EOS (mostly) on my 2011 Macbook Pro laptop. Mac OS sucks but I keep it just in case it’s needed for something obscure.

I run EOS only on my Dell Optiplex - about 10 yrs old by now. Runs like a charm.

Apple hardware is very nice… until the intentionally cripple it. Then I moved on.

I’m short, these all run EOS beautifully. Fast and productive at every thing I throw at them. I’m no wizard. It can be done easily. :vulcan_salute: