Commenters lament that Rosetta will go away before users are ready.
In my opinion, Rosetta should be more heavily gated* to push everyone from Adobe to Steam to build for aarch64. Countless "Apple Silicon native" claiming tools require Rosetta under the hood for dependencies or even (bless their hearts) only their installer.
* Like right-click to open packages or install not-from app store, except Rosetta dialog should note that once it's installed any other old software not made for this system will run without warning. Turns out avoiding Rosetta is a great way to ensure not just apps but your CLI tools are all up to date... Alternatively, make Rosetta sandboxed with apps, and/or let it be turned off again more reasonably than the safe-mode surgery required now.
> In my opinion, Rosetta should be more heavily gated* to push everyone from Adobe to Steam to build for aarch64.
Honestly I see game developers simply abandoning MacOS (more than they already have) if they need to do more work on their back catalogues. Nobody at Adobe cares if CS3 doesn't run on the latest MacOS, it's no longer a revenue source for them. EA would care a lot more if all their games older than 4 years needed work to run on MacOS, there's a lot more long tail revenue in games (but a lot of that is over aggregated back catalogues where the revenue per title is pretty low even if they sum up to a significant amount) than there is in the more typical B2C cloud SaaS attached apps that the App Store model is designed to serve.
The scuttlebutt is that Steam is not going to get ported, and Valve has given up on Apple since the 32 bit drop in general and the Metal/Vulkan mess.
I have seen a huge downtick in games with Mac releases too… even for stuff where it seems like it would be possible with an extra platform export.
Apple seems to care about gaming for about 15 minutes every year, and one day they will figure out it can work on their stuff if they’re willing to accept that their platform will be nobody’s priority.
> It seems that M4 chips can’t virtualise any version of macOS before 13.4 Ventura
13.4 was released on May 18, 2023. That's actually not very far into the past.
Anyway, what would be the most common use cases for this? And how common are those?
If you're building macOS apps, it's common to want to test them on all system versions you support. Especially so considering Apple's attitude towards backwards compatibility.
Are there any Macs that can run 13.4 but can't run 13.5?
I don't think so, but no Mac before 2016 can run 13.x: https://everymac.com/systems/by_capability/maximum-macos-sup...
Virtualizing an older ARM version of macOS was never going to be sufficient to QA x86 applications running on older Intel Macs. For that, you'll always want real x86 hardware.
I once used an old Mac OS X in order to extract a Cisco AnyConnect certificate from Keychain that wasn't possible to extract in any other way.
https://reverseengineering.stackexchange.com/questions/6043/...
I've first tried recompiling the Keychain app, but it had too many dependencies that were not trivial to build, so, using an older Mac, in that case, was the easiest way to get the private key from my own keychain.
CI is the big one, or similar testing against older versions for backwards compatibility. Usually good enough to just compile for it on MacOS, but sometimes we get a surprise in a third party library or something that only shows up on an older release.
Dev environments, and testing.
Macs are the best at virtualizing macs.(look up hackintosh to see how many hoops must be jumped through on non mac hardware)
You don't need a Hackintosh to virtualize a Mac - you can actually download the MacOS image directly from Apple and boot it right into QEMU with the proper configuration. I've used a few scripts over the years that could have an OSX image running on Linux in less than 15 minutes.
Any tutorial?
official support for any macos version < ventura has been dropped (monterey support ended 2024-09-16)
so i wonder whether apple will consider this a bug...
Seems like a bug if it only affects certain processor cores.
They added custom instructions to Apple silicon to more easily emulate x86 behavior (e.g., https://developer.apple.com/documentation/virtualization/acc...). They may have now removed them because their analytics say that Rosetta2 use on new devices is minimal.
i guess there will be always a previous processor supporting something which the most recent doesn't.
but when the bug report is regarding supporting a software version which they don't support themselves anymore, personally i don't think they will give it any priority
I sure hope this is addressed by a future OS update. I’m not a hardware person, so I can’t imagine why this wouldn’t be possible to address. If someone with more experience in that area can offer theories, I’d love to hear them.
Apple broke the compiler suite on Mac OS X 10.3. As part of a security update they shipped a new version of libstdc++ -- which was built using a different ABI than the compilers for 10.3, and they weren't going to update the compilers for 10.3. (Some sort of GCC ABI flag day happened.)
You could still build programs for 10.3 -- but you needed the latest version of Xcode and the compilers for that, which Apple only shipped for 10.4 and up. So you had to set Xcode for 10.3 compatibility mode and then you could build your app.
If you develop for Apple's platform, you need to be running the latest version, period, end of statement. If you aren't, there's no telling what may break.
Come on, Apple. What are you doing? I was thinking just the other day that Apple should virtualize older iPhones within the latest iPhone system software, so you could seamlessly open old apps and games (32-bit, anyone?) in their own containerized environments. I can't think why they haven't added this feature for any reason other than money grubbing.
You could even customize the containers to be completely closed off from the rest of the iPhone—no contacts, no Internet access (or high security Internet access), etc.
Come on, Apple. Do something good for once. Oh and bring back the headphone jack.
-Mark
For better or worse it's never been Apples MO to keep software working forever, that's Microsoft's schtick. PPC OSX software is gone, x86-32 OSX software is gone even on hardware that could still run it natively, AArch32 iOS software is gone, and if history is any indication it's only a matter of time before x86-64 OSX software is gone too.
One time I had to run a very old version of Eagle CAD on Linux and it turned out that even tho I had a native Linux version, it was easier to run the windows version in wine! I guess stable interfaces have their advantages.
Back when IDA Pro for Linux & macOS was finally released, they decided to make every OS a separate license purchase. The net result of this was that every single person I knew who used it just kept buying Windows licenses and using it under WINE when they wanted to use it on their other computers.
the community has been joking that win32 is the most stable Linux api
I have a half-joking, half-serious thought: has anyone written a desktop environment for Linux that uses the Win32 API? Since Win32 is much more stable than Qt and GTK, it would be easier to target that API. The side bonus is API compatibility with Windows.
This might not have been viable 25 years ago when KDE and GNOME were in their infancy, but WINE has come a very long way since then. Standardizing on Win32 would eliminate the churn of dealing with Qt and GTK major version revisions.
> Standardizing on Win32 would eliminate the churn of dealing with Qt and GTK major version revisions.
What makes it so hard to write a GUI toolkit that is long-term (say for 25 years) backwards compatible. If Microsoft is capable of doing this, why can't open-source developers?
> has anyone written a desktop environment for Linux that uses the Win32 API?
No, but window managers who use Xt, Xaw or Motif are ok.
ReactOS comes to mind, although it's not Linux.
Linux (the ecosystem; not necessarily the kernel) is actively hostile to binary software.
It baffles me as to why. I think it’s hilarious how Linus is so careful to not break user space (for good reason) and all the user space libraries break every week.
Distribution maintainers pretty much do whatever they want with the OS level ABI. That on top of whatever those user space libraries want to do anyways makes native application ABI stability basically impossible.
Because every distro runs it’s compilers with a variety of flags and safety features and library versions and ABI quirks that make supporting stable ABIs a pain in the butt.
You make a good point. It was kind of the breaking point for me when Apple killed 32-bit executables, because it meant even old steam stuff couldn't run.
But that's a casual consumer viewpoint. It's valid to buy them if they solve your problems in the here-and-now. (I used one for a year at work and it was a bad experience, but a lot of that was having x86 libraries I had to use, so... Bad choice for here-and-now.)
I think it's likely x86-64 support (via Rosetta) will continue for quite some time.
Rosetta is giving Apple a competitive advantage by being able to run x86-64 binaries in VMs (Linux or maybe even Windows) at near-native speeds. This enables doing cool things like running MS SQL Server in a Docker container - which enables developing on a full local .NET stack on a Mac.
Perhaps there will be an intermediate step where they drop support for x86-64 executables, but retain support for x86-64 virtualization. That would still let Apple slim down the macOS system frameworks to a single architecture like they did previously when they dropped the x86-32 frameworks.
There is no support for x86-64 virtualization on ARM Macs. Do you mean dropping support for Rosetta for macOS apps but keeping support for Rosetta for Linux VMs (which run ARM Linux kernels and use Rosetta to support x86 userspace software)?
Maybe I'm missing something but I run SQL Server in Docker under Windows WSL2 at near native speed.
What's the competitive advantage here?
I suppose allowing developers targeting x86_64 Linux to still use Macs and the power efficiency of ARM CPUs, since I don’t think (maybe wrong) that ARM Windows machines support emulation inside WSL.
But that’s more feature parity with x86 Windows machines, not an advantage.
Interesting juxtaposition against yesterday's front page: Valve updates Half Life 2 for 25th Anniversary.
If the requirements are still accurate, it will run on XP with 512MB RAM.
In addition to this, Steam provides an option for developers to expose the older game versions to the players, which Valve themselves make an active use of. So if you have a specific old mod that’s not compatible with the new update, or don’t want to deal with the update in a middle of playthrough, you don’t have to upgrade
Maintaining the build artifacts and pipelines and testing backward compatibility for a long-lived project like HL2 must be pretty difficult, I would think? That’s a great example and counterpoint.
Yeah, that's just their MO. I think it's easier to run old windows games on a mac than to run old mac games.
And architecture aside, at one point I had to install an old version of iWork (I thin it was '09) to update a file so the latest iWork could read it. They had code on hand that could read those older files, but decided not to integrate it. They don't prioritize backwards compatibility.
32bit ARM and aarch64 are wildly different instruction sets. 32bit ARM may as well be x86 or MIPS as far as running it on aarch64 hardware, it is going to require just about the same level of emulation(memory models may be similar which would help, but that's about it).
Unlike x86/64, the 32bit silicon is entirely gone in most aarch64.
I wonder why Intel and AMD still keep the 32 bit and even 16 bit parts. Are there people still running many legacy apps?
As a consumer, yes. Old steam games are a big deal.
In business... not where I work, but I hear stories of shops that still have lots of old 32-bit COM stuff, etc.
Intel has proposed dropping 32-bit and 16-bit support in the future.
https://www.intel.com/content/www/us/en/developer/articles/t...
The proposal doesn’t remove 32-bit user land or (I think) virtualization.
X86S allows 32-bit ring3 (userland) but even VMs are stuck in long mode and only support 32-bit code for userland. Booting a VM for a 64-bit OS that has a legacy bootloader with 16-bit or 32-bit code would require software emulation of that code.
On windows, a lot of installers are 32-bit even if the application they're installing is 64-bit so that they can fail gracefully on 32-bit OSes.
Why would you care that the installer fails gracefully?
It's helpful for the users
The OS already throws a specific error message, and it is the OS that should be responsible for this.
32-bit applications are still pretty ubiquitous, including Office add-ins, and there is no particular benefit on x86 in removing support for 32-bit on the application side.
Yes
>32bit ARM and aarch64 are wildly different instruction sets
Maybe for the CPU implementation, but having written a lot of ARM2 assembly, the disassembly of Aarch64 is far more readable than x86_64 to me.
Apple does a lot of good stuff, But remember that their whole business model is selling hardware. They have no financial interest in making it easy to continue to use old phones.
They have to maintain a balance that still incentivizes current purchases. Otherwise it’ll be a constantly trend of “don’t buy now, support might not last.”
Old phones no, but old apps yes. If a developer has abandoned an app and hasn't been investing in the update treadmill, but end users still care about it, that can make people feel negatively about Apple.
> Old phones no, but old apps yes. If a developer has abandoned an app and hasn't been investing in the update treadmill, but end users still care about it, that can make people feel negatively about Apple.
On the other hand, it is well within the standard Apple approach to say "here's how we want people to use our hardware. We are well aware that this is not consistent with how some potential and past users want to use the hardware, but we are comfortable with losing those customers if they will not adapt to the new set-up."
I know it's not the Apple approach, I'm just pointing out an interpretation that it isn't particularly focused on end user needs in this area.
I feel like it's mostly an attitude about where to focus engineering resources, which is very "inside baseball", but people have post hoc justifications that it's really what end users want anyway.
This isn't as true as it used to be, now that Apple is getting increased revenue from subscriptions. If your old iPhone continues to work well, then Apple has a better chance of selling you Apple Music, Apple TV, etc. etc.
For some value of 'old' perhaps.
In terms of length of official support, and aftermarket value, Apple is at the top of the game. Those strike me as the most important metrics here.
And while you might think that once official support is over, that's the end of the story, this is far from true. Those phones end up in developing markets, where there's an entire cottage industry dedicated to keeping them going. Jailbreaking is getting harder, so that might stop being an option eventually, but so far it's viable, and that's what happens.
Or any interest in reducing the incentives to buy their $200 Bluetooth headphones.
iOS dropped 32-bit support because the CPU itself no longer supports AArch32. Virtualization won’t help.
> Apple should virtualize older iPhones within the latest iPhone system software, so you could seamlessly open old apps and games (32-bit, anyone?) in their own containerized environments
What is the practical, broad use case for this? (And can't you virtualize older iOS version on a Mac?)
> bring back the headphone jack
The article is about Macs. If you want a headphone jack, get a 3.5mm:USB-C converter.
Speaking of headphone adapters. It’s crazy to me that something like an iPod released in 2005 will output better audio when playing a lossless file than the most state of the art $2,000 iPhone with Apple’s most state of the art $549 headphones in 2024.
The remarkable thing is that 90% of listeners don’t seem to notice.
Their reference point is a lossy 128kb/s file from a streaming service double transcoded over bluetooth so that must be what music sounds like. Who would have thought technology would progress backwards.
> remarkable thing is that 90% of listeners don’t seem to notice
That's not a remarkable thing, it's the reason.
(And out of the remaining 10%, a good fraction may notice but prefer the new convenience. Those who remain can find the difference between most and all, or go corded.)
What streaming service even does 128kb/s? Youtube is the only one that comes to mind and that's for free usage only. Paid accounts get 256kbit AAC
Spotify uses OGG Vorbis codec and streams at 160 kbps at standard bitrate and 320 kbps at high quality
In addition to AAC, the entire Apple Music catalog is now also encoded using ALAC in resolutions ranging from 16-bit/44.1 kHz (CD Quality) up to 24-bit/192 kHz
Amazon Prime Music at 256 kbps
That's about 99% of the streaming music market people actually use
Tidal, SoundCloud, Deezer, and Bandcamp offer lossless support.
Of course to make this strawman argument you have to ignore the previous comment that says you can do wired connections just over a different port type.
Let’s also ignore any understanding of the DAC quality between older iPods and newer iPhones, where even the dongle Apple sell are considered a high quality DAC.
Let’s also ignore any advances in Codecs in that time, or advances in audio hardware itself.
Let’s also ignore that most iPod users would have bought low quality MP3s or medium quality AACs at the time. Not to mention that most customers use streaming today so wouldn’t even be able to take advantage of the higher quality codecs today.
Finally let’s ignore customer preferences and what niche set of customers would have bought high end enough audio equipment and have the hearing to appreciate and also not want wires today to even fall into your narrow band description.
Who would have thought that if you ignore all aspects that are inconvenient to an argument that you could make any argument you want?
And they’re listening on AirPods or whatever stuck on their ear. I have AirPods 2 Pro and sure, they sound nice. Less sweaty on the treadmill. But even a $100 DJ headset from a $200 streamer blows it away.
That’s a bit apples to oranges because you’re comparing different form factors completely.
Form has a huge impact on acoustic properties and comfort.
You’d want to compare them against IEMs.
Someone would have to maintain all the old OS versions that would be required to run those old apps, and keep those OS versions reasonably secure. That sounds like a maintenance nightmare to me.
No, the old versions would not have access to anything else, so the only thing that needs to change is the part running the container. Present one folder as the whole disk drive to the old. Send touch screen input to the old. That's about it really.
If you're trying to play older 32-bit games, chances are you'd like to have sound, too.
Who is going to develop the virtual audio device drivers (for each OS) that are required to virtualize sound? Who is going to run all the tests needed to make sure that the guest-side 32-bit iOS 7 (and 8, 9, 10) drivers are going to play well with the host-side driver?
Who is going to accept the risk that someone takes over the guest VM after exploiting one of the well-known and unpatched kernel-level vulnerabilities present in iOS 10? What happens if malware finds and exploits a bug in that new audio driver virtualization pipeline so it can escape the VM and then compromise the host iOS?
yeah it's much better to lose access to games you've paid for. this is why I stopped buying apps on Android - if it's not open source, I'm not interested. I also don't buy DRM stuff unless I know I can remove the lock.
I don't think almost anyone actually want a headphone jack anymore. Just the consumer reality in 2024. I do obviously, but it's clearly too rare for them to bother with.
Why do you want a headphone jack? Not a rhetorical question, just genuine interest. Is it about audio quality, avoiding batteries or something else?
I would find it so cumbersome to use a cable on a handheld device nowdays. But different things for different people! :)
Not who you are replying to but my $30 wired Apple earbuds (came with my 6S) have outlived all of my co-workers half dozen $160 AirPods. That’s reason enough for a lot of people.
For $8.99 you can buy a high quality USB-C (or lightning) DAC with a 3.5mm output directly from Apple.
It's tiny and lightweight. I keep one in the back of my headphone case.
There are never enough USB-C ports.
You can buy Apples wired earbuds with the lightning connector for $18. Or the lightning to 3.5mm adapter (that’s what I have because I also still have my decade old original earbuds).
Yeah I agree, that is a very valid reason by itself.
I don't daily drive my phone for commuting anymore, but the trade-offs aren't exactly new: - battery frustrations - cost of a dozen cheap but good quality headphones vs a wireless equivalent - easier to lose wireless headsets when you put them down somewhere (wired too, but way cheaper so less big deal) - audio quality? Who knows
For people that demand noise cancelling, you need an active power source, but I personally hate noise cancellation and always turn them off. Maybe valuable in a plane with lots of engine noise.
My wired Shure in ear monitors have much better sound quality, and battery management on AirPods is pretty annoying. Even when they’re not running out mid-trip, it’s just unnecessary mental overhead to keep another thing charged.
Easier to store vs bulky charging case and charger and charger cable etc. The wired solution is more portable, believe it or not.
For me it’s because
- I already have high quality earphones, same set for many years
- they don’t require charging
- audio quality is great
- they’ll work on any device with a 3.5mm jack, no proprietary lock-ins
- I have never lost a set of earphones and if I did replacement wouldn’t break the bank
Yes people seem happy to buy not only new phones every couple of years but new accessory devices as well. I don't understand it but it's quite apparent.
I find it very practical with small Bluetooth earbuds, but I agree on the consumption aspect of it. I really don’t like that I can’t change the batteries in my AirPods. I would even be semi-okay with having to hand in them to a technician for battery exchange, for a reasonable cost. But the current battery exchange for airpods is just another name for buying new earbuds. And the third party solutions that actually change the batteries cost about as much as new buds.
It's probably worth finding out whether this is a bug or an intentional decision before making assumptions.
For Apple Hanlon's razor rarely holds, they're both too competent and anticonsumer for this sort of thing to not be intentional malice in an attempt to sell more stuff. Maybe not, but very unlikely and even if it's a bug they'll probably call it a feature and keep it.
So, is it assumed as a bug more than a removed feature? If this is a bug and could be fixed I would treat this different to "no more support for older macOSes".
Based on the virtualization framework documentation, this seems very intentional.
My guess is that they do this because their development processes only assure that the macOS version shipped with the hardware works on that hardware. And the virtualization layer is really thin.
They see no reason to spend the extra QA time, but rather have everyone upgrade to the latest macOS version their hw support.
Do you think it could be simple to fix, even by a hackish third party?
In this case, it's an unintentional bug.
> In this case, it's an unintentional bug.
How do you know? Maybe it's in the mentioned bug report 15774587; I tried to look at it, and was sure I could log into Radar at some point, but now I can't seem to dig up any valid login.
Macs in CI are an absolute nightmare. For some reason (well, I do have a reason, they want to sell you more Mac Minis) macOS is the only modern OS that has no real container solution builtin. Windows has Docker and true containers, FreeBSD has jails, Linux has a bajillion solutions, Darwin (macOS)? Nothing. They've ripped half of FreeBSD already, just pull jails too!
It at least has the virtualization framework now. There’s a product called Anka that plugs into Jenkins and lets you deploy macOS VM images as build agents on top of physical Apple hardware. While slower than containers, and limited to 2 VMs (?!?) you can have reproducible and sane build environments via VM images.
It's limited to 2 VMs because Apple's software license agreement for MacOS: https://www.apple.com/legal/sla/docs/macOSSequoia.pdf
to install, use and run up to two (2) additional copies or instances of the Apple Software, or any prior macOS or OS X operating system software or subsequent release of the Apple Software, within virtual operating system environments on each Apple-branded computer you own or control that is already running the Apple Software, for purposes of: (a) software development; (b) testing during software development; (c) using macOS Server; or (d) personal, non-commercial use.
Apple just really doesn't care about you, and as a developer, you're just a sucker to extract money from.Realistically you can run more than 2 VMs with some work[0], but legally companies that provide CI and other virtual solutions can't buy 1 mac then get a license to run 100 virtual macs.
https://darwin-containers.github.io/
Also, if you want to cross-compile in Linux instead of run a container: https://github.com/shepherdjerred/macos-cross-compiler
You WILL buy more apple stuff, whether you like it or not.
Why do IT professionals keep insisting that their consumer electronics are built with the intention of supporting their professional goals?
You are forced to use macOS if you want to make apps for macOS. If there's some difference between an older macOS and the latest one, you can't run said older version, get to the bottom of what's happening, and patch your application to fix it on the M4 Macs.
Because corporate policy requires these IT professionals, developers, and related engineers to use these consumer electronics in the workplace?
Because Apple spent a lot of money marketing them as tools for professionals?