This just shows how little Apple cares about gaming on Mac. It's so sad that I spent thousands on multiple Mac devices (MBP, M Studio, etc.) only to be bottlenecked not by hardware, but by Apple's shitty approach to gaming software.
I know why they do it though. Apple can't take their undeserved 30% cut of Mac games the same way they take their iOS cuts.
We had a teacher years ago who said something that remains true until today: "everything is about money, especially the ones that appear to have non-monetary reasons."
I think their bigger problem is there's shit for documentation. You get a big list of function signatures and if you actually want to know how you're supposed to use anything you find the WWDC session from 4 years ago and hope it's still accurate.
Somebody told me that the quality of a company is directly proportional to the quality of its documentation...
Boeing's documentation is probably excellent.
In general it is assumed that documentation is the boring part and paying attention to it is a sign of quality. But where do you put people who prefer writing and/or teaching to thinking long and deep about product issues ?
I guess mathworks is one of the best companies ever.
They sure don't care about games on Mac, but I think this specific issue is more due to trying to do "magic" for better or worse.
Introducing the notch creates this "here but not really usable" area that is specific to some displays, and Apple touts it as something that will just work thanks to their software layer.
Yet every app has to deal with it one way or another and most apps that care about screen estate or resolution will need hacks.
It will be the same for every feature that pushes the boundaries and offers an abstraction layer as an answer to how the issues will be solved.
> I know why they do it though. Apple can't take their undeserved 30% cut of Mac games the same way they take their iOS cuts.
Why would Apple be deliberately sabotaging the experience? They would gain nothing from it. That argument makes even less sense when you consider most of the games mentioned in the article are on the Mac App Store, Apple can take their cut.
https://apps.apple.com/us/app/control-ultimate-edition/id650...
https://apps.apple.com/us/app/shadow-of-the-tomb-raider/id14...
https://apps.apple.com/us/app/riven/id1437437535
https://apps.apple.com/us/app/cyberpunk-2077-ultimate/id6633...
https://apps.apple.com/us/app/stray/id6451498949
This is an obvious case of Hanlon’s Razor. Anyone who develops for Apple platforms and has had to file Feedbacks is aware of Apple’s incompetence and lack of care.
Except for the handful of Mac ports exclusive to the Mac App Store, who with a lick of sense would choose to buy from there? Steam, GoG and Epic are all more feature rich, have more often sales/3rd party resellers and throw in the PC version too.
On iOS there is no choice.
When No Man’s Sky first announced a Mac port, they promised to release on Steam and App Store. I waited a year after the Steam release for the App Store release which never came.
I am a very casual gamer and sometimes weeks go by without playing. My first experience with Steam was with Civ VI. Long flight, no internet, great I’ll play some Civ! But instead of opening up Civ, I was forced to open Steam which would then not allow me to play my own game because I hadn’t authenticated recently enough for them. Or I would try to play and Steam would say, oh first you need to download some huge update before you’re allowed to play your single player entirely offline game.
I know theoretically GoG is supposed to solve this issue but no Mac game I wanted was available there. Finally Cyperpunk 2077 launched on multiple stores and I bought on GoG. Even then, the default option became to use the GoG launcher. If I wanted DRM free download, there was some immediately complicated off putting set of instructions, downloading something like 20+ files, etc.
App Store experience, I click download, it downloads. I open it, it opens the app and not some launcher. Everything just works.
It's a shame because a Mac Mini is a solid gaming computer.
I don’t try to game on my laptop, and I have always thought “gaming laptops” are somewhat akin to “racing-spec BarcaLoungers”. That’s why I’ve never understood why so many people bitch about not being able to game on Macs. If you want to play video games, you should probably buy a console or a desktop PC.
White-hot take: you’re allowed to own a PC and a Mac. They aren’t like matter and antimatter, you won’t collapse the galaxy or anything.
People are complaining because the hardware is capable but there are other reasons may make gaming on a mac an annoyance. Not because macbooks are not powerful enough to run games. And to be fair, macbooks can perfectly run games, and indeed a lot of titles perform great, even if most have to use some sort of compatibility layer (roseta 2, vulkan to metal etc). Why should I get a PC just for gaming if my machine can handle that? Especially something like a big form factor gaming PC that takes quite a bit of space in a room.
And this is not a matter of laptop vs desktop because most of these issues (not the notch) will be present for a mac studio.
In fact apple itself has started advertising gaming in macs in WWDCs last years. So it is only fair that people will complain about it.
But in general gaming in macs is perfectly feasible (maybe not the latest most demanding graphics game at max settings, but most of the stuff). You may miss certain graphics features that may not be available for macs, but otherwise performance is not bad. Even playing windows versions of games through whisky or CrossOver is perfectly feasible.
> If you want to play video games, you should probably buy a console or a desktop PC.
Push this logic one or two notches further and people should write and build code on desktop only with an e-ink portrait monitor.
Specialization has its place but asking a generic computing device to gracefully handle most applications isn't a big ask IMHO.
They're not running the latest top notch games or anything exotic, it should be fine with no tinkering.
>That’s why I’ve never understood why so many people bitch about not being able to game on Macs.
Well that's where you're wrong then. It's perfectly possible to game on a laptop. Over the last decade with the development of the lightning ecosystem and docks it's become a very low friction endeavor as well.
There are a myriad of quality "gaming" laptops out there, but obviously that's not for everybody, and that's one of many reasons why workstation laptops exist (basically the same thing - ability to run at high loads for extended periods of time - without the gamer aesthetic).
There are many casual gamers out there who don't want to shell out for a dedicated gaming device, nor should they win a laptop is a perfectly adequate device to game on. It's not that hard to comprehend.
Valid and correct take, but I don’t want to own a Mac and a PC. I’d like to be able to run casual games on my MacBook. I don’t game often or heavily enough to warrant purchasing a rig for it.
White-hot take: you are allowed to own an outhouse and a urine only toilet. They aren't like matter and antimatter, and so on.
What? This seems like just games not defaulting to the correct resolution for the display they're on. The first thing I do in every game I play (on any platform) is go through the settings, and make sure the resolution matches the screen's physical resolution. If the game is dumb about it, I choose a smaller multiple or run in windowed mode or on an external display.
> I spent thousands on multiple Mac devices
I mean... yeah why would they change?
to be fair there is only so much you can do inside the power envelope of 100W or so
The Xbox Series S only uses about 80w under full load, and that's built on TSMCs old 7nm process. Apple's bleeding edge 3nm silicon should be able to do even more with a similar power envelope.
100W TDP ought to be enough for anyone.
Not really. If you compare it to gaming rigs, they bear radically different architectures, so you can't really compare them by TDP or power requirements. They don't emit the same hear or require the same amount of power per TFLOP. And I wouldn't be surprised if tflops also wouldn't translate to actual compute room for shaders.
Even if they did, 100w should be room enough to play relatively recent titles, specially indie ones. Nothing really excuses Apple from this contempt it has for the gaming market.
> Even if they did, 100w should be room enough to play relatively recent titles
Steamdeck runs on 45W, and that's plenty enough power to have fun.
I briefly owned a Steam Deck (before returning it), and it seems like most users tone down their expectations a lot compared to PC gaming. 30fps at the Deck’s low resolution seems to be the norm for recent games. Enough power to have fun, sure, but I think people would rightfully pan it if it weren’t Valve.
The included charger is 45W but the chip consumes less.
> Nothing really excuses Apple from this contempt it has for the gaming market.
Considering how the typical self-identifying "gamer" conducts themselves online, I think Apple might be on to something...
While the differences between ARM and x86 CPUs are quite substantial, Apple's iGPUs follow the same architecture as all the brands
Oh it’s not just Apple…
This was an issue I also discovered on Xbox 360 in 2008. TV’s have overscan and depending on that setting, your resolutions will be off.
However, at the time, we couldn’t create render targets that matched the overscan safe area. XNA added a Screen SafeArea rect to help guide people but it was still an issue that you had to consciously develop for.
Now, we can create any back buffer size we want. It’s best to create one 1:1 or use DLSS with a target of 1:1 to the safe area for best results. I’m glad the author went and reported it but ultimately it’s up to developers to know Screen Resolution != Render Resolution.
Anyone using wgpu/vulkan/AppKit/SDL/glfw/etc need to know this.
If I understood you correctly... you wanted to be able to render to a slightly smaller surface to avoid wasting graphics compute time, but that's still going to be upscaled to 1080 for the HDMI scanout, and then mangled again by TVs' overscan - which to me feels like introducing more problems more severe than whatever problem you were trying to solve in the first place.
(Besides, TV overscan is a solved problem: instead of specifically rendering a smaller frame games should let users set a custom FoV and custom HUD/GUI size - thus solving 3 problems at once without having to compromise anything).
No, Your TV says it’s 1080 but it’s not, it’s 1074… This is a solved issue now but it wasn’t when HDMI was first introduced. The Xbox 360 suffered from red rings of death. Microsoft hated Linux. And C# was cool.
Basically, if you rendered an avatar image in the top left of the screen, perfectly placed on your monitor, on the TV its head would be cut off. So you change to safe area resolution and it’s perfect again (but on your monitor safe area and screen resolution are the same, except Apple apparently). Make sense?
You can see how if your screen says it’s 4k, but really it’s short 40 pixels, you render at 4k - the screen will shrink it by 40 pixels and introduce nasty pixel artifacts. TV overscan goes the other way. Interesting find by the author about the notch.
Read up on g buffer and deferred rendering. Usually one doesnt do everything at full resolution until the final output and even then it is often better these days to have fancy upscaling.
Many games do let users set the things you mention but it is not always so simple. For example, handling rounded edges and notches is a huge pain.
Not mentioned about WoW in here is that they considered the notch enough to also have an option to have the UI avoid the notch. It calls a function in C_UI to get the safe region, and then resizes UIParent to fit within that region while still rendering the game up to the true top of the display.
WoW has always been exceptionally good at treating macOS like a first-class citizen. It's a shame Blizzard has stopped supporting macOS for their newer games.
Back in the day they even implemented near-zero-performance-impact video recording that leveraged AVKit/Quicktime as a Mac-exclusive feature, which was pretty neat. It let me get silky recordings where Windows user guildmates’ videos skipped and stuttered from having to run both WoW and much heavier third party recording software.
Perhaps one or two people on that team have always had personal macs.
Am I the only one who finds screens with rounded corners and notches really stupid? We had to struggle for decades with CRTs and their funky geometry, and when we finally get displays with perfect geometry, we botch them again to make them look... cooler?
Don't think of notches as something stupid that takes away from screen area.
Think of them as something that allows the overall screen area to increase, as bevels shrink.
And then when the corners of the screen are so close to the corner of the laptop, and the corner of the laptop is rounded, it looks weird if the corner of the screen isn't rounded. Like a square peg in a round hole.
Fortunately, it's all context-dependent. So when you watch a video that is inherently rectangular on a Mac, that takes precedence. It's only shown below the notch, and the rounded corners on the bottom disappear.
So it's kind of the best of all worlds. Bigger screen with round corners for actual work (the notch is not particularly objectionable in the menu bar), slightly smaller screen with rectangular corners for video (and games too I assume?).
Are the screens OLED? The phones are...
IMO the notch is pointless, but they need space for the front camera. With OLED they can just turn the pixels off when it suits the application and it becomes like a big bevel, which was the alternative anyway.
Expected in the next year or two on the pros, but not yet.
I'm surprised this article focused solely on blurry rendering when mouse pointer location in "fullscreen" Mac games is also commonly affected by this bug or whatever we are calling it. You have to dive into an OS menu for each game to tell it to render fullscreen below the notch to fix it. It should be a global accessibility setting but isn't for some reason.
> But World of Warcraft is an older game using the legacy CoreGraphics display services full screen API. That API actually allows World of Warcraft to draw into the notch.
Not knowing much about Macs, I would have thought games were supposed to render full screen around the notch for immersion but respect the safe area for UI and gameplay. Are they supposed to leave a menu bar on macOS?
> Control gets around the issue by just making up its own resolutions.
That's hilarious, I wonder if they had trouble enumerating resolutions and gave up or if they simply couldn't bother.
> Are they supposed to leave a menu bar on macOS?
Depends also how the specific game is implemented, but often that area is just black and inaccessible, as in you cannot even move the cursor there. It is as if the screen does not include the area above the notch anymore, ie how the screen would be if there was no notch, like the m1 air.
Which says more about the volume of the market of gaming on Mac. It's small and unfortunate.
It's actually really small, according to Steam Hardware Survey, Macs are only 1.88% of Steam users, which is less than that of Linux, which is probably why most developers don't care.
Between the deprecation and stagnation of OpenGL on the platform, the removal of 32 bit support completely, the refusal to natively support Vulkan in favor of Metal, and the switch to ARM based systems... I can't believe it's still that "high".
Don't worry, they plan to gut Rosetta 2 so it will only support Mac games from the Intel era, that should help shrink that number!
> Rosetta was designed to make the transition to Apple silicon easier, and we plan to make it available for the next two major macOS releases – through macOS 27 – as a general-purpose tool for Intel apps to help developers complete the migration of their apps. Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
https://developer.apple.com/documentation/apple-silicon/abou...
Their compatibility layers being intended as exclusively transitional can be frustrating, but on the other hand it’s also why near every Mac app now has a proper aarch64 build. If it were known that Rosetta were going to remain part of the OS forever, many apps would’ve never been ported and would’ve been at an effectively permanent 30-60% performance penalty and disproportionate drags on battery life.
On the other side of the fence in Windows land, adoption of features much more trivial than an architecture change are absolutely glacial because devs know they can lean on back compatibility in perpetuity.
Apple’s approach is perhaps too aggressive but Microsoft’s is far too lax. I don’t think it’s realistic to dump a binary and expect it to work forever with no maintenance.
I do my work on a Mac. I game on a game machine. Which right now is a Steam Deck. In the past it's been PS4, 360, Gamecube, PS2, etc.
I think the only game I've put serious time into on my Mac was Hades 1, which I pretty much finished before the console ports happened.
Mac users might prefer to get their games through the default Mac App Store than Steam for distribution which plausibly could unrealistically distort the numbers.
It would be smaller than that overall because Steam stats are incomplete, they don't count all the Game Pass users. I haven't opened Steam in the six or so years I've been taking advantage of Microsoft providing a few hundred games for $10/month.
Interesting article, but I think the demonstration image isn’t doing its job. Neither side really looks good to me. They both look roughly the same.
The bottom left side is showing the circle with the artist-created rough texture that it's supposed to have. The top right shows that that edgy-but-sharp texture has been softened by anti-aliased.
Right? If it's vertically squashed, then at least draw the dividing line vertically so we can see a better difference! But yeah, both images seem like a janky rendering of an antialiased circle.
How can this be fixed without breaking existing software? Re-ordering the list?
They could do what Windows has done, and build OS code that checks if the running application is a known-legacy game, and lie to the game about various capabilities so that the game runs well and looks good.
Not sure how much of that is still around but it was rampant for many years and likely a key to Windows success in gaming.
> They could do what Windows has done, and build OS code that checks if the running application is a known-legacy game, and lie to the game about various capabilities so that the game runs well and looks good.
Or, even simpler (and AFAIK modern Windows does that too): if the running application doesn't say in its application manifest "I'm a modern application which understands the new things from operating system versions A, B, C and features X, Y, Z", you know it's a legacy application and you can activate the relevant compatibility shims.
Also consider setting NSPrefersDisplaySafeAreaCompatibilityMode and just leave self letterboxing control to a toggle in the settings (with whatever default you prefer).
I’m disappointed none of the proposed fixes are for CGDisplayCopyAllDisplayModes to have the “first” option on the list be the BEST option, taking into account the notch. The author hinted that many games pick the first option, so rather than demanding all those publishers add new code, Apple could make the easy path the happy path.
interesting -- I ran into this recently playing baldur gate 3 and was curious the technical details why. my fix was that I had an external monitor and I just reset the resolution to the external monitor. (by default, though, the monitor was showing up blurry though; with the wrong aspect ratio.)
Baldur's gate 3 groups the resolutions by aspect ratio. I assume it probably queries the resolutions, computes the aspect ratios and then displays them grouped them by it. This results in some weirdness with some weird ratios with huge coprime numerators/denominators but, as long as the right resolution is detected somewhere, you only basically need to do is select the right aspect ration for your screen.
How's factorio?
I played all the way through Space Age on a M1 Macbook Air no problem.
In typical Apple fashion, they have ignored the feedback for several years. Nice work, Apple!
Yes!
I remember first implementing this in Planimeter Game Engine 2D, we got a massive resolution list from SDL (through LÖVE, which is what we're built on).
If I remember correctly, we filtered the list ourselves by allowing users to explicitly select supported display ratios first, then showing the narrowed list from there. Not great. Technically there's a 683:384 ratio in there.[1]
But it did enough of the job that users who knew what resolution they wanted to pick in the first place didn't have to scroll a gargantuan list!
[1]: https://github.com/Planimeter/game-engine-2d/blob/v9.0.1/eng...
The notch is such a wildly stupid idea I can't even begin with it. I actually kinda liked the direction they were going with the keyboard Touch Bar... but they killed that.
What's the alternative? A camera on top of the screen? Fragile, unless you could fold it away. Which would be another point of failure.
The notch grants more usable vertical physical pixels than you'd have otherwise, with the only "downside" being a small unusable area in the center where the camera/mic/other sensors are placed.
I put "downside" in quotes because the alternative is just not having those vertical pixels at all, and having the screen end below the sensors.
No it's not