I think there are some Chinese hardware hackers that have upgraded RAM and possibly hard drives on Mac Mini’s/Studios/MacBook, but they are pretty extreme hacks - desoldering/resoldering etc.
The hard drives on Mac Studio are removable, so I wonder if you could get hold of an actual legit Mac Studio hard drive and just swap it for a bigger capacity, or perhaps they lock the firmware for each machine in accordance with the drive size it ships with from factory…?
Either way, Apple are manipulating this on purpose - I’m sure it’s very easily possible if they unlocked the limitation in the firmware.
As far as I can tell, the best the hackers have managed to achieve is to replace the SSD module with a different SSD module of the exact same capacity from a different Studio.
I understand why, from a business perspective, Apple refuses to allow 3rd party SSDs. What I don’t understand is why they insist that I buy an entirely new machine. I’ve already got a M1Max and 64GB of RAM. Given the 3-week long wait, I would think Apple would be eager to take the machine back and give me some kind of a credit. But the best they can do is offer e-waste recycling.
Do you mean some way to boot from an external drive?
Sorry, I have nothing to offer but I was wondering about all this stuff the other day and saw quite a few guides on how to do this. Perhaps a loophole that’s been closed?
Also curious why you find 512 internal storage a limitation? As a non-mac user I’ve been trying to figure out how much you can load the internal storage before running into issues.
That would be nice, but it seems clear that Apple does not want you to do that. I would prefer not to depend on hacks because even what would usually be considered normal use cases are being closed off by Apple with successive OS updates.
I would love to do what I did with my 5k iMac, my various Linux boxes (before SSDs got big and cheap) and even my Sun Sparc IPX way back in the day: which is to use the small internal disk for the OS and an external drive for my $HOME directory. This is a very normal and orthodox Unix thing to do because even back in the '70s fast disks were much more expensive than slow disks, so the Unix filesystem was designed so that you could easily and transparently map (“mount”) parts of the filesystem to different physical drives. Apple is slowly but surely chipping away at this functionality, and it is both disappointing and irritating.
There are a lot of guides, but they all boil down to: “sorry, but there is no way to change the capacity of your Mac Studio.” If you are the IT helpdesk for a small company, you might be able to keep one machine alive by cannibalizing the drive from another machine. Or if you just like to futz around with electronics and haven’t discovered modular synths yet.
I had a 2TB external SSD on my 5k iMac that I nearly filled up. When I’m not doing music stuff, I consult/advise on various software projects and often have a half dozen projects being pitched to me to collaborate on at any given time. So I like to have plenty of space for large source code repos as well as space to build, and then plenty of RAM for Docker instances. And I want some free space for audio and video editing.
That’s a possible workaround, but you have to make that setting in every piece of software, and then remember those settings when you move to a new machine. A much more elegant solution is to be able to point $HOME at a physical drive. This worked beautifully in my 5k iMac, and is explicitly disabled in the Studio.
The issue isn’t performance - the Studio’s I/O capabilities are more than adequate to host a high performance external SSD. The problem is that Apple does not want you to use external storage for your primary workspace. There are workarounds, but they are ugly and error prone.
That’s… messed up. Have you tried adding it to your .zprofile or .zshrc? .zshrc will only be read by a shell though, so I guess .zprofile would be the only option as you log in. That’s really unfortunate.
Does Mac still allow a regular .profile to avoid ambiguity on what shells programs might be reaching for?
.zprofile, .zshrc and other “dotfiles” live in $HOME, so the problem needs to be addressed before those files are read. Because if those files can be read then $HOME is already mounted.
My best guess as to what is happening is that Apple’s init system simply forbids $HOME form living anywhere other than where / is mounted and wedges the login process as soon as it sees that case. Most likely this is officially a security concern and it’s merely a happy accident that it forces you into Apple’s absurdly expensive SSD. If this process is documented, I’ve been unable to find it.
Thanks, yeah, that’s all I’ve been trying to find (admittedly could have searched more doggedly) - just what the accepted wisdom is for the Mx macs.
Everything I found was just “if you stream/browse/ise cloud apps get this, if you do content creation do this…” yeah no shit, haha. Just tell me how much I should keep free to keep the OS happy.
And like, is it a rule of thumb to keep equal space to your RAM free for the swap file, plus 20%?
I would leave some wiggle room but 20% alone should be plenty for swap/SSD operations until you get close to 70%. If you start to get close then maybe back off a little as the SSDs need to trim/self heal. You can go over 80% but I/O speeds go down. This was the case 5 years ago. Maybe that has changed?
FWIW, my Intel Macs have always been swap monsters, despite maxing their RAM capacity. I’m not sure I’ve ever made my 64GB Studio swap. I’m not sure why this would be - perhaps Apple didn’t prioritize memory management in the Intel era. Perhaps the switch to ARM required enough kernel work that the engineers were able to finally prioritize fixing the memory management system.
It’s also possible that 64GB is merely enough for casual use, and you still need a lot of free space if you have 8GB or 16GB, or heavily load a 64GB machine.
Yea my Intel Mac will gladly go to 10GB of swap if I don’t reboot it for a week but I’ll look at htop and have 8GB free… Even though in theory I should never have run out of RAM. Seems to be a little swap on the new Macbook but I only have 16GB on that one, so I’ll allow it. Never goes crazy though. At least so far. Maybe it’s moving long standing memory that doesn’t get accessed often.
I did that, and on the next reboot I couldn’t log in to my user account at all. I had to boot up in Safe Mode and create a new account.
Moving the Home directory setting worked great on my old Intel 5k iMac, just not on the Studio.
Correction - I was able to boot from the external SSD on the 5k iMac. I just bought a 2TB drive so I could move the entire / to a new machine if the 5k ever failed or needed servicing.
Airs (with mobile accounts enabled) and minis.
They’re only bound for the use of network passwords/accounts. They are excluded from all GPOs, security policies, etc, and managed separately (as they should).
I didn’t set up the forest, but I can look if there’s anything odd, but it’s been the same standard mac setup I’ve been doing for eons prior to M series.
I find it hard to believe it’s something studio specific