Apple Silicon
June 22, 2020 1:18 PM   Subscribe

Apple’s WWDC 2020 conference began today with announcements on iOS 14’s new home screen, Apple CarKey, App Clips (similar to Android’s Instant Apps), improved handwriting recognition on iPadOS, 3D audio on AirPods Pro, handwashing and sleep tracking on WatchOS, macOS 11 “Big Sur”, and of course, Apple’s transition away from Intel and to its own in-house ARM processors for its computers.
posted by adrianhon (163 comments total) 14 users marked this as a favorite
 
This is the third time they have switched processor architecture: 68k to PowerPC to Intel to Arm.

This news will no doubt overshadow everything they announce today, which is funny because if Apple does the job well, it should be seamless.

(I just want a MacMini in the form factor of an AppleTV.)
posted by wenestvedt at 1:21 PM on June 22


I am dancing about this. Intel has been shit for decades, and even before Apple switched away from PPC, Intel were monstrous assholes doing what has in hindsight been almost inconceivable damage to the design of microprocessors through monopolistic, anti-competitive actions. Apple has pretty much turned this around with their SoCs and if nothing else to finally be able to anticipate computers being markedly faster from generation to generation is reason to rejoice.
posted by seanmpuckett at 1:24 PM on June 22 [17 favorites]


Sleep tracking is what keeps me wearing a Fitbit instead of an Apple Watch, and I would switch for this so my data can all be in one place...but if you have to charge the watch every night, how can it track your sleep?

Are you supposed to charge it while you shower & dress, or something?
posted by wenestvedt at 1:26 PM on June 22 [5 favorites]


Wenestvedt, I charge my watch while I'm having coffee and breakfast and doing the morning e-mail. It takes less than 2 hours to top up after a day's use.
posted by seanmpuckett at 1:27 PM on June 22 [4 favorites]


Wrist implants feeding off the glucose levels in your bloodstream. So you can feed the techno-vampyres.

And you'll need major surgery every few years when someone at Apple changes their mind about the interface.
posted by bonehead at 1:28 PM on June 22 [6 favorites]


A bunch of iMacs got put on the Apple Refurb Store today, so I pulled the triggered and ordered a nice new 27" iMac to replace my aging mid-2012 non-retina 15" MacBook Pro.

iOS 14 looks really exciting! I'm going to completely rethink my entire homescreen, I expect.
posted by SansPoint at 1:32 PM on June 22


Typing this on a new MBP idling in my lap at a not-cool 50°C.

Since 2006 it has pained me every so slightly to know what a crap ISA was running under the hood.

I hope we'll get some performant game-friendly Macs out of this soon, as AGP/PCIe has been a hobbling factor in this area from day 1. Integration is going to rock, he says hopefully.
posted by Heywood Mogroot III at 1:34 PM on June 22 [3 favorites]


I hope we'll get some performant game-friendly Macs out of this soon, as AGP/PCIe has been a hobbling factor in this area from day 1.

Well, the end of 32 bit sliced off a massive portion of the steam library. AFAICT, you'll get game friendly macs only in the sense that you can run Clash of Clans on macOS and iOS.
posted by pwnguin at 1:41 PM on June 22 [16 favorites]


wenestvedt, I too find this weird, but in reality you don't need to charge ANYTHING for the hours and hours that used to be required. I have a friend who wears his Apple Watch overnight for sleep tracking, and he finds that putting it on the charger for his shower/shave/dress/coffee cycle every morning -- less than an hour -- is enough.

The other weird charging thing that comes up a lot is the insane rage from Apple-haters about the Magic Mouse's charging port. It's on the bottom, so you can't use it while you're charging. This was absurdly triggering to lots of people who refused to read further, but if they had they would have found that (a) it needs only a trivial amount of charging time to finish a working day and (b) it warns you as the battery gets lower, so you have plenty of opportunity to pop the cable in when you go to a meeting or to get coffee or even leave for the day. I use this mouse 50 hours a week; the charge port thing is an absolute non-issue.

(As for me, I don't sleep in a watch; I don't think I could. I used to, but quit when I started dating my wife almost 20 years ago, and now the new habit is stronger than the old one.)
posted by uberchet at 1:44 PM on June 22 [7 favorites]


Tim Cook would rather sever a limb than give ARM any credit. The ARM processor is one of the masterpieces of tech history, and Steve Furber and Sophie Wilson should be a lot more famous than they are.
posted by w0mbat at 1:47 PM on June 22 [25 favorites]


This looks like a great iOS update and I'm very excited about the ARM-based Macs.

The iOS home screen has been a mess since day 1. It's nice to see them cleaning that up.

Also really nice that they are going to support "EV Routing" in Apple Maps: it will integrate with your electric vehicle to include charging stops in your long-distance trips.
posted by Winnie the Proust at 1:51 PM on June 22 [4 favorites]


So how long before the 3rd party software vendors all get up to speed with ARM based MacOS? For my job I need Chrome, Firefox, Docker, Intellij Idea, All of MS Office including Outlook, Zoom/Ring Central, ESET Security, Global Protect VPN, and a ton of random other software development stuff.

I'm not against ARM in theory, it's a better architecture than Intels but I'm not looking forward to the transition and I doubt that my company's IT people are going to be happy about having twice the amount of testing to do.
posted by octothorpe at 1:54 PM on June 22 [4 favorites]


Apple: "Apple Maps will finally support bicycle routes!"
Me: 🎉🥳🎊
Apple: "...in select major cities plus the ones our executives live in."
Me: ⛈😭🌪
posted by ardgedee at 2:05 PM on June 22 [19 favorites]


Some cool stuff. Wanted to see an Arm mac for ages, but I bought a new Mini pretty recently so it'll be a while before I get one. Want to hear more about this built-in virtualization, just Linux?

Lost well down the list from all the big announcements, Apple dropped a teaser for their Foundation tv series, and it looks hype.
posted by rodlymight at 2:08 PM on June 22 [6 favorites]


Jared Harris! Lee Pace!! OK, also Asimov, but nothing’s perfect.
posted by adrianhon at 2:11 PM on June 22 [5 favorites]


The Newton owner in me smiled at the iPad Scribble stuff. It all comes around eventually.
posted by JoeZydeco at 2:13 PM on June 22 [14 favorites]


When I got my current 13" MBP in 2013, it was the computer I wanted all my life. (My first computer was an Apple IIc.) I have to say that it is surprisingly functional still, all things considered, and I do design/photography/video stuff on it. Software-wise, devices are doing a lot more, but the seams are showing. I just rebooted because my iPad wasn't showing up on air drop, which feels weird after a few golden years of everything just working. I have high hopes right now that within a year I will get another laptop that I am literally in love with.
posted by snofoam at 2:13 PM on June 22 [1 favorite]


It'll be interesting to see if the new Macs support Thunderbird, which Apple's been hyping for like a decade now, considering it's an Intel technology and the dev kits reportedly lack it currently.
posted by General Malaise at 2:14 PM on June 22


FWIW I sleep in my Apple Watch (Series 5, using Autosleep) and I wake up and throw it on the charge while I shower without issue. I used to be Fitbit because of the sleep tracking and I'm looking forward to tighter integration with the WatchOS.
posted by msbutah at 2:17 PM on June 22 [2 favorites]


> The Newton owner in me smiled at the iPad Scribble stuff. It all comes around eventually.

The Newton's handwriting recognition has been available as the Ink application but as far as I can tell not actually usable as a utility in any Apple product since Jobs fridged the Newton in 1997. This might be the longest dormancy for a technology that I can think of.
posted by ardgedee at 2:20 PM on June 22


All I have to say is that I like the Big Sur desktop background. It's simultaneously a throwback to the swoopy abstractions of the 10.4 era, and a minimalist illustration of the actual Big Sur coast. Neat!
posted by theodolite at 2:22 PM on June 22 [1 favorite]


(It's not truly on topic, but JOEZYDECO I SEE YOU. I still have a 2100, and it boots up just fine. What a wonderful device they were, at precisely the wrong time.)
posted by uberchet at 2:27 PM on June 22


I don't trust the Apple of the last half-decade for half a dozen reasons, some of which will make me *very* cautious about buying into their platform shift. There's no way I'll be buying into any first round of revisions (hell, I might not be buying anything that will not run Mojave for a while).

I'm more ambivalent about the promise vs peril of a world beyond the one of the last decade where the desktop was essentially unified behind x86. I'm told its baggage is significant and there's so much possible in the performance/energy potential beyond it. I also know that between when Apple ditched PPC and Catalina have been the most convenient time for running arbitrary software on a given Apple machine that I've seen in 35 years.
posted by wildblueyonder at 2:31 PM on June 22 [8 favorites]


Ardgedee, I'm going to say based on my experience with bike routing using Google Maps that an app with a slow careful rollout is a better choice than getting dumped onto a 45 MPH uphill sharrow during rush hour with the occasional bit of shoulder with a rattlesnake to bunny hop over.
posted by BrotherCaine at 2:41 PM on June 22 [3 favorites]


Can someone speculate what might happen to Java based apps such as ImageJ under the new regime?
posted by dhruva at 2:47 PM on June 22 [4 favorites]


It'll be interesting to see if the new Macs support Thunderbird, which Apple's been hyping for like a decade now, considering it's an Intel technology and the dev kits reportedly lack it currently.

This is an interesting question. With USB4 consisting of USB3 plus the now royalty-free Thunderbolt 3 it would seem likely to me, but on the other hand if Apple can make a $70 markup on a cable for "AppleMagicConnect Thunderbird" you know there will be some loud voices inside the company advocating for that.
posted by whir at 2:50 PM on June 22 [2 favorites]


Yeah, I wonder if USB4 is going to come fast enough to just swap and replace, and it's just not ready enough for the dev kits.

Also, unrelated, I wonder if on the new Macs there will be a choice of processors, like i3/i5/i7 at different price points, or if each model will just be one processor per generation per model like the iDevices.
posted by General Malaise at 2:54 PM on June 22


Can someone speculate what might happen to Java based apps such as ImageJ under the new regime?

It's speculation as you say, but the JVM has been working on ARM chips for a long time (eg: Android phones, in their earliest incarnations), so I'd be shocked if there wasn't a way to run them in Apple / Arm machines. (The Java ecosystem itself has recently been moving towards more emphasis on natively-compiled binaries via the GraalVM project, too, which would presumably be an option for apps written in Java as well.)
posted by whir at 2:56 PM on June 22 [4 favorites]


(Er, that's Thunderbolt, not Thunderbird.)
posted by whir at 3:00 PM on June 22 [3 favorites]


I just bought a quad-core 2020 MacBook Air last month but I'm very stoked for a smaller ARM-based model like the 2015 MacBook (with a decent keyboard.)
posted by porn in the woods at 3:00 PM on June 22


Yeah, I made the same mistake, I blame autocorrect
posted by General Malaise at 3:01 PM on June 22


Huh. Second MacOS update in a row that has precisely 0 new features I'm interested in.
posted by signal at 3:01 PM on June 22 [7 favorites]


pple: "Apple Maps will finally support bicycle routes!"
Me: 🎉🥳🎊
Apple: "...in select major cities plus the ones our executives live in."
Me: ⛈😭🌪


I work remotely (so I don't live in any of the cities mentioned during the keynote), and after installing the same build that is going out as the developer seed build today, I have cycling directions.
posted by sideshow at 3:02 PM on June 22 [3 favorites]


Couple of thoughts:
  1. Being able to run iOS apps in MacOS is going to be HUGE, but I do worry about where it breaks things.
  2. They did a good job at keeping benchmarks and performance hush hush, I can't wait to see benchmarks and battery life numbers come October or so.
  3. Tim Cook said "When we look ahead we envision some amazing new products and transitioning to our own custom silicon is what will enable us to bring them to life." and I can't wait to see how this makes new products available.
posted by furtive at 3:04 PM on June 22


As for getting native versions of apps, no worries. If an app has been ported to x64, building for ARM should be no more than a simple recomplile. Most of the work will be messing with the build script; those are usually an unholy mess.
posted by sjswitzer at 3:11 PM on June 22 [3 favorites]


I think ten years ago the macbook pro was the perfect machine for developers, nowadays I can't help but look towards the windows side of things (especially with WSL2.)

I really wish I could build a pc and install MacOS on it without having to worry about compatibility. Of course, the move to ARM is just going to make that kind of thing even harder.
posted by simmering octagon at 3:11 PM on June 22 [2 favorites]


Having used Macs through the last two architecture transitions, I'm not too concerned about what might go wrong with this one. This looks to be smoother than before, since most (all?) Mac developers are already using Xcode, which has already been compiling to both architectures, and it sounds like this will be little more than a recompile (admittedly, that might just be Apple's happy talk). Apple's been tipping its hand for a while with Catalyst apps, which are cross-compiled going the other way.

I bought a G4 iMac right after Apple announced the transition to Intel. My wife's MacBook is getting long in the tooth—I just replaced the battery on it yesterday, and we had a talk about whether to sell it and replace it with something new. But buying a new Mac now is a more interesting decision.

I'm curious if we'll ever see a dual-boot iPad—either blessed by Apple or as a tinkerer's project.
posted by adamrice at 3:14 PM on June 22 [2 favorites]


So how long before the 3rd party software vendors all get up to speed with ARM based MacOS? For my job I need Chrome, Firefox, Docker, Intellij Idea, All of MS Office including Outlook, Zoom/Ring Central, ESET Security, Global Protect VPN, and a ton of random other software development stuff.

Probably not long, considering that most of the software on your list has iOS options.

Some tools will need porting, but unlike when Apple switched from PPC to Intel, and Adobe and Microsoft dragged their feet for two or three years, those vendors were demoing product earlier today. Oracle already has a JDK targeting ARM.

The development situation is much better today than it was, when Apple was a smaller player. Some companies won't have to pay developers separately for writing to iOS and macOS platforms, but rather compile once to Catalyst frameworks and spend more time on UI tweaks. Even the machine learning frameworks used will effectively be less hardware "gnostic".

I'm pretty excited to see what they do. I predicted long ago that Apple would see a future where they would need to prioritize energy efficiency, while preserving or even improving on performance per Watt, and they've had to move past Intel themselves to get there.

That's a pretty huge story, in itself.
posted by They sucked his brains out! at 3:15 PM on June 22 [4 favorites]


Huh. Second MacOS update in a row that has precisely 0 new features I'm interested in.

I was definitely not interested in the "no longer fun any 32 bit binaries" feature of Catalina and haven't updated any of my Macs.

Being able to run iOS apps in MacOS is going to be HUGE, but I do worry about where it breaks things.

For years, downward pricing pressure in the iOS App store had made apps 10% of the cost of their desktop App equivalents. This is going to be a huge problem for anyone who currently prices their software that way.
posted by 3j0hn at 3:17 PM on June 22


Huh. Second MacOS update in a row that has precisely 0 new features I'm interested in.

At some point I probably should get around to upgrading to Catalina.
posted by octothorpe at 3:18 PM on June 22 [3 favorites]


Super-excited to see native handwriting recognition on the iPad...got mine last fall (pro,with pen) and had to download something 3rd party and cludgy.
As for all the iPhone/iOS stuff...well, it's nice to see them catching up to android I guess :/ To be fair, the car key thing and a few minor things are new, but I just got the new Galaxy A71 (top of samsung's new midrange...6.7"amoled, 6GB/128GB, snapdragon 730, 64MP main camera,12mp ultrawide,5mp macro,5mp depth,32mp front (all excellent) $365 unlocked on amazon.) and it has like ALL of these things, even the ar stuff (which you can draw/make your own), and not just on the front camera. The translation will even do ar overlays, like point it at a sign and it will become english onscreen. (i haven't tried this yet...but I DID just remember a bunch of ephemera I got in japan. I know what I'm doing tonight!)
Also...widgets are a new thing? Jeebus, apple, reallly? Also...still no live wallpapers? It's been over a decade now. c'mon apple, pull it together gurl.
That being said, while it may not be all that original, it does look very nice and slick and like everything plays really nicely together, but android is catching up on that front a lot...or maybe just samsung is. (and sorry to be the android vs apple guy in this thread, but I did JUST get this new phone (and I NEEDED a new one, my old one only worked plugged in. It's crank operated. It's made of wood.) and have been very impressed with what is supposed to be a budget model. Very.
posted by sexyrobot at 3:21 PM on June 22 [3 favorites]


I love that Apple for the Mac has now gone from a CISC processor to a RISC, back to another CISC, and now finally back to a RISC. They're like my cat who can't decide if she wants to be in the window or on the floor, but on a timescale that lasts decades.
posted by General Malaise at 3:25 PM on June 22 [14 favorites]


My main MacBook Pro is still on Mojave, largely due to audio plugins which will never be updated to 64-bit, and unfinished music projects depending on them, which would become junk if they died. I now have the dilemma of whether to buy an Intel MacBook running Catalina and then Big Sur, or hold out for the ARM MacBook.
posted by acb at 3:31 PM on June 22


They did a good job at keeping benchmarks and performance hush hush, I can't wait to see benchmarks and battery life numbers come October or so.

We're going to know sooner than October since Apple is shipping out dev kits next week. I'm cautiously optimistic — the Rosetta 2 translated demo of Maya was extremely impressive as was the UI responsiveness in Photoshop (and I do like that their demo file had hundreds of layers named "Layer XX", that's some attention to detail).
posted by nathan_teske at 3:39 PM on June 22


My main MacBook Pro is still on Mojave, largely due to audio plugins which will never be updated to 64-bit, and unfinished music projects depending on them, which would become junk if they died.

My workhorse iMac is forever frozen at Mavericks because of my need to run CS5 (especially Illustrator) reliably, and my refusal to pay a monthly Adobe tithe.
posted by Thorzdad at 3:45 PM on June 22 [2 favorites]


Moving to ARM for MacOS was inevitable and is good, but I agree with simmering octagon that the intel macs were simply awesome for cross-platform development. Someone needs to make a small USB-C device you can plug into your mac to run Windows on.
posted by sjswitzer at 3:47 PM on June 22 [4 favorites]


My main MacBook Pro is still on Mojave, largely due to audio plugins which will never be updated to 64-bit, and unfinished music projects depending on them, which would become junk if they died.

I read an interview with the editor for Parasite who's still running Yosemite because he won't switch from Final Cut Pro 7.
posted by octothorpe at 3:53 PM on June 22 [6 favorites]


My main MacBook Pro is still on Mojave, largely due to audio plugins which will never be updated to 64-bit, and unfinished music projects depending on them, which would become junk if they died. I now have the dilemma of whether to buy an Intel MacBook running Catalina and then Big Sur, or hold out for the ARM MacBook.


Catalina is good for music these days. All the plugins I use are working and I have a ridiculous number of synth plugins. The Novation, Korg and Arturia ones are fine.
NI software requires you to grant Full Disk Access (in the Privacy tab of the Security & Privacy control panel) to some of their bits. Logic is fine. Ableton is fine (if you update it, $$$). Pro Tools works but not the free version, because Avid is a money grubbing bastard.
posted by w0mbat at 3:53 PM on June 22


I hope we'll get some performant game-friendly Macs out of this soon

Ha, hope you like iOS games. Moving to ARM is going to just completely end the concept of MacOS running ports of Windows games. Also skeptical high end GPUs will even be possible in the ARM hardware architecture, although it's not impossible.
posted by Nelson at 3:55 PM on June 22 [6 favorites]


you know you guys all sound like grandpa simpson ranting about the metric system being the tool of the devil, right?
posted by entropicamericana at 3:56 PM on June 22 [27 favorites]


I stopped at High Sierra because goddamn it I have CS5 and I'm not fccking renting a meaningless update just for...[insert a reason to update a MacOS version that still gets security updates].

But I know my time is limited. This machine is not fast (although it still runs CS5 like a dream, unlike how my much newer work computer runs CC).
posted by General Malaise at 3:56 PM on June 22 [1 favorite]


you know you guys all sound like grandpa simpson ranting about the metric system being the tool of the devil, right?

My car 2013 MacBook Air gets forty rods to the hogshead and that's the way I likes it!
posted by General Malaise at 3:58 PM on June 22 [13 favorites]


I'm feeling quite pleased with myself that I did my big phone clean up a few months ago and my home screen still looks like this.
posted by lucidium at 3:59 PM on June 22 [3 favorites]


The architecture change will certainly present new and exciting challenges to the Hackintosh community.
posted by thedward at 4:05 PM on June 22 [2 favorites]


Ha, hope you like iOS games. Moving to ARM is going to just completely end the concept of MacOS running ports of Windows games. Also skeptical high end GPUs will even be possible in the ARM hardware architecture, although it's not impossible.

This just isn’t true. All the major game engines have native (Metal) ports to both iOS and to macOS 64-bit. Many older games were never updated to use the current engine versions with 32-bit support, but coming out with a 64-bit, Catalina-supported game port in 2020 is no harder than coming out with a 32-bit macOS game in 2018. And those 64-bit ports will recompile to ARM just as easily as anything else.
posted by doomsey at 4:06 PM on June 22 [3 favorites]


The architecture change will certainly present new and exciting challenges to the Hackintosh community.

I, for one, am excited for some teenager to figure out how to install MacOS on either a iPad Touch or Apple TV and document how to in a overly detailed 1.5 hour Youtube video.
posted by General Malaise at 4:13 PM on June 22 [7 favorites]


There's no way I'll be buying into any first round of revisions
I am as much of an Apple fan as anybody, but this is solid advice FOR ANY TECH CHOICE EVER ON ANY PLATFORM.

In fact, my reaction to the 2005 "we're shifting to Intel" announcement was to upgrade early, after only 2 years instead of my usual 3, to get a nice long runway on known-good PowerPC hardware (an Aluminum Powerbook).

Kind of ironically, I only ended up using that machine for 2 years, too, because I took a job in late 2007 that required Windows software, so I got a Macbook Pro. But at that point Apple had had nearly 2 years to shake out any kinks. Surprising absolutely nobody, my first Intel Mac was a joy to use. (I actually don't remember what happened to it; I suspect I gave it to a friend when it came time to upgrade.)

Anyway, the transition went pretty darn well, but they are inherently risky things, so why be first through the door?

(This post sent me to my packrat-corpus of notes, which is how I know that i've had 8 macs since 1999, starting with a Powerbook G3 in 1999 and culminating, for now, in the 6-core Retina Macbook Pro I bought last summer that I'm typing this on.)
posted by uberchet at 4:14 PM on June 22 [4 favorites]


uberchet: I had a similar experience when Imade my first Mac purchase right at the start of the Intel transition with a Mac mini G4, based on the mistaken assumption they'd update the mini last.

They did not.

But that mini, and a companion iBook G4 gave me a couple really solid years of use throughout college. Apple supported the PowerPC for a good five years, and I suspect with the volume of Intel Macs out there, Intel Macs will get at least that much support. I eventually got a MacBook (non-Pro) as a graduation gift, which lasted me about five years, until I got my MacBook Pro, which has lasted me seven.
posted by SansPoint at 4:20 PM on June 22 [2 favorites]


Tim Cook would rather sever a limb than give ARM any credit. The ARM processor is one of the masterpieces of tech history, and Steve Furber and Sophie Wilson should be a lot more famous than they are.

It is; though it is also a commodity. There are a few orders of magnitude more ARM CPUs in existence than anything else, because “ARM CPUs” covers everything from high-end phones and new servers to the microcontrollers in hard drives to disposable RFID tags. And Apple's A-series CPUs aren't just regular ARM cores with some iPhone-specific silicon added; Apple have invested a lot into optimising and building on them. So they have an excuse to not understate their brand-new architecture, which they're angling for buy-in for, as just another ARM.
posted by acb at 4:25 PM on June 22 [7 favorites]


We're going to know sooner than October since Apple is shipping out dev kits next week.

The dev kits are using an A14 finetuned for iPhone/iPad workloads, rather than one optimised for the Mac. It'll run, and it'll be fast, though it probably won't be optimal for desktop use in the tradeoffs made. (Which is probably a reason that it's not in a laptop; power and thermal considerations are a lot less forgiving there.)
posted by acb at 4:28 PM on June 22


ARM(ish) is a great great move. Everything else is ho-hum. I hope the wider ARM world gets an injection of productification in response. I have to think once the benefits in Apple's version start making news, the benefits will be demanded of Lenovo, Dell, etc. There's a lot of traveling workers out there who'd like a Macbook Air that didn't require them to carry a charger too much of the time.

There's no way I'll be buying into any first round of revisions
I am as much of an Apple fan as anybody, but this is solid advice FOR ANY TECH CHOICE EVER ON ANY PLATFORM.


I think Word for Windows in 1989 was the first time I became acquainted with this particular piece of advice.
posted by rhizome at 4:31 PM on June 22 [3 favorites]


Catalina is good for music these days. All the plugins I use are working and I have a ridiculous number of synth plugins. The Novation, Korg and Arturia ones are fine.
NI software requires you to grant Full Disk Access (in the Privacy tab of the Security & Privacy control panel) to some of their bits. Logic is fine. Ableton is fine (if you update it, $$$).


I have a stack of 32-bit plugins which will never be updated, and projects depending on them (reFX, for example, notoriously released a bunch of great-sounding synths in the late 00s, including a skronky-sounding virtual guitar, and a virtual analogue with interesting-sounding filters, and are on record as saying that they will never be updated to 64-bit because they're not selling them any more, and they've moved on to some new more EDM-ish synth or something). Unless I want all my old projects to turn to either garbage or a folder of petrified WAV mixdowns, I'm pickling a machine capable of opening them, which means no Catalina.
posted by acb at 4:32 PM on June 22 [5 favorites]


Ha, hope you like iOS games. Moving to ARM is going to just completely end the concept of MacOS running ports of Windows games. Also skeptical high end GPUs will even be possible in the ARM hardware architecture, although it's not impossible.

Um, you saw the Unity and Dirt Rally demos, right?
posted by acb at 4:33 PM on June 22 [3 favorites]


The architecture change will certainly present new and exciting challenges to the Hackintosh community.

Making an ARM Hackintosh would be on a par with getting iOS running on an Android handset. The hardware is a lot less standard.
posted by acb at 4:35 PM on June 22 [1 favorite]


I hope all of us have somebody who looks at our projects the way Tim Cook looks at the opportunity to say "undisclosed location".
posted by mhoye at 4:36 PM on June 22 [3 favorites]


The dev kits are using an A14 finetuned for iPhone/iPad workloads

The Developer Transition Kit is using the Apple A12Z introduced back in March with their latest iPad Pro. Apple's latest architecture is in the A13 (albeit with less cores then the A12Z) introduced with the iPhone 11. The A12Z in the dev kit is definitely not optimized for serious desktop workloads since it is designed for entirely passive cooling — I anticipate the chip introduced with the new ARM-based Macs will significantly outperform the A12Z.
posted by RichardP at 4:39 PM on June 22 [5 favorites]


Getting a transition dev kit would be a fun toy, but since I'm not a macos developer, it would just be a toy.

My take on the processor transition is less that Intel is so terrible but rather that 3rd party fabs are now so good and ARM is a sufficiently battle-tested architecture that you can just stop paying Intel to do all that stuff for you. Apple has a supply chain like no other company on the planet so a little more vertical integration isn't a big change. They're unlikely to pass the $100-ish BOM savings on to consumers, they'll just make Macs slightly more profitable and bulk up their cash reserves a minuscule amount.

We already see this sort of approach in datacenters where Google/FB/AWS basically have bespoke systems with custom motherboards and special processor SKUs and possibly even custom silicon ML support chips. The days of PCs being defined by a single monster CPU are over. Now it's about GPUs, bespoke CPUs and other specialized hardware.

Also I didn't get to watch the announcement - is Apple going to start shipping gaming-level GPUs at any point? It seems weird that they don't seem to want to touch high-end consumer GPUs.
posted by GuyZero at 4:51 PM on June 22 [4 favorites]


I also had to chuckle at the little subterfuge that went on when Cook introduced Johny Srouji and the hardware team in their "undisclosed location".

It backs up the old rumor that the hardware team revolted and wouldn't move to the new Apple Park with its open office layout. They are in traditional offices somewhere nearby.
posted by JoeZydeco at 5:06 PM on June 22 [5 favorites]


There are still a bajillion Apple buildings in Cupertino. Their old HQ is still intact and fully functional over on Infinite Loop. No one has to say "fuck it" - Apple Park only has room for maybe half of Apple's total headcount in the area.
posted by GuyZero at 5:12 PM on June 22 [6 favorites]


The Newton owner in me smiled at the iPad Scribble stuff. It all comes around eventually.

Egg freckles!
posted by Halloween Jack at 5:18 PM on June 22


> Tim Cook would rather sever a limb than give ARM any credit. The ARM processor is one of the masterpieces of tech history, and Steve Furber and Sophie Wilson should be a lot more famous than they are.

As far as I can tell from a read of Wikipedia, Apple was a member of the joint venture that started Advanced RISC Machines, and they currently hold a 14% stake. So what Apple has spared them in praise they've made up with money.
posted by ardgedee at 5:32 PM on June 22 [6 favorites]


Apple has an Architecture License from ARM, not a subscription, term, or any of the other licenses so they can (and have to) do more or less whatever they want. They're not licensing processor/core designs (like most ARM licensees), they're pretty much licensing the ISA. They get a test suite and core specs, and the rest is up to them. So, frankly, why would they laud ARM? They're not using ARM cores, they're using the ARM ISA to make their own cores.

There are something like 15 Architecture licensees, vs >300 more run of the mill licensees.
posted by aramaic at 6:05 PM on June 22 [11 favorites]


I’ve only looked at one thing (so far): compatibly with old hardware. I’m impressed that the iPad Air 2 and the 1st gen iPhone SE will still be supported. I ditched both of mine this winter, anticipating they would be dropped. Now I can tell my son and a friend their devices are good (secure) through at least Fall of 2021.
posted by lhauser at 6:07 PM on June 22 [1 favorite]


Thorzdad wrote: My workhorse iMac is forever frozen at Mavericks because of my need to run CS5 (especially Illustrator).

Have you looked at the Affinity tools (Photo and Designer)? They aren't at 100% feature parity, but they're pretty good.
posted by CheeseDigestsAll at 6:08 PM on June 22 [5 favorites]


Sleep tracking is what keeps me wearing a Fitbit instead of an Apple Watch, and I would switch for this so my data can all be in one place...but if you have to charge the watch every night, how can it track your sleep?

I use a sleep-tracking app on my current Gen4 Apple Watch called AutoSleep, and it works great. My watch wearing pattern is basically: Wake up (thank you silent buzzing watch alarm), go for morning walk. Come home, put watch on charger, shower, make breakfast & the day's lunch, get dressed, put watch back on, go to work, home from work, watch on charger again while making dinner and doing dishes, put watch back on until the next morning. Those two charging periods are enough to keep it topped up.
posted by xedrik at 6:25 PM on June 22 [1 favorite]


Re: the lack of native MacOS ports for a lot of PC games... I am not a software developer, but here is my understanding of it. I used to play Final Fantasy 14 a lot, on my Mac Pro which dual-boots to Win10. When they announced a Mac port for the game, I was thrilled. But then, I found out it's not really a Mac port, it's basically a WINEskin of the Windows version, wrapped up for MacOS. Performance was awful in MacOS. Same hardware, same graphical settings, I'm just running MacOS instead of Windows. I had a very nice chat with upper-level support, and their explanation was basically this: DirectX is very, very good at handling lots of things happening on-screen at once, like you'd find in a MMORPG or a MOBA. OpenGL... isn't. There is no DirectX for MacOS.

Many of the games in my library have Windows and Mac versions (Creativerse, Borderlands 2, Team Fortress 2, Dota 2 off the top of my head) and the Windows versions are just so much faster and crisper, even though I'm playing on exactly the same hardware, just booting into Win10 or MacOS. (But like, Photoshop and Premiere are notably faster on the Mac side, so... /shrug )
posted by xedrik at 7:26 PM on June 22 [3 favorites]


(Rhizome, how right you are. I evaluated Word for Windows for the biz school at my college, and the resulting document ended up being as brutal an indictment as I’ve ever had occasion to pen.

I wrote it in WordPerfect.)
posted by uberchet at 7:46 PM on June 22 [5 favorites]


I worked on the 8086 Assembler at Intel back in the late 70’s. I ended up writing the manual for the assembler. At that time there was this big argument about CISC versus RISC processors. Complex instruction sets versus Reduced instruction sets. All the PHDs at Intel swore that CISC was better and that RISC was a hoax. Well, many years later, after the 8086 has evolved into the current Intel processors, it looks like RISC finally won with the ARM chips. At least with Apple.
posted by njohnson23 at 7:55 PM on June 22 [12 favorites]


So are we really going to be interacting with this new utopian cornucopia of iOS apps on our laptops with a mouse? I'm with family this week and in the last 24 hours I've seen a 2-year-old, a 40-year-old, and a 75-year-old all jab at my MacBook Pro screen in an effort to click or scroll. And now they're going to see all their familiar games and apps with nice big rounded buttons that they have to drag that little arrow on top of and click in order to activate? It just seems bizarrely stubborn of Apple at this point.
posted by chortly at 8:16 PM on June 22


I know that i've had 8 macs since 1999

Lordly. I've had 3 Macs across that entire time. An iMac Graphite SE, a gigantic iMac (with side-load plastic disk drive), and another even more gigantic 2015 iMac, which required buying an external plastic disk media reader, and I'd expected it to last me until maybe close to 2030 because macs last forever? But now it will likely be deprecated within about 4-5 years.

It'll come close to what I expect for a lifetime of a Mac, really, but it will be sad that it's forced upon me rather than being chosen.
posted by hippybear at 8:21 PM on June 22 [3 favorites]


Well, many years later, after the 8086 has evolved into the current Intel processors, it looks like RISC finally won with the ARM chips. At least with Apple.

It seems virtually certain that you will know waaaaay more about this than me, but I had thought that modern x86 cpus were mostly risc with a cisc translation layer?
posted by GCU Sweet and Full of Grace at 8:38 PM on June 22 [5 favorites]


GCU Sweet and Full of Grace-

Nope. Not me. I was there in the early days and all I know is that the 8086 architecture evolved over the years. Translating CISC to RISC sounds like extra work. Somebody else here would know.
posted by njohnson23 at 9:06 PM on June 22 [1 favorite]


I went 12 years between Macs, 2008-2020. I built an i5-4690K Hackintosh in 2015 and another i7-6700K in 2017 but saw the writing was on the wall with the ARM Macs so got the MBP last month (I'm a sucker for 0% interest offers, $27/mo for 18 months, I can pay that ; )

Gamedev might work best going with Unity, don't see any interesting sessions on Apple game graphics APIs...
posted by Heywood Mogroot III at 9:18 PM on June 22


Translating CISC to RISC sounds like extra work

LOL, that's where all the magic is : )
The front-end has two major pathways: the µOPs cache path and the legacy path. The legacy path is the traditional path whereby variable-length x86 instructions are fetched from the level 1 instruction cache, queued, and consequently get decoded into simpler, fixed-length µOPs.
. . .
In the back-end, the micro-operations visit the reorder buffer. It's there where register allocation, renaming, and retiring takes place. At this stage a number of other optimizations are also done.
https://en.wikichip.org/wiki/intel/microarchitectures/skylake_(client)#Pipeline

above URL also states Skylake actually has 180 integer registers in the chip, of course not exposed to the x64 ISA but used by the execution engine to plow through code loads.
posted by Heywood Mogroot III at 9:36 PM on June 22 [6 favorites]


This is the third time they have switched processor architecture: 68k to PowerPC to Intel to Arm.

Well, the Mac prototype used a 6809, and surely the 65C816 gets an honorable mention [ducks, narrowly missing copy of Softalk thrown at head]
posted by RobotVoodooPower at 10:20 PM on June 22 [5 favorites]


No one even knew what a PowerPC processor was when Apple announced the Power Macintosh 7100, a working example of which I still have on my desk. This is a tested architecture in an area where Apple has already displayed a high degree of competence. Just give me a reliable keyboard that doesn't leave permanent scratches on the display, components that can actually be repaired and from which data can be recovered in case of a catastrophic failure? Tell me the truth if something can be fixed but it just won't be Apple to do the repair, OK?

It's not the processor switch that concerns me - that will be fine. It's the otherwise dismal general trajectory of how Apple treats its computer customers.
posted by 1adam12 at 11:22 PM on June 22 [8 favorites]


There are still a bajillion Apple buildings in Cupertino. Their old HQ is still intact and fully functional over on Infinite Loop. No one has to say "fuck it" - Apple Park only has room for maybe half of Apple's total headcount in the area.

Not even close to half. Maybe a third.

My team did say "fuck it". Except we did that back in 2015 when AP was still a dirt lot.

The media reporting as if everyone works in one building is weird as hell since the Apple logo is practically on every piece of commercial real estate in Cupertino, and all over the rest of the Bay Area as well. These aren't secret locations.
posted by sideshow at 11:32 PM on June 22 [6 favorites]


Perhaps ironically, I now play all my games on an absurd Linux tower, using the really-unbelievably good Steam Proton wrappers. This could - perhaps! - mean that 2021 is finally the year of linux on the desktop.

I mean, Chinese Democracy is twelve years old now, apparently. Lemme dream, man.
posted by kaibutsu at 12:18 AM on June 23 [4 favorites]


>I really wish I could build a pc and install MacOS on it without having to worry about compatibility. Of course, the move to ARM is just going to make that kind of thing even harder.
High Sierra, Mojave and Catalinas work really well as VM's in Linux via these Foxlet install-to-vm scripts (sl-github, instructions).
IANAL/IANYL: Note that an Apple macOS license says that you may run a copy of macOS on Apple hardware and I'm not advocating breaking the promises you made in the licence agreement or infringing the monopoly rights granted to copyright holders.
posted by k3ninho at 12:26 AM on June 23 [5 favorites]


OMG, I made the mistake of trying to build a FORTH on x86_64 and it was utter hell. I figured I'd try to learn myself some x86 assembly and OMG I ended up finding out that 80% of the things were legacy and slow and nobody used them anymore because it's micro-code all the way down for backwards compatibility. It's a HORRORSHOW. That instruction you picked because it did the thing you wanted... it was fast and cool 30 years ago but nobody has used it since, but it's still there. A bazillion variable length instructions and 80% of them are crap. RISC all the way baby, the only sane thing. Do few things, but do them well, don't emulate old things just because you can. Deleted Rant On Closed Source. Plus... They got the MSB/LSB thing totally wrong!

I really don't like x86 architecture if you couldn't tell. Sadly I'm stuck with it.
posted by zengargoyle at 1:21 AM on June 23 [8 favorites]


My "Late 2013" MBP with 2.8 GHz dual-core i7 and 8 GB of RAM is apparently on the list of supported Macs that will run Big Sur, but given how the fan on it runs constantly now that I'm running Catalina, I don't expect the performance to be great. But upgrading will mean moving to USB-C, necessitating a dongle or eleven for the music gear I use. Ugh. But Big Sur sure looks pretty.

Waiting for Clavia to release 64-bit versions of its Mac software for the Nord keyboards was hell. I doubt they'll be quick to recompile everything again.
posted by emelenjr at 4:29 AM on June 23


Also skeptical high end GPUs will even be possible in the ARM hardware architecture, although it's not impossible.
On the supercomputing side NVIDIA has been saying it’s supporting ARM architectures for a year now (although that sort of ARM chip and that sort of GPU are not fully congruent to the sort you’d see in an Apple machine). I think it’s been doing small device stuff for considerably longer.

I expect any obstacles will be Apple not putting much effort into it themselves rather than anything inherent to ARM architectures.
posted by edd at 4:49 AM on June 23 [1 favorite]




Actually with a little further digging I can find ARM-based desktops available with both NVIDIA and AMD graphics cards today.
posted by edd at 5:16 AM on June 23


This is the third time they have switched processor architecture: 68k to PowerPC to Intel to Arm.

It's the fourth time, including the switch from i386 to x86_64.

I'm not a game developer, but I'd expect the big platforms like Unity will support it quickly. If the machines really are a lot faster, it sounds like a net win for games to me (a non-game developer), unless you count games running in bootcamp or via virtualization as Mac games.

Phew! Nice to be able to permanently sort out all differing opinions in an Apple thread.
posted by ~ at 5:48 AM on June 23


hippybear, I'll own that I upgrade pretty fast. Partly this is because I use my machines for work, and I prefer to have an active AppleCare plan behind my active machine. And of course the other part is that I'm an aficionado and enjoy fancy new computers as often as I can justify them.

Generally I had off my 3-ish year old Macs to my wife, and then her old Mac (at that point 5-6 years old) gets given away or re-used for some other household purpose. So right now #8 is on my desk, under my fingers, and #7 is upstairs on my wife's desk, and #6 (from 2012) is going strong as a NAS/home server.

#5 was stolen, sadly, and the disposition of #1 through #4 is lost to memory.

(Oh, there was also a tiny MacBook Air I bought for Erin in 2012, since the theft in 2012 ruined our usual process. When the quarantine happened we gave it to some friends who didn't at the time have a home laptop. It still works fine, though the hinge is a little loose.)

Anyway, yeah, they typically last a LONG time. I missed the keyboard debacle, so the set of Macs I've owned has been as solid and reliable as the golden-age ThinkPads from the last 90s.
It's the fourth time, including the switch from i386 to x86_64.
Does that really count, though?
posted by uberchet at 6:20 AM on June 23 [1 favorite]


Can someone speculate what might happen to Java based apps such as ImageJ under the new regime?

Perhaps Apple will bring back the Jazelle ARM extensions so that you can execute the BXJ instruction to jump into Java bytecode directly without a JVM.

(Although "distribution of products containing software code to exercise the BXJ instruction and enable the use of the ARM Jazelle architecture extension without [..] agreement from ARM is expressly forbidden", so maybe don't do that)
posted by autopilot at 6:26 AM on June 23


Does that really count, though?
If some software stops working for some people, I'd say it counts. Most users are way abstracted away from any details other than the simple binary choice (pun unintended) of whether something loads or not.
posted by edd at 6:34 AM on June 23


"One great example of such advantages came when Apple demonstrated that a console-class video game (Shadow of the Tomb Raider—granted, a game released in 2018) could run on an iPad Pro chipset in Mac OS. The demo wasn't perfect: the game ran at 1080p with fairly middling settings, but did so at what appeared to be at the very least a steady 30FPS, all while being run as an emulated x86 version of the game. That is: the game wasn't even natively compiled for ARM. But consider that Intel's most powerful laptop chipset GPU, found in the 10th generation Ice Lake series, is not capable of breaking single digit frame rates on this game at 1080p. Apple received some snark about this demo being lame, but it's only lame if you don't understand at all just how terrible modern integrated laptop GPUs are. Apple is using a—very likely fanless—system on a chip to run laps around the latest and greatest from Intel. There is nothing Intel makes that has a more powerful GPU. Oh, and all evidence suggests that it's doing this at under half the cost of using that Intel chip. That's not just impressive, it's absolutely nuts."

- via AndroidPolice
posted by fairmettle at 7:15 AM on June 23 [10 favorites]


Okay but when will they again make a Macbook with enough ports for people to actually use it to do work?
posted by hydropsyche at 8:31 AM on June 23 [3 favorites]


Almost certainly never, so you may as well plan to switch to PC at about time that's convenient for you.
posted by aramaic at 8:45 AM on June 23 [3 favorites]


Studio engineer here. The only thing Apple offers that I find attractive is the desktop OS and to a lesser extent, Logic Pro X, though I primarily use other DAWs. I'm very curious to see how long it takes to shift their Pro towers to Apple Silicon or if they'll abandon that market entirely.

If the software I need to do my job was available on another Unix-like system I would switch happily. People keep telling me that Win10 isn't so bad but I have have a dualboot system at home to play games on and I still fucking hate it. The more I use it, the more I hate it. It's ugly to look at. The auto-update feature means I never know when I'm going to get held up from doing my damned job and that alone means I'd never run it in the studio in front of clients. It pushes ads for Candy Crush on me?! For some reason text rendering is ugly as sin and harder to read than on the same system booted into macOS. The control panel is inscrutable. I could go on and on. . .

Guess I'll keep running Hackintosh until it's no longer feasible, or I'll buy a used 2020 cheesegrater in a few years when the after market prices have come down.
posted by Evstar at 9:02 AM on June 23 [4 favorites]


Does that really count, though?
If some software stops working for some people, I'd say it counts. Most users are way abstracted away from any details other than the simple binary choice (pun unintended) of whether something loads or not.
You're confusing a chip architecture transition with the end of support for 32-bit applications. That's also a transition, but it's not the same sort of thing at all. Sure, people outside the industry might not understand the difference, but that doesn't mean it's not huge.
Okay but when will they again make a Macbook with enough ports for people to actually use it to do work?
Weird. I've been working on a 2019 Macbook Pro for nearly a year without even knowing it wasn't possible.
People keep telling me that Win10 isn't so bad
As you note, those people are just plain wrong.

The thing is, MSFT could make Windows better. They choose not to. If MacOS ever runs me off -- and that seems unlikely -- I'll be on Linux. I can't imagine putting up with all the weird inconsistent and poorly-thought-out crap involved in Windows as a true daily driver.
posted by uberchet at 9:21 AM on June 23 [3 favorites]


Microsoft actually is making Windows 10 better, but mostly better for developers, not end-users. I'm a developer who likes to play PC games, so I doubt I'll be making the switch to arm chips and will probably go back into the Win10/WSL2 universe when I need my next hardware upgrade, much as Windows's UI is painful to me (and to all right-thinking people).
posted by whir at 9:46 AM on June 23


I would pay a few hundred dollars a year to maintain a macOS license on my own hardware but that's totally antithetical to the Apple philosophy.
posted by Evstar at 9:46 AM on June 23


My team did say "fuck it". Except we did that back in 2015 when AP was still a dirt lot.

The media reporting as if everyone works in one building is weird as hell since the Apple logo is practically on every piece of commercial real estate in Cupertino, and all over the rest of the Bay Area as well. These aren't secret locations.


My team wasn't even given the option to choose. We'll be in the hinterlands of Sunnyvale forever, apparently.
posted by hanov3r at 10:02 AM on June 23 [2 favorites]


That's how they maintain that startup feel, putting you on the third floor of some random brick building in an office that will be an architecture company after Apple ends its lease someday.
posted by rhizome at 10:24 AM on June 23 [2 favorites]


They also have development offices outside of California. I believe there are some Apple developers based in London (either in office space above the Regent Street Apple Store or in an office out in Wembley/Watford/somewhere similarly far from the Soho-Shoreditch tech belt).
posted by acb at 10:58 AM on June 23


If MacOS ever runs me off -- and that seems unlikely -- I'll be on Linux.

Biggest things holding me back is (in order):
- Adobe CS
- Touchdesigner
- iMessage.app
- Non-Apple hardware quality (In-person brick and mortar repair + AppleCare warranty)
- Resale value of Non-Apple hardware
- Trackpad
posted by wcfields at 11:48 AM on June 23


Oh, for me it's a shitload more than that.

I'd start with:

- Native MS Office
- Adobe Lightroom
- A proper mail client. I've yet to see one on Linux that works as well as Mail on the Mac
- A good Exchange option
- Alfred
- Time Machine
- Level of polish of Apple hardware
- Easy of integration with other Apple hardware
- A whole host of really well-designed and carefully-coded utilities or apps that I use all the time

So it'd take a LOT to get rid of me.
posted by uberchet at 12:26 PM on June 23 [1 favorite]


Taking in some of the discussion around the net, it sounds like the move from Mojave to Catalina is anticipated to be harder than the move from Catalina to Big Sur. The dropped 32 bit support is a known pain point for me (and one of the reasons why I will likely not move beyond Mojave anytime soon).

Can anyone tell me about other pain points in the Catalina move? I'd like to start getting an idea of what I need to reckon with and examining alternatives again.
posted by wildblueyonder at 12:26 PM on June 23


Can anyone tell me about other pain points in the Catalina move?

How much do you love iTunes?

Seriously, though, it was not as bad as I thought. It was supposedly buggy, but I waited a couple updates and it didn't seem more buggy. I didn't really notice any benefits, either. Some features, like using an ipad as a 2nd screen, require newer hardware than I am using.
posted by snofoam at 1:10 PM on June 23


snofoam: Honestly, iTunes is probably the biggest reason I _haven't_ upgraded to Catalina. I've heard bad things about the new Music app, especially for people like me with large libraries and who don't stream. Not sure I'll have a choice depending on what's on my new iMac (fingers crossed for Mojave).
posted by SansPoint at 1:18 PM on June 23 [1 favorite]


How much do you love iTunes?

On one hand, I have a hard time thinking of positive changes to iTunes in the last 12 years, and I moved into dread and loathing sometime after version 10.x. Not in love with it.

On the other hand, I have a large library and don't generally stream, which I assume means Apple actively hates me.
posted by wildblueyonder at 2:13 PM on June 23 [3 favorites]


(FYI, I know the loss of needed 32-bit tools is painful, but let me caution holdouts against deciding to sit at Mojave forever.

Stay there really only as long as it takes you to figure out a soft 64-bit landing plan. I get that it's disruptive and a pain in the ass and awkward, but you will make the inevitable migration harder if you let this ride for years before approaching the problem. Find a friendly nerd to help you, if the technical aspects are vexing for you.)
posted by uberchet at 2:17 PM on June 23


(Apparently I'm just in for parenthetical asides now, but: I eventually backed into streaming for a bunch of reasons, and holy cow it's great.

I still have a giant library spinning on a NAS here in the house, but the interface with Apple Music is so much better than iTunes or Sonos that it's really just archival now. By using AM, I never have to plug a cable in to sync music to my phone. By using AM, I can play just about anything I want via the Apple TV's Apple Music app. And most importantly, by using AM my wife can get access to anything she wants on HER phone without needing to cable up. This is all easily worth $12 a month.)
posted by uberchet at 2:19 PM on June 23 [1 favorite]


(uberchet: I am a total corner case because I'm particular about certain albums and versions thereof. Just as a simple example, I have Kraftwerk's albums in both English and German versions—and prefer to listen to them in German—but because I live in the US, no streaming service offers the German versions. So, I'd be stuck listening to the English versions. That's before you get into the nitty-gritty about which remasters of certain albums sound better than others, and whether those masters are on a streaming service, and then you get into the out of print and/or limited-release stuff...)
posted by SansPoint at 2:22 PM on June 23 [2 favorites]




RE: iTunes

Try Swinsian if you store your library locally. It's like going back in time to when iTunes was good. And it support FLAC, ogg and pretty much every other format that Apple chooses not to.
posted by Evstar at 3:39 PM on June 23 [3 favorites]


I feel like Apple is in a much easier position to make a transition like this than anyone else would be. For software, Apple’s own apps, Adobe and I guess Office (but really still?) are all most people need. It will be so easy to be ready whenever the machines are ready, it only requires three developers and one of them is Apple. Any other random stuff like my BitTorrent client and FileZilla are going to get updated and if not, who will notice if they are running a little slow. Also, I bet they come out of the gate with way better performance. They’re not doing this so they can have laptops 20% faster, they are doing this because they can crush what is possible with Intel’s lineup.
posted by snofoam at 3:50 PM on June 23


The thing is, it will be just like the 32 bit stuff. Most developers will recompile and resubmit. Some will drag their feat and only do it right when the compatibility period is about to end. Some companies will decide there isn't enough money in supporting their old stuff, and they will cut their customers off. Some companies will go out of business (or are out of business already), and their stuff will eventually no longer work, which sucks but that's how it is sometimes.

One thing I can guaren-fuckin-tee will happen: Some developers will spend years and years doing nothing, and then put out a bunch of "OH SHIT, NEVER UPGRADE YOUR MAC EVER!!!!" right at the last minute and then transition into blaming Apple for only giving them a decade* notice after their customers upgrade anyway.

*Or whatever the transition period is. It was like 11 years for the Catalina 32 bit stuff
posted by sideshow at 4:03 PM on June 23 [7 favorites]


"Some will drag their feet"

The problem here is the assumption that feet should necessarily be moving at all, instead of able to camp at a stable location.

"It was like 11 years for the Catalina 32 bit stuff" sounds like a compelling argument... until you realize there's no particular reason it *needs* to be less than something on the order of a human lifetime. I don't have to re-learn how to play the guitar every decade, but the amount of time and money I have to sacrifice to keep the capabilities of a computing platform I buy into during that time is tiring.

If computing is meant to be a tool or an instrument rather than an experience, then longevity paired with capacity for virtuosity matters over novelty -- not to say that novelty is without value, but it needs to be compelling to sacrifice other investments.

I know how these conversations go so someone is going to say something like "what about economics and incentives -- do you expect other people to maintain 32 bit or any other environment for free for you indefinitely?" And the answer is that while there seems to be some indication that I could (Apple certainly seems interested in no-retail-cost yearly updates, Microsoft seems to manage incredible backwards compatibility by a model largely oriented around OEM licenses plus some margin of professional purchases, both using subsidies from complementary products).... no, I honestly *don't* expect free maintenance forever. I'd happily *pay* certain vendors to maintain 32 bit compatibility, or more generally the idea of a subscription fee to OS updates that maintain the contract of a certain environment. And I'd imagine that makes more economic sense than yearly lurches forward.

Virtualization? Decent solution when it's easy, legally blessed, and has a modest performance cost. What's the status of previous versions of the MacOS onthose fronts?

Another arbitrary turn on the upgrade treadmill is certainly within my capabilities to manage, but I have many things to pay more attention to that are more interesting, and the altar that looks like it's dedicated to the God of Progress often turns out to be attended by the demon Churn.

Maybe the performance / battery life benefits will be worth the tradeoff this time. But once upon a time I heard it's the clock on the wall that matters. The values behind "just works" are supposedly that you spend more time doing what you want to do and no more time than necessary thinking about your computing platform.
posted by wildblueyonder at 4:46 PM on June 23


I get you, Sanspoint. I'm a music nerd, too. As it happens, Apple Music has a solution to the "rare tracks" problem. iTunes/Music uploads those unique/unusual tracks to Apple so you can access them via Apple Music. I've got some unusual recordings in my library that aren't available in Apple Music normally that I have access to via AM this way.

It may still be that Apple Music is unpalatable to you for lots of other reasons, but this PARTICULAR gotcha has been planned for.

WildBlueYonder:

"The assumption that feet should necessarily be moving at all"

That's the very nature of computing, and has been since ENIAC powered on.

"something on the order of a human lifetime."

We've been around and around before on MeFi about the cost of maintaining 32-bit libraries; lots of folks outside the industry just don't care to understand that these are real costs with no associated gains.

You can get salty about it if you want, but the same gripes existed at every transition in computing history. They're not more compelling now.

Moreover, the Mac's relative polish and stability vs. Windows is in large part BECAUSE Apple has eschewed the "backward compatibility at any cost" model.

Of course, you're free to use your current Mac at the current release of OS X until terminal hardware failure. No one will try to stop you. But you'll eventually be left behind this way.
posted by uberchet at 5:09 PM on June 23 [7 favorites]


"It was like 11 years for the Catalina 32 bit stuff" sounds like a compelling argument... until you realize there's no particular reason it *needs* to be less than something on the order of a human lifetime. I don't have to re-learn how to play the guitar every decade, but the amount of time and money I have to sacrifice to keep the capabilities of a computing platform I buy into during that time is tiring.

I get where you're coming from, and I sympathize, but when I was young, z80s and 6502s were all the rage. Surely our computer devices are so much more capable since then that the many transitions since have been worth it?
posted by General Malaise at 5:12 PM on June 23 [4 favorites]


"It was like 11 years for the Catalina 32 bit stuff" sounds like a compelling argument... until you realize there's no particular reason it *needs* to be less than something on the order of a human lifetime.

Yet "spending millions upon millions of dollars per year to enable terminally lazy developers" has absolutely no compelling argument.

For simple apps, you can download the dev beta and have all your "migrate my app to Apple Silicon" work done this evening. Larger apps will need to plop down $500 to get the dev kit to actually verify their builds works on the new silicon.Or just wait until the new machines ship later this year or whatever, and go grab one from the Apple Store. Even the just kinda lazy developers will have years and years to get their act together to fix their stuff before the transition away from x86 is complete.

Microsoft's entire brand for the last quarter century has been "our stuff just fucking sucks" because they'd rather just ruin it for everybody rather than deprecate their terrible old stuff and just cut it out of their ecosystem. So, definitely not a path anyone else wants to follow.
posted by sideshow at 5:35 PM on June 23 [4 favorites]


It seems counterintuitive that there are commenters who seem to be both really into computer technology, but at the same time super resistant to progress. I don’t know what people are doing with their computers, but when I was a kid I could tell a turtle that was a triangle to make a circle on the screen and now I can do basically anything I can imagine.
posted by snofoam at 5:38 PM on June 23 [7 favorites]


Sometimes change for the sake of change isn't necessary progress. Some of us who have been involved in this stuff for decades get really tired of the constant churn done just so some billionaires can have even more money.
posted by octothorpe at 6:24 PM on June 23 [10 favorites]


It seems counterintuitive that there are commenters who seem to be both really into computer technology, but at the same time super resistant to progress.

It's not counterintuitive. It's a sign that desktop computing has matured. Has basically been mature for quite some time now. Mature enough that even the enthusiasts think of their computers as tools. They just want to concentrate on their work (or games) and hope the OS facilitates that with minimal interference. It's not a generic objection to progress. It's an objection to progress that doesn't seem to do much of anything other than interfere with the work.

Yet "spending millions upon millions of dollars per year to enable terminally lazy developers" has absolutely no compelling argument.


Actually has a very compelling argument. It's about reducing the pain point for end users. Backwards compatibility means I can keep working even if some of my tools have been essentially abandoned. Means I personally save money if a five year old, or even 10 year old, version of commercial software still meets my needs. Means that it's not my problem if the developer was too lazy to recompile to 64 bit. Means I'm not going to lose access to a favorite game just because the game hasn't been patched in a few years.

That's what I particularly appreciate about Microsoft. While Windows 10 is far from perfect, I at least generally don't have too much worry about any older software in my library. In many respects, it's one of the most user friendly things an OS vendor can do for their customers. This does wonders for retention and is a very compelling reason for spending those millions.
posted by Teegeeack AV Club Secretary at 6:29 PM on June 23 [2 favorites]


I'm still using the surprisingly excellent Microsoft Office Suite For Mac from 2008. Word for Mac 2008 is one of the most solid builds of that particular thing I have ever used. I use it for things a bit, but then I want to do a Super Special Thing on it, and discovering the path to create that is all entirely intuitive, built on decades of Mac Standards when it comes to how menus and submenus and workflow works.

That will most definitely go away once I upgrade. So... What is Microsoft Word like to use in 2020? Is it all subscription-based now? Pay-Per-Use, maybe? Or is there a piece of Mac software I can download and use locally? And is ANY of it any good?
posted by hippybear at 6:53 PM on June 23


I learnt basic on a ZX-81.
I program for a living, on a newish MacBook.
I have seen precisely zero things that make me want to upgrade even to Catalina.
It's not because I'm a curmudgeon or Luddite. It's because I work for a living, and I don't want to spend a few days fixing things that work fine right now.
posted by signal at 7:44 PM on June 23 [5 favorites]


Technical debt isn't free, and the cost is *not* simply paying some dev to sit watch and make sure it compiles for the next fifteen years.

Technical debt piles on top of itself like a giant mutant game of Jenga, where you've set fire to the bottom of the tower before starting play, and the various bricks are soaked in different chemicals. Some are soaked in kerosene, some are soaked in borax, some are steaming with LOX. Who knows what'll be next? Will it take off my hand, or just burn the house down?

Decisions that made sense as a workaround ten years ago will suddenly stop you from passing test. Oversights that didn't seem to matter now stop you from pursuing fabulously profitable alternatives. Europe suddenly decides to enforce Law XYZ? Well, hurrah, the stack makes it mathematically impossible to obey without rebuilding literally everything (this is from direct personal experience btw), and once we've rebuilt that stack at hideous effort your use case is now mathematically impossible without rebuilding everything my code depends upon (including software from innumerable other vendors, some of which are now extinct), which would require several hundred devs working for five years. Literally. We costed it out. Yay! Embrace the suck!

Some of these systems are among the most complex creations yet built by humanity, easily exceeding the most sophisticated physical structures by several orders of magnitude. We are not talking framework-of-the-week apps and websites here.

So.

Not gonna happen.

Gonna ditch that library your software depends on, instead.

I mean, OK, yes, if we charged the responsible users (that's you) a few hundred thousand dollars each, then perhaps we could swing it or a reasonable facsimile at an expense that zeroes out in the end after a couple of years work.

Meanwhile, back in the real world, if we lose a couple edge-case-lords in exchange for meeting European legal requirements in a timely fashion, then we're fine with that. In fact, just keeping track of the administrative and legal requirements would cost more than your business is worth, so in fact you're not even worth thinking about in the first place, never mind actually trying to accommodate.

...but there is an alternative: it's called the mainframe. Built over a span of decades to ensure things would continue to run a long as possible. They're quite expensive, but marvelous in their niche. Genuinely remarkable. Hope you enjoy JCL, and you'll need a few hundred thousand dollars to get started.
posted by aramaic at 7:57 PM on June 23 [5 favorites]


So apparently the idea that there's value in long-running commitment to platforms so that software created for them can continue to run even as the larger environment grows, and maybe people could even be paid to do this... that's controversial? Or must mean that I'm a computing outsider or something (heh).

General Malaise mentioned z80s and the 6502 -- OK, great. Let's talk about the 6502 sibling the Ricoh 2A03. Since we're all concerned with the history of computing and presumably familiar with its late 20th century period, we all know that was part of the guts of the NES, right? We can call machines made around these CPUs turtle-drawing chopped liver, but the fact is that they were inside hardware platforms that are *still* cultural touchstones. And people still want to engage with titles made for these platforms. So what do we do? Do we have people rewrite all the NES games from scratch for a new platform? If the original developers don't want to reconstitute their teams to do so -- if they considered the software done and shipped -- do we chew them out for being too lazy to update their product (and comment on how they've had decades to port to 2000s platforms now, why can't they get with it)? One certainly can, but what's more common is that people re-create the environment NES games run in in the context of some other environment.

My guess is that most of you arguing with me have even played an NES game under emulation.

This isn't contrary to "the very nature of computing" -- it's the other way around. The nature of computing is that there's agreement about the logic of a system, from the state transitions and i/o of ICs to a piece of machine code does when it hits an instruction register to the primitives of a higher level language to the definition of a standard library to the system calls of an OS. Everything in computing is a contract about what happens.

This isn't "spending millions upon millions of dollars per year to enable terminally lazy developers." It's maintaining an environment for software that was *done* and did its job. And if a developer is willing to undertake it (and better yet, if people are willing to pay for it) it's way more economically efficient to pay an internal team to maintain an existing common environment than to pay (or goad) tens of thousands of ISVs (some of whom may have moved on but left nevertheless functional tools) to adapt to a new environment.

This is not about saying nothing new should ever come into existence. This is not about saying no change is good.

This is about saying that the best progress preserves existing capabilities and investments with as narrowly distributed overhead as possible.

If overhead is widely distributed, that's a warning sign that something isn't right. Or at least that whatever is being optimized for, it may well not be your productivity.
posted by wildblueyonder at 8:05 PM on June 23 [5 favorites]


Objecting to corporate, capitalist, churny planned obsolescence should not be construed as anti-technological intransigence.
posted by polymodus at 8:11 PM on June 23 [3 favorites]


> It seems counterintuitive that there are commenters who seem to be both really into computer technology, but at the same time super resistant to progress. I don’t know what people are doing with their computers, but when I was a kid I could tell a turtle that was a triangle to make a circle on the screen and now I can do basically anything I can imagine.

It's more than a bit misleading -- unintentionally so, I'm guessing -- to cite the two endpoints without noting all of the detours, retracing of steps, and dead ends along the way. I don't think anyone here is arguing that we should throw away modern OSes, applications, and programming languages, and I'm cautiously optimistic that the purported advantages of moving to ARM will be realized, but the cost/benefit accounting is very incomplete at this point, so a bit more focus on the costs and a skeptical eye toward the benefits shouldn't lead to accusations of being against technological progress.
posted by tonycpsu at 8:12 PM on June 23 [4 favorites]


If we temporarily forget about supporting old binary-blob software, I am optimistic about the situation. The smartphone revolution caused many important code bases to switch from x86-only to cross-platform. Apple-centric developers have now endured two ISA switches and hopefully are writing their software with a minimal amount of assembly. In the open source world, ISA dependency is frowned upon.

For actively-maintained projects, I think this transition will be smooth.
posted by scose at 8:36 PM on June 23 [4 favorites]


Basically every car has enough space somewhere on the dash to include a cassette player, and the additional cost to include one is basically nothing compared to the cost of a car.

Also, isn’t this why Windows has sucked for so long? Like even if maintaining all the old stuff for compatibility doesn’t actively interfere with current stuff, the whole thing just becomes unmanageable.
posted by snofoam at 3:46 AM on June 24 [4 favorites]


Also, isn’t this why Windows has sucked for so long?

Windows hasn’t sucked in decades. Win XP was fine, Win 7 great, and Win 10 is excellent. It is fast, responsive, and never crashes. I can’t remember the last time I saw a blue screen of death. Probably the last time I had a Win 98 or Win 2000 computer.

We have both Windows and Apple computers in my house, and I find the Windows computers much less frustrating to do all sorts of stuff with. The Apple computers are great appliances for surfing the internet or using approved software, just don’t try anything crazy. On my windows computer I run modern software, legacy software, virtual machines, and support my collection of 80’s era hobby computer seamlessly and easily. I do all my graphics work on the windows desktop with its many ports and vast expand ability.

The macs are for email, web surfing, and word processing.
posted by fimbulvetr at 5:11 AM on June 24 [2 favorites]


Moving to ARM is great, fuck intel. But anything that anchors OS to hardware or vice-versa gets a bit of a thumbs-down from me, and this may have ramifications there.

As others have remarked, the Intel transition made cross-platform work a breeze. My daily driver MBpro is almost 8 years old and runs linux, windows, and osx right next to each other and that was easy because the architecture was standardized and the machine was user-serviceable (SSD over rust, ram upgrade, battery replacement).

Linux and Windows run on ARM, but there will be kinks with a custom Apple chip.

And the end of user-serviceable Macs has been a big part of my reluctance to upgrade this machine; nothing I saw in the announcement seemed to indicate any relenting in that front.

I'm pretty OS-agnostic - I use them for different things - but I would probably ditch OSX entirely at this point if it weren't for musical collaboration. As it is, Apple stopped making hardware that fit my particular needs, so I pulled myself off Apple hardware. There are dozens of us!

Luckily for the hackintosh community and people with Mac towers, this transition's going to take a while. We'll see how well Big Sur runs on older machines - it's *supported* on most 2013-newer according to Apple.
posted by aspersioncast at 6:31 AM on June 24 [1 favorite]


Windows hasn’t sucked in decades. Win XP was fine, Win 7 great, and Win 10 is excellent. It is fast, responsive, and never crashes. I can’t remember the last time I saw a blue screen of death. Probably the last time I had a Win 98 or Win 2000 computer.

I jump back and forth between my company Macbook and my personal homebuilt PC running Windows 10 and they're both mostly fine. I have some nits on each of them but for the most part they're both stable and do what I want them to do. For the most part it's about the applications and they're all basically the same on both. Chrome is Chrome, Outlook is Outlook, Photoshop is Photoshop. I find that the MacBook crashes a little more often than the Windows machine but that's still fairly rare.
posted by octothorpe at 6:44 AM on June 24 [1 favorite]


It's an objection to progress that doesn't seem to do much of anything other than interfere with the work.
So don't upgrade. Problem solved.
Means I personally save money if a five year old, or even 10 year old, version of commercial software still meets my needs.
So don't upgrade. Problem solved.
What is Microsoft Word like to use in 2020?
Drastically better, IMO, than 2008. Faster, more stable, and generally a smoother experience. But yeah, I think it's subscription.

Outside a work commitment, I probably wouldn't bother given that there are other options now. (Well, except that it comes with Excel, which is truly great, so maybe...)
Technical debt isn't free, and the cost is *not* simply paying some dev to sit watch and make sure it compiles for the next fifteen years.
PREACH.
Also, isn’t this why Windows has sucked for so long? Like even if maintaining all the old stuff for compatibility doesn’t actively interfere with current stuff, the whole thing just becomes unmanageable.
Nail. Head. Bam.
Windows hasn’t sucked in decades. [...] and never crashes
I LOL'd. Literally.

I mean, sure, if you don't ever ask it to do much, you probably don't notice.

But I sell Windows software for a *living* and can't abide the utter bullshit way Windows lumbers along. There's no consistency, and the whole thing is unstable by design. If you install and remove software often, your C:\Windows folder grows inexorably over time. This is on purpose, even though it eventually forces a machine wipe and rebuild because your Windows folder has grown to like 40GB-- something that I've literally never had to do under OS X in nearly 20 years.

Apple also isn't trying to shove ads into _my_ computer, or push bullshit like Candy Crush, or whatever. The Mac also doesn't suddenly reboot on me for "updates" of nebulous transparency and stability.

Windows is awful. It has always been awful. And part of the reason is MSFT's refusal to ever break with the past.

Also, please stop promulgating the idea that Macs can only run "approved software." It isn't true, even remotely.
posted by uberchet at 7:29 AM on June 24 [7 favorites]


I dunno. I can only speak from my own experience. I do a LOT with my windows machine. I have software on it going back to the 90s, and install and re-install all the time.

it eventually forces a machine wipe and rebuild because your Windows folder has grown to like 40GB-- something that I've literally never had to do under OS X in nearly 20 years.

And I haven't had to wipe a drive and re-install since Win 2000. So... 20 years of not having to do that. I didn't even bother doing a clean install of Win 10 on my desktop, just upgraded from Win 7 -- something I have never done on a PC in over 30 years -- and it went flawlessly. Computer boots in seconds, only 2 very old and obscure programs didn't survive the transition and had to be re-installed.

My experience with Windows computers go back to Win 3.1, and Apple computers to Apple IIs. I currently have wide range of Apple machines (Apple IIs, 68K Macs, PowerPcs, and Intels) set up and running. The only pre-Win 10 PC computer I keep around is a 486 Dos 6.2.2/Win 3.1 machine, because the Win 10 computer can run pretty well everything. I am neither a Windows fanboy/hater nor an Apple fanboy/hater. All I can say is that the modern Apples are great for "just works, just don't step too far outside of the box", while the modern Windows computers do EVERYTHING. I have a parallel-port dot matrix printer hooked up to my Win 10 desktop just for the hell of it (and it was plug-and-play). I use the same Model PS/2 keyboard that I've had since 1986. And if I want to write floppy disks for my 68K macs? USB floppy drive on the Win 10 computer.

So yes, Microsoft does keep a lot of backwards compatibility, but so what, it works, and it absolutely amazes the hell out of me that I can run and use the newest hardware and software right next to stuff from 35 years ago without a problem.

And yes, modern Apple computers generally "just work".

Apple also isn't trying to shove ads into _my_ computer

What ads? I just right-clicked and unpinned those widgety notification things the first time I saw them. Took seconds. It wasn't difficult and didn't require any advanced computer skillz. I haven't seen an ad since.

Maybe it is because I have the "pro" version, but I have never had an issue with sudden reboots in Win 10.

There is a lot of reflexive hate for both Apple and Windows, but both techs are mature, do what they are supposed to do, and almost always without trouble. Compared to computers of 20, 30, and 40 years ago (all of which I have running in my computer room) modern machines are MAGIC. They just work. All the complaints are nit-picks and edge cases.

Anyways, on topic, I don't have a problem with Apple going ARM. I'm certain they will be just as fantastic (and expensive) as the current machines. I'll buy one to replace the MacBook Pro when it kicks the bucket.
posted by fimbulvetr at 8:27 AM on June 24 [2 favorites]


It's 2020 and we're still having MacOS vs Windows arguments? lol. Reality check: they both work pretty well.

I wonder if Apple's shift will create any momentum for Windows on ARM? With very power efficient laptops, maybe. Windows has worked on ARM for a long time and you have your choice of ARM hardware but my impression is it's always been second fiddle. Part of the problem is you don't have anything like Apple's amazing hardware design project. It's not the ARM CPU that makes Apple special, it's the whole package that's going in to A14.
posted by Nelson at 9:02 AM on June 24 [6 favorites]


And I haven't had to wipe a drive and re-install since Win 2000.
Then you're not adding and removing software often.

Most corporate Windows installations require rebuilds with some frequency. This has been true since at least Win95 (we didn't do it with 3.1 probably because the machines themselves didn't last long enough vs. Moore's Law to experience the problem).

Corporations reimage *constantly* because of this. We run a lab of VMs for testing with image-level backups because of this problem. It's a known issue, and an absolutely predictable outcome of a design choice they made. Microsoft Premier Support will explain it to you if you have a contract with them. It's a nearly universal problem with machines that see heavy use. Weird shit starts happening (even if your C:\Windows isn't 40GB), and MSFT has no idea how to fix it short of a wipe and reinstall. Fun!

If you're not seeing it, you're staying on a happy path.
I have a parallel-port dot matrix printer hooked up to my Win 10 desktop just for the hell of it (and it was plug-and-play).
That's a GREAT example of the kind of crap I absolutely do not want in a modern OS. There's no good reason for it to be there.
So yes, Microsoft does keep a lot of backwards compatibility, but so what, it works,
Well, some of the time it does.
and it absolutely amazes the hell out of me that I can run and use the newest hardware and software right next to stuff from 35 years ago without a problem.
As has been pointed out exhaustively here, this fetishistic attachment to backward compatibility comes with a material cost in stability and support that just isn't worth it for most people.

One reason I choose Apple is because it's fairly free of bad design decisions left over from the last century, like the management of SxS or the inscrutable Registry.
What ads? I just right-clicked and unpinned those widgety notification things the first time I saw them. Took seconds. It wasn't difficult and didn't require any advanced computer skillz. I haven't seen an ad since.
"Oh I can just turn it off!" doesn't excuse that extremely gross behavior in the first place. Or the telemetry. Or the intrusive and opaque patch behavior. Or the inconsistent administrative design. Or the MSFT addiction to calling things by the same name when they're materially different (hey, try to use an AzureAD login for a service account sometime!)

And since you didn't address it, let me say this again:

Please stop promulgating the idea that Macs can only run "approved software." It isn't true, even remotely. It's the current version of LOL MACS DONT HAVE RIGHT CLICK; every Mac I've ever had supported multibutton mice.
posted by uberchet at 9:12 AM on June 24 [8 favorites]


It's 2020 and we're still having MacOS vs Windows arguments?

Yeah seriously can we consider that ax ground to a nubbin at this point?

I too am very curious about what this means for ARM in general, if only because Redmond tends to keep a bit of an eye on what Cupertino has coming down the pike.

And the proliferation of hobby boards and such has meant a shitload of ARM-optimized linux kernels.
posted by aspersioncast at 9:27 AM on June 24 [1 favorite]


Geeze. Calm down dude. “Macs don’t only run approved software” (he wrote while using his iPad). Happy now?

And yes, I do install and uninstall software all the time. Not just on rare occasions. I don’t know how to quantify this to make you happy, but hey, I don’t “need to be right in the internet.”

If you don’t like backwards compatibility, good for you. If you want an excellent computing appliance, get a Mac. I use Macs, Apple TV, iPods, iPads, all that wonderful Apple crap. I also have Linux computers, chrome books, and my network even has my first computer, a 1981 ti/994a, connected to it via a raspberry pi gateway.

I stand by my statement that modern computers are fantastic, and for the most part just work, arguing about what mega Corp you buy them from is silly.
posted by fimbulvetr at 9:31 AM on June 24 [1 favorite]


As has been pointed out exhaustively here, this fetishistic attachment to backward compatibility comes with a material cost in stability and support that just isn't worth it for most people.

I guess for me, if MS dropped backwards compatibility it would be a deal-breaker and make my daily computer experience much worse. But I know I am not a typical computer user. I am the sort of user that has no problem getting out the soldering iron to do repairs or hacks and is quite familiar with regedit and the like -- which I blissfully never *need* to access anymore with Win 10, I've only done it to hack stuff.

Most people I know only care if their computing device can access the internet and use the Google Docs suite. For them, Chromebooks are "good enough". They don't give a crap what the OS is, and they don't keep machines long enough to worry about any long-term problems.
posted by fimbulvetr at 10:04 AM on June 24


As has been pointed out exhaustively here, this fetishistic attachment to backward compatibility comes with a material cost in stability and support that just isn't worth it for most people.

Half of my use of my Windows box is for gaming and I like that I can still play Dark Forces II or System Shock II on it. Losing that back catalog would make me very sad.
posted by octothorpe at 10:09 AM on June 24 [2 favorites]


I'm fascinated to see what Apple produces for their Desktop level processor. As they're fond of pointing out, their current offerings are on-par with Intel already, and those are designed with very limited power and heat restrictions in mind. What are they going to be able to produce with those restrictions largely lifted?
posted by Eddie Mars at 10:31 AM on June 24 [3 favorites]


What are they going to be able to produce with those restrictions largely lifted?

They aren’t switching so they can make cheaper computers. I think the capabilities are going to be ridiculous. And awesome.

I haven’t had to use a Windows computer in ten years, so they may be a lot better than what I remember. My only recent experience is watching antivirus, etc. notifications pop up when people are trying to give presentations at conferences.
posted by snofoam at 10:51 AM on June 24 [2 favorites]


In sympathy with the folks who are keeping the old ways alive, having so many things rely on the cloud is problematic for me because my town has not been rewired for data since the last hurricane. I have to use a cellular hotspot at home and even the biggest plan I can get is a fraction of what I would need to do everything normally (cloud backups, streaming, etc.).
posted by snofoam at 11:25 AM on June 24 [6 favorites]


What are they going to be able to produce with those restrictions largely lifted?

Hard to say until an actual product comes online, but a 24-hour-per-charge, work-ready laptop that weighs 2-3 pounds would be pretty groundbreaking.

Their flagship phone gets 20 hrs on a charge playing back video — usually pretty intensive work, though the decoding end of H.264 is well-optimized via hardware.

Even a skinny Apple laptop would have a considerably larger enclosure to hold a larger battery, using a logic board sized for a phone or iPad.

One of the main power draws then becomes the screen, and Apple is working on a low-power LTPO backplane for OLED displays in mobile devices. That kind of technology might work its way into a laptop product, with the according energy savings.
posted by They sucked his brains out! at 2:42 PM on June 24 [5 favorites]


I'm all for more experimentation with ISAs and processors, but my skepticism here is that this feels like it will be accompanied by moves to ratchet up the level of control over what users and developers can do, e.g., by requiring code signing and only allowing distribution through the app store rather than just aggressively putting roadblocks in the way of running code without Apple's permission.
posted by Pyry at 3:25 PM on June 24


People keep predicting Apple will lock MacOS down to app-store-only like iOS, and it keeps not happening. I would be very, very surprised if that ever happened.
moves to ratchet up the level of control over what users and developers can do, e.g., by requiring code signing and only allowing distribution through the app store rather than just aggressively putting roadblocks in the way of running code without Apple's permission.
The moves Apple has made regarding code signing are generally good ones. You SHOULD need to adjust defaults in order to run unsigned code. It's 2020; the digital world is dangerous. 95% of users have no business doing that.

It's harder to add a printer than it is to allow unsigned or non-app-store binaries, no kidding.
posted by uberchet at 3:38 PM on June 24 [6 favorites]


I don't know about a 20 hour laptop, but I'd be delighted if an ARM Mac could match the current iPads Pro, which get a true 10 hour battery life. In my experience, that's long enough for an entire work day, and with USB-C Power Delivery you can recharge most of the battery back in under an hour.
posted by adrianhon at 3:39 PM on June 24 [1 favorite]


My partner has one of the 12” MacBooks and it is ridiculously light and works just fine for her. Compared to that, my 13” MBP is an anvil (that seemed very light when I first got it). I would love something like a 14” MBP that is way more powerful than what I have now, has twice the battery life and weighs about 1/3 less. It seems possible.
posted by snofoam at 4:30 PM on June 24 [1 favorite]


Welp, my new iMac has Catalina pre-installed, so I'm stuck with the Music app. Not thrilled, but I've managed to mostly wrangle it into submission. Alas, I can no longer use Launchbar to control music playback.
posted by SansPoint at 12:07 PM on June 25


(Consider alfred or quicksilver?)
posted by uberchet at 3:26 PM on June 25


>It's harder to add a printer than it is to allow unsigned or non-app-store binaries, no kidding.

Well, yea, but isn't the first step of adding a printer installing some unsigned binaries for drivers?
posted by pwnguin at 3:48 PM on June 25


I dunno where YOU get YOUR printers, but...
posted by uberchet at 3:59 PM on June 25 [1 favorite]


The french kiss, the mwwAHH of death is when Apple's new slab o' silicon emulates Intel at close enough to line speed where emulator bois can't even notice, as Pentium was all about recombining RISC with CISC micro-code. Sauce for the goose is sauce for the gander.
posted by Slap*Happy at 8:25 PM on June 27 [1 favorite]


> I dunno where YOU get YOUR printers, but...

Well I just use the ones at work. They're networked, and amusingly enough, run Android.
posted by pwnguin at 11:00 PM on June 27 [1 favorite]


Initial tests with primitive mule hardware stuffed into a Mac Mini indicate that pre-beta Rosetta 2 is only a generation or two behind modern low-spec MacBook performance on Intel... and who are we kidding? Devs and QA who rely on x86 compatibility for Windows or Unix/Linux or AWS development have been nursing along their MacBooks with the good keyboard for years and years.

Apple takes their damn sweet time with desktop hardware revolutions, mostly because Motorola, Motorola/IBM and Intel can't keep up or overshoot. Apple has been buying silicon innovators, way back to when Jobs was in charge (why did they buy Raycer?) There is a new silicon regime, CPU and GPU and SOC-glue orchestrated, Apple style, with clever acquisitions and in-house talent.
posted by Slap*Happy at 5:18 PM on July 1


Yeah, as one person described it, the benchmarks we're seeing are from "when they aren't even trying," given that we're seeing benchmarks running through an x86 compatibility layer, on a chip designed for a 2018 iPad (which is itself significantly slower than the chip running all the current-generation iPhones). Imagine what they could do with a modern, purpose-built CPU.
posted by DoctorFedora at 8:15 PM on July 1 [2 favorites]


« Older A full moon American fever dream   |   How an Agoraphobic Traveler Wanders the Earth Newer »


This thread has been archived and is closed to new comments