I wonder if anyone else here is old enough to remember the "I'm a Mac", "And I'm a PC" ads.

There was one that was about all the annoying security pop-ups Windows (used to?) have. (FWIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230 .)

Lately I've gotten so many of these popups on Mac that it both annoys and amuses the hell out of me. "Die a hero or live long enough to see yourself become the villain", I guess.

But, man, Apple hardware still rocks. Can't deny that.

Oh, that smell of molten keyboard plastic, those yellow spots burned into a display with its own heat exhaust, those laser-machined loudspeaker holes next to keyboard, all filled with grime! How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line! Not to mention the power button right next to backspace.

It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!

The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.

I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.

> the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years

Apple have a couple of extra mechanisms in place to remind us to buy a new device:

- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.

- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?

> No longer providing updates to the device after just a few years when it's still perfectly fine.

This is a weird one to complain about because Apple leads the industry in supporting older devices with software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.

The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).

We’re moving away from hardware and into software and longevity in this discussion but wrt “apple leads the industry in supporting older devices with software updates” i would point out that Red Hat is probably more of a beacon / industry leader here as the main promise of RHEL is 10 years of support and updates. But again we don’t ship hardware so I see the narrower sense that you’re making but still would like to push back on the idea that giant companies cannot continue to keep complicated legacy code bases secure and functional about 2x longer in most cases than what Apple has done

Main problem, not just from Apple, is that as phone tech gets standardized and more long-lasting the software support cycles have not gotten longer.

It is abysmal that Android phone makers still need to customize the OS so much for their hardware. Apple has no incentive for longer support cycles if Android does even worse on it.

It has always been like that since CP/M and commercial UNIX days.

Vertical integrations like everyone sell a product, a brand, a whole ecosystem experience.

If all OEMs sold the same CP/M, UNIX, MSX, MS-DOS, Windows software stack, on the what is basically the same hardware with a different name glued on the case, they wouldn't get any brand recognition, aka product differentiation.

Thus OEMs specific customisations get added, back in the day bundled software packages are part of the deal, nowadays preinstalled on the OS image, and so on.

"You cheated on me last night!"

"This is a weird one to complain about, look at Donnie, he cheated on his girlfriend 3 times last month!"

i don't get it, how long do you think is reasonable?

I tend to look at technology prices in terms of cost per unit time of useful life.

If Apple continues to supply updates for six-year-old phones, iPhone 17 prices range from $11/month (base model iPhone 17) to $28/month (iPhone 17 Pro Max w/2TB storage), meaning it's only about 20% more expensive to store data on a RAID 10 array of iPhone 17 Pro Maxes running current iOS versions than on standard-tier S3 (not a relevant comparison, obviously, but it amuses me).

So I don't know what's reasonable, but Apple's policies certainly appear to be.

I'm still salty that Apple no longer offer battery service on my OG Apple Watch, however, so reason has its limits.

Suppose you always want to be running the latest iOS release, but you want to replace your phone as infrequently as possible. You would "only" have to have purchased 4 iPhones since 2007:

    | Model     | Launch date        | Obsoleted by | Price 
    |-----------|--------------------|--------------|------
    | iPhone    | June 29, 2007      | iOS 4        | $399 (*price cut)
    | iPhone 4  | June 24, 2010      | iOS 8        | $599
    | iPhone 6  | September 19, 2014 | iOS 13       | $649
    | iPhone 11 | September 13, 2019 | -            | $699
Adjusted for inflation, the total for these phones is $3,287 excluding carrier contracts. Assuming the iPhone 11 will be obsoleted by iOS 27 in September 2026, this costs you about $14.29/mo.

I was a long time Android user - but I realised I was getting through 2 or more phones in the time my wife had one. They'd either become obsolete or just die. I reluctantly bought an iPhone on this basis - it's actually going to work out cheaper if I get 5 or 6 years out of it.

However, I find the iPhone keyboard so bad and the settings concept so muddled that I'm going to return to Android when this experiment is over. Probably not for another 4 years though!

You know you can sell and replace your phone if you don't like it. Recent Pixels have 7 years of support and they don't die. That's what I'd recommend you get instead. You can even trade in your iPhone for up to $700 when you buy a Pixel. You really don't need to force yourself to use a phone you don't like, leave alone for that long.

If we're talking anecdotes, my wife changes her iPhone every 4 years because it gets worse and worse. Daughter does the same. I change my Galaxy every 4 years because it gets worse and worse as well. Not sure how some people can say their <insert beloved brand> holds forever, unless they don't really use it of course. No brand really keeps up with the requirements, unless all you do is make phone calls - which is why my dad still has a Sony Ericsson.

Technically not updates but if you hook up a PowerPC mac with 10.4 Tiger on it you can still get it updated to the latest version released, 10.4.11

I demoed that exact feature (though on 10.5) not so long ago and people didn’t believe me…!

The part that really gets me is that the price per GB to go from a 256 to a 512 GB iPhone is $2.54 (since the next storage option up costs $200 total). Two and a half dollars!!! A 512 GB micro SD would run you $0.10/GB. They have been charging 25x the market rate for storage on a device with no expandable storage at all for years. Baffling that they aren't called on it more. It should be criminal.

I had the 2019 cheesegrater Mac Pro. 7TB (going from 1 to 8) would cost me $3,000.

So I bought a 4xM2 PCI card, 4 2TB Samsung Pro SSDs for $1,100. And as a result got 6.5GBps versus the onboard 1TB's 5GBps.

Same with memory. 160GB (32 to 192GB) from Apple was also around $3K. OWC sold the exact same memory chips, manufacturer, spec, speed, for $1,000 for 192GB.

I recently found my ipad mini 2 (released in 2013) that had been boxed up when I moved a few years ago. After charging up the battery and booting it up, I checked for system updates. The latest system available for it was ios 12.5.7, released in 2023. It loaded fine, and I’ve been using the mini as an ereader ever since – the screen is fine, and wifi works.

A Macbook is the only Apple device I have in my entire array of computers and computer-related stuff, so I've got plenty of points of comparison. While Apple's hardware design isn't perfect, all of what you bring up seems wildly blown out of proportion to me. I can say I've never seen anyone with molten keyboards and displays. I've used the charger cable on my main charging brick for about five years now, and it's still going strong, despite being used for charging everything everywhere. And while Apple has committed many sins in terms of doing their absolute best at preventing anyone from touching their sacred hardware (we just need DRMed cables and enclosures to complete the set), this only affects repair. In terms of planned obsolescence, Macbooks statistically don't seem much less reliable than any other laptops on the market. They make up a majority of the used laptop market where I am.

And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.

Apple produced at least three mice that were very different and terrible in different ways. Their laptops are good, but don't waste your time defending their other peripherals.

Apple's unwillingness to admit that one button isn't enough is legendary. They added a fucking multi-touch area to the fucking mouse because that's apparently easier to use and more efficient. It's funny as hell.

I've barely ever tried them, but I've never liked the shaping of any that I have held, and I don't think that the touchpad addition justified the discomfort that it causes in all other use cases. That being said, the whole "Apple added the charging port on the bottom to be evil and prevent you from using the mouse" thing had become such an entrenched internet fable over the last decade that it's impossible for me to come by it and not comment on it. I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.

I’ll admit to owning one and I use it.

The charging port location is weird and stupid, but I have never needed to charge it while I am using it. When it hits about 15%, I plug it in at the EOD and don’t have to charge it again for maybe a month. I am a neat freak and you have to look hard to see any cable on my desk rig.

The multi touch stuff works fine for me, but perhaps I am just used to it.

The only complaint I have is the shape, it’s not “comfortable” to use. Easily addressed by a stupid 3D printed humpback add on that changes the aesthetic but makes it comfortable for me to use. I shouldn’t have to buy a mouse accessory…but I did.

Here is the thing though…it’s just a mouse. I point, I click, then I move my hand back to the keyboard. It’s fine. While I’m sure there is a better designed one out there, is any mouse truly revolutionary?

We do know the intention though. Apple thinks a mouse with a cable looks messy and ugly, so they made the mouse charge fast and put the port on the bottom. Made it impossible to use it whilst charging but you could get 2 hours of use out of like 10 minutes or charging. The end result Apple hoped for was people always seeing the mouse on the desk, cableless, charged.

I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".

Again, this is something that's often repeated all over the internet, but there is no source for this, it's just speculation - and fairly unconvincing speculation at that, since it has to go so far in assigning the designers these strong opinions and unwillingness to compromise just for it all to make sense. I feel like what I proposed is a far simpler and more straightforward explanation. Occam's razor and all. Just look at what the mouse looked like through its generations[1]. When redesigning it, they obviously just took out the single-use battery compartment and replaced it with a section containing the rechargeable battery and the charging port. In fact, they really couldn't have done it any other way, because the mouse is so flat that its top and bottom sides taper all the way to the desk, with no space for a charging port. So, when making the gen 2 model, just putting the port where it is was probably a far simpler drop-in solution that saved them from having to redesign the rest of the mouse.

[1] https://cdn.shopify.com/s/files/1/0572/5788/5832/files/magic...

> I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".

The Jobs era of Apple had a ton of pretty but less functional designs. Jobs is quoted as saying that, but he was full of it. He didn't actually practice that philosophy at all.

>Apple added the charging port on the bottom to be evil

I don't think anyone does anything "to be evil".

But clearly they had a choice between what was good for the user (being able to use the mouse while charging) and what suited their aesthetic, and they chose the latter. Open-and-shut case, indeed.

That's Apple for you. Any time there's a conflict between aesthetics and user friendliness, aesthetics will always win out.

“I'll clarify that no one but the designers themselves knows the original intention, but since it's the exact same design as the AA model, just with internal changes, it seems like an open-and-shut case.”

“Legendary attention to detail”

Indeed, it is pretty open-and-shut.

which is really funny, since the Microsoft mice (only a few are left) and keyboards (discontinued) are by far some of my favorite peripherals.

On the apple mouse side, I got a white corded mouse with the tiny eraser looking mousewheel back in around 2003 or so, it's still in use today with a M4 mac mini. Works like a champ, Keyboard from that era is also still in use and used daily in our family.

I daily drive the Microsoft Touch Mouse, have for 10+ years. It is by far my favorite piece of hardware. I've never seen another one used in the wild, which might explain why they discontinued it.

The remote controls for Apple TV are among the all time worst peripherals I have ever used. Remotes aren’t hard. They reinvented the wheel by making it rectangular.

To be fair, since the Logitech Harmony One went EOL there hasn't been a decent remote available from anyone.

There was a third-party battery module[1] for the original AA Magic Mouse that would allow it to charge wirelessly, a feature that Apple somehow still has not managed to steal!

[1] https://techpp.com/2011/04/19/mobee-magic-charger-for-magic-...

> How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line!

???? ctrl+a and ctrl+e? That works on most Linux setups, too. Only Microsoft screws that up. I love how in Mac Office apps, Microsoft also makes ctrl+a and ctrl+e do what they do in windows lol.

[deleted]

Was there a Gateway that did better?

Can you be specific about your bad experiences with Apple hardware? I've gone through 5 MacBook Pros since 2008 and my only complaint was the old Intel models always got too hot. Nothing ever broke on them and I guess I kept them relatively clean?

I also have all of the adapters that came with the MBPs too, all perfectly functioning, the oldest still attached and powering my 2013 model with the dead battery (2008 model was sold, still working). The magsafe cable is pretty yellow now, and maybe a little wonky from the constant travelling, but no fraying/fire hazard yet.

>Oh, that smell of molten keyboard plastic, those yellow spots burned into a display with its own heat exhaust, those laser-machined loudspeaker holes next to keyboard, all filled with grime! How I miss that time on a Macbook, with all the chords you have to press whenever you need a Home or End button to edit the line! Not to mention the power button right next to backspace. It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!

None of the above sound like anybody's actual experience. Which is also they have the biggest resale value retention among PC laptops, and biggest reported user satisfaction.

Now, if you were about the lack of ports (at least for a period) or the crappy "butterfly" keyboard (for a period), you'd have an actual point.

Home/End is just Control-A/E.

Never seen "molten keyboard plastic". I'm sure you can find some person who has that somewhere on the internet. I doubt it's a problem beyond some 0.0001% rare battery failures or something like that.

"yellow spots burned into a display with its own heat exhaust". Not sure what this even means. Especially AS Macs don't even get hot. I haven't heard the fan ever, and I use a M1 MBP of 5+ years with vms and heavy audio/video apps.

"when its charger dies in a month" is just bs.

While I was in law school, every student who had an Apple laptop had to get their laptop replaced at least once (some multiple times) over the course of our program. The biggest problem was the bulging keyboard, due to the bulging battery, but their were also numerous issues with displays and with chargers not lasting very long. Most chargers lasted at least a semester, but few of the Apple chargers lasted an entire school year. They simply weren't designed with durability in mind. Quite humorously, after one student's laptop keyboard began bulging during torts, the professor began an impromptu lecture on product liability laws.

The only PC laptops that were replaced were the ones that got damaged in accidents (car accidents, dropped off a balcony, used as a shield in self defense during a robbery, etc.). Dell Latitudes of that era were sturdy, and not noticeably heavier than their fragile Apple counterparts.

Staingate?

I had a GPU issue (that was the subject of a recall that matched my symptoms precisely (and I could make the MBP core dump on demand in the Genius Bar) but "recall declined, does not fail diagnostics".

Damaged charging circuit on an MBA. Laptop worked perfectly. Battery health check fine. Just could not charge it. "That will be a $900 repair. Maybe we can look at getting you into a new Mac?" (for one brief moment I thought they were going to exchange mine... no, they wanted me to buy one. And of course, my MBA couldn't be traded in because it was damaged...).

I've also had multiple Magsafe connectors fray to the point of becoming like a paper lantern with all the bare wire visible, despite the cable being attached to a desk with cable connectors so there was near zero cable stress (and often only plugged/unplugged once a week).

Also they leak charge onto the case.

Any properly grounded device will do that with specifically incorrect electrical wiring and/or a shoddy charger. Did this happen with a properly wired outlet, and an undamaged Apple charger?

I have doubts that it did, as that would warrant a safety recall.

Can confirm it does happen. UK, both on my ThinkPad and a friend's MacBook when plugged in. It's a somewhat unavoidable side effect of the switching AC adapter designs - the output is isolated from the mains, but there is a tiny leakage current that can sometimes be felt as a "vibration". This is completely safe (far below the currents needed to cause harm) and no recall is needed.

Thank you. I always felt this vibration and wondered what it was.

If you replace the two prong plug on the AC adapter for a three prong cable, your MacBook case will be properly grounded and you won’t feel any vibration.

Cast aside your doubts, I've been to different parts of Europe a few times with different, healthy MBPs (I buy a new one every 4-5 years) with healthy adapters.

Plugging into the wide EU outlet with the Apple-manufactured plug, from the "World Travel Adapter Kit", can lead to uncomfortable "vibration" that you feel when you touch the top case, depending on the the hotel/airbnb. Whenever I visit I expect I should charge while I'm not using the device.

In researching why it was happening to me, I found sufficient forum posts complaining about it that it seems to be commonplace.

I have my doubts that Apple would admit enough to perform a safety recall given the issues they've had with their garbage chargers in the past. Other companies have no problems with building hardware that lasts. Apple seem to prefer their customers pay for replacements.

Found this out one time when I went to touch my MBA and it was like I stuck my finger in a light socket.

n=4 but my niece spilled a whole cup of milk and a whole cup of matcha on my M2 (twice on 1 device). I just flipped it up, dried it out with a hair dryer (apparently shouldn't do that) and it still works 2 years later.

Can't relate to what you're saying, had 4 MacBooks, and many PCs too.

Yes! I wholeheartedly agree!

I teach C++ programming classes as part of my job as a professor. I have a work-issued MacBook Pro, and I make heavy use of Terminal.app. One of the things that annoy me is always having to click on a dialog box whenever I recompile my code and use lldb for debugging. Why should I need to click on a dialog to grant permission to lldb to debug my own program?

It wasn't always like this on the Mac. I had a Core Duo MacBook that ran Tiger (later Leopard and Snow Leopard) that I completed my undergraduate computer science assignments on, including a Unix systems programming course where I wrote a small multi-threaded web server in C. Mac OS X used to respect the user and get out of the way. It was Windows that bothered me with nagging.

Sadly, over the years the Mac has become more annoying. Notarization, notifications to upgrade, the annoying dialog whenever I run a program under lldb....

> Why should I need to click on a dialog to grant permission to lldb to debug my own program?

Because apps and web browser tabs run as your user and otherwise they would be able to run lldb without authorization. So, this is the authorization.

Tried iTerm?

Terminal app is terrible, use iTerm if you possibly can.

*ghostty

Ghostty is very janky (though less on Mac than on Linux). It’s promising but needs a lot of polish. So in the meantime I reverted to iTerm.

That has nothing to do with developer prompts.

I can't stand the stupid dance required to run a third party app every single time.

Get the fuck out of my way and let me use what is supposedly my computer.

They become more shitware and Microsoft like with every update.

Worst part about this is that these days the error message simply lies to the user. "Whatever.App is damaged and can't be opened. You should move it to the trash."

If you remove the 'quarantine' attribute that gets added to downloaded files it runs great.

This doesn't really address your concern, but I feel obligated to share the Gatekeeper bypasses I know about:

Homebrew has an option which bypasses Gatekeeper when installing apps:

  brew install --cask app --no-quarantine
And apparently you can have this on by default:

  export HOMEBREW_CASK_OPTS="--no-quarantine"
And I have this alias for everything else:

  alias unq="xattr -dr com.apple.quarantine"

The Apple gods know what's best for you. They're just trying to protect you from yourself.

That's true, but macOS is still further from the current equivalent of Windows than it ever was.

Yes, Windows 11 is horrendous.

Years and years ago, I fought the good fight, full desktop Linux fulltime.

I see and hear from a lot of people it's pretty great these days though, and you can do whatever the new cool fork of WINE is or a VM for games / software compatibility.

Definitely not moving to 11. When 10 gets painful enough I'll probably try Devuan or Arch or something for desktop use.

I used Linux exclusively from 1999 to 2012. I gave up in 2012 and switch to Mac.

I’ve just switched back to Linux, and I can confirm. It’s different, it’s very different. And it’s very good. I realised how much I’ve just been fighting Apple everyday in newer versions of Mac.

I was going to ask my boss for a new MacBook Pro soon but now I’m having second thoughts.

> I was going to ask my boss for a new MacBook Pro soon but now I’m having second thoughts.

If you're looking for laptop recs, I've been happy with my Framework 16 except for one issue: when you close the lid, it can subtly flex in your backpack, press keys, and wake itself up. I work around it, but it's annoying.

But they recently announced a second build of the Framework 16 where one of the selling points is that the lid won't flex. Can't personally verify that they got it right, but given the build quality of the rest of the machine, I suspect they did.

I gave up in 2010, but on my case went to Windows 7, plus VMWare.

While I appreciate NeXTSTEP heritage on macOS, I rather have something that costs 1000 euros less with more RAM, SSD, and a GPU capable of doing CUDA and Vulkan without translation layers.

Now if my employeer or customer is willing to assign a MacBook Pro with the same RAM and storage for project activities, great more on them, I would not say no.

Similar timeframes! Any distro recs?

I'd suggest Linux Mint if you want Ubuntu with a GUI that doesn't suck, though Kubuntu is probably a decent choice (KDE isn't to my taste visually so I don't use it, but it absolutely doesn't suck and many people like it). If you want a rolling release instead because you want to live on the bleeding edge, Arch is hard to beat.

I can second Mint for a good GUI, but if you want Wayland I would probably either go for Pop!OS with the new Cosmic desktop beta or Fedora with KDE over any version of Ubuntu. Snap packages suck still IMO

I should probably mention that Linux Mint disables Snap packages by default (for reasons of principle that boil down to "Snap packages suck"), though you can reenable them with a single command if you actually want Snap. That's another point in Mint's favor for me.

You can disable SIP if you don't like it.

I'm the first to tell people they can disable SIP, but it's not something that everyone should do.

i.e, blanket disabling of SIP will interfere with conveniences like Apple Pay, etc. People want those conveniences. Disabling SIP is a trade-off.

I had no idea Apple Pay was a thing on MacOS.

> But, man, Apple hardware still rocks. Can't deny that.

This makes me extra sad. The HW is very good and very expensive, but the SW is mediocre. I bought an iPhone 16 a few months ago and I swear that is the first and last iPhone I'd purchase. I'd happy sell it at half of the price if someone local wants it.

Edit: Since people asked for points, here is a list of things that I believe iOS does not do well:

- In Android Chrome, I can set YouTube website to desktop mode, and loop the video. I can also turn off the screen without breaking the play. I can't do this in Safari however I tried.

- In Safari, I need to long-press a button to bring up the list of closed tabs. How can anyone figure it out without asking around online?

- In Stock app, a few News pieces are automatically brought up and occupy the lower half of the screen when it starts up. This is very annoying as I don't need it and cannot turn it off.

- (This is really the level of ridiculous) In Clock, I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.

- Again, in Clock, if I want to setup alarm for sleep, I have to turn on...sleep. This is OK-ish as I can just setup a weekly alarm and click every weekday.

So far, I think Mail and Maps are user friendly. Maps actually show more stuffs than Google Map, which is especially useful for pedestrians. Weather is also good and I have little complain about it.

The YouTube thing is Google's choice, not Apple's, as those are "premium" features. Install Vinegar (https://apps.apple.com/us/app/vinegar-tube-cleaner/id1591303...) to get a standard HTML5 player in YouTube, which will let you make it full screen, PiP it, background it, whatever.

I dislike the new Safari layout in iOS 26 too. https://support.apple.com/en-nz/guide/iphone/ipha9ffea1a3/io... -- change it from "compact" to "bottom". I assume this choice will disappear in the future, but for now, you can make it more familiar.

Unfortunately, I don't have any advice for the Clock/Alarm; I don't typically schedule one-off future alarms. That would be a useful feature.

> The YouTube thing is Google's choice, not Apple's, as those are "premium" features. Install Vinegar (https://apps.apple.com/us/app/vinegar-tube-cleaner/id1591303...) to get a standard HTML5 player in YouTube, which will let you make it full screen, PiP it, background it, whatever.

But it IS Apple's choice. The problem is they have a mixed up conflict of interest, and it's even worse when Apple themselves is trying to sell you their own services.

IMHO the company making the hardware, the company making the software, and the company selling the cloud services shouldn't be allowed to all be the same company. There's too much conflict of interest.

I don't see how it's Apple's choice?

Google sells PiP, background playing etc. as part of YouTube Premium (not Apple!). Google serves browser clients a player that can't do those things, because they want you to pay for them. Vinegar is a browser extension that replaces Google's player with Apple's plain HTML5 player. Apple's plain HTML5 player does all that stuff for free.

With Apple's default apps, i kind of feel like the apps themselves are strategically designed to not be the best place to look for lesser trafficked use cases.

Visual customizations get upstreamed into the system accessibility settings. Extra functions are exposed exclusively in Shortcuts for you to hack together an automation feature yourself. For a fully feature supported app Apple would probably say go pay for an app (and them through fee) on the App Store.

For example with your future alarm.. you could get another app or you could create a 'Time of Day' Shortcut automation which checks everyday to see if the date is the date you want the alarm on; if it is the day create the alarm. Delete the alarm on the next day. A (not so) fun fact about automating Alarms before iOS 17: you could only delete alarms through Siri and not Shortcuts lol...

> I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.

I think you're supposed to use the calendar for that.

Which is broken, because I want to be able to say “set an alarm at for next Wednesday at 4:30 AM“ the second I find out I have to give someone a ride to the airport.

If I have to make it a calendar entry, I may not notice it in time.

why do you have to go to calendar to set alarm ?

The alarm and the calendar can both alert you on a specific date and time. But like a bedside alarm clock, the iOS alarm is limited to a time in the next 24 hours.

On Android, why would you not use the calendar when you want to be alerted days from now?

Because on my Android phone, the alarm accepts a date and rings only on that date

Wat? Because a century old device can only set a repeating alarm, the 2025 $1k smartphone should, too?! I set so many one of alarms on my android weekly, I would be infuriated if I had to use the calendar for that.

You can set as many non-repeating alarms as you like, as long as they're within the next 24 hours.

What kind of event are you creating an alarm for that's more than a day away? On Friday, do you create your alarm to wake up on Monday? Does Android have a calendar-like view of your upcoming alarms?

(Sorry for the barrage of questions, but this is interesting to me.)

I frequently use one-time alarms for early morning travel a few days away. What makes them great is that the alarms in general are much more robust in their requirements to turn them off; I can accidentally dismiss a calendar alert, but I have a much harder time accidentally deactivating the clock alarm in a sleep induced stupor at odd hours.

More importantly, alarms don't get silenced by my nightly do not disturb schedule.

>You can set as many non-repeating alarms as you like, as long as they're within the next 24 hours.

Any color as long as it's black, eh?

I'm with you. I got an iPhone 15 a couple of years ago after only ever using Android. I was expecting great things and it's just.. confused!

I never have any confidence it will notify me of things. Often I just miss stuff. I actually have no idea how all the different "Focus" modes work. Notifications pile up and then seem to disappear without any action, and then reappear a week later.

The keyboard is really awful. I recently pulled an ancient Galaxy phone out of a drawer to test something and was reminded of just how much better the Android keyboard is. It just always guesses the wrong things!

And the settings are equally jumbled. Sometimes they're in the Settings app, sometimes they're in the app I'm using. It's just confusing.

It was honestly eye opening because I'd spent so many years assuming iPhones were better. I moved across because my Android phones kept having hardware issues. I think I'll probably go back to Android after this experiment.

I recently bought an original iPhone SE in hopes of using a smaller device. But I can't use it for anything worthwhile because I can't sign into an Apple ID due to having encrypted backups on my account, which was introduced in iOS 16.2. The SE only supports 15.8.5. So I'd have to disable encrypted backups for all of my other devices in order to sign into my account.

Why am I unable to use Apple Music on my device while I can use it from a web browser or from an android phone that don't have encrypted iCloud backups enabled?

Unfortunately this also prevents me from jailbreaking the device because I have to sign into an Apple account in order to trust a developer certificate on the device required for the jailbreaking tool. It's my device! Let me approve of the certificate without an Apple ID!

Oh, that sounds annoying. I also bought an SE 2016 128GB as "Offline-First" / de-Appled device. I think what I did was just signing in to App Store, not fully adding my Apple ID to the device. It now serves as a time capsule, has all the classic apps like Doodle Jump, Fruit Ninja, MiniMetro etc and some resource-light open source apps like CoMaps, fileexplorer with connection / offline sync to Nextcloud. Jailbroken via Palera1n (Think I didn't have to approve install any certificate)

Where are you located at?

What are your actual complaints? And have you tried the trash fire that is Android?

Do you think they just got a smartphone for the first time a few months ago?

I switched to Android in 2021 after almost a decade using iPhone and I was surprised to find modern Android is actually very good. I remember the early days of Android being a trash fire but since around ~2020 it seems to have gotten a lot better. For various reasons im looking to switch back to the iPhone but I know it'll be a case of giving up some good things on Android in exchange for other things being better on the iPhone.

I updated my reply ^

> But, man, Apple hardware still rocks. Can't deny that.

They really dodged a bullet there. 2016-2020 Apple laptop hardware definitely didn't rock. It's good they did an about-face on some of those bad ideas.

The fact they were able to turn around their hardware division after all that is the only thing which gives me hope they might be capable of doing an about-face on software.

Are you referring to the butterfly keyboard and TouchBar?

FWIW, I think the Touchbar was close to being a good idea, it was just missing haptics.

They didn't rock but... modern MacBook Pro models are bigger and heavier and have a notch (and yes, I know technically the notch is more screen rather than less, and that one can simply use the space below, but I still don't like it). I also liked how you could charge the older models from either side and still have 3 free Thunderbolt ports.

Debatable since the nub is still around on all their devices. My M3 work laptop definitely feels like a playskool toy.

You can’t get more brain dead that taking away important screen real estate then making the argument that you get more real estate because it’s now all tucked into a corner.

God forbid there be a black strip on the sides of the screen. How did we ever live?!??

I really like my macbook with the notch. The bezels are smaller and you can fit a larger screen in a similar footprint to previous macbooks without the notch.

Except Apple increased the height of the screen by exactly as many pixels as the notch is tall, so yes, your windows actually do get more real estate compared with the pre-notch models. I’m not going to argue that it’s free of problems, but it doesn’t come at the cost of usable desktop space.

Also worth pointing out that this design allows a substantial footprint reduction, which for example puts the 13.6” Macbook Air within striking distance of traditional bezel 12” laptops in terms of physical size. Some people care about that.

It does come at the cost of useable desktop space because if you don’t use a separate monitor with your portable device, you can’t see or use icons in the task bar.

And peoples suggestions is to install 3rd party software or just deal with it. It doesn’t help that fanatics feel the need to tell you which parts of the screen are and aren’t important real estate. Like fuck me and my opinions right?

“Well actually mathematically there’s more real estate in less convenient places so it’s fine.” Is so…depressing to watch people just give in to any little idea that comes out of this company’s PR department, like their logic is The Only truth.

It’s a legitimate issue in one sense, but another way of seeing it is that outside of a select few programs that actually need to be that way, Windows-tray-style apps are something of an anti-pattern. Most apps would be better as plain old dock apps or maybe headless daemons with HUD style UI summoned by a keystroke and thus not need menubar icons. It’s genuinely weird that we’ve ended up in a place where more than a fractional number of users have a bajillion menubar icons to contend with.

My suggestion would be to buy a laptop you like, then.

Or if your work is forcing it on you, I'm sorry but that's not the fault of people who happen to enjoy it. Maybe ask if you can get a different machine?

I’ve tried all those options. I only use my Apple laptop for work and for everything else I use a well designed piece of hardware, a Framework laptop. I’m a hardware prude and only work on professional hardware with fantastic design that doesn’t obfuscate my screen for an argument about pixel space and a desire to standout at the cost of sensibility.

Putting aside the keyboard debacle a lot of the blame for that can be directed at Intel's door. They consistently over promised and underdelivered.

Fuck man, I worked on the original Mac OS back in '83, when all the work was in assembly. Know what happened? Apple happened. That company is fucked up something supreme. The entire premise behind that original graphical UI was never user experience, it was 'the users are idiots, we have to control them'.

We know each and every person who worked on the Mac because of projects like folklore.org. Which one are you?

Besides the sibling comments, if you read books like "Steve Jobs & the Next Big Thing", this point of view is quite detailed there.

Younger readers will find out that modern Apple attitude is quite similar to the early years of the Macintosh being sold to universities.

The be nice and think different phase was only during the time they were about to close doors.

as his user profile indicates: Blake Senftner. I don't see him mentioned on folklore.org, but that doesn't mean he didn't participate in some way.

https://www.quora.com/profile/Blake-Senftner

> Original Macintosh Beta Tester and Mac 3rd Party Developer (‘83-’85)

I was a teen game developer with my own games in Sears & K Mart nationwide for the US, for the Vic-20 and the C-64, and was invited as a representative of the independent games industry. When my involvement was ending, Apple then told me they changed their mind and was not going to support independent games for the Mac at all. But offered to waive that restriction if I paid them $30K and gave them full editorial control over what I published. Nope.

It's wild how fast Apple pivoted from Woz just wanting to make a PC anyone could write and play their own video games on to "Nah we want full control of every last bit, fuck your indie games".

I think Apple marketing understands human motivation and the rarely acknowledged super strength of prestige marketing. Apple's marketing very much leans into every one of their products must be perceived as a high prestige item to own, or they will not release it. When the Mac was brand new, they cultivated and guarded that prestige like a hawk.

I'd wonder if especially at that point in history, trying to appear as a gaming platform would be a liability.

An early Mac was not a great gaming computer even by 1980s standards, and the last thing you want is Commodore or Atari running an ad saying "Apple's $2500 black-and-white prestige piece doesn't play games as well as a $299 C64/800XL". Not to mention the stink of the US video game market crash hovering around anything game-related.

If they pivot directly towards more professional workstation/publication/art department positions, nobody's making that point. (Now I'm thinking of the time "Boot", or maybe it was already "Maximum PC" by then, reviewed a SGI O2 and said it was impressive, but had limited game selections.)

Quickdraw was revolutionary, it had all the optimizations, and then all the code was in assembly. Things like classic arcade games were very much possible. I had a lunar lander game with side scrolling landscape, a dig dug clone, a variation on donkey kong, and a variant of Robotron. Apple thought that would attract the wrong impression, they wanted the design and typography crowd. Can't say they were wrong, to be honest.

He worked on a beta as part of the Apple Professional Developer's Program at Harvard.

I recently installed Ubuntu on my gaming machine. It was a bit of a learning curve, but I am still able to game, and I can play around with software without being treated like a criminal. It's great.

I still use Mac for dev, but only because I don't really feel like messing around with Linux on a work computer.

Not far from my philosophy. If I'm being paid, I'll use whatever I'm getting paid to use. But on my own, I'll choose to learn tools that will be around for a long time, and won't get taken away by some rent-seeking company (i.e. open-source).

Ubuntu is great once you remove the motd advertisements.

Maybe I'm just too used to advertising blatantly in my windows env that I'm not noticing ads on Ubuntu. Which MOTD ads are you talking about?

The ones you remove by

   setting ENABLED=0 in /etc/motd-news
   apt remove ubuntu-advantage-tools
https://canonical.com/legal/motd

https://support.tools/remove-ubuntu-pro-advertisement-apt-up...

I do not understand this rhetoric of Apple hardware being so amazing. The only moderately impressive thing they've done for years is the M chips. Beyond that, it's just crippled, overpriced, and unrepairable.

They have shiny cases, yay. I'll take my ugly Thinkpad and actually get shit done over a shiny case and glossy screen.

I think it's a bit more than that. I like that I can easily swap the SSD and DRAM in my Thinkpad. But Apple has definitely done some interesting things including:

- good thermals (especially vs. Thinkpad P series), even supporting fanless operation on the MacBook Air

- excellent microphone and speaker array (makes people much more intelligible on both sides during Zoom calls)

- excellent multitouch trackpad with good palm rejection (though for a trackpoint device Thinkpad is your best bet)

- unified GPU and CPU memory with up to 135 GB/s bandwidth (downside: DRAM is not upgradable)

- host-managed flash storage (downside: SSD is not upgradable)

And of course the 10-20 hour battery life is hard to beat. Only downside is I'll forget to plug in at all.

Historically, Apple has innovated quite a bit in the laptop space, including: moving the keyboard back for the modern palm rest design (PowerBook, 1991); active-matrix color display (PowerBook 180c, 1993); integrated wi-fi and handle antenna (iBook, 1999); Unix-based OS that could still run MS Office and Photoshop (Mac OS X, 2000 onward); full-featured thin metal laptop with gigabit ethernet (PowerBook G4 Titanium, 2001); pre-ultrabook thin laptop that fits in a manila envelope (MacBook Air, 2008); high-resolution display and all-flash storage in an ever-thinner design ("Retina" MacBook Pro, 2012); going all-in on USB-C/Thunderbolt and 5K external "retina" display (MacBook Pro, 2016); unprecedented performance, and a tandem OLED display with <10ms touch-to-pixel latency, in an absurdly thin iPad, which can also be used as a "laptop" (iPad M4 + magic keyboard, 2024); etc. Some of the innovations also failed, such as the touchbar, dual-controller trackpad, and "butterfly" keyboard which plagued the 2016 models.

I mean, having 2 hours of active use vs 10 hours is quite a bit of difference, quite a bit more meaningful than "shiny case".

Mac is great hardware to be sure. I have to say though, I much prefer an S25 Ultra with Samsung's version of "nanotexture" — even with iPhone 17's improved (?) anti-reflective screen.

I've been very patient with iOS 26. I tell myself - so long as its foundation is right, they'll iron out these details. But it is properly bad often and at times extremely frustrating.

The funny thing is the annoying popups on Windows look like advertising copy from the web post Microsoft getting grid and flexbox into HTML to support HTML-based applications. They at least try to be visually enticing.

Annoying popups on MacOS look like the 1999 remake of the modal dialogs from the 1984 Mac, I guess with some concessions to liquid glass.

Funny that a lot of people seem to have different Liquid Glass experiences, are we being feature flagged? I don't see the massive disruption to icons that the author seems but it does seem to me that certain icons have been drained of all their contrast and just look bleh now, particularly the settings icon on my iPhone. I don't see a bold design based on transparency, I just see the edges of things look like they've been anti-antialiased now. It's like somebody just did some random vandalization of the UI without any rhyme or reason. It's not catastrophic but it's no improvement.

It is catastrophic if you have older devices.

All this wank to waste the power of faster and faster chips.

I have flipped every imaginable switch and even set up a work profile, and I have not yet been able to turn off desktop notifications on my new work Mac. Every time I log in there's a persistent notification about... something stuck in the corner. The notification does not explain itself, clicking it just drops me into a system menu. Usually there's three or four such notifications queued up.

So it just lives there permanently on the desktop and I avoid using the thing as much as possible. I do all of my work functions through SSH and leave the Mac in a corner of my desk with the screen closed.

I swear, MacOS actively tries to be as annoying and intrusive as possible. Every time I touch the damn thing some new behavior reveals itself. Like there are two sets of hotkeys for copy/paste and which one you need to use appears to be entirely random per-window.

Thankfully work lets me use linux on my main machine and I almost never have to deal with goddamn MacOS. I would rather daily drive Windows 11 with copilot and cortana and piles of bossware than plain MacOS.

I am, better, I am old enough to have seen Apple in the good and bad times, and tell that Apple's major attention to detail, and catering to developers, was a side effect from being really close to shutdown the whole business.

Now they have plenty of money, the attitude during early Apple years is back.

Unsurprising that they'd end up there, at the time Mac was allowed to get away with fewer security pop-ups by (relative) obscurity. Fortunately Windows still manages to run ahead as the even worse villain, as I wouldn't even let a Windows 11 PC in my house these days.

[deleted]

I often get third party popups from software vendors which asks me for my MacOS password. I have checked several times and these are "legit" (as in, the popup comes from a who it says it does and it's a reputable company). It's wild to me that Apple have painted themselves into a world where it's expected that users give their OS password to third party apps.

MacOS and iOS both seem to have an insatiable hunger for passwords. The most aggravating scenario for me by far is when the App Store on iOS, with no consistent pattern I have been able to identify, makes me reenter my entire massive Apple ID password instead of the usual Face ID prompt to download ... a free app.

I can’t get it to use my password manager on that screen either, and navigating to another app closes the modal so you have to copy your password and then start over.

Wait, that's actually never legit. If the password popup comes from the OS on behalf of the vendor, that's OK; the third-party party never has access to your password, just a time-limited auth token to allow it to do something privileged.

Ok? I don't know if it's the OS on behalf of the app or not. It's a password prompt that doesn't even have an affordance for biometrics, unlike other MacOS admin prompts. It's commonplace in MacOS applications.

This is an example of what I'm talking about https://www.reddit.com/r/Slack/comments/1geva4f/how_do_i_sto...

This is good for security becuase you're giving temporary access for a helper binary to do privileged stuff in a limited scope.

From the UX perspective, yes, it is triggered from the app.

It's been a long time since I used the Core Foundation API but you trigger a request, and then get back a token from the OS that grants you permission to do stuff.

I don't know if this is current or not:

https://developer.apple.com/library/archive/documentation/Se...

[deleted]
[deleted]

You're not sure if anyone on HN is more than 14 years old? I mean, I feel you, but the odds are...

I watched the whole video and laughed so hard! Thanks for sharing this!

> But, man, Apple hardware still rocks. Can't deny that.

Ah yes, the Johnny Ive era of "no ports on Macbooks except USB-C, and hope you like touchbars!" was fantastic. Not to mention how heavy the damn things are. Oh and the sharp edges of the case where my palms rest. And the chiclet keyboards with .0001 mm of key travel. I'll take a carbon fiber Thinkpad with an OLED display any day of the week, thank you. Macbooks feel like user hostile devices and are the epitome of form over function.

I don't mind that the Macbooks only have USB-C ports. Unlike many PCs, where the USB-C ports can't be used for charging, or can't be used for high-speed data transfer, or can't be used for external displays, or can't be used by certain software that only speaks USB 2.0, etc., the Macbooks let any USB-C port do anything. It's a forward-thinking decision, even if it was primarily made for aesthetic reasons.

What I do mind is that there's only 3 of them.

The transition era was certainly annoying, but now that it's over I think the Mac experience is objectively worse. My PC laptop has 2 USB-C ports that can be used for charging, display, 40 Gbps transfer, etc., just like my Macbook Air. The difference is that the PC also has 2 USB-A ports and an HDMI port. This means that I'm able to plug in a flash drive or connect an external display without having to remember to bring a dongle with me.

I largely agree that PCs have caught up feature-wise, but because they took longer to get there, I still have a couple crappy USB-C ports on PCs that are otherwise fine.

The problem with the 2 USB-C ports on modern PC laptops is that one of them pretty much has to be reserved for the charger, whereas the MBP has a MagSafe port that you can charge with instead. So it really only feels like you have one USB-C port and the other ports are just there as a consolation. That might work out to roughly equal, but I don't think it leaves the Mac off worse. I don't hate the dongles so much though.

It wouldn't have hurt to have some USB-A and HDMI on the MBP--the Minis can pull it off, so clearly the hardware is capable--but more (Thunderbolt) USB-C would still be the best option IMO. USB-A (definitely) and HDMI (probably) will eventually be relics someday, even if they are here for a little while longer.

The current Macbook Pro design also includes an HDMI port.

Can those USB-C ports charge your battery if it is completely empty?

Yes, they're the only charging ports on the laptop.

Didn't they have some issues when you tried charging from the left side ports? Eg overheating and throttling etc.

There are some models of MacBook Pro where one side has more 'thermal headroom' than the other side. I have one of those models, and I can't remember which side it is.

USB-C connectors are much less reliable than their predecessors due to their design though. I have several connectors that failed, either they no longer grip the cable securely or they just lose contact randomly.

I'm sure it's handy for mobile devices where size and versatility trumps everything, but on laptop/desktop machines where longer-term usage is expected I would prefer something more reliable.

Are you comparing USB-C to its direct predecessors, things like (micro-)USB-A/B, (mini-)HDMI, and (mini-)DP, or to more distant ancestors like PS/2 and D-sub/VGA/DVI?

Comparing to USB-A specifically.

C is absolutely better than the micro/mini variants, but not the full-size ports.

Interesting. I can see how USB-A has better friction/grip in the port (when made properly anyway), but I can't say it has better longevity; I've broken plenty of USB-A connectors and even ports. I am also glad to be rid of the "which way is up" dance, and I don't think anyone has shed any tears for the loss of USB-B.

USB-A has much larger pins, meaning it will continue to work even if slightly damaged. USB-C has much higher pin density so the slightest misalignment (due to damage or just bad manufacturing tolerances) causes dropouts.

I'm just salty because I will have to replace either the ports on my Macbook or my USB-C wireless headphone receiver (both are a pricey endeavor, not to mention the downtime of having the laptop shipped for repair) just for the same issue to most likely reoccur a year down the line since it wasn't a result of any kind of misuse (both devices are exclusively used in an office environment and otherwise in brand new condition).

I forgot where I read it, but there's apparently a Jobs policy of "one standard and two proprietary ports" or something, so to allow data to be ingested easily and freely shared inside Apple ecosystem, but not exported back out to the outside world with same ease.

Which is like, a great way to subsidize junk USB hubs...? But for sure they love following through with policies.

That is complete BS, Macs have never had any proprietary data ports on them. Serial, SCSI, Ethernet, USB, FireWire, Thunderbolt, USB-C have all been standards.

Well there was a proprietary Ethernet port for a while.

https://en.wikipedia.org/wiki/Apple_Attachment_Unit_Interfac...

Apple serial port used proprietary Apple connectors.

Apple SCSI ports used nonstandard Apple connectors.

Apple Ethernet port was just Ethernet, except Macs preferred AppleTalk for networking, which was a purported competitor to Ethernet.

Apple USB port was just USB, except they were among the firsts, so it was kind of ex-proprietary.

Apple FireWire was just IEEE1394, except(combine Ethernet and USB)

Apple Thunderbolt was(combine all above)

Apple USB-C is(combine all above)

What I see here is an evolution away from proprietary connectors (but I definitely agree that they strongly favored such connectors in the past). By the time of mini-DP and Thunderbolt, Apple and Intel were working jointly and the technology came to both PCs and Macs. By the time of USB-C, it was basically the entire electronics industry working together with Apple, and the end result has come to just about every kind of electronic device made by nearly every vendor. It doesn't get any less proprietary than that.

The bizarre part of the USB-C story is not Apple's involvement or early adoption of it, but rather that the mobile hardware side of Apple refused to support it. That they clung to the Lightning connector until the EU forced them to drop it, while their computer division had long since and enthusiastically adopted USB-C, is much more damning.

[deleted]

I still don't see your point.

Your argument was they had a rule of "one standard and two proprietary ports" as a means to "allow data to be ingested easily and freely shared inside Apple ecosystem, but not exported back out to the outside world with same ease".

For serial they used mini-DIN to save space on the back of the machine instead of a random mix of DB-25 and DE-9 on the PC side. My family and everyone I knew used a dime-a-dozen cable to use a typical PC modem, data was shared feely. There was no "one standard" port at this time to get data "ingested", serial went both ways.

Even on PCs, to do anything serial you needed hardware and driver support anyway, that was the blocker, not the shape of the port. If Apple adopted DB-26 for serial, how would that let data share more freely?

For SCSI, the DB-25 Apple used was not proprietary. And even in the System 6 days they had Apple File Exchange to access FAT-formatted disks to write data out for PC users.

For Ethernet, Apple started building in Ethernet as standard before PC makers. They sold a laptop with Ethernet built-in in 1994, this was unheard of on PC laptops.

As for AppleTalk, they pushed LocalTalk at a time before PCs had any built-in networking whatsoever, a PC network card cost a hundred bux and were only used by corporations whereas in the home if you had a Mac you could make a network with a printer cable between two machines, Apple got it for cheap by spending an extra 10 bucks on RS-422 for their serial ports, why wouldn't you advertise that?

If you're talking about AppleTalk the network protocol rather than LocalTalk the physical protocol, Apple bundled TCP/IP with MacOS before Windows did ("Trumpet WinSock" was third party software), back when Microsoft thought they could stop people from adopting the internet since "The Microsoft Network" was so going to be so much better.

Arguing that adopting the Apple making PowerPC machines adopting the Intel-defined USB which was already on PCs for years before was a means to keep people from moving data out from the "internet Mac" (which was advertised as letting you share information with the world with "there is no step 3") is just... it makes no sense.

iOS on the other hand... Completely different thing.

And technically (the best way, right?) there's a whole thing to suss out between AppleTalk, 422, and LocalTalk, hahahahahah, but it's effectively as proprietary as PS/2 ports were, until they weren't. And ADB was 100% proprietary iirc, but I'm not going to look it up for you.

TBF, Apple did publish public standards for LocalTalk and AppleTalk and there was third-party hardware that implemented it (including stuff like switches that Apple never implemented themselves)

ADB was definitely proprietary, but arguably it wasn't a data port, nobody used ADB to output data.

Wuh?

On all but the top tier MBPs, USB C ports on Macs have different specs for data transfer (often the ones on the right of the machine will have half the transfer speed).

Without trying to be pedantic, not all USB-C ports on Apple computers support Thunderbolt.

I haven't encountered this, but I've also only used the Apple Silicon devices. This might explain why there are so few ports, though: Thunderbolt is basically PCIe and has AFAIK direct lanes to the CPU; more full-featured ports = more PCIe lanes = much more complexity/expense.

The difference is in the desktop systems. That's where there are USB-C ports without Thunderbolt (e.g. Mac Mini).

Damn, you're right. I have an M1 Mac Mini and both ports are Thunderbolt. I recall, and Wikipedia corroborates, that the M2 Mac Mini could come with either two or four ports, but all were Thunderbolt. Now though, the M4 ones, besides getting an awful facelift, also seem to have sacrificed one Thunderbolt port (and both USB-A ports?!) to "gain" two non-Thunderbolt USB-C ports. What a terrible trade IMO.

I only have 2 USB-C ports, and most of the time I have nothing plugged in (but power). Sometimes ethernet USB-C when moving large files.

I find the keyboards terrible, even the modern ones. I much preferred the 2006-2007 MacBook Pro keyboards.

Those are all legit criticisms but also be fair. They eventually did get rid of the touchbar. USB-C-only was merely ahead of its time. They improved the keyboards.

And even at their worst they were still much better than any Windows laptops, if only for the touchpad. I have yet to use a Windows laptop with a touchpad even close to the trackpad's that Apple had 15 years ago. And the build quality and styling is still unbeaten. Do Windows laptop makers still put shitty stickers all over them?

USB-C only is still a nuisance to this very day and remains the thing I hate most about my Macbook. Without fail there is never an adapter to be found when I need it.

I'm kinda bewildered to see people defend it. Even Apple knew they were wrong, they didn't put an HDMI port on the new Mac chassis by mistake.

Case in point that people will never admit that Apple messes up, even if Apple themselves will.

I owned multiple Macbooks that built a positive static charge when they were on, instilling a Pavlovian fear of being shocked into anyone that used it. Those were fun.

If you use the 3-prong version of the power adapter to connect to a grounded outlet, this problem goes away. Of course, Apple doesn't actually sell a 3-prong plug for their charger in Europe... so us lucky folks in the EU have to get a 3rd party one off the internet

Yes, they do. [1] for Italian outlets, [2] for many of the others. I'm sure I don't need to continue.

[1]: https://www.apple.com/it/shop/product/mw2n3ci/a/prolunga-per...

[2]: https://www.apple.com/fr/shop/product/mw2n3z/a/câble-d’exten...

I suspect what they meant is that there isn't an official Schuko nub that slides onto the brick and lets you hang it directly from the socket rather than carrying an extra meter of cable around. There is a BS1363 one, and those are only legit feasible in a grounded configuration (although I guess you could use a plastic ground spade to lift the child protection slider inside the socket if you were a particularly unpleasant engineer). Nice for those of us in British-adjacent countries.

I'm talking about the non-corded variants. Carrying an extra metre of cable around just to get the grounding prong is not great.

As another poster mentioned, it's particularly annoying because Apple does ship the UK adapter in 3-pin grounded form

YMMV. I throw away the non-corded variants because they rarely fit in the spaces that people think to put outlets, especially as the chargers have grown and (particularly) train operators love putting outlets attached to tables with about 5mm of clearance.

That’s nothing to do with static electricity, it’s capacitive coupling through the safety capacitors in the power supply. The chassis sits at 90vac or so as a result, it’s not a safety issue it’s FCC compliance for emitted noise.

Is this generally true for laptops / phones?

I've often wondered why I can tell by touch whether a device is charging or not from the slight "vibration" sensation I get when gently touching the case.

For ungrounded / 2-prong outlet devices, yeah.

It's often noticeable if you have a point contact of metal against your skin; sharp edge / screw / speaker grill, etc. Once you have decent coupling between your body and the laptop, you won't feel the tingle / zap.

They're called Y-caps if you want to delve deeper into them and their use in power supplies.

I get that too. I was wondering if it was just me.

They still do. My m1, m1max and m4max Macbook Pros all build a positive static charge. It isn't even something that renders it "returnable" because I observed it on every single Macbook in the last 4-5 years so I just assume that's just how Macbook Pros are now.

This hasn't changed in at least 2 decades: I was getting zapped by Apple metal laptops circa 2004. But I have never encountered this problem when using a grounded plug.

It was also a lot worse for me when plugged into outlets in an old house in Mexico, especially when my bare feet were touching the terracotta floor tiles; it's not really an issue in a recently re-wired house in California with a wood floor, using the same laptops, power strips, etc.

If you are having this issue and you currently plug a 2-pronged plug into a grounded outlet, try using Apple's 3-pronged plug instead, and I expect it would go away. If you don't have grounded outlets, then that's a bit more complicated to solve.

That's what confuses me, I am using the cable with three prongs, it is grounded. I am beginning to suspect some other appliance I am plugging into it that is responsible of the build-up of charge, but then why is it not finding its way to the ground... something doesn't add up but has been my experience consistently.

Perhaps the outlet isn't properly grounded?

Is there any laptop with a metal body out there that does not have this issue? I've had two RedmiBook by Xiaomi and both has that vibrating electric feeling to them when plugged in.

It's an artifact of the charger.

This comment explains it well: https://news.ycombinator.com/item?id=45686427

That era sucked for sure, but since (I think) 2021, macbook pros have magsafe, 3x USB-C, an HDMI port, and an SD card. I had a 2014 MBP I was waiting to upgrade once they came up with a sensible redesign, and I'd say they did!

I still have my 2014, along with a 2021 MBP for work, and still love them as machines for my usage profile - writing software/firmware, and occasional PCB design. The battery life is good, M-series performance is great, screen is decent-to-good, trackpad is still best in class, and macos is _okay_ in my book. The keyboard isn't amazing as I prefer mechanical for sure, but I still type faster on a macbook keyboard than anything else. That being said, I designed a mechanical keyboard that sits on top of the macbook keyboard so I can enjoy that better typing experience.

Pretty dang happy with my setup.

Most of what you're talking about is from MacBooks of 5+ years ago on a completely different processor architecture.

I miss the Powerbook G3 series. That was some fantastically modular design.

[dead]

> I wonder if anyone else here is old enough to remember the "I'm a Mac", "And I'm a PC" ads.

Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?

> There was one that was about all the annoying security pop-ups Windows (used to?) have.

Those have been relentlessly mocked on the Mac for years. I remember a point where several articles were written making that exact comparison. People have been calling it “macOS Vista” since before Apple Silicon was a thing.

> WIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230

A bit better quality: https://www.youtube.com/watch?v=VuqZ8AqmLPY

> Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?

Part of getting old is accepting that 20 years was a long time ago and not everyone is going to remember the same commercials we saw two decades ago, including people who were children during the ad campaign.