- If you get into a car crash with your iPhone a ML model detects this and automatically calls emergency services.

- If you are wearing an Apple Watch a ML model is constantly analyzing your heart rhythm and will alert you to (some types of) irregularities. It's so computationally efficient it can literally do this in the background all day long.

- When you take any picture on any iPhone a whole array of ML models immediately run to improve the image. More models are used when manually editing images.

- After you save the photo ML models run to analyze and index the photo so it's easily searchable later. That's why you can search for "golden retriever" and get actual results.

- When you speak at your device (for example, to dictate a text message) there's a ML model that transcribes that into text. Likewise, when you're hands-free and want to hear an incoming text message, an ML model converts it to audio. All on-device and available offline at that.

Or are we playing that stupid game where "AI === LLM"?

> Or are we playing that stupid game where "AI === LLM"?

Well, the original question was specifically about LLMs. ("What other companies have successfully integrated LLM tech in their mainstream products?")

They weren’t responding to the original question.

I took that to mean the tech under LLMs.

My bad, I meant LLMs