this post was submitted on 31 Oct 2024
96 points (98.0% liked)
Apple
17529 readers
31 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is a necessity to run a decent LLM and still have room for the rest of your programs. Only reason they’re doing it.
After all, Apple needs AI working properly on its devices to spy on users
I’ve heard a lot of praise from paranoid Linux users and they all like the way Apple intelligence works locally (we talked back when the first informations came through)
The claim is that their own hosted cloud computing for AI is also secure and that Apple has no idea/can't know what it's computing for you.
No clue if it's true, but that is the direction all cloud AI stuff should go.
There's "semantic search"on iPhones. Looks pretty much like Recall on Windows, down to storing the data on the end device.
Is it storing it or is it accessing the original content?
Hey siri summarize finances.txt and it opening the file to summarize is different than it having its own copy stored somewhere.
The incentives aren’t there for Apple. It makes money selling you a product that you trust. If that were violated, it’s a threat to their business. It’s Google, Meta, and Microsoft that make their money collecting your data to target you with ads.