Back to Unchained Photo  ·  Privacy
Privacy & Ownership

Your photos feed their
algorithms. Not anymore.

Every photo you store with Google, Apple, or Meta is analyzed, processed, and used — in ways most people have never read about. This page explains exactly what happens to your photos on their servers, and what stops the moment you bring them home.

What the platforms actually do

Straight from their
own terms of service.

These aren't interpretations or speculation. The following comes directly from Google's and Apple's own policy documents and public statements. You agreed to all of it — most people just didn't read it.

G
Google Photos
Used by 4 billion+ people
Content Analysis
Google's terms state plainly that their automated systems analyze your content — including photos — "as the content is sent, received, and when it is stored." This analysis is used to personalize ads, search results, and product features.
Source: Google Terms of Service, policies.google.com/terms
License Grant
By uploading to Google Photos, you grant Google a license to use your content to develop new services. Google's privacy policy states they "use information to help develop new" products — Picasa data, for example, was used to design and train Google Photos.
Source: Google Privacy Policy, policies.google.com/privacy
AI Training
Google uses the data collected across its services, including Photos, to develop and improve machine learning and AI systems. Your vacation photos, family portraits, and personal moments contribute to that work.
Source: Google Privacy Policy, policies.google.com/privacy
Apple iCloud
Over 1 billion active devices
Enhanced Visual Search
In late 2024, Apple quietly activated a feature that analyzes photos on your device and sends encrypted data to Apple's servers — even if you've opted out of iCloud Photos. The feature, which identifies landmarks in your photos, was enabled by default with no prominent disclosure.
Source: The Register, January 2025
The CSAM Moment
In 2021, Apple announced plans to scan every iCloud photo against a database before upload. The backlash from privacy researchers was immediate and severe — over 90 civil society organizations called it a surveillance framework. Apple ultimately abandoned the plan, but the architecture it revealed showed exactly what was technically possible.
Source: MacRumors, EFF, New America Foundation, 2021–2022
The Key Point
Apple's own engineers admitted during the CSAM debate that scanning can't be made safe without creating systems that can be repurposed — by Apple, by governments, by anyone who applies sufficient pressure. The capability exists. The policy can change.
Source: Apple statement to Wired, via Macworld, 2023
f
Meta (Facebook)
Facebook, Instagram, WhatsApp
Facial Recognition
Meta used facial recognition on photos uploaded to Facebook for years before regulatory pressure forced changes. The technology mapped faces across billions of photos without meaningful user consent. The data collected during that period remains on their servers.
Source: FTC settlement record, 2020
Ad Targeting
Photos you post on Instagram are analyzed for content — what's in the image, where it was taken, what products appear — and that analysis informs the ads shown to you and others. This is core to Meta's business model, not a side effect of it.
Source: Meta Privacy Policy, metaprivacycenter.com
AI Training
In 2024, Meta began using public posts — including photos — to train its AI models across Facebook and Instagram. Users in the EU were able to opt out following regulatory pressure. Users in the US were not offered the same option.
Source: Meta AI training policy, 2024
When you move to Immich

Here's what stops
immediately.

The moment your photos are on your own hardware and your phone is backing up to Immich instead of Google or Apple, the following things become structurally impossible — not because we promise it, but because there's no server to send data to.

No content analysis
Immich runs entirely on your hardware. No company receives your photos. No automated system scans them for patterns, content, or faces on a remote server. What's in your library stays in your library.
No ad targeting from your photos
Your photos can't be used to build an advertising profile if no ad platform ever receives them. The connection between your camera roll and a commercial interest is severed completely.
No AI training contribution
Your family portraits, your children's birthdays, your private moments — none of it feeds a foundation model. Immich's on-device AI features process your photos locally and the results never leave your network.
No government access pathway
A government subpoena to Google or Apple produces your photos. A subpoena to a server in your home that only you control produces nothing — because there's no company to compel. This is a structural difference, not a policy one.
No policy changes that affect you
Google changed its Photos storage policy in 2021. Apple activated on-device scanning by default in 2024 without prominent notice. When your photos are on your hardware, platform policy changes are irrelevant. You're not on their platform.
No subscription required to see your own memories
Immich is free, open-source software. Once it's running on your Umbrel, your photos are there permanently — no monthly fee, no storage tier, no risk of losing access because a payment fails or a plan is discontinued.
Local by design

Everything Google Photos does.
None of what it takes.

Immich is the leading open-source Google Photos replacement. It runs entirely on hardware in your home — the Umbrel home server we ship to you. Your phone's Immich app backs up every photo you take automatically, over your home wifi, to a device you own.

Every feature that makes Google Photos useful — the ones that feel almost magical — are replicated in Immich. The difference is where the computation happens. On Google's infrastructure, it happens on their servers with your data. On Immich, it happens on the device in your home with no data ever leaving your network.

"Children can be protected without companies combing through personal data."
— Apple, explaining why it abandoned iCloud photo scanning, December 2022. The same logic applies to every other reason these platforms want access to your photos.
Facial recognition — runs locally
Immich identifies faces in your photos and groups them automatically. The machine learning model runs on your Umbrel's processor. No face data is ever sent anywhere.
Smart search — runs locally
Search for "beach 2019" or "Grandma's birthday" and Immich finds it. The search index is built from your photos on your device, not on a remote server trained on everyone's libraries.
Automatic phone backup — over your home wifi
Every photo you take backs up automatically when your phone connects to home wifi. No cloud intermediary. The photo goes from your phone to your Umbrel and nowhere else.
Shared albums — private by default
Share albums with family members who also connect to your Immich. No third-party server sees those shared photos. The sharing happens within your home network.
Memories, map view, timeline — all local
The features that make a photo library feel alive — surfacing old memories, showing where photos were taken, organizing by date — all work in Immich without any data leaving your home.
Remote access — through your own tunnel
Want to access your photos away from home? You can set up secure remote access through Tailscale, a private VPN. Your photos travel between your phone and your home server — not through a third-party cloud.

Your photos.
Your hardware. Your call.

Getting off these platforms doesn't have to be a technical project. We handle the entire migration — pulling your library, loading it onto your Umbrel, and shipping it ready to plug in. You cancel the subscription when you're ready.

See How It Works →