Biometric Data Privacy Hygiene
The most effective surveillance defense is reducing what's available to collect. This section covers metadata stripping, data broker opt-outs, account compartmentalization, and systematic biometric exposure reduction. Most long-term tracking risk comes from accumulated data, not real-time detection.
Prevention Over Countermeasures
Biometric Data Lifecycle
Collection Points
- • Social media: photos with face data, voice in videos, writing style
- • Photo metadata: EXIF GPS coordinates, device serial numbers, timestamps
- • Voice assistants: voice recordings retained for "model improvement"
- • Fitness apps: gait patterns, heart rate, location trails
- • Payment systems: transaction patterns, location, timing
- • Public records: DMV photos, passport data, court filings
Aggregation & Correlation
- • Data brokers: merge records from 100+ sources per individual
- • Face search engines: PimEyes, Clearview index billions of photos
- • Identity graphs: connect email → phone → address → social → photo
- • Device fingerprinting: browser, OS, screen, fonts create unique ID
- • Cross-platform tracking: same email/phone links separate personas
- • Advertising IDs: IDFA/GAID enable re-identification across apps
High-Impact Privacy Controls
| Control | Impact | Effort | Target | Notes |
|---|---|---|---|---|
| EXIF/metadata stripping | High | Low | Location, device | Automate with ExifTool batch scripts |
| Data broker opt-outs | High | Medium | Identity graphs | Requires ongoing monitoring (re-appears in 3-6 months) |
| Account compartmentalization | High | Medium | Cross-platform linking | Separate email, phone, payment per persona |
| Photo content reduction | Medium | Low | Face embeddings | Limit high-res frontal photos on public profiles |
| Voice data restrictions | Medium | Low | Voiceprint | Disable voice assistant history retention |
| Advertising ID reset | Medium | Low | App tracking | iOS: Limit Ad Tracking / Android: Delete advertising ID |
Metadata Stripping Playbook
Comprehensive metadata removal for photos, videos, and documents before sharing or uploading.
#!/bin/bash
# Prerequisites: apt install libimage-exiftool-perl ffmpeg (or brew install exiftool ffmpeg on macOS)
#
# Comprehensive metadata stripping for biometric privacy hygiene
# Removes EXIF, IPTC, XMP, GPS, and device-specific metadata
echo "=== Photo Metadata Hygiene ==="
# Strip ALL metadata from a single image
exiftool -all= image.jpg
# Batch strip entire folder (preserve file dates)
# ⚠ -overwrite_original permanently destroys metadata — back up originals first if needed
exiftool -all= -overwrite_original -P ./photos/*.jpg
# Preview what would be removed (dry run)
exiftool -all -s ./photos/sample.jpg
# --- Targeted removal (keep copyright but strip GPS/device) ---
exiftool -gps:all= -Make= -Model= -SerialNumber= \
-Software= -LensModel= -overwrite_original ./photos/*.jpg
# --- Video metadata stripping ---
# FFmpeg strips most metadata during re-encoding
ffmpeg -i input.mp4 -map_metadata -1 -c:v copy -c:a copy output_clean.mp4
# --- PDF metadata stripping ---
exiftool -all= -overwrite_original document.pdf
# --- Verify metadata is clean ---
echo ""
echo "=== Verification ==="
exiftool -G1 -s output_clean.mp4 | head -20
echo ""
echo "GPS tags remaining:"
exiftool -gps:all ./photos/*.jpg | grep -i "gps"
# Expected output:
# [Photos] Stripping EXIF from photos/*.jpg...
# 3 image files updated
#
# [Video] Remuxing video without metadata...
# input.mp4 → clean_input.mp4 (metadata stripped)
#
# [Verification]
# ExifTool Version Number : 12.76
# File Name : photo1.jpg
# GPS Position : (none) ✓
# Camera Make : (none) ✓
# Date/Time Original : (none) ✓#!/bin/bash
# Prerequisites: apt install libimage-exiftool-perl ffmpeg (or brew install exiftool ffmpeg on macOS)
#
# Comprehensive metadata stripping for biometric privacy hygiene
# Removes EXIF, IPTC, XMP, GPS, and device-specific metadata
echo "=== Photo Metadata Hygiene ==="
# Strip ALL metadata from a single image
exiftool -all= image.jpg
# Batch strip entire folder (preserve file dates)
# ⚠ -overwrite_original permanently destroys metadata — back up originals first if needed
exiftool -all= -overwrite_original -P ./photos/*.jpg
# Preview what would be removed (dry run)
exiftool -all -s ./photos/sample.jpg
# --- Targeted removal (keep copyright but strip GPS/device) ---
exiftool -gps:all= -Make= -Model= -SerialNumber= \
-Software= -LensModel= -overwrite_original ./photos/*.jpg
# --- Video metadata stripping ---
# FFmpeg strips most metadata during re-encoding
ffmpeg -i input.mp4 -map_metadata -1 -c:v copy -c:a copy output_clean.mp4
# --- PDF metadata stripping ---
exiftool -all= -overwrite_original document.pdf
# --- Verify metadata is clean ---
echo ""
echo "=== Verification ==="
exiftool -G1 -s output_clean.mp4 | head -20
echo ""
echo "GPS tags remaining:"
exiftool -gps:all ./photos/*.jpg | grep -i "gps"
# Expected output:
# [Photos] Stripping EXIF from photos/*.jpg...
# 3 image files updated
#
# [Video] Remuxing video without metadata...
# input.mp4 → clean_input.mp4 (metadata stripped)
#
# [Verification]
# ExifTool Version Number : 12.76
# File Name : photo1.jpg
# GPS Position : (none) ✓
# Camera Make : (none) ✓
# Date/Time Original : (none) ✓Data Broker Exposure Audit
Systematic audit of major data brokers with priority ranking for opt-out requests.
#!/usr/bin/env python3
# No external dependencies — uses Python stdlib only
# ⚠ Data broker URLs and response times are approximate and may change — verify current opt-out URLs before submitting
"""Data broker exposure audit — identify where your biometric data may be stored.
Automates the discovery and opt-out request process for major data brokers."""
import json
import csv
from datetime import datetime
from dataclasses import dataclass, asdict
from typing import List, Optional
@dataclass
class DataBroker:
name: str
category: str # 'people-search', 'background-check', 'advertising', 'biometric'
data_types: List[str] # e.g., ['photo', 'address', 'phone', 'social']
opt_out_url: str
opt_out_method: str # 'web-form', 'email', 'mail', 'phone'
verification_needed: str # 'email', 'id', 'none'
response_time_days: int
gdpr_applies: bool
ccpa_applies: bool
notes: str = ""
# Major data brokers with biometric/photo exposure risk
BROKERS = [
DataBroker("Spokeo", "people-search", ["photo", "address", "phone", "social", "email"],
"https://www.spokeo.com/optout", "web-form", "email", 3, False, True),
DataBroker("BeenVerified", "people-search", ["photo", "address", "phone", "criminal"],
"https://www.beenverified.com/app/optout/search", "web-form", "email", 7, False, True),
DataBroker("Whitepages", "people-search", ["photo", "address", "phone"],
"https://www.whitepages.com/suppression-requests", "web-form", "phone", 3, False, True),
DataBroker("PimEyes", "biometric", ["face-embedding", "photo", "source-url"],
"https://pimeyes.com/en/opt-out", "web-form", "email", 14, True, True,
"Face recognition search engine — HIGH PRIORITY for biometric privacy"),
DataBroker("Clearview AI", "biometric", ["face-embedding", "photo", "social-source"],
"https://clearview.ai/privacy/requests", "web-form", "email", 45, True, True,
"Law enforcement face recognition — opt-out may not apply to LE use"),
DataBroker("ThatsThem", "people-search", ["address", "phone", "email", "ip"],
"https://thatsthem.com/optout", "web-form", "none", 5, False, True),
DataBroker("Radaris", "people-search", ["photo", "address", "phone", "property"],
"https://radaris.com/page/how-to-remove", "web-form", "email", 14, False, True),
DataBroker("FastPeopleSearch", "people-search", ["address", "phone", "email"],
"https://www.fastpeoplesearch.com/removal", "web-form", "none", 2, False, True),
]
def generate_audit_report(brokers: List[DataBroker], output_file: str = "broker_audit.csv"):
"""Generate opt-out tracking spreadsheet."""
with open(output_file, "w", newline="") as f:
writer = csv.writer(f)
writer.writerow([
"Broker", "Category", "Data Types", "Opt-Out URL", "Method",
"Verification", "Expected Response (days)", "GDPR", "CCPA",
"Request Sent", "Response Received", "Status", "Notes"
])
for b in brokers:
writer.writerow([
b.name, b.category, "; ".join(b.data_types), b.opt_out_url,
b.opt_out_method, b.verification_needed, b.response_time_days,
"Yes" if b.gdpr_applies else "No",
"Yes" if b.ccpa_applies else "No",
"", "", "Pending", b.notes
])
print(f"Audit report generated: {output_file}")
print(f"Total brokers: {len(brokers)}")
# Priority ranking
biometric = [b for b in brokers if "face-embedding" in b.data_types or "photo" in b.data_types]
print(f"Biometric exposure risk: {len(biometric)} brokers")
print("")
print("Priority opt-out order (biometric data first):")
for i, b in enumerate(sorted(biometric, key=lambda x: "biometric" in x.category, reverse=True), 1):
print(f" {i}. {b.name} ({b.category}) — {b.opt_out_method}")
generate_audit_report(BROKERS)
# Expected output:
# Generated: broker_optout_tracker.csv
#
# Priority | Broker | Category | URL | Status
# ─────────────────────────────────────────────────────────────────────────────────
# 1 | Spokeo | people_search | spokeo.com/optout | pending
# 2 | BeenVerified | people_search | beenverified.com/opt-out | pending
# 3 | Acxiom | data_broker | isapps.acxiom.com/optout | pending
# ...
# Total: 24 brokers | Estimated completion: 45-90 days#!/usr/bin/env python3
# No external dependencies — uses Python stdlib only
# ⚠ Data broker URLs and response times are approximate and may change — verify current opt-out URLs before submitting
"""Data broker exposure audit — identify where your biometric data may be stored.
Automates the discovery and opt-out request process for major data brokers."""
import json
import csv
from datetime import datetime
from dataclasses import dataclass, asdict
from typing import List, Optional
@dataclass
class DataBroker:
name: str
category: str # 'people-search', 'background-check', 'advertising', 'biometric'
data_types: List[str] # e.g., ['photo', 'address', 'phone', 'social']
opt_out_url: str
opt_out_method: str # 'web-form', 'email', 'mail', 'phone'
verification_needed: str # 'email', 'id', 'none'
response_time_days: int
gdpr_applies: bool
ccpa_applies: bool
notes: str = ""
# Major data brokers with biometric/photo exposure risk
BROKERS = [
DataBroker("Spokeo", "people-search", ["photo", "address", "phone", "social", "email"],
"https://www.spokeo.com/optout", "web-form", "email", 3, False, True),
DataBroker("BeenVerified", "people-search", ["photo", "address", "phone", "criminal"],
"https://www.beenverified.com/app/optout/search", "web-form", "email", 7, False, True),
DataBroker("Whitepages", "people-search", ["photo", "address", "phone"],
"https://www.whitepages.com/suppression-requests", "web-form", "phone", 3, False, True),
DataBroker("PimEyes", "biometric", ["face-embedding", "photo", "source-url"],
"https://pimeyes.com/en/opt-out", "web-form", "email", 14, True, True,
"Face recognition search engine — HIGH PRIORITY for biometric privacy"),
DataBroker("Clearview AI", "biometric", ["face-embedding", "photo", "social-source"],
"https://clearview.ai/privacy/requests", "web-form", "email", 45, True, True,
"Law enforcement face recognition — opt-out may not apply to LE use"),
DataBroker("ThatsThem", "people-search", ["address", "phone", "email", "ip"],
"https://thatsthem.com/optout", "web-form", "none", 5, False, True),
DataBroker("Radaris", "people-search", ["photo", "address", "phone", "property"],
"https://radaris.com/page/how-to-remove", "web-form", "email", 14, False, True),
DataBroker("FastPeopleSearch", "people-search", ["address", "phone", "email"],
"https://www.fastpeoplesearch.com/removal", "web-form", "none", 2, False, True),
]
def generate_audit_report(brokers: List[DataBroker], output_file: str = "broker_audit.csv"):
"""Generate opt-out tracking spreadsheet."""
with open(output_file, "w", newline="") as f:
writer = csv.writer(f)
writer.writerow([
"Broker", "Category", "Data Types", "Opt-Out URL", "Method",
"Verification", "Expected Response (days)", "GDPR", "CCPA",
"Request Sent", "Response Received", "Status", "Notes"
])
for b in brokers:
writer.writerow([
b.name, b.category, "; ".join(b.data_types), b.opt_out_url,
b.opt_out_method, b.verification_needed, b.response_time_days,
"Yes" if b.gdpr_applies else "No",
"Yes" if b.ccpa_applies else "No",
"", "", "Pending", b.notes
])
print(f"Audit report generated: {output_file}")
print(f"Total brokers: {len(brokers)}")
# Priority ranking
biometric = [b for b in brokers if "face-embedding" in b.data_types or "photo" in b.data_types]
print(f"Biometric exposure risk: {len(biometric)} brokers")
print("")
print("Priority opt-out order (biometric data first):")
for i, b in enumerate(sorted(biometric, key=lambda x: "biometric" in x.category, reverse=True), 1):
print(f" {i}. {b.name} ({b.category}) — {b.opt_out_method}")
generate_audit_report(BROKERS)
# Expected output:
# Generated: broker_optout_tracker.csv
#
# Priority | Broker | Category | URL | Status
# ─────────────────────────────────────────────────────────────────────────────────
# 1 | Spokeo | people_search | spokeo.com/optout | pending
# 2 | BeenVerified | people_search | beenverified.com/opt-out | pending
# 3 | Acxiom | data_broker | isapps.acxiom.com/optout | pending
# ...
# Total: 24 brokers | Estimated completion: 45-90 daysAccount Compartmentalization Audit
Detect shared identifiers that could link separate personas together, enabling cross-account correlation.
#!/usr/bin/env python3
# No external dependencies — uses Python stdlib only
"""Identity compartmentalization audit — detect cross-account linkability.
Maps shared identifiers that could link separate personas together."""
import hashlib
from collections import defaultdict
def audit_compartmentalization(accounts):
"""Analyze identity isolation between accounts.
accounts: list of dicts with keys:
persona, platform, email, phone, ip_used, browser_fingerprint,
payment_method, display_name, photo_hash
"""
# Group all identifiers by persona
personas = defaultdict(lambda: defaultdict(set))
for acct in accounts:
persona = acct["persona"]
# photo_hash: SHA-256 hash of profile photo file — generate with: hashlib.sha256(open(photo,'rb').read()).hexdigest()[:12]
for key in ["email", "phone", "ip_used", "browser_fingerprint",
"payment_method", "display_name", "photo_hash"]:
if acct.get(key):
personas[persona][key].add(acct[key])
# Cross-persona linking analysis
print("=== Compartmentalization Audit ===")
print("")
all_personas = list(personas.keys())
link_risks = []
for i, p1 in enumerate(all_personas):
for p2 in all_personas[i+1:]:
shared = {}
for identifier_type in personas[p1]:
overlap = personas[p1][identifier_type] & personas[p2][identifier_type]
if overlap:
shared[identifier_type] = overlap
if shared:
risk_level = "HIGH" if len(shared) >= 2 else "MEDIUM"
link_risks.append({
"persona_a": p1, "persona_b": p2,
"shared_identifiers": shared,
"risk": risk_level
})
if link_risks:
print(f"⚠ Found {len(link_risks)} cross-persona linkage risks:")
for risk in link_risks:
print(f" [{risk['risk']}] {risk['persona_a']} ↔ {risk['persona_b']}")
for id_type, values in risk["shared_identifiers"].items():
masked = [v[:4] + "***" for v in values]
print(f" Shared {id_type}: {', '.join(masked)}")
else:
print("✓ No cross-persona linkage detected")
# Per-persona summary
print("")
print("=== Per-Persona Identifier Inventory ===")
for persona, identifiers in personas.items():
total = sum(len(v) for v in identifiers.values())
print(f" {persona}: {total} identifiers across {len(identifiers)} types")
# Example audit
accounts = [
{"persona": "work", "platform": "LinkedIn", "email": "john@company.com",
"phone": "+1555123", "ip_used": "100.1.2.3", "photo_hash": "abc123"},
{"persona": "work", "platform": "GitHub", "email": "john@company.com",
"ip_used": "100.1.2.3", "photo_hash": "abc123"},
{"persona": "personal", "platform": "Instagram", "email": "j_doe@gmail.com",
"phone": "+1555123", # SAME PHONE — linkage risk!
"ip_used": "100.1.2.3", # SAME IP — linkage risk!
"photo_hash": "def456"},
]
audit_compartmentalization(accounts)
# Expected output:
# === Persona Compartment Audit ===
#
# ⚠ CRITICAL: Shared email found across 2 personas
# • work, social → user@gmail.com
#
# ⚠ WARNING: Matching photo hash across personas
# • work, dating → a1b2c3d4e5f6
#
# ⚠ WARNING: Phone number reused
# • work, social → +1-555-0100
#
# Linkage risk: HIGH — 3 cross-persona identifiers detected#!/usr/bin/env python3
# No external dependencies — uses Python stdlib only
"""Identity compartmentalization audit — detect cross-account linkability.
Maps shared identifiers that could link separate personas together."""
import hashlib
from collections import defaultdict
def audit_compartmentalization(accounts):
"""Analyze identity isolation between accounts.
accounts: list of dicts with keys:
persona, platform, email, phone, ip_used, browser_fingerprint,
payment_method, display_name, photo_hash
"""
# Group all identifiers by persona
personas = defaultdict(lambda: defaultdict(set))
for acct in accounts:
persona = acct["persona"]
# photo_hash: SHA-256 hash of profile photo file — generate with: hashlib.sha256(open(photo,'rb').read()).hexdigest()[:12]
for key in ["email", "phone", "ip_used", "browser_fingerprint",
"payment_method", "display_name", "photo_hash"]:
if acct.get(key):
personas[persona][key].add(acct[key])
# Cross-persona linking analysis
print("=== Compartmentalization Audit ===")
print("")
all_personas = list(personas.keys())
link_risks = []
for i, p1 in enumerate(all_personas):
for p2 in all_personas[i+1:]:
shared = {}
for identifier_type in personas[p1]:
overlap = personas[p1][identifier_type] & personas[p2][identifier_type]
if overlap:
shared[identifier_type] = overlap
if shared:
risk_level = "HIGH" if len(shared) >= 2 else "MEDIUM"
link_risks.append({
"persona_a": p1, "persona_b": p2,
"shared_identifiers": shared,
"risk": risk_level
})
if link_risks:
print(f"⚠ Found {len(link_risks)} cross-persona linkage risks:")
for risk in link_risks:
print(f" [{risk['risk']}] {risk['persona_a']} ↔ {risk['persona_b']}")
for id_type, values in risk["shared_identifiers"].items():
masked = [v[:4] + "***" for v in values]
print(f" Shared {id_type}: {', '.join(masked)}")
else:
print("✓ No cross-persona linkage detected")
# Per-persona summary
print("")
print("=== Per-Persona Identifier Inventory ===")
for persona, identifiers in personas.items():
total = sum(len(v) for v in identifiers.values())
print(f" {persona}: {total} identifiers across {len(identifiers)} types")
# Example audit
accounts = [
{"persona": "work", "platform": "LinkedIn", "email": "john@company.com",
"phone": "+1555123", "ip_used": "100.1.2.3", "photo_hash": "abc123"},
{"persona": "work", "platform": "GitHub", "email": "john@company.com",
"ip_used": "100.1.2.3", "photo_hash": "abc123"},
{"persona": "personal", "platform": "Instagram", "email": "j_doe@gmail.com",
"phone": "+1555123", # SAME PHONE — linkage risk!
"ip_used": "100.1.2.3", # SAME IP — linkage risk!
"photo_hash": "def456"},
]
audit_compartmentalization(accounts)
# Expected output:
# === Persona Compartment Audit ===
#
# ⚠ CRITICAL: Shared email found across 2 personas
# • work, social → user@gmail.com
#
# ⚠ WARNING: Matching photo hash across personas
# • work, dating → a1b2c3d4e5f6
#
# ⚠ WARNING: Phone number reused
# • work, social → +1-555-0100
#
# Linkage risk: HIGH — 3 cross-persona identifiers detectedData Broker Re-listing
Biometric Exposure Reduction Workflow
-
1. Inventory Public Exposure
Search for your name, photos, and phone number across Google Images, PimEyes (biometric search), social platforms, and data broker sites. Document all public-facing biometric data (photos, videos, voice recordings).
-
2. Prioritize Removals
Highest priority: biometric-specific sites (PimEyes, Clearview AI). Second: people-search sites with photos. Third: social media posts with high-resolution frontal face images. Fourth: metadata-rich media on public uploads.
-
3. Submit Opt-Out Requests
Use GDPR Article 17 (right to erasure) for EU-based services or CCPA "Do Not Sell" for California-based services. For others, use their opt-out forms directly. Track all submissions in a spreadsheet with dates and response status.
-
4. Harden Active Accounts
Restrict photo visibility to connections-only. Disable profile indexing by search engines. Remove high-resolution frontal photos. Disable voice assistant data retention. Compartmentalize with separate emails/phones per persona.
-
5. Establish Ongoing Hygiene
Set quarterly calendar reminders to re-check data broker listings. Automate EXIF stripping for all media before upload. Review account privacy settings after platform updates. Monitor for new biometric search engines.
Browser Fingerprinting Defense
Browser fingerprinting creates a unique device identifier without cookies by combining rendering differences, hardware properties, and API responses. Defending against it requires either blending in with a common fingerprint or blocking the APIs entirely.
Fingerprinting Techniques
- • Canvas fingerprinting: Draws hidden text/graphics and reads pixel data — rendering varies by GPU, driver, and OS
- • WebGL fingerprinting: Queries GPU renderer string, shader precision, and supported extensions for hardware identification
- • AudioContext fingerprinting: Processes audio through the Web Audio API — subtle floating-point differences create a unique hash
- • Font enumeration: Detects installed fonts via rendering width measurements
- • Navigator/Screen: User-Agent, screen resolution, timezone, language, hardware concurrency, device memory
Defense Tools
- • Tor Browser: Best overall — all users share an identical fingerprint by design. Blocks Canvas, WebGL, AudioContext by default
- • Mullvad Browser: Firefox-based, Tor Browser's fingerprinting resistance without the Tor network. Good for non-onion use
- • Firefox + resistFingerprinting: Set
privacy.resistFingerprinting = truein about:config — spoofs timezone, screen size, user-agent - • Brave Browser: Built-in fingerprinting protection that randomizes Canvas, WebGL, and AudioContext outputs per-session
Testing Your Fingerprint
- • Cover Your Tracks (EFF):
coveryourtracks.eff.org— tests uniqueness and tracking protection effectiveness - • BrowserLeaks:
browserleaks.com— detailed breakdown of Canvas, WebGL, fonts, audio, and WebRTC leaks - • CreepJS:
abrahamjuliot.github.io/creepjs— advanced fingerprinting test that detects spoofing attempts - • Goal: Your fingerprint should either be identical to millions of other users (Tor) or randomized per-session (Brave)
Practical Considerations
- • Aggressive fingerprinting protection breaks many websites — use profiles (work vs. private)
- • Browser extensions themselves add to your fingerprint — minimize extensions in privacy browsers
- • VPN + fingerprinting defense together are stronger than either alone
- • JavaScript-disabled browsing eliminates most fingerprinting but severely limits functionality
Email Alias Services Comparison
Email aliases prevent cross-service identity correlation by giving each account a unique forwarding address. If one alias is compromised or sold, it doesn't expose your real email or link to other services.
| Service | Open Source | Free Tier | Self-Hostable | Key Details |
|---|---|---|---|---|
| SimpleLogin | Yes | 15 aliases | Yes | Acquired by Proton AG. PGP encryption support. Browser extension for auto-generation. |
| Firefox Relay | Partial | 5 aliases | No | Mozilla-backed. Premium ($1.99/mo) adds phone number masking and unlimited aliases. |
| addy.io (AnonAddy) | Yes | Unlimited (shared domain) | Yes | Unlimited aliases on paid plan with custom domains. GPG/OpenPGP encryption. |
| iCloud Hide My Email | No | Unlimited (iCloud+) | No | Apple ecosystem only. Built into iOS/macOS. Requires iCloud+ subscription ($0.99/mo+). |
Mobile Hardening Checklist
Systematic privacy hardening for your primary mobile device. Complete both platform-specific sections for comprehensive coverage.
iOS Hardening
- ☐ Enable Lockdown Mode (Settings > Privacy & Security > Lockdown Mode)
- ☐ Enable App Tracking Transparency — deny all cross-app tracking requests
- ☐ Disable Significant Locations (Settings > Privacy > Location Services > System Services)
- ☐ Set all apps to "While Using" or "Never" for location access — disable Precise Location for non-navigation apps
- ☐ Enable Mail Privacy Protection (Settings > Mail > Privacy Protection)
- ☐ Use iCloud Private Relay and Hide My Email for Safari and account signups
- ☐ Disable Siri data sharing (Settings > Siri > Siri & Dictation History > Delete)
- ☐ Review Safety Check (Settings > Privacy & Security > Safety Check) to audit shared access
Android Hardening
- ☐ Delete advertising ID (Settings > Privacy > Ads > Delete advertising ID)
- ☐ Disable Wi-Fi and Bluetooth scanning (Settings > Location > Wi-Fi/Bluetooth scanning)
- ☐ Remove unused apps and revoke background location for all non-essential apps
- ☐ Install a DNS-based tracker blocker (NextDNS, AdGuard) as Private DNS
- ☐ Disable 2G fallback (Settings > Network > SIMs > Allow 2G — toggle off)
- ☐ Consider GrapheneOS or CalyxOS for maximum privacy — requires supported Pixel device
- ☐ Use shelter or work profiles to sandbox social media and shopping apps
- ☐ Audit app permissions monthly — check for newly requested permissions after updates
Hardening is Iterative
Quick Wins (Top 5 Actions)
- Strip EXIF from all photos before uploading anywhere — automate with shell alias
- Opt out of PimEyes — this is the most accessible biometric face search engine
- Use unique email aliases per service (e.g., SimpleLogin, Firefox Relay, Apple iCloud+)
- Reset advertising IDs on all mobile devices and disable tracking permissions
- Restrict social media photos to connections-only and remove high-res frontal images
Data Privacy Labs
Hands-on exercises for biometric data hygiene and exposure reduction.