The Weight of Technology – 740 page – Now available in Kindle & Paperback

Are rideshare apps “selling your schedule”? What Uber, Lyft, DoorDash (and others) actually do with route data — and why it matters

A rumor like “Uber is selling ride schedules and route data to the highest bidder” spreads fast because it sounds like something the modern data economy would do. But the truth is usually more specific — and the specifics are where the real risk lives.

Here’s the factual, documented baseline:

  • Uber publicly states that it discloses (“shares”) personal information with third parties for targeted advertising, and notes that these disclosures may be considered a “sale” or “sharing” under certain US state privacy laws. Uber
  • Uber’s own ad-related privacy page explicitly says it may share things like trip/order/search history and even “current trip information” with advertising/marketing partners and ad tech intermediaries (depending on settings). Uber
  • Lyft similarly acknowledges categories of information it may have “sold or shared” for cross-context behavioral advertising/analytics, including geolocation information. Lyft
  • DoorDash provides opt-outs for “sale or sharing” of personal information for targeted advertising/ad personalization, and California’s AG has alleged DoorDash “traded” customer personal information (names, addresses, transaction histories) via marketing cooperatives — treating that as a “sale” under the CCPA (and settling with injunctive terms). DoorDash+1

So: is Uber selling your exact route + schedule as a product listing to random buyers?
I did not find credible evidence of that specific claim in public, primary documentation. What is clearly documented is that rideshare and delivery companies can share (and, in certain legal definitions, “sell/share”) personal data for advertising ecosystems and business partnerships — and that ecosystem creates real vulnerabilities even if nobody is literally auctioning off “Jeremy’s Tuesday 7:10AM commute.”

Let’s break it down properly.


1) What these services are (and why “location” is the whole engine)

Rideshare and delivery apps (Uber, Lyft, DoorDash, etc.) are essentially:

  • Real-time logistics platforms (matching supply and demand)
  • Navigation + routing systems (fastest path, dynamic reroutes, ETAs)
  • Payment processors
  • Trust & safety platforms (identity checks, fraud detection, incident review)
  • Increasingly: advertising and retail-media networks (ads inside apps, measurement, targeting)

To function, they must answer questions like:

  • Where are you now?
  • Where are you going?
  • When do you usually travel/order?
  • What’s the best route?
  • Did the trip happen as claimed?
  • Was there suspicious behavior?
  • What offers might convert you next time?

That means your “route data” isn’t a side effect — it’s foundational.


2) What kind of data gets collected (routes are only one layer)

Even if you never type your home address into a profile, these platforms commonly end up with a behavioral map that can include:

Identity & account data

  • Name, phone/email, device identifiers, payment tokens

Trip/order metadata

  • Pickup + dropoff points, time/date, distance, ETA, price, cancellations

Location traces

  • Precise GPS (often), background location during active sessions, inferred frequent places

“Inferences” (the spooky part)

Once enough trips exist, it becomes easy to infer:

  • Home/work
  • Regular schedule patterns
  • Airport habits
  • Nightlife routines
  • Medical visits (by destination category)
  • Relationship patterns (repeat pickups/dropoffs at the same non-home locations)

Lyft explicitly includes route/destination type “ride information,” and also references geolocation and inferences as categories tied to advertising/analytics contexts. Lyft+1


3) “Selling” vs “sharing” vs “using internally” — why the wording confuses everyone

The plain-English version

  • Using internally: “We use it to run the service.”
  • Sharing: “We send some of it to partners/vendors to operate, measure, or advertise.”
  • Selling (consumer privacy law definitions): In some laws (like California), “sale” can include exchanging data for benefit, not just cash. That’s why you’ll see opt-outs labeled “Do Not Sell/Share” even if a company says “we don’t sell data” in the everyday sense.

Uber’s US privacy page uses exactly this legal framing: disclosures for personalized advertising may be considered “sales” or “sharing” under certain laws, and offers an opt-out. Uber

DoorDash’s California AG case is a real-world example of how broad “sale” can be defined: the AG described DoorDash “trading” customer data (including names, addresses, transaction histories) in a marketing cooperative context. California DOJ AG Office


4) What Uber says it shares in ad contexts (and why “current trip info” matters)

Uber’s own advertising privacy FAQ states (depending on settings) it may share:

  • Ad/device identifiers or hashed contact info
  • App usage data (including trip, order and search history)
  • Current trip information
  • And it lists major ad platforms and ad tech intermediaries as partner categories Uber

This is not the same as “selling your exact route to a random stranger.”
But it’s still a big deal because:

  • “Current trip information” is time-sensitive location context
  • Trip history is routine modeling
  • Ad ecosystems are multi-hop (data can pass through layers of vendors and measurement partners)

Also worth noting: Uber’s ad business has been growing for years, and recent reporting describes Uber launching an “insights” product for marketers built around rides + delivery behavior (positioned as privacy-safe via clean-room approaches). Business Insider

Even when “privacy-safe” is the intent, any expansion of “data used for marketing intelligence” increases exposure surface area.


5) The real vulnerabilities if route/schedule data leaks, is shared too widely, or is resold downstream

Let’s talk realistic threat models — the stuff that can actually happen.

A) Stalking & personal safety risk (most direct)

If a bad actor gains access to:

  • Your frequent pickup point
  • Your frequent destination
  • The timing pattern (weekdays, late-night, gym days)

…they don’t need your “full GPS breadcrumb trail.” They just need predictability.

Worst-case outcomes:

  • Stalking at pickup spots
  • “Accidental” encounters at frequent destinations
  • Targeting someone leaving work at a predictable time

B) Domestic violence / coercive control scenarios

If someone with access to your phone/account (or an insider, or a compromised email) can see:

  • Past trips
  • Saved places
  • Receipts

…that becomes movement surveillance.

C) Burglary-by-routine inference

“Airport ride at 5:30AM” + “trip duration” + “home pickup” can imply nobody’s home.

Even without selling, data broker ecosystems and downstream sharing (especially in marketing cooperative models) can increase risk that this kind of inference ends up outside the original app. DoorDash’s AG statement explicitly warns marketing co-ops can lead to downstream spread to data brokers. California DOJ AG Office

D) Sensitive-location inference

Repeated destinations can reveal:

  • Clinics
  • Addiction treatment locations
  • Religious institutions
  • Political events
  • Union activity
  • Legal services

Even “de-identified” datasets can be re-identified when movement patterns are unique.

E) Delivery-specific risks (DoorDash, Uber Eats, etc.)

Delivery adds a new layer:

  • You are explicitly providing a home/work address and often delivery instructions
  • Orders can reveal diet, religion, health, household composition
  • Multi-order routing can expose proximity patterns

And because delivery is address-centric, it’s inherently more “pinpointable” than many rideshare trips.

F) Breaches & vendor sprawl

The more parties that touch data (analytics, ads, attribution, measurement), the more chances for:

  • Breach
  • Misconfiguration
  • Over-collection
  • Insider abuse

You don’t need a conspiracy. You just need one weak link.


6) Practical ways to reduce risk (without uninstalling your life)

Here are realistic mitigations users can do today:

In-app privacy controls

  • Opt out of targeted advertising / “sale or sharing” where offered
    • Uber provides a US opt-out flow tied to targeted ads disclosures Uber
    • DoorDash describes opt-out for “sale or sharing” for targeted advertising/ad personalization DoorDash
    • Lyft provides similar opt-out framing Lyft

Location permission hygiene

  • Set location to “While Using the App” (not “Always”), if the service still works for you
  • Disable background location unless a safety feature needs it

Reduce routine predictability

  • Don’t always request rides from your exact doorstep
    • Walk to a nearby corner for pickup when feasible
  • Avoid saving “Home” and “Work” labels if you’re privacy-sensitive

Account & device hardening

  • Use a strong password + MFA where available
  • Lock down your email account (because receipts/history often live there)
  • Minimize app permissions (contacts access is a common one to avoid)

Use safety tools on your terms

Sharing trip status is a safety feature — but remember it’s also a live location broadcast link, so only share with trusted contacts. Uber and Lyft both provide trip-sharing features. Uber+1


7) The bottom line

  • I did not find solid evidence that Uber is literally “selling ride schedules and route data to the highest bidder” as a direct product in the simplistic sense.
  • But it’s factually supported that:
    • Uber discloses data for targeted advertising in ways that may be legally considered “sale/sharing,” and it states it may share trip/order history and even current trip info with advertising partners depending on settings. Uber+1
    • Lyft acknowledges categories of data (including geolocation) that may be sold/shared for cross-context behavioral advertising/analytics. Lyft
    • DoorDash has faced California enforcement tied to “trading” customer personal info through marketing cooperatives, described as enabling broader dissemination (including potential downstream data broker exposure). California DOJ AG Office

The vulnerability isn’t just “sale.”
It’s the creation of a high-resolution map of your life — and the reality that once that map is shared widely enough, it becomes difficult to control where it ends up.