READ: The Skeleton Key To Our Lives: The Risks And Consequences Of Consumer Location Data Tracking

This page is an extension of the “Recommended Articles” page but I feel the data brokerage industry (as well as so-called “video analytics”) itself is the heart and soul of what has happened, and more importantly been allowed to happen (many of those who ought to be regulating are big-time customers!) via real-time geolocation sales to those whose sole intention has been to hunt me down. In simplest terms, consider me a target in the proverbial “war on woke,” and the strategy precisely identical to tactics weaponized against activists and minorities during COINTELPRO.

Visit the “My Objective” page for a more specific “terrible companies list” which specifically names many of the worst offenders in this industry.

A lof of the articles in question that I repost are indispensable to my fight for my own freedom, but I feel they often miss the point by using limiting phrasing like “repressive authoritarian governments” or “law enforcement could misuse …” without enough emphasis on what’s already rampant and widespread, ordinary random goons and thugs, or else snitches and narcs, who are neither government nor law enforcement, wielding these weapons with reckless abandon for any purpose, with unfettered access to my real-time geolocation and the ability to stalk me anywhere I go, and 50% of round trips do happen to wind up leading them to the very house I live in which is of some serious concern. Such has been my experience. I was just thinking about this as of this minor edit (3/13/25) just prior to departing the country finally. I’d signed up for a final half marathon and need to get in a few short jogs this week, and …. I’m just doing them! I don’t want to. I’m thinking of all the routes I’ve done for the past 25 years of my life and … literally cannot think of a single one that doesn’t fill me with a sense of visceral dread, this notion that if I run a certain loop, or drive to a certain trailhead, and back, I’m unwillfully signing myself for more stalking and abuse in the name of “keeping the community safe.” That’s all thanks to utterly reckless and widespread (way way way beyond just the boys in blue, the officials per se) abuse of real-time geolocation stalking facilitated by “smart cities” and data brokers and more importantly, the law enforcement and military agencies, and apparently esteemed community members (it’s been sickening to me to see it revealed who some of the folks I thought I knew personally really are), that go to great lengths to suppress how rampant they know this all to be.

However, in this case the means of achieving the desired ends are exponentially more efficient. Imagine being doxxed by a massive global network of seriously sick Right Wing incels - meaning the usual stuff … where you work, where you live, where you frequent, etc. Scary, right? Super scary! Then imagine that being turned up to the highest level, where not only are those things easily determined, but at any given point in time, at any given place Earth, it is known to these predators where you are at any given moment. How do you expect that this is going to end for me? The workaround, as I understand it, is that “we don’t know who you are,” that they are merely tracking a “device ID” or an “IP address” and its preferences, known associates, real-time location, and historical location trends. This is even worse, as it apparently is both a workaround and a way to further dehumanize targets, reducing them to a highly reductive profile and string of digits, a dot on a mapping app.

Even worse (can it get worse??) is that I feel most of the companies listed fit more into the “gray market” category, and based on my own experience being repeatedly hunted down by thuggish (mostly) Right Wing shitheads each time I turn on a cellular device - there are even worse black market operations trafficking in this information which I feel is heavily used among, to name a few, select individuals who work in private security, military, first responder professions, and transportatoin, notably cab drivers who have an utterly horrific historical track record of widespread discrimination and bigotry. So in many ways, my experience is a simple anecdotal warning of history repeating itself, human nature doing what it does if not held to account, but aided and abetted by an utterly unprecedented new set of weapons. Real-time location data capture, sale, and purchase is, I am arguing, an even more lethal and dangerous invention than even firearms.

I want to also stress the importance of outright bans that involve not just public agencies - everything goes “underground” when it’s understood the only requirement is to not have any of it directly traceable to police, government, etc. - but private entities and private citizens. If you know anything of the weaponization of SARs, the “see something say something” post-9/11 propaganda, which is currently being applied to the next power grab - the “war on human trafficking” - you understand the importance of regulating and criminalizing use of these things by any human being against any other innocent human being, such as myself for any reason - public, private, volunteer, etc. Ideologically motivated movements and attacks are not necessarily confined to police, for-profit companies, government entities, etc. And they are usually borne of moral panic.

If bans of those trafficking in data, which is quickly “evolving” from device, social media, and merely location-based to the literal hacking of our bodies (everything you can possibly think of), is not realistic …. consider “guardrails” - real ones, not the bullshit our own US federal government which is by far the biggest customer of the shady data brokers it swears “the bad guys” are using, suggests. Specifically as pertains to workarounds that allow specific individuals to be hunted without ever formally adding their name to a watchlist, so as to deliberately avoid culpability for … not only watchlisting which has a horrific history in America and the world, but real-time hunting, which has literally no precedent.Case in point, you can’t look up some database and find the name “Joe Leineweber,” as evidence of my targeting by feds, cops, and white trash vigilantes they source dirty work to in countries all over Earth.

Alerts are literally tied to things like device ID, advertiser ID, and increasingly, a voice print, a face print, a gait print, that are stored as data but perhaps can never be unearthed as “Joe Leineweber.” EVERYTHING about this surveillance economy is pretty well designed to hunt down, and guarantee impunity against perpetrators, anyone for any reason, by leaving virtually no trace of targeting despite our own inability, on the consumer side (and by consumer I no longer mean “clicking AGREE on ‘Terms and Conditions’ you have no choice about, but actually walking down public roads, being a “consumer of pavement” or “consumer of groceries” which you HAVE to go get eventually on private property) to do pretty much anything without leaving a trace that their Mechanical Hound can sniff out and feed right back to them. If this sounds insane, bookmark it, copy and paste it, just remember it … and come back to this in a few years. It’s been very frustrating for me, as I’m still being pathologized for merely suggesting that I’m a victim of mere device-level tracking that is malicious and designed to destroy and undermine me, so I have little faith that accountability will ever catch up to far far far worse things already being done to me that will invariably claim my life.

The other massive issue with this industry is what I might liken to gerrymandering. Elections look objective and fair if you don’t understand the underlying trickery. Same with data. I strongly believe the advantage lies heavily in favor of Conservatives in the status quo of 2025, who manipulate the shit out of the inputs to create “future terrorists” and “kidnappers” and “traffickers” out of ordinary dudes like me, and exploit the power of illusory truth to further corner me into this inescapable predicament.

Online Behavioral Ads Fuel Surveillance Industry

NSA finally admits to spying on Americans by purchasing sensitive data

There’s a Multibillion-Dollar Market for Your Phone’s Location Data

The Location Data Market, Data Brokers, and Threats to Americans’ Freedoms, Privacy, and Safety

Using ‘Sensitive Locations Lists’ to Address Data Broker Harm

Why Location Data Brokers Put All Communities At Risk

A surprising number of government agencies buy cellphone location data. Lawmakers want to know why.

Location data poses risks to individuals, organizations

Data Brokers Know Where You Are—and Want to Sell That Intel

I Gave a Bounty Hunter $300. Then He Located Our Phone

Who Is Policing the Location Data Industry?

What goes on in the shadows: FTC action against data broker sheds light on unfair & deceptive sale of your location data

How your sensitive data can be sold after a data broker goes bankrupt

SDKs, The hidden trackers in your phone, explained

I already know the damage done by having your devices real-time doxxed. I’ve paid pretty much the heaviest price one could (by America’s standard … people are regularly being assassinated and detained in other places) in terms of having myself criminalized, “de-funded” and serious damage to livelihood, and in general placed in a constant state of dread, fear, and intimidation which, as interpreted by the painfully archaic “mental health industrial complex,” has all been bundled into a claim they have not once expressed any doubt about ….. that literally all I’ve experienced, and self-reported, is … you guessed it, “all in my head” and that the tracking modalities, and motives of those “sharing” access to where I’m at with the most malicious of intent, are simply “not real,” that “consensus reality” suggests that these things are not even possible.

Well, after reading, are any of you apparent “medical experts” convinced that these are things are not only possible and plausible, but utterly rampant the world over, in fact a major driving force of the global tech economy? Probably not. Fair enough, continue to call me crazy, as long as you also make sure my “crazy” ass can no longer be accessed and fucked with by the World’s Worst People, the vast majority of whom, if I’ve not stressed this enough, meet all of the following characteristics:

Less educated, more Conservative, professional/personal ties narrowly pointing to military, transportation, first responders, private security, property management.

It scares me, and I already have knowledge but not pure evidence I suppose, of crimes done against me which seemed ideologically motivated. I have observered over time a lack of boundaries or limits to the things a group which perceives me as an “enemy” will go. That’s concerning knowing I’m real-time trackable everytime I press a button or turn a key in the ignition or book a trip or sell a stock or go shopping … in light of the trend of radicalization paired with skills some of these people have, to say nothing of the acts of sabotage already done to my life.

https://apnews.com/article/military-extremism-pentagon-veterans-terrorism-capitol-riot-jan-6-0c1fdd7b6b761e9c9e8556a9b9e45dc9

In the absence of a crackdown, I remain endangered by the data economy, esp when paired with real-time location purchase, sale, and “sharing.” Despite some news about FTC crackdowns, trust that as the target of these weapons, nothing has changed at all in actual “rubber meets the road” practice in this industry.

Also, the next frontier in exposing this corrupt ecosystem is to dig deeper into the demand side, who specifically, esp private proxy actors (markedly harder to track and catch than highly scrutinized government agencies), as well as to stifle the present explosion of the biometrics industry, such that people like me are now real-time located / hunted not only by a device or an account login address, but by our faces, voices, tattoos, gaits, cars, and other objects that are not so easy to opt out of or leave at home.

Random thought - with so many scrapers and brokers and shady middlemen out there, why can’t we buy data about who purchases our data? That should be law, in fact, immediate notifications to the subject about who bought what and for what purpose, particularly in the realm of location brokers. I would even think there’s a valid public safety rationale for knowing at all times who has what information about you, whom they paid, when, and for what price. It all is, after all, at its core, stalking and theft by any other name regardless of what it presently considers itself to be.

https://www.choice.com.au/consumers-and-data/data-collection-and-use/who-has-your-data/articles/data-broking-investigation

https://www.pcmag.com/news/us-yes-were-buying-data-on-us-citizens-including-their-embarrassing-secrets

"The report points out “the government would never have been permitted to compel billions of people to carry location tracking devices on their persons at all times… Yet smartphones, connected cars, web tracking technologies, the Internet of Things, and other innovations have had this effect without government participation.”

If I could sum up the core issues as pertains to data brokers and video analytics (or various other extensions of so-called “AI”), as pertains to my first-hand experience being dubbed the perpetual “threat” which they purport to “prevent” or “deter” from all manner of fake hypotheticals - these would be my Top 6 as of 2025. Note the issues have as much to do with the psychographic profile of the average purchaser as they do with wrinkles in the algorithms, which is why I flatout refuse to acknowledge any notion of so-called “guardrails” on AI public safety tools, as the very concept which it is all based on - mass surveillance crowdsourced to the masses - can yield positive results in no shape or form, period:

#1: One-Way Info Flow

This is the worst part of AI “future tense threat” apps and definitely not a design flaw, but rather intrinsic to making sure these work for the right people against a certain kind of person. Consider the common practics of online stores or brick-&-mortar stores receiving 1-5 star ratings via Amazon, Etsy, Yelp, etc. Anyone can review the company and it appears fair and democratic and yet, it becomes very clear in time that manipulation is rampant; there are negative reviews that read like personal or competitive attacks designed to undermine, just as there are positive reviews with a false “insiders stuffing the ballot box” feel to them.

Well, in the threat prediction / prevention / snake oil industry, it feels like the only acknowledged inputs are the negative ones, the Karens, the bigots, the hot-heads, the angry vigilantes, the haters, the chronic cop callers. And these people, I know from experience, are the same psychographic mix of, say, the average Twitter troll and/or local news Comment Section person. A good parallel for me was my Etsy shop (which I shut down under duress due to online harassment and trolling, like so many other parts of my life due to having nowhere to appeal to who cares), where at least when I did get the 0.1% “bad review as retribution” it could be viewed very clearly as being relative to the 99.9% of good ones. In predictive policing, however, only “accusers” get representation, because people who like me have no reason or belief that they should call 911 to officially report that I helped them out in a pinch, made a nice gesture, or just generally kick ass as a person. (I’m not suggesting we add this, by the way to real life person-scoring).

I have somehow someway been rated via these data launderers as a threat to - get this - attack retail employees when they are in transit between their place of work and their car. Yes, really and truly, no female has ever walked the 10 yards from building door to car door in America alone, not without 1-3 Good Honest Men sitting in their trucks witnessing the act, so long as Joe Leineweber’s face has been detected in her vicinity. Americans, this is why you are so fucking pathetic and I’m ashamed to be associated with you. You pride yourselves on being complete bitch ass pussies and try to condition others to be scared so you can pose as their saviors. Which perhaps not coincidentally is why you have ostracized me in such a way, you little cunty bitch ass stalkerware addicts. And yet there is this obvious knowledge I have that not one of these employees arrived at this conclusion themself, nor do they appear to have a reverse feedback mechanism, to vote or rate me otherwise and report their repeated non-threatening and totally normal interactions as a way to vote me off of these watchlists, which as far as I can tell amount to life sentences upon being determined a “threat.”

As such, the way these are set up obviously apply a value to the word of the kinds of people who go out of their way to file official complaints and reports with government safety bureaus much, much more highly than normal people who don’t operate this way, who in reality represent an overwhelmingly massive majority who would have had me pushed way way way off the “threat” status I’m under if they were ever given a chance to rate me in the first place.

#2: Overly Dependent on Cop Data

I believe my official death certificate should state that my cause of death was “anecdotes.” AI snake oil public safety apps are extremely problematic in that they seem to almost exclusively rely on police data. I have been living in extremely touchy, American, Karen-on-Steroids kinds of environments, Enhanced Sheriff’s Patrol District zones, HOAs, the kinds of places where going running after dark doesn’t put me at risk of being mugged by thugs, but by being reported as a suspected criminal by 6 bigots who all go to the same church and stalk me through the same apps. I’m oft reported for “aggression” for the simple act of snapping a shutter on my camera or speaking English words toward a person. So that is simply not fair, that my ‘data’ does not match up with that of others, yet it affects how I’m treated everywhere despite being borne of the unique suburban American dynamic that overpoweringly controls my life. Case in point, I get treated like a very, very dangerous person when traveling in places like Latin America. Yet I know none of this would have ever happened if I’d simply existed in Latin America in the first place starting 10 years ago. There never would have been the first police report, the first conflict, and all the ensuing conflicts which built off of that based on this pattern of malicious / prejudice “incidents” reflective of group bullying but made to appear as an escalation of my own behaviors. These are just facts. Cop data is about the dirtiest input possible in an algorithmically defined system, especially the kinds of reports I’m routinely subject to which I believe are often made with that specific knowledge, that no arrest will or can be made (I’m never doing anything wrong), but that the report itself, attached to my name helps establish a pattern to tip the algorithms ever closer to making me out to be the most dangerous maniac in the history of man whenever I’m seen under any of these AI-powered bullshit machines.

#3: Bad Guys in Good Guys’ Clothing

This especially applies to the horrible industry which traffics in real-time geolocation data. I can’t unequivocally prove (and all evidence I supply will be forever suppressed anyway), but I do know, of flagrant abuse of geolocation data to frame people for crimes, by knowing their locations and subsequently planting false patterns of crime in their wake.

Imagine, say, the political pressure cooker that was Portland, Oregon, my home city, over the past decade (2015-2025) during which I personally have experienced massive unrelenting abuses of my real-time location data. I believe there was a massive effort by political opponents to paint Liberal activists, and Liberal city leadership by extension, as fomenting lawlessness and domestic terrorism. These claims are laughable if you live in Portland, about as sleepy a big city as exists in our country. But there were pretty intense ideological motives to make the “Liberal Cesspool” tag stick.

I discovered repeatedly during this era evidence suggesting that wherever, let’s just say, Person A’s cell phone or car traveled, that perhaps hours or a day later there might tend to be what appeared to be evidence of petty crime sprees reflective of a Conservative person’s stereotype of a Radical Portland Bogeyman - busted out windows, graffiti with explicitly “anti-American” messaging or threats against law enforcement, arson, and general small-time property destruction and/or threatening messaging, even “missing persons” reports … in places where these things are not typically found … places that Person A frequents … and this just felt super fucking fishy.

Or in other instances in a foreign country, imagine corrupt cops interested not in ideological gains but financial ones. And they get real-time alerts via these predictive policing apps based on fraudulent portrayals of Person A as a “threat.” Two of these are financial alerts, as in - ATM withdrawals; and real-time location alerts, via face, car, and phone. So these cops become aware that Person A just withdrew $1,000 cash over two days, subsequently track said person to their campsite, and set up a pretextual “arrest.” The ransom, due to specific knowledge of how much cash Person A is presently carrying, and/or Person A’s disreputable status as a “person of interest,” is set considerably higher than the typical tourist “propina” situation, and Person A is fleeced for nearly every dollar.

These are bad but not worst cases, which are also routinely carried out through the abuse of these “public safety” tracking tools. Those would be assassination, forced disappearance, imprisonment and kidnapping, and these are very real, actively occurring themes which are specifically done through the use of these apps, not some hypothetical doomsday set far off in the future.

#4: Massively Bad Self-Selection Bias

People who are drawn to purchasing data, especially video analytics and car/device trackers which encourage users to literally rewind and watch other people’s private lives without their knowledge or consent, represent a very particular subset of the population which is not in any way representative of a fair, balanced or ethical cross-section of humanity.

On top of the aforementioned self-selection bias which skews in favor of Pure Evil, of bigotry, of those who lust for power, of nosy without bounds, of very, very stupid Republicans, there’s also the issue of “public safety” or “personal safety” branding and the kinds of lame folks commonly drawn to such roles in communities as “protector” or “crime stopper.” Some are deeply insecure and afraid of their own shadow, some are gossipy little locusts who just can’t get through the week without crucifying a local pariah, and still others harbor a hero complex, this outdated belief that they must physically, proactively, aggressively protect homes, businesses, people, etc.

So that will forever be an issue with public safety stalker apps, that no matter which of these profiles fits a given end user, most fall under the broader umbrella of Total Fucking Losers often unaware of their own bigoted sentiment and how by singling out people as “future tense” threats, they continue to undermine the fundamental tenets of Liberal Democracies and amplify the power of mindless Crime & Punishment bully factions which have never been more enabled to act on their prejudice as they are today through these apps.

#5: Vicious Pre-Prejudicing Cycle Becomes “Reality”

The problem with alerting multiple people across multiple industries and in multiple geographic spaces throughout a given day to a single person being a “threat” is that it negatively conditions their behavior toward me; and leaves me perpetually vulnerable to the absolute worst among them, that person with the worst judgment, the least emotional control, the greatest propensity for abusing power and discretion. The privilege of stalking me with impunity is theirs, as these apps literally instruct people to go to the real-time location of their “threats.” Often this is on private property (think “chokepoints,” like stores and parking lots, the kinds of places that are only technically “private” but overtly open to the public and necessary sooner or later to each person’s survival, as in a grocery store property) where the stalker has the upper hand as an employee or owner. and the burden of self-defense is perpetually mine to bear alone without backup.

By applying “person of interest” or “threat” stigma to my face, device, or car, already, the person receiving the alert is far, far, far more likely, I would argue inevitably, going to report anything I do or say that they don’t like, that they never would have noticed in the first place, especially if an act of protest or defiance in the face of blatant discrimination, as some form of “incident” and this produces, and sustains, this absurd cycle. Basically, if you are a self-respecting individual, and you are aware of being discriminated against in these environments, and aware of the very tools by which it is done, and you make an attempt to draw a line in the sand with Becky the Assistant Manager, she will punish you (at least presuming this is set in the USA and Becky is a Republican) for your resistance, and her “official report” will follow you apparently to your next errand and to all other errands you make thereafter, forever tagged and flagged as some sort of ambiguously hostile hostile little Trouble Maker.

There is no jury, no judge, no evidence required, no due process, just Joe at the mercy of Becky’s worst prejudice and/or personality flaw and/or mood swing, and the enablement of her like-minded peers who share in ownership of predictive policing apps for a reason which I argue has little to do with keeping communities safe. Nothing about this is remotely fair.

#6: Lame Complainers Rewarded; Nearly Zero Input Opps for Cool People

I believe this is encompassed in the other 5 bullet points, but in short, these AI public safety apps clearly overweight negative inputs, people who file formal complaints, and seem to supply no opportunity for positive inputs about people. I believe many who have filed frivolous complaints about me repeatedly, especially in suburban and rural Oregon, did so well aware of this, that we exist in an algorithmic public safety environment where those calls are part of a “long game” of steadily consistently elevating my so-called threat score, and this is problematic because there seem to be no programs that track the complainers themselves, so there are massive vulnerabilities (again I would argue by design) to data fraud. I believe malicious inputs are overly weaponized by Conservatives against Liberal targets in America.

There’s a Multibillion-Dollar Market for Your Phone’s Location Data

A huge but little-known industry has cropped up around monetizing people’s movements

By: Jon Keegan and Alfred Ng

Originally published on themarkup.org

Companies that you likely have never heard of are hawking access to the location history on your mobile phone. An estimated $12 billion market, the location data industry has many players: collectors, aggregators, marketplaces, and location intelligence firms, all of which boast about the scale and precision of the data that they’ve amassed.

Location firm Near describes itself as “The World's Largest Dataset of People's Behavior in the Real-World,” with data representing “1.6B people across 44 countries.” Mobilewalla boasts “40+ Countries, 1.9B+ Devices, 50B Mobile Signals Daily, 5+ Years of Data.” X-Mode’s website claims its data covers “25%+ of the Adult U.S. population monthly.”

In an effort to shed light on this little-monitored industry, The Markup has identified 47 companies that harvest, sell, or trade in mobile phone location data. While hardly comprehensive, the list begins to paint a picture of the interconnected players that do everything from providing code to app developers to monetize user data to offering analytics from “1.9 billion devices” and access to datasets on hundreds of millions of people. Six companies claimed more than a billion devices in their data, and at least four claimed their data was the “most accurate” in the industry.

“There isn't a lot of transparency and there is a really, really complex shadowy web of interactions between these companies that’s hard to untangle,” Justin Sherman, a cyber policy fellow at the Duke Tech Policy Lab, said. “They operate on the fact that the general public and people in Washington and other regulatory centers aren't paying attention to what they're doing.” 

Occasionally, stories illuminate just how invasive this industry can be. In 2020, Motherboard reported that X-Mode, a company that collects location data through apps, was collecting data from Muslim prayer apps and selling it to military contractors. The Wall Street Journal also reported in 2020 that Venntel, a location data provider, was selling location data to federal agencies for immigration enforcement. 

A Catholic news outlet also used location data from a data vendor to out a priest who had frequented gay bars, though it’s still unknown what company sold that information. 

Many firms promise that privacy is at the center of their businesses and that they’re careful to never sell information that can be traced back to a person. But researchers studying anonymized location data have shown just how misleading that claim can be. 

The truth is, it’s hard to know all the ways in which your movements are being tracked and traded. Companies often reveal little about what apps serve as the sources of data they collect, what exactly that data consists of, and how far it travels. To piece together a picture of the ecosystem, The Markup reviewed the websites and marketing language of each of the 47 companies we identified as operating in the location data industry, as well as any information they revealed about how the data got to them. (See our methodology here.)

How the Data Leaves Your Phone

Most times, the location data pipeline starts off in your hands, when an app sends a notification asking for permission to access your location data. 

Apps have all kinds of reasons for using your location. Map apps need to know where you are in order to give you directions to where you’re going. A weather, waves, or wind app checks your location to give you relevant meteorological information. A video streaming app checks where you are to ensure you’re in a country where it’s licensed to stream certain shows. 

But unbeknownst to most users, some of those apps sell or share location data about their users with companies that analyze the data and sell their insights, like Advan Research. Other companies, like Adsquare, buy or obtain location data from apps for the purpose of aggregating it with other data sources. Companies like real estate firms, hedge funds and retail businesses might then turn and use the data for their own advertising, analytics, investment strategy, or marketing purposes. 

Serge Egelman, a researcher at UC Berkeley’s ​​International Computer Science Institute and CTO of AppCensus, who has researched sensitive data permissions on mobile apps, said it’s hard to tell which apps on your phone simply use the data for their own functional purposes and which ones release your data into the economic ether.

“When the app asks for location, in the moment, because maybe you click the button to find stuff near you and you get a permission dialog, you might reasonably infer that ‘Oh, that's to service that request to provide that functionality,’ but there's no guarantee of that,” Egelman said. “And there's certainly usually never a disclosure that says that the data is going to be limited to that purpose.”

Companies that trade in this data are reluctant to share which apps they get data from. 

The Markup asked spokespeople from all the companies on our list where they get the location data they obtain. 

Companies like Adsquare and Cuebiq told The Markup that they don’t publicly disclose what apps they get location data from to keep a competitive advantage but maintained that their process of obtaining location data was transparent and with clear consent from app users. 

“It is all extremely transparent,” said Bill Daddi, a spokesperson for Cuebiq.

He added that consumers must know what the apps are doing with their data because so few consent to share it. “The opt-in rates clearly confirm that the users are fully aware of what is happening because the opt-in rates can be as low as less than 20%, depending on the app,” Daddi said in an email. 

Yiannis Tsiounis, the CEO of the location analytics firm Advan Research, said his company buys from location data aggregators, who collect the data from thousands of apps—but would not say which ones. Tsiounis said the apps he works with do explicitly say that they share location data with third parties somewhere in the privacy policies, though he acknowledged that most people don’t read privacy policies. 

“There’s only so much you can squeeze into the notification message. You get one line, right? So you can’t say all of that in the notification message,” Tsiounis said. “You only get to explain to the user, ‘I need your location data for X, Y, and Z.’ What you have to do is, there has to be a link to the privacy policy.”  

Only one company spokesperson, Foursquare’s Ashley Dawkins, actually named any specific apps—Foursquare's own products, like Swarm, CityGuide, and Rewards—as sources for its location data trove. 

But Foursquare also produces a free software development kit (SDK)—a set of prebuilt tools developers can use in their own apps—that can potentially track location through any app that uses it. Foursquare’s Pilgrim SDK is used in apps like GasBuddy, a service that compares prices at nearby gas stations, Flipp, a shopping app for coupons, and Checkout 51, another location-based discount app. 

GasBuddy, Flipp, and Checkout 51 didn’t respond to requests for comment.

A search on Mighty Signal, a site that analyzes and tracks SDKs in apps, found Foursquare’s Pilgrim SDK in 26 Android apps. 

While not every app with Foursquare’s SDK sends location data back to the company, the privacy policies for Flipp, Checkout 51, and GasBuddy all disclose that they share location data with the company.

Foursquare’s method of obtaining location data through an embedded SDK is a common practice. Of the 47 companies that The Markup identified, 12 of them advertised SDKs to app developers that could send them location data in exchange for money or services.

Placer.ai says in its marketing that it does foot traffic analysis and that its SDK is installed in more than 500 apps and has insights on more than 20 million devices. 

“We partner with mobile apps providing location services and receive anonymized aggregated data. Very critically, all data is anonymized and stripped of personal identifiers before it reaches us,” Ethan Chernofsky, Placer.ai’s vice president of marketing, said in an email. 

Into the Location Data Marketplace 

Once a person’s location data has been collected from an app and it has entered the location data marketplace, it can be sold over and over again, from the data providers to an aggregator that resells data from multiple sources. It could end up in the hands of a “location intelligence” firm that uses the raw data to analyze foot traffic for retail shopping areas and the demographics associated with its visitors. Or with a hedge fund that wants insights on how many people are going to a certain store.

“There are the data aggregators that collect the data from multiple applications and sell in bulk. And then there are analytics companies which buy data either from aggregators or from applications and perform the analytics,” said Tsiounis of Advan Research. “And everybody sells to everybody else.” 

Some data marketplaces are part of well-known companies, like Amazon’s AWS Data Exchange, or Oracle’s Data Marketplace, which sell all types of data, not just location data. Oracle boasts its listing as the “world’s largest third-party data marketplace” for targeted advertising, while Amazon claims to “make it easy to find, subscribe to, and use third-party data in the cloud.” Both marketplaces feature listings for several of the location data companies that we examined.

Amazon spokesperson Claude Shy said that data providers have to explain how they gain consent for data and how they monitor people using the data they purchase.

“Only qualified data providers will have access to the AWS Data Exchange. Potential data providers are put through a rigorous application process,” Shy said. 

Oracle declined to comment.

Other companies, like Narrative, say they are simply connecting data buyers and sellers by providing a platform. Narrative’s website, for instance, lists location data providers like SafeGraph and Complementics among its 17 providers with more than two billion mobile advertising IDs to buy from. 

But Narrative CEO Nick Jordan said the company doesn’t even look at the data itself. 

“There’s a number of companies that are using our platform to acquire and/or monetize geolocation data, but we actually don’t have any rights to the data,” he said. “We’re not buying it, we’re not selling it.” 

To give a sense of how massive the industry is, Amass Insights has 320 location data providers listed on its directory, Jordan Hauer, the company’s CEO, said. While the company doesn’t directly collect or sell any of the data, hedge funds will pay it to guide them through the myriad of location data companies, he said.

“The most inefficient part of the whole process is actually not delivering the data,” Hauer said. “It’s actually finding what you’re looking for and making sure that it’s compliant, making sure that it has value and that it is exactly what the provider says it is.”

Oh, the Places Your Data Will Go

There are a whole slew of potential buyers for location data: investors looking for intel on market trends or what their competitors are up to, political campaigns, stores keeping tabs on customers, and law enforcement agencies, among others.

Data from location intelligence firm Thasos Group has been used to measure the number of workers pulling extra shifts at Tesla plants. Political campaigns on both sides of the aisle have also used location data from people who were at rallies for targeted advertising.

Fast food restaurants and other businesses have been known to buy location data for advertising purposes down to a person’s steps. For example, in 2018, Burger King ran a promotion in which, if a customer’s phone was within 600 feet of a McDonalds, the Burger King app would let the user buy a Whopper for one cent.

The Wall Street Journal and Motherboard have also written extensively about how federal agencies including the Internal Revenue Service, Customs and Border Protection, and the U.S. military bought location data from companies tracking phones. 

Of the location data firms The Markup examined, the offerings are diverse. 

Advan Research, for instance, uses historical location data to tell its customers, largely retail businesses or their private equity firm owners, where their visitors came from, and makes guesses about their income, race, and interests based on where they’ve been. 

“For example, we know that the average income in this neighborhood by census data is $50,000. But then there are two devices—one went to Dollar General, McDonald’s, and Walmart, and the other went to a BMW dealer and Tiffany’s ... so they probably make more money,” Advan Research’s Tsiounis said.

Others combine the location data they obtain with other pieces of data gathered from your online activities. Complementics, which boasts data on “more than a billion mobile device IDs,” offers location data in tandem with cross-device data for mobile ad targeting.

The prices can be steep. 

Outlogic (formerly known as X-Mode) offers a license for a location dataset titled “Cyber Security Location data” on Datarade for $240,000 per year. The listing says “Outlogic’s accurate and granular location data is collected directly from a mobile device’s GPS.” 

At the moment, there are few if any rules limiting who can buy your data. 

Sherman, of the Duke Tech Policy Lab, published a report in August finding that data brokers were advertising location information on people based on their political beliefs, as well as data on U.S. government employees and military personnel. 

“There is virtually nothing in U.S. law preventing an American company from selling data on two million service members, let's say, to some Russian company that's just a front for the Russian government,” Sherman said. 

Existing privacy laws in the U.S., like California’s Consumer Privacy Act, do not limit who can purchase data, though California residents can request that their data not be “sold”—which can be a tricky definition. Instead, the law focuses on allowing people to opt out of sharing their location in the first place. 

​​The European Union’s General Data Protection Regulation has stricter requirements for notifying users when their data is being processed or transferred. 

But Ashkan Soltani, a privacy expert and former chief technologist for the Federal Trade Commission, said it’s unrealistic to expect customers to hunt down companies and insist they delete their personal data.

 “We know in practice that consumers don't take action,” he said. “It's incredibly taxing to opt out of hundreds of data brokers you've never even heard of."  

Companies like Apple and Google, who control access to the app stores, are in the best position to control the location data market, AppCensus’s Egelman said. 

“The real danger is the app gets booted from the Google Play store or the iOS app store,” he said.” As a result, your company loses money.” 

Google and Apple both recently banned app developers from using location reporting SDKs from several data companies.  

Researchers found, however, that the companies’ SDKs were still making their way into Google’s app store. 

Apple didn’t respond to a request for comment. 

"The Google Play team is always working to strengthen privacy protections through both product and policy improvements. When we find apps or SDK providers that violate our policies, we take action,” Google spokesperson Scott Westover said in an email.

Digital privacy has been a key policy issue for U.S. senator Ron Wyden, a Democrat from Oregon, who told The Markup that the big app stores needed to do more. 

“This is the right move by Google, but they and Apple need to do more than play whack-a-mole with apps that sell Americans’ location information. These companies need a real plan to protect users’ privacy and safety from these malicious apps,” Wyden said in an email. 

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.