Ban the sale of precise geolocation
526 points
7 hours ago
| 26 comments
| lawfaremedia.org
| HN
Johnbot
7 hours ago
[-]
A lot of geolocation data on the market is anonymized, following medium-lived unique IDs that aren't able to be mapped to other identifiers. The problem with that is that if you have precise locations, or enough samples that you can apply statistics to find precise locations, in many cases you can de-anonymize the IDs. You can purchase address and resident listings from a number of different data vendors, and by checking where the device returns to at night you can figure its home address. Then if you find information on the residents (work locations, schools, etc.), you see if said device goes where each resident of the home address is likely to go, and you now have a pretty good idea of exactly who the device belongs to.
reply
rockskon
6 hours ago
[-]
There is no such thing as anonymized location data when you have the location of something where and when they sleep and work.

It's a rhetorical fiction the ad industry tells itself.

reply
Terr_
4 hours ago
[-]
Right, there's probably no other phone in the world that typically stops for hours within 1000 feet of my bed and typically stops on Monday-Friday within 1000 feet of my work-desk.
reply
mapt
3 hours ago
[-]
Now think what Lavrenti Beria and an LLM could have done with that.
reply
wafflemaker
2 hours ago
[-]
Somebody once said that if Stalin had access to television, he would never have to kill 20+ million ppl. What would he do with all that data? No idea.
reply
Forgeties79
6 hours ago
[-]
And with LLM’s now it’s easier than ever to piece the parts together. Companies were doing it before we even knew what LLM’s were capable of.

Edit: It's a rhetorical fiction the ad industry tells us.

reply
teraflop
5 hours ago
[-]
We should have learned this lesson 20 years ago when researchers were able to deanonymize a lot of the Netflix Prize dataset, which contained nothing except movie ratings and their associated dates.

https://arxiv.org/abs/cs/0610105

If movie ratings are vulnerable to pattern-matching from noisy external sources, then it should be obvious that location data is enormously more vulnerable.

reply
vovanidze
6 hours ago
[-]
exactly. calling it 'anonymized' is pure security theater once you have enough data points to map out someones daily routine.

waiting for legislation or eulas to fix this is a lost cause since adtech always finds a loophole. the fix has to be architectural. moving toward stateless proxies that strip device identifiers at the edge before they even hit upstream servers. if the payload never touches a persistent db there is literally nothing to de-anonymize. stateless infra is the only sane way forward

reply
microtonal
6 hours ago
[-]
To be honest, I feel like this is where iOS and Android are failing us. Why is every app allowed to embed a bunch of trackers? Only blocking cross-app tracking on user request as iOS does is not enough (and data of different apps/websites can be correlated externally).
reply
rolph
5 hours ago
[-]
im not sure about allowed. perhaps required may be closer.

why would someone include tech that makes people think twice about using the app, unless it is required if you want to "sell" in a particular venue.

if your developing geolocation based apps, location tracking is a core function.

a calender, absolutely does not require location tracking beyond what side of the prime meridian are you on.

reply
nickburns
5 hours ago
[-]
> if your developing geolocation based apps, location tracking is a core function.

But the subsequent sale of that data is not—is the discussion here.

reply
rolph
4 hours ago
[-]
and the reason why that data is available for sale, starts with forced collection of data, if you want to participate in an app store as a developer.

you cant sell what you dont have unless you lie lower than a rug.

fix the data collection problem and a second order effect of no data for sale emerges.

reply
nickburns
3 hours ago
[-]
Are you suggesting Android/iOS app developers are forced into data collection somehow? If so, how? I'm genuinely curious.
reply
LeifCarrotson
2 hours ago
[-]
> why would someone include tech that makes people think twice about using the app, unless it is required if you want to "sell" in a particular venue.

Because the overwhelming majority of people don't think twice about this tech.

I do, and that's why I use a lot of web tools or old-fashioned phone calls, but most people think metadata=unimportant and assume that the purpose of the app is what it does for them rather than to gather their personal information for sale.

reply
CPLX
5 hours ago
[-]
Because we don’t enforce antitrust law in this country and the people that make those decisions profit from the ads.
reply
chimeracoder
4 hours ago
[-]
> To be honest, I feel like this is where iOS and Android are failing us. Why is every app allowed to embed a bunch of trackers? Only blocking cross-app tracking on user request as iOS does is not enough (and data of different apps/websites can be correlated externally).

Even if Google and Apple both want to commit to fighting this, it becomes a game of whack-a-mole, because there are all sorts of different ways to track users that the platforms can't control.

As an easy example: every time you share an Instagram post/video/reel, they generate a unique link that is tracked back to you so they can track your social graph by seeing which users end up viewing that link. (TikTok does the same thing, although they at least make it more obvious by showing that in the UI with "____ shared this video with you").

reply
uxhacker
4 hours ago
[-]
How is this legal under the GPDR? There is clear examples in the citizenlab document of a user been tracked inside of the EU from outside.

Is there not also a requirement for clean consent? Ie a weather app can’t track your precise location?

reply
nzach
2 hours ago
[-]
> enough samples that you can apply statistics to find precise locations, in many cases you can de-anonymize the IDs

I think a lot of people don't realize the power of a big enough sample size. With enough samples even something pretty innocent looking like your daily step counter could make you identifiable.

As far as I know we don't have large enough databases to make this happen in practice, but I don't think this is impossible in the future.

reply
jandrewrogers
2 hours ago
[-]
How large are you estimating is "large enough"?
reply
sroussey
6 hours ago
[-]
Companies exist that de-anonymize other data brokers data. Lets the other data brokers claim they have anonymized data while end end users get everything.
reply
ImPostingOnHN
6 hours ago
[-]
you could probably run a anonymization company at the same time you run a de-anonymization company
reply
gessha
4 hours ago
[-]
Best of both worlds - legal and profitable \s
reply
ninalanyon
6 hours ago
[-]
In what sense can the latitude and longitude of my house be called anonymous data?
reply
kube-system
5 hours ago
[-]
Ultimately, a map is anonymous data containing lat/lon of everyone's house

Alone, these points are not deanonymizing, it's when there's other data associated.

reply
ramoz
3 hours ago
[-]
From what I've seen none of this is that complex, one could simply 'draw a circle around your house' and get all the "anonymized" device pings and just trace those.
reply
jandrewrogers
6 hours ago
[-]
Location and identity are inextricably linked. You can't destroy identity without also destroying location and location is critical for myriad purposes.

The analytic reconstruction of identity from location is far more sophisticated than the scenarios people imagine. You don't need to know where they live to figure out who they are. Every human leaves a fingerprint in space-time.

reply
nickburns
6 hours ago
[-]
> and location is critical for myriad purposes.

It's not though.

Critical for myriad elective purposes? Sure.

reply
jandrewrogers
5 hours ago
[-]
Only if you consider the entire concept of logistics in civilization as "elective".
reply
xphos
5 hours ago
[-]
Seems hyperbolic we had logistics that functioned extremely well before we had customer location data for sale on 3rd party sites.
reply
philipallstar
4 hours ago
[-]
If you re-read the comment they didn't say that selling it was intrinsic.
reply
nickburns
5 hours ago
[-]
I don't follow what you mean by 'logistics in civilization' as that's pretty vague and amorphous.

Could you be more specific with maybe a single example of where my physical geographic location is electronically critical for a purpose that isn't elective/optional/avoidable?

(And I'm not just trying to be obtuse. I think you're touching on at least part of the 'heart' of both this conversation and that of digital ID verification.)

reply
quickthrowman
5 hours ago
[-]
How does tracking the movements of individual humans aid shipping and logistics, other than providing traffic data to freight companies? How did we manage to have global supply chains prior to GPS being invented?

Edit: I assume I am missing a crucial part of logistics that you’re familiar with, genuinely curious.

reply
1121redblackgo
6 hours ago
[-]
Yep. With side channel/one order of thinking above the laws, its trivial to get around said laws. Need better laws.
reply
malfist
6 hours ago
[-]
> A lot of geolocation data on the market is anonymized

A lot isn't good enough.

reply
ch4s3
7 hours ago
[-]
IMO we should ban gathering this data without a warrant or specific contractual agreement between the device owner and entity aggregating the data. As much as congress loves to claim the interstate commerce theory of everything, this seems like a slam dunk.
reply
Dwedit
7 hours ago
[-]
Contractual agreement? Nobody reads things like EULAs or terms of service. It's probably in there already.
reply
ch4s3
7 hours ago
[-]
I should have been a bit more clear. We should ban retention for any purposes where it is not explicitly required for the intended function and clearly agreed to by all parties. Think somethig like strava or asset tracking. You know it stores gps data, and why.
reply
ryandrake
6 hours ago
[-]
There is no such things as "clearly agreed to by all parties" when it comes to end users. Companies provide a one-sided, "take it or leave it" EULA, and if you don't agree to everything in it, you don't use the product. There is no meeting of the minds, there is no negotiation, and there is no actual agreement. It's a rule book dictated by one side.
reply
pocksuppet
6 hours ago
[-]
Then it's not a valid contract and therefore does not absolve them of criminal liability for stalking you.
reply
kube-system
5 hours ago
[-]
Contracts of adhesion can be valid contracts. The ability to negotiate or equal bargaining power is not a required element of a contract.

Furthermore, you cannot contract away criminal liability if any exists.

reply
lukeschlather
5 hours ago
[-]
Even attempting to use a contract of adhesion to justify selling GPS location data to a third party should be a criminal act.
reply
kube-system
5 hours ago
[-]
Yes, the US is in desperate need of better privacy laws.
reply
celeritascelery
5 hours ago
[-]
You click on “accept terms and conditions” which means you agree to the contact.
reply
ch4s3
6 hours ago
[-]
You can't just bury literally anything in an EULA. There's a fair amount of case law establishing that EULAs clauses that are surprising or illegal aren't enforceable.
reply
pwg
6 hours ago
[-]
That fact does not change the point of the individual to which you replied. Regardless of whether the clauses in the EULA are 100% legal, some mixture or 100% illegal, the entire EULA is a "one sided rule-book dictated completely by one side". You, the person held to the EULA's rules, do not get to negotiate on the individual points. You simply have a "take it or go away" set of options.
reply
nine_k
3 hours ago
[-]
If the product has any serious audience / traction, it becomes profitable to scan its EULA for illegal clauses, and sue the company for damages (and maybe extra punishment for breaking the law).

The fact that 100% of its users, except the litigant, skimmed through the EULA and did not notice anything does not relieve the company from the responsibility.

reply
kube-system
5 hours ago
[-]
You're talking about contracts of adhesion and they are overwhelmingly common for B2C agreements. Most red-lining of contracts only happens in high-value B2B transactions where the sums of money involved are enough that it makes sense to bring lawyers into the loop.
reply
nickburns
6 hours ago
[-]
reply
rolph
5 hours ago
[-]
when you already pay for the device and a contract, then surprise now that you have skin and flesh in the game, you HAVE TO agree to this EULA or your property is a brick and we keep your money.

that is defined as extortion, but labled as onboarding.

reply
kube-system
5 hours ago
[-]
Courts do look poorly upon this -- to have a valid contract of adhesion there is some degree of advanced notice required and ability to reject it.
reply
stavros
6 hours ago
[-]
There is the GDPR.
reply
teeray
5 hours ago
[-]
Instead of “I accept”, you’re given a quiz
reply
toofy
7 hours ago
[-]
if it were up to me i’d require a hand signed contract that explicitly, up front and in plain english gives permission and is not transferable to any “partners”.
reply
rubyfan
7 hours ago
[-]
Right, privacy terms are written to be vague and permissive. Even if you read them you can’t usually understand how the data will be used or opt out.
reply
rubyfan
7 hours ago
[-]
I think we should make this type of tracking opt-out by default. We should also ban the sale of its use to third parties and its use for purposes other than the specific functionality which required it to be enabled in the first place.
reply
gruez
7 hours ago
[-]
>I think we should make this type of tracking opt-out by default

That's opt-in, not opt-out.

https://en.wiktionary.org/wiki/opt-out

reply
nickburns
6 hours ago
[-]
GP states correctly that they believe the default 'choice' of a user should be 'opting-out' of location tracking.
reply
rcxdude
1 hour ago
[-]
This is utterly confusing the use of the terms. Opting is making a choice. The default isn't a choice. Opting by default makes no sense.
reply
wakawaka28
4 hours ago
[-]
Every EULA already covers this basically. The real problems are: people agree to it, and the government can do an end-run around the constitution by simply purchasing data or hiring contractors.
reply
troupo
7 hours ago
[-]
> IMO we should ban gathering this data without

GDPR tried. And the narrative around GDPR was deliberately completely derailed by adtech.

Lack of enforcement didn't help either

reply
ch4s3
7 hours ago
[-]
GDPR like all EU regulation is needlessly complicated and aimed at a compliance model that seems designed for SAP.
reply
microtonal
6 hours ago
[-]
The compliance model is very simple. Do not collect data. Problem solved. If you need to collect data (e.g. because you are a webshop), only collect the minimum necessary.

The problem is not the GDPR, the problem is the surveillance industry that wants to grab as much data as possible and try to do as much malicious compliance as possible.

reply
jandrewrogers
5 hours ago
[-]
Designing around GDPR compliance shows up all over the place in industrial data collection. It doesn't only affect surveillance webslop.

The costs are often worse on industrial side because the data is so much larger and faster than web or mobile data.

reply
gwerbin
5 hours ago
[-]
What do you mean by "industrial" in this case?
reply
jandrewrogers
4 hours ago
[-]
Telemetry from machines and data from environmental sensors that is collected for operational purposes (safety, efficiency, reliability) in industrial applications. Old school engineering systems that in modern times have expansive network-connected sensors that may even have onboard classifiers to reduce the quantity of data.

The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for.

reply
gwerbin
2 hours ago
[-]
Interesting. What are your obligations under GDPR in that case? It's not like a packing machine can request data deletion.
reply
jandrewrogers
35 minutes ago
[-]
No one has been able to provide a satisfactory answer to this question. I've seen the lawyers try to figure this out at a few companies.

GDPR frames everything in the context of a person's data. There is no "person_id" or similar field in these data models. That isn't the purpose of the data, it would be expensive to extract it, and then it would create obvious liability under GDPR. This makes the idea of finding a person's data expensive -- brute-force search on huge data volumes.

Compounding this, these data systems are often operational and some of the data may be in situ at the edge because it is too large to move all of it. The power and compute budget may not exist to find a person using brute force.

AFAICT, current best practice is to maintain a polite fiction that people aren't being tracked because that is not the intent. No one thinks that would stand up to serious legal scrutiny though. If the regulators come after you then plead best effort based on the technical infeasibility of doing anything else.

reply
GJim
1 hour ago
[-]
Eh?

The GDPR is there to protect your personal/sensitive data, or data that can personally identify you. If has nothing whatsoever to do with data capture from industrial machinary.

I remain astounded how ignorant some people are of basic GDPR principle: protecting your _personal_ data.

reply
jandrewrogers
13 minutes ago
[-]
Industrial data capture can produce detailed traces of your travel no different than tracking your mobile phone. Some can capture personal details that adtech often can't because the sensor suites are more diverse and operate in different environments. We just don't use it for that.

How is this not your personal data?

Exploitation of these types of data sources has been demonstrated for 15+ years at this point. Abuse is often impractical for technical reasons but GDPR doesn't give you free pass on collecting personal data just because you aren't using it like personal data.

reply
pocksuppet
6 hours ago
[-]
Have you read it? It's not that bad, unless you're thinking like an adtech programmer trying to find the exact edge case for the maximal amount of tracking you're allowed to do, because such a bright line does not exist and that fact infuriates adtech professionals. It is vague because reality is vague and complex; each specific case of alleged violation has to be interpreted by multiple humans; there is no algorithm.
reply
ch4s3
6 hours ago
[-]
The law mandates a data protection officer with specific duties. It also establishes a board that "issue guidelines, recommendations, and best practices" which is where administrative complication and nonsense always creeps in.
reply
jandrewrogers
6 hours ago
[-]
It is regulation that imagines companies are a government bureaucracy.

I have read GDPR and don't work in adtech. It is vague and it is pretty easy to find pathological scenarios that don't make much sense or impose an unusually high burden for no benefit. Every European law firm seems to agree with this assessment despite what proponents assert. Consequently, it forces a lot of expensive defensive activity in practice.

To some extent, it was just a failure of imagination on the part of GDPR's authors. Many things are not nearly as simple as it seems to assume and it bleeds into data models that have nothing to do with people.

It is what it is but no one should pretend it is not a burden for companies that have nothing to do with adtech or even data about people.

reply
troupo
6 hours ago
[-]
You can literally read the entire "complicated" regulation in one sitting in an afternoon. There's literally nothing complex or complicated about it.

Congrats on gullibly believing the ad tech narrative.

reply
ryandrake
6 hours ago
[-]
The "GDPR is complicated" meme has been circulating among software developers since probably before it was even written. It's so wild that HN dunks on it so much: Here we have a societal problem in computing we've been complaining about for decades, someone offers an incremental but imperfect regulation to start taking steps to correct it, and everyone hates it!
reply
kentm
3 hours ago
[-]
> It's so wild that HN dunks on it so much: Here we have a societal problem in computing we've been complaining about for decades, someone offers an incremental but imperfect regulation to start taking steps to correct it, and everyone hates it!

YOUR collection of user's data is an overreach and breach of privacy. MY collection of data is absolutely necessary to grow my scrappy small business and provide value. I am a good person with good intentions, so its OK. You are a bad person doing bad things, so its not OK.

reply
IX-103
3 hours ago
[-]
The GDPR is vague and unworkable as written. It fundamentally restricts all data processing with a few, vague exceptions.

What is data processing essential for the services being provided? Many publishers assumed that getting paid was an essential part of providing a service, and it was not until 3 months before the implementation deadline that the committee clarified that getting paid is not included when you are being paid by a third party.

How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)? Is making that determination a service essential for providing your service? The answers apparently were "You don't" and "No", which would effectively make companies assume that the GDPR applies to everyone on the planet.

The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight. It was too big of a change to happen at once, so it effectively only loosely enforced in practice.

I like the idea of the GDPR, but the implementation sucks.

reply
GJim
1 hour ago
[-]
> The GDPR is vague and unworkable as written. It fundamentally restricts all data processing with a few, vague exceptions.

What utter utter FUD

You are free to collect as much personal data as you want, PROVIDING you have my explicit opt-in informed consent to do so.

What about this is difficult to understand?

> How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)?

The GDPR provides _basic_ data safety and consumer protection. If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data. In which case you need to _seriously_ consider if what you are doing is ethical.

> The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight.

Utter Bullshit!

You are free to advertise as much as you like! But if you want to track me with your advertising (hello scummy adtech industry) then you need my explicit informed consent to do so. And so you should!

Again, what about this is difficult to understand?

reply
ryandrake
50 minutes ago
[-]
> If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data.

It's interesting and revealing when someone responds to a law that says "You're not allowed to abuse users in countries X, Y, and Z" with "How can I figure out who's in the other countries, so I can abuse them?" instead of "I'll just stop abusing everyone, and then I don't even need to worry about where anyone is."

Whenever you find yourself asking "how do I toe as close to the 'illegal' line as I can without technically going over it?" I think it's time to ask yourself some pretty hard questions.

reply
pocksuppet
6 hours ago
[-]
Same with the California age input box.
reply
lukeschlather
5 hours ago
[-]
The problem with the age input box is that we don't have the GDPR. We're mandating that people give accurate age information to advertisers, and it's legal for advertisers to sell detailed dossiers on people including their age and target advertising using the age. This is why Meta wrote the age input box legislation, they want to make everyone legally required to provide Meta with their age.
reply
ch4s3
6 hours ago
[-]
Being able to read something in one sitting doesn't make it simple or obvious. The law establishes a board that gets to set new requirements.
reply
buzer
2 hours ago
[-]
What new requirements can be set by the board? As far as I understand EDPB can only issue guidelines, recommendations and best practices. All of these are just guidelines on how to interpret GDPR. Courts are the ones who ultimately decide if are complying with GDPR. Local DPA likely won't harshly punish you if you follow EDPB's recommendations if they end up getting overturned by court.

DPA won't punish you for not following EDPB's recommendations, they will punish you for breaking GDPR. You are free to ignore EDPB if you think your legal position is strong, but you carry the risk if you are wrong.

reply
stavros
6 hours ago
[-]
As someone who has to implement it, it's really not bad at all: Ask the user for consent to use their data, and don't be misleading about it. That's it.

The rest of the "It'S So LaRgE AnD UndErSpEciFieD" is just FUD. The regulators don't just slap fines, they work with you to get you to comply, and they just want to see that you're putting in the effort instead of messing them about.

I have literally never been surprised by the GDPR. Whenever I thought "surely this is allowed" it was, whenever I thought "this can't be allowed", it wasn't. For everything in the middle, nobody will punish you for an honest mistake.

reply
kentm
3 hours ago
[-]
Also, "Be able to track a user's data and delete it on a request."

This is not too hard if you do proper engineering work ahead of time and are purposeful about how you move and manage data (step 1 is just not collecting it unless its vital). But the industry encourages us to be very bad about that because we gotta "move fast and break things or you're not gonna make it."

reply
ch4s3
4 hours ago
[-]
> for everything in the middle, nobody will punish you for an honest mistake.

How do you know that? Again the law establishes a rules making body that can at any time change or add rules, and as far as I can tell there's no public review process.

reply
stavros
4 hours ago
[-]
Which body is this? The EDPB?
reply
redwall_hp
6 hours ago
[-]
Anti GDPR people: "it's so complicated not being able to walk into someone's house and take their things! Which things can I not take? How about this? And now I need a lawyer if I take someone's things? Ridiculous!"

Just don't spy on people.

reply
stavros
6 hours ago
[-]
Yeah that's pretty much what it feels like, or sometimes it's "what if someone's stuff is lying on the street? Can I take it then?" and the regulator is kind of like "look around and ask if it belongs to anyone, and if not, sure".
reply
KaiserPro
1 hour ago
[-]
The problem the USA has is that it has no concept of "private data" outside of some part of HIPAA.

Until that changes you're going to be stuck.

Something as simple as the data protections act 1998 (https://en.wikipedia.org/wiki/Data_Protection_Act_1998) would kneecap a lot of the shady shit that goes on in the USA.

reply
treebeard901
3 hours ago
[-]
Once wealthy and powerful people realize how this can be used to track them they will start cracking down. One of many examples for how underrated access to location data is for unauthorized people, it is a primary way that the military locates and kills targets in foreign countries. It is surprising all of the data is so freely available with data brokers. Or in some cases from the app companies themselves, if you're willing to make it worth the trouble for them.
reply
crummy
2 hours ago
[-]
Seems there’s an opening for an ElonJet-like tracker that operates on this data.
reply
romaniv
6 hours ago
[-]
The problem with all these discussions about banning stuff is that privacy is always on the back foot. It's by design. People who want to surveil and manipulate us are actively investigating new ways of doing it, they get paid for it and they risk nothing in the long run. All of these discussions about specifics are just reactions. They aren't even reactions to the surveillance itself, but rather to a discovery by someone that a new surveillance machine has been constructed and launched.

So the current feedback process involves: construction → exploitation → reporting → public awareness → legislation. This is too slow. Moreover, operating in this environment is exhausting.

We need a different feedback loop altogether. I'm not sure which one would work best, but something different needs to be considered.

reply
jjk166
4 hours ago
[-]
Yeah, abuse of privacy should be the crime, the same way theft is. How exactly the crime is committed shouldn't matter. Companies can have every right to make a compelling argument that what they did was not an abuse of privacy when they are defending themselves in court.

And critically, it is not someone becoming aware of private information that is the abuse of privacy, it is exploiting that private information which is the abuse. There may be countless legitimate technical reasons you need to collect data, but there can not possibly be a technical justification for selling it.

reply
groos
4 hours ago
[-]
Most people don't realize how bad geolocated data is for a free society. I can buy data from a broker, geo-fence your house address, and then I'm able to see all the places where you went, who you associate with, and identify all you associate with by tracking them to addresses. All of this happens with anonymized device identifiers. It is the wet dream of a company such as Palantir and all governments who desire absolute control over their populations.
reply
linkjuice4all
6 hours ago
[-]
Let’s just stretch copyright to cover movement/location as a protected creative expression. It’s somewhat ridiculous but we’ve already established case law and technology for handling/mishandling protected assets.
reply
gruez
5 hours ago
[-]
Then they add a clause to the ToS with "you grant us and our affiliates a worldwide, non-exclusive, royalty-free, sublicensable and transferable license to your location..."
reply
victor22
40 minutes ago
[-]
You cannot regulate anything anymore, everything is geotracked, be real
reply
ButlerianJihad
2 hours ago
[-]
When I had the opportunity to peer into public records, I found some extremely intriguing stuff.

There was one person with a feminine name who showed up with a “home address” that would correspond to being my “neighbor” at home, at my clinic, at church, when I went to college, etc. All the years corresponded correctly, and the addresses were some residential place about a block or less away from the places where I went.

For all I know, this person was either fictional or an innocent bystander. She did appear to have a Facebook account or two. I was never able to directly contact her. But I found it very strange and I wondered what would be gained by doxxxing me in this manner?

Of course this has nothing directly to do with GPS coordinates, but imagine if the GPS began to be part of your public record as well, or on your credit report. Imagine if it was entered into the public record what coffee house you visited every morning, or if there were errors in this record.

reply
GJim
41 minutes ago
[-]
> GPS coordinates

* coordinates

There are many ways of establishing ones latitude and longitude without recourse to one particular GNSS system.

reply
uxhacker
7 hours ago
[-]
More details are available here, including screenshots of the tool.

https://citizenlab.ca/research/analysis-of-penlinks-ad-based...

reply
reenorap
4 hours ago
[-]
I'm of the opinion now that posting videos online without the explicit permission of EVERYONE in the video should be illegal. It's one thing to take a video and keep it on your phone but if you share it outside of your family and only your family, then it needs to have the expressed consent of everyone whose face is on it otherwise it should be a crime.

The previous views on privacy didn't take into account the fact that everyone now has video cameras and people are incentivized to violate privacy to make money as influencers. I think people's privacies need to be protected and I think that means making laws around it much, much stricter. This includes things like location data, it shouldn't be sold or exposed at all.

reply
wakawaka28
4 hours ago
[-]
So, no videos of festivals or politicians speaking? How about legally recorded conversations or anything else exposing corruption? Body cam footage? Harassment is already a crime. Take care not to come up with oppressive laws to deal with a nuisance.
reply
pnw
4 hours ago
[-]
The examples show Android devices. How does Webloc track iOS devices given Apple doesn't allow unique IDs and allows the user to disable the ad ID? I wish these articles would go into a bit more detail for the technical reader.
reply
lifeisstillgood
5 hours ago
[-]
There needs to be a believeable legal framework behind this.

Imagine a option on your iPhone that says “Enable this to allow geo-location tracking for organisations registered under the NOADSJUSTPUBLICGOOD Act” - then any wifi endpoint could locate you as long based on signal strength etc and that data could only be made available to people registered under the act.

Would we see new understanding of how people move around in cities, would we see better traffic information, Inthink so - as long as people believe that there are real teeth to the laws and they enforced loudly and publically.

We should embrace the benefits of a society wide epidemiology experiment - the benefits for public health are incredible. (Add to that supply chain logistics on open ledgers and many of the new things that just were not possible before and the future of open transparent but well regulated democracies is bright.

Let me know if you spot one.

reply
Terr_
2 hours ago
[-]
"Get consent first" hasn't worked because the average consumer can't give informed consent to the kind of stuff going on behind the scenes.

What about: "If something bad happens because of the data your company shared or lost, it is criminally and financially liable?"

reply
atmosx
2 hours ago
[-]
Both make sense. Depends on who you want to protect.
reply
kidnoodle
4 hours ago
[-]
I had a theory that the way to solve this was a location intelligence data union which sold safely anonymised aggregates and shared the profits, while also litigating on behalf of members under available legislation to stop other people using their data.

Alas, I was stymied by not having any cash to work on it, and the unit economics were not very VC friendly (at least I assume that’s one of the reasons why I didn’t get any traction from VCs).

reply
eptcyka
2 hours ago
[-]
I want geolocation to not be sold. Yet, I do not believe we have been successful in banning the sale of cocaine and elephant tusks. What makes us think this will be an easier problem to solve?
reply
lionkor
2 hours ago
[-]
People get arrested for running large cocaine operations, that's the difference.
reply
Eextra953
6 hours ago
[-]
Does anyone know of any groups that are organizing and lobbying to get things like this into law? I know about the EFF but they seem to be more focused on documenting and reporting instead of lobbying and getting things passed.
reply
Cider9986
5 hours ago
[-]
Restore the fourth, Brennan Center, EPIC, Freedom of the press foundation.
reply
dminor
6 hours ago
[-]
Senator Wyden has been pretty focused on it. I think it's going to take some changes in Congress before it happens though.
reply
Cider9986
5 hours ago
[-]
Massie and McGovern in the house as well.
reply
titzer
6 hours ago
[-]
These people really have no idea at the level of data collection from Google's rootkit on Android known as "Google Play Services".
reply
glitchc
6 hours ago
[-]
How about we just ban the collection of precise geolocation? Wouldn't that be a better solution?
reply
davebren
6 hours ago
[-]
You can have legitimate use cases where it's a core functionality of the application to store it, so the user obviously knows it's being collected and agrees by using it.
reply
warkdarrior
4 hours ago
[-]
So you want to ban all mapping apps and all fitness apps?
reply
ssl-3
2 hours ago
[-]
Nothing external needs my precise location to navigate with a map. An approximate location is sufficient to deliver to my device a map of an area I'm in, and of the overall route, and all of the details that are useful for navigation.

Fitness apps can be local. We have pocket supercomputers; certainly, we don't need help from the clown to keep track of how far (or how energetically) we biked or walked today, or where that took place.

reply
Mithriil
6 hours ago
[-]
I would expect such a law to be lobbied to death.
reply
kristianpaul
5 hours ago
[-]
Haven't read the article yet but having more NTRIP public endpoint could help a lot to this precise location
reply
charcircuit
3 hours ago
[-]
I think it's fair for law enforcement to compensate the people collecting this data instead of forcing them to give it away for free.
reply
erelong
4 hours ago
[-]
Alternatively, opt out of services that sell it
reply
shevy-java
5 hours ago
[-]
Soon Geolocation will be tied to Age! Then you can meet locals and congratulate them on their birthday. The movie Minority Report was way too timid in its prediction here. Age up everything! \o/
reply
lifestyleguru
7 hours ago
[-]
Smartphones, mobile apps, mobile networks, and WiFi stopped being your friends around 2015-2016. Now it's just a matter of how much data can be harvested from device sensors in real time until reaching a pain point which doesn't exist.
reply
Cider9986
5 hours ago
[-]
WiFi isn't that bad, we have mac address randomization[1] and VPNs. Cellular is obscenely bad, though.

[1] https://grapheneos.org/usage#wifi-privacy

reply
reorder9695
5 hours ago
[-]
If anyone's interested in this the book "The Age of Surveillance Capitalism" is rather revealing of the sheer scale of this.
reply
mystraline
5 hours ago
[-]
Yep.

And the FLOSS/Linux phone hardware attempts have frankly sucked.

I was hoping that my PinePhone Pro would actually be usable. But no, its a PineDoorstop.

Proper Linux would be a great 3rd choice. But yeah. We've got a duopoly and not much we can do about it.

reply
9991
4 hours ago
[-]
GrapheneOS is a proper Linux. The hardware isn't open, but otherwise it's quite nice and clearly designed for the end-user's benefit, in stark contrast to the more widely-adopted alternative mobile OSes.
reply
troupo
7 hours ago
[-]
Don't you want random companies to store your precise location for 12 years? https://x.com/dmitriid/status/1817122117093056541
reply
Swizec
7 hours ago
[-]
Screenshot in that tweet says 13 months FYI
reply
mzajc
7 hours ago
[-]

  > Lifespan: 13 Months
  > ...
  > Standard retention (4320 Days)
It looks like a cookie prompt, so I assume "Lifespan" refers to cookie expiration and "retention" to how long the data (including geolocation) is retained on the spyware company's servers.
reply
troupo
7 hours ago
[-]
[Cookie] Lifespan: 13 Months

Data Retention: Standard Retention (4320 days)

reply
wolvoleo
7 hours ago
[-]
Just ban the sale of any kind of adtracking. That way we can get rid of the cookiewalls too.

Missed opportunity by the EU when they wrote GDPR.

reply
GJim
4 hours ago
[-]
> Missed opportunity by the EU when they wrote GDPR.

Not really.

There are legitimate reasons why I might wish to be tracked or give my personal data to a company. As long as I'm asked to give clear, opt-in informed consent, this is perfectly fine. This is the very essence of the GDPR!

Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.

The GDPR is well written.

reply
wolvoleo
3 hours ago
[-]
> There are legitimate reasons why I might wish to be tracked or give my personal data to a company. As long as I'm asked to give clear, opt-in informed consent, this is perfectly fine. This is the very essence of the GDPR!

In these cases they don't even need to ask for your permission.

> Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.

No, the EU should have done more to prevent this. They didn't want to kill a billions-of-euros industry. But they should have.

reply
troupo
7 hours ago
[-]
GDPR literally prohibits the sale of user data and tracking without user consent (because yes, you want to give people the possibility to opt in for a variety of reasons).

GDPR has literally nothing to do with cookie popups. That was, and is, adtech

reply
em-bee
6 hours ago
[-]
prohibits [...] without user consent

that's what causes the popups.

it should prohibit it outright, consent or not.

reply
SoftTalker
6 hours ago
[-]
But the only reason the popups are needed is the adtech tracking cookies. You don't need a popup for cookies that are related to essential site functionality.
reply
em-bee
6 hours ago
[-]
yes, so if ad tracking is forbidden outright then asking for permission to do it is invalid too.
reply
GJim
4 hours ago
[-]
We certainly do need another law to ban the adtech industry..... Though no doubt that would prompt a _shitstorm_ from Google, Elon and chums.
reply
wolvoleo
3 hours ago
[-]
I see only positives there.
reply
nathanlied
3 hours ago
[-]
I can live with the tears of Google and Elon, frankly.

The adtech industry has, time and again, proven they cannot self-regulate to any decent capacity. At this point, the only reasonable course of action is to shackle them down with such heavy legislative burdens they're rendered de facto extinct.

I will not mourn their loss.

reply
pocksuppet
6 hours ago
[-]
I think they are saying GDPR did not ban websites from noisily asking for consent and trying to trick you into giving consent.
reply
wolvoleo
3 hours ago
[-]
Well they did but that is not policed.

For example, giving consent should be the same difficulty as denying it. So one click consent means there must be also one click non-consent. But this is policed very poorly.

I think they should just ban adtech altogether, at least any form of targeted advertising, individual pricing (which is already illegal in many EU countries) and ideally also deep market research.

reply
lotu
6 hours ago
[-]
My job was building cookie walls in response to GDPR. It might not have been the “intent” but it certainly was the consequence of that law.
reply