Hah, Minority Report is here. Digital signage around me tailoring ads to my preferences is a freaky idea though, will they start showing me "Hot singles want to date you"? I suppose they'll be able to scan the crowd and see who's the richest^W who's the easiest to pry money from and then show an ad targeted to them. And a network of these cameras can conceivably even identify where I live and work ("We see this face shopping at this supermarket most weeks", "most mornings we spot him at this coffee place"), and by doing so guess my income and spending levels.
And scanning cars is an interesting idea. "It's a Cybertruck, show the ad for the penis-enlarging pill!".
It was all sent to a real-time dashboard, and pretty interesting
And robot vacuum cleaners :) Although there's some hostility to cybershow here I'll reference a couple of episodes because surveillance is a frequent talking point. In this [0] one the discussion is exactly what you're talking about.. cars, and the incident with Tesla employees sharing 'funny' road traffic accidents.
In this other discussion about CCTV in general [1] two notable things came up:
1) From a security POV, there's an impotency of digital surveillance. It is a tool that works well as post-facto analysis (such as arresting a bunch of rioters the next day). But it has less value in real time unless you also have human resources - and in most cases (except in dangerous war zone) if you already have them they're superior to any electronics.
2) Communities are sold surveillance by an "insecurity industry". It's a big business. But they rarely reflect on its value or other effects. Surveillance is a sign of poverty if you take all quality of life factors into account. Overt security signals inner insecurity and social decay. The more cameras you can see in a place the 'poorer' that neighbourhood since a truly wealthy society is one with high trust.
This would have worked in trustworthy societies - not these ones. (If a trustworthy society is still known in these times, please inform.)
You expect confidentiality from your lawyer, your medic, your wife, your telephony provider.
(Not wishing to be pedantic for its own sake, there's an important distinction)
High-trust is the average trust metric between any two randomly chosen individuals from the set at some time.
Trustworthiness (in one regard) is the historically accumulated record of positive performance against promises made. But a society is not a single entity. Maybe society's representatives might be trustworthy.
So there's a push for "zero trust" society. What do you make of that?
Given that what happens is what is allowed by the society in which it happens,
the priority is to flee - as fully bad indicators are there. The problem remains, "to where" - identifying a society that still recognizes and defends Dignity.
While the concept of privacy makes it into laws, it's still just a minor component of a broader "dignity" that so many technologies seem set on destroying. A dozen or more major thinkers since the 1900s have noticed that human dignity is very fragile in the face of technology (Weber, Marx, Fromm, Freud, Jung, Nietzsche...) Surveillance is just one modern facet of undignified life with technology.
Question: can technology ever enhance dignity?
We've seemingly built a system (society) in which material and social success hinge on a willingness to forgo dignity. People who have a strong sense of dignity are disadvantaged and marginalised. So to answer your "to where?" To the margins. Unless one is prepared to embrace indignity in visible opposition. Struggle may be the last refuge of dignity.
According to what would-be objective judgement? Under a threshold you fight; over a threshold you flee. It is just sensible. Dignity may or may not be impacted, heightened or lowered - it depends on details.
> can technology ever enhance dignity
Of course tools and devices are meant to enhance dignity, we build them for a purpose - to serve us and assist.
> We've seemingly built a system (society) in which material and social success hinge on a willingness to forgo dignity. People who have a strong sense of dignity are disadvantaged and marginalised
Very correct. (Careful with those «we».)
> To the margins
Of what is not your society? Look at reality. It is a procustean coexistence of squirrels and Men. There will probably be societies less vile.
Please expand.
8/11/2024 | Amazon.com | $50
But Level 3 data includes each individual line item:
8/11/2024 | Amazon.com | $50 | 1 Very Embarrassing item | some additional fields
This appears in all sorts of interesting ways, and is not restricted to B2B/B2G transactions as they state so prominently. Anyone can sign up if they have a certain number of transactions per year and save quite a bit on credit card processing fees for providing the data.
I can't find the article but there was a tire company that provided a branded credit card, and they had risk profiles for their customers. The riskiest went to some specific bar, and the least risky were buying snow removal tools. (Please forgive my memory if I have the details incorrect).
edit: Found it https://archive.md/gyde0
"Martin’s measurements were so precise that he could tell you the “riskiest” drinking establishment in Canada — Sharx Pool Bar in Montreal, where 47 percent of the patrons who used their Canadian Tire card missed four payments over 12 months. He could also tell you the “safest” products — premium birdseed and a device called a “snow roof rake” that homeowners use to remove high-up snowdrifts so they don’t fall on pedestrians."
Additionally if you try to buy large amounts of visa gift cards it can be problematic. This is one way they catch manufactured spend.
At the end of the day, some merchants are providing every single detail of your transactions down to the line item and all that information is being tagged to you.
But: if the "purchased item" column is filled in the database of the credit card expenses, it means that the shop receiving the payment has transmitted the information. This is an unrequired deliberate action... The credit card company could just receive "Card ...1234 to pay 20u to Acme Inc. shop". That the shop transmit further information to the credit card company is a further action that should be made transparent to the card owner.
I'd actually find it pretty cool to get access to my own level 3 data for smarter budgeting/analysis (eg: automatic tracking of food stocks, separation of spend on luxury foods from basics etc), but I've not found a way to get access as an individual yet
The solution is proper, enforced anti-spyware and anti-stalking legislation (so not the GDPR), not hardware band-aids that are trivially bypassed.
1. Protection against law suits. We reserve the right to not delete any information you have, since if there's a law suit we would need that as proof.
2. Freedom of speech. We are a publisher, so by removing your personal information, our right to free speech is threatened and since this is a foundational legal principle, it overrides any GDPR laws.
I worked in the communications part of a lender. We couldn't delete anyone's texts or other correspondence for a number of years due to compliance requirements.
And I thought burning my fingerprints off was going to be painful.
The other more inclusive “we”, not just you and I.
So, which "taxis"? Taxis are getting extinct - with cars. (You did not expect by entering a taxi to subscribe a sinister contract with unclear entities - and some will plainly refuse it. The "unlivable society" proceeds.)
There absolutely should be some kind of notice, or at least an opting-in (where the "opt-in" is not the act of simply getting into a cab).
It's irritating a vital service like this becomes an "all or nothing" deal, where I can't selectively opt-out of some shady practice and still use the fundamental service.
I don't see a way to opt-out without plastic surgery.
Make storing personal data like storing hazardous material. Something you absolutely avoid if possible, and treat with extreme care when you absolutely must store it.
Unfortunately the users of this site would rather tell you to move to the woods, go off grid, and paint dazzle camouflage on your face before admitting that a solution has to come from society rather than the individual.
https://www.privacy.org.nz/privacy-act-2020/privacy-principl...
The concern is (as always) when the law is not adhered to by those tasked with enforcing it.
I'm tired of hearing minimizing language like "Police now admit their actions were not consistent with the law" and that being the end of the matter.
If you want to scare businesses, ban arbitration clauses and other self-absolving Terms of Service. It won't stop Pornhub from getting hacked but it will make their lawyers piss themselves imagining the consequences. Trying to enforce SOC2 on the entire internet is an exercise in futility that will end with Russian hackers selling your credit card to teenagers.
Of course it doesn't appear "spontaneously," it's the result of your actions and others' actions, hence the "cabin in the woods" solution. The commenter is implying that expecting each individual to carefully act to preserve their online privacy clearly isn't producing good outcomes, and would like to see collective action through regulation to encourage better outcomes.
> If you want to scare businesses, ban arbitration clauses and other self-absolving Terms of Service.
That is one potential way to implement the suggested "hazardous material" policy. If storage of any data opens a business up to legal action with teeth then they'll stop risking the storage of such data outside of when the benefit to them outweighs the potential risk. Ideally the risk would be such that it becomes standard practice to process data on-device and design protocols and services such that only the absolute minimum required amount of information leaves end-user devices.
If you live in the modern world you are producing data about your actions and that data is going to be collected.
Society really needs to do this as soon as possible. These businesses give themselves the right to do anything they want by putting some clause in some document nobody reads.
Except supporting NGOs fighting against the surveillance: https://eff.org, https://edri.org.
Not only we have (non-compliant) consent flows that destroyed user experience everywhere (without improving privacy in any way, since again they're not compliant and not actually designed to give you privacy), but the lack of enforcement means companies can now claim various things as GDPR compliant, knowing full well nobody is going to actually examine this claim (and if they do, the resulting consequences will be negligible) to give their users/customers a false sense of security.
These two statements are contradictory. You can't have a good enforcement without implementing a reasonable law first, which GDPR is.
Surveillance per se might be useful, let's say you want to know how much live traffic is there in your planned trip, alerts for incidents, natural phenomenon and so on. The issue is just the balance of forces and what can be done in case of unbalanced forces those who hold the knife from the handle side.
I like having Google Street View, I do not like have my gate on it but that's the trade off, I like Street View so I have to accept having my door on it as well. The point is who own what. If StreetView is like OpenStreetMaps it's a thing, if it's a private service where the owner decide what to keep and what to publish than there's a problem.
I like being able to see my car's cam from remote, but that means others cars owners will see me walking around as well, that's an acceptable trade off if it's balanced (anyone can see his/her own car's cams) it's far less acceptable if vendors can see and sell streaming cams, owners depend of them to partially see or not their cam streams.
Let's say you are a big-edutech player, you have all infos collected by your platform on your infra. Even if children and families know what you have and "why" [1] they can't know you send an ads at a time small bits of information to drive the scholar path of talented children you plan to hire tomorrow or you try to push some students with political/philosophical ideas aside to avoid having them as active adults against you.
Long story short:
- we, of course, need personal ownership, the opposite of modern IT where most info are in third party "cloud" hands and users have just some modern dumb terminals "endpoints" to interact with third party services who own their digital lives;
- we of course need to know where our information go
but it's not enough, we need information fairness. OpenStreetMaps might have someone using data for certain business purpose, that's still fair, since anyone else can use and own the same data, it's a choice do it or not. Google Maps it's not. Google is the owner, others are customers.
If we share anything or nothing or anything else in between accessible to all or to no one, we are in a balanced situation, there will be some who takes better advantage than others because they understand how to do and they want to do so, but it's still a fair situation. Otherwise it's a recipe for a dictatorship witch we can more and more call "a corporatocracy".
--
[1] a small anecdote: a leading Italian bank years ago decide to ditch RSA physical OTP to access their services mandating a mobile crapplication, I file a formal protest asking for GDPR information and aside noticing they allow operation from mobile, de-facto nullifying the third factor witch is against EU laws (largely ignored the PSD2 norm mandating a separate device for auth and operation), they answer me after a significant amount of time politely that:
- they do ask camera permissions because the app allow to scan Qr codes form various payment systems and for live chat (see below), for similar reason they need gallery access;
- they do want speaker because in-app they offer live audio-video chat assistance so their operator can talk with their customers while being able to see and act on phone screen;
- they need to access filesystem because they allow their customer to pay some bills sent via pdfs by mail or downloaded anyway from some portals, their app need to allow the user select them to being automatically processed;
- they need precise position and phone sensors to being sure it's me acting on my device and not a remote attacker;
- ....
Long story short there are gazillion of plausible reasons for this and that, but I can't know if there are ONLY such legit use of my information or not. I can't be sure even with mandatory AGPL on all systems, because I might have the sources, but no way to be sure their are the very same actually running on their servers.
AFAIK even if the bank has "legitimate" use cases for your private info (and I'm not convinced that those you mentioned are), they aren't allowed to use it for something else without your consent, according to GDPR.
> I can't be sure even with mandatory AGPL on all systems, because I might have the sources, but no way to be sure their are the very same actually running on their servers.
With AGPL, they must share with you actual source code running on their servers.
But I can't prove they respect the law. That's the point.
> With AGPL, they must share with you actual source code running on their servers.
Same as above, they can share a nearly identical system I can see matching but I can't verify it's the same.
Go far, take a look at xz "backdooring".
That's still a balance problem, some have taken advantage on someone else mimicking something legit. As long as a NK project get equally backdoored we would be in balance. You spy on me, I spy on you. You can act behind my lines, I can do the same.
As long as there is enough balance there will be peace and prosperity because the personal advantage became the common one, we all evolve.
Let's say you downloads all news articles you read in an accessible, searchable etc format, let's say a feed reader storing posts, and them are full articles in the feed, plus you have historical archives downloadable like old usenet exports. You still get only certain news, others have access to much more news so might know things you do not know and decide to not publish them to take advantage other all the others...
In finance there is insider and outsider training defined and forbidden for a reason, but nothing exists for information. Alphabet or Apple have an immense knowledge from iOS and Android users any of their user have not. How easy for them could be find talents in schools thanks to their education penetration and an ads at a time convince them to take a certain path they like, not necessarily the child like, than hire them while pushing others let's say uni students of some humanities with ideas they dislike in bad roles? What if the owner of an insurance company is also the owner of an insurance comparison service?
We can't rule nature, we can't design a forever perfect society, but as much fairness there is as much positive evolution for all we might elicit. As much as we came back to feudal like society as less positive evolution we might hope.
Of course owning our information, like our home, car, .... is MANDATORY but far from being sufficient. And actually if you see trends... The 2030 Agenda where "you'll own nothing" it's already there in the IT world, it's already there in modern connected cars and so on, it will be there soon in the whole society and that's the topmost asymmetry of information and ownership we can imaging.
So far 99% of cars have purely mechanical breaks and steering so even without assistant systems A BIT you can act, but very few start to be "by wire" (like Tesla Cybertruck with the famous "lagging steering")... Ownership is more vast then "just my files on my desktop storage". Similarly try recalling the recent "Google position scandal" wrongly accusing some people to be on a crime scene. Materially it's still some data in some file but...
The combination of those items give people a motive and opportunity, and the only thing that is lacking is a choice of weapon. People have been mass murdering others over their political beliefs since forever. Poisons are a covert weapons, and car accidents can be created.
It is conspiratorial to think this way, but it is naive to look through the history until even recent years to dismiss these concerns.
† I'm not alleging a giant conspiracy theory about direct corporate control of the media, but it is well known that businesses 'seed' articles by sending unsolicited 'fact sheets' and talking points to reporters.
Contact him via email:spyrecovery36 @ gm ail c om.