this post was submitted on 18 Oct 2024
158 points (98.2% liked)
Asklemmy
43936 readers
468 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
SSRN is a kind of vast warehouse of academic papers, and one of the most ~~excited~~ cited and well-read ones is called "I've got nothing to hide and other misunderstandings of privacy."
The essence of the idea is that privacy is about more than just hiding bad things. It's about how imbalances in access to information can be used to manipulate you. Seemingly innocuous bits of information can be combined to reveal important things. And there are often subtle and invisible harms that are systematic in nature, enabling surveillance state institutions to use them to exercise greater amounts of control in anti-democratic ways, and it can create chilling effects on behavior and free speech.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=998565
le user generated summary (no gee-pee-tee was used in this process):
comment 1/2
Section I. Introduction
skip :3
Section II. The "Nothing to Hide" argument
We expand the "nothing to hide" argument to a more compelling, defensible thesis. That way we can attack it more cleanly
Section III. Conceptualizing Privacy
A. A Pluralistic Conception of Privacy (aka "what's the definition")
Privacy can't be defined as intimate information (Social Security/religion isn't "intimate"), or the right to be let alone (shoving someone and not leaving them alone isn't a privacy violation), or 1984 Orwell surveillant chilling social control (your beverage use history isn't social control) (p. 755-756 or pdf page 11-12).
Privacy is kind of blobby so we define it as a taxonomy of similar stuff:
(p. 758-759, or pdf page 14-15)
So privacy is a set of protections against a set of related problems (p. 763-764 or pdf page 19-20).
comment 2/2
B. The Social Value of Privacy
Some utilitarians like Etzioni frame society needs and individual needs as a dichotomy where society should usually win (p. 761 or pdf page 17). Others like Dewey thinks "individual rights are not trumps, but are protections by society from its intrusiveness" that should be measured in welfare, not utility. "Part of what makes a society a good place in which to live is the extent to which it allows people freedom from the intrusiveness of others" (p. 762 or pdf page 18). So, privacy can manifest in our right to not be intruded.
Section IV. The problem with the "Nothing to Hide" argument
A. Understanding the Many Dimensions of Privacy
Privacy isn't about hiding a wrong, concealment, or secrecy (p. 764 or pdf page 20).
Being watched's "chilling effects [i.e. getting scared into not doing something] harm society because, among other things, they reduce the range of viewpoints expressed and the degree of freedom with which to engage in political activity"; but even so, it's kinda super hard to prove that a chilling effect happened so it's easy for a Nothing to Hider to say that the NSA's "limited surveillance of lawful activity will not chill behavior sufficiently to outweigh the security benefits" (p. 765 or pdf page 21). Personal damage from privacy is hard to prove by nature, but it still exists.
If we use the taxonomy, we notice that the NSA thingamabob has:
But then the Nothing to Hide argument only focuses on one or two definitions but not others. So it's unproductive.
(p. 766-767 or pdf page 22-23)
B. Understanding Structural Problems
Privacy isn't usually one big harm, like that one thing where Rebecca Schaefer and Amy Boyer were killed by a DMV-data-using stalker and database-company-using stalker respectively (p. 768 or pdf page 24); it's closer to a bunch of minor things like how gradual pollution is.
Airlines violated their privacy policies after 9/11 by giving the government a load of passenger info. Courts decided the alleged contractual damage wasn't anything and rejected the contract claim. However, this breach of trust falls under the secondary use taxonomy thing and is a power imbalance in the social trust between corpo and individual: if the stated promise is meaningless, companies can do whatever they want with data -- this is a structural harm even if it's hard to prove your personal damages (p.769-770 or pdf page 25-26)
There should be oversight -- warrants need probable cause, wiretaps should be minimal and with judicial supervision -- Bush oopsied here (p. 771 or pdf page 27).
"Therefore, the security interest should not get weighed in its totality against the privacy interest. Rather, what should get weighed is the extent of marginal limitation on the effectiveness of a government information gathering or data mining program by imposing judicial oversight and minimization procedures. Only in cases where such procedures will completely impair the government program should the security interest be weighed in total, rather than in the marginal difference between an unencumbered program versus a limited one. Far too often, the balancing of privacy interests against security interests takes place in a manner that severely shortchanges the privacy interest while inflating the security interests. Such is the logic of the nothing to hide argument" (p. 771-772 or pdf page 27-28).
Section V. Conclusion
Nothing to Hide defines privacy too narrowly and ignores the other problems of surveillance and data mining.