There are steps that the average user can take to protect their own privacy, such as replacing products that have surveillance built-in with ones that don’t. Replacing Google Chrome with Brave, Google Search with Duck Duck Go, getting rid of Facebook and YouTube, etc.But I think, while well intended, this won’t address the core threats that modern technology is posing to our privacy. Let’s use Facebook as an example. 500,000 people join Facebook every day, and unless there’s an equal or greater number leaving the platform (there’s not), then one informed average user isn’t changing much. Facebook remains in power, and our society is going further down a path where bad privacy practices get rewarded. One last point on this topic: Facebook can track you even if you don’t have a Facebook profile. If a friend uploads a picture of you to Facebook (a group photo including you, perhaps), or if you go to a website that’s partnered with Facebook Pixel (their trans-internet advertisement platform), then Facebook will build a shadow profile for you and collect information regardless of your lack of profile.The best thing an average user can do is to fight like hell to get regulation and legislation passed that can combat these practices. Regulation that can limit how long a company can retain data on its users, regulation that can give users the power to delete their data, regulation that requires more opting-in and explicit permission to track and monitor users, etc. Most users wants this (79% of US voters want Congress to prioritize a privacy protection bill), but it won’t happen until we get specific about what we want and put pressure on Congress to do it (the current election cycle is a great time to voice these concerns).
You can’t, and a whole portion of my book is about this topic. The average time spent online is over 6 and a half hours daily, there’s over 2.3 billion people on Facebook and over 2 billion on YouTube. These social media sites have become the medium for our connections, our political discourse, and our past-times. Losing access to them, or forgoing them for privacy reasons, is cutting people off from crucial societal functions.
This isn’t to hand-wave the fears that people have with social media use. People fear being doxxed or cyberstalked. People fear the permanence of things posted to the Internet. People fear the privacy violations.But all these fears can be nullified through more focused and tough regulations. For example, we should make doxxing illegal and strengthen cyberstalking laws. We should create data laws that allow people “the right to be forgotten” (the GDPR provides this). We should create stringent privacy laws that hamper the egregious amounts of data-collection and allow us to better control it. And when we introduce these protections, we might become less anxious about social media use.
The aggregation and storing of another non-consenting person’s publicly available data as a means to collect personal information about them and use it to harm them.That might not be the best definition for a layman. I would also include the etymology that “doxx” comes from “docs” or documents.
I wouldn’t sign on to creating such a policy without knowing more about what specifically the pros and cons would be. But I do think it should be studied.
I often compare where we’re at today with where we were at in the late ‘60s with the environment. We could see that there were impacts of industrialization on the environment but didn’t have the research to say what the best path forward would be. So we created the EPA to look into it- to study these effects and create sensible regulations and technocratic solutions.
As I said, this is comparable to where we’re at today. We can see the effects of modern technology on our youth, less sleep, higher stress, lower employment, etc. And this is coupled with stories about how the people running these tech companies don’t let their own kids use their products. But before we endorse blanket policies about banning X or Y completely, we should investigate what the best policy would be. For example, banning social media might seem like an enticing way to help teens get more sleep, but if it turns out through studies that the actual cause of the teen sleep problem is increased pressure from school, then our policy is just needless restriction. So we should create an “EPA of Tech Ethics” to research these perceived threats and derive technocratic solutions.
To be more direct to your two questions, I think one thing the modern era of technology is providing new and more effective ways of verification. So when we look at a policy like COPPA, which prevents a sites from rendering services and saving data to those under 13, and we know letting children hit a checkbox that says they’re 13 isn’t an effective means of verification, then we feel compelled to use those new means of verification to solve that issue. My counter would be that those new means of verification almost certainly come with the relinquishing of personal information and that that violation of privacy could have a worse effect on the individual than whatever might befall them from accessing the service.
You can find Brian Wolatz on Twitter at @BrianWoaltz. You can preorder his book on kindle, The Gig Society: How Modern Technology is Degrading Our Values and Destroying Our Culture Kindle Edition by clicking here. Click here to order the physical version. His website is brianwolatz.com.