Photo by Markus Spiske on Unsplash
Every invention impacts history in a small way, but few change the entire course of history. Johannes Gutenberg’s printing press, invented around 1436, did just that. Even though he wasn’t the first to automate book-making, his design was the first that could support the mass-production and dissemination of books. His printing press paved the way for the first network of news around Europe, for Martin Luther to become the first best-selling author, and for the Scientific Revolution to take hold—basically unlocking the modern age.
The crazy part is that we’re living through the biggest transformation in how we distribute and receive information since Gutenberg’s printing press was invented. Today, the Internet, social media, and tech companies make it possible for anyone with internet access to produce and disseminate information in an instant.
It’s important to remember that anything capable of disruption is inherently unpredictable—there’s no way that Gutenberg’s contemporaries from his small hometown in Mainz knew that his invention would impact the course of humanity. The same is true of the Internet. We can only observe how the digital age is changing us now and theorize about the future—including trying to remedy problems as we discover them along the way.
One such concern is user data and privacy—and specifically, how tech companies are using our data for monetary gain. Now, we’ve covered the things that you need to keep in mind if your business collects user data to comply with current law—but problems arise when regulation can’t keep up with the fast pace of innovation and technology. Or when the company is so good at data collection and monetization of that data that it just ignores privacy law. Cambridge Analytica brought this into the mainstream.
Cambridge Analytica, a now-defunct tech firm, claimed to have up to 5,000 data points on 240 million Americans using “psychographic” analytics in its dataset. The firm used a third-party app to improperly collect data from 87 million Facebook profiles and used the dataset to target particular people for the firm’s clients. While the methods that Cambridge Analytica used to collect particular user data violated privacy law—the practice of collecting and monetizing user data is an incredibly successful and common business model. In fact, it has become its own economy.
The data economy is propped up by what Shoshana Zuboff, professor emerita of Harvard Business School, calls “surveillance capitalism.” She argues that tech corporations are mining user information to predict and shape behavior in violation of personal autonomy and counter to principles of democracy. Cambridge Analytica was a pretty clear showing of this—Google, Facebook, Twitter, and Amazon are a bit more nuanced.
The basic model is the same: tech companies create a free service or platform that users happily access and share with their friends. The companies then monitor user behavior and, as Zuboff writes, monetize the “human experience” by translating it into behavioral data. And, since nothing like this has existed before, there weren’t (and still aren’t) adequate regulations in place to ensure that user data is protected.
I mean, have you thought about how Google photographed and posted every house and street that it could for Google Maps without asking permission? Or that Facebook’s original advertising system, called Beacon, automatically sent data from external websites to Facebook for the purpose of targeted advertising—without users being able to opt out? Even today, you pay more for an Amazon Prime membership than Amazon pays in taxes. User data was, and continues to be, the product in a data economy, and regulation of Big Tech hasn’t been able to keep up.
So, what can we do about it?
Well, first, we need to recognize that asking surveillance capitalists to obey traditional notions of privacy is a non-starter. Zuboff says that asking privacy from these tech companies “is like asking a giraffe to shorten its neck, or a cow to give up chewing.” Tech firms are not going to willingly hand over their mechanism to corporate survival.
Part of the solution probably includes widespread recognition that tech companies are collecting and monetizing our data. Just last year, U.S. Senator Orrin Hatch, who would presumably vote on regulatory provisions, asked Mark Zuckerberg how Facebook was able to continue operating without charging users for its services. Zuckerberg’s smirk and reply “Senator, we run ads” went viral. Though the exchange was largely Twitter comedy at Senator Hatch’s expense, it’s emblematic of the widespread confusion about how tech companies and platforms that impact billions of people operate day-to-day. If you’ve been on the Internet, you are a data point—and those that make our laws need to at least have a basic understanding of what’s happening in order to be able to regulate appropriately.
Another part of the solution is transparency. The data economy is here to stay. We can’t put the toothpaste back in the tube, genie back in the bottle, whatever-escaped-Pandora’s-box back in the box. But we can require companies to tell us what they are doing with our data and give us a choice on how it’s used. Requiring companies to disclose this information is at least a small step toward accountability.
We’re in the process of coming to terms with what’s happening. The European Union implemented the General Data Protection Regulation and fined Google, Facebook and Apple for antitrust practices and illegal tax breaks in 2018. Globally, about 77% of people say they worry that their internet privacy is at risk, and about 74% say technology giants should see their powers limited. As I’m writing this article, Jeff Bezos of Amazon, Tim Cook of Apple, Mark Zuckerberg of Facebook, and Sundar Pichai of Google are being questioned before a House committee.
As far as what we are likely to see in coming years, here are a few predictions:
- Some countries are more inclined to regulation than others—and America is not likely to be on the forefront. Americans value creativity and innovation, and many don’t want to see those stifled by possibly-prohibitive government regulation. However, if other powerful democracies around the world begin to regulate Big Tech, we are more likely to follow.
- Any new regulatory schemes will need to be adaptable. As we talked about in Regulation in the Gig Economy, the tech industry evolves much faster than law. The MIT Sloan Management Review advocates for a “standard-based approach,” which can be reworked for new risks. “With a standards-based approach, regulators can introduce new guidelines to encourage sensible innovation or, conversely, swiftly hold tech companies accountable when unforeseen risks arise.” Big Tech will continue to evolve, so any new regulation would have to evolve, too.
- If new regulations do occur, they will likely be focused on conduct of companies instead of structural changes. Rob Atkinson, president of the Information Technology and Innovation Foundation, says that despite recent conversation about breaking up Big Tech companies or limiting power, “U.S. antitrust laws are more permissive than in other parts of the world, changing it requires a large consensus among legislators”—which we are not known for. That’s why if regulation comes, it’s likely to be focused on company conduct that could garner bipartisan support. Though, especially after the recent hearing with the Big Tech CEOs, tech companies will likely prevail as long as legislators disagree on what part(s) of the industry should be regulated—ranging from antitrust monopoly concerns, to the spread of misinformation, to political bias.
Antitrust and data privacy laws that we have in place right now are probably more tailored to regulating works made by Gutenberg’s printing press than they are to regulating Big Tech firms. While it remains to be seen exactly how legislators and regulatory agencies will check the power of Big Tech—the question is more “when” than it is “if.” While we may not see immediate action following the testimony from Big Tech CEOs—the fact that they were required to testify on Capitol Hill is indicative of some movement. Or, maybe even the fact that you’ve read to this point in an article on data privacy. It’s just a matter of time.
About Alexandra Smith
Alexandra “Alex” Smith is a student at Belmont University College of Law and summer law clerk at Think Tennessee. Her awesome thinking started as a 2018-19 graduate fellow at Rockridge Venture Law. (Actually, it started before then, but we like to take as much credit as possible anticipating all the monumental things Alex will do on the other side of law school.)