Cathy O’Neil: What kind of information economy will support healthy democracy?

The internet has been great at connecting us with products, but not at connecting us as a polity.

By Cathy O’Neil

Bloomberg View

The internet, led by tech giants Amazon, Facebook and Google, has been great at connecting us with products. It has been less good, however, at connecting us as a polity. It’s time we started imagining what an internet optimized for the citizen would look like.

I’m a plus-sized woman, and when I was pregnant with my first son in 1999, finding maternity clothes was a pain. The maternity shops didn’t have my size, and the plus-size shops didn’t sell maternity clothes. By the time I was pregnant with my third son in 2008, the internet offered all I needed — very cheaply and with quick delivery.

We’ve made a trade. In exchange for commercial surveillance and data collection, we have the world’s best at-home mall. And just as my friends and I used to socialize at the mall when we were teenagers, nowadays we socialize online in commercial spaces like Facebook and Twitter. The stores around us, informed by our digital profiles, magically transform to meet our needs and desires. We have the tailored internet experience.

This happy story has exceptions. Predatory industries use the same profiling technology to locate folks desperately in need of money or vulnerable to gambling pitches and sell them things they shouldn’t buy. But in large part, we’re in consumer nirvana.

The dark side of this revolution is that data collected for one use can be repurposed for another. Our marketing silos are information silos, too. Thanks to our tailored experiences, we barely share a common reality with people who disagree with us. They see different products, different political news, even different facts.

It crept up on us slowly, while we were playing with our new toys. But now the internet giants have enormous power. Facebook and Google together collect 77 percent of U.S. digital ad revenues. They account for a growing portion of political ads and offer us news environments that they control and edit. They capture so many hours of our attention every day, it’s hard to see how we wouldn’t be influenced.

This influence isn’t benign. Just last month, the New America Foundation shut down its Open Markets initiative after a big donor — Eric Schmidt, executive chairman of Google parent Alphabet Inc. — reportedly objected to its position on European antitrust regulation. The tech giants are spending increasing amounts on lobbying, lest we think of their market power and access to data as anything other than inevitable.

Consider the implications for society. The way Google presents search results can influence voter opinions and possibly help radicalize vulnerable people (remember Dylann Roof?). Facebook’s experiments with its “I voted” button suggest that it could shift election outcomes. Imagine the possibilities if Facebook CEO Mark Zuckerberg ran for president (not an unthinkable scenario). He would be completely within his rights to direct his engineers to skew the Facebook newsfeed toward positive coverage of his campaign.

We caught a glimpse of what that might look like in the last presidential campaign. Donald Trump’s team, for example, reportedly planted Facebook posts as part of an operation to suppress the African-American vote. And now we know that fake accounts, likely operated out of Russia, spent about $100,000 on Facebook ads ahead of the 2016 election — ads that might have reached tens of millions of people. Yet researchers can’t get a more precise sense of what happened, because Facebook hasn’t released the data needed to do so.

Facebook gives us a lot. It’s a place to connect with friends and family. It’s also the most efficient propaganda machine ever built by man. How can we decouple those two things and promote one while limiting the other? More generally, if we decide that we’ve made a bad trade with big tech, what can we do about it?

The authority of the inscrutable makes it hard for any individual to take on the tech giants — nobody who isn’t a data scientist specializing in machine learning can even understand the algorithms. That leaves government, which has so far focused on antitrust matters related to consumers, not to the state of our democracy. And the classic antitrust approach of splitting up companies to increase competition probably wouldn’t do the trick: Who can compete against Google without the data, the engineering teams, the infrastructure and the code?

We’ll have to try something else. It might involve setting a legal standard for transparency, so that people will know how their data is being sold, for how much and to whom. Or it might require finding ways to limit the use of targeted advertising, so there would be less incentive to collect quite so much data about us.

Maybe what we need is a sort of internet-age analog to the ideal of public broadcasting — a new and parallel internet optimized for the citizen rather than for the consumer. Picture a space that isn’t polluted with ads, that allows people to search for, say, information about health from sources that have nothing to sell them. We would go there to learn, listen and engage, and go back to the commercial internet only when we wanted to buy something. It might start locally, at the city or neighborhood level, curated and informed by local standards and customs. The people who created content could be compensated by their communities.

There’s probably no single or perfect solution, but we have to start somewhere. Extrapolate 50 years into the future, when the big data of today will look puny and the algorithms inaccurate and modest. What kind of information economy will support a healthy democracy?

This isn’t a technological challenge. It’s a political imperative. We can’t give away our democracy to a tiny group of profit-seeking corporations in return for a better shopping experience. At a minimum, we should demand that Facebook and Google make public all the data on their political ads, including who bought them and how they were targeted.

At least then we would have a better idea of what we’re up against.

Cathy O’Neil is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.” Readers may email her at cathy.oneil@gmail.com.