In this series, the Gazette asks Harvard experts for concrete solutions to complex problems. Francine Berman, Edward P. Hamilton’s Distinguished Professor of Computer Science at the Rensselaer Polytechnic Institute, is an Associate Professor at the Berkman Klein Center for Internet & Society. Berman’s current work focuses on the social and environmental impacts of information technologies and, in particular, the Internet of Things, a deeply interconnected ecosystem of billions of everyday things connected through the network.
GAZETTE: Do you think the Internet has been a definite force for good in the world?
BERMAN: Yes and no. What the Internet and information technologies have brought us is enormous power. Technology has become a critical infrastructure for modern life. It saved our lives during the pandemic, providing the only way for many to go to school, work or see family and friends. It also allowed electoral manipulation, the rapid spread of misinformation, and the growth of radicalism.
Are digital technologies good or bad? The Internet itself supports Pornhub and CDC.gov, Goodreads and Parler.com. The digital world we experience is a fusion of technological innovation and social controls. For cyberspace to be a definitive force, social change will be needed in the way we develop, use, and monitor technology, prioritizing the public interest over private benefit.
Fundamentally, it is the responsibility of the public sector to create social controls that promote the use of technology for good rather than exploitation, manipulation, misinformation and worse. Doing so is enormously complex and requires a shift in the broader culture from technological opportunism to a technology culture of public interest.
DIARY: How can we change the culture of technological opportunism?
BERMAN: There is no magic bullet that creates this cultural change: there will be no law, federal agency, institutional policy, or set of practices, although they are necessary. It is a long and hard slogan. Changing from a culture of technological opportunism to a culture of technology of public interest will require many sustained efforts on various fronts, as we are living now, as we work hard to move from a culture of discrimination to a culture of public interest. ‘inclusion. .
That said, we must now create the basics of cultural change: proactive short-term solutions, long-term fundamental solutions, and serious efforts to develop strategies for challenges we do not yet know how to address.
In the short term, the government must take the initiative. There are many horror stories: fake arrest based on poor facial recognition, lists of victims of data-mediated rape, intruders calling babies from connected baby monitors, but the consensus on what digital protections is not surprising: specific expectations of privacy, security, safety, and the like: they should have U.S. citizens.
“It’s hard to solve online problems that you haven’t solved in the real world. In addition, the legislation is not useful if the solution is not clear. “
– Francine Berman
We have to fix it. The European General Data Protection Regulation (GDPR) is based on a well-articulated set of digital rights of citizens of the European Union. In the United States we have some specific digital rights (privacy of financial and health data, online privacy of children’s data), but those rights are largely partial. What are consumers’ digital privacy rights? What are the security and safety expectations of the digital systems and devices used as critical infrastructure?
Specificity is important here because, to be effective, social protections must be incorporated into technical architectures. If a federal law were passed tomorrow that said consumers should opt for the collection of personal data by digital consumer services, Google and Netflix would have to change their systems (and business models) to allow users this kind of discretion. There would be trade-offs for consumers who didn’t opt in: Google search would become more generic, and Netflix recommendations wouldn’t suit your interests well. But there would also be advantages: participation rules put consumers in the driver’s seat and give them greater control over the privacy of their information.
Once a basic set of digital rights for citizens is specified, a federal agency with regulatory and enforcement power should be created to protect those rights. The FDA was created to promote the safety of our food and drugs. OSHA was created to promote the safety of our workplaces. Today there is more public scrutiny about the safety of the lettuce you buy at the grocery store than about the safety of the software you download from the Internet. Current bills in Congress calling for a data protection agency, similar to the data protection authorities required by the GDPR, could create the necessary oversight and enforcement of digital protections in cyberspace.
Additional legislation penalizing companies, rather than consumers, for lack of protection of consumers ’digital rights could also do more to encourage the private sector to promote the public interest. If your credit card is stolen, the company and not the cardholder pay the price heavily. Penalizing companies with significant fines and the legal liability of corporate staff, especially those in suite C, offer strong incentives for companies to strengthen consumer protection. Reorienting the company’s priorities would positively contribute to moving from a culture of technological opportunism to a culture of technology in the public interest.
DIARY: Is specific legislation needed to address some of today’s most thorny challenges: false information on social media, fake news, and the like?
BERMAN: It’s hard to solve problems online that you haven’t solved in the real world. Also, the legislation is not helpful if the solution is not clear. At the root of our problems with misinformation and fake online news is the huge challenge of automating trust, truth and ethics.
Social networks largely eliminate the context of information and, with it, many of the clues that allow us to know what we feel. Online, we probably don’t know who we’re talking to or where they got their information from. There are many batteries. In real life, we have ways to detect information, assess context credentials, and use conversational dynamics to assess what we are hearing. Few of these things are present on social media.