What Cities Can Learn from Congress’s Facebook Hearings

April 17, 2018 - (6 min read)

Last week, Facebook founder and CEO Mark Zuckerberg took to Capitol Hill to answer questions before Congress regarding his company’s recent data privacy revelations.

While several members of Congress sought answers regarding Facebook’s plan to prevent this from happening again, the hearings raised issues that are just as pertinent for local government leaders — especially those guiding technology innovation in their own communities.

There is a pressing need for policy makers to better understand problems.

One of the most striking takeaways of the hearings was the amount of question time devoted to increasing legislators’ basic understanding of the Facebook business model and its mechanisms for extrapolating and utilizing personal data. Zuckerberg also deferred several technical questions, responding that he would need to check with his colleagues on topics such as the extent to which users can download their own information.

A lack of subject matter knowledge, particularly by decision makers, can be a significant vulnerability that exists up and down echelons of government and across innovation applications. In the increasingly urgent realm of cybersecurity, for example, local officials’ limited understanding prevents them from best protecting themselves and their constituents.

The International City/County Management Association reports that 50% of local officials surveyed have never taken cybersecurity awareness training, and that improving such awareness among government workers is one of the top three most salient factors in achieving a high level of cybersecurity. Leaders at all levels have an obligation to develop their own knowledge in this arena to better protect public interests.

[blog_subscription_form title=”Subscribe to CitiesSpeak” subscribe_text=”Get the essential news and tools for city leadership, delivered daily by email.” subscribe_button=”Submit”]

Self-regulation doesn’t work.

According to Zuckerberg’s testimony, Cambridge Analytica falsely certified in 2015 that it had deleted all data collected without the subjects’ consent. As Zuckerberg stated, “In retrospect, it was clearly a mistake to believe them.” Several Members referenced the permissive environment that allowed Facebook to handle data to its own discretion, with Representative Jan Schakowsky putting the problem bluntly as, “Self-regulation doesn’t work.”

This statement echoes growing concerns in other corners of technological innovation. The National Highway Transportation Safety Administration currently utilizes a self-certification system for developers who want to place their autonomous vehicles (AVs) on the road. States, too, offer this leeway to developers, with Pennsylvania last week rolling out voluntary standards for self-driving vehicle testing.

California’s recent decision to permit self-driving vehicles on the roads without a human present relies on developers to self-certify that their technology can do so safely. Given last month’s fatal AV accident in Tempe, Arizona, the willingness of government at all levels to allow for self-regulation of AV technology without some limitations or oversight is now in question.

Data issues are larger than one company or one industry.

Google and Amazon similarly hold tremendous amounts of consumers’ personal data and their leadership should be held to the same scrutiny as Facebook (and, indeed, a number of legislators referenced those companies throughout the hearings). Looking to fix Facebook’s recent incident alone is akin to addressing one symptom of a much larger problem. Further, applying new regulations to the technology industry alone may be insufficient.

Representative Greg Walden encapsulated the extraordinary breadth of the company’s activities and the increasing blurring of sectors, asking first, “is Facebook a media company?” and later, “is Facebook a financial institution?”

In search of an appropriate forum to take up Facebook and related issues, Representative Raul Ruiz suggested creating a committee or task force dedicated to consumer data protection. Cities should similarly take stock of not just their regulatory structures but their staffing needs and responsibilities when grappling with smart city technology issues.

Many are creating new positions, such as a Chief Information Officer or entire innovation departments, that can adequately address the complexity of innovation regulation. How a city documents their steps to protect their citizens’ data should be considered by these departments or in broader task force efforts.

Equity disparities should not be coded or codified.

Both Senator Cory Booker and Representative G.K. Butterfield noted the lack of ethnic diversity among Facebook’s staff, with the Senator tying this shortcoming to the company’s recent history of permitting advertisers to place discriminatory ads based on a consumer’s ethnic or racial affiliation. This finding is just one example of the perils of unchecked AI technology.

A 2017 NLC report on the Future of Equity in Cities highlighted the risk of new technologies to perpetuate inequity or bias, and the responsibility for local leaders to ensure that innovations brought to the city benefit all residents. The need for municipal leaders to provide safeguards against equity erosion is well captured by Zuckerberg’s testimony: “I think the big mistake…is viewing our responsibility as just building tools, rather than viewing our whole responsibility as making sure that those tools are used for good.”

Policy makers need to proactively prepare for technology growing pains.

Zuckerberg, as well as several members of Congress, pointed out the benefits of his platform, from crowdfunding hurricane relief efforts to encouraging the registration of 2 million voters. It would be neither fair nor accurate to say that Facebook hasn’t positively contributed to the world despite Congress and the public’s growing list of grievances. Facebook, as a case study, represents the inevitable trading of old problems for new ones as we move to a denser and more connected experience.

There will be accidents on the way to autonomous vehicle deployment. There will be cyber attacks on the way to smart cities. There will be jobs lost on the way to an automated workforce. City leaders need to start planning their reactions to adverse events now and readying themselves for hard conversations with their constituents. Communities must have input in decisions about values and how much risk versus reward to carry.

The answers will be different everywhere, and local leadership can play a tremendous role in guiding this dialog.

Perkins.jpgAbout the Author: Lucy Perkins is an Associate in NLC’s Center for City Solutions and Applied Research. Her work covers topics in technology and urban innovation.