Oversight Oversights

State Software Collab launches; Battling the Infodemic; Facebook's Supreme Court; Google Fox in NY Henhouse; and much more.


Civic tech responds: Say hello to the State Software Collaborative, a new project of the Beeck Center at Georgetown University. It will be led by Robin Carnahan, most recently of 18F and a a former Missouri Secretary of State, and Waldo Jaquith who led open data efforts in the White House. They are planning to “help states collaboratively procure, develop, and maintain the software they depend on to deliver mission-critical services, so that instead of 50 states buying near-identical, overpriced software 50 times, they can procure high-quality, fair-priced software just once, and share it among themselves.”

Nicole Edwards reports for Code for Canada on how Civic Tech Saint John in New Brunswick is helping local nonprofits fight food insecurity.

Life in Facebookistan: The first twenty members of Facebook’s new Oversight Board have been named, and David Kaye, the UN Special Rapporteur on freedom of opinion and expression, writes on JustSecurity that he is impressed by their quality, citing several “who are major figures in the world of human rights law and advocacy.”

Siva Vaidhyanathan, a professor at the University of Virginia who wrote a critical book on Facebook called Anti-Social Media, is less impressed, noting that the board “will have no influence over anything that really matters in the world. Not on content that never comes down. Not on ads. Not on disinformation campaigns. Not on harassment. Not on Groups. Not on algorithms. Not on Instagram. Not on WhatsApp…. It influences none of the things that make Facebook Facebook: Global scale (2.5b users in >100 languages); targeted ads (surveillance); and algorithmic amplification of some content over other.”

The Oversight Board, which Facebook has staked with a $130 million irrevocable trust, will not be disclosing how much its members are being paid for working roughly 15 hours a month, Sarah Frier reports for Bloomberg. “We heard from stakeholders during the consultation that if we were to share publicly the compensation figure, we might put board members at risk of undue pressure and influence,” she was told. No oversight there!

(By the way, could we just decide whether “oversight” means keeping an eye on things or its opposite? Auto-antonyms!)

Speaking of oversights, Governor Andrew Cuomo has asked former Google CEO and current paid Alphabet advisor Eric Schmidt to head a new Blue Ribbon Commission reimagining the state’s current health and education systems, Zack Fink reports. Alphabet has huge business operations in both education and health. As this 2017 story in the New York Times by Natasha Singer details, it has long been colonizing schoolchildren with Google Classroom. Project Nightingale, its secret deal with Ascension, a large nonprofit health insurer, raises critical questions about what Google is doing with patient health data.

Consumer Watchdog, a longtime critic of all things Google, has written to Cuomo urging him to rescind the appoint and to preclude Google from providing any of the new technology to be procured by the state’s schools under a proposed $2 billion bond act, given the conflict of interest. “It is entirely inappropriate for a top of executive of a company likely to be considered as provider of technology to advise the state on what technology to adopt. This is not the fox guarding the chicken coop, but rather the fox building the coop,” they write.

Workers at Google are being told they can’t expense meals at home to the company, Jennifer Elias reports for CNBC. No more free food for coddled coders!

Deep thoughts on fighting the infodemic: People responsible for protecting the public, like health agencies, need to do a much better job of understanding how information moves in the internet era, Renee Diresta writes for The Atlantic. And especially now, when authoritative information about COVID-19 remains scarce because it is such a new phenomenon, the curation algorithms that constantly serve up “new” stories to the tops of our feeds are doing more damage to public trust in science than they are helping, she argues.

DiResta does offer some hope: “Some of the best frameworks for curating good information today remain those that involve a hybrid of humans and artificial intelligence: On Wikipedia, an army of volunteer human editors methodically records the facts while using bots to point out suspicious activity, and an arbitration committee—ArbCom—handles users who repeatedly make edits in bad faith. On Reddit, highly qualified moderators are curating coronavirus subreddits that offer substantive discussions about emerging research, while low-quality, misinformation-heavy subreddits have a warning label on them. Twitter has begun verifying the accounts of doctors and other science communicators, recognizing that channels beyond the official CDC and WHO accounts are providing highly useful, up-to-date information. These processes are difficult to scale because they involve human review, but they also recognize the value of factoring authoritativeness—not just pure popularity—into the way information is curated.”

End times: I feel his pain. (h/t David Sifry)

You are reading First Post, a twice-a-week digest of news and analysis of the world of civic tech, brought to you by Civic Hall, NYC’s community center for civic tech. If you are reading this because someone forwarded it to you, please become a subscriber ($10/m) and support our work or sign up for our newsletter and stay connected with the #CivicTech community.