Decolonizing civic tech; cops on Facebook; Pedo-Tube; and much more
Of all the sessions I managed to catch at the Code for America Summit last week in Oakland, which was as packed, energetic and inspiring as ever, by far the most valuable and thought-provoking was one called “Empire.gov” that focused on the idea of “decolonizing civic tech.” The panel—Alberto Rodriguez Alvarez, Sydette Harry, Abbey Kos and Cordelia Yu—raised a lot of tough and important questions about the voices missing from the commanding heights of civic tech at places like the summit, and also pointed to the need for concrete changes, like travel scholarships for potential attendees who couldn’t afford the event. To learn more, read this post shared by the four as part of their panel, and sign up for updates (hopefully video of the session, and an accompanying keynote delivered by Harry, will be available soon).
Coding it Forward’s third summer cohort of Civic Digital Fellows taking up positions across the federal government is now underway with a record 55 fellows, Tajha Chappellet-Lanier reports for FedScoop.
Here’s a 10-point “Algorithmic Bill of Rights” generated by Sigal Samuel of Vox, who talked to 10 experts including Cathy O’Neil, Amy Webb, Joy Buolamwini, and Yeshimabeit Miler.
The Plain View Project, launched by Philadelphia lawyer Emily Baker-White, is documenting police behavior on Facebook, compiling publicly viewable posts, comments and other activity that demonstrate problematic attitudes on race. After building a database of more than 3500 Facebook accounts of cops, the project found that about one in five current officers and 20% of retired officers made posts that display bias, applaud violence, scoff at due process or use dehumanizing language, Emily Hoerner and Rick Tulsky report for BuzzFeed News and Injustice Watch.
Attend: Tonight’s NY Tech Meetup is co-sponsored by Civic Hall and focusing on some of NYC’s best civic tech, including Motivate, Neverware, Pluto, Quadrant 2, VotEd, BLOC and Pathfinder.
Apply: SumOfUs is looking to hire a new executive director.
Life in YouTubistan: Researchers at Harvard’s Berkman Klein Center have found that the video platform’s automated recommendation system has been driving attention to videos of home movies of children, focusing on prepubescent, partially clothed children, in effect building mass audiences of apparent pedophiles, Max Fisher and Amanda Taub report for The New York Times. On Twitter, Fisher notes, “We talked to one mother, in Brazil, whose daughter had posted a video of her and a friend playing in swimsuits. YouTube’s algorithm found the video and promoted it to users who watched other partly-clothed prepubescent children. Within a few days of posting, it had 400,000 views.”
Fisher and Taub add, “When The Times alerted YouTube that its system was circulating family videos to people seemingly motivated by sexual interest in children, the company removed several but left up many others, including some apparently uploaded by fake accounts….Jennifer O’Connor, YouTube’s product director for trust and safety, said the company was committed to eradicating the exploitation of children on its platform and had worked nonstop since February on improving enforcement. ‘Protecting kids is at the top of our list,’ she said.”
That said, YouTube has refused to shut off its recommendation system on videos of children, Fisher and Taub report, noting that “The company said that because recommendations are the biggest traffic driver, removing them would hurt ‘creators’ who rely on those clicks.” Just to refresh your memory, back in March, Neil Mohan, YouTube’s chief product officer, told Kevin Roose of the Times that it was not in the company’s “business interest” for its recommendation engine to drive users down a rabbit hole of increasingly extreme content. He also insisted that YouTube wanted to respect “people’s personal tastes” and talked with pride about “keeping users in power, in terms of their intent and the information that they’re looking for.” Indeed.
Life in Facebookistan: Independent researchers focused on information disorder are having a hard time understanding why Facebook has so far still refused to label the manipulated Nancy Pelosi-as-drunk video as a forgery or disinformation, instead of just adding a bland language label saying that there is “additional reporting on this,” Alex Kantrowitz reports for BuzzFeed News.
Here, with the opposite point of view, is Jeff Jarvis of the Newmark School of Journalism at CUNY, who says he is “amazed that so many smart people thought it was an easy matter for Facebook to take down the video because it was false, without acknowledging the precedent that would set requiring Facebook henceforth to rule on the truth of everything everyone says on its platform — something no one should want.”
My two cents: Facebook already exercises lots of editorial discretion (just ask your nipples), so demoting or removing manipulated content doesn’t set any kind of precedent. What is needed, however, is some transparent and accountable way for Facebook and other platforms to make these judgments. A team of editors could easily make judgments about the difference between political satire that a reasonable person can recognize as such, and deliberate manipulations aimed at duping viewers into believing they are factual truth.
Related: Former US President Barack Obama says he’s worried about deep fakes, telling an Ottawa audience last week that, “The marketplace of ideas that is the basis of our democratic practice has difficulty working if we don’t have some common baseline of what’s true and what’s not,” Zi-Ann Lum reports for the Huffington Post.
Tech and politics: Rep. David Cicilline (D-RI), the chair of the House’s antitrust subcommittee, has announced a sweeping review of Big Tech, including Facebook, Google, Amazon and Apple, Tony Romm and Elizabeth Dwoskin report for The Washington Post.
Relational organizing, and the tools that enable it, could be the way that Democrats regain their tech edge in 2020, Dave Leichtman of Microsoft writes in Campaigns and Elections.
Does digital activism favor conservatives? That’s the argument of a new book, The Revolution That Wasn’t, by Jen Shradie, and as this review by Mary Joyce in SSIR points out, her case study of state-level activism in North Carolina supports that claim. But Joyce, herself a longtime student of digital organizing, adds, “Extraordinary claims demand extraordinary evidence, and a four-year study of a small slice of civil society in a single state is not conclusive evidence for the claim that digital technology has not been politically revolutionary for progressives. There are plenty of counterexamples to that thesis, from #MeToo and the anti-Trump network Indivisible to Iran’s Green Movement, the Arab Spring, Occupy Wall Street, and Black Lives Matter. Digital technology has granted new capacities of inexpensive and large-scale communication, coordination, and anonymity to the marginalized on both the left and the right.”
The 2020 campaigns haven’t taken any protective measures to guard against deep fakes, Kaveh Waddell reports for Axios.
Thursday is the 75th Anniversary of D-Day. Take a few minutes to remember and ask yourself: Could we do this today?