David Kaye Profile picture
@UCILaw ★ former UN Special Rapporteur ★ Speech Police https://t.co/kV07QvaFvJ… ★ @TheGNI ★ @Article19org ★ @davidkaye@mastodon.social

Sep 7, 2018, 14 tweets

a wonky thread about what it should mean for social media companies to be transparent. my last report to the @UNHumanRights Council discusses: freedex.org/a-human-rights… #transparency can be so vague as to be meaningless, so some [long] thoughts follow

let me just emphasize that i drew these principles from human rights law (& work of some great thinkers), not out of thin air. sure, that law applies to states, but companies also have responsibility not to interfere w/users' rights.

one aspect of transparency is clarity in the rules (community stds, guidelines, etc). they are broad and general, tho companies are trying to do better, it's true.

companies do report government requests for account actions (take-downs, suspensions, etc). see their Transparency Reports. they are really helpful, but mainly still in the aggregate, not v granular. check out the work of @rmack & rankingdigitalrights.org for more

when it comes to transparency regarding platform rules, there is so much work to be done. opacity generates distrust even if those inside the companies want to do the right thing, as many of them do.

so when i say 'radical transparency', i mean at least the following: first, rule-making transparency. how are the rules made? what standards apply? who is involved? read/follow @Klonick & @TarletonG & @ubiquity75 for incredibly rich background (on anything really).

second, transparency about account actions. have you ever protested a post and heard nothing in response? or suddenly lost access to your 'suspended' account? you know what i mean then.

third, transparency about how companies curate news feeds and search results. that means algorithmic transparency, as much as possible (& my next report to the #UNGA deals w #AI and freedom of expression).

fourth, decisional transparency - we need a kind of platform case law system so that we can understand what the companies are actually doing, creating a more level playing field for discussion and debate over account/content decisions.

transparency alone of course isn't enough to solve concerns about the big companies' power over public and private discourse. but it's critical for accountability & public debate.

one other thing. with all the talk of regulation in the air, i want to urge some caution (at least when it comes to content norms). content reg is often a tool to harm independent and nonconformist voices, if not in design then practice. we need smart reg, focused on disclosure.

before i conclude, if you care about this stuff, i really urge you to read the work of experts in the space of content moderation, free expression, etc. please start with links here: ohchr.org/EN/Issues/Free… i'd tag them all if i could

i also urge everyone - esp american policymakers - to learn a thing or two about human rights law. it's the only *global* vocabulary for free expression & privacy, & it should be the basis for policies for global companies.

ok now you don't have to go read an 11k word UN report. you're welcome. /end

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling