A whopping 800,000 people registered to vote for National Voter Registration Day.
But Congress and state governments still have not taken the recommended measures to increase security in the midterm elections. <thread> eff.org/deeplinks/2018…
At this year’s @defcon, researchers evaluated a voting machine that’s used in 18 different states. They demonstrated how easy it is to gain admin access, which lets someone change settings—or even the ballot—in under two minutes. defcon.org/images/defcon-…
A Congressional working group concluded that election infrastructure is largely insecure across the country, with 42 states using machines susceptible to vote flipping, and at least ten states using machines that provide no paper record or receipt. documentcloud.org/documents/4379…
When @mcsweeneys editors approached EFF earlier this year about collaborating on a surveillance & privacy-themed essay collection, we jumped at the opportunity.
The first all non-fiction issue of Timothy McSweeney’s Quarterly Concern debuts this November: eff.org/deeplinks/2018…
“The End of Trust” features writing by EFF’s team, including Executive Director Cindy Cohn, @maassive, Soraya Okuda, @doctorow, and board member @schneierblog, exploring issues related to surveillance, freedom of information, and encryption. eff.org/deeplinks/2018…
A group of organizations, advocates, and academics—including @EFF—came together in February to create the Santa Clara Principles on Transparency and Accountability in Content Moderation. We're happy to announce that the Principles now have a permanent home:santaclaraprinciples.org
The Principles set a minimum standard for transparency and accountability for communications platforms, and should serve as a basis for more in-depth dialogue and activism going forward. santaclaraprinciples.org
The Principles ask that companies be transparent to the public and their users about content takedowns and account suspensions, and provide opportunity for timely, meaningful appeals to their users. santaclaraprinciples.org
Anyone looking to make changes to how online platforms police speech should learn lessons from the failures of using copyright to do the same. Here are five major takeaways from the copyright wars: eff.org/deeplinks/2018…
1. Mistakes will be made. The law gives huge incentives to platforms to take things down after getting a complaint, leading people seeing their work disappear due to fraudulent takedown notices. Content moderation policies have and will make similar errors.eff.org/takedowns
2. Robots are not the answer. We've seen the mess that automated filters like YouTube's cause.
On Monday, a federal court dismissed our lawsuit against the Justice Department to block enforcement of #FOSTA. (1/5) eff.org/deeplinks/2018…
The case was filed on behalf of two human rights organizations, a digital library, an activist for sex workers, and a certified massage therapist. The court did not reach the merits of any of the constitutional issues, but instead found the plaintiffs did not have standing. (2/5)
We’re disappointed & believe the decision is wrong. For example, the court failed to apply the standing principles that are usually applied in 1st Amendment cases in which the plaintiffs’ speech is chilled. The plaintiffs are considering their options for their next steps. (3/5)
In 2014, we launched Onlinecensorship.org (@censored) to collect reports from users who had experienced content takedowns on social media, in an effort to encourage companies to operate with greater transparency and accountability as they make decisions that regulate speech.
Today, we're relaunching the site with a fresh new look! We're still collecting reports from users, but going forward, @censored will be home to more resources for users, journalists reporting on content moderation, and companies.
Our journalist toolkit offers insight into a set of diverse issues and is a one-stop resource for information related to content moderation policies: onlinecensorship.org/content/a-reso…