Tech
Social Media

Jack Dorsey thinks Twitter should've been allowed to be a hellscape

Twitter's founder has regrets, but his proposed solutions aren't any better.
By Matt Binder  on 
Jack Dorsey
Dorsey's Twitter wasn't great, but neither is his vision for what he wished it would've been. Credit: PRAKASH SINGH/AFP via Getty Images

Twitter's founder and former CEO Jack Dorsey is reflecting on how things turned out with the social media platform he was integral in creating, which now belongs to Elon Musk.

In both a tweet thread and newsletter post (on Twitter's now-defunct Revue newsletter platform), Dorsey addressed the Twitter Files, the internal company documents being reported on by Musk's handpicked writers Matt Taibbi and Bari Weiss. Dorsey's name and emails have come up a few times in what has already been released.

So far, the Twitter Files have mainly shown internal communications between employees at the company, in which they debate about specific pieces of content, whether that content violated Twitter's rules, and what punitive action to take on those tweets or users. 

In his post about the active direction in which Twitter carried out its content moderation policies, Dorsey sounds regretful. Basically, it seems as though he wishes he'd just let Twitter become an anything-goes hellscape.

"This burdened the company with too much power, and opened us to significant outside pressure (such as advertising budgets)," Dorsey wrote. "I generally think companies have become far too powerful, and that became completely clear to me with our suspension of Trump’s account."

Dorsey's proposed solution lies in these three principles:

  1. Social media must be resilient to corporate and government control. 

  2. Only the original author may remove content they produce. 

  3. Moderation is best implemented by algorithmic choice.

At first glance, some of these principles sound reasonable, but the reality is that they're not that easy to carry out in practice because you're dealing with human beings. For example, how would Dorsey deal with death threats, publishing of a user's private data, or child sex abuse material if only the original poster could remove it? His beliefs stem from the idea that everyone on the internet is acting in good faith, which is clearly not the case.

Dorsey somewhat addressed these concerns by saying takedowns and suspensions "[complicate] important context, learning, and enforcement of illegal activity." But this conflates a multitude of issues. If there is some broader context or lesson, then surely moderation policies should take that into consideration on a case-by-case basis.  Not everything has to be publicly visible for social media platforms to alert law enforcement of potential illegal activity.

Obviously, as a for-profit entity Twitter made choices so that advertisers wouldn't stop spending money on the platform. However, many of those decisions were also driven by users of the platform themselves who did not want to interact with racism or harassment. 

Dorsey even brings up one such instance of harassment in his piece: Elon Musk's recent targeting of Twitter's former head of trust and safety Yoel Roth.

"The current attacks on my former colleagues could be dangerous and doesn’t solve anything," Dorsey wrote. "If you want to blame, direct it at me and my actions, or lack thereof."

Roth recently had to flee his home after the Twitter Files narrative painted him as its major villain and Musk not-so-subtly insinuated that Roth was a pedophile due to a disingenuous read of his college thesis.

So how would Dorsey's principles help someone like Roth? "Algorithmic choice," an ideal solution proposed by Dorsey, would just enable Roth to stick his head in the sand and avoid seeing the threats and harassment on his feed. It wouldn't stop other social media users from upending his life because they could still choose to view content about Roth.

"The biggest mistake I made was continuing to invest in building tools for us to manage the public conversation, versus building tools for the people using Twitter to easily manage it for themselves," Dorsey said in his post.

Really, Twitter should have done both. Users should have more control over what they see on social media and how they use a particular platform. But platforms have a responsibility, too. Twitter was correct in putting filters on certain accounts that still enabled users to share posts to their followers but not, say, promote those posts in the trends feed. But Twitter should've also let users know if their accounts had been hit with such filters, as well as why and what they could do to fix the issue.

Going strictly by Dorsey's stated principles, it appears he wishes Twitter had a system in place which simply shifted culpability from the corporation and onto its users. And that, Mr. Dorsey, is the opposite of taking responsibility.


Recommended For You

Elon Musk now says Twitter's 280 character limit will increase to 4000

Is Mastodon the emerging alternative to Twitter?

Best sex toys for women: Take pleasure into your own hands

The reinstatement of journalists on Twitter appears to be conditional and incomplete


Trending on Mashable

How to watch Netflix's 'Kaleidoscope' in chronological order, if you must

Wordle today: Here's the answer, hints for January 3

AirTag odyssey: One woman's lost luggage journey goes viral

Netflix's '1899' mysteriously cancelled after just one season

The biggest stories of the day delivered to your inbox.
By signing up to the Mashable newsletter you agree to receive electronic communications from Mashable that may sometimes include advertisements or sponsored content.
Thanks for signing up. See you at your inbox!