Sinem Günel is a powerhouse on Medium, writing, giving away advice, running a 6-week Medium writing academy. I am on the fence whether I need to jump into the program as she is offering so much free advice through different platforms, including YouTube, that I feel I am not ready yet, and may never will be, for the full treatment, depending on how seriously I want to monetize.
I was honoured though that at a recent YouTube confab she organized with Medium’s CEO Tony Stubblebine that he answered my own question as the only one from the audience - seemed like a big step up from Medium obscurity for me especially since I have only been writing seriously on the platform for a month. (I had been on Medium in the past but never to monetize and never writing consistently)
You can see my question on YouTube queued up here:
In fact I had asked 3 or 4 questions and Tony simply picked up on the last one being content moderation. I asked a simple enough question, how does content moderation work on Medium? but Tony didn’t answer me directly, about the mechanics of how this worked which is what I was after, but instead took a more philosophical approach explaining that content was to be constructive on Medium and implying I think that whatever was going on behind the scenes in terms of machine learning or other complex devices it was people themselves who were acting as a check, molding and nourishing the overall content that characterizes the platform. Obviously both mechanisms are working in tandem, but as a relative newbie to Medium just starting to write again more seriously, I can say the whole system seems to be doing a good job as I added to my question to Tony: that I am getting access to a wide variety of opinions (albeit skewing center-left I find) but without too much nonsense or hatred.
As Tony said, content moderation is very much in the news lately, including the lack thereof on Twitter right now, except perhaps for Elon Musk himself, as Musk got rid of most of the staff whose role this was. Conversely, there is no central moderation people or panel on Mastodon because of its very nature as a decentralized platform on the fediverse where each instance is run by individual sysadmins and therefore they become responsible for what is on their own servers - with the ability to block whole instances if necessary as a lot did with Gab for example which caters to right wing and white supremacist content.
On Mastodon there is no algorithm, no artificial intelligence flagging even blasphemous content, instead it is very much the community who does so, whether it by sysadmins or individual users reporting transgressive content, reporting being a mechanism that exists on Medium and which is pretty much standard these days.
I have read about the cheap labour dispersed worldwide whose agonizing job it is to wade through the circumspect content on Facebook (and I assume many other social networks) to keep the feed clean. Incidentally, I once roomed with an evangelical Christian who would only use Facebook, not the open web, as it represented a safe sanitized version of life at the expense of those third-worlders doing all the horrifying grunt work, that is.
These kinds of mechanics are what I was actually hoping Tony would answer but perhaps the venue Sinem provided was not geared enough towards backend issues but moreso on popularizing one’s own writing which is her forte. With my background of 20 years of software technical writing however I still want to know more. I did find this good article on Medium itself which goes into how content moderation systems work on a higher level and touches on some of the things I actually expected Tony might have talked about.
What he did talk about was also interesting however and important as he explained that Medium allows for more opinionated content than other longer format platforms might and I see this playing out in some of my favourite writers on Medium. I doubt you will ever see the kinds of headlines Umair Haque puts out in the Gray Lady (the New York Times) as it would have Wall Street jumping out of windows like during the Great Depression, or have Jessica Wildfire’s economic critique in the Wall Street Journal, or Indi.ca’s critique of the whole Western (re: White) World in Time magazine. But on Medium these writers, like so much more of their ilk, are allowed a free rein as far as I can see so are allowed to flourish.
There is a very delicate balance here and I am somewhat forgiving of the big platforms on this considering the sheer scope of content that they have to look out for. The only realistic solution to content moderation in my eyes is something which I touched on in my previous piece Avatar and ChatGPT, You are Being Deprecated when I talk about a quite old blog post I wrote about which reviewed the even older book (from 2000, yikes!) called After the Internet: Alien Intelligence by James Martin wherein he talks about the actual next step of intelligence on this planet not being artificial or human in isolation, but the continued merging of the two. In this way, as I see it, good content moderation will use both facets so that they are more than the sum of its parts.
Whether Twitter, Facebook, Mastodon, or Medium, there is a need for good content moderation, in my view, in one form or another, to keep everything civil. Free speech absolutists have an argument in the US where this is enshrined in their constitution but most other countries on earth take a more circumscribed approach to this but at least as Tony Stubblebine did say Medium will continue to sway towards the opinionated, so in the case of Umair in particular, we may stick our heads in the sand for a moment or two after each of his pieces but more importantly he gets to point out the serious flaws in our current civilization, something which should not be moderated out at this late date.
Thanks Sinem and Tony!