Thursday 27 December 2018

The imminent and well-deserved demise of Tumblr

Truth Coming Out Of Her Well to Shame Mankind

On 3 December Tumblr announced it was getting rid of adult content, starting in two weeks from the announcement. Tumblr management has never been competent at the best of times, but this takes the cake. This will kill the site.

Those of you who don’t have Tumblr accounts will scarcely imagine what a dumpster-fire this is. Just to start with, the main draw of Tumblr was that it was the one social media site that allowed adult content. What Tumblr has just done is basically what I’ve seen a couple of local hospitality businesses do – a pub on the Otago University campus years ago, and a café in Port Chalmers more recently. Both had quirky art-work that gave them an alternative vibe; both were bought by new owners who removed the quirky art-work and tried to rebrand them as bland mainstream venues; both promptly went out of business. The campus pub was eventually revived by a competent proprietor; the Port café is currently sitting empty, a monument to the folly of erasing a successful enterprise’s main point of distinctiveness. The same fate awaits Tumblr.

I was on Tumblr for nearly two years. My original hope, when I joined, was to spread this blog a little further, since after six years I have exactly two followers here. This was not to be, because Tumblr’s search software passes over posts that contain links to other sites – far from the only way in which Tumblr fails its users – which meant that the links to this blog were only seen by those few of my followers who happened to be online at the moment I posted the links. Additionally, I gained Tumblr followers relatively slowly; I think my peak following, after two years, was 182 including spambots. I think this may have been because I declared on my blog header that I was a nudist but didn’t intend to post any nude photos, which lost me both the people who think nudism is skeevy and the people who want to see nude photos. But I can’t be sure.

(Tumblr advertises itself as being good for artists. How are they any good for artists if they make posts with external links non-searchable, you might wonder, so that artists can either get their art seen or link to places where people can pay them for it, but not both? Well, they aren’t, that’s how. And yet, so many of us put up with them, until now.)

The announcement came in the form of an official letter from the Tumblr staff account, which I would go through and point out all the lies except that it’s all lies. The new guidelines ban, and I quote, “images, videos, or GIFs that show real-life human genitals or female-presenting nipples,” in case you were wondering why the phrase “female-presenting nipples” became a commonly repeated joke in early December. So Tumblr is apparently enlightened enough to talk in terms of gender “presentation”, and yet reactionary enough to frame femaleness as inherently more sexual than maleness. Remember, this is not an old policy. This is a policy devised in 2018 by the curators of one of the internet’s biggest platforms for feminist and LGBT content.

Most of us first knew something was going wrong on 16 November, when the Tumblr app disappeared from Apple’s App Store. For those like me who used Tumblr on our computers rather than on mobile, the first sign was when Tumblr responded by suddenly making huge numbers of posts non-searchable, based on their tags – not just things like #NSFW and #nudity but hundreds more, some quite inexplicable. I never did find out why we suddenly couldn’t search for #chronic pain. Meanwhile, posts tagged with things like #white supremacy and #white genocide continued to pop up in search results.

Our first thought was that this was a ham-fisted attempt to deal with the exponentially-increasing nuisance of pornographic spambots. Any post on any subject at all whose count of responses reached five digits would suddenly start getting reblogged by porn-themed accounts with generic comments like “Cool, see my site here.” You could report and block, report and block, report and block, and new bots would keep on coming. They weren’t intended to attract the eyes of the bloggers they were harassing, but to piggyback off them to hoist their coders’ porn sites up the Google search rankings. You see, the way Tumblr works, if you reblog a post and add a caption, then there is now a link to your blog from every other blog that has also reblogged that post. If thousands of people have reblogged it then that’s thousands of links, all from legitimate content-bearing webpages that real people read, and that’s what Google’s search algorithms look for. So our first response was: well, yay that the staff are doing something about this at last, but they need to get a lot smarter about it, please.

Of course we were quite mistaken to think Tumblr management had suddenly started caring about their users’ experience of the site. The pornbot coders wised up within a week or two and reprogrammed their bots to use the tag #SFW and no unsearchable tags, which it turned out was all it took to get past the algorithms. Those of us who conscientiously tagged Pre-Raphaelite paintings as #nudity (so people could filter them out on their work computer just in case their boss looking over their shoulder got the wrong idea) continued to be punished for our honesty.

The next hypothesis was that it was about the child porn, and that does seem to be what sparked Apple’s ire. Tumblr’s strategy for dealing with paedophiles was exactly the same as their strategy for dealing with Nazis, to wit “have a Block function and let the users do all the work”. I’m glad to say I never saw any myself, but many other users had been making complaints to the staff about the problem for years, with the same results as every other complaint to the staff. Getting dropped from the App Store, now that was something they cared about.

But while the App Store incident undoubtedly fast-tracked the adult content ban, the truth is it had been coming for months. Like many other social media platforms, Tumblr’s business plan is to hire out their users’ eyeballs to advertisers. They evidently noticed that their site was getting popular with the social justice crowd and in particular the Black Lives Matter movement, and they apparently had a big plan in the works to capitalize on that. And many advertisers baulk at the idea of people seeing their ads right next to GIFs of sex acts. So the nudity had to go. We used to think that if Tumblr had just one virtue, it was that they understood that not all nudity is sexual and not everything sexual is degrading; now it turns out they just didn’t care about being degrading until there was money in it.

On top of all that, there’s a draconian piece of legislation coming into force in the United States next year called SESTA/FOSTA – I can’t be bothered looking up the acronyms but the ST in both halves stands for “sex trafficking”. Nobody with any human decency could oppose stopping sex trafficking, which makes it the perfect pretext for interest groups pushing less creditable agendas. Under the new law, social media hosts based in the US will be liable for allowing sexual solicitation on their sites, even unknowingly, even if it’s just one ad. This is why Facebook, from next month on, is going to start cracking down on anything that could remotely be interpreted as a sexual invitation, up to and including posts consisting of “looking for a good time tonight ;)”. This legislation will incidentally stifle discussion of sexuality or sexual orientation in social media; I’m pretty sure that’s a plus as far as its originators are concerned, but it might come as an unpleasant surprise for some of its more liberal supporters.

How deleting Facebook posts saying “looking for a good time tonight ;)” will contribute to stopping sex trafficking is: it won’t. But at least Facebook understands that you need human intervention to make this sort of thing work. Tumblr think they’re going to accomplish it with software, and indeed with ludicrously simplistic software. They’ve promised that nude paintings and breastfeeding photos and news articles about nude protests will be safe once they get their programs properly trained, but the algorithm they’re using could not possibly make such fine distinctions even if you trained them on all the data on the internet for a thousand years. You may have seen funny Facebook posts about the photo-captioning AI, trained on pictures of fields of sheep, that tagged all fields as “sheep” and failed to recognise sheep in any other context. It’s the same algorithm.


Nude images follow.