I Protest.
This is my protest.
I exist and live my inimitable, messy, biological, human reality, revelling in real life and surviving to write about it. I write in my wild imperfection and diversity and absurdity.
These words may tend towards the technological or poetical or lyrical, they may elucidate or illuminate, they may seem politely publishable or become uncomfortable or inappropriate or taboo. Not all will be “safe for work” or suitable for polite discussion around a conservative dinner table.
This site and these writings are motivated by a will to rebel, disillusionment with the contrivance of normality, rejection of any onus to conform and a healthy modicum of disobedience. This is my counter-punch to the world that very nearly broke me.
In this year 2024, we live in a world on fire: a world in which a trajectory of hope was glimpsed, once, but diverted and subverted to serve money.
In the context of technology, the World Wide Web was the embodiment of that hope. The web was not destined to be free. It was deliberately and consequently set free1234 by Sir Tim Berners-Lee et al., counter to the pressures of the corporations and entities involved.
The World Wide Web – denoted W3 by CERN – was released into the public domain almost thirty-one years ago, on 30 April, 19931. Only because it was set free could it fascinate and captivate the imaginations and creativity of its early pioneers.
Those pioneers built quirky web sites. Those pioneers coalesced into web rings. They embraced the <BLINK>
tag5 and tacked counters and badges onto their garish, amateur pages.
Soon came GeoCities (‘94) and LiveJournal (‘99) and Blogger (‘99) and the vastness of the blogosphere6 and, in conjunction with their myriad competitors and alternatives of the early web 2.0 era, they would extend a helping hand to those less technically minded. All these authors and creatives and wildly opinionated mavericks required easy content management systems but lacked none of the passion of the pioneers.
The media was not limited to the written word: Flickr (2004) and YouTube (2005) found their genesis, for photography and videography, and they are but two names that remain, today, out of the countless swathes that also ran the dot com race.
From this creative slurry, in this same petri-dish, spawned bulletin boards and discussion forums and, indeed, Wikipedia (2001) foremost! Online documentation and reference matter began to be seen and, by the end of the ’90s, Google made this knowledge searchable and vowed not to be evil7. We followed people with RSS (‘99) and, for a fleeting instant, one likely felt confident using Google Reader (2005) to aggregate those authors and sources one cared to read.
Far be it for me8 to proffer a comprehensive history of the web; let these anecdotes suffice; The trend is apparent.
The early web offered hope and went further, tracing a trajectory that offered platforms to all who would write or film or photograph or record or publish anything, while democratising easy and near-instant access by those who wished to fetch it. It offered homes for communities of people and their diverse masks and personæ.
Today, in 2024, it is hard to see the free and freeing web about which we dreamed.
It is too easy to believe that that glimpsed trajectory was hopelessly diverted, leaving only “five giant websites, each filled with screenshots of the other four”9 – enshittified10 platforms owned by private equity and imminently guillotinable shareholders and billionaires.
Technology news is overwhelmed with noise and hyperbole about the large language model (L.L.M.), inflating an artificial intelligence (A.I.) bubble, and it is too easy to succumb to despair: resigning ourselves to accept that real, live human authors and artists stand no chance in the face of the torrent of mediocrity that these stochastic parrots11 promise to emit.
The immense hype surrounding A.I. is not out of proportion if one considers that these stochastic parrots11 promise a perfect end-game to the neo-feudalist, end-stage capitalist or advertising baron.
Their platforms and algorithms seek to optimise for a single measure – engagement – to addict consumers, monopolising their attention in order to attract buyers of advertising exposures. Long before generative A.I. became practical, they exploited the algorithmic curation of human-created content into feeds to accomplish this.
To the present-day baron who perceives it as a worthy endeavour, generative A.I. offers to automate the roles of the artist and the influencer on social media and the television editor with a knack for pacing episodes to maximise binge watching. It renders content farms passé. It promises that every feed can be infinite, the consumer can be hooked forever, fed content by algorithms, selected by algorithm, optimised – still – for engagement.
We must expect that this trend will continue; it suffices to realise that monopolistic consolidation and enshittification10 of platforms, algorithmic exploitation of attention to drive engagement, and A.I. content generation are all consistent with the modus operandi of the neo-feudal baron.
Their endeavour is self-defeating. Their generative algorithms will dilute the audience and reach of real, live humans while simultaneously flooding their platforms and the wider web with bot-shit12. Lived experience, related by human authors and portrayed by human artists, will be lost in the deluge of drivel: overwhelmed and outnumbered by several orders of magnitude.
These models are trained on the web. Future generations will be trained largely on content generated by their predecessors: coprophagia13.
These models sample real life via the creations of real humans that are uploaded to the web, today as in the recent past, but their success threatens to obliterate that boon, rendering human-authored matter insignificant in scale. The result will resemble a kind of zero-knowledge learning.
Zero-knowledge learning is a very effective technique to train algorithms to master games14 and to excel at tasks with well-defined, understood objectives but the objective of authors, artists and creatives is not well-defined and hardly understood.
A stochastic parrot11 might learn to mimic a human convincingly, despite coprophagia13, but I speculate that what they produce will simply prove boring and uninteresting and, with each successive generation, become only more meaningless.
No number of parameters can provide them with the experience of living in the world.
Pandora’s box has been opened but Pandora’s box also offers hope.
The World Wide Web remains free1234 and an independent web can survive because it is free despite the hubris among the barons.
It is easier and cheaper, today, to create and host one’s own web site or blog than it ever was and many are doing precisely that. For self-publication, the web remains a most democratic and accessible platform.
I am not alone15 in my belief that the extremely online era will end – if it ever existed beyond the fever-dreams of a loud and tech.-obsessed niche in the developed West – but I also believe that an audience of real-life human readers and viewers will survive.
Who can claim that these curious few will not vastly outnumber the browsers of the early years?
To prevail against today’s odds, one must realise that the future does not lie within any enshittified10 platform16.
The curious few will not let algorithms either generate their “content” or curate the feeds they follow.
Their social network will be fair and free of algorithmic feeds or curation; it will be decentralised17. Simple technologies like RSS and basic mechanisms like web rings18 will experience a renaissance. Authors and artists will collaborate, cite their references and link to each other’s pages, creating a hyperlinked network: one might call it a web!
To prevail, it suffices to be the antithesis of an impassive, robot algorithm: to be obscenely human.
We should tell our tales that no machine can relate, first hand. We should write about real life in the real world. We humans have bodies and our bodies consist of organs and urges and needs and lusts. They are rudely born, thrive, malfunction or wear out, recover or heal or die – all in our diverse ways. Along with our studied opinions, technical essays and other, blander matters, we should tell these messy, biological and sometimes inappropriate tales.
In the context of the web, raw humanity is the opponent of algorithmically generated content.
In a world where a tragic regression towards conservativism proceeds apace, the promulgation, normalisation and impolite, flagrant celebration of our diversities becomes ever more important for another reason.
I find myself inspired by the proud and intimate declarations of transexual authors who proclaimed their existence on the transexual day of visibility that passed, very recently19. April, too, is Autism acceptance month20 in celebration of the inimitability of neuro-diverse people.
Diversity may not be limited to sexuality, gender or neurology but must encompass ethnicity and race, socio-economic situation, class and caste, too. “Normality” is merely a contrived concept.
Publishing our diverse, personal stories may be counter-cultural – perhaps disobedient – but culture needs to evolve. Some might argue that it is inappropriate to expose our humanity in proximity to mundane topics – that our minds and bodies have no place adjacent to our polite opinions on politics or essays on technical, acceptable, safe-for-work matters – but propriety, too, needs shifting.
All of us21 who live and harbour creativity should indulge it.
This is my protest: I, in my wild imperfection and diversity and absurdity, exist and live and survive to write, here.
These paragraphs were not converged from random noise: they coalesced around the thoughts of one human being, informed by life experience, biased by primal, biological imperative as much as the weather, my mood, sensory environment and the micro-biome in my gut.
If you care to, read them – the onus is to think: just a fancy word for changing your mind22.
-
https://www.w3.org/People/Berners-Lee/Weaving/Overview.html ↩︎ ↩︎
-
https://home.cern/science/computing/birth-web/short-history-web ↩︎ ↩︎
-
https://www.scientificamerican.com/blog/observations/the-world-wide-web-became-free-20-years-ago-today/ ↩︎ ↩︎
-
The
<BLINK>
and<MARQUEE>
tags would likely have been capitalised, in those days. ↩︎ -
My very first GeoCities site documented my collection of cards for the Legend of the Five Rings™ trading-card game even while WebCrawler (‘94) and AltaVista (‘95) were still dominant. Later, I wrote many a blog, ranting mostly about technology and programming topics. Some of those articles even proved helpful and widely visited. All of those are lost by now – somewhat to the relief of my adult mind. ↩︎
-
https://www.eff.org/deeplinks/2023/04/platforms-decay-lets-put-users-first ↩︎
-
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4678265 ↩︎
-
https://thomasjbevan.substack.com/p/the-end-of-the-extremely-online-era ↩︎
-
https://blog.neocities.org/assets/the-new-neocities#neocities-site-surfing-web-rings ↩︎
-
31 March 2024 was the transexual day of visibility (TDoV) ↩︎
-
April is sometimes called Autism awareness month but the neuro-diverse community largely rejects that title, arguing that acceptance is more congruent with their needs. ↩︎
-
A few of my other inspirations are, non-exhaustively and in no particular order, the works of Cory Doctorow, Brendan Leonard, Molly White, Freya Holmér, Cat Hicks and pages like this lovely, unassuming list of web reminiscence. ↩︎
-
Peter Capaldi playing The Doctor (2015) ↩︎