About the totalizing effect of finance. I want to maybe call this article Walden Three, but I need to read Walden Two to know if that’s a meaningful title. The vibes are there.
I have theorized a little already on the profit-making mechanisms of platforms like Steam or Fortnite. The conclusion is somewhat obvious and mundane; Steam makes money by taking a cut. Recall this equation:
S, approximately equals, f, v, sum, start subscript, t, equals, t, start subscript, 0, end subscript, end subscript, start superscript, T, end superscript, c, left parenthesis, t, right parenthesisYou may think this is incomplete, and that there are some costs to be subtracted from this equation. I will show shortly that this is not the case. There are costs, but we are dealing directly with the costs of serving games on Steam. Staff etc. are better rolled into Valve’s game development costs. The costs of serving games are extremely small compared to the profit Valve makes, so that we can say they are effectively zero. If this type of thinking is new to you, the small-angle approximation is something you may have encountered in school. If an angle theta is very small, sine, left parenthesis, theta, right parenthesis, approximately equals, theta .
Over the lifetime of a product on Steam, the profit rendered to Steam is the unit-value of the product, times the number of times it is sold, times the fee that Steam charges. We can sacrifice the time-dependence to get a simple rule out of this. If the total number of copies sold is expressed as:
C, equals, sum, start subscript, t, equals, t, start subscript, 0, end subscript, end subscript, start superscript, T, end superscript, c, left parenthesis, t, right parenthesisThen, over a game’s entire lifespan:
(eq 1)
S, approximately equals, C, f, vAs a quick reality check: if my game costs $8, sells 10 copies, and Valve takes 30%, then:
S, approximately equals, 10, dot, 0, point, 3, dot, 8, equals, 24Valve makes $24. How much of that is eaten up by the costs of serving my game?
Data on Steam’s content delivery network costs are not available as far as I know (they use mostly Akamai, some self-hosting, some other companies). Microsoft Azure’s cheapest plan for >1000TB per month is $0.023/GB and the service is fairly popular. Steam handles about 15 terabytes per second (https://store.steampowered.com/stats/content/) or around 1.3 million TB per day. So the cost of serving my game is quite a bit less than the $2.40 Valve takes per copy sold. Valve definitely gets super-cheap volume pricing, but at the relatively high Azure pricing my game would have to be larger than 100GB for Steam to lose money serving it.
Steam’s overhead is low relative to its profit because copying files is basically free. In absolute terms, hosting is expensive at scale, but Steam’s centrality in distributing games means that, for them, copying files is basically free. Hosting is only expensive if you are hosting large files and those files are not being paid for. Most files on Steam are small, and the larger ones (AAA games) sell millions of copies. CS:GO cases alone could subsidize most of Steam’s developer fees[1].
So for Valve, almost all of this profit is reinvested as capital. The profit reaped from a digital commodity that sells will always dwarf the firm’s employee wages (although the C-suite might take a nice chunk) so practically all of the profits can go into expanding and reinforcing Steam’s dominance of PC publishing.
Additionally, Valve is not paying developers to make Steam games. They are not supplying studios with wages and equipment, just reaping the benefits of owning a market. There are no minus signs in equation 1: at scale it’s all profit for Valve.
And you might say, fairly, that in the grand scheme of things that is not so bad. Steam does drive sales for people’s games, and its existence is a net positive. The thing about markets is that they create structures which are not sustainable. The single-minded pursuit of profit rarely produces anything robust. Steam is optimized toward profit only, and this comes at the expense of quality as we saw with Atari in the early 80s.
In video games and in life, we should always be asking: what does this system incentivize? Well, Steam incentivizes exactly what Steam is doing: allowing anyone to put low-effort garbage on their store. The tiny losses from hosting games with no players is worth it because almost every sale generates a net profit. The PR spin on this is that Steam is “democratizing” games or something like that, but a comparison with Itch.io will quickly dispel that myth. Itch is built entirely around indie games, and it does not have the same trash problem that Steam does. There are tons of bad games, but they are mostly experiments, first tries, or made by kids. Itch’s culture is different: many games are free, and the site tries to promote games that are good or interesting. The trash on Itch is endearing, because the site does not provide an avenue for cashgrabs.
Steam is full of low-effort porn games and Unity asset flips because people throw up whatever on there hoping to make a quick buck. Steam’s discovery queue will show you whatever because the point is just to make you spend money. Don’t get me wrong, Itch certainly follows its Rational Economic Interest in many ways but the site is proof that people have to choose to gear their businesses entirely toward profit. Itch has a very reasonable set of quality standards that keep its tags clean and forbid a lot of moneygrubbing practices. Valve was inevitably going to allow porn on Steam, because it is built to exploit all possible opportunities. The only reason they banned crypto is because they already have a market and a proxy currency.
Steam is so full of trash that it needs more and more complicated recommendation, tagging, and review systems to stave off the deluge of sewage and keep users on their platform. Practically every marketplace that goes all-in on profit faces this problem in some form. Amazon in particular is quickly becoming AliExpress with faster shipping. Every section is overwhelmed with cheap garbage, fakes, and scams. At time of writing, the first three Amazon results for “drill” are tools by GardenJoy, AVID POWER, and DayPlus. Fake brands slapped on cheap tools. As you scroll down, real results from Black and Decker and DeWalt are peppered between drills by WORKPRO, DEKOPRO, DETLEV, HANMATEK, Lostrain, Adedad, VILLCASE…
Buying a cordless drill is about as trustworthy as ordering !!REAL CIALIS!! from a spam email. Everything starts to become a gamble, and marketplaces end up having lax return policies to compensate. Obviously, returned products are not dealt with sustainably. The algorithms might help a little bit, but they radically reduce the discoverability of small games. A recommendation algorithm will recommend popular games to you, so engagement/discoverability must be concentrated on games that are already popular to a greater or lesser extent.
Popularity begins to act as a perpetual motion machine. Sales concentrate in the hands of a few, while every game that does not get picked up has depressed sales. This is why in the last year or two indie developers have started telling everybody to wishlist their games on Steam, the platform is quickly becoming subject to the same dynamics as a site like YouTube. In fact, YouTube and Twitch creators fit neatly into this new system. Their popularity is a distributive property, and if you can get some e-boy to play your game it could mean a few months of rent are guaranteed over night.
Algorithmic driving also incentivizes games to be as eye-catching as possible. In the case of porn games, provocative titles and concepts rule so porn games regularly appear in the “popular new releases” section if you have adult content turned on.
This is true to some extent with all games. Bright, loud twists on existing genres are prevalent. They always have been, of course–there is inherently less risk with retreading a successful concept–and you need only watch one of the many E3-like events to pick up on whatever pattern is current.
The best games are punished most. The unassuming, the slow burning, and the mysterious need to betray their secrets and diminish the play-experience to get any players at all. Like slot machines themselves, the trend is toward a refined technique of presentation that draws players in with promises of big, loud, but familiar experiences. The new and unproven can go fuck itself.
Algorithmic driving has shown up everywhere because there is simply too much content. The internet is an undifferentiable microwave background of posts, videos, music, etcetera. Content. Open-air social media sites[2] like Twitter and YouTube both have cults around their respective algorithm. Elon Musk himself is a Twitter algorithm occultist. At first, people succeed seemingly at random. Fortunes materialize from thin air, and for less successful creators the algorithm itself becomes an object of contemplation. This mirrors the spiritual role of RNG for slot players–the Really New God. I have shown in the past that Random Number Generators offer little more than a veneer of neutrality or randomness. The name is a rhetorical trick. Like RNG, algorithms can be manipulated to keep creators on the treadmill.
Once creators decide to chase the algorithm, an unpredictable feedback loop is set up. Hits within the system can be unexpected, especially on YouTube where an AI black box is doing a lot of the work. Creators have a lead time on developers, and if an algorithm “hack” is found–some weird set of keywords or visuals that the algorithm really likes–it can quickly proliferate. Because platforms are marketplaces for engagement, the platform is incentivized to allow these hacks. Arbitrage-based get rich quick schemes are very popular on YouTube at time-of-writing, for example. Algorithmic hits get more eyes on content and, crucially, on the ads. A canonical example is the plague of Elsa Spider-Man children’s videos on YouTube. Their popularity was a strange, nonsensical aberration in the system.
These videos were creepy mostly because they were inexplicable (in horror movies, the monster is scariest before you see it), and people were quick to attribute some dark motivation to them or at least point out that their creation was very centralized. Markets create monopolies, so the centralization is not automatically evidence of some conspiracy. What we were really seeing was a grotesque interplay of an undesigned algorithm[3] with people ensnared by the profit motive. I am not a determinist, people could have chosen not to make videos of superhero miscarriages, but if a firm is set up to chase the algorithm it is unlikely to suddenly gain a moral dimension. Institutions seek to recreate themselves. Capital tends to multiply itself.
YouTube eventually addressed the keyword-laden Elsa Spider-Man videos, but only after a significant number of people picked up on them. Before they were taken down, they had evolved in increasingly violent directions. Needles were a central theme, and miscarriage was common (Elsa was often pregnant). The market was able to hone in on several elements that boosted videos in the algorithm, and it exploited them. Videos like this are indicative of a feedback loop.
I conceive of algorithmic driving as a trajectory that creators follow, pulled to and fro by trends present and past. The precise content of videos is individuated but there are obvious trends. Let’s Play slop offers plentiful examples. You will find elements of Five Nights at Freddy’s, Minecraft, Poppy Playtime, Baldi’s Basics, Among Us, and non-gaming trends combined and reincorporated into videos which are almost unintelligible. There are insular YouTube trends as well, a popular current one is the “day 1/day 100” thumbnail, consisting of a frame split down the middle, showing a low-power character on one side and a ridiculously overpowered character on the other.
Each algorithmically-driven element of a video is like a tiny black hole that pulls the creator toward it. Or, more intuitively, the creator is like a compass and different elements are magnets of different strengths, each pulling the needle a little bit. Each element has a weight that changes over time with public opinion or YouTube’s interference. The video you get at the end is some particular path through a pre-defined algorithm-space. The creator is pulled toward a lot of screaming, a lot of Baldi’s Basics, a little FNAF, a little Minecraft… this is what people mean when they call content derivative. Simple submission to the algorithm.
This is another way of saying that there is a feedback loop at play. Creators try to find the most efficient path through a space that YouTube defines, and the space is also shaped by creators (since weights are defined by the success of videos). It is obvious that some successful elements will proliferate, as their weight is too high to resist. The policy of not cursing within the first few minutes of a video is ubiquitous at this point, with many creators censoring entire videos. We have discovered a sort of philosophical equivalent to a concept in control systems, the state-space model.
The system has some input (we can call it creativity) which is then operated on to produce an output (a video). What we call “the algorithm” is in the middle, a collection of systems and intermediate variables that manipulate a creator in some way to produce an output. YouTube is somewhat open about this, although they cover it from a “user satisfaction” point of view. Feedback is missing from that paper’s Figure 2, but all popular YouTube videos are more-or-less subject to the content feedback loop. Nobody thought they could get away with making reaction videos a decade ago, now they are inescapable, and YouTube thinks you want Asmongold’s opinion about everything under the sun.
The algorithm’s prominence undermines a lot of myths about success, many of which YouTube itself will peddle to smaller creators so that they will continue to produce for the platform. Success is not correlated with hard work, success is not correlated with quality, and it is not really correlated with audience engagement (if you want viewers to be intellectually engaged, anyway). Success is basically conforming to the algorithm, and we don’t know what the algorithm wants. Making videos is a slot machine, the success or failure of videos is random to us. Twitter, which has somehow become a profitable venture for some users, is the same.
Control theory is a very general thing, especially in software. Once you are able to quantify the inputs and desired output, the system in the middle can do whatever you want it to. As we have seen with the Elsa Spider-Man videos, somebody will meet the algorithm’s demand no matter how gross it is, and YouTube can basically manipulate trends however they want to produce desired outputs. Their policies on the Ukraine war are a recent example, but tech giants have always been eager to run non-consensual experiments on users, so we really have no reason to trust Google’s stewardship. We don’t even know if any trend manipulation would be disclosed.
Trends existed before the internet, and success was also largely random. But the digital age is characterized by a tightening of all slack–instead of executives influencing or hopping on trends, platform owners are able to engineer and control trends while quashing anything they don’t like. Firms want to eliminate all space and bind creativity within their limits–toy companies try to get in as early as possible, so that a child’s play is framed through their products. So that this manipulation is the only thing they know.
Platforms are not TV channels. They are more like a cable provider owned by a TV channel. Your guide shows whatever the provider wants and nothing else. It is as if only these channels exist. Platforms, and really all firms, seek totality[4]. They want to create a world consisting only of their content. The design of the Platform shapes its content and limits what can be expressed: YouTube pushes creators further toward heavily censored infantile garbage every day. The Platform, as I have shown, is a highly efficient business model in a content-based economy.
The digital world allows consumption to be totally individuated. Consumption experiences can be perfectly tailored based on the mass of data that has been extorted from users in the last decades. It is an invisible strait-jacket of amusement.
You can skip this part safely, just griping about YouTube.
Science and technology videos are OK, although they are often stricken with engineer-brain that stops creators from seeing any societal concerns created by their subjects. Much worse are videos nominally about the humanities. It is an impoverished field in general, but if you wish to learn about philosophy (especially from the breadtube set) you’ll be subjected to a bunch of videos that seem like they’re more about playing dress-up than learning anything. At best you can expect some stale post-modernism or critical theory. Rarely can you detect an understanding of any writer, even more rarely can you find anyone who loves any of these writers. It feels for the most part like these are just people with degrees repeating things they were forced to read.
Videos that are nominally about philosophy (or are “intellectual” takes on social issues) exist to act as thought-terminating clichés. We know, e.g. ContraPoints is coded as a “leftist”, so we can predict her take on any issue (always a milquetoast liberal one since she hangs out with Hillary Clinton). We know the video will have a title that alludes to some hot-button issue or concept (traps, J.K. Rowling, etc.). We know that the video will be long. This is all we need. These videos are basically forum weapons, they confirm the obvious left-liberal consensus on a given topic and mark the end of discourse around it, and they can be deferred to as absolute judgement, at least for certain sets of confused teenagers and sad adults.
Discussion about art is in a sadder state still. YouTubers abound who want to explain media to viewers, most of whom don’t seem to notice how condescending that is. As far back as the hateful phrase video essay goes, there has been an overwhelming sense of pretension in video essays.
Nerd culture is very hazardous to people’s’ health. The need to explain media comes from a childish impulse to possess the object of love: an explanation simplifies and binds the thing. To explain also presupposes an answer, whereas art is often a collection of techniques meant to clarify an idea rather than offer a binary “answer” or even make a definite statement.
Explainers more often than not exile the work of art to its own private universe, where it cannot touch reality[5]. In other words, the explanation always stops before it can get to the important part; art only speaks because it speaks to people who live here, in the real world, so the significance of any work is lost if we discuss internal consistency and nothing else.
I tie this to nerd culture because most nerdy media has, historically, been disposable and meaningless. Most sci-fi pulp can only be cared about if you buy into its fantasy, which has little to no external correspondence. The rare piece of genre fiction which manages to endure is remembered for what was real about it. Many people know that Neuromancer invented the term “cyberspace” and inspired all visual depictions of the early internet. Fewer people know the actual plot of the novel.
Taking a work of art purely on its own also removes any center from which to orient ourselves. A major part of writing about books is comparing them and understanding influences, situating a writer within history and their tradition. This allows us to judge whether the author has done something new or better, or whether the work is just derivative. This is not to say that derivative writing is bad necessarily (enjoy whatever you want), but that it is not really worth putting serious attention into. Reviewing something as though it is insulated from the world removes our ability to see what is good about it. Earthbound made me cry versus Earthbound’s unconventionally familiar setting, meta elements, and innovative rolling HP meter helped me to get invested and, by the end, made me cry. If every game had the rolling HP meter and asked you to type in your real name, Earthbound would be nowhere near as compelling, and a reviewer that cannot parse emotional responses from the techniques that create them has not done their job.
The other side of the coin is just as problematic. Reviewers often present every subject as some earth-shaking masterpiece, and they will defer to some particular quirk of their subject to do this. How Some Director Terrifies You is a current video titling trend, and the “how” of it will be long continuous shots or disorienting camera movement or something. The technique is elevated to be more impactful than it really is. Every great movie I have ever seen has been driven by shots of people talking, and stylistic flourishes are just that: flourishes. They can intensify feelings but have no value separate from the underlying humanity of the film. These flourishes do separate the good from the great: Stalker’s long takes of mud are incredible, but not if you take them as abstracted from the story of the Writer, Scientist, and Stalker.
Patterns in creator behaviour are driven by algorithms. Fixating on one small point of “genius” is easy to understand and makes for snappy titles. Detaching a work from its context means you can review anything and it is automatically worth talking about, because it is in a universe by itself.
TikTok is engagement perfected. An endless sequence of short videos For You. Entertainment offered in regularly sized, monetizable chunks. Like Twitter, ads are interspersed organically among posts from your friends, and the timeline becomes some greater object: not a collection of posts, but a stream that invites you to look for longer and longer at what has been curated For You. A machine zone.
The world tech companies are building is a natural extension and outcome of what Shoshana Zuboff calls surveillance capitalism. This form renders and sells personal data, and over time the salesmen realized that they and their algorithms could offer guaranteed or nearly-guaranteed outcomes to firms. They can arbitrarily make a specific person buy a specific thing, or at least that is the goal. This power extends to the amorphous–a good consumer is angry and alone, and by happenstance social media use enrages and isolates people.
Absolutely guaranteed outcomes are not possible. But the average can slowly shift toward them. This is not just a concern for a consuming petit-bourgeois. The issue is not that your recommendations will turn into a stale mirror. Workers are a resource to be consumed, and they must be molded into subjects of the firm: emotional states should be manipulated for productivity gains, policy violations should be reported by devices in the workspace. Workers may be shaped like your caprices shape your YouTube recommendations. And this happens almost invisibly.
Even if we assume that firms look out for their workers (which is very unlikely) we are moving toward non-consensual behaviour modification. Even if the company discloses its techniques, successful management strategies will saturate all jobs eventually so all workers are coerced into accepting. Zuboff offers several examples of actually existing worker control systems in her book the Age of Surveillance Capitalism.
From a user or creator perspective, all popular websites are slot machines. For us, they are subject to the really new god the algorithm. But the house is always pursuing profit, and the machine has some occluded logic. It might seem contradictory to associate guaranteed outcomes with slot machines, but if I own the casino then I am certain that these machines rake in the dough. To the users, recommendations materialize from an aether of past behaviour, enticing them to take a chance on knock-off drills or a video called the Strange True Story Behind Sploopy Doo (analog horror). Meanwhile, creators try to divine order from the algorithm and better appease it.
Be it YouTube, Twitter, Steam, or any other big site, you cannot know if or to what extent you are being psychologically manipulated at any given time. We are beyond the panopticon at this point: 24/7 surveillance of our activities is assumed, but we are now pulled to and fro by some more abstract control. People are struggling with Content-induced ADHD and addiction because that state produces the highest return on investment. This was not directly part of the plan, but rather an outcome of a control system that optimizes for engagement.
The quantum of control is the nudge, a small invisible push away from a “bad” behaviour or toward a “good” behaviour. Pokemon Go’s partnerships with businesses are a nudge toward e.g. McDonalds, because they can make a rare shiny Tympole show up on one of the shit-encrusted order kiosks.
As everything is quantified and digitized we subject everything to control. There are stand out examples: access to your fridge is revoked because the company that made it went bust, or your insurance company can disable your car if you can’t afford payments. But these exceptional examples are not the problem. The guy sitting at a slot machine who shits himself and has a heart attack is not the problem. The problem is the invisible design that draws us closer and closer to being that guy, even if we can maintain an image of decency. This image of decency itself is constantly, invisibly shifting to normalize whatever is profitable. It strains against human dignity, sure, but if you’re patient enough you can move a mountain.
Mister Beast, with his hyper-facetuned thumbnails and fixation on money is the rawest expression of the world we’ve made. Money appears as though from thin air, bestowed upon people who do nonsensical things. Put your finger on this car for 24 hours and I’ll make you a millionaire.
Plug your Controller into Controller port 2. If you do that he won’t be able to read your mind!
Notes:
This is an assumption, but I am almost certain it is true. If it is relevant to a video, it is worth working out the numbers as accurately as I can. ↩︎
In opposition to friendship-based sites like Facebook (is Facebook still like that? I never used it.) ↩︎
YouTube in particular is notorious for using machine learning to make decisions about content. AI is largely a black box by design; people think AI is effective because training and output weights are automated, that is what allows pundits to say that machine learning is some sort of intelligence. It’s not so much that the algorithm is “undesigned,” but that it automatically optimizes toward watch time (or some other metric) without any concern for safety or decency. The profit motive is pursued without human intervention, and this lets YouTube cover their ass and paint themselves as merely naive when the algorithm vomits up videos of HULK BLOODLETTING SURPRISE EGG. ↩︎
Pay to Win has turned up many examples of this exact principle. Time Warner has swallowed so many companies at this point that it is nearly unfathomable. Sega, as I said in Pay to Win 2, was owned by an oil company for a while. They want a perfect envelope where everything you interface with belongs to them. Although reactionaries are too stupid to see it, this is the real promise of the WEF’s “you’ll own nothing and be happy.” It is not some communist slogan, it is not saying that we will own everything in common. The prediction is that some group of firms (chiefly BlackRock, it seems) will own everything and we will be collectively subjected to some insane new serfdom. ↩︎
In my experience a symptom of this behaviour is caring about the internal logic of time travel. Matthewmatosis’ review of Bioshock Infinite is an exception, of course. ↩︎