A rant on content creation and walled gardens 4/16/2024

In high school we were required to get a graphing calculator for various math classes. My parents got me the TI-85, which while based on the same z-80 chip as the popular TI-82, was the non "stripped down" version. It had a fairly complex BASIC based development setup. It was my first truly "personal" computer.

I wrote short stories on it saved as strings in the persistent memory. I used the "point" mode to draw simple bitmap art. I wrote software and learned the basics (hohoho) of programming. Everything from simple raster graphics apps (rotating a triangle ohhh boy) to menu driven "rpgs". I traded apps with my fellow nerds via the integrated transfer port. Later I ran DOOM on it as the power of Z80 assembly was unlocked by spirited developers via z-shell.

So why then, was all of this possible on a low powered chip that was already outdated at the time it was chosen for inclusion in the graphing calculator? Despite its underpowered specs it provided a complete creation suite--even if some creativity was required to use it. If the TI platform proved anything, it was that a platform is incomplete without true creation tools.

Today's computers are incomplete devices. Todays smartphones are a terrible joke, with the only content creation possible being snapping (admittedly now very good photos and videos) and painfully typing out text in such a terribly inaccurate way that the device literally runs a transformer based machine learning model to correct and suggest text in a vain attempt to overcome the folly of typing on glass. Audio of course is in slightly better shape with a plethora of stripped down trackers and other creation tools allowing one to at least pick out some beats.

On the desktop however, windows and macOS ship without a development environment or even obviously accessible compilers. Coding is an opt-in affair. There is no invitation to explore further. Even the fanciest mac laptop today is inherently a consumption device until modified (to the great chagrin of Apple) into a suitable dev environment. Shockingly some Linux distros don't ship compilers by default, although at least here the barrier to entry and installation is typically much lower than the commercial OSes.

It's no surprise then that the so called "digital native" is generally lacking in what the previous generation considered to be "tech savviness". Is this a result of tech becoming more streamlined? The rough edges sanded off by time and effort of countless developers? Or is it that despite the majority of us carrying a relative (to the TI-85 at least) super computer in our pockets with multiple gigs of ram, fast solid state disks, and a multicore processor capable of running triple A games, we aren't provided by default with the affordances needed to truly create content: Interactive experiences and doodles. Solving our own real life problems or desires with code, pixels, or bleeps and bloops.

The root of this problem may actually live in a slightly surprising area: analytics. Most modern applications even some open source ones, collect a set of usage data about the features users are engaging with and in what ways are they engaging. Privacy arguments aside, this telemetry CAN be used to optimize a users experience. Allowing developers to straighten out complicated flows based on instrumented signal from struggling users. Unfortunately this data can also be used--especially in commercial applications--as a sort of culling metric. The features with the weakest signal are either outright removed or deprioritized for further investment.

How does this apply to interactivity and content creation? It takes a certain spark to open that blank blinking document and decide to create something out of nothing. For software development, that creation is even further inhibited by the need to communicate accurately with the computer in a series of hard to learn commands. Judging purely by the metrics, its not worth including something that only 25% of the population even bothers to launch in the first place, and maybe only 5% sticks with playing around with enough to learn and become enamored with software.

But this misses the point. The act of bundling said creation tool as a core component, a core experience even of the operating system allows for curiosity to take root all its various forms. Given an available tool, a curious user may reach for solving their own problem, struggling through compiler errors and the like until their problem is solved. Other users may seek to emulate a friend or coworker that is fluent in this type of content creation. Assuaging their own curiosity in the blinking of the editor. Without even the opportunity to peer under the covers being afforded by today's devices of course the curiosity itself is lost. Spent in other areas, noble pursuits perhaps, even sometimes creative ones, but also ones that don't lead to a fluent digital culture.

There are a few glimmers of hope here and there, but while they exist they are not by default bundles, by default accessible, and in many cases not free. The real harm of the walled garden, monopolistic considerations aside, is that it prevents such an environment from even being offered in the first place. Restrictions on what you CAN AND CANNOT do with the device you are renting from your hardware rental company, often well intentioned attempts to make the device more secure in the face of today's threat landscape leave users in the lurch. Unaware that they are even missing anything. Unaware of the value and promise of truly personal computing: Creating and using programs, art, and music on the same device you are consuming it on. A truly balanced ecosystem, and one that betters all of humanity rather than enslaving it to a constant stream of endorphine producing content.