Narrative Structure And The Principle Of Least Action

I love reading and watching non-fiction because it never answers a question without raising many more. The process of learning and discovery is a never ending quest to slay the hydra. Every severed head—every answered question—only brings with it more heads to slay. This all brings me to an insight that occurred to me the other day after rewatching a Veritasium video on the discovery of the Principle of Least Action—for like the fifteenth time.

Perhaps narrative structures, stories, also obey this same principle.

Action, Briefly Explained

For those who don't know about the Principle of Least Action, you should watch the video—it's great. But in short, Action is a concept in physics. Specifically, it's the combination of mass, distance, and velocity, the minimization of which seems to underpin the motion of all objects in the universe. Action alone seems to determine the trajectory of objects in space, that is the path they take as they move.1 In our modern theories of physics Action is fundamental, and Nature seems to do her best to minimize the expense of it.

A picture of a set of possible curves between two points on a 2-d plane.
A visualization of all possible paths a particle might take as it moves. Photo credit: Veritasium

Of all the possible paths that an object could take as it moves, it seems that objects in free-fall motion always move along the specific trajectory which minimizes the Action.

Narration viewed as Action

In writing there is a principle called Chekhov's gun, which states simply that:

If in the first act you have hung a pistol on the wall, then in the following one it should be fired. Otherwise don't put it there.

The point is to rid your story of extraneous elements. Detail must serve a narrative purpose; it must be justified. In a way, this is similar to the Principle of Parsimony (a.k.a. Occam's Razor) which is commonly paraphrased as, "of two competing theories, the simpler explanation of an entity is to be preferred." As a writer, your job is to craft a world and a narrative that fits disparate information together into a seamless whole. The better story is the one which requires fewer assumptions by the reader and omits extraneous information. This might be otherwise phrased as crafting a "tight" story. A tight story with a good ending is one that minimizes the information required for a reader to find the narrative and its completion engaging, one that ties up loose ends, and one that gives the reader the feeling of satisfaction and closure.

A picture of a set of books on physics together with several good stories and a hint that they are related.
Grand Unification?

Stories exist in an abstract space with many more dimensions than our familiar 3-D space would allow me to depict. However the story is still singular, it follows a well-defined path traced by its words through this space and my argument is that the "ideal path", the one which the author ought to prefer when writing, is the one which minimizes the Action in that space.

To be clear, I do not have a rigorous formulation for this idea, it's an idle musing after all, but upon reflection it seems to comport with my intuition about good writing. Perhaps a thought experiment will help.

A (Totally Rigorous) Proof

Consider two points on the Plane with the X & Y dimensions quantifying some range of values for narrative structure: say the amount of world-building in a scene vs the progression of the story over all. Moving "up" on this chart means adding more backstory, while moving "right" is progressing the story.

A picture of two points on the 2-D plane connected by a single trajectory and a slightly different trajectory which quote uses endquote more action
A point in the story begins at point A (at page α) and moves to point B (at page β). Photo credit: Veritasium

Now, the story will likely have many, many more dimensions than this, but stay with me.

The story itself then is a curve which connects the two points at a given page count. This curve, or the path of the story as it moves through pages, can take many forms. However the one it should take under the principle above is the one which minimizes the Narrative Action (which we have not, and will not, formally define).2

Now consider altering this curve by adding additional detail about the world. This detail is by definition unnecessary to the story because the minimal path exists from point A to point B as defined above. Therefore that detail would only increase the total Narrative Action and should be removed.3

In this way, the writer should emulate Nature in her effort to minimize all information in their story which is not required to attain the desired effect. Obviously the hard part is actually intuiting the true path that does this, but I'm not here to tell you how to be a good writer, that's too hard. Still, these ideas do seem connected, or perhaps that's just me.

1. Yes, curvature is also a thing. Also Quantum. Ignore it.

2. Out of scope for this conversation.

3. Associated with this is the idea that a story should create stresses, that is ask questions, which are relaxed or resolved by later sections. That tension is critical, but the path followed from the author's insertion of that tension (akin to the throwing of a ball which disturbs it from free-fall motion) and the resolution of it (the ball's final destination) should still follow this principle.

Experts In The Internet Age: The Power Of Email

These days I spend a fair bit of my time emailing libraries in order to get access to research materials for a project I'm working on, and I'd like to share what has become a very typical experience for me and one that I think more people should know about.

tl;dr librarians are amazing and incredibly helpful.

It all started in the way most things do these days: with a search and a random recommended result.

At the time, I was looking into several different genres of music played during the seventeenth and eighteenth centuries (1600-1700) across Europe and the Middle East. Specifically I was looking into English/American Colonial and Romanian Folk. I came across several incredible melodies—including one that immediately stole my heart even though it wasn't at all what I was looking for—and I eventually stumbled on this tune captured by a visitor to Colonial Williamsburg:

This was precisely what I was looking for, but the song came in with no introduction and the video itself contained no further clues as to what this song even was. I wanted to know more about it (and potentially find the source of the melody and others like it) but I was starting from a pretty bare slate.

I started my research by trying to find any information on the Colonial Williamsburg site as to the kinds of music their performers play, but that went nowhere. So, I headed over to my piano and deciphered the tune by ear so that I could potentially look up the sheet music. However no music search engine I could find could tell me what the song was. I was running low on options now, so I did what I've now done several times: I emailed the Colonial Williamsburg Research department and simply asked if they could help.

A few days later I had a response:

[Our] Performing Arts dept [says] that the clip you have is a medley of songs. They're from the John Playford English Dancing Master collection of tunes, and are called, ‘Cockleshells,’ and ‘Oranges and Lemons.’ (emphasis mine)

Perfect. I had a source. Even better: that source is on the Internet Archive.

A screenshot of the songbook mentioned earlier contining the music for Cockleshells.
Photo credit: Internet Archive

Now, some readers might be asking why I am telling this story at all. Didn't I simply send an email and get a reply? Yes. I did. But that's precisely why I feel so compelled to tell it.

These days the internet is a beast with two heads: seemingly everything is available on it, yet we have trouble finding anything. Mathematicians sometimes refer to this phenomenon as the problem of finding the hay in a haystack. We're surrounded by information and yet overwhelmed by any ability to actually navigate it to find what we want.

Amidst this chaos, I have continued to find guidance and help in the form of the under-appreciated experts who staff our libraries and research institutes. There is only so much that independent research can accomplish and it's hard to remember that you can ask for help.

In the past few years I've sent dozens of emails like this one and always received helpful responses. Increasingly, in our ever-connected world, data may exist on the internet, but we might find that we will end up relying on the knowledge of the experts in the physical realm to find it.

The internet has made so much content and information available at our fingertips, but perhaps the most powerful fragment of that content is a simple email address.

I've said it before, and I'll say it again: even with all its many flaws, I love email. When in doubt, you can simply ask.

What if: Bookclub, but AI?

A while back I had a free evening and a silly idea, and what resulted was an interesting exploration of the large-context behavior and analytical capabilities of modern LLMs (ChatGPT in my case). The idea was very simple:

What would happen if ChatGPT hosted a bookclub with itself?

Now, this is—of course—a very serious question with very serious implications for the trajectory of modern technology, so I set out on a quest to answer mankind's greatest curiosity.

A few hours later, I had my answer, so let's go over it, shall we?

The Why

There has long been a common notion in the Discourse that—since LLMs are trained on public internet data and since that data is increasingly polluted with AI spam content—the general quality of AI-generated results would degrade over time.

As well it's widely known that even the most top-quality LLMs today generally return banal, shallow, and overly friendly responses, even when prompted not to (in some cases because of explicit training to do so), and I was curious to see what would happen if GPT was given the opportunity to converse, long-form, over time with a disparate group of minds. I was interested to see if GPT would overcome its preference for lackluster observations when summarizing text and genuinely discover something novel. To do this, I knew I couldn't simply ask it questions about a given text in its training set, I would need to instruct it to conduct an elaborate play of personalities, each of whom would read and summarize the text in their own way.

Getting it Together

The code itself isn't very exciting. I cobbled together a little Python script that would take in a set of personality files for each of my several bookclub participants and then randomly choose between them for who was going to speak next. To keep things consistent, the script would keep a log of the dialogue as it went and submit that with each request.

As I played with the script, I made two changes, each of which significantly improved the conversation. First, I made it so that each participant would be primed by feeding ChatGPT the given personality and the chapter text, and then I asked GPT to summarize the chapter in the perspective of the given person. That summary would then be added to the given person's mind-state. Second, I added a bias in the random choice of speaker that preferred any names mentioned in the prior response—GPT often referred to other characters by name and it would be natural for that person to respond directly.

Both of these changes inspired several additional ideas for future improvements, but we'll get to that in a minute. For now, let's see what happened when the script did its thing.

Ready, Set, Bookclub!

The Cover of the Book assigned for that week's session: The Purple Cloud

The book I had originally lined up for this was Pride and Prejudice as it's easily available in the public domain. However that choice proved useless as ChatGPT knew too much about the book from its training set. I needed something it had rarely encountered before so that I could test its observational prowess.

Hence I chose a book I read a few years back: The Purple Cloud, by M.P. Shiel.

All that was left was to craft a series of personalities to enact my little play, and so I set to crafting some backstories and, like a director: I set the scene.

ChatGPT's Scene Instructions & Motivation GPT receiving character motivations before the show

I workshopped these quite a bit, but I know there's more to do. Obviously I could have tried custom assistants, but that is discussed more below.

At first, I was pleasantly surprised. Each of GPT's personas invented motivations and expanded on their backstories, and GPT never seemed confused about who was talking and easily tracked the flow of the conversation. Characters routinely invented novel conversation topics and helped to slowly invent the character of the world around them.

However it didn't take long before things went awry.

On several attempts, one of my characters would pour wine for the group (a kind enough gesture), and the group would all toast and thank each other for coming out for the evening. However, it seems no where in GPT's training data does it contain any reference to narrative pacing or what to do in order to finish a toast, and so the conversation would inanely continue on and on as each character added more to the toast, and never once did any of them decide to actually drink. 🥂

A screenshot of the output of the script.

In future attempts, I intend to add another personality to act as a sort of stage direction bot, whose only job is to add occasional scene changes that the characters would be assumed to observe. This, hopefully, would alleviate this issue and help to avoid the next one:

Once GPT established a scene, it never changed or evolved. This was surprising as over time all sorts of conversational details were being added to the chat: discussions of the book chapter, character moments, and backstory were ripe for use, but the conversations only ever got duller as they went on. Soon characters were agreeing to visit a consistently invented, newly-opened, vegan cafe with apparently fantastic croissants, but as the small talk progressed characters would endlessly revisit topics, gush pointlessly, and never add anything novel of value.

And that leads me to my most interesting observation: the conversation always died without any insight into or worthwhile analysis of the book chapter provided. Sure, characters would discuss the most superficial of elements and it was all incredibly mundane. Discussion was largely focused on the sense of foreboding mood found in the chapter or on the director's stage direction for some of the characters to be put off by the often unwieldy prose. For all it's might, GPT couldn't even be bothered to bring up specifics of the main character's arctic expedition, comment on any specific character's motivations or plot drama, or even try to predict the ending.

A screenshot of the output of the script.
Overall (perceived) conversational quality.

And invariably, one of my characters would ask when the eponymous Purple Cloud would appear, to which another would reply that it "sounded ominous". GPT it seems is about as good at analyzing novel text as a high-schooler who only read the back-of-the-book blurb (read: myself in high-school).

What Does It All Mean?

In general, I was less-than-impressed with GPT's abilities in this task. As I mentioned before, it quickly became apparent that GPT could spout off vapid insights regarding the text of Pride and Prejudice even without the chapter text (I had a bug originally that prevented the text from being included in the chat log, but nevertheless GPT knew about the plot and characters), however with a less-familiar text it proved much less insightful than I expected.

In general, I'm not sure what to make of this little experiment. I'm more certain that I'd like to try it all again with several specific, technical improvements that will hopefully address some of the limitations I encountered.

Bookclub 2.0

While I haven't gotten around to implementing any of these yet, here's the list of features I came up with to improve the process:

Use Stage Direction: As I said, one of the chief limitations of this process was the fact that the scene itself never evolved. That meant that after each character had explored their backstory and current setting, there was no where to go and so they babbled on incessantly about nothing. Adding another agent whose primary job is to insert novel change would hopefully disturb the equilibrium and allow for new insights.

A dump truck drives by and outside a dog barks.

Improve each Character's Mental Model: Here I'd hope to track not only the character dialogue but improve on the priming process I mentioned before by asking (at every step) what each character thinks about the current situation and allow their own mental state to evolve beyond what is present in the spoken dialogue. Currently it's not possible for characters to have separate internal and external states which also likely flattens their complexity. As well I'd hope to include much more detail in each character's backstory and education including other materials they found interesting or recently encountered.

Consider Different LLMs: One critical limitation of this approach is that it only uses one LLM and therefore is biased toward one set of training data and methods. Perhaps if a given character were assigned a different LLM that could help matters. There's bound to be implicit baises in each model that could affect the conversational tone (and who knows! perhaps then it would be possible to have a jerk participate in the conversation).

In addition to these two improvements, I'd hope to investigate some technical changes as well. The sheer volume of tokens being submitted (including the entire chapter text and dialogue history) means that the project quickly racked up a tab. However in order to have richer character personas I'd likely need to investigate training custom assistants rather than feeding it all through the chat log API.

Anyways that's it. I don't really have a through-line for this post. I just tried a thing and thought it was worth sharing. If you have suggestions or feedback, please file an issue or shoot me an email. The project is up on GitHub if you want to play with it yourself.

A Counting Meta-Post

It's been a while since we've run this little snippet, so let's see what we get!

$ find archive/ -name "*.md"|xargs -I {} cat {} | wc -w
124582

That means that there's roughly 124,582 total words written on this blog, not bad! That means I've managed to increase the total word count on this blog by nearly 24% in two years.

I'm not going for word count here, but it's still an interesting metric mostly because I post so much less frequently than I used to. That realization lead me to a much more interesting question about the overall length of individual blog posts over time. I posted the preliminary results over on Mastodon where I also discussed my reasoning behind the obvious trend of post length increasing over time.

A chart of the length of blog posts over time with a trend line that clearly goes up and to the right.

In general, my posts get longer as time goes on, and you can see that even more clearly in this chart which groups posts by the year they were A boxplot of the posts on this blog grouped by year

The general trend for a given post's word count had been slightly upward for a long time, but the trend seems to take off in 2021 (with a huge dip in 2023 due to a lot of quick project posts). 2024 however is the clear outlier.

Post by @sonicrocketman@mastodon.social
View on Mastodon

Of course the early days of this blog were a huge outlier as well, but that's another topic. Excluding the early days, things are trending up.*

* In terms of word count. Offer not valid in all states.

Foundational Texts

Recently, I saw this post from Kottke about so called Foundational Texts, and it got me thinking.

Writer Karen Attiah recently wrote about the pleasure of perusing other people’s personal libraries and then asked her followers what their “personal foundational texts” were…those books that people read over and over again during the course of their lives…

How about you? What are your personal foundational texts?

The question is a super interesting one. What books one reads shape their worldview. All media does this but, at least to me, books seem to hold an outsized influence. A few movies or TV shows stand out to me as being my favorites, fewer still exert a shaping force on me. Books seem to possess a more potent staying power.

The thing is, I don't really re-read books (especially non-fiction). Sure, there's a couple I've revisited over the years, but it's a very small list. That said there are select set of books that, once I'd read them, I've never gotten them out of my head.

So let's get into it (in no particular order):

  • The Wizard of Earthsea by Ursula K. LeGuin
  • The People vs. Democracy by Yascha Mounk
  • The Scientific Revolution by Steven Shapin
  • Pale Blue Dot by Carl Sagan
  • The Silmarillion by J.R.R. Tolkien
  • Zen and the Art of Motorcycle Maintenance by Robert M. Pirsig
  • Gödel's Proof by Ernest Nagel

Most of these I read in high school and college (or right after college), which is probably why they are so foundational. What's interesting is that, looking at them now, I clearly see a set of themes that stand out to me as foundational to my thinking about the world: optimism vs. caution, systems vs. the individual, agency vs. limits. Perhaps I'll expound more on these in a future post.

Usually, when I to get interested in a topic, I dive in whole hog, and so my reading list tends to be very single-minded for a while until eventually it takes a sharp detour one day toward something usually unrelated. From 2017-2022 that topic was U.S. Politics and Government. When I'm not in that sort of research-mode, I tend to revert to either some classic American novel or Fantasy.

But this means that what "foundational" means to me is that it is heavily influential and formative as to how I think about a topic. When I think of each of the books in the list above, I can immediately recall the change it had on my thinking.

As I've mentioned before podcasts have had a huge impact on me, especially in my late high-school and early college years. Both the Writing Excuses podcast and the lectures from Astronomy 162 at Ohio State stand out in my memory as shaping my interests.

Now, to continue Jason's call to action, what about you? What are your foundational texts?

Science, Models, And Squeaking Lead

At the dawn of the 14th century the Franciscan alchemist Paul of Taranto crouched over the strange lump of metal he'd created. He gaped, in awe of what he had done. It should have been impossible. The scholars told him he was a fool to even try, yet he'd done it. It wasn't gold that he'd created. He was still far from achieving that goal, but he'd made an important step. According to the book he would later publish under the title Summa perfectionis magisterii, Paul—writing under a deliberately confusing pseudonym—had just transmuted Lead into Tin!1

For those of us in the 21st century, it's second nature to dismiss this sort of claim as obviously ridiculous. There is no known, chemical way to do what Paul intends. But that begs a very obvious question: what had he done? If we take the man at his word, he certainly seems to have done something to his bar of lead, but what? And why did he believe his experiment had succeeded?

These questions will lead us down a very fascinating path, and one that reveals the striking truth about our knowledge of the natural world.

Theory & Practice

We take so much for granted these days about the knowledge of the natural world. We consider obvious and teach to children what took generations of the brightest minds to figure out, and it can be very easy to forget that.

Today we break apart systems to understand them; this is Reductionism. And we use material analysis to understand and manipulate the properties of physical objects: that is we melt, dissolve, chemically alter, and then recombine materials in order to create what we want. But in order to do all that—and better to make predictions about what exactly our methods will accomplish in doing so—we must accept the following assumption: that an object is nothing more than a physical assortment of indivisible components.2 This assumption may seem obvious to most people today, but it wasn't always that way! Indeed most serious philosophers in the past considered the idea ludicrous.3

I won't go into the history of Classical & Medieval Matter Theory, but suffice it to say that before a pre-modern version of what we today would call Atomic Theory would emerge, the dominant view of "What is Stuff?" was far more qualitative than quantitative. For the programmers out there, think of their Matter Theory as a sort of Duck Typing.4

If it walks like a duck and it quacks like a duck, then it's a duck.

Matter was it's qualities. Gold is yellow, ductile, resists being tarnished. Lead tarnishes easily, it's heavy. Tin is silvery and when you bend it, it squeaks. Therefore the process of turning one metal into another is a matter of giving it the desired qualities!

Photos: mine

As at every point in history, there were differing view points about what exactly matter is made of and scholarly opinions differed greatly. However, one theory—preferred by alchemists like our old friend Paul—held that the four prime elements were bound together into a duopoly of two higher-level substances to make the metals. What were those substances? Why sulfur and mercury!

Yup, literal sulfur (or brimstone) and mercury (or quicksilver).

Now before we make fun of Paul too much (or rather Geber as he called himself in his writing), let's try to understand why he thought sulfur and mercury were the foundational elements underpinning all metals. Under this theory, the difference between say lead and gold was simply in the relative proportions of these primary ingredients!

Paul, like so many alchemists, was seemingly quite the avid experimentalist, and so based his theories on what he could determine by the fire. Among many other experiments, he noticed the sulfurous smell given off by impure metals during refining and assumed that such a smell was due to a volatile sulfur within the metal itself. Additionally, according to Dr. William Newman:

The fact that calcined [i.e. burned] metals often appear in the form of yellow, red, or white powder (what we coul call oxides) suggests to Geber that they also contain...sulfur that remains after the volatile sulfur has been forced out by calcination.
- Atoms & Alchemy by William R. Newman, p. 33

As for the mercury, that brings us back to our earlier tale. Again Newman writes:

Geber proves [his] point by washing lead with quicksilver and then melting it. whereupon the lead gains the creak that the tin had lost—as he puts it, the lead is converted to tin.
- Atoms & Alchemy by William R. Newman, p. 33

According to his theory this worked because tin had more intrinsic mercury than lead. Therefore, to transform it, one simply needed to add mercury into the lead. Sure enough, once lead is washed in this way, it squeaks when it bends. Paul had successfully imparted the quality of squeakability into his lead, thereby transforming it (albeit partly) into tin.

It might sound silly, but consider this: here was a person who formed a hypothesis about the cause of a natural phenomenon, then tested it. It worked, so he built on it. That sounds a lot like science, doesn't it?

What Even is Science?

At it's core, modern science is two things: a process of inquiry and a collection of knowledge. Together those two form a model of the natural world that we use to make predictions and offer explanations into the workings of nature. Science does not, and cannot, tell us how nature actually works under the hood, instead it gives us the tools to develop and test our models.

This point is worth belaboring, because for me, it took a long time to really click.

I studied aerospace engineering, which is basically Newton's mechanics, material design, and a lot of stuff about how fluids flow (air is a fluid, you see). When you study physics in this way, it can be tempting to believe you've learned something about nature, but what you've learned is how to model nature. There are always error terms sticking out, gaps in the theory, losses you approximate (spherical, frictionless cows for example). The math isn't nature, it's an approximation. Crucially, it's an approximation that seems to work, or at least it does within your measure of tolerance.

It's hard to remember that we don't really know how nature works. We know how to approximate it. But that fact becomes easier to understand when you approach the study of nature, not from our current perspective, but from the outside.

What's so fascinating to me about the example of poor old Geber/Jabir/Paul above is that his theories about matter were utterly wrong, and yet they did offer testable predictions that were sometimes correct! Like many scientists before and since, Paul may have very well chalked up his experimental failures to defects in apparatus, hidden variables like mineral origins, or even the incompleteness of his own theories, but nevertheless he worked with and improved his theories so to arrive at testable methods and predictions, and he wasn't the only one!

It fascinates me so much because I find myself wondering: what do we believe about nature that future generations will look back on with the same bemusement that we feel about Paul of Taranto or any other proto-scientist who's theories fell short? What about nature will become so obvious that it's taught to ten-year-olds in four centuries, but that now our brightest cannot see?

1 The opening scene of this post is a fictional account, so no one take this too seriously.
2 Or chemically indivisible at least.
3 All of this pertains only to the European West and the Middle East during the Classical and Medieval periods. Basically those cultures touched by the legacy of Aristotle.
4 For the Programmers out there, think about this:
class Tin extends Lead implements Squeakable, Silvery {}

The Mysterious Potential Of The Plane

In recent years I've embarked on a sort of journey of rediscovery with respect to mathematics. As a kid I was somewhat good at math and I spent my college years doing lots of it—to the point where I grew to really dislike it. But even in those times, mathematics—while becoming increasingly difficult—always seemed like it was trying to tell me something. These days, it fascinates me.

A common observation among those who study classical engineering fields is that many (if not most) of the problem domains are modelled with very similar techniques. The motion of a cantilevered beam, a road bridge, a plane wing, a mass on a spring, a musical note, and even a planet's orbit can be described by very similar mathematics. Indeed so many physical phenomena can be described by these equations that it can make you wonder if you've stumbled into some great secret of the universe.

To the physics/engineering undergrad, everything is a mass on a spring.
Credit: MikeRun/WikiMedia Commons

But that wonderful truth is only a gateway to something much deeper, more fundamental—a true capital-S Secret of the Universe. Mathematics tends to deceive with simplicity, and there's few things in this world that hide more secrets than this simplest of concepts.

Consider the Plane

As children we're taught about the 2-D Plane.1 At first, students are confused in a manner similar to how they felt when being told x was a number now, and oftentimes that confusion grows into consternation. At worst, plotting and manipulating curves on this plane feels pointless, arbitrary, and unnatural. For those who come to know the plane as a tool, it can be quite an enlightening process. As an undergrad engineering student, you become familiar with the plane and its rules; you learn to shape curves and solve perspective puzzles with it. Modern video animations make this so much more intuitive because they're able to show a figure on the plane change over time: you can watch it squish and stretch as you adjust the coefficients of a quadratic.

What's more Geometry holds so much deeper meaning in our world than I think most people imagine. Vast numbers of people can use mathematics to do every day tasks: measure the square-footage of a room, scale up a recipe for a family dinner, or plan their finances, but Geometry does so much more. Simple rules about finding the angles in a triangle can help measure the curvature of the universe, and simple sines and cosines (which form the backbone of many vector operations) enable the ability to encode the meaning of abstract linguistic concepts in modern Large Language Models. Our universe is made of math in truly fascinating ways.

But this isn't my point. So far, this has all been backstory.

Exploring the Plane

What's truly wondrous about the 2-D Plane isn't what you can plot, but what you can discover. There are infinitely many lines, curves, and shapes one can draw on the 2-D Plane and while most of them are useless, quite a few are incredible. Those are the ones we plot to find the motion of a ball in the air, or the speed of a planet's orbit over time. But there are some curves that hold incredible secrets. We know of a few and there are likely many, many more. These magical curves transform our understanding of mathematics itself.

Consider the logarithm:

x = 1,000  or  10 3
therefore
l o g ( x ) = 3

This is a function that can be used to do all sorts of practical, useful operations, but it can also convert multiplication into addition.

10 3 * 10 5 = 10 8 l o g ( 10 3 ) + l o g ( 10 5 ) = 8

Plotted, the logarithm function makes a curve on the 2-D Plane. This means that there exists a curve on the plane which converts multiplication to addition! What other magical curves exist with incredible properties? Perhaps there's a curve that plots itself? Oh wait! There is!

At it's core, a function is a thing that takes a number(s) and returns another. This sequence of values can be plotted to form a curve on the plane. There's a curve mapping subtraction to multiplication, one for the GDP of the U.S. over time, and (given some encoding trickery) a curve describing the lifespan in seconds of each and every being that will ever exist! They're all in there somewhere; we just have to find them.

This is one of the true wonders of mathematics2, and more specifically of geometry. Who would think that simple shapes and curves could tell such great truths? Yet so much hides within the expanse. In this way, the work of finding such curves is one of exploration in a treacherous wilderness. With simple tools, one carves a path of discovery amid the endless vastness, returning to share the knowledge they have found.

The Secrets Within

Held within the depths of this infinite expanse there are perhaps infinitely many fundamental secrets. We only find them through careful study, or occasionally by accident, and we have no idea how many we might be missing—even amidst the sectors of that expanse we've so thoroughly explored.

It's so wondrous to me that such a simple concept could hold such great truths and that shapes and curves upon the plane could so accurately model the natural world. Perhaps it should be the Plane or the Triangle that stands beside Fire as our most powerful invention.

1 For most of my life, I'd been confused when people mentioned "the complex plane" or "the number line" rather than "a plane" or "a number line", but the beauty in properly capitalizing the name "The 2-D Plane" is that, in a Platonic-Forms kind of way, it reveals the truth that we're all talking about the same plane. There's only one. We may depict some fragment of it on a board, but the true 2-D Plane exists in singular and only in Form.
2 No doubt some people out there will read a few of my points here and correctly point out that these are more points about infinity rather than specifically the 2-D plane. You are correct, but my point remains valid.

Nothing Throughout The Ages

I'm fascinated by the systems by which we, as humans, make and communicate knowledge. From the very beginning of our species we've done our best to devise ways to understand the natural world and our place in it, and we've used a huge number of different systems of thought to organize, piece together, and build a more complete picture of the world for ourselves.

I think a lot of people may misunderstand my point here, so let me be very clear, when I say a system of knowledge-making I mean any system, no matter how misunderstood, how imprecise, or how flawed it might seem to modern sensibilities. Religion is a knowledge system, science is a knowledge system, oral tales and legends are a knowledge system, so is writing, music, folklore, culture, and any other way humans pass down forms of knowledge from one generation to the next.

Every system like this has its flaws and its benefits. Oral tales and folklore for example are excellent ways to pass down traditions and values, but they're terrible ways to ensure later generations understand the precise temperature at which water boils.

But in particular, I'm very interested in the places where systems of knowledge converge on similar or identical truths, even if they do so for different reasons.

An early-modern diagram of atoms. Drawing: me.

For example, an alchemist by the name of George Starkey, living in England in the 1650s, discovered a process by which he could chemically cause water to freeze via the endothermic reaction of sal ammoniac (ammonium chloride) with water, though he explained the phenomenon very differently than we would today—believing instead that he had isolated the quality of coldness and dryness by manipulating the fundamental particles of the compound. Nevertheless, his method worked (and worked well enough for his friends to try persuading him to sell "artificial ice" to Italian nobles in the summer).1

Nothing, Null & the Void

Likewise the concept of Nothing has puzzled generations of knowledge-makers. Famously the number Zero was absent from Western mathematics until it was introduced around 1200 C.E.. Adding Nothing to a number was considered nonsense. Why even do the math problem then?

In regard to physics, Aristotle argued, rather forcefully, that Nothing (what he called the void) simply cannot exist because such an existence would be paradoxical. Aristotle viewed matter as continuous, and did not accept the idea of the "atom" (that is, a smallest particle of matter) precisely because it would beg a very obvious question: what's between the atoms? It couldn't be nothing, because Nothing couldn't exist.

Another depiction of corpuscles. Drawing: me.

This school of thought lasted for over a thousand years. Eventually though particle theory, or what was called at the time "the corpuscular theory of matter" became increasingly popular right around the time of the Scientific Revolution. All the while, the question that Aristotle posed went unanswered.

Today we know that the vacuum does indeed exist*. (Uh oh)

Indeed the classical vacuum: without air or other gross matter, exists in plenty. Most of space is empty, even most of what we call matter is empty. This is the truth of space and of matter: they are made of mostly nothing. However there are several kinds of nothing, and empty space is only one of those kinds. At the barest level, our current understanding suggests that space, even when devoid of matter or force particles, is still filled with something: namely quantum fields, and even more abstractly with potential.

And this is where the asterisk from before comes in: as far as we know, there is no patch of space that is truly devoid of these fundamental building blocks, so in a way: Aristotle was right.2

Math ado about Nothing

As I mentioned before, Western mathematics had to wait until the thirteenth century to be introduced to the concept of Zero in a formal way. Though once incorporated, Nothing, represented by the number zero, proved to be incredibly useful.

Today the concept of Nothing underpins our entire understanding of numbers. Indeed it seems one can construct all the natural numbers out of it. In this way Mathematics scoffs at poor Aristotle, for unlike matter which is mostly nothing, the counting numbers and therefore all numbers are really nothing at all!3

Nothing, it turns out, is Everything.

Consider constructing ℕ using the Empty Set
(i.e. a set containing nothing):
  0 = {}
  1 = {{}}
  2 = {{{}}}
  ...

Religious Nothing

The Abrahamic Religions (along with many other faiths) also feverishly debate the concept of Nothing. In particular, there is the idea that the Christian God created the universe "ex nihilo", from nothing, as written in the Book of Genesis. Likewise faiths from all over the world deal with the question of Nothing, though usually in the context of the question: Why is there Anything?

Creation stories, like that in Genesis, exist all over the world and come in many truly incredible forms.

All across the world, throughout time and space, humans have needed an answer to the same fundamental questions and we've found a bunch of conflicting and corroborating ways to answer them. Each of these answers have some truth to them and some methods of knowledge-making attempt to answer questions that others simply cannot. Each system is therefore one of partial truths. Science, for example, can answer the question of how matter moves, but it cannot answer what it means to exist.

I think that's what I love about studying knowledge-systems. I find myself trying to inhabit the minds of those who lived under very different presuppositions about the world, to see the universe through their eyes, and in doing so to find some new and partial truths about the world.

Nothing Is

In particular, the concept of Nothing has confounded our reasoning for millenia, and while we seem to have a good grip on it now, it's important to remember that we really only guess at its properties (if Nothing can even have properties—Aristotle, help!).

Indeed, the concept of Nothing has evolved over the past several thousand years, and even still we debate what it really means. That's something I love about questions like this: the debate will likely never end. Questions posed before the Fall of Rome still rattle around in the minds of people today and while sometimes Physics, or Mathematics, or one's Faith can make attempts to answer these questions, we may truly never know. Such truths may be beyond us.

1 See Gehennical Fire by William Newman.
2 Just not in the way he intended.
3 Since all types of numbers are ultimately based on the so called Natural Numbers.

Running & Securing Servers: A Concise Guide

It might be pretty out-of-vogue to host your own servers in 2024, but I still do it and I quite like it. There's a lot of great benefits to building software on basic infrastructure and I always find myself appreciating the flexibility that simple VPS hosting brings. That said, you do need to be careful. Life as a developer on the Internet requires a pretty decent familiarity with security best-practices.

Being self-hosted, there's also a lot of simple design decisions that you need to make when running a system on simple VPSs, and there's really not a lot of good info out there to help with this.

So, here's a simple guide to some of the things I do when setting up my servers.

This post isn't sponsored by Linode. It's just what I know best.

1. Follow Linode/Akamai's Great Guide for Securing Your Server.

Always set up ssh key forwarding and use it. That said, I usually end up leaving Password Authentication Enabled, since I've had problems remoting in from a known machine (i.e. my laptop isn't available). In these cases I use a very strong password (Apple's Keychain app will make up to 31 character random passwords) and synced with iCloud so it's on my phone.

2. Set Up a Cloud Firewall

These days, I prefer using the cloud firewalls provided by the platform (here's a guide for setting this up on Linode). I still do set up simple on-device firewalls on the server itself, but I depend more on the cloud firewall since it's easier to configure.

3. Configure Basic Maintenance Cronjobs

These days, I usually set up a few basic cron jobs to monitor things and keep things running. Here's a list of my typical cron jobs. This list will vary a bit depending on what kind of server you're setting up.

  • Postgres nightly/weekly vacuum.
  • Disk Usage & Avg CPU Usage monitoring using Nine9s Measures
  • Nightly DB Backup & Archive (if needed, see 4)
  • Nightly Log Trimming (keep 30 days)
  • Automatic Deploys (see 5)

As an aside here, I use Pushover in a lot of my shell scripts. I send a quick API call to the service which forwards Push Notifications to my phone. That way I get notified of things like build failures which normally fail silently.

4. Set Up Automated Backups

If your platform supports it, just use the automated backups for your servers. It's so easy and well worth a few extra bucks per month. If you're not in the position to use those (e.g. Linode when wont't back up encrypted partitions), then make sure you have a cron job that backs up your database. I use simple pgdump | gzip commands in those cases.

If you do the backups yourself, be sure to test restoring from your backups as well!

5. Apps go in Containers

I use Docker, but any container system works. I've been told this setup isn't "how Docker is intended to be used", but that's fine with me. It works. I deploy my apps to my servers using a build script that usually runs automatically. It uses the production git repo for my docker compose files and runs a simple docker compose up with the configuration for production. I even have it set up to boot a backup copy of the current app while it deploys the new one so I get zero-downtime deployments.

It's probably overkill to do this, but it works well and avoids the possibility of corrupting code files on the host machine. The automatic deploy (which I frequently do nightly) is awesome because it means I never have to think about which changes have made it to production. If it's in main it's in prod.

I also run my databases (which is almost always Postgres) in a container as well, only because it makes minor upgrades easier.

6. Run Webservers on the Host

Above, I told you that I use Docker. However these days I always run the webserver (nginx in my case, though Caddy is also very nice) directly on the host. Nginx then proxy-passes to my my containers using the loopback address. That way nothing is exposed (Docker likes to punch holes in firewalls). Running nginx on the host let's certbot work as expected.

7. Set Up Break-Glass Measures

The single most popular post on this blog is about my (in)famous 8GB Empty File break-glass measure. I don't always do this anymore, but it has saved me a bunch of times. To me, SSH password auth is a kind of break-glass fallback too.

What Else?

That's all I can think of at the moment. It's a lot, but it's also not too bad. You don't need to be scared of hosting things yourself, but you do need to be careful. It's a wild world out there on the internet.

As a final note, if you're looking to get a handle on web architecture stuff, I recommend this blog post for a quick rundown on that. Practical Web Architecture is difficult to learn in my experience, and it takes a lot of pondering, trial, and error to do it right. Most resources are intended for enterprise-scale deployments and not smaller projects. In general, don't over-complicate things if you don't need to. One of the best parts about hosting your own apps this way is that you have the flexibility to change things around as you grow (at the cost of some work and likely some downtime).

I've been doing this for a while now, but I'm sure there's something I've missed. This guide is certainly not extensive, but it should at least help you avoid some major headaches. I can say that from experience.

The Joys Of Playing Live Music

My band, The Fourth Section has been playing more frequent shows lately and along side all of that we've been recording select songs and publishing them on YouTube.

We're well on the way to another album (though the timing is very uncertain). That said though, it feels good to be able to share these live versions of some of our new music. We've even put together a playlist of these edited versions which will continue to grow as we write and perform new material.

I've been playing in a band for nearly a decade now and while my love of writing and playing music has been a crucial release valve and creative outlet for me, one of the best parts of playing in a band is playing live. It's hard work, but it's so fulfilling.

[For nearly all of human history] in order to hear a piece of music you had to be within earshot of someone playing it.
- John Green (vlogbrothers)

Live music is something special. Unlike recorded music—which really hasn't been around all that long in human history—live music has been with us since the beginning. Regardless of form, there's something primal about being in a room where music is being played. It's a communal experience where a group of people sit and appreciate a thing together. Live music is where art meets mind.

And it's not just bands. I'm always struck by just how captivating a solo artist with a guitar can be in a live performance. I love a good acoustic song, but there's something indescribable about being in the room that just can't be replicated by the best recording. A good acoustic song recording can make me feel things, but a good live performance can enthrall me; it can make time stop.

Live music is one of many reminders that we are still very human. Even in this modern and increasingly disconnected world, we still need the same things that drove early humans in times long gone to pound a drum and sing together.

There's tremendous beauty in that, I think.