It's "Automagical"

I have been meaning to write a long form digression (as usual) about the not-so-recent rise of tools that hide the complexity of our job in a less than optimal way, in the name of "democratization", but I struggle with an entry point into the subject.

The "problem"

I am "classically" trained. I wrote compilers on paper for my exams, and have a default attitude towards "NEW AND EASIER WAYS"™️©® to do things that leans heavily towards suspicion. The thing is, once you know how things work under the hood, you learn to respect the complexity of, and the craft put into, having something like COUNT.IF(B$24:B$110;">0") actually working properly.

Recently, I gave a talk about geometry applied to iOS UI development, and it seems a lot of people are learning about these tricks for the first time, even though they are millenia-old on paper, and at least half a century old in the memory of any computer. That's not to say I think people are stupid at all, it just makes me wonder why professionals in my field don't have that knowledge by default.

New shiny things have always had an appeal for the magpie part of our genetic makeup. Curiosity and "pioneering spirit" is deeply rooted in us and we like to at least try the new fads. In other fields, it clashes heavily with the very conservative way we look at our daily lives ("if it ain't broke, don't fix it"), but somehow, not in ours. It's probably related to the fact we aren't really an industry but I don't want to beat a dead horse again, and would like to instead focus on the marketing of "automagical" solutions.

Who is it aimed at?

As mentioned, I probably am too much of an old fart to be the core target audience. It kind of sucks given my (aspiring) youthful nature, but what can you do?

When I come across the new-new-new-new-new way of doing something that a/ wasn't that old, and b/ isn't much easier than the previous way, I try to see past the shininess and look for a reason to revolutionize yet again something that gets "upgraded" once a week already. Bugs exist, there is almost always a way to coax more performance out of something, and adding features is a good thing, so there could be a good reason to make that thing. I usually fail to see the point, but that's because I'm not the target. So, who is?

First timers

There is a huge push to include non-programmers in our world. And it's a good thing. We want more ideas, more diversity and more brains.

Does it help to hide too much of the complexity from them, though? If they are to become programmers, having them believe falsehoods about the "simplicity" of our job hurts everyone in the long run:

  • we devalue our own expertise (in the case where expertise actually does apply, long debate that doesn't have its place here and now)
  • it puts a heavy burden (and reliance) on a handful of programmers to manage the edge cases and the bugs and the performance, while also taking away their visibility
  • it confuses even more the people who need our tech, who now can't reconcile the fact that everyone says it's easy while also saying it's super expensive to build

Does a website cost half a day of thinking to build it yourself and $1 a month to run, or does it cost 3 months of work by $1000-a-day specialists and upwards of $200 AWS costs per month?

Professionals will respond "it depends", and then what? How does that help first timers? Especially if the outsiders fail to see the difference, or if they saw a YouTube video how it's "click-click-boom", and that conflicts with their attempts at replicating the process?

This looks too much like a scam. Invest in this surefire way to make "TONS OF MONEY"™️©®, while needing ZERO expertise or time! When has that ever worked?

As soon as these newcomers hit their first complexity barrier, it's game over (as my students can attest).

Young professionals

So, next up the totem pole is the newly minted programmer. They are facing a very difficult challenge: putting a number in front of their work. Do they ask for $500 a day? Do they work on a flat fee per project basis? Do they go for a 9-to-5? What salary can they ask for?

Regardless of their objective level of competency (and I don't believe there is such a thing, despite what HR people and headhunters want you to think), it's too early in their career to have a definite idea of the boundary between laziness and efficiency. Is it better to work super hard 30h and then cruise along for a week or have bursts of productivity inter-spaced with slower consolidation phases? ("it depends", yea yea, I know). The fact is, young professionals tend to be overworked because they don't know better.

The appeal of a Shiny New Thing, that could cut in half their coding time is huge: they would effectively be making twice the money, or at least could sleep more. Again, we face the problem that most of the tasks they will be working on are, by definition, new, and bring along with them new bugs, new requirements, new edge cases and new problems in general, for which they have no solution beyond relying on the same couple of unavoidable maintainers.

This has the potential of bringing progress to a grinding halt until someone else figures out a way to move forward - usually the maintainers of the thing they built their work on top of -, in which case it devalues them drastically. Are they programmers?

This, to me, feels like the number one reason why young developers move towards "manager" position as fast as they can. Their expertise as actual coders has been devalued over and over and over again, and it's not their fault. But they fairly quickly think of themselves as incapable of coding anything.

Nerds and other enthusiasts

I guess this is me. "Oh cool a new toy!", which may or may not turn out to be a favorite.

Nothing much to say here, I've played with, contributed to, and used in production, cool new ideas that will become mainstream down the line (maybe?). But this is not an easy demographic to target, because we're fickle.

Old farts and other experienced programmers

Hmmmmm, this is me as well... Something that is well known about old farts is that we are reluctant to embrace "COOLER NEWER IDEAS"™️©®. Call it conservatism, call it prudence, call it whatever you want, I plead guilty. I've seen many novel ideas that were totally the future fall back to the dustbin of technology. Hell, I even spearheaded a conference about everything that was revolutionary about the Newton.

Where is my handwriting recognition now? Voice recognition was all the rage in the beginning of the naughties, and while we see decent adoption for very specific purposes, we're still very very far from the Star Trek computers.

"Deciders"

So, all that leaves is the person nominally in charge of taking decision but who is often ill-equipped to call the ball. Again, this is not about any measure of intelligence. I would be hard pressed to decide which material to build a bridge with, and I hope that doesn't make me a stupid person.

The inner monologue here seems to be very close to the one for the young professional (it would have to, since a lot of them "moved up" to this position from a few paragraphs ago): I am being sold something that will potentially halve my costs for the same results, what's not to like about it?

Again, the danger is over-reliance on a few key developers who actually know the inner working of that new framework or tool, which could go belly-up by the next iteration. Short, medium, and long term risk evaluation isn't natural or instinctive at all. I won't reiterate the points, you will just have to scroll back up if they slipped your mind.

What does it cost and who benefits?

(Yes I like complicated questions that I can answer in a broad generalizing sweep of my metaphorical arm... or die trying to)

Let's say I come up with a new way of "making a complicated website", by just "providing a simple text format and a couple of buttons to click". Are you drooling yet?

Because I want to provide a first version that will attract users, I need to deal with all the complexities of HTML, CSS, JS, browser idiosyncrasies, ease of adoption, etc. It's a huge amount of work. Let's simplify by assuming I'm a genius and I'm super cheap, and it will probably be an OK solution after a few months of work (which is absolutely not the case, I'm just trying to prove a point, ok?). We'll just round it up to 10 grands of "value" (i.e. what someone would cheaply pay for a junior developer to code it).

If I now say "hey, it cost me $10000 to make it, now pony up", the chance of adoption fall to zero (or so close to it that it's negligible), because every single other "web framework" out there is "free".

But someone had to pay (either in hard coin or in time) to make that thing, so what gives? More often than not, these days, those "automagical" solutions come from one of two ways:

  • a labor of love by a handful of committed devs (long process, careful crafting, uncertain future)
  • an internal development paid by a "web company" that gets released (shorter turnaround, practical crafting, highly idiosyncratic)

In the second case, the release is almost incidental. Any contribution, bug report, valid criticism, etc..., is a freebie. The product is in use and actively being worked on anyways, whether adoption takes off or not. It's 100% benefits from here on in, and in addition, the company remains in control of the future evolution of the Thing.

In the first case, it's more complicated. I see two extremes, and a whole spectrum in between, but I could be wrong, having never release a widely popular web framework.

⚠️ CYNICAL AND SLIGHTLY SARCASTIC CONTENT AHEAD ⚠️

On one end we have the altruistic view, and the way it benefits the originators is in kudos and profile upgrade. Should these people ever voice their opinions or look for new opportunities, they probably won't lack friends. It's a proof that these developers know their stuff and are experts.

On the other end we have the cynical view: by creating something new and having a wide adoption, you essentially create a need for your expertise. Who else could better train (for money) people, or build (for money) your website? Who better to hire (for money) if you need an expert? You were warned, this is the cynical view.

Of course, the real answer is somewhere in between, and highly depends on the project. I am sure that you are already thinking about frameworks you used and placing them on the spectrum.

Why that bothers me

No, it's not just because I am an old fart with fossilized views about progress. While it's not my passion, I actually enjoy teaching people how to code. I kind of hope that everyone I taught doesn't think of me as a conservative know-it-all incapable of embracing new things.

But teaching gives me a window on how people are trained to replace me (eventually). And while I like the students fine as human beings, it's hard not to cringe at the general lack of interest for fundamental knowledge (maths - as applied to computing -, algorithmics, etc...). To be fair, it's not their fault. They have been told repeatedly that they don't need to remember the hard parts of our racket. Someone, somewhere, has already done it. All you need to do is find it, maybe adapt it, and use it.

This is true, to some extent, in the context of a school. After all, how can we grade their work if we don't already know the answers? But once they start working... they end up in a position where their employers or customers expect them to grapple with problems there is evidently no off-the-shelf solution that works well enough. Since they weren't trained (much) in the low-level stuff, they are more or less condemned to an assembly job, where they glue together efficient pieces written by other people in the most efficient way they know of. I don't doubt it's fun for a while, but it gets very little recognition by their peers, and it's impossible to explain why certain things are impossible or don't perform well enough to untrained people, and there is a lot of competition.

Speaking of competition, it also levels it for the worse: if your only job is perceived to be assembly work, and the only difference between a junior and an experienced developer is the time it takes to glue things together, where is the value in keeping the more experienced (and more expensive) one? This creates a huge imbalance between the person that manages a product (and the purse) and the one making the thing.

Trend lines

This was long, and jumped all over the place, and it reflects the state of my thoughts on the matter.

  • yes, everyone should be computer litterate
  • yes, we need more people and more diversity in the specialist field of writing code
  • yes, we need an expertise scale because no one can be good at everything
  • and also, yes we need to stress that coding is hard and has value

Despite the exponential growth of the need for computer "people", it seems to me that the number of students going through a more rigorous training isn't going up. Meaning that the reliance on competent  low-level programmers is going to increase faster and faster, and, as the gap widens, the possibility of transitionning from low(ish)-value/high-level programming to a better paid and recognized "expert" status will dwindle to almost zero.

"When everyone is a super, no one will be"
- Syndrome, The Incredibles

(Except the handful of super powerful people who actually know how things work under the hood and can name their price)