Demodynamics

It should be clear by now: I am a geek. Aside from all the normal quirks, I’m a computer geek, which means that I dream about systems and I subcounciously try to optimize things, make them more rational if not more efficient… I’m told it’s borderline rude, sometimes.

Anyway.

There is one thing geeks and non geeks who actually encounter large amounts of people all at once agree on: we suck at demodynamics.

Look at a school of fish or a flight of sparrows. Even though they have no brain to speak of compared to ours, you don’t see them bumping into each other even though their speed and group density is a receipe for disaster. Imagine a bunch of people you say “run around for a half hour, but you have to stay together as a group” to. When you’re done laughing, you’ll know what I mean.

Why am I rambling about demodynamics anyway?

Well, professionally, you can draw a lot of parallels between the two following situations:

  • a group of people is supposed to run together towards a common goal without knowing the route and finding some difficulties along the way
  • a group of people is supposed to deliver a product that has been outlined in somewhat vague (from an engineer’s point of view) fashion

And you see the same kind of dynamics: people shoving, people showing off, but also people helping each other when facing a wall etc…

Yesterday, I was in the subway (but you can have similar occurences when driving), and a couple of ladies rushed past me in a corridor, only to go half my speed ahead of me, effectively blocking me, because they were side by side.

Now, the worst part is I don’t think they even realized. They were side by side because they were chatting, and going slower for the same reason. Whoever is placed in that situation will undoubtedly sigh heavily, at the very least. But the same can be said for people who honk at you when you can’t pass the truck in front of you, etc…

As I said, people suck at demodynamics. Evaluating the right time to yield a priority you do have, in order to fluidifying traffic for everyone, including you, is a hard thing to do, since you basically can’t trust anyone around you to act with the same plan, let alone intent.

When you think about it, it’s all about two things: telegraphing your intent (and your plan), and being on the lookout for other people telegraphing their intent. That’s level zero. Then you have to know when to enforce and when to yield, and telegraphing that as well.

Most people think the problem lies in the second layer. We are a competitive race, and we naturally expect our solution to be followed. But my impression is that we completely lack the understanding of level zero. It’s not that our plan is the best one… It’s that it’s the only one.

Talking about this to my friends in the business and outside of it, we kind of agreed that people who like to do things when they have to relinquish control to have a better time are the ones looking around for cues and avoid bumping into other people (as understood in a general sense): people who dance a lot, musicians, construction workers, military or military inspired people,…

In any project I go with, it is painstakingly obvious that if someone I depend on fails, I’m screwed. If for nothing else, that makes a duty of mine to help this person. To some degree, the same can be said about people “above” me. I have to point at potential problems early and help them make a decision.

Unfortunately, as with the people in the subway or on the road, it doesn’t seem to be that obvious. Here in France, we go back and forth on a mandatory class taught to all kids that’s called “civic instruction”, or whatever the name that thing might have these days. Is there any way we could make that a demodynamics course, or a dance class?

TBC

  

Rule Of Thumb

Principles, rules and laws are essentially the same thing. I won’t bother you with a paraphrasing of my recent reading list, which includes Plato, Kepler and Douglas Adams, but for a freelancer, it’s important to differentiate what is what, especially for the Other Guy.

A principle is a lighthouse on the horizon, and it’s OK to veer left and right, or even ignore it altogether. That’s one end of the spectrum. At the other end of the spectrum, you have the Law, which, to quote Morpheus, will break you if you try to break them (and get caught, obviously).

There are varying degrees of rules in between, from the rule of thumb to the house rule. Which apparently is akin to law. Or so I’m told.

Moving on…

Developing a program is kind of a ninja split between the two: some rules are unbreakable, because of maths, and contracts and stuff, and some people try to impose on us rules that can (and sometimes should) be gladly ignored. Just look at some interface designs blatantly ignoring the rule that someone somewhere edicted, and look just plain awesome. Right?

I took a roundabout way to make that point but programmers tend to consider rules with a clear downshift on the “have to” slider.

But, as computers are very attached to their governing rules, humans go a long way to actually enforce them. Case in point: you’re asked to make a mockup app that will illustrate some concept or something. It’s about as easy as making a working prototype, sometimes, so we bend the Prime Beancounter Directive: we go beyond what’s asked. But it’s not what was covered in the Contract. So we don’t get paid. Or at least it’s very hard.

So the appreciation of this particular rule was apparently wrong.

The problem is twofold: the question of the rigidness in the expression of the rule, and whether the Other Person tends to respect the spirit of the rule rather than the letter of it.

For the second part, it’s a lot easier to hide behind wording and you-have-to-s than to imagine what the intent of the rule is. That’s how we get “warning hot” on coffee cups (wait, what? I specifically ordered a lukewarm boiled cup of coffee, not that seemingly delicious cup of joe!), or “do not dry your pet in it” on microwaves (I won’t even bother). As weird as it sound, stupidity is foolproof. Adhering completely to blatantly stupid explicit rules is what makes the world tick smoothly it seems. For more on that, see the Miss Susan vs Jason, in Thief of Time.

You soon learned that ‘No one is to open the door of the Stationery Cupboard’ was a prohibition that a seven year-old simply would not understand. You had to think, and rephrase it in more immediate terms, like, ‘No one, Jason, no matter what, no, not even if they thought they heard someone shouting for help, no one – are you paying attention, Jason? – is to open the door of the Stationery Cupboard, or accidentally fall on the door handle so that it opens, or threaten to steal Richenda’s teddy bear unless she opens the door of the Stationery Cupboard, or be standing nearby when a mysterious wind comes out of nowhere and blows the door open all by itself, honestly, it really did, or in any way open, cause to open, ask anyone else to open, jump up and down on the loose floorboard to open or in any other way seek to obtain entry to the Stationery Cupboard, Jason!’

Loophole. The Dreaded Word by the Rulemakers. The Golden Sesame for the Rulebreakers.

But the power of a loophole relies solely on the fact that the rule is rigid to the point of absurdity. Of course, there should be an unbreakable rule that states that it’s not allowed to come to my home and take my hard-won computer for themselves. Of course there should be one for being able to tell a power hungry person that they overstep said power.

I guess the whole point is finding out where the rule protects a group of people from others and also from themselves. But when we’re talking about breaking a rule, in order to make something better for everyone, it’s an epitome of everything that’s wrong with our reasoning abilities.

And yet… I hear some of you think along the lines of “yea but if some rules should be put aside, how can that not be an argument for that there should be no rule, at least with some people?”. Strictly respecting all the rules makes it easier to have others respect all the rules as well, right?

Wrong.

Again, I think it’s a matter of harm. If by breaking a rule you harm no one (including yourself) in any way (except maybe their ego, but that has nothing to do with anything), then the rule is stupid. And should be ignored. Like, say, going beyond expectations. Actually, breaking a stupid rule should be grounds for an award, a compensation, something stating “wow, that rule was stupid. This awesome programmer deserves a raise. And he’s so cute too… <fawns>“.

Ahem. Anyways…

So then, I hear you think from here on my spaceship, how do you know you’re doing no harm? to anyone?

Dude, the daily personal and professional interactions we have are rarely a matter of life and death for entire nations. Business laws are supposed to protect me from getting screwed over by customers with no scruple. Not to prevent me from doing my job better than I’m supposed to. Fact is, most of the time, to enforce a “common sense” rule (getting paid for a job), I have to go through stupid rules first. And since the Other Guy is usually better equipped than I am to handle these first stupid hurdles, they win the race. So it spirals down: stupidity being the most efficient way, it becomes the norm. And we have to edict new rules to kind of balance the most stupid of our actions, or to close the loophole. Oh wait, another set of stupid rules to follow!

Stupidity is recursive. Thinking is hard.

The end doesn’t justify the means. Life shouldn’t be a permanent chess game either.

  

We Are What You Call Experts

OK, so France now has an experts board of digital something or other. Most companies I work with have hired, or will hire an expert to recommend stuff or audit stuff. And of course, even I get hired for expertizing stuff every now and again (go figure)…

As I stated before here and there, there’s something troubling about experts in what is perceived as my field. Experts in demolition or piloting, or botany, I get. These are highly specialized fields where it’s easy to spot an expert: they clearly know what they are talking about. You test them by asking them to do what they are experts of.

But in computer science, the field is so vast that it’s quite easy to pretend to be (or be mistaken as) an expert on one of the gazillions of subfields this domain has. Even family sometimes doesn’t get the fact that a coding or design expert isn’t the person best suited to repairing the printer…

Sometimes I feel like we are the doctors from the 1600s. We use jargon, we give off a vibe of tightly knitted hermetic community, and we wield an inordinate amount of power in regard to what we actually do, or know. Would you go to a vet to reattach your cut finger? Or go “hey come on, you have a medical degree, you can give me meds for my heart condition I think I have!” to a cousin who’s studying to be a chiropractor? (no offense to either of these fine specializations, it’s just to illustrate a point, I wouldn’t ask a heart surgeon to set a splinter either)

We live in a world of experts. Because of the high specialization of everything, you have to be certified, it’s harder and harder to switch fields, and the amateur sports are loosing spectators. But as soon as we are talking about computers, the expert status is somewhat murky. How many times do we freelancers have to “compete” with the second cousin of the daughter’s hairdresser, who’s “making websites”? Dude, I’m an app developer, I have a score of people I trust who can build an awesome website for you, why would I know anything about web technologies? I rely on… experts… for that… Can I dabble in it and commit an atrocity that would pass in poor light for something acceptable? Sure! Should I get paid for that? Hell no, there are people way better suited for that job. Could I? Probably.

Enough ranting, how can anyone rate somebody as an expert? In computer science, diplomas are not a sure way. Look at Mike, who’s clearly an expert, yet came from journalism. Portfolios are a good indicator, but only an expert can gauge the difficulty of the thing. Publications are yet another indicator (thank for reading this, by the way!), but with the Internet, the number of plagiarism cases is going through the roof.

“You are pretty bleak” I hear you think… But in all seriousness, I wouldn’t even know how to prove to you that I’m an expert. And by proving, I mean convincing you I know what I’m doing, without having to work for you for free to build something in an Internet-shielded room for a week. When all’s said and done, it’s just a matter of marketing myself. It helps that most people see computer science as some kind of magic, and are therefore highly susceptible to buzzwords and “hey I make loads of money in my work, that must mean I’m good, right?”. Wrong. Buzzwords are easy to acquire (read Plato’s Gorgias if you don’t believe me), and the money argument is a tautology and a self-fulfilling prophecy.

So to all of you awesome developers and designers and computer geeks out there who are really and immensely competent, yet don’t have the respect and credibility they deserve, kudos! And I’m sorry. I fear it will be a long time before there’s an objective way, accepted by most people, to finally get how good you all are. It took medicine a couple of millennia to go from “having a diploma” to “having a somewhat clearer way of discerning experts from fakers”. Hopefully it won’t be that long this time around, but I can make no promises. I’m not an expert in such matters…

  

Wall? What Wall?

The excellent Mike Lee (@bmf on twitter) has a hilarious way of handling theoretical problems: he ignores them to solve them.

In a case of life imitating art imitating life, programmer Mike Lee explained his writing a solution to the halting problem, with the simple explanation that, lacking a formal education in computer science, he didn’t realize it was considered unsolvable.

To solve the halting problem is to write a function that takes any function—including itself—to determine whether it will run forever, or eventually stop.

The classical approach is to wait and see if the function halts, but the necessity to accept itself as input means it will end up waiting for itself.

This paradox is what makes the problem unsolvable, but Lee’s function avoids the paradox by using a different approach entirely.

“It simply returns true,” Lee explained. “Of course it halts. Everything halts. No computer is a perpetual motion machine.”

That being said, the scientists vs engineers problem is an old one. Computer science started out as a branch of mathematics, and was treated as such for the longest time. When I was in college, we didn’t have any exam on an actual machine. It was all pen and paper!

Part of the problem of any major “it can’t be done” block on the road is the sacrosanct “it’s never been done before” or “such and such guys have said it can’t be done”. The truth, though, is that technology and knowledge make giant leaps forward these days, mostly because of people like Mike who just want to get things done.

Just remember that a few decades ago, multi-threading was science fiction. Nowadays, any programmer worth their salt can have a builtin “hang detector” to monitor if a part of their program is stuck in an infinite loop, or has exited abnormally. Hell, it’s hard to even buy a single-core machine!

I distinctly remember sitting in a theoretical computer science class, listening to a lesson on Gödel’s numbers. To oversimplify what I was hearing, the theorem was about how any program could be represented by a single number, however long. And about 5 minutes in, I was saying in my head “duh, it’s called compiling the program”. Had I said that out loud though, I’d probably have gotten in a lot of trouble.

Don’t get me wrong though, I think that mathematical analysis of computer programs is important and worthwhile. I’d like to be able to talk about optimization to a whole lot more people (how you just don’t use an O(n³) sorting algorithm, please…). But whereas I see it as a valuable tool to prove something positive, I stop listening whenever something is deemed impossible.

Trust Mike (and to a lesser extent me) on this: if something is impossible, it’s probably because the right tools haven’t been used yet. Maybe they don’t exist. And I’m ready to acknowledge that there is a probability they won’t exist any time soon. But “never” is a long time for anything to (not) happen.

UPDATE: it seems that people link it with the skeuomorphism ranting from before. True, it does ring familiar: we do things like we’ve always done, because we can’t do otherwise. Right?

  

I’m a Corsair, Now

For the past few months, I’ve watched with growing fascination people trying to hack (and sometimes succeed) into my various servers.

Now, while I won’t admit to any foul play, it reminds me of things we used to do with a group of friends who are today much less disreputable than I am.

So, at the risk of earning once more a reputation of being a dinosaur, I’ll gloss over some stuff that gave us (and probably give the people behind these somewhat unskilled attacks) a rush and some thrills.

I don’t remember anything before 2000. My memory from before is composed of things that I was told. But, given the fact that some of these people are my friends to this day, people I trust completely, I’ll go ahead and assume they were true. It blends so well with what I do remember that I’m inclined to believe it all anyway.

Ye Olden Days

Back in the day of RTC modems, being connected was an investment. Relatively speaking, it costed a lot of money, especially given the fact we were paying by the minute, here, in France. We were connected because we wanted to be connected. None of that “hey, I’ve got 10 minutes to kill, I’ll watch a youtube video” kind of things. We didn’t even have the bandwidth to do that… 3KB/s meant that downloading a single mp3 file would take a half hour. And it would be a half hour where we could do nothing in the meantime.

I won’t say it forced us to do meaningful things, because it didn’t. But with so many things to cram in such a short time of activity, writing an email, a post on a newsgroup or a BBS, or even chatting on IRC took a certain degree of planning and will.

It’s hard to imagine today that checking my email (downloading AND sending) would take 5 to 10 minutes to complete. Then I would have to disconnect if I wasn’t doing anything on the ‘net, to reconnect when I felt like sending my responses. Boggles the mind doesn’t it?

It also means that every minute/hour we could steal to spend chatting with our friends on IRC would have to serve some purpose, whether to get news from people you like, plan something together, etc… What I certainly can remember is when most of us got permanent broadband access. Suddenly, the activity was a lot less focused.

Anyway, back to the thrills and the adrenalin rush, we were doing inconsequential things such as taking over channels, and probing defenses of other computers, sometimes even getting access. Some of that original group even went on making that their actual job. I never did, although I kept a toe in that pool: trying to figure out a way to hack into other people’s machines and servers, as well as “social hacking” organizations to get privileged information or just for fun, is a useful skill to have, if only to have a basic understanding as to how to protect yourself and people around you from it.

We would spend hours discussing the best methods, and talking about the latest exploits that would allow someone to get in and do… whatever. I don’t think, nor remember, doing any harm, but we could have, I guess. And that’s the clincher when you’re a young computer scientist in a world where most people just don’t get it. It gives you tremendous power, that can look, from the outside, quite magical. The temptation to use the skills, the minutes or hours you spend preparing and finally the victory over a security system is like every other sport. It makes you feel immensely good.

Fast Forwarding Ten Years

Nowadays, I get amused when I see the phishing campaigns which seem so effective. They are a very crude attempt at corrupting the weakest link in the security chain: the human brain. I mean, come on! Who in their right mind would think that the bank would send you an email telling you your account has been hacked/overfunded/…? Banks usually understand security, and will do paper or phone, not hilariously unsecure emails… Millions of people apparently fall for that.

To circle back to my current string of attacks, what appalls me is that it’s a brute force attack. The attacker(s) are trying as many passwords as they can to gain access. Someone read too much Dan Brown, the Bergovsky Principle doesn’t exist… While it is indeed possible on some incredibly weak protection schemes to guess the password for an account by repeatedly trying until you’ve exhausted all the possibilities, it’s hardly the most effective method:

  • it’s slow
  • it leaves a huge amount of traces
  • it can only work if you know for sure the login
  • it works only if you have an unlimited amount of tries
It’s All About The Brain

Man, some days, I pity the clever minds who always had access to that amount of raw processing power and bandwidth. The most effective way to do something to an average person or system is supposed to be repeatedly banging against the door? If a piece of software doesn’t run fast enough, “tough luck”, and wait for the next generation of hardware?

I was sitting at a table the other day with some young programmers who looked at me as if I foamed at the mouth because I asked if they were using encrypted connections to check out their sources. In a fricking public space. And for some reason, I’m a rabid and ranting person because there’s no way I’m going to store some personal and confidential data on a server I don’t trust (I’m looking at you google and dropbox).

Am I paranoid? Is it the rambling of an internet-time old geezer? Maybe. But remember: trust is something that should be earned. What if your account got hacked? How would you know? At least, with the (increasingly smaller, incidentally) amount of control I have over my servers and services, I can tell if I have been impersonated or stolen from…

It’s an evolutionary meme that scarcity of resources breeds inventiveness. When you don’t have the possibility of trying 30 million passwords per minute, the few you actually do try have to have a good chance of being the ones. When your bandwidth limitation means that it’s going to take literally forever to get something, you try to get it offline. And when you can’t trust a service beyond a certain point, you don’t give it information that is too sensitive.

What’s With The Dinosaurs, Anyway?

Dinosaurs got extinct, probably because they were built to take advantage of a much bigger resource pool than we are. They couldn’t adapt to having less. That’s why I laugh when I’m called a dinosaur. I might become extinct because I can’t adapt fast enough to certain things, that’s for sure. But resource-wise, I learned with my friends a long time ago, that if you have the skill, you can do a lot more, with a lot less. And if I do have more at some point in the future, who knows what I can do?

You guys who are trying to get in at that very moment… Do less, but smarter. It’s not that hard. And you’ll get in. And I’ll curse and I’ll laugh at the same time, and the game will go on. For the moment, you are behaving like a dinosaur, be it the cooler Tyrannausorus-Rex one, and it got extinct. Or so I’m told.

  

Not So Conservative Anymore, Eh?

First things first, happy new year to all! Pots of luck, good health, and all the rest that should apply to you this year!


New year’s resolutions notwithstanding, I wasn’t so conservative in the past couple of weeks.

First off, pissed by the lack of stability and dogged slowness of Xcode on a fricking 8-core/16GB of RAM machine, I gave AppCode a spin. And for reasons that I’m not completely sure I understand myself, I got myself a new Nexus 7.

I didn’t plan it that way, but the first post of the year will be a feedback on these products.

AppCode

First off, it’s meant to be an Xcode complement, and not a replacement, as some things like editing XIBs and CoreData models still have to occur in Xcode, and the debugging/packaging of applications have to rely on Xcode’s toolchain anyway.

So, as the name suggests, it’s mostly for code editing and debugging. For someone who’s never used daily an IDE on another platform (I’m looking at you Eclipse and Visual Studio), some of the philosophy might be irritating or challenging. AppCode is made by the people who did IntelliJ, a popular Java IDE, and therefore comes with a different set of conventions.

The very VS way to have the “tabs” (or sticky editors, or views, to use older parlance) vertically on the left and at the bottom of the screen has the advantage of finding easily your build log, the TODO list (yaaaaaay), and such, but at the price of taking some screen space for “nothing” sometimes. And the opposite approach to regular tabs from Xcode takes a little time to get used to: where you had permanent tabs that would represent kind of a workspace, where one file is open at a time switches to plenty of tabs, one for each file, that are not reassignable.

I guess the crux of what I’m saying is that if you used to be a VS or Eclipse or IntelliJ user (or if you still are), the gap is not that big of a deal, but if you’ve only used Xcode in the past few years, it will take some time before you zip through your interface again. But that’s ok, right, we’re not that conservative anymore!

AppCode shines mostly where Xcode doesn’t: it feels really responsive, it has text edition capabilities that make me wish for a lovechild between them and BBEdit to make it even better, has a plugin SDK for us to add the missing features or to patch the existing ones, and analyzes the code in a controllable fashion. It’s a tool made by developers who apparently would like me to tinker around, which I read as a good sign. After all, we the clients are in the same business as them, and therefore share a few longings and values…

Of course, having to go back to Xcode whenever I need to edit some “proprietary stuff” is a little jarring. I can’t completely break free of Xcode to compare them on equal footing, side by side.

But all in all, it’s giving me new ways to do things that might ultimately be good for me. Just the amount I spend swearing has been cut by two thirds… For everything that has to do with text editing in the context of a project (where BBEdit can’t, unfortunately, help me) I am more efficient.

Nexus 7

For a wide variety of reasons and potential future projects, I went and bought an Android tablet. I should add that I’m not a tablet kind of guy, even though I currently have 4 in my care. For some reason, apart from reading an article every now and again, or a comic book in digital form, I don’t really have any use for it.

I should also mention that I recommend tablet-like devices to people who would benefit from having one instead of a laptop, so it’s not an ideology thing, but my way of using electronic devices revolves around two things that tablets can’t provide for me: a screen big enough to have lots of things on it, and the ability to switch back and forth rapidly between applications, because I’m copying and pasting, or testing, or using specific tools to edit such and such chunk of data.

That being said, I have played a lot with iPads, and I feel like I can at least compare the two devices on the basis that I use them in the same way: because mostly, I have to.

First off, forget about the regular trolls. From where I’m standing, both tablets are a pain in the neck to get data on. They go through hoops to “make it simple” for the user, only to have another (and sometimes confusing) way for moving things around between applications and the OS. They took a different tack on Android than they did on iOS which should appeal to me as a “computer nerd”: I can access the filesystem or use one of the many “file selectors” available on the google store.

And here lies the thing that aggravates me most about the tablet: it’s hard to “get it”. Removing an app has to be done from the app list (which is logical) by doing the same thing as if I wanted to add it to the main screen (which is not) with a small variation. I have to put it in the trashcan that only appears when I’m holding an app. OK, that’s just a habit to get, right?

Trouble is, pretty much everything I encountered as a newbie on this thing is of the same ilk. If you know how to do it, it’s not overly complicated, but if not, it’s more of a game of Pin the Donkey. I spent half an hour on the phone with my good friend Marc (from the podcast), who got one too, and we both had something to teach the other, in terms of “normal” use.

And I won’t even mention my crushed hopes for an app store that would be so much better than Apple’s, because it’s made by people who nailed the web search

As a developer though, I’m rather happy. I can tinker, tinker and tinker, and the device doesn’t get in the way. I also think that the Activity/Intent mechanism as a good thing: you declare your app as being able to do this or that, and whenever another app needs those features, they can call you, or ask the system to find you. That’s pretty neat for reusability.

Hardware wise, it’s a nifty little machine. I have yet to see it struggle with something other than poorly designed user interaction features, and could finally play with Alex Ku’s port of the blender game engine. After months and years telling everybody and their kin that the Blender project was nothing short of miraculous, and pretty much a solution to every single one of their problems, I can show them. The tablet does 3d graphics so fluidly that most people I showed it to thought it was a showreel. Loved it.

Anyway, I figure that based on my usage of tablets in general, and my work, I’ll use the iPad and the Nexus pretty much indiscriminately: they both give me a solutions to problems I don’t have and don’t really fit that well in my way of using electronic devices. But I’m glad they exist, as I can totally understand that laptops and computers in general can be daunting or challenging or just plain weird to a lot of people who just want to get their things done.