I remember when I was a young lad pursing my second undergraduate degree (don’t presume I finished the first, because I didn’t), I had to write a paper to get accepted into the computer science program. I remember reflecting on how math doesn’t change, and even though you are teaching me antiquated methods, I recognize that the patterns are transferable. Apparently, me telling them that their tech was not up to date didn’t piss them off enough to exclude me, or maybe I was spot on with that analysis.
Now after about 18 years in the industry, I think that reflection was spot on. I’ve been watching client/server become request/response, and now single page apps look a lot more client/server. It was do all rendering on the server and now its do all rendering on the client. On the backend, we had don’t do any processing in the data store (kill those sprocs!), but then map/reduce comes around and says, wait do the processing close the the data. Sorry man, we were wrong, that was a good idea after all, we just need to tweak it a bit. Doesn’t docker remind you of the JVM conceptually, it’s something like a virtual machine that sits on another machine, but doesn’t require the whole OS, in other words I can run several dockers on one box, just like I can run several JVM processes, albeit JVM will run out of memory much sooner!. Tech trends come and go, and a lot of them sound the same as before, sometimes they improve on the past, sometimes they make the same mistakes as the past. In general, it always back to square one:
- Pick the right tool for the job.
- Don’t add unnecessary layers of abstraction where they aren’t necessary, or aren’t bringing real value.
- Don’t trust that whatever X framework is promising without some tangible demonstration. The academic promise, if its nothing more than that, can not be trusted, and even when it works, has limits.
- Whatever fail safe pure layer you create, some asshole will find a way to leak it, trust me, I’m often that asshole.
- Beware of easy to start hard to finish. The industry is all about time to market, time to market, bring the value. But remember 80% of the time and effort is spent maintaining and supporting the software. Maintenance and sustainability are crucial. Regression test are great, but the compiler gives you stuff for free.
So next time you want to poke fun at the COBOL, ok now its the legacy Java, because its 10 or 20 years old. Think to yourself, will that transpiler that was just invented yesterday be around in 2 years even? Software that keeps adding value over time is hard. If you work somewhere that has some 5, 10, 20 year old code, instead of cursing at that shit, maybe stop and realize that its amazing that stinky crap still works, and its still providing value (and maybe your paycheck!), do you think your shitty ass code will be around that long?
I think the microservice trend is fine and I can see that value. I like that it forces the division between layers, and decomposes problems, its largely about risk management, and quick deployments. But on the other hand it’s also a cop out to believe that software can’t have any longevity. Maybe its a dying art, maybe I’m just an old fogey, maybe the crap created today really won’t be around in 10 years, none of it and I should just accept that. But seeing as how the patterns keep coming back and fading away just like fashion, I’m thinking a sturdy pairs of jeans will do just fine. They might take awhile to put on, but once they are on, I can do anything. And if I can have quick time to market and maintainability, I’m picking that one… Scalalala