what the fuck is devops anyway and why should i give a shit?
W..T..F.. is this new thing they call devops?
its even more of this absolute insanity of working endlessly
just to stay in the same place you were. keeping abreast of
the endless churn of meaningless 'updates' and replacing
all the components that keep getting deprecated (the easy
alternative to actually fixing bugs) nevermind that theyre
full of bugs while theyre still in use.. all to
accomplish a goal that one already had accomplished
last month, three months ago, a year ago, over and over,
but also a goal that we had solved decades ago with
simpler tools.. but hey, all the blogs say this is how we
have to do things!
At the core of this 'devops' craze there are a handful of
dogmatic rules, compliance with which in a sense comprises
the meaning of 'devops' as a word. The main items of dogma
are, each building on the others,
first, this idea of 'continuous integration and deployment', meaning
that there is no stable software environment, rather an
endlessly re-deployed binary soup constantly being updated by
whoever and whatever modifies it.
second, this idea that everything must run in a 'container'
on a 'cluster' in some manner 'managed' by a 'system'... which
concepts are poorly understood by these people, and which
in practice ends up being some mishmash of half-assed software
offferings that are trendy at the moment to provide more and more
layers of separation between running code and anything 'specific'
anywhere in the world. The definition of the value of the number 4
is too 'specific' and thus must be referenced from somewhere
should we want to change it - where we reference it from is
of courseone more of these pieces of critical 'infrastructure', which
as a rule must originate from some san francisco startup
made up entirely of 25 year old hipsters who spend all their time
drinking expensive coffees in absurdly decorated 'offices' which look
more like an ikea showroom. It of course , as a piece of software, itself
must also conform to this dogma of statelessness and dependence on
external 'infrastructure' which adds more links to the chain, and they
all get extra points if they dont run anything on any computers of their
own, but instead at the very least run them 'in the cloud' or perhaps
even dont run them at all but rather invoke them 'as a service' from
someone else's management. Moreover, as 'infrastructure' , it must
also conform to the dogma of 'continuous deployment' , so there is no
stability in these elements of 'infrastructure' either- things you depend
on are supposed to be changed out from under you with no warning
(wait you don't read all of our commit logs? youre not agile enough!)
and you're supposed to scramble to reconfigure or rewrite other things
(which if you're following dogma, means waving some magic wands far
far from an actual program or computer and letting subtle side-effects
percolate back) which of course also means other things you depend on
will have to do the same before anything works again.
but this is what they call being a 'ninja' or a 'rockstar' or some shit
like that, where you run around like an asshole trying to figure
out what the fuck just got broken and how to get your shit somehow
working again. over and over and over again.
third, the idea that everything is temporary, and therefore must be
stateless and therefore any state (alas that there must still be some
actual state somewhere, as much as these new age nihilists would
hate to admit it!) must be hidden 'somewhere else' as much as
possible. This idea gets repeated over and over up and down the
stack, so in the end you get this horrendous chain of reference, where
half a dozen different mechanisms for _transferring_ state in a
temporary manner, get used to keep pushing the origin of any information
further and further away, and at each stage there is some opaque
synthesis going on, where it is difficult or almost impossible to
reconstruct an accurate picture of just what the hell is going on
at any point in time, or to trace a change back to which _combination_
of fragmentary snippets of information in the various scattered stores
will yield which changes in behaviour. At many stages along this chain,
one might think one has found the spot wherefrom some item of data
might be originating, that it appears in the running instance,
only to find that changing it there does nothing, it's being synthsized
from somewhere even further up the chain.
What do you actually want? a more rigorous and robust
way of developing software?
there is no magical infrastructure which will do this for
you. The answer there really is in the category of
best practices and the simple tools we had decades ago were
fine for it - the real meat was always in discipline and
organization of your team. You should have been using
revision control, tests, backups, code reviews before
committing changes, and some kind of release engineering
already.. if you needed some magical tool calling itself
'technology' to get your act in order then you were
already screwing up.
There has never been and never will be a substitute for
actually knowing what you're doing and doing it well. No
matter how many software packages and how many management
fads come and go none of them will change this basic reality.
Though, by trying to enable people who don't
know what they're doing, to write some code, or to get
some results they don't understand, these tools might
simply make knowing what youre doing a whole lot more
of a pain in the ass by forcing arbitrary and pointless
complexity on the people who ultimately must make this
heap of junk work. Add to that the consequence of the
rest of the industry following the same fads, and you get
this neverending churn of half-finished software always
in flux, always needing attention, and a handful of smug
and dishnoest 'gurus', actually charlatans, who will sell
you the fantasy of everything ready and running at the
push of a button.
really, get over it. Institute some basic discipline
in your team, get a grip on understanding what the hell
you're doing and why you're doing it, and go back to
writing actual programs, maybe say some makefiles etc,
producing actual binaries, with minimal dependencies
and minimum sensitivity to other peoples' shit changing
on you unannounced, and try to regain some stability,
reliability, predictability, and reproducibility in
your software development.
*** BONUS RANT ***
this popular piece of shit called jenkins fails one
more of the golden design principles for good software,
which is 'use human readable data formats wherever possible'..
their brilliantly bloated binary log formats make it near
impossible to quickly just look at a log file without using
their horrifically shitty web interface (which by the way doesnt
even work with large logs, as these modern browsers seem
to choke the hardest on plain text files! ten tabs playing video
simultaneously, no problem. a half-meg text file? browser response
lag stretches out to close to a _minute_) , and also makes
any simple machine analysis very difficult (say i want
to run a simple awk script on the log to try to analyze something)
of course there's sure to be some 'plugin' for some other heavyweight
'tool' like elasticsearch or some shit, but for fucks sake,
i wanted to just do some stats myself on this fucking log file!
thanks again assholes!
followup: it turns out this is some dipshit programmer
on the jenkins crew deciding to output fucking HTML color and formatting
tags for EVERY GODDAMNED LINE of the log, and this HTML then gets compressed(!)
and encoded and takes up the first 300-plus bytes of every line (where your
typical line might be 50-80 bytes long on average! only a 10x blowup in logfile
sizes!)
for fucks sake people!
This file copyright 2018 by G. Economou