« New iPods locked to iTunes in hardware | Marketing surveys: eternal damnation for television pilots » |
Link: http://xkcd.com/303/
It really doesn't matter what the programmer is actually doing; as soon as the words "code's compiling" are spoken, a free period appears to be granted.
I suppose I could counter that with "document the change while you're waiting" and watch heads explode.
When we had our five-year foray into C++, this was really appropriate. You could work on small pieces with fairly decent compile times, but once you were into the integration builds, or heaven forbid need debug information turned on or off, it was an hour and a half process.
Sure, macros let you do a number of things, but not enough to justify the approach C++ had of either compiling everything for every single output file or having a semi-craptastic scheme for “pre-compiling” things that could as often as not end up actually introducing bugs that weren’t present in the code.
Great for lunch breaks, reading, and going home, I guess. I think we did one day just get overly frustrated and broke out the umbrellas and mouse balls and played a quick round of umbrella golf.
It’s annoying that every once in a while they develop your tools so that compiling takes half the time as it used to, so your free time is halved too. And half of that you spend complaining to your co-worker how the old compiler was much better…
Turbo Pascal was the first language I did a lot of serious development in, and it always compiled really quickly (no macros and proper modules helped - full builds were under a minute until you got really huge projects), so I never had the coffee-break-while-compiling experience until we jumped into C++.
I had experience with a lot of other languages at home, including C++ (thank you, SAS/C and a sale at the Computer Shop in town) and a lot of Unix ports courtesy of the Fred Fish library, but you never truly get the sense of how long a real project compiles until you do a real project :)
So which compiler did they make worse and take less time to compile in, Ari? :)
it’s just some company specific thing. And whether it’s worse when it’s faster… one always have to have some resistance to new things by default. ;)
I often find that upgrades are a particular problem if the provider is poor at matching functionality from release to release (there was one by the C++ vendor where the initialization order changed and collections went from being references to being instances… ouch that broke stuff unnecessarily), or if you’ve had to make your own workarounds to the class library or other components, which is a little harder to fault the vendor for, but can still make upgrades a pain.
It’s things like that which make me recommend to people to use your own thin architecture to wrap around other people’s functionality if you want to remain sane between developer tool releases :)