books chapter two
Brad Fitzpatrick has never lived in a world without the Internet, founded LiveJournal, and says lots of things I agree with, so he must be really smart.
AOL used to mail you a CD with a code for some free hours. Brad wanted some free Internet, so he scripted the request form submission and requested a few thousand CDs. His mom said that would cause trouble, but he figured their fault for not rate limiting, right? Eventually AOL called his house and and yelled at him for too many form submissions. Brad yells back that they should stop sending him all this junk, it’s filling up his mailbox! AOL says sorry for the inconvenience. A cautionary tale about what happens when you have neither rate limiting nor monitoring.
Brad was a solo contractor for a while, trading projects with another solo contractor named Brad, but then a big project came along that required them to work together and they had to fly to “the cheesesteak place.” I died.
In addition to LiveJournal, Brad (and company, this is somewhat later) were building a photo sharing site, FotoBuilder. This didn’t really take off, beaten by Flickr, perhaps because it was overdesigned. At the same time, some components built to handle the load at LJ were designed to be a little more abstract. memcached would otherwise have simply been embedded into LiveJournal’s code, but since they had a second project in mind, they made it flexible enough to support any web app. So abstraction good and abstraction bad. Having a shipping product and a known use case seems to help identify the right level of abstraction.
Brad bought one NetApp, which cost “all the disposable income you have without going broke.” It didn’t really work out all that well, so they never bought a second. Not that they could afford two, anyway. However, scaling storage and the database was something they struggled with. Handling increased web load by tossing another web server into the pool is pretty easy, but a database has a lot more state and is not so easily replicated. Migration is particularly difficult. To avoid pain the future, plan for the day when you can’t access all your data with a single join. This was a hard won lesson at the time, but it’s probably closer to common knowledge these days, although that’s no reason to go overkill.
I had somebody recently tell me about something: "Java takes care of that; we don't have to deal with that." I was like, "No, Java can't take care of this because I know what kernel version you're using and the kernel doesn't support it. Your virtual machine may be hiding that from you and giving you some abstraction that makes it look like that's efficient, but it's only efficient when you're running it on this kernel." In practice, nothing works. There are all these beautiful abstractions that are backed by shit. The implementation of libraries that look like they could be beautiful are shit.
Productivity tip: try working on an airplane, and see how often you flick over to a browser and check some news site, only to receive a not connected error.
On debugging, it helps to verify one’s assumptions. At every level. Before spending 90 minutes single stepping through a program because the output is wrong, make sure you’re reading the right output file. Having done this a few times, the hardest bugs to find are definitely the ones that don’t exist.
Brad is skeptical that programmer time is really worth more than computer time. Perhaps for a small number of machines, but eventually there are going to be many machines, right? I agree very much. If a program with a million users takes two seconds to start, and that can be reduced to one second, even with ten days of effort that comes out ahead. And if every user launches the program a second time, even more savings. There’s a sort of tragedy of the commons. Even if the program is open source, and I can fix, my personal savings will only be a few seconds. So nobody wants to be the one to invest a few days to speed things up because they personally won’t come out ahead. One might rephrase programmer time is more important than computer time as my time is more important than your and everybody else’s time.
Brad recommends Higher Order Perl as a fun mind bending book.
I don't think we've really made any progress in quite a long time. Computers seem slower, and crashier, and buggier than ever. But I'm such an optimist, I keep thinking that they'll get better. It seems like my computing experience was happier ten years ago than it is today. It seems like my computer was faster ten years ago; like my computer worked better ten years ago.
Yeah, but that was ten years ago. Things have certainly changed, right?
Maybe there are so many layers of abstraction that people don't know what the hell is going on underneath because the computers are so damn fast that it hides your stupidity.
Dude, you read my mind! Actually, I read this book several years ago and this sounded like good advice, but I didn’t find it especially noteworthy. Coming back and reading it again, wow, it really resonates. Stupidity hiding abstractions are my personal bugbear these days, but how I wish they were everyone’s bugbear.
Steve Wozniak is a man who knows how to sell a computer. No, wait, wrong Steve. Wozniak knows how to build computers. And he’s really good at it, using half the chips as anybody else. “When you design with very few parts, everything is so clean and orderly you can understand it more deeply in your head, and that causes you to have fewer bugs. You live and sleep with every little detail of the product.” Reasonable advice for building software, too, I think.
Steve is telling this story about how he was slowing building a computer at home, and it was very expensive. He spent all this effort reducing chips and reusing components (like TV for a monitor) to keep the costs down. At the time he was an engineer at HP. As I’m reading this I kept thinking that something must not be right. An engineer’s salary shouldn’t be that bad, and how many chips could he need, etc.? But then I realize maybe this is before the invention of the integrated circuit, so you really would need a lot of chips. However, actually it was Steve who forgot about ICs. He was pretty heads down at work and on his own project, and he’d missed the introduction of cheap microprocessors! Once he got his hands on a datasheet, he made up for lost time, but I thought it was kind of amusing that he’d fallen behind the times for a brief while. Sometimes there’s a change or new development that makes a huge difference, and it helps to know about it.
In developing the Apple II, Jobs would pressure Wozniak to reduce costs by eliminating another two chips. Wozniak would say he could do it, but then they’d have to cut high res mode, and Jobs would relent. I like this development style. Go through the project, pick some stuff and try to remove it. What features are lost and are they important? Some stuff you definitely need to keep, but there’s usually some “loose” bits that can be pulled out.
Joe Kraus founded Excite, which made a search engine in the days before big Goog. He and his friends didn’t really have a plan, they just made a company and threw ideas against the wall.
There’s two unrelated stories about cost savings, but I thought they made for a nice contrast. In the early days, when they had very little money, Joe bought a used copy machine. This would save so much time and money, vs going to the bank to get dimes to pay for copies at the library. Except it didn’t really work, so that was a waste of money. Some time later, when Excite is getting some traction, they had a buyout offer from Microsoft. Excite asked for $100 million, and allegedly Bill Gates asked someone how much it would cost to build an equivalent service, and the answer was $25 million. So Microsoft turned down the offer, but then they didn’t build a search engine, either. (At least not for many years.) So kind of a strange missed opportunity, where they had the choice to buy it or build it, and they chose nothing at all.
The titular chapter, The Mythical Man-Month. What to do when a project is late? As the months tick by, scale up the men accordingly. Alas, reality is not as simple as this artithmetic makes it appear.
An interesting observation about the difficulty of project estimation. When building something in a physical medium, we learn to expect difficulties. The lumber and the paint and all the other parts contain some inherent uncertainty in them. Mistakes will be made. Rework will be done. And so we accommodate these possibilities in our estimates. Or at least the better craftsmen do. But computer programming is crafting “pure thought-stuff”, a very flexible and powerful medium. This leads to a false confidence that everything will go according to plan. All we need do is think it, and it will be. Programming is hard because programming is easy.
As we add workers to a project, we increase training time and intercommunication, and so forth, eventually making things even later. It’s also difficult to adjust a project’s timeline when it’s late. If a four month project has only accomplished one month of progress after two months elapsed time, will the project be done in five months or eight months? A great deal of frustration and trouble could be eliminated by being honest upfront, but nobody wants to hear that things aren’t going to be done on time. And so delays tend to explode at the very end.
Along those lines, Brooks would always schedule lots of time for what he calls a system test. Integrate all the parts and test the final result. Not allocating enough time for this is extra bad. You may think you’re on time, but if nothing comes together, suddenly and without warning you are going to be very late. And this is even exacerbated if previous efforts to maintain the schedule did increase manpower. Now you have more components, by more individual workers, that need to be integrated.
Personally, I really like branchless development where everyone is always working on a single integrated code base. The system testing is amortized over the course of development and everybody always knows where the project as a whole stands.
4. Good enough software doesn’t necessarily mean the bare minimum, but there’s no need to gold plate before shipping. “Great software today is often preferable to perfect software tomorrow.” Oh, to live in a world where those are my choices. They’re mostly talking about quality, but the advice also applies to features. Don’t ruin a good program by piling on too much. Quit while you’re ahead.
5. Expand and diversify your knowledge portfolio. An elaborate analogy with financial portfolios is made, the gist of which is that you should always keep learning. Nobody wants to be that COBOL programmer who can’t find a job. But maybe don’t spend too much time chasing the newest buzzword? I can’t really argue against learning, although some of the examples given are a little dated. Learn about the web! OK!
6. Communication is key. Consider your target audience. This is more about interpersonal communication, rather than documentation, but either way, there’s several ways to convey the same message. Pick one that people are inclined to pay attention to.
We spend some time studying Braille. Which is a conveniently binary like code. Six dots give us the possibility to represent 64 characters or symbols. That’s a decent amount, but we can do even better.
If a letter appears by itself, it’s actually a word, like “but”, “can”, “it”, “the”, etc. Among the possibilities, I like that “knowledge” ⠅ has a symbol all its own. We can also use a single code for some common combinations, like ch, gh, sh, etc. And then there’s a special symbol, ⠼, which means the letters to follow are actually numbers. If we want to indicate a capital letter, there’s a shift character, ⠈, for that too.
Braille is pretty neat because it demonstrates a number of encoding techniques. There’s compression, reducing the space needed for common or redundant information. There’s shift codes, which alter the interpretation of following codes. And there’s escape codes, which indicate special handling for the following code.
Next we learn how a flashlight works. Electricity is kind of like water flowing around in a pipe, and atoms are like little solar systems with electrons orbiting around, and none of this is really true because analogies only go so far. We measure electricity using volts and amps and ohms and watts in honor of the inventor of electricity, Ohmbudsman Voltaire Ampadeus Watts. I made that up.
Now that we know a bit about electricity, we can understand that circuits are open or closed, current is flowing or not, lightbulbs are on or off, etc. This could be important.
Common theme this week seemed to be about keeping up with the latest developments. If you miss out on the introduction of the microprocessor, that’s definitely going to make things challenging. Chasing the new shiny can definitely lead one in the wrong direction, too. Wozniak was able to design great computers with microchips, but he was also capable of designing great computers without microchips. He knew what was happening underneath. And the same with Brad Fitzpatrick on the software side. There’s a lot that can be accomplished using abstractions, but we need to be careful we’re not making things worse.
Studying a simpler system is a great way to learn. A couple people made this point, and I really liked that the case of study of Braille makes for such a fine example. Memorizing all of Braille could be time consuming, but even a quick shows how some concepts work and combine.