documentation is thoroughly hard
Documentation is good, so therefore more documentation must be better, right? A few examples where things may have gotten out of control.
more...
Documentation is good, so therefore more documentation must be better, right? A few examples where things may have gotten out of control.
more...
Why don’t unix commands have any vowels in the name? cp and mv are obviously devoweled standins for copy and move. But they’re less intuitive for new users. The user wants to copy a file. Why shouldn’t the name of the command be exactly the operation the user wants to perform?
What exactly does the user want to do? Instead of copying files, maybe I want to link two files. What does that mean? In unix, we have hard links and symbolic links. If I replace the “original” file, do I want the link to refer to the original file or the replacement? Or maybe what I mean by link two files is to combine two object files into an executable. Do we call that loading instead? ln is the name of a command, but link is the name of a concept. And sometimes the concept evolves over time. The linker is called ld because it used to be the loader. (I think.)
grep is a remarkably useful tool, but with a most unintuitive name. Why not call it find like Windows does? I want to find some text, I run find. So obvious. But some users may want to find files in the filesystem, not strings in a file. What command do they run? Probably locate.
There may be a great deal of historical accident in the names of commands (what if the inventors of awk had different initials?), but that doesn’t mean we can’t recognize the value of unique and precise identifiers.
I provide an RSS feed for flak. I also wrote a simplistic RSS feed reader for myself. The design of the latter was influenced by observing the behavior of existing readers.
There’s a small wave of fetchers that appear every five and ten minutes, converging with larger waves every fifteen minutes. These coalesce with a tidal wave at the top of every hour. My log file shows a whole lot of quiet interspersed with feeding frenzies at regular intervals.
This isn’t a problem, per se, because the total number of feeders is low, and the feed itself is very lightweight. But it’s easy to imagine a more popular blog with more content requiring an outsize investment in capacity to handle such an uneven request distribution.
What can a reader do to avoid such rude behavior? Check feeds at irregular times. For me, this was implemented as a check deadline for each feed. Each time the feed is checked, the deadline is incremented by a random amount between two and four hours. (One to two would work great, too. I’ve fluctuated a bit.) This means that not only is my fetcher not synced with other fetchers, but it’s not possible for it to even accidentally fall into lock step.
If everyone did things this way, that’s all that would be needed. But in a world populated with lock step feeders, there’s one more wrinkle. The fetch process is initiated by cron every five minutes, but the very first thing it does is sleep a random amount between one and three minutes before checking for expired deadlines, ensuring that we never hit a server during a hot minute.
I do this mostly because being polite to servers is the right thing to do, but clients benefit from being nice too. Requests to an idle server are more likely to succeed and faster. If multiple clients are sharing a link (or proxy), they can suffer the same kinds of congestion that busy servers do.
One can imagine that RSS feeds are not the only problem domain which benefits by decoupling a regular activity from a fixed time.
The gcc-local man page, which documents local changes to the compiler has this to say.
The -O2 option does not include -fstrict-aliasing, as this option
causes issues on some legacy code. -fstrict-aliasing is very unsafe
with code that plays tricks with casts, bypassing the already weak
type system of C.
What does this mean and why should you care? The first part is easy to answer. Long ago, in the dark ages when legacy code was written, people used to write functions like this:
float
superbad(float f)
{
int *x = (int *)&f;
*x = 0x5f3759df - ( *x >> 1 );
return f;
}
The C standard clearly says that objects are not to be accessed through incompatible pointers, but people did it anyway. Fucking idiots.
As for why one should care about the default setting of the compiler, the best answer I can give is that if you’re in a position to care, you probably know more than enough to form your own opinion and don’t need me to explain it to you. Otherwise, nobody cares except to the extent it confirms one’s own biases.
The strict aliasing optimization is disabled in gcc 4.2 because it was disabled in gcc 3.3. It was disabled in gcc 3.3 because it was disabled in gcc 2.95. It was disabled in gcc 2.95 because it was the year 1999.
The gcc-local man page continues with even more stupid options.
The -O2 option does not include -fstrict-overflow, as this option
causes issues on some legacy code. -fstrict-overflow can cause
surprising optimizations to occur, possibly deleting security
critical overflow checks.
Lame.
I’m at the HOPE XI conference. Or I was. It’s kind of overcrowded, which is both great and not so great. I haven’t been to a HOPE since The Last HOPE, but I don’t recall it being as crowded. Perhaps it was. In any case, the logistics of getting in to see each talk in person is exhausting. Some of the talks I wanted to see today are definitely the big name headliners, and I can’t imagine it will be less crowded. Better to watch online. Some thoughts on the talks I did see.
more...
Lots of kernel patches yesterday. Several of them, as reported by NCC, involved similar integer truncation issues. Actually, they involved very similar modern 64 bit code meeting classic 32 bit code. The NCC Group report describes the bugs, but not the history of the code. (Some of the other bugs like usermount aren’t interesting. The unp bug is kind of interesting, but not part of the NCC set. Also doesn’t involve integers. Another time.)
more...
Strolling through the book store, among the new titles on display in the politics section was Ratfucked by David Daley. What could this be about? The subtitle, The True Story Behind the Secret Plan to Steal America’s Democracy, conjured up images of telepathic lizard men so I passed it by. A little while later, though, I saw the New Yorker’s review and summary which sounds a lot better. It describes a plan to target particular districts in local elections, win control of the state, then aggressively gerrymander the map to ensure future victories as well. Of particular interest, the summary focused on some local Pennsylvania elections and the damned Arlen Specter library. Sounds great, this is worth a read. In fact, the cover image subtitle for the Kindle version, How the Democrats Won the Presidency But Lost America, is much more accurate and less sensational. (The book title is actually stylized Ratf**ked
because the author is a pussy.)
more...
Finally got a chromebook. I was interested in the HP Chromebook 13 since it was first announced as a kind of cheaper Pixel. But then it spent several months on HP’s out of stock list. Now it’s back.
more...
Some quotes pulled from recent New Yorker article on the history of presidential nominations. Who said it and when?
“The Republican National Convention at Cleveland next week promises to be a very dull show.”
“Have no influence in the election on the score of the Negroes.”
“Free the delegates.”
“Americans must rule America.”
“This is strictly a white man’s party.”
“I wish I could slay a Mexican.”
“We stand for the segregation of the races and the racial integrity of each race.”
“In this bright new century, let me ask you to win to your side the women of the United States.”
“Women will decide the outcome of this election.”
“Are the American people fit to govern themselves?”
“The will of the people is crap.”
The book was better. Watched the SyFy series The Magicians. A short review of the book series.
more...