William Dampier

Powell’s Books – Review-a-Day – A Pirate of Exquisite Mind: Explorer, Naturalist, and Buccaneer: The Life of William Dampier by Diana Preston, reviewed by Times Literary Supplement:

William Dampier was a Somerset man, born in the village of East Coker in the middle of the seventeenth century. His memorial brass, in the medieval parish church of St Michael, speaks of a life driven by a profound curiosity about the natural world. Unstated, but implicit in the brief list of his remarkable achievements, is the sustained courage essential for any exploration of the ocean at a time when wind was the only power, when the determination of longitude was problematic and many coastal seas were uncharted:

TO THE MEMORY OF WILLIAM DAMPIER BUCCANEER EXPLORER HYDROGRAPHER and sometime Captain of the Ship Roebuck in the Royal Navy of King William the Third. Thrice he circumnavigated the Globe and first of all Englishmen explored and described the coast of Australia. An exact observer of all things in Earth, Sea and Air he recorded the knowledge won by years of danger and hardship in Books of Voyages and a Discourse of Winds, Tides and Currents which Nelson bade his midshipmen to study and Humboldt praised for Scientific worth.

Surely here was a man of whom the people of East Coker could be justly proud, a heroic figure to add lustre and interest to an otherwise obscure corner of England? Strangely though, Dampier’s memorial was not erected until 1907, and even then, its appearance in the ancient church was not welcomed by all of the worshippers.

Images
Hmm, I have been by his house many times. I knew he was a major figure in England’s nautical past, but didn’t realize he was this notable a figure. And he is no longer the only reason to make a pilgrimage to St Michael’s church: T. S. Eliot is interred in the wall there.

final(?) system/network tuning

Following up on earlier notes, I decided to make one more change: I doubled the sending buffer’s size.

[/usr/home/paul]:: sysctl net.inet.tcp.sendspace
net.inet.tcp.sendspace: 131070

I rebooted the system that’s been taking all the load and these tuning adjustments weren’t permanent at that point. It took a very short time before I saw the same problem cropping up. I reset the send and receive buffers to 64k and things improved. I don’t know if I need a send buffer that large, but I don’t think it will hurt anything. I do have some transaction in today’s log of more than 200k, one of more than 2 Mb.

Now playing: The National Anthem by Radiohead from the album “Kid A” | Get it

lazyweb: what can be done about comment spam?

Looking through my logs just now, I see lots of spurious requests with referer values for various businesses who seem to think that I’m going to give them some publicity or marketing assistance.

I wonder if there isn’t some way to cull these and create a central “wall of shame” clearinghouse: of course, one could easily put one’s competitors name up on the wall as easily as one’s own. But I doubt that’s happening: I think these are people as misguided and self-centered as Canter & Siegel (what, you didn’t know that it took lawyers to invent commercial spam?).

iPod doesn’t write files contiguously? drains battery life as a result?

iPod – restore and maximize battery life:

If you erase and add files in a normal manner you will get fragmentation. The iPod hard disk will have to work more to read your music and your battery will suffer.

This needs to be tested, I think: since the update process is initiated by the desktop system, it should be trivial, or even obvious, to ensure that files are written to contiguous blocks.

iPodlounge | All Things iPod:

Apple does not recommend running disk utilities like Norton Speed Disk, Disk Scan and Disk Defragmenter. It’s not really needed as the drive is not written to and erased nearly as much as a typical hard drive. If you’re emphatic about cleaning up your drive it’s best just to do a full Restore with the Apple Software Updater. This reformats the drive (defragmenting it in the process) and has the added benefit of creating a new clean iPod database which over extended periods of use can get corrupted.

I can see how seeking and scanning could run down battery life, but I’m not convinced it’s needed.

what’s wrong with this picture?

So I found a conversation over here that leads off with a simplified version of how the two US parties differ (not that I disagree with it). Somehow the comments thread turned into a discussion of trial lawyers and healthcare costs.

So I found a conversation over here that leads off with a simplified version of how the two US parties differ (not that I disagree with it). Somehow the comments thread turned into a discussion of trial lawyers and healthcare costs.

Having had some experience in this — my kidney stone surgery in early 2003 wasn’t covered by insurance and I have been hung out to dry other times as well — what I see happening here is a colossal buck pass. Company A wants to insure it’s employees so it contracts with Insurer B who authorizes payments for Doctors C, D, and E. The good doctors have their own insurance to cover malpractice, from Insurer G. But what if one of Doctor D’s cases has an unsatisfactory outcome — the patient has some condition that now requires long-term care, and Insurer F has to cover those costs.

Now, let’s say the family of this patient (H) is unhappy and thinks the Doctor was negligent. Will Doctors C and E take action? Would they be willing to honestly examine a case that could end up with them drumming one of their own out of the profession?

So enter Trial Lawyer I who takes the case on contingency, wins a huge settlement — which Insurer G has to pick up. It of course passes on the cost to the other doctors who in turn raise their rates, allowing Insurer G to raise its premiums to Company A, eating into their profits.

Who is in a position to stop this and hold the line on costs? The insurers? They’ll charge what the traffic will bear. The companies that buy from them? I’m guessing the market isn’t all that competitive when you hear how many employees of small firms are uninsured. The doctors? They don’t want to be exposed to the risk of losing their livelihood (one doctor I was a patient of some years ago — when I was uninsured — told me he had to clear $300,000 before he could take a salary, just to cover overheads, like his office, staff, and insurance). What about the patient? Should they take it on the chin?

It sounds to me like the whole system needs to be trashed and rebuilt from scratch. I would like to see some kind of arrangement where the insured people deal directly with a doctor or group, and just pay directly, rather than involving a multiplicity of additional parties (insurers, brokers, etc.). Of course, the risk analysis means involving some experts, actuaries and the like, but surely this can be made to work again, assuming it ever did.

What would it take for ordinary people to self-organize into a group and effectively sell themselves to a group of doctors? What’s so different about when I entered the work force and insurance was complete and covered by an employer, and today where it is not always complete and there is often an employee contribution (ie, a pay cut)? I don’t know if I buy the argument that trial lawyers and malpractice costs are the sole cause. Blaming lost legal suits and the associated costs is like blaming the cops for your speeding tickets. It just sounds to like a bunch of people standing in a circle blaming the person next to them . . . round and round it goes, but never stops.

more on site optimization

I reimplemented MTOptimizeHTML on the main page.

Before :
131835 Feb 1 15:17 index.html

After:
114726 Feb 1 16:07 index.html

That 17k difference is some savings (15%), but 114k is still plenty big.

Of course with mod_gzip, it goes down to 27,622, for a 76% savings. That I can deal with.

I think I’ll just keep it on the main page for now and see how it goes.

[Posted with ecto]

why knowledge management matters

How to Save the World

Contrast these two paragraphs, each designed to convey the value propositions of knowledge management to an unaware, perhaps skeptical, audience of executives:

1. Knowledge Management caters to the critical issues of organizational adaptation, survival and competence in face of increasingly discontinuous change. Essentially, it embodies organizational processes that seek synergistic combination of information processing capacity of information technologies, and the creative and innovative capacity of human beings.
2. In June 1995, a health worker in Kamana, Zambia, logged on to the CDC website in Atlanta and got the answer, posted by an unknown associate in Indonesia, to a question on how to treat malaria.

Even if the audience has no experience in health care, they immediately relate better to the second argument, even though it is less comprehensive an explanation of the benefits of knowledge management. The story engages them in ways the factual argument cannot.

I had an email exchange with a librarian at my workplace a few weeks: she has recently been charged with finding ways to integrate technology into the learning process, and I sent her a couple of recent links on KM, thinking they might be useful.

Her reply was that KM wasn’t anything she was interested in. At the time, I was surprised and puzzled, but after reading these two examples, I’m really disappointed. A culture that isn’t even aware of how little it knows about itself is an amazing phenomenon, and not altogether enjoyable.

the virtues of literacy

O’Reilly Network: Marshall McLuhan vs. Marshalling Regular Expressions [Jul. 08, 2002]

What does the success of regular expressions have to do with McLuhan? Simply put, the technology and Friedl’s book seem to embody everything McLuhan said was passé: they celebrate and support a reverence for text that McLuhan expected current generations to abandon. The actual message, as I will show, is more subtle and enhances McLuhan’s work substantially.

At first, I was pleased to see that Jeffrey Frield’s book on regular expressions had come out in a second edition: I was a reviewer on the first edition, but until I read this article, I had no idea how popular the book was. It is a great reference, and I know the author has been scrupulous on tracking and fixing errata: one of the reader reviews at Amazon.com claims the book has no errors, which doesn’t surprise me.

But back to McLuhan. It’s been a while since I read any of his books, but this article refreshes the core ideas well enough. One of the aspects of hacker culture or at least of the smarter ones I have known is the high level of literacy and of literary proficiency. They read and write well, and as anyone who has worked on UNIX systems knows, the text processing tools in that environment are the most powerful to be found anywhere. emacs, TEX, SGML and its better known offshoots of HTML and XML are all text processing tools. And why has it been necessary to create these tools? To write programming language source code, which is far more rigorous than human language,and requires more powerful, more finely controllable tools. Where most of us can get along quite well with simple search and replace in a word processor, reqular expressions make a myriad other text management tasks not just possible but simple.

Perhaps this is where the irony of using the word “programming” to describe producing televion, a passive non-creative activity, and producing software, its polar opposite, makes itself most keenly felt.

A closed, unprogrammable device fits McLuhan’s most dire assessment of automation and its numbing effect. But once a hacker breaks open the device and reprograms it, he reclaims not only the device itself but all media with which it comes in contact.

Reclaiming media is what the FCC hearing at the UW (was that just yesterday?!) was all about. There’s something very meme-ish about all this.