smells like techno-utopians

Someone suggested that this post is based on some of the writing here. Hard to tell: I suppose the idea of land rents is the connection but there’s nothing there that isn’t a crib from Thomas Paine’s Agrarian Justice, ca 1797. Call it the American Equity Fund or a sovereign wealth fund or a prosperity dividend, as you like.

Most of this piece is the usual techno-utopian twaddle that assumes we all want AI to make more and more decisions (I was reporting about AI — expert systems and neural nets — before the author of this piece was in school) and that we will all just accept the disruption, the loss of opportunity/security as jobs are abstracted away, so that a few people who have managed to fail upwards their short lives can continue to hold power. I suppose one may think this is Luddite or anti-technology crankiness but that just shows a profound misunderstanding of what the Luddites were angry about. All of this Taylorism-adjacent AI stuff is just a way to turn those with local knowledge and skills into cheap labor. James C Scott talks about this in Seeing Like a State, describing this kind of top-down deskilling as high modernism. Just like the capitalists claim about socialism, this really has failed everywhere it has been tried but that won’t stop those who have never learned from failure from trying it again. After all, it’s only other people’s lives they are taking, piece by piece.

There is a way to take someone’s life without killing them: you simply deprive them of the means to enjoy it, reducing them to a meat machine that lacks the means and eventually the desire for anything outside of their labor obligation. This seems to what most of these schemes boil down to, stripmining industries (as with Uber/Lyft and the taxis) or cities, as their inflated salaries distort local housing markets and force workers who don’t command 6 figure salaries to harvest data and put ads in more and more places to leave the cities they built.

Daniel Pink laid out the three components of internal motivation: Mastery, Autonomy, and Purpose. Mastery requires no explanation. Autonomy is simply the power to say “no” to work you don’t want, for whatever reason — it doesn’t pay enough, the working conditions are poor, whatever. And Purpose should also be self-explanatory: you feel like your work has some value beyond money. But these masters of the universe don’t know what any of that means. They are well-supplied with autonomy but what are they masters of and what purpose do they serve besides the accumulation of more wealth? Why can’t these lottery winners just take their winnings and go home?

No, I don’t think any of these insular thinkers have read any of my stuff. The few mentions of land in this piece don’t mention split-rate taxation on commercial land, which is really where the action is. Putting a 2.5% tax on all privately-owned property is a non-starter in a city with a 1% rate that people are howling at having to pay.

The roots of this are in the preference for private goods over public ones: it even extends to the techno-utopian chariot of choice, the self-driving Tesla. Imagine making a car no one wants to drive and thinking you’re an automaker.

As anyone who has worked on the internet might appreciate, the better solution here is to make the road network into a real network, where the vehicles are packets, guided by an intelligent network that can receive the desired destination from each vehicle and manage all of them, merging and exiting them, as the passengers require. Rather that waste time and effort designing cars that can read signs, why not built signaling into the signs and all other road infrastructure to guide the vehicles?

But that sounds like public investment, shared/common goods. They would never consider upgrading or innovating in the public space when they could simply create a private version they can control access to.

So this coterie of failsons and upward failing dreamers will continue to invest in systems and ideas that no one but themselves needs or wants. It may take the rest of this century before the effects of this poison are gone.

The best education reform money can buy … until next time

We are in the midst of a transformation in the education reform industry, it seems, as big money is set to do to local school board races what it’s done to state and national elections. One local race, for a volunteer position, already has more than $125,000 in donations, with $100,000 donated to a PAC in large amounts (multiple $10,000 and $15,000 contributions) vs less than $30,000, with an upper limit of $900, on the other side. Both candidates are challenging for a vacant seat so there is no incumbency benefit there.

Since all of my political awareness started in the Reagan years, I’m forced to ask, what are they buying with that money? A school board seat is a volunteer position but demands a lot of time, with work sessions before most peoples’ work day begins as well as public meetings, all of which also require preparation. Who wants to work an extra 20 hours a week for no pay? The board has oversight authority over the superintendent but doesn’t have executive power. So at best it can guide and support a superintendent or worse, allow them to run amok as we have seen with recent incumbents. But still worse, a majority could dismantle the public school system, all in the name of reform.

Local university science professor Cliff Mass has some idea what’s going on here. He sees this as the latest attempt to apply business management techniques to the classroom.

So let’s look into that.

If a classroom or school were to be run like a manufacturing plant, this assumes that the raw materials coming in are of a consistent quality or that there is a refinement process that gets them to that standard. I think anyone who has ever been to school knows that students do not enter school as identical lumps of clay to be stamped or molded into shape. So what’s the solution in this manufacturing-patterned model? I haven’t heard anyone in the education reform industry discuss assessments for all incoming students or remediation for students who are not quite ready. I hear a lot about the standards teachers will be held to but nothing about the responsibility for student preparedness.

That responsibility lies at home. Students in the elementary grades are in school for six hours each of the 180 days in a school year. Of that, when you subtract lunch, enrichment like art, music, health and fitness/PE, and recess (anyone who says you don’t get recess in the working world needs to explain how coffee shops in business districts stay in business), it’s about 5 hours of instructional time. Where are the kids for the balance of the 24 hours? Same place they were for the first five years: in their family’s care.

What happens in a business when customers are lined up at the register or can’t through on the phone or can’t pick up their work at the promised time? Business that want to stay in business increase staff or improve their systems. I never hear anything about increased family support or improved social services in the schools to address a student’s lack of readiness. An elementary school with 300+ students in this large urban district might have a counselor for an hour or two a week. Many schools don’t have a full time art or music teacher as was common 30 or 40 years ago, or a full-time nurse. We have fewer staff per student than when many of the loudest voices for reform were in school. How does making something worse improve outcomes?

One of the goals I often hear about is making schools more efficient but since there are no details given, I have to assume that means “cheaper.” So that means high student:teacher ratios. So we have varied levels of preparedness and no remediation process and at the same time, we think that having more kids per teacher is somehow better. Where I have a hard time with this is that the people backing these ideas are considered to be very smart. They’re almost always successful business people and that’s taken as a proxy for intelligence in our culture. But no one ever asks them to defend specifics like higher student to teacher ratios as the means to improving education outcomes. If someone ran a grocery stores like this, shutting down cashier stations until customers started to leave, how successful would they be?

There seems to be some idea that processes are always improved by reducing the staff headcount. But there’s a point beyond which is doesn’t work. Nine women can’t make a baby in one month. But does anyone really think a teacher can reach and inspire and educate 30 kids as effectively as 20? Effectiveness is what we should be striving for, not efficiency. But how to measure that? The usual tests and assessments never seem to be enough. And those metrics point to teachers, not the students and their families, as the party most responsible for student outcomes.

Another idea that we hear a lot about is competition, that we need to make schools competitive. Why? Competitive with what? If you are going to set up a competitive structure and pit schools against each other, this means some students are going to go to second-rate schools. This doesn’t create winners, it creates losers. When did that become a desirable educational outcome?

What they mean is allowing for-profit schools to be set up alongside the public schools in publicly-funded and maintained buildings but with the freedom to ignore all the constraints of class size, of a centrally-mandated curriculum, and with the freedom to turn away students who might lower the school’s performance metrics. Why not just let public schools have that much autonomy?

In the past couple of years, a historically low performing middle school here in Seattle secretly tossed out the district-mandated math curriculum, the poorly-regarded Discovery Math series, and used the Saxon math textbooks. As a result, their math assessments were up sharply and the racial achievement gap narrowed. Kids who had struggled learned math better with those materials, based on objective measurements prescribed by the state. But the school had to break the rules to make that happen. You’d think reformers would be all over this as a vindication of their ideas. I’ve not heard anything. If anything, I suspect they would argue that this reinforces their attitudes toward teachers, a profound mistrust.

This mistrust of teachers and lack of respect for teaching as a professional discipline is a big component of the ed reform movement, an attitude that is unique in the developed world. Other countries and cultures respect teachers and parents expect their children to respect their teachers and value education. This growing antipathy toward public school teachers, beginning almost 100 years ago, is documented in Bryce Nelson’s book Good Schools: The Seattle Public School System, 1901-1930. Anyone who hopes to serve on the board or work in education policy who hasn’t read this isn’t ready yet. What Nelson found is that in the post-WWI era, just like today, there was increased pressure to make schools more efficient and accountable. This was an outgrowth of Taylorism, the time and motion studies fad that turned proud craftsmen into dissatisfied wage slaves. If you really want to know what teachers do all day, it seems like a simple problem to solve: go sit in some classrooms or help a teacher do kindergarten assessments and you’ll learn a lot more than you will protesting about the budget at a board meeting.

This devaluation of teaching as a profession is why we get things like Teach for America. It sounds like a great idea, taking recent college grads who haven’t yet begun their careers and putting them into classrooms in underserved areas. But what does it really say? It tells me that some people think that these five week wonders are good enough, that people who have gotten a four year degree in educations and then a post-graduate certification are not really worth it. There seems to be pretty obvious disconnect there, that we are concerned about underserved student populations but we don’t want to assign university-trained teachers to those schools.

Today’s teachers are trained to a higher standard than ever and at the same time education has become more complex, as we learn more about learning styles, cognitive and auditory issues that affect learning. Maybe if we valued professional educators as professionals, like engineers, nurses, even business school graduates, we wouldn’t have underserved student populations. If these educational reformers are serious about things like competition and autonomy, as they seem to be with charter schools, why aren’t they advocating for recruitment incentives in hard to fill positions? We know the answer already: Teach for America is cheaper and it furthers the goal of undermining professional educators. No one will admit that’s the goal but the net effect is the same.

As long as the public holds teachers accountable for student performance but not the parents or society as a whole, we’ll keep hearing these people talk endlessly about what?s wrong with the schools, never acknowledging that schools are a mirror of society. If you think your schools are broken, you might be right but that’s what doctors called “referred pain.” The cause of the pain isn’t where it hurts. But we keep applying fixes and wondering why they don’t take.

Barack Obama as the last President of the United States?

With his own claims to originalism fading fast, Scalia suggests liberal judicial activism, practiced by some of colleagues on the Court, is part of what brought about the Holocaust in Nazi Germany. The speech was an address to the Utah State Bar Association.

[From Peak Scalia | TPM Editors Blog]

I wonder if this isn’t the wrong comparison to make, though I’m not looking through the lens of judicial activism.

Consider Barack Obama:
• as the inheritor of a lot of policies and mechanisms he claims to oppose (like domestic surveillance or inequality): whether or not he does and what he could actually do about it is another discussion
• with his non-aggressive (not to say passive) disposition (see above, what he could do about it vs someone more confrontational)
• his political disadvantages (minority in the House, bizarre anti-productive rules in the Senate, passive to oppositional big media)
• huge military expenditures, including politicized defense expenditures for weapons that will never be used (we’re buying tanks? Why? Do we need a navy larger than the next several nations’ fleets combined? And why the F-35, other than contracts and jobs it represents?)
• a massive state surveillance operation of which we have no idea the real cost, and the yet to be weighed divisive social effects
• and the current economic morass with consolidation of power and wealth in every industry and the resulting inequality.

Who do you think of as a 20th Century political figure, given that description?

The name Mikhail Gorbachev ring a bell? He rose to power just as the wheels were coming off the wagon and there wasn’t a lot he could do about it. He inherited a state that had been hollowed out by ideology-driven economic policies and ruinous military spending. He ended presiding over the dissolution of an empire, as Russia became independent of the USSR along with other nations that had been “unified” into the old USSR — Ukraine, Belarus, the central Asian states, etc.

There’s a lot of muttering about secession or breaking the country up, letting intransigent states or regions go their own way, but there’s an assumption of choice to those arguments, of a new CSA breaking away. But what if as a precursor to or result of that happening, the federal government was rendered powerless, through budgetary hijinks or other political stunts (maybe congress members from the intransigent states refuse to return to DC or some other personal veto)?

Crazy? Possibly. Unbelievable? I’m not so sure. I continually find myself thinking on that memo from Buchanan to Nixon on the “bigger half” [1][2] and am reminded there are people — a lot of people — who would break the country up rather than see it accommodate ideas or people they oppose. That and referring to Frank Herbert’s Dune as a political text: “He who can destroy a thing, controls a thing.” This has been the SOP of the Republican congress since the Gingrich era, though he was more loyal to the idea of a continuing United States than many of his successors.

1. http://blogs.telegraph.co.uk/news/timstanley/100163806/after-wisconsin-and-north-carolina-america-has-its-silent-majority-now-it-needs-a-nixon/

2. http://books.google.com/books?id=u7n3MMmktssC&pg=PA606&lpg=PA606&dq=buchanan+%22bigger+half%22&source=bl&ots=jgGr5VD-_8&sig=7s4xNXO1IAL_GEkxF5B9HhcV2xo&hl=en&sa=X&ei=XCzsUZumBKepiAKi6IFI&ved=0CGkQ6AEwCQ#v=onepage&q=buchanan%20%22bigger%20half%22&f=false

90 degrees with no air conditioning? I need some lemonade.

Little heat wave here at present so we need to stay hydrated. You can only drink so much water and beer isn’t always the best choice if you have work to do. So lemonade it is. And none of that powdered mix. Do you even know what’s in that stuff?

  • 1/2 cup sugar
  • 1 cup lemon juice
  • 5 cups of water

Stir, chill, and enjoy.

You can go to a full cup on the sugar but I find it too sticky and sweet.

That was easy, wasn’t it? So many things we are expected to buy ready-made or pre-packaged that are no better and are not much more convenient. Do you really need to buy microwave popcorn in those little bags with those nasty chemicals they use as flavorings?

Building uniconvertor on OS X, post Snow Leopard

I have been doing some work with laser cutting and to that end, I need to translate files from Inkscape’s native SVG format to other formats, like plt or eps. For quite some time now, this has been failing, as the uniconvertor team — who supply the internal translation functions — have let their code base fall into decay.

Turns out the trick isn’t building uniconvertor, but the underlying library, sk1lib, where the real work gets done. For some reason, sk1lib doesn’t come with the developer-supplied distribution, even though the SK1project offers binary packages for Windows and umpteen variants of Linux. So you can successfully install uniconvertor but it doesn’t check dependencies, so you won’t know it doesn’t work until you try to run it. Annoying, that. They haven’t been particularly responsive, either. Too busy working on 2.0 which no one will care about is 1.x is broken.

I wish I had saved my error messages and other debris to better explain all this but it came together pretty quickly before I knew I was on the right track.

You’ll need the following distributions:

Install the tools, if you don’t have them. Make sure they are up to date. I found I had to rip Xcode out and replace it with this toolchain to get things working. There were some issues with llvm/clang that seemed to clear up after I did that. Next, download, build and install FreeType2. I found this symlink was needed:
ln -s /usr/local/include/freetype2/freetype/ /usr/include/freetype

If I remember where I found it, I’ll credit the poster, though it seems to be a pretty common workaround.

Next, lcms: the usual drill: ./configure, make install clean

Then, build and install sk1libs.

python ./setup.py build; python ./setup.py install --record installed-files.txt

The installed-files.txt is just a list of the files that get installed, in lieu of an actual package manager.

And then this patch has to be applied to sk1libs/src/utils/fs.py, either before installation or after.

224c224
<       return ['/System/Library/Fonts', '/Library/Fonts', os.path.expanduser("~/Library/Fonts")]
---
>       return ['/',]
314c314
<
---
>

Without the patch, the program will read every file on your system into some list, for a purpose known to no man. The comments point to python 1.5 so I suspect it’s long overdue for review and refactoring. Credit for this goes to Valerio Aimale.

Credit for this discovery goes to the Inkscape developers who are trying to get the OS X releases of Inkscape in parity with Windows and Linux.

Finally, uniconvertor-1.1.5 — python ./setup.py build; python ./setup.py install --record --installed-files.txt

And that should do it. Test it out.

next steps

Following up on this post I realize I have a very serious problem: the lack of self-esteem or value makes it hard to sell yourself or your ideas. If your underlying belief system is that everything you say or do is of no consequence, it makes it hard to get through the interview process, assuming you can even get one.

If you don’t believe you deserve it, you won’t get it.

The best advice I could have given myself, had I realized it, is that relying on jobs that other people create and define is never going to work for me. Temperamentally and physically/biologically I’m better off doing my own thing. But then there it is: how do you sell whatever it is you’re doing or making if you don’t believe it’s any good? And how could it be good, if you made it?

Imagine how you would manage this if you were faced with having to find a whole new life for yourself in less than a month due to a failed domestic situation? If you’re a reasonably normal person, well-adjusted and comfortable with yourself, this may not be a big deal. You have friends or other resources.

But that’s not my situation. Thirteen years I have lived here and I have no network to draw on and not much in the way of local knowledge to use to navigate a new course. It’s going to be a rough stretch.

swap usage monitor

Wrote this little thing to keep an eye on how swap usage grows. I find that when it exceeds physical RAM, things get boggy.

#!/bin/sh
PATH=/usr/local/bin:/bin:/sbin:/usr/bin:/usr/sbin:
LAST=`who -b | cut -c18-50`
RAM=`system_profiler SPHardwareDataType | grep Memory | awk '{ print $2 * 1024 }' `
TOTAL=`du -m /var/vm/* | grep swap | awk '{total = total + $1} END {print total}'`
if [ ${TOTAL} -ge ${RAM} ]; then
logger "swap in use = ${TOTAL}, exceeds installed RAM (${RAM}), last reboot was ${LAST}, recommend reboot"
open /var/log/system.log # this opens the log in the Console application so you can see it/can't ignore it.
fi
exit 0

Though, to be fair, it’s not as bad as when I had some useless never-looked-at Dashboard widgets. That was a performance killer. That discovery was inspired by this.I used to check this by simply using

du -sh /var/vm
6.0G    /var/vm

but that didn’t catch that there was a hibernation/sleepimage file in there.

-rw------T  1 root  wheel   4.0G May 22 01:12 sleepimage
-rw-------  1 root  wheel    64M May 22 04:05 swapfile0
-rw-------  1 root  wheel    64M May 22 04:05 swapfile1
-rw-------  1 root  wheel   128M May 22 04:05 swapfile2
-rw-------  1 root  wheel   256M May 22 04:05 swapfile3
-rw-------  1 root  wheel   512M May 22 04:05 swapfile4
-rw-------  1 root  wheel   1.0G May 22 04:05 swapfile5

That’s why I just add up the swapfiles themselves. The one thing I would add is a more informative display of the time since reboot: getting days (?) since reboot would be more informative. But that requires more jiggery-pokery with date(1) than I care to deal with. I’m sure some clever obfuscated perl could be cooked up but I want this to use only tools I know will be available.

Update: this just went off (opened up the Console app) and displayed these messages:


May 22 21:59:04 ivoire com.apple.launchd[1] (com.apple.xpcd.F5010000-0000-0000-0000-000000000000[26487]): Exited: Killed: 9
May 22 21:59:04 ivoire kernel[0]: memorystatus_thread: idle exiting pid 26487 [xpcd]
May 22 22:04:32 ivoire com.apple.launchd.peruser.502[2211] (com.apple.cfprefsd.xpc.agent[26538]): Exited: Killed: 9
May 22 22:04:32 ivoire kernel[0]: memorystatus_thread: idle exiting pid 26538 [cfprefsd]
May 22 22:04:33 ivoire kernel[0]: (default pager): [KERNEL]: ps_select_segment - send HI_WAT_ALERT
May 22 22:04:34 ivoire kernel[0]: (default pager): [KERNEL]: ps_vstruct_transfer_from_segment - ABORTED
May 22 22:04:34 ivoire kernel[0]: (default pager): [KERNEL]: Failed to recover emergency paging segment
May 22 22:04:34 ivoire kernel[0]: macx_swapon SUCCESS
May 22 22:05:02 ivoire.paulbeard.org paul[26584]: swap in use = 4096, exceeds installed RAM (4096), last reboot was May 20 17:51 , recommend reboot

This — Failed to recover emergency paging segment — looks alarming. I doubt it is. It’s not new, in any case.

Something I learned today

It’s taken me about 40 years to fully (?) process this.

“J.T” is a simple, hour-long story of a young boy living in a New York ghetto, but it tackles some weighty issues.

[From J.T. Reviews & Ratings – IMDb]

I saw this movie right after it came out, so around 1970. The “weighty issues” it deals with are racism and poverty in mid 20th Century America but an 8 year old English boy living in Canada didn’t get any of that. You have to read a few more of the reviews to learn what I saw. And from what I can tell, I never saw the end of the movie, as you’ll see.

I saw it with my mother, in the front room of our house, and for some reason, I remember it as a summer afternoon, with long shadows everywhere. The storyline of the movie I remember was that a poor black kid finds a cat in an abandoned building and it becomes the center of his universe, something for him to love and care for, to look forward to being with. But some older boys who have nothing to love or care about find him sneaking into this old building and they catch him in the act of caring, of loving. They take the cat from him and play “keep away,” teasing and taunting, until one of them runs out to the street and slips, sending the cat into traffic where it is killed by a car, right in front of the young hero.

At this point, I burst into tears. All I saw was a small boy — like myself — who lost something precious due to the cruelty of others, out of the simple meanness and unempathetic jealousy of those who don’t know how to love.

My mother laughed at me for crying. She laughed at a child for expressing a natural emotion. She didn’t do it to minimize the effects or soften the blow. She offered no comfort, no compassion. She was no different from the boys on the screen, who hate to see anyone or anything receive love.

And that response to my openness, my willingness to openly feel, made me close up and hide that part of myself from the world. It made me fear rejection to the point where almost every decision I have made since then has been to avoid it. And to avoid rejection is to avoid life. It means not trying things, not risking exposure to the hurt that comes from being rejected.

My mother and I, if we were ever close, weren’t after that. Soon I was on my way to a new life in a new family in a new country south of the border, but that scabbed-over hurt stayed with me for years, many, many years. I expect the other changes only made me keep that of myself wrapped up tight.

It was only in the past 2-3 years I would allow myself to openly express that kind of feeling, to let the tears flow. And only rarely and at home.

I saw my mother twice after that before she died in 2003, a span of 33 years, and neither experience was positive. No, we weren’t close. There’s more but it’s not relevant here.

I didn’t realize until today that there was more of the movie after that scene, so badly was I hurt at the time. I never saw it or remembered it, I guess. I knew there was something about that moment, frozen in my mind, but I never quite realized what it was, what it all meant.

This has been coming clearer the past few weeks, the realization that I have shut myself off from far too many experiences and opportunities but not understanding why.

People think saying “no” doesn’t cost anything. It does. It can cost you everything.

It’s been a cathartic day. Just recounting the story brings more than a tear to my eyes. When I put it all together this morning, I was in pretty bad shape. And I expect the next few days will be up and down as I come to grips with the understanding of what happened and what I can do now.

I’ve always wondered why Philip Larkin’s “This Be the Verse” resonated with me. What he describes is not unique to my experience but now that first line is going to stay with me, at least as far as an apt description of one of my parents. It’s never been far from my mind…maybe now I understand why.

Following up here.

I had this idea 30+ years ago

A new emerging concept known as hybrid solar lighting may offer an effective way of routing daylight deep into buildings. Using parabolic reflectors, direct sunlight can be concentrated on a smaller mirror which after removing most of the Infra red component (which can be extracted as electricity), reflects a very focused beam of visible light on to the end of a optical fibre bundle. This cooled beam of concentrated full spectrum natural light can then be routed into the interior of buildings for illumination. The hybrid design allows this additional lighting source to be mixed with back up lighting to create a dynamic system that always maximises the amount of natural light fed into the building.

[From Solar Power | Green Energy Jobs Career Guide]

Maybe not for task lighting but an easy win for hallways or ambient lighting.

I can recall when the idea came to me, around 1982, as I was walking along a corridor in an apartment/condo building in Florida. There were no windows but there were small wall sconces that radiated heat as I passed them. Perhaps it was the realization that there was all this heat and light outdoors, surrounding this air-conditioned darkness.

network tuning, OS X Leopard edition

I had occasion to fire up an old PPC iMac G5 (OS X 10.5.8) the other week and was appalled at how slowly it performed at network access. So here’s what I did to fix it.

Per Scott Rolande, there are tunable values for many aspects of the TCP stack. Handily, they live in a text file and can be tinkered with interactively.

kern.ipc.maxsockbuf=4194304
kern.ipc.somaxconn=512
kern.ipc.nmbclusters=2048
net.inet.tcp.rfc1323=1
net.inet.tcp.win_scale_factor=3
net.inet.tcp.sockthreshold=16
net.inet.tcp.sendspace=262144
net.inet.tcp.recvspace=262144
net.inet.tcp.mssdflt=1440
net.inet.tcp.msl=15000
net.inet.tcp.always_keepalive=0
net.inet.tcp.delayed_ack=0
net.inet.tcp.slowstart_flightsize=4
net.inet.tcp.blackhole=2
net.inet.udp.blackhole=1
net.inet.icmp.icmplim=50

This machines didn’t have a sysctl.conf file so I used copied his and used it to pull out the current values.

for i in `cut -d= -f1 sysctl.conf`; do sysctl $i; done
kern.ipc.maxsockbuf: 8388608
kern.ipc.somaxconn: 128
kern.ipc.nmbclusters: 32768
net.inet.tcp.rfc1323: 1
net.inet.tcp.win_scale_factor: 3
net.inet.tcp.sockthreshold: 64
net.inet.tcp.sendspace: 65536
net.inet.tcp.recvspace: 65536
net.inet.tcp.mssdflt: 512
net.inet.tcp.msl: 15000
net.inet.tcp.always_keepalive: 0
net.inet.tcp.delayed_ack: 3
net.inet.tcp.slowstart_flightsize: 1
net.inet.tcp.blackhole: 0
net.inet.udp.blackhole: 0
net.inet.icmp.icmplim: 250

A little different. Not sure why kern.ipc.maxsockbuf is so much higher on an old machine that maxes out at 2Gb of RAM…

To test throughput, I needed a test file.
hdiutil create -size 1g test.dmg
.....................................................................................................................................
created: /Users/paul/test.dmg

Over wireless G on a mixed wireless N/G network to a wired 100 Mbit host on a Gigabit switch, it managed a stately 12 Mbits/second.

Twelve minutes (12m19.024s) later:
sent 1073872981 bytes received 42 bytes 1452160.95 bytes/sec
total size is 1073741824 speedup is 1.00

Oy. Now to try it to a wireless destination, a 10.8.3 machine.

Hmm, interestingly, OS X handles rsync transfers a little differently: it blocks out space equivalent to the size of the file. This is checking the size of the file during the transfer. du tells a different story than ls. As you can see the file size never changes during the transfer. du -h .test.dmg.GsCjdW; sleep 10 ; du -h .test.dmg.GsCjdW
1.0G .test.dmg.GsCjdW
1.0G .test.dmg.GsCjdW

Using ls -l shows the actual size of the file, not the disk space set aside for it.

Still slow: sent 1073872981 bytes received 42 bytes 1428972.75 bytes/sec

Took 12m30.961s, the difference being because it went to sleep (out of boredom?).

After changing the various sysctl OIDs, things got much worse.

This is what I have on the 10.8.3 system:
kern.ipc.maxsockbuf: 4194304
kern.ipc.somaxconn: 1024
kern.ipc.nmbclusters: 32768
net.inet.tcp.rfc1323: 1
net.inet.tcp.win_scale_factor: 3
net.inet.tcp.sockthreshold is not implemented
net.inet.tcp.sendspace: 2097152
net.inet.tcp.recvspace: 2097152
net.inet.tcp.mssdflt: 1460
net.inet.tcp.msl: 15000
net.inet.tcp.always_keepalive: 0
net.inet.tcp.delayed_ack: 0
net.inet.tcp.slowstart_flightsize: 1
net.inet.tcp.blackhole: 0
net.inet.udp.blackhole: 0
net.inet.icmp.icmplim: 250

A 1Gb transfer takes too long (which of course is the problem) so I made a couple of small changes and tried a 100Mbit file. Down to 13 seconds. Hmm, not bad. The changes:
sysctl -w net.inet.tcp.sendspace=4194304
sysctl -w net.inet.tcp.recvspace=4194304
sysctl -w net.inet.tcp.mssdflt=1460

I set net.inet.tcp.[send|recv]space to be half of kern.ipc.maxsockbuf and made the net.inet.tcp.mssdflt match the receiving system.

Now a 1Gb test file takes 53.287s. Copying from 10.8.3 to 10.5.8 took just 31.215s. After synchronizing the net.inet.tcp.mssdflt on the system I first tested with, transfers to there are down to 1m47.471s.

So some big improvements for not much effort. I’m sure there are lots of other tweaks but given the relatively little need for more improvement and the limited potential (the old 10.5 system on wireless G is frozen in time while the newer wireless N machines will see further improvements), I don’t know that I’ll bother. A twelve-fold increase in one direction and a 24-fold boost going the other way is pretty good. If I really cared, i.e., this was something I expected to do regularly, I’d run a Cat5 cable to it and call it done.

After a reboot to ensure the values stay put, I tested different copy methods as well, all with the same 1Gb file.

from the 100Mbit wired machine using rsync: 0m56.349s

same to/from, using scp -C for compression (since I used rsync -z): 1m40.794s

from the 10.8.3 system to the 10.5 system with scp -C: 1m35.228s

from the 10.8.3 system to the 10.5 system with rsync -z: 0m24.734s (!!)

from the 10.5 system to 10.8.3 with rsync -z: 0m38.861s

So even better after the reboot. Could be other variables in there as well. I’m calling it done.

UPDATE: the morning after shows a different story. I was puzzled that snmp monitoring wasn’t working so I took a look this morning and things are slow again, down to 5 Mbits/second from the 12 I considered poky. At this point, I’m not sure how reliable the benchmark data was or at least how I was evaluating it.

I’ll have to investigate further. I created some more test media by splitting up the 1Gb file into smaller ones, so I have a pile of 100Mbit and 10Mbit files as well. Part of the optimization I am looking for is good throughput for large files as well as being able to handle smaller files quickly. Large buffers and access to a good sized reserve of connections, in other words.