Kadin2048's Weblog
JulAug Sep
Oct Nov Dec


Thu, 20 Oct 2016

For all the stupidity of the current Presidential election, one interesting discussion that it has prompted is a resurrection of the old debate over nuclear strategy, and particularly the strategy of “launch under attack” (aka and better known as “Launch On Warning”). Jeffrey Lewis has an article, “Our Nuclear Procedures Are Crazier Than Trump”, in Foreign Policy which ties this into current events, prompted by recent statements by both candidates.

Much of the discussion in the last 24 hours has centered on whether Hillary Clinton inadvertently disclosed classified information when she mentioned, during the third debate, that the President would have only “four minutes” to decide on whether to respond in the event of a large-scale attack on the continental U.S. by an adversary. This is not, at least to me, a particularly interesting discussion; nothing Clinton said goes beyond what is in the open literature on the topic and has been available for decades.

What is interesting is that, in 2016, we’re talking about Launch On Warning at all. Clinton’s “four minutes” should be a thing of the past.

I mean: the other President Clinton supposedly moved the U.S. away from LOW in a 1997 Presidential Directive, instead putting U.S. forces on a stance of second-strike retaliation only after actually being on the receiving end of a successful attack. This is a reasonable posture, given that the U.S. SSBN force alone has enough destructive power to serve, independently of the rest of the ‘nuclear triad’, as a reasonable deterrent against a first strike by another global power.

What’s interesting is that, at the time, the Clinton administration downplayed the move and said that it was merely a continuation of existing policy dating from the Reagan years and expressed in previous PDDs. A Clinton spokesperson reportedly said at the time: “in this PDD we direct our military forces to continue to posture themselves in such a way as to not rely on launch on warning—to be able to absorb a nuclear strike and still have enough force surviving to constitute credible deterrence.” (Emphasis mine.)

The actual Presidential Directives are, as one might expect, still classified, so we don’t have a lot other than hearsay and the statements of various spokespeople to go off of. But it would appear safe to say that the U.S. has not depended on LOW since at least 1997, and probably since some point in the 80s. I think it’s likely that the original change was prompted by a combination of near-miss events in the 1970s (e.g. Zbigniew Brzezinski’s infamous 3 A.M. wakeup call on November 9, 1979), plus the maturation of the modern SSBN force into a viable second-strike weapon, which together caused U.S. leaders to question the wisdom of keeping the nuclear deterrent on a hair trigger. As well they probably should have, given the risks.

In fact, being able to lower the proverbial hammer and relax the national trigger finger somewhat is probably the biggest benefit of having an SSBN force. It’s why other nuclear powers, notably the U.K., have basically abandoned ground-based nuclear launch systems in favor of relying exclusively on submarines for deterrence. The U.K., famously, issues “Letters of Last Resort” to their submarine captains, potentially giving them launch authority even in the absence of any external command and control structure — ensuring a retaliatory capability even in the event of complete annihilation of the U.K. itself. While this places a lot of responsibility on the shoulders of a handful of submarine captains, it also relieves the entire U.K. defense establishment from having to plan for and absorb a decapitation attack, and it certainly seems like a better overall plan than automated systems that might be designed to do the same thing.

In the U.S. we’ve never gone as far as the U.K. in terms of delegation of nuclear-launch authority (perhaps because the size of the U.S. nuclear deterrent would mean an unacceptable number of trusted individuals would be required), but it’s been a while since any President has necessarily needed to decide whether to end the world or face unilateral annihilation in a handful of minutes. They would need to potentially decide whether to authorize a U.S. ICBM launch in that very short window of time, but they wouldn’t lose all retailiatory capacity if they chose not to, and it is difficult to imagine — given the possibility and actual past experience with false alarms — that a sane president would authorize a launch before confirmation of an actual attack on U.S. soil.

So why did the “four minute” number resurface at all? That’s a bit of a mystery. It could have just been a debate gambit by Clinton, which is admittedly the simplest explanation, or perhaps the idea of Launch On Warning isn’t completely gone from U.S. strategic policy. This is not implausible, since we still maintain a land-based ICBM force, and the ICBMs are still subject to the first-strike advantage which produced Launch On Warning in the first place.

And rather than debating the debate, which will be a moot point in a very few weeks, the real question we ought to be asking is why we bother to maintain the land-based strategic nuclear ICBM force at all.

Here’s a modest proposal: retire the ICBM force’s strategic nuclear warheads, but retain the missile airframes and other launch infrastructure. Let other interested parties observe the nuclear decommissioning, if they want to, so that there’s no mistaking a future launch of those missiles as a nuclear one. And then use the missiles for non-nuclear Prompt Global Strike or a similar mission (e.g. non-nuclear FOBS, “rod from God” kinetic weapons, or whatever our hearts desire).

It ought to make everyone happy: it’s that many fewer fielded nuclear weapons in the world, it eliminates the most vulnerable part of the nuclear triad and moves us firmly away from LOW, it doesn’t take away any service branch’s sole nuclear capability (the Air Force would retain air-launched strategic capability, as a hedge against future developments making the SSBN force obsolete), and it would trade an expensive and not-especially-useful strategic capability for a much-more-useful tactical capability, and in the long term it could potentially allow the U.S. to draw down overseas-deployed personnel and vulnerable carrier strike groups while retaining rapid global reach.

It makes too much sense to ever actually occur, of course, at least not during an election season.

0 Comments, 0 Trackbacks

[/politics] permalink

Fri, 09 Sep 2016

This was originally posted to Hacker News as a comment in a discussion about “microhousing”. The question I was responding to was:

What is NIMBY for microhousing based on?

This is an ongoing argument in Northern Virginia (which is not quite as expensive as SF / Seattle / NYC, but probably only one cost tier below that) over micro-housing, typically in the form of backyard apartments and the subdivision of single-family homes into boarding houses, and the major arguments are basically the same issues that apply to all “just build more housing, stupid” proposals.

Basically, if you suddenly build a lot more housing, you’d start to strain the infrastructure of the community in other ways. That strain is really, really unpleasant to other people who share the infrastructure, and so current residents — who are often already feeling like things are strained and getting worse over time — would rather avoid making things worse. The easiest way to avoid making things worse is just to control the number of residents, and the easiest way to do that is to control the amount of housing: If you don’t live here, you’re probably not using the infrastructure. QED.

In many ways, building more housing is the easiest problem to solve when it comes to urban infrastructure. Providing a heated place out of the rain just isn’t that hard, compared to (say) transportation or schools or figuring out economically sustainable economic balance.

Existing residents are probably (and reasonably) suspicious that once a bunch of tiny apartments are air-dropped in, and then a bunch of people move in to fill them up, that there won’t be any solution to any of the knock-on problems that will inevitably result — parking, traffic, school overcrowding, tax-base changes, stress to physical infrastructure like gas/water/sewer/electric systems — until those systems become untenably broken. I mean, I can’t speak to Seattle, but those things are already an increasingly-severe problem today, with the current number of residents, in my area, and people don’t have much faith in government’s ability to fix them; so the idea that the situation will be improved once everyone installs a couple of backyard apartments is ridiculous. (And then there are questions like: how are these backyard apartments going to be taxed? Are people who move in really going to pay more in taxes than they consume in services and infrastructure impact, or is this going to externalize costs via taxes on everyone else? There’s no clear answer to these questions, and people are reluctant to become the test case.)

If you want more housing, you need more infrastructure. If you want more infrastructure, either you need a different funding model or you need better government and more trust in that government. Our government is largely (perceived to be) broken, and public infrastructure is (perceived to be) broken or breaking, and so the unsurprising result is that nobody wants to build more housing and add more strain to a system that’s well beyond its design capacity anyway.

That’s why there’s so much opposition to new housing construction, particularly to ideas that look just at ways to provide more housing without doing anything else. You’re always going to get a lot of opposition to “just build housing” proposals unless they’re part of a compelling plan to actually build a community around that new housing.

0 Comments, 0 Trackbacks

[/politics] permalink

Fri, 26 Aug 2016

Bruce Schneier has a new article about the NSA’s basically-all-but-confirmed stash of ‘zero day’ vulnerabilities on his blog, and it’s very solid, in typical Bruce Schneier fashion.

The NSA Is Hoarding Vulnerabilities

I won’t really try to recap it here, because it’s already about as concise as one can be about the issue. However, there is one thing in his article that I find myself mulling over, which is his suggestion that we should break up the NSA:

And as long as I’m dreaming, we really need to separate our nation’s intelligence-gathering mission from our computer security mission: we should break up the NSA. The agency’s mission should be limited to nation state espionage. Individual investigation should be part of the FBI, cyberwar capabilities should be within US Cyber Command, and critical infrastructure defense should be part of DHS’s mission.

Far be it from me to second-guess Schneier on most topics, but that just doesn’t seem to make a whole lot of sense. If the key problem is that vulnerabilities are being hoarded for offensive use rather than being shared with manufacturers (defensive use), it doesn’t seem like splitting those two missions into separate agencies is going to improve things. And the predictable result is that we’re then going to have two separate agencies working against one another, doing essentially the same research, looking for the same underlying vulnerabilities, for different aims. That seems… somewhat inefficient.

And if history is any guide, the U.S. will probably spend more on offensive armaments than on defense. Contrary to the Department of Defense’s name, since the end of WWII we have based our national-defense posture largely on a policy of force projection and deterrence-through-force, and I am highly skeptical that, as a nation, we’re going to suddenly take a different tack when it comes to “cyberwarfare” / IT security. The tension between offense and defense isn’t unique to IT: it exists in lots of other places, from ICBMs to vehicle armor, and in most cases U.S. doctrine emphasizes the offensive, force-projective capability. This is practically a defining element of U.S. strategic doctrine over the past 60 years.

So the net result of Schneier’s proposal would probably be to take the gloves off the NSA: relieve it of the defensive mission completely, giving it to DHS — which hardly seems capable of taking on a robust cyberdefense role, but let’s ignore that for the sake of polite discussion — but almost certainly emerge with its funding and offensive role intact. (Or even if there was a temporary shift in funding, since our national adversaries have, and apparently make use of, offensive cyberwarfare capabilities, it would only be a matter of time until we felt a ‘cyber gap’ and turned on the funding tap again.) This doesn’t seem like a net win from a defense standpoint.

I’ll go further, admittedly speculation: I suspect that the package of vulnerabilities (dating from 2013) that are currently being “auctioned” by the group calling themselves the Shadow Brokers probably owe their nondisclosure to some form of internal firewalling within NSA as an organization. That is to say, the sort of offensive/defensive separation that Schneier is seemingly proposing at a national level probably exists within NSA already and is related to why the zero-day vulnerabilities weren’t disclosed. We’ll probably never know for sure, but it wouldn’t surprise me if someone was hoarding the vulnerabilities within or for a particular team or group, perhaps in order to prevent them from being subject to an “equities review” process that might decide they were better off being disclosed.

What we need is more communication, not less, and we need to make the communication flow in a direction that leads to public disclosure and vulnerability remediation in a timely fashion, while also realistically acknowledging the demand for offensive capacity. Splitting up the NSA wouldn’t help that.

However, in the spirit of “modest proposals”, a change in leadership structure might: currently, the Director of the NSA is also the Commander of the U.S. Cyber Command and Chief of the Central Security Service. It’s not necessarily clear to me that having all those roles, two-thirds of which are military and thus tend to lean ‘offensive’ rather than ‘defensive’, reside in the same person is ideal, and perhaps some thought should be given to having the NSA Director come from outside the military, if the goal is to push the offensive/defensive pendulum back in the opposite direction.

0 Comments, 0 Trackbacks

[/politics] permalink

Tue, 23 Aug 2016

About fifty pages into John Bruce Medaris’s 1960 autobiography Countdown for Decision, there is an unsourced quote attributed to Col. C.G. Patterson, who in 1944 was in charge of Anti-Aircraft Artillery for the U.S. First Army, outlining the concept of a “technological casualty”:

“If a weapon costs more to build, in money, materials, and manpower, than it costs the enemy to repair the damage the weapon causes, the user has suffered a technological casualty. In any long-drawn-out struggle this might be the margin between victory and defeat.” 1

As far as I can tell, the term “technological casualty” never passed into general usage with that meaning, which is unfortunate. And although sources do confirm that Col. Patterson existed and by all accounts served admirably as the commander of air defense artillery for First Army in 1944, there doesn’t appear to be much record outside of Medaris’ book of the quote. Still, credit where it is most likely due; if ever a shorthand name for this idea is required, I might humbly suggest “Patterson’s Dictum”. (It also sounds good.)

I suspect, given Patterson’s role at the time, that the original context of the quote had to do with offensive or defensive air capability. Perhaps it referred to the attrition of German capability that was at that point ongoing. In Countdown, Medaris discusses it in the context of the V-2, which probably consumed more German war resources to create than they destroyed of Allied ones. But it is certainly applicable more broadly.

On its face, Patterson’s statement assumes a sort of attritative, clash-of-civilizations, total-commitment warfare, where all available resources of one side are stacked against all available resources of the other. One might contend that it doesn’t seem to have much applicability in the age of asymmetric warfare, now that we have a variety of examples of conflicts ending in a victory — in the classic Clausewitzian political sense — by parties who never possessed any sort of absolute advantage in money, materials, or manpower.

But I would counter that even in the case of a modern asymmetric war, or realpolitik-fueled ‘brushfire’ conflicts with limited aims, the fundamental calculus of war still exists, it just isn’t as straightforward. Beneath all the additional terms that get added to the equation is the essential fact that defeat is always possible if victory proves too expensive. Limited war doesn’t require you outspend your adversary’s entire society, only their ‘conflict budget’: their willingness to expend resources in that particular conflict.

Which makes Patterson’s point quite significant: if a modern weapons system can’t subtract as much from an adversary’s ‘conflict budget’ — either through actual destructive power, deterrence, or some other effect — as it subtracts from ours in order to field it (including the risk of loss), then it is essentially a casualty before it ever arrives.

1: Countdown for Decision (1960 ed.), page 51.

0 Comments, 0 Trackbacks

[/politics] permalink

Wed, 10 Aug 2016

“America’s Intentional Broadband Duopoly” by Dane Jasper, writing on the blog of Sonic.net Inc., an ambitious Gigabit ISP, is one of the best summaries of why US broadband is the way it is that I’ve read. If you live in the US and use the Internet, it’s worth reading, just to understand why your Internet access options suck so damn badly compared to the rest of the civilized world.

Spoiler Alert: It is not, as telco / cableco apologists sometimes attest, a function of geography or population density — there are ample examples of countries with both more challenging geography or less-dense populations with far better, and cheaper, Internet service. (And the population density is really a red herring when you realize that most of the US population lives in areas that are pretty dense, like the Eastern Seaboard, which is comparable to Europe.) The answer is a sad combination of political lobbying, regulatory capture, and technological false promises.

In case their site goes down at some point in the future, here’s a link to the Internet Archive’s cached version.

Via MetaFilter.

0 Comments, 0 Trackbacks

[/politics] permalink

Tue, 09 Aug 2016

All the way back in 2008 — you remember 2008, right? Back when oil hit $100/barrel for the first time, and a whole bunch of Americans thought Russia had invaded the Peach State, and who can forget the International Year of the Potato? — two days after the election, I wrote the following:

About the only positive aspect of [the Democrats’ victory] that I can find, is that it might represent the death knell of the far-right, authoritarian “conservatives” that have monopolized the GOP brand for too long. […] The far-right just isn’t socially mainstream enough to form the core of a majority political party.

I stand by that statement, by the way, even in the face of Trump; what Trump shows is that a dedicated, passionate minority can get a basically-unelectble candidate all the way to the general election.

But it’s disheartening that the lesson the Republican establishment learned from 2008 wasn’t “don’t let the inmates run the asylum”, but instead was, seemingly, “don’t pick Sarah Palin as a running mate.” (To their credit, nobody has repeated that particular mistake as far as I know.)

As Trump slides towards a 10-point gap behind Clinton, and has almost certainly alienated blue-collar white voters in key swing states like Pennsylvania with his anti-military rants, it will be interesting to see whether the GOP as a party finally learns a more general lesson about the disconnect between primary voters and the rest of the country, or if — like the aftermath of 2008 — they manage only to add one more mistake to the long list of things they won’t do again.

0 Comments, 0 Trackbacks

[/politics] permalink

Thu, 04 Aug 2016

It looks like the honeymoon, if there ever really was one, is over for Candidate Trump, and people are seriously starting to consider whether it would be better for the Republican party if he just lost the election.

Writing in The Guardian, Katrina Jorgensen spells it out:

[F]or the party to come back strong after Donald Trump’s divisive candidacy […] the least-worst option is a major loss in the presidential race.

The key word here is “major”. Intentionally or not, Trump has signaled with his ‘rigged election’ comments that a narrow loss wouldn’t necessarily be a clear sign to sit down and shut up.

If Trump only trails [Clinton] by a few points, you can bet he will blame the Republicans who voted their conscience. Or he’ll kick up dirt over the “rigged” system, as he has already alluded to. Trump supporters in the party will go on a witch-hunt […] Only a loss by a wide margin would send a clear message to the Republican party: this is the wrong choice for America.

Basically, Republicans need to cordon off Trump from the rest of the party and in particular from down-ballot Senate elections. Barring an unexpected retreat by Trump himself, which seems unlikely, the Presidency is essentially a lost cause — but the House and Senate are not. Trump’s increasingly bizarre behavior may actually help differentiate other candidates from him, and make it more difficult for Democrats to use him as leverage, because he is simply that clearly divorced from the rest of the party’s mainstream candidates.

Then, the party needs to give some serious thought to its primary system. Ironically, it wouldn’t be surprising if the Republicans end up with the same sort of superdelegate-heavy system that the Democrats implemented, and which basically doomed the Sanders campaign in favor of the safe (but unpopular) Clinton in their own primary this year. So the strategy is certainly not without risks. But the general election, if it led to a lopsided Republican defeat by Clinton, would show that the failure mode of the superdelegate-heavy, establishment-driven primary system is preferable to the failure mode of the populist-driven system the Republicans currently use.

As Paul Ryan said earlier today, “[Republicans] are a grass-roots party; we aren’t a superdelegate party.” One can only wonder if perhaps he’s wishing that wasn’t the case.

0 Comments, 0 Trackbacks

[/politics] permalink

Wed, 03 Aug 2016

Years ago I came across a piece by a journalist named Alex Steffen called Night, Hoover Dam. It summed up a lot of feelings that I had about the “survivalist fallacy”, to the point where I even wrote a blog post about it back in May 2008.

It was originally hosted on a site called ‘Worldchanging.org’, an environmental website which apparently got acquired and subsequently killed in 2011. This is a shame, because there was a lot of good content there, and I can’t imagine it would have cost them much to keep it going. But thankfully, we have the Internet Archive, and so the piece itself wasn’t lost for good.

Here’s the archived version: https://web.archive.org/web/20160111223335/http://www.worldchanging.com/archives/001413.html

It’s still worth a read.

0 Comments, 0 Trackbacks

[/politics] permalink

Mon, 25 Jul 2016

As a result of an interesting link on Hacker News, specifically to a post on Alex Wellerstein’s blog “Nuclear Secrecy” called “A brief history of the nuclear triad” — which is a good read and thoroughly recommended — I discovered the text of Herb York’s 1978 autobiography ‘Race to Oblivion: A Participant’s View of the Arms Race’ online as HTML. I can only hope that the online text is legal, because the book is otherwise unavailable except as used copies, and it is certainly still relevant.

The book seems to be typically only read or studied by those in classes dealing with arms control or strategic policy, which is a bit unfortunate as there’s quite a few gems in there, completely aside from the book’s stated purpose.

In particular, the author mentions something (in chapter 5, marked as page 91) about the defense budget that anyone who has worked in the Federal sphere can probably relate to:

Defense planning is full of arbitrary figures and figurings that have been thoroughly rationalized only after the fact. The number of units of many types of equipment is almost as arbitrary; so are the total numbers of men in the various services; and hence so is the total defense budget itself. I would say that the defense budget is arbitrary by at least a factor of two. The fierce arguments that can break out over a cut of, say, five percent have their origins in the very great difficulties of making changes in large traditionbound systems and not in the fact that the numbers as they originally stood were correct in any absolute sense. Thus, the real reason that this year’s defense budget is so and so many billion dollars is simply that last year’s defense budget was so and so many billion, give or take about five percent. The same thing, of course, applies to last year’s budget and the budget of the year before that. Thus the defense budget is not what it is for any absolute reason or because in any absolute sense the total cost of everything that is supposedly truly needed comes out to be precisely that amount, but rather it is the sum total of all the political influences that have been applied to it over a history of many years, and that have caused it to grow in the way that it has grown.

Or, to borrow the technical term, what York is suggesting is that the defense bureaucracy, viewed as a system, basically has a fixed slew rate. You can expand or contract the defense budget, but because the system itself resists change, it’s very rare to have the political will to change it by more than 5% or so per budget cycle. It further looks more than a bit suspicious for this slew rate to work out so roundly to 5%, a number that we find deliciously convenient on account of our five digits. It makes me wonder if this value isn’t chosen — consciously or otherwise — as the breaking point between the forces of change and forces of stability quite often in budget negotiations.

I’m not convinced that this relatively-low maximum slew rate is necessarily a bad thing, when you are dealing with an institution as large as the DoD: it would probably be bad if it were subject to political whims that could change the budget more greatly than they do from year to year, and the result would almost certainly be more favoritism, if not outright corruption, but it does present a significant challenge: with that limit taken on premise, if you want the budget to be a certain amount by a certain time, or if you want it to be focused on some set of priorities at some future date, you have to start pushing it in the right direction far in advance of the target.

That in combination with relatively short political-leadership cycles (which tend to be ~8 years in the Executive branch and not too much longer in the House side of the Legislative; the Senate is somewhat slower-moving, but not by orders of magnitude or anything) creates a problem to getting anything intelligent done at all, outside of a crisis. (Others may disagree, but I still have some faith in our institutional ability to react quickly when the chips are down; it’s just a hell of an expensive way to run a country.)

In the coming decades, I think the challenge for established nation-state actors in the face of new adversaries — particularly non-state actors like, but not necessarily limited to, terror groups — is going to be not letting those groups permanently outmaneuver them by getting inside the OODA loop of the established players to such an extent that they become unable to adequately respond.

The silver lining to this for the West, if it can be said to be much of one, is that there’s no evidence that the emerging superpower states such as China and India are any better at all of this, or have a faster organizational “slew rate”, than we do. On this issue, we’re all basically in the same boat, and it’s a very large, very massive, and very slow-to-maneuver one.

0 Comments, 0 Trackbacks

[/politics] permalink

Wed, 01 Jul 2009

There’s been a bit of discussion recently over the idea of mileage-based road taxes replacing the Federal gasoline and diesel taxes that currently pay for the Interstate system, among other things. Most articles seem to have been prompted by a report from the “National Surface Transportation Infrastructure Financing Commission” (which somewhat strangely has a photo of the DC Metro in an underground station on its homepage) suggesting that the gas tax be phased out by 2020 and replaced by a mileage-based tax.

The proposal by the NSTIFC called for a GPS-based system to track road usage and upload it on a monthly basis for taxation purposes. This is stupid. It’s overly complex, it would be ridiculously expensive, it has major privacy concerns, its operation would be opaque to users, and it would almost certainly be open to abuse due to its complexity. It’s a terrible idea and the people suggesting it should be forced to read the RFPs of every overly-complex public sector IT project that has fallen flat on its face for similar reasons, until they repent for coming up with such a terrible idea.

However, the stupidity of that particular implementation plan doesn’t mean that all mileage-based taxes are a bad idea. The underlying concept is a sound one, and if it’s done right it might cause people to think harder about the services they’re using and how much it costs to maintain them. That’s a Good Thing in my book.

The kind of mileage-based tax I’d support would be a low-tech one. Calculate taxable mileage using annual odometer readings, conducted while vehicles are undergoing normal safety or emissions inspections. (There are states which currently don’t do emissions or safety inspections that would have to start doing them, but this change is far less than what would be required for alternative schemes, e.g. the GPS-based one.) Certainly it’s possible to roll back an odometer or bribe an inspector, but those things are already illegal, as are other kinds of tax fraud. Increase the penalties proportional to the increased incentive to commit fraud, and we shouldn’t have much more trouble with odometer tampering than we currently do.

Basing the taxable mileage from the odometer reading doesn’t require invasive GPS tracking devices, which would doubtless be used for purposes well beyond tracking taxable mileage once installed. It doesn’t require any new technology, and in many places it makes use of the already-extant inspection infrastructure. It’s cheap — both from the user’s and the government’s perspective — and it would work.

Two of the most frequently-cited concerns regarding mileage-based taxes concern drivers who frequently travel outside the US, and drivers who spend most of their time on private roads (e.g. farm vehicles). The second issue — vehicles on private roads — is easy to address: if your vehicle doesn’t have a license plate and doesn’t normally operate on public roads (vehicles which today use untaxed off-road fuel), it doesn’t get taxed. If your vehicle does have a plate, it does. In the very worst case, this might force a very small number of edge-case users to get a second vehicle, if they currently have one that sometimes operates on-road using taxed fuel and sometimes off-road using untaxed gas, but this is such a small percentage of vehicles that I’m not sure it bears building policy around.

Addressing international driving is a more interesting question. The simplest, lowest-tech solution is probably to simply record mileage as part of the border-crossing process. If drivers who are crossing the border want a tax exemption on their non-US mileage, they could carry a logbook similar to a passport, specific to their vehicle, and have the mileage noted and certified by a Border Patrol agent as they crossed out of and back into the US. It would be up to drivers to determine whether, based on the amount of mileage they actually drive outside of the US, the paperwork was worth it.

What gets lost in the discussion of mileage based taxes, and which I think bears attention, is that in any reasonably fair scenario, the taxes on passenger cars and light trucks should be vanishingly low. The bulk of mileage taxes should be placed on commercial vehicles weighing more than 6000 lbs., because they actually cause wear and tear to the roads. Passenger cars essentially don’t. Whenever you see an Interstate being repaved, it’s generally either due to weather deterioration, or wear and tear by trucks. The weather-repair costs should be borne by all drivers essentially in proportion to the amount they drive, but the wear and tear expenses should be squarely on heavy vehicles. In fact, the easiest way to ensure vehicles pay for the damage they do is to base the tax on the milage driven multiplied by the maximum axial weight of the vehicle. (Road wear is essentially proportional to the load on each axle, although I suspect the relationship is strongly nonlinear and some research might be required to determine the actual rate tables.)

And that brings me around to my only real objection to a mileage-based tax, which is also my objection to virtually all taxes except those placed on real property: the public needs an assurance from the government that the mileage tax would only be used for maintenance and construction of the transportation infrastructure, and not for whatever purpose Congress decides is politically expedient this season. This is because, when you start taxing a particular activity, you start to change the underlying incentive structure that drives people’s choices and lives. It is important to make the ‘retail’ cost (that is, the out-of-pocket cost paid by the consumer) of goods reflect the true cost to society of that good, but it shouldn’t be made any higher.

Federal road taxes should be used for the maintenance of the Federal road and highway system only — not for regional light rail projects (better funded by property taxes on those areas that will benefit) or for environmental remediation of fossil fuels (better funded by taxes on the fossil fuels themselves). And certainly not for schools, hospitals, police stations, or anything else, except insofar as the need for those things can be directly attributed to the existence of the Federally-maintained road network.

Some have objected to the idea of a mileage-based tax because under most proposals, it would not immediately replace the gas tax — that is, the gas tax would not drop to zero cents per gallon on the day a mileage-based tax went into effect. If both the mileage-based road usage tax and the gas tax were set properly, this would not be a problem. The mileage based tax would go towards infrastructure maintenance, and the gas tax would go towards remediating the environmental consequences and other negative externalities of petroleum use. Since there are a lot of negatives associated with burning oil, it should have a fairly high tax regardless of what we decide to levy for road use. Furthermore, the remaining gas tax should apply across the board to all petroleum products intended for combustion, not just road fuels: this means oil used in power generation, on farms, or by railroads shouldn’t be exempt. If you burn it and vent the byproducts into the atmosphere, it should be taxed: it’s not a “road usage” tax anymore, it’s a “petroleum combustion” one. (Here’s where you build your CO2 or climate-change taxes, incidentally.)

Retaining — even increasing, if valid reasons exist — the tax on gasoline to cover its negative externalities also eliminates one other problem with a mileage-based tax: that it creates a perverse incentive to continue using petroleum vehicles and not switch to alternative fuels, which are cleaner and have fewer negative externalities associated with their use. Plus, as a bonus, if we institute a mileage-based tax with a weight component, we can stop punitively taxing diesel fuel as a backdoor way of taxing trucks for the damage they do to the roads. Diesels are more efficient and are favored in other parts of the world (where the tax regimes are less punitive) due to their inherent economy.

There are lots of reasons to hate the mileage-based taxation proposals that have been put forward, and would require GPS receivers and constant monitoring of every car on the road. However, there’s no reason to dismiss the idea of mileage-based taxes out of hand. Taxing based on services actually consumed is always a good idea in my book, and if it were done right, a mileage-based tax could help shape our actions in ways that avoid externalizing costs on others. However, I remain as cynical as ever about Washington’s ability to get this, or just about anything else, right.

0 Comments, 0 Trackbacks

[/politics] permalink