Sotomayor's Mind-Numbing Speeches. By Matthew J. Franck
Bench Memos/NRO, Thursday, June 11, 2009
I have just come up for air from the stifling smog of banality generated by the collected speeches of Sonia Sotomayor. They can be seen as the attachments to Question 12d on this page set up by the Senate Judiciary Committee, but let me hasten to add that I have wasted my morning on reading them so you don't have to.
I was interested in part in whether I was right to speculate yesterday that maybe her best-known speech, 2001's "A Latina Judge's Voice," was partly "unscripted" or ab-libbed, since that might explain the occasionally bad writing in its published form. It could not have been wholly off-the-cuff, of course, with its recitation of statistics and its quotations from a few sources—but those might have been on notecards. Having gone through virtually all the speeches on the Senate website, I can say that what the Berkeley La Raza Law Journal published was almost certainly based on a prepared text. I say that for a couple of reasons.
First, Judge Sotomayor's speeches all display mediocre writing, and frequently exhibit errors of various kinds, and the copies of them that have been made public are in most cases quite obviously her own originals. She said of herself as long ago as 1994 that "I consider myself merely an average writer," and as recently as 2007 that "[w]riting remains a challenge for me even today . . . I am not a natural writer." This is a salutary self-awareness, because her speeches are bad—uniformly boring, almost invariably devoid of actual ideas, and completely forgettable. I am not sure she has ever uttered an interesting thought off the bench—unless you count her celebration of the "wise Latina," which is interesting only to the pathologist of confused identity politics.
Second, Sotomayor's speeches are mind-numbingly repetitious when read in series. This may seem an unfair comment; each one was presumably a first-time listening experience for her audience (one can hope, anyway). But reading them in series, I began quietly to mutter to the speaker, "Get some new material. Aren't you boring yourself?" She didn't just once deliver the line "I became a Latina by the way I love and the way I live my life"; she delivered it easily a dozen times to (mostly Hispanic) audiences over the course of ten years or so. And the repetition allows us to see the same little mistakes over and over. To take just two: Sotomayor has a fond memory from her childhood of the Mexican comedian Cantinflas, but she never once spells his name correctly in at least a half dozen speeches (after which I stopped counting). And she likes relating how the city girl came to Princeton and had never heard a cricket before, and how she imagined it looked like "Jimmy the Cricket" from the Disney movie Pinocchio (once she called him "Jiminy the Cricket," which gets a little closer, but in subsequent speeches she went back to Jimmy). Such repeated mistakes over a period of years provide us with some assurance that all the errors we see are her own, and not transcription errors from audio recordings. So when she refers to "Anthony Scalia" for "Antonin" or "Thurmond Marshall" for "Thurgood," we can be pretty sure that's Sotomayor's own mistake.
I can also confirm what others have observed about the notorious "wise Latina" speech at Berkeley: that it was not an isolated instance of such an argument against traditional notions of judicial impartiality.
Sotomayor's speeches for the most part fall into three large categories: 1) celebrations of her Latina background, combined with exhortations to strive as she has done, delivered to young Hispanic audiences; 2) encouragement to law students to take up pro bono work and benefit their communities; and 3) brief descriptive accounts of the work of the courts on which she has served, trial and appellate. In the first category there are mostly harmless hurrahs for being Hispanic or Latino or Puerto Rican or whatever. Many of these contain a variant of this paragraph from 1998:
America has a deeply confused image of itself that is a perpetual source of tension. We are a nation that takes pride in our ethnic diversity, recognizing its importance in shaping our society and in adding richness to its existence. Yet, we simultaneously insist that we can and must function and live in a race- and color-blind way that ignores those very differences that in other contexts we laud. That tension between the melting pot and the salad bowl, to borrow recently popular metaphors in New York, is being hotly debated today in national discussions about affirmative action. This tension leads many of us to struggle with maintaining and promoting our cultural and ethnic identities in a society which is often ambivalent about how to deal with its differences.
Nothing here, without more, should cause anyone to conclude that Sotomayor is for race-consciousness in judging. The paragraph is almost studiously non-committal and merely descriptive, except for its initial characterization of the "tension" between diversity and colorblind equality as a mark of a country that has a "deeply confused image of itself." Lots of Americans could easily explain to Judge Sotomayor how this is no source of confusion at all.
But then, overlapping with all these mostly boring celebrations of the joys of being a Latina, we find the speeches that do indicate something about Sotomayor's view of judging. The Berkeley La Raza speech, given in October 2001, was given again in almost identical form on February 26, 2002, and October 22, 2003. And an earlier version, containing much of the same material but not yet polished into its best-known form, was given on March 17, 1994 in Puerto Rico, on the topic of "Women in the Judiciary." Because of that topic and the audience listening to the speech, Sotomayor referred to a "wise woman with the richness of her experiences" rather than a "wise Latina." But the essentials of the speeches we know from 2001 and later were all there: the disagreement with Judge Miriam Cedarbaum's argument for judicial neutrality, the expectation that better judging will come from women and minorities because they are women and minorities, and the plumping for more appointments of Hispanic judges in particular.
In sum, we can say that Judge Sotomayor has, with few exceptions, given just three or four speeches to public audiences in her career—the same three or four, over and over and over. One of her repeated themes is on the virtue of race-conscious and sex-conscious bias in judging. Like her other themes, this one cannot be "walked back" by White House claims that she "misspoke" or could have made her point differently. She is, on the evidence of her speeches, a great self-absorbed bore, a mediocrity as a writer, and a polished practitioner of identity politics.
Sunday, June 14, 2009
WaPo: The Obama Diet it's not going to slim down the federal budget anytime soon
The Obama Diet. WaPo Editorial
It's not going to slim down the federal budget anytime soon.
WaPo, Friday, June 12, 2009
IMAGINE GOING on a diet in which you figure out how many calories a day you eat and pledge to make up for any amount beyond that with vigorous exercise. Except that your regular consumption includes four gooey slices of chocolate cake daily -- which you have no intention of giving up. You might not put on weight as quickly, but you're not likely to slim down, either. This is about what President Obama proposed this week with new legislation to codify congressional pay-as-you-go rules. "The 'pay-as-you-go rule' is very simple," the president said. "Congress can only spend a dollar if it saves a dollar elsewhere."
Well, not exactly. First, as under existing House and Senate rules, what is known as the PAYGO law would not apply to discretionary spending programs, which account for about 40 percent of the federal budget; these programs, which Congress reviews and funds yearly, could continue to grow without corresponding cuts or new revenue. Only extra spending for mandatory programs (Medicare, Social Security and the like) and tax provisions would be covered. Second, Mr. Obama would write into the law four whopping exceptions to the pay-as-you-go rule: extending most of the Bush tax cuts, keeping the estate tax at its current level, preventing the alternative minimum tax from hitting more taxpayers and increasing Medicare payments to doctors. This adds up to a $2.8 trillion loophole over 10 years.
What to make of this move? Well, it's better than nothing, and if implemented it would improve on the record of Mr. Obama's predecessor, who -- with plenty of Democratic encouragement -- dug the entitlement hole even deeper with the creation of a Medicare prescription drug plan that was not paid for. By contrast, Mr. Obama vows that his health-care reform will have to be financed with spending cuts or tax increases. That is important under the current budgetary circumstances, and to the extent that pay-as-you-go legislation would make it more difficult for lawmakers to get around their self-imposed rules simply by voting to ignore them, all the better.
But the president's self-congratulatory back-patting about fiscal rectitude is more than a bit hard to take in light of the huge exceptions he would grant. Yes, the president inherited a budget in arrears, an economy in tatters and a tax system that is unsustainable as written. It was necessary to add to the deficit in the short term to jolt the economy back to life. The House has these four exemptions in its pay-as-you-go rule; even without the Obama exceptions, there was no reasonable hope that these costs would be paid for.
Yet Mr. Obama's professions of being willing to make hard choices are belied by his failure to adjust his spending plans to budgetary reality. Something will need to give -- either raising significantly more revenue or dramatically scaling back government. He doesn't deserve much credit for a pay-as-you-go proposal that elides this reality instead of confronting it.
It's not going to slim down the federal budget anytime soon.
WaPo, Friday, June 12, 2009
IMAGINE GOING on a diet in which you figure out how many calories a day you eat and pledge to make up for any amount beyond that with vigorous exercise. Except that your regular consumption includes four gooey slices of chocolate cake daily -- which you have no intention of giving up. You might not put on weight as quickly, but you're not likely to slim down, either. This is about what President Obama proposed this week with new legislation to codify congressional pay-as-you-go rules. "The 'pay-as-you-go rule' is very simple," the president said. "Congress can only spend a dollar if it saves a dollar elsewhere."
Well, not exactly. First, as under existing House and Senate rules, what is known as the PAYGO law would not apply to discretionary spending programs, which account for about 40 percent of the federal budget; these programs, which Congress reviews and funds yearly, could continue to grow without corresponding cuts or new revenue. Only extra spending for mandatory programs (Medicare, Social Security and the like) and tax provisions would be covered. Second, Mr. Obama would write into the law four whopping exceptions to the pay-as-you-go rule: extending most of the Bush tax cuts, keeping the estate tax at its current level, preventing the alternative minimum tax from hitting more taxpayers and increasing Medicare payments to doctors. This adds up to a $2.8 trillion loophole over 10 years.
What to make of this move? Well, it's better than nothing, and if implemented it would improve on the record of Mr. Obama's predecessor, who -- with plenty of Democratic encouragement -- dug the entitlement hole even deeper with the creation of a Medicare prescription drug plan that was not paid for. By contrast, Mr. Obama vows that his health-care reform will have to be financed with spending cuts or tax increases. That is important under the current budgetary circumstances, and to the extent that pay-as-you-go legislation would make it more difficult for lawmakers to get around their self-imposed rules simply by voting to ignore them, all the better.
But the president's self-congratulatory back-patting about fiscal rectitude is more than a bit hard to take in light of the huge exceptions he would grant. Yes, the president inherited a budget in arrears, an economy in tatters and a tax system that is unsustainable as written. It was necessary to add to the deficit in the short term to jolt the economy back to life. The House has these four exemptions in its pay-as-you-go rule; even without the Obama exceptions, there was no reasonable hope that these costs would be paid for.
Yet Mr. Obama's professions of being willing to make hard choices are belied by his failure to adjust his spending plans to budgetary reality. Something will need to give -- either raising significantly more revenue or dramatically scaling back government. He doesn't deserve much credit for a pay-as-you-go proposal that elides this reality instead of confronting it.
Naturalism Has Been Hijacked
Naturalism Has Been Hijacked. By George Ball
Man is not a cancer on the planet.
WSJ, Jun 13, 2009
Mankind has really been put in its place over the past 500 years. Why only the other day, back in 1400, the sun orbited the earth; man was God's consummate work of art; humans were masters of themselves and the domain God provided for them.
Our secular fall from grace began with Copernicus, who dislodged the world from its celestial catbird seat. Later, Darwin established that man, far from being the animal kingdom's pièce de résistance, was a bit like a baboon in clothes. Then Mendel documented the laws of inheritance -- so much for free will -- and Freud subordinated what was then left of our minds to unseemly drives over which we have little control.
In the 20th century, technology assumed a size and complexity too big to fit into what was left of our brains. In the 1890s, an intelligent layman could achieve a rudimentary grasp of the scope of current scientific thought. Perhaps no one -- scientist or not -- fathoms the full scope of technology today.
According to scientist and futurist Raymond Kurzweil, the coming technological-evolutionary quantum leap, known as the Singularity, will erase the line between human beings and technology. He maintains that technology's exponential progress will result in part-human, part-machine beings with infinitely greater brain power and life-spans approaching immortality.
Mr. Kurzweil envisions the time, if a body part fails, one need only grab its replacement from the pantry and snap it in place. Already, lawyers are busy devising the constitutional framework for a post-human future, in view of the shifting nature of what comprises a human being. The classic paradox comes to mind: Once the knife's blade and handle are each replaced several times, is it still the same knife? Once all your parts have been replaced a few times, are you still you?
Now a segment of the Green movement presents a fresh challenge to mankind's place within nature. Humans, the thinking goes, are one species among the many, a life form coexisting with others, our rights commensurate with those of snail darters, mosquitoes and coral reefs.
The new environmentalist thinking occupies that treacherous terrain between rationality and romanticism. It's highly logical, too, an all-encompassing equation where everything is equivalent to everything else -- communism at a cellular level.
The premise glows with the innocence we forsook when Adam larcenously appropriated an apple from its rightful owner, the tree.
This dangerous new unnatural naturalism sees the planet as a realm of halcyon purity. Conversely, mankind is portrayed as a cancer on the planet. Welcome to secular subhumanism.
The Earth-Firsters are not fools. There are choice elements in their deranged philosophy that merit consideration; such is the essence of temptation. However, their failure is that they undermine their cause with acts of brutality. Theodore Kaczynski, the Unabomber, a Ph.D. with kindred neo-Luddite views, was one such activist run amok, responsible for dozens of injuries and four deaths. He is a case study of how, contaminated with extreme emotion, logic becomes toxic.
Self-described "evo-lutionaries" and animal-rights activists feel justified in spiking trees, burning down housing developments, vandalizing laboratories and threatening the lives of researchers and their families. By all means save the Red-Cockaded Woodpecker, but not at the cost of human lives, no matter how few. That way lies madness.
One activist author posits that the planet can support only one billion people -- a number surely including the writer, his friends and extended family. Another activist advocates saving the world through euthanasia, abortion, suicide and sodomy. However, the truly repugnant part of this story is that these are both tenured professors in wealthy universities.
In Switzerland, proposed legislation protects the rights of plants. As you roam the Swiss mountains, do not violate the rights of the wildflowers by picking them: An undercover gnome might arrest you. Internationally, the greener-than-thou brigade scorns bioengineered seeds -- the 20th century achievement that vastly increased the world's food supply and rescued billions from starvation -- forgetting that nature has been creating hybrids since the beginning of time.
A Yale professor maintains that owning pets is a kind of species colonialism, an exploitative master-subject relationship. The word "pet" is now viewed as pejorative; if you must hold a creature hostage, call it your "animal companion."
The political views of the Eco-elitists defy easy categorization, if not also comprehension. Their anti-business stance might mark them as liberals, while their hard-edged fundamentalist views about nature and brittle nostalgia for a lost Peaceable Kingdom are surely conservative.
Perhaps they are little more than one of nature's newest 21st century hybrids: Progressive-Reactionaries.
Mr. Ball is chairman of W. Atlee Burpee & Co., and a former president of the American Horticultural Society. This op-ed was adapted from an entry on his blog (www. heronswoodvoice.com).
Man is not a cancer on the planet.
WSJ, Jun 13, 2009
Mankind has really been put in its place over the past 500 years. Why only the other day, back in 1400, the sun orbited the earth; man was God's consummate work of art; humans were masters of themselves and the domain God provided for them.
Our secular fall from grace began with Copernicus, who dislodged the world from its celestial catbird seat. Later, Darwin established that man, far from being the animal kingdom's pièce de résistance, was a bit like a baboon in clothes. Then Mendel documented the laws of inheritance -- so much for free will -- and Freud subordinated what was then left of our minds to unseemly drives over which we have little control.
In the 20th century, technology assumed a size and complexity too big to fit into what was left of our brains. In the 1890s, an intelligent layman could achieve a rudimentary grasp of the scope of current scientific thought. Perhaps no one -- scientist or not -- fathoms the full scope of technology today.
According to scientist and futurist Raymond Kurzweil, the coming technological-evolutionary quantum leap, known as the Singularity, will erase the line between human beings and technology. He maintains that technology's exponential progress will result in part-human, part-machine beings with infinitely greater brain power and life-spans approaching immortality.
Mr. Kurzweil envisions the time, if a body part fails, one need only grab its replacement from the pantry and snap it in place. Already, lawyers are busy devising the constitutional framework for a post-human future, in view of the shifting nature of what comprises a human being. The classic paradox comes to mind: Once the knife's blade and handle are each replaced several times, is it still the same knife? Once all your parts have been replaced a few times, are you still you?
Now a segment of the Green movement presents a fresh challenge to mankind's place within nature. Humans, the thinking goes, are one species among the many, a life form coexisting with others, our rights commensurate with those of snail darters, mosquitoes and coral reefs.
The new environmentalist thinking occupies that treacherous terrain between rationality and romanticism. It's highly logical, too, an all-encompassing equation where everything is equivalent to everything else -- communism at a cellular level.
The premise glows with the innocence we forsook when Adam larcenously appropriated an apple from its rightful owner, the tree.
This dangerous new unnatural naturalism sees the planet as a realm of halcyon purity. Conversely, mankind is portrayed as a cancer on the planet. Welcome to secular subhumanism.
The Earth-Firsters are not fools. There are choice elements in their deranged philosophy that merit consideration; such is the essence of temptation. However, their failure is that they undermine their cause with acts of brutality. Theodore Kaczynski, the Unabomber, a Ph.D. with kindred neo-Luddite views, was one such activist run amok, responsible for dozens of injuries and four deaths. He is a case study of how, contaminated with extreme emotion, logic becomes toxic.
Self-described "evo-lutionaries" and animal-rights activists feel justified in spiking trees, burning down housing developments, vandalizing laboratories and threatening the lives of researchers and their families. By all means save the Red-Cockaded Woodpecker, but not at the cost of human lives, no matter how few. That way lies madness.
One activist author posits that the planet can support only one billion people -- a number surely including the writer, his friends and extended family. Another activist advocates saving the world through euthanasia, abortion, suicide and sodomy. However, the truly repugnant part of this story is that these are both tenured professors in wealthy universities.
In Switzerland, proposed legislation protects the rights of plants. As you roam the Swiss mountains, do not violate the rights of the wildflowers by picking them: An undercover gnome might arrest you. Internationally, the greener-than-thou brigade scorns bioengineered seeds -- the 20th century achievement that vastly increased the world's food supply and rescued billions from starvation -- forgetting that nature has been creating hybrids since the beginning of time.
A Yale professor maintains that owning pets is a kind of species colonialism, an exploitative master-subject relationship. The word "pet" is now viewed as pejorative; if you must hold a creature hostage, call it your "animal companion."
The political views of the Eco-elitists defy easy categorization, if not also comprehension. Their anti-business stance might mark them as liberals, while their hard-edged fundamentalist views about nature and brittle nostalgia for a lost Peaceable Kingdom are surely conservative.
Perhaps they are little more than one of nature's newest 21st century hybrids: Progressive-Reactionaries.
Mr. Ball is chairman of W. Atlee Burpee & Co., and a former president of the American Horticultural Society. This op-ed was adapted from an entry on his blog (www. heronswoodvoice.com).
A society of children cannot survive?
Retreat into Apathy. By Mark Steyn
A society of children cannot survive.
NRO, Jun 13, 2009
Willie Whitelaw, a genial old buffer who served as Margaret Thatcher’s deputy for many years, once accused the Labour party of going around Britain stirring up apathy. Viscount Whitelaw’s apparent paradox is, in fact, a shrewd political insight, and all the sharper for being accidental. Big government depends, in large part, on going around the country stirring up apathy — creating the sense that problems are so big, so complex, so intractable that even attempting to think about them for yourself gives you such a splitting headache it’s easier to shrug and accept as given the proposition that only government can deal with them.
Take health care. Have you read any of these health-care plans? Of course not. They’re huge and turgid and unreadable. Unless you’re a health-care lobbyist, a health-care think-tanker, a health-care correspondent, or some other fellow who’s paid directly or indirectly to plough through this stuff, why bother? None of the senators whose names are on the bills have read ’em; why should you?
And you can understand why they drag on a bit. If you attempt to devise a health-care “plan” for 300 million people, it’s bound to get a bit complicated. But a health-care plan for you, Joe Schmoe of 27 Elm Street, didn’t used to be that complicated, did it? Let’s say you carelessly drop Ted Kennedy’s health-care plan on your foot and it breaks your toe. In the old days, you’d go to your doctor (or, indeed, believe it or not, have him come to you), he’d patch you up, and you’d write him a check. That’s the way it was in most of the developed world within living memory. Now, under the guise of “insurance,” various third parties intercede between the doctor and your checkbook, and to this the government proposes adding a massive federal bureaucracy, in the interests of “controlling costs.” The British National Health Service is the biggest employer not just in the United Kingdom, but in the whole of Europe. Care to estimate the size and budget of a U.S. health bureaucracy?
According to the U.N. figures, life expectancy in the United States is 78 years; in the United Kingdom, it’s 79 — yay, go socialized health care! On the other hand, in Albania, where the entire population chain-smokes and the health-care system involves swimming to Italy, life expectancy is still 71 years — or about where America was a generation or so back. Once you get childhood mortality under control, and observe basic hygiene and lifestyle precautions, the health “system” is relatively marginal. One notes that, even in Somalia, which still has high childhood mortality, not to mention a state of permanent civil war, functioning government has entirely collapsed and yet life expectancy has increased from 49 to 55. Maybe if government were to collapse entirely in Washington, our life expectancy would show equally remarkable gains. Just thinking outside the box here.
When President Obama tells you he’s “reforming” health care to “control costs,” the point to remember is that the only way to “control costs” in health care is to have less of it. In a government system, the doctor, the nurse, the janitor, and the Assistant Deputy Associate Director of Cost-Control System Management all have to be paid every Friday, so the sole means of “controlling costs” is to restrict the patient’s access to treatment. In the Province of Quebec, patients with severe incontinence — i.e., they’re in the bathroom twelve times a night — wait three years for a simple 30-minute procedure. True, Quebeckers have a year or two on Americans in the life-expectancy hit parade, but, if you’re making twelve trips a night to the john 365 times a year for three years, in terms of life-spent-outside-the-bathroom expectancy, an uninsured Vermonter may actually come out ahead.
As Louis XV is said to have predicted, “Après moi, le deluge” — which seems as incisive an observation as any on a world in which freeborn citizens of the wealthiest societies in human history are content to rise from their beds every half-hour every night and traipse to the toilet for yet another flush simply because a government bureaucracy orders them to do so. “Health” is potentially a big-ticket item, but so’s a house and a car, and most folks manage to handle those without a Government Accommodation Plan or a Government Motor Vehicles System — or, at any rate, they did in pre-bailout America.
More important, there is a cost to governmentalizing every responsibility of adulthood — and it is, in Lord Whitelaw’s phrase, the stirring up of apathy. If you wander round Liverpool or Antwerp, Hamburg or Lyons, the fatalism is palpable. In Britain, once the crucible of freedom, civic life is all but dead: In Wales, Northern Ireland, and Scotland, some three-quarters of the economy is government spending; a malign alliance between state bureaucrats and state dependents has corroded democracy, perhaps irreparably. In England, the ground ceded to the worst sociopathic pathologies advances every day — and the latest report on “the seven evils” afflicting an ever more unlovely land blames “poverty” and “individualism,” failing to understand that if you remove the burdens of individual responsibility while loosening all restraint on individual hedonism the vaporization of the public space is all but inevitable. In Ontario, Christine Elliott, a candidate for the leadership of the so-called Conservative party, is praised by the media for offering a more emollient conservatism predicated on “the need to take care of vulnerable people.”
Look, by historical standards, we’re loaded: We have TVs and iPods and machines to wash our clothes and our dishes. We’re the first society in which a symptom of poverty is obesity: Every man his own William Howard Taft. Of course we’re “vulnerable”: By definition, we always are. But to demand a government organized on the principle of preemptively “taking care” of potential “vulnerabilities” is to make all of us, in the long run, far more vulnerable. A society of children cannot survive, no matter how all-embracing the government nanny.
I get a lot of mail each week arguing that, when folks see the price tag attached to Obama’s plans, they’ll get angry. Maybe. But, if Europe’s a guide, at least as many people will retreat into apathy. Once big government’s in place, it’s very hard to go back.
— Mark Steyn, a National Review columnist, is author of America Alone.
A society of children cannot survive.
NRO, Jun 13, 2009
Willie Whitelaw, a genial old buffer who served as Margaret Thatcher’s deputy for many years, once accused the Labour party of going around Britain stirring up apathy. Viscount Whitelaw’s apparent paradox is, in fact, a shrewd political insight, and all the sharper for being accidental. Big government depends, in large part, on going around the country stirring up apathy — creating the sense that problems are so big, so complex, so intractable that even attempting to think about them for yourself gives you such a splitting headache it’s easier to shrug and accept as given the proposition that only government can deal with them.
Take health care. Have you read any of these health-care plans? Of course not. They’re huge and turgid and unreadable. Unless you’re a health-care lobbyist, a health-care think-tanker, a health-care correspondent, or some other fellow who’s paid directly or indirectly to plough through this stuff, why bother? None of the senators whose names are on the bills have read ’em; why should you?
And you can understand why they drag on a bit. If you attempt to devise a health-care “plan” for 300 million people, it’s bound to get a bit complicated. But a health-care plan for you, Joe Schmoe of 27 Elm Street, didn’t used to be that complicated, did it? Let’s say you carelessly drop Ted Kennedy’s health-care plan on your foot and it breaks your toe. In the old days, you’d go to your doctor (or, indeed, believe it or not, have him come to you), he’d patch you up, and you’d write him a check. That’s the way it was in most of the developed world within living memory. Now, under the guise of “insurance,” various third parties intercede between the doctor and your checkbook, and to this the government proposes adding a massive federal bureaucracy, in the interests of “controlling costs.” The British National Health Service is the biggest employer not just in the United Kingdom, but in the whole of Europe. Care to estimate the size and budget of a U.S. health bureaucracy?
According to the U.N. figures, life expectancy in the United States is 78 years; in the United Kingdom, it’s 79 — yay, go socialized health care! On the other hand, in Albania, where the entire population chain-smokes and the health-care system involves swimming to Italy, life expectancy is still 71 years — or about where America was a generation or so back. Once you get childhood mortality under control, and observe basic hygiene and lifestyle precautions, the health “system” is relatively marginal. One notes that, even in Somalia, which still has high childhood mortality, not to mention a state of permanent civil war, functioning government has entirely collapsed and yet life expectancy has increased from 49 to 55. Maybe if government were to collapse entirely in Washington, our life expectancy would show equally remarkable gains. Just thinking outside the box here.
When President Obama tells you he’s “reforming” health care to “control costs,” the point to remember is that the only way to “control costs” in health care is to have less of it. In a government system, the doctor, the nurse, the janitor, and the Assistant Deputy Associate Director of Cost-Control System Management all have to be paid every Friday, so the sole means of “controlling costs” is to restrict the patient’s access to treatment. In the Province of Quebec, patients with severe incontinence — i.e., they’re in the bathroom twelve times a night — wait three years for a simple 30-minute procedure. True, Quebeckers have a year or two on Americans in the life-expectancy hit parade, but, if you’re making twelve trips a night to the john 365 times a year for three years, in terms of life-spent-outside-the-bathroom expectancy, an uninsured Vermonter may actually come out ahead.
As Louis XV is said to have predicted, “Après moi, le deluge” — which seems as incisive an observation as any on a world in which freeborn citizens of the wealthiest societies in human history are content to rise from their beds every half-hour every night and traipse to the toilet for yet another flush simply because a government bureaucracy orders them to do so. “Health” is potentially a big-ticket item, but so’s a house and a car, and most folks manage to handle those without a Government Accommodation Plan or a Government Motor Vehicles System — or, at any rate, they did in pre-bailout America.
More important, there is a cost to governmentalizing every responsibility of adulthood — and it is, in Lord Whitelaw’s phrase, the stirring up of apathy. If you wander round Liverpool or Antwerp, Hamburg or Lyons, the fatalism is palpable. In Britain, once the crucible of freedom, civic life is all but dead: In Wales, Northern Ireland, and Scotland, some three-quarters of the economy is government spending; a malign alliance between state bureaucrats and state dependents has corroded democracy, perhaps irreparably. In England, the ground ceded to the worst sociopathic pathologies advances every day — and the latest report on “the seven evils” afflicting an ever more unlovely land blames “poverty” and “individualism,” failing to understand that if you remove the burdens of individual responsibility while loosening all restraint on individual hedonism the vaporization of the public space is all but inevitable. In Ontario, Christine Elliott, a candidate for the leadership of the so-called Conservative party, is praised by the media for offering a more emollient conservatism predicated on “the need to take care of vulnerable people.”
Look, by historical standards, we’re loaded: We have TVs and iPods and machines to wash our clothes and our dishes. We’re the first society in which a symptom of poverty is obesity: Every man his own William Howard Taft. Of course we’re “vulnerable”: By definition, we always are. But to demand a government organized on the principle of preemptively “taking care” of potential “vulnerabilities” is to make all of us, in the long run, far more vulnerable. A society of children cannot survive, no matter how all-embracing the government nanny.
I get a lot of mail each week arguing that, when folks see the price tag attached to Obama’s plans, they’ll get angry. Maybe. But, if Europe’s a guide, at least as many people will retreat into apathy. Once big government’s in place, it’s very hard to go back.
— Mark Steyn, a National Review columnist, is author of America Alone.
Texas Tort Bar: The plaintiffs-lawyer lobby blows $9 million and gets nowhere
Texas Tort Victories. WSJ Editorial
The plaintiffs-lawyer lobby blows $9 million and gets nowhere.
WSJ, Jun 13, 2009
Texas recently finished its legislative session, and the best news is what didn't pass. Namely, some 900 bills put forward by the tort bar.
The plaintiffs-lawyer lobby spent $9 million in last year's state legislative elections to help smooth the way for these bills, which were designed to roll back tort reforms passed in recent years, or to create new ways to sue. Yet that money wasn't enough to convince most Texas legislators to give up two-decades of hard-won legal progress, which ranges from class-action clean-up to medical liability reform.
Among the more notable failed proposals were a bill that would have shifted the burden of medical proof away from plaintiffs and on to defendants in asbestos and mesothelioma cases; an attempt to rip up Texas's successful system of trying multidistrict litigation in a single court; and legislation to allow plaintiffs to sue for "phantom" medical expenses.
Part of this success was due to the legislature's gridlock over a controversial voter ID bill. Yet Republicans who run the Senate and House also did yeoman's work to keep many bills from ever reaching the floor. Republicans also got a helping hand from a number of brave, antilawsuit Democrats, many of them from South Texas, where litigation has exacted more of an economic toll.
Speaking of the economy, it's notable that Texas created more new jobs last year than the other 49 states combined. Texas's low tax burden is one reason. But also important is a fairer legal environment in which companies are less likely than they were a generation ago to face jackpot justice.
The plaintiffs-lawyer lobby blows $9 million and gets nowhere.
WSJ, Jun 13, 2009
Texas recently finished its legislative session, and the best news is what didn't pass. Namely, some 900 bills put forward by the tort bar.
The plaintiffs-lawyer lobby spent $9 million in last year's state legislative elections to help smooth the way for these bills, which were designed to roll back tort reforms passed in recent years, or to create new ways to sue. Yet that money wasn't enough to convince most Texas legislators to give up two-decades of hard-won legal progress, which ranges from class-action clean-up to medical liability reform.
Among the more notable failed proposals were a bill that would have shifted the burden of medical proof away from plaintiffs and on to defendants in asbestos and mesothelioma cases; an attempt to rip up Texas's successful system of trying multidistrict litigation in a single court; and legislation to allow plaintiffs to sue for "phantom" medical expenses.
Part of this success was due to the legislature's gridlock over a controversial voter ID bill. Yet Republicans who run the Senate and House also did yeoman's work to keep many bills from ever reaching the floor. Republicans also got a helping hand from a number of brave, antilawsuit Democrats, many of them from South Texas, where litigation has exacted more of an economic toll.
Speaking of the economy, it's notable that Texas created more new jobs last year than the other 49 states combined. Texas's low tax burden is one reason. But also important is a fairer legal environment in which companies are less likely than they were a generation ago to face jackpot justice.
College: If they can find time for feminist theory, they can find time for Edmund Burke
Conservatism and the University Curriculum. By Peter Berkowitz
If they can find time for feminist theory, they can find time for Edmund Burke.
The political science departments at elite private universities such as Harvard and Yale, at leading small liberal arts colleges like Swarthmore and Williams, and at distinguished large public universities like the University of Maryland and the University of California, Berkeley, offer undergraduates a variety of courses on a range of topics. But one topic the undergraduates at these institutions -- and at the vast majority of other universities and colleges -- are unlikely to find covered is conservatism.
There is no legitimate intellectual justification for this omission. The exclusion of conservative ideas from the curriculum contravenes the requirements of a liberal education and an objective study of political science.
Political science departments are generally divided into the subfields of American politics, comparative politics, international relations, and political theory. Conservative ideas are relevant in all four, but the obvious areas within the political science discipline to teach about the great tradition of conservative ideas and thinkers are American politics and political theory. That rarely happens today.
To be sure, a political science department may feature a course on American political thought that includes a few papers from "The Federalist" and some chapters from Alexis de Tocqueville's "Democracy in America."
But most students will hear next to nothing about the conservative tradition in American politics that stretches from John Adams to Theodore Roosevelt to William F. Buckley Jr. to Milton Friedman to Ronald Reagan. This tradition emphasizes moral and intellectual excellence, worries that democratic practices and egalitarian norms will threaten individual liberty, attends to the claims of religion and the role it can play in educating citizens for liberty, and provides both a vigorous defense of free-market capitalism and a powerful critique of capitalism's relentless overturning of established ways. It also recognized early that communism represented an implacable enemy of freedom. And for 30 years it has been animated by a fascinating quarrel between traditionalists, libertarians and neoconservatives.
While ignoring conservatism, the political theory subfield regularly offers specialized courses in liberal theory and democratic theory; African-American political thought and feminist political theory; the social theory of Karl Marx, Emile Durkheim, Max Weber and the neo-Marxist Frankfurt school; and numerous versions of postmodern political theory.
Students may encounter in various political theory courses an essay by the British historian and philosopher Michael Oakeshott, or a chapter from a book by the German-born American political philosopher Leo Strauss. But they will learn very little about the constellation of ideas and thinkers linked in many cases by a common concern with the dangers to liberty that stem from the excesses to which liberty and equality give rise.
That constellation begins to come into focus at the end of the 18th century with Edmund Burke's "Reflections on the Revolution in France." It draws on the conservative side of the liberal tradition, particularly Adam Smith and David Hume and includes Tocqueville's great writings on democracy and aristocracy and John Stuart Mill's classical liberalism. It gets new life in the years following World War II from Friedrich Hayek's seminal writings on liberty and limited government and Russell Kirk's reconstruction of traditionalist conservatism. And it is elevated by Michael Oakeshott's eloquent reflections on the pervasive tendency in modern politics to substitute abstract reason for experience and historical knowledge, and by Leo Strauss's deft explorations of the dependence of liberty on moral and intellectual virtue.
Without an introduction to the conservative tradition in America and the conservative dimensions of modern political philosophy, political science students are condemned to a substantially incomplete and seriously unbalanced knowledge of their subject. Courses on this tradition should be mandatory for students of politics; today they are not even an option at most American universities.
When progressives, who dominate the academy, confront arguments about the need for the curriculum to give greater attention to conservative ideas, they often hear them as a demand for affirmative action. Usually they mishear. Certainly affirmative action for conservatives is a terrible idea.
Political science departments should not seek out professors with conservative political opinions. Nor should they lower scholarly standards. That approach would embrace the very assumption that has corrupted liberal education: that to study and teach particular political ideas one's identity is more important than the breadth and depth of one's knowledge and the rigor of one's thinking
One need not be a Puritan to study and teach colonial American religious thought, an ancient Israelite to study and teach biblical thought, or a conservative or Republican to study and teach conservative ideas. Affirmative action in university hiring for political conservatives should be firmly rejected, certainly by conservatives and defenders of liberal education.
To be sure, if political science departments were compelled to hire competent scholars to offer courses on conservative ideas and conservative thinkers, the result would be more faculty positions filled by political conservatives, since they and not progressives tend to take an interest in studying conservative thought. But there is no reason why scholars with progressive political opinions and who belong to the Democratic Party can not, out of a desire to understand American political history and modern political philosophy, study and teach conservatism in accordance with high intellectual standards. It would be good if they did.
It would also be good if every political science department offered a complementary course on the history of progressivism in America. This would discourage professors from conflating American political thought as a whole with progressivism, which they do in a variety of ways, starting with the questions they tend to ask and those they refuse to entertain.
Incorporating courses on conservatism in the curriculum may, as students graduate, disperse, and pursue their lives, yield the political benefit of an increase in mutual understanding between left and right. In this way, reforming the curriculum could diminish the polarization that afflicts our political and intellectual classes. But that benefit is admittedly distant and speculative.
In the near term, giving conservative ideas their due will have the concrete and immediate benefit of advancing liberal education's proper and commendable goal, which is the formation of free and well-furnished minds.
Mr. Berkowitz is a senior fellow at Stanford University's Hoover Institution.
If they can find time for feminist theory, they can find time for Edmund Burke.
The political science departments at elite private universities such as Harvard and Yale, at leading small liberal arts colleges like Swarthmore and Williams, and at distinguished large public universities like the University of Maryland and the University of California, Berkeley, offer undergraduates a variety of courses on a range of topics. But one topic the undergraduates at these institutions -- and at the vast majority of other universities and colleges -- are unlikely to find covered is conservatism.
There is no legitimate intellectual justification for this omission. The exclusion of conservative ideas from the curriculum contravenes the requirements of a liberal education and an objective study of political science.
Political science departments are generally divided into the subfields of American politics, comparative politics, international relations, and political theory. Conservative ideas are relevant in all four, but the obvious areas within the political science discipline to teach about the great tradition of conservative ideas and thinkers are American politics and political theory. That rarely happens today.
To be sure, a political science department may feature a course on American political thought that includes a few papers from "The Federalist" and some chapters from Alexis de Tocqueville's "Democracy in America."
But most students will hear next to nothing about the conservative tradition in American politics that stretches from John Adams to Theodore Roosevelt to William F. Buckley Jr. to Milton Friedman to Ronald Reagan. This tradition emphasizes moral and intellectual excellence, worries that democratic practices and egalitarian norms will threaten individual liberty, attends to the claims of religion and the role it can play in educating citizens for liberty, and provides both a vigorous defense of free-market capitalism and a powerful critique of capitalism's relentless overturning of established ways. It also recognized early that communism represented an implacable enemy of freedom. And for 30 years it has been animated by a fascinating quarrel between traditionalists, libertarians and neoconservatives.
While ignoring conservatism, the political theory subfield regularly offers specialized courses in liberal theory and democratic theory; African-American political thought and feminist political theory; the social theory of Karl Marx, Emile Durkheim, Max Weber and the neo-Marxist Frankfurt school; and numerous versions of postmodern political theory.
Students may encounter in various political theory courses an essay by the British historian and philosopher Michael Oakeshott, or a chapter from a book by the German-born American political philosopher Leo Strauss. But they will learn very little about the constellation of ideas and thinkers linked in many cases by a common concern with the dangers to liberty that stem from the excesses to which liberty and equality give rise.
That constellation begins to come into focus at the end of the 18th century with Edmund Burke's "Reflections on the Revolution in France." It draws on the conservative side of the liberal tradition, particularly Adam Smith and David Hume and includes Tocqueville's great writings on democracy and aristocracy and John Stuart Mill's classical liberalism. It gets new life in the years following World War II from Friedrich Hayek's seminal writings on liberty and limited government and Russell Kirk's reconstruction of traditionalist conservatism. And it is elevated by Michael Oakeshott's eloquent reflections on the pervasive tendency in modern politics to substitute abstract reason for experience and historical knowledge, and by Leo Strauss's deft explorations of the dependence of liberty on moral and intellectual virtue.
Without an introduction to the conservative tradition in America and the conservative dimensions of modern political philosophy, political science students are condemned to a substantially incomplete and seriously unbalanced knowledge of their subject. Courses on this tradition should be mandatory for students of politics; today they are not even an option at most American universities.
When progressives, who dominate the academy, confront arguments about the need for the curriculum to give greater attention to conservative ideas, they often hear them as a demand for affirmative action. Usually they mishear. Certainly affirmative action for conservatives is a terrible idea.
Political science departments should not seek out professors with conservative political opinions. Nor should they lower scholarly standards. That approach would embrace the very assumption that has corrupted liberal education: that to study and teach particular political ideas one's identity is more important than the breadth and depth of one's knowledge and the rigor of one's thinking
One need not be a Puritan to study and teach colonial American religious thought, an ancient Israelite to study and teach biblical thought, or a conservative or Republican to study and teach conservative ideas. Affirmative action in university hiring for political conservatives should be firmly rejected, certainly by conservatives and defenders of liberal education.
To be sure, if political science departments were compelled to hire competent scholars to offer courses on conservative ideas and conservative thinkers, the result would be more faculty positions filled by political conservatives, since they and not progressives tend to take an interest in studying conservative thought. But there is no reason why scholars with progressive political opinions and who belong to the Democratic Party can not, out of a desire to understand American political history and modern political philosophy, study and teach conservatism in accordance with high intellectual standards. It would be good if they did.
It would also be good if every political science department offered a complementary course on the history of progressivism in America. This would discourage professors from conflating American political thought as a whole with progressivism, which they do in a variety of ways, starting with the questions they tend to ask and those they refuse to entertain.
Incorporating courses on conservatism in the curriculum may, as students graduate, disperse, and pursue their lives, yield the political benefit of an increase in mutual understanding between left and right. In this way, reforming the curriculum could diminish the polarization that afflicts our political and intellectual classes. But that benefit is admittedly distant and speculative.
In the near term, giving conservative ideas their due will have the concrete and immediate benefit of advancing liberal education's proper and commendable goal, which is the formation of free and well-furnished minds.
Mr. Berkowitz is a senior fellow at Stanford University's Hoover Institution.
North Korea Says It Will Start Enriching Uranium - Also Will Weaponize All the Plutonium It Can Find
North Korea Says It Will Start Enriching Uranium. By Blaine Harden
Weapons Move Is 'Retaliation' for Sanctions
Washingt, Sunday, June 14, 2009
TOKYO, June 13 -- North Korea adamantly denied for seven years that it had a program for making nuclear weapons from enriched uranium.
But on Saturday, a few hours after the U.N. Security Council slapped it with tough new sanctions for detonating a second nuclear device, the government of Kim Jong Il changed its tune, vowing that it would start enriching uranium to make more nuclear weapons.
Declaring that it would meet sanctions with "retaliation," North Korea also pledged to "weaponize" all the plutonium it could extract from used fuel rods at its Yongbyon nuclear plant, which was partially disabled last year as part of the North's agreement to win food, fuel and diplomatic concessions in return for a promise to end its nuclear program.
That agreement collapsed in April, when North Korea -- fuming about Security Council condemnation of its March launch of a long-range missile -- kicked U.N. weapons inspectors out of the country and began work to restart its plutonium factory. It tested a second bomb on May 25, and South Korean officials have said more missile launches and a third nuclear test are possible in the near future.
"It makes no difference to North Korea whether its nuclear status is recognized or not," the Foreign Ministry in Pyongyang said in a statement carried by the state news agency. "It has become an absolutely impossible option for North Korea to even think about giving up its nuclear weapons."
The 15-member Security Council unanimously passed a resolution Friday that imposes broad financial, trade and military sanctions on North Korea, while also calling on states, for the first time, to seize banned weapons and technology from the North that are found aboard ships on the high seas.
North Korea seemed Saturday to have interpreted the seizure resolution as a "blockade." But at the insistence of China and Russia, the North's traditional allies, the resolution does not authorize the use of military action to enforce any seizure that a North Korean vessel might resist, nor does it restrict shipments of food or other nonmilitary goods.
"An attempted blockade of any kind by the United States and its followers will be regarded as an act of war and met with a decisive military response," North Korea said.
Secretary of State Hillary Rodham Clinton said North Korea's "continuing provocative actions are deeply regrettable."
The bellicose language in North Korea's statement -- which describes the Security Council action as "another ugly product of American-led international pressure" -- is similar in tone to previous North Korean responses to U.N. sanctions.
But the North's announcement that it would process enriched uranium to make more weapons was an extraordinary public admission of active involvement in a program whose existence has been denied by Pyongyang since 2002, when it was first mentioned in a U.S. intelligence report.
That year, the Bush administration accused North Korea of secretly continuing with nuclear weapons development in violation of a 1994 agreement. It then canceled construction of two light-water reactors in the North that were to have been used to produce electricity for the impoverished country.
But in 2007, the Bush administration began to back off its assertions that North Korea had an active program to enrich uranium. The chief U.S. intelligence officer for North Korea, Joseph R. DeTrani, told Congress at the time that although there was "high confidence" that North Korea had acquired materials that could be used in a "production-scale" uranium program, there was only "mid-confidence" that such a program existed.
Uranium enrichment, which offers a different route for making nuclear weapons than plutonium, uses centrifuges to spin hot uranium gas into weapons-grade fuel.
Insisting that it had no uranium-enrichment program, the North Korean government took an American diplomat to a missile factory in 2007, where there were aluminum tubes that some experts had said could be used in uranium enrichment. North Korea allowed the diplomat to take home some samples.
Traces of enriched uranium were unexpectedly discovered on those samples. Other traces were found on the pages of reactor records that North Korea turned over to the United States in 2008, as part of now-aborted negotiations on denuclearizing the North.
In recent years, U.S. officials have suggested that although North Korea has tried to enrich uranium, it has not been very successful.
North Korea on Saturday said it has indeed made progress.
"Enough success has been made in developing uranium-enrichment technology to provide nuclear fuel to allow the experimental procedure," the government said. "The process of uranium enrichment will be commenced."
This may have been bluster, at least in the short term.
It will take many years for the North to develop the uranium route to a bomb, according to Siegfried S. Hecker, a periodic visitor to the Yongbyon complex who was director of Los Alamos National Laboratory and is co-director of Stanford University's Center for International Security and Cooperation.
Writing last month in Foreign Policy magazine, Hecker said North Korea lacks uranium centrifuge materials, technology and know-how. He warned, however, that Iran has mastered this technology and could help the North move forward with uranium enrichment. North Korea and Iran have shared long-range missile technology that could enable both countries to deliver a nuclear warhead.
North Korea also said Saturday that the spent fuel rods at its Yongbyon reactor are being reprocessed, with all the resulting plutonium to be used in nuclear weapons. The government said that it has reprocessed more than a third of them.
Hecker said in a recent interview that there is enough plutonium in the spent rods for "one or two more" nuclear tests. He also said it would take the North about six months to restart its Yongbyon plant, and that it could then produce enough plutonium to make about one nuclear bomb a year for the next decade.
Early this year, North Korean officials said that technicians have used all the plutonium previously manufactured at Yongbyon to make nuclear weapons.
In South Korea on Saturday, several analysts said the North's fist-shaking response to Security Council sanctions suggests that hard-liners in the country's military are exercising increasing power in running the government.
Kim Jong Il suffered a stroke last summer and has appeared frail in public appearances. He is believed to have chosen his youngest son, Jong Un, as his successor. It is unknown, however, how far the succession process has progressed in the secretive communist state.
"Given Kim's ailing health . . . the North Korean leader is likely to have yielded to the demands and pressure of military people who have little awareness of the outside world," said Koh Yu-hwan, a professor of North Korean studies at Dongguk University in Seoul.
Special correspondent Stella Kim in Seoul contributed to this report.
Weapons Move Is 'Retaliation' for Sanctions
Washingt, Sunday, June 14, 2009
TOKYO, June 13 -- North Korea adamantly denied for seven years that it had a program for making nuclear weapons from enriched uranium.
But on Saturday, a few hours after the U.N. Security Council slapped it with tough new sanctions for detonating a second nuclear device, the government of Kim Jong Il changed its tune, vowing that it would start enriching uranium to make more nuclear weapons.
Declaring that it would meet sanctions with "retaliation," North Korea also pledged to "weaponize" all the plutonium it could extract from used fuel rods at its Yongbyon nuclear plant, which was partially disabled last year as part of the North's agreement to win food, fuel and diplomatic concessions in return for a promise to end its nuclear program.
That agreement collapsed in April, when North Korea -- fuming about Security Council condemnation of its March launch of a long-range missile -- kicked U.N. weapons inspectors out of the country and began work to restart its plutonium factory. It tested a second bomb on May 25, and South Korean officials have said more missile launches and a third nuclear test are possible in the near future.
"It makes no difference to North Korea whether its nuclear status is recognized or not," the Foreign Ministry in Pyongyang said in a statement carried by the state news agency. "It has become an absolutely impossible option for North Korea to even think about giving up its nuclear weapons."
The 15-member Security Council unanimously passed a resolution Friday that imposes broad financial, trade and military sanctions on North Korea, while also calling on states, for the first time, to seize banned weapons and technology from the North that are found aboard ships on the high seas.
North Korea seemed Saturday to have interpreted the seizure resolution as a "blockade." But at the insistence of China and Russia, the North's traditional allies, the resolution does not authorize the use of military action to enforce any seizure that a North Korean vessel might resist, nor does it restrict shipments of food or other nonmilitary goods.
"An attempted blockade of any kind by the United States and its followers will be regarded as an act of war and met with a decisive military response," North Korea said.
Secretary of State Hillary Rodham Clinton said North Korea's "continuing provocative actions are deeply regrettable."
The bellicose language in North Korea's statement -- which describes the Security Council action as "another ugly product of American-led international pressure" -- is similar in tone to previous North Korean responses to U.N. sanctions.
But the North's announcement that it would process enriched uranium to make more weapons was an extraordinary public admission of active involvement in a program whose existence has been denied by Pyongyang since 2002, when it was first mentioned in a U.S. intelligence report.
That year, the Bush administration accused North Korea of secretly continuing with nuclear weapons development in violation of a 1994 agreement. It then canceled construction of two light-water reactors in the North that were to have been used to produce electricity for the impoverished country.
But in 2007, the Bush administration began to back off its assertions that North Korea had an active program to enrich uranium. The chief U.S. intelligence officer for North Korea, Joseph R. DeTrani, told Congress at the time that although there was "high confidence" that North Korea had acquired materials that could be used in a "production-scale" uranium program, there was only "mid-confidence" that such a program existed.
Uranium enrichment, which offers a different route for making nuclear weapons than plutonium, uses centrifuges to spin hot uranium gas into weapons-grade fuel.
Insisting that it had no uranium-enrichment program, the North Korean government took an American diplomat to a missile factory in 2007, where there were aluminum tubes that some experts had said could be used in uranium enrichment. North Korea allowed the diplomat to take home some samples.
Traces of enriched uranium were unexpectedly discovered on those samples. Other traces were found on the pages of reactor records that North Korea turned over to the United States in 2008, as part of now-aborted negotiations on denuclearizing the North.
In recent years, U.S. officials have suggested that although North Korea has tried to enrich uranium, it has not been very successful.
North Korea on Saturday said it has indeed made progress.
"Enough success has been made in developing uranium-enrichment technology to provide nuclear fuel to allow the experimental procedure," the government said. "The process of uranium enrichment will be commenced."
This may have been bluster, at least in the short term.
It will take many years for the North to develop the uranium route to a bomb, according to Siegfried S. Hecker, a periodic visitor to the Yongbyon complex who was director of Los Alamos National Laboratory and is co-director of Stanford University's Center for International Security and Cooperation.
Writing last month in Foreign Policy magazine, Hecker said North Korea lacks uranium centrifuge materials, technology and know-how. He warned, however, that Iran has mastered this technology and could help the North move forward with uranium enrichment. North Korea and Iran have shared long-range missile technology that could enable both countries to deliver a nuclear warhead.
North Korea also said Saturday that the spent fuel rods at its Yongbyon reactor are being reprocessed, with all the resulting plutonium to be used in nuclear weapons. The government said that it has reprocessed more than a third of them.
Hecker said in a recent interview that there is enough plutonium in the spent rods for "one or two more" nuclear tests. He also said it would take the North about six months to restart its Yongbyon plant, and that it could then produce enough plutonium to make about one nuclear bomb a year for the next decade.
Early this year, North Korean officials said that technicians have used all the plutonium previously manufactured at Yongbyon to make nuclear weapons.
In South Korea on Saturday, several analysts said the North's fist-shaking response to Security Council sanctions suggests that hard-liners in the country's military are exercising increasing power in running the government.
Kim Jong Il suffered a stroke last summer and has appeared frail in public appearances. He is believed to have chosen his youngest son, Jong Un, as his successor. It is unknown, however, how far the succession process has progressed in the secretive communist state.
"Given Kim's ailing health . . . the North Korean leader is likely to have yielded to the demands and pressure of military people who have little awareness of the outside world," said Koh Yu-hwan, a professor of North Korean studies at Dongguk University in Seoul.
Special correspondent Stella Kim in Seoul contributed to this report.
Subscribe to:
Posts (Atom)