Never Right, But Never in Doubt. By Ronald Bailey
Famine-monger Lester Brown still gets it wrong after all these years
Reason, May 5, 2009
"Could food shortages bring down civilization?," asks environmental activist Lester Brown in the current issue of Scientific American. Not surprisingly, Brown's answer is an emphatic yes. He claims that for years he has "resisted the idea that food shortages could bring down not only individual governments but also our global civilization." Now, however, Brown says, "I can no longer ignore that risk." Balderdash. Brown, head of the Earth Policy Institute, has been a prominent and perennial predictor of imminent global famine for more than 45 years. Why should we believe him now?
For instance, back in 1965, when Brown was a young bureaucrat in the U.S. Department of Agriculture, he declared, "the food problem emerging in the less-developing regions may be one of the most nearly insoluble problems facing man over the next few decades." In 1974, Brown maintained that farmers "can no longer keep up with rising demand; thus the outlook is for chronic scarcities and rising prices." In 1981, Brown stated that "global food insecurity is increasing," and further claimed that "the slim excess of growth in food production over population is narrowing." In 1989, Brown contended that "population growth is exceeding the farmer's ability to keep up," concluding that, "our oldest enemy, hunger, is again at the door." In 1995, Brown starkly warned, "Humanity's greatest challenge may soon be just making it to the next harvest." In 1997, Brown again proclaimed, "Food scarcity will be the defining issue of the new era now unfolding."
But this time it's different, right? After all, Brown claims that "when the 2008 harvest began, world carryover stocks of grain (the amount in the bin when the new harvest begins) were at 62 days of consumption, a near record low." But Brown has played this game before with world grain stocks. As the folks at the pro-life Population Research Institute (PRI) report, Brown claimed in 1974 that there were only 26 days of grain reserves left, but later he upped that number to 61 days. In 1976, reserves were supposed to have fallen to just 31 days, but again Brown raised that number in 1988 to 79 days. In 1980, only a 40-day supply was allegedly on hand, but a few years later he changed that estimate to 71 days. The PRI analysts noted that Brown has repeatedly issued differing figures for 1974: 26 or 27 days (1974); 33 days (1975); 40 days (1981); 43 days (1987); and 61 days (1988). In 2004, Brown claimed that the world's grain reserves had fallen to only 59 days of consumption, the lowest level in 30 years.
In any case, Brown must know that the world's farmers produced a bumper crop last year. Stocks of wheat are at a six-year high and increases in other stocks of grains are not far off. This jump in reserves is not at all surprising considering the steep run-up in grain prices last year, which encouraged farmers around the world to plant more crops. By citing pre-2008 harvest reserves, Brown evidently hopes to frighten gullible Scientific American readers into thinking that the world's food situation is really desperate this time.
Brown argues that the world's food economy is being undermined by a troika of growing environmental calamities: falling water tables, eroding soils, and rising temperatures. He acknowledges that the application of scientific agriculture produced vast increases in crop yields in the 1960s and 1970s, but insists that "the technological fix" won't work this time. But Brown is wrong, again.
It is true that water tables are falling in many parts of the world as farmers drain aquifers in India, China, and the United States. Part of the problem is that water for irrigation is often subsidized by governments who encourage farmers to waste it. However, the proper pricing of water will rectify that by encouraging farmers to transition to drip irrigation, switch from thirsty crops like rice to dryland ones like wheat, and help crop breeders to develop more drought-tolerant crop varieties. In addition, crop biotechnologists are now seeking to transfer the C4 photosynthetic pathway into rice, which currently uses the less efficient C3 pathway. This could boost rice yields by 50 percent while reducing water use.
To support his claims about the dangers of soil erosion, Brown cites studies in impoverished Haiti and Lesotho. To be sure, soil erosion is a problem for poor countries whose subsistence farmers have no secure property rights. However, one 1995 study concluded that soil erosion would reduce U.S. agriculture production by 3 percent over the next 100 years. Such a reduction would be swamped by annual crop productivity increases of 1 to 2 percent per year—which has been the average rate for many decades. A 2007 study by European researchers found "it highly unlikely that erosion may pose a serious threat to food production in modern societies within the coming centuries." In addition, modern biotech herbicide-resistant crops make it possible for farmers to practice no-till agriculture, thus dramatically reducing soil erosion.
Brown's final fear centers on the effects of man-made global warming on agriculture. There is an ongoing debate among experts on this topic. For example, University of California, Santa Barbara economist Olivier Deschenes and Massachusetts Institute of Technology economist Michael Greenstone calculated that global warming would increase the profits of U.S. farmers by 4 percent, concluding that "large negative or positive effects are unlikely." Other researchers have recently disputed Deschenes' and Greenstone's findings, arguing that the impact of global warming on U.S. agriculture is "likely to be strongly negative." Fortunately, biotechnology research—the very technology fix dismissed by Brown—is already finding new ways to make crops more heat and drought tolerant.
On the other hand, Brown is right about two things in his Scientific American article: the U.S. should stop subsidizing bioethanol production (turning food into fuel) and countries everywhere should stop banning food exports in a misguided effort to lower local prices. Of course these policy prescriptions have been made by far more knowledgeable and trustworthy commentators than Brown.
Given the fact that Brown's dismal record as a prognosticator of doom is so well-known, it is just plain sad to see a respectable publication like Scientific American lending its credibility to this old charlatan.
Ronald Bailey is Reason magazine's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Bipartisan Alliance, a Society for the Study of the US Constitution, and of Human Nature, where Republicans and Democrats meet.
Tuesday, May 5, 2009
The Best Judges Obama Can't Pick
The Best Judges Obama Can't Pick. By Benjamin Wittes
Brookings, May 3, 2009
What do Merrick Garland, David Tatel and Jose Cabranes have in common?
All are sitting federal court of appeals judges who were nominated by Democratic presidents. All three are deeply admired by their colleagues and are among a small group of the very finest federal judges in the country. And all three have names you probably won't hear often in public discussions about whom President Obama should tap to replace retiring Justice David H. Souter.
Garland: white guy. Tatel: white guy and, at 67, too old. Cabranes: Hispanic, sure, but even older.
I have nothing against the people whose names have so far been floated as possible nominees (some of them are excellent), and I'm not against diversity on the high court. Far from it: It's important to have a court that looks like America, and it is particularly important that following Sandra Day O'Connor's retirement in 2005 an additional woman join the high court.
That said, there are significant costs to the nominating system that we have developed, in which gender, ethnicity and age have, from the very start of the search for Souter's replacement, placed off-limits many lawyers and judges whose colleagues regard as some of the best in their profession. The dirty little secret is that the conservative talent pool on the federal courts these days is larger and deeper than the liberal one, mainly because Republicans have been in power far longer than Democrats recently and have therefore had more opportunity to cultivate a strong bench on the bench.
While both parties feel pressure to keep the bench diverse, Democrats have less latitude for bucking these expectations in judicial nominations than Republicans do. The core constituency that Republicans must satisfy in high court nominations is the party's social conservative base, which fundamentally cares about issues, not diversity, and has accepted white men who practice the judging it admires. By contrast, identity-oriented groups are part of the core Democratic coalition, so it's not enough for a Democrat to appoint a liberal. At least some of the time, it will have to be a liberal who also satisfies certain diversity categories.
The age issue has particularly striking consequences. It used to be commonplace for presidents to appoint justices who were well into their 60s. Lewis Powell, Earl Warren, Charles Evans Hughes (the second time around), William Howard Taft and Oliver Wendell Holmes, for example, were at least 60 when nominated, as was Justice Ruth Bader Ginsburg when President Clinton nominated her in 1993. Older judges brought experience to the table, and because life tenure is shorter for them than for younger judges, the stakes are lower in their confirmations.
Yet the ever-escalating political war over the courts has put a premium on youth -- on justices who can hang around for decades as members of rival ideological camps. Judge J. Harvie Wilkinson III, one of the most esteemed conservative jurists in the country, might well be on the Supreme Court today, for example, had he not had the temerity to be 60 when O'Connor retired and opened up a slot. Nor is Obama likely to follow Clinton's lead in declining to discriminate against the late-middle-aged. After all, if conservatives only appoint relative youngsters such as John G. Roberts and Samuel Alito (50 and 55, respectively, at the time of their nominations), it's unilateral disarmament for a liberal to do otherwise.
The result is a strange conversation about who should replace Souter -- one that self-consciously omits many of the judges whose work is most actively studied by those who engage day-to-day with the courts. This may well be a reasonable price to pay for a diverse bench, and for those who don't read judicial opinions, it is in any event an invisible price. But let's be candid about paying it.
Brookings, May 3, 2009
What do Merrick Garland, David Tatel and Jose Cabranes have in common?
All are sitting federal court of appeals judges who were nominated by Democratic presidents. All three are deeply admired by their colleagues and are among a small group of the very finest federal judges in the country. And all three have names you probably won't hear often in public discussions about whom President Obama should tap to replace retiring Justice David H. Souter.
Garland: white guy. Tatel: white guy and, at 67, too old. Cabranes: Hispanic, sure, but even older.
I have nothing against the people whose names have so far been floated as possible nominees (some of them are excellent), and I'm not against diversity on the high court. Far from it: It's important to have a court that looks like America, and it is particularly important that following Sandra Day O'Connor's retirement in 2005 an additional woman join the high court.
That said, there are significant costs to the nominating system that we have developed, in which gender, ethnicity and age have, from the very start of the search for Souter's replacement, placed off-limits many lawyers and judges whose colleagues regard as some of the best in their profession. The dirty little secret is that the conservative talent pool on the federal courts these days is larger and deeper than the liberal one, mainly because Republicans have been in power far longer than Democrats recently and have therefore had more opportunity to cultivate a strong bench on the bench.
While both parties feel pressure to keep the bench diverse, Democrats have less latitude for bucking these expectations in judicial nominations than Republicans do. The core constituency that Republicans must satisfy in high court nominations is the party's social conservative base, which fundamentally cares about issues, not diversity, and has accepted white men who practice the judging it admires. By contrast, identity-oriented groups are part of the core Democratic coalition, so it's not enough for a Democrat to appoint a liberal. At least some of the time, it will have to be a liberal who also satisfies certain diversity categories.
The age issue has particularly striking consequences. It used to be commonplace for presidents to appoint justices who were well into their 60s. Lewis Powell, Earl Warren, Charles Evans Hughes (the second time around), William Howard Taft and Oliver Wendell Holmes, for example, were at least 60 when nominated, as was Justice Ruth Bader Ginsburg when President Clinton nominated her in 1993. Older judges brought experience to the table, and because life tenure is shorter for them than for younger judges, the stakes are lower in their confirmations.
Yet the ever-escalating political war over the courts has put a premium on youth -- on justices who can hang around for decades as members of rival ideological camps. Judge J. Harvie Wilkinson III, one of the most esteemed conservative jurists in the country, might well be on the Supreme Court today, for example, had he not had the temerity to be 60 when O'Connor retired and opened up a slot. Nor is Obama likely to follow Clinton's lead in declining to discriminate against the late-middle-aged. After all, if conservatives only appoint relative youngsters such as John G. Roberts and Samuel Alito (50 and 55, respectively, at the time of their nominations), it's unilateral disarmament for a liberal to do otherwise.
The result is a strange conversation about who should replace Souter -- one that self-consciously omits many of the judges whose work is most actively studied by those who engage day-to-day with the courts. This may well be a reasonable price to pay for a diverse bench, and for those who don't read judicial opinions, it is in any event an invisible price. But let's be candid about paying it.
What You Can(‘t) Do About Global Warming
What You Can(‘t) Do About Global Warming
World Climate Report, April 30, 2009
We are always hearing about ways that you can “save the planet” from the perils of global warming—from riding your bicycle to work, to supporting the latest national greenhouse gas restriction limitations, and everything in between.
In virtually each and every case, advocates of these measures provide you with the amount of greenhouse gas emissions (primarily carbon dioxide) that will be saved by the particular action.
And if you want to figure this out for yourself, the web is full of CO2 calculators (just google “CO2 calculator”) which allow you to calculate your carbon footprint and how much it can be reduced by taking various conservations steps—all with an eye towards reducing global warming.
However, in absolutely zero of these cases are you told, or can you calculate, how much impact you are going to have on the actual climate itself. After all, CO2 emissions are not climate—they are gases. Climate is temperature and precipitation and storms and winds, etc. If the goal of the actions is to prevent global warming, then you shouldn’t really care a hoot about the amount of CO2 emissions that you are reducing, but instead, you want to know how much of the planet you are saving. How much anthropogenic climate change is being prevented by unplugging your cell phone charger, from biking to the park, or from slashing national carbon dioxide emissions?
Why do none of the CO2 calculators give you that most valuable piece of information? Why don’t the politicians, the EPA, and/or greenhouse gas reduction advocates tell you the bottom line?
How much global warming are we avoiding?
Embarrassingly for them, this information is readily available.
After all, what do you think climate models do? Simply, they take greenhouse gas emissions scenarios and project the future climate—thus providing precisely the answer we are looking for. You tweak the scenarios to account for your emission savings, run the models, and you get your answer.
Since climate model projections of the future climate are what are being used to attempt to scare us into action, climate models should very well be used to tell us how much of the scary future we are going to avoid by taking the suggested/legislated/regulated actions.
So where are the answers?
OK, so full-fledged climate models are very expensive tools—they are extremely complex computer programs that take weeks to run on the world’s fastest supercomputers. So, consequently, they don’t lend themselves to web calculators.
But, you would think that in considering our national energy plan, or EPA’s plan to regulate CO2, that this would be of enough import to deserve a couple of climate model runs to determine the final result. Otherwise, how can the members of Congress fairly assess what it is they are considering doing? Again, if the goal is to change the future course of climate to avoid the potential negative consequences of global warming, then to what degree is the plan that they are proposing going to be successful? Can it deliver the desired results? The American public deserves to know.
In lieu of full-out climate models, there are some “pocket” climate models that run on your desktop computer in a matter of seconds and which are designed to emulate the large-scale output from the complex general circulation models. One of best of these “pocket” models is the Model for the Assessment of Greenhouse-gas Induced Climate Change, or MAGICC. Various versions of MAGICC have been used for years to simulate climate model output for a fraction of the cost. In fact, the latest version of MAGICC was developed under a grant from the EPA. Just like a full climate model, MAGICC takes in greenhouse gas emissions scenarios and outputs such quantities as the projected global average temperature. Just the thing we are looking for. It would only take a bit of technical savvy to configure the web-based CO2 calculators so that they interfaced with MAGICC and produced a global temperature savings based upon the emissions savings. Yet not one has seemed fit to do so. If you are interested in attempting to do so yourself, MAGICC is available here.
As a last resort, for those of us who don’t have general circulation models, supercomputers, or even much technical savvy of our own, it is still possible, in a rough, back-of-the-envelope sort of way, to come up with a simple conversion from CO2 emissions to global temperatures. This way, what our politicians and favorite global warming alarmists won’t tell us, we can figure out for ourselves.
Here’s how.
We need to go from emissions of greenhouse gases, to atmospheric concentrations of greenhouse gases, to global temperatures.
We’ll determine how much CO2 emissions are required to change the atmospheric concentration of CO2 by 1 part per million (ppm), then we’ll figure out how many ppms of CO2 it takes to raise the global temperature 1ºC. Then, we’ll have our answer.
So first things first. Figure 1 shows the total global emissions of CO2 (in units of million metric tons, mmt) each year from 1958-2006 as well as the annual change in atmospheric CO2 content (in ppm) during the same period. Notice that CO2 emissions are rising, as is the annual change in atmospheric CO2 concentration.
[figure 1]
Figure 1. (top) Annual global total carbon dioxide emissions (mmt), 1958-2006; (bottom) Year-to-year change in atmospheric CO2 concentrations (ppm), 1959-2006. (Data source: Carbon Dioxide Information Analysis Center)
If we divide the annual emissions by the annual concentration change, we get Figure 2—the amount of emissions required to raise the atmospheric concentration by 1 ppm. Notice that there is no trend at all through the data in Figure 2. This means that the average amount of CO2 emissions required to change the atmospheric concentration by a unit amount has stayed constant over time. This average value in Figure 2 is 15,678mmtCO2/ppm.
[figure 2]
Figure 2. Annual CO2 emissions responsible for a 1 ppm change in atmospheric CO2 concentrations (Figure 1a divided by Figure 1b), 1959-2006. The blue horizontal line is the 1959-2006 average, the red horizontal line is the average excluding the volcano-influenced years of 1964, 1982, and 1992.
You may wonder about the two large spikes in Figure 2—indicating that in those years, the emissions did not result in much of change in the atmospheric CO2 concentrations. It turns out that the spikes, in 1964 and 1992 (and a smaller one in 1982), are the result of large volcanic eruptions. The eruptions cooled the earth by blocking solar radiation and making it more diffuse, which has the duel effect of increasing the CO2 uptake by oceans and increasing the CO2 uptake by photosynthesis—both effects serving to offset the effect of the added emissions and resulting in little change in the atmospheric concentrations. As the volcanic effects attenuated in the following year, the CO2 concentrations then responded to emissions as expected.
Since volcanic eruptions are more the exception than the norm, we should remove them from our analysis. In doing so, the average amount of CO2 emissions that lead to an atmospheric increase of 1 ppm drops from 15,678 (the blue line in Figure 2), to 14,138mmtCO2 (red line in Figure 2).
Now, we need to know how many ppms of CO2 it takes to raise the global temperature a degree Celsius. This is a bit trickier, because this value is generally not thought to be constant, but instead to decrease with increasing concentrations. But, for our purposes, we can consider it to be constant and still be in the ballpark. But what is that value?
We can try to determine it from observations.
Over the past 150 years or so, the atmospheric concentration of CO2 has increased about 100 ppm, from ~280ppm to ~380ppm, and global temperatures have risen about 0.8ºC over the same time. Dividing the concentration change by the temperature change (100ppm/0.8ºC) produces the answer that it takes 125ppm to raise the global temperature 1ºC. Now, it is possible that some of the observed temperature rise has occurred as a result of changes other than CO2 (say, solar, for instance). But it is also possible that the full effect of the temperature change resulting from the CO2 changes has not yet been manifest. So, rather than nit-pick here, we’ll call those two things a wash and go with 125ppm/ºC as a reasonable value as determined from observations.
We can also try to determine it from models.
Climate models run with only CO2 increases produce about 1.8C of warming at the time of a doubling of the atmospheric carbon dioxide concentrations. A doubling is usually taken to be a change of about 280ppm. So, we have 280ppm divided by 1.8ºC equals 156ppm/ºC. But, the warming is not fully realized by the time of doubling, and the models go on to produce a total warming of about 3ºC for the same 280ppm rise. This gives us, 280ppm divided by 3ºC which equals 93ppm/ºC. The degree to which the models have things exactly right is highly debatable, but close to the middle of all of this is that 125ppm/ºC number again—the same that we get from observations.
So both observations and models give us a similar number, within a range of loose assumptions.
Now we have what we need. It takes ~14,138mmt of CO2 emissions to raise the atmospheric CO2 concentration by ~1 ppm and it takes ~125 ppm to raise the global temperature ~1ºC. So multiplying ~14,138mmt/pmm by ~125ppm/ºC gives us ~1,767,250mmt/ºC.
That’s our magic number—1,767,250.
Write that number down on a piece of paper and put it in your wallet or post it on your computer.
This is a handy-dandy and powerful piece of information to have, because now, whenever you are presented with an emissions savings that some action to save the planet from global warming is supposed to produce, you can actually see how much of a difference it will really make. Just take the emissions savings (in units of mmt of CO2) and divide it by 1,767,250.
Just for fun, let’s see what we get when we apply this to a few save-the-world suggestions.
According to NativeEnergy.com (in association with Al Gore’s ClimateCrisis.net), if you stopped driving your average mid-sized car for a year, you’d save about 5.5 metric tons (or 0.0000055 million metric tons, mmt) of CO2 emissions per year. Divide 0.0000055mmtCO2 by 1,767,250 mmt/ºC and you get a number too small to display on my 8-digit calculator (OK, Excel tells me the answer is 0.00000000000311ºC). And, if you send in $84, NativeEnergy will invest in wind and methane power to offset that amount in case you actually don’t want to give up your car for the year. We’ll let you decide if you think that is worth it.
How about something bigger like not only giving up your mid-sized car, but also your SUV and everything else your typical household does that results in carbon dioxide emissions from fossil fuels. Again, according to NativeEnvergy.com, that would save about 24 metric tons of CO2 (or 0.000024 mmt) per year. Dividing this emissions savings by our handy-dandy converter yields 0.0000000000136ºC/yr. If you lack the fortitude to actually make these sacrifices to prevent one hundred billionth of a degree of warming, for $364 each year, NativeEnergy.com will offset your guilt.
And finally, looking at the Waxman-Markey Climate Bill that is now being considered by Congress, CO2 emissions from the U.S. in the year 2050 are proposed to be 83% less than they were in 2005. In 2005, U.S. emissions were about 6,000 mmt, so 83% below that would be 1,020mmt or a reduction of 4,980mmtCO2. 4,980 divided by 1,767,250 = 0.0028ºC per year. In other words, even if the entire United States reduced its carbon dioxide emissions by 83% below current levels, it would only amount to a reduction of global warming of less than three-thousandths of a ºC per year. A number that is scientifically meaningless.
This is the type of information that we should be provide with. And, as we have seen here, it is not that difficult to come by.
The fact that we aren’t routinely presented with this data, leads to the inescapable conclusion that it is purposefully being withheld. None of the climate do-gooders want to you know that what they are suggesting/demanding will do no good at all (at least as far as global warming is concerned).
So, if you really want to, dust off your bicycle, change out an incandescent bulb with a compact fluorescent, or support legislation that will raise your energy bill. Just realize that you will be doing so for reasons other than saving the planet. It is a shame that you have to hear that from us, rather than directly from the folks urging you on (under false pretenses).
World Climate Report, April 30, 2009
We are always hearing about ways that you can “save the planet” from the perils of global warming—from riding your bicycle to work, to supporting the latest national greenhouse gas restriction limitations, and everything in between.
In virtually each and every case, advocates of these measures provide you with the amount of greenhouse gas emissions (primarily carbon dioxide) that will be saved by the particular action.
And if you want to figure this out for yourself, the web is full of CO2 calculators (just google “CO2 calculator”) which allow you to calculate your carbon footprint and how much it can be reduced by taking various conservations steps—all with an eye towards reducing global warming.
However, in absolutely zero of these cases are you told, or can you calculate, how much impact you are going to have on the actual climate itself. After all, CO2 emissions are not climate—they are gases. Climate is temperature and precipitation and storms and winds, etc. If the goal of the actions is to prevent global warming, then you shouldn’t really care a hoot about the amount of CO2 emissions that you are reducing, but instead, you want to know how much of the planet you are saving. How much anthropogenic climate change is being prevented by unplugging your cell phone charger, from biking to the park, or from slashing national carbon dioxide emissions?
Why do none of the CO2 calculators give you that most valuable piece of information? Why don’t the politicians, the EPA, and/or greenhouse gas reduction advocates tell you the bottom line?
How much global warming are we avoiding?
Embarrassingly for them, this information is readily available.
After all, what do you think climate models do? Simply, they take greenhouse gas emissions scenarios and project the future climate—thus providing precisely the answer we are looking for. You tweak the scenarios to account for your emission savings, run the models, and you get your answer.
Since climate model projections of the future climate are what are being used to attempt to scare us into action, climate models should very well be used to tell us how much of the scary future we are going to avoid by taking the suggested/legislated/regulated actions.
So where are the answers?
OK, so full-fledged climate models are very expensive tools—they are extremely complex computer programs that take weeks to run on the world’s fastest supercomputers. So, consequently, they don’t lend themselves to web calculators.
But, you would think that in considering our national energy plan, or EPA’s plan to regulate CO2, that this would be of enough import to deserve a couple of climate model runs to determine the final result. Otherwise, how can the members of Congress fairly assess what it is they are considering doing? Again, if the goal is to change the future course of climate to avoid the potential negative consequences of global warming, then to what degree is the plan that they are proposing going to be successful? Can it deliver the desired results? The American public deserves to know.
In lieu of full-out climate models, there are some “pocket” climate models that run on your desktop computer in a matter of seconds and which are designed to emulate the large-scale output from the complex general circulation models. One of best of these “pocket” models is the Model for the Assessment of Greenhouse-gas Induced Climate Change, or MAGICC. Various versions of MAGICC have been used for years to simulate climate model output for a fraction of the cost. In fact, the latest version of MAGICC was developed under a grant from the EPA. Just like a full climate model, MAGICC takes in greenhouse gas emissions scenarios and outputs such quantities as the projected global average temperature. Just the thing we are looking for. It would only take a bit of technical savvy to configure the web-based CO2 calculators so that they interfaced with MAGICC and produced a global temperature savings based upon the emissions savings. Yet not one has seemed fit to do so. If you are interested in attempting to do so yourself, MAGICC is available here.
As a last resort, for those of us who don’t have general circulation models, supercomputers, or even much technical savvy of our own, it is still possible, in a rough, back-of-the-envelope sort of way, to come up with a simple conversion from CO2 emissions to global temperatures. This way, what our politicians and favorite global warming alarmists won’t tell us, we can figure out for ourselves.
Here’s how.
We need to go from emissions of greenhouse gases, to atmospheric concentrations of greenhouse gases, to global temperatures.
We’ll determine how much CO2 emissions are required to change the atmospheric concentration of CO2 by 1 part per million (ppm), then we’ll figure out how many ppms of CO2 it takes to raise the global temperature 1ºC. Then, we’ll have our answer.
So first things first. Figure 1 shows the total global emissions of CO2 (in units of million metric tons, mmt) each year from 1958-2006 as well as the annual change in atmospheric CO2 content (in ppm) during the same period. Notice that CO2 emissions are rising, as is the annual change in atmospheric CO2 concentration.
[figure 1]
Figure 1. (top) Annual global total carbon dioxide emissions (mmt), 1958-2006; (bottom) Year-to-year change in atmospheric CO2 concentrations (ppm), 1959-2006. (Data source: Carbon Dioxide Information Analysis Center)
If we divide the annual emissions by the annual concentration change, we get Figure 2—the amount of emissions required to raise the atmospheric concentration by 1 ppm. Notice that there is no trend at all through the data in Figure 2. This means that the average amount of CO2 emissions required to change the atmospheric concentration by a unit amount has stayed constant over time. This average value in Figure 2 is 15,678mmtCO2/ppm.
[figure 2]
Figure 2. Annual CO2 emissions responsible for a 1 ppm change in atmospheric CO2 concentrations (Figure 1a divided by Figure 1b), 1959-2006. The blue horizontal line is the 1959-2006 average, the red horizontal line is the average excluding the volcano-influenced years of 1964, 1982, and 1992.
You may wonder about the two large spikes in Figure 2—indicating that in those years, the emissions did not result in much of change in the atmospheric CO2 concentrations. It turns out that the spikes, in 1964 and 1992 (and a smaller one in 1982), are the result of large volcanic eruptions. The eruptions cooled the earth by blocking solar radiation and making it more diffuse, which has the duel effect of increasing the CO2 uptake by oceans and increasing the CO2 uptake by photosynthesis—both effects serving to offset the effect of the added emissions and resulting in little change in the atmospheric concentrations. As the volcanic effects attenuated in the following year, the CO2 concentrations then responded to emissions as expected.
Since volcanic eruptions are more the exception than the norm, we should remove them from our analysis. In doing so, the average amount of CO2 emissions that lead to an atmospheric increase of 1 ppm drops from 15,678 (the blue line in Figure 2), to 14,138mmtCO2 (red line in Figure 2).
Now, we need to know how many ppms of CO2 it takes to raise the global temperature a degree Celsius. This is a bit trickier, because this value is generally not thought to be constant, but instead to decrease with increasing concentrations. But, for our purposes, we can consider it to be constant and still be in the ballpark. But what is that value?
We can try to determine it from observations.
Over the past 150 years or so, the atmospheric concentration of CO2 has increased about 100 ppm, from ~280ppm to ~380ppm, and global temperatures have risen about 0.8ºC over the same time. Dividing the concentration change by the temperature change (100ppm/0.8ºC) produces the answer that it takes 125ppm to raise the global temperature 1ºC. Now, it is possible that some of the observed temperature rise has occurred as a result of changes other than CO2 (say, solar, for instance). But it is also possible that the full effect of the temperature change resulting from the CO2 changes has not yet been manifest. So, rather than nit-pick here, we’ll call those two things a wash and go with 125ppm/ºC as a reasonable value as determined from observations.
We can also try to determine it from models.
Climate models run with only CO2 increases produce about 1.8C of warming at the time of a doubling of the atmospheric carbon dioxide concentrations. A doubling is usually taken to be a change of about 280ppm. So, we have 280ppm divided by 1.8ºC equals 156ppm/ºC. But, the warming is not fully realized by the time of doubling, and the models go on to produce a total warming of about 3ºC for the same 280ppm rise. This gives us, 280ppm divided by 3ºC which equals 93ppm/ºC. The degree to which the models have things exactly right is highly debatable, but close to the middle of all of this is that 125ppm/ºC number again—the same that we get from observations.
So both observations and models give us a similar number, within a range of loose assumptions.
Now we have what we need. It takes ~14,138mmt of CO2 emissions to raise the atmospheric CO2 concentration by ~1 ppm and it takes ~125 ppm to raise the global temperature ~1ºC. So multiplying ~14,138mmt/pmm by ~125ppm/ºC gives us ~1,767,250mmt/ºC.
That’s our magic number—1,767,250.
Write that number down on a piece of paper and put it in your wallet or post it on your computer.
This is a handy-dandy and powerful piece of information to have, because now, whenever you are presented with an emissions savings that some action to save the planet from global warming is supposed to produce, you can actually see how much of a difference it will really make. Just take the emissions savings (in units of mmt of CO2) and divide it by 1,767,250.
Just for fun, let’s see what we get when we apply this to a few save-the-world suggestions.
According to NativeEnergy.com (in association with Al Gore’s ClimateCrisis.net), if you stopped driving your average mid-sized car for a year, you’d save about 5.5 metric tons (or 0.0000055 million metric tons, mmt) of CO2 emissions per year. Divide 0.0000055mmtCO2 by 1,767,250 mmt/ºC and you get a number too small to display on my 8-digit calculator (OK, Excel tells me the answer is 0.00000000000311ºC). And, if you send in $84, NativeEnergy will invest in wind and methane power to offset that amount in case you actually don’t want to give up your car for the year. We’ll let you decide if you think that is worth it.
How about something bigger like not only giving up your mid-sized car, but also your SUV and everything else your typical household does that results in carbon dioxide emissions from fossil fuels. Again, according to NativeEnvergy.com, that would save about 24 metric tons of CO2 (or 0.000024 mmt) per year. Dividing this emissions savings by our handy-dandy converter yields 0.0000000000136ºC/yr. If you lack the fortitude to actually make these sacrifices to prevent one hundred billionth of a degree of warming, for $364 each year, NativeEnergy.com will offset your guilt.
And finally, looking at the Waxman-Markey Climate Bill that is now being considered by Congress, CO2 emissions from the U.S. in the year 2050 are proposed to be 83% less than they were in 2005. In 2005, U.S. emissions were about 6,000 mmt, so 83% below that would be 1,020mmt or a reduction of 4,980mmtCO2. 4,980 divided by 1,767,250 = 0.0028ºC per year. In other words, even if the entire United States reduced its carbon dioxide emissions by 83% below current levels, it would only amount to a reduction of global warming of less than three-thousandths of a ºC per year. A number that is scientifically meaningless.
This is the type of information that we should be provide with. And, as we have seen here, it is not that difficult to come by.
The fact that we aren’t routinely presented with this data, leads to the inescapable conclusion that it is purposefully being withheld. None of the climate do-gooders want to you know that what they are suggesting/demanding will do no good at all (at least as far as global warming is concerned).
So, if you really want to, dust off your bicycle, change out an incandescent bulb with a compact fluorescent, or support legislation that will raise your energy bill. Just realize that you will be doing so for reasons other than saving the planet. It is a shame that you have to hear that from us, rather than directly from the folks urging you on (under false pretenses).
Corporate Tax Reform
Corporate Tax Reform. WaPo Editorial
The president's tax plan can be the start of an important discussion.
WaPo, Tuesday, May 5, 2009
EXPECT President Obama's international tax proposal to set off a firestorm in Congress and the business community. The proposal, which will be released in more detail when the president's full budget comes out in the next few days, would alter tax rules so that companies are less able to shift profits to avoid U.S. taxation while also cracking down on tax havens for companies and individuals. In a speech yesterday, the president billed the plan as a revenue raiser, which it would be, and a plan to create jobs, a more contentious assertion.
Corporate tax policy is certainly in need of reform. The United States has the second-highest corporate tax rate in the world, though at just over one-tenth of the budget, the overall share of revenue it raises is remarkably small, both because the corporate tax base is chopped up by so many deductions, exemptions and credits, and because larger companies have great flexibility in shifting their profits around the world to lower their tax bills. In December, the Government Accountability Office reported that 83 of the nation's 100 largest companies have subsidiaries in tax havens, with Citigroup, Morgan Stanley and News Corp. (think Fox News) leading the way, each with more than 150 subsidiaries in tax haven locations. Many companies have legitimate business in these places, and many that are there solely to minimize their tax bills are doing so legally. Still, there is ample room for tax streamlining, given, for instance, the imbalance between domestic companies that are not able to shift profits and multinationals that can to some extent pick and choose where they pay taxes. Additionally, it's good to see the administration looking for revenue to promote investment and reduce the deficit.
However, some of the changes the administration is contemplating could harm U.S. competitiveness. Higher tax burdens would put U.S. corporations at a disadvantage compared with foreign competitors that do not face the double tax regime to which some corporations would be subject. The administration cited numbers showing that in 2004, U.S. multinationals paid $16 billion in taxes on $700 billion in foreign earnings, but it did not mention the $120 billion in foreign taxes they paid that year. Trade groups will argue that the increased cost of doing business will lead to job losses in the United States, not the gains promised by Mr. Obama.
The corporate tax code is a mess -- the rates are too high and the complexity extreme. One promising approach would be to pair the types of reforms the administration is talking about with a reduction in the overall rate, which would be advantageous for both multinational and domestic companies -- though, of course, it would reduce the amount of revenue that would be raised as a result of the policy. While it was never a centerpiece of his campaign, Mr. Obama supported reducing corporate income tax rates during his run for the presidency as long as it was done in a revenue-neutral manner. This proposal could be a way to get that broader discussion of corporate tax reform started.
The president's tax plan can be the start of an important discussion.
WaPo, Tuesday, May 5, 2009
EXPECT President Obama's international tax proposal to set off a firestorm in Congress and the business community. The proposal, which will be released in more detail when the president's full budget comes out in the next few days, would alter tax rules so that companies are less able to shift profits to avoid U.S. taxation while also cracking down on tax havens for companies and individuals. In a speech yesterday, the president billed the plan as a revenue raiser, which it would be, and a plan to create jobs, a more contentious assertion.
Corporate tax policy is certainly in need of reform. The United States has the second-highest corporate tax rate in the world, though at just over one-tenth of the budget, the overall share of revenue it raises is remarkably small, both because the corporate tax base is chopped up by so many deductions, exemptions and credits, and because larger companies have great flexibility in shifting their profits around the world to lower their tax bills. In December, the Government Accountability Office reported that 83 of the nation's 100 largest companies have subsidiaries in tax havens, with Citigroup, Morgan Stanley and News Corp. (think Fox News) leading the way, each with more than 150 subsidiaries in tax haven locations. Many companies have legitimate business in these places, and many that are there solely to minimize their tax bills are doing so legally. Still, there is ample room for tax streamlining, given, for instance, the imbalance between domestic companies that are not able to shift profits and multinationals that can to some extent pick and choose where they pay taxes. Additionally, it's good to see the administration looking for revenue to promote investment and reduce the deficit.
However, some of the changes the administration is contemplating could harm U.S. competitiveness. Higher tax burdens would put U.S. corporations at a disadvantage compared with foreign competitors that do not face the double tax regime to which some corporations would be subject. The administration cited numbers showing that in 2004, U.S. multinationals paid $16 billion in taxes on $700 billion in foreign earnings, but it did not mention the $120 billion in foreign taxes they paid that year. Trade groups will argue that the increased cost of doing business will lead to job losses in the United States, not the gains promised by Mr. Obama.
The corporate tax code is a mess -- the rates are too high and the complexity extreme. One promising approach would be to pair the types of reforms the administration is talking about with a reduction in the overall rate, which would be advantageous for both multinational and domestic companies -- though, of course, it would reduce the amount of revenue that would be raised as a result of the policy. While it was never a centerpiece of his campaign, Mr. Obama supported reducing corporate income tax rates during his run for the presidency as long as it was done in a revenue-neutral manner. This proposal could be a way to get that broader discussion of corporate tax reform started.