Archive for February, 2011

By Harry {doc} Babad, © Copyright 2011, All Rights Reserved. 


The materials I share in the articles that follow come from the various weekly science and environmental newsletters, both pro or anti any given subject’s focus or technologies, as well as blogs to which I subscribe.

Article selection (my article – my choice} are obviously and admittedly biased by my training, experience and at rare times my emotional and philosophical intuitive views of what works and what will not… But if you have a topic I neglect, send me feedback and I’ll give it a shot.

Since my topic segments are only a partial look at the original article, click on through the provided link if you want more details, as well as <often> other background references on the topic(s).       Doc.

Titles, As Usual, in No Formal Order, The New Snippets

  • Turning Tough Trash Into Food-Friendly Fuel
  • Fool’s Gold Catches Eye Of Solar Energy Researchers
  • Economies Of Scale: The Cost Of Nuclear New Build In America — It not the cost of the first one that ultimately counts.
  • Potholes On The Road To Renewable Fuels — Corn-kernel-based ethanol hits the fast lane, but cellulosic ethanol is still mostly stuck in first gear
  • Strip Search: How Safe are Airports’ New X-ray Scanners?
  • “Cheap energy”: Could natural gas be stepping on the renewable sector’s toes?

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Turning Tough Trash Into Food-Friendly Fuel

Researchers are making steps toward producing biofuels from the abundant plant materials we don’t eat.

In her search for a better way to put fuel in your tank, biological engineer Ratna Sharma-Shivappa is working on a chemical juggling act: She is trying to break down the problematic woody material in grasses without harming the energy-containing carbohydrates that the plants also contain. If she can perfect the process, it could lead to inexpensive biofuels that are made from inedible crops—not from corn like most of today’s ethanol.

If scalable this would likely eliminate or drastically reduce the difficult and highly politically driven choice of using based corn based ethanol for fuel, rather then feeding the worlds hungry. Once again Americas factory farm supported farm lobbies, has convinced the DOE and EPA to increase the allowable ethanol in our gasoline to 15%, engine corrosion problems not withstanding. This time against will of the automotive industry. There’s also the now demonstrated fact that corn based ethanol is, based on life cycle carbon releases, a negative pollution control force  – Growing corn releases more greenhouse gases than adding ethanol to fuel saves. Indeed the effect of switching to more corn ethanol in fuel does little except to line the pockets of ‘big’ agriculture and funding farm state politicians.

By exposing ground-up miscanthus grass (a relative of sugarcane) to ozone gas, Sharma-Shivappa and her colleagues at North Carolina State University were able to break down the tough structural molecule called lignin, allowing them to access the valuable carbohydrates without degrading them. Enzymes then split the carbs into sugars, which are fermented to make ethanol. Although ozone is pricey, the technique works at room temperature and does not require high pressure; advantages that Sharma-Shivappa believes will help keep it cost-effective. Next she will test the ozone treatment on other potential biofuel plants. “This should be applicable to most lignin crops,” such as switch grass, she says. There’s a bit more about alternatives to ozonization, in the linked article.

Doc Sez, that this is broader than just miscanthus grass, since it might also be applicable to the Brazilian sugar cane residues (biomass), as an alterative to the caustic treatment and or possible enzymatic processing now under study.

Article by Valerie Ross, Discover Magazine, December 2010 issue

Added Reading

Fermenting Cane Biomass to Fuel in Brazil

Ethanol Production Via Enzymatic Hydrolysis Of Sugar-Cane Bagasse And Straw In Brazil

Cellulosic Ethanol – Wikipedia, 2011

Is Ethanol Really More Eco-Friendly Than Gas?

Ethanol, Schmethanol, The Economist, September 2007

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Fool’s Gold Catches Eye Of Solar Energy Researchers


  • There are several issues related to the technology on which solar energy is based but in one word they relate to competitive and unsubsidized cost. Three examples
    Cost of the semiconductors used to make solar cells
  • Cost of solar energy compared to that from natural gas, nuclear of course coal
  • Finding low cost storage to allow solar energy to meet our industrial and urban base load requirements.

An improvement in any of these areas gets us closer to use of the sun to generate electricity on a real world competitive basis. Yes readers, I do understand that some of the competition becomes more fair to Solar should the governments of this world adopt either a carbon tax or better yet change the focus of bested interests as discussed in a recent article in the January 2011 Economist.  Another alternative being talked about is Lowering Income Taxes While Raising Pollution Taxes Reaps Great Returns published in the sustainability blog, in April 2010.

Iron pyrite – also known as fool’s gold – may be worthless to treasure hunters, but it could become a bonanza to the solar industry. The mineral, among the most abundant in the earth’s crust, is usually discarded by coal miners or sold as nuggets in novelty stores.

But researchers at the University of California-Irvine said they could soon turn fool’s gold into a cheaper alternative to the rare and expensive materials now used in making solar panels. “With alternative energy and climate-change issues, we’re always in a race against time,” said lead researcher Matt Law. “With some insight and a little bit of luck, we could find a good solution with something that’s now disposed of as useless garbage.”

The UC-Irvine team believes the mineral can be processed into a thin film for use in photovoltaic cells, and could eventually convert sunlight into electricity at roughly the same rate as existing technology. Though it’s too early to estimate the cost of cells made with pyrite, Law said they’re likely to be cheaper because fool’s gold is so readily available. A prototype could be ready within the year, but it could be at least three years before the cells are commercially available. Some industry analysts, however, are skeptical that the team – which includes a chemist, a mathematician and a physicist – can hit pay dirt. There’s more… some of it negative by folks with a vested interest in the existing technology.

PhysOrg.Com Blog, January 21st, 2011 (c) 2011, Los Angeles Times

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Economies Of Scale: The Cost Of Nuclear New Build In America

— It not the cost of the first one that ultimately counts.

Article by Jack Craze, Nuclear Energy Insider, November 2010

The cost of nuclear new build is a source of major contention in the US. President Obama’s administration has proposed tripling the size of the loan guarantee program to $56 billion. Industry figures say this is not nearly enough to kick-start the nuclear renaissance, while the general public remains fiercely opposed to anything resembling another federal subsidy package.

The costs of building a nuclear reactor are, in many people’s minds, prohibitively high. In America, a lot of people remember the hundreds of billions of dollars ‘squandered’ on nuclear energy in the 1980s. Others point to the recent price escalation (to around $10 billion) for the Olkiluoto 3 reactor in Finland. And while a record-high 74% of Americans say they support the development of nuclear energy in the US, the upfront costs of construction remain a problem, particularly in the (potential) middle of a recession.

Westinghouse, one of America’s leading commercial nuclear companies, puts the installation costs of one of its 960-megawatt (MW) reactors at $7 billion. This compares to $2.5 billion for a 750 MW coal plant, and $3 billion for a 600 MW hydro plant. “What we have to remember”, observes Dr. Jim Conca, Senior Scientist at the Institute for Energy and the Environment at New Mexico State University, “Is that as you build more reactors, or anything at an engineering scale, the cost comes down. “For example, the South Koreans’ sixth nuclear reactor cost about 40% less than their first. And in China, they’re building nuclear reactors at about $3 billion a unit.

Doc Sez: Look at the projections for the production costs for the new Nissan Leaf. At the initial low production levels the MSRP is $32780, offset by major federal and state subsidies to perhaps as low as $25,280 in some states. In a recent interview Nissan CEO Carlos Ghosn noted that he expects to be price competitive without government subsidies when annual Leaf production hits 500,000 units per year (which is down from a previous forecast of 1 million). And that’s without a major break though in battery costs.

True, the Chinese, for now, have low-cost labor which accounts for some of that lower cost, but it does show you that the $7 billion Westinghouse price-tag is a very conservative estimate.”

The article goes on to discuss the role of Federal Loan Guarantees to kick stat initial reactor construction, minor indirect support (e.g., a loan guarantee is not a grant) compared to France, Germany, Korea and Japan who are serious about nuclear energy. It concludes with an overview of trends in Construction, commodities and long-term costs. It makes a good read, check it out.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Potholes On The Road To Renewable Fuels

— Corn-kernel-based ethanol hits the fast lane, but cellulosic ethanol is still mostly stuck in first gear

Article by Jeff Johnson, September 13, 2010, Chemical and Engineering News

Four years ago, speaking to 1,300 ethanol supporters in the heart of the Corn Belt, then-president George W. Bush gave a rousing speech singing the praises of biofuels, particularly corn-kernel-based ethanol. His speech on the eve of the 2006 congressional elections was music to the ears of the crowd attending the government-organized St. Louis conference, aptly titled “Advancing Renewable Energy: An American Rural Renaissance.”

The president outlined his plan to offer tax credits, subsidies, and federal research support to fuel a drive for ethanol that would move the nation “beyond a petroleum-based economy and make our dependence on Middle Eastern oil a thing of the past.” He added that cellulosic ethanol made from nonfood sources, waste, and energy crops was“ right around the corner” and would be “practical and competitive within six years.”

Bush’s support for ethanol and his mix of energy, economic, and electoral policies have been continued by President Barack Obama, particularly the push for fuels made from cellulosic feedstocks. Obama’s Departments of Energy and Agriculture have offered billions of dollars to support cellulosic ethanol R&D and bio refinery construction. But despite the money and talk, no commercial cellulosic ethanol biorefinery is operating in the USA today, and the most optimistic cellulosic ethanol boosters acknowledge that commercial-scale production could be years away.

I wonder what the Brazilian’s and apparently the Chinese are doing right?

Meanwhile, in the US, they clamor for additional federal support.

Where have I heard this song before?

The article make good reading, and the folks at the American Chemical Society’s magazine [C&EN} do a credible job of getting their facts straight.

I found the discussion of diverting food, a major international, but not US food staple, of particular concern.

The continued competition between corn for food and corn for fuel worries food and agricultural experts. Cellulosic ethanol was supposed to ease the demand for corn as fuel, but instead, reliance on corn as a gasoline additive has become secure, and now the price of corn is “hooked” to the volatile price of oil, according to Craig Cox, Midwest vice president of Environmental Working Group (EWG), a nonprofit research organization. Cox, a former USDA official and congressional staff member, believes that when oil prices rise, they will drive up the price of corn ethanol and consequently the price of corn—with a ripple effect on the cost of grains throughout the world.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Strip Search: How Safe are Airports’ New X-ray Scanners?

By Alice Park, Time Magazine, October 9, 2010.

Let me start this article by saying tomorrow (February 26th I am flying to Phoenix and expect at lease once on my trip to pass through a set of new scanners. Compared to all the other radiation exposures in my life this risk is a no-brainer. What you ask?

I did part of my undergraduate research near a incompletely shielded cobalt 60 source

  • I lived in Denver for about six year,
  • I was a frequent coast-to-coast flyer in the 1980’s,
  • I had 7-Grey of X-ray radiation treatment for a neck cancer,
  • I live with lousy teeth and so am X-rayed more often than most folks
  • And …have had more than my share of CAT scans.

The only place I didn’t get more than a background radiation dose was working at the US DOE Hanford Nuclear Site for ca. 25 years. Okay, no the article details.

Don’t be surprised if on your next trip to the airport, security personnel tell you to stop and put your arms up. No, you’re not being arrested. You’re being X-rayed from head to toe–or, more accurately, from toe to head.

The latest generation of airport scanners is designed to detect nonmetal weapons such as ceramic knives and explosive devices that can slip past magnetometers. The new machines–135 of them are already in operation, and nearly 1,000 are expected to be in place by the end of 2011–rely on low-intensity radiation that is absorbed a few millimeters into your skin and then reflected back, creating a reasonably accurate contour image of your body and anything else underneath your clothes. When the Transportation Security Administration (TSA) began rolling out the so-called backscatter machines in March, the agency, along with the Food and Drug Administration (FDA) and the National Institute of Standards and Technology, assured the public that the radiation dose from a scan was negligible–far lower not only than the amount in a chest X-ray but also than the levels passengers absorb from cosmic rays on a cross-country flight.

The backscatter numbers, however, seemed too good to be true to several scientists, including John Sedat, a biophysics professor at the University of California, San Francisco. After studying the degree of detail obtained in the seconds-long scans, the scientists wondered how the radiation exposure could be so low. The answer, they concluded, lay in how the manufacturer and government officials measured the dose: by averaging the exposure from the beam over the volume of the entire body. This is how scientists measure exposure from medical X-rays, which are designed to zap straight through bone and tissue. But backscatter beams skim the body’s surface. Sedat and his colleagues maintain that if the dose were based only on skin exposure, the result would be 10 to 20 times the manufacturer’s calculations.

That’s a huge difference, but the higher amount, TSA and FDA officials maintain, still falls within the limits of safe radiation exposure. Based on measurements conducted by the FDA as well as by technicians at Johns Hopkins University and elsewhere, says the FDA’s Daniel Kassiday, “We are confident that full-body-X-ray security products and practices do not pose a significant risk to the public health.”

Check this out, there’s both a difference of opinion on the use of one type of scanning, one that uses background scatter methods, and other devices being implemented, but the bottom like is the risks to an individual are low. – What these concerned scientists worry about is population dose to the 8,000,000 people worldwide including children who fly each year.  I agree that more studies are needed but unless they in addition to ‘absolute’ risk relate the added risks of malignancy to those from other sources of pollution, this will become another brainless media fest. Meanwhile my grandson who works for the TSA says that at least in Seattle, folks have made very little fuss about the scanners… and after all good news make very poor headlines.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

“Cheap energy”: Could natural gas be stepping on the renewable sector’s toes?

By Heba Hashem, Middle East Correspondent Nuclear Energy Insider, 6 December 2010,

Liquefied gas capacity will shoot up 47% by the end of 2013, according to the International Energy Agency (IEA), which will threaten investments in the renewables sector.

Although prices of renewable energy are coming down with technology advances, the intermittent nature of the energy production from renewable sources is making natural gas more appealing and investment worthy to companies.

Last month, Qatar’s energy minister said that natural gas would become more desirable than other energy sources, including renewables, which are environmentally promising but remain too expensive.

Wind speed is ideal for operating turbines at the height of around 800 meets, but building a tower that high isn’t feasible. Still, wind energy has a zero marginal cost, and thus can be profitable in the right environment.

Today’s recession dictating future decisions — According to Dr. Ray Perryman, a US- based economist and president of the Perryman Group:  “Wholesale and to some extent retail markets for electricity are becoming less regulated and more competitive over time. When prices rise, the emphasis will shift to renewables”.

“This ebb and flow is the nature of markets, but sophisticated companies are now investing billions of dollars in renewable transmission infrastructure, and new wind and solar manufacturing plants continue to expand”.

Because emerging countries have an accelerating demand for energy, there is going to be high demand for all sources (traditional and renewables). “The recession has interrupted this pattern temporarily, but not fundamentally”.

The golden age of gas may lead to cheaper gas prices for consumers, but it will also result in a rush to build gas-fired power plants at the expense of much cleaner forms of electricity generation. The IEA estimates that 35% of the increase in global gas production to 2035 will come from such unconventional projects.

Moreover, oil giants like Shell and Exxon-Mobil are shifting their business focus and repositioning themselves as gas producers, which Shell is marketing as a cleaner, yet still a CO2 producing, form of energy.

The article continues with an excellent discussion of Shale Gas and the US Market and ends up with a usual question associated with competing energy sources in a changing regulatory environment — Natural gas has crucial role to play, but for how long?

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles listed in this column are copyright protected – their use is both acknowledge and is limited to educational related purposes, which this column provides.

Sources & Credits: — Many of these items were found by way of the links in the newsletter NewsBridge of ‘articles of interest’ to the national labs library technical and regulatory agency users. NewsBridge is electronically published by the Pacific Northwest National Laboratories, in Richland WA.  If using NewsBridge as a starting point, I follow the provided link to the source of the information and edit its content (mostly by shortening the details) for information for our readers. I also both follow any contained links, where appropriate, in the actual article, and provide you those references as well as those gleaned from a short trip to Google-land. Obviously if my source is a magazine or blog that the material I work with.

In addition, when duplicating materials that I cite, I do not fill the source words with quotation makes, the only place I keep quotes intact is where the original article ‘quotes’ another source external to itself.  Remember, when Doc sticks his two bits in, its in italics and usually indented.

In Closing

I’ll be posting articles for your comfort and anger in the next few months. I never respond to flaming, but will take time to provide evidence in the form of both primary technical and secondary {magazine articles} references for those who ask. However, most of you can reach out and Google such information for your selves.

Readers Please NoteRead about my paradigms views, prejudices and snarky attitudes at:

Furthermore, many of the technologies I share still have to prove that they are reliable, durable and scalable — and if you Google them in detail, you will find studies saying they are capable of being commercialized and often as many other studies that are more skeptical. I find it always appropriate, as I read to step back and WIIFT – No it’s not something new to smoke; just the compulsion to ask what’s in it for them. It’s okay to have a hidden agenda, but agenda’s too hidden discomfort me.

I know, perhaps even truly believe, is this. For green energy related items, if we put a simple price (tax) on carbon (greenhouse gases) and gave out no subsidies, these new technologies would have a better chance to blossom. With American ingenuity, Indian and Chinese too, thousands more ideas would come out of innovators’ garages. America still has the best innovation culture in the world. But we need better policies to nurture it, better infrastructure to enable it and more open doors to bring others here to try it.

Remember, conditions, both technical and geopolitical continuously change – So if you’ve made up your mind about either the best way to go, or about its all a conspiracy, move on to the next article in our blog. Today’s favorite is tomorrow unintended consequence. However, that’s better than sticking one’s head in the sand or believing in perpetual motion. Remember, there’s no free lunch and as a taxpayer and consumer you must always end up paying the piper!

May your world get greener and all creatures on Earth become healthier and more able to fulfill their function in this Gaia’s world.

By Harry {doc} Babad, © Copyright 2011. All Rights Reserved.

I have lately been inundated by wind power articles, trying to convince me that (subsidized) wind power for the US, is the next best thing to sliced white bread. One of my thought, ignored by most commentators is that it does mater to me whether I must pay a direct rate increase for alternate energy, or the government sneaks it out of my pocket as a hidden tax, aka subsidy. I object! I get stuck and perhaps suckered either way, which the industry and their political supporters prosper. As Robert Heinlein has coined the SciFi slang terms TANJ and TANSTAAFL. Both speak to my views.

White Bread Analogy — Relative to the white bread, many of us have long been aware that the Wonder’s™ of this world, have profitably convinced several generations of Americans of their products’ worth. We all should eat, so the message goes, this low in nutrition (needs fortification), low fiber, and either sweet or flavorless with no mouth appeal products  ‘tongue” on our innocent advertising is truth addicted public.  The paradigm is slowly changing, check out big store grocery stores and you find much more in the way of whole grain, artisan and other healthier breads… still high carb, but much better tasting and better for you; if using a bread maker does not fit your schedule.

Is the same true of Wind Power? If so where in the US does it make sense? The Europeans once big on wind power that they heavily subsidized it as a silver bullet have now gone off subsidizations. Guess what even in Europe with a closely placed urban friendly grid; orders for Wind turbines have dropped dramatically.

I recently came across two well written and thought provoking articles that naysay the wondrous benefits wind power, which of course motivated this Op-Ed topic. I shall summarize their claims at the end of this article segment.

WIND POWER YES or NO — Questions That Need To be Considered

  • Is Wind Power competitive in your region perhaps because it is easily connected to the local or regional grid?
  • Can a wind power system be developed, in the near future, to provide base load uninterruptable power supply to urban and industrial America?
  • What impact will the cost of wind power have on base regional electrical rates if it is only confined to making up for base-lead shortages? At what unsubsidized cost are any savings that result worth the life cycle cost penalties. (E.g., How Good is Good?)
  • Have much publicized estimated costs/benefits of wind power considered the ecological and greenhouse costs of making, installing and ultimately disposing of the windmills? (E.g., Best estimate ranges of Full Life Cycle Costs. Even the global warming folks do this!)
  • With the major NIMBY response to mostly off shore and also to mountain top turbine farms, are we placing wind farms in locations where the wind blows, sort of, but the distance to the industrial and urban consumer becomes an obstacle to true competitiveness?
  • How do the other environmental side effects ranging from noise pollution to bird kills compare to other energy sources, say natural gas, solar, nuclear (including mini-reactors) and of course oil.
  • How will Windpower fare in the newly developed “Clean Energy Standards” that are being considered as alternatives to both cap-and-trade and a carbon tax.

Note, I left coal out of my list because I don’t believe there will ever be a cost effective politically correct clean way to either use a ‘clean’ coal technology do assure 100,000 year sequestration of the CO2 from coal burning. The numbers I heard bandied about are 5,700 year to assure sequestration safety. Alas I can find no credible analysis or regulatory basis for this number.

The two articles that intrigued me were:

OVERBLOWN: Windpower on the Firing Line (Part I), and0
Oxymoronic Windpower (Part II: Windspeak.)

Both were written by Jon Boone on September 13, 2010 and January 18, 2011 and are posted on the Master Resource the Free Energy Market Blog.

They are a fascinating combination of Boone’s adopted slang from both George Orwell’s’ Movie 1984 and the Harry Potter books, plus. The articles also include a pithy description of the Wind Power industry’s double-speak. The later in typical Madison Avenue style, foisted on the public and brain washed into politicians’ sense of political correctness. Of coursed all is funded by those who would profit, either financially or ideologically, from wind technology.

Although masked sarcasm, that covers sharp and biting analysis, I find Jon Boone’s analysis, replete with credible references, credible and accurate. They substantiate the studies I’ve done, in an area I try to keep up with the ever-evolving factual data. I quote…

Widespread misunderstanding about the difference between energy and power has given cover to the charlatan-like wind lobby, which pretends their wares provide something they do not. We are all familiar with black-white PR jargon that characterizes wind projects as mills, farms, and parks, despite the looming industrial presence of 450-foot tall turbines propelling rotors at tip speeds of nearly 200-mph for many miles along terrain or seabed. But for sheer oxymoronic audacity, nothing beats the trickeration of the term wind power, since the technology is the very antithesis of modern power performance. In fact, wind provides no modern power. Rather, it throws out spasmodic, highly skittering energy that cannot by itself be converted to modern base load power.

Although much of the first article in this series is filled with a general overview about energy and its role in modern society, it is an excellent read, worth your attention. It’s underlying, and accurate premise is that the diffuse nature of wind’s fuel requires (in most locations) continuous supplementation by reliable machines fueled by more energy-dense fuels, as well as virtually dedicated new transmission lines and voltage regulation systems. It’s the kind and scope of activity that must happen to make wind create modern high-density continuous power.

Note: Unlike my usual practice, in this Op-Ed segment, quotes are in italics and my ‘purple prose is in plain or plain blue colored text.

The second article in this series provide details about the wind power industry, their campaign that uses albeit CO2 producing coal as the antithesis “clean” wind power and other madison avenue tactics to create a favorable ‘climate’ for funding wind energy. [Eg. AWEA] despite wind’s low unit availability, and capacity values. White bread anyone?

Here are some factual insights, edited by me for brevity that Boone provides:

1. Despite more than 100,000 huge wind turbines in operation around the world, with about 35,000 in North America, no coal plants have been closed because of wind technology. In fact, many more coal plants are in the offing, both in the US and throughout the world. Moreover, a Colorado energetics company, Bentek, recently published a study about wind in Texas and Colorado showing, in its study areas, that wind volatility caused coal plants to perform more inefficiently, “often resulting in greater SO2, NOx, and CO2 emissions than would have occurred if less wind energy were generated and coal generation was not cycled.” Further examination of fuel use for electricity in both states during the time of inquiry suggested that wind caused no reduction in coal consumption.2. Unpredictable, undispatchable, volatile wind can provide for neither baseload nor peak load situations. It can only be an occasional supplement that itself requires much supplementation. Consequently, as Australian engineer Peter Lang once wrote, since“ wind cannot contribute to the capital investment in generating plants… it’s simply is an additional capital investment.” 

3. Wind technology does NOT represent alternate energy. Since wind cannot provide controllable power and has no capacity value, it cannot be an alternative for machines that do provide controllable power and high capacity value. Wind therefore is incapable of entering into a zero-sum relationship with fossil-fired capacity—that is, more wind, less coal. All other conditions being equal (demand, supply, weather, etc.), more wind generally means more coal.

4. None of the considerable public subsidies for wind, indeed, not even state renewable portfolio standard (RPS) laws, are indexed to measured reductions in carbon dioxide emissions and fossil fuel consumption. Consequently, there is no transparency or accountability for how wind technology will achieve the goals set forth by those policy initiatives. This means that corporations with a lot of fossil-fired market share to protect have no obligation to replace it with wind. And they don’t. Because they can’t. Freedom from responsibility is a child’s fairy tale dreams come true.

5. The work of a number of independent engineers—Hawkins, Lang, Oswald, Le Pair and De Groot—suggests that even the most effective fossil fuel pairing with wind, natural gas, will very marginally reduce overall natural gas consumption beyond what would occur using only natural gas generators, without any wind whatsoever.

6. Because oil provides barely 1% of the nation’s electricity, wind represents no threat to oil’s market share.

There’s more, my favorite entitled, as you might guess, is a discussion of follow the money… Check out the links and the reference therein.

Feedback of course is always welcome with one proviso: Just the Facts Ma’am by Joe Friday; of Dragnet fame’s so provide references to your counter arguments.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Sidebar Notes

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles cited or quoted in this column are copyright protected – their use is both acknowledged and is limited to educational related purposes, which this column provides.

The author considers, as do many experts, Wikipedia a reliable and accessible site for technical information, provided that the reference cited in the Wikipedia article meets the following standard.

Are the references provided essentially complete or are representative of the literature, and relevant?  Do they include both precedent and present work, including any referenced disagreement with any of the Wiki author’s views?

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

I like using MAMP (Mac Apache MySQL PHP) to develop server-based code when not connected to a test server, and one of my classes in school had several projects where we use PHP and MySQL, so this short piece is on the basics of configuring and using Eclipse Helios to write the PHP code and MAMP to provide server-side functionality on your computer to create a basic PHP application.

  1. Download and install Eclipse for your development platform from here.
  2. Download and install MAMP (or WAMP if you run Windows, LAMP if you run Linux, and SAMP if you run Solaris). You do not need the Pro version of the software.
  3. Start Eclipse, open any perspective and select the ‘Help/Install New Software’ menu option.
  4. In the popup window, there is a field titled ‘Work with:’ – select the drop down list beside it and choose ‘All Available Sites’. Scroll down the list and expand the options for ‘Programming Languages’.
  5. Scroll down the list and click inside the box for the option labeled ‘PHP Development Tools (PDT) SDK Feature’.
  6. Press the ‘Next’ button twice, select the radio button that indicates you accept the license agreements, then press the ‘Finish’ button.
  7. Restart Eclipse with the ‘File/Restart’ menu option.
  8. Open the Mac OSX Applications folder, and then locate and open the MAMP folder. Your next steps are to start and configure MAMP.
  9. Click one time on the MAMP icon.
  10. Click on the ‘Start Servers’ button.
  11. Click on the ‘Preferences’ button.
  12. Check the value of the ‘Start page URL’ – this is the location to store your HTML, PHP, and image files. You need this when you create a new PHP Project in Eclipse. Press the ‘Cancel’ button.
  13. Open the Mac OSX Applications folder, and then locate and open the Eclipse folder.
  14. Click one time on the Eclipse icon to start Eclipse.
  15. You see the preliminary Welcome screen. To close it, press the close button beside the Welcome tab in the far, upper left area of the screen.
  16. Select the ‘Window/Open Perspective/Other…’ menu option to select the PHP perspective.
  17. Select the ‘File/New/PHP Project’ menu option.
  18. Enter a project name, but this is where you deviate from typical Eclipse project setup. Select the radio button beside ‘Create project at existing location’ and browse to the ‘Start Page URL’ directory (see step 12) and use this as the location for your PHP project.
  19. Select ‘File/New/PHP page’.
  20. For a simple hello world application, enter this code:
  21. Save the file with a name of ‘hello.php’ using Eclipse.
  22. Open your browser and use this as the URL for your simple PHP web page:

NOTE: WAMP users do not need to have the :8888 portion of the URL. They use http://localhost/hello.php.

You should now see Hello World in your browser. Pretty simple to create new PHP applications after you install and configure your environment. The only thing to watch is setting the location for your Eclipse PHP source code so MAMP’s Apache engine knows where to find it.

An excellent source of PHP information is They have documentation that can be read online as well as downloaded.

1/21/2012 Update: Added label for radio button in step 18, per comment from Stephen.
11/8/2011 Update: Added sentence to intro paragraph, change hello/php to hello.php in the Note.
3/11/2011 Update: Added WAMP information in the Note below step 22 of this process.

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

The US space program had good news today (February 24, 2011). NASA successfully launched Discovery for its final flight (STS-133), which is a trip to the International Space Station (ISS). A great video of the launch from NASA can be seen here, which shows the take and the separation of the solid fuel boosters when the shuttle is 29 miles from NASA at a height of 24 miles. Wow! I wish that our Piper Arrow had that kind of acceleration and ceiling…

Image credit: NASA TV


Photo credit: NASA

The crew of Discovery is shown to the right, with NASA astronauts Steve Lindsey (center right) and Eric Boe (center left), commander and pilot, respectively; along with astronauts (from the left) Alvin Drew, Nicole Stott, Michael Barratt and Steve Bowen, all mission specialists.names listed below (thank you for the picture, NASA).

A successful launch is always good news, but this is a bitter sweet moment for fans of the space program. The Space Shuttle era is coming to a close in 2011. After today, depending on funding from Congress, there will only be one or two more Shuttle flights this year, then the US Shuttle fleet will be retired.

For a real treat, watch this video.

The Hubble Space Telescope

I’ve followed nearly every launch since STS-1, and my favorites involve the Hubble Space Telescope. The initial plans called for launching the Hubble in 1986, however the destruction of the Challenger delayed the launch until 1990, when Shuttle Discovery carried and launched it on mission STS-31.

There were problems with the Hubble mirror, so another visit was necessary to effect repairs. Space Shuttle Endeavour’s STS- 61 mission was to repair the Hubble, and it was a huge success. The astronauts successfully retrieved, repaired, and redeployed the Hubble, and the before and after images from the Hubble are remarkable. Since the repair, the Hubble has contributed a great deal to new images of the planets and stars in the sky.

There have been four other missions to repair or upgrade the Hubble to prolong it’s effective use exploring the wonders of the universe. The other Hubble shuttle missions were:

  • Shuttle Discovery – STS-82 in Feb, 1997
  • Shuttle Discovery – STS-103 in Dec, 1999
  • Shuttle Columbia – STS-109 in March, 2002
  • Shuttle Atlantis – STS-125 in May, 2009

The Hubble will continue to provide valuable images of the skies for years to come, however it too will be replaced by the James Webb Space Telescope, expected to be launched in 2014.

This is not the end of US flight, as there are private firms like Space X and Virgin Galactic that are working on vehicles capable of delivering people and supplies to the ISS in low earth orbit. There are other space agencies like the the European Space Agency (ESA), the Japan Aerospace Exploration Agency (JAXA), and the Russian Federal Space Agency (Roscosmos) that have sent missions to the ISS. The Russian agency will be the primary agency providing Soyez capsules to deliver and retrieve people in the near future.

It will be awhile before the next generation of US space crafts are ready, so it will be a time to watch the efforts of others – in the private sector as well as other nations. Hopefully we will live to see missions to establish bases on our moon and on Mars, which will be as awesome as our first missions to the moon in the 1960s and 70s.

UPDATE (Monday, 3/7/2011)

Shuttle Discovery decoupled from the ISS this morning at 6AM CST and is headed back to earth. The Shuttle will orbit earth in the vicinity of the ISS for the next 2 days, then re-enter the atmosphere and land on Wednesday. This marks the last time Discovery will visit the ISS.

UPDATE (Wednesday, 3/9/2011)

Shuttle Discovery begins mission orbit number 202, which is her final earth orbit, at 9:01AM CST. The 2 minute de-orbit burn began while the Discovery was over India, traveling at Mach 25. The landing was at Kennedy Space Center (KSC) at 10:57AM CST today, which was the last of the 39 missions flown in this Shuttle. It is great this Shuttle did so well so many times, yet this was the last time that ship will fly and that is sad indeed.

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

Python is a good programming language for web developers, and I enjoyed my first experience with it. Our web development class was given a simple python web server and some HTML files and told to extend both to render images. Since I had not used it, I first spent time tracking down python information: where to get it, how to install it, and how to develop using it.

There were online sources for it, but I was pleased to discover that Apple ships python on Macs. My 1 yr old 2.26 GHz dual core Macbook has Python 2.6.1, so no download was necessary. I found some information on the internet with the command line syntax for starting python so I decided to use a Terminal to launch python, edit the python server source files and HTML files with TextWrangler, and view the HTML with FireFox.

Getting Started With Python

This is the process I followed:

Python running on port 9000

1. Put the python server code and HTML files in a folder on the computer. I created a python folder in my user directory (eg. /mikehubbartt/python/).

2. Start the Mac OSX Terminal.

3. Use cd <dir> to change to the directory with the python server code and HTML files.

4. At the Terminal, enter ‘python <> and press the Return/Enter key. The python server is now running, using the port specified by ‘PORT_NUMBER = ‘ in the python server source code.

5. Use the browser to access the main HTML file (index.html) via the python web server, using this syntax:


NOTE: You append the port number to localhost in the URL, which is specified in the file to be able to access the content via a web browser. If you specify a different port number, make sure you use that port when accessing content.

6. Now you see the first file with your web browser, and the Terminal content will now change as you select links on the web pages (see my second screen shot at the right for an example – I’ve clicked good and bad links to show the error messages seen at the Terminal).

The next steps to modify the server source code and HTML source files (I used TextWrangler, but any other program is fine – it is preference of the developer) are project specific, so I won’t spoil it for you by giving the exact steps, although I will give some tips.

Python Tips

Tip 1. Any changes to the python server code require modifying the source code, stopping the server, then restarting the server. The way to stop the python server is to select the terminal window and then press control-c. Making any desired changes to the python source code and save the changes, then switch back to the Terminal and press the up arrow key on the keyboard to retrieve the previous Terminal command and then press the Return/Enter key to restart the python server.

Tip 2. Any changes to the HTML files is seen after changing and saving those files, then refreshing the browser, unless the python server code is causing the problem.

Tip 3. When given any partial source code, look at the existing code and see if there are existing elements that could be used in conjunction with a programming language reference to provide similar functionality.

Tip 4. Remember that you can send images and text as well as HTML pages to a web browser, without needing to embed the text and image in an HTML file with a python web server.

Tip 5. When links are clicked in the HTML files, that header information is reflected in the terminal window, so it is helpful to have the editor, browser, and terminal window accessible at the same screen. Use the Terminal information to help track down and debug your code.

Tip 6. Do NOT run the python server in more than 1 Terminal at a time.

Tip 7. Always terminate the python server with control-c before closing the Terminal window.

Tip 8. Make sure you have images in some of the HTML files, so you can see how that header information differs from HTML files just containing text.

Tip 9. Locate and print a handy python quick reference guide. Also locate and bookmark good URLs like the official python programming language site.

Tip 10. Play with it. It takes remarkably few lines of source code to create a python web server, and it is a fun language to use.

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

I had one interview a couple of years ago where an interviewer Googled me and found a review I wrote about ColdFusion. Half way through the interview (after explaining his company used software that was way beyond support by the original vendor), he read a couple of paragraphs from my ColdFusion review – this is what I wrote (and he read to me):

Like many Consultants, I’ve worked at a wide variety of clients during my career. Some clients used cutting edge technology that met or exceeded their business needs, and the greatest challenge with those projects was either company politics or getting approval for the hardware needed for the applications. While tough, the technology was interesting and those are usually the most fun places to work.

Then there are clients that choose to limp along on some underpowered and inappropriate tool or deprecated programming language written many years earlier that miraculously works because of constant nursing and continuous patching by some tired and unappreciated developer. Those companies rarely understand the costs of moving to modern technology and it is a frustrating situation for any developer to deal with, but unfortunately this is experienced far too often in the business world.

After reading my comments, he asked why I’d want to work for his company since they use old, unsupported software. I said I stood by my comments because developers prefer to work with technologies that are at least supported by the vendors, and that using something so old that it required an old, unsupported version of operating system was dangerous. Would that mean I didn’t want to work with his product? No. But I did feel (and suggested) he move forward to a modern version of the application and a modern version of operating system still supported by the vendor.

Why is this relevant? My review appeared in a magazine in the UK 1.5 years before I was interviewed, and when I wrote it I didn’t think future employers would be interested in tracking down everything I’ve written. I have published a lot of pieces in print and online publications, so there are many places for people to look if they really want to see my style and opinions, but it still took me unaware during the interview.

In current times, we have Facebook, Twitter, LinkedIn, blogs, and other social media sites where people can and do vent about life, work, and personal situations. Now companies have caught up with the widespread use of social media, so it is important to be aware that more than your intended audience may read what you write, and that could affect current or future job opportunities.

Last night, we met for our second Advanced Web Application Development class and one of the students gave a talk on social media and how it is perceived by companies. His presentation highlighted the facts that employers are aware when employees make negative comments about the company or their bosses, and they can legally do something about it. Recently I spoken with recruiters that advised I remove any negative remarks from any Facebook, Tweeter, LinkedIn or blog entries, as they see more and more companies pre-screening applicants. I was told there are now powerful tools employers use to search for social media content, and they are interested in the way people act and speak around their friends.

Today I received an email from a consulting company that addresses Social Media and want to share it with you, as it is very good and quite appropriate for employed and unemployed workers. Click here to access and download the PDF. This presentation is excellent and well worth the time to read.

You lose control of content distribution when you publish something on the internet. While searching for myself, I’ve come across things I wrote for print magazines in the late 80s, which is ‘way before the internet. If you wrote for a print magazine, someone may like it enough to scan in and post for others to see, so please be careful what you write.

– Mike

By Ted Bade, © Copyright 2011, All Rights Reserved.


As you have probably gathered, I really enjoy astronomy. I like looking at objects in deep space, gazing at the moon and the planets of our solar system, and sometimes even enjoying a glimpse of a comet. I enjoy using my telescope but often, at my home in New England, the skies are overcast, or the weather is rotten. (Especially this current winter with record amounts of snow fall.) So what does one do to enjoy a little astronomy when the sky doesn’t cooperate? Find an alternative, I say. This is easy for Starry Night users, who can look at the LiveSky menu and select ‘Online Telescope Imaging…’  which opens a browser window to access a site called SLOOH.

SLOOH the Site

Several years ago, I learned about The name SLOOH is a play on the word slew, which in Astronomy circles to slew a telescope is to move it’s position. What offers is access to large 20” telescopes, via the Internet. The telescope is controlled remotely and moved through a series of targets as the night moves on. The scope stays with each object for a period of time, giving the camera time to collect and even color the light, producing beautiful images.

SLOOH the Software

The SLOOH interface is the user’s window to what the telescope sees. You can watch as the image develops on your screen. Starting with a monochrome, then watch the colors revealed as various filters are applied. You can capture up to three images any time during the exposure, you select when. One of my favorite tricks is to make an image before the colors start, and one just before the end of the exposure. This gives a great comparison of  naked eye viewing versus a time exposure.

Granted, you are not specifically in control of where the telescope points, or how long the exposures are, but a great many of the objects available to see based on the time of year  are on the list. Also, don’t forget, two very important aspects of this telescope: It is large (20” reflector) and the position of the telescope.

When SLOOH started, there was one telescope on a mountain in the Canary Islands, which is close to the equator. This means that it can “see” most of the sky, north and south. Within the past year, SLOOH has added two more telescopes, one in the mountains of Chili and one in Australia. (They recently shutdown the Australia site because the weather conditions there we rarely good and they weren’t getting much use of the telescope.) With telescopes in these various locations, a member has the potential of being able to see any part of the sky.

After you log into your SLOOH account, you are then taken to the “Launch Pad” which gives you access to various features of the site. In addition to the three telescopes, there is a link to the images you have downloaded, banners telling you of “radio shows” the site provides, access to reservation of time slots, as well as a brief list of what is currently being looked at as well as what the next few targets are.

From the launch pad, you can choose which telescope you would like to see, providing that telescope if currently on line. Once you choose a telescope, a new window opens which is your window to accessing the telescope view and information about what is on the screen. This window provides your view of what the telescope is seeing as well as a lot of other information. Take a look at my screen shot.

First of all there is a big circular area which displays what the telescope camera is seeing. As the exposure continues, you watch see it change in this window. A button near the bottom of this circular area shifts the camera view into full screen. To the right of the circular view area there are three buttons that control the view you see. There are three possibilities, High Mag, which gives a view using the maximum magnification, Wide field shows the image in a wider field and with less magnification. (Note that some objects do not use the high magnification, because it wouldn’t make any sense. Looking at a small corner of a large object wouldn’t be of much use). The last view is “all sky”, which is essentially what you would see if you just looked out of the telescopes dome.

The left hand side of the window is the information area. There are several choices of information and settings to choose from. The default is “Mission data”, which offers information about the object currently being viewed. The other tabs provide other features, for instance, you can tune the program to your system and display, check the weather conditions at the dome, or get some help. When there is a radio event on there is usually a chat channel open for members to ask questions/make comments during the show. You can digitally enlarge an image, see how long the current exposure is and how much time is remaining, and more.

SLOOH has a presence on Facebook, Twitter, and interacts with Google Earth. There is also a forum to participate in if you like that. With Google earth, you can share your images of the universe with the Google earth (universe view) site. It’s a cool way to share your work!

One feature, I haven’t tried personally, is the ability to schedule a time slot to view coordinates that you are interested in. There are three options for selecting a target, choose from a list of objects, choose by using a catalog number, or enter the coordinates of an object or area of space you are interested in. The schedule window shows slots for the current week. So to schedule the telescope you choose an object and an available time slot. Just be sure that you will be able to view the scope when your time arrives! Otherwise, you will miss the view.

While looking at a live computer image of what the telescope can see isn’t as exciting as looking through one’s own telescope in the backyard, it is very nice. The images that you capture are tagged and dated, then stored for your later perusal or downloading.

The SLOOH site organizes the images you have captured for easy retrieval. The images are organized by category such as Solar System, Globular Clusters, various types of galaxies, and more. When you select a type, you are presented with a list of objects of that type, each object in the list also indicates how many images of that object you have collected. It also tells you the time and date of the most recent image. If you click on a specific object, you are shown a list of your images. Here you can enjoy looking at your images or download them for better processing. As with any astrophotography image, a little digital darkroom works can go a long way! You can also delete images you don’t like.

Besides downloading the image, you can share the image with your friends. SLOOH provides easy links to many different social networking sites.  Images have a SLOOH logo on them, so they get credit for the image, but they are your images to work with. Being a Mac guy, I collect and process my favorite images and have made a photo slide show of them. Mostly I use my favorites for backgrounds on my desktop and as a screen saver.

There are two basic plans for buying into SLOOH. First there is the “Commander Membership”. With this membership you pay an annual fee and can log in and view any of the scopes any time they are up and running. You also have a fair amount of personal scheduling time (When I started years ago the membership included so many minutes of scheduling time, currently it appears that, as long as things aren’t busy, you can use more time. The Commander fee is $50 a year, but I noticed that sells it for a discount.

The other method is called a Credit Membership. In this plan you buy an amount of credits which can be used anytime you log in. When you use them up, you can buy more credits. You can buy credits along with activity books and other things from various retailers. SLOOH links directly to as their retailer, but I have seen the packages at other locations.

I have been a member of SLOOH for several years. My activity varies, but when I have a bit of free time I like to log in and see what’s on the display. As with any telescope, weather conditions can be an issue. Cloudy skies, a full moon, and other factors can make the telescope unavailable. Sometimes the images are spectacular and at other times they are terrible. But this is typical for astrophotography. The radio shows have come and gone over the years I have been a member. It’s great listening to an astronomer (amateur or professional), as they share their insights and thoughts about astronomy.

I truly enjoy this site and the services they provide. I intend to remain a member as long as I am able. I really enjoy this site and have a great time watching the sky through their telescopes. If you want my advice, I’d encourage you to visit SLOOH’s site and see what they have to offer.

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

Title: Unleashing Web 2.0 From Concepts to Creativity
Authors: Gottfried Vossen and Stephan Hagemann
ISBN: 0123740347
Published: August, 2007 by Morgan Kauffman Publishers
Price: $52.95 for paperback/$29.95 for Kindle

This semester (Spring 2011) I’m taking a course in grad school called ‘Advanced Web Application Development’, and the only required textbook is ‘Unleashing Web 2.0 From Concepts to Creativity’. I bought the book before the start of the semester from (good price and fast delivery) and started reading it the day it arrived. I had read 3 or the 6 chapters by the time the class first met, when I learned that we would only cover those chapters, but decided to finish the book as the material was interesting and I found the material to be an easy read. Let’s check it out.

Chapters in the Book

The book is organized into 6 chapters:

  1. A Brief History of the Web
  2. A Review of the Technological Stream
  3. Enabling Techniques and Technologies
  4. Sample Frameworks for Web Application Development
  5. Impacts of the Next Generation of the Web
  6. The Semantic Web and Web 2.0

The authors approach is to present the topics and refer the reader to URLs for current information.

Chapter One provides history and terminology relevant to Web 2.0. It includes definitions of the 5 types of e-commerce (B2B, B2C, C2C, G2C, B2G), PayPal, and some code examples. There were examples of CSS and XML  code – not enough to do more than see a short example, but still useful when taking free online language sources. I particularly liked that the authors mentioned MAMP/WAMP/LAMP – Mac/Windows/Linux Apache MySql PHP/Perl – which developers use to develop and test server-based applications using Apache and MySQL. I also liked how the authors explained Web Services.

Chapter Two has HTML code and tags, more CSS code, some JavaScript and PHP. This chapter goes much deeper into Web Services and WSDLs, and it brings up Amazon’s ECS (E-Commerce Service). I’ve worked with Web Services before, and while this material is not enough to know how to configure them by itself, it is well-covered and the authors give internet references that are useful.

Chapter Three goes into RIAs (Rich Internet Applications), Saas (Software as a Service – the included diagram is very good), Google APIs, and Flickr. Good data, especially on the APIs although you still need to go to the API websites for detailed information.

Chapter Four goes into client-side and server-side frameworks. Our Advanced Web Application Development class has a project where we use Ruby on Rails (server-side framework), so this was interesting but no where near enough for what I’ll need to do the coding so I will pick up a book dedicated to Ruby on Rails development. This chapter also covers MVC (Model View Controller) and has a decent diagram as well as written data on the topic.

Chapter Five goes into the business-side of Web 2.0. It breaks down the types of commerce that (as of 2007) is done using Web 2.0, and this is the main section I’d love to see updated to 2011, as there are probably a few new ones that came out since this edition was released.

Chapter Six is on Semantics, which I just started and will add that information when I finish the book.


For now, I liked this book. I like the approach of referencing many free online sites that will have more current information than books several years old. I will keep this book on the bookshelf, even though I have many pages of notes I made while reading the book (who wants to memorize hundreds of URLs).


  • Reasonable priced: check out for used copies if you’re watching your book budget.
  • There is a version available for the Kindle. I don’t own one, but understand some students are going the ebook route as often as possible to save money and weight.
  • Excellent references to many internet sites with relevant Web 2.0 information. Most of the URLs I checked for Eclipse add-ons or plug-ins were still available.
  • The content flows well – not disjointed, which does occur with books that have multiple authors.


  • The book was published in 2007, and 3-4 internet years is like 50 human years. I’d love to see this book updated with 2011-current information. I bring this up is that some of the Eclipse add-ins did not work with Eclipse Helios (although they did with Eclipse Galileo).
  • Only provides code snippets, so you need to buy other books to learn more about the programming languages mentioned in the book – to be fair, there are enough internet sites on the languages that you don’t need to buy a programming book, but I prefer to learn coding from old school printed books.


Recommended buy for developers or managers that want more understanding of Web 2.0 technologies. This is good book and it was worth the time to read it for my class, as well as to come up to speed in Web 2.0 technologies and terminologies. I liked the approach of providing an overview of topics while providing URLs, which should have more up-to-date information than printed books.

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

iPad versus Android


Freescale aPad IMX515

What is the fuss about tablets? Do we need them? Why or why not? How much do they cost? Which one should we buy? What is the supported hardware and software? How do we get more (software and hardware)? Can the operating system be upgraded? What OS versions to avoid? What do they lack, straight out of the box, and what add-on is worth it’s weight in gold? Let’s find out.

I’ve been tracking tablets since Apple released the iPad, and I have to admit I’m impressed. I’d like an iPad, but the price tag is so much higher than 7″ or 8″ Android OS-based tablets, so my first tablet will probably run Android. I’ve done some online comparisons and see there are 7″, 8″ and 10.1″ Android tablets that run different versions of Android (1.6, 2.1, 2.2) and have different supported hardware/software features.

The hardware features that vary include CPU and clock speed, memory (ranging from 128MB to 1GB), hard drive storage (2 – 8 GB), TF slots for additional memory (max ranging from 16 – 32 GB), cameras (no camera, have web cam, have 1.3 M Pixel cameras – similar to cell phones from 2 – 3 yrs ago), USB Ports (some do, some don’t), and wireless connectivity (a/b/g/n).

The main supported software I’ve seen is probably related to the version of Android installed on the tablet. The types of supported video formats varies considerably – only 1 or 2 I looked at support H.264, and only 1 said it supports Flash 10.1.

The tablet that has caught my eye today is the Freescale aPad IMX515 ARM Cortex A8 8″Android 2.2 tablet. Speed is 1 GHz, it has 512 MB RAM and 4 GB storage (plus TF expansion slot for another 32 GB storage).

My question: Has anyone out there that bought an Android OS tablet be willing to share their experience of using the tablet? Ted, one of the other contributors of this site, has an iPad and loves it. I’m curious if the Android tablets invoke as much appreciation by their owners.

I’d like to learn what owners like and dislike about their tablets, what they cost and what they spent to upgrade them, how long is the battery life on a single charge, what types of applications are bought (vs free downloads), how many use these devices at work or school, and whether or not the owner feels these devices can replace laptops or netbooks.

My impression is that these devices are very handy and can take the place of a lot of stuff being done on laptops (gaming, internet browsing), but I don’t believe they currently can replace a laptop. I cannot imagine writing a long book or developing large software applications on a tablet. I can see them as terrific devices to take out in the field – I know I’d love to take one along when going out at night to observe the skies, as long as the battery life kept it going all night.

If the Android tablets have short battery life and the iPad is much greater, it seems wiser to skip the Android tablet purchase and move right on to the one that will do what I need, because battery life is just as important as supported applications.

I’d like to hear from anyone willing to share about their good or bad experiences using a tablet.


Apple released the new iPad (2.0) on March 2, 2011. It is 2/3 as thick as iPad 1.0, faster dual core A5 processor, better/faster graphic processing (1024×768), lighter (1.3 lb), has front and rear-facing cameras, 3-axis gyro, accelerometer,  runs iOS 4.3, same 10 hr battery life, and ships 3/11/2011 at the same price as the iPad 1.0 models. Mac computers must have a USB 2.0 port and run OS X 10.5.8 or later. Click here to go to the site for product details. Want to see the new model? Here it is:

By Harry {doc} Babad, © Copyright 2011, All Rights Reserved.


As a technology and greening blogger, and co-author of two reference rich textbooks on the world of nuclear I almost always provide my readers with a Wikipedia citation about my subject matter. Why? First and foremost, after employing a strong application of Caveat lector (Let the Reader Beware), all the Wiki based articles I use for reference purposes are both well written and follow Doc’s rule for reference use in peer review.

Are the references provided essentially complete or representative of the literature, and relevant?  Do they include both precedent and present work, including any referenced disagreement with any of the Wiki author’s views?

Yes, tedious as it may seem, I go back and check, at least skim if not study, all the RELEVANT reference contained in the Wiki. If the material parallels the analysis, not always necessarily the conclusions, found in cited references at the end of each Wiki, I deem credible, I reference the Wikipedia article; it’s an easy source for my readers to access. My requirement is credibility based on my assessment of good science, peer reviewed if possible, not consensus.

Good Science in Wikipedia Is Too Often Challenged By Purists
Because it’s Published in a
Mere Wiki

I recently (2008 and 2009) coauthored and published two books nuclear science and technology, both textbooks. They are, as referenced below, Nuclear Energy and the Use of Nuclear Materials For High School and Middle School Teachers, and more recently Nuclear is Hot. The EnergySolutions Foundation published these, in support of its educational mission.

I mention this because the most broadly focused negative feedback we received on our books, from some reader, but not our reviewers, was related not to the books’ contents, but to our use of Wikipedia for some of the many hundreds of references in the books.

We were scolded by a few academics, mostly science high school and college teachers or professional ‘educators’. This it was noted reduced our credibility. Bad authors… we should have used only primary references, despite their technical complexity, instead of Wikipedia and other more reader accessible generalized references. The commenters claimed Wikipedia references were not trustworthy when compared to references cited in the Encyclopedia Britannica, journals or professional society published science-technology magazines.

Alas, such trustworthiness arguments also holds true whether reading a textbook full of primary and secondary references that are digests on any technical subject. Unfortunately, if your goal is reader accessibility to source materials, as the song say – primary journals are “the last thing on my mind.” From my perspective, accessibility means both ease of access and ease of understanding by my target audiences.

Using journal level source or often-outdated Britannica details as a basis for sharing information, sucks. Journal articles are hard enough to understand by well-educated degree bearing professionals; especially when they are not experts in that particular scientific or technical niche.

For example, I’m earned both a doctoral degree and have a number of years of postdoctoral experience at MIT and the University of Chicago. I studied to become a synthetic and physical-organic chemist, I’ve worked and published both as an academic and then industrial scientist until 37 years ago, publishing in my field of study and work. Then, mid-career I switched jobs and specialties to the area of nuclear science and technology, particularly the management of radioactive waste.

Trying to wade though and understand the details in an interesting appearing article in a physical or biochemistry journal, this is still chemistry. It’s tough, like trying to read Homer’s Iliad in the original Greek. Even following the details of current organic practice, 37 years from having worked in the field, very difficult. Not only has the knowledge base grown exponentially, but also the vocabulary has changed beyond my present understanding.

What does that suggest about the general ability of even bright student and education degreed teachers, to deal with such ‘primary’ reference materials? Talk about Towers of Technical Babble!

A Bit About References…Their Coats of Many Colors

Like dwags and people, references come in many types and pedigrees. Just to tickle your appetite, here’s my stream-of-consciousness partial list of reference categories.

Primary ReferencesJournal Published Research

Primary (Professional Society Published)

Secondary (Published by an Industry or Advocacy)

Well Referenced Magazine & Newspaper Articles including Op-Ed Pieces (e.g., MIT Technology Review, Discover Magazine, Scientific American)

An additional source of information comes from the Internet associated with those above or other that specialize in reporting about specific technical topics.

Secondary References Including Media Published Web Sites

General Magazine & Newspaper Articles w/o traceable leads to background information

National Magazine with Strong Editorial and Fact Check Policies (e.g., The NY Times, Washington Post, Christian Science Monitor, Wall Street Journal LA Times and a few like the Silicon Valley Review

All the rest

Well Referenced Wikipedia Articles

Journal Review Articles

Furthermore, all science is fluid, both growing and changing; you knew that? New paradigm arise, out dated theory is replaced, and additional better-verified data serves to change the reported original outcomes and conclusions.

As facts evolve, we must face the challenge that to remain informed we must keep challenging universal truths (e.g., “everyone knows…”) about science and technology. This circumstance is real, regardless of whether the source is Wikipedia, a science article in The Economist or Scientific American. Remember, we live in a world of evolving or even changing paradigms; therefore our knowledge must keep pace with such growth. That’s correct, but only if we’re are not to lapse into judging the technical world on outmoded and inaccurate information.

Why Use Wikipedia?

We know that all material on the Internet can contain both errors in fact or by the author selectively omitting contradictory information. This as is well-documented serves both to circulate his/her belief set or as a shortcut to escape the pressing of grants chasing and perish or publish.

A study in the magazine (scientific journal) Nature in December 2005 found “Wikipedia comes close to Britannica in terms of the accuracy of its science entries. (Nature is a peer-reviewed journal.) That investigation studied ”42 {scientific} entries from the websites of Wikipedia and the Encyclopedia Britannica on subjects that represented a broad range of scientific disciplines. A team of independent subject mater experts analyzed the articles. They found when the error categories were expanded to include “factual errors, omissions, or misleading statements” 162 errors were found in Wikipedia and 123 in the Britannica. That is roughly four per article for the upstart amateurs and three for the professionally authored and peer reviewed publication that has been around since 1768. (Nature 438, 900-901; 2005.) There may be more information on the Internet but time constraints did allow me to check for more current analysis.

Since 2005 all the documented Wikipedia errors have been corrected. The Wiki managers have also expanded their efforts to deal with such error identifying feedback more rapidly and closer to real time. The heavily peer reviewed Britannica claims it was slandered and at least in 2008, the Nature cited errors had not been corrected.


Therefore, as with everything scientific and technical you read, whether textbooks, an Internet article, or an entry in an encyclopedia, check out both the facts and the author’s affiliation. Google it, then draw your own conclusions based on the evidence.

On the Internet, checking facts and looking for biases is easier than you think.

Read the articles, check who sponsors the site, and that organization’s mission statement. You may not like what you find relative to possible sponsor bias. I often don’t – but relative to science and technology I read, even as an old man set in his ways, I live with it. The best I can do with problem documentation is to sort out half-truths and distortions from substantiated fact; and attempt to verify that the research sponsors have not bought the results. [E.g., think drug trials and genetic engineering test results.]

It’s a little bit like the so called “fact” sheets politicians post on their websites about their opponents, which show little resemblance to things like the details documented in the Congressional Record.

Alternatives to Wikipedia

You can search each subject one search topic item at a time in Google. Remember the way you ask the question will filter your results. Then start reading …all thousand or hundred thousand hits. Fortunately the most relevant hits are in the first 100 references (links) Goggle retrieves. There are also semi-static encyclopedias on the web. Amazingly they too often lift material, from open information sources, such as Wikipedia or Encarta without acknowledging that fact.

In seeking reference information for our nuclear textbooks, most of what we found from antinuclear groups was irrelevant, inaccurate, outdated, heavily emotionally biased or downright scare mongering. The textbook authors chose not to cite or list the documented inaccuracies, and instead did not use these sources as references. Two examples:

The TMI reactor accident killed no one and albeit cleanup was expensive to deal with. This was particularly true in the panicky environment fostered by the local and national press. The accident did not significantly increase the cancer mortalities in the nearby Harrisburg, Pennsylvania region. If zero is a number you prefer, stop flying, don’t visit Denver, and stop eating because food is naturally radioactive as is the world. Chernobyl (USSR) was a reactor, unlike all now operating licensed power generating facilities in the world, which had no containment vessel. You know, that’s the dome around the reactor at a nuclear power station.

The Chernobyl Disaster was caused by human error and compounded by faulty technical design, yes, and no containment vessel. TMI had a containment vessel that worked, which limited release. America has never used uncontained nuclear reactors nor have any of the nuclear dependent nations such as France, Japan, England Korea, China or India. Modern nuclear reactors are even now being built to even stronger containment standards to thwart terrorist threats such as those of 9/11.

In addition the new reactor types coming online, are passively safe, they need no operator or instrument controlled safety intervention so shutdown if anything goes wrong.

Did you know that Wikipedia publishes a Teachers’ Guide?

The guide can also be used as a general users guide, since it stresses Caveat lector, albeit implicitly. Unlike most other sources of information I use, it provides detailed answers to questions about the limits of Wikipedia accuracy and reliability. (See: Part of the information is related to the rules under which the site operates; the rest focuses on the feedback and corrections practices used by the site’s developers.’_Guide The sections relevant to this article are listed below.

  1. Is Wikipedia accurate and reliable?
  2. What keeps someone from contributing false or misleading information?
  3. Can students cite Wikipedia in assignments?
  4. Is it a safe environment for young people?
  5. What is open-source media?
  6. Why do people contribute to open-source projects?
  7. Why have we not heard of this {about Wikipedia} before?

Also check out:

I’m not going to rehash the contents of the referenced articles provided Wikipedia’s operating philosophy and rules, I also accept that some of you won’t believe simply because information, simply because it’s from Wikipedia. You’ve a right to believe, but from my perspective, not to try to force me to accept your beliefs. I challenge you to identify an alternative, broad source of high quality information that prominently acknowledges both disagreements and errors. Some of the media does this in fine print on page 10 of a magazine or newspaper never on TV and rarely on commercial radio.

To date, of the thousands of blog s I’ve visited, searching for information to write about, only a handful, admit to error or acknowledge opinion contrary to what they pitch. As bloggers, are we as stated by and old New York Times masthead publishing all the news that’s Fit to Print; or hiding behind a facade of all our news is Print to Fit.

The Wikipedia teachers’ guide notes, as do other Wikipedia links I’ve provided acknowledges: “Wikipedia cannot be perfect. There is almost certainly inaccurate information in it, somewhere, which has not yet been discovered to be wrong. Therefore, if you are using Wikipedia for important research or a school project, you should always verify the information somewhere else — just like you should with all sources.”

Without belaboring the much point further, I’d like to quote from Bill Kerr, with whose analysis I agree. Bill is an Australian blogger who frequently and intelligently deals with Internet censorship in public schools and other related topics.

Bill Kerr’s Concerns (and also mine.)“I am worried about how academics {and teachers in general} are treating Wikipedia and I think that it comes from a point of naivety. Wikipedia should never be the sole source for information. It will never have the depth of original sources. It will also always contain bias because society is inherently biased, although its {Wikipedia} efforts towards neutrality are commendable. These are just realizations we must acknowledge and support.

But what it does have is a huge repository of information that is the most accessible for most people. Most of the information is more accurate than found in a typical encyclopedia and yet, we value encyclopedias as an initial point of information gathering. It is also more updated, more inclusive and more in-depth. Plus, it’s searchable and in the hands of everyone with digital access (a much larger population than those with encyclopedias in their homes). It also exists in hundreds of languages and is available to populations who can’t even imagine what a library looks like.

Yes, it is open. This means that people can contribute what they do know and that others who know something about that area will try to improve it. Over time, articles with a lot of attention begin to be inclusive and approximating neutral. The more people who contribute, the stronger and more valuable the resource. Boycotting Wikipedia doesn’t make it go away, but it doesn’t make it any better either.”

Also check out:

More About Checking Technical Web Sites for Bias, Error, Omission,
and Just Plain Dumb Mistakes

You know, googling a subject, not only for Wikipedia reliability, but also checking information provided in other media, including that of blogs, is wise if you want to write credibly about technology. Certifiable accredited subject experts write many of the web’s posted technical pages. Unfortunately folks write too many others on a belief-based mission. I’m talking about blogs the political and think-tank pundits, aka talking heads. It’s not always easy to tell the difference between the two, without taking the time to further check.

Some of these are articles I reviewed are by a. new to me author category, discipline jumping born-anew experts. [E.g., an industrial engineer becoming an authority on genetic engineering. Alternatively, a nuclear engineer suddenly becoming an authority on cancer or nanotechnology.] Are you to let Dr. Harry Babad {me} do surgery on you? If so I have a bridge in NYC to sell you – it’s a real bargain.

One aspect, call it doc’s head check for evaluating the credulity of a source of material, is an author’s willingness to provide referenced full disclosure of opposing viewpoints. I found for most of the Wikipedia articles I checked, where appropriate, differences of opinion or a weakness in basis was noted. After all it’s what Wikipedia rules {author guidelines} require.

More About the Value in Blogs — From a devil’s advocate point of View, there are strong believers out there that who know that blogs are worse than porn. Check it out — for now I’ll provide only a single link – you can Google further to follow this belief set. See: Perhaps this too will become a subject of a future article.

In Closing

I will continue to judiciously use Wikipedia as a reference source in my technical work when I’m writing for a non-technical audience.

Reality, in everyday usage, means “the state of things as they actually exist.” The term reality, in its widest sense, includes everything that is, whether or not it is observable or comprehensible. Isn’t philosophy awesome?However, the only ways our concepts of reality have a demonstrable basis would be on the preponderance of available evidence. This requires an ability to reproduce observations of any part of the world around us. And of course, the more we know and study and test what we read or hear or see, the more our vision of reality changes.

In the sense I’ve defined above, the reality of information found in our sources of information, which can be physically or statistically checked, the closer the information comes to being valid at any given time. That also holds true to papers by student using Wikipedia as a reference. I agree with teachers that students need to provide more dialog and references than a Wikipedia article. By simply citing a Wikipedia reference or three and quoting them is insufficient to understanding of the subject matter and that’s what learning is all about. Parrots are not thinkers, neither are tape recorders.

TEST TIME — Apply the caveat lector test to the following a list of web published realties, be they from Wikipedia or your grocery checkout counter’s favorite tabloid.

  • Information provided in a 30 second TV spot by a politician up for election.
  • Alleged facts during TV debates about the environment – folks claiming solar energy is clean energy without taking full life cycle pollution costs of making the solar cells and solar arrays into account.
  • The actual number of folks who’ve gotten cancer from radiation released in the Chernobyl reactor disaster or the TMI accident.
  • Information supporting your buying a stock from someone who gains by making it appear as a good deal.
  • Medical information on sites owned and operated by those trying to sell you cures.
  • Facts about people and issues by those who have a vested interest in their TRUTH such as many TV and Internet talk shows that take information out of context or just plain lie to get their message across.
  • Most advertising that claims superior performance about a product in LARGE print and provides you actual details in tiny print.

My Bottom Line — the more subjective a topic, the more room there is for bias or error or omission. At issue, it is and always will be hard to prove the reality of subjective information, despite the number of people who treat such information as TRUTH.

Therefore, do your homework. Remember, according to doc_Babad, grey is more beautiful than black or white. The more important the decision, the bigger the challenge of the homework assignment, but pick a topic and start checking… it will brighten up your mind.

Conclusions About Wikipedia as a Valid Technical Source

I wholeheartedly disagree with claims that Wikipedia is not trustworthy when the reader practices Caveat Lector! I consider, as do many other scientific and technical experts, Wikipedia a reliable and accessible site for technical information, provided that the reference cited in the Wikipedia article meets the standard identified at the beginning of this article.

While some educators dislike usage of Wikipedia, a nose in the air attitude, Doc continues to verify, to the extent that the information is available online, all cited Wikipedia references. I check for technical accuracy and the use of the scientific method in data acquisition that support the author’s findings. If I’m not comfortable with the underlying data, I either don’t site that Wiki, or provide a footnote documenting my concerns.

Therefore their use as broadly accessible reference to technical fact or analysis is justified. As I tell naysayers, if you don’t like the references I use, provide me with a readable, accessible and peer reviewed or otherwise credible alternative. I’ll add you counter argument to my articles and reference list.

Science is grey and evolving, and topics such as nuclear safety or man-made CO2 being the major cause of climate change evolve. Today’s demonstrated truths rapidly become yesterday’s fairy tales.

As facts evolve, we must face the challenge that to remain informed we must keep challenging universal truths (e.g., “everyone knows…”) about science and technology. This circumstance is real, regardless of whether the information source is Wikipedia, a science article in The Economist, Business Week (on technology, Discover Magazine, Scientific American, or a blog reporting on technical material. I use Caveat lector when reading a web hosted article, or any blog espousing any point of view whether it be a headline in a newspaper seeking circulation, or a study in a medical journal by an author whose work is funded by a drug company.

Remember, we live in a world of changing paradigms; therefore our knowledge must keep pace if we are not to lapse into judging the technical world on outmoded and inaccurate information.

I recognize that some of the other internet references may at future date become unavailable, thus by adding ‘living’ Wiki-based reference(s) to my articles, where appropriate, we hope to maintain the ‘knowledge thread.’ In addition, hopefully, the information in the article will allow readers to look elsewhere if they encounter unavailable citations at a future date. Or perhaps more to the point — Google On, but Caveat lector!


Nuclear Energy and the Use of Nuclear Materials For High School and Middle School Teachers by Raul A. Deju, Ph.D. and Harry Babad, Ph.D.; © 2008 EnergySolutions Foundation, Inc.

NUCLEAR IS HOT!{A book for High-School Students.} Everything you wanted to know about nuclear science and were afraid to ask. By Raul A. Deju, Ph.D., Harry Babad, Ph.D. and Michael A. Deju. © 2009 The EnergySolutions Foundation. First Edition Published March 2009,
ISBN Number 0615277543.

The EnergySolutions Foundation [

Teachers and Wikipedia

Trusting Wikipedia

Wikipedia Errors

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Sidebar Notes

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles cited or quoted in this column are copyright protected – their use is both acknowledged and is limited to educational related purposes, which this column provides.


The Scientific Method and More

Thumbnail definitions clipped from WIKIPEDIA

But let’s define some other terms first.

BIAS: is a term used to describe a preference toward a particular perspective or ideology, which means all information and points of view have some form of bias
BELIEF: is the psychological state in which an individual holds a proposition or premise (argument) to be true without necessarily being able to adequately prove its main contention to other people who may or may-not agree.
FAITH: can refer to a religion, or to another deeply held belief, such as freedom or democracy It allows one to commit oneself to actions or behavior, based on self-experience that warrants belief, but without any existence or need for existence of demonstrable or absolute proof.
PARADIGM: Since the late 1960’s, the word “paradigm” has referred to thought pattern in any scientific discipline or other epistemological context. The Merriam-Webster Online dictionary defines it as “a philosophical and theoretical framework of a scientific school or discipline within which theories, laws, and generalizations and the experiments performed in support of them are formulated.”

Therefore, bias, error, omission, and just plain mistakes are all a part of our information sphere —yesterday, today, and tomorrow. The major differences are that we now, in 2008, have a greater ability to more broadly and deeply check what we read and hear and to attempt to make sense of the information available.

The Scientific Method

Scientific method refers to a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge. To be termed scientific, a method of inquiry must be based on gathering observable, empirical and measurable evidence subject to specific principles of reasoning. The Oxford English Dictionary says that scientific method is: “a method of procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses.”

Although procedures vary from one field of inquiry to another, identifiable features distinguish scientific inquiry from other methods of obtaining knowledge. Scientific researchers propose hypotheses as explanations of phenomena, and design experimental studies to test these hypotheses. These steps must be repeatable, to predict future results. Theories that encompass wider domains of inquiry may bind many independently derived hypotheses together in a coherent, supportive structure. Theories, in turn, lead to new hypotheses or place groups of hypotheses into context.

Scientific inquiry is generally intended to be as objective as possible, to reduce biased interpretations of results. Another basic expectation is to document, archive and share all data and methodology so they are available for careful scrutiny by other scientists, giving them the opportunity to verify results by attempting to reproduce them. This practice, called full disclosure, also allows statistical measures of the reliability of these data to be established.

Doc notes this is made more interesting by the fact that measured data is often modeled conceptually to determine further relationships among groups of tests, or to predict, let’s day for climate change, or disease spread, to predict future trends from past or present collections of data. [Doc’s bias, if you can’t measure it, I ain’t data.]. However it may be knowledge with is the result of extrapolating data, let’s say by modeling.

Did you know that the presumed father of the scientific method was Galileo Galilei (1564-1642.)”?

And as frail human beings, all of this is continually confounded by trying to distinguish between measurable (e.g., scientifically measured) truth and belief.

Truth and Belief — Again from WIKIPEDIA

Beliefs and BiasesBelief can alter observations; the human confirmation bias is a heuristic that leads a person with a particular belief to see things as reinforcing their belief, even if another observer would disagree. Researchers have often admitted that the first observations were a little imprecise, whereas the second and third were “adjusted to the facts”. Eventually, factors such as openness to experience, self-esteem, time, and comfort can produce a readiness for new perception.” []

To explore further, check out

By Harry {doc} Babad, © Copyright 2011, All right Reserved.

Lessons Not Learned from Nuclear Power ——— Doc’s Eclectic View


The more I read and study the approach taken by the US and perhaps much of the rest of the world to reduce the amount of carbon dioxide [CO2] released from coal or gas burning power plants, the more perplexed I get.  Readers, I would welcome any feedback from you, on the CO2 and storage alternative I describe below.

At the root of my concern is the fact that the industry with Federal help is leaning toward geological disposal, as opposed to the easier, lower cost, and likely as safe approach to use long term near surface or surface storage at each generator site. I am also admitted biased toward interim 50-500 year engineering solutions to those that need demonstrating to thousands of years of geological media certitude.

To maximize the feedback I could receive from energy and climate change knowledgeable individuals, I post an earlier draft of this article on the American Nuclear Societies Social Media eList. This is an by invitation only ad hoc team of experts who share technical information about energy, greening technology and at times work, mostly as individuals, to counter false and fact-free media claims, cause not knowledge driven activists, or just mom-and-pop grass roots true believers about… what ever their cause.

I have appended the detail and itemized feedback comments I received and my thoughts about the information conveyed. I share only the first names of the folks who proved feedback; who they are is their business.

Preventing Carbon Dioxide Release to the Atmosphere

This article discusses a conceptual view for an alternative to geologic sequestration; the surface or near surface intermediate term (50-500 years) storage of the CO2 Released from Power Plants.

As I follow the government with industry support’s search for, and efforts to demonstrate a safe and publicly acceptable way to find a way to capture and search, dispose of CO2, I get very confused. The first task, based on what I’ve read is relatively straightforward. The chemistry and engineering is well divined and demonstrated at small and intermediate scale. The second step disposal, or even very long-term storage is a more difficult task, fraught with uncertainties. This is especially true for geological or deep seas disposal.

The idea of injecting CO2 in to depleted oil or gas fields, or brine filled aquifers reminds me of the efforts to site a ‘geologically safe’ nuclear repository. The key issue, politics aside, is whether an individual storage/disposal location will remain intact for the lifetime of the risk. For radioactive HLW, perhaps 10,000 years, a regulatory not a risk based limit. For carbon dioxide, forever or at least until we need it to reverse the next ice age.

I believe, iconoclast that I am, that the general and likely insurmountable problem with geological CO2 storage is predicting the long-term future in a heterogeneous environment. Specifically, the safety of each greenhouse gas geological storage site requires that their integrity must be demonstrated on a site-specific basis, alas expensive, even without considering NIMBY related legal costs. This seems to be the case until someone comes up with a cost effective, implementable method of irreversibly converting the captured CO2 to a thermodynamically stable form.

One possibility, we well understand, is concerting to calcium carbonate, in situ – underground.  Converting our captured carbon dioxide to limestone, in a geological formation places less of a burden on proving the geologic integrity of a specific site.

In nuclear terms, think of this as the waste form, which for HLW is borosilicate glass or the insoluble ceramic spent fuel itself.

A Potential Interim Storage Solution

I wonder why the near surface or surface storage of dry Ice in a well designed, terrorist proof passive facility hasn’t been studied, or if so, not publicized.

I’d like to acknowledge the fine diagram of the storage concept, which I described to Scott Armstrong, over the phone last night. Scott is president of MC•MUG, the local Macintosh Users group, a graphics expert and instructor, and a fine photographer.
  1. Pile up stacks of dry ice blocks, either one atop the other or on  some simple weight bearing shelving.
  2. These should insulated by a thick layer of dirt-cheap rock wool. Either blankets on the ice or as part of the dome structure. Which, that’s chemical engineering 101 engineering cost analysis issue.
  3. Located the storage unit, I picked a rebar reinforced dome, geodesic perhaps, both for esthetics and its easier for a hijacked747 to slide off such a dome.
    From an applicable forces perspective think safety analysis, such a dome would be much less expensive then that for a present, or near future generation designed, nuclear reactor dome.
  4. Instrumented the facility with thermocouples, CO2 detectors or what ever; all of the shelf items Add, if paranoid, for emergencies, a small external cooling plant.Why small – short of dropping a nuke on the facility or a direct hit by a well focused full strength solar flare (Science Fiction, there’s unlikely to be a way to heat the dry-ice blocks rapidly enough uncontrollably evaporate the CO2 back in o the atmosphere.
  5. Site these storage domes at every coal, oil or natural gas based power plant or generator complex, they make the CO2 they get store it. You want to generate hydrocarbon based electricity, then store the CO2 as part of your costs of operation.

The nuclear power industry does this of necessity, due to the fact the contracts with the Department of Energy to take possession of the fuel have never come close to being met.  [One more form of indirect taxation we all must pay.] This continues while the industry and consumer are simultaneously being ripped off by a tax for a virtual-cost over run plagued Yucca Mountain based nuclear repository which President Obama cancelled, but without either stopping the tariff or refunding the industry’s money.

Potential Benefits of Dry-Ice Storage

  • No requirement for trading emission credits! You create the CO2, you store it.
  • Avoids the need for carbon tax, at least on burning hydrocarbon burning power plants. No need to confuse the issue with gases released by other industries like feed lots or tailpipe emission, The make you keep!
  • Bearing the Costs of greenhouse gas-storage become part of doing business and paid by the local and regional electricity rate-payer’s. The real, not artificially subsidized cost of electricity is what you and will buy, of necessity. Thus, if politically possible, the cost of electricity become clearly visible. Not as now, snuck out of your pocket by the industry, congress and the IRS.This also levels the playing field for other energy alternatives, and we will not need pay taxes for lobbyist selected or government favorites.

Again, think Nuclear reactors where the owners-ratepayer are forced to store Spent Nuclear Fuel at the reactor sites and the storage cost seemed to be passed on to the rate-payers; a reasonable precedent.

Side Note: The average discharge in our coal-powered fleet is ca. 1.2 to 1.3 tons/MWh depending on the type of coal burned, and which reference you cite.


Am I missing something? Is our love for big-ticket technology and profitable Federal grants the driving force preventing a KISS solution? Feedback, particularly with references that negate or support my arguments would be welcomed.

A Few References in Passing

Carbon Capture and Storage, Wikipedia, 2011 and the references contained therein. []

What is Carbon Sequestration? by the Big Sky Carbon Sequestration Partnership, undated [

Carbon Sequestration, AAPGGEO-DC Blog, Dec 2008. []

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles cited or quoted in this column are copyright protected – their use is both acknowledged and is limited to educational related purposes, which this column provides.

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Appendix – Social Media eMail Feedback

Responses to the Social Media feedback on “Long-term CO2 Storage A Nuclear Resembling Quandary”

The following is a set of cut and paste copies of the information provided our bloggers, members of social media email team. I have followed each individual’s direct feedback or general subject related comments with feedback, where appropriate. All emails discussed were received before the 8:00 PM of the 28th of January

Harry, aka doc_Babad

Feedback on Social Media Comments

My colleague Bob, a senior nuclear engineer/manager and cost effective going green advocate from Greenville, SC noted:

Dry Ice is a nice idea, but we’re tilting at windmills here.  Man-made CO2 in the atmosphere is a non-problem.  In fact it’s a good thing since it makes things grow and helps feed the world.  CO2 only makes up 3% of the GHG in the atmosphere and man only contributes 2% of the CO2.

Although I have tried to get my hands around such numbers, I’ve never been comfortable that they had been subjected to peer reviewed meta-analysis, such as is sometime done for conflicting drug testing result. Therefore I’ll continue to tilt at windmills should they not prove virtual.

Robert responded:

My biggest concern with CCS is the scale.  The amounts of CO2 to be captured, transported, and stored are immense.  While I am loath to call it impossible, I would think CCS a much greater technical challenge than nuclear waste (despite the media claims of the reverse).

I agree, with the general concern that Bob shares  – that’s why both capture and storage should be located at the point of origin, the generating complex. As far as which is more difficult, I believe today’s NIMBYs are tomorrows advocates.

Jonathan, in an engineering design focused feedback, pointed out:

1.   The immediate thing that strikes is how would the CO2 be cooled to form dry ice. I don’t have any top of my head figures, but my gut instinct is that it would be energy intensive. Add on top I doubt the best insulation would keep the dry ice solid for the decades necessary without more energy intensive cooling.
Jonathan, based on reading about currently available technology for (1) capturing heat not utilized for producing electricity, and (2) perhaps less robust means of turning such waste heat into power, I thought a real functions and requirements study coupled with detailed conceptual design analysis could flesh out the specific of how and how well. One alternative might be the use waste heat as a source of energy for CO2 solidification. My intent with the article was two-fold. First a response to at why I could, in 45 minute Google and DOE OSTI search session, I could find no reference to dry-ice storage as a potential methodology for curtailing the release of greenhouse gasses. Secondly, I wanted to get broad feedback for the participants in Social Media, on the concept. Thank you – you’ve helped me achieve that latter.

2.   You would also produce just under a cubic meter of dry ice for every MWh. That’s going to be one heck of a pile of dry ice very soon.

Of course, that reality might be even enough to frustrate the building of now power plants that use petrochemical to generate electricity.  In addition, there doesn’t seem to be a shortage of land around the generating plant’s I’ve visited or lived down-wind from. Whether on the surface or for a shallow storage vault, these folks certainly have enough acreage to keep expanding large uncontrolled ash/slag piles.

3.    One KISS approach would be to pump CO2 to a deep seabed location, where the water pressure would be sufficient to solidify the CO2 as it emerged – not that I think this is environmentally sound.

I agree the pressure meets CO2 solidification requirements. Again there’s the transportation problem poised by Robert S. Margolis. In addition are you going to build such a disposal site in international waters — hmm? You could of course try to license such a site or sites in the states that have deep brine deposited associated with salt domes or bedded salt… Texas or Louisiana anyone?

4.   One difference between a nuclear repository and CO2 sequestration is that with CO2 some level of leakage could be more acceptable. In very simplified terms if we were to sequestrate 100 years of CO2 and it had a 0.1% leak rate we’d have a tenth of our GHG emissions for 1000 years. If stabilizing GHG emissions requires an 80% CO2 reduction then we’d be essentially ‘taking a loan’ on future emissions. A hundred years hence we’d have to reduce to 10% of current emissions and have the 10% emissions from sequestration.

That would be a big ask, but if leakage was only 0.01% it might become more arguable, pragmatically against the alternative of not meeting emissions targets at all.

I agree in general, but wonder whether the other alternates, other than going CO2 emission free, contain comparable potential bobby-traps. In addition, being somewhat mathematics adverse, I don’t follow how a 0.1% CO2 leak forces me to take a loan on the future. I just consider it a 0.1% additional un-captured release, but a bit time delayed. Were doing much worse than that now. I’ve seen no statistically defensible number on capture efficiency either at a power plant o a regional pipeline fed, facility. What am I missing?

5.    I personally don’t support the case above when there are already good alternatives, but I think it would be an argument made. And perhaps more significantly very low levels of leakage won’t be a showstopper for CO2 CCS in the way it is made for nuclear repositories.

Okay! However, everything I’ve studied suggests that current regulations controlling nuclear material or radiation release are based on fear mongering politics and regulatory over enthusiasm. We too long, listen to the loudest voice, safe at any costs, because the costs do not come directly and visibly out of our pockets. I’ve never been comfortable with the thesis of always safe-always multiplicatively conservative, using SciFi scenarios, rather than demonstrated actual risk. Lots of healthy folks get significantly higher doses such as the residents of (Guarapari, Brazil; Kerala, India; Ramsar, Iran; Yangjiang, China). I still can’t find, since co-authoring two textbooks with Dr. R. A Deju in 2008 and 2009, any peer reviewed data that identified meaningful differences in heath, heath compared to folks in comparable socio-economic niches.

Stephen more broadly commented

1.    I’m just amazed that CCS is regarded as a viable concept. US coal-fired power plants crank out 2 billion tons of CO2 every year, and Chinese coal-fired power emissions I believe has overtaken those of the US. Four billion tons every year from two countries — we’re just going to magically keep finding low-cost storage sites for all this stuff?

I pass; we’ve paid for stupid things since a CO2 emission free alternative like nuclear has not gained sufficient impetus to have a meaningful effect on US and China’s emissions. Additionally, the of the main green energy alternative, no system has yet been cost effectively been demonstrated to guarantee base-line power, However, my favorite science fiction based alternative, the beaming of RF radiation to desert areas, from space, might do so if the desserts selected were globally located. What you say, creating a commercially based satellite system that collects gigawatts’ worth of solar power and beams it down to Earth where it is converted to electricity. The ideas was first proposed by Dr. Isaac Asimov in 1941, and more recently evaluated by folks are diverse at the US Pentagon. For SciFi buffs, Harry Harrison and Ben Bova also expanded on the theme.

2.    Most of the cost of CCS is in the second “C” — capture. The only “proven” technology is amine-based chemical absorption. All sorts of R&D is going into other, hopefully less expensive ways of separating CO2 from flue gas, but these are early-stage R&D efforts. (And the operative word is “hopefully” — none has been proven to be effective, much less economical.) So if CCS had to be implemented today, it would be based on amine absorption.

Stephen, I am uncomfortable with your thesis about the scalability and cost effectiveness CO2 capture. Although amine technology is most often identified an s reasonable, if not yet fully test concept for capture, there are others including use of zeolites and membranes. When I have time available, I will more thoroughly search this topic and share my finding with Social Media.

3.    Which is why it hasn’t been implemented yet. Generating companies use coal because it is cheap. When it is no longer cheap, well there goes its advantage. CCS is simply not economical — it adds a cost to coal-fired power. Long before people find that out the hard way, coal-fired power generators will have gone bankrupt or switched their fleets to gas or nuclear — assuming coal generation is hit with emissions regulation or legislation. And in the near term, gas looks to be the front-runner — it’s okay to use the atmosphere as a CO2 dumps as long as the CO2 comes from gas combustion.

True, but comes the day of either a carbon tax;
… or the potential sea-level rise caused flooding of costal mega cities. There will be 20 coastal megacities (population exceeding 8 million) by 2010. The risk comes from a likely combination of sea level rise and storm surges. Lets pick a few likely targets — NYC, Bangkok, New Orleans, Mumbai, Shanghai, Manila, Caracas, Ho Chi Min City
… let’s see who pays the piper!

4.    If some use could be found for all that CO2, then maybe CO2 capture wouldn’t be such a joke. But that would depend on large-scale hydrogen production from water splitting. And the best hope for that is to use nuclear heat to split water — it’s the only way to make H production sustainable and clean. There should be way more R&D in nuclear hydrogen production.

Of course, Stephen, I agree!

Rod responded to the Social media dialog; I agree, by stating:

As a long time adherent of KISS approaches to engineering, I disagree with your interpretation. A real KISS type engineer who really works at keeping things simple would say – just don’t produce the CO2 in the first place. Then you do not need to spend any time, effort or money figuring out how to separate it from a waste stream, how to capture it after separation, where to store it or how to get it there.

The big difference between used nuclear fuel and CO2 is that the former starts off as a solid material encased in corrosion resistant cladding. It does not leak as long as you simply put it into a simple container. If the container ever shows signs of deterioration, fix or replace the container.

The only way you ever get any “leakage” from a used nuclear fuel storage area is if your computer models assume that people stop doing their jobs and that barriers magically disappear over time.

I agree with Rod but without going into politics and lobbyist support moneyed interests, let’s just always remember (to our idealistic despair) our democratic society is imperfect. But as Winston Churchill noted on November 11, 1947 “Democracy is the worst form of government, except for all those other forms that have been tried from time to time.”


1/28/11       9:26 PM

By Mike Hubbartt, © Copyright 2011, All Rights Reserved.

Dwarf Body Facts:

  • Number of Dwarf Planets: 5
    • Ceres
    • Eris
    • Haumea
    • Makemake
    • Pluto
  • Sizes: smallest is Ceres and the largest is Eris (maybe…)
  • Orbits: Ceres is in the asteroid belt between Mars and Jupiter, and the others are out around Pluto’s orbit (29-49 AUs)
  • Diameters: from 950 km to 2800 km
  • Total Number Moons for all dwarf planets: 7
    • Pluto’s Moons: Charon, Hydra, Nix, and P4
    • Eris’s moon: Dysnomia
    • Haumea’s moons: Hi’aka and Namaka
  • Interesting facts: Pluto will be visited by New Horizons in 2015, which will go on to explore the Kuiper Belt until 2022.
  • Click here for Wolfram|Alpha data on all Dwarf Planets


1 Ceres (orbital period 4.6 yrs)

Ceres is located in the asteroid belt between Mars and Jupiter, and it was classified as an asteroid until 2006, when it was reclassified as a dwarf planet. Ceres is the smallest of the five dwarf planets and is located between Mars and Jupiter, making it a good candidate for a future mission if the data retrieved by the New Horizon mission is impressive. All we have to do is wait until 2015, when New Horizons should be within 186 miles of the surface of Pluto – we should get some impressive pictures at that time.

Click here for the Wolfram|Alpha information on Ceres


136199 Eris  (orbital period 557.4 yrs)

Eris was discovered out past Pluto, and, due to it being slightly larger than Pluto, caused the discussions that eventually produced the new class of planetary bodies called Dwarf Planets. While I wish they added a 10th planet instead of cutting us down to 8, I understand that there may be a push to reconsider the decision to downgrade Pluto, since more recent measurements indicate Eris may be slightly smaller than Pluto. The atmosphere of Eris is currently frozen, so it is quite bright and there are photos from the Keck Observatory as well as from the Hubble that show Eris and its moon.

Click here for the Wolfram|Alpha information on Eris



Dysnomia is the only moon of Eris. Not a lot of data on it, except from JPL. Scientists used Dysnomia to measure the size of Eris, and Dysnomia makes a circular orbit around Eris once every 16 days.

Click here for the Wolfram|Alpha information on Dysnomia


136108 Haumea (orbital period 284.8 yrs)

Haumea was discovered in 2003, and its orbit ranges between 35 and 50 AUs, so it sometimes is closer to the sun than Pluto. It has a fast planetary rotation rate and its diameter averages 1400 km. It too has moons: Hi’aka and Namaka (not much data on either moon) in Starry Night or from JPL.

Click here for the Wolfram|Alpha information on Haumea

This image (Courtesy JPL/NASA) is an artist’s conception of Haumea and its two moons.

Courtesy NASA/JPL-Caltech


136472 Makemake (orbital period 308 yrs)

Makemake (pronounced mah-kee-mah-kee) is larger than Haumea (average diameter of 1500 km) and the average distance from the sun is 46 AUs.

Click here for the Wolfram|Alpha information on Makemake

Pluto (orbital period 247.9 yrs)

Pluto was discovered by Clyde Tombaugh in 1930. I already wrote a short article about Pluto – click here to view it.

Click here for the Wolfram|Alpha information on Pluto



Charon is the largest of Pluto’s 3 moons, but it was not discovered until 1978. Click here to see my earlier article on Pluto which has additional information on Charon.

Click here for the Wolfram|Alpha information on Charon



Charon is nearly as large as Pluto, but in 2005 it was learned that Pluto also has two tiny moons: Hydra and Nix. This is a screen shot of Hydra from Starry Night. Hydra has an estimated diameter of 20 – 70 miles.

Click here for the Wolfram|Alpha information on Hydra



Nix is the other small moon of Pluto, and this is the Starry Night screen shot I found for Nix. Nix has an estimated diameter of 20 – 70 miles.

Click here for the Wolfram|Alpha information on Nix



P4 is the newest and smallest moon orbiting Pluto, with an estimated diameter of 8 – 21 miles.

Information Sources

NSAS’s website, NASA/JPL-Caltech’s website, Starry Night Pro Plus information, IAU Minor Planet Center.