Archive for July, 2010

If you ever in your life had any interest in space and space travel, you have got to check this out -> click here.


The actual JPL Press release is:

Source: Jet Propulsion Laboratory
NASA Spacecraft Camera Yields Most Accurate Mars Map Ever
Close View of Valles Marineris
Close View of Valles Marineris
This image shows a 90-mile-wide portion of the giant Valles Marineris canyon system. Landslide debris and gullies in the canyon walls on Mars can be seen at 100 meters (330 feet) per pixel.

Image Credit: NASA/JPL-Caltech/Arizona State University

PASADENA, Calif. – A camera aboard NASA’s Mars Odyssey spacecraft has helped develop the most accurate global Martian map ever. Researchers and the public can access the map via several websites and explore and survey the entire surface of the Red Planet.

The map was constructed using nearly 21,000 images from the Thermal Emission Imaging System, or THEMIS, a multi-band infrared camera on Odyssey. Researchers at Arizona State University’s Mars Space Flight Facility in Tempe, in collaboration with NASA’s Jet Propulsion Laboratory in Pasadena, Calif., have been compiling the map since THEMIS observations began eight years ago.

The pictures have been smoothed, matched, blended and cartographically controlled to make a giant mosaic. Users can pan around images and zoom into them. At full zoom, the smallest surface details are 100 meters (330 feet) wide. While portions of Mars have been mapped at higher resolution, this map provides the most accurate view so far of the entire planet.

The new map is available at:

Advanced users with large bandwidth, powerful computers and software capable of handling images in the gigabyte range can download the full-resolution map in sections at:

“We’ve tied the images to the cartographic control grid provided by the U.S. Geological Survey, which also modeled the THEMIS camera’s optics,” said Philip Christensen, principal investigator for THEMIS and director of the Mars Space Flight Facility. “This approach lets us remove all instrument distortion, so features on the ground are correctly located to within a few pixels and provide the best global map of Mars to date.”

Working with THEMIS images from the new map, the public can contribute to Mars exploration by aligning the images to within a pixel’s accuracy at NASA’s “Be a Martian” website, which was developed in cooperation with Microsoft Corp. Users can visit the site at:

“The Mars Odyssey THEMIS team has assembled a spectacular product that will be the base map for Mars researchers for many years to come,” said Jeffrey Plaut, Odyssey project scientist at JPL. “The map lays the framework for global studies of properties such as the mineral composition and physical nature of the surface materials.”

Other sites build upon the base map. At Mars Image Explorer, which includes images from every Mars orbital mission since the mid-1970s, users can search for images using a map of Mars at:

“The broad purpose underlying all these sites is to make Mars exploration easy and engaging for everyone,” Christensen said. “We are trying to create a user-friendly interface between the public and NASA’s Planetary Data System, which does a terrific job of collecting, validating and archiving data.”

Mars Odyssey was launched in April 2001 and reached the Red Planet in October 2001. Science operations began in February 2002. The mission is managed by JPL for NASA’s Science Mission Directorate in Washington. Lockheed Martin Space Systems in Denver is the prime contractor for the project and built the spacecraft. NASA’s Planetary Data System, sponsored by the Science Mission Directorate, archives and distributes scientific data from the agency’s planetary missions, astronomical observations, and laboratory measurements.

For more information about NASA’s Odyssey spacecraft, visit:

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Jia-Rui Cook 818-354-0850
Jet Propulsion Laboratory, Pasadena, Calif.

J.D. Harrington 202-358-5241
NASA Headquarters, Washington

Robert Burnham 480-458-8207
Arizona State University, Tempe

Over 1300 space shuttle employees received notice this week that they would be laid off, as the shuttles are being retired later this year. For more news on this subject, click here.

I appreciate that civilian spacecraft like WhiteKnightTwo/SpaceShipTwo and Falcon 9/Dragon, can assist the Russian Soyez replenish food and water and air supplies. All three vessels can also transport crew to and from the ISS, but the Shuttle has done such a good job for so long that I will miss it.

I came across this article at MacWorld UK and must admit I’m disappointed. While Windows 7 is a huge upgrade over Vista, Ubuntu is one of the finer Linux implementations and Dell had listed PCs with Ubuntu for 2 years. For more information, click here.

A late announcement of the newest version of eclipse (Helios) to be released, but still worthwhile news for us fans of the IDE.


OTTAWA, CANADA – June 23, 2010 – Today the Eclipse community delivers its annual release train, a coordinated release of the major Eclipse projects. For the seventh year in a row, the 2010 release train, code named Helios, arrives on time and is now available for download.

The Helios release is the largest release train produced by the Eclipse community, including 39 different project teams, over 33 million lines of code are released and the work of 490 committers. The release train makes it easier for users and adopters of Eclipse technology to adopt new versions of the different Eclipse projects. The Eclipse community also makes available 12 different Eclipse packages that target different types of developer usage, including Java EE developers, PHP developers, C/C++ developers and many more.

“The Helios release is another fantastic effort by the Eclipse committer community”, explains Mike Milinkovich, Executive Director of the Eclipse Foundation. “Besides the feat of coordinating such a large development effort, Helios introduces important innovations in areas such as Git support, Linux development and JavaScript support. Congratulations to everyone for another great release”

The Helios release including many new features and projects updates. Some of the highlights include:

  • A new Linux IDE package makes it easier for Linux developers to use an integrated tool chain for building C/C++ applications for the Linux operating system. This package includes the new Linux Tools project which includes Eclipse integrations of popular Linux utilities such as GNU Autotools, Valgrind, OProfile, RPM, SystemTap, GCov, GProf, and LTTng. A recent Eclipse Community Survey has shown increased use of Linux by developers. It is expected this package will help further accelerate Eclipse adoption in the Linux community.
  • Eclipse Marketplace Client provides developers an ‘app-store’ experience to easily discover and install new Eclipse plug-ins. Eclipse Marketplace is a catalog of Eclipse based solutions. Over 100 of these will be available from the new Marketplace Client, making it significantly easier to find and install Eclipse solutions.
  • Support for Git, a popular distributed version control (DVCS), is provided by the new Eclipse EGit and JGit projects. The new EGit 0.8 release includes a new Git repositories view and support for fast forward merging and tagging. JGit 0.8 – which EGit uses under the covers to talk to Git repositories – benefited from performance enhancements of up to 50% when working with large repositories.
  • The Web Tools Platform project has introduced support for creating, running, and debugging applications written for the latest Java EE Specifications (Java EE 6) including, Servlet 3.0, JPA 2.0, JSF 2.0, and EJB 3.1.
  • Improved support in the JavaScript Development Tools project (JSDT) for JavaScript developers, including a JavaScript debug framework that allows for integration of JavaScript debuggers, such as Rhino and Firebug. A new JavaScript IDE package has also been created to make it easier for JavaScript developers to find, install and use an Eclipse-based IDE.
  • Eclipse Xtext 1.0, a popular framework for creating domain specific languages (DSL), introduces 80 new features, including improved performance and scalability by up to 30 times previous versions. A new in-memory indexing feature makes it possible to develop more sophisticated DSL’s in Xtext.
  • A new release of Acceleo 3.0 implements the OMG Model-to-text (MTL) specification and provides the features required for a code generator IDE. This release also provides unique tools around example-base design of code generators.

More information about the Helios release can be found at The Helios packages can be downloaded now at

About the Eclipse Foundation

Eclipse is an open source community, whose projects are focused on building an open development platform comprised of extensible frameworks, tools and runtimes for building, deploying and managing software across the lifecycle. A large, vibrant ecosystem of major technology vendors, innovative start-ups, universities and research institutions and individuals extend, complement and support the Eclipse Platform.

The Eclipse Foundation is a not-for-profit, member supported corporation that hosts the Eclipse projects. Full details of Eclipse and the Eclipse Foundation are available at

CUPERTINO, California—July 27, 2010—Apple® today unveiled a new Mac® Pro line with up to 12 processing cores and up to 50 percent greater performance than the previous generation.* Featuring the latest quad-core and 6-core Intel Xeon processors, all-new ATI graphics and the option for up to four 512GB solid state drives (SSD), the new Mac Pro continues to deliver amazing performance and expandability for the most demanding consumers and professionals.

“The new Mac Pro is the most powerful and configurable Mac we’ve ever made,” said Philip Schiller, Apple’s senior vice president of Worldwide Product Marketing. “With up to 12 cores, the new Mac Pro outperforms our previous top-of-the-line system by up to 50 percent, and with over a billion possible configurations, our customers can create exactly the system they want.”

At the heart of the new Mac Pro’s performance are next generation quad-core and 6-core Intel Xeon processors running at speeds up to 3.33 GHz. These multi-core processors use a single die design so each core can share up to 12MB L3 cache to improve efficiency while increasing processing speed. These systems feature an integrated memory controller for faster memory bandwidth and reduced memory latency; Turbo Boost to dynamically boost processor speeds up to 3.6 GHz; and Hyper-Threading to create up to 24 virtual cores. The Mac Pro now comes with the ATI Radeon HD 5770 graphics processor with 1GB of memory and customers can configure-to-order the even faster ATI Radeon HD 5870 with 1GB of memory.

For the first time, Mac Pro customers have the option to order a 512GB SSD for the ultimate in reliability and lightning fast performance. With the ability to install up to four SSD drives in the system’s internal drive bays, the new Mac Pro can provide ultra high-speed disk bandwidth and random disk performance, two times faster than the average performance of a standard disk drive.** Mac Pro also now features two Mini DisplayPorts and one dual-link DVI port. The additional Mini DisplayPort output allows customers to connect two LED Cinema Displays without an additional graphics card or adapter and the dual-link DVI port supports legacy DVI-based displays up to a resolution of 2560 x 1600 pixels.

Every Mac Pro comes with Apple’s innovative Magic Mouse and customers can also order Apple’s new Magic Trackpad as an option. The Magic Trackpad brings the intuitive Multi-Touch™ gestures of Mac notebook trackpads to the desktop. With its glass surface, the wireless Magic Trackpad allows users to scroll smoothly up and down a page with inertial scrolling, pinch to zoom in and out, rotate an image with their fingertips and swipe three fingers to flip through a collection of web pages or photos. The Magic Trackpad can be configured to support single button or two button commands and supports tap-to-click as well as a physical click. Magic Trackpad is available separately for $69.

Continuing Apple’s commitment to the environment, Apple’s desktop lineup is a leader in green design. The Mac Pro meets stringent Energy Star 5.0 requirements and achieves EPEAT Gold status.*** The Mac Pro enclosure is made of highly recyclable aluminum and the interior is designed to be more material-efficient. The Mac Pro uses PVC-free internal cables and components and contains no brominated flame retardants. The new Apple Battery Charger provides a convenient and environmentally friendly way to always have a fresh set of batteries for your Magic Trackpad, Magic Mouse and Wireless Keyboard. The Apple Battery Charger is available as an option for $29 and comes with six long shelf life rechargeable batteries.

Every Mac also comes with Mac OS® X Snow Leopard®, the world’s most advanced operating system, and iLife®, Apple’s innovative suite of applications for managing photos, making movies and creating and learning to play music. Snow Leopard builds on a decade of OS X innovation and success with hundreds of refinements, core technologies and out of the box support for Microsoft Exchange. iLife features iPhoto®, with breakthrough ways to organize and manage your photos by who appears in them and where they were taken; iMovie® with powerful easy-to-use features such as Precision Editor, video stabilization and advanced drag and drop; and GarageBand® which offers a whole new way to help you learn to play piano and guitar.

Optional Apple professional applications include Aperture®, Final Cut® Express, Final Cut Studio®, Logic® Express and Logic Studio®.

Pricing & Availability
The new Mac Pro will be available in August through the Apple Store® (, Apple’s retail stores and Apple Authorized Resellers.

The new quad-core Mac Pro, with a suggested retail price of $2,499 (US), includes:

  • one 2.8 GHz Quad-Core Intel Xeon W3530 processor with 8MB of fully-shared L3 cache;
  • 3GB of 1066 MHz DDR3 ECC SDRAM memory, expandable up to 16GB;
  • ATI Radeon HD 5770 with 1GB of GDDR5 memory;
  • two Mini DisplayPorts and one DVI (dual-link) port (adapters sold separately);
  • 1TB Serial ATA 3Gb/s hard drive running at 7200 rpm;
  • 18x SuperDrive® with double-layer support (DVD±R DL/DVD±RW/CD-RW);
  • four PCI Express 2.0 slots;
  • five USB 2.0 ports and four FireWire® 800 ports;
  • AirPort Extreme® 802.11n;
  • Bluetooth 2.1+EDR; and
  • Apple Keyboard with numerical keypad and Magic Mouse.

The new 8-core Mac Pro, with a suggested retail price of $3,499 (US), includes:

  • two 2.4 GHz Quad-Core Intel Xeon E5620 processors with 12MB of fully-shared L3 cache per processor;
  • 6GB of 1066 MHz DDR3 ECC SDRAM memory, expandable up to 32GB;
  • ATI Radeon HD 5770 with 1GB of GDDR5 memory;
  • two Mini DisplayPorts and one DVI (dual-link) port (adapters sold separately);
  • 1TB Serial ATA 3Gb/s hard drive running at 7200 rpm;
  • 18x SuperDrive with double-layer support (DVD±R DL/DVD±RW/CD-RW);
  • four PCI Express 2.0 slots;
  • five USB 2.0 ports and four FireWire 800 ports;
  • AirPort Extreme 802.11n;
  • Bluetooth 2.1+EDR; and
  • Apple Keyboard with numerical keypad and Magic Mouse.

Configure-to-order options include:

  • one 3.2 GHz Quad-Core Intel Xeon W3565 processor for the quad-core Mac Pro;
  • one 3.33 GHz 6-core Intel Xeon W3680 processor for the quad-core Mac Pro;
  • two 2.66 GHz 6-core Intel Xeon X5650 processors (12-cores) for the 8-core Mac Pro;
  • two 2.93 GHz 6-core Intel Xeon X5670 processors (12-cores) for the 8-core Mac Pro;
  • two ATI Radeon HD 5770 cards with 1GB of GDDR5 memory;
  • one ATI Radeon HD 5870 card with 1GB of GDDR5 memory;
  • up to 16GB of DDR3 ECC SDRAM memory for the quad-core Mac Pro;
  • up to 32GB of DDR3 ECC SDRAM memory for the 8-core Mac Pro;
  • up to four 512GB solid state drives (SSD); or
  • up to four 1TB or 2TB Serial ATA hard drives running at 7200 rpm;
  • Mac Pro RAID card;
  • dual-channel or quad-channel 4Gb Fibre Channel card; and
  • up to two 18x SuperDrives with double-layer support.

Accessories include: Magic Trackpad, Apple Battery Charger, wired Apple Mouse, wireless Apple Keyboard, Mini DisplayPort to DVI Adapter, Mini DisplayPort to Dual-Link DVI Adapter (for 30-inch DVI display), Mini DisplayPort to VGA Adapter, the AppleCare® Protection Plan; and pre-installed copies of Mac OS X Snow Leopard Server; iWork®, Logic Express 9, Final Cut Express 4 and Aperture 3. Complete options and accessories are available at

*Testing conducted by Apple in July 2010 using preproduction Mac Pro 12-core 2.93 GHz units and shipping Mac Pro 8-core 2.93 GHz units, all configured with 6GB of RAM. Based on render performance of Maxwell Render 2.0.3 using Benchwell’s sculpture.mxs. Performance tests are conducted using specific computer systems and reflect the approximate performance of Mac Pro.

**Testing conducted by Apple in July 2010 using preproduction Mac Pro 12-core 2.93 GHz units configured with 6GB of RAM, 1TB 7200-rpm hard disk drive and 512GB solid-state drive. Testing conducted using Iometer 2006.07.27 with a 30-second ramp-up, 5-minute run duration, 128KB request size, 8 outstanding IOs, and 150GB test file. Average rotational media performance calculated by creating the test file on the outer, middle and inner sectors of the drive and averaging the results from all three measurements. Performance tests are conducted using specific computer systems and reflect the approximate performance of Mac Pro.

***EPEAT is an independent organization that helps customers compare the environmental performance of notebooks and desktops. Products meeting all of the 23 required criteria and at least 75 percent of the optional criteria are recognized as EPEAT Gold products. The EPEAT program was conceived by the US EPA and is based on IEEE 1680 standard for Environmental Assessment of Personal Computer Products. For more information visit

<!–BULLET LIST HEADER (as needed)

  • BULLET 1
  • BULLET 2
  • BULLET 3
  • BULLET 4 etc…

–>Apple designs Macs, the best personal computers in the world, along with OS X, iLife, iWork, and professional software. Apple leads the digital music revolution with its iPods and iTunes online store. Apple is reinventing the mobile phone with its revolutionary iPhone and App Store, and has recently introduced its magical iPad which is defining the future of mobile media and computing devices.

by Mike Hubbartt, © Copyright 2010, All Rights Reserved.

I love reading about the middle ages. I would not want to have lived in those times, but the period interests me more than most in history. When I started at Augsburg College in 2007, my adviser suggested I take the Tuesday evening Medieval Crusades class taught by Dr. Phil Adamo, so I enrolled in it without a glimmer what it would be like to take a junior level history course. It was, to say the least, a class where we did a lot of reading (100-200 pages of highly detailed information each week) and writing (6 papers), but the topic and the professor were so interesting that it made the work simple and fun.My book report for the class was on Richard the Lionheart, and while I spent a lot of time on it, I enjoyed every second studying about someone that is still well-known to many people today.

I met a number of interesting people in the class, including several that majored in Medieval Studies. Arwen went on to get her degree in Library Science a couple of years ago, and Josh just finished this spring. Josh is a Blacksmith, and he took the time to make a complete set of armor which he demonstrated to the other students several times each year. The craftsmanship he put into that armor was impressive – making chain mail is a labor-intensive job, and he didn’t cut any corners. The final product is excellent.

There aren’t many schools that offer Medieval Studies as a major, so Augsburg gets students from all over the country that want to learn about this fascinating topic. While I was at the school, Terry Jones (from Monty Python) came to the school and gave a talk to the students. Click here to see details of the Augsburg Medieval Studies program. Augsburg also offers a Medieval Summer Camp that should be worth the time to attend to see if this fascinating subject appeals to you – for details contact Phil Adamo at

Favorite Undergraduate Computer Science Courses

Posted: July 27, 2010 by Mike Hubbartt in Academia

by Mike Hubbartt, © Copyright 2010, All Rights Reserved.

I had several computer science classes I really enjoyed while working on my undergraduate degree.

I enjoyed the Object-Oriented programing class, because I really dove in and did every bit of the coding included in the textbook, as well as the code samples handed out by Professor Sutherland.

I also enjoyed Dr, Watters’ Algorithms class, although I will never again view embedded loops as acceptable solutions in any event.

I really loved both of Dr. Sheaffer’s Compiler classes. We used Java to write a C compiler to generate assembly language code – tell me it isn’t ironic to OO code to create a structured language compiler to generate assembly…

How about your favorite computer science classes? What were they and where were you going to school while taking them? Did the professor or the topic make the classes special?

Microsoft Biology Foundation 1.0 ships

Posted: July 26, 2010 by Mike Hubbartt in Press Announcements
by Peter Galli on July 15, 2010 05:59PM

<!– –>Version 1.0 of the Microsoft Biology Foundation (MBF), which is  licensed under the OSI-approved Microsoft Public License, has shipped.

MBF is a language-neutral bioinformatics toolkit built as an extension to the Microsoft .NET Framework, initially aimed at the area of Genomics research.

Currently, it implements a range of parsers for common bioinformatics file formats; a range of algorithms for manipulating DNA, RNA, and protein sequences; and a set of connectors to biological web services such as NCBI BLAST.

The MBF executables, source code, demo applications, and documentation are freely downloadable. The source code and development of MBF is managed on this CodePlex site, but all downloads are hosted on the Microsoft Research site here.


I posted this because Bioinformatics is one of the areas of focus I want to pursue while in grad school. Cool stuff.

by Mike Hubbartt, © Copyright 2010, All Rights Reserved.

Did you know that King Richard the Lionheart did not speak English, that he was only in England soil for 1.5 yrs or the 10 yrs he rules, and he developed one of the best defensive tactics while marching his men from Acre to Jerusalem? I learned that from a book on King Richard, which I read for a report in a Medieval Crusades class I took in the Fall of 2007.

Why did I return to college? The poor economy and bad job market have forced many people to re-evaluate their career options and opt to return to college to earn a new or different degree to make a decent living. I returned to school in 2007, not because the economy, but because I felt the job market was changing into one that would soon require a degree just to be able to apply for a job in technology. Based on what I’ve seen the past year, I believe I made the right choice.

In June of 2007 I enrolled in Augsburg College’s Weekend College (WEC) program, with a declared major of Computer Science (CSC).  Even though I had over 20 years of experience in the IT industry, I was not allowed to skip any of the required classes, so I started with the Intro to Computer Science class (CSC160) that Fall, which was taught by Dr. Kern Sutherland. Dr. Sutherland was a gem. A professor that cared about teaching, not about being the teacher, and her lectures were interesting and thought-provoking. I took the next class in the sequesnce (CSC170 – OO Programming) with Dr. Sutherland and learned a lot, even though most of the concepts were quite familiar to me. The third quarter Dr. Sutherland left school – she retired – so I took the Intro to Networking class taught by Dr. Shana Watters and Data Structures taught by Dr. Larry Crockett – both were interesting, but still a review of familiar topics, and I always found something of interest in the lectures and textbooks for the classes.

My second year (2008-2009) was a lot more challenging. I had Algorithms and Database Design (both taught by Dr. Watters –  a very passionate and enthusiastic teacher that truly cares that her students learn) in the fall, Assembly Language (taught by Dr. Erik Steinmetz – a top notch teacher that knows how to make complicated problems easy to understand) in the Winter, and Logic (taught by Dr. Charley Sheaffer – as compassionate and understanding a person as anyone I’ve met) in the Spring. My last year (2009-2010) CSC classes were Database Architecture (taught by Dr. Watters) in the Fall, Compilers I (taught by Dr. Sheaffer) in the Winter, and Compilers 2 (taught by Dr. Sheaffer) in the Spring of 2010.

Starting with my Data Structures class, I made it a point to use the Eclipse IDE for all classes where we wrote code, since I knew it was widely used by developers in the business arena. Eclipse was powerful – it made it so much easier to find and fix minor bugs in my code – and the UI is very intuitive. I became the unofficial advocate for Eclipse on campus, demonstrating it to fellow students and to interested Professors. To my great pleasure Dr. Sheaffer used it and required the other students in his Compilers classes to use Eclipse for their class project, and I was told that Dr. Watters intends to require her Data Structures students to use it next year when she teaches that class.

The CSC classes were useful, but, as you might guess, the liberal arts requirements were more of a challenge as they took time away from the programming classes. I took three math classes (ending with Discrete), two fine arts classes (History of Jazz – most excellent, and Web Design – a good review of material I knew), but the most daunting classes I took outside of CSC were French 1 and 2. I understand that linguists and psychologists say that the younger you are, the easier it is to learn a foreign language. I’m not just out of high school, so I knew this could be my biggest challenge to getting my degree. I decided to take French (not Farsi nor Norwegian) and took them in the Winter and Spring quarters of 2010 in a class taught by Dr. Issac Joslin. It was tough. As tough as I thought it would be. We met once a week for 3 – 4 hrs per day and we had a lot of material to get through for the course. I’d say 1/2 of the class had 0 exposure to French and 1/2 the class was well-versed in the language. Such a mix made it a difficult class to teach as well as attend, but the subject was interesting enough to make the effort worthwhile. I graduated over a month ago but still practice French when I email my wife. I hope to be able to retain enough to make a trip to Paris next year and be able to communicate with the people we meet on our vacation.

I graduated this Spring – our last class met June 26th and the graduation ceremony was June 27th at the Augsburg campus. We were only allows 5 tickets per student for guests, so my wife, my brother Pat, mother-in-law Ardy, and friends Jim Kunkel and Scott Lackey came to watch me receive my diploma. Afterwords we had an open house to share the joy of the day and our guests were my brother and sister-in-law and two of their grandsons (Derick and Kyle), Ardy, and our friends Dr. Jim Kunkel (from Texas) and Scott and Dennis and Meliisa Kleibur and their lovely daughters, my adopted mother and father, good friends Pam and Larry Ellinghuysen, and good neighrs Stu and Maria Elena Nankin and Barb Salem. We ate well. My brother grilled steaks (using Barb’s grill), and I made Perlou Wraps and corn for lunch, with BBQ ribs for supper. We ate most but not all of the food and have been grazing on frozen leftover ribs the past 4 weeks, but the last ones will be gone by tomorrow. C’est la vie.

Was it easy to return to college? Not a bit. Many of my classmates said they’d been in school for 6-8 years, which was too long for me. I’m a sprinter, not a marathoner, so i pushed through in 3 years. I did well enough that I decided to go on to graduate school, so I am enrolled int he GPS program at the University of St. Thomas. I know this will be as challenging, if not more, than my undergrad degree, but I will take advantage of the opportunity to learn as much as possible in every class I take. School is work, and there are good and bad subjects/teachers/textbooks/classmates, but the point is that you are there to learn, regardless of the distractions. I look back with fondness on my time at Augsburg – I met a lot of good people and good teachers, and I learned a lot.

Would I recommend that others return to college? Yes. It was absolutely worth it, even though I gave up a lot of time watching TV or relaxing with a book since I needed to study, I did something that can’t be taken away. I earned my college degree. I am proud I was able to accomplish this goal and hope that others take the step to make themselves more marketable by going back to school. It isn’t easy, but the things you learn will stay with you forever.

I can promise you will work very hard to get a degree, but I can also promise you will be thrilled when you take that graduation walk to the auditorium to receive a diploma. And the next time someone asks  “are you a college grad?” you can say with a smile “Yes I am.”

FUD – an acronym for Fear, Uncertainty, Doubt. A practice used to generate fear in people so they will demand change or reject valid information.

Is global warming for real, or is it a fabrication? Most scientists and climatologists around the world state that global warming is a problem that we and our children must address. The biggest groups of people that disagree with global warming seem to rely on pseudo science, religious convictions, or political positions.

Chris Monckton

Dr. John Abraham

A noted global warming skeptic named Christopher Monckton from the UK gave a presentation at Bethel College and stated that global warming is not an issue. The problem: Monckton lacks scientific background and credentials to act as a valid analyst of this topic. Dr. John Abraham, a professor at the University of St. Thomas, viewed Monckton’s slides and pointed out how each point either relied on old or incorrectly-interpreted data, or used non-facts to promote Monckton’s views. Click here to watch Dr. Abraham’s excellent analysis and dissection of Monckton’s attempts to dismiss global warming using pseudo science and unverifiable quotes. Monckton’s response was an unprovoked and unwarranted attack on Dr. Abraham and the University of St. Thomas.

I’ve discussed this topic with friends and colleagues for the past 4 years and have seen people rely on political and religious positions to justify their disbelief in global warming. One colleague tried to use half truths as a reason to doubt science and said “he only believes in the law of gravity.”  I sadly told him there is no law of gravity, which he did not believe. A friend told me he didn’t believe in global warming because God wouldn’t allow us to destroy ourselves. I wondered then (and now) how much damage 3000+ nuclear weapons in US and Russian arsenals could do to our planet… I’m guessing it wouldn’t take that many to destroy the planet and they do exist, so what does that prove to this perspective? Both cases show that some people have made up their minds about something and will look for anything to validate that position – you can’t discuss something with someone that refuses to be open-minded, so don’t bother.

Science attempts to find answers, whether or not those answers agree or disagree with current or previous political and religious opinions. When people use pseudo science, rumors, outright lies, or misstate the truth to bolster their opinion, then you have to question their ethics. I believe that trying to hide what you are or want or promote means you have doubts people will believe you if you are honest. Why? I believe that scientists are people, but they are held accountable to their peers who review their works. Good and ethical scientists try to find answers and embrace the truth, but scientists are human and can be wrong. Science is based on asking questions and trying to find the truth, not twisting facts to fit their political and religious viewpoints.

I applaud Dr.Abraham, an expert with credentials to back his positions, for taking the time to put together a presentation that shows the inconvenient facts, and hope other scientists continue to stand up for the truth.

Just my two cents…


JULY 30, 2010 –  A Followup

Another conspiracy? Please! Since Dr. Abraham’s presentation that refutes Chris Monckton’s position on climate change, Monckton has used mail and interviews to lash out at Dr. Abraham, who refuses to be pulled into a series of personal attacks. Dr. Abraham has stated the scientific reasons why Monckton’s presentation is wrong, and Monckton responded by letters to the University of St. Thomas demanding that they punish Dr. Abraham. Bravo to Dr. Abraham for refusing to let the facts be overridden by Monckton’s non-scientific outlandish claims, and bravo to the university of St. Thomas for supporting their professor and for refusing to be bullied into placating a non-expert. For someone that bandies about the word ‘libel’ so often, Monckton should consider his own comments about Dr.Abraham, the University of St. Thomas, and the university President.

For more information, click here.

By Harry Babad © Copyright 2010, All right Reserved


Read about my paradigms views, prejudices and snarky attitudes @:

The materials I share in the articles that follow come from the various weekly science and environmental newsletters to which I subscribe. When I have time, I also check a variety of energy related and environmental blogs. In addition, I subscribe to New York Times, Time Magazine, The Economist, Business Week, Wired, National Geographic, Smithsonian – all which I skim for articles of interest.

Some of what I chase is called out in publications by the libraries of the Pacific Northwest [NewsBridge], Sandia, Argonne and the Lawrence Livermore National Laboratories, which are managed by the US Department of Energy. Other articles I found interesting come from technology feeds from the, Discover Magazine, various international energy and green advocacy groups as well as The American Association for the Advancement of Science [AAAS], from the American Nuclear Society [ANS] magazine Nuclear News is a source and for chemistry related news C&EN from the American Chemical Society [ACS].

Article selection (my article – my choice} are obviously and intentionally biased by my training, experience and at rare times my emotional and philosophical intuitive views of what works and what will not… But if you have a topic I neglect, send us feedback and I’ll give it a shot.

Remember, conditions, both technical and geopolitical continuously change – So if you’ve made up your mind about either the best way to go, or about its all a conspiracy, move on to the next article in our blog. Today’s favorite is tomorrow unintended consequences. However, that’s better than sticking one’s head in the sand or believing in perpetual motion. Remember, there’s no free lunch and you must always end up paying the piper!

Since my tid-bit are only a partial look at the original article, click on through if you want more details, as well as, often, added similar or related articles on the same topic(s).


Now, As Usual, in No Formal Order, a Bakers Dozen Snippets

——— A List of Their Titles ———

  • Will Electric Cars Destroy Your Neighborhood Power Grid? No, But…
  • How Algal Biofuels Lost a Decade in the Race to Replace Oil.
  • Ten Ideas To Save The Planet From Coal — Burying The CO2 Problem.
  • World’s Largest Working Hydroelectric Wave Energy Device Launched.
  • At Issue Sustainability — The Big Paradigm Shift — Four principles for the shift from a fossil fuel based society plus one from Doc.
  • FACTBOX: What is the Non-Proliferation Treaty?
  • U.S. Set To Fund More Stem Cell Study — New lines approved.
  • Five Key Cyber Security Areas for DHS to TackleGeneral Accountability Office [GAO] Advice.
  • The Carpal Tunnel Survival Guide — You know this but…
  • Highest rate of CO2 emissions growth since 1990 – Data 1990-2005
  • Dams Could Alter Local Weather, Cause More Rain
    {Emphasis added by me.}
  • Tessera Solar and Stirling Energy Systems Unveil World’s First Commercial-Scale Solar-Thermal Plant, The Suncatcher
  • Icy Crystals — Methane Hydrate — Heat Up

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Will Electric Cars Destroy Your Neighborhood Power Grid? No, But…

Over the next 12 months, carmakers will introduce several new plug-in electric vehicles. One question that’s frequently asked of–and many others too: Does recharging electric cars pose a threat to the electricity grid?

The 2010 Fisker Karma and the 2011 Chevrolet Volt plug-in hybrids, among other electric-drive vehicles, are scheduled to roll out next year. Without a range-extending engine, the 2012 Nissan Leaf electric car will be even more reliant on the grid. So it’s a reasonable question.

The answer, in short, is: No. The current power grids in the U.S. are more than capable of handling incremental demand from the small numbers of plug-in cars that will be sold over the next few years.

However, depending on were you live and how popular the electric cars prove to be, utilities in some areas will need to take a bit of longer-term planning action sooner than others.

It should be noted that in a two-volume report, Environmental Assessment of Plug-In Hybrid Vehicles, concluded that the gradual rollout of electric vehicles would impose a very small load on the grid. Since an electric car recharging equals the load of four or five plasma TV sets, overall demand won’t be notably affected.

By John Voelcker – Senior Editor, November 23rd, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

How Algal Biofuels Lost a Decade in the Race to Replace Oil

For nearly 20 years, a government laboratory built a living, respiring library of carefully collected organisms in search of something that could grow quickly while producing something precious: oil.

But now that collection has largely been lost. The National Renewable Energy Laboratory scientists found and isolated around 3,000 species algae from construction ditches, seasonal desert ponds and briny mashes across the country in a major bioprospecting effort to find the best organisms to convert sunlight and carbon dioxide into fuel for cars.

Despite meager funding, the Aquatic Species Program initiated under President Jimmy Carter, laid the scientific foundation for making diesel-like fuel from the fat that microscopic algae accumulate in their cells. Fifty-one varieties were carefully characterized as potential high-value strains, but fewer than half of those remain.

“Just when they started to succeed is when the plug got pulled,” said phycologist Jeff Johansen of John Carroll University, who collected algal strains for the program in the 1980s. “We were growing them in ponds and we were going to grow enough to have them made into a diesel fuel.” The program was part of the huge investment that Jimmy Carter made into alternative energy in the late 1970s. All kinds of research avenues were explored, but when the funding shriveled during later years, knowledge, experts and know-how were lost. The setback highlights the problems created by inconsistent funding for energy research.

Now, President Obama has trumpeted the American Recovery and Reinvestment Act, also known as the stimulus package, as the largest increase in scientific research funding in history. Scientists roundly applauded the billions of dollars that went into energy research, development and deployment. But what about when the stimulus money runs out in two years?

“One caution is that much of this has been funded with the stimulus package,” said Ernie Moniz at a Google-hosted panel on energy in late November. “So, we’re going to have to see what happens after these next two years, because what we need is not a drop, but a further increase in R&D commensurate with the task at hand.” And that’s exactly what didn’t happen in the last big energy R&D push. A discussion of algae comes back can be found on the linked site.

Alas, reinventing the wheel is an old bureaucratic fall back position since records are never really kept when a project is closed down by a lack of funding.

By Alexis Madrigal, Wired — December 29, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Ten Ideas To Save The Planet From Coal — Burying The CO2 Problem

It may be expensive but with coal demand expected to soar by 2030 capturing and storing carbon emissions is as vital as ever. The UK’s top scientists tell us why.

What idea policy or technology holds the greatest promise for tackling climate change? That was the question the UK’s Channel 4 News posed to the scientific community over the past few weeks.

Among a number of scientists the verdict was unanimous – carbon capture and storage must work. 

 The Department of Energy and Climate Change predicts that by 2030 global coal demand will rise by 70 per cent.

But where and when coal burns carbon is emitted. And these emissions must be stored long term in order to combat our changing climate. 

”In the immediate future, I believe that the greatest promise is offered by technologies related to Carbon Capture and Storage (CCS),” Dr. Andrew Yool from Ocean Modeling and Forecasting told Channel 4 News.

“In principle, these will allow technological societies to retrofit existing infrastructure while continuing to use fossil fuel resources without exacerbating either climate change or ocean acidification.

CCS is a means of capturing carbon emissions from source (fossil fuel power plants) and storing it away from the atmosphere under ground. 

Long term storage of CO2 is not only expensive but it is also a relatively new concept. 

Last year operations began at the world’s first ever coal-fired power plant with facilities to capture and store its own emissions. It was built as an extension of the Schwarze Pumpe plant in Germany; the mini power station is a pilot for CCS.

More on the technical specifics of the CCS concept check out both the link and at Wikipedia []

However efforts to do so in the US have been met with all sorts of bureaucratic stumbling blocks.

For other UK potential solutions to climate change check out the link below. Some are technical but many related to institutional, societal and attitude changes. These are always a harder task of course, unless misbehaving cost you personally – yes comes out of your pocket.

Channel 4 News, UK November 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

World’s Largest Working Hydroelectric Wave Energy Device Launched

Queen’s University Belfast has helped the global wave energy industry take a major stride forward with the launch of the world’s largest working hydroelectric wave energy device by Aquamarine Power Ltd.

Known as Oyster, the device has been officially launched by Scotland’s First Minister Alex Salmond MP, MSP at the European Marine Energy Centre (EMEC) in Orkney.

It is currently the world’s only hydroelectric wave energy device producing power and is now producing power by pumping high-pressure water to its onshore hydroelectric turbine. This will be fed into the National Grid to power homes in Orkney and beyond. A farm of 20 Oysters would provide enough energy to power 9,000 three bedroom family homes.

Professor Trevor Whittaker from Queen’s School of Planning, Architecture and Civil Engineering, the principal investigator, said: “The concept of Oyster came about through research in our wave-tank facility at Queen’s. The launch of Oyster is both a major landmark in terms of carbon-free sustainable energy production and a proud day for Queen’s University Belfast, which already has a reputation as being one of the leading wave-power research groups in the world. In fact Oyster is the third prototype demonstration wave power project, which the team at Queen’s has instigated in the past 20 years.

“Devices such as these have the power to revolutionize the world’s energy industry, at a small scale, and help combat climate change. Doc sez if it is cost effective w/o subsidies, no mater what the scale, every KWh helps, especially if it prevents more CO2 from being released in the generation of electricity.

The Product and Design Development Web Page, November 30, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

At Issue Sustainability — The Big Paradigm Shift — Four principles for the shift from a fossil fuel based society

I’ll keep one this short to raise you temptation level. Needless to say I agree with Jim Lane about the pitfalls of not recognizing the problems created by global warming and energy shortfalls.

In the 2010 Dodgen Lecture at the annual meeting of the Mississippi Academy of Sciences, and in the Q&A that followed, I described four principles that must be observed in order to successfully complete the transition away from a fossil-fuel based society.

  1. The Right To Clean, Affordable Energy:
  2. Energy Must Be Consumed Within The Radius That It Is Produced: NIMBY be dammed
  3. An Energy Finance System Must Permit Individuals To Participate:
  4. Energy Must Be Recognized As A Special Class Of Investment:

The Missing Paradigm Change

  1. If its subsidized, it’s not paying for its self, and is likely be a long term looser. [Doc]

More at, some of which I agree with.

Article by Jim Lane, Editor, Biofuels Digest, February 2, 2010.’Sustainable+energy

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

FACTBOX: What is the Non-Proliferation Treaty?

The objective of the treaty, which took effect in 1970, is to halt the spread of nuclear weapons-making capability, guarantee the right of all members to develop nuclear energy for peaceful ends and — for the original five nuclear weapons powers — to phase out their arsenals. Here are some key facts about the Non-Proliferation Treaty. An addition description can be found in he Wikipedia article on the same subject.

It’s been in the news lately usually making headlines but with any detail what the treaty is all about, whether discussed as part of President Obama’s new initiative, Israel’s lack of membership in the treaty organization, or the recalcitrant behavior if the Iranian government. So link-in!

Reuters News, Mon Nov 30, 2009


– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

U.S. Set To Fund More Stem Cell Study — New lines approved

The Obama administration has begun approving new lines of human embryonic stem cells that are eligible for federally funded experiments, opening the way for millions of taxpayer dollars to be used to conduct research that was put off-limits by President George W. Bush.

Launching a dramatic expansion of government support for one of the most promising but most contentious fields of biomedical research, the National Institutes of Health on Wednesday authorized the first 13 lines of cells under the administration’s policy and was poised to approve 20 more Friday.

“This is the first down payment on what is going to be a much longer list that will empower the scientific community to explore the potential of embryonic stem cell research,” said NIH Director Francis S. Collins. “Today’s announcement is the first wave.”

An additional 76 stem cell lines are waiting vetting, and researchers have indicated that they plan to submit at least 254 more for approval.

The NIH has already authorized 31 grants worth about $21 million for research on human embryonic stem cells, money that was contingent on new lines passing government muster. The grants are for a variety of research, including work aimed at developing cells that could be used to treat diseases of the heart and nervous system.

But opponents of the research, who argued that the work is not only unethical but also unnecessary, because of the availability of adult stem cells, condemned the announcement and other more recently identified alternatives.

The article goes on to discuss both <1> Opponents Views, including the US Conference of Catholic Bishops and the background of, and <2> information about the new NIH guidelines.

Doc Sez: That so far the data about the usefulness and efficacy of stems cells not derived from embryos has been disappointing. In addition much the rest of the world is more concerned about curing present day ills that in ‘saving’ unwanted embryos that will be destroyed.

In addition I believe the new golden rule should apply – Do unto others what they would do unto them selves. That means any one objecting to the use of therapies based on embryonic stems cell should not use them – put you life where your mouth is.
For information on the New NIH Guidelines for Obtaining the Cells see:

Article by Rob Stein, Washington Post Staff Writer, December 3, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Five Key Cyber Security Areas for DHS to Tackle — GAO Advice Presented at Senate Hearing on Post 9/11 Security

Five key cyber security challenges the Department of Homeland Security should tackle were outlined in testimony delivered Wednesday at a hearing on post-9/11 transportation challenges.

The only witnesses at a Senate Commerce Science and Transportation hearing on post-9/11 transportation challenges Wednesday was Homeland Security Secretary Janet Napolitano, but the managing director for homeland security and justice at the Government Accountability Office delivered a written statement for the record that outlined five key cyber security challenges the Department of Homeland Security should tackle.

According to the statement prepared by Cathleen Berrick, the five key cyber security areas include:

  • Bolstering cyber analysis and warning capabilities;
  • Completing actions identified during cyber exercises;
  • Improving cyber security of infrastructure control systems;
  • Strengthening DHS’s ability to help recover from Internet disruptions; and
  • Addressing cybercrime

Berrick said DHS has made progress in strengthening cyber security, such as addressing some lessons learned from a cyber attack exercise, but further actions are warranted. “DHS has since developed and implemented certain capabilities to satisfy aspects of its responsibilities,” she said, “but it has not fully implemented GAO’s recommendations and, thus, more action is needed to address the risk to critical cyber security infrastructure.” To read Ms. Berrick register and then check:

Article by Eric Chabrow, Executive Editor,, December 2, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

The Carpal Tunnel Survival Guide

My index finger went completely numb. You could poke it and I wouldn’t feel a thing. That was the flashing red neon sign telling me something was wrong. The culprit: laptops. My esteemed partner in crime, Patrick Miller, recently wrote about what bugs him about laptops. Now it’s my turn, but I want to share a personal tale with you. Along the way, I’ll tell you how to avoid the same mistakes I made.

Carpal Tunnel: A Loathe Story — Carpal tunnel syndrome and RSI–the bane of the modern computer user–hit home for me because I spent too much time using poorly placed touch pads and seriously scrunched keyboards. But I’ve gleaned a thing or two about ergonomics as a result. My misery is your chance to learn.

Go on; hold your hand near a laptop’s touchpad and mouse buttons. Every time you start tapping out a document on the bus, at the coffee shop, or on that flight to the big meeting you’re likely forcing your hands into uncomfortable positions. As my doc told me the other day: “Don’t buy a laptop based only upon what you plan to do with it. First and foremost, make sure that it conforms to your body’s needs–not the other way around.”

Dr. Thomas M. Marsella, MD, with the Occupational Health Services department of the Physician Foundation at California Pacific Medical Center has a couple other suggestions.

The advice Related Topics

  • Take Breaks – There’s even freeware to help
  • Do Some Stretches – The examples
    Stretch 1: Evil Genius.
    Stretch 2: Hands Down.
    Stretch 3: Double Chin.

I know, you know all about this, but do you practice healthy computer ergonometrics or do you just hope your wrists are stronger then those other wimps?

Also check out 9 Things I Hate About Laptops By Patrick Miller, PC World, November 17, 2009.

Darren Gladstone, PC World, Dec 2, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Highest rate of CO2 emissions growth since 1990

According the article posted in, CO2 emissions rates have been increasing worldwide since 2009 and had reached all-time highs in 2005.  The reported information was derived from emissions data from the Oak Ridge National Laboratory’s Carbon Dioxide Information Analysis Center (CDIAC).

Alas I did not have time to check the data against the information available from with the US Department of Energy’s Energy Information site,  [] or the Oakridge data, but can see no reason, a priori, to dispute it. The article is simply a compilation of data, in tabular forms, and draws no conclusions or posits no future actions. That is left to the users of such data including regulatory agencies, and to concerned with international climate change.

I’ve reproduced the summary findings, and added information about the United States below, for your convenience.

  • Between 1990 and 2005 Vietnam had the highest rate of emissions growth among countries that emitted more than 100 million tons of CO2 in any year during the past three decades.
  • Vietnam’s emissions from fossil fuel use, cement manufacturing, and gas flaring increased 376 percent from 5.8 million metric tons of carbon to 27.8 million tons between 1990 and 2005. Malaysia ranked second with a 224 percent increase.
  • China topped the rankings in terms of total carbon emissions growth over the period. China’s carbon emissions from fossil fuel combustion, cement manufacturing, and gas flaring increased by 875.7 million tons of carbon to 1.53 billion tons of carbon (5.62 billion tons CO2) in 2005. Most of the increase came from coal burning, which accounted for 1.116 billion tons of the country’s carbon emissions in 2005. China’s emissions have since climbed by another 25 percent to 1.923 billion tons of carbon in 2008, according to preliminary figures from CDIAC.
  • The United States had the second highest growth rate in carbon emissions and released the third highest amount of carbon dioxide.
  • The biggest drop in emissions between 2000 and 2005 came in Germany, which reduced carbon emissions by 6.4 million tons of carbon or 3 percent of its total emissions.
  • Belgium had the greatest rate of reduction at 7 percent over the period.

Also Check Out NASA: Last Decade Was Warmest on Record, January 2, 2010,

Article posted on December 4, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Dams Could Alter Local Weather, Cause More Rain

Note: I found this article informative, especially when it may be just another example of the Laws of Unintended Consequences that plague our decision making. []. Although the Wikipedia article focuses on the social science, as we should by now be aware, the physical sciences are no exceptions to it applicability But let’s not stick our heads in the sand, we’re sure to lose them. along with the rest of our bodies.

As if America’s aging dams were not in enough trouble already, new research suggests that their reservoirs could be increasing the intensity of extreme rainstorms in their immediate vicinities.

That’s a problem because the dams were designed for the climate that existed in the area before they were built. If by virtue of their creation, they increase the chance that an extreme weather event will exceed the dams’ capacity, they could be less safe than previously thought.

“What if the dam itself, its reservoir, could have accelerated or intensified the heavy rainfall patterns?” said Faisal Hossain, a hydrologist at Tennessee Tech University, who has co-authored a paper and editorial on the topic accepted for publication in Natural Hazards Review and Water Resources Research, respectively.

There is strong evidence that a standing body of water, like a lake, can alter precipitation patterns, Hossain said. Increasing the amount of liquid water in a region increases the amount of evaporation in a region, too. That water vapor will eventually condense and fall as precipitation. So, it’s logical to think that a dam’s reservoir could have the same impact. And dams allow irrigation, which can transform the land in the area, possibly leading to local climactic impacts.

Marshall Shepherd, a research meteorologist at the University of Georgia, called the findings “interesting and plausible” in an e-mail to “The literature contains many examples of how extreme land use changes alter precipitation patterns,” wrote Shepherd, whose own work focuses on climactic changes induced by cities.

Shepherd would like to see more detailed analysis of the mechanics behind how a dam could change local precipitation.

There’s a bit more summary information in the article, but alas no detailed references – that’s a string I would have pulled.

Also check out a related article Old American Dams Quietly Become a Multibillion-Dollar Threat By Alexis Madrigal Wired Magazine, August 25, 2009

The image in this article is the result of a busted gate in the Folsom Dam in Northern California let water pour through in 1995. AP/Bob Galbraith. See Wikipedia –

Read More —

This article is by Alexis Madrigal, Wired Magazine, December 3, 2009.

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Tessera Solar and Stirling Energy Systems Unveil World’s First Commercial-Scale Solar-Thermal Suncatcher Plant

Only four months after breaking ground, Tessera Solar and Stirling Energy Systems (SES) showcased the highly anticipated Maricopa Solar power plant at a special event for key partners, stakeholders and media. Maricopa Solar is the first commercial project for the SunCatcher™ concentrating solar power technology designed and manufactured by SES. Joining in the celebration were Arizona Governor Jan Brewer and officials from Sun River Power [SRP], local and state government, the U.S. Department of Energy, Sandia National Laboratories, utility customers, suppliers and the international energy group NTR plc, Tessera Solar and SES’s majority shareholder.

“Maricopa Solar represents a genuine breakthrough in solar energy and demonstrates that Dish Stirling engine based solar power is now ready for commercial deployment in the US and around the world. With this milestone now behind us we look forward to breaking ground on our initial 1,500 megawatts of projects in California and Texas later this year.”

Maricopa Solar is comprised of 60 SunCatcher dishes and will provide 1.5 megawatts of renewable energy to SRP customers in Greater Phoenix, Arizona.

“Through partnerships such as Maricopa Solar, we will be able to learn a great deal about this emerging solar technology while helping to create green jobs, economic development opportunities and clean energy for SRP and our customers, said SRP Associate General Manager Richard Hayslip. “The Maricopa Solar project is just one example of SRP’s commitment to building a renewable energy portfolio that is beneficial to our environment and customers.”

The innovative and highly-efficient SES SunCatcher is a 25-kilowatt solar power system which uses a 38-foot, mirrored parabolic dish combined with an automatic tracking system to collect and focus the sun’s energy onto a Stirling engine to convert the solar thermal energy into grid-quality electricity.

“The SunCatcher represents the next generation of grid-quality solar power technology providing clean, reliable and cost-effective solar power to address global climate change and reduce our planet’s carbon emissions,” said Steve Cowman, Stirling Energy Systems CEO.

SunCatcher has a number of advantages including the highest solar-to-grid electric efficiency, zero water use for power production, a modular and scalable design, low capital cost, and minimal land disturbance. SunCatcher was designed and developed in America, through a public-private partnership with the U.S. Department of Energy. The SunCatchers unveiled at the Maricopa Solar facility were manufactured and assembled in North America, mostly in Michigan by automotive suppliers.

January 25, 2010, No By-line

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Icy Crystals — Methane Hydrate — Heat Up

A Houston engineer says he’s developed a technology that can economically extract energy on a large scale from an untapped — and vast — source.

The material, the source of the potential natural gas fuel is methane hydrate aka methane clathrate, the presumed culprit in several of British Petroleum’s problems in the Gulf, this time as a potentially extractable natural resource. Doc Sez: Did you know that oil drilling rigs flare off this valuable resource – for shame.

Vast deposits of methane, trapped in ice-like crystals under Alaska’s frozen tundra and beneath ocean floors worldwide, could play an important role in the nation’s energy future.

But after more than two decades of study, major oil companies and governments are still trying to crack the code to large-scale extraction of these energy rich substances called gas hydrates.

Wickrema Singhe, a Houston-based engineer and project consultant to some of the world’s biggest oil companies, has wrestled with the same problem. And recently, he’s developed a technology he believes could provide at least part of the answer.

The technology involves using low-energy microwaves to melt the icy structures and unlock the gas, in contrast to leading industry methods that Singhe says use far more energy or are too costly for commercial production.

The concept has attracted interest from oil companies including BP and Chevron Corp., as well as Japanese oil and gas firms.

He hasn’t found a taker yet, but he fervently believes in the concept and is pitching for universities including Texas A&M to do more testing.

“I’m fairly confident I’ll find a way to get to the next step,” Singhe said.

Gas hydrates are ice-like solids that trap gas molecules inside. They are found in high-pressure, low-temperature environments, like seabeds and beneath frozen ground. The most common gas found in them is methane, the chief component of natural gas. That’s why they’re often called methane hydrates.

The U.S. Geological Survey estimates that methane hydrates may contain more energy than the entire world’s fossil fuels combined. As such, methane hydrates could represent a “global paradigm shift in energy supply,” Ray Boswell, manager for methane hydrate research and development programs at the U.S. Department of Energy’s National Energy Technology Laboratory, told a U.S. congressional committee in July. There’s more check it out.

Article by Brett Clanton, 
 The Houston Chronicle, Jan. 23, 2010

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles listed in this column are copyright protected – their use is both acknowledge and is limited to educational related purposes, which this column provides.

Sources & Credits: — Most of these items were found in the newsletter NewsBridge of ‘articles of interest’ to the libraries technical and regulatory agency users. It is electronically published by the Pacific Northwest National Laboratories, in Richland WA.  I then followed the provided link to the source of the information and edited the content (abstracted) the information for our readers.

In Closing

I’ll be posting articles for you comfort and anger in the next few months. I never respond to flaming, but will take time to provide evidence in the form of both primary technical and secondary {magazine articles} references for those who ask. However, most of you can reach out and Google such information for your selves.

May your world get greener and all creatures on Earth become healthier and able to fulfill their function in this Gaia’s world.

Harry, aka doc_Babad


Previous Greening Columns

(Originally posted on Yahoo news at – click here. All copyrights belong to their respective owners.)

MOJAVE, California – A company working to send tourists on suborbital flights tested its spacecraft with a crew for the first time.

Virgin Galactic says the craft remained attached to a specially designed airplane throughout a six-hour flight over California’s Mojave desert Thursday.

On its website, the company congratulated the crew and said “Objectives achieved.” It says the two crew members evaluated all of the spaceship’s systems and functions.

Virgin Galactic says the flight test program will run through 2011 before it starts commercial operations.


On a personal note, I was at Oshkosh in July of 2005 and saw SpaceShipOne at the EAA airshow. It was magnificent. I haven’t seen SpaceShipTwo in person but hope any of our blog readers that sees it and takes some photos will post them here for all to see.

For more information about SpaceShipOne, click here.

By Mike Hubbartt, © Copyright 2010, All Rights Reserved.

Eschalon: Book II
Version: 1.04 (Updated July 14, 2010)
Publisher: Basilisk Games
Price: $24.95 (Download), $35.95 (DVD)
To contact them via email:
Free Demo via Download: click here.

Role Playing Games (RPGs) have been around for a long time. The first RPG games needed books that had stats for players, treasure, and obstructions. In the book-based version of Dungeons and Dragons (D&D), a Dungeon Master (DM) drew the map of the site of the action on paper with obstructions, treasures, and Non Player Characters (NPCs – essentially extras that can provide help when asked). I played D&D in college and we had an excellent DM (Smaug the Unpleasant) and a great group of players that spent hours exploring and fighting (Beaudrow the Elf) and opening doors (Aragon the Strong).

RPGs were among the first games produced for home computers, and my favorite type of software was D&D. I played simple versions of it on the Commodore 64 and more advanced versions (graphically as well as depth of play) on the more advanced Amiga. I played Dungeons and Dragons, Ultima, and Might and Magic and really liked them, so you can understand that I was looking forward to playing Eschalon: Book II.

Basilisk Games says that Eschalon: Book II is “a turn-based role playing game” based on the RPGs of the 1980s and 1990s. Like the classic games, Eschalon players have attributes that affect how well they do, there are objects scattered throughout the game that are useful, and there are missions to undertake where players can improve their character attributes.

The game system requirements are:

  • CPU: 2GHz or faster
  • RAM: 512 MB
  • Video: 3D Accelerated
  • OS: Mac OS X 10.3.9 or greater, Windows 2000/XP/Vista, Linux
  • Display Options: full screen mode (1024×768) or within a window

Getting Started

I downloaded and installed (expanding the downloaded .dmg file and dragging the folder into the Applications folder) the software on a new 13.3” 2.26GHz Intel Dual Core CPU Macbook with a 250 GB hard drive. No problems during the installation and it takes up less than 300 MB of space on the hard drive.

I started the game and like the options/info at the initial startup screen, which has good information that can help Basilisk’s support in the event of problems getting the software to run on your machine. I selected Launch Game to get to the intro, which I can skip it after learning the necessary story background, but do suggest first time players watch the intro.

Per the manual, my first task was to create a character, so I created Doc Beaudrow.
Basilisk gives each character an extra 20 attribute points beyond the main ones generated by the game (Strength, Endurance, Wisdom, Perception), which I used to enhance my elf’s fighting abilities. After Beaudrow the Elf was ready, I went to the game rules screen and selected the easiest options to see how the game plays.

The game starts in the character’s home, where I found items in the drawers in cabinets inside the house and took them for my adventure. A tip… take everything (including the journal) from all of the drawers in the house, look in the barrel outside the house (it sometimes has gold), and don’t forget to take a drink and fill your water skin at the well before leaving. Characters need to eat and drink. Upon leaving the house I found a message from someone that wanted to contact me, so I set off to an inn in a nearby town. I stopped along the way and bought some supplies and weapons, which I would need very soon.

I spoke to NPCs that I met on the trail and in the buildings of the town (this is how you learn about additional adventures for each level) and then made it to the inn to meet the person that left the message at my home, and he just gave me enough to get me interested when he was killed. I set out but had forgotten the  RPG player’s prime creed: save early and save often. I had accumulated money, resources and experience but my poor elf was about to journey to the afterlife. I encountered some nasty dragonets that easily overwhelmed me, so Beaudrow was no more. But, to my chagrin, I lost everything: the character and the experience, so I had to create a new character and start over.

Armed with knowledge that a lesson learned is not a bad one, I recreated my elf, picked up all the goodies in and around the house, then saved the elf before heading to town. This time I bought the supplies and weapons, found more gold, met the person in the inn, and saved my character before going any further. I now had a starting point, plus a point where I had some experience points and no damage.

I went to a home under siege by dragonets, barely evading them and entering the home of a man that needed someone to find and destroy the nest of these creatures. I used a game feature to take a screen shot of the game at that point. My elf stands by a table while the dragonets (they look like oversized dragonflies, but their bite is much worse) are buzzing about outside the home. These creatures attack fast, and the best way I found to kill them was to use arrows, so make sure you buy plenty when you get the chance on the way into town.  Your character status (damage and ability to continue to fight) and available weapons are easily accessible on the screen. Note the floppy disk icon at the far right of the list of icons – that is you way to save the game at the current point in the adventure. As I’ve said before, save early and save often. Also note the map with visible and hidden regions at the top right area of the screen. It shows where you’ve been in each level of the adventure.

I think you get the idea now. This RPG is like others from the past. It is simple to learn for people already familiar with this type of game, yet also easy to pick up if this is the first time you are playing it. The screen layout is useful yet not overwhelming.

I have explored most of the first level and am going on to the others as time allows. I intend to continue playing the game and will update this article from time to time. If any readers want to contribute tips (just hints, not explicit instructions) for other readers, please feel free to post them here.


Excellent manual – I was impressed with the desktop publishing, as well as the way the game information is covered so well and yet without being too much to wade through.

Very minor system requirements to play the game, so people with older computers can enjoy the game as well as people with new, cutting-edge equipment.

Reasonable price, and available as a download or on a disk for installation.

Viewing the game full screen or in a window is a nice touch, especially for those playing (during breaks or lunchtime, of course) while at work.

I really like first person shooters, but a good RPG should be more than just chopping down opponents. Basilisk provides good background information in the manual and the intro that make it an adventure with goals to accomplish

Configurable Difficulty Modes – this makes it so easy for a newcomer to learn how to play the game without getting bogged down in details that are more important to experienced gamers.

Finally someone is making an effort to get decent games to people running Linux. I would like to hear from anyone that runs this under Linux – any issues with a particular version/type of Linux, like/dislike, and general impression when compared to other Linux games.

I love how easy it is to take screen shots within the game – what a great way to help reviewers as well as people that post their players on a blog or website.


One minor problem (missing material on page 2-explanation for Left Ctrl at the bottom of the page) with the manual, which does not affect the quality of an otherwise outstanding piece of  documentation. I wish the manual had page numbers (which I added), as it is necessary to sort through it to find useful tips throughout the manual.

I’d also prefer that it was easier to find details like the location of saved games and screen shots. For example, mine was located in /Users/mikehubbartt/Library/Application Support/Basilisk Games/Book 2 Saved Games.


Fun, and I recommend it as a good value for the money and a good buy. If you played any of the older RPG games like D&D, Ultima, Might and Magic, then you will enjoy this game. If you are new to the RPG genre, download and try the trial of this game. You just might be pleasantly surprised.

Press Release – SWF Protector (July 12, 2010)

Posted: July 12, 2010 by Mike Hubbartt in Press Announcements

© Copyright 2000-2009 DCoM Soft. All Rights Reserved.

Nice, France: DcomSoft has released absolutely new powerful SWF Protector, that works on Mac OS X 10.4 and Mac OS X 10.5. This very simple to use, yet very profound tool helps you protect your SWF files in very reliable way. From now on nobody will be able to steal ActionScripts from your SWF file and will never be able to use them for their own purposes.
Did it happen to you to have created an amazing SWF file using all your skills and knowledge and then find out that somebody has reverse engineered your brainchild and uses it for their own purposes? We don’t think it is suitable, so DComSoft has developed the application that will help you preserve your Flash animations from unauthorized usage.

Welcome totally new DComSoft SWF Protector for Mac! With this tool it is so easy to protect your SWF file ActionScripts in such a way that nobody will be able to restore them, because SWF Protector for Mac makes source code of SWF file completely inaccessible for SWF decompilers.

Versatile settings allow you to customize your protection in different ways.

DComSoft SWF Protector for Mac offers its users two protection modes: simple and advanced. Simple mode allows processing many SWF files at a time and SWF Protector chooses the protection mechanism based on peculiarities of the files. In Advanced mode one can process one file at a time, and the settings can be selected very specifically for each object.

Various protection algorithms of SWF Protector for Mac provide you with the most profound encryption of the scripts in your file.

SWF Protector Main Features:

  • Four reliable protection algorithms
  • Full support of AS 2.0 and AS 3.0
  • Two protection modes: Simple and Advanced
  • Choice of protection method and its intensity in Advanced mode
  • Batch files protection in Simple mode
  • Possibility to get full internal file information

Pricing and Availability:

  • Personal License: $39.95
  • Business License for One User: $59.95
  • Company License: $299.95

SWF Protector 2.0
Download SWF Protector
Purchase SWF Protector
Application Icon

DCoM Soft, Inc. is a software development company with over ten years of experience in developing software solutions for Microsoft Windows, Linux and Mac operating systems. We try to make our solutions dependable, fail-safe and easy to use.

By Harry Babad © Copyright 2010, All Rights Reserved


With A Chip on My Shoulder — I avoid greening sites that equate a demonstration of a concept (e.g., a lab or small scale pilot scale test) to having an industrially viable commercial solution that an instant cure all for our environmental and energy woes. My paradigm, government subsidies don’t make things commercial viable — Indeed, governments have, internationally, been shown to consistently pick losers whether as energy-green or in other technology areas. . [E.g., Corn ethanol vs Food, No cost unmetered water policies in a world of drought and shortages or off shore oil as an energy cure-all.] In addition, subsidizing industry to use its politically sold favorites… no way.

Read more about my paradigms views, prejudices and snarky attitudes at:

The information I share in the articles that follows comes from the various weekly science and environmental newsletters to which I subscribe. When I have time, I also check a variety of energy related and environmental blogs.

Article selection (my article – my choice} are obviously and intentionally biased by my training, experience and at times my emotional and philosophical views of what works and what will not… If you want more information, read the article by clicking on its link. If that does not satisfy, Google a bit.

Bottom Line: The resulting column contains a mini-summary with links to articles I found interesting. I also get technology feeds from the New York Times, Time Magazine, The Economist, Business Week, Discover Magazine, various international advocacy groups, and the American Nuclear Society. I also subscribe to a number of technology blogs, which are identified when I use their posted contents.

Why a Greening Column? — This all started while writing two textbooks on things nuclear for high school students and their teachers. Googling and reading subscription turned out as a good way as any to keep up with a rapidly changing world of energy and greening – for example who would have thought a few weeks ago that off shore oil might not be the main route to US energy independence, climate change be damned?

Remember, conditions, both technical and geopolitical continuously change. — So if you’ve made up your mind about either the best way to go, or about its all a conspiracy, move on to the next article in our blog. Remember today’s favorite is tomorrows unintended consequences. However, soling unintended consequences is better than sticking one’s head in the sand. As Charles Dickens would have likely agreed — It was the best of times, the worst of times.

= = = = = = = = = = = = = = = = = = = = =  = = = = = = = = =

No, As Usual, in No Formal Order, a Bakers Dozen Snippets

——— A List of Their Titles ———

  • Building Better Biofuels
  • EPA Proposes Stronger Air Quality Standards For Sulfur Dioxide /New Standard To Protect Millions of The Nation’s Most Vulnerable Citizens)
  • E-Transportation Jump-Start: Coalition Seeks to Pave the Way for Electric Vehicles
  • Don’t Bet On a Hydrogen Car Anytime Soon
  • Engineer Designs Micro-Endoscope to Seek Out Early Signs of Cancer
  • Experts Say — The Smart Grid Poses Privacy Risks
  • The Six Greatest Threats to U.S. Cyber Security
  • Battery Research Aims To Store Renewable Energy
  • A New Route{s} to Cellulosic Biofuels
  • As Nuclear Reactor Fleet Ages, Engineers Ask,’ Is 80 the New 40?
  • The Hidden Costs Of Fossil Fuels – And Biofuels, Too
  • Native Grasses An Explosive Idea For Cleaning Contaminated Soil
  • 10 Reasons Not to Revive the Nuclear Power Industry

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Building Better Biofuels

By Dr. Tim Donohue, Director of the Great Lakes Bioenergy Research Center

( — Making biofuels from plants brings opportunities and challenges, according to Dr. Tim Donohue, Director of the Great Lakes Bioenergy Research Center, one of three U.S. Department of Energy Bioenergy Research Centers. The opportunity lies in the availability. Donohue gave a talk at the Pacific Northwest National Laboratory’s Frontiers in Biological Sciences Seminar Series. The series features academic government and industrial leaders who discuss novel ideas and scientific advances in biological sciences.

“We’re trying to replace fossil fuels in the liquid transportation fuels sector, so we have to use a readily available feedstock. Cellulose is the most abundant organic material on the planet,” said Donohue. It consists largely of sugar polymers (glucose plus others) that can be converted to other fuels by catalytic or microbial chemistries. And these sugars come from the non-edible parts of the plants, rather than from food sources.

The challenges include getting at the sugars trapped in insoluble fibers of the cellulose wall, and the variety of cellulosics. “Plants being considered are hardwood, softwood, corn, and switch grass. However, there’s likely no one magic solution,” said Donohue. One of the Center’s roles is to come up with the varying solutions needed.

Read more about the benefits and present limitations on biofuels production. Blog       Science-Physics-Technology-Nanotechnology News

November 17th, 2009 – Article Provided by the Pacific Northwest National Laboratory

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

EPA Proposes Stronger Air Quality Standards for Sulfur Dioxide /New standard to protect millions of the nation’s most vulnerable citizens)

WASHINGTON – For the first time in nearly 40 years, EPA is proposing to strengthen the nation’s sulfur dioxide (SO2) air quality standard to protect public health. Power plants and other industrial facilities emit SO2 directly into the air. Exposure to SO2 can aggravate asthma, cause respiratory difficulties, and result in emergency room visits and hospitalization. People with asthma, children, and the elderly are especially vulnerable to SO2’s effects.

“Short-term exposures to peak SO2 levels can have significant health effects – especially for children and the elderly – and leave our families and taxpayers saddled with high health care costs,” said EPA Administrator Lisa P. Jackson. “We’re strengthening clean air standards, stepping up monitoring and reporting in communities most in need, and providing the American people with protections they rightly deserve.”

EPA is taking comment on a proposal to establish a new national one-hour SO2 standard, between 50 and 100 parts per billion (ppb). This standard is designed to protect against short-term exposures ranging from five minutes to 24 hours. Because the revised standards would be more protective, EPA is proposing to revoke the current 2 4-hour and annual SO2 health standards.

EPA also is proposing changes to monitoring and reporting requirements for SO2. Monitors would be placed in areas with high SO2 emission levels as well as in urban areas. The proposal also would change the Air Quality Index to reflect the revised SO2 standards. This change would improve states’ ability to alert the public when short-term SO2 levels may affect their health.

The proposal addresses only the SO2 primary standards, which are designed to protect public health. EPA will address the secondary standard – designed to protect the public welfare, including the environment – as part of a separate proposal in 2011.

EPA first set National Ambient Air Quality Standards for SO2 in 1971, establishing both a primary standard to protect health and a secondary standard to protect the public welfare. Annual average SO2 concentrations have decreased by more than 71 percent since 1980

US Environmental Protection Agency  — Release date: 11-07-2009

Cathy Milbourn!OpenDocument

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

E-Transportation Jump-Start: Coalition Seeks to Pave the Way for Electric Vehicles By Larry Greenemeier 

Although the widespread adoption of electric vehicles and their related infrastructure has always suffered from chicken-and-egg syndrome, Nissan and FedEx, along with several utilities and technology companies have formed a coalition to break the stalemate. At a press conference Monday in Washington, D.C., the Electrification Coalition announced its formation as well as a new 130-page report on the dangers of oil dependence, the benefits of electric vehicles, and ways to overcome roadblocks that have kept these vehicles from being deployed en masse.

Sixty percent of the petroleum used by the U.S. daily comes from foreign sources, FedEx CEO Fred Smith said at the launch event, adding that 90 percent of all U.S. transportation is petroleum-powered. Smith made clear his position that reliance on foreign oil is “in no small way related” to the wars in Iraq and Afghanistan. This energy mentality has to change because the U.S.’s dependence on foreign oil has created what amounts to a security risk for the country as a whole, said Sen. Byron Dorgan (D–N.D.), who also spoke at Monday’s event.

The coalition’s position is that a move to electric vehicles would help the U.S. combat the economic, environmental and national security vulnerabilities caused by the country’s petroleum dependence. The coalition’s “Electrification Roadmap” report predicts that if by 2040, 75 percent of light-duty vehicle miles traveled in the U.S. are covered by electric vehicles, oil consumption in that fleet would be reduced by more than 75 percent, and “U.S. crude oil imports could effectively be reduced to zero.”

With the number of vehicles on the planet expected to grow from 600 million today to 2.5 billion by 2050, this group of companies sees electric vehicles as the best alternative, given concerns of foreign oil dependency, oil prices and climate change.

The Electrification Coalition, made up of carmaker Nissan Co., various utilities and tech companies.  Read more that the link below.

Scientific American

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Don’t Bet On a Hydrogen Car Anytime Soon

By Curt Suplee

Just in time for Thanksgiving, a familiar techno-turkey is back on the national policy table: the hydrogen-powered car. The Obama administration had flatlined funding for President George W. Bush’s pet initiative, briefly but heavily touted a few years back as the driving force toward a future “hydrogen economy” in which gas would displace gasoline.

Two wars and a financial sinkhole later, most Americans had managed to forget the whole thing. But then last month the Senate improbably restored $187 million for H-car research programs to an appropriations bill.

Okay, that’s barely enough to cover one year’s bonuses on the lower floors at AIG. But why is it there at all? The answer lies in the persistent, hypnotic allure of hydrogen eco-mythology, with its promise of breaking our addiction to fossil fuels and foreign oil while banishing greenhouse pollution from our skies — a vision most pointedly embodied in the hydrogen car. Or, more accurately, the notion of the hydrogen car.

Electrical current from a fuel cell, a device that combines hydrogen and oxygen to produce electricity, powers the prototypical H-car. The principle involved is a schoolroom classic: If you stick two electrodes into a beaker of water, the electrical energy breaks H2O apart into its ingredients, H and O, in a process called electrolysis. A fuel cell does the same thing in reverse, putting separate H’s and O’s back together into water molecules and thereby producing electrical energy, which can be used to run a motor. Click the Link — Read On.

The Washington Post, Tuesday, November 17, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Engineer Designs Micro-Endoscope to Seek Out Early Signs of Cancer

Traditional endoscopes provide a peek inside patients’ bodies. Now, a University of Florida engineering researcher is designing ones capable of a full inspection.

Physicians currently insert camera-equipped endoscopes into patients to hunt visible abnormalities, such as tumors, in the gastrointestinal tract and internal organs. Huikai Xie, an associate professor of electrical and computer engineering, is working on replacing the cameras with scanners that “see” beneath the surface of tissues — revealing abnormal groups of cells or growth patterns before cancerous growths are big enough to be visible.

“Right now, endoscopes just take pictures of the surface tissue. So, if you see some injury, or abnormality, on the surface, that’s good,” Xie said. “But most of the time, particularly with cancer, the early stages of disease are not so obvious. The technology we are developing is basically to see under the surface, under the epithelial layer.”

Experiments with the professor’s scanning “micro-endoscopes” on animal tissue have been promising, although his devices have yet to be tested in people. The pencil-sized or smaller-sized endoscopes could one day allow physicians to detect tumors at earlier stages and remove tumors more precisely, increasing patients’ chances of survival and improving patients’ quality of life.

Xie and his graduate students have authored at least 40 papers on various aspects of the research, which is supported with more than $1 million in grants, primarily from the National Science Foundation. In September, he delivered an invited talk, “MEMS-Based 3D Optical Micro-endoscopy,” at the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society. He also recently launched a small company, the Gainesville-based WiOptix Inc., to speed commercialization of his scanning technology.

With current camera-equipped endoscopes, once doctors spot abnormalities, they typically perform a biopsy, and then send the suspicious tissue to a laboratory. But biopsy is risky and may cause bleeding and even trauma. Also, it usually takes a couple of days to receive the analysis of the biopsy sample from the laboratory. If it is cancerous, surgeons may attempt to remove the abnormality and surrounding tissue, using either endoscopes equipped for surgery or traditional surgical methods.

Xie’s endoscopes replace the cameras with infrared scanners smaller than pencil erasers. The heart of his scanner is a micro-electromechanical system, or MEMS, device: A tiny motorized MEMS mirror that pivots back and forth to reflect a highly focused infrared beam.

Computers process the return signal from the endoscopes, transforming it into a three-dimensional image of the surface tissue and the tissue beneath. One scanner even produces a 360-degree-image of all the tissue surrounding the endoscope. Doctors or other trained observers can then search the image for abnormalities or suspicious growth patterns. There’s more click on!

Also check out

AScribe Newswire, Nov. 19 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Experts Say — The Smart Grid Poses Privacy Risks

By Brian Krebs

Technologists already are worried about the security implications of linking nearly all elements of the U.S. power grid to the public Internet. Now, privacy experts are warning that the so-called “smart grid” efforts could usher in a new class of concerns, as utilities begin collecting more granular data about consumers’ daily power consumption.

<Doc Sez, so what’s new – first the telegraph, then the cell phone. Now the Internet and the cell phone and soon the grid. Establish controls, enforce them HARSHLY and live with the reality that what can be done – will be done.>

“The modernization of the grid will increase the level of personal information detail available as well as the instances of collection, use and disclosure of personal information,” warns a report (PDF) jointly released Tuesday by the Ontario Information and Privacy Commissioner and the Future of Privacy Forum (FPF), a think tank made up of chief privacy officers, advocates and academics. The report mentioned above can also be downloaded as a PDF.

Smart grid technology — including new “smart meters” being attached to businesses and homes — is designed in part to provide consumers with real-time feedback on power consumption patterns and levels. But as these systems begin to come online, it remains unclear how utilities and partner companies will mine, share and use that new wealth of information, experts warn. Read more about the issue at the link below.

Doc Further Notes: Even when you anticipate the law of unintended consequences, it is never the less difficult to deal with them. Indeed in our blogger conspiracy theory rich Internet, even relatively solvable problems become major political and headline issues. After all of all the truths we hold self evident, scientific literacy and the ability to deal with risk related issues are rarely in evidence.

The Washington Post Security Fix Blog, November 18, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

The Six Greatest Threats to U.S. Cyber Security

By Michael Cooney

It’s not a very good day when a security report concludes: Disruptive cyber activities expected to become the norm in future political and military conflicts. But such was the case today as the Government Accountability Office today took yet another critical look at the US federal security systems and found most of them lacking

From the GAO: “The growing connectivity between information systems, the Internet, and other infrastructures creates opportunities for attackers to disrupt telecommunications, electrical power, and other critical services. As government, private sector, and personal activities continue to move to networked operations, as digital systems add ever more capabilities, as wireless systems become more ubiquitous, and as the design, manufacture, and service of information technology have moved overseas, the threat will continue to grow.

Within today’s report, the GAO broadly outlines the groups and types of individuals considered to be what it called key sources of cyber threats to our nation’s information systems and cyber infrastructures.

Click on and read about our Nation’s vulnerability

Network World Inc., November 17, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Battery Research Aims To Store Renewable Energy

By Devin Powell and Philip F. Schewe, ISNS

A battery storage facility on Long Island helps to provide power for an MTA bus depot. Credit: New York Power Authority. Battery manufacturer EaglePicher, working in partnership with the Pacific Northwest National Laboratory, received 7.2 million [dollars] to modify and lower the cost the sodium sulfur batteries.

The biggest chemical battery in the United States is located near Interstate 90 in the small town of Luverne, Minn. The 80 ton device — the size of two tractor-trailers stacked on top of each other — stores as much energy as about 3 million rechargeable AA batteries and can power about 3,000 houses for more than an hour when discharging at its maximum rate.

The battery is also intended to soak up extra energy at night, when the wind blows strongest and when the power demand from the grid is the lowest. This energy can then be released in the afternoon to lessen the strain on the electrical grid when people return home from work.

Why Size Matters — “Most of the batteries we have in the world were made for small-scale usage,” said George Crabtree director of the material science division of Argonne National Laboratory. “You don’t need much energy to start your car, and your car battery is going to recharge again as soon as the car starts.”

But according to a 2008 report by the American Institute of Chemical Engineers, large-scale batteries need to be developed to deal with the increasing amounts of renewable energy on the grid. The AIChE report warned that no proven technologies have been developed to store large quantities of solar and wind energy. “Without [massive energy storage], renewable power can only be piggybacked onto the U.S. grid to supply not more than 15 percent of the power at best,” concluded the 2008 AIChE report.

The chemistry inside these sodium-sulfur batteries is similar to that of the lead acid battery inside of a car. In the car battery, a chemical reaction provides power by sending electrons from one lead plate to another through a liquid called an electrolyte. NGK batteries replace the lead plates with molten sulfur and molten salt and the liquid electrolyte with a solid piece of ceramic that allows electrons to flow between the two hot liquids.

This gives the batteries a much longer lifetime than car battery chemistry would allow. NGK guarantees them for 15 years (4,500 charge and discharge cycles), during which their efficiency at absorbing and discharging energy drops from about 92 to 75 percent.

There’s more , check it out: Google Large Scale Storage Batteries, there’s some fine articles out there-tune in ., November 19, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

A New Route to Cellulosic Biofuels — ZeaChem’s pilot plant will make ethanol-using termite microbes. By Phil McKenna

Biofuel startup ZeaChem has begun building a biofuel pilot plant that will turn cellulosic feedstocks into ethanol via a novel approach that uses microbes found in the guts of termites. The company says the ethanol yields from the sugars of its cellulosic feedstocks are significantly higher than the yields from other biofuel production processes. ZeaChem says its process also has the potential to produce a plastic feedstock.

The company employs a hybrid approach that uses a combination of thermochemical and biological processes. It first uses acid to break the cellulose into sugars. Then, instead of fermenting the sugars into ethanol with yeast, as is typically done, the company feeds the sugars to an acetogen bacteria found in the guts of termites and other insects. The bacterium converts the sugar into acetic acid, which is then combined with hydrogen to form ethanol.

“It’s a little more complicated than a conventional process. It’s not the obvious, direct route, but there is a high yield potential,” says Jim McMillan of the U.S. Department of Energy’s National Renewable Energy Laboratory in Golden, CO.

In more conventional biofuel processes, much of the carbon content locked up in the sugars is lost to the formation of carbon dioxide when the sugars are fermented into ethanol. Converting the sugars into acetic acid and then ethanol, however, yields no carbon dioxide. As a result, this method has the potential to raise biofuel yields by as much as 50 percent, according to ZeaChem.

In Israel, a different concept with the same goal Cellulosic Ethanol,

There’s more – there’s no free lunch!

Technology Review, November 2009

–      – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

As Nuclear Reactor Fleet Ages, Engineers Ask,’ Is 80 the New 40?

By Paul Voosen of the Greenwire Column

Could nuclear power plants last as long as the Hoover Dam?

Increasingly dependable and emitting few greenhouse gases, the U.S. fleet of nuclear power plants will likely run for another 50 or even 70 years before it is retired — long past the 40-year life span planned decades ago — according to industry executives, regulators and scientists.

With nuclear providing always-on electricity that will become more cost-effective if a price is placed on heat-trapping carbon dioxide emissions, utilities have found it is now viable to replace turbines or lids that have been worn down by radiation exposure or wear. Many engineers are convinced that nearly any plant parts, most of which were not designed to be replaced, can be swapped out.

“We think we can replace almost every component in a nuclear power plant,” said Jan van der Lee, director of the Materials Ageing Institute (MAI), a nuclear research facility inaugurated this week in France and run by the state-owned nuclear giant EDF.

“We don’t want to wait until something breaks,” he said. By identifying components that are wearing down and replacing them, he said, suddenly nuclear plants will find that “technically, there is no age limit.”

Indeed, as U.S. regulators begin considering the extended operations of nuclear plants — the Nuclear Regulatory Commission (NRC) expects the first application for an 80-year license could come within five years or less — perhaps the largest lingering question is one of basic science: How do heavy doses of radiation, over generations, fundamentally alter materials like steel and concrete?

There’s more to read about

New York Times, November 20, 2009

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

The Hidden Costs Of Fossil Fuels – And Biofuels, Too

By Moises Velasquez-Manoff,

The ‘hidden’ costs of burning fossil fuels and biofuels aren’t factored into their market prices, but someone has to pay them. Fumes, as illustrated,  emerge from a coal-fired power plant in Germany. The hidden costs of coal plants include the effects of mercury on wildlife and people, the climate-warming effects of carbon emissions, as well as pollutants such as sulfur dioxide, nitrogen oxides, and particulate matter.

A new report by the National Research Council seeks to put a dollar amount on the “hidden” costs of energy produced by burning fossil fuels. These costs aren’t factored into the market prices of coal, oil, and gasoline, or the prices of electricity generated by fossil fuels, the report says. But someone eventually pays for them.

The report found that, in 2005, the hidden costs of energy production with fossil fuels in the United States amounted to $120 billion. This includes the negative impact of air pollution on health, but doesn’t include the effects of mercury emitted by coal-fired plants on wildlife and people, harm done to ecosystems by air pollution, or the climate-warming effects of carbon emissions.

Coal-fired plants produce about half the nation’s electricity. The report found that pollutants like sulfur dioxide, nitrogen oxides, and particulate matter cost the US $62 billion. That works out to about 3.2 cents’ worth of “non-climate” damages for every kilowatt-hour (kWh) generated.

Natural gas had fewer hidden costs than coal. Four hundred ninety-eight natural-gas-powered electric plants caused about $740 million in damages. That’s about 0.16 cents per kWh, or 1/20th of the damage produced by coal.

Vehicles, meanwhile, which account for 30 percent of US energy use, produced $56 billion in damages. That works out to between 1.2 and 1.7 cents’ worth of hidden costs per mile traveled.

Climate considerations aside, damages wrought by ethanol made from corn were usually similar to, or even slightly worse, than damages from gasoline. That’s because of the extra energy needed to convert corn to biofuel.

The Christian Science Monitor

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

Native Grasses An Explosive Idea For Cleaning Contaminated Soil

By Judy Lowe

You hear a lot these days about the benefits of native plants, but here’s a new one: Certain native grasses can convert the toxic leftovers from atrazine – second most common herbicide in the US and a stubborn pollutant in the nation’s waterways – into harmless carbon dioxide, reports the Kansas City Star.

But there’s more.

Three researchers – Robert Lerch of the Agricultural Research Service of the USDA, Chung-Ho Lin of the University of Missouri at Columbia, and John Yang of Lincoln University – thought that if native grasses worked for atrazine, why wouldn’t they clean up soils contaminated with TNT and another explosive, RDX, which are chemically similar?

It turns out that two common native grasses – switch grass and Eastern gamma grass – do. This is a big deal, because “The U.S. Army has identified more than 538 sites contaminated by explosives, including 20 EPA-designated Superfund sites,” says Dr. Yang.

The grasses work by nourishing microorganisms in the soil that work to break down the explosives into harmless components. The advantage to using grasses is that they’re natural and cost effective, says Yang in an interview with the Columbia Daily Tribune. Research shows using native grasses to clean up a site costs only $200 to $10,000 for 2-1/2 acres — a fraction of cost of the traditional method of phytoremediation, reports the Tribune.

Compare those small amounts to the estimated cost of $100,000 to $1 million per acre that, the Star reports, it typically costs to haul away the soil in a field contaminated by TNT or RDX and incinerate it.

And the researchers think there may be many more potential uses for grasses in cleaning up contaminated areas.  “We really haven’t looked at that,” Dr. Lerch says. “I think it’s fair to say there is a lot more potential.”

The Christian Science Monitor

– – – – – – – – – – – – – – – – – – – – – – – – – — – – – – – – – –

10 Reasons Not to Revive the Nuclear Power Industry

By Elisabeth King

(“Nuclear power redux … why?), I would add these 10 reasons why nuclear power is the wrong answer to our nation’s energy needs.

Doc Sez, I wonder how much time it took Ms. King to search the copyright protected antinuclear sites to come up with this well-worn list. What no references — Isn’t that plagiarism or am I just under educated.

But in fairness, here’s Ms King’s List followed by my reality check based on US government and international agency documentation. There’s not enough space to provide all the references that counter Ms Kings agenda, but you can check out Google for the word,

1.)        Human Error.

That’s why the nuclear industry has the toughest safety training and zero tolerance for effort. After Three Mile Island, that Safety & Training became the paradigm of the industry. Can Ms King find one that is safer – Oil recovery, coal and other minerals mining, chemical manufacturing… ?

2.)        Carbon Footprint.

I absolutely agree with Ms King – all industries including renewable need to be judged on cradle to grave costs including their carbon footprints. I guess the steel (wind power) Concrete {hydropower) chemical pollutants {Photovoltaic Solar Power) and potential from ground water contamination (hydro cracking for natural gas.) are all foot print free.

3.)        Pollution of the Soil {and Groundwater}.

So what’s unique about uranium? How about lead, cadmium, arsenic, chromium, silver and god, and zinc mining, are they pristine? The issue is not contamination but creating rigorously enforcing regulations and assuring control measure are suitable for the risk associated with mining, or oil drilling for that mater.

4.)        Waste disposal.

Are contained nuclear wastes (HLW) in airplane crash proof storage, less risky that mining wastes. Are exposed radioactive coal slag piles less dangers that EPA mandated Low-level waste burial? You surely could have fooled me, and the majority of health-risk-exposure experts. If a cause is needed, start working on lobbyist funded politicians, undereducated technically illiterate bureaucrats and an public that thinks, for the most part, that science ins a dirty work.

5.)        Leakage.

All talk, just inflammatory smoke and mirrors. Miss King, prove the validity and generality of you claim documented by multiple certified scientific analysis published, independently replicated, and peer reviewed.  Provide references by any world regulatory agency that agrees with your hypothesis.

6.) Effects on health.

Without even casting rocks at the often almost discredited linear threshold dose hypothesis, there is ample evidence that low-level of radiation behave like most other toxic substances, what you ask does that mean? At sufficiently low dose {exposures}, most amply reported in the peer reviewed open literature, there is not linear threshold to radiation, or mercury or arsenic or chromium. The human body, and its well developed, but not perfect immune system has protected mankind other of Gaia’s creatures since they evolved and thrived.

7.)        Unreliability.

Since when – All you need to do call you claim false, is to check both the DOE’s and IAEA records, since of course you believe traceable industry records are not reliable. Shutdowns are preventative – if these were bad how could nuclear power be demonstrated to have online factors approaching 90-91%.

8.)        Expense.

Capital costs up front are caused by a combination of regulatory overkill and risk adverse bankers and venture capitalist, at least in the US. Ms King, have you ever wondered why the same nuclear power plant take twice to three times as long to build in the US than in the rest of the nuclear power seeking world. Are our plants any safer? After all most of the approved designs whether by the NRC or the IAEA, are carbon-copy clones of one-another, at least for this generation of reactors.

9.)        Eventual {uranium} shortages.

Hmm — what is there neither a breeder reactor, proliferation proof recycling, a thorium fuel cycle (India), and ocean extraction of uranium or finding new deposits in your future.

10.)      Fiscal responsibility.

Humbug!  You should become a politician. “Nuclear power is by far our most expensive option: the moral equivalent of buying a mink coat with a credit card while the refrigerator stands empty and the children have holes in their shoes.” Alas even the worlds economist will challenge that premise – true we Americans lead by your wisdom, make it so. Apples and oranges comparisons, our side of full lifecycle analysis is equivalent of the medieval belief in demoniacal possession and vapors and humors cause disease.

By now you, Miss King, must be really unhappy with me… However I can back my statement up with hundreds of peer (independently) reviewed studies, can you? Or is this another case of Dylanesque ‘Blowin’ in the Wind.

The Times Record, March 5, 2010

= = = = = = = = = = = = = = = = = = = = =  = = = = = = = = =

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Some of the articles listed in this column are copyright protected – their use is both acknowledge and is limited to educational related purposes, which this column provides.

Sources & Credits: — Most of these items were found in the newsletter NewsBridge that lists ‘articles of interest’ to the libraries technical and regulatory agency users. It is electronically published by the Pacific Northwest National Laboratories, in Richland WA.  I then follow the provided link to the source of the information and edited the content (abstracted) the information for our readers. Should I find an associated or contradictory reverence, I also share that with you.


Charles Dickens in a Tale of Two Cities

“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.”

Charles Dickens, An English novelist (1812 – 1870)

In Closing

I’ll be posting articles for you comfort and anger in the next few months. I never respond to flaming, but will take time to provide evidence in the form of both primary technical and secondary {magazine articles} references for those who ask. However, most of you can reach out and Google such information for yourselves. To prove me wrong all you need to do is to send me scientifically peer-reviewed evidence (references).

May your world get greener and all creatures on Earth become healthier
and be more able to fulfill their function in this Gaia’s world.

Harry, aka doc_Babad


Previous Greening Columns

By Harry {doc} Babad, © Copyright 2010, All Rights Reserved.


As noted in recent macCompanion articles, and columns  I am always a late adopter of a new operating system. Why?

  • Let someone else bug fix.
  • Updating to current conflict free software tools, is always a bit of a wait – shareware developers have other priorities, so sometimes there’s a bit of a wait until a compliant version is released. Of course there is the software that has been abandoned while the developer moves on to, I hope, more rewarding things. In the later case, I always keep the toes crossed. I’ll perhaps need the mouse and keyboard to try to repair any damage a non-compliant piece of soft ware might (rarely) cause.
  • Search for replacement for developer abandoned utilities I use daily to ease my ’computing around, which also takes time

This time around, Snow Leopard did less damage to my favorites that I’d feared, but I did need wait on upgrading to give developers a chance to catch up with Apple.

Indeed HP still hasn’t gotten around to releasing a driver for my ScanJet 8250, so I licensed a powerful, flexible and full featured driver from  SilverFast. My reward a crash-free program that has an easier to use interface and comes with a full set of graphics and photography tool to allow one to customize their scanner output. Each version of the SilverFast product comes optimized for a specific scanner, but differs from the HP approach by bother being kept current by the developer and having a responsive technical support w/o the much Todo fuss that too often surrounds HP. []

A Bakers Dozen – what they do and why should you care

  1. AppCleaner 1.2.2AppCleaner allows you to uninstall your apps more easily. It searches the files created by the applications and you can delete them quickly. [Freeware]
  2. ClipEdit 3.0.3 — ClipEdit is a small application that allows you to create and edit text and picture clippings. ClipEdit also exports any Internet address to an Internet clipping. ClipEdit is very flexible with many customizable settings. And now with the handy toolbar editing a text clipping is even easier yet! Try it; if you use text clippings frequently you’ll wonder how you lived without clipEdit for so long. Every Day Software’s [Free]
  3. DEVONthink 2.0.2 — A PIM for all your stuff. DEVONthink is the solution to the digital age conundrum. It is your second brain, the one and only database for all your digital files, be they PDFs, emails, Word docs or even multimedia files. Boasting a refined artificial intelligence, DEVONthink is exceedingly flexible and adapts to your personal needs. And if the files are not digital yet, digitize them with DEVONthink Pro Office [Demo]
  4. Easy Find is a Great Document Title Contests Item Search Tool — Welcomed home from its “Find File” classic OS abandonment. EasyFind is an alternative to or supplement of Spotlight and finds files, folders or contents in any file without the need for indexing. This is especially useful if you are tired of slow or impossible indexing, outdated or corrupted indices or if you are just looking for features missing in the Finder or Spotlight. This is not available on the MacUpdate site. [Free]
  5. Folder Brander Creates Text And Text Symbol Containing Folder Icons The Mac OS X lets you add folders to the Finder’s sidebar, or the right side of the Dock. This excellent feature provides quick access to frequently used folders. This is great except … all folders look the same. Yellow Mug software’s FolderBrander lets you easily change the look of your folders – pick a different color and add a short text label. [Shareware]
  6. HoudahSpot, a Spotlight Search Enhancement “KISS” Tool — This application ended most of my frustration when trying to find items using spotlight. It is the front-end most Spotlight users don’t know they’re missing. With HoudahSpot your files are at your fingertips. Keep frequently used files within reach. Retrieve the files you didn’t know you still had. Create detailed queries to pinpoint the exact file you want to get to. Save queries for direct access to your favorite files. Set up templates for frequently performed searches. Use HoudahSpot for some housekeeping (Hi, Mom!). You could ask HoudahSpot to find all music files not yet in your Music folder. Just drag the results to your Music folder  [Shareware]
  7. IconCompo Can Help Create Graphics Containing Folder Icons — Trollin’s iconCompo can compose one icon out of two images mainly for making custom folder icons. Drag and drop one or two images onto the main window of the program and then adjust the images in various ways – position, size, directions, transparencies and colors, text. Composite images can be attached to existing files/folders, used to create new folders, or saved in several image formats such as icns, png and tiff. With a Finder contextual menu item iCCMPlugin, you can attach custom icons with your saved settings or add thumbnails. [Shareware]
  8. iSeek is A Menu Based Multi-Bowser Search Tool — Ambrosia Software’s iSeek is a handy little program for Mac OS X that allows you to instantly search for anything you seek, no matter what program you are running. iSeek puts a familiar search field in your menu bar, needing just a click or keystroke to start your search! iSeek’s slick and seamless interface hides a significant amount of power and convenience. iSeek is no mere front-end to Google; it will ship with pre-configured search shortcuts for dozens of useful Internet resources. Look up a word definition in the dictionary, or a synonym in the thesaurus, or even famous quotes that reference the word. Search for information on Google, in popular news sites such as the BBC News,, or even search for software on
  9. Memoblock is a Notebook and/or To-Do list — It is the best, for my needs, product I’ve yet tested. Memoblock is a useful notepad utility for OS X. Store as many styled text notes as you require, transfer notes to certain iPods, save as vNotes for mobile phones and more. Alarm reminders can be set for individual  notes, and notes can be categorized as you wish. Best of all, Memoblock is donationware!
  10. shadowClipboard 3.0.5, a Cocoa Clipboard Enhancer — This application when used in advanced mode, though a bit unstable in Snow Leopard is the best multiple clippings storage utility I’ve found. Believe me I’ve tested many. ”— stupidFish Programming’s shadowClipboard is a powerful clipboard manager for Mac OS X. It remembers a user definable number of items copied to the clipboard. A choice of two interfaces (a simple and an advanced) let you choose which item to paste into applications. shadowClipboard offers Clipboard Sharing for sharing you copied items with other users on the local network. [Shareware]
  11. Unsanity’s Fruit Menu 3.7.3 — It is now only compatible for  use with for Tiger and Leopard. The version for Snow Leopard will be released soon I hope.  FruitMenu is a haxie that gives you the ability to customize the Apple Menu and contextual menus. Using a visual editor you can edit the contents of the menus to suit your needs and taste. FruitMenu will also display the contents of the FruitMenu Items folder inside of your Library folder, launch applications and shell scripts from the Apple Menu and contextual menus, to allow easy file navigation and launching. To make the haxie completely flexible and customizable, you can assign hotkeys to particular menu items. Last but not least, it is priced at only US $15!
  12. Unsantiy’s FontCard Haxie 1.5.1b4 — It is now only compatible for  use with for Tiger and Leopard.  FontCard is a haxie that modifies the Font menu in Carbon and Cocoa applications. It can add an icon that displays the format of a font next to the font menu item, display the font name in the font face, group fonts into submenus, and add font collections to the font menu. Shareware]
  13. X-Menu a DEVONtechnologies Haxie — Menu adds one or more global menus to the right side of the menu bar. They give you access to your preferred applications, folders, documents, files, and snippets. Launch any application or insert text snippets or URLs into your email messages or Pages documents with a single menu choice. [Free]

That’s all for now folks.

You can download most of these items from the site.


The original of this article was posted to macCompanion in June of 2008. Since that eZine is no longer fully active, I am reprinting it, in a slightly revised form in our Blog.

Acknowledgements: Unless otherwise noted I have provided the source of the material in these articles. I also found in my many notes I’ve stashed for future articles, that certain themes keep coming up, that parallel what I’ve read or practiced.  In most cases I have acknowledged as well as modified the original document(s) to personalize them for our readers.

As needed the information provided was created, and as appropriate demonstrated on my iMac 2.8 GHz Intel Core 2 Duo with 2 GB installed 667 MHz DDR2 SDRAM running Mac OS X version 10.6.4 with all current security updates installed.

By Harry {doc} Babad, Copyright 2010, All Rights Reserved


As a few of our readers know, I spend a great deal of time looking for information. The items I look for range from:

  1. Specific Information related to nuclear waste disposal on which I consult. General Technical information on energy, the environment and other issues I want to write about.
  2. References for a new book and updating my co-authored book on The Use of Nuclear Materials revision 2nd Edition.
  3. Grist for my Hobbies of collecting recipes and recipe eBooks and Folk Music
  4. Anything else my wife wants me, as her secretary, to chase.

Several months ago I wrote an article on To Site Search or to Google for the macC March 2010.  I concluded, perhaps a predetermined answer that the question was a little bit like the number of angels on the head of a pin. “To Site Search or to Google, That is the question.“ The answer is both! It depend on which seems faster, an intuitive guess, or whether one method gives you so much chaff, that you get mega-hits crazy, so its time to switch.

I like Google better than the other search engines I have worked with. Perhaps its familiarity, but I don’t really believe that. In addition, if a site has opted to use the Google engine to power its search function, it is easy to use, tolerant of syntax errors and even forgives my misspelling.

Otherwise searching individual websites for information can be either easy or maddening depending on how carefully/flexibly a site was indexed and the accuracy with which the internal search engine was configure. However, if I’m hunting for an electronic copy of a magazine article, googling it is often very simple and quick.

Alas too website search functions suck! Even, one of my favorite sources of stuff, has flaws in its search engine, so that by changing search criteria, you will often find you seemed to have missed retrieving in your first attempt to locate a product. Unfortunately, there are so many technical or other commercial sites whose engines are designed to thwart your finding what you need… they get friendly feedback email from me, mostly to relive my search stress.

Multi-Browser Search Engines and Tools

DEVONagent is now v. 2.3.1, is the solution if you’re tired of clicking hundreds of links to find most of them either outdated, broken, or leading to junk pages. DEVONagent communicates with search engines and then digs through the results for you. This results in giving you only the documents worth reading. Better yet, DEVONagent summarizes the accumulated knowledge and presents you with a list of the most important topics and an interactive mind map. Finding information on the Web has never been easier.

Your Research Assistant: DEVONagent finds, collects and organizes information with powerful search architecture. DEVONagent also provides a simple to use built-in archive and integrates perfectly with DEVONthink. With over 130 plugins for popular search engines, databases and search tools, including predefined search sets, and a clean Mac-like interface; DEVONagent is the number one tool for finding information on the web. I often use the current version for my searches. [] DEVONagent 1.7, which I reviewed in August 2005 for macC) made a believer out of me. Confession time — when I updated to Snow Leopard, I neglected to add DEVONagent to my dock, as a result that last few months worth of searching has been harder then it might have been.

MultiBrowser 1.0.0 — A new application <MacUpdate *****> I found when browsing, might be another tool of worth for casting a wider ring than just Google alone. According to its developer. I’ve not yet tested it, MultiBrowser, is a freeware program, allows you to take control of the many browsers that you may have installed on your Mac. Whenever you click on a link in just about any application (e.g. Mail, Preview, etc), MultiBrowser will appear and allow you to choose which browser to open that link in. Its browser selection window is highly configurable allowing you to change its colors, size, and more so that you can decide exactly what it should look like. MultiBrowser uses built-in Mac OS X services so that it does not even have to be running until you actually click a link – this means that it will not waste any of your Mac’s resources. MultiBrowser also has additional options for changing how browsers are launched, handling multiple monitors, and more. It has optional usage tracking (which will not be shared with anyone except yourself) to let know how many times you use each browser.

ISeek – An old favorite that is nearing its support life, is another variant for focusing searches. Although last updated in December 2007, it is Snow Leopard compatible – as program I use all the time. iSeek is a handy little program for Mac OS X that allows you to instantly search for anything you my seek, no matter what program you are running. iSeek puts a familiar search field in your menu bar, needing just a click or keystroke to start your search!

iSeek’s seamless interface hides a significant amount of power and convenience. iSeek is no mere front-end to Google, it will ship with pre-configured search shortcuts for dozens of useful Internet resources. Look up a word definition in the dictionary, or a synonym in the thesaurus, or even famous quotes that reference the word. Search for information on Google, in popular news sites such as the BBC News,, or search for software on

Any Link on the Internet can be added to iSeek as a search shortcut that’s available in a snap, and your recent searches are saved, too, for even quicker access. The real power of iSeek is that it is also highly configurable: you can add whatever search sites you find useful, and easily share them with your friends.

I have even added other locations, via its preferences pane, to the list of easily accessible sites, allowing me to add one-click access to my most used web links directly from my Menu bar. This allows me faster access to Time Magazine, The Economist and Bloomberg Business Week, on which I sped much time.

Meta Search Engines — An option I’ve not yet tried is using a Meta search engines such as  I did search ‘Small Nuclear Reactors’ and got too many hits for comfort. I tried narrowing down the search by looking for ‘pocket” and ‘Micro’ reactors but that didn’t help. All searches contained to many hits associated with none nuclear energy articles. Small Rectors for Nuclear Energy gave no hits on but worked well in Google even without using a Boolean approach.

At that point I decided to live with the tools I have and know or to only explore new tools if I thought they might add value to my search efforts. Life and my time is too short… and for my paid consulting work, this that latter is doubly true.

Advanced Google Searches

To paraphrase a phase incorrectly attributed to Horace Greeley, go Boolean young man (person). Much of what I share has been abstracted and paraphrased from the referenced and other sites. Check them out for more details as well as guidance.

Boolean searching is built on a method of symbolic logic developed by George Boole, a 19th century English mathematician. Most online databases and search engines support Boolean searches. Boolean search techniques can be used to carry out effective searches, cutting out many unrelated hits among the thousands of documents a search engine provides. (See the Appendix for added Boolean operator information. Also, )

Using Boolean Logic to broaden and/or narrow your search is not as complicated as it sounds; in fact, you might already be doing it. Boolean logic is just the term used to describe certain logical operations that are used to combine search terms in many search engine databases and directories on the Net. It’s not rocket science, but it sure sounds fancy (try throwing this phrase out in common conversation!).

Basic Boolean Search Operator AND — Using AND narrows a search by combining terms; it will retrieve documents that use both the search terms you specify, as in this example: Portland AND Oregon

Basic Boolean Search Operator OR — Using OR broadens a search to include results that contain either of the words you type in. OR is a good tool to use when there are several common spellings or synonyms of a word, as in this example: liberal OR democrat

Basic Boolean Search Operators – NOT Using NOT will narrow a search by excluding certain search terms. NOT retrieves documents that contain one, but not the other, of the search terms you enter, as in this example: Oregon NOT travel.

Keep in mind that not all search engines and directories support Boolean terms. However, most do, and you can easily find out if the one you want to use supports this technique by consulting the FAQ’s (Frequently Asked Questions) on a search engine or directory’s home page. Then practice a bit while running your normal searches – you’ll be surprised at how easy it is to do – and a narrow search steals less of your time and lowers your frustration level.

There are, for a few websites such the MacUpdate and of course Google have excellent built in search focusing tools, in which you can narrow the focus of a search by using a combination of typed limiting criteria and check lists of biographical criteria areas. But these are few and far between so go Boolean.


As noted above, there are several ways to improve you search capability.

  • You can, in he browser of your choice, learn to better define you search term. Don’t be shy about change your search term. Often the hits you get are either sufficiently different that a multiple search gets more useful information. Other time, alternate wording eliminates the strange. Also at times comparably similar search terms [e.g., Micro nuclear reactor, small nuclear reactor, pocket reactor]
  • If you are comfortable, a valuable tool for doing so is a Boolean search. Each browser I use has a FAQ that defines how such a search should be formatted… a matter of format and punctuation.
  • You can; alternatively, use a search application that allows searches either with multiple browsers or on varies previously identified sites.
  • Finally there are tools like iSeek that allow you to access your search sites faster via a menu bar item.

Whatever you do don’t let the web intimidate you — it’s a strange and wondrous place and each search engine has different ways of indexing information, and search nuances. So illegitimus non carborundum.


  1. To Specifically Site Search or to Google, That is the question.
  2. Google and Other Search Engines — Visual QuickStart Guide – or How to find it when you need it, by Alfred and Emily Glossbrenner, Peachpit Press. May 2004. [Article no longer appears on macC archives.]
  3. Google, The Missing Manual, 2nd Edition, by Sara Milstein, J. D. Biersdorfer, and Matthew MacDonald, O’Reilly Press, January, 2006. [Article no longer appears on macC archives.]
  4. Google Search Basics: Basic Search Help
  5. Boolean Logic, Wikipedia May 2010 — A detailed very mathematical description of the use of Boolean methods aimed at geeks
  7. The Spider’s Apprentice Blog [2007 – Dated but still very useful]
  8. A Helpful Guide To Web Search Engines
    How Search Engines Work —
  9. Boolean Web SearchLearn how To Use Boolean Search Operators. By Wendy Boswell,
  10. Guide., and,
  11. Web Search 101 – How to Search The WebIntroduction to Web Search By Wendy Boswell,
  12. Guide.


Boolean Search Operators

  • The Boolean search operator AND is equal to the “+” symbol.
  • The Boolean search operator NOT is equal to the “-” symbol.
  • The Boolean search operator OR is the default setting of any search engine; meaning, all search engines will return all the words you type in, automatically.
  • The Boolean search operator NEAR is equal to putting a search query in quotes, i.e., “sponge bob squarepants”. You’re essentially telling the search engine that you want all of these words, in this specific order, or this specific phrase.


Acknowledgements: Unless otherwise noted I have provided the source of the material in these articles. I also found in my many notes I’ve stashed for future articles, that certain themes keep coming up, that parallel what I’ve read or practiced.  In most cases I have acknowledged as well as modified the original document(s) to personalize them for our readers.

This original article was posted to macCompanion in June of 2008. Since that eZine is no longer fully active, I am re reprinting it, in a slightly revised, in ourBlog.

By Harry {doc} Babad, © Copyright 2010, All Rights Reserved


…Almost in the beginning there was MacPaint and MacDraw. I used the two programs for creating simple graphic, the paint related documents created from clip art, which had just begun to appear.

In  the beginning…

My photography efforts, even now, mostly focused capturing images of friends, family and vacation scenes first in black and white and eventually color. I was then using a Kodak camera using 120 or 220 film (6 cm sized) but have evolved first to a 35 mm SLR and now to a digital camera. The operating modes in which I used my cameras were almost always point and shoot – all simple easy to use. The digital camera gives me several options for framing images, of which I take full advantage-still in point and shoot mode.

My computer and camera never crossed trails. Nor did I attempt to do any work on my photos to enhance their appearance. Relative to the availability of photography software, that was okay since these applications had not yet been developed. Although it took a while to see my photography results since I did not have nor want a dark room. Photo-life evolved to color film, for me in the ‘50s when prices dropped, and a point and shoot 35 mm format (slides and prints) because affordable.

So this is how my equipment evolved:

Borrowed from my Dad as I was growing up It was all mine — a Kodak brownie bullet camera It was over kill since I used it only in auto pictures mode – but I got it at a real discount.

Of course, in a variation of Moore’s law it became easier to satisfy diversification of needs – although technology got simultaneously simpler and inherently more complicated. Those were the days my friends, and I was sure I was going to join the desktop publishing and graphics related revolution.

The Photography Revolution — We morphed, or attempted to do so, into more creative, but yet professional, photographers. We became, slowly but steadily more competent at ‘photo shopping’ <enhancing> our images.  We were mostly unpaid but effectively talented amateurs tasking advance of the appearance of and growing capability of graphics software along with desktop publishing ware and zowie-wow digital cameras. It got the rest of us so much closer to the instant gratification of creating relatively good images even from moderately poor snapshots. Most of those to jump on the bandwagon were folks like me, casual photographers with shoebox collectors of the images in our daily lives.

The Graphic — However on a succession of Macintosh computers and user friendly and evolving imaging associated software, I was able to dabble in creating simple art forms, both in vector and pixel format. Easy to print too, all it took was a desktop printer so there were no days waiting to see if we blew that shot. Again, Instant gratification, at a price I was willing to afford.

Epson Stylus-C58-printer HP Color Laser Printer CLJ-3500

Diversity and the quantity of software graphic art proliferated, clip art was born again, this time with higher end images that avoided being caricatures, cartoon and cuties. All of a sudden, almost spontaneously, we hungered to share our images with friends and even strangers. A new industry was born. Graphics collections on portable media {floppies, CDs and now DVDs, and subscription services and now (more recently) service bureaus to allow you the choice of licensed use of an image at a time because available.

It Became a Choice of Them or Me — Was being creative, in the graphics area worth my time? I have also been verbally adept by graphically impaired. It’s not that I can see but rather the image in my head refuses to transfer to another media such a paper, canvas of computer documents.

I could buy software – printers, and try to release my inner artist, and roll my own graphic creations.  Alternatively, I could access clipart and have, a broader, and at times a higher quality choice in creating a look for a graphic to enhance a document or book I was writing. I could even modify clip art to better suit my specific needs.

However with a few years, I was joyful at being able to make greater use of the efforts of folks more graphically talented than I was to illustrate my at home writing efforts. At work, I had the luxury of professional artists-illustrators who I shamelessly used to make my reports and presentations graphics look great. Alas that era too soon passed, except for “C” Suite occupants. ROF-ing graphics support staff occurred as the desktop publishing revolution deepened and the accountants chose price of quality for illustrations. After all why not let engineers and scientists become graphics experts and do away with overhead chargeable service support personnel.

Back to Me — I could create a sign or do simple sketch on my Macintish SE-30 and have the results available in minutes. Thus a dichotomy was born, separated and was fused as technology as hardware and software evolved for the home and non-professional graphics creators in business.

Time moved relentless on. Graphics software {e.g., Adobe CS suite, digital cameras become affordable, even SLR models once restricted to gifted well funded amateurs and professionals. Companies like Adobe created simpler non-expert version of their software such as Photoshop Elements or Apple’s iPhoto. One very interesting story that partially demonstrates this evolution is told is in Jeff Schewe’s 10 Years of Photoshop-the birth of a killer application, as well as an fine article in the same vein in From Darkroom to Desktop—How Photoshop came to light by Derrick Story. As always there’s lot more on these subjects in Wikipedia, but you all knew that.

Check These Out:

Although these articles focus on Photoshop, it doesn’t take much to use them to gain an insight into the growth and popularity of digital photography, and the ‘do it yourself movement’ associated with creating and customizing graphics.

In closing, there is a whole lot posted and written about photography, graphics software, analog and digital photography that you can read. If attempted to provide you with a definitive biography and history, this article would become both ‘book length’ and I would inadvertently leave out something great that contributed to the present state of the art in working with images.

Although I use such tools only when I can’t find something made to order to highlight and accent my written documents, most of you are like more talented and more driven to ‘roll your own.’ More power to you!  See Today’s Tools for Graphics Inhibited Harry

Nikon CoolPix Digital Camera Photoshop Elements for the Graphically Impaired

I’ve included, in the next section, descriptions and functions of the main types of graphics tools, Raster or Vector, oriented graphics editors – Read this or not, but enjoy looking at the world through a square ‘lens’ and adding a bit of the ‘rose’ colored to your visions and visualizations.

= = = = = = = = = = = = = = = = = = = = = = = = = = = =

Appendices and Post Scripts

Note: Some of these products and features described below have become common to both vector and raster image editing software. So ignore any repetitions of features – They indeed are real.

Copyright Notice: Product and company names and logos in this review may be registered trademarks of their respective companies.

Raster Graphics Editors

From Wikipedia, the free encyclopedia

A raster graphics editor is a computer program that allows users to paint and edit pictures interactively on the computer screen and save them in one of many popular “bitmap” or “raster” formats such as JPEG PNG, GIF and TIFF.

Usually an image viewer is preferred over a raster graphics editor for viewing images.

Some editors specialize in the editing of photographs such as the popular Adobe Photoshop, while others are more geared to artist-created illustrations, like the Adobe Fireworks.

Vector editors are often contrasted with raster graphics editors, and their capabilities complement each other. Vector editors are better for graphic design, page layout, typography, logos, sharp-edged artistic illustrations (e.g. cartoons, clip art, complex geometric patterns), technical illustrations, diagramming and flowcharting. Raster editors are more suitable for retouching, photo processing, photo-realistic illustrations, collage, and hand drawn illustrations using a pen tablet. Many contemporary illustrators use Corel Photo-Paint and Photoshop to make all kinds of illustrations. The recent versions of bitmap editors, such as GIMP and Photoshop support vector-like tools (e.g. editable paths), and vector editors such as CorelDRAW or Adobe Illustrator are gradually adopting tools and approaches that were once limited to bitmap editors (e.g. blurring).

Typical Functions

  • Select a region for editing.
  • Draw lines with brushes of different color, size, shape and pressure
  • Fill in a region with a single color, gradient of colors, or a texture.
  • Select a color using different color models (e.g. RGB, HSV), or by using a color dropper.
  • Add typed letters in different font styles.
  • Remove scratches, dirt, wrinkles, and imperfections from photo images.
  • Composite editing using layers.
  • Edit and convert between various color models.
  • Apply various filters for effects like sharpening and blurring.
  • Convert between various image formats.

For a Summary Comparison of Raster Graphics Editors First Check Out —

Vector Graphics Editors

From Wikipedia, the free encyclopedia

A vector graphics editor is a computer program that allows users to compose and edit vector graphics images interactively on a computer and save them in one of many popular vector graphics formats, such as EPS, PDF, WMF, SVG, or VML.

Vector Editors Versus Bitmap Editors — Vector editors are often contrasted with bitmap editors, and their capabilities complement each other. Vector editors are often better for page layout, typography, logos, sharp-edged artistic illustrations (e.g. cartoons, clip art, complex geometric patterns), technical illustrations, diagramming and flowcharting.

Bitmap editors are more suitable for retouching, photo processing, photorealistic illustrations, collage, and illustrations drawn by hand with a pen tablet. Many contemporary illustrators use Corel Photo-Paint and Photoshop to make all kind of illustrations. Recent versions of bitmap editors such as GIMP and Photoshop support vector tools (e.g. editable paths), and vector editors such as CorelDRAW, Adobe Illustrator, Xara Xtreme, Adobe Fireworks, Inkscape or SK1 are adopting raster effects that were once limited to bitmap editors (e.g. a blurring issue).

Specialized Vector Graphics Features

Some vector editors support animation, while others (e.g. Adobe Flash) are specifically geared towards producing animated graphics. Generally, vector graphics are more suitable for animation, though there are raster-based animation tools as well. Vector editors are closely related to desktop publishing software such as Adobe InDesign or Scribus, which also usually include some vector drawing tools (usually less powerful than those in standalone vector editors). Modern vector editors are capable of, and often preferable for, designing unique documents (like flyers or brochures) of up to a few pages; it’s only for longer or more standardized documents that the page layout programs are more suitable.

Special vector editors are used for Computer Assisted Drafting [CAD]. They are not suitable for artistic or decorative graphics, but are rich in tools and object libraries used to ensure precision and standards compliance of drawings and blueprints.

Finally, 3D computer graphics software such as Maya, Blender or 3D Studio Max can also be thought of as an extension of the traditional 2D vector editors, and they share some common concepts and tools, but you knew that!

Note that not all the listed software is available for the Macintosh but with the available windows emulators, for Intel Mac-PC’s, this is no longer an issue.

For a Summary Comparison and specific details, of Vector Graphics Editors First Check Out —

Also check out the Graphic Design and Graphics Software articles at Wikipedia. Neither is great but there are a good getting acquainted start.

There are also lists of graphics software on Wikipedia