Pages

August 01, 2009

Several Excellent Articles from Capacity Factor

The blog Capacity Factor has several excellent articles and overall the entire site has excellent analysis and research.

Sustainable energy and the Kardashev scale analyzes different energy sources to maintain civilization energy levels over time.

The Nextbigfuture analysis of seawater uranium for fission is ten times the world's current electricity needs for 100 million years.



Looking at using the uranium and thorium in the earth's crust. 80 trillion tons times 950 gigawatt days/ton times 24 billion watt/hours per GWd.
1750 billion trillion kilowatthours.


World net electricity generation nearly doubles in the IEO2008 reference case, from about 17.3 trillion kilowatthours in 2005 to 24.4 trillion kilowatts in 2015 and 33.3 trillion kilowatthours in 2030. Uranium and thorium can provide 100 times current world electricity usage for 1 billion years.

Other articles from Capacity Factor:

A meta study of relative energy cost by source.

A metastudy of levelized cost of energy by source.



A critique of the Waxman Markey bill.

A detailed examination of Uranium mining accident which shows neglible effects

Sometimes Fewer Dollars are Better for Healthcare

New Scientist explains how come sometimes less is more for healthcare.

Research at the Dartmouth School of Medicine in New Hampshire shows how high-spending regions of the country are driving the spiralling costs. Insurers and the government pay set fees for each medical intervention performed. In some regions, doctors in institutions that are competing to become "centres of excellence" in high-paying fields may use unnecessary diagnostic tests, and surgeons often perform expensive procedures when cheap drugs may be a better option.

The main proposals now before Congress won't do much to tackle this problem. Rather, they concentrate on the important issue of expanding access to health insurance, and the Congressional Budget Office calculates that they will increase spending by hundreds of billions of dollars over the coming decade.

Cutting costs would involve more research into the comparative effectiveness of different tests and treatments, and giving doctors incentives to deliver quality care, not just paying them more for doing more.

Why the reluctance to tackle these issues? Partly it's because no politician wants to be accused of rationing healthcare. One way forward might be to inform the public that sometimes less can be more. "When people understand, they're less likely to choose expensive, invasive procedures," says Shannon Brownlee of the New America Foundation a think tank based in Washington DC.


Dartmouth has research on "Health Care Spending, Quality,
and Outcomes"


Dartmouth: The Policy Implications of Variations in Medicare Spending Growth



RELATED READING

The best way to lower healthcare costs is to find more real cures.

Past, present and future US federal budgets


Space elevator Games Delayed at Least 4 Weeks

The helicopter flight was not successful in maintaining constant tension or position for hanging the 1 kilometer long tether for the space elevator games.

This resulted in a safety device dropping the line – which means we’ll have to do this yet again, until we get it right, and so the games cannot proceed as planned on August 5th. The likely minimum delay is probably 4 weeks.

Aviation week has coverage of the problem

A technical issue with a helicopter cable system is forcing the Spaceward Foundation to postpone the Space Elevator Power Beaming Challenge Games originally scheduled for this summer at NASA's Dryden Flight Research Center.

The 3/16th inch thick steel cable is suspended beneath an Aris Helicopters-operated Sikorsky S-58T, but issues reoccurred with the winch and pulley system similar to those which earlier this month forced organizers to reschedule the contest to Aug. 5-7. In the latest tests over the dry lakebed at Edwards Air Force Base last week, as the helicopter was hovering above the anchor point, one of the safety release mechanisms gave way unexpectedly. The release mechanism is designed to drop the cable should the pull strength exceed 3,500 pounds, but officials say that according to strain gauges the pull tension was "nowhere near 3,500 pounds."

Spaceward, which is trying to develop a stable racetrack for the contests to use, now plans to re-examine this system and determine what changes may need to be made. Officials believe the games may be rescheduled for September or October, given the time required for reruns of the tests with the teams and the helicopter.



Prior coverage of the space elevator games

A Worthy New Purpose for Space and More Discussion of Fuel Depots

“Review of U.S. Human Space Flight Plans Committee,” also known as the “Augustine Committee,” declared a new overarching purpose for America’s national space enterprise:

“the underlying reason why we do human spaceflight is the extension of human civilization beyond Earth“

The official Review of U.S. Human Space Flight Plans Committee webpage is here.

This is a great reason and one that this site agrees with.

Adjoining that should be:

1. a tight space agency focus on technology and systems that greatly reduce the cost of space access
2. the industrialization/creation of infrastructure in space (the solar system).
3. Infrastructure that allows the use and leveraging of the material and energy resources

Speculist comment on the new goal for space.

Alan Boyle, Msnbc cosmic log, covers the fuel depot concept.

The 5 page pdf fuel depot white paper reviewed by the committee



Hobby space is tracking information from the Augustine Commission and reactions from around the blogosphere and internet.




July 31, 2009

Photonic Propulsion and Fusion Work at Bae Institute

Young K. Bae demonstrated pure photonic propulsion by bouncing 10 Watt lasers between mirrors a few thousand times and generate 35 micronewtons of force.

Details on the photonic laser thruster demonstration.

Photonic Laser Thruster (PLT) is an innovative photon thruster that amplifies photon thrust by orders of magnitude by exploiting an active resonant optical cavity formed between two mirrors on paired spacecraft. PLT is predicted to be able to provide the thrust to power ratio (T/P) approaching that of conventional thrusters, such as laser ablation thrusters and electrical thrusters. Yet, PLT has the highest Isp of 3x10^7 sec, which is orders of magnitude larger than that of other conventional thrusters. We have demonstrated the photon thrust amplification in PLT for the first time. The T/P obtained with an OC mirror with R= 0.99967±0.00002 was 20±1 µN/W, and the maximum photon thrust obtained was 35 µN, resulting in an apparent photon thrust amplification factor of 2,990±150. Scaling-up of PLT is promising, and PLT is predicted to enable wide ranges of space endeavors. Low thrust PLTs may enable nanometer precision spacecraft formation for forming ultralarge space telescopes and radars, and provide economically viable solution to Fractionated Spacecraft Architecture, the System F-6. Medium thrust PLTs may enable precision propellantless orbit changing and docking. High thrust PLTs may enable propelling spacecraft at speeds orders of magnitude greater than that by conventional thrusters.



There are technical issues still to be overcome thermal limitation, optical absorption, and saturation of the laser gain media.



Near Term possibility: Photon tether formation flight (PTFF),with the maximum baseline distances greater than 10 km, 10-W Photon Laser Thruster (PLT) with a 0.999998 (500,000 reflections, better mirror than the actual demo) reflectance OC mirror would produce FT 33:5 mN, would enable a wide range of formation flying missions.









Far Term: Photon Laser Propulsion (PLP) application is in deep-space rapid-turnaround probing missions. L is the distance of the acceleration and M is the mass of the launched spacecraft.



If the scattering and absorption of the optical systems are negligible, with a 10-MW laser system, 0.999998 reflectance mirrors, and maximum velocity max 180 km/s accelerating for 1000 kilometers a 1 kilogram mass. At this velocity, the PLP spacecraft would transit the 100 million kilometers to Mars in less than a week.


Fusion work with Winterberg

Bae Institute proposes in a recent paper that the existence of Metastable innershell molecular state (MIMS) was experimentally discovered by Bae et al. in hypervelocity (v > 100 km/s) impact of nanoparticles. The decay of MIMS resulted in the observed intense soft x-rays in the range of 75–100 eV in agreement with Winterberg’s recent prediction.

MIMS can be used for generating super-intense x-ray beams with unprecedented high conversion efficiency from kinetic-energy to x-ray energy, over 40%. Such super-intense x-ray beams can make inertial confinement nuclear fusion more efficient and economically viable. Metastable Innershell Molecular State (MIMS) is a new high energy density matter quantum state. MIMS exists in matters compressed “suddenly” at pressures in excess of one hundred million atmospheres.


The super intense x-ray generation efficiency in % as a function of nanoparticle shock pressure in millions of atmospheres (Mbar). The excellent Arrhenius fit to the data indicates the existence of Metastable Innershell Molecular State (MIMS), a transient quantum state in highly compressed matter.



They propose that the proposed intense x-ray production with nanoparticle impact can be used to generate intense hard xrays.

Efficient x-ray generation from matters composed of heavy elements may not require the usage of nanoparticles and Dicke superradiance mechanism. Based on the experimental observation by Bae et al and present analyses, the kinetic energy per atom required for triggering the x-ray production mechanism is in the order of the
x-ray energy. For the 92U–92U pairs, the required velocity of 92U nanoparticles to achieve such threshold energy is ∼ 100 km/s, of which corresponding threshold pressure is ∼2 Gbar.

They propose that Metastable innershell molecular state (MIMS) can be readily created in “cold” compression with pressures in excess of 100 Mbar and that such “cold compression can be generated in the hypervelocity (v > 100 km/s) of nanoparticles, in which the collision/compression time scale (10–100 fs) is shorter than the ion–electron thermalization time scale (> 1 ps). Further, they propose here that the limited size of nanoparticles can increase the emission rate of MIMS x-rays owing to the Dicke superradiance mechanism. Their theory combined with the Winterberg’s recent prediction explains that the anomalous detector signals discovered by Bae et al. in hypervelocity (v > 100 km/s) impact of nanoparticles, such as clusters and biomolecules, resulted from the existence and optical decay of MIMS. The analysis of the experimental data resulted in the energy of intense soft xrays in the range of 75–100 eV in agreement with Winterberg’s prediction, and the conversion efficiency of 38% from the initial kinetic energy of nanoparticles to the x-ray radiation energy.


FURTHER READING
Friedwardt Winterberg has made major contribution to nuclear fusion theory and initiated the ideas that led to global positioning satellites.

Controversial Blacklight Power Signs 6th Commercial License Deal

BlackLight Power (BLP) Inc. signed its sixth commercial license agreement and first with Akridge Energy, LLC.

UPDATE: Further Rowan University confirmation has been published

Akridge Energy may use the technology to produce electric power up to a maximum continuous capacity of 400 megawatts (MW). To date, BLP has licensed the rights to produce approximately 8,000 megawatts of electrical power to five utilities, two of which are publicly traded companies, and one independent power producer.


There has been no further independent confirmation of energy generation or the release of a Blacklight Power generator for public testing.

Blacklight Power claims that they are developing an extraordinary new form of energy generation. They also claim extraordinary and controversial science.
Previous Deal and Information
BlackLight Power (BLP) Inc. today announced its first commercial license agreement with Estacado Energy Services, Inc. in New Mexico, a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (Estacado). In a non-exclusive agreement, BLP has licensed Estacado to use the BlackLight Process and certain BLP energy technology for the production of thermal or electric power. Estacado may produce gross thermal power up to a maximum continuous capacity of 250 MW or convert this thermal power to corresponding electricity.

Background
- Blacklight Power has provided information and assistance to a blogger/chemistry professor looking to validate their process

- Venture Beat investigates Blacklight Power

- Rowan University study provides external confirmation of a substantial amount of extra heat from Blacklight Power materials.

- Blacklight Power Claims

The latest expected unit costs for the Blacklight power system compared to current energy technology:



The Blacklight hydrogen production plant diagram

Potential Applications for Blacklight Power Technology
- H2(1/p) Enables laser at wavelengths from visible to soft X-ray
- VUV photolithography (Enables next generation chip)
- Blue Lasers
- Line-of-sight telecom and medical devices
- High voltage metal hydride batteries
- Synthetic thin-film and single crystal diamonds
- Metal hydrides as anticorrosive coatings





Estacado is a wholly-owned subsidiary of Roosevelt County Electric Cooperative, (RCEC) in New Mexico. With over 2,757 miles of energized lines in east central New Mexico, RCEC serves Dora, Elida, Floyd, Arch, Rogers, Milnesand, Causey and Portales.


FURTHER READING
Details of Blacklight Powers patent dispute in the UK.

In upholding both of the examiner's objections, the Hearing Officer identified the question which he had to address to be whether the underlying theory of GUTCQM was true. In doing so, he identified three criteria which he had to consider in determining whether a scientific theory was true, namely whether:

the explanation of the theory is consistent with existing generally accepted theories. If it is not, it should provide a better explanation of physical phenomena then current theories and should be consistent with any accepted theories that it does not displace;

-the theory makes testable predictions, and the experimental evidence shows rival theories to be false and matches the predictions of the new theory, and whether
-the theory is accepted as a valid explanation of physical phenomena by the community of scientists who work in the relevant discipline.

Critically, the hearing officer went on to determine that he must satisfy himself that it was more probable than not that the theory was true. On this basis, the Hearing Officer found that he was not satisfied that the theory was true and therefore the claims in the applications which relied upon the theory were not patentable.

The appeal focused on whether the Hearing Officer had been right in considering the appropriate test to be whether the theory was true on the balance of probabilities. Blacklight contended that the test that should be applied is whether the theory is clearly contrary to well established physical laws. In considering this, the examiner should assess whether the applicant has a reasonable prospect of showing that his theory is a valid one should the patent be litigated in court. In making these arguments, Blacklight accepted that on the material before the Hearing Officer the theory was probably incorrect.


Examiner has an article on Blacklight Power

Bob Maddox: Pure Rocket Man and Possible Future Darwin Award Winner


Bob Maddox and one of four rockets
Bob Maddox wants to strap himself to a homemade four-engine pulse jet rocket, ride it to around 25,000 feet and then jump off. [via Wired.com and the Medford Mail.

Oregon live has a feature on Bob Maddox.

Bob's Contact as per Oregon Live and probably to Donate: 541-779-3800, rrocketmann at charter dot net

The rockets will burn gas and kersosene and generate 4,000 pounds of thrust. Although liftoff will be a relatively low 250 mph, the rockets will be capable of nearly supersonic velocity, according to the Medford Mail. Gyroscopes and servos will monitor the pulse jets, and he’ll use small rockets in the nose of his contraption for steering.

Maddox envisions a rocket that will generate 4,000 pounds of thrust. It will start at a relatively low speed, 250 mph, so if anything bad happens, it won't happen quite so fast. Top velocity, though, will be close to supersonic.

The most dangerous moment for anything that flies is just getting off the ground. Maddox said his reserve parachute is designed to open, if he needs it, at an altitude of just 50 feet.

Steering a rocket that's climbing five miles into the atmosphere might sound a little tricky, but Maddox said he's making all parts using computer numeric code software, so they will all have identical performance. He'll monitor the engines with gyroscopes and servo mechanisms, which will nudge four small nose rockets for steering.

When he reaches an altitude of 25,000 feet, Maddox plans to go skydiving. A rocket in his ejection seat will fire for half a second, pushing him out of the rocket.

Skydiving from that height is the least of his problems. Maddox has parachuted more than 2,000 times, and he jumped from 20,000 feet in a mass jump of 100 people.

When he bails out, the engine will stop and a parachute on the rocket will deploy so that it can be recycled and used again.



Unlike the character in Doctor Strangelove, Bob will be in a rudimentary cockpit. It was mentioned that he has a ejection seat


Bob Maddox will Evel Knievel and his attempted Snake River Canyon jump look small time.





Bob Maddox effort reminds of a line from the Simpson's by Homer Simpson

Homer: Nobody snuggles with Max Power. You strap yourself in and feel the "G"s!



Good luck Bob.

NASA Panel Strongly in Favor of Fuel Depots and Considering Deep Space Crewed Missions


New Scientist reports that the NASA panel appears strongly in favor of orbital fuel depots to lower the cost of space exploration.

This site has covered fuel depots before and also is strongly if favor of using orbital fuel depots.

Fuel depots would allow NASA to mount moon missions without spending billions of dollars developing the gigantic Ares V rocket. Existing, less powerful rockets such as Boeing's Delta IV or Lockheed Martin's Atlas V would suffice.

Prior to each moon mission, fuel would be ferried to the orbiting depot by these or even smaller rockets operated by private companies. Competition for this work would drive down costs and spur development of more efficient launch vehicles, Goff argues. "Until we lower the cost of transportation to space, we're never going to see serious off-world exploration," he says.

On 30 July, the panel's subcommittee on exploration beyond low-Earth orbit came out strongly in favour of creating fuel depots in space as a way to facilitate exploration beyond low-Earth orbit. At a public meeting of the panel in Cocoa Beach, Florida, the subcommittee proposed that depots be part of every space exploration scenario that the full committee puts forward in its final report.

Private companies would compete for the job of ferrying fuel to the orbiting depot
It remains to be seen whether the panel will back the idea in its final report, to be published at the end of August. "This panel is probably the best chance depots are going to have in the next 10 years to get actual NASA support and funding," Goff says.





The committee reviewing NASA's goals has outlined a scheme to send astronauts on progressively longer space trips – including dockings with asteroids and flybys of Venus – to prepare for an eventual landing on Mars.



One of the options the team proposed is called the "flexible path", which Crawley also described as a "deep space" or "in space" option.

It would see astronauts sent on a series of progressively longer missions beyond low-Earth orbit. The first would fly by the moon. Later missions would include rendezvousing with one or more of the many asteroids on orbits that take them close to Earth. Asteroid missions would take several months each.

Later, astronauts could fly by Mars and Venus, and touch down on Mars's 27-kilometre-wide moon Phobos. Each of these missions would take more than a year.

An advantage of the stepwise approach is that the difficult job of building landers and surface equipment could be deferred, since early missions would be restricted to flybys or rendezvousing with small objects that have negligible gravity – a process that would resemble docking with another spacecraft.

Although Crawley did not give a specific year by which the first human mission to an asteroid could occur, he said it could happen within six years of starting a project to accomplish this goal.

The other options on the subcommittee's shortlist were:

• Mars first: Cancel the return to the moon and focus on sending humans to Mars instead.

• Lunar global: Have astronauts land in many different places on the moon's surface, with the option of eventually building a lunar outpost, but focus on doing things there that really help prepare the way for human Mars missions. In one version of this option, hardware would be designed from the beginning to be used on Mars, with the moon missions serving to test it.

• Continue with the current plan, which aims to return astronauts to the moon by 2020 and eventually build a permanent lunar base. Meeting the 2020 deadline would presumably require an increase to NASA's budget.

• Continue with the current plan, but keep within the budget currently expected for NASA by slowing the schedule.

• Continue with the current plan, but cancel the Ares I rocket designed to put a crew capsule in low-Earth orbit. In this case, NASA would build only the more powerful Ares V rocket, which is capable of sending crew and cargo to the moon.




General Fusion : the Technical challenge of Precisely Timed Spheromaks Compressions




































Power pistons: General Fusion's reactor is a metal sphere with 220 pneumatic pistons designed to ram its surface simultaneously. The ramming creates an acoustic wave that travels through a lead-lithium liquid and eventually accelerates toward the center into a shock wave. The shock wave compresses a plasma target, called a spheromak, to trigger a fusion burst. The thermal energy is extracted with a heat exchanger and used to create steam for electricity generation. To produce power, the process would be repeated every second. Credit: General Fusion

MIT Technology Review (Tyler Hamilton) covers General Fusion. General fusion has raised between $9-13.5 million and gotten C$13.5 million in government funding for the latest round.

The prototype reactor will be composed of a metal sphere about three meters in diameter containing a liquid mixture of lithium and lead. The liquid is spun to create a vortex inside the sphere that forms a vertical cavity in the middle. At this point, two donut-shaped plasma rings held together by self-generated magnetic fields, called spheromaks, are injected into the cavity from the top and bottom of the sphere and come together to create a target in the center. "Think about it as blowing smoke rings at each other," says Doug Richardson, chief executive of General Fusion.

On the outside of the metal sphere are 220 pneumatically controlled pistons, each programmed to simultaneously ram the surface of the sphere at 100 meters a second. The force of the pistons sends an acoustic wave through the lead-lithium mixture, and that accelerates into a shock wave as it reaches the plasma, which is made of the hydrogen isotopes deuterium and tritium.

If everything works as planned, the plasma will compress instantly and the isotopes will fuse into helium, releasing a burst of energy-packed neutrons that are captured by the lead-lithium liquid. The rapid heat buildup in the liquid will be extracted through a heat exchanger, with half used to create steam that spins a turbine for power generation, and the rest used to recharge the pistons for the next "shot."

The ultimate goal is to inject a new plasma target and fire the pistons every second, creating pulses of fusion reactions as part of a self-sustaining process. "One of the big risks to the project is nobody has compressed spheromaks to fusion-relevant conditions before," says Richardson. "There's no reason why it won't work, but nobody has ever proven it."

General Fusion says it can achieve "net gain"--that is, create a fusion reaction that gives off more energy than is needed to trigger it--using relatively low-tech, mechanical brute force and advanced digital control technologies that scientists could only dream of 30 years ago.

It may seem implausible, but some top U.S. fusion experts say General Fusion's approach, which is a variation on what the industry calls magnetized target fusion, is scientifically sound and could actually work. It's a long shot, they say, but well worth a try.

"I'm rooting for them," says Ken Fowler, professor emeritus of nuclear engineering and plasma physics at the University of California, Berkeley, and a leading authority on fusion-reactor designs. He's analyzed the approach and found no technical showstoppers. "Maybe these guys can do it. It's really luck of the draw."

The company can now start the first phase of building the test reactor, including the development of 3-D simulations and the technical verification of components. General Fusion aims to complete the reactor and demonstrate net gain within five years, assuming it can raise another $37 million.

If successful, it believes it can build a grid-capable fusion reactor rated at 100 megawatts four years later for about $500 million, beating ITER by about 20 years and at a fraction of the cost.

General Fusion's basic approach isn't entirely new. It builds on work done during the 1980s by the U.S. Naval Research Laboratory, based on a concept called Linus. The problem was that scientists couldn't figure out a fast-enough way to compress the plasma before it lost its donut-shaped magnetic confinement, a window of opportunity measured in milliseconds. Just like smoke rings, the plasma rings maintain their shape only momentarily before dispersing.

Nuclear-research giant General Atomics later came up with the idea of rapidly compressing the plasma using a mechanical ramming process that creates acoustic waves. But the company never followed through--likely because the technology to precisely control the speed and simultaneous triggering of the compressed-air pistons simply didn't exist two decades ago.

Richardson says that high-speed digital processing is readily available today, and General Fusion's mission over the next two to four years is to prove it can do the job. Before building a fully functional reactor with 220 pistons on a metal sphere, the company will first verify that smaller rings of 24 pistons can be synchronized to strike an outer metal shell.







Summarizing the Nuclear Fusion Projects that could Commericialize before 2020

Nuclear fusion projects that have been funded and have some chance of successful commercial fusion before 2020 are:

1. IEC (Inertial electrostatic confinement) fusion work by EMC2 Fusion. This is building upon the work of the late Robert Bussard. In a personal interview the project lead Dr Richard Nebel stated:

The project that we hope to have out within the next six years will probably be a demo, which won't have the attendant secondary equipment necessary for electricity generation. Hopefully the demo will demonstrate everything that is needed to put a full-scale working plant into commercial production. So if the concept works we could have a commercial plant operating as early as 2020.

In a separate comment in May, 2009 Dr Nebel stated that commercial viability of this technology should be known in 18-24 months from this Navy and US government funded project.

2. Tri-Alpha Energy which has received over $50 million in funding and is developing colliding beam fusion. This is building upon the work of fusion researcher Norman Rostoker and is using a field reversed configuration. The project is highly secretive but has mentioned 2015-2018 target dates.

3. General Fusion is working on magnetized target fusion. They have private funding and funds from the Canadian government. General Fusion is likely to get the full $50 million for a net energy gain device with a target date of 2013. If the current validation and early stage efforts are successful then the first commercial scale unit could be in 2016-2018.

4. Lawrenceville Plasma Physics is working on controversial dense plasma focus fusion. This has also received sufficient funding for a currently ongoing technology validation effort.

Helion Energy is also working on a version of colliding beam fusion, but they do not appear to be funded yet.

There is the Laser fusion-fission hybrid but it is unlikely to commercialize before 2030.

July 30, 2009

Open source Database Breakthrough: 10-80 times faster


The figure below shows the architecture of the new VectorWise engine. The
left part shows the system architecture (“X100” execution engine and
ColumnBM buffer manager) and how it maps on the computer resources
(CPU cache, RAM and disk). The right part shows a query in action, having
been decomposed into so-called relational operators (Aggregate, Project,
Select and Scan) and execution primitives (such as summation –
aggr_sum_flt_col).


A ground-breaking database kernel - is now being combined with the leading open source relational database from Ingres.

The Ingres VectorWise project team has worked with Intel to evaluate database performance on the new Intel Xeon processor 5500 series based platform. To date, the results of the project have demonstrated dramatic cost and performance capabilities as evidenced by nearly 80 fold speed up on a query modelled after the Q1 query of TPC-H3 suite on the Intel Xeon processor.



VectorWise next-generation database technology is based on a novel query processing architecture that allows modern microprocessors and memory architectures to reach their full potential. This is a unique achievement: in detail studies that compare common computing tasks such as scientific calculation, multi-media authoring, games, and databases have consistently shown that typical database engines do not benefit from new processor performance features such as SSE, out of order execution, chip multithreading, and increasingly larger L2/L3 caches due to their large, complex legacy code structure.

The computational power that database systems provide is known to be lower than the performance realized by hand-coding the same task in a (e.g. C++) program. However, the actual performance difference can be surprisingly large: a factor 100. VectorWise has created the first database system to revert that situation, with dramatic efficiency improvements as a result
.

Vectorized Execution
The VectorWise engine is designed for in-cache execution, which means that the only “randomly” accessible memory is the CPU cache, and main memory (by now inappropriately named “RAM” – Random Access Memory) is already considered part of secondary storage, used in the buffer manager for buffering I/O operations and their compressed large intermediate results. Queries are processed by passing on multiple tuples at a time between relational operators, called “vectors.” These vectors are at the heart of the execution engine:
• VectorWise has developed vectorized versions of relational operators, so there is vectorized selection, vectorized project, join, sort etc. It has even been possible to vectorize binary search.
• Vectors are the simplest possible data structure, an array of values. A tuple is represented as a set of vectors of the same length, one for each column.
• An optional selection vector contains the positions of the tuples currently taking part in processing, i.e. those that have passed a filter operation.
• A vector contains typically between 100-1000 values. The vector size is tuned such that all vectors in a query plan fit comfortably together in the CPU cache.
• Vectorized primitives have many performance advantages because methods perform 100-1000 times more work, function call overhead is dramatically reduced, and database code becomes much more local, improving instruction cache locality and CPU branch prediction.

New Zeolite Membranes Increase Energy Efficiency of Chemical Separations up to 50 times


Shown in the image are depictions of (top) a conventionally calcined c-oriented silicalite-1 zeolite membrane and (bottom) an identically oriented membrane that has undergone rapid thermal processing (RTP). Red and green regions in the 3D schematics are indicative of zeolite crystal grains and defects/grain boundaries, respectively. A scanning electron microscopy (SEM) image of the membrane cross-section is shown, as well as representative cross-sectional images collected of dye-saturated membranes via laser scanning confocal microscopy. The schematics and representative data highlight the accessibility and inaccessibility of grain boundaries, respectively, in the conventionally calcined and RTP treated membranes.

Credit: Jungkyu Choi, University of California, Berkeley; Mark A. Snyder, Lehigh University; and Michael Tsapatsis, Univerity of Minnesota



Engineers have developed a new method for creating high-performance membranes from crystal sieves called zeolites; the method could increase the energy efficiency of chemical separations up to 50 times over conventional methods and enable higher production rates. Chemical separations is a multi-billion part of the economy and effects many aspects of many industries.

The researchers demonstrated the RTP process on relatively thick (several micrometers) zeolite membranes. Tsapatsis and collaborators are now working towards making zeolite membranes 10 to 100 times thinner to allow molecules to pass through more quickly. They hope to eventually implement RTP treatment with its beneficial effects to these membranes as well

The ability to separate and purify specific molecules in a chemical mixture is essential to chemical manufacturing. Many industrial separations rely on distillation, a process that is easy to design and implement but consumes a lot of energy.

Tsapatsis's team developed a rapid heating treatment to remove structural defects in zeolite membranes that limit their performance, a problem that has plagued the technology for decades.

This discovery could increase the energy efficiency of producing important chemical solvents such as xylene and renewable biofuels such as ethanol and butanol.





Creating Zeolite Membranes

Researchers create zeolite membranes by growing a film of crystals with small organic ions added to direct the crystal structure and pore size--two zeolite properties that help determine which molecules can pass through the material. Then they slowly heat the zeolite film in a process called calcination to decompose the ions and open the pores.

However, Tsapatsis explained, "This method for creating zeolite films often leaves cracks at the boundaries between grains of zeolite crystals." These defects have prevented zeolite films from being used effectively as membranes, because molecules of unwelcome chemicals that are rejected by the zeolite pores can still penetrate through the membrane defects.

"While it may be possible to correct some of these defects, the repair process is difficult and expensive," Wesson said. Currently zeolite membranes have found use only in specialized, smaller-scale applications, such as the removal of water from alcohols or other solvents.

In an effort to minimize the formation of cracks and other defects, the heating rate during calcination is very gentle, and the process can take as long as 40 hours--typically a material is heated at a rate of 1 degree Celsius per minute up to a temperature between 400 and 500 degrees Celsius, where it is held steadily for several hours before being allowed to slowly cool. Because conventional calcination is time-consuming and energy-intensive, it has been difficult and expensive to produce zeolite membranes on a large scale.

Hotter and Faster

Tsapatsis's team developed a treatment called Rapid Thermal Processing (RTP), a treatment in which zeolite film is heated to 700 degrees Celsius within one minute and kept at that temperature for no more than two minutes. Acting as an annealing method, RTP refines the granular structure of the zeolite crystal film.

When the researchers examined the RTP-treated films, they found no evidence of cracks at grain boundaries. Although they found other types of defects, these don't seem to affect the membrane properties or performance.

In a comparison to conventionally-made zeolite membranes, Tsapatsis said, "We observed a dramatic improvement in the separation performance of the RTP-treated membranes." A second round of RTP treatment improved separation performance even further, to a level on par with current industry separation methods.



Laser Propulsion Tests in Brazil using Gigawatt Pulsed Lasers


At a Brazil-based lab, a hypersonic shock tunnel is linked to two pulsed infrared lasers with peak powers reaching the gigawatt range - the highest power laser propulsion experiments performed to date.

Leik Myrabo is an aerospace engineering professor at Rensselaer Polytechnic Institute who has demonstrated the feasibility of using ground-based lasers to propel objects into orbit; possibly reducing orbit-flight costs by a factor of 1000. Small scale tests were done to a 233 feet or so.

UPDATE: A Feb, 2009 Wired article has some more information on other ongoing laser propulsion research.

Myrabo reportedly has made more than 140 test flights using small prototypes. He isn’t the only one exploring this field, either. Five years ago, NASA joined Tim Blackwell, a researcher at the Center for Applied Optics at the University of Alabama in Huntsville, in using laser propulsion to power a small model airplane. Researchers at the University of Tokyo have used a laser to propel a tiny airplane and detailed their findings in the journal Applied Physics Letters in 2002.




Latest news at Lightcraft Technologies.

Engine and Inlet Experiments Underway
Scale cross-sections of various LightCraft engine and inlet geometries are currently undergoing tests in a hypersonic wind tunnel in South America at Mach 7 to 10. These laser propulsion experiments are aimed at confirming physics-based computer models for: a) Directed Energy AirSpike (DEAS) inlets; and, b) full-scale pulsed detonation engine segments-examining interactions of expanding plasma blast waves with inlet flows and thruster surfaces. Tests thus far have shown excellent conformity.


Flight Dynamics & Control Laws Developed
Working with graduate students at Rensselaer Polytechnic Institute (RPI) in New York state, Dr. Myrabo and his team developed and modeled a comprehensive set of Flight Dynamics and Control (FDL) laws for the LightCraft, calibrated against 16 historic lightcraft flights at White Sands Missile Range, NM. Subsequent computer simulations have confirmed that complete dynamic control of a full-size LightCraft along a launch trajectory into low Earth orbit is feasible. Aerodynamics and propulsion data bases now being collected in both low-speed and hypersonic wind tunnel experiments will be used to upgrade subroutines in the FDL code.




Laser propulsion is a form of Beam-powered propulsion where the energy source is a remote (usually ground-based) laser system and separate from the reaction mass.

Lightcraft
A ground based laser is the power source that propels the Lightcraft into orbit. Lightcraft can deliver payloads into space for a fraction of the cost of traditional rockets because most of the engine stays on the ground, thereby unburdening the craft from having to lift the energy source for its propulsion system.

The back side of the craft is a large, highly polished parabolic mirror that is designed to capture the laser beam projected at it from the ground. The mirror focuses the beam, rapidly heating the air to 5 TIMES the temperature of the sun, creating a blast wave out the back that pushes the vehicle upward. As the beam is rapidly pulsed, the vehicle is continuously propelled forward, on its way to orbit.


Lightcraft technology can be a lot cheaper than conventional rockets.






Types of Laser Propulsion

Pulsed plasma propulsion
A high energy pulse focused in a gas or on a solid surface surrounded by gas produces breakdown of the gas (usually air). This causes an expanding shock wave which absorbs laser energy at the shock front (a laser sustained detonation wave or LSD wave); expansion of the hot plasma behind the shock front during and after the pulse transmits momentum to the craft. Pulsed plasma propulsion using air as the working fluid is the simplest form of air-breathing laser propulsion. The record-breaking Lightcraft, developed by Leik Myrabo of RPI (Rensselaer Polytechnic Institute) and Frank Mead, works on this principle.


Laser electric propulsion
A general class of propulsion techniques in which the laser beam power is converted to electricity, which then powers some type of electric propulsion thruster. Usually, laser electric propulsion is considered as a competitor to solar electric or nuclear electric propulsion for low-thrust propulsion in space. However, Leik Myrabo has proposed high-thrust laser electric propulsion, using magnetohydrodynamics to convert laser energy to electricity and to electrically accelerate air around a vehicle for thrust.


Ablative laser propulsion
Ablative Laser Propulsion (ALP) is a form of beam-powered propulsion in which an external pulsed laser is used to burn off a plasma plume from a solid metal propellant, thus producing thrust. The measured specific impulse of small ALP setups is very high at about 5000 s (49 kN·s/kg), and unlike the lightcraft developed by Leik Myrabo which uses air as the propellant, ALP can be used in space.

Material is directly removed from a solid or liquid surface at high velocities by laser ablation by a pulsed laser. Depending on the laser flux and pulse duration, the material can be simply heated and evaporated, or converted to plasma. Ablative propulsion will work in air or vacuum. Specific impulse values from 200 seconds to several thousand seconds are possible by choosing the propellant and laser pulse characteristics. Variations of ablative propulsion include double-pulse propulsion in which one laser pulse ablates material and a second laser pulse further heats the ablated gas, laser micropropulsion in which a small laser onboard a spacecraft ablates very small amounts of propellant for attitude control or maneuvering, and space debris removal, in which the laser ablates material from debris particles in low Earth orbit, changing their orbits and causing them to reenter
.

ALP is being developed by Professor Andrew Pakhomov at the University of Alabama in Huntsville of the UAH Laser Propulsion Group.

RELATED READING

Keith Henson has a plan for large scale use of laser propulsion to boost space based solar power.

More on Keith Henson's plan

A discussion of laser launch with Jordin Kare

Energy beam propulsion conference.


Nanoreporters: Hydrophilic (water soluble) carbon clusters Being Used to Sense Oil in Old Oil Wells


Hydrophilic (water soluble) carbon clusters are being designed by Rice researchers to sense the presence of oil that remains in old wells. The HCCs are sheets of carbon one atom thick and 60 nanometers long, with embedded molecules that will detect oil, sulfur and water and store information about how much of each they encounter along their path.

Groups led by Rice professors James Tour, Michael Wong and Mason Tomson and Rice researcher Amy Kan are collaborating on a system by which hydrophilic carbon clusters (HCC) -- microscopic entities designed to sense the presence of oil -- can be sent into a well by the billions and come back to the surface full of valuable information

Generally, 30 to 50 percent of the oil in a well is left downhole, because they don't know it's there or don't know exactly where it is," said Tour, the Chao Professor of Chemistry and professor of mechanical engineering and materials science and of computer science

The team's solution is to send tagged macromolecular clusters that can pass through the deposits, mixed with saltwater or other fluids, into a well. The researchers can collect and analyze them after they return to the surface.

"Inside our clusters are small molecules that will report to us whether they've seen oil or water and how much of each along their paths," Tour said. "We put a trillion of these downhole, and we'll analyze 100,000 at the other end to get an average of what they've seen."

Wong said, “We are chemically constructing these nano-sized clusters to be able to handle being exposed to high temperatures, pressures and salty conditions found in a reservoir.”

Though the clusters may take time to work themselves through the subsurface rock, they come back to the surface full of good information that may take no more than a day to analyze.

Custom versions of these "smart" clusters will be able to give the specifics of what's in a well, he said. "We can tag them differently, much like having an internal bar code," said Tour, who suggested regularly pumping HCCs into a well could provide constant monitoring of its status.

"We've got a long way to go before we know for sure if it works," said Tomson, whose lab is working to prove the concept this summer. "Within six to nine months after that, we'll have a pretty good idea of whether we're on to something."

The HCCs could be ready for oil field testing in a year or two, but commercialization is going to depend on the industry's willingness to invest. "It's potentially a very high-visibility project," said Tomson, who also directs the Brine Chemical Consortium of oil and productions and service companies. "It's just the kind of thing they would be excited about."


Other drilling techniques would then be used to recover the oil.

BP (british petroleum) says advancements in technology — including gas injection — will likely allow it to recover 60 percent of the oil at its massive Prudhoe Bay field in Alaska. The original estimate three decades ago was 40 percent.



The nanoreporters would provide more details on where to drill and where it is worthwhile to drill.

FURTHER READING

Enhanced oil recovery could help get an increase of 17 million barrels of oil per day in North America

Computer generated reservoir models can be an important part of accessing 218 billion barrels in old wells in america.

Space Elevator Games In final Prep for August 5-14, 2009 : Delayed at Least 4 weeks

The Space Elevator Games for 2009 were delayed but there are competitors no site and final preparations are occuring.

UPDATE: The space elevator games are delayed at least 4 weeks.

See the Space Elevator blog to track events.

Power beaming Competition
August 5, 2009 Sept or October 2009
NASA Dryden Flight Research Center,
Edwards Air Force Base, Mojave, CA
Teams [down to 3 or 4]
[KCSP, LM, USST, NSS, McGill, U MICH]
1 km vertical raceway,
laser-powered vehicles
$2,000,000 Total prize purse (two levels)

7/09: Teams arrive for setup (Dryden)
7/21 - 7/23: Laser tests (Dryden)
7/22: Test flights (Dryden)
7/17: SE day at SFF conference (Ames)
8/14: Tether Challenge (Seattle)
8/13 - 8/17: Space Elevator Conference (Seattle)
TBA: Power Beaming Challenge (Dryden

From July 19, 2009

Laser Clearinghouse (LCH) inspection went off without a hitch - with flying colors, actually. We now have KCSP (Kansas City Space Pirates), USST (University of Saskatchewan) and LM (Laser Motive) all ok’d to proceed.

True to Monday, NSS, McGill, and U Michigan have not met the qualification deadline - they are all in very advanced stages, and probably could qualify if the games were held a month from now, but we had to make the call, and so they will not be competing this year.

UAlberta is due here Tuesday morning, and we’ll be able to evaluate them then.KCSP and USST wil head out for more testing tomorrow morning (KCSP qualified 2 weeks ago, and USST will aim to qualify over the next two day).


Three teams are definitely in; the Kansas City Space Pirates, LaserMotive, USST (University of Saskatchewan)

University of Alberta was on site and was still being qualified as of last report.



There do not seem to be any teams entered for the tether competition this year.







July 29, 2009

Heat Transfer Can Be 1000 Times Greater than Plancks Law at the Nanoscale


Courtesy / Sheng Shen A diagram of the setup, including a cantilever from an atomic force microscope, used to measure the heat transfer between objects separated by nanoscale distances

A well-established physical law, Planck's law, describes the transfer of heat between two objects, but some physicists have long predicted that the law should break down when the objects are very close together. MIT researchers have determined that heat transfer can be 1,000 times greater than the law predicts.

The new findings could lead to significant new applications, including better design of the recording heads of the hard disks used for computer data storage, and new kinds of devices for harvesting energy from heat that would otherwise be wasted.

By using the glass (silica) beads, they were able to get separations as small as 10 nanometers (10 billionths of a meter, or one-hundredth the distance achieved before), and are now working on getting even closer spacings.

The new findings could also help in the development of new photovoltaic energy conversion devices to harness photons emitted by a heat source, called thermophovoltaic, Chen says. "The high photon flux can potentially enable higher efficiency and energy density thermophovoltaic energy converters, and new energy conversion devices," he says.

Micron gap thermal photovoltaics were described at nextbigfuture.




Nanometer gap thermal photovoltaics could be super-efficient for energy conversion

Nanokites Could Enable Large Scale Production of Carbon Nanotubes of Unlimited Length


Picture credit: Sprinerlink : Nano Research Jpurnal

Hauge's Rice University team describes a method for making "odako," bundles of single-walled carbon nanotubes (SWNT) named for the traditional Japanese kites they resemble. It may lead to a way to produce meter-long strands of nanotubes

Hauge's new method creates bundles of SWNTs that are sometimes measured in centimeters, and he said the process could eventually yield tubes of unlimited length.

Large-scale production of nanotube threads and cables would be a godsend for engineers in almost every field. They could be used in lightweight, superefficient power-transmission lines for next-generation electrical grids, for example, and in ultra-strong and lightning-resistant versions of carbon-fiber materials found in airplanes. Hauge said the SWNT bundles may also prove useful in batteries, fuel cells and microelectronics.

To understand how Hauge makes nanokites, it helps to have a little background on flying carpets.

Last year, Hauge and colleagues found they could make compact bundles of nanotubes starting with the same machinery the U.S. Treasury uses to embed paper money with unique markings that make the currency difficult to counterfeit.

Hauge and his team -- which included senior research fellow Howard Schmidt and Professor Matteo Pasquali, both of Rice's Department of Chemical and Biomolecular Engineering; graduate students Pint and Sean Pheasant; and Kent Coulter of San Antonio's Southwest Research Institute -- used this printing process to create thin layers of iron and aluminum oxide on a Mylar roll. They then removed the layers and ground them into small flakes.

Here's where the process took off. In a mesh cage placed into a furnace, the metallic flakes would lift off and "fly" in a flowing chemical vapor. As they flew, arrays of nanotubes grew vertically from the iron particles in tight, forest-like formations. When done cooking and viewed under a microscope, the bundles looked remarkably like the pile of a carpet.

While other methods used to grow SWNTs had yielded a paltry 0.5 percent ratio of nanotubes to substrate materials, Hauge's technique brought the yield up to an incredible 400 percent. The process could facilitate large-scale SWNT growth, Pint said.

In the latest research, the team replaced the Mylar with pure carbon. In this setup, the growing nanotubes literally raise the roof, lifting up the iron and aluminum oxide from which they’re sprouting while the other ends stay firmly attached to the carbon. As the bundle of tubes grows higher, the catalyst becomes like a kite, flying in the hydrogen and acetylene breeze that flows through the production chamber.

Hauge and his team hope to follow up their work on flying carpets and nanokites with the holy grail of nanotube growth: a catalyst that will not die, enabling furnaces that churn out continuous threads of material.

"If we could get these growing so they never stop – so that, at some point, you pull one end out of the furnace while the other end is still inside growing – then you should be able to grow meter-long material and start weaving it," he said.




Odako growth of dense arrays of single-walled carbon nanotubes attached to carbon surfaces

The full nine page paper is available as a pdf here

3 pages of supplemental material is here

A novel process is demonstrated whereby dense arrays of single-walled carbon nanotubes (SWNT) are grown directly at the interface of a carbon material or carbon fiber. This growth process combines the concepts of SWNT tip growth and alumina-supported SWNT base growth to yield what we refer to as “odako” growth. In odako growth, an alumina flake detaches from the carbon surface and supports catalytic growth of dense SWNT arrays at the tip, leaving a direct interface between the carbon surface and the dense SWNT arrays. In addition to being a new and novel form of SWNT array growth, this technique provides a route toward future development of many important applications for dense aligned SWNT arrays.




FURTHER READING

Synthesis of High Aspect-Ratio Carbon Nanotube “Flying Carpets” from Nanostructured Flake Substrates

Role of Water in Super Growth of Single-Walled Carbon Nanotube Carpets





Unconventional Natural Gas Reserves

The Potential Gas Committee, a group of academics and industry specialists supported by the Colorado School of Mines, reports the largest increase in natural-gas reserves in its 44-year history. Estimated reserves rose to 2,074 trillion cubic feet (Tcf) in 2008 from 1,532 Tcf in its 2006 report.

UPDATE: Robert Rapier made a calculation that if the potential gas committee estimates of 2,074 tcf is correct then converting all cars to natural gas in the USA would provide 43 years worth of oil imports for the USA.

Most of this increase resulted from development of a technique known as “hydraulic fracturing” where water is injected via special “wells” to shatter underground shale formations and release trapped gas.

Not included in the committee’s “reserves” are the discoveries in “unconventional resources” that are becoming technologically practical to tap. One of these is the presence of “geopressurized zones” with gas at depths on the order of 25,000 feet found on the Gulf Coast of the United States. Experts put these reserves at 5,000 to 49,000 Tcf. Beyond that are the methane hydrates on the seafloor, which if not banned to the United States by a Law of the Sea Treaty, could provide an estimated at 7,000 to 73,000 Tcf.


More details from the Potential gas committee report is at the Colorado School of Mines.

Multistage fracturing success has been previously reported here

Three years ago, Packers could insert a half-dozen or so "stages" into a single well. As horizontal wells got longer, that number has grown to 22, and Themig says new advancements will allow virtually "unlimited" stages in a single well. That, in turn, has resulted in an order-of-magnitude higher production for a basic well that costs only about twice as much to drill.

The average conventional gas well in Western Canada produces about 250,000 cubic feet of gas a day. EnCana Corp. CEO Randy Eresman said in releasing the company's second-quarter results this week that its latest Horn River wells that use the multistage technology are coming on at initial rates of up to 11 million cubic feet per day.


A new estimate for the producable yield of Marcellus shale is 500 trillion cubic feet (tcf).



Unconventional gas resources are discussed at this dedicated ugresources website.

Canadian unconventional gas potential is huge with resource estimates are over 2,000 Tcf, excluding hydrates.











































Unconventional Gas Resource Estimates for the USA are
Tight Sand: 1200+ Tcf
Coalbed Methane: 1200+ Tcf
Shale Gas: 1100+ Tcf

FURTHER READING

New Energy and Fuel reports on a different new fracturing method by Exxon for accessing unconventional natural gas.

Multiplex Automated Genome Engineering : Accelerating Evolution Millions of Times to Make Biotech Factories in Days

In the Journal Nature: Programming cells by multiplex genome engineering and accelerated evolution

Researchers created over 4.3 billion combinatorial genomic variants (of E Coli) per day. They isolated variants with more than fivefold increase in lycopene production within 3 days, a significant improvement over existing metabolic engineering techniques.

Multiplex automated genome engineering (MAGE) is used for large-scale programming and evolution of cells. MAGE simultaneously targets many locations on the chromosome for modification in a single cell or across a population of cells, thus producing combinatorial genomic diversity. Because the process is cyclical and scalable, they constructed prototype devices that automate the MAGE technology to facilitate rapid and continuous generation of a diverse set of genetic changes (mismatches, insertions, deletions). We applied MAGE to optimize the 1-deoxy-d-xylulose-5-phosphate (DXP) biosynthesis pathway in Escherichia coli to overproduce the industrially important isoprenoid lycopene


Researchers (Harvard Medical School, MIT and Georgia Institute of Techonology) rapidly turn bacteria into biotech factories

The E. coli bacterium contains approximately 4,500 genes. The team focused on 24 of these—honing a pathway with tremendous potential—to increase production of the antioxidant, optimizing the sequences simultaneously. They took the 24 DNA sequences, divided them up into manageable 90-letter segments, and modified each, generating a suite of genetic variants. Next, armed with specific sequences, the team enlisted a company to manufacture thousands of unique constructs. The team was then able to insert these new genetic constructs back into the cells, allowing the natural cellular machinery to absorb this revised genetic material.

Some bacteria ended up with one construct, some ended up with multiple constructs. The resulting pool contained an assortment of cells, some better at producing lycopene than others. The team extracted the best producers from the pool and repeated the process over and over to further hone the manufacturing machinery. To make things easier, the researchers automated all of these steps.

"We accelerated evolution, generating as many as 15 billion genetic variants in three days and increasing the yield of lycopene by 500 percent," Harris says. "Can you imagine how long it would take to generate 15 billion genetic variants with traditional cloning techniques? It would take years."





Wired magazine has a good diagram and article on the work.

The technique could also be used to design models of diseases — in tissue cultures, or animals — that have large-scale genomic changes.

Church said that MAGE might end up being more useful than building entire genomes from scratch. That approach is flashy and powerful, but unnecessarily complicated.


11 pages of supplemental information.

Phil Bowermaster on Transhuman/Futurist Political Discussion

Phil Bowermaster who blogs at the Speculist has a good comment at Michael Anissimov's Accelerating Future.

The article is in response some posts from John Hughes and Mike Treder.

John Hughes
Peter Thiel, the 99% funder of SIAI, is a raving right-wing anarcho-capitalist, who supports Republicans for public office and sits on the Hoover institution board, and that your founder and head guru, Eli, is also a libertarian of some vague sort


Mike Treder slams libertarians, Peter Thiel and Libertarianism in an article at the blog of the Institute for Ethics and Emerging Technologies.

The IEET is pro-human enhancement and "technoprogressive"

Mike Treder's article was a reaction to this piece by Peter Thiel

In our time, the great task for libertarians is to find an escape from politics in all its forms — from the totalitarian and fundamentalist catastrophes to the unthinking demos that guides so-called “social democracy.”

Peter Thiel describes using cyberspace, seasteading and outer space as technological means to achieve escape (or perhaps a means to get to small groups that unanimously choose to have a particular system) from politics.


My comment in the Accelerating Future thread was to indicate that Peter Thiel is the largest donor to SIAI but not anywhere near 99%.

Libertarian defined at wikipedia shows that there are many flavors of libertarian and some of those are not compatible.

Mike Treder seems to have a beef with propertarian libertarianism.

From Phil Bowermaster:
The rise of the blogosphere and sites like Daily Kos and Free Republic have established a new “accelerated” rhetorical framework for politics which now seems to be more or less universally applied. The basic assumption behind the framework is that there is Our Group and then there is the Other. Any ideas from the Other are subjected to a three-step analysis and response:

1. Hysteria / overreaction

2. Vilification

3. Condemnation

Personally, I’d like to see a group such as IEET take a different approach. Maybe they could look for some kind of, oh I don’t know, Middle Way that transcends opposites?




Forgive my reductionism, but there will always be tension between those who believe that the good of the individual is primary and that the good of the group must be subordinated to it, and those who believe that the good of the group is primary and that the good of the individual must be subordinated to it. A working system (as opposed to a lofty set of ideological propositions) will inevitably consist of a series of trade-offs between those two. Technology has the potential to ease the impact of some of these trade-offs, and even replace them with new trade-offs, but the tension will never completely go away.

Even without Michael’s super-intelligences (which will show up sooner or later) the introduction of an open-source universal assembler enabled by nanotechnology and potent narrow AI could do significantly more to liberate the world’s poor than any trickle-down economic growth model or redistributionist scheme. When technology trumps political theory, I go with the technology. The vital question: would such technology be made available through some big government push or through private efforts?

Either. Both. Neither. Take your pick. Maybe if we find a way to talk with each other about these things like reasonable people we’ll come up with a completely new model that’s better than anything we’ve tried before.


It would be better to find ways to use technology to cut Political Gordian Knots.
If the american political spectrum has the left 10% and right 10% virtually unable to talk to each other then then it seems good to use technology to enable them to not agree. There are wider political gaps in the world than the left and right of American politics. Requiring unanimous agreements is unworkable.

FURTHER READING

Noam Chomsky at wikipedia.

Peter Thiel at wikipedia

Note: The debate in the comment section and the comment section of the IEET post by Mike Treder seems to be irreconcilable in regards to Hard left and hard left american and libertarian socialist/liberal democrat versus libertarian capitalist. This reinforces the suggestion to try to use technology within social systems that enable peaceful agreement to disagree.

Glenn Reynolds, instapundit, also wrote a response to Mike Treder's piece.

Libertarian Transhumanism at wikipedia. Note: Glenn Reynolds is listed as a primary advocate of this position.

July 28, 2009

Air Force Future UAV munitions and One Pilot Flying 12 UAVs


air Force Research Lab (AFRL) is also developing munitions systems for the UAVs [Unmanned Air Vehicles].












One is a precision missilelike bomb for urban strikes that could be mounted on multiple platforms. Designed to cut down on collateral damage, Suburb Warrior could get a flight test as early as 2014.












An integrated submunition guidance system called Sniper will allow UAV operators to target up to four enemies simultaneously inside urban environments. Flight tests are due by 2011 and could be integrated onto UAVs and long-range cruise missiles, according to AFRL.

The Tube Launched Expendable UAS (TLEU) will be launched in-flight by another aircraft. The missilelike weapon will have a warhead as well as a sensor to send back a feed to provide situational awareness to troops. The TLEU will be launched off a gunship and scheduled to reach initial operational capability by 2014.




Finally, the AFRL outlines the steps that the service is taking toward having pilots flying multiple UAVs as well as multiple types of UAVs at a time. Research by the Massachusetts Institute of Technology’s Human and Automations Lab shows pilots could fly up to 12 aircraft at the same time.

Future Fuel Efficient Airplanes



1. GE Aviation is advancing jet propulsion and its next-generation engine core program, called eCore, through several private- and government-funded R&D programs, many with key technology milestones this year.

General Electric is working on HEETE (Highly Efficient Embedded Turbine Engine) A three-year program sponsored by the USAF, HEETE focuses on embedded technologies for the endurance and range of future intelligent surveillance and reconnaissance, tanker, mobility and unmanned combat air vehicles. The first phase will fund development of an ultra-high-pressure ratio compressor and associated thermal management technologies - potentially the centerpiece of GE's next compressor system. Along with a new high-pressure turbine, HEETE will provide a 25 percent improvement in fuel burn at a 70:1 overall pressure ratio in a full engine. GE completed detailed design and is procuring a compressor rig to run in 2010.

A target is for 35% more fuel efficient jet engines by 2021.



2. LEAP-X: The first version of eCore runs in mid-year as part of CFM International's (50/50 joint company of GE and Snecma) LEAP-X engine program, a new turbofan engine for future replacements for current narrow-body aircraft. With the first core running this year, GE and Snecma are targeting the run of a full demonstrator engine in 2012, incorporating technologies developed over three years as part of the LEAP56 technology program [with possible certification in 2016]. The LEAP-X is rooted in advanced aerodynamics and materials technologies, such as ceramic matrix composites (CMCs) and Titanium-Aluminide. This new turbofan will reduce the engine contribution to aircraft fuel burn by up to 16 percent compared to current CFM56 Tech Insertion engines powering Airbus A320 and Boeing Next-Generation 737 aircraft. Additional fuel burn improvements will be achieved once this engine is paired with new aircraft technology.

3. FATE (Future Affordable Turbine Engine): A follow-on to AATE is the U.S. Army's FATE program, focusing on a 7,000-shaft-horsepower class engine to power future heavy-lift helicopters. Goals include a 35 percent improvement in fuel efficiency, 20 percent reduction in development costs, 45 percent improvement in maintenance costs and 90 percent improvement in power-to-weight ratio**. GE will test advanced materials and pursue aerodynamic improvements for high-pressure ratios. Competitions for component programs are under way. In September, GE received a contract on turbine cooling technology. A second round of contracts will be awarded this year to develop a compressor, followed by a four-year technology program scheduled to begin in 2012.


4. INVENT (INtegrated Vehicle ENergy Technology): The USAF Research Laboratory's INVENT program is studying next-generation military electric power and thermal management systems for aircraft with integrated hybrid-electric system architectures. Goals include a 10 to 15 percent extension of range and endurance, 10 to 30 percent increase in power and thermal capacity, and lifecycle reduction costs. GE contracts involve preliminary designs of possible adaptive power and thermal management systems and robust electric power systems for possible integration into tactical, unmanned and long-strike platforms. An integrated ground demonstration is scheduled for 2012, with flight demonstrations planned for 2015.






5. Future Vehicle Aircraft Research (N+3 Designs): NASA contracted GE to study concepts for commercial aircraft 25 to 30 years from now. The concepts are called N+3, denoting technologies three generations beyond today's aircraft. They face significant performance and environmental challenges set by NASA, including: an 80 decibel reduction in noise below current Stage 3; 80+ percent lower NOx emissions below CAEP 2; 70 percent improvement in fuel burn; and the ability to operate from small airports. GE, Georgia Institute of Technology and Cessna Aircraft Company will take an integrated airframer and propulsion system design approach to analyze a 10- to 30-passenger aircraft that can fly point-to-point service between small community airports. Potential designs include a traditional ducted turbofan and open-rotor or unducted fan engine designs.

6. Open Rotor: Last fall, GE announced a joint study with NASA related to an open rotor or unducted fan engine design. In the 1980s, GE successfully ground-tested and flew an open-rotor engine that demonstrated dramatic fuel savings. Since then, GE has advanced its data acquisition systems and computational tools to better understand open-rotor systems. GE also gained extensive experience with composite fan blades in its GE90 engine and GEnx engine. This year, GE and NASA will conduct wind tunnel tests, using a component rig, to evaluate subscale counterrotating fan blade designs and systems. Snecma (SAFRAN Group), GE's longtime 50/50 partner in CFM International, will participate in fan blade design testing.

7. Fuel additives made of tiny particles known as nanocatalysts can help supersonic jets fly faster and make diesel engines cleaner and more efficient.

Princeton-led team has proposed a solution based on the use of graphene -- molecular sheets of carbon atoms. In 2003, Aksay and his chemical engineering colleague, Professor Robert Prud'homme, developed the first commercially viable technique for making graphene by using a chemical process to split graphite into its ultrathin individual sheets. The resulting flakes are 200- to 500-nanometers wide, making the largest of them about one-hundredth the width of an average human hair. When small amounts are added to liquid fuels, they lower the temperature at which the fuel ignites. The catalyst might also be used to reduce the amount of nitric oxide produced by diesel engines or accelerate soot oxidation rates, which could reduce the pollution and fuel use.


41 page pdf : Environmentally AviationEnvironmentally Responsible Aviation Technical Overview