October 24, 2008

Update of the Mundane Singularity: Tech Singularity Without Molecular Nanotechnology and Artificial General Intelligence Triggers

This weekend is when the Singularity Summit is being held in San Jose. Let us revisit the idea of technological singularity and the mundane singularity. Later this today there will be coverage of the Singularity Summit talks.

Ray Kurzweil, defines the singularity as a period of extremely rapid technological progress.

Robin Hanson, economist, proposes that multiple "singularities" have occurred throughout history, dramatically affecting the growth rate of the economy. Like the agricultural and industrial revolutions of the past, the technological singularity would increase economic growth between 60 and 250 times.

A Technological Singularity and Transhumanism are often criticized because the primary technologies that enable it are Molecular Nanotechnology and greater than human intelligence general AI, which some believe are not possible.

This site considered how much of the projected benefits of a technological singularity could be achieved even without Molecular Nanotechnology and Artificial General Intelligence as the technology triggers.

How much :
1. Economic abundance
2. Radical life extension
3. Physical and Cognitive enhancement
4. Blood Stream Robots
5. Supermaterials
6. Open Access to space
7. Pollution elimination
8. Computer Advancement
9. Shape changing functional devices like utility fog

This site does not agree that Molecular Nanotechnology (MNT) is not achievable or that greater than human intelligence AI is not achievable.

1. Relative economic abundance with every living person having an personal resources at the affluent level of a current US citizen. $250,000 per person per year in purchasing power parity income. [The income level that Obama would want to tax more heavily if he becomes President]. No shortages of any basic need water, food, medical care [equal to that which is achieving the medical results currently affordable to an affluent person now] and energy [currently a US citizen uses an average of 13,000 kwh per year for electricity and three times that for transportation and a share of industrial energy usage. So abundance is 100,000 kwh for every person and assuming a future population of 10 billion is 1000 trillion kwh.

A manufacturing and construction revolution can be achieved with printable buildings, inflatable electric cars, printable electronics and advanced automated rapid manufacturing.

RECENT: Contour crafting for printing buildings has been funded by Caterpillar, Inc [the world's largest manufacturer of construction equipment].

Computer simulation and detailed modeling and other enabling technology will enable the revolution.

Paper stronger than cast iron made from plant cellulose is here and will make manufacturing far cheaper.

Stem cell meat factories, advanced aquaculture and vertical farming and more advanced genetically engineered food will enable an abundance of food. The vertical farming would be further enabled by the printing building technology.

Aquaculture (fish farming) already provide over half of the world's fish.

Genetically modified fish can grow over twice as fast as regular fish and can enable more productive fish farming.

For water desalination is already very advanced and it is becoming more energy efficient and cheaper. More abundant and affordable energy helps to create more water from desalinization.

The mass produced uranium hydride nuclear reactor would be part of a relatively mundane energy abundance solution. These reactors would have far less waste since 50 times more fuel would be burned generating energy. Molten salt reactors are even more efficient and could burn 99% of the uranium and plutonium in the reactor.

Increasing the current level of nuclear power in the world by 450 times would achieve the 1000 trillion kwh level. Increasing the efficiency of so that fuel usage is reduced by 30 to 98 times and being able to use thorium as well as uranium would ensure that there is sufficient nuclear fuel for the 5-15 times more per year that would be needed. There is uranium in seawater and Japanese researchers have been able to extract kilograms of it. [Japan is seriously investigating using gene engineered seaweed for a combination of biofuel and large scale extraction of uranium from seawater. It would cost more but fuel costs are only a small percentage of a nuclear plants operation.

Using deep burn of nuclear fuel and factory mass production of nuclear reactors would allow scaling to 100 times current power usage for twenty thousand years using nuclear fission.

2. Radical life extension: Achieving actuarial escape velocity [which is not eradicating death but radical life extension] whereby life expectancy increases at greater than one year for each year that passes. No age related disease caused deaths. An increased level of increased physical regeneration and restoration. Really bad accidents or destructive weapons would still be able to kill. Advanced technology could create a precise copy of a person, but whether this will be done for ethical and societal reasons or whether the copy is the person is not discussed. A copy of "the mind" could be created in another substrate (ie. not a flesh and blood person but a computer than simulates "the mind").

Calorie restriction mimicking drugs could be available within five years according to a leading researcher and should provide 3-13 years of increased life span

Treatments to boost the human immune system against cancer and effective and cheap early detection of cancer cells will enable a massive decrease in cancer deaths.

The SENS project has raised over ten million dollars and is launching projects for each of the seven parts of the initial program to substantial extend human lives. This would be a major first step on the actuarial escape velocity path.

Regenerative medicine is making substantial advances with stem cells, tissue generation, and increasing the regenerative capability in humans to be more like salamanders (able to regrow limbs.) This research is well funded by the US defence department with the AFIRM (Armed Forces Institute of Regenerative Medicine funded for $250 million for five years) project.

3. Cognitive and physical enhancment: Enhance various performance aspects of the human body. Various medical and mechanical enhancements will be discussed which will be significant advances to existing performance enhancement.

Effective and safe myostatin inhibition will likely be developed which will enable most people to become several times stronger and closer to the best levels achievable now (one in one million people already have myostatin inhibited and it is four times as effective as high doses of steroids).

Cognitive enhancement is already here and will become more effective.

Craig Venter, billionaire and enabler of new gene therapy and synthetic biology technology, has indicated that very strong cognitive enhancement is possible, desirable and a goal that he wants to achieve.

From the Eric Drexler website - things that Molecular Nanotechnology would enable.
- desktop computers with a billion processors
- inexpensive, efficient solar energy systems
- medical devices able to destroy pathogens and repair tissues
- materials 100 times stronger than steel

4. Blood stream robots or achieving the goals (cellular surgery and repair) for which blood stream robots were proposed using other means. [medical devices able to destroy pathogens and repair tissues]

There are several major research institutes (including Carnegie Mellon) who are making major progress making more capable robots that are pill size to the size of bacteria.

Nanoparticles, existing blood stream robots and guideable containers and cellular repair are being proven and people are working to improve and deploy them.

5. Materials 100 times stronger than steel [cheap and commonly used] : Production or access to diamond and carbon nanotubes increased by 1000 times and using diamond as a primary material for house sized objects and for electronics.

Carbon nanotube production will be ramped up which will become very cheap and will be deployed widely

Very large (multi-carat) diamonds can be produced very fast since 2005 Current methods can produce, three-dimensional growth of colorless single-crystal diamond in the inch-range (~300 carat) is achievable. Large scale production and scaling up diamond creation is an active and well funded area.

6. Open access to space [within the solar system for human and robotic travel and small probes up to a significant fraction of light speed for interstellar access]

Ten near term developments for greatly improved space access were covered here

Mirrored laser arrays are achievable with refinement of current technology as is nuclear propulsion.

7. Pollution "elimination" : Reduction of pollution into the environment and nearly complete elimination of deaths caused by pollution.

The use of the uranium hydride and molten salt reactor would greatly reduce the use of fossil fuels.

This sites proposed energy plan is a fast, affordable, and low technology development risk path to eliminating fossil fuels and enabling abundant clean energy.

8. Desktop computers with one billion processors (or performance greater than one billion of todays processors)

500 cores in new teraflop chips for less than $200 for the processor.

Berkeley and Tensilica already working towards energy efficient and affordable exaflop computers for the 2015-2017 timeframe

Design conferences have been held to work out details on zettaflop computers

9. Shape changing functional devices like utility fog

Claytronics has been funded by Intel.

Todd Mowry, leading the work on claytronics, claims he will produce a "3-D fax" by 2012.

RECENT: Today I saw Justin Rattner (Intel) slides showing millimeter 2d and millimeter 3d sized claytronic catoms. They have built 2D mm and 3d mm scale catoms that can move with electrostatics.Justin Rattner, who is also senior fellow and vice president of the corporate technology group at Intel. "And we'll go from millimeters to microns, I guess, some time over the next five to 10 years."

Precise 3 dimensional manufacturing is progressing

So how much of some of the key goals of a transhuman singularity can be achieved without fullblown molecular nanotechnology, AGI or fusion ? Quite a bit. which is why the real deal with molecular nanotechnology, AGI and fusion will be really impressive. The mundane technological singularity shows the kinds of societal shifts that will be needed in order to fully take advantage of the upside. A lot of systems and processes have to be redesigned. The mundane singularity is 100 to 1000 times faster in terms of production and various capabilities.

Blacklight Power Providing Info and Assistance in Understanding their Work

Derek Lowe, phd in chemisty, has gotten responses to some of his questions from Blacklight Power. Derek is skeptical but is trying to gain a deeper understanding of what is being done. Blacklight Power seems to be open to providing information and assistance to Derek.

Mills has been good enough to offer to help me out with any aspects of the data that they’ve published, and to get in contact with the company should I be in the area, which is a good sign, and much appreciated. They’re also supposed to have a video of the reaction up shortly, and we’ll see what we can learn from that as well. Against all this, I have to put the fact that I still find the physics behind the company quite odd and improbable. And one has to remember that the track record of odd, improbable physics breakthroughs that promise huge supplies of energy is. . .not good. And that’s putting it very mildly indeed.

But all it takes is one. And all Blacklight has to do to quiet the skeptics (many of whom are much more vitriolic than I am) is to throw that big switch at some point and have the kilowatts (or megawatts) come streaming out. That’ll do it, for sure, and the company assures everyone that this is their goal. I wish them luck with it, because a huge and unexpected new source of energy would be a good thing indeed. I’m actually glad to live in a country where ideas this wild can raise tens of millions of dollars, but (for the time being) I’m also glad that none of that money is mine.

Blacklight doesn't seem to be trying to extract money from the general public

I’ve heard from some folks at Blacklight Power, including their founder, Randell Mills. He says that I have a number of details wrong about their system, and wrote with more information. I’ll quote from Mills:

”We do not add water to R-Ni. Any water present after drying is in the form of Bayerite or Gibbsite (Al(OH)3) which is quantified by XRD and TPD. Regarding the Rowan University team validation, the maximum theoretical heat from the measured content was 1% of the observed energy as stated with the analytical results given in the Rowan report which is on-line at our website.”

He also takes exception – as well he might – to my line about the correlation of the company’s activities to their fund-raising needs, stating that Blacklight currently has no need to raise any money at all. And as for the NMR figure that I could make no sense of, that appears to have been mislabeled. The one I was looking at, Mills says, is indeed a solution NMR and was actually Figure 45 in the document. Figure 58, he says, has now been fixed, although I have to say that it still looks like a duplicate of Figure 45 this morning at this link.

But as best I understand it now, the fundamental claim of the Blacklight work is that formation of their lower-energy states of hydrogen is extremely exothermic. Alkali metal hydrides, they say, are particularly good catalysts for this, giving you hydrinos and sodium metal (see equations 32 through 34 in their PDF). So the Raney nickel in these experiments is being used as a source of atomic hydrogen, and forming small amounts of sodium hydride on its surface gives you a system to see all this in action. Figure 17 would seem to be one of these, and Figure 21 is the same thing on a kilo scale.


October 23, 2008

Operating the 128 qubit Dwave Quantum Computer

The Dwave quantum computer has a grid of connected qubits and problems need to be mapped onto them to be solved.

What can we say about whether or not we can embed the problem graph into a hardware graph? Here are some things that we know:

1. You can always embed a fully connected graph on N variables into a Chimera graph with N^2 variables.
2. You can never embed a problem graph that has either more vertices or more edges than a hardware graph.
3. Any instance that is in the middle needs to be determined by solving what is in general a hard problem in its own right, that is does G_p embed in G. In practice if you wanted to operate in this regime, you’d probably have a fast heuristic embedder to see if a problem natively embeds.

Based on these observations, there are three obvious ways these chips can be operated.

1. Use the chip as a solver for complete graphs, where the algorithm calling the chip has a hard-coded embedding scheme for any problem edge set. This option has the advantage that it is the most flexible for inclusion in the master algorithm (no restrictions on problem edge set), and there is no runtime cost for embedding. Its key disadvantage is that the number of variables of problems you can solve in this mode is upper bounded by roughly the square root of the number of qubits in hardware, which is a significant cost with the current chips & their low qubit count.

[Noted in comments: 2000 qubits seems to be where operating at number 1 start to take off. You would have K 256 and K 192.]

2. Use the chip as a solver for problem graphs that exactly match the hardware graph by making this a hard constraint in the master algorithm. In other words, as an algorithm designer you are constrained to only pose problems to the hardware that have problem graphs that exactly match the hardware graph. This has the advantage of having trivial hard-coded embedding and maximally using the resources of the hardware. Its key disadvantage is a severe loss of flexibility in algorithms design possibilities.
3. Use the chip for arbitrary problem sizes by running an embedding heuristic each time a possibly embeddable graph is generated by the master algorithm. This has the advantage of a significant gain in flexibility for algorithm designers. The main disadvantage is the increased runtime burden of having to compute embeddings on the fly.

My own strong preference is for #2 above. The loss of flexibility in algorithms design is a big problem, but we’ve been able to find ways to build algorithms respecting fixed hardware graphs already so I think that the advantages of #2 carry the day, at least in the short term.

October 22, 2008

Promising obesity drug, twice as effective as current drugs

According to trials, a new obesity drug, Tesofensine, which may be launched on the world market in a few years, can produce weight loss twice that of currently approved obesity drugs.

The drug works by suppressing hunger, leading to an energy deficit which burns off excess body fat. This randomised, placebo-controlled phase II study was done in five Danish obesity management centres, and involved 203 obese patients (body mass index 30-40 kg/m2), weighing a mean of just over 100kg. They were prescribed a limited-energy diet and assigned to tesofensine 0.25mg (52 patients), 0.5 mg (50), 1.0 mg (49), or placebo (52), all once daily for 24 weeks. The primary outcome was percentage change in bodyweight. A total of 161 patients completed the study, and an analysis showed that the mean weight loss recorded for placebo and diet was 2.2kg and for tesofensine 0.25mg, 0.5mg and 1.0mg it was 6.7kg, 11.3kg, and 12.8kg respectively.

For the 0.5mg and 1.0mg doses, this represented a weight loss around twice that attained using sibutramine or rimonabant*, the currently-approved therapies in Europe. Blood pressure was increased in the 1.0mg group. The most common side-effects caused by tesofensine were dry mouth, nausea, constipation, hard stools, diarrhoea, and insomnia.

Plasmonic lithography : potential for 5 nanometer features

Engineers at the University of California, Berkeley, are reporting a new way of creating computer chips that could enable commercial speed 5 nanometer optical lithography. It can also mean higher density hard drives and optical disks with 20 times the density of Blu-ray.

The 5 page research paper: Flying plasmonic lens in the near field for high-speed nanolithography, Published online: 12 October 2008; doi:10.1038/nnano.2008.303.

This lowcost nanofabrication scheme has the potential to achieve throughputs that are two to five orders of magnitude higher than other maskless techniques.

By combining metal lenses that focus light through the excitation of electrons - or plasmons - on the lens' surface with a "flying head" that resembles the stylus on the arm of an old-fashioned LP turntable and is similar to those used in hard disk drives, the researchers were able to create line patterns only 80 nanometers wide at speeds up to 12 meters per second, with the potential for higher resolution [maybe 5 nanometers] detail in the near future.

Currently, the minimum feature size with conventional photolithography is about 35 nanometers, but our technique is capable of a much higher resolution at a relatively low cost.

This technology could also lead to ultra-high density disks that can hold 10 to 100 times more data than disks today.

The engineers designed a silver plasmonic lens with concentric rings that concentrate the light to a hole in the center where it exits on the other side. In the experiment, the hole was less than 100 nanometers in diameter, but it can theoretically be as small as 5 to 10 nanometers. The researchers packed the lenses into a flying plasmonic head, so-called because it would "fly" above the photoresist surface during the lithography process.

The researchers said the flying head design could potentially hold as many as 100,000 lenses, enabling parallel writing for even faster production.

"I expect in three to five years we could see industrial implementation of this technology," said Zhang. "This could be used in microelectronics manufacturing or for optical data storage and provide resolution that is 10 to 20 times higher than current blu-ray technology."

The researchers designed an air bearing that uses the aerodynamic lift force created by the spinning to help keep the two surfaces a mere 20 nanometers apart.

Air bearings are used to create magnetic tapes and disk drives, but this is the first application for a plasmonic lens.

With this innovative setup, the engineers demonstrated scanning speeds of 4 to 12 meters per second.

"The speed and distances we're talking about here are equivalent to a Boeing 747 flying 2 millimeters above the ground," added Zhang. "Moreover, this distance is kept constant, even when the surface is not perfectly flat."

The researchers pointed out that a typical photolithography tool used for chip manufacturing costs $20 million, and a set of lithography masks can run $1 million. One of the reasons for the great expense is the use of shorter light wavelengths to create higher resolution circuitry. Shorter wavelengths require nontraditional and costly mirrors and lenses.

The system described by the UC Berkeley engineers uses surface plasmons that have much shorter wavelengths than light, yet are excitable by typical ultraviolet light sources with much longer wavelengths. The researchers estimate that a lithography tool based upon their design could be developed at a small fraction of the cost of current lithography tools.

Xiang Zhang's lab website Xiang Zhang has been a leader with metamaterials and nanophotonics.

October 21, 2008

Venture Beat Investigates Blacklight Power

Venture beat has coverage of the Rowan University study of Blacklight Power

-Rowan University Prof Jansson gets supplied the Raney nickel from Blacklight Power, which it in turn obtains from an industrial supplier.
-Mills said it doped with a very small amount of another common material, sodium hydroxide, in a process that others could replicate.
-Jansson has been aware of Blacklight for years, and even acted as an advisor for an energy company that ultimately made a strategic investment, but it appears to have no unethical ties, just an ongoing interest.
-Mills, for his part, says that he’d like for scientists to independently verify every step of the process, from obtaining the Raney nickel and doping it to the calorimeter tests to prove that the energy bursts really exist. The information needed to run those tests is free to the public, he says; the only thing required is a researcher willing to take the time to puzzle through the process.
-Jansson’s team is observing produce only a quick burst of intense heat. In a commercialized process, there needs to be a steady output. Mills says he has purposefully kept knowledge of how to loop the reaction within the company, so that his own researchers can remain a step ahead in their work on the 50KW reactor the company earlier announced.
-According to Mills, it’s likely that a totally independent researcher will verify the whole process within a year. Meanwhile, the company will start licensing out its energy process.

The independent study was covered yesterday at this site.

The latest expected unit costs for the Blacklight power system compared to current energy technology:

The Blacklight hydrogen production plant diagram


A blogging Phd organic chemist Derek Lowe is taking a look at the Rowan University work.

This part would appear to be what’s being tested at Rowan:

”To achieve high power, R-Ni having a surface area of about 100 m2/g was surface-coated with NaOH and reacted with Na metal to form NaH. Using water-flow, batch calorimetry, the measured power from 15g of R-Ni was about 0.5 kW with an energy balance of delta-H = -36 kJ compared to delta-H of roughly 0 kJ from the R-Ni starting material, R-NiAl alloy, when reacted with Na metal. The observed energy balance of the NaH reaction was -1.6 x 10 to the 4th kJ/mole H2, over 66 times the -241.8 kJ/mole H2 enthalpy of combustion.”

I'll wait for more details before commenting on this, but it's clearly rather odd. Also in the rather-odd category are some of the figures in the Blacklight PDF - take a look at Figure 58, for example, which is labeled "MAS NMR spectra relative to external TMS Of NaCl, KCl, and CsCl showing the expected trend of increasing intensity of H2 (1/4) at 1.1 ppm relative to the H2 at 4.3 ppm down the column of the Group I elements."

Well, fine - but hold on a minute. MAS is "magic angle spinning", which is a solid-state NMR technique - and that NMR spectrum is clearly taken with a lot of DMF around. The dimethylformamide peaks are labeled as such, and it looks like a solution spectrum, not a solid-state one. Second, where's the trend? I see no series presented, just a single spectrum of something, with no labels to suggest various alkali metals. What's more, although I can't find a value for the NMR chemical shift of hydrogen gas in DMF, it's known to be 4.5 in deuterochloroform, so their 4.3 ppm is reasonable. But there's no peak at 4.3 to compare that big 1.1 ppm peak to - what am I looking at here?

We shall see - maybe. I'll report back if I hear from the group at Rowan. For now, I remain skeptical. I would truly enjoy the discovery a new energy source, but the history of this field does not inspire confidence.

Here is a peer reviewed paper which indicates that the Blacklight process could work while being consistent with existing physics.

The possible existence of fractional quantum states in the hydrogen atom has been debated since the advent of quantum theory in 1924. Interest in the topic has intensified recently due to the claimed experimental findings of Randell Mills at Blacklight Power, Inc., Cranbury, New Jersey of 137 inverse principal quantum levels, which he terms the “hydrino” state of hydrogen. This paper will show that the general wave equation predicts exactly that number of reciprocal energy states.

The four-dimensional potential equation indicates that fractional quantum states exist. The solution is square integrable, satisfying a fundamental tenet of quantum physics. Mills’ claim of 137 different inverse energy levels seems confirmed, as is Naudts’ relativistic analysis showing that at least one reciprocal state can exist.

Hydrino theory indicates maybe compatible with the standard theory of relativistic quantum mechanics.

Here is the paper that is critical of Blacklight Power.

What works in reality will be explained with the right science. If fractional quantum states work, then science will integrate it with everything else that works and move on. So it still all boils down to does this thing work. We will see over the next 1-2 years.

If more funders put more millions into Blacklight Power and it does not work out for them, then so what. Given the trillions being lost now on the belief that advanced physics and math models could change sub-prime loans into triple-A loans, there are clearly worse consequences to being wrong about math and having misplaced belief.

First Technical Preview of Dwave 128 qubit Quantum Computer

Dwave Systems will have a new 128 qubit quantum computer in about two weeks. This was first discussed four days ago on this site along with a review of the background on the Dwave adiabatic quantum computer.

In the systems Dwave builds, the maximum number of connections per qubit is constrained by noise. In order to couple a qubit into a coupler, the qubit needs a certain amount of inductance. This inductance is obtained by increasing the perimeter of the qubit, which increases the noise seen by the qubit. Increasing the noise a qubit sees has several deleterious effects, all of which I will be discussing in later posts. For now let’s just say that the maximum number is 6 connections per qubit without answering the question of why or how to make it better.

Given 6 connections per qubit, what is the “ideal” layout / interconnect scheme? Answering this depends on what as a designer you are trying to optimize. Let’s say that the primary objective is to make a tile-able unit cell with a maximum of 6 connections per qubit. There are several possible ways to do this. The way we settled on is as follows:

The qubits are topologically loops of niobium. They are interrupted in a variety of places by compound Josephson junctions. Imagine drawing four loops schematically like the outlines of four parallel popsicle sticks lying north-south, and then laying down on top of this the exact same structure rotated by 90 degrees. Each “popsicle stick outline” is a single qubit. The points of intersection are where the coupling devices are placed. This unit cell looks physically like the picture on the left, which is identical to the picture on the right, which is also known as the complete bipartite graph on 4 vertices K_{44}.

From Geordie, DWave Systems CTO comments:
For the 128-qubit chip, you can split the chip into two regions, the 6 upper right hand blocks and the 10 lower left + diagonal blocks. You can embed a K_{8} in the upper right block, and a K_{12} in the lower left+diagonal block. These two complete graphs can be connected in a limited way through the couplers the two sections share. Since the size of the fully connected graphs you can embed in this guy is small, we plan to operate based on algorithms that respect the interconnect structure of the 128-qubit chip, ie. graph embedding can be done at this level but the cost is prohibitive at this stage.

In principle you can continue to tile the plane with unit cells until you (a) run into fab yield limits and/or (b) run out of real estate on the processor die. You could build a 256-qubit chip by tiling two 128-qubit patterns side by side, or a 512-qubit chip by doing a 2×2 tile of the 128-qubit pattern, or a 2,048-qubit chip by doing a 4×4 tile of the 128-qubit pattern. Re. timing on the entanglement results, as soon as possible.

So in this design there are 8 qubits per unit cell, 16 inter-cell couplers per unit cell, and 8 intra-cell couplers (4 to the right, 4 to the bottom). To make the tiling strategy explicit, here are 32-qubit.

Space and Energy Roundup

1. Alan Boyle at msnbc reports that the IEC fusion project is still in limbo awaiting a review of their results and for word on follow up funding.

[EMC2 Fusion] is waiting for guidance from a peer-review panel and his funders on whether to proceed to the second phase.

"We've been pretty busy, but it's the same situation," Nebel told me today. "We're kind of in a holding pattern."

He's been able to keep the five-person team together and "doing a few things" during this holding pattern. There have been some rumblings to the effect that EMC2's results have been encouraging enough to justify pressing forward, but Nebel has declined to make a prediction about the project's future.

Nebel worries about the same kind of budget limbo that the U.S. ITER team is worrying about, even though his budget is an order of magnitude lower. Among the factors on his mind are the change in the White House and the changes in economic circumstances.

"The thing that usually gets hit the hardest is what they call discretionary funding," Nebel said, "and that's what we're looking at here. That'd be the biggest fear everywhere."

2. Elon Musk and Spacex are confident again after their first successful orbital rocket launch.

Under a $278 million contract with NASA, Space Exploration Technologies, known as SpaceX, plans to launch a far more powerful booster by mid-2009. By designing the new Falcon 9 to be reusable, Musk hopes to make space travel far cheaper, and secure a permanent gig taxiing supplies to the International Space Station.

If SpaceX can achieve its ambitions of slashing the cost to reach space by a factor of 10, "it would be recognized as one of the pivotal events in human history, in the history of life itself," Musk said. "It would make it possible to colonize Mars, to make life multi-planetary. In the absence of a reusable launch vehicle, that's not going to happen."

Elon Musk has stated that one of his goals is to improve the cost and reliability of access to space, ultimately by a factor of ten. Ultimately, I believe $500 per pound ($1,100/kg) or less is very achievable."

3. Venture beat reports that NASA's problems with the Shuttle and its other programs are forcing a greater dependence on the private space companies.

NASA has earmarked $500 million for contracts with firms like Orbital and SpaceX to deliver and return cargo, and eventually even crew members.

NASA has the potential to be a stable buttress for many private firms — buying data, licensing designs and transporting payloads via commercial contracts. Not to mention pumping reliable streams of revenue into the ind This won’t just help the private players. It might just be the key ingredient needed to give NASA a competitive edge over its international peers — not only China, but an ever-expanding space community.

SpaceX, Bigelow Aerospace and XCOR Aerospace are hard at work developing the technology that could lead to commercial manned spaceflight in the not too distant future. Small Texas company Armadillo Aerospace is building reusable-rocket powered vehicles with an eye toward eventual passengered voyages. And Blue Origin, an even smaller startup funded by Amazon founder Jeff Bezos, has already flown its New Shephard spacecraft, designed for sub-orbital transport. They hope to be marketing it to tourists within the next two years.

4. This weekend, Oct 24-25, is the lunar lander challenge.

Armadillo Aerospace lander

Two teams are expected to fly during the competition: Armadillo Aerospace and TrueZer0.

The Competition is divided into two levels. Level 1 requires a rocket to take off from a designated launch area, rocket up to 150 feet (50 meters) altitude, then hover for 90 seconds while landing precisely on a landing pad 50 meters away. The flight must then be repeated in reverse—and both flights, along with all of the necessary preparation for each, must take place within a two and a half hour period.

The more difficult course, Level 2, requires the rocket to hover for twice as long before landing precisely on a simulated lunar surface, packed with craters and boulders to mimic actual lunar terrain. The hover times are calculated so that the Level 2 mission closely simulates the power needed to perform a real lunar mission.

In the 2007 competition, held as part of the X PRIZE Cup, there were nine competitors total. However, despite the best efforts of all of the teams, only one of them, Armadillo Aerospace, was ready to fly. They missed winning Level 1 by 7 seconds.

Team name: Armadillo Aerospace
Vehicle name(s): MOD & QUAD (PIXEL)
Team leader: John Carmack
Team members:James Bauer, Tommy Bishop, Russ Blink, Phil Eaton, Joseph Lagrave, Neil Milburn, and Matthew Ross
Fuel: LOX, Ethanol and Helium
Level(2): One and Two
Vehicle Weight: 1340lbs and 2250lbs, respectively
Thrust: ~1800lbs and 3000lbs, respectively

Team name: TrueZer0
Vehicle name(s): Ignignokt
Team leader: Todd Squires and Scott Zeeb
Team members: George Johnson, Todd Squires, Scott Zeeb, Josh Johnson(left to right)
Fuel: H2O2 and N2
Level(2): One
Vehicle Weight: 475lbs
Thrust: 650lbs

5. Blacklight Power has an independent university study confirming its 50KW reactor

Fun off-topic: Beverage and Breast Size roundup

1. Around half of all women possess a gene shown to link breast size to coffee in take according to a swedish study of 270 women. The study was published in the British Journal of Cancer.

Woman with an average weight but big breasts and a high number of mammary glands run an above average risk of developing breast cancer. Previous studies have shown that women can reduce this risk by drinking at least three cups of coffee a day.

2. The TV show, Manswers, indicated hops in beer can work like the hops in breast enhancement pills and can increase breast size.

Phytoestrogen can be detected in beer, but the levels are low.

Hops may also have an anti-cancer effect.

Synthetic Telepathy and Better machine neuron connections

The Army has given a grant to researchers at University of California, Irvine, Carnegie Mellon University and the University of Maryland has two objectives.

The first is to compose a message using, as D'Zmura puts it, "that little voice in your head."

The second part is to send that message to a particular individual or object (like a radio), also just with the power of thought. Once the message reaches the recipient, it could be read as text or as a voice mail.

In a separate but related development movement was restored to paralyzed limbs in monkeys through artificial brain-muscle connections. The two projects could combine with the more robust brain and neuron connections helping to provide better signals for the synthetic telepathy work.

The group's approach is one of several lines of current neuroprosthetic research. Some investigators are using brain-computer interfaces to record signals from multiple neurons and convert those signals to control a robotic limb. Other researchers have delivered artificial stimulation directly to paralyzed arm muscles in order to drive arm movement—a technique called functional electrical stimulation (FES). The Fetz study is the first to combine a brain-computer interface with real-time control of FES.

Until now, brain-computer interfaces were designed to decode the activity of neurons known to be associated with movement of specific body parts. Here, the researchers discovered that any motor cortex cell, regardless of whether it had been previously associated with wrist movement, was capable of stimulating muscle activity. This finding greatly expands the potential number of neurons that could control signals for brain-computer interfaces and also illustrates the flexibility of the motor cortex.

The advantage of Moritz’s approach is that the signal from a single neuron can be interpreted by a much less powerful computer chip, perhaps one small and low-powered enough to implant into the animal’s — or a patient’s — body.

Moritz also suggests that his team’s approach could eventually control several muscles at once by electrically stimulating nerves in the spinal cord, rather than stimulating the muscles directly. Eventually the researchers hope to develop wireless electrodes that wouldn’t involve wires sticking out of the skull, Moritz says.

Clinical applications are still probably at least a decade away, according to Dr. Fetz. Better methods for recording cortical neurons and for controlling multiple muscles must be developed, along with implantable circuitry that could be used reliably and safely, he says.

Previous implants collect signals from large collections of neurons, and need complex software to process them into a clean output signal.

Moritz's system, though, uses only 12 moving electrodes – just 50 micrometres wide – to seek out and connect to just a single neuron. This produces a much simpler and tidier output signal.

After being inserted into the brain's motor cortex, the device can sense where the strongest signal is coming from, and move the electrodes towards it.

Piezoelectric motors can move the 12 electrodes in small 1-micrometre increments and will back off when necessary to avoid damaging nerve cells.

Commercial EEG headsets already exist that allow wearers to manipulate virtual objects by thought alone, noted Sajda, but thinking "move rock" is easier than, say, "Have everyone meet at Starbucks at 5:30."

One difficulty in composing specific messages is fundamental — EEGs are not very specific. They can only locate a signal to within about one to two centimeters. That's a large distance in the brain. In the brain's auditory cortex, for example, two centimeters is the difference between low notes and high notes, D'Zmura said.

Placing electrodes between the skull and the brain would offer more precise readings, but it is expensive and requires invasive surgery.

To work around this problem, the scientists need to gain a much better understanding of what words and phrases light up what brain sections. To create a detailed map of the brain scientists will also use functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG).

Each technology has its own strengths and weaknesses. EEGs detect brain activity only on the outer bulges of the brain's folds. MEGs read brain activity on the inner folds but are too large to put on your head. FMRIs detect brain activity more accurately than either but are heavy and expensive.

Of all three technologies EEG is the one currently cheap enough, light enough and fast enough for a mass market device.

The map generated by all three technologies will help the computer guess which word of phrase a person means when a part of the brain is lights up on the EEG. The idea is similar to how dictation software like Dragon NaturallySpeaking uses context to help determine which word you said.

Direct control of paralysed muscles by cortical neurons

A potential treatment for paralysis resulting from spinal cord injury is to route control signals from the brain around the injury by artificial connections. Such signals could then control electrical stimulation of muscles, thereby restoring volitional movement to paralysed limbs. In previously separate experiments, activity of motor cortex neurons related to actual or imagined movements has been used to control computer cursors and robotic arms and paralysed muscles have been activated by functional electrical stimulation. Here we show that Macaca nemestrina monkeys can directly control stimulation of muscles using the activity of neurons in the motor cortex, thereby restoring goal-directed movements to a transiently paralysed arm. Moreover, neurons could control functional stimulation equally well regardless of any previous association to movement, a finding that considerably expands the source of control signals for brain-machine interfaces. Monkeys learned to use these artificial connections from cortical cells to muscles to generate bidirectional wrist torques, and controlled multiple neuron–muscle pairs simultaneously. Such direct transforms from cortical activity to muscle stimulation could be implemented by autonomous electronic circuitry, creating a relatively natural neuroprosthesis. These results are the first demonstration that direct artificial connections between cortical cells and muscles can compensate for interrupted physiological pathways and restore volitional control of movement to paralysed limbs.

Nuclear power Roundup October 21, 2008

1. Italy will make 8-10 nuclear reactors starting in 2013.

Correcting a 50 billion euro mistake of ending nuclear power in Italy.

The long term aim, according to Scajola, is to 'rebalance the power generation in Italy'. By 2030 the Italian government would like to see nuclear power taking a 25% share in generation, with renewables on the same level and fossil fuels making up the remaining 50%.

2. Russian and Chinese delegations added an intention to construct an 800 MWe demonstration fast breeder reactor to older plans to expand the Tianwan nuclear power plant. Two new VVER-1000 pressurized water reactor units operate at the plant in eastern China, and a framework to build two more was embellished by an instruction to draft a memorandum concerning their actual construction. Russia already operates one BN-600 fast breeder reactor for electricity production at Beloyarsk, while a BN-800 unit is under construction there. The 800 MWe unit for China is presumed to be similar to the second Beloyarsk reactor.

3. Kyrgyzstan Kara Balta since the start of 2008 has produced over 600 tonnes of uranium and plans to produce more than 2000 tonnes in 2009.

4. Saskatchewan, Canada has some of the largest uranium reserves in the world. Saskatchewan has appointed a 12 member panel to perform a study on how to develop its uranium and nuclear industry. The new partnership's mandate is to identify and evaluate opportunities for value-added development of the uranium industry and make recommendations in a report that is to be submitted by March 31, 2009. Members include Armand Laferrere, president of Areva Canada, Jerry Grandey, president and CEO of Cameco and Alex Pourbaix, president of energy at TransCanada Corp and Duncan Hawthorne, president of Bruce Power Inc.

This should result in a recommendation to vigorously develop the Uranium resources, step up to nuclear research and build some nuclear power plants. This would be a good thing for Saskatchewan's economy. The position of the province for the last few decades would be like Saudi Arabia being against oil development and the building of refineries.

Note: the author, Brian Wang, lived in Saskatchewan for 20 years and felt that the lack of development of the massive uranium resource and the lack of support for nuclear power was a massive mistake.

Ohio state has solar power breakthrough

Researchers have created a new material that overcomes two of the major obstacles to solar power: it absorbs all the energy contained in visible sunlight, and generates electrons in a way that makes them 7 million times easier to capture.

Note: there is other work for capturing the infrared spectrum of sunlight.

Researchers and companies in the UK have delivered first generation single-junction cells with energy conversion efficiencies up to 12% for thermo-photovoltaic (TPV) cells. This compares to 9% from existing, commercially available devices. Increasing to 15% conversion of infrared energy to electricity is expected.

Ohio State University chemists and their colleagues combined electrically conductive plastic with metals including molybdenum and titanium to create the hybrid material.

Sunlight contains the entire spectrum of colors that can be seen with the naked eye -- all the colors of the rainbow. What our eyes interpret as color are really different energy levels, or frequencies of light. Today's solar cell materials can only capture a small range of frequencies, so they can only capture a small fraction of the energy contained in sunlight.

This new material is the first that can absorb all the energy contained in visible light at once.

The molecules didn't just fluoresce as some solar cell materials do. They phosphoresced as well. Both luminous effects are caused by a material absorbing and emitting energy, but phosphorescence lasts much longer.

To their surprise, the chemists found that the new material was emitting electrons in two different energy states -- one called a singlet state, and the other a triplet state. Both energy states are useful for solar cell applications, and the triplet state lasts much longer than the singlet state.

Electrons in the singlet state stayed free for up to 12 picoseconds, or trillionths of a second -- not unusual compared to some solar cell materials. But electrons in the triplet state stayed free 7 million times longer -- up to 83 microseconds, or millionths of a second.

When they deposited the molecules in a thin film, similar to how they might be arranged in an actual solar cell, the triplet states lasted even longer: 200 microseconds.

"This long-lived excited state should allow us to better manipulate charge separation," Chisholm said.

At this point, the material is years from commercial development, but he added that this experiment provides a proof of concept -- that hybrid solar cell materials such as this one can offer unusual properties.

The actual paper:The remarkable influence of M2δ to thienyl π conjugation in oligothiophenes incorporating MM quadruple bonds


Oligothiophenes incorporating MM quadruple bonds have been prepared from the reactions between Mo2(TiPB)4 (TiPB = 2,4,6-triisopropyl benzoate) and 3′,4′-dihexyl-2,2′-:5′,2″-terthiophene-5,5″-dicarboxylic acid. The oligomers of empirical formula Mo2(TiPB)2(O2C(Th)-C4(n-hexyl)2S-(Th)CO2) are soluble in THF and form thin films with spin-coating (Th = thiophene). The reactions between Mo2(TiPB)4 and 2-thienylcarboxylic acid (Th-H), 2,2′-bithiophene-5-carboxylic acid (BTh-H), and (2,2′:5′,2″-terthiophene)-5-carboxylic acid (TTh-H) yield compounds of formula trans-Mo2(TiPB)2L2, where L = Th, BTh, and TTh (the corresponding thienylcarboxylate), and these compounds are considered as models for the aforementioned oligomers. In all cases, the thienyl groups are substituted or coupled at the 2,5 positions. Based on the x-ray analysis, the molecular structure of trans-Mo2(TiPB)2(BTh)2 reveals an extended Lπ-M2δ-Lπ conjugation. Calculations of the electronic structures on model compounds, in which the TiPB are substituted by formate ligands, reveal that the HOMO is mainly attributed to the M2δ orbital, which is stabilized by back-bonding to one of the thienylcarboxylate π* combinations, and the LUMO is an in-phase combination of the thienylcarboxylate π* orbitals. The compounds and the oligomers are intensely colored due to M2δ–thienyl carboxylate π* charge transfer transitions that fall in the visible region of the spectrum. For the molybdenum complexes and their oligomers, the photophysical properties have been studied by steady-state absorption spectroscopy and emission spectroscopy, together with time-resolved emission and transient absorption for the determination of relaxation dynamics. Remarkably, THF solutions the molybdenum complexes show room-temperature dual emission, fluorescence and phosphorescence, originating mainly from 1MLCT and 3MM(δδ*) states, respectively. With increasing number of thienyl rings from 1 to 3, the observed lifetimes of the 1MLCT state increase from 4 to 12 ps, while the phosphorescence lifetimes are ≈80 μs. The oligomers show similar photophysical properties as the corresponding monomers in THF but have notably longer-lived triplet states, ≈200 μs in thin films. These results, when compared with metallated oligothiophenes of the later transition elements, reveal that M2δ–thienyl π conjugation leads to a very small energy gap between the 1MLCT and 3MLCT states of <0.6 eV.

Downloadable supplemental information

October 20, 2008

Transformation optics, metamaterials, nanophotonics, plasmonics

Transformation optics is a field of optical and material engineering and science embracing nanophotonics, plasmonics, and optical metamaterials.

Transformation optics may enable invisibility, ultra-powerful microscopes and computers by harnessing nanotechnology and "metamaterials."

The list of possible breakthroughs includes a cloak of invisibility; computers and consumer electronics that use light instead of electronic signals to process information; a "planar hyperlens" that could make optical microscopes 10 times more powerful and able to see objects as small as DNA; advanced sensors; and more efficient solar collectors.

Computers using light instead of electronic signals to process information would be thousands of times faster than conventional computers. Such "photonic" computers would contain special transistor-size optical elements made from metamaterials.

Transformation optics also could enable engineers to design and build a "planar magnifying hyperlens" that would drastically improve the power and resolution of light microscopes.

"The hyperlens is probably the most exciting and promising metamaterial application to date," Shalaev said. "The first hyperlens, proposed independently by Evgenii Narimanov at Princeton and Nader Engheta at the University of Pennsylvania and their co-workers, was cylindrical in shape. Transformation optics, however, enables a hyperlens in a planar form, which is important because you could just simply add this flat hyperlens to conventional microscopes and see things 10 times smaller than now possible. You could focus down to the nanoscale, much smaller than the wavelength of light, to actually see molecules like DNA, viruses and other objects that are now simply too small to see."

He estimated that researchers may be building prototypes using transformation optics, such as the first planar hyperlenses, within five years.

The Buckypaper Race to Market

1. Dr. Xiangwu Zhang, an Assistant Professor in the College of Textiles at North Carolina State University, hydroentangling treats the stack of unentangled fibers as a whole to produce strong fabrics or membranes, and hence it is an excellent method to assemble carbon nanotubes (CNTs), which are too small to be manipulated individually. The continuous hydroentangling process used in the textile industry is able to produce nonwovens at a speed up to 400 meters per minute. [H/T to reader Brock one of these links]

Zhang demonstrated that the tensile strength of hydroentangled CNT membrane with a thickness of 100 µm is 51 MPa, which is three times greater than that of filtration-produced CNT buckypaper.

2. Florida State institute has been able to produce buckypaper with half the strength of the best existing composite material, known as IM7. Ben Wang expects to close the gap quickly.

"By the end of next year we should have a buckypaper composite as strong as IM7, and it's 35 percent lighter," Wang said.

Buckypaper now is being made only in the laboratory, but Florida State is in the early stages of spinning out a company to make commercial buckypaper.

One challenge is that the tubes clump together at odd angles, limiting their strength in buckypaper. Wang and his fellow researchers found a solution: Exposing the tubes to high magnetism causes most of them to line up in the same direction, increasing their collective strength.

[H/T to commenter eternal carrot for figures on IM7 material

IM7/8551-7A composite beams: The corresponding longitudinal modulus and failure loading were found to be 124.96 GPa and 782 MPa, respectively.

The loading thickness was 1.29mm, 2.80 mm and 1.83 mm for three samples. [not 0.1 mm or 100 microns for the North Carolina material.]

This air force PDF on page 19 describes IM7 carbon fiber as having the tensile strength of about 5520 MPa with a modulus of about 276 GPa.

3. Nanocomp Technologies, a startup, has also produced large sheets of carbon nanotubes.

Nanocomp Technologies received a $1.5 million development contract from the U.S. Army Natick Soldier Center in Massachusetts in August 2008.

Florida states High performance Material Institute website

Xiangwu Zhang's site at the University of North Carolina

Making aluminum cheaper and more energy efficient can also be a way to reduced the weight of cars. Expensive cars like the Jaguar, Aston Martin and Audi A-8 have aluminum frames and bodies which reduce vehicle weight by about 500 pounds.

Oak Ridge National Laboratory has empirically developed a rule of thumb that a 10 percent reduction in vehicle weight improves fuel economy by 5 to 7 percent.

Rowan University Study of Blacklight Power

Dr. Jansson, professor of engineering at Rowan University confirms BlackLight's 50 kilowatt reactor [H/T to a reader Ron B.]

Rowan Scientists confirmed BLP‟s 1 kW and 50 kW power source tests corresponding to 20 kilojules and 1.0 megajoules respectively. Chemical analysis of the reactant and product R-Ni powder could account for less than 1% of the observed energy from known chemistry.

Note: the Blacklight Power process is highly controversial. Many people believe it is a total scam. So reports like this will not convince those people. Blacklight Power has said they will be selling their reactors starting in 2009. When Rowan University and Blacklight Power are running their facilities using Blacklight Power reactors for energy, then their will be some conversion of skeptics. The remainder would be converted if they can personally verify Blacklight Power reactors providing energy for buildings owned by people that they personally know and trust. Blacklight Power is independently funded for over $50 million. So there is no tax money going into this development. There is extra passion in the skeptics because those behind Blacklight Power also claim that their system represents new science, Hydrinos. Hydrinos if true would require modification of certain foundational aspects of chemistry and quantum physics. Blacklight Power announcements have put a rough timeframe of reckoning between them and their critics. The reckoning should happen sometime in 2009 or 2010 at the latest. 2010 if Blacklight Power announces some plausible delays, but then critics will be all over any delays. Based on the Rowan University work, Blacklight Power will be letting more universities and scientists take the 50 kw reactor out for a spin. If this potential breakthrough is for real, then 2009 will be when this becomes very obvious. The claims could not have less than 20% truth without having world changing impact.

In 2002, a NASA Institute for Advanced Concepts (NIAC) Phase I study was conducted at Rowan University, led by mechanical engineering professor Anthony Marchese, to investigate the so-called BlackLight Process for use in spacecraft propulsion. The team successfully replicated and confirmed results obtained by BlackLight, Inc., including the observation of line broadening and excess heat (although the final report stated "Additional studies are required to rule out all other possible explanations other than 'excess power' for these observations.") Peter Jansson was involved in the NIAC report.

BlackLight Power (BLP) Inc. today announced the successful independent replication and validation of its 1,000 watt and 50,000 watt reactors based on its proprietary new clean energy technology. This follows BLP's May announcement that it had successfully tested a new non-polluting energy source.

BLP's 50,000 watt reactor generated over one million joules of energy in a precise measurement made by Rowan University engineers, led by Dr. Peter Jansson. The independent study included full characterization of a proprietary solid fuel to generate the energy, before and after the reaction.

Dr. Jansson's Rowan University team conducted 55 tests of the prototypes, including controls and calibrations, during a nine-month study. Test results indicated that energy generation was proportional to the total amount of solid fuel, and only one percent of the one million joules of the energy released could be accounted for by previously known chemistry. These results matched earlier tests conducted at BlackLight's Research and Development Center, in Cranbury New Jersey.
Michael Jordan, former CEO of Westinghouse and current board member of BlackLight Power, says "The offsite replication and independent testing announced by Dr. Peter Jansson and his team of scientists underscore the business viability and impact of BlackLight's new energy source as the opportune replacement of coal-based fuels. It will go down as one of the most important advances in the field of energy in the last fifty years."

Rowan University has 10,000 students and US News & World Report ranks Rowan University in the "Top Tier” of Northern Regional Universities.

Wikipedia entry on Rowan University

- The higher‐than‐average rate of return allowed on oil pipelines.

Here ia a 16 page pdf of the Rowan University Report

Based on the results achieved by the Rowan University scientific teams from the BLP 1 kW and 50- kW reactors we have concluded that there is a novel reaction of some type causing the large exotherm which is consistently produced in our experimental runs. The past few months have shown great progress in our ability as an external team of scientists to reproduce many of the experiments that BLP scientists have achieved in their own laboratories. The current scientific team has grown competent in the experimental protocols and become more familiar with performing the experimental and the data analysis with consistent results. We are confident that the energy released from the BLP experiments can be replicated in laboratories at other scientific and educational establishments. Our future plans include moving the project to the new South Jersey Technology Park during October 2008. We intend to continue further testing from October 2008 – May 2009. Based on the success of our calibration tests, we will continue with flow rate calibrations prior to and after each calibration/ heat run to assure we have more consistent and repeatable data. A constant record of each offset for each run will be kept as a reference for reanalyzing previous data. This step in the analysis assures we can minimize inaccuracies.

The 2002 Blacklight Power propulsion study made at Rowan University