Pages

May 18, 2007

Scarcity will not be completely eliminated

This is an interesting lengthy article about the net economy and scarcity in general Brad deLong points to the last part of the first ariticle,that describes why scarcity will not be completely eliminated

The scarcity could be greatly reduced using advanced technology such as molecular nanotechnology.

This is expanding upon the cornucopian view

Here is a discussion of the Cornucopian versus New Malthusian perspectives.

I would be classifid as a Cornucopian. I agree that there are physical limits, but they are a lot farther off than New Malthusian's believe. I also think the steady state on earth plan that the New Malthusian's promote is a very bad idea.

With advanced technology we are not near the population carrying capacity of the earth. I believe that the carrying capacity of the earth is over 100 billion people This would be a carrying capacity where most of the area on the earth can be left for nature. We would and should change the primary energy infrastructure to improved nuclear fission (thorium molten salt), nuclear fusion or space based solar power. We would should also use nanotechnology and other means to reduce pollution and waste to neglible levels.

We could also drastically lower the footprint and sustainability by using superior food production. Scale up test tube meat to industrial levels and create superior hydroponics for consumed plants and create sustainable aquaculture.

I believe that we will be able expand into the solar system to allow for a carrying capacity of 10 quadrillion (10^15) people.

We will be able to get into space

The technology will be there. We still have to do a lot of hard work and make the right choices.

Aluminum instead of gasoline to power cars

There is a proposal to use aluminum to power cars instead of gasoline. The plan is to use pellets of aluminum to generate hydrogen as needed to power a fuel cell. The aluminum gets converted to alumina in the process. The alumina can be recycled from aluminum to alumina. Based on the info the mileage is 1 mile per pound of aluminum.

This is a poor proposal. If we scale it up to all cars we would be moving 140 million tons of aluminum per week to and from cars and charging facilities. To get the aluminum part of the system cost competitive with gasoline we need expensive fuel cells. It would be better to get plug in hybrids and transition to all electrical.

There was also a storage system breakthrough for holding gases of most any kind. It stores a target of 180 times the volume at about 500psi. The gas storage system is based on corn cobs turned into brickets.

A midsize car with a full tank of aluminum-gallium pellets, which amounts to about 350 pounds of aluminum, could take a 350-mile trip and it would cost $60, assuming the alumina is converted back to aluminum on-site at a nuclear power plant."

If I put gasoline in a tank, I get six kilowatt hours per pound, or about two and a half times the energy than I get for a pound of aluminum. So I need about two and a half times the weight of aluminum to get the same energy output, but I eliminate gasoline entirely, and I am using a resource that is cheap and abundant in the United States. If only the energy of the generated hydrogen is used, then the aluminum-gallium alloy would require about the same space as a tank of gasoline, so no extra room would be needed, and the added weight would be the equivalent of an extra passenger, albeit a pretty large extra passenger."


For 800 million cars one would need 140 million tons of aluminum for full tanks for all cars. So the actual amount of aluminum/alumina in a system of rotating material some in tanks, some being charged) would need to be three to five times as much, but their can be home or "gas station" recharging. Ideally you do not want to be transporting a lot of alumina/aluminum back and forth large distances to centralized recharging stations. If we did then the recharging stations should be in every city, town, community.

About 27 million tons of aluminum are produced each per year The aluminum is not consumed in the proposed process but is transmuted into alumina.

There are globally 23 billion tons of bauxite reserves

Aluminum production is energy intensive process but it would only be done once. Then the aluminum would be part of a closed loop process to make hydrogen in cars and then get recharged and stripped of the extra oxygen.

Aluminium electrolysis with the Hall-Héroult process consumes a lot of energy, but alternative processes were always found to be less viable economically and/or ecologically. The world-wide average specific energy consumption is approximately 15±0.5 kilowatt-hours per kilogram of aluminium produced from alumina. (52 to 56 MJ/kg). The most modern smelters reach approximately 12.8 kW·h/kg (46.1 MJ/kg). Reduction line current for older technologies are typically 100 to 200 kA. State-of-the-art smelters operate with about 350 kA. Trials have been reported with 500 kA cells.

Recovery of the metal via recycling has become an important facet of the aluminium industry. Recycling involves melting the scrap, a process that uses only five percent of the energy needed to produce aluminium from ore.[8] Recycling was a low-profile activity until the late 1960s, when the growing use of aluminium beverage cans brought it to the public consciousness.


Fuel cells are still over $3000/kw If I need fuel cells to allow me to go 80 miles in one hour. Then the fuel cell would need to provide 200 kWh. So the fuel cells still look too expensive. If it was part of hybrid system then hydrogen fuel cell part could be shrunk. The costs of this approach would need to be reduced for all components for it to be viable.

How much water is needed to provide the hydrogen for the alumina to convert?

The most accurate way to determine this would be to use the gas law. Start with the amount of hydrogen needed and then convert to the equivalent amount of water.

I will shortcut this and use the standard that 180 times the volume of the gas tank is needed. A site with possibly useful conversion rates on hydrogen from water 34 pounds of hydrogen has the energy content of 15 gallons of gasoline

1.2 tablespoons (one mole of water) makes 22.42 liters of H2 gas and
11.21 liters of O2 gas.

250 liters of gas to fill a 60 gallon tank.
11 tablespoons to fill the tank but multiply by 180 concentration.

256 tablespoons to a gallon
So about 8 gallons of water to equal a 15 gallon gas tank.

Artificial and adjustable nanofluid channels

Nanowerk has a spotlight on artificial adjustable nanofluid channels


Flow of fluorescein molecules through an array of five tunable elastomeric nanochannels and their accumulation at an air-filled microscale compartment. Running horizontally at the top is an air-filled microchannel. The nanochannels are triangular and are 80 nanometers high from base to top corner and 600 nm wide at the base (Image: Dr. Takayama)

"Our method of fabricating nanochannels is very simple" Dr. Shuichi Takayama explains to Nanowerk. "We do it by stretching a piece of surface treated rubber. People may have similar experiences where they have stretched an old rubber band and seen cracks form. We just do this in a finer, more controlled manner to make nanochannels. It does not require any of the typical expensive equipment needed to create nanostructures, such as e-beams or cleanrooms. Our tuneable nanochannels are unique in being able to adjust its cross-sectional size."

"We believe that our approach can be extended to higher levels of functionality through the integration of parallel and serial operations, sophisticated optics and a wealth of polymer chemistry" Takayama concludes.

Trillion pixel image created

Aperio makes first trillion pixel image created


Consisting of a 225 pathology slides of breast tissue, the 143GB image vastly exceeds the 4GB limit imposed by the original TIFF format while retaining backwards compatibility. The new format will be open source.

The company says the technology’s stupendous image resolution makes it possible to create spectacularly detailed images of blood, tissue and bone marrow and has been compared to Google Earth when zooming in and out.

Scanning obviously requires specialised technology, in this case Aperio's ScanScope slide scanning system is used to create the digital images of entire microscope slides at gigapixel resolutions. The company says this process takes only minutes and produces digital slide images with dimensions that routinely exceed 100,000 x 100,000 pixels.


In related news, 2,045 images were stitched together to make a 13 gigapixel panorama A camera perched on the roof of a building at 7th Avenue and 110th Street in New York City panned and tilted, capturing the skyline from 4:43 p.m. to 6:53 p.m.

A website harlem 13 gigapixels presents the result

Related articles:
Avoiding jpeg compression to get 10 times better resolution

A survey of currently available high resolution cameras

A chip with 111 million pixels

Long range laser surveillance

Laser surveillance of reflective tags offers high precision and long range tracking alternative to short range RFIDs


Using low-cost reflective tags placed on objects, LBIMS maps the precise location of high-value items. The laser can scan many points per second and can detect small changes - less than a centimeter - in the reflected signal, meaning tampering can be immediately detected.

The precision of the system is made possible by a high-resolution two-axis laser scanner capable of looking at a 60-degree field of view in 0.0005-degree increments, dividing the field of view into more than 10 billion individual pointing locations. A camera with comparable resolution over the same field of view would require a 10,000-megapixel detector.

Tests performed at the International Atomic Energy Agency in Vienna, Austria, and at the Joint Research Center in Ispra, Italy, have shown LBIMS to be relatively impervious to various attacks designed to foil the system. The Joint Research Center is involved in the development and testing of highly sophisticated laser scanning systems for a variety of applications. Even tests in highly reflective rooms such as one with stainless steel walls proved no challenge for LBIMS.

Comparing the added energy from non-fossil fuel sources

Nnadir at dailykos has another great observation which compares the increased power per year from 1993 to 2005 of non-fossil fuel sources in the United States

The rate of increase in units of energy, delivered as electrical power per year in the period between 1993 and 2005 (12 years). The units of this calculation will be thousand megawatt-hours/per year.

Wood (biomass): 96 thousand megawatt-hours/per year.

Waste: - 259 thousand megawatt-hours/per year. Negative number.

Geothermal: - 190 thousand megawatt-hours/per year. Negative number.

Solar: (Usually everybody's favorite): +8

Wind (Another favorite): 1345 thousand megawatt-hours/per year.

Overall, renewable energy in the United States has increased at a rate of 1000 thousand megawatt-hours/per year.

The nuclear energy figure is 16,203 thousand megawatt-hours per year for nuclear even without building a new plant. Where did all this energy come from if no new plants were built? Improved operations mostly.


So nuclear power has been the fastest way to displace fossil fuels by a large margin. It will and should still be a major part of the solution to the problem of displacing fossil fuel.

Recent work from MIT indicates that existing nuclear plants could be modified to safely generate 50% more energy. This can be done by changing the shape of the fuel from rods to cylinders and by adding nanoparticles to the water. A power uprating application takes about a 18-24 month to be processed.

A robust infrastructure

An interview on wired about creating a robust society that is able to resist system shocks The key is to create community infrastructure in a box. Enable the quick replacement of infrastructure and for the grid to be able to contain failure to small regions.

New Toyota cars will be 100% hybrid in 2020

Toyota plans to go 100% hybrid by 2020 Takimoto also made the bold claim that by 2020, hybrids will be the standard drivetrain and account for “100 percent” of Toyota’s cars as they would be no more expensive to produce than a conventional vehicle.
Masatami Takimoto, who said cost cutting on the electric motor, battery and inverter were all showing positive results and by the time Toyota’s sales goal of one million hybrids annually is reached, it “expect margins to be equal to gasoline cars”.

Resource investor discusses the impact of a large shift to hybrids using the likely materials of lithium-ion and cobalt

In 2006, Toyota made a record-setting 9.3 million vehicles including a little more than 300,000 Priuses. All other manufacturers together made enough hybrids so that the total produced globally was around 500,000. Last year, the world’s production of new lithium for all uses was in balance with demand. If we assume that in 2020 Toyota, alone, will produce 12 million vehicles and that all of them will be powered by a hybrid system using a lithium-ion technology battery pack, and, if we assume that those battery packs each contain, for argument’s sake, 20 pounds of lithium, then Toyota alone in 2020 will require around 240 million pounds of lithium annually or 120,000 tonnes per year. In addition each lithium-ion battery pack, if it were built today, would need a few pounds of cobalt. Even one pound per car or truck will require 6,000 tonnes per year just for Toyota’s production in 2020.

The world’s largest producer of lithium today is Chile’s SQM [NYSE:SQM]. Also there’s MetallGesellschaft's (MG) American subsidiary Foote Mineral (or Cyprus-Foote Mineral) in Nevada.

Today’s entire world production used primarily for chemical use, not batteries, would only be fraction, perhaps as large as third, of just Toyota’s needs under their announcement for 2020. There is no way that the world’s other car companies could allow Toyota to be the sole producer of high performance hybrid vehicles, so we should multiply the needs of the global OEM automotive industry in 2020, under this scenario, by around 8.

That means if all of the world’s OEM automotive manufacturers were to begin now and to, as Toyota has announced, ramp up their changeover from pure internal combustion engine power trains to hybrids by 2020 the world would need an additional amount of lithium each year beginning around 2012 of as much as is being today produced annually! Under this scenario the world would run out of known reserves in 2020. We would have used up all of the world’s recoverable lithium


So the battery technology must continue to be improved and made more efficient with things like virus-batteries for nanostructured batteries. or carbon nanotube ultracapacitors

If Toyota delivers and the other car makers follow then most of world's production of cars could be hybrid or even plug-in hybrid in 2020. By 2035-2040 most of the worlds cars could be converted as the older cars are retired. One important issue will be the drivetrain of the new $2000-5000 cars being made in China and India. I believe that those cheap cars will also be converted to high efficiency since the owners would not be able to afford high operating costs.

Current reserve estimates are 6.2 million tons of lithium, with about half located in Bolivia

Alternative batteries are sodium nickel chloride and zinc-air, both of which offer comparable or greater energy density than lithium without the attendant safety or resource depletion issues. After iron, aluminum and copper, zinc is the most commonly used metal by modern society. A 2005 USGS estimate placed American zinc reserves a 30,000,000 metric tons and world reserves, excluding the US, at 220 million metric tons.


Here is an 8 page pdf of the 2005 US geological survey report on Lithium

May 17, 2007

Polariton Superfluid: Laser version of superconductors in new form of matter

Physicists at the University of Pittsburgh have demonstrated a new form of matter that melds the characteristics of lasers with those of the world's best electrical conductors - superconductors.

This is another example of more than one new state of matter being created every year The most familiar examples of states of matter are solids, liquids, and gases; the most common state of matter in the universe is plasma. Less familiar phases include: quark-gluon plasma; Bose-Einstein condensates and fermionic condensates; strange matter; superfluids and supersolids, and possibly string-net liquids.

The new state (polariton superfluid) is a solid filled with a collection of energy particles known as polaritons that have been trapped and slowed, explained lead investigator David Snoke, an associate professor in the physics and astronomy department in Pitt's School of Arts and Sciences.

Using specially designed optical structures with nanometer-thick layers-which allow polaritons to move freely inside the solid-Snoke and his colleagues captured the polaritons in the form of a superfluid. In superfluids and in their solid counterparts, superconductors, matter consolidates to act as a single energy wave rather than as individual particles.

In superconductors, this allows for the perfect flow of electricity. In the new state of matter demonstrated at Pitt-which can be called a polariton superfluid-the wave behavior leads to a pure light beam similar to that from a laser but is much more energy efficient.

The polariton superfluid is more stable at higher temperatures, and may be capable of being demonstrated at room temperature in the near future.

Snoke's polariton trap was devised with a technique similar to that used for superfluids made of atoms in a gaseous state known as the Bose-Einstein condensate.


Further reading:
Wikipedia discusses polatitons as quasiparticles resulting from strong coupling of electromagnetic waves with an electric or magnetic dipole-carrying excitation.

Wikipedia discusses polaritronics as an intermediate regime between photonics and sub-microwave electronics. In this regime, signals are carried by an admixture of electromagnetic and lattice vibrational waves known as phonon-polaritons, rather than currents or photons.

Nanoglue could be used for smaller computer chips

Nanoglue, self assembled layers for connecting two objects, could help make extremely tiny computer chips. The organic-based nanolayers are about a 1,000 times thinner than the thinnest organic-based glues. The glue has a backbone of carbon molecules. On one end of the chain is silica and oxygen, and on the other end is sulfur. These different-end molecules act as hooks that bind with other surfaces.

Ramanath topped off the chain with a thin layer of copper that acts as a protective coating to help keep the molecules intact.

This is a follow up to this article on the nanoglue

China is building three petaflop supercomputers by 2010

China is moving to front of supercomputing by going for at least two, most probably three, Petaflop supercomputers by 2010.


GoogleTechnoratidel.icio.usStumbleUpon



Beijing, will have one, most probably under the patronage of Chinese Academy of Science, where Lenovo is an incumbent with a large 1000-CPU Itanium Quadrics system right now. This will probably come on line first, since it is the central node of China National Grid.

The financial capital of Shanghai has a the go-ahead for another Petaflop, at its supercomputer centre. The incumbent there is (far less known) Dawning. Each of these is going ahead with 100+ TFLOP 'pilot' systems this year.

Galactic computing, set up by the well-known Steve Chen of ex-Cray fame, uses its base in the ever-prosperous city of Shenzhen, now the richest city of Guangdong province together with the capital Guangzhou.



Espected major nodes of China's supercomputer grid. Shenzhen is not on the plan but are funding their own supercomputer

There are several other national supercomputer grid projects.

The second stage of ChinaGrid project is from 2007 to 2010, covering 30 to 40 key universities in China. The focus will extend from computational grid applications to information service grid (e-information), including applications for a distance learning grid, digital Olympic grid, etc. The third stage will be from 2011 to 2015, extending the coverage of the ChinaGrid project to all the 100 key universities. The focus of the third stage grid application will be even more diverse, including instrument sharing (e-instrument).


The EU has the phase II of the EGEE Grid. It is built on the EU Research Network GÉANT and exploits Grid expertise generated by many EU, national and international Grid projects to date.

Japan has the National Research Grid Initiative NAREGI.

The USA has Teragrid. The TeraGrid 2005-2010: In August 2005, NSF's newly created Office of Cyberinfrastructure extended support for the TeraGrid with a $150 million set of awards for operation, user support and enhancement of the TeraGrid facility.

A list of completed grids is here
A Chinese company red Neuron has announced the completion of the Tensor MPU2016 High Performance Computing technology demonstration and development platform. It is a key part in reducing the cost of a gigaflop/sec to $250-750 by 2009.

The Tensor MPU2016, with 16 processor cards containing Freescale 8641D SoC (system on chip) processors and Xilinx Virtex-4FX FPGAs, is an ideal platform for companies which may be working on the development of high performance solutions for the embedded systems market. MPU or, Master Processing Unit, is a novel approach that provides high density and reliability while preserving CPU and interconnect flexibility. Initial performance tests resulted in achieving a HP Linpack benchmark score of 32 Gigaflops for a single chassis configuration, which triples the performance demonstrated by the prototype Tensor MPU1016 system produced by Red Neurons in the first quarter of 2007.


South Korea and Japan are also working on petaflop machines

LANL's plans to scale the IBM "Roadrunner" to petascale Linpack performance in a couple of years. Argonne National Laboratory has a 280-teraflop Blue Gene/L and intends to advance to a peak petaflop in a few years. So in 2010, China should have three petaflop supercomputers, Japan and the USA could each have two.

U.S. Department of Energy's National Nuclear Security Administration (NNSA) selected IBM to design and build the world's first supercomputer to harness the immense power of the Cell Broadband Engine (Cell B.E.) processor aiming to produce a machine capable of a sustained speed of up to 1,000 trillion calculations per second, or one petaflop.

India is also developing a supercomputer that can reach one petaflop. The project is under the leadership of Dr. Karmarkar who invented the Karmarkar's algorithm. The Tata group of companies are funding the project.[14] CDAC is also building a supercomputer that can reach one petaflop by 2010


Folding@home is close to a petaflop of power

Japan already has MD-Grape 3 the worlds first petaflop+ computer

Singapore has put S$150 million into a quantum computing center.

Further reading:
Steve Chen is planning and being funded to build a supercomputer grid across China

Part II on Steve Chen

Here is information on what it will take and when we should expect zettaflop computing

Other petaflop computing projects in the USA and other places

The USA is making time on existing supercomputers more available

May 16, 2007

Water World Found In Front Of Nearby Star

A hot ice Neptune sized world with water has been found

The star GJ 436, a diminutive star (red dwarf) 30 light-years from the Sun, was known since 2004 to harbour a 22-Earth mass planet, orbiting 4 million kilometers from the star (0.03 Astronomical Units).

Measurements show that the planet has a diameter of about 50,000 km, four times that of the Earth. From the size and mass of the planet, the astronomers could infer that it is mainly composed of water. If the planet contained mostly hydrogen and helium – like Jupiter or Saturn – it would be much larger, and if it was made up of rock and iron like Earth, Mars and Venus, it would be much smaller.

This water world can either be surrounded by a light envelope of hydrogen and helium, like Neptune and Uranus, or be entirely surrounded by water, like most of Jupiter’s satellites. As the planet is close to its host star, its surface temperature is expected to be at least 300 C (600 F). The water in its atmosphere would therefore be in the form of steam. Inside, the water is crushed under intense pressure and adopts states unknown on Earth, except in physicist’s laboratories. Says Frédéric Pont: "water has more than a dozen solid states, only one of which is our familiar ice. Under very high pressure, water turns into other solid states denser than both ice and liquid water, just as carbon transforms into diamond under extreme pressures. Physicists call these exotic forms of water 'Ice VII' and 'Ice X'.


Centauri Dreams has coverage

Inexpensive, universal, super nanoglue



A new method allows a self-assembled molecular nanolayer to become a powerful nanoglue by "hooking" together any two surfaces that normally don’t stick well.

Unprotected, a nanolayer (green ball: silicon, blue: sulphur, red: carbon, white: hydrogen) would degrade or detach from a surface when heated to 400 degrees Celsius. But when topped with a thin copper film that binds strongly with the nanolayer, heat causes the nanolayer to form strong chemical bonds to the silica underlayer -- hooking or gluing the copper-silica "sandwich" together. This technique produces a sevenfold increase of the thin film sandwich’s adhesion strength and allows the nanolayer to withstand temperatures of at least 700 degrees Celsius. Both features are unexpected and unprecedented. This new ability to bond together nearly any two surfaces using nanolayers will benefit nanoelectronics and computer chip manufacturing. Other envisioned applications include coatings for turbines and jet engines, and adhesives for high-heat environments.

Because of their small size, these enhanced nanolayers will likely be useful as adhesives in a wide assortment of micro- and nanoelectronic devices where thicker adhesive layers just won’t fit.

Another unprecedented aspect of Ramanath’s discovery is that the sandwiched nanolayers continue to strengthen up to temperatures as high as 700 degrees Celsius. The ability of these adhesive nanolayers to withstand and grow stronger with heat could have novel industrial uses, such as holding paint on hot surfaces like the inside of a jet engine or a huge power plant turbine.

“The molecular glue is inexpensive – 100 grams cost about $35 – and already commercially available, which makes our method well-suited to today’s marketplace. Our method can definitely be scaled up to meet the low-cost demands of a large manufacturer,” he said.

Better genetic engineering and study of intracellular processes of plants

A team of Iowa State University plant scientists and materials chemists have successfully used nanotechnology to penetrate plant cell walls and simultaneously deliver a gene and a chemical that triggers its expression with controlled precision. Their breakthrough brings nanotechnology to plant biology and agricultural biotechnology, creating a powerful new tool for targeted delivery into plant cells.

Currently, scientists can successfully introduce a gene into a plant cell. In a separate process, chemicals are used to activate the gene’s function. The process is imprecise and the chemicals could be toxic to the plant. "With the mesoporous nanoparticles, we can deliver two biogenic species at the same time," Wang said. "We can bring in a gene and induce it in a controlled manner at the same time and at the same location. That’s never been done before."

And in the future, scientists could use the new technology to deliver imaging agents or chemicals inside cell walls. This would provide plant biologists with a window into intracellular events.

Carbon Nanotube Aeroegel Made with Optimizable Strength, Shape and Conductivity

Researchers at the University of Pennsylvania have created low-density aerogels made from carbon nanotubes, CNTs, that are capable of supporting 8,000 times their own weight.

The new material also combines the strength and ultra-light, heat-insulating properties of aerogels with the electrical conductivity of nanotubes. Aerogels are novel, semi-transparent, low-density materials created by replacing the liquid component of a gel with gas and are normally constructed from silicon dioxide or other organic polymers. They are currently used as ultra-light structural materials, radiation detectors and thermal insulators. The team also maintained control of the density, microscopic structure and shape of the CNT aerogels. The addition of polyvinyl alcohol created a more even dispersion of CNT throughout the aerogel, adding strength.

Grazing the scalp combined with Wnt proteins could cure baldness

Grazing the scalp and Wnt proteins could cure baldness.

The team cut out a square centimetre of skin from the backs of mice two weeks after their hair follicles had formed. After 14 to 19 days the wounds had closed and formed new. When the researchers added Wnt proteins - signalling molecules usually involved in embryonic development - the number of follicles doubled and the skin healed with less scarring. This suggests that wound healing may trigger an embryonic state in skin, says Cotsarelis. Surprisingly, the new follicles originate from stem cells that are not usually involved in creating hair follicles.

May 15, 2007

Engines of Creation predictions are not fanciful


GoogleTechnoratidel.icio.usStumbleUpon




An updated version of Engines of Creation is online

The Center for Responsible Nanotechnology (CRN), has an article that points out that the term "molecular nanotechnology" has been associated almost invariably with fantastic notions
like bloodstream nanobots, true universal assemblers (“meat machines”), and theoretically ubiquitous “utility fog.” Such concepts admittedly are fascinating to consider and someday may become reality, but they seem to be further in the future than are the middle-period developments that concern CRN.


I believe those who think of those things as fantastic notions are not aware of developments using current technology that are bringing them about.

From New Scientist, diodes could power bloodstream microbots

A new form of propulsion that could allow microrobots to explore human bodies has been discovered. Velev's diodes are millimetre-sized but any robot designed to work within the human body would have to be an order of magnitude smaller. In the past, attempts to shrink propulsive mechanisms have run up against a fundamental barrier in fluid dynamics: fluids become progressively more viscous on smaller scales. "It's like moving through honey," says Velev. But extrapolations of the team's measurements indicate the propulsive force will work just as well at smaller scales. "The propulsive force scales in exactly the same way as the drag. That's quite significant," says McKinley.


The first surgical microbot could be ready by 2009


A capsule insertable robot has been made in Japan


Nanoparticles have been used as drug delivery systems. They are more crude than the nanobot vision but they can be remotely guided to the tumor and then triggered from the outside to release material. So they are simple machines.

Similarly mini-bacteria cells are performing similar functions

Cellular repair is becoming possible as well. Magnetically assembled nanotube tips are being added to devices that can inject or remove organelles from cells

These things are not as capable as the Chromallocyte recently designed by Robert Freitas but it shows that such things are clearly not fanciful.

Meat Factories can be made using stem cells. There is existing work with test tube meat.

Step towards utility fog are being made by Intel with work on claytronics


Current claytronics components which are planned to be shrunk to about one millimeter

Projecting rapid manufacturing capabilities from current rapid prototyping, rapid manufacturing and fabbing could be not that far from the Engines of creation view of universal assemblers.

Combining the ovonic quantum control device with PRAM and other polymer components could enable more fabbable all flat (reel to reel) printing of computers and solar power cells.

Lasers, combined with metamaterials, nanoparticles and superlens could enable additive rapid manufacturing with 2 nanometer precision.

Non-molecular nanotechnology (microelectronics), pre-molecular nanotechnology (nanoparticles, nanomaterials), DNA nanotechnology, synthetic biology, graphene, fullerene nanotechnology, advanced chemistry, robotics, rapid manufacturing are making possible what was believed would require molecular nanotechnology. When full-blown diamondoid arrives what will actually be possible will be confounding to those who have not been paying attention or who are in denial.

We only will need molecular nanotechnology because we are not being creative enough with what we can already do or on the way to doing very soon. If we were not flushing money on the Shuttle and the Iraq War we could have mastery of space. If we were not confused about nuclear power we could have clean energy.

Establishing a Global scale thorium fuel cycle

There is an paper by Kazuo Furukawa et al, "A Road Map for the Realization of Global-scale Thorium Breeding Fuel Cycle by Single Molten-Fluoride Flow"
Hat tip to energy from thorium discussion board

If fission industry were to achieve the replacement of the present fossil fuel industry the doubling–time of nuclear energy should be less than 10 years, preferably 5-7 years. The liquid metal cooled fast breeder reactors (LMFBR) have the best breeding criteria but the doubling time exceeds 20 years.

The development and launching of THORIMS-NES requires the following three programs during the next three decades: (A) pilot plant: miniFUJI (7-10 MWe): (B) small power reactor: FUJI-Pu (100-300MWe). (C) fissile producer: AMSB for., globally deploying THORIMS-NES.

The growth rate should be about 10 years in doubling time, and its peak output about 10 TWe (30 times bigger than the present) achieving by 2065, considering factors such as population and economical growth, etc.


Here is a look at the past and possible future of energy


On a somewhat related topic, I have revamped my recent nuclear energy for oilsands article Check it out as it has technical and financial specifics from a peer reviewed article form the journal of Nuclear Energy.

May 14, 2007

Nasa space challenges: Lunar lander and lunar mining

Rocket company Armadillo Aerospace has just completed a 3-minute hover test of its vehicle Pixel, positioning itself to win the most challenging level of the $2 million Lunar Lander Centennial Challenge in October 2007.

Armadillo Aerospace's Pixel vehicle hovered for a record 192 seconds on Saturday, but the tethered flight was not a good test of its landing gear, which played a key role in the company's failure to win the 2006 lunar lander challenge (Image: Armadillo Aerospace)

Four teams of backyard inventors vied to dig as much simulated lunar soil as possible in half an hour at NASA's Regolith Excavation Challenge on Saturday, but no one scooped up the $125,000 first prize.

The teams were trying to excavate 150 kilograms of the mock soil, or "regolith", using no more than 30 watts of power – enough to run a refrigerator light bulb – and dump the soil in a bin. Technology Ranch of Pismo Beach, California, was the only one to run 30 minutes on its first attempt, scooping up 75 kilograms (although 10 kilograms missed the collection box). Intended as a two-year programme, the top prize for the second year of the competition was originally going to be $250,000. But this year's unclaimed prize will be added to it to make a total award of $375,000 in 2008.

Masafumi Iai, a student at the University of Missouri at Rolla, makes adjustments to the Lunar Miner, his team's entry in the competition (Image: Dana Mackenzie)

All of the NASA centennial challenges are listed here

The astronaut glove challenge was one

My take on Shaping the Future

Science Fiction author Charlie Stross talks about "Shaping the Future" He wrote Accelerando and Singularity Sky among other books.



He notes how progress used to measured by top speed, but how that stopped between 1950-1970. I believe that speed will become a useful measure of progress again. I believe that we can and will burst through the roadblocks that have stalled progress to faster speed Lasers, magnets, nuclear fusion, new production (molecular nanotechnology) and new materials will help break things wide open. I believe that superior room temperature superconductors will be created. Which could be used to ground launch a magnetic sail

Charles Stross also mentions how the convergence of technology makes things less predictable. I believe that the convergence of our progress with lasers, magnets, materials and molecular nanotechnology will result in easy and cheap access to space and high acceleration up to significant fractions of light speed.

Energy will also be transformed. Mass produced and cleaner nuclear fission is possible in the form of molten salt thorium reactors, nuclear fusion (z-pinch version or Bussard fusion or another version), and massive solar space power.

Zettaflop and faster classical computers and Quantum computers combined with artificial intelligence, artificial neurons and brain-computer interfaces will help accelerate the transformation.

The relative stability of the infrastructure and makeup of civilization for the past 30-60 years is about to change in a major way over the next 30-100 years. Unless we totally screw up the future we will defintely have a radically different scope and capability in 2050+. I think it can happen even sooner.

Quantum computers and classical annealing

Highly technical discussion about the Dwave quantum computer system, claims and other quantum computer papers.

This was triggered by a post by Geordie Rose of Dwave

Bill Kaminsky provides interesting comments

Not directly related but on quantum computers in general:
Scott Aaronson has some general questions and answers related to his view of the field of quantum computing