October 12, 2007

Numenta system central to image analysis project

The Defense Advanced Research Projects Agency (DARPA) and the National Geospatial-Intelligence Agency awarded Lockheed Martin a $4.9-million, 18-month program to use brain-inspired technologies to develop a system that will speed an image analyst's job by 100 times.

Called Object Recognition via Brain-Inspired Technology (ORBIT), the system will use electro-optical (EO), light detection and ranging (LIDAR), and brain-inspired technologies to automatically recognize objects in urban environments from ground and aerial surveillance. ORBIT will fuse commercial airborne EO and LIDAR sensor data into a three-dimensional, photorealistic model of the landscape. Its brain-inspired object-recognition technology will automatically generate lists of recognizable imagery, like mailboxes and dumpsters.

"ORBIT's automated, 3D, object-recognition capability will help eliminate the time analysts spend manually identifying objects," said Dr. Peter Bilazarian, ORBIT program manager, Lockheed Martin Advanced Technology Laboratories (ATL). "We think ORBIT will reduce analysis time of one square kilometer of imagery from 1,300 hours to less than 10 hours. Faster turnaround time for analysts means more timely and accurate mission planning."

Central to ORBIT are three complementary brain-inspired and machine- learning approaches to object recognition: Numenta's Hierarchical Temporal Memory technology, which performs invariant pattern learning based on a model of the brain's neocortex; a standard model of the brain's visual cortex for spatial recognition; and a computer-vision technology, which mimics the process of human vision and recognition.

Atomic orbitals change at the interface of certain types of nanostructures

Until now, materials science researchers believed that an electron's charge and spin influenced the characteristics of conventional bulk materials. Atomic orbitals, which consist of the patterns of electron density that may be formed in an atom, were previously thought to be inactive.

Chakhalian's work has focused on what happens at the interface between two different materials - for instance, superconductors and ferromagnets, two materials with properties that were thought to be incompatible with each other in bulk. In 2006, he and his colleagues created the first high-quality material to have both superconducting and ferromagnetic properties, and they used that material in this experiment.

The researchers forced a high-temperature superconducting material containing copper oxide and a ferromagnetic material containing manganese oxide into unusual quantum states. Using a technique called resonant X-ray absorption, they were able to "look" at the atomic orbitals at the interface and determine their symmetry in a non-destructive way.

They found that the atomic orbitals changed the nature of their symmetry at the interface and created a covalent bond between the copper and manganese atoms. This bonding does not exist in the bulk of the individual materials

"When you merge these two materials, the atomic orbitals at the interface become important. They start contributing to the electronic properties of the material," Chakhalian said. "This opens a new way of designing materials. We can design quantum materials with engineered physical properties."

The discovery may allow researchers to manipulate nanoscale superconductivity at the interface - opening the possibility of creating room-temperature superconductors.

Generators that use superconducting materials generate electricity extremely efficiently, at half the size of conventional generators. General Electric estimates the potential market for superconducting generators to be between $20 billion and $30 billion over the next decade.

Future and past productivity improvement

There is a post at the Center for Responsible Nanotechnology (CRN) about how if nanofactory level nanotechnology is created that there will not be nano Santa Claus creating a post-capitalist society with abundance for everyone.

The CRN post goes onto discuss the Dale Carrico prescription of a guaranteed income in order to redistribute the bounty to everyone. I have questioned the specifics of the implementation and the expected benefits of a major income redistribution via a guaranteed income method. I have also researched the existing levels of taxation, welfare states etc... There is some financial means to implement those programs and a case can be made that we do not know that it would be inferior to replacing current wasteful uses of those funds. I question going beyond the range shown to be not extremely problematic in current successful modern countries. Complete income redistribution and taxation over 50% seems to be clearly problematic (Cuba, old Soviet Union, Maoist China).

I agree with the first part that a large productivity increase does not mean more stuff and bounty for everyone. Just as the productivity and wealth gains of the 1820-now did not mean everyone got rich. It did mean that many more did have the opportunity for better lives. But some countries ended up being left behind and many people not do well in countries which did well overall.

An interesting historical analysis of productivity growth is in this online book about Technology and productivity.

The great growth of productivity is correlated with 4 great inventions and the spreading availability of them. They also spawned and were the source of follow on inventions and process improvement.

1. Electric light- longer productive day
2. Electric motor and combustion engine
- faster, more flexible movement, powering mass production and industry
3. Petro refining, chemicals, plastics, pharma
- rearranging matter into more productive forms
4. Electricity and electronics for entertainment, communication and info
- started markets had more impact than later improvements. being able to send a telegraph was replacing pony express and couriers was a bigger leap than phone vs telegraph etc...

Information technology from 1950s onward and the Internet from 90s onward have created a productivity growth surge which so far has been less than the early big 4 inventions.

The believe that there a revolution in materials is continuing and is delivering a larger increase in material capability than early petro, chemical, plastics and pharma. This is not molecular manufacturing (MM) dependent as it is already occurring. MM would help to accelerate and enhance the distribution of the effect and would allow for more flexible and powerful applications. The material revolution is micro and nanograined metals which are several times stronger, carbon nanotubes and materials with properties which are enhanced by controlled design at the molecular level.

I believe that supply of material and energy can be massively transformed by development of access to space resources and with new energy technology such as nuclear fusion and/or mass produced nuclear fission and/or massive amounts of solar power collection. This would mean enhanced energy availability and energy density. Also, a doubling of energy available and elimination of many energy losses via usage of superconducting wire, superconducting motors and thermoelectronics. Again this is not molecular manufacturing dependent but would be accelerated and enhanced by molecular manufacturing.

Vastly superior automation, robotics and production also would have a transformative effect. There is the application of devices like the iRobot vacuum cleaner and now window cleaners. There is the development of effective robotic driving of cars (DARPA challenge). We already have had robotic assembly lines, but the widespread application of robotic driving and Halo video conferencing could free up a lot of unproductive commuting time. Also, the automation of functions outside of the factory would spread the productivity boost around to other parts of the economy.

However, as in the past the boost to factory productivity primarily benefited the factory owner and shareholders. Some factory workers had some of the benefits after they unionized and captured them with collective bargaining. These new boosts in productivity are being captured by those companies and individuals with a business plan that leverages them.

I believe that new technology will enable productivity gains that are larger than the big 4 inventions of the past and which will have their absorption and effect into the economy in shorter elapsed timeframe. Existing societal and national structures and institutions will likely adapt to the extent that they have with past increases in productivity and wealth.

The current state of affairs means that massive increases in production and massive drops in cost do not diffuse to many parts of the world economy. The decrease in computer PC prices by 40-100 times since there introduction and for computers in general of 10,000 times and increase in their power did not provide benefits to many people in Africa and Asia. The recent $100-150 laptop effort has been an attempt at addressing this.

Economies will need to restructure and many new radical process improvements will need to be made to fully capture the benefits of new technologies. Individuals will need to recognize opportunities, risks and make the correct choices to capture benefits and avoid negative effects.

I see the future situation as not a nano-Santa Claus but a series of massive sales at Walmart (with more new and better stuff at lower prices and tomorrows prices often lower than today's everyday low prices), where you still have to find a way to make your money (salary, business, investment whatever) and then elbow your way past the other shoppers to get the best bargains or arrange for internet orders and delivery (which means you have to have a good connection, a computer, electrical power and the supplier has to not have their website swamped and their fulfilment systems and processes working).

Applying metamaterial invisibility for electromagnetic wormholes

Allan Greenleaf, professor of mathematics at the University of Rochester, and his coauthors lay out the possibility of building a sort of invisible tunnel between two points in space.

invisible tunnel, electromagnetic wormhole
Invisible tunnel

Current technology can create objects invisible only to microwave radiation, but the mathematical theory allows for the wormhole effect for electromagnetic waves of all frequencies. With this in mind, Greenleaf and his coauthors propose several possible applications. Endoscopic surgeries where the surgeon is guided by MRI imaging are problematical because the intense magnetic fields generated by the MRI scanner affect the surgeon's tools, and the tools can distort the MRI images. Greenleaf says, however, that passing the tools through an EM wormhole could effectively hide them from the fields, allowing only their tips to be "visible" at work.

Greenleaf and his coauthors speculated on one use of the electromagnetic wormhole that sounds like something out of science fiction. If the metamaterials making up the tube were able to bend all wavelengths of visible light, they could be used to make a 3D television display. Imagine thousands of thin wormholes sticking up out of a box like a tuft of long grass in a vase. The wormholes themselves would be invisible, but their ends could transmit light carried up from below. It would be as if thousands of pixels were simply floating in the air.

But that idea, Greenleaf concedes, is a very long way off. Even though the mathematics now says that it's possible, it's up to engineers to apply these results to create a working prototype.

50-100 nanometer nanodiamond particles good for drug delivery

This is a precursor proof of the effectiveness of nanomedicine concepts of Robert Freitas.

Northwestern University researchers have shown that nanodiamonds -- much like the carbon structure as that of a sparkling 14 karat diamond but on a much smaller scale -- are very effective at delivering chemotherapy drugs to cells without the negative effects associated with current drug delivery agents.

Their study, published online by the journal Nano Letters, is the first to demonstrate the use of nanodiamonds, a new class of nanomaterials, in biomedicine. In addition to delivering cancer drugs, the model could be used for other applications, such as fighting tuberculosis or viral infections, say the researchers.

Nanodiamonds promise to play a significant role in improving cancer treatment by limiting uncontrolled exposure of toxic drugs to the body. The research team reports that aggregated clusters of nanodiamonds were shown to be ideal for carrying a chemotherapy drug and shielding it from normal cells so as not to kill them, releasing the drug slowly only after it reached its cellular target.

To make the material effective, Ho and his colleagues manipulated single nanodiamonds, each only two nanometers in diameter, to form aggregated clusters of nanodiamonds, ranging from 50 to 100 nanometers in diameter. The drug, loaded onto the surface of the individual diamonds, is not active when the nanodiamonds are aggregated; it only becomes active when the cluster reaches its target, breaks apart and slowly releases the drug. (With a diameter of two to eight nanometers, hundreds of thousands of diamonds could fit onto the head of a pin.)

“The nanodiamond cluster provides a powerful release in a localized place -- an effective but less toxic delivery method,” said co-author Eric Pierstorff, a molecular biologist and post-doctoral fellow in Ho’s research group. Because of the large amount of available surface area, the clusters can carry a large amount of drug, nearly five times the amount of drug carried by conventional materials.

Liposomes and polymersomes, both spherical nanoparticles, currently are used for drug delivery. While effective, they are essentially hollow spheres loaded with an active drug ready to kill any cells, even healthy cells that are encountered as they travel to their target. Liposomes and polymersomes also are very large, about 100 times the size of nanodiamonds -- SUVs compared to the nimble nanodiamond clusters that can circulate throughout the body and penetrate cell membranes more easily.

Unlike many of the emerging nanoparticles, nanodiamonds are soluble in water, making them clinically important. “Five years ago while working in Japan, I first encountered nanodiamonds and saw it was a very soluble material,” said materials scientist Houjin Huang, lead author of the paper and also a post-doctoral fellow in Ho’s group. “I thought nanodiamonds might be useful in electronics, but I didn’t find any applications. Then I moved to Northwestern to join Dean and his team because they are capable of engineering a broad range of devices and materials that interface well with biological tissue. Here I’ve focused on using nanodiamonds for biomedical applications, where we’ve found success.

“Nanodiamonds are very special,” said Huang. “They are extremely stable, and you can do a lot of chemistry on the surface, to further functionalize them for targeting purposes. In addition to functionality, they also offer safety -- the first priority to consider for clinical purposes. It’s very rare to have a nanomaterial that offers both.”

“It’s about optimizing the advantages of a material,” said Ho, a member of the Lurie Cancer Center. “Our team was the first to forge this area -- applying nanodiamonds to drug delivery. We’ve talked to a lot of clinicians and described nanodiamonds and what they can do. I ask, ‘Is that useful to you?’ They reply, ‘Yes, by all means.’”

For their study, Ho and his team used living murine macrophage cells, human colorectal carcinoma cells and doxorubicin hydrochloride, a widely used chemotherapy drug. The drug was successfully loaded onto the nanodiamond clusters, which efficiently ferried the drug inside the cells. Once inside, the clusters broke up and slowly released the drug.

In the genetic studies, the researchers exposed cells to the bare nanodiamonds (no drug was present) and analyzed three genes associated with inflammation and one gene for apoptosis, or cell death, to see how the cells reacted to the foreign material. Looking into the circuitry of the cell, they found no toxicity or inflammation long term and a lack of cell death. In fact, the cells grew well in the presence of the nanodiamond material.

New force-fluorescence device measures nanometer-scale motion and pico-newton forces

A hybrid device combining force and fluorescence developed by researchers at the University of Illinois has made possible the accurate detection of nanometer-scale motion of biomolecules caused by pico-newton forces.

“By combining single-molecule fluorescence resonance energy transfer and an optical trap, we now have a technique that can detect subtle conformational changes of a biomolecule at an extremely low applied force,” said U. of I. physics professor Taekjip Ha, the corresponding author of a paper to appear in the Oct. 12 issue of the journal Science

The hybrid technique, demonstrated in the Science paper on the dynamics of Holliday junctions, is also applicable to other nucleic acid systems and their interaction with proteins and enzymes.

The Holliday junction is a four-stranded DNA structure that forms during homologous recombination – for example, when damaged DNA is repaired. The junction is named after geneticist Robin Holliday, who proposed the model of DNA-strand exchange in 1964.

To better understand the mechanisms and functions of proteins that interact with the Holliday junction, researchers must first understand the structural and dynamic properties of the junction itself.

But purely mechanical measurement techniques can not detect the tiny changes that occur in biomolecules in the regime of weak forces. Ha and colleagues have solved this problem by combining the exquisite force control of an optical trap and the precise measurement capabilities of single-molecule fluorescence resonance energy transfer.

With this latest work, the researchers have deduced the pathway of the conformational flipping of the Holliday junction, and determined the intermediate structure is similar to that of a Holliday junction bound to its own processing enzyme.

“The next challenge is to obtain a timeline of movement by force, for example, due to the action of DNA processing enzymes, and correlate it with the enzyme conformational changes simultaneously measured by fluorescence,” Ha said.

October 11, 2007

Suggestions for optimizing IQ test performance

Suggestions on how to optimize performance on an IQ test

Let the “experts” argue about whether you can boost IQ or not, in any absolute sense. If you slept well, exercised, then sat up straight and breathed deeply as you took the test, don’t you think you would score a few points higher on an intelligence quotient test? More importantly, wouldn’t you be better prepared for whatever mental tasks you faced?

Fossil Fuel air pollution in Europe shown to shorten life expectency of all Europeans

Despite some success with air pollution, current levels -- mainly nitrogen oxide, fine particles and ground-level ozone -- are estimated to shorten average life expectancy in Western and Central European countries by almost a year and to threaten the healthy development of children.

This from the report, 'Europe's environment — The fourth assessment', which was presented in Belgrade, Serbia, at the opening session of the sixth ministerial conference of the 'Environment for Europe' process held under the auspices of the United Nations Economic Commission for Europe (UNECE).

The latest in a series of assessments of the pan-European environment published by the EEA over the past 15 years, the report assesses environmental progress in 53 countries — an area with a total population of more than 870 million people. The region includes: Eastern Europe, Caucasus and Central Asia (EECCA), South East Europe (SEE), as well as Western and Central Europe (WCE).


In the Russian Federation, an assessment of the impact of outdoor air pollution on public health, based on the 1993 and 1998 monitoring data, showed that 15–17 % of total annual mortality (up to 219 000–233 000 premature deaths) might be caused by fine particles (Reshetin and Kazazyan,2004).

In Ukraine and the Russian Federation, estimates of health losses from urban air pollution based on TSP monitoring data in Ukraine indicated considerable health and mortality consequences. In Ukraine, the low (conservative) estimate was 27 000 excess
deaths annually, and for the Russian Federation the estimate was about 85 000 excess deaths (Strukova et al., 2006).
As estimated under Transport Health and Environment Pan-European Programme (THE PEP),
air pollution from road transport affects the health of about 10–15 million urban Russian residents. In the large city centres, road transport may account for more than 80 % of total air emissions. In 2002, the average annual concentrations of harmful pollutants exceeded maximum permissible levels in 201 Russian cities, home to 61.7 % of the urban population. An estimated 22 000–28 000 additional deaths in
the Russian Federation were attributable to road transport-related emissions in people over the age of 30 (ECMT, 2004).
The WHO project 'Comparative Quantification of Health Risks' has estimated the health impacts of outdoor air pollution in major cities (population > 100 000 people) of the world grouped in 14 regions, including EUR-C, consisting mostly of
EECCA countries. The annual impact of air pollution by particulate matter for this region was estimated at 46 000 premature deaths and 320 000 years of life lost (WHO, 2004b).

Excess concentrations of ozone are thought to hasten the deaths of up to 20 000 people in the EU each year (Watkiss et al., 2005). Further, ozone is responsible for people vulnerable to its effects having to take medication for respiratory conditions
for a total of 30 million person-days a year. Some studies also suggest that long-term exposure to ozone reduces lung function growth in children
.

lives saved in Europe with better air quality
Achieving the lower levels of particulates (mainly from coal power plants and automobiles) would save thousands of lives per year


France which has 80% power from nuclear energy has superior air quality


the worse air quality is from particulates and ozone, then the more people die, the more money is lost (many billions per year) and the more the environment is destroyed


Targets for good public health for different kinds of air pollution, all of these things are mainly from coal power plants and automobiles and trucks

American superconductor developing 10 MW wind generators

American Superconductor in a joint venture to develop 10 MW wind generators. National Institute of Science and Technology is funding development of technologies for 10 Megawatt-Class, Direct Drive Offshore Wind Generators Powered by High Temperature Superconducting Wire


del.icio.us



Direct drive wind generator systems utilizing HTS wire instead of copper wire for the generator's rotor are expected to be much smaller, lighter and more efficient than conventional generators and gearboxes. The net effect is expected to be a lower cost of wind generated electricity, particularly for offshore wind farms. AMSC and TWMC also announced that they have received an award from the National Institute of Science and Technology's (NIST) Advanced Technology Program (ATP), which is providing $3.4 million in funding toward the $6.8 million research project to be conducted under the joint venture.

"The objective of the TWMC-AMSC research joint venture is to develop technologies that will enable the deployment of offshore 10 megawatt class, direct drive wind generators - double the power capacity of conventional systems," said AMSC founder and chief executive officer Greg Yurek. "The result will be more power delivered from each offshore wind turbine, which would significantly reduce the total costs of offshore wind farms

By replacing copper with HTS on the generator's rotor and utilizing a new high-efficiency stator design to be developed under this project, AMSC and TWMC estimate that they could produce 10 MW class direct drive generator systems that would weigh approximately 120 metric tons, or about one-third the weight of conventional direct drive generators with this power rating. Technically, weight reductions could be greater, albeit at a higher cost, giving wind energy system manufacturers and developers new options to design and deploy cost-effective offshore wind farms.

The 30-month cost-shared research project to be conducted by the joint venture with NIST funding calls for the development of new HTS wire and coil technologies that will help enable the design and manufacture of 10 MW class, direct drive AC synchronous generators for off-shore wind turbines. The targeted ultra-low-speed, high torque generators are expected to produce full power at 6 kilovolts at 11 revolutions per minute.



5 MW wind generator

Germany's REpower (Corp) offered (since 2005) a 5 MW machine with a three-bladed rotor at a diameter of 126 meters. The world's largest turbines are manufactured by the Northern German companies Enercon and REpower. The Enercon E112 delivers up to 6 MW , has an overall height of 186 m (610 ft) and a diameter of 114 m (374 ft). The REpower 5M delivers up to 5 MW , has an overall height of 183 m (600 ft) and a diameter of 126 m (413 ft).

6 MW wind turbine
6MW Enercon wind turbine under construction

Currently more and more 5 MW wind turbines are being ordered, the future is 10 MW wind turbines and 80 metre long rotor blade (160 meters in diameter).

FURTHER READING
There was a proposal for a 1 gigawatt in single wind turbine

Ontario liberals win in landslide, more nuclear energy and less coal for Ontario

October 09, 2007

Dwave system Quantum Computer Update

Dwave Systems will be demoing their latest and greatest adiabatic quantum computer system at SC/07 in Reno November 12th. Dwave has not released the number of qubits in the new system, but indicate that they are on track with their technology roadmap.

At the same conference, Nanotero will be exhibiting their NRAM memory Hopefully they will be announcing a release date as the NRAM had previously been slated to be released sometime in 2007.

Robert Bussard has died of cancer

From Centauri Dreams, comes the sad word that Robert Bussard has died of cancer.

I had been covering the recent work of
Robert Bussard on inertial electrostatic fusion.

This project had been funded again by the US Navy

Power and control also noted Robert Bussards passing and summarize his work which will continue

The Polywell Fusion reactor is described at Wikipedia

The proposed WB-7 and WB-8 devices will be constructed and tested during 2008. Depending on the experimental results, the research could continue in pursuit of the final full-sized model.


EMC2 was the company that received the funding for the project

Fusion R&D
Phase 1 - Validate and Review WB-6 Results
1.5 - 2 years (by the end of 2008), $3-5M

Fusion R&D
Phase 2 - Design, Build and Test
Full Scale 100 MW Fusion System
5 years (2009-2013), $200M

Askmar has links to the scientific papers on the Fusion concept

The only small scale machine work remaining, which can yet give further improvements in performance, is test of one or two WB-6-scale devices but with “square“ or polygonal coils aligned approximately (but slightly offset on the main faces) along the edges of the vertices of the polyhedron. If this is built around a truncated dodecahedron, near-optimum performance is expected; about 3-5 times better than WB-6.


This paper describes the application of Bussard Fusion to achieve up to 1.2 million ISP and $24 per kg to get to orbit

2007 Feynman prize winners

The Foresight Nanotech Institute awards prizes each year for people who've made noteworthy contributions to molecular manufacturing.

The student prize went to Fung Suong Ou, for "Devices and Machines on a Single Nanowire." He used a combinatorial approach to fabricate one-dimensional structures composed of carbon nanotubes and metal nanowires.

The communication prize was earned by Robert Freitas for his decade-plus of work telling people about the benefits of medical applications of molecular manufacturing. His highly detailed and informative Nanomedicine books are available in full online, as well as Kinematic Self-Replicating Machines.

The Feynman theory prize was won by David A. Leigh, for artificial molecular motor and machine design in the realm of Brownian motion.

The Feynman experimental prize went to Sir J. Fraser Stoddart, for synthesizing molecular machines including a molecular "muscle."


FURTHER READING
Paper by Fraser Stoddart, Evaluation of synthetic linear motor-molecule actuation energetics

Eric Drexler on the Productive Nanosystems TechnologyRoadmap

Drexler is the one who started the idea of molecular manufacturing back in the mid-1980's. The general focus of the Roadmap is on atomically precise technologies, not productive nanosystems.

It provides merit criteria and metrics for research today. When selecting between proposals, look for atomic precision. Look for size, range of materials, other criteria that we'll probably hear about later in the talk.

The Roadmap looks toward advanced manufacturing (what physics says should be possible), but focuses on accessible productive nanosystems (such as ribosome-like systems).

Near-term, there are several kinds of atomically precise things we can build. One is biopolymers: protein, DNA.

New topic: Advances in production technology. Type 1 advances build better products. In Type 2, the products include improvements to the production system, which can enable further improvements. So we really want better productive machines that can build better productive machines... This appears to be an argument for using nanosystems as the means of production of nanosystems.

Today, tools build tools build tools... traceable back to blacksmithing. The tool that extruded your breakfast bagel is a leaf on this tree. The advanced APM tree has a "Mark II Ribosome" low on the trunk, and "Macroscale APM" high on the trunk, with "Dollar-per-kilogram fab" among the leaves. People tend to assume that things high in the tree are proposals for next year, "which would be absurd."

The Roadmap talks about cross-linked organic structures. An idea that arose pretty late is mixed covalent-ionic bonding. Titanium dioxide, quartz. This may be closer than what's been looked at more closely.

The role of roadmapping: Developing the knowledge and confidence necessary for coordinated system development. So the Productive Nanosystems roadmap should show what's necessary, when, how to coordinate and schedule developments. Avoid chicken-and-egg problems that lead to slow incremental progress.

DNA currently costs dollars per milligram. There's no point in thinking about kilogram-scale structures... but there's a researcher who has an idea for making DNA at dollars per kilogram... but why should he do it when there's no market for kilograms of DNA? This is a real example: it seems that DNA might actually get vastly cheaper.

Productive Nanosystem Zyvex talk

Chris Phoenix, CRN, is live blogging the event. John Randall, Zyvex: A completely different approach. Zyvex was founded to create atomically precise manufacturing on the way to productive nanosystems. In other words, building precise structures using big machines rather than nanoscale tools.

Atomic layer deposition builds amorphous materials; atomic layer epitaxy (ALE) builds crystalline materials. Start with a protected (passivated) surface: every available bond has a hydrogen atom. If you deprotect the surface, removing the hydrogen, then you can deposit a layer of atoms. If you choose the right precursor gas, you add only one monolayer which is protected as it's added. Then you can deprotect and add exactly one more layer of atoms. There are a number of precursor gases available. There are literally hundreds of systems to grow things with atomic precision in one dimension.

if you combine this with the ability to deprotect the surface in selected locations... With a scanning tunneling microscope, you can remove single hydrogen atoms with atomic precision. Several groups have demonstrated this. This is "the limit of a thin resist" - a monolayer of hydrogen.

Differences from mechanosynthesis:
1) Building blocks don't have to be captured by the tool tip.
2) The tool tip can be used to inspect both deprotection and assembly.
3) You can do large areas (fast) or atomic resolution, depending on mode.
4) This is a very general technique.
5) All you need is an atomic-resolution STM tip - don't need anything else with atomic resolution.

You need an atomically precise, invariant tip. ALIS has built such a tip. A reproducible atomic structure at the end of a tungsten wire.

They're trying to develop a dual-material process, silicon and germanium, so that you can make releasable structures. (They think they can deal with lattice mismatch.)

One possible product is a nano-imprint template. They expect atomically precise tools to be the most valuable product. They expect to enable productive nanosystem factories.

Question: Hydrogen migrates at normal temperatures. Is that compatible with the deposition technologies? A: We believe (after careful study) that the hydrogen is stable on a silicon surface, up to 200-300 degrees C. We think we can get epitaxy to work in that window. Cryogenic temperatures are not necessary. You do get motion on a single dimer, but no long-range motion.

Productive Nanosystems conference first two talks

The Foresight Productive Nanosystems conference has started.

Chris Phoenix at the Center for Responsible Nanotechnology is liveblogging the event.

Here is his introductory article about the conference

The first speaker is Alex Kawczak, VP, Nanotechnology & BioProducts, Battelle, who talks about some aspects of the Technology Roadmap for Productive Nanosystems

There are several Atomically Precise things in the Roadmap: Manufacturing, Atomically Precise Productive Nanosystems (APPN), Atomically Precise Technologies. Now he's talking about the nanotech market as a whole ($1 trillion by 2015), most of which is not atomically precise. He says atomic precision can improve nanotech.

Atomically Precise Structures are a definite arrangement of atoms. Self-assembled DNA, engineered proteins, nanotube segments, etc. But atomically precise technology will increase scale and complexity.

Atomically Precise Manufacturing (APM) lets you build atomically precise structures under programmable control.

Atomically Precise Productive Nanosystems are functional nanosystems that implement APM. This is nano-building-nano - the high-impact stuff.

So this sounds like the roadmap defines a spectrum of AP technologies, working from self-assembly of engineered AP structures, up to nano building nano.

Two strategies in the roadmap: 1) Develop AP technologies for energy; 2) Develop AP technologies for medicine. Hm, no emphasis on productive nanosystems in that slide.

They're hoping that the Roadmap will help a broad range of industries to develop nano capabilities. They want to develop a broad technology base for APT, apply this to develop APM, APPNs, and spinoff APT applications. They want to "treat atomic precision as an essential criterion for research." So the roadmap encompasses self-assembly as well as APPN.

The roadmap recommends hybrid manufacturing technology approaches at several points.

So it sounds like the Roadmap does talk, at least some, about molecular manufacturing, which they call APPN. This could be a very interesting conference. And it looks like the Roadmap does explicitly endorse molecular manufacturing.

Post-talk comment from Jim Von Ehr (today's moderator): Comparison to semiconductor roadmap: That was developed after they'd been going for a while. Our roadmap is developed in advance, so it's a bit speculative; you'll be amazed at how many different things were pulled together.


Chris Schafmeister talked about Productive Nanosystems: Abiotic Biomimetic Roadmap

Productive nanosystem definition: "A closed loop of nanoscale components that make nanoscale components."

Schafmeister has built 14 building blocks - some of them, they can make tens of grams at a time. They've built one with a functional group and they're working on other functional groups - some not found in natural amino acids.

They attach a building block to a plastic bead, then add other building blocks one at a time. This is not self-assembly: it is programmed assembly. They want to build molecules containing 20-50 blocks. That's a lot of reaction steps! Once they've built a chain, they double-link it, making it rigid. They've synthesized over 100 molecules; most are very water-soluble; the most building blocks so far is 18.

He wants to "create many artificial catalysts that approach the capabilities of enzymes." No one has made an enzyme yet - he wants to make thousands of them, engineered. He wants to make 60,000 enzymes as rapidly as he can write 60,000 lines of code. This may be achievable because enzymes carry out catalysis (accelerating chemical reactions) by changing the mechanism of the reaction. It does this via functional groups arrayed around the substrate. "If we can position multiple functional groups in three-dimensional space in all the right places," then we may be able to implement enzymes. So if functional groups (found in databases) were positioned in space correctly, you'd have the enzyme


How long do the chemical operations take? A: Seconds, maybe minutes. Not hours. Right now, we do one per hour (10^17 molecular copies).

The 2010 Blimp plane

Hat Tip to Al Fin, for spotting a very interesting blimp plane hybrid

blimp plane
The Luxury blimp plane hybrid, Aeroscraft ML866

While 70% of the aerodynamic lift comes from helium, the remaining 30% is derived from its innovative “wing” shape. As well as being able to hover the aircraft will be capable of speeds up to 138 mph (0-222 kmh) and will operate at altitudes of up to 12,000 ft (3,657 m). and the massive 210 ft (64 m) long by 118 ft (36 m) wide by 56 ft (17 m) high structure will deliver a roomy 5000+ square feet of cabin space.

Aeros displayed a 1/48th scale model at this year’s NBAA show and hopes to begin airframe static testing of the rigid composite structure within months, with flight testing at the San Bernadino International Airport to follow as early as 2010. An additional series of commercially focussed Aeroscraft is also on the drawing board and will be scaled to payloads of up to 60 tons.

No exact pricing details are available as yet but reports suggest the tag will be under $40 million.


blimp plane executive floor plan
Blimp plane executive floor plan

blimp plane commercial floor plan
blimp plane commercial floor plan

blimp 60 ton payload cargo plane
60 ton payload blimp cargo plane

blimp plane bouyancy control
Blimp plane bouyancy control

blimp plane strong lightweight structure
blimp plane strong lightweight structure

FURTHER READING
Aeros is a world leading lighter-than-air, FAA-certified aircraft manufacturing company.

UPDATE:
I looked more closely at the site and they have some interesting innovations. A composite structure for more strength and less weight and an interesting device for dynamic control of bouyancy.

It seems later versions of this type of craft would be helped by wing in ground effect lift.
1. Even lighter and stronger materials. Carbon nanotubes etc..
2. cheap thin film solar for power systems
3. The wing lift capability seems like it could be designed to take advantage of wing in ground effect lift.


The Russian Ekranoplane, WIG plane, could lift over 100 tons of cargo

WIG plane/boats need to be big to get the most efficiency. Height off the ground to still get the extra left is determined by the size of the wing. Since this is also large it seems like it is well suited.


Boeing Pelican, WIG concept. Boeing's claimed that the Pelican would be capable of transporting 750 tons over 10,000 nm (18,530 km) when cruising in ground effect, but can carry the same load only 6,500 nm (12,045 km) when out of ground effect. The Pelican, the 500 ft (153 m) span vehicle would carry up to 2,800,000 lb (1,270,060 kg) of cargo while cruising as low as 20 ft (6 m) over water or up to 20,000 ft (6,100 m) over land. Unlike the Soviet concepts, the Pelican would not operate from water, but from conventional runways using a series of 76 wheels as landing gear.

Although the really big WIG vehicles designed to haul 5,000 tons would probably then swamp the blimp lifting effect. But vehicles with 60-2000 tons of lift seem like they would benefit from taking advantage of blimp lift, wing and wing in ground effect.

October 08, 2007

A billion millionaires in 2025? not likely

James Canton, in his book "extreme Futures", has made a prediction that there would be a billion millionaires in 2025.



On page 8 this 16 page pdf, we see that there are projected to be 8 billion people in 2025. A billion millionaires means that the person at the 12.5th percentile would be a millionaire.

The United States is the richest country, and in 2000, the mean wealth was $144,000 per person. BUT the median wealth in 2000 is $55000. So the top 100 million adults in the United States had $55,000 per person in 2000. (200 million adults in the USA in 2000, so the top 50 percentile had 100 million.) Wealth meaning net worth, where net worth is the value of physical and financial assets less debts.

CORRECTION:
Woops. Did this post a little fast.
The world numbers were correct in terms of how the top 10% and 1% are doing. Thanks for the catch.

Median household net worth increased from $49,932 in 1998 to $55,000 in 2000.

Therefore, the 50 percentile american is far closer and a little behind the top 10% in the world at 60K.

How we feel about the wealth is separate from the statistical question of millionaires.


This graph shows the changes in net worth from 1989 to 2004 in the United States. The poorest 75% have been mostly stagnant in their growth of net worth. The top 25% have closed to doubled their net worth. Following a similar pattern foward to 2025 then the top 15-25% of the United States would be millionaires. This is constant dollars. If one used future dollars which could have devalued then the top 50-75% could be millionaires in future devalued dollars. I think using future probably devalued dollars is pointless.

This is pdf that examines wealth in the world

To be in the top 10% in wealth in the world (in 2000) required $61,000 in assets, and more than $500,000 was needed to belong to the richest 1%, a group which — with 37 million members worldwide.

Notice someone is in the middle in the USA is almost in the top 10% in the world. The 40th percentile American was in the top 10% in the world. The top 12.5 percentile would be somewhat poorer but very close.


Here is a distribution of where the wealthiest 10% were in the world in 2000.

Using currency exchange rates, global household wealth amounted to $125 trillion in the year 2000, equivalent to roughly three times the value of total global production (GDP) or to $20,500 per person.

So for the top 12.5% in the world to become millionaires then the non-US population of the world would have to have far better wealth accumulation than the people in the US who are in the 50th to 40th percentile. This is conceivable as there will be large numbers in China who will become affluent. However, I do not think it is possible to the degree needed to get to one billion millionares in 2025.

If the world was following the US wealth distribution changes and those changes matched the 1989-2004 period, then the US would still have 25% of all of the wealthy 10% and only 1/3 of those americans would be millionaires. So the top 3.3% of the world or about 264 million people in 2025. World GDP has growing at 5% per year, while the USA is growing at 2.7% per year. Assuming this trend continues then the average person in the non-USA portion would gain 40-50% on the person in the United States. We can be generous and say that instead of the 10% person having less than half of the US mean of $144,000 then they catch up. But the average person in the United States is not projected to be a millionaire. It is the top 25% in the USA. So only the top 5% of the world would qualify. This would be 400 million people in the world.

For the person at the 12.5th percentile to be a millionaire then the next 600 million people would have to make 4 to 6 times more. People would have to be accumulating wealth with 8-11% per year better than they have been. Those people who had $50,000 in net worth would have to increase their net worth by 17% per year.

The top 12.5% would probably have about 80% of the worlds net worth. The top 10% in the world have 71% of the wealth. The top 2% had 50%. So the top 12.5 to 2% would have about 30% of the worlds wealth. One billion millionaires would mean that the worlds net worth would have increased to over 3,000 trillion in order for the poorest of the top 12.5% to have over one million dollars in net worth. This would be a 24 times increase from wealth in the year 2000. This would be an average of about 20% per year growth in world wealth from 2008 onwards.

So I would optimistically be projecting 400-500 million millionaires in 2025 and a more conservative estimate would be 120 to 250 million millionaires in the equivalent of today US dollars . A billion millionaires in 2025 would not happen unless you are using future devalued dollar pesos, the global economy starts growing at 20-30% or more per year starting in 2015 or there is a flattening in wealth distribution for the top 15%.

Synthetic vascular system progess towards growth of engineered tissue for transplants

One day soon, laboratories may grow synthetically engineered tissues such as muscle or cartilage needed for transplants. In a major step forward, Cornell engineers describe in the journal Nature Materials a microvascular system they have developed that can nourish growing tissues.

The researchers have engineered tiny channels within a water-based gel that mimic a vascular system at the cellular scale and can supply oxygen, essential nutrients and growth factors to feed individual cells. The so-called gel scaffold can hold tens of millions of living cells per milliliter in a 3-D arrangement, such as in the shape of a knee meniscus, to create a template for tissue to form.

In theory, the system could accommodate many kinds of tissue.

The research provides solutions to the physical engineering aspects of growing tissues synthetically. Still, many biological challenges remain, such as finding a source of cells that can be harvested from a patient and grown without changing the cell's characteristics. Co-author Lawrence Bonassar, a Cornell associate professor of biomedical engineering who was instrumental in developing the gel for tissue growth and in determining the proper biological requirements for cell growth, is also among those trying to direct stem cells to produce desired tissue types. Currently, stem cell-derived cartilage has been made but is not functional.

As new tools develop, researchers hope to use these engineered tissues in non-clinical applications, such as replacements for animals in the testing of pharmaceuticals and chemicals. The technology, researchers believe, also offers the hope of growing implants from the patient's own cells to replace damaged or diseased tissue.

Petaflop plans progress

The Blue Gene/P machine at Argonne is supposed to reach one petaflop — 1 quadrillion sustained operations per second — in 2008. It should have a peak speed of three petaflops by the end of 2008.

Turek said IBM's goal was 10 petaflops by 2011 and 20 petaflops by 2017. The Japanese have announced their intent to reach 10 petaflops by 2012.

FURTHER READING
IBM and Google have dedicated a large cluster of several hundred computers (a combination of Google machines and IBM BladeCenter and System x servers) that is planned to grow to more than 1,600 processors.
Students will access the cluster via the Internet to test their parallel programming course projects. The servers will run open source software including the Linux operating system, XEN systems virtualization and Apache's Hadoop project, an open source implementation of Google's published computing infrastructure, specifically MapReduce and the Google File System (GFS).


Sony PS3's have helped the Folding@Home project to pass a petaflop in processing power during Sept 2007

NTT docomo's Super 3G and 4G plans

IEEE spectrum reports on a live NTT test of Super 3G.

DoCoMo has taken on something of a leading role in promoting Super 3G inside the 3G Partnership Project (3GPP) a consortium of wireless operators and vendors, including Vodafone, Lucent, Motorola, and Nokia, working to create global specifications for 3G technologies. Essentially, Super 3G is targeting a useable download transmission speed of around 100 Mbps, with an uplink speed of 50 Mbps, and reaching much higher peak speeds in both cases.

At CEATEC DoCoMo gave its first public demonstration of the technology that is still under development and which is not expected to be deployed until 2010.


Prior press releases in July, 2007 indicated the commercial rollout plans targeting 2010.

Super 3G represents a break with the current 3G infrastructure, so it will require a new round of heavy investment before it can be deployed. That’s the bad news. The good news is that the same infrastructure can be used for future 4G systems. As a DoCoMo staffer said, “We see Super 3G as being a bridge to 4G.” And one that Japanese users no doubt will be the first to step across.


Super 3G holds the promise of allowing download speeds as fast as 300Mbps and upload speeds of 80Mbps. It uses the same radio spectrum band as current 3G services. NTT DoCoMo plans to launch a network based on the technology by 2010.

The technology could be seen as a steppingstone between current 3G technologies like W-CDMA and 1xEV-DO and 4G technologies such as UMB and WiMAX.

NTT DoCoMo is even looking past Super 3G towards those 4G technologies. In fact, late last year the carrier was successful in getting about 5Gbps data speeds to a receiver moving at about 10 kilometers per hour (6 mph).


This [4G technology] is a technology that is probably not going to [be widely implemented] until 2013.”

Transcript of my talk Economics in a new Era is up

Sir Arthur C. Clarke speaks about Sputnick and technology

Sir Arthur C Clarke makes several interesting observations of the past and the future.

On the past:

Launching Sputnik and landing humans on the Moon were all political decisions, not scientific ones, although scientists and engineers played a lead role in implementing those decisions. (I have only recently learned, from his long-time secretary Carol Rosin, that Wernher von Braun used my 1952 book, The Exploration of Space, to convince President Kennedy that it was possible to go to the Moon.) As William Sims Bainbridge pointed out in his 1976 book, The Spaceflight Revolution: A Sociological Study, space travel is a technological mutation that should not really have arrived until the 21st century. But thanks to the ambition and genius of von Braun and Sergei Korolev, and their influence upon individuals as disparate as Kennedy and Khrushchev, the Moon—like the South Pole—was reached half a century ahead of time.

I hope that nations can at last see better reasons for exploring space, and that future decisions would be informed by intelligence and reason, not the macho-nationalism that fuelled the early Space Race.


For those who need some background on his next quote:
And in the heady days of Apollo, we seemed to be on the verge of exploring the planets through manned missions. I could be forgiven for failing to anticipate all the distractions of the 1970s that wrecked our optimistic projections—though I did caution that the Solar System could be lost in the paddy fields of Vietnam. (It almost was.)


One of the reasons that the Space program lost all momentum after Apollo was that the US budget was strained paying for the Vietnam war, which can be clearly seen in hindsight as a waste of money. Time magazine discusses over $100 billion/year was spent in 1971

Nasa spending was far less

1971 3.381 billion in 1971 dollars or 12.356 billion inflation adjusted.


Arthur C clarkes three wishes for the future:

1. A method to generate limitless quantities of clean energy.

2. Affordable and reliable means of space transport.

3. Eliminating the design faults in the human body


[Note: I interpret item 3 as a a weakly transhumanist statement]

Bub1 Gene could be used to stop cancer growth

A protein in our cells called ‘Bub 1’ is essential for normal cell division to take place; if the gene that generates Bub 1 is ‘switched off’ then the cells are unable to divide successfully.

Now that scientists understand the precise role of Bub 1 in normal cell division, as well as what goes wrong when the gene is missing, they plan to test their theory on cancer cells.

“Unlike some other genes that become mutated in cancer cells, the Bub 1 gene appears normal indicating that it behaves in exactly the same way in cancer cells as it does in healthy cells.

“If this is the case, then we can be confident that switching it off will stop cancer cells proliferating too. And while our normal cells don’t divide that often, cancer cells divide more frequently, so hopefully by targeting Bub1 we will selectively kill cancer cells.”

Equally exciting, says Dr Taylor, is the fact that drugs are already being developed that are able to block the actions of Bub 1-type enzymes, known as ‘protein kinases’; such kinase blockers or ‘inhibitors’ are already providing a whole new approach to tackling cancer and Bub1 inhibitors may be another weapon in the oncologist's arsenal.

October 07, 2007

synthetic chromosome created, First Synthetic Life within weeks

The Guardian indicates Craig Venter's team has built a synthetic chromosome out of laboratory chemicals and is poised to announce the creation of the first new artificial life form on Earth.

The announcement, which is expected within weeks and could come as early as Monday at the annual meeting of his scientific institute in San Diego, California, will herald a giant leap forward in the development of designer genomes.

Форма для связи

Name

Email *

Message *