Pages

June 15, 2007

Cheaper fabbers and 3d scanners

A desktop fabrication factory for $5000


The Desktop Factory 3D printer, which has a list price of $4,995, uses an inexpensive halogen light source and drum printing technology to build robust parts layer by layer from composite plastic powder. The cost of the build material is expected to be about $1 per cubic inch. The maximum build volume of the initial product will be 5 x 5 x 5 inches. The thickness of each layer is 0.010 inch.

Next Engine has a $2495 3d scanner with 0.005 inch accuracy


3d scanner

Canon will have a $9900 V-flash desktop modeler.

Hat tip to Karl Schroeder

Look at my other articles on fabbers and rapid prototyping and rapid manufacturing

60 Ghz wireless at one tenth the cost

60 Ghz millimeter communication at one tenth the cost with new Toshiba Corp. 60-GHz receiver CMOS chip instead of GaAs

60Ghz provides a total data rate of 312 Mbps (down/up links are 155.52 Mbps).

Current prices seem to be in the $10,000-15000 range. So if the full system price can be brought down to 10% then it would be $1000-1500. But other components may not see the price reduction.

Here is a pdf that describes applications using 60Ghz wireless communication

For many applications, 60 GHz radios have become the technology of choice, based on being license-free, highly immune to interference, and easy to install. The achievable distance is the main limitation of 60 GHz radios. Based on the geographic area of deployment and link availability requirements, 60 GHz radios can be confidently deployed at maximum distances ranging from 400 to 1000 meters.

License-Free Spectrum. The FCC allocated an un-precedented 7 GHz of unchannelized
spectrum for license-free operation between 57-64 GHz – this compares to
only about 500 MHz of spectrum allocated between 2-6 GHz for WiFi and other licensefree applications. For the first time, sufficient spectrum was allocated to make possible multi-gigabit RF links.

Narrow Beams Antennas. A 10-inch dish antenna can achieve 40 dBi of gain with a
half-power beamwidth of 1.4 degrees. A corresponding 5.8 GHz antenna would have
beamwidth ten times larger. This narrow beamwidth allows multiple 60 GHz radios to be
installed on the same roof-top or mast, even if they are all operating at the transmit and receive frequencies.

Oxygen Absorption. This is a unique property that does not affect lower-frequency
radios. Oxygen attenuates 60 GHz signals by 12-16 dB per kilometer (i.e., half of the
energy is absorbed for every 200 meters the signal travels), which is the main reason that 60 GHz links cannot cover the distances achieved by other millimeter-wave links. The impact of the small beam sizes coupled with oxygen absorption makes the links highly immune to interference from other 60 GHz radios, since another link in the immediate vicinity will not interfere if its path is even moderately different from the first link, and any radio operating beyond the immediate vicinity (even on the exact same trajectory) will have its signal severely attenuated by the oxygen attenuation. These same two factors make the signal highly secure – in order to intercept the signal, one would have to locate a receiver lined up on the exact same trajectory, and in the immediate locale of the targeted transmitter.

Rainfall Limitations. Like all radio links that operate above 10 GHz, intense rainfall significantly limits the distance over which 60 GHz links can transmit data error-free.

60 GHz links are limited by their performance during periods of heavy rain. FSO links
are limited by their performance during periods of heavy fog. Because heavy rain and
heavy fog to not occur at the same time, a hybrid link that has both a 60 GHz link and a FSO link can operate at the maximum distance of the shorter of the two links in clear weather. It is possible to create a very high availability, full-time gigabit speed link at up to one kilometer using this type of hybrid link. Customers also benefit from the hardware redundancy, providing partial protection if one of the two links experiences a hardware failure. The only downside to this type of link is cost, as gigabit FSO links are more expensive than gigabit 60 GHz links.

For the more cost-sensitive customer, another hybrid alternative is to pair a gigabit 60 GHz link with a lower-capacity 5 GHz link. Since the 5 GHz link is immune to rainfall, it can be used as a lower-speed fallback for periods when the 60 GHz link is impaired by heavy rainfall. If a customer is able to tolerate the lower speed link performance 1% of the time, then the 60 GHz link distance can be set at the 99% availability distance, which from the distance chart (above) can be up to 900 meters. For a small premium over the cost of the gigabit link, using a $3,000 24 Mbps performance 5 GHz link as a fall-back can provide a customer full gigabit performance 99% of the time and 24 Mbps 1% of the time.


300 Ghz Millimeter wave chips are also used to security scanning through clothing

Tri Alpha Energy raises $40 million in Venture capital for nuclear fusion

Tri Alpha, which licenses intellectual property from the University of California, Irvine, is developing what’s called “hot,” or plasma, fusion, in which atomic nuclei are fused, releasing huge amounts of energy, according to Ray Rothrock of Venrock, one of the company’s backers.

I have dug up the details on this privately well funded fusion project as well as a related project with somewhat similar goals in terms of the size of intended reactors. I remember the old Migma project being described in OMNI magazine. The goal was cheap, relatively small stackable fusion reactors.

btw: How about some Digg and Reddit love for finding scattered details on this cool and stealthmode and potential high impact project?

del.icio.us



Some background information from wikipedia on aneutronic fusion

Aneutronic fusion is any form of fusion power where no more than 1% of the total energy released is carried by neutrons. Since the most-studied fusion reactions release up to 80% of their energy in neutrons, successful aneutronic fusion would greatly reduce problems associated with neutron radiation such as ionizing damage, neutron activation, and requirements for biological shielding, remote handling, and safety issues. Some proponents also see a potential for dramatic cost reductions by converting the energy of the charged fusion products directly to electricity. The conditions required to harness aneutronic fusion are much more extreme than those required for the conventional deuterium-tritium (DT) fuel cycle, and even these conditions have not yet been produced experimentally. Even if aneutronic fusion is one day shown to be scientifically feasible, it is still speculative whether power production could be made economical.


The proton - Boron reaction is discussed at wikipedia

Link to one of the most recent patents, Controlled Fusion in a field reversed configuration and direct energy conversion The patent describes a 100MW configuration.

A list of the 14 patents related to colliding beam fusion

Unlike many government-sponsored efforts, however, Tri Alpha is working with fusion reactions that produce fewer neutrons and, thus, less radiation, Mr. Rothrock says. The company also uses a different method for containing and controlling fusion reactions, which happen at million-degree temperatures. “It’s a long way from reality, but the trend line is going in the right direction,” he says. “The science is rock-solid; the calculations continue to bear out the results.”

Mr. Prouty estimates it will take his company “not 15 to 20 [years], but not 3 to 5 either” to go from the research stage to power generation.

A few years ago, Venrock first started investing in Tri Alpha, he says. The firm later reportedly convinced Goldman Sachs, Vulcan Capital, Enel Produzione, and PIZ Signal to join as backers. But neither Mr. Prouty nor Mr. Rothrock would say when the company was founded, or how much total funding it has. Its original funders are believed to include billionaire Paul Allen.

Ballpark estimates put worldwide private investment in fusion research at about $1 billion over the last 50 years. Of that, about $100 million currently funds cold fusion research, and less than $15 million has been invested in all fusion projects in the Valley since the 1980s.


From May 2005, we have some information on the technical details of triAlpha Energy

TriAlpha is the brainchild of Norman Rostoker, a senior fusion researcher. He had previously collaborated with another researcher, Maglitch, on the MIGMA approach to advanced fuels. This approach involved shooting two counter-circulating beams of ions at each other in a confining magnetic field. It was not very workable, as the ion densities would always be very low. Rostoker combined this idea with another device, the Field Reversed Configuration, sending the beams into the FRC.

The FRC is essentially a large-scale plasmoid centimeters rather microns across, with much lower densities and magnetic fields than with the DPF. It does not benefit from the magnetic field effect as its field are far too low. Scientifically, TriAlpha’s results so far are very modest compared with focus fusion’s. The average ion energy, a measure of plasma temperature is a few 10’s of eV. This is a factor of 10,000 short of what is required for pB11 fusion. Of course, we have already achieved the needed ion energies (100keV) with focus fusion, so in this sense are way ahead. In addition, it is by no means guaranteed that their confinement will remain stable if they can reach higher temperatures.

However, TriAlpha has been impressive at raising funds. So far they have raised nearly 12 million dollars, [from 2000 to 2005]. This is mostly from two billionaires.


Here is a pdf from the 2005 Florida Physics News which on page 9 describes Colliding Beam fusion which they indicate they are working with TriAlpha Energy


Here is a diagram of the kind of reactor they are looking to make and the size of it relative to a person. One unit is about the size of small bus


Here is an image of the magnetic fields and plasma

Dr. Hendrik Monkhorst of the Quantum Theory Project and his collaborator, Dr. Norman Rostoker of UC Irvine, designed a novel type of fusion reactor called the Colliding Beam Fusion Reactor (CBFR).

CBFR in Field Reversed configuration has a cylindrical shape, rotates at a high rate about its axis inside a solenoidal magnet, and thus produces a magnetic field that closes upon itself: a kind of self-confinement of fuel nuclei was established, with all confined particles flowing in the same direction. Protons rotate at a high rate, with an energy of about 1 MeV, and Boron 11 are slower, which causes the protons to literally ‘rear-end’ the Boron 11 with an energy at which fusion cross-section is highest. The collaborators found that plasma parameters could be set such that essentially all injected protons and Boron 11 undergo fusion to 4-Helium which were guided into Direct Energy Converter (DEC) devices. These devices turned their kinetic energy directly into electricity, unlike previous techniques where water was boiled, producing steam which drove turbines to eventually produce electricity. Resulting advantages included abundant fuel supply, nearly no radioactivity, no danger of runaway reactions or explosions, scalability of size and output power, easier engineering and maintainability. They have begun a multi-faceted study which is currently underway to establish the full feasibility of the design. Many calculations, theory development and nuclear polarization (to enhance the fusion reactivity), is centered in the UF Physics Department.

Experiments and engineering studies are being conducted at UC Irvine and the site of the start-up company, TriAlpha Energy, Inc., licensed by UCI and UF, located off-campus near Foothill Ranch. There are currently nine Patents that describe the CBFR and various embodiments of the direct energy converter, two of them detailed in the 2004 Physics Alumni Newsletter.


Focus Fusion is a competing fusion project.

In July, 2006, Lawrenceville Plasma Physics Inc (LPP) and the Chilean Nuclear Energy Commission expect to begin an ambitious three-year experimental program using the Speed-2 plasma focus device that will determine the scientific feasibility of the focus fusion approach. The Chilean government will be funding approximately $1,000,000 of the total $1,700,000 cost of the project.


LPP provides technical background on their effort

LPP provides (a biased) comparison of their technology against other technology

LPP hopes to make 5MW reactors for $300,000 each

Here is page with diagrams of what they are planning with their 5 megawatt reactor

Further Reading on energy related topics:
A list of several of my Thorium nuclear fission reactor articles

My summary of the costs of other energy sources such as solar, wind etc...

Other nuclear fusion articles including Z-pinch and Bussard fusion


Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions

UBS GDP forecast for 2025 and 2050

I have put out my own detailed analysis of the China's economy passing the United States on an exchange rate basis by 2020

Andreas Hoefert, chief global economist at UBS has provided a long range projection for 2025 and 2050 using purchasing power parity (PPP) and a formula that takes into account such things as capital and labour growth.

Using this technique, the United States is today's largest economy, followed by the European Union, China, Japan, India and Germany. [Wikipedia has lists of countries by PPP GDP] They also have a list with estimates projecting for the current year

But things get more interesting in 2025, when China takes the No. 1 position, followed by the United States, the EU, India, Japan, Brazil and Indonesia. By 2050, India moves into the No. 2 position, behind China. As well, countries such as Pakistan, Mexico and Bangladesh crack the top 12.

As fast as China is expected to grow by 2050, he noted India is actually expected to grow even faster. Vietnam, which is not expected to make the top 15 largest economies by 2050, also has an excellent growth story to tell, given its young, educated workforce and stable political environment.

Australian hypersonic scramjet goes over Mach 10

Australia has tested a hypersonic scramjet that exceeded mach 10. They went at about 6850 mph.

Scramjet principles are described here at wikipedia

Scramjet programs are listed and described here

The fastest US test was the the third X-43 flight. It set a then speed record of 6,600 mph (10,620 km/h), [about mach 9] on 16 November 2004. It was boosted by a modified Pegasus rocket which was launched from a Boeing B-52 at 13,157 meters (43,166 ft).

The NASA Langley, Marshall, and Glenn Centers are now all heavily engaged in hypersonic propulsion studies. The Glenn Center is taking leadership on a Mach 4 turbine engine of interest to the USAF.


the USA is working on the X51A (hypersonic missile) for hypersonic flight in the Mach 4.5 to 6.5 range with tests scheduled for 2009.

June 14, 2007

Bloomberg bring efficiency to New York, other cities should copy

Bloomberg brings efficiency to New York City Inc

Europes demographic future -two Europes

The Economist takes a close look at changing demographic trends and a rebound in birth rates in France and England and some other countries

France is not unique. Official forecasts predict Britain's population will rise 15% by 2050, an extra 9m people. For Sweden, the forecasts say the population will grow by about a fifth. Some of this is the result of immigration and rising longevity but, according to David Coleman, a demographer at Oxford University, the recovery is also the result of older women having more children “almost sufficient to compensate for the sharply reduced birth rates of younger women”. This is exactly what was hoped for, but does not seem to be happening yet, in the Mediterranean and eastern Europe.

If you take account of late childbearing, you find that 16 European countries, with a total population of 234m, now have fertility rates of 1.8 or more. Half are above 2.0. Despite near-panic about “inevitably” declining population, then, some European countries are growing quite strongly. They are rare examples of bucking the trend that, as countries get richer, their birth rates fall.

None of this means that Europe has broken the chains of its demography. The EU's overall population will fall by 7m by 2050. The so-called support ratio (roughly, the proportion of workers to pensioners) is declining everywhere. And as Mr Coleman points out, Europe's share of global population will fall from 21% now to 7% by 2050. Even its successes are only relative. A fertility rate of 1.8 is still below replacement.


Eris is bigger than Pluto

Eris—formerly known as 2003 UB313 and then Xena—is the largest so-called dwarf planet in the solar system.

Eris was the catalyst for Pluto's demotion last summer, when the International Astronomical Union redefined the term "planet" and created the category of dwarf planet for objects such as Eris and Pluto.

Using the Keck Observatory and the Hubble Space Telescope, Michael E. Brown and Emily Schaller at the California Institute of Technology in Pasadena have now put the mass of Eris at a third more than Pluto's.

RNA and metagenomics revolutions

The Economist looks at the RNA revolution

The DNA of a human cell is only a small part of the overall puzzle of what is going on within the human body.

Ever since the human-genome project was completed, it has puzzled biologists that animals, be they worms, flies or people, all seem to have about the same number of genes for proteins—around 20,000. Yet flies are more complex than worms, and people are more complex than either. Traditional genes are thus not as important as proponents of human nature had suspected nor as proponents of nurture had feared. Instead, the solution to the puzzle seems to lie in the RNA operating system of the cells. This gets bigger with each advance in complexity. And it is noticeably different in a human from that in the brain of a chimpanzee.

If RNA is controlling the complexity of the whole organism, that suggests the operating system of each cell is not only running the cell in question, but is linking up with those of the other cells when a creature is developing. To push the analogy, organs such as the brain are the result of a biological internet. If that is right, the search for the essence of humanity has been looking in the wrong genetic direction.


Recent work in metagenomics shows that we only have partial understanding or even recently awareness of 1% of the activity of microbes in human bodies

We contain 10 times more microbial than human cells and 100 times more microbial genes than human genes We are superorganisms of human and microbial parts.

10 to 100 trillion microbes perform functions in our body that we have not had to evolve

So we are now shedding light onto areas that we had very little awareness of until recently.

Richard Jones will be UK Senior Strategic Advisor for Nanotechnology

Professor Richard Jones of the University of Sheffield has been appointed as the Senior Strategic Advisor for Nanotechnology, taking up the post from 1 June 2007.

Professor Jones will spend 3 days per week advising EPSRC on the development and implementation of its Nanotechnology Strategy. He will also act as an advocate for nanotechnology and for EPSRC both within the UK and internationally.

Nanotechnology is a priority research area for EPSRC. Key elements of the Strategy include developing a series of nanotechnology Grand Challenges, equipment sharing and provision for doctoral level training.

Professor Richard Jones said: "Nanotechnology, responsibly developed, could help meet a number of society's pressing needs in areas like sustainable energy and medicine. I am looking forward to working with EPSRC and the research community to ensure the UK is at the forefront of the global competition to develop exciting science and valuable applications in nanotechnology."

Professor Richard Jones is Professor of Physics at the University of Sheffield. He leads the Polymer Physics group, and conducts research into the properties of polymers and biopolymers at surfaces and interfaces. In his research, he aims to learn from some of the principles used by nature - self-assembly and molecular responsiveness - to create synthetic nanodevices such as molecular motors.


The strategy announces some relatively modest increases in funding from the current level, which amounts to around £92 million per year, much of which will be focused on some large-scale “Grand Challenge” projects addressing areas of major societal need.

By a number of measures, the UK is underperforming in nanotechnology relative to its position in world science as a whole. Given the relatively small sums on offer, focusing on areas of existing UK strength - both academically and in existing industry - is going to be essential, and it’s clear that the pharmaceutical and health-care sectors are strong candidates. Nature Nanotechnology’s advice is clear: “Indeed, getting the biomedical community— including companies — to buy into a national strategy for nanotechnology and health care should be a top priority for the nano champion.”


A pdf has the UK nanotech strategy most of which is primarily following NNI type evolutionary efforts.

It should also see more support and follow up for the UK Ideas Factory, which I think has some promising projects.

Lower cost and more power efficient video and images

Showlei Associates has announced its CamCoder video compression device that will dramatically lower the cost, power consumption and size for the compression of high-definition streaming images.

Continuing improvements in lowering the cost and efficiency of imaging impacts public issues like persistent and ubitquitous surveillance.

This IC can be used in a variety of applications and especially addresses the need for high-resolution surveillance image recording. The device is able to simultaneously encode two separate streaming images — full size and quarter size — with robust compression and high quality. The IC also contains internal logic for user-programmable motion detection and watermark insertion, as well as on-board memory.

The CamCoder interfaces directly with a variety of CMOS imagers — from QVGA to very-high, eight-megapixel resolution, and above.

According to John Music, president, Showlei is known for proprietary time-domain video codecs that lower chip transistor count by more than 10:1 — compared with other MPEG designs


Kodak will have an image sensor that is four times more sensitive to light in commercial cameras in 2008

Kodak’s new proprietary technology adds panchromatic, or “clear” pixels to the red, green, and blue elements that form the image sensor array. Since these pixels are sensitive to all wavelengths of visible light, they collect a significantly higher proportion of the light striking the sensor. By matching these pixel arrangements with advanced software algorithms from Kodak that are optimized for these new patterns, users can realize an increase in photographic speed, directly improving performance when taking pictures under low light. Kodak’s new technology also enables faster shutter speeds (to reduce motion blur when imaging moving subjects), as well as the design of smaller pixels (leading to higher resolutions in a given optical format) while retaining performance.

VASIMR update

VASIMR engine was run for 2 minutes before overheating The company is targeting using the new plasma engine technology to move satellites in low earth orbit in 2010.

Japan's 10 petaflop supercomputer

Japan is proceeding with its 10 petaflop supercomputer project and plans to finish it i n2011. Japan's Ministry of Science and Technology will provide funding for the project worth a total of 115.4 billion yen (about 710 million euros) or almost 1 billion US dollars

June 13, 2007

One billion PCs and other big numbers

Mike Treder among many others have talked about the milestone of one billion personal computers that is being reached this year

Tomi T Ahonen at communities dominate talks about 2.7 billion mobile phones and the numbers of other devices

1.15 billion cell phones will be sold this year alone In other parts of the world the mobile phone is how people connect to the internet and each other.

3 billion people will have mobile phones by the end of 2007

800 million cars
1.4 billion credit card holders and more with debit cards
One source quotes 2.5 billion combined debit and credit card users at the start of 2007

So a clearer picture on what those around world who have not purchased their own personal computer are doing is mostly sharing a computer or using a mobile phone.

Also, some of the youngest would not necessarily be expected to have a computer.
0-14 years: 27.4% (male 931,551,498/female 875,646,416) Total of 1.8 billion 14 or less.

Those aged 5 or less would not be considered deprived without a computer or phone. Although from 6 and up having access to a computer for education makes sense.
About 120 million births per year, therefore 600 million of the worlds 6.5 billion are not deprived without a PC or mobile phone.

Comprehensive list of the work needed to realize vision of molecular manufacturing

Those who are actually working towards molecular manufacturing had been compiling a detailed list since 2001 of the work that was still needed to be done to achieve diamondoid molecular manufacturing.

Go through this link to see how you can help support this work.

Since 2001 we have been compiling a growing list of technical challenges to implementation of diamondoid molecular manufacturing and molecular machine systems. This list, which is lengthy but almost certainly incomplete, parallels and incorporates the written concerns expressed in thoughtful commentaries by Philip Moriarty in 2005 and Richard Jones in 2006. We welcome these critiques and would encourage further constructive commentary – and suggestions for additional technical challenges that we may have overlooked – along similar lines by others.

Our list represents a long-term research strategy that serves as a direct response to the recent (2006) call by the NMAB/NRC Review Committee, in their Congressionally-mandated review of the NNI, for proponents of “site-specific chemistry for large-scale manufacturing” to: (1) delineate desirable research directions not already being pursued by the biochemistry community; (2) define and focus on some basic experimental steps that are critical to advancing long-term goals; and (3) outline some “proof-of-principle” studies that, if successful, would provide knowledge or engineering demonstrations of key principles or components with immediate value.

Our current list of technical challenges are organized into the four categories of technical capabilities that we believe are required for the successful achievement of positional diamondoid molecular manufacturing, enabling nanofactory development. This list is currently most extensive in the area of diamond mechanosynthesis (DMS) since DMS has been the primary focus of our earliest efforts leading toward nanofactory implementation.


(I) Technical Challenges for Diamond Mechanosynthesis
(A) THEORETICAL
(1) Design and simulation of DMS tooltips
(2) Design and simulation of tooltip-workpiece interactions
(3) Design and simulation of tool-tool interactions
(4) Simulation of mechanosynthetic interactions in realistic vacuum environment
(5) Design and simulation of entire DMS reaction sequences
(6) Design and simulation of DMS procedures beyond hydrocarbons
(7) Rearrangement and reconstruction of workpiece surfaces
(8) Design and simulation of molecular feedstock presentation systems for DMS

(B) EXPERIMENTAL
(1) General design and construction of high-accuracy UHV nanopositioning systems
(2) Challenges specific to DMS nanopositioning systems
(3) Experimental fabrication of DMS tips
(4) Experimental background for DMS
(5) Experimental proof-of-principle and early DMS demonstration benchmarks
(6) DMS parallelization
(7) Availability of natural nanoparts for testing and fabrication


(II) Technical Challenges for Programmable Positional Assembly
(A) THEORETICAL
(1) Nanopart gripper design
(2) Nanopart manipulator actuator design
(3) Design and simulation of nanopart feedstock presentation systems
(4) Design and simulation of workpiece release surfaces
(5) Design and simulation of nanopart assembly sequences
(6) Atomic rearrangements in juxtaposed nanoparts

(B) EXPERIMENTAL
(1) Development of SPM technology to enable nanopart assembly work
(2) Fabrication and testing of workpiece release surfaces
(3) Experimental proof-of-principle and early positional assembly demonstration benchmarks

(III) Technical Challenges for Massively Parallel Positional Assembly
(1) Massive parallelization of DMS reactive tooltips and systems
(2) Massive parallelization of nanopart assembly grippers and related systems
(3) Simulation software for massively parallel manufacturing systems

(IV) Technical Challenges for Nanomechanical Design
(1) Establishment of nanoparts libraries
(2) Simulation of nanoparts, nanomachines, and nanomachine operations
(3) Nanofactory design

US pressures China to increase Yuan value faster

The USA is threatening 27% tariffs against China to force China to more quickly raise the value of the Yuan by 15 to 40%

This supports my thesis that China's economy will the pass the USA by 2020

15% appreciation in the yuan would move it from 7.6 to 1 US dollar to 6.6 to 1 US dollar.

40% appreciation in the yuan would move it to 5.4 to 1 US dollar.

China had an economy of 20.94 trillion yuan at the end of 2006.
They have a projected 10.5% growth in 2007.
Therefore, if we were to adjust the currency by 40% appreciation.
China's economy at the end of 2007 would be US$4.25 trillion.
This would not include Hong Kong or Macau, which would add another 210 billion.
China's combined economy would be US$4.46 trillion.
Japan at the end 2006 is at US$4.367 Trillion
The projection of the IMF is for Japan to shrink to US$4.3 trillion in 2007

China's economy would be in the range of 32-34% of the size of the US economy.

The politics and money situation within the USA is an attempt by manufacturers within the USA to get more business and profit instead of lower costs for consumers or more profits for Walmart and other importers of goods from China.

China is experimenting with looser controls on the Yuan

June 12, 2007

Vaccine made in the form of rice

Japan makes vaccine against cholera in the form of rice This will allow for vaccines to be stored at room temperature. It will make vaccines cheaper and safer.

The Japanese researchers created the rice-based cholera vaccine by inserting the genetic material from the cholera bacterium into the sequenced genome of the rice plant. The researchers used two types of rice plants to generate the vaccine: Kitaake, which produces normal rice, and Hosetsu, which produces dwarf-type rice. Once the rice plants produced the toxins, they were fed to mice in a powder form suspended in water. The rice-based vaccine produced antibodies throughout the mice's bodies including their mucosal sites, which are an important first line of defense since infectious diseases typically invade and infect a person at these sites. As a result, the mice became immune to the diarrhea-causing bacterium.

"Our goal is to develop a new generation of environmental- and human-friendly vaccines, which can induce protective immunity in both mucosal and systemic compartments against infectious microorganism," says Tomonori Nochi, the vaccine's lead investigator and a postdoctoral fellow in the Department of Microbiology and Immunology at the Institute of Medical Science at the University of Tokyo. The researchers' report appears in this week's edition of the Proceedings of the National Academy of Sciences.

Rice is a plant that can be stored at room temperature for a long time, which is very important for the development of the vaccine. It's estimated that worldwide, it costs $200 to $300 million each year to preserve vaccines at cold temperatures, explains Nochi. "Thus we termed our technology cold-chain-free vaccine. In addition, purification of the vaccine antigen from rice seed is not necessary, also causing a reduction in cost."

Furthermore, abolishing the painful use of needles and syringes not only cuts costs, but also prevents pathogens from accidentally appearing in the vaccines and then spreading throughout the population, especially in underdeveloped countries where supplies are limited.

The researchers plan to prepare the rice-based vaccine in the form of a capsule or tablet for applications in humans, hence they don't have plans to deliver the vaccine as a form of steamed rice. The rice-based vaccine is also suitable for prevention of other mucosal infectious diseases, such as influenza and HIV.

Aggressive energy efficiency plan eliminates half of new energy demand from now to 2020

Aggressive energy efficiency could cut energy usage growth by 50%, but it does not eliminate energy usage growth or reduce below what we currently have in place Diana Farrell of McKinsey Global Institute argues a commitment to energy efficiency is the best way to avert a global climate crisis. Her proposals are more than what the world is currently on track to do.

Why nothing really happens most of the time

There was a recent posting on the IEEE spectrum blog about revolutionary nanotechnology: Wet or Dry

I respond to that blog in the comment sections of that article. It boils down to if you do not research an issue then you can repeat lies, slander and bait and switch marketing. It takes work to not accept a watered down and bait and switched future.

Many futurists often talk about the accelerating technology and change.

But there are strong forces that oppose change and support the status quo.

1. Government and existing industry will often support ideas and plans that do not deliver real change and do not really try to solve problems. Things like 200 billion for Highway funding and transportation infrastructure which mostly gets diverted to other non-highway or transportation projects or just to support bureaucracy which is not held to a standard of results.

2. The public could get its awareness raised about and issue and really want a solution. Things like air pollution, climate change or the development of a new technology. Breaking through to raise awareness on complex issues is tough. Molecular nanotechnology still has lingering problems getting fully understood.

3. They could get denials that it is a problem or that the new technology is not possible. They can get bait and switched into something that is called a solution to the problem but will not do anything like what it is labelled. Climate change and molecular nanotechnology are mostly stuck in this phase.

4. They could get put off with studies of the problem and work to investigate but do nothing about the issue.

5. They could get delivered an expensive program which is flashy and complicated but has no follow up and does not have the scope to do anything like the original visions. Then after decades of this the original vision is declared impossible or needing many decades more for progress. This is the situation with Space colonization and space development after Apollo, Space shuttle and the international space station. People in the programs can be heroic and well intentioned but the programs themselves could only be glorified and dangerous short term camping trips.

6. The journalists covering technology may also not understand the overall technical issues and may not be able to evaluate the real ability to achieve progress with a particular program. Mainstream journalists often have little science, engineering and business background and are unable to analyze what is happening with a proposed program or business "solution". Even journalists for technology and engineering magazines may not really dig into the details. They will have a particular opinion and agenda and look for ways to advance it.

Sharon Weinberger of Wired magazines Danger Room blog, has clear biases. She wrote a book Imaginary Weapons and is always looking to find repeated examples of that thesis (wasted government funding of technology which proved unsuccessful). She will criticize research and papers on technology like that developed in Nasa Institute for Advanced Concepts but will not bother to read them. Here is an example of that where she criticizes some research without understanding or reading anymore than the title She asks for an expert to help her who then tells her about a proposal with similarities.

7. Until a particular class of research and change starts to become profitable at every step (like the internet and mobile phones) then there can be a lot of noise and misinformation and other work passed off with a new label but very little real work and very little real change. Even then there can be entrenched companies and forces that are trying to limit change and defend an existing business. Cable and phone companies blocking or slowing change on communication technologies. Music and content companies that extend copywrite for 75+ years and try to block innovation in content distribution with lawsuits and other tactics.

8. Sometimes people can be convinced that a particular problem is mostly solved, when only the most obvious aspects have been achieved. This is and was the case with air pollution. When the smog was obscuring the skies of all major cities in the west (and not just the lesser amount that LA still has) and you would have 12,000 dropping dead from events like the London Fog of 1952 people got active and solved 80% of the most visible pollution. But economic activity has increased the level of unseen pollution and particulates. Millions die each year but it is mostly not keeling over in the streets but more quietly in hospitals or in less developed countries. Plus there is the numbing effect of something horrible happening year after year. Like someone who was always being beaten by a spouse or parent. The victim got used to it.

These roadblocks to real change and solutions would be less of an issue if they were not preventing the saving of millions dieing from air pollution or from the diseases of old age. Problems that we can work on real solutions if we can push through all of the distractions, bait and switches and lies.

June 11, 2007

Getting the scope of the Energy problem right matters

The Economist points out that there are more economical ways to reduce energy usage They are correct. However, there is the issue of scaling up some of the money making ways.

Even if a lot of people follow the more economically positive steps of insulating houses, switching lighting etc... unless this results in coal or fossil plants being shutdown or operating a lot less so that less pollution and CO2 is generated then there is no actual reduction.

Better insulation and better lighting have been known for decades. Up until recently the lighting switch has only be done for 5% of lights. This is even with legislation in california that requires a percentage of flourescent lighting and dimmer switches on new construction and remodeling.

More fuel efficient cars: 1 million hybrid cars now. Great pat yourselves on the back. But there are 600 million cars on the road worldwide. Every year 65 million more car and trucks get added.

The energy efficiency efforts are nice but it is like you have 1000 concentration camps operating 24X7 and shutting them down between noon and 3 pm on Tuesdays. This comparison is not an exaggeration. The World Health Organization quotes 3 million dead from air pollution every year. Another 1.5 million die from bad water. Most of those dead are from fossil fuel usage.

Changing out the powerplants is what is needed. It will take 2-3 and probable more decades (at the fastest rate) but that is what has to be done.

China is building dams at the rate of a Three gorges every other year. By 2020 they will have 155GW more hydro power. But they are still adding coal at one plant per week. They are not getting rid of the existing coal plants (except shutting down the smallest and dirtiest and replacing them with cleaner, bigger coal plants.)

In the US are any of the environmental plans targeting the coal and natural gas plants to shut them down ? The most aggressive is to make them less polluting over the next 8 years. That is better than nothing, but to really handle the problem we have to phase out the coal plants and fossil fuel usage. (Or cure cancer, heart disease, asthma and use gene therapy to make everyone immune to particulates,
arsenic, mercury and smog poison and later to high temperatures, extreme weather and less water). I think it would be easier to mass produce the nuclear reactors, scale up wind, solar etc... We should and still are working on the disease cures.

The world will be spending $40 trillion over the 45 years on infrastructure. 11 trillion on energy and another 11 trillion on transportation. We should use that money and tackle the full scope of the problem. Fixing it sooner we will save trillions that we would have spent on the people we made sick or killed.

Some other info related to refuting the arguments against scaling up nuclear power:
Nuclear proliferation has killed no one

Argument: Nuclear can't scale up fast enough
Answer: 12 reactors per year in 1974 in the USA, more power added from operating efficiencies gains even with no new nukes, MIT 50% power boost to existing reactors.

266 reactors in the world constrution pipeline and rapidly increasing

China might build 300+ by 2050

Argument: There will not be not be enough uranium when we scale up

Answers:
There is Uranium in flyash (coal waste), a lot of flyash has been and still is being generated

There is still plenty of Uranium from ordinary sources which have not been fully explored, because of the 30 year depression in prices and activity.

Breeders and reprocessing are viable technologies in use and development now


Seawater uranium (not needed until ordinary sources are stretched but
could be cheaper than ordinary land sources if properly developed)
Here is info on a cost analysis for recovering uranium from seawater (10,000 ton per year setup. There is 3.5 to 4 billion tons of Uranium in the oceans.

I am not sure why some would think that scaling up this experimentally proven capability is less solvable than scaling up nanotech for terawatts of solar power.

Here is the copy of the table.

Table 2. Adsorbent production cost (production capacity = 10,000 tons/year)

Item Cost (billion yen/year) Percent Comments

Production equipment and amortization
0.165 billion yen/yr
3% of total costs
1.8 billion yen equipment cost

Precursor material cost
4.137 billion yen/year
84% of total costs
600,000 yen per ton nonwoven,

87,700 yen per ton for polymerization - reaction reagents

Operation expense (includes personnel) 0.62
13 personnel cost, repair cost
Total 4.93 100 unit cost of adsorbent

Unit cost of adsorbent 4.93 million yen/ton (4,100 yen/kg-U)

the biggest cost would be the precursor material.

Link to polyethylene production.

You can divert less than 0.1% of the polyethylene for 10 years when you decide to scale up the seawater extraction. Then you can make a little over 1 of the 10,000/ton year processes each year. In ten years you have 100,000/ton year.

The world capacity of polyethylene production increased up to 70 million tons per year, the polyethylene output in 2005 amounted to 65 million per year

===
Multiple 10,000ton/year uranium harvestnig operations looks doable.

===

We can have a better economy by getting rid of coal

We should also get rid of inefficiencies (superconductors and other technology for less energy losses and fix transportation (plug in hybrids, all electric cars, better mass transit).

3D holograms of molecules

The New Scientist reports progress to 3D holograms of molecules


Image quality improves significantly as the number of computer processing iterations increases from 0 (left) to 500 (right) (Image: The American Physical Society)

For years physicists have grappled with a kind of double vision that has made using holography difficult. The process results in an out-of-focus second image being superimposed on the main one, which can seriously degrade the result. "The twin image problem has existed since holography was conceived. People have always worried about it," says Hans-Werner Fink, a physicist at the University of Zurich in Switzerland.

Physicists have devised various optical techniques for removing the twin, but they do not always work for light of much shorter wavelengths such as X-rays.

Now Fink and his colleague Tatiana Latychevskaia have solved the problem in a way that should work, regardless of the source of illumination.

Latychevskaia says she realised how to do it after noticing that the blurry superimposed twin makes some areas of the light field brighter than would be possible were there a single image alone.

So she designed a computer program that identifies these regions, replaces them with a more realistic light level, and than calculates how this would affect the light field that created them.

The modified light field is then used to create a new image and the process begins again. Repeating this process many times removes the twin image entirely, dramatically sharpening the result.

"Although tested so far only in the optical region, there is no obvious reason why the authors' method should not also work with X-rays and electron waves," he told New Scientist. That would be important for using the technique on a microscopic scale.