Pages

October 14, 2006

MIT rethinking house construction: Open prototype initiative

The Open Prototype Initiative is a collaboration between the Massachusetts Institute of Technology House_n Research Consortium, Bensonwood Homes and other construction industry members to develop a series of four prototype homes , deploying advanced designs, materials, systems, and fabrication strategies, with a goal of showing how high-quality, sophisticated, and personalized homes can be built more cost-effectively and in less time. Each home will be built in 20 working days. The first prototype, Open_1 - a three-story 28-by-46-foot house begins in June 2006, with another new home being built every 18 months through 2010.

Some key features:
- The floor, wall and roof systems will be pre-built with wiring pre-installed.
- Floors, ceilings and baseboards will allow for easy access to plumbing, heating and wiring;
- The structure will consist of distinct, disentangled and accessible layers that allow for both efficient assembly and for change over time;
- The building shell, with exterior finish, will be assembled in five working days
- Mechanical, electrical, and plumbing systems will be installed in three working days
- Interior fit-out will be completed in five working days
- Interior finishes will be completed in five working days

It seems like they have several interesting ideas to improve upon the modern manufactured/prefab home.

A quick summary of categories of manufactured homes. The MIT plans are a fresh take on a mix of panelized and modular systems.

Manufactured Home: Built entirely in the factory under federal code administered by the Department of Housing and Urban Development (HUD), which went into effect June 15, 1976, has been upgraded numerous times.

Mobile Home: The term mobile home used for homes built prior to June 15, 1976, when HUD code went into effect. Voluntary standards were previously in effect.

Modular Home: Built to state, local or regional code where home will be located. Multi-section units are transported to sites and installed.

Panelized Home: Built in factory, where panels that include windows, doors, wiring & siding, are transported to site and assembled. Codes are set by state or locality where sited.

Pre-Cut Home: Materials are factory cut to design specifications and then transported to the site and assembled. Examples are: kit, log and dome homes. Standards are set by state and locality.

October 13, 2006

Antimatter and fusion would be helped by molecular nanotechnology

Wikipedia has information about antimatter.

The reaction of 1 kg of antimatter with 1 kg of matter would produce 1.8×10**17 J (180 petajoules) of energy (by the equation E=mc²). This is about 134 times as much energy as is obtained by nuclear fusion of the same mass of hydrogen (fusion of 1H to 4He produces about 7 MeV per nucleon, or 1.3×10**15 J for 2 kg of hydrogen). This amount of energy would be released by burning 5.6 billion liters (1.5 billion US gallons) of gasoline (the combustion of one liter of gasoline in oxygen produces 3.2×10**7 J), or by detonating 43 million tonnes of TNT (at 4.2×10**6 J/kg). Some researchers claim that with current technology, it is possible to attain antimatter for US$25 million per gram by optimizing the collision and collection parameters (given current electricity generation costs). Antimatter production costs, in mass production, are almost linearly tied in with electricity costs.

Assuming the economical mass production of antimatter was possible it would cost $25 billion to make 1 kg of antimatter which is equal to 1.5 billion gallons of gasoline. So for mundane energy usage other forms of energy storage are better. It is a net energy loser. However, a lot of relatively inexpensive antimatter would be great for space propulsion systems and for antimatter catalyzed fusion and fission.

Molecular nanotechnology could reduce electricity costs with mass production of high efficiency solar cells on earth and in space.

Antimatter harvesting could also be cheaper and more efficient using molecular nanotechnology. 20 kilograms are within the orbit of Saturn. This does not have to be an energy loser but it would require being very clever to find an efficient harvesting system. Perhaps having superconducting antimatter traps that moved around the gravitational manifolds in the solar system. (Interplanetary superhighway)

Here is a pdf that talks about a development path to get the Z Pinch system to enabling fusion power generation. Molecular nanotechnology could help make better targets for the z pinch, better lasers, better magnets and better chambers.

Shortly after (or before, but definitely after) real molecular nanotechnology arrives, then we should finally get commercial fusion power generation, over 10 times our current power generation for super-efficient solar power, and kilograms of antimatter.

Further reading:
A pdf from 2006 that discusses antimatter production

Atomic clocks made 50 times more accurate

A University of Nevada research team was able to isolate and explain a significant portion of the error in atomic clock output. The portion of error that the team studied has now been cut to one-fiftieth of its original size. The team's research was based solely on calculations, many of which were conducted on high performance computers. This shows that computer simulations can accelerate molecular research and development and further validates the quantum mechanical models that have been developed.

The new findings are also paving the way for all kinds of new scientific experimentation. Extremely accurate measurements are required to make estimations about the behaviors of the universe. The extra time-keeping precision will allow scientists to explore hypotheses about the big-bang theory. The improved technology might even be accurate enough to provide evidence related to the controversial theory that universal constants, as in the amount of charge in an electron, are changing.

Bistable nano switch Creeted

Now scientists from Northwestern University have demonstrated a novel carbon nanotube-based nanoelectromechanical switch exhibiting bistability based on current tunneling. The device could help advance technological developments in memory chips and electronic sensing devices. A major advantage of their device is its geometry, which is fully compatible with current manufacturing techniques for mass production. The device is made of a free suspended multiwalled carbon nanotube interacting electrostatically with an underlying electrode. In the device circuit, there is a resistor in series with the nanotube, which plays an important role in the functioning of the device by adjusting the voltage drop between the nanotube and the underlying electrode.

My view on North Korea

North Korea is still a fourth rate country that is halfway collapsed. They could cause a lot of damage to South Korea but I think they would lose against South Korea in a war. The risk of North Korea (NK) giving/selling nuclear weapons is not likely. Kim Jong Il and his inner circle are playing a game of brinksmanship, but if they actually let the nukes get out or actually used them that would force action from the US and others. Action from the US and others would be the death or at least deposement of the current rulers of North Korea.

The bigger risk as pointed out by Kaplan in the Atlantic magazine is the chaos of North Korean leaders losing control.

The Atlantic magazine has some analysis of North Korean situation

The situation is about money and who will pay the trillions to clean up North Korea. North Korea got propped up in the mid-90s to delay dealing with a collapsed North Korea.

China would be willing to take over the top part to get some better ports.

Kim Jong Il has not been acting that crazy from what I have seen. He and his inner circle are acting in the criminal self interests of a small group. They are a gangster state using blackmail, arms dealing and counterfeiting. Unlike Iran they have no larger religious agenda or expansionist aspirations. Billions of dollars a year to manage what could be a trillion dollar problem.

The N Korean nukes will only get loose if Kim Jong Il and his crowd lose control.

In terms of lives and numbers, millions of N Koreans dead over the last decade. Millions more at risk from starvation.
Millions of S Koreans at risk in a war situation.
Tens of thousands of people outside at risk if the nukes got loose. Their nukes are not that good and are unlikely to get delivered in a way that would cause maximum casualties.
Thousands of US and other troops could die by getting into the mix of a Korean War.

Patience to manage and contain the problem is a decent strategy. Plus try to cut a deal where they walk away. Push ahead on remote nuclear material and bomb detectors that could be used to make a safe surgical first strike possible, but then have and implement a plan for taking over establishing order after the NK collapse.

October 12, 2006

Origin of low capacitance in thin-film capacitors found

Researchers at UC Santa Barbara have discovered what limits our ability to reduce the size of capacitors, often the largest components in integrated circuits, down to the nanoscale. By finding the source of the problem, practical guidelines for minimizing the effects can be found and smaller thin film devices can be made. A roadblock to smaller and faster computers and electronics will be removed Nicola Spaldin, a professor in the Materials Department of the College of Engineering, and her collaborator, post-doctoral researcher Massimiliano Stengel, used quantum mechanical calculations to prove that a so-called "dielectric dead layer" at the metal-insulator interface is responsible for the observed capacitance reduction. Metals with good screening properties can be used to improve the properties.

Hardware and software scheme avoids JPEG compression, could allow 10 times better resolution for cameras

When a multi-megapixel digital camera snaps a shot, most of the information doesn't even make it into the final photo file. About 90 percent of information is lost during the compression process that creates a JPEG file. hey've built and tested the hardware and software for a camera that collects just enough information to recreate a picture, while avoiding the traditional compression process. Incorporating these algorithms into commercial cameras could allow a two-megapixel camera to snap 20-megapixel photos.

In a prototype, the researchers used an array of tiny mirrors--a technology developed by Texas Instruments that's already used in high-definition projection televisions. The micromirror array takes in a small amount of information, and directs it onto a single sensor. Then algorithms are used to reconstruct the image. Since the prototype has only one sensor, in effect it's a single-pixel camera. However, the algorithm recreates an image with 100 times the resolution of what would traditionally be captured in a single pixel.

Baraniuk and his team recognized that an emerging field of information theory, called "compressive sensing," offered an alternative approach to conventional image acquisition and compression.


The researcher's camera has a long way to go before it's in a commercialized form, though, notes Baraniuk. Right now, the setup spans an optical table in a lab, and the researchers' algorithms are slow compared with the compression in commercial cameras. The group is working to make its algorithms faster, and, Baraniuk adds, the hardware continues to improve as more micromirrors are being added to smaller arrays, and their flipping speed increases.

Baraniuk expects that the first application for the new camera could be in terahertz imaging systems--systems that use terahertz-frequency radiation to see through objects and detect small amounts of chemicals. Currently, it's expensive to build the large sensors needed for these systems, he says, so a single-sensor camera like the one the group developed would be ideal.


I would predict that within 6-10 years this will reach commercial cameras. Then digital cameras that are at 20 to 500 megapixels will get a ten times boost in resolution.

This relates to a computing prediction that I made at the beginning of this year
Gigapixel cameras common 2009-2015

Currently we are almost at expensive gigapixel cameras using backplane technology which has slow image capture

Related info:
111 megapixel CCD on a single chip This chip combined with the system that was just developed would provide gigapixel cameras.

Imaging with 50 times less power and other efficiency improvements

October 11, 2006

Next Prius in 2009 will have over 100 mpg

the next Prius is set to be an evolution, company sources say. The hybrid will retain the same basic 1.5-liter hybrid drivetrain. But Toyota is now on a mission to do two things: drive the economy ratings skyward, and cut the associated costs by 20-30 percent. Toyota was rocked when news seeped out that Honda was planning a low price Fit hybrid for 2008, with the price differential just 200,000 yen (some $1,700) more than the regular gasoline version. So work on the next Prius has redoubled to slash R&D costs and halve Toyota's current hybrid differential of 500,000 yen (some $4,240) to compete.

Sources say the next Prius will also be able to run longer and faster in pure electric mode, up to a sustained 30 mph, which will significantly extend its zero-emissions range. The gains will largely come from replacing the current Prius' nickel-hydride batteries with lithium ion cells.

Lower costs for adding hybriding to cars means faster return on investment from fuel efficiency and the ability to mix several improvements at an affordable cost. $1700 for hybrid to get to 100mpg and then $100 for diesel to get to perhaps 150mpg or adding plug in options.

The hybrid center has a past and future timeline of delivered and planned hybrids

Prior plans from Honda for 65mpg+ hybrids

69 mpg diesel hybrid from Peugeot

A boost for solar cells with photon fusion

From Physorg.com, an innovative process that converts low-energy longwave photons (light particles) into higher-energy shortwave photons has been developed by a team of researchers at the Max Planck Institute for Polymer Research in Mainz and at the Sony Materials Science Laboratory in Stuttgart


The efficiency of solar cells today is limited, among other reason, by the fact that the longwave, low-energy part of the sunlight cannot be used. A process that increases the low level of energy in the light particles (photons) in the longwave range, shortening their wave length, would make it possible for the solar cells to use those parts of light energy that, up to now, have been lost, resulting in a drastic increase in their efficiency. The equivalent has only been achieved previously with high-energy density laser light which, under certain conditions, combines two low-energy photons into one high-energy photon - a kind of photon fusion.


As this procedure allows previously unused parts of sunlight to be used in solar cells, the scientists are hoping that it offers the ideal starting point for more efficient solar cells. To optimize the process and to bring it closer to an application, they are testing new pairs of substances for other colors in the light spectrum and are experimenting with integrating them in a polymer matrix.

UCF make Extreme Ultraviolet light source 30 times brighter to enable 12 nanometer lithography

The team, led by Martin Richardson, university trustee chair and University of Central Florida (UCF)'s Northrop Grumman professor of X-Ray optics, successfully demonstrated for the first time an EUV light source with 30 times the power of previous recorded attempts – enough to power the stepper machines used to reproduce detailed circuitry images onto computer chips.

"We must use a light source with a wavelength short enough to allow the minimum feature size on a chip to go down to possibly as low as 12 nanometers," Richardson said. The current industry standard for semiconductor production is approximately 65 nanometers. A nanometer is one-billionth of a meter; a sheet of paper is about 100,000 nanometers thick.

This means that the normal semiconductor business and Moore's law should have a clear technical path to 12 nanometers. Even better and cheaper technology could still come up, but it is difficult to displace the proven processes.

The semiconductor roadmap 2005, 2006 will be out Dec, 2006

The roadmap projects about 10 nanometer structures in 2015

CNET has an article about the projected semiconductor manufacturing process nodes, 32 nanometers, 22 nanometers and the challenges of achieving them

Wikipedia info on semiconductors

A series of CNET articles about extending Moore's law with other technology

Laser TV, $1000, 50 inch, 3 times better color than Plasma TVs, Q4 2007

From IGN, Australian Arasor International and American Novalux claim a technologic breakthrough: $1k, 50-inch, 12-bit color displays by Q4 2007.

Novalux chief executive Jean-Michel Pelaprat said "If you look at any screen today, the colour content is roughly about 30-35 percent of what the eye can see. But the very first time with a laster TV we'll be able to see 90 percent of what the eye can see." In technical terms, Mr. Pelaprat is claiming greater than 10-bit color depth for displays making use of the technology.

The laser projection HDTVs will use a quarter of the electricity that current HDTV technologies

October 10, 2006

grants to develop $1,000 genome sequencing technology

Xiaohua Huang, a professor of bioengineering in UCSD's Jacobs School of Engineering, leads the effort at UCSD to develop a promising technology that shrinks what is currently being done in large genome-sequencing laboratories down to a glass slide the size of a business card. Huang's team will combine micro- and nano-fabrication technologies with innovative chemistry technologies to simultaneously sequence more than 1 billion individual pieces of DNA attached to the surface of single slides.

Huang will be joined at the Jacobs School by Pavel Pevzner, a professor in the Computer Science and Engineering Department who leads the department's Bioinformatics Laboratory. Sequencing the 23 pairs of human chromosomes extracted from the cells of one individual involves cutting the DNA into tens of millions to hundreds of millions of pieces, and Pevzner is developing the computational techniques needed to computationally reassemble the chromosomes by piecing together the overlapping ends of all the fragments after they have been sequenced.

Bioengineering professor Michael Heller is a third member of the UCSD team. He is an expert on using electric fields to actively manipulate biomolecules and assemblies of nanostructures. His expertise will be utilized to accelerate and enhance the sequencing process.

October 09, 2006

Other tech: MIT material stops bleeding in seconds

MIT and Hong Kong University researchers have shown that some simple biodegradable liquids can stop bleeding in wounded rodents within seconds, a development that could significantly impact medicine.

When the liquid, composed of protein fragments called peptides, is applied to open wounds, the peptides self-assemble into a nanoscale protective barrier gel that seals the wound and halts bleeding. Once the injury heals, the nontoxic gel is broken down into molecules that cells can use as building blocks for tissue repair.

"We have found a way to stop bleeding, in less than 15 seconds, that could revolutionize bleeding control," said Rutledge Ellis-Behnke, research scientist in the MIT Department of Brain and Cognitive Sciences.


Doctors currently have few effective methods to stop bleeding without causing other damage. More than 57 million Americans undergo nonelective surgery each year, and as much as 50 percent of surgical time is spent working to control bleeding. Current tools used to stop bleeding include clamps, pressure, cauterization, vasoconstriction and sponges.

The exact mechanism of the solutions' action is still unknown, but the researchers believe the peptides interact with the extracellular matrix surrounding the cells. "It is a completely new way to stop bleeding; whether it produces a physical barrier is unclear at this time," Ellis-Behnke said.

The researchers are confident, however, that the material does not work by inducing blood clotting. Clotting generally takes at least 90 seconds to start, and the researchers found no platelet aggregation, a telltale sign of clotting.