Pages

February 08, 2008

Myostatin inhibitors can triple pectoral size but remember to also use gene therapy to increase tendon strength 70%


A naturally myostatin blocked boy. Antisense RNA has been used to activate myostatin inhibition in mice

Mice lacking the myostatin gene have 25–30% increased muscle mass. Individual muscles, such as the pectoralis and quadriceps, of myostatin mutant mice are two- to threefold heavier than those of wild type mice.

Antisense RNAs were injected into normal mice through intravenous (i.v.) routes at the dose of 5 mg kg-1, twice weekly for 4 weeks. Two days after the last injection, the leg muscles were weighed and the ratio of the leg's muscle weight to body weight in each group was calculated


Myostatin blockers would let more people look like this body builder who did not have myostatin blocked but worked harder at building his muscles the old fashioned way.

Another study indicates that myostatin blocker enhanced muscles will result in more tendon injuries.

So it is a good thing that they have found that gene therapy can be used to increase tendon strength in rats by 70%.


Tendons transduced with BMP-14 exhibited less visible gapping, a greater number of neotenocytes at the site of healing, and 70% greater tensile strength than did either those transduced with GFP or the sham controls at two weeks after repair. Histological examination revealed no inflammatory response to the adenovirus in tendons transduced with BMP-14 or GFP. No ectopic bone or cartilage formed in the tendons transduced with BMP-14.
They believe this will help tendons heal.


Gene therapy could one day help cure diseases like cancer, aids and diabetes and it's been estimated that by 2011, the market for gene therapy products will be US$6.5 billion.

They are using gene therapy to hinder HIV by sabotaging HIVs ability to replicate.

The new method approaches treatment of the AIDS virus by using modified HIV virus, called VRX496, to attack and destroy the protective shell, or envelope, of the virus within living cells.

Typically, viruses reproduce by taking over the replication machinery of a living cell, turning the cell into a factory that pumps out more copies of the virus.

By attacking the protective envelope of the HIV virus, the new therapy suppresses the viability of the HIV virus, meaning that the virus can no longer replicate itself as successfully.


However, the possible changes to the human body don't end with disease.Researchers have also been able to reduce fat, pump up muscle (myostatin blockers), prevent and help recovery from radiation damage, lengthen lifespans and change the color of mice through gene manipulation. Although gene therapy is not yet considered safe for humans, it is only a matter of time before better techniques arrive that make it possible on a large scale.

Gene therapy is leading to faster and cheaper drug development and production.

It is quicker to make transgenic animals using gene therapy. Current methods of producing such animals involves microinjection and cloning, which is a more expensive and longer process. These methods are inefficient and also carry a risk of producing offspring with developmental abnormalities. Instead of 100 to 300 attempts to make clones that work, 10% of animals were breeding the desired changes with gene therapy.

RNA interference (RNAi) for gene therapy

At least six clinical trials using RNA interference (RNAi) have been approved, “with many more coming down the pipeline,” according to the Editorial by Mark A. Kay, MD, PhD, an Associate Editor of Human Gene Therapy and the Dennis Farrey Family Professor in Pediatrics and Professor of Genetics at Stanford University School of Medicine. “One thing is clear,” adds Kay, “small RNAs as a therapeutic platform are here to stay.”

Gene therapy could fix chronic pain.

The rats were injected with a gene that tricks the body into releasing endorphins, a natural painkiller, in the nerve cells surrounding the spinal cord.

The treatment simulates the effect of painkilling drugs but is much narrower in scope, targeting nerve cells along the spinal cord, but not in the brain or in other parts of the central nervous system.


FURTHER READING
Myostatin inhibitor trials on humans

UK sports study expects myostatin inhibitor use by 2012

Future Fab@home local production is not assured victory over centralized production

Jamais Cascio, founder of Open the Future and a director at CRN, says nanofactories will have a huge impact: "If it becomes cheaper and more efficient to have something printed out locally instead of made in China, it will have a big effect on things like trade balances, international labor, and ... our national economy."

I think with improved technology that centralized production will not go away.

Technological impact starts occuring before nanofactories arrive with advances on the current generation of 3d printers, fabbers and rapid manufacturing systems. Also, the technology impact is not exclusively in favor of local manufacturing versus China production. Transportation costs to ship 140 million 100 gram UAVs do not put centralized production at much of a disadvantage relative to local fabrication. So do transportation costs outweigh economies of scale ? Big photocopying jobs still go to Kinkos and bigger printing runs of marketing materials still get outsourced to China. I see the relationship with local printable electronics and fabbers to be the same.


Adaptable but relatively large reel to reel printable electronic production could be part of the warehouse area of a Walmart or Costco. Just like they photograph printing at Walmart and Costco and Kinkos now.

Monster high volume production would still make sense to be placed in China.

It is not just that you can have bigger machines with higher economies of scale, the economies of scale also have to do with having the customers to keep the machines busy and running all the time. There are economies of constant operation and amortization and economies of specialization. The manufacturing specialist will be better at it because that is what they do all the time.

I do not think the business factors change that much even with nanofactories (at least not beyond the current products like printing). I think the big nanofactory will still be more efficient and lower cost than the desktop nanofactory.

Economies of scale

The common ones are purchasing (bulk buying of materials through long-term contracts), managerial (increasing the specialization of managers), financial (obtaining lower-interest charges when borrowing from banks and having access to a greater range of financial instruments), and marketing (spreading the cost of advertising over a greater range of output in media markets). Each of these factors reduces the long run average costs (LRAC) of production by shifting the short-run average total cost (SRATC) curve down and to the right.



There are Diseconomies of scale as well.

Causes
1. Cost of communication
2. Duplication of effort
3. Top-heavy companies
4. "Office politics"
5. Isolation of decision makers from results of their decisions
6. Slow response time
7. Inertia (unwillingness to change)
8. Cannibalization
9. Large market share / portfolio
10. Public and government opposition
11. Other effects related to size


FURTHER READING
Prodution costs and pricing

This discusses the advantages and disadvantages of mass production, jobbing and batch production.

Sousveillance DIY : witness cameras and the future of surveillance


Inverse surveillance is a subset of sousveillance with a particular emphasis on "watchful vigilance from underneath". The idea is that citizens should be working together to watch society and government to prevent abuse of power and to help detect and defend against terrorists and others who would act against society.

An interesting new development is that there are "do it yourself" DIY instructions on "How to build your own witness camera." The camera will detect people moving around, it silently starts recording to digital media.

The Witness Camera is a combination of a VGA CMOS camera, a passive-infrared movement sensor, a 1 GB SD-card (or bigger), and an AVR Mega32 microcontroller implementing a solid-state time-lapse recorder. It is a compact, complete, self-contained surveillance system designed with home users in mind. It can be installed in minutes wherever there is a mains plug, and it is affordable because of it is built using an handful of inexpensive parts.

The component prices are less than $100 for the camera part and maybe $150 for a display (you could already own a display system with your own computer.) Commercial systems range from $1000-2000+ and have hard drives and video disk or tape recording.

About $50 for the camera
$14 for the remote
$14 for the Passive IR


You need a out-of-sight place, spacious enough to accommodate the recorder and the video display ($150), because images can only be inspected using the original recorder.


Your mileage can vary, but you should be able to get about 50,000 frames at 320x200 (like the one below), or 25,000 at 640x480, using a 1 GB card (I haven’t tried bigger cards). This corresponds to more than 40 hours of overall recording.Actual time span is much more than that. Likely, the best location for the camera is in the foyer, where people stand just a few minutes per day. In my case; just 20 minutes on average, giving an impressive 120 days of storage capacity.




All fo the pieces should be fabricatable with rapid manufacturing and printable electronics. So this will get even cheaper and more capable.


The state of sousveillance and surveillance will be radically transformed over the next 4-8 years even without full blown nanofactories. Right now there are about 200 million vidphones (about 150+ million activated video cellphones) and camcorders (50+ million) and there are tens of thousands of closed circuit television and other monitoring. There are over one billion camera cellphones.

Power efficiency and power generation will make it easier for always on camcording
MIT and Texas Instruments have designed chips that are ten times more energy efficient and could run off of ambient energy (thus could be always on.) 5 years away from commercialization for the low power TI chips. A few years back in the
lab there has been work that makes digital CMOS cameras 50 times more energy efficient. The recent development of systems to generate (5 watts) power from people walking and taking the power from the breaking part of the step actually makes walking easier.

Superior Lidar, t-rays and better satellite and other remote sensing too.

Progress with computers and software automatically deciphering what is in the digital images. Quantum computers will also help with pattern recognition and faster image database searches.

So within 6 years there will be billions of vidphones/camcorders always on. 10-20 times as much as now.
- serious reel to reel fabrication of printable electronics integrated with upgraded fabbers (see what current 1 million dollar rapid manufacturing systems can make) could enable people to dump out smaller than USB stick versions of the witness camera for less than 5 dollars a piece. Within ten years it could go to less than rice grain sized and be producable for pennies a piece. Once we are past the $5 a piece level then people can wire up every aspect of themselves, their home, their office cubicle, their car etc...


FURTHER READING
Make magazine is has other ideas that are suitable projects for rapid manufacturing and future fabber machines.

Example:
Do-It-Yourself robotic inflatables that navigate autonomously and intelligently. They are light-seeking helium-filled balloons that graze the landscape in search of light and cellphone signals


So you can fab your own witness cameras and mount them in your inflatable flying robots.

Phase change memory, FRAM, MRAM, better Flash and Dram all getting released


Next-generation memories--such as FRAM, MRAM, PCM and others--are supposed to replace today's DRAMs and flash memory technologies. Current memory devices are expected to hit the wall, as the floating-gate reaches its physical limits.

Intel Corp. and STMicroelectronics Inc. reached a milestone, as they begin shipping prototype samples of their previously-announced phase change memory (PCM) line. The 90-nm, 128-megabit product is slightly late to the market; the companies were supposed to ship the device late last year.

As RAM and flash technologies run into scaling limitations over the next decade, PCM costs will decline at a faster rate, Intel and ST claimed, further predicting that the advent of multi-level-cell PCM will accelerate the cost per bit crossover of PCM relative to today’s technologies. The duo also projected that by combining the bit-alterability of DRAM, the non-volatility of flash, the fast reads of NOR, and the fast writes of NAND, PCM has the ability to address the entire memory market and be a key driver for future growth over the next decade.

Intel has said that it has produced prototypes of PCM devices and is shipping samples to customers now. Codenamed “Alverstone” the devices are 256 Mb multi-level (2 bit) cell devices manufactured in 90 nm. Taking the prototype down to leading edge lithography levels of 45nm would increase the 2 bit cell technology to 1 Gb, which is still behind the latest flash at 16 Gb.

Unofficially, the 90-nm, 128-Mbit part is being billed as a NOR flash compatible replacement. Cliff Smith, technical industry manager at Intel, said that the part provides fast read and write speeds at lower power than conventional flash, and allows for bit alterability normally seen in RAM.

According to Al Fazio, memory technology development director, Intel, only three memory technologies that more or less meet the next generation memory criteria: MRAM, FeRAM and PCM, with the latter being the most promising. PCM is the most promising because it appears to have the capability to scale down to 5 nm and beyond.

PCM is already proven in certain applications; it is the material that is used in rewritable CD-ROMs. There, it is used with a laser — photon energy changes its material from an amorphous to a crystalline state. “We’re trying to use an electrical current — IR heating — instead of light to change the memory,” Fazio said. “While it’s doable, it hasn’t been proven on a large manufacturing base. That’s a major hurdle — moving this feasibility, the basic capability, to the manufacturing floor.” Another PCM concern is that although there are many studies showing that it is scalable down to probably <5 nm dimensions, producing a device around it is a different matter. “How do you build the device structure, what’s the wordline/bit line configuration of that array, how do you get all of this to work in those small dimensions? I have little doubt that it will be solvable, but it will require work,” Fazio said.


Meanwhile, in recent times, Freescale, NEC and others have rolled out rival MRAM devices. And Texas Instruments and others claim to be shipping another competitive technology called FRAM.

Japan's NEC recently claimed that it has developed the world's fastest MRAM. NEC's new ''SRAM-compatible, MRAM'' can operate at 250-MHz. The MRAM has a memory capacity of 1-megabit.

The MRAM is still in the development stages, and eventually, it will be targeted for select markets, said Masao Fukuma, senior vice president of NEC Electronics Corp. "Embedded memory is our first target," he told EE Times at ISSCC



SanDisk builds NAND flash chips with 3-bit cells when others had only offered 2 bit cells
Several firms are working on NAND flash structures with 4-level cells. SanDisk is using the intermediate step x3 MLC with three bits per cell, developed jointly with its partner Toshiba, for a 16-gigabit chip.

In comparison with single-level cell (SLC) NAND flashes, which are more expensive because of their larger die areas, MLC memory chips do, however, have disadvantages. The number of write cycles that each individual cell survives is typically around 10,000 for MLCs, but the figure for SLCs is usually 100,000. Error correction for MLCs requires a more expensive 4-bit ECC technique, whereas 2-bit ECC is sufficient for SLCs. The higher cost of signal processing, moreover, reduces the data-transfer rate.

A closely related chip with x2 MLC and half the capacity, on the other hand, is to achieve more than 60 Mbytes per second. SanDisk quotes only 8 megabytes per second for writing to the x3 MLC NAND flash. SLC NAND flashes are therefore still commonly built into fast solid-state discs (SSDs). Intel and Micron had also announced ONFI 2.0 NAND flashes with a capacity of eight gigabits and a write-data transfer rate of up to 100 Mbyte/s at ISSCC 2008.

SanDisk intends to begin shipping products. Shipments will start with 16-Gbit devices, followed by 32-Gbit parts in the second half of 2008.


FURTHER READING
Another alternative memory is programmable metallization cell or nanoionic memory

February 07, 2008

Carnival of Space Week 40

Carnival of Space #40 is up at orbiting frog

I contributed my article on SpaceX's progress on the Falcon 9 rocket

Hobby space updates the activites of Bigelow Aerospace who are making an inflatable space hotel

Centauri Dreams lives up to its siet name with a talk about the Longshot space mission. It was a plan to develop technologies over 20-30 years for a 100 year mission to reach the alpha centauri solar system.

Longshot was conceived as being built with modular components on the ground and then launched to low-Earth orbit for assembly at the space station presumed to be operational there. The enabling technologies included a “pulsed fusion micro-explosion drive” (I’m quoting from the Project Longshot report) with a specific impulse of 1 million seconds, along with a long-life fission reactor with 300 kilowatts power output.


The Longshot pdf report is here

Venture capitalist says nuclear fusion is coming

A picture from the patent filing for what General fusion is trying to build with backing from Chrysalix Energy Venture Capital. GF will build a ~3 meter diameter spherical tank filled with liquid metal (lead-lithium mixture). Rams use compressed steam to accelerate pistons to ~50 m/s. Make compression wave in the liquid metal. Microsecond of fusion once per second. Note: This could be considered like a variant of "steam punk nuclear fusion" made real if it works.

"Within five years, large companies will start to think about building fusion reactors," Wal van Lierop, CEO of Chrysalix Energy Venture Capital, said in an interview at the Clean Tech Investor Summit taking place here this week. In three to four years, scientists will demonstrate results that show that fusion has a 60 percent chance of success, he said.

Lierop has backed General Fusion's Magnetized Target Fusion (MTF) model. An electric current is generated in a conductive cavity containing lithium and a plasma. The electric current produces a magnetic field and the cavity is collapsed, which results in a massive temperature spike. MTF has an advantage over other fusion techniques in that the plasma only has to stay at thermonuclear temperatures (150 million degrees Celsius) for a microsecond for a reaction to occur.

Canada's General Fusion has received $1.2 million in venture funding to conduct further research on its fusion reactors, according to VentureWire. The company's ultimate plan is to build small fusion reactors that can produce around 100 megawatts of power. The plants would cost around $50 million. That could allow the company to generate electricity at about 4 cents per kilowatt hour, relatively low.


Here is a cutaway view of the insides of that pin cushion sphere of injectors.

General Fusion is using the MTF approach but with a new, patented and cost effective compression system to collapse the plasma.

GF will build a ~3 meter diameter spherical tank filled with liquid metal (lead-lithium mixture). The liquid is spun to open up a vertical cylindrical cavity in the center of the sphere (vortex). Two spheromaks (magnetized plasma “smoke ring”) are injected from each end of the cavity. They merge in the center to form a single magnetized plasma target. The outside of the sphere is covered with pneumatic rams. The rams use compressed steam to accelerate pistons to ~50 m/s. These pistons simultaneously impact the outside of the sphere and launch a spherical compression wave in the liquid metal. As the wave travels and focuses towards the center, it becomes stronger and evolves into a strong shock wave. When the shock arrives in the center, it rapidly collapses the cavity with the plasma in it. At maximum compression the conditions for fusion are briefly met and a fusion burst occurs releasing its energy in fast neutrons. The neutrons are slowed down by the liquid metal causing it to heat up. A heat exchanger transfers that heat to a standard steam cycle turbo-alternator to produce electricity for the grid. Some of the steam is used to run the rams. The lithium in the liquid metal finally absorbs the neutrons and produces tritium that is extracted and used as fuel for subsequent shots. This cycle is repeated about one time per second.

The use of low-tech pneumatic rams in place of sophisticated high power electrical systems reduces the cost of the energy delivered to the plasma by a factor of 10 making such a power plant commercially competitive. Because the fusion plasma is totally enclosed in the liquid metal, the neutron flux at the reactor wall is very low. Other fusion schemes struggle with a high neutron flux at the wall that rapidly damages the machine and also produces some radio-active material. Frequent robotic replacement of the then radio-active plasma facing components is a costly problem for many fusion machines.

General Fusion has patented this technology and believes that a reactor working on this principle could be built at a much lower cost than using the old magnetic and laser fusion approaches. Such a power plant would make fusion a commercially viable clean power source.


FURTHER READING
General Fusions patents are here

Here are the basic steps that will be followed for the general fusion MTF approach.

Over a dozen research papers at Los Alamos National Lab (1998-2007) on magnetized target fusion and other fusion methods

The graphic is of LANL (Los Alamos National Labs) version of an Magnetized target fusion system using similar principles to what the Venture capitalist is backing.

An interesting paper is Applications of predictions for FRC translation to MTF FRC is field reversible configuration.

We describe a physics scaling model used to design the high density Field Reversed Configuration (FRC) at LANL that will translate into a mirror bounded compression region, and undergo Magnetized Target Fusion compression to a High Energy Density plasma. The theta pinch formed FRC will be expelled from inside a conical theta coil. At Kirtland AFRL the FRC will be compressed inside aux conserving cylindrical shell. Semi empirical scaling laws, which were primarily developed and benchmarked for collisionless Field Reversed Configurations (FRC) are expected to remain valid even for the collisional regime of FRXL experiment. The scaling laws are used to predict the the time available for the translation compared to the lifetime of the FRC. This approach is used to outline the design and fabrication of the integrated MTF plasma compression experiment.


another 47 powerpoint presentation is Magnetized Target Fusion: Plans & Prospects




LANL using "can crusher technology" to magnetically crush an aluminum lining.


LANL will be running some pretty big tests in 2008


Implode a plasma by rapidly compressing the metal liner. $130,000 to 150,000 for each test run.

Hypersonic vehicles designs and progress



From the designers of Skylon, a fairly practical spaceplane designs, comes the Mach 5 A2 commercial Concorde replacement. The A2 and the Scimitar engine are more affordable and longer lasting versions of the Skylon spaceplane and Sabre engine.

Analysis of the Development, Production and Operating costs suggests that the average ticket price would be comparable to an existing Business class ticket. The A2 vehicle could capture all of the current business and first class traffic due to the greatly reduced journey time of 4.6 hours compared to the current 22 hours.

Unlike Concorde the A2 vehicle has exceptional range (approx 20,000 km both subsonic and supersonic) and is therefore able to service a large number of routes whilst simultaneously avoiding supersonic overflight of populated areas. Its good subsonic performance enables it to service conventional subsonic overland routes thereby increasing its sales potential to airlines.



To achieve the range requirement liquid hydrogen fuel is mandatory since the specific calorific energy of hydrocarbon fuels is too low. Reaction Engines have conceived the Scimitar precooled engine concept which exploits the unique thermodynamic properties of liquid hydrogen.


The Scimitar Engine is a derivative of the Sabre spaceplane engine intended for SSTO launcher application. Consequently most of the Scimitar engine technology is similar to Sabre but designed for much longer life. Both engines are designed around existing gas turbine, rocket and subsonic ramjet technology. However the incorporation of lightweight heat exchangers in the main thermodynamic cycles of these engines is a new feature to aerospace propulsion.


The A2 is larger than an Airbus A380 super jumbo jet.



FURTHER READING
More pictures of the A2

The Reaction Engines site who have designed the Skylon space plane and the A2 commercial transport (The A2 is also referred to as the Lapcat project, Long-Term Advanced Propulsion Concepts and Technologies project)

Skylon Images



The key to the Skylon and A2 planes are precooled engines as explained in this 9 page pdf.

The issues relevant to propulsion design for Single Stage To Orbit (SSTO) vehicles are considered. In particular two airbreathing engine concepts involving precooling are compared; SABRE (Synergetic Air-Breathing and Rocket Engine) as designed for the Skylon SSTO launch vehicle, and a LACE (Liquid Air Cycle Engine) considered in the 1960’s by the Americans for an early generation spaceplane. It is shown that through entropy minimisation the SABRE has made substantial gains in performance over the traditional LACE precooled engine concept, and has shown itself as the basis of a viable means of realising a SSTO vehicle. Further, it is demonstrated that the precooler is a major source of thermodynamic irreversibility within the engine cycle and that further reduction in entropy can be realised by increasing the heat transfer coefficient on the air side of the precooler. If this were to be achieved, it would improve the payload mass delivered to orbit by the Skylon launch vehicle by between 5 and 10%.


The US hypersonic program is also looking to use fuel to cool the engine of its hypersonic planes.

Successful recent ground tests of jet-fueled, ramjet/scramjet demonstrator engines by Pratt & Whitney Rocketdyne and Aerojet represent important progress toward flight-testing of three separate hypersonic-vehicle programs.

Using JP-7 jet fuel, PWR ran the combustor successfully at a variety of Mach numbers from Mach 2.5 to Mach 6.0, demonstrating "desired operability and performance" at each speed, the company said.

FaCET aims to develop a hypersonic test vehicle -- which could fly in 2012 -- that would take off and land by itself, use an advanced turbojet to get up to a speed of at least Mach 4 and then use a liquid hydrogen-powered scramjet to get to Mach 10 and beyond. Jet fuel can't be used as a scramjet fuel at speeds as high as Mach 10.


The US should be flight testing an unmanned mach 10 aircraft in 2008


NASA hypersonics expert Dr Isaiah Blankson believes that MHD energy-conversion in the intakes can take 30-40% of the energy, letting a turbine engine run at up to Mach 7. One engine able to take the plane all the way to Mach 7 and have a lot of extra power for things like military lasers or railgun launchers.

The advantage of this proposal is that it seems like a simpler design than some other proposals for scramjets Scramjets promise to be better than rockets by not needing the 75% of the weight which is oxidizer, but designs need to simpler and not replace the oxidizer with a heavier and more expensive aircraft.

Reportedly, Blankson says extracting 30 to 40 per cent of the inflow energy would cut its speed by 50 to 75 per cent. That sounds counterintuitive, as kinetic energy is proportional to the square of velocity, but presumably a man with his background knows what he's on about. Potentially, a Mach 7 flow would slow to Mach 3 downstream of the MHD, and then a Blackbird type setup could handle it.


February 06, 2008

Rewritable holograms will revolutionize medicine and advertising


University of Arizona optical scientists have broken a technological barrier by making three-dimensional holographic displays that can be erased and rewritten in a matter of minutes. Currently 4 inch by 4 inch red displays, but soon life size color displays that could still be rewritten in minutes. If they can increase the writing speed 5000 times then 3d television and movies would be possible. Full size color without special eyewear that changes every 3 minutes would be huge for advertising. In terms of giving people a day to day feeling that "the future has arrived" this will be one of those things. Full color life size holograms that can be viewed without glasses and that change every few minutes and which advertisers have placed along with every billboard and busstop and store front will be something that people will be running into dozens of times a day.

The holographic displays – which are viewed without special eyewear – are the first updatable three-dimensional displays with memory ever to be developed, making them ideal tools for medical, industrial and military applications that require "situational awareness."


The 4-inch-by-4-inch prototype display that Peyghambarian, Tay and their colleagues created now comes only in red, but the researchers believe much larger displays in full color could be developed. They next will make 1-foot-by-1-foot displays, then 3-foot-by-3-foot displays.

"We use highly efficient, low-cost recording materials capable of very large sizes, which is very important for life-size, realistic 3-D displays," Peyghambarian said. "We can record complete scenes or objects within three minutes and can store them for three hours."

The researchers also are working to write images even faster using pulsed lasers.

"If you can write faster with a pulsed laser, then you can write larger holograms in the same amount of time it now takes to write smaller ones," Tay said. "We envision this to be a life-size hologram. We could, for example, display an image of a whole human that would be the same size as the actual person."


Railguns for space launch


The source of this post is this 10 page IEEE paper, Launch to Space With an Electromagnetic Railgun by
Ian R. McNab, Senior Member, IEEE
The cost of electricity for a launch will be negligible, as shown below. Barrel life is central to the successful economics for this system. A system might cost $1.3 billion and launch for $500/kg. Recent tests fired 7 pound projectiles at 5637 mph. Lunar escape velocity is 5,324 mph. So the truck sized system is already good enough to launch from the surface of the moon. Classic science fiction "the Moon is Harsh Mistress" by Heinlein could become reality.

Other gun launch systems were reviewed and found lacking:
Only Electromagnetic railguns seem worthy of further study for this application.

This choice was made on the basis that:
• they have already achieved 7 km/s at small scale, and 10.6 MJ at 2–3 km/s (with a test system able to go to 32 MJ) ;
• significant development is being funded for military applications;
• they offer the possibility of achieving the muzzle velocities and energies required;
• the potential cost savings seem significant based on our estimates.

Methods of accelerating large masses in large bore railguns will need to be developed, and some concepts are suggested here.


The muzzle velocity in the range needed for a moon-based launch system have already been achieved in the recent test firings. (about 2.5 km/s). Then it would just be a matter of scaling up energy linearly for heavier masses. (E=MC**2). The 10.6MJ system shot a 7 pound shot. The current 32MJ could fire 21 pounds (10kg) at the desired speed. A 320MJ system could fire 100kg payloads. Using resources available on the moon, this could serve as the forward base for sending material to Mars in support of a manned mission or to supply orbital infrastructure around the earth.

Even a scaled model would have substantial energy requirements: 10 kg at 7 km/s is a muzzle energy of 250 MJ, and with a launcher efficiency of 80%, an energy input 300 MJ would be required. This is comparable to the energy obtained from capacitor modules for the U.S. National Ignition Facility for laser fusion research.

The estimated system cost of $1.3B and a component life of 10 000 launches without replacement yields a cost of about $530/kg into orbit. It is important to note that this does not include the cost of the vehicle itself or operational costs on the Earth or in space, and these items need to be estimated.



The UTSTAR railgun tube.


Railgun launcher parts and sizes for the IEEE designed system


A chart with speed, energy and other variable tradeoffs.


Basically below 7km/s the total energy needed to launch a commercially viable amount of annual payload increases rapidly

The extension of this technology to the muzzle velocities ( 7500m/s) and energies ( 10 GJ) needed for the direct launch of payloads into orbit is very challenging, but may not be impossible. For launch to orbit, even long launchers ( 1000 m) would need to operate at accelerations 1000 gees to reach the required velocities, so that it would only be possible to launch rugged payloads, such as fuel, water, and material. Estimated launch costs could be attractively low ( $600/kg) compared with the Space Shuttle ( $20 000/kg), provided that acceptable launch rates can be achieved.

So triple the muzzle speed and increase power by 1000 times the current test level or 330 times the current 32 MJ system.

A disadvantage of gun launch is that the launch package has toleave the gun barrel at a very high velocity ( 7500 m/s) through the Earth's atmosphere, leading to a very high aerothermal load on the projectile.

However, the current 32 MJ system is only about the size of a truck. So a nice big scramjet that could fly at Mach 10-12 could use a moderately scaled up version of the rail gun current system to fly above most of the atmosphere and then fire hardened payloads into orbit. Then less heat shielding would be needed.

To provide 500 tons/year to orbit would require 2000 launches/year—a little over five per day on average.




The launch package is cargo within a shaped shell with a small rocket.

The source of this post is this 10 page IEEE paper, Launch to Space With an Electromagnetic Railgun by
Ian R. McNab, Senior Member, IEEE

February 05, 2008

Railgun on track for 2012 deployment trials on US warships


Photograph taken from a high-speed video camera during a record-setting firing of a seven-pound bullet fired from a truck-sized electromagnetic railgun at seven times the speed of sound and sent a visible shockwave through the air before crashing into a metal bunker filled with sand.

UPDATE: New article written on using railguns for space launch. This system already would be able to launch projectiles with enough speed for lunar based space launches.

Hypersonic vehicles could carry railguns as weapons or too launch into space above most of the atmosphere with smaller railgun systems that use less shielding for the projectiles. The MHD version of a hypersonic plane would be ideally suited for powering railgun and laser weapons.

Hat tip to reader Scott: the
U.S. Navy has demonstrated World's Most Powerful Electromagnetic rail gun (EMRG) at 10.64 Megajoules


An electromagnetic catapult, or railgun, is on track for deployment on U.S. warships around 2012, according to the Office of Naval Research (ONR).

The Navy's latest test made history with the world's fastest muzzle velocity of 5,637 miles per hour--generating a record 10.6 megajoules of energy (1 joule = 1 watt-second).




If the Navy decides to deploy the railgun, it plans to have a final design in place for approval by 2012. Initial prototypes will probably shoot a single projectile, but plans for rapid-fire versions are already on the drawing board.

The final design specification calls for a muzzle velocity of 5,760 mph for a weapon that is capable of launching a projectile in a parabolic ballistic path 94 miles high. It must strike targets within six minutes at 3,840 mph.

Initial tests showed that targets can be obliterated by the kinetic force of the impact with pinpoint accuracy without shrapnel, which is the most common cause of collateral damage when using high-explosive munitions.


At full capability, the rail gun will be able to fire a a 40-pound projectile more than 200 nautical miles at a muzzle velocity of mach seven and impacting its target at mach five. In contrast, the current Navy gun, MK 45 five-inch gun, has a range of nearly 13 miles. The high velocity projectile will destroy its targets due to its kinetic energy rather than with conventional explosives.

The safety aspect of the rail gun is one of its greatest potential advantages, according to Dr. Elizabeth D'Andrea, ONR's Electromagnetic Railgun Program Manager. Safety on board ship is increased because no explosives are required to fire the projectile and no explosive rounds are stored in the ship's magazine.

Science and technology challenges met by ONR in the development of the rail gun include development of the launcher, pulse power generation and the guided projectile design. The program's goal is to demonstrate a full capability, integrated railgun prototype by 2016-2018.


MIT Technology Review also covers the rail gun

FURTHER READING
Nov, 2007, I had reported on a 32 megajoule rail gun delivered for testing.

BAE Systems has delivered a functional, 32-megajoule Electro-Magnetic Laboratory Rail Gun (32-MJ LRG) to the U.S. Naval Surface Warfare Center in Dahlgren, Va. Installation of the laboratory launcher is currently underway, and according to BAE, this is the first step toward the Navy’s goal of developing a tactical 64-megajoule ship-mounted weapon.

DNA pistons for powering nanotechnology devices


Nanoscopic DNA pyramids that change shape when sent different chemical signals, have been demonstrated by researchers in the UK and Germany. Such structures could act as the motors of nanoscale robots, they say.

The researchers demonstrate the operation of reconfigurable DNA tetrahedra whose shapes change precisely and reversibly in response to specific molecular signals. Shape changes are confirmed by gel electrophoresis and by bulk and single-molecule Förster resonance energy transfer measurements. DNA tetrahedra are natural building blocks for three-dimensional construction9; they may be synthesized rapidly with high yield of a single stereoisomer, and their triangulated architecture conveys structural stability. The introduction of shape-changing structural modules opens new avenues for the manipulation of matter on the nanometer scale.


The DNA piston work is described at New Scientist magazine.

Other researchers have previously built DNA devices capable of walking along proteins or functioning like nanoscopic robot arms, but precise control of these 3D structures has proven difficult.

Now Andrew Turberfield of Oxford University in the UK, and colleagues at the University of Bielefeld in Germany, have shown how carefully crafted DNA structures can be made to self assemble and change shape when sent specific DNA signals.

The researchers built tetrahedrons – structures with four triangular sides – using four short DNA "struts" that join at each end. The process exploits the way DNA is held together by complementary bases that form the rungs of a ladder-like structure.

Tuberfield and colleagues first created DNA molecules with some bases on one side of the "ladder" left exposed. These bases were carefully chosen to match those of other DNA molecules so that, when mixed together, the right combination of DNA strands assembled into a tetrahedron.

The same trick can change the tetrahedron shape, by causing the struts to extend or shorten. This is possible if a strut has a loop of excess DNA in its middle (see image, top right). Adding a "fuel" strand of DNA straightens out the loop by binding onto it, and makes the strut extend.



A different sequence of "anti-fuel" DNA returns the strut to its normal length. "The fuel is designed to have a dangling free end," explains Turberfield. "The anti-fuel eventually displaces the fuel, allowing the loop to reform." The process can change the length of a strut from 3.4 nanometres in length 10.2 nm and back again.

In experiments, the researchers made cages with two extendible struts that could be independently controlled using different DNA sequences. In theory, it should be possible to create cages in which every strut can be controlled independently, Tuberfield says.

The researchers are now exploring ways to build larger structures using tetrahedral DNA structures as building blocks.

Drug delivery
In a commentary article concerning the new research, Chengde Mao at Purdue University
, West Lafayette, Indiana, US, who was not involved with the work, says the structures could perhaps be used to carry a payload and release it on demand.

"It is likely that these studies will lead to responsive molecular cages that can encapsulate guests and release them on demand," he writes. "For example, it is now possible to enclose proteins in a DNA tetrahedron." The cages can also be decorated with carefully placed proteins, perhaps providing a way to get the body to treat cages in a particular way.

Turberfield agrees: "Using the cages to hold and release proteins could have uses in drug delivery." For example, he says, drug compounds normally destroyed by the body could travel to a particular organ in a DNA cage, before being released.


FURTHER READING

Andrew Turberfield of Oxford University site

Bielefield

February 04, 2008

Marathon oil's Bakken play

During the fourth quarter of 2007, Marathon Oil completed the acquisition of more than 70,000 net leasehold
acres in the Bakken Shale play in North Dakota.
The acreage brings Marathon’s total Bakken Shale leasehold to more than 320,000 net acres. Marathon currently has six rigs running in its Bakken program and ended 2007 with a net production rate of 2,600 boepd."

Marathon leased six new drill rigs specifically designed for piercing the Bakken shale formation. The formation is about 10,000 feet below surface and requires sophisticated techniques to fracture the oil bearing strata

Marathon had more than 200,000 acres under lease when it moved into North Dakota two years ago with plans to drill as many as 300 wells over the next several years.


FURTHER READING:
A Bakken Shale blog

Brigham Exploration has about 67000 acres Bakken formation and its first three wells are producing a total of 1000 bopd

Tristar is developing on the Saskatchewan part of the Bakken oil play.

Post the closing of the qcquisition of Private Southeast Saskatchewan Company Transactions, TriStar will have greater than 10,000 boepd of long life, light oil production in its Southeast Saskatchewan core area including a 100 percent working interest at Fertile. In addition, TriStar will have more than 1,175 (650 net) future development drilling locations for both conventional and Bakken light oil representing potential future capital expenditures net to TriStar of over $900 million, providing an extensive production and opportunity base which will not be affected by the recently announced royalty changes in Alberta. TriStar believes the Fertile pool has greater than 86 million barrels of original oil in place on both the Combined Lands and the Private Company 100 percent lands; less than 2 percent recovered to date.


Dwave Systems has $17 million in added funding to make a quantum computer with thousands of qubits by the end of 2008

Dwave Systems closed a $17M financing round as of the end of January 2008. These funds will be used primarily to push the level of integration of our chips into the low thousands of qubits by the end of the year. In parallel with this central effort we will be working on running experiments on smaller systems to map out features of these systems important to their operation as quantum computers.

On November 2007, the last iteration of D-Wave’s chip was 28 qubits (quantum bits). The CTO Geordie rose said they were on track to show a 512 qubit machine in 2008 and 1024 the year after that (by the end of 2008. The die has room for a million qubits. This new announcement seems to imply 2000-4000 qubits by the end of 2008. "low thousands of qubits by the end of the year [2008]".

The latest financing round was fully subscribed by existing investors Draper Fisher Jurvetson (DFJ), GrowthWorks Capital Ltd, BDC Venture Capital, Harris & Harris, bcIMC and Pender Fund.

This was the number 1 item on my list of technologies to watch in 2008 It is ahead of the Bussard fusion because the Bussard fusion prototype could answer a lot of the questions about the potential of that potentially bigger technology but not all of the issues will be answered until the commercial fusion system is funded, built and operating. By the end of 2008, the Dwave quantum computer should be operating (or not) at a commercial level. Thousands of qubits will have applications where it should be clearly superior to any conventional computer system.

2-5 months until the Q2, 2008, 512 qubit machine.

9-11 months until a Q4, 2008, 2000-4000 qubit machine.

That will be great to see.

Hopefully by 2009-2010 we can fill out that die and get up to 1 million qubits and start transforming business and science.

UPDATE: Is it Quantum computing ?

Scott Aaronson, Dwave critic, has finally met Geordie Rose, CTO of Dwave They met at MIT when Geordie presented four hard problems to get MIT help in solving.

These problems were as follows:

1. Find a practical adiabatic factoring algorithm. Because of the equivalence of adiabatic and standard quantum computing, we know that such an algorithm exists, but the running time you get from applying the reduction is something like O(n11). Geordie asks for an O(n3) factoring algorithm in the adiabatic model. It was generally agreed (with one dissent, from Geordie) that reducing factoring to a 3SAT instance, and then throwing a generic adiabatic optimization algorithm at the result, would be a really, really bad approach to this problem.

2. Find a fault-tolerance threshold for adiabatic quantum computing, similar to the known threshold in the circuit model. Geordie asserted that such a threshold has to exist, because of the equivalence of adiabatic and standard quantum computing. However, others immediately pointed out that this is not so: the equivalence theorem is not known to be “fault-tolerance-preserving.” This is a major open problem that many people have worked on without success.

3. Prove upper and lower bounds on the adiabatic algorithm’s performance in finding exact solutions to hard optimization problems.

4. Prove upper and lower bounds on its performance in finding approximate solutions to such problems. (Ed Farhi described 3 and 4 as “so much harder than anything else we’ve failed to solve.”)


Scott is leaving himself an out in case Dwave's system works in 2008: Scott says:

Even if D-Wave managed to build (say) a coherent 1,024-qubit machine satisfying all of its design specs, it’s not obvious it would outperform a classical computer on any problem of practical interest. This is true both because of the inherent limitations of the adiabatic algorithm, and because of specific concerns about the Ising spin graph problem. On the other hand, it’s also not obvious that such a machine wouldn’t outperform a classical computer on some practical problems. The experiment would be an interesting one! Of course, this uncertainty — combined with the more immediate uncertainties about whether D-Wave can build such a machine at all, and indeed, about whether they can even produce two-qubit entanglement.


Scott also shows that he still does not understand business:

also means that any talk of “lining up customers” is comically premature


Geordie responded:

The first is that there are already buyers and sellers of quantum computers for research (Bruker NMR machines) and our systems are already much more useful and interesting than these.

The second is that we expect that even for fairly small systems (~1,000 qubits, which we plan to do this year) this type of special purpose hardware can beat the best known classical approaches for instance classes where the class embed directly onto the hardware graph even if the “spins” are treated entirely classically, which we assume is a worst-case bound. Often forgotten in this type of conversation is the fact that there is a long history of simple special purpose analog hardware outperforming general purpose machines. If you want an example, look at Condon and Ogielski’s 1985 Rev Mod Sci article–their Ising model simulator beat the fastest Cray of the time in Monte Carlo steps/second. You can’t draw conclusions about the general utility of this type of approach without looking at details.


I would note that it is standard business practice to pre-sell tickets to things that are not complete and may or may not work.

Examples: Aptera electric car, Tesla electric car, Toyota Prius sign up lists, music concerts, Microsoft sells software subscriptions with the understanding that their should be major software upgrades (yet Vista and other major upgrades were delayed for years), Virgin Galactic has presold hundreds of tickets for its yet to be completed sub-orbital rocket.

Reviewing my predictions on the future and recent Gartner predictions

Here is another update to my March 2006 technology predictions.

Prediction: Real-time biomarker tracking and monitoring 2008-2012

Progress: Cheap less than $100 USB gene tester

Old mockup of the cheap gene tester. The device is now much smaller than size of a shoe-box (USB stick size) with the optics and supporting electronics filling the space around the microchip.

Prediction: Real-time personalized disease treatment 2008-2012

Progress: The above gene tester can be used to test within minutes for adverse drug reactions which are a major problem in health care. By running a quick genetic test on a cancer patient, for example, doctors might pinpoint the type of cancer and determine the best drug and correct dosage for the individual.




Prediction: 80-200mpg cars - mainstream, batteries, ultracapacitors 5-10 times better 2008-2012

Progress:
Australian ultraBattery has a life cycle that is at least four times longer and produces 50 per cent more power than conventional battery systems. It’s also about 70 per cent cheaper than the batteries currently used in HEVs

EEStor ultracapacitor expected for mid-2008 and will be used in Zenn Motors electric and hybrid cars


AFS Trinity has what it calls Extreme Hybrid (XH) technology which employs a proprietary dual energy storage system that combines Lithium-Ion batteries and ultra capacitors with control electronics. They showed their 150 mpg hybrid SUV at the North American International Auto Show (NAIAS) in Detroit.

Prediction: Customized cells 2010-2014

Progress: Synthetic life- custom cell with completely synthesized DNA likely in 2008.
A 582,970 base pair sequence of DNA has been synthesized.

Prediction: Gecko mimicing wallcrawling suits for military and enthusiasts 2008-2012

Researchers at the University of California, Berkeley, have developed an adhesive that is the first to master the easy attach and easy release of the reptile's padded feet. The material could prove useful for a range of products, from climbing equipment to medical devices.

Prediction: Wireless superbroadband (50-1000Mbps) 2009-2012

Progress:
List of deployed wimax networkks

Whitespace modems continue to be tested and could provide 50-100Mbps or faster speed

More on whitespace modems

Prediction: Fiber to the home (100Mbps-1000Mbps) 2010-2015

Progress: Various groups in the USA are pushing for national broadband policy to be passed in 2008 to encourage 100Mbps or faster connections Japan has 93.7 Mbps average download speeds in October 2007.

Fiber to the premises deployment history by country at wikipedia

Prediction: Advanced plastic circuits, computing, monitors and energy gathering-walls, roofs, desktops 2009-2012

Progress: Printable electronics catching up to speed of CMOS electronics. Several of the technologies are suitable for printing electronic displays in a method similar in speed and cost to how newspapers are made now. (reel to reel printing of large areas)

Prediction: One billion digital video cameras posting online realtime; personal privacy is history 2008-2012

Progress: Over 1.24 billion cellphones will be shipped in 2008. Over 1 billion of those will be camera phones. As of Q2 2007, there are over 131 million UMTS users (and hence potential videophone users), on 134 networks in 59 countries. Camcorder sales are 4.5 million to 5.8 million per year in the United States Camcorders, are under 100 million total shipped as of 2007 The key to this prediction is if most new cellphones shift over to videophones. As over one billion cellphones will ship in 2009.

Prediction: 10 petaflop computer 2012-2013

Progress:
The Blue Gene/P machine at Argonne is supposed to reach one petaflop — 1 quadrillion sustained operations per second — in 2008. It should have a peak speed of three petaflops by the end of 2008.

Turek said IBM's goal was 10 petaflops by 2011 and 20 petaflops by 2017. The Japanese have announced their intent to reach 10 petaflops by 2012.


The Sun Constellation compute speed is estimated at 1.7 petaflops, and it will store up to 10 petabytes of data.

Fujitsu expects to build a supercomputer that can perform 3 quadrillion calculations per second, or petaflops, in 2011

Prediction: Optical interconnects connect CPUs directly at 100 Gbps+ 2012-2018

Progress: IBM reveals core-to-core optical dream in progress.

IBM researchers have created a modulator that's one hundred to a thousand times smaller than other prior modulators and is theoretically capable of using light pulses to transmit data between cores, rather than relying on traditional wires. Chip-level optical routing would allow cores to communicate much faster than even the best wired connection (IBM estimates its nanophotonic technology would be 100 times faster) and would almost certainly eliminate any bandwidth-related bottlenecks within a single core.



Prediction: DNA nanotechnology creates nanotools and parts 2010-2015


Progress: Synthetic biology is making DNA for mechanical and electronic purposes DNA to assemble millions of three dimensional nanoparticles, and all molecular programmable DNA construction.

Progress: Two artificial DNA "letters" that are accurately and efficiently replicated by a natural enzyme have been created by US researchers. Adding the two artificial building blocks to the four that naturally comprise DNA could allow wildly different kinds of genetic engineering, they say. This combines with the previous articles about using DNA to assemble millions of three dimensional nanoparticles, being able to synthesize strings of DNA over 500,000 base pairs long and all molecular programmable DNA construction.

Prediction: Protein engineering creates artificial ribosome 2014-2022

Progress: A 582,970 base pair sequence of DNA has been synthesized. This is twenty times longer than the previous record DNA synthesis of one strand. A ribosome has 2.3 million base pairs.

Prediction: High resolution direct visual feeds to retina, able to fool viewer for short periods of time 2012-2020

Displays on contact lens and on glasses that project into the eye.

Prediction: Superconducting engines on ships and planes, less than 1/3 the weight
2012-2020

Progress: All-Electric Ship Could Begin to Take Shape By 2012

March 29, 2007:
American Superconductor Corporation, a leading energy technologies company, and its strategic partner, Northrop Grumman, announced today the successful completion of factory acceptance testing for the world's first 36.5 megawatt (49,000 horsepower) high temperature superconductor (HTS) ship propulsion motor at Northrop Grumman's facility at the Philadelphia Naval Business Center.



Comparison of superconducting engine to regular engine

Prediction: Magnetic rail guns, with over 20 times the speed and power of
conventional guns 2013-2018


Progress: Railgun on track for 2012 deployment trials on US warships. The railgun program's goal is to demonstrate a full capability, integrated railgun prototype by 2016-2018. a 10.6 Megajoule shot reached a speed of 5,673 mph.

Prediction: Military lasers on fighters, ships and tanks able to destroy other
vehicles 2012-2018


Progress:
100kw solid state lasers being assembled in 2007

Prediction: Breakthrough in handling or reducing long term waste from nuclear
fission - makes nuclear fission "clean" 2010+


Progress:
The Molten Salt Reactor can generate 1000 times less uranium and plutonium waste and everything else that is left over has a halflife of less than 50 years.

Fissile fuel burnup of at least 50% should be achievable with adequate design of the Hyperion Power Generation uranium hydride reactor. 50 times less nuclear waste would be left over. This reactor could be ready 2012 and appears to be funded.

Prediction: Develop useful power generation from forms of nuclear fusion 2020+

Progress: Bussard's inertial electrostatic confinement fusion WB-7 prototype activated 2008. EMC2 Fusion has built an upgraded model of Bussard's last experimental plasma containment device, which was known as WB-6. This work is very important because we could have commercial fusion in as little as 5 years if the work is successful.

Prediction: Develop useful space propulsion from nuclear fusion 2015+

Progress: If the Bussard inertial electrostatic confinement fusion is successful, then it would make it 40 to 1000 times cheaper to get into space.

Prediction: Cellular life found on Mars 2010

Progress:
New Analysis of Viking Mission Results Indicates Presence of Life on Mars. Phoenix mission in May 2008, could help to resolve some of these issues.

The Phoenix mission will land a telerobot in the polar region of Mars in May 2008. One of the mission's two primary objectives is to look for a 'habitable zone' in the martian regolith where microbial life could exist, the other mission being to study the geological history of water on Mars. The lander will have a 2.5 meter robotic arm that is capable of digging a 0.5 meter trench in the regolith. The arm is fitted with an arm camera that will be able to verify that there is material in the scoop when returning samples to the lander for analysis – this overcomes an important design flaw in the Viking landers.

Gartner Predictions [not my predictions, but a couple overlap with what I believe]:
Gartner, Inc. has highlighted 10 key predictions of events and developments that will affect IT and business in 2008 and beyond. Here I list a few that I found interesting and in general agree with.

Through 2011, the number of 3-D printers in homes and businesses will grow 100-fold over 2006 levels. Printers priced less than $10,000 have been announced for 2008, opening up the personal and hobbyist markets.


I have been following the rapid prototyping and rapid manufacturing industry for several years. I have noted the falling prices, increasing capabilty and increasing popularity of the various 3D printers back in October 2007

The Desktop Factory 3D printer $4995 will be available in 2008. Their goal by 2011 is to have their 3D printer below $1000.
1. sales of hundreds of units in 2008 to a plan of 3500 in 2009.
2. In 2010, a price point of roughly $2,000 and somewhere between 20,000 – 30,000 units.
3. In 2011, with a price below $1000 and enter the consumer space. They believe they will sell over a 100,000 units a year

By 2011, Apple will double its U.S. and Western Europe unit market share in Computers. Apple's gains in computer market share reflect as much on the failures of the rest of the industry as on Apple's success.


I had covered the likely increase in Apple's market share January 2008

By 2012, 80 per cent of all commercial software will include elements of open-source technology

By 2011, early technology adopters will forgo capital expenditures and instead purchase 40 per cent of their IT infrastructure as a service.

By 2010, 75 per cent of organisations will use full life cycle energy and CO2 footprint as mandatory PC hardware buying criteria.


Texas Instruments and MIT make microchip ten times more energy efficient


Researchers at MIT and Texas Instruments (TI) (NYSE: TXN) today unveiled a new chip design for portable electronics that can be up to ten times more energy-efficient than present technology. The design could lead to cell phones, implantable medical devices and sensors that last far longer when running from a battery in about five years (2013+).

The key to the improvement in energy efficiency was to find ways of making the circuits on the chip work at a voltage level much lower than usual, Anantha Chandrakasan, lead MIT researcher on this project explains. While most current chips operate at around 1 volt, the new design works at just 0.3 volts.

Reducing the operating voltage, however, is not as simple as it might sound, because existing microchips have been optimized for many years to operate at the higher standard voltage level. "Memory and logic circuits have to be redesigned to operate at very low power supply voltages," Chandrakasan says.

One key to the new design, he says, was to build a DC-to-DC converter - which reduces the voltage to the lower level - right onto the same chip, which is more efficient than having the converter as a separate component. The redesigned memory and logic, along with the DC-to-DC converter, are all integrated to realize a complete system-on-a-chip solution.

One of the biggest problems the team had to overcome was the variability that occurs in typical chip manufacturing. At lower voltage levels, variations and imperfections in the silicon chip become more problematic. "Designing the chip to minimize its vulnerability to such variations is a big part of our strategy," Chandrakasan says.

So far the new chip is a proof of concept. Commercial applications could become available "in five years in a number of exciting areas," Chandrakasan says. For example, portable and implantable medical devices, portable communications devices and networking devices could be based on such chips, and thus have greatly increased operating times. There may also be a variety of military applications in the production of tiny, self-contained sensor networks that could be dispersed in a battlefield.

In some applications, such as implantable medical devices, the goal is to make the power requirements so low that they could be powered by "ambient energy," Chandrakasan says - using the body's own heat or movement to provide all the needed power. In addition, the technology could be suitable for body area networks or wirelessly-enabled body sensor networks.


More details on the ultra low power CMOS design at Electronics Weekly

Researchers at the Massachusetts Ins­titute of Technology (MIT) have developed a feedback-control scheme that interactively tunes CMOS operating voltage to minimise dissipation. Energy consumption in CMOS drops quadratically as its supply voltage is bought below its threshold voltage. However, according to MIT, leakage increases exponentially at the same time. This means that for any given circuit workload and temperature, there is a particular supply voltage that trades capacitive losses with leakage in a way that minimises power consumption.

Changing the 7-tap filter (at optimal voltage) to a 1-tap version drops power by 25 per cent at constant voltage, whereas feedback control achieves a cut of over 40 per cent. In the presence of leakage - added as a 1µA constant load to the circuit - power would almost triple, but the loop pulls this down to an increase of only 30 per cent. With temperature increasing from 0 to 85°C, the loop saves around 50 per cent of power compared with constant voltage operation, claimed MIT. The technique places no burden on the controlled ‘load’ and consumes a tiny fraction of the power it saves.

Singularity lite: Focus on virtual versus physical

I had a previous article examining the next step or two in technological acceleration. Looking at the long term history of economic and technological progress, the next step should be consistent 16-25% annual growth.

Each "step up in consistently higher technological growth rate" is according to Robin Hanson a series of about seven doublings at a higher rate of growth. Across the full series then an economy would increase by 128 times from the beginning to the end of one average series of doublings.

A comment from Walheed at onsingularity: So far, technologies shrink certain markets, but open even bigger ones. And it seems that this trend is going to continue for the foreseeable future. However, long term, there is a possibility that we might see technological progress continue marching ahead but economical growth (measured in dollars exchanged per year) might not follow.


I accept and recognize that old markets and industries get shrunk or displaced as I briefly mentioned in the original article. However, I also see technological hypergrowth that is so strong that it does not need redefinitions of growth to capture or see it. There might be some decoupling of progress from currency transactions, but if something akin to technological or economic growth is at 30-50% per year then there will be startling transformation at the physical level as well. Converting the matter of the solar system into a dyson shell of computronium would be a matter of when not if. 120 years to go through 7 rounds of doubling from 1903. If we kick up into another gear of faster growth, then 21-50 years for the next set of 7 rounds of doubling. (over times more after each set.) There was qualitative and definition changes from 1903 to now as well. Telecommunications was the telegraph and now it is the internet, mobile phones, fax, and a lot of other businesses.

For what Walheed to envision to be the case the new growth would need to be nearly completely virtual. Where there is increasing value placed on the civilization into non-physical realms. This would ultimately merge into Nick Bostrom's idea of simulated universes by advanced civilizations I have difficulty seeing how for research and fun and other reasons that becomes the dominant use of resources or the economic focus of an advanced civilization.

William Randolph Hearst was one of the richest men in 1903. He had 28 newspapers read by 20 million people (did not reach that level until 1925). He also had big property. Hearst Castle, but did not own that land until 1919 (construction from 1919-1947). A more modest modern approximation would still be a multi-millionaire. Someone with a bunch of popular blogs or websites to reach 5 million (lower than the 1920 figure to the 1903 level. Instapundit gets 250,000 per day, 7 million per month. I would guess $5-10/cpm. $35,000-70,000 per month. However, the staff to achieve that went way down. Nice big modern homes (30,000+sf) still very expensive. It would be easier and cheaper (but still expensive) to make a modern approximation of royalty from an earlier period. Certain things become cheaper and easier to make with better technology, but there is still economic value generated and the physical world retains a large share of the value.

Qral's comments:
I wonder how long "super-growth" might last before it runs into Earth's thermal limits and has to majorly go off-planet? But for how long will one system be enough? In just 31 doublings after industry off-worlds we will match the Sun's energy output. That means Matrioshka Brain levels will appear within ~ 660 years. And then?

Here I take the middle view between Qral and Walheed. I think there will be some decoupling (perhaps increasing multiple) of virtual versus physical. In 1900, global energy consumption equaled 0.7 TW(=10**12 Watt.) Now DSP is up over 100 times and we use about 15TW. So there was an increase of 6 times the efficiency in energy usage to GDP. Plus I think there was some decoupling of GDP from resources. Things like information technology and other less resource intense industries. In the future to maintain hypergrowth we might need more virtual industries.

More GDP from less resources can go beyond just going to the limits of energy efficiency.

100,000 times more for Kardashev 1 all the energy on the planet. 3 of the big stages which if they are coming faster each time would happen very quickly. Definitely need to go offworld, but with fusion and nanotech not a problem.

Then the solar system, 10**26 power. 1 billion times. 4-5 major doubling cycles. (a major doubling cycle has seven doublings.) So if things are accelerating it happens even faster.

It seems we will either have to decouple economic growth from energy growth a lot more or use super tech to tap more power than solar.

With the advanced technology at the disposal of our projected hypergrowth civilization, I believe that we can achieve interplanetary and interstellar growth and also the development of the oceans and deserts and maximize utilization of all earth resources.

I have reviewed many superior launch and propulsion systems which I believe can be developed with near term technology improvements

Fusion, fission and laser photonic propulsion will open up space.

Mirrored laser arrays would allow light sails to be efficiently accelerated to near the speed of light.