October 20, 2007

Myostatin drugs possibly four times better muscle growth than steroids without the harmful side effects

The drug ACE-031 has been found to reproduce the enhanced muscle growth effect caused by certain genes. Genetic manipulation in mice can cause 4 times the muscle growth. The drug mimics the effect of gene therapy and appears to be a super-steroid without the harmful effects and is using a different process than steroids.


The dog in the photo is supermuscular because of naturally occurring mutations that silence both versions of the myostatin gene. Called bully whippets, these dogs are rarely champion racers. However, animals with one mutated and one normal version of the gene are more muscular than typical animals and are among the breed’s fastest racers. Credit: Stuart Isett, Polaris





REDDITdel.icio.us



Mice given the new drug show a 30 to 60 percent increase in muscle mass, and mice with a version of muscular dystrophy show increased grip strength, a standard measure of rodent strength. Preliminary results from primate studies show that the animals on the drug bulk up at similar rates to those seen in rodents. "Before I became involved with Acceleron, if someone had told me you could increase muscle mass by up to 60 percent in a month, I never would have believed it," says CEO John Knopf.

While it's not yet clear if similar rates will be seen in humans, high doses of anabolic steroids, which carry serious side effects, increase muscle mass by a maximum of 15 to 20 percent. And because myostatin is found only in muscle, knocking it out does not appear to have the adverse effects of broader-acting steroids.

Says Evans, "I think these drugs, perhaps used in combo with exercise, might have great potential in reversing the trend toward increasing obesity and decreasing muscle mass."


In the previous studies on the genes being set, the aging of muscles was actually stopped. This did not increase life expectancy but there would be less of the problems of someone old suffering serious reduction in muscle.

Results of experiments with adult mice who received injections of the gene into their muscles. The inoculations prevented muscle deterioration in mice as old as 2 years -- 80 years in human terms. The shots even regenerated muscle, restoring some of the lost strength and size.

Old mice regained 27 percent of muscle lost to age; younger mice experienced a 15 percent increase, Sweeney reported. "You build muscle mass and strength even without exercise," he says.

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions

Multi-wall carbon nanotubes increase light emission of polymer 100 fold

Researchers at the Advanced Technology Institute of the University of Surrey, in collaboration with researchers from China and the USA, have demonstrated a 100-fold increase in the light emission from a nylon polymer sample, by incorporating multi-walled carbon nanotubes (MWCNT).

Previously adding carbon nanotubes (CNT) reduces the light emission from the composite, due to quenching of charge carriers at the nanotubes, which are generally metallic in nature for multi-walled CNT. This quenching reduces the emission efficiency of the devices.

This increase in light-emission only occurred when they acid treated the MWCNT prior to inclusion in the polymer. They propose that this increase is due to a novel energy transfer mechanism, from the acid-damaged surface of the MWCNT to the emitting sites in the polymer. In addition to the enhanced light-emission, the study also demonstrates that the MWCNT produced an improvement in the stability of the polymer to light-induced degradation.

Dr. Simon Henley, one of the lead investigators, comments "These results show that carbon nanotubes have enormous potential as a versatile material in future optoelectronic devices, and raise the prospect of utilising MWCNTs to harvest solar radiation in organic solar cells, in addition to improving device stability. "

Professor Ravi Silva, Director of the Advanced Technology Institute states: "The mere fact that now we can have a predictable organic-nanotube hybrid composite, with enhanced properties should open the door for many new applications. The enhancement in the luminescence properties bodes well a new generation of organic devices that could potentially reach commercially viable figures of merit for large scale production.

October 19, 2007

Boston Consulting Groups global Millionaire count

Global wealth grew by 7.5 percent in 2006 to reach $97.9 trillion, measured in local currencies.

The number of millionaire households grew by 14.0 percent in 2006, to 9.6 million. These were the richest 0.7 percent of households, and they owned $33.2 trillion—or about one-third—of global wealth. North America was home to nearly half of all millionaire households, Europe had about one-quarter, and Asia-Pacific accounted for about one-fifth.

Wealth remained concentrated in certain regions. North America (the United States and Canada) and Europe again had the deepest pools of wealth, at $36.2 trillion and $33.0 trillion, respectively.


Businessweek had some more information from this report

The U.S. had, by far, the highest number of millionaire households, with nearly 4.6 million, and the highest number of $100 million-plus households, with 2,300.

Tapping Human Assets to Sustain Growth: Global Wealth 2007. —BCG's seventh annual global-wealth report—is based on a comprehensive market study of wealth in 62 countries (representing more than 96 percent of global GDP) and a benchmarking survey of 111 wealth managers that oversaw almost $10 trillion in client assets and liabilities.

Boston consulting group 2006 report on millionaires with 2005 data

The number of millionaire households, measured in U.S. dollars, reached 7.2 million. They owned 28.6 percent of global wealth. Nearly 41 percent of all millionaire households—almost 3 million—were in the United States. Japan had the second-largest number of such households (825,000), followed by the United Kingdom (440,000). China, with 250,000 millionaire households, placed sixth.


The Top Five Countries With the Most Millionaire Households

No. 1: U.S.

Total millionaire households: 4,585,000
Change in 2005-06: +10%
Growth rank: 13 (of 15)

Total population: 301,139,947

Total $100 million+ households: 2,300 (rank: 1)
Change in 2005-06: +7%
Growth rank: 14

No. 2: Japan

Total millionaire households: 830,000
Change in 2005-06: +7%
Growth rank: 15

Total population: 127,433,494

Total $100 million+ households: 1,300 (rank: 2)
Change in 2005-06: +6%
Growth rank: 15

No. 3: Britain

Total millionaire households: 610,000
Change in 2005-06: +30.5%
Growth rank: 3

Total population: 60,776,238

Total $100 million+ households: 810 (rank: 3)
Change in 2005-06: +25%
Growth rank: 4

No. 4: Germany

Total millionaire households: 350,000
Change in 2005-06: +21%
Growth rank: 10

Total population: 82,400,996

Total $100 million+ households: 620 (rank: 4)
Change in 2005-06: +18%
Growth rank: 10

No. 5: China

Total millionaire households: 310,000
Change in 2005-06: +39%
Growth rank: 1

Total population: 1,321,851,888

Total $100 million+ households: 180 (rank: 13)
Change in 2005-06: +74%
Growth rank: 1

FPGAs can accelerate repetitive computer operations up to 1000 times

FPGA processing accelerators have been developed. Warp processing" gives a computer chip the ability to improve its performance over time.

Here’s how Warp processing works: When a program first runs on a microprocessor chip (such as a Pentium), the chip monitors the program to detect its most frequently-executed parts. The microprocessor then automatically tries to move those parts to a special kind of chip called a field-programmable gate array, or FPGA. “An FPGA can execute some (but not all) programs much faster than a microprocessor – 10 times, 100 times, even 1,000 times faster,” explains Vahid.

“If the microprocessor finds that the FPGA is faster for the program part, it automatically moves that part to the FPGA, causing the program execution to ‘warp.’” By performing optimizations at runtime, Warp processors also eliminate tool flow restrictions, as well as the extra designer effort associated with traditional compile-time optimizations.

FPGAs can benefit a wide range of applications, including video and audio processing; encryption and decryption; encoding; compression and decompression; bioinformatics – anything that is compute-intensive and operates on large streams of data. Consumers who want to enhance their photos using Photoshop or edit videos on their desktop computers will find that Warp processing speeds up their systems, while gamers will immediately notice the difference in better graphics and performance. Additionally, embedded systems such as medical instrument or airport security scanners can perform real-time recognition using Warp-enhanced FPGAs.


This new method only uses the FPGA when it detects performance gains are being made. The computer with FPGA warp processing will adapt to each individuals specific workload. Therefore, this will have a wide impact and effortless impact on the part of the user.

Mountain top Kitegen could tap winds 24 times stronger than conventional

The kitegen system could radically improve the situation a promising outlook for wind power. Winds are stronger and steadier at 800meter and up. Building on mountains and hills would provide access to even stronger and steadier winds. Winds that are up to 24 times stronger than current winds at 80 meter altitudes. The Kitegen system would have to be adapted to pull energy from gale force winds on mountains. I believe that the Kitegen system is suitable for using many kites to pull a large ring shaped turbine.

Wind speeds on average in the Himalayas are 75mph (33m/s) and can reach more than 100mph (45 m/s). The windiest place in the world is Mount Washington (1,918m) in New Hampshire, USA. On 12th April 1934 a surface wind speed of 231mph was recorded.

Mountains cover 54% of Asia, 36% of North America, 25% of Europe, 22% of South America, 17% of Australia, and 3% of Africa. As a whole, 24% of the Earth's land mass is mountainous. The Himalayas average 5 km above sea level, while the Andes average 4 km. Most other mountain ranges average 2 – 2.5 km.

Kitegen has wind data showing that at 5600 meters of altitude there are many locations with 15-20 m/s wind and at 10500 meters there is 35-45m/s wind Accessing those winds with kitegen systems on mountains would increase wind power by 3 to 6 times over the four times improvement that 800 meter winds have over 80 meter winds. So 10,500 meter winds are 24 times stronger than 80 meter winds.


A large kitegen carousel system. The track could run around the top of a mountain like a necklace.


Blue Ridge mountains in Virginia

Conventional New Wind Energy Forecast
I believe that Nuclear power can be scaled up faster than wind. However, wind power is doing quite well and it is and will be significant.

73,904 MW total installations as of 2006, expect 90,000 MW by the end of this year (2007). This 2007 total is equal to about 30GW of nuclear installation because of operating load. Germany has over 18,000 turbines with avg size of a little over 1MW. the latest installations are 5MW and 6MW units. By 2010, the World Wind Energy Association expects 160GW of capacity to be installed worldwide.

The 28GW build rate at the end of 2010 carrying forward would add 280GW from 2011-2020. After 2010 the size of new wind turbines will be 7.5-10MW and probably still getting bigger.


Wind past and predicted by industry

http://en.wikipedia.org/wiki/Wind_power

MW of installed capacity

2005 2006 latest
1 Germany 18,415 20,622 21,283
2 Spain 10,028 11,615 12,801
3 United States 9,149 11,603 12,634
4 India 4,430 6,270 7,231
5 Denmark 3,136 3,140
6 China 1,260 2,604 2,956
7 Italy 1,718 2,123
8 United Kingdom 1,332 1,963 2,191
9 Portugal 1,022 1,716 1,874
10 Canada 683 1,459 1,670
11 France 757 1,567
12 Netherlands 1,219 1,560
13 Japan 1,061 1,394
14 Austria 819 965
15 Australia 708 817


FURTHER READING
My previous article on Kitegen

My first article on Kitegen

October 18, 2007

Progress on Uranium from coal flyash waste

Atomic Insights reports on progress towards extracting Uranium from coal flyash waste.

Spartan Resources of Canada, has press releases discussing a project in Hungary and another announced a successful test of samples from an ash pile located in the central Yunnan Province of China.

The Yunnan province ash pile being evaluated contains about 5.3 million tons of ash with a uranium concentration of 160-180 parts per million. The total quantity of uranium in the pile is thus about 2085 tons. According to the UIC ISL article the normal recovery percent from an ISL deposit ranges between 60-80% so the amount of uranium that might be recovered is about 1250-1700 tons.

It takes about 200 tons of natural uranium to power a 1000 MWe reactor for a year, so the ash pile mine could supply between 6-8 reactor years of fuel. The current price listed at UXC is about $78 per pound. Even at that price, the uranium from a single ash pile might be worth as much as $250 million. Not bad for something considered to be at best a nuisance and at worst an environmental contaminant.

Carnival of Space Week 25

Check out the well written Carnival of Space #25 at sortingoutscience.net

My contribution was preview coverage of the space elevator games

The interim report on space based solar power is out and sortingoutscience lists many links to sites that are talking about it.
I had made several contributions at the space based solar power website.

The Japanese lunar probe Selene mission is presented in depth,again with many extra links provided by sortingoutscience.

The analysis of methods for deflecting asteroids is discussed, along with the determination that satellites with inflatable mirrors is the best choice for most situations I had put this information out to the Lifeboat Foundation blog.

Colony worlds discusses how exploration and emergency are not the right reasons to go to space, but that economic reasons are. I mostly agree and that is why the near term focus should be reducing costs and building up infrastructure in space using robotics and low energy orbital transfers.

The Carnival has a lot of other great stuff on Mars, Saturn, Jupiter and more check it out

October 17, 2007

Pseudo common sense for computers

Using a little-known Google Labs widget, computer scientists from UC San Diego and UCLA have brought common sense to an automated image labeling system. This common sense is the ability to use context to help identify objects in photographs.

For example, if a conventional automated object identifier has labeled a person, a tennis racket, a tennis court and a lemon in a photo, the new post-processing context check will re-label the lemon as a tennis ball.



Google Sets generates lists of related items or objects from just a few examples. If you type in John, Paul and George, it will return the words Ringo, Beatles and John Lennon. If you type “neon” and “argon” it will give you the rest of the noble gasses.

“In some ways, Google Sets is a proxy for common sense. In our paper, we showed that you can use this common sense to provide contextual information that improves the accuracy of automated image labeling systems,” said Belongie.

The computer scientists also highlight other advances they bring to automated object identification. First, instead of doing just one image segmentation, the researchers generated a collection of image segmentations and put together a shortlist of stable image segmentations. This increases the accuracy of the segmentation process and provides an implicit shape description for each of the image regions.

Second, the researchers ran their object categorization model on each of the segmentations, rather than on individual pixels. This dramatically reduced the computational demands on the object categorization model.

Right now, the researchers are exploring ways to extend context beyond the presence of objects in the same image. For example, they want to make explicit use of absolute and relative geometric relationships between objects in an image – such as “above” or “inside” relationships. This would mean that if a person were sitting on top of an animal, the system would consider the animal to be more likely a horse than a dog.


FURTHER INVESTIGATION
Google sets widget

Other google lab widgets

Sorts DNA, Cells, Molecules thousands of times faster than conventional methods

University of Rochester researchers have patented the device, which they hope will make tests such as identifying proteins in a tiny sample of blood as simple as placing a drop on a handheld device.

Laboratories and hospitals all over the world use similar, albeit cumbersome, hours-long processes in efforts to identify everything from DNA fragments to pathogens. King and Thomas Jones, professor of electrical and computer engineering, induce an electrical field around the droplet to be analyzed, and in one-tenth of a second the droplet elongates along an electrode into an electrified, liquid string. As the fluid is stretched, the electrical field separates the molecules laterally along the edges of the long droplet. Stretching the droplet along a specially prepared detector can lay down one set of molecules directly onto the detector, making their recognition highly efficient.

King and Jones found that a micro-liter of fluid or less is enough for the process to work with great efficiency. The most common method of separating proteins, called gel electrophoresis, requires more liquid and can take several hours.

The frequency of an electric field can be tuned to send one subset of particles in one direction, and another set of particles in the reverse direction based on the way they behave in an electric field. This is called the dielectrophoretic force.

The team is now looking into building electrodes with integrated particle detectors, and using fluorescence-marked proteins to see if they can increase the speed and accuracy of the process further yet.

Progress to nerve and paralysis repair

University of Manchester researchers have transformed fat tissue stem cells into nerve cells — and now plan to develop an artificial nerve that will bring damaged limbs and organs back to life.

Dr Paul Kingham and his team at the UK Centre for Tissue Regeneration (UKCTR) isolated the stem cells from the fat tissue of adult animals and differentiated them into nerve cells to be used for repair and regeneration of injured nerves. They are now about to start a trial extracting stem cells from fat tissue of volunteer adult patients, in order to compare in the laboratory human and animal stem cells.

Following that, they will develop an artificial nerve constructed from a biodegradable polymer to transplant the differentiated stem cells. The biomaterial will be rolled up into a tube-like structure and inserted between the two ends of the cut nerve so that the regrowing nerve fibre can go through it from one end to the other.

This 'bionic' nerve could also be used in people who have suffered trauma injuries to their limbs or organs, cancer patients whose tumour surgery has affected a nearby nerve trunk and people who have had organ transplants.

With a clinical trial on the biomaterial about to be completed, the researchers hope the treatment could be ready for use in four or five years.

"The frequency of nerve injury is one in every 1,000 of the population — or 50,000 cases in the UK — every year.

"The current repair method — a patient donating their own nerve graft to span the gap at the injury site — is far from optimal because of the poor functional outcome, the extra damage and the possibility of forming scars and tumours at the donor site. Tissue engineering using a combination of biomaterials and cell-based therapies, while at an early stage, promises a great improvement on that. Artificial nerve guides provide mechanical support, protect the re-growing nerve and contain growth factor and molecules favourable to regeneration. The patient will not be able to tell that they had ever 'lost' their limb and will be able carry on exactly as they did before."

New Silicon nanowire could power nanoscale devices

Charles Leiber and colleagues at Harvard University describe silicon nanowire they devised that can convert light into electrical energy. Virtually invisible to the naked eye, a single strand can crank out up to 200 picowatts. The nanowire is not made of metal but of silicon with three different types of conductivity arranged as layered shells.

Incoming light generates electrons in the outer shell, which are then swept into the second layer and the inner core along micropores.


The nanowire, which resembles a miniscule coaxial cable, is made of layers of silicon (Image: Nature)

Although the proof-of-concept device only converts about 3% of light into electricity, Lieber says it "allows us to study a fundamentally different geometry for photovoltaic cells, which may be attractive for improving the efficiency."


He also believes it may be possible to boost the nanowire's efficiency by getting rid of defects in the crystal. "Our goal is to get in the 15% [efficiency] range," Lieber says."

Lieber's new nanowire functions as a complete solar cell. At its core is a rod-shaped crystal of silicon, about 100 nanometres across, doped with boron. Layers of polycrystalline silicon are added to wrap it in a 50-nm-thick layer of undoped silicon and a 50-nm-thick outer coating of silicon doped with phosphorus.


UPDATE:
IEEE Spectrum discusses the new nanowires

Harry Atwater, a physicist at Caltech, called the Harvard research “an important first experimental step forward.” Atwater recently wrote a theoretical paper that suggested it may be possible to get the efficiency of such a nanowire above the 20 to 25 percent seen in highly ordered crystalline silicon. Lieber sees no reason that the efficiency can’t be improved to at least 10 or 15 percent. At that point, he says, the lower costs that his production process entails might make large arrays of nanowires competitive with macroscale solar cells.


MIT Technology review also has coverage on the solar nanowires

Since the materials are thin, the chances of an electron being trapped by a defect before escaping from one layer to the next are low, so it's possible to use cheaper materials with more defects.

Lieber has tested only small numbers of nanowire solar cells. For large-scale applications, the nanowires would need to be chemically grown in dense arrays. Atwater and Lewis recently took steps in this direction, publishing in the past month two papers in which they describe growing dense arrays of microscopic wires, but wires without the multiple layers that Lieber's have. Paired with a liquid electrolyte, the wires generated electricity from light. Since it may prove easier to manufacture solid-state solar cells such as Lieber's, however, Lewis and Atwater are working to produce arrays of wires with multiple layers.

Even with the potential advantage of cheaper materials, wire-based solar cells would probably need to be about 10 percent efficient if they were to compete with existing technology. The researchers' next steps include finding ways to make more dense arrays of wires to absorb more light and, in Lieber's case, to find ways to generate increased voltage from nanowire solar cells.

Malaria vaccine safe and 65% effective for babies, the most at risk group

Known by its lab name of RTS,S the prototype is raising high hopes of the first vaccine shield against a disease that claims more than a million lives a year -- 800,000 of them African children aged under five -- and sickens hundreds of millions more. Infants who received RTS,S were 65 percent less at risk of contracting malaria compared with their control counterparts.

The investigators add a small caveat about the trial, pointing out that the homes of all the babies who took part in the test were provided with free insecticide-treated bednets and were sprayed twice with insecticide.

"The future use and deployment of a malaria vaccine should be seen in the context of comprehensive malaria control programmes," they caution.

If further evaluations give the green light, "a Phase III trial, involving 16,000 children in seven African countries, could start in 2008," MVI's Christian Loucq told AFP.

"If all goes well, the vaccine would be submitted for approval by the European health authorities in 2011."

Sony PS3 cluster supercomputers

A 68 page pdf on using Sony PS3's for scientific computing

For a matrix of size 2Kx2K they achieved 11.05 Gflop/s, which is around 75% of the double precision peak. They have also implemented a single precision version of the code, which achieved 155 Gflop/s (again around 75%efficiency) for a matrix of size 4Kx4K. Unfortunately, a single precision algorithm does not legitimately implement the Linpack benchmark. Our initial implementation of the mixed-precision Linpack benchmark [21] placed the CELL processor on the Linpack Report [22] with performance close to 100 Gflop/s.

One way of looking at the CELL processor is to treat it as eight digital signal processors (DSP), augmented with a control processor, on a single chip.

One of the major shortcomings of the current CELL processor for numerical application is the relatively slow speed of the double precision arithmetic. The next reincarnation of the CELL processor is going to include a fully-pipelined double precision unit, which will deliver the speed of 12.8 Gflop/s from a single SPE clocked at 3.2 GHz, and 102.4 Gflop/s from an 8-SPE system, what is going to make the chip a very hard competitor in the world of scientific and engineering computing. Given that, the current CELL processor employs a rather modest number of transistors of 234 million. It is not hard to envision a CELL processor with more than one PPE and many more SPEs, perhaps reaching the performance of a TeraFlop/s for a single chip.
The Cell2 is expected in 2008 and will initially be used in the Roadrunner supercomputer.

A cluster of eight PS3s has been linked together for astrophysics calculations.

This PS3 cluster has been reviewed at Wired magazine


The eight PS3 probably get to a combined 500-800 gigaflops of performance for $3200.

Kitegen follow up

According to Kitegen's preliminary evaluations, it is expected that wind generators of this type may have much lower electric energy production costs than actual wind farms (by a factor up to 10-20) and could generate up to 250 MW/km2, vs. 3 MW/km2 of wind farms.


Redditdel.icio.us



With a kite area of 50 m2 , simulations give about 200 kW power generated with 12m/s wind speed. A wind mill of the same power is 40 m high, weights about 62 t and costs about 900,000 euros. The expected KiteGen weight and cost are about 8t and 60,000 euros respectively.







It is expected that a wind generator of this type will have a territory occupation much lower than a wind farm of the same power (by a factor up to 50-100) and much lower electric energy production costs (by a factor up to 10-20). In the first step of the KiteGen project a small scale prototype has been realized (see Fig. 1) to show
the capability of controlling the flight of a single kite, by pulling the two lines which hold it, in such a way to extract a significant amount of energy.

The first tests performed on the built prototype in the yoyo configuration show a good matching between simulations and experimental results as regards the generated power.

A single 500 m**2 kite with 12 m/s nominal wind speed and aerodynamic efficiency (i.e. CL=CD) equal to 12 would be able to generate 10 MW mean power. 100 such kites towing a 1500 m radius carousel would generate 1000 MW mean power with about 7-8 km**2 land occupation and with an estimated energy production cost ten times lower than the one obtained by fossil fuel thermal plants. Note that a wind farm producing the same mean power, using the present wind mill technology, would have a territory occupation of about 250-300 km**2 and an energy production cost 40-50% higher than thermal plants.

100 MW Kite Gen power plants, illustrated at this link, are estimated to deliver a cost of energy produced lower than 0.03 Euro per kWh.


Kitegen control unit, for controlling figure eight flight of airfoil


Kitegen airfoil flight patterns


Proposed carousel of airfoils for a 100 megawatt class kitegen generator


Newer imagining of a large scale kitegen generator. A generator ring is pulled and the support structure stays in place


Chart of average wind speeds at different altitudes. 800-1200 meters is a sweet spot with less challenging altitude but strong winds. At 800meters, average wind speed is 7.2 m/s and generates 205 watts per m**2. this is four times the power that current 5-6 MW windmills can get at 80 meters height with 4.6 m/s avg wind and 58 watts per m**2. Every point on the Earth surface, on average, 800 m above it, has enough wind power to be exploited with a Kite Gen power plant for energy generation.

FURTHER READING
10MW superconducting wind generators are being developed over the next 30 months The Italian Kitegen pilot plant should also be completed by the end of 2008.

My previous article about kitegen

Another 1 gigawatt wind generator proposal

Kitegen still needs more funding Although they had some agreement with a utility to help cover costs.

October 16, 2007

Space Elevator 2010 contest preview

The space elevator Games 2010 competition is being covered at the Space elevator blog The competition will be from October 19-21, 2007. The competition has some bad weather issues. There is a space elevator climber competition and a tether competition.

Five of the space elevator climber competitors are from Canada


REDDITdel.icio.us



UPDATE: More live coverage of the space elevator games at spaceelevator.com

All teams qualified based on indoor qualification runs yesterday.
Today outdoor on 100 foot track, Kansas City Space Pirates, Technology Tycoons and UBC-Snowstar and USST (University of Saskatchewan) have qualified. Team ETC, LaserMotive, McGill, and Centaurus did not advance past final qualification. Oct 19-21 (friday through Sunday) is the official competition.

Climbs are scheduled to take place every hour, on the hour from 11-6 Friday and 10-6 Saturday and Sunday. (weather-permitting) The four non-qualifying teams will also get a chance to climb the 400′ tether if slots are available. But preference will be given to the teams competing for Prize money. The Tether competitors are arriving and that competition also runs Friday through Sunday.



Kansas Pirates team set up reflectors to power their climber

Clayton Ruszowski, the president of the University of Saskatchewan Space Team (USST), said he and his Saskatoon-based team of 20 to 30 undergraduate engineering students have been working for about 10 months on a solar-cell skinned elevator prototype. USST won first place last year and was 2 seconds from meeting last years minimum requirement.

A share of the $500,000 space elevator climber prize will be given to teams that can climb the 100 meters in 2 meters/second. 50 seconds elapsed time or less. There are eight teams that expected to compete this year for the climber prize.

The Space Elevator is described at the Frequently asked questions part of the elevator2010.org site

-The Space Elevator is a thin ribbon, with a cross-section area roughly half that of a pencil, extending from a ship-borne anchor to a counterweight well beyond geo-synchronous orbit.
-The ribbon is kept taut due to the rotation of the earth (and that of the counterweight around the earth). At its bottom, it pulls up on the anchor with a force of about 20 tons.
-Electric vehicles, called climbers, ascend the ribbon using electricity generated by solar panels and a ground based booster light beam.
-In addition to lifting payloads from earth to orbit, the elevator can also release them directly into lunar-injection or earth-escape trajectories.
-The baseline system weighs about 1500 tons (including counterweight) and can carry up to 15 ton payloads, easily one per day.
-The ribbon is 62,000 miles long, about 3 feet wide, and is thinner than a sheet of paper. It is made out of a carbon nanotube composite material.
-The climbers travel at a steady 200 kilometers per hour (120 MPH), do not undergo accelerations and vibrations, can carry large and fragile payloads, and have no propellant stored onboard.
-Orbital debris are avoided by moving the anchor ship, and the ribbon itself is made resilient to local space debris damage.
-The elevator can be made larger by using itself to carry more ribbon pieces into place. There is no limit on how large a Space Elevator can be!


The rules for the 2007 climber/power beaming competition is here


Here is the crane that holds a tether to be climbed

The competition provides the race track, in the form of a vertically-suspended ribbon, and a power source in the form of an electrical outlet. Competing teams provide complete climbers system, which have to scale the ribbon while carrying some amount of payload, using only power that was transferred from the ground using beamed power.

The climbers net weight is limited to between 10 and 25 kg [22 - 55 lbs], and they must ascend the ribbon at a minimum of 2 m/s. [6.6 feet per second] Climbers will be rated according to their speed multiplied by the amount of payload they carried, and divided by their net weight. For example, a 15 kg climber, carrying 5 kgs of payload at 2.5 m/s will have a score of 5 · 2.5 · / 15 = 0.83

Power is unlimited. It is up to the competitors to build the most power dense machine that they can devise.

The 2007 prize purse, provided by NASA, is now $500,000.


The 2007 $500,000 award goes to the teams that can come up with the best Space Elevator ribbon sample, provided that they can beat last year's winning ribbon by at least 50%. If an entry can meet the basic performance metric (~ 4 GPa-cc/g), then they will stand a very good chance to win the $500,000 purse.

part of USST space elevator climber
USST, last years first place team, shows part of their climber

Other Canadians competing in the elevator contest include a group from the University of British Columbia and private groups from Toronto and Edmonton. There is also a team from Montreal (McGill University)

Kansas city space elevator climber 2007
Kansas City Space Pirates and UBC-Snowstar are using reflected sunlight to power their climber. Kansas city's climber has met the 1 m/s qualifying speed.

Glass fiber and Aluminum hybrid could save maintenance costs and reduce aircraft weight

The U.S. aluminum giant Alcoa, materials-technology company GTM Advanced Structures and scientists at Delft University of Technology in the Netherlands have patented a fiber metal laminate (FML) called CentrAl reinforced aluminum, or CentrAl, for use in aircraft manufacture.

CentrAl provides some 25 percent more tensile strength than high-strength aluminum alloys, is extremely resistant to metal fatigue and is highly damage-tolerant.

"We think you can save 600 to 800 kilograms in a large aircraft (over carbon-fiber composite) -- we estimate that the saving could be around 15 to 20 percent of the weight of the wing," he said. This hasn't been proved, because nobody has yet made a wing using the new material, but it is likely that a wing made using CentrAl would be much easier to repair and maintain than a carbon-fiber composite wing.

CentrAl starts with layers of glass fiber/epoxy sandwiched between layers of aluminum. Between the fiber and the aluminum are layers of a proprietary resin-rich material that its developers call "BondPreg." These layers cause the aluminum to adhere to the glass fiber and also help to spread stress loads evenly throughout the laminate. Thick layers of advanced aluminum, attached strongly to the CentrAl laminate using BondPreg, form the outside of the sandwich.

Toshiba claims to 'validate' nano-imprint litho

Molecular Imprints Inc. (MII) claims that Toshiba Corp. has ''validated'' the use of its nano-imprint lithography technology in developing 22-nm CMOS devices

Nanoimprint seems to be a viable plan B in case EUV stumbles.

It's unclear if Toshiba will put nano-imprint tools into its production fabs at 22-nm and beyond. At this node, Toshiba is also exploring other lithography technologies, such as 193-nm immersion and extreme ultraviolet (EUV).

''Toshiba leveraged MII's Imprio 250 system to pattern 18-nm isolated features and 24-nm dense features with <1-nm critical dimension uniformity and <2-nm line edge roughness (LER),'' according to MII's paper.

''Defectivity levels of as low as <0.3 defects per cm squared were achieved, which are approaching those of immersion lithography,'' according to MII. ''Device overlay results were also within Toshiba's required specifications.''


Nano imprint is at a critical make or break stage

Most observers predict no real action this year. Estimates are that only 30 to 50 nanoimprint machines shipped in 2006. In 2007, shipments are widely expected to be below 50 units; some sources estimate that vendors in total will ship only 10 to 20 real tools this year.

The delays and soaring costs for extreme-ultraviolet (EUV) and other next-generation lithography technologies have rekindled an interest in nanoimprint in the IC world, particularly among the NAND flash community. The storage and LED camps are likewise looking at nanoimprint for the development of next-generation recording media and photonics-based LEDs.

Status of thorium reactor and molten salt reactors

Here is a survey on status of the work around the world on Thorium nuclear fission reactors and molten salt reactors. Molten Salt reactors may be safer than current reactors and could close the nuclear fuel cycle and some designs can eliminate the longest lived nuclear waste. Molten Salt reactors potentially eliminate the need for both fuel enrichment and fuel fabrication, both major expenses.

I would bet on Czech, India to build a Thorium reactor first.
Japan, Norway could also get involved early.
Canada or France could help make one for someone else (probably India)

WNA News Briefing, India: Construction of the country's first advanced heavy water reactor (AHWR), using a thorium fuel cycle, will reportedly start during 2007.

India is developing the Advanced Heavy Water reactor (AHWR) as the third stage in its plan to utilise thorium to fuel its overall nuclear power program. The AHWR is a 300 MWe reactor moderated by heavy water at low pressure. The calandria has 500 vertical pressure tubes and the coolant is boiling light water circulated by convection. Each fuel assembly has 30 Th-U-233 oxide pins and 24 Pu-Th oxide pins around a central rod with burnable absorber. Burn-up of 24 GWd/t is envisaged. It is designed to be self-sustaining in relation to U-233 bred from Th-232 and have a low Pu inventory and consumption, with slightly negative void coefficient of reactivity.


The India - US nuclear deal is in trouble

UPDATE: India has shelved the US-India nuclear deal. Parts of the ruling coalition do not want to give up sovereignty and allow inspections of India's nuclear sites

India on Tuesday approved purchase of equipment for implementation of two 700 MW pressurised heavy water reactors each in Rajasthan and Gujarat.

Some of the other advanced reactors in development could also use thorium

High Temperature Reactors (HTRs) can potentially use thorium-based fuels, such as HEU with Th, U-233 with Th, and Pu with Th. Most of the experience with thorium fuels has been in HTRs.

A larger US design, the Gas Turbine - Modular Helium Reactor (GT-MHR), will be built as modules of 285 MWe each directly driving a gas turbine at 48% thermal efficiency.

South Africa's Pebble Bed Modular Reactor (PBMR) is being developed by a consortium led by the utility Eskom, and drawing on German expertise.

The Czechs working on molten salt reactor

Canada's Candu reactors can burn thorium and with modifications could burn it far more efficiently

A recent Canadian design for a modified geometry 2 fluid Molten Salt Reactor

The French have a lot of active research into Molten Salt Reactors

Japan researchers published peer reviewed plan for shifting to thorium fuel cycle


FURTHER:
Thorium fuel mix in past Indian nuclear reactors

another overview of Thorium reactors

October 15, 2007

New treatments and blood tests to detect Alzheimers'

Mainstream awareness of Transhumanism and the technological Singularity

Congrats to the World Transhumanist Association for getting a topline feature on the New Scientist website.

Plus a discussion in the LA Times

Ray Kurzweil is making a Singularity is near Movie

Ray: ... I am making a movie based on the book Singularity is Near. It has an A-line documentary, and a B-line story. And in the B-line story, I have an AI that tries to pass the [Turing] test in 2029. It does not actually succeed in 2029, but she goes on to try again.

Ian: So this is a science fiction movie that you are producing right now, or someone is producing with you?

Ray. Yes. The movie is called "The Singularity is Near: A True Story About the Future." The A-line documentary has me interviewing 20 big thinkers on their ideas about the future, and their ideas on my ideas, people like Marvin Minsky, and Alvin Toffler and others. And then the B-line is an actual narrative story illustrating the ideas.


It sounds kind of like a Discovery Channel feature 2057

Nuclear licensing activity in the USA

The Wall Street Journal discusses the activity of companies trying to get nuclear plants certified and built.

Before seeking the combined construction permit and operating license, the NRC wanted utilities to first seek approval for their proposed plant sites, to determine their suitability. That, the commission said, would speed up the process of going through a standardized application process. Early site review wasn't required, but it was strongly recommended. The NRC committed to a 42-month schedule for processing applications, once it had reviewed them for completeness.

When NRG Energy submitted the first full application for a plant, the plans immediately departed from the preferred process. NRG decided to skip the early site-permit process for its south Texas site, because it plans to put two new reactors next to an existing nuclear station. It picked a GE-designed reactor that was certified -- but in 1996, so it already is out of date in some respects. NRG is seeking permission to make modifications that reflect a decade of operating experience in Japan and technology advancements like better computer controls.

The approach contains some risk for utilities. Modifications sought by equipment vendors, as part of certifications, are decided once and for all by the NRC. But modifications sought by utilities as part of plant licensing can be challenged in each case. In the past, this provided an avenue for lengthy delays by opponents.

Four reactor designs are certified for U.S. use, but only two -- the earlier GE design picked by NRG and a Westinghouse design, called the AP 1000 -- have attracted interest from U.S. customers. The Westinghouse design has been selected by the most companies.

Westinghouse got its reactor certified in December 2005, but it is back at the NRC asking for changes in its design. Some fix errors, some come in response to NRC requests -- for example, the post-Sept. 11 requirement to design plants to withstand airline crashes -- and some are sought by customers. The NRC review is expected to take a couple of years. In the meantime, several power companies are expected to submit their combined construction and operating-license applications.

Constellation, which is expected to submit an application soon, also decided to skip the early site review for its Maryland location. It has picked a reactor by French-based Areva that isn't certified for U.S. use. Areva intends to submit a reactor-certification request to the NRC by year end and hopes to have approval in 2010. Tom Christopher, chief executive of Areva's U.S. unit, said it will be challenging to have reactor certification and plant licensing occurring simultaneously.


FURTHER:
October 15th is blog action day on the environment More nuclear energy will help reduce the number of active coal power plants. This will reduce air pollution which kills 30,000 people in the United States an average of 14 years sooner than otherwise.

Improved printing for organs and tissue

A new approach to "printing" living cells could make it easier to arrange them into precise structures without harming them. This could enable future therapies where replacement limbs or organs can be printed to order.


A jet of air can draw out a thread of living cells and sticky polymer that could provide a way to carefully position cells to regenerate tissue or organs (Image: Suwan Jayasinghe


Currently printers use a 60-micrometre needle, so the droplets are at least 100 µm in diameter. Those needles can also damage larger cells like neonatal cardiomyocites – baby heart cells – can be 100 µm across. Squeezing them through an inkjet needle can make them rupture and die.

Jayasinghe is developing an alternative approach, called Pressure Assisted Spinning. Three needles nested inside one another separately deliver cells, a viscous polymer and pressurised air. The cells and polymer mix are drawn out and mixed by the pressurised air, explains Jayasinghe.

Vladimir Mironov of the Medical University of Southern California says Jayasinghe's simple solution doesn't tackle the problems hindering all types of cell printing. "The precise placing of different cell types [along the thread] is not possible," he says. "And [manual] cell seeding on a scaffold is laborious and expensive."

As well as inkjet printing, some researchers are experimenting with electrospinning, Mironov points out, a well-understood technology first developed about 100 years ago for making textiles.

In this process, a cell solution flows through an electrically charged hollow needle a few centimetres above an electrically grounded target. The charged solution is drawn towards the target, a little like lightning being drawn towards the Earth, pulling it into a very fine fibre with cells along its length.

But electrospinning also cannot space cells controllably, and has other drawbacks, says Jayasinghe, pointing out that up to 30,000 volts of electricity is needed. The current is low, though, making the chance of serious injury minimal. It is still a hazard, he says, one not present using pressure assisted spinning.

US Navy planning unmanned surface vessels

The Navy has just released its plan for robotic unmanned surface vessels These look like vehicles that can greatly enhance the flexibility of navy operations and help ensure greater control of the surface and subsurface area around larger manned navy vessels.


del.icio.us





All four classes of unmanned surface vessels


Table of missions

x class USV
Smallest X class USV for recon mainly

harbor class USV
Harbor class specifications, 7 meters long, Zodiacs with weapons, can defend against attacks like the one on the USS Cole


Snorkler class, 7 meter submarine, stay in the water for 1 day

fleet class USV
Fleet class USV, 11 meters long, good speed and able to stay in the water for 2 days

If costs were held down to $1 to 5 million per USV and UAV, then it would make sense to spend 10%-50% of the Navy budget to provide 10-100 USVs and UAVs per manned Navy Vessel. Every manned ship would have the equivalent of an unmanned carrier group. Mini-unmanned carrier groups would be far cheaper than current manned versions.

FURTHER READING

This is the damage to the USS Cole. A Zodiac with a couple of terrorists and explosives was allowed to get too close to the USS Cole. Robotic Zodiacs could intercept and challenge such craft at a standoff distance from Navy vessels.

Wikipedia has nifo on Autonomous Underwater Vehicles

Simpler, more reliable metamaterial made from semiconductors

An easy-to-produce material made from the stuff of computer chips has the rare ability to bend light in the opposite direction from all naturally occurring materials. The semiconductors that constitute the Princeton invention are grown from crystals using common manufacturing techniques, making it less complex, more reliable and easier to produce than other metamaterials.


Bending light: A new type of material causes light waves (represented by ovals) to move in a way that’s completely different from the way they move in ordinary materials. Credit: Anthony Hoffman, Princeton University

The MIT Technology Review also has more information about this new metamaterial

The materials developed at Princeton retain the property of negative refraction, yet they're much easier to make. Rather than requiring intricate structures, such as the split rings used in the microwave cloaking device, the materials can be made simply by stacking up extremely thin layers of semiconductor material. What's more, that stacking can be done by the same tools now used to make semiconductor materials for lasers used in telecommunications, says Claire Gmachl, the Princeton researcher who led the work. The new materials consist of alternating layers of indium gallium arsenide and aluminum indium arsenide, and they're tuned to work in the infrared region of the spectrum.

The first application the Princeton researchers are developing is a flat lens for chemical-sensing devices, an application for which materials that work with infrared light are particularly well suited. Gmachl says that the current optical setups for such devices are bulky because they use conventional lenses. "The first application would be using that material to miniaturize optical setups" by replacing curved lenses with flat ones, she says.

Another early application could be in night-vision devices, which also work with infrared wavelengths. "For people who want to improve night-vision devices, this could be quite interesting," Smolyaninov says.


This material may contribute to significant advances in many areas, including high-speed communications, medical diagnostics and detection of terrorist threats.

Negative refraction holds promise for the development of superior lenses. The positive refractive indices of normal materials necessitate the use of curved lenses, which inherently distort some of the light that passes through them, in telescopes and microscopes. Flat lenses made from materials that exhibit negative refraction could compensate for this aberration and enable far more powerful microscopes that can "see" things as small as molecules of DNA.

In addition, the Princeton metamaterial is capable of negative refraction of light in the mid-infrared region, which is used in a wide range of sensing and communications applications. Its unique composition results in less lost light than previous metamaterials, which were made of extremely small arrangements of metal wires and rings. The semiconductors that constitute the new material are grown from crystals using common manufacturing techniques, making it less complex, more reliable and easier to produce.

Next, the team plans to incorporate the new metamaterial into lasers. Additionally, the researchers will continue to modify the material in attempts to make features ever smaller in an effort to expand the range of light wavelengths they are able to manipulate.

October 14, 2007

Cancer deaths dropping twice as fast as last decade

Between 2002 and 2004, death rates dropped by an average of 2.1 percent a year. That may not sound like much, but between 1993 and 2001, deaths rates dropped on average 1.1 percent a year.

The big change was a two-pronged gain against colorectal cancer. It remains the nation's No. 2 cancer killer, deaths are dropping faster for colorectal cancer than for any other malignancy — by almost 5 percent a year among men and 4.5 percent among women.


UPDATE:
Colorectal cancer fact sheet

The following table outlines some of the advantages and disadvantages of the colorectal cancer screening tests described in this fact sheet.




Table: Advantages and Disadvantages of Colorectal Cancer Screening Tests
Test AdvantagesDisadvantages
Fecal Occult Blood Test (FOBT)

  • No preparation of the colon is necessary.
  • Samples can be collected at home.
  • Cost is low compared to other colorectal cancer screening tests.

  • Studies have proven that this test, when performed every 1 to 2 years in people ages 50 to 80, reduces the number of deaths due to colorectal cancer by as much as 30 percent.li>
  • FOBT does not cause bleeding or tears in the lining of the colon.
  • This test fails to detect most polyps and some cancers.
  • False positive results are possible. ("False positive" means the test suggests an abnormality when none is present.)

  • Dietary and other limitations, such as increasing fiber intake and avoiding meat, certain vegetables, vitamin C, iron, and aspirin, are often recommended for several days before the test.

  • Additional procedures, such as colonoscopy, may be necessary if the test indicates an abnormality.
Sigmoidoscopy
  • The test is usually quick, with few complications.
  • Discomfort is minimal.

  • In some cases, the doctor may be able to perform a biopsy (the removal of tissue for examination under a microscope by a pathologist) and remove polyps during the test, if necessary.
  • Less extensive preparation of the colon is necessary with this test than for a colonoscopy.

  • This test allows the doctor to view only the rectum and the lower part of the colon. Any polyps in the upper part of the colon will be missed.
  • There is a very small risk of bleeding or tears in the lining of the colon.
  • Additional procedures, such as colonoscopy, may be necessary if the test indicates an abnormality.
Colonoscopy
  • This test allows the doctor to view the rectum and the entire colon.

  • The doctor can perform a biopsy and remove polyps during the test, if necessary.

  • The test may not detect all small polyps and cancers, but it is the most sensitive test currently available.
  • Thorough preparation of the colon is necessary before the test.
  • Sedation is usually needed.

  • Although uncommon, complications such as bleeding and/or tears in the lining of the colon can occur.
Double Contrast Barium Enema(DCBE)
  • This test usually allows the doctor to view the rectum and the entire colon.
  • Complications are rare.
  • No sedation is necessary.
  • The test may not detect some small polyps and cancers.
  • Thorough preparation of the colon is necessary before the test.
  • False positive results are possible.
  • The doctor cannot perform a biopsy or remove polyps during the test.
  • Additional procedures are necessary if the test indicates an abnormality.
Digital Rectal Exam (DRE)
  • Often part of a routine physical examination.
  • No preparation of the colon is necessary.
  • The test is usually quick and painless.
  • The test can detect abnormalities only in the lower part of the rectum.
  • Additional procedures are necessary if the test indicates an normality.


New tests for colorectal cancer screening are under study. For example, virtual colonoscopy (also called computed tomographic colonography) is a procedure that uses special x-ray equipment to produce pictures of the colon. A computer then assembles these pictures into detailed images that can show polyps and other abnormalities. Because it is less invasive and does not require sedation, virtual colonoscopy may cause less discomfort and take less time than conventional colonoscopy. However, as with conventional colonoscopy and DCBE, thorough preparation of the colon is necessary before the test.

Genetic testing of stool samples is also under study as a possible way to screen for colorectal cancer. The lining of the colon is constantly shedding cells into the stool. Testing stool samples for genetic alterations that occur in colorectal cancer cells may help doctors find evidence of cancer or precancerous polyps. Research conducted thus far has shown that this test can detect colorectal cancer in people already diagnosed with this disease by other means. However, more studies are needed to determine whether the test can detect colorectal cancer or precancerous polyps in people who do not have symptoms.

Hitachi making 4 terabyte hard drives in 2011 or 2009

Hitachi is making current perpendicular-to-the-plane giant magneto-resistive heads (CPP-GMR) for hard disk drives to enable 4 terabyte drives in 2011 and/or 1 terabyte notebook drives.

The CPP-GMR drive essentially changes the structure of drive heads. Current drives come with a tunnel magnetoresistance head. In these, an insulating layer sits between two magnetic layers. Electrons can tunnel through the layer. Precisely controlling the tunneling ultimately results in the 1s and 0s of data.

UPDATES:
PC world is indicating that the new drives will be available in 2009. This makes more sense in order for hard drives to keep more market share from up and coming memory technology like Flash or NRAM and others.

Sci-tech today report that the technology will start to be rolled out in 2009 and will be fully developed by 2011

Форма для связи

Name

Email *

Message *