Pages

October 03, 2009

Singularity Summit Weekend : Collection of Liveblogging

This site has previewed the schedule and several key talks that will be presented at the Singularity Summit in New York

Michael Annisimov, Accelerating Future, is at the Singularity Summit and indicates that there are over 800 attendees.

Popular Science magazine has live blogging of the Singularity Summit

Sentient Developments has Benjamin Peterson liveblogging.

Lisa Rein of Hplus Magazine will be liveblogging the event

UPDATE:
The New Atlantis has more detailed liveblogging posts.

Ben Goertzel, Hameroff

Ben talks about neural symbolic architecture for cognitive robotics. Ben shows demonstrations of physicalized AGI systems using neural symbolic architecture using the Nao robot platform. Doc Goertzel shows video of an AGI which recognizes the emotions of a human, and proceeds to show video of a Second Life implementation of AGI. Ben goes on to list a proven record of AGI (Biomind) assisting in medical research. For example Biomind was used aging networks from gene expression data from Genescient's Methuselah flies.

Hameroff takes the stage and remarks on the historical failures of artificial intelligence research, citing Newell and Simon (dating back to the 40s.) Not because it's impossible but because the researchers are not modeling the correct structures. He goes on to describe microtubules, quantum effects in the brain, gamma-states indicative of consciousness (*BING*) states in the brain. He glosses over the Quantum Consciousness theory he has developed with Roger Penrose and encourages AI researchers to model cilia.


Hplus magazine coverage of the Whole Brain Emulation talks which has several pictures.

Popular Science covers the Anna Salamon opening talk
END UPDATE

The SIAI's (Singularity Institute's) Anna Soloman just finished the opening talk about intelligance and the Institute's vision for a controlled intelligence explosion, as opposed to an uncontrolled intelligence explosion that would destroy us and everything we value.

Anna discussed the gradual technological progression for intelligence that will eventuall make humans obsolete, and described a number of avenues for incremental research progress, so that we can eventually learn how to build an intelligence that we understand, and that will create a world we value.




Anders Sandberg talked about Whole Brain Emulation

In order to emulate a brain we're going to need greater scanning capability:
• need enough resolution - rough consensus 5x5x50 nm resolution scanning

• need enough information

• need enough volume

"The step from mouse to man is 20 years in terms of brain emulation. First scan or simulate then computer power gradual emergence of emulation."



Randal Koene "The Time is Now We Need Whole Brain Emulation"

Randal is impressing upon his audience the importance of shedding the fleshbag.

Theodore Berger building hippocampus replacement.

"In-vivo techniques, neural recording, neural interfacing:
• Scary/risky procedures
• chronic implantation
• power supply
• scale and bandwidth .."

Different technologies could provide less-invasive (?) modalities: qdots.

Multiscale scanning requirements, need to record neuronal activity at different levels of activity (voxel, group, spike, analog, spatial, molecular)


Itamar Arel "Technological Convergence Leading to AGI" (Artificial General Intelligence)

Deep Machine Learning
• biologically-inspired computational intelligence approach
-massively parallel

VLSI tech
• adequate tech is here
• billions of transistors on a single chip
• power requirements are low

Ben Goertzel asks Arel what his funding requirements are ... Arel says he can build you an AGI system for 2-14 million dollars.


Cities 2.0

The world has an opportunity to develop new more productive and efficient cities that are built entirely new because of the billions moving to cities. The current move from rural to urban is increasing per capita GDP by four to eight times in places like China. The productivity gains could be even greater if a new kind of city could be produced with double or triple the productivity of current cities.

China is considering accelerating urbanization from the current pace of20-35 million people per year.

Source: Population Division of the Department of Economic and Social Affairs of the United Nations Secretariat, World Population Prospects: The 2006 Revision and World Urbanization Prospects: The 2007 Revision, http://esa.un.org/unup

World Urban population 2000-2050
Year Urban population (in billions) Total population 
2000 2.85                           6.12 
2005 3.16                           6.51
2010 3.49                           6.91
2015 3.84                           7.30
2020 4.21                           7.68
2025 4.58                           8.01
2030 4.97                           8.32
2035 5.34                           8.59
2040 5.71                           8.82
2045 6.06                           9.03
2050 6.40                           9.20


Urbanization is ultimately heading towards about 85-90%. If population levels out at 9.5 billion then that would be about 8.5 billion in cities in about 2080.

There is an opportunity for countries like China to reinvent cities that are more productive and efficient. Design and prototyping of cities could be done from now to 2020. A radically different from scratch city could be developed and built out starting in 2020-2030.

Doming cities could be re-examined as well as factory production of cities with more compact footprints and arcology features. Multi-level structures with 20-30 foot high levels could use less area while still providing a feeling of openness.



A domed city would most benefit from being a planned city like Masdar city in Abu Dhabi.

City scale climate engineering could save money and more efficiently reduce energy costs.

Air supported sheets for Domes could get to far larger size

For towers, the mile high building is still proceeding.

The smaller more conservative domes in the Houston video would have either just one for the city center or several covering several areas of the metro area.

The cost benefits for any dome city would be more readily realized as part of a build from scratch plan like Masdar.

Retrofit versions would be after the technology and benefits were proved out. Although certain cities that are vulnerable to hurricanes like along Texas coast and New Orleans could have justification for Dome retrofits instead of other public works projects that would have costs into the tens of billions. Bridges, buildings, roads etc...

An existing large domed greenhouse has several connected dome structures so you can always expand out as needed.

Bolonkin (who has optimistic economic projections) has a paper on doming a city and handling water and climate.

Reimaginging Movement

Masdar is a planned city that would have no fossil fuel cars.

US Bureau of Transportation Statistics (BTS) costs of freight transporation by mode

Air 82 cents per ton mile
Truck 26 cents per ton mile
Rail 2.9 cents per ton mile
Barge 0.72 cents per ton mile (2001)
Pipeline 1.49 cents per ton mile (2001)

Therefore the energy efficiency gains are possible with a megascale engineering revamp. Magnetic pipelines for cargo movement and deliveries could reduce costs by twelve times over trucks inside cities.

China has funded magnetic pipelines for moving cargo. The magnetic movement of cargo would be like plumbing for waste and water, but would be pipes for efficiently moving goods within and between cities.

Another company, Launchpoint Technologies, is developing a freight rail electrication system. Rail motor is at the link on magnetic pipelines.

Revamping transportation and cargo movement and people movement would be part of a planned conversion of cities. It is thinking far bigger than the areas of cities that are walking only outdoor malls.

Note the low cost per ton mile of barges. This is why as China dams its major rivers with the equivalent of a three Gorges every two years from now to 2030, they are deepening a lot of rivers to allow 10,000 ton barges to go to the interior. The China example of hundreds of gigawatts of hydroelectric power while also enabing a nationwide barge infrastructure shows that

1. A major plan where engineers run a country.
2. That thinking big can deliver more energy, efficiency and high productivity for a country.
3. An engineering plan that enables the GDP of entire regions and cities to double or more

Transportation Energy

The first commenter talked about the energy and cost of walking around and public transportation.

Here is an analysis of walking around energy usage.

Humans are modestly efficient. Walking,an average person burns about 100 Calories per mile at 3mph, or 300 per hour, while sitting for the same hour burns around 80 Calories just keeping you warm. In other words, the walking 3 miles uses about 220 extra Calories. Calories are kilocalories, and one Calorie/kcal is about 4 BTUs, 4200 joules or 1.63 watt-hours.

While walking 1 mile burns an extra 74 Calories, on a bicycle we’re much better. Biking one mile at 10mph takes about 38 extra calories over sitting.

A gallon of gas has about 31,500 Calories in it, so you might imagine that you get 815 “mpg” biking and 400 “mpg” walking. Pretty good. (Unless you compare it to an electric scooter, which turns out to get the equivalent of 1200 mpg from pure electricity.)

But there’s a problem. We eat, on average about 2700 Calories/day in the USA, almost all of it produced by agribusiness. Which runs on fossil fuels. Fossil fuels provide the fertilizer. They run the machines. The process and transport and refrigerate the food. In many cases our food — cows — eats even more food produced with very high energy costs.

I’ve been digging around estimates, and have found that U.S. agriculture uses about 400 gasoline-gallon equivalents per American. Or 1.1 gallons per day, or about 10 Calories (40 BTU) from oil/gas for every Calorie of food. For beef, it’s far worse, as close to 40 Calories of oil/gas (160 BTU) are used to produce one Calorie of beefy goodness.

You can see where this is going. I’m not the first to figure it out, but it’s worth repeating. Your 3 mile walk burned 220 extra Calories over sitting, but drove the use of 2,200 Calories of fossil fuel. That’s 1/14th of a gallon of gasoline (9oz.) So you’re getting about 42 miles per gas-gallon of fossil fuel.


Brad Templeton has some analysis of the energy efficiency of public transit in the USA. Public transportation is more efficienct in Asia where there are fifty people on a jeepney in the Phillippines.


Here is more Brad Templeton analysis of transportation myths.

Transit is only 1% of total USA transportation energy use, it doesn't matter a great deal how green it is. Making it twice as good, or twice as bad, would not significantly alter the energy total.


Reinventing Farming Inside City 2.0
This situation could be improved by changing the fuel usage for agriculture. If we used more nuclear power and used electrified farm machinery and used a "wonder tree" to fertilize the soil.

The Faidherbia tree - pending some further research on its impact on the water table - may now provide a natural and widespread fertilizer fix. According to the Agroforestry Centre, farmers in Malawi testify the tree is like a "fertilizer factory in the field", as it takes nitrogen from the air, fixes it in the leaves and subsequently incorporates it into the soil. The Agroforestry Centre's research showed that in Malawi maize yields increased by 280 per cent in the zone under the tree canopy compared with the zone outside the tree canopy. In Zambia, unfertilized maize yields in the vicinity of Faidherbia trees averaged 4.1 tonnes per hectare, compared to 1.3 tonnes nearby but beyond the tree canopy.


A revamp in farming is needed to make City 2.0 far more energy efficient even for walking.
Vertical farming has been examined in detail. This would bring farming into the 2.0 City, which would almost eliminate transportation energy for food to the city and make a city more independent. The carousels and rafts and tending of the vertical farm can be with electrical systems.







Integrating farms and parks and fish farming into the City 2.0 could be done in a way that enhances the quality of life for the city inhabitants while also providing food in a far more efficient way. Currently people in suburban developments can have their houses within 2-3 blocks of one of several parks or school built by the developer as part of a 1000-6000 home development.

City 2.0 should adopt the best principles of an Arcology.

Arcology, a portmanteau of the words "architecture" and "ecology," is a set of architectural design principles aimed toward the design of enormous habitats (hyperstructures) of extremely high human population density.


China has some small arcology projects

A seed arcology appropriate for southern Chinese coastal sites in the Pearl Delta River area or Hainan Island is proposed. It is the home for approximately 300 people to start, and centers around an in-house food packaging facility and Integrated Water Center (IWC).

Because this project intercepts flows of residential wastewater which might otherwise flow to industrialized treatment facilities or even the sea, cooperation with city planners and local governments is assumed. Marketable food products come directly from adjacent terraces where urban wastewater is biologically purified, i.e., urban agriculture. Solar and wind electrolysis and wastewater gasification provide hydrogen gas which in turn provides electricity and heat on demand to satisfy resident needs throughout the 15-hectare minimum site.

The structure passively saves energy via bioclimatic adaptation, reflected solar illumination, and reused/recycled materials. The structure contains a small craft marina, hydroponics gardens, filter beds, bioremediation tanks, dry and liquids storage, classrooms, offices, dormitories, a small market plaza and shared, communal spaces. Bio-terraces and algae ponds, linked by a Contour Retaining wall Infrastructure System (CRIS) surround its outer parts. Arcology is a factory, farm, school and community in one, located in the urban periphery where migrants tend to settle. Automobile traffic is limited to delivery and emergency vehicles; pedestrian traffic is the norm in a condensed, three-dimensional environment. This seed is meant to act as a self-reliant economic unit. The arcological seed not only provides urban infrastructure where none existed previously, but one predicated on the ecosystem model or natural resource cycles. Urban wastewater has already demonstrated its economic potential in China and remains largely untapped



October 02, 2009

VASIMR, Uranium Hydride Reactor, Direct Conversion of Heat or Radiation to Electricity

This is an update of the late 2007 article about using hyperion power generation uranium hydride reactors with VASIMR plasma rockets.

Hyperion Power Generation is moving ahead with the preparations for three factories for its roughly 7 ton 27 MWe nuclear reactors. One to two hundred reactors could be built and deployed from 2014-2019.

VASIMR plasma rocket announced the followup to the testsin 2010 at the space station with clusters of 200 KW rockets to get to a megawatt in 2013 and then development of megawatt class single unit rockets.

By 2020, then Multi-megawatt VASIMR rockets could be combined with the uranium hydride reactor(s). 200 MW versions would enable the 39 day trips to Mars.



A proposed portable nuclear reactor (simplified solid core) is the size of a hot tub and will be able to generate 27MW. It is in funded development. A 200 KW version of the Vasimr engine is being ground tested in 2008 and a flight version is being readied for 2010. Seven of the nuclear generators would provide 200 MW of power to enable 39 day one way trips to Mars. Two technologies that are both in funded development and with no major feasibility questions could revolutionize space travel.

Clarification about what is novel about this design and what should be the same as other nuclear reactor and nuclear propulsion systems:

The reactor does not exist yet. Therefore, it is not space rated. They have just announced that they are working on it. They are talking 2012 for the first one to get finished for some ground application.

However, it is just another solid core nuclear reactor. I do not see why other nuclear reactor designs and solid core rockets would work and this would not. There have been other nuclear thermal spacecraft designs using similar technology. I am just choosing to pair the reactor with the Vasimr plasma drive instead of using direct nuclear propulsion systems. It is nuclear electric powering a vasimr drive.

The patent indicated that if they used thorium hydrate then the reactor would run at about 1900 degrees, which could be better for nuclear power system for a rocket. However, they are first working on uranium hydrate.

I have not done any detailed design for this system, but there is not that much about it that is that novel compared to other nuclear reactor for space rocket designs. The main novelties - no people needed to tend the nuclear reactor - it keeps a constant temperature by itself - it is simple and presumably easier to build and maintain - less waste than many other systems. Heat piping, radiation shielding, heat radiators, conversion of the steady state heat to electricity are all things that can be tweaked based on the application and which can be cribbed from past nuclear rocket designs. Some other nice things are that they are talking about mass factory production and low costs.

From the patent (section 58)

At the rate of power production assumed for the reactor, 50 to 100 W/cm**3.


If the density is 8, then it would seem to work out to 7-14KW per kg.

There are some basic sources online to perform the rough estimates of heat pipe weighting and heat radiators.

Some other component weights for other nuclear rockets.

The hyperion device runs at 400-800 degrees using uranium hydrates and can run at 1900 degrees using thorium hydrates. We can look at other spacecraft designs that have heat radiators. Those are separate technologies from the main hyperion nuclear reactor and the physics of dealing with the heat is the same. The Hyperion reactor's main advantage is that it self-regulates to whatever temperature range it is designed for based on different metal hydrates that are used. Dealing with heat and electricity conversion is not changed from other solid core nuclear reactors.

I have a new article that discusses the state of thermoelectronics, which has improved a lot since the mid-90s and which has a lot of money going into improving them to help deal with high oil prices. The goal of the DOE EERE is to raise diesel engine efficiency by over 50% by 2014 with addon systems to utilize waste heat.

Hyperion power systems (the maker of the new power generator. They indicate that the Sante Fe reporter made a mistake. The output is about 25-17 MW ELECTRIC [This statement was also consistent with the patent which talked about tens of MW in electricity. They also said that the containment vessel will be dense enough that no radiation will escape even if it is not buried in the ground.


The proposed nuclear "battery" reactor

For a space craft we would want to eliminate extra weight and especially any dirt radiation shield. There are several approaches. Research has been looking at using lightweight electric and electrostatic fields for radiation shields. We would only need to concentrate shielding on the crew quarters. The crew quarters would probably be in the front with the 600 tons of fuel in fuel tanks between them and the reactors. Shorter trip times mean less exposure to low gravity and cosmic rays.

Spacecraft designers may also use a ship's own cryogenic fluids as a radiation screen by arranging the cargo tanks containing them around crew compartments.

"In most [mission] scenarios, you need liquid hydrogen for fuel and you need water," explained Richard Wilkins, director of NASA's Center for Applied Radiation Research at Prairie View A & M University in Texas, conducting one study into liquid shield approaches. "And these are all considered materials that are particularly good for cosmic ray shielding."


Here is a 12 page pdf on electrostatic radiation shielding



Another issue is converting heat from the nuclear plant to electricity. There has been a great deal of progress on thermoelectronics. Thermoelectronics are electronics that convert heat into electricity. The thermoelectric effect is discussed at wikipedia

One company Powerchips claims to be able to achieve 70-80% carnot efficiency. This would mean that at 70% carnot efficiency if the hot side was 500 degrees celsius and the cold side was 0 then 45% of the heat would be captured as electricity. If the cold side could get down to -80 or -90 then the effiency would be 54%. The cold side might get that cold or colder in space if it was shaded.

The total critical mass is from 600-1200 kg. The total mass for the nuclear reactor is probably under 100 tons and possibly in the 10-20 ton range. A nuclear powered Vasimr rocket would enable one way trips to Mars in 39 days and delivering 22 tons of payload. Vasimr engines can get up to 50,000 ISP which is 1100 times more fuel efficient than the Space Shuttle. The nuclear space vehicle would weigh about 600-1500 tons fully fulled. So it would take several launches using chemical rockets to put the pieces in orbit for assembly. A slightly scaled back system with one or two nuclear reactors would still enable trip to Mars for 70-100 day trips to Mars.

There is no serious scientific question about whether these two technologies (improved nuclear fission and Vasimr plasma propulsion) will work. It is a matter of funding the work and doing the engineering development.

Combining this power source which is targeting 2012 operation with Vasimr plasma drive would then enable very good space transportation out to Mars or the asteriods.


Franklin Chang Diaz and his 200 kw Vasimr engine

This 28 slide presentation by Andrew Petro of NASA shows that using a Vasimr propulsion system with 200MW of nuclear power would enable a one way trip to Mars in 39 days.


Information on the 200MW Vasimr system and one way travel times to Mars

The VASIMR system is a high power, electrothermal plasma rocket featuring a very high specific impulse (Isp) and a variable exhaust. Its unique architecture allows inflight mission-optimization of thrust and Isp to enhance performance and reduce trip time. VASIMR consists of three major magnetic stages where plasma is respectively injected, heated and expanded in a magnetic nozzle. The magnetic configuration is called an asymmetric mirror. The 1st stage handles the main injection of propellant gas and the ionization subsystem; the 2nd stage acts as an amplifier to further heat the plasma. The 3rd stage is a magnetic nozzle which converts the plasma energy into directed momentum. The magnetic field insulates nearby structures from the high plasma temperature (>1,000,000 oK.) It is produced by high temperature superconductors cooled mainly by radiation to deep space. Some supplemental cooling from the cryogenic propellants ( hydrogen, deuterium, helium or mixtures of these) may also be used.

The system is capable of high power density, as the plasma energy is delivered by wave action, making it electrodeless and less susceptible to component erosion. Plasma production is done in the 1st stage by a helicon discharge, while additional plasma heating is accomplished in the 2nd stage by the process of ion cyclotron resonance.


Another paper analyzing vasimr engines

FURTHER READING
Another nuclear powered vehicle that we have the technology to start building now is the liberty ship a gaseous core nuclear design. It could launch 1000 tons into orbit in one trip and would not leak any nuclear material. This kind of design is needed to greatly improve launching from earth to orbit. The nuclear Vasimr only helps with getting from orbit to anywhere else.

Be sure to read my analysis of the patent for the nuclear "battery" a solid core uranium hydride reactor.

MIT interview with Franklin Chang Diaz, president and CEO of Ad Astra Rocket Company, who is working on the Vasimr propulsion system that could shorten trips in space and improve fuel efficiency.

For Mars and beyond, we will need to develop nuclear electric power. If we don't, we might as well quit. We're not going to get anywhere without it.

I also would not want to send people to Mars on a fragile and power-limited ship. If you send people that far, you have to give them a fighting chance to survive, and the only way you can do that is if you have ample supplies of power. Power is life in space.


Here is a 47 slide presentation by Tim Glover on a 12MW Vasimr system.


Tim Glover's presentation shows components that would be needed for high power Vasimr systems like this 4MW ICRF antenna


Size of parts for a 1MW Vasimr engine


2.8 MW RF power converter


The components and weights of a 2.5MW vasimr engine design


This pdf from 2003 surveyed various near term propulsion options for trips to Mars.

If we get fusion power working then we can do even better.

Hyperion Power Generation Uranium Hydride Nuclear Reactor Factories



Hyperion Power Generation plans to build a small reactor manufacturing plant in the United Kingdom within the next two years. The firm says it plans to use the existing UK supply chain to build its 70MWt (27MWe), self-regulating reactor and that the UK will be its ‘launch pad’ for the European market.

H/T AtomicInsights

“We have customer commitments for over a hundred units already. We’re going to be very busy! In fact, we’re now scheduling deliveries out to 2018-2020 even though we expect to go to market in the 2013-2014 timeframe.”

Hyperion plans to build three manufacturing facilities: one in the USA to support the North and Latin America markets, and a third in Asia, probably Japan. Although the firm hasn’t made a decision on where its UK plant will be located, it’s likely to be near existing nuclear facilities, which are clustered around the Sellafield site in Cumbria, northeast England.

Deal said that Hyperion is in discussions with economic development organisations around the UK, as it decides where to locate the plant, which would employ around 200 people. This process that is likely to take a couple of years. “It might be possible to use existing facilities; we just don’t know yet,” he said. The firm is in discussions with the existing UK supply chain, too.

Hyperion’s small reactor would not be in competition with larger reactors and would be a source of incremental revenue to existing suppliers, Deal says.

Deal said that Hyperion expects to submit its design certification application to the US Nuclear Regulatory Commission ‘within the next year’. It also plans to get its design licensed in the UK.

In terms of cost the reactor itself will cost approximately $30 million. Deal says that the firm is committed to generating electricity for less than 10 cents/kWh. “If you look in terms of reactor cost, plus plant side it comes out to be $2000/kW. But, it depends on if you’re retrofitting to replace coal or gas plants.”

The Hyperion Power Module – like the company itself, which employs less than 100 people – defines small. At just 1.5m wide and 2m tall the reactor can be transported to site by ship, rail or road. It will be able to produce power for 8-10 years depending on the application, after which it will need to be returned to the factory for refuelling.




Hyperion’s reactor can be used for decentralised power, for military installations, dedicated applications such as mining operations, factories, water treatment or sewage facilities.

Deal said that Hyperion has had several enquiries from people who want to use its reactor as a source of baseload power for wind projects. There is also the option that Hyperion reactors could be used to retrofit fossil fuel plants. “We are essentially selling the heat. That means we’re incredibly flexible for retrofit.”



Megawatt Class VASIMR Plasma Rocket Cluster by 2013

In an Interview in Seed Magazine, Dr. Franklin Chang-Diaz discusses the future of the VASIMR plasma rocket.



Once we’ve demonstrated a 200-kilowatt prototype engine operating at full power on the ground, the next step is testing an identical version in space. We’re already testing the prototype unit in our vacuum chamber here in Houston, and we’re designing the actual flight engine, which is called the VF-200. We signed an agreement with NASA last December to actually mount the VF-200 on the International Space Station in 2012 or 2013. Unfortunately, the space station doesn’t have 200 kilowatts to give us. So what we’ll do is use the solar arrays of the station to charge a battery pack that we’ll carry on board, which will allow us to fire the rocket at 200 kilowatts for up to 15 minutes. We’ll do this again and again for months to qualify the engine in space. In 2013 or 2014, we’ll make clusters of 200-kilowatt engines to give us something close to a megawatt of electricity, and deploy them with a very high-powered solar array. This will be a robotic reusable “space tug” that can refuel or reposition satellites, or even send packages to the Moon at a much lower price. By charging for those services, we hope to bootstrap our way into developing a megawatt-class rocket. That rocket would be too powerful to test on the ISS, but it could perhaps be tested on the surface of the Moon where solar power is abundant. Like the ISS tests, we’d fire the megawatt-class VASIMR continuously for a period of one month, then two months, to validate and verify that it could be used on a human mission to Mars.

Once we have this capability, Mars isn’t really the only place that we can go. With a megawatt-class VASIMR, basically we will have access to the entire solar system. Mars is an interesting place, but so are Europa and Ganymede and Enceladus and Titan. These are places where we might find extraterrestrial life. Even with the 200-kilowatt solar-powered VASIMR we could do amazing things. We’re developing a concept to drive it close to the Sun, between Venus and Mercury, where it can get a momentum boost and catapult a probe into the outer solar system at high speed. This would let us deliver a package to Jupiter in one-and-a-half years; otherwise that trip takes about six.

One thing we’d like to do is maintain the ISS in orbit. The ISS has to be reboosted every few months; otherwise it gradually falls and burns up in the atmosphere. These reboosts require about 7 metric tons of rocket fuel per year. How much does it cost to get 7 metric tons of rocket fuel into orbit? $140 million. That’s the bill someone has to pay, each year, just for hauling up the fuel. The 200-kilowatt solar-powered VASIMR can do the same thing with about 320 kilograms of argon gas per year, which still costs about $7 million, but it decreases the price by a factor of 20. Of course, we have to make a little money ourselves, so the price decrease won’t be quite that large, but it can still save NASA a lot of money and net us a handy profit.





We would hope that, if not the US, maybe the Europeans, the Chinese, the Russians, or somebody else will develop a nuclear-electric power capability that we can marry up to this rocket. We have to realize that the US is no longer the only player. The US may choose not to do this, but that doesn’t mean the rest of the world will follow—not anymore. We no longer live in a confrontational world like the one that fueled the Apollo program in the 1960s. We live in a world that has to cooperate, to collaborate. The US has a tremendous opportunity to still be the leader here, but if it isn’t, others will be. Information is traveling faster everywhere now; technology has gained a foothold and developed in the nooks and crannies of the planet. The world has changed, and the US no longer has a monopoly on knowledge. We need to collaborate to build a capable space infrastructure so that we can truly explore.



VASIMR Space Missions

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions



Ginkgo BioWorks aims to push synthetic biology to the factory level automation


Biological parts: Ginkgo BioWorks, a synthetic-biology startup, is automating the process of building biological machines. Shown here is a liquid-handling robot that can prepare hundreds of reactions. Credit: Ginkgo BioWorks

MIT Technology Review reports that startup Ginkgo BioWorks aims to push synthetic biology to the factory level.

Ginkgo BioWorks, a new synthetic-biology startup that aims to make biological engineering easier than baking bread. Founded by five MIT scientists, the company offers to assemble biological parts--such as strings of specific genes--for industry and academic scientists.

"Think of it as rapid prototyping in biology--we make the part, test it, and then expand on it," says Reshma Shetty, one of the company's cofounders. "You can spend more time thinking about the design, rather than doing the grunt work of making DNA." A very simple project, such as assembling two pieces of DNA, might cost $100, with prices increasing from there.



MIT's Tom Knight developed a standardized way of putting together pieces of DNA, called the BioBricks standard, in which each piece of DNA is tagged on both sides with DNA connectors that allow pieces to be easily interchanged.

"If your part obeys those rules, we can use identical reactions every time to assemble those fragments into larger constructs," says Knight. "That allows us to standardize and automate the process of assembly. If we want to put 100 different versions of a system together, we can do that straightforwardly, whereas it would be a tedious job to do with manual techniques." The most complicated part that Ginkgo has built to date is a piece of DNA with 15 genes and a total of 30,000 DNA letters. The part was made for a private partner, and its function has not been divulged.

Assembling parts is only part of the challenge in building biological machines. Different genes can have unanticipated effects on each other, interfering with the ultimate function. "One of the things we'll be able to do is to assemble hundreds or thousands of versions of a specific pathway with slight variations," says Knight. Scientists can then determine which version works best.

So far, Knight says, the greatest interest has come from manufacturing companies making chemicals for cosmetics, perfumes, and flavorings. "Many of them are trying to replace a dirty chemical process with an environmentally friendly, biologically based process," he says.

Ginkgo is one of just a handful of synthetic-biology companies. Codon Devices, a well-funded startup that synthesized DNA, ceased operations earlier this year. "The challenge now is not to synthesize genes; there are a few companies that do that," says Shetty. "It's to build pathways that can make specific chemicals, such as fuels." And unlike Codon, Ginkgo is starting small. The company is funded by seed money and a $150,000 loan from Lifetech Boston, a program to attract biotech to Boston. Its lab space is populated with banks of PCR machines, which amplify DNA, and liquid-handling robots, mostly bought on eBay or from other biotech firms that have gone out of business. And the company already has a commercial product--a kit sold through New England Biolabs that allows scientists to put together parts on their own.


FURTHER READING
Gingo Bioworks BioBrick Assembly Kit: The BioBrick Assembly Kit includes all the reagents needed to assemble BioBrick standard biological parts.

BrickLayer Assembly Service: Our Ginkgo-bots assemble BioBrick parts so you don't have to.



UK National Physical Lab Research Femtosecond Lasers for Formation Flying in Space


The National Physical Laboratory (NPL) has helped to establish that femtosecond comb lasers can provide accurate measurement of absolute distance in formation flying space missions.

Formation flying missions involve multiple spacecraft flying between tens and hundreds of metres apart, which autonomously control their position relative to each other. The benefit of such missions is they can gather data in a completely different way to a standard spacecraft – the formation can effectively act as one large sensor.

Measuring absolute distance between the formation spacecraft is critical to mission success. Femtosecond comb lasers are an accurate way of making such measurements. The lasers emit light with very short pulses – each lasting just a few femtoseconds (a femtosecond is one billionth of one millionth of a second). The short pulses allow time of flight measurements to be used to determine distance to a few microns.

For example, in the proposed International X-ray Observatory mission, due to launch after 2020, it is thought that the 25 metre spacecraft will require highly accurate measurement of the absolute distance between the front and back of the spacecraft because the craft's body will be flexible.

For the X-ray images to stay in focus, the position and orientation of the mirror at one end will have to be known, and controlled, to roughly 300 microns in length and 10 arc seconds in angle. Otherwise the telescope will not be able to resolve an image and the mission would fail.




The other challenge in formation flying is achieving the formation itself, which is done once the spacecraft reach the appropriate region of space. The spacecraft orient themselves in relation to each other by plotting their positions relative to known stars, and then establish their lateral positions via laser pointers. Once the formation is established, it can be maintained via highly accurate absolute length measurements between the spacecraft.

These kinds of missions could answer a lot of the 'big questions' in astronomy and cosmology – like "is general relativity correct?", "how did the universe develop following the Big Bang?", and "Where do all the magnetic fields in the universe come from?"

Another mission, called LISA (Laser Interferometer Space Antenna) is being planned to look for gravity waves. This will involve three craft flying approximately 5 million km apart. In this case it is not necessary to know the absolute distance between the craft, extremely small changes in their separation on timescales of 10 seconds to 10,000 seconds could be a sign that gravity waves have been detected.

Such missions stand to benefit from this project and the use of femtosecond combs, and a number of groups worldwide are developing other systems using such technology as distance measuring instruments


October 01, 2009

Winterbergs Advanced Deuterium Fusion Rocket Propulsion For Manned Deep Space Missions

Winterberg's design to obtain a high thrust with a high specific impulse, uses propulsion by deuterium micro-bombs, and it is shown that the ignition of deuterium micro-bombs is possible by intense GeV proton beams, generated in space by using the entire spacecraft as a magnetically insulated billion volt capacitor. The design could have exhaust that is 6.3% of the speed of light. A multi-stage fusion rocket could achieve 20% of the speed of light with exhaust at that speed.

Winterberg also describes deuterium micro-bombs that can launch off of the earth using a total of 100 kilotons of small fallout free nuclear micro bombs for launching 1000 ton space craft that are mostly cargo. Winterberg developed the basic principle of the global positioning system and his work was the basis for the Project daedalus design.
Project Daedalus was a study conducted between 1973 and 1978 by the British Interplanetary Society to design a plausible interstellar unmanned spacecraft. A major stimulus for the project was Friedwardt Winterberg's fusion drive concept for which he received the Hermann Oberth gold medal award.


* Project Daedalus Study Group: A. Bond et al., Project Daedalus – The Final Report on the BIS Starship Study, JBIS Interstellar Studies, Supplement 1978
* F. Winterberg, "Rocket propulsion by thermonuclear microbombs ignited with intense relativistic electron beams", Raumfahrtforschung 15, 208-217 (1971).
* Winterberg is Hermann Oberth Gold Medalist,Physics Today, December 1979


Friedwardt Winterberg at wikipedia

Winterberg is well-respected for his work in the fields of nuclear fusion and plasma physics, and Edward Teller has been quoted as saying that he had "perhaps not received the attention he deserves" for his work on fusion.

His current research is on the "Planck Aether Hypothesis", "a novel theory that explains both quantum mechanics and the theory of relativity as asymptotic low energy approximations, and gives a spectrum of particles greatly resembling the standard model. Einstein's gravitational and Maxwell's electromagnetic equations are unified by the symmetric and antisymmetric wave mode of a vortex sponge, Dirac spinors result from gravitationally interacting bound positive-negative mass vortices, which explains why the mass of an electron is so much smaller than the Planck mass. The phenomenon of charge is for the first time explained to result from the zero point oscillations of Planck mass particles bound in vortex filaments."

In 2008, Winterberg criticized string theory and pointed out the shortcomings of Einstein's general theory of relativity because of its inability to be reconciled with quantum mechanics at the Physical Interpretations of Relativity Theory conference and published his findings in Physics Essays

Back in 1963, it was proposed by Winterberg that the ignition of thermonuclear micro-explosions, could be achieved by an intense beam of microparticles accelerated to a velocity of 1000 km/s. And in 1968, Winterberg proposed to use intense electron and ion beams, generated by Marx generators, for the same purpose. Most recently, Winterberg has proposed the ignition of a deuterium microexplosion, with a gigavolt super-Marx generator, which is a Marx Generator driven by up to 100 ordinary Marx generators

An earlier Winterberg paper on micro fusion space propulsion was covered here.

Deuterium from an Asteroid or Comet

Deuterium can be extracted from water with relative ease in three steps:
1. Water is electrolytically split into hydrogen and oxygen.
2. The hydrogen gas composed of H2 and HD is cooled down until it liquifies, whereby the heavier HD is separated by the force of gravity from the lighter H2.
3. The newly produced HD is heated up and passed through a catalyst, splitting HD into H2 and D2, according to the equation:
2HD H2 + D2
Since the gravitational field on the surface of a comet or small planet, from where the D2 shall be extracted, is small, the apparatus separating the liquid HD from H2 must be set into rapid rotation.

The comparatively small amount of energy needed for the separation can ideally be drawn from a ferroelectric capacitor (for example a barium-titanate capacitor with a dielectric constant ε ≈ 5000), to be charged up to many kilovolts by a small fraction of the electric energy drawn from the deuterium fusion explosions through a magneto hydrodynamic loop. One can also draw this energy from a small on-board nuclear reactor requiring only a small radiator, slowly charging the capacitor. Alternatively, one may store the needed energy in the magnetic field of a superconductor.

Launching into Orbit

Winterberg looks at two (non-chemical) possibilities:
1. A laser driven by a high explosive, powerful enough to ignite a DT micro-explosion, which in turn can launch a thermonuclear detonation in deuterium.
2. The second possibility is more speculative: It is the conjectured existence of chemical keV superexplosives. These are chemical compounds formed under high pressure, resulting in keV bridges between inner electron shells, able to release intense bursts of keV X-rays, capable of igniting a DT thermonuclear reaction, which in turn could by propagating burn ignite a larger deuterium detonation.



For the realization of the first possibility, one may consider pumping a solid argon rod with a convergent cylindrical shock wave driven by a high explosive. If the argon rod is placed in the center of convergence to reach a temperature of 90,000 ° K, this will populate in the argon the upper ultraviolet laser level, remaining frozen in the argon during its following rapid radial expansion. The energy thusly stored in the upper laser level can then be removed by a small Q-switched laser from the rod in one run into a powerful laser pulse, to be optically focused onto a thermonuclear target.

The idea is to use a replaceable laser for the ignition of each nuclear explosion, with the laser material thereafter becoming part of the propellant. The Los Alamos scientists had proposed to use an infrared carbon dioxide (CO2) or chemical laser for this purpose, but this idea does not work, because the wavelength is too long, and therefore unsuitable for inertial confinement fusion. I had suggested an ultraviolet argon ion laser instead. However, since argon ion lasers driven by an electric discharge have a small efficiency, I had suggested a quite different way for its pumping, illustrated



There the efficiency can be expected to be quite high. It was proposed to use a cylinder of solid argon, surrounding it by a thick cylindrical shell of high explosive. If simultaneously detonated from outside, a convergent cylindrical shockwave is launched into the argon. For the high explosive one may choose hexogen with a detonation velocity of 8 km/s. In a convergent cylindrical shockwave the temperature rises as r -0.4, where r is the distance from axis of the cylindrical argon rod. If the shock is launched from a distance of ~1 m onto an argon rod with a radius equal to 10 cm, the temperature reaches 90,000 K, just right to excite the upper laser level of argon. Following its heating to 90,000 K the argon cylinder radially expands and cools, with the upper laser level frozen into the argon. This is similar as in a gas dynamic laser, where the upper laser level is frozen in the gas during its isentropic expansion in a Laval nozzle. To reduce depopulation of the upper laser level during the expansion by super-radiance, one may dope to the argon with a saturable absorber, acting as an “antiknock” additive. In this way megajoule laser pulses can be released within 10 nanoseconds. A laser pulse from a small Q-switched argon ion laser placed in the spacecraft can then launch a photon avalanche in the argon rod, igniting a DT micro-explosion.

For the realization of the second possibility, one would have to subject suitable materials to very high pressure. These energetic states can only be reached if during their compression the materials are not appreciably heated, because such heating would prevent the electrons from forming the bridges between the inner electron shells.

Employing the Teller-Ulam configuration, by replacing the fission explosive with a DT micro-explosion, one can then ignite a much larger DD explosion.
As an alternative one may generate a high current linear pinch discharge with a high explosive driven magnetic flux compression generator. If the current I is of the order I = 10^7A, the laser can ignite a DT thermonuclear detonation wave propagating down the high current discharge channel, which in turn can ignite a much larger pure DD explosion.

If launched from the surface of the earth, one has to take into account the mass of the air entrained in the fireball. The situation resembles a hot gas driven gun, albeit one of rather poor efficiency.

For , and setting for v = 10 km/s = 10^6 cm/s the escape velocity from the Earth, one finds that N ≥ 10. Assuming an efficiency of 10%, about 100 kiloton explosions would there be needed to launch 1000 ton ship into orbit.

Neutron entrapment in an autocatalytic thermonuclear detonation wave is a means to increase the specific impulse and to solve the large radiator problem. The maximum exhaust velocity becomes 6.3% of light speed.

Winterberg Compares Super Marx Deuterium Fusion Against Laser Fusion-fission Hybrid Concept

Winterberg compares his Super Marx generator pure deuterium micro-detonation ignition concept to the Lawrence Livermore National Ignition Facility (NIF) Laser DT fusion-fission hybrid concept (LiFE)

In a Super Marx generator a large number of ordinary Marx generators charge up a much larger second stage ultra-high voltage Marx generator, from which for the ignition of a pure deuterium micro-explosion an intense GeV ion beam can be extracted. A typical example of the LiFE concept is a fusion gain of 30, and a fission gain of 10, making up for a total gain of 300, with about 10 times more energy released into fission as compared to fusion. This means a substantial release of fission products, as in fusion-less pure fission reactors. In the Super Marx approach for the ignition of a pure deuterium micro-detonation a gain of the same magnitude can in theory be reached. If feasible, the Super Marx generator deuterium ignition approach would make lasers obsolete as a means for the ignition of thermonuclear micro-explosions













Up until now nuclear fusion by inertial confinement has only been achieved by using a fission explosive as a means (driver) for ignition. This is true not only for large thermonuclear explosive devices, like the 1952 pure deuterium Mike Test (carried out in the South Pacific with the Teller-Ulam configuration), but also for small deuterium-tritium (DT pellet) micro-explosions, (experimentally verified with a fission explosive at the Nevada Test Site by the Centurion-Halite experiment). From this experience we know that the ignition is easy with sufficiently large driver energies, but which are difficult to duplicate with lasers or electric pulse power.



Winterberg believes substantially larger driver energies can be reached by a “Super Marx generator”. It can be viewed as a two-stage Marx generator, where a bank of ordinary Marx generators assumes the role of a first stage. If the goal is the much more difficult ignition of a pure deuterium micro-explosion, the Super Marx generator must in addition to deliver a much larger amount of energy (compared to the energy of the most powerful lasers), also generate a magnetic field in the thermonuclear target that is strong enough to entrap the charged DD fusion products within the target. Only then is the condition for propagating thermonuclear burn fulfilled. For this to happen, a 100 MJ-1GeV-10^7 Ampere proton beam is needed.

An ignition an energy of 100 MJ and a yield of 23 GJ, the fusion gain would be G = 230, about the same as for the LiFE concept. However, since even in pure deuterium burn neutrons are released through the secondary combustion of the tritium D-D fusion reaction products, a much higher overall gain is possible with an additional fission burn, as in the LiFE concept



Other possibilities
The energy of up to a gigajoule, delivered in ~ 10-7 seconds at a power of 10^16 Watt, opens up other interesting possibilities.

1. If instead of protons heavy ions are accelerated with such a machine at gigavolt potentials, these ions will upon impact be stripped off of all their electrons, in case of uranium all of its 92 electrons. Accordingly this would result in a 92 fold increase of the beam current to ~ 10^9 Ampere. With such an ultrahigh current, a very different fusion target



seems possible, where a solid deuterium rod is placed inside a hollow metallic cylinder. The inner part of the beam I1 will directly pass through the deuterium inside the cylinder, while the outer part of the beam I0 will be stopped in the cylindrical shell, there depositing its energy and imploding the shell onto the deuterium cylinder, at the same time compressing the azimuthal magnetic beam field inside the cylinder. If ignited at the position where the beam hits and ignites the cylinder, will lead to a deuterium detonation wave propagating down the cylinder.

2. At a beam current of ~ 10^9 Ampere, will lead to a large, inward directed magnetic pressure. At a beam radius of 0.1 cm, the magnetic field will be of the order of 2×10^9 Gauss, with a magnetic pressure of 10^17 dyn/cm2 ≈ 10^11 atmospheres. At these high pressures the critical mass of fissile material (U235, Pu 23a, and U233) can be reduced to ~ 10-2 g . This would make possible micro-fission explosion reactors not having the meltdown problem of conventional fission reactors.

3. In general, the attainable very high pressures would have many interesting applications. One example is the release of fusion energy from exotic nuclear reactions, like the pB11 neutron-free fusion reaction, conceivably possible under very high pressures.

Winterberg Conclusion

The fusion/fission LiFE concept proposed by the Livermore National Laboratory is an outgrowth of the DT laser ignition project pursued at the National Ignition Facility. Ignition is there expected in the near future. With its large fission component, it is difficult to see how the LiFE can compete with conventional fission reactors. Like them it still has the fission product nuclear waste problem. For this reason it hardly can without fission solve the national energy crisis, for what it has been billed by California Governor Schwarzenegger.

The proposed Super-Marx concept is by comparison a much more ambitious project, because it recognizes that the fundamental problem of inertial confinement fusion is the driver energy, not the target. And that only with order of magnitude larger driver energies can real success be expected. This in particular is true, if the goal is to burn deuterium. Unlike deuterium which is everywhere abundantly available, the burn of deuterium-tritium depends on the availability of lithium, a comparatively rare element. In conclusion, we display a table comparing the two different concepts.











Lancet Researchers Say Aging Process is Modifiable

Physorg reports that the Lancet in an article to be published Friday in the medical journal Lancet (prestigious medical journal), the researchers write that the process of aging may be "modifiable." The Lancet is saying there is no fixed ceiling on human longevity.

Lancet Abstract: Ageing populations: the challenges ahead

If the pace of increase in life expectancy in developed countries over the past two centuries continues through the 21st century, most babies born since 2000 in France, Germany, Italy, the UK, the USA, Canada, Japan, and other countries with long life expectancies will celebrate their 100th birthdays. Although trends differ between countries, populations of nearly all such countries are ageing as a result of low fertility, low immigration, and long lives. A key question is: are increases in life expectancy accompanied by a concurrent postponement of functional limitations and disability? The answer is still open, but research suggests that ageing processes are modifiable and that people are living longer without severe disability.


Their analysis of data from more than 30 developed countries revealed that death rates among people older than 80 are still falling. In 1950, the likelihood of survival from age 80 to 90 was 15 percent to 16 percent for women and 12 percent for men, compared with 37 percent and 25 percent, respectively, in 2002.

"The linear increase in record life expectancy for more than 165 years does not suggest a looming limit to human lifespan. If life expectancy were approaching a limit, some deceleration of progress would probably occur. Continued progress in the longest living populations suggests that we are not close to a limit, and further rise in life expectancy seems likely," Kaare Christensen, of the Danish Aging Research Center at the University of Southern Denmark, and colleagues wrote.


Most babies born in rich countries this century will eventually make it to their 100th birthday, new research says. Danish experts say that since the 20th century, people in developed countries are living about three decades longer than in the past. Surprisingly, the trend shows little sign of slowing down.


James Vaupel of the Max Planck Institute in Germany and colleagues in Denmark examined studies published globally in 2004-2005 on numerous issues related to aging. They found life expectancy is increasing steadily in most countries, even beyond the limits of what scientists first thought possible. In Japan, for instance, which has the world's longest life expectancy, more than half of the country's 80-year-old women are expected to live to 90.

"Improvements in health care are leading to ever slowing rates of aging, challenging the idea that there is a fixed ceiling to human longevity," said David Gems, an aging expert at University College London. Gems was not connected to the research, and is studying drugs that can lengthen the life span of mice, which may one day have applications for people.

"Laboratory studies of mice, including our own, demonstrate that if you slow aging even just a little, it has a strong protective effect," he said. "A pill that slowed aging could provide protection against the whole gamut of aging-related diseases."


Other Longevity Research

At Fightaging
back in June, I pointed out a longevity mutation that only extends healthy life span in male mice. By way of a bookend to that discovery, here is a mutation that extends healthy life span by 20% or so in female mice only.




Researchers have identified a genetic tweak that can slow aging in mice:

Caloric restriction has long been known to extend lifespan and reduce the incidence of age-related diseases in a wide variety of organisms, from yeast and roundworms to rodents and primates. Exactly how a nutritionally complete but radically restricted diet achieves these benefits has remained unclear. But recently several studies have offered evidence that a particular signaling pathway, involving a protein called target of rapamycin (TOR), may play a pivotal role. This pathway acts as a sort of food sensor, helping to regulate the body's metabolic response to nutrient availability.
Withers and colleagues noticed that young mice with a disabled version of the protein S6 kinase 1 (S6K1), which is directly activated by TOR, bore strong resemblance to calorie-restricted mice: they were leaner and had greater insulin sensitivity than normal mice.

The more recent results in S6K1 knockout mice are one small part of the flurry of research into the biochemistry of calorie restriction. Scientists are racing to explore pathways and mechanisms gene by gene and protein by protein, seeking the best place to intervene using designed drugs. The goal is to capture all the benefits of calorie restriction, or even do better, whilst minimizing or eliminating unwanted side-effects. Give it another ten years and the new scientific industry of metabolic manipulation will rival that of stem cell research, I'd wager. It certainly seems set for that sort of growth, starting from calorie restriction biochemistry and working its way outward.


In a Bad Financial Year Some Progress Against Poverty


The World Bank projects that the number of people living in extreme poverty (on less than $1.25 a day) will fall slightly in 2009, declining from 1,203 million in 2008 to 1,184 million. The share of the world’s population living in extreme poverty is also expected to decline by a small amount, from 21.3 percent in 2008 to 20.7 percent in 2009.

But the economic crisis is slowing recent progress in reducing the number of people around the world living in extreme poverty. The projected 0.6 percentage point reduction in the poverty rate for 2009 is a significant reduction from the 1.3 percentage point average annual decline experienced during the previous three years. All told, the World Bank projects that the global recession will cause anywhere from 55 million to 90 million more people to remain in poverty in 2009 than would otherwise have been the case.



Technological and economic growth in Africa

Maximizing Economic Growth and Aid to the Poor

Anti-poverty devices for smart phones

The case for continuing to search for technological solutions to world problems.

Nvidia Fermi and the Oak Ridge GPGPU Supercomputer



HPC Wire speculates that the Oak Ridge National Lab Nvidia supercomputer will be ready in 2011 with 20-25 petaflops. The plan for an exaflop supercomputer is for 1 exaflop by 2019.

ORNL's Fermi machine will be built by Cray. At the "Breakthroughs in High Performance Computing" session on Wednesday evening, Cray CTO Steve Scott basically gave Fermi the seal of approval for its use in high-end supercomputers. The new features that made that possible: ECC, a lot more DP performance, a unified address space, and support for concurrent kernels. Cray intends to add the upcoming GPUs in next year's new XT line (XT6?). Scott said the Fermi chips will be integrated into Cray's SeaStar interconnect, presumably co-habitating with AMD Opteron hardware




The Future of CAD 2019 as predicted by Solidworks

Guest article by Joseph Friedlander

Your occasional correspondent made a special trip to Tel Aviv (I live out in the sticks in Israel) to the Systematics (http://www.systematics.co.il/English/about.html) hosted the Tel Aviv Solidworks 2010 Show. This is the one show I try to go to every year to keep up with the tools of the industrial design field. I went with a design teacher whose (30 year ago!) student gratifyingly won an award for a totally automated cable making line (a model of impressive detail, down to the fastener level, in the modeling competition)

For those unfamiliar with CAD (computer aided design) Solidworks is a leading solid modeling program. It features parametric design, which essentially gives priority to relations between features and the constraints binding them. By defining more and more qualities, the design becomes more constrained—and more detailed. (Not just a representation like a picture taken from one angle only, but a solid model of the piece part being worked on is built up in the program’s internal modelspace. Views are then generated or rendered from this for the user to see.) A comparable program is Autodesk Inventor.

The reason CAD is important to the future is that the ability to store manufacturing capability through detailed plans that can drive CAM (computer aided manufacturing) devices can be a big part of what Brian Wang has labeled the ‘Mundane Singularity’—changes that trigger cascading economic growth, doing more with less, improving economic outputs from the same inputs. In a phrase: Exponential productivity.



Those who remember seeing industrial films with about an acre of white shirted, narrow tied engineers and draftsmen drawing some of the 50 tons of blueprints it took to make a major weapons system may be surprised to learn how much is possible in house with a team of two or three people nowadays.

Someone with a big software and small hardware budget today (say the cost of a secretary for a year) can do tests on virtual prototypes, design validation tools; do motion studies, including realistic moving of links and couplings, virtual drop tests, iterative design optmization, fatigue, shape, seismic tests. fluid-flow simulation and thermal analysis (“Finite Element Analysis”) routing of wire buses and packages throughout and between major assemblies, (“Piping, Tubing, plus Wiring and Harness layout”) and considerable pre-modeled parts available for use.

“SolidWorks Toolbox is a library of predefined fasteners, gears, cams, pins and other accessories, based on information found in Machinery's Handbook.”

Other libraries are made by other vendors, but the rule is, if you don’t have to draw it, it saves time (and presumably since the model is from the vendor, always right). Carried to an extreme, in the future, for some designs one need only assemble, not draw, at least in theory…

Jeff Ray, CEO of Solidworks (http://www.solidworks.com/) which more formally is Dassault Systèmes SolidWorks Corp, gave an entertaining talk during the show about the future of CAD in 2019. (Note that many of these features are described in my terms below owing to the darkness of the hall and the pace of the talk…)
· Touchscreen interface
· lossless data management
· seamless movement of data between mesh/nurbs (http://en.wikipedia.org/wiki/NURBS -- Non-uniform rational B-spline (NURBS) is a mathematical model commonly used in computer graphics ) environment simulation in essence, surfaces vs solid modeling.
· Model mimics VR in many cases
· Cool range of data appliances, all wireless to move and design on—PDA/thin flatbook/and 30 inch virtual desktop (all including virtual keyboard)

Developments along ease of use lines from Solidworks Labs


Primitive beginnings: 1982 MIT CAD system --Computer Aided Design Laboratory in 1982, Department of Mechanical Engineering at MIT. Notice the monochrome screen. Yet this was future cool at the time.



A previous article by Jeff Ray about the future of CAD

· Latest and greatest tools in real time. (Seamless upgrades)
· Workflows, special techniques and master workarounds, particular to CAD software, but not to engineering 'CAD overhead' will disappear
· Every operation on every part or assembly will be automatically recorded, saved and preserved.
· Access a design from any place, from any device, at any time and do what you need to do with it
· Simulation will become one and the same as design. As you create products, CAD will run FEA, (Finite element analysis—stress testing) cost analysis, manufacturability testing, motion simulations and more.
· Large assemblies open, photorealistic images render and simulations complete in realtime as perceived by humans..."


Key predictions are similar to those in Microsoft's future vision on manufacturing---



touch pads/tables/surfaces everywhere, all wireless, all seamless, e-paper able to take sketch scans and transfer it to any device, (amazingly cleaned up, a cynic might note, into a 3D model that today would take a number of minutes to make at best, albeit in ‘sketchy’ lineforms…)

The number of licenses Solidworks has out there is over a million, of which the majority are educational, a large minority industrial.

I took a moment to meet and thank Mr. Ray after his presentation, and offered this suggestion: Make a special non-time limited version of Solidworks to generate part geometry, and allow it to save in Google Sketchup format. Forget all the features that would tempt people to try to game the system, and just allow part creation and assembly. It shouldn’t even print, or save in Solidworks format—just Sketchup.
Why? The vast user base of Sketchup— most of which are not Solidworks users, many of which are student age.
It’s hard to beat free as a cost (google sketchup)

(There is a premium version of Sketchup available, but with a comparatively tiny user base)

The vast 3d Warehouse Library

Now also parts catalogs within—over 100 million parts

"SketchUp users are now just a mouse click away from more than 100 millions of standard parts coming from leading suppliers: Asco Joucomatic Numatics, Assfalg, Atlanta, Bosch Rexroth, Boutet, Burster, Cepex, Chambrelan, Contrinex, CSR, Dirak, Drumag, Elitec, ENOMAX, Enzfelder, Euchner, Expert, Festo, Ganter, Genustech, Gerwah, GMT, Halder, Hervieu, HP Systems, Hydropa, IFM, Igus, INA/FAG, Item, ITV, Kabelschlepp, Kinetic, Legrand, Legris, L'Etoile, Mädler, Mayr, Mecalectro, Misumi, Norelem, Norgren, Normydro, Nozag, Pinet, Progressus, Quiri, Rabourdin, Rodriguez, Rohde, Römheld, Rötelmann, Rud, Sapelem, Schmalz, Schmersal, Siam-Ringspann, Sick, SNR, Socafluid, Somex, Stauffenberg, Ströter, Stüwe, Suhner, Sumer, Telemecanique, Trelleborg, Wefapress, Winkel, Zimm and more."

The idea is that students need to practice for the many hours it will take to learn construction and assembly of subcomponents into models—but this way they could be training for jobs in industry using Solidworks –at least being far more employable to manufacturers than a novice because they would be able to cut their on the job training down to the advanced commands.

Issues with the way Sketchup handles data—again, surfaces vs. modeling

In user interface lingo, SketchUp is much more "modal" than DesignWorkshop. The SketchUp user has to put the software into the correct state, by selecting the specific right tool, in order to perform any task. SketchUp provides some nifty enhancements to the classic surface modeling operations, such as connecting faces so at times they move and stretch together. But with the surface-based approach, SketchUp is playing catch up to what's almost automatic in the direct manipulation of solids.

Google SketchUp 7 wants to shape you into a 3D artist

Aside from drawing, you can also access Google's 3D warehouse, which allows you to search for 3D models while in the software and place them into your creation. The sheer number of models is impressive. You can choose from people to buildings to cities to just about anything. I searched for a dog to place in my model and the 3D warehouse returned almost 2,000 results. Simply put, you'll be able to find almost any object without much trouble… SketchUp 7 does ensure that it's easy to take and attribute credit for important creations by acknowledging the designer when the models are shared. For simple dog designs, that probably won't matter much. But for professionals creating 3D models to show to clients or to show off their ability, the credit feature becomes an important part of using the product, especially as the 3D Warehouse grows.

The incentive to tempt students into learning Solidworks would be being able to get a reputation as a great modeler for the Google 3D Warehouse. It would be something cited on the kid’s resume. It would be an on line portfolio that could get a kid hired.

The reason for Solidworks to port their student version to a free version that could not print but only save to Sketchup file format would be to increase their market share of future modelers WITHOUT cannibalizing their future sales—i.e. real corporate Solidworks users would not want to mess with file conversions of uncertain thoroughness and accuracy, and could not print or share their work as Solidworks files.

Now it is an open secret that a lot of software companies turn their gaze away from a certain amount of copying of their (expensive) product by students. It increases market share probably, and the base of future paying users, some say. My own reaction is that this posture teaches kids to be sneaks.



In The Wealth and Poverty of Nations: Why Some Are So Rich and Some So Poor, by David S. Landes. Boston: Little Brown, $55.95.

Landes gives the quotable prescription for prosperity on a micro as well as macro scale; “work, thrift, honesty, patience, tenacity.” Doing a good design on computer, as does programming inculcates all those qualities—except possibly honesty, if the price of admission is an illicit copy. Thus my suggestion.

It would be far better not to tempt people to make illegal copies to help students out. One way around this dilemma would be the solution outlined above: A free version of Solidworks but which cannot print or save but for Sketchup format.

Those who have spoken to me know my economic predictions for the current crisis, which may be briefly expressed as (for many but not all countries): Seven bad years. In times to come, kids will need as many ways to build their salable skills as possible, and cheap and universal CAD access that builds their reputation at the same time as enriching the world’s library of open source design seems a win-win solution.

Seven bad years for everyone? No, not for everyone. Those who produce a real product or service that cannot be done without—both those conditions apply—have a good chance to make it.

A real product or service—that CAN be done without-- is not enough. See the Baltic Dry Shipping Index for details.
Revealed the Ghost Fleet Recession from the Daily Mail

Design work—and the time used to generate it—can be like a craftsman making goods for the shelf during a dull market, to sell in a better one. It is a way—an imperfect way—to store time while building value—your own skills—and wealth—the open source library of designs.

During the Depression, many people waited for the world to improve by designing elements of a better world during their idle time. Many of those designs and inventions found good use in the War and post-war eras. It’s not a great solution. But it beats moaning about what can’t be changed—while preparing for what can be implemented later. Sometimes a rollout delay leads to a better product. Let’s make it so in the future we hope to build together.

A Video of Jeff Ray from Sept 2008 Where He Also Talks About Future CAD



Space Elevator Games one Kilometer Tether Finally Ready

The 1000 meter helicopter supperted vertical raceway is now ready and waiting for the space elevator teams. More information on the upcoming schedule coming your way soon.

Full altitude integrated tests were performed this past weekend.

They performed a series of measurements in order to correlate helicopter positions and lasing angles. The trick is to have the climber within the allowed 15-degree lasing angle throughout the climb, while at the same time maintaining its separation from the helicopter. Not-too-steep, not-too-shallow, and actually, we need to drift the helicopter during the climb since there’s no single position that satisfied all conditions. Given the practice we’ve had, this was almost trivial to do, and what’s more important, since wind conditions will likely be different during the games, we know we can adjust in real time to different cable sags.



Finally, we did an end-to-end test with battery powered climbers. Only USST (University of Saskatchewan Space Design Team) and KCSP (Kansas City Space Pirates) had climbers ready to go, and KCSP suffered from control related issues and did not have their van full of spare parts with them, so to Brian’s endless misery, they were out of the game. USST (University of Saskatchewan Space Design Team) was the last climber standing, and on their second try, they put the pedal to the metal and completed the 1 km climb with no problems. Meanwhile, Lasermotive who were out with their beam director, confirmed that tracking was feasible within the 15-degree cone I mentioned.





Power Beaming Competition

The Power Beaming challenge will continue to influence public perception of the Space Elevator project by demonstrating progressively more accurate (and more impressive!) prototypes of the Space Elevator system. By participating, you get the opportunity to partner in writing this unique chapter of history. The total NASA provided prize purse is $2,000,000, highlighting its commitment to the development of power beaming technologies.

In this challenge, Spaceward provides the race track, in the form of a vertically-suspended tether, and the competing teams provide Space Elevator prototypes, featuring climbers that have to scale the tether using only power that is transferred to them from the ground using beamed power.

The climbers net weight is limited to 50 kg [110 lbs], and they must ascend the ribbon at a minimum speed of 2 m/s. [6.6 feet per second] carrying as much payload as possible. A high performance prize will be awarded to teams that can move at 5 m/s. [16.5 fps]

Climbers will be rated according to their speed multiplied by the amount of payload they carried, and divided by their net weight. For example, a 15 kg climber, carrying 10 kg of payload at 2.5 m/s will have a score of 10 · 2.5 · / 15 = 1.67

Power is unlimited. It is up to the competitors to build the most power dense machine that they can devise. For more comprehensive specifications, please download the formal specs below.


The USST climber was able to make the 1000 meter climb using batteries. They need to make the climb using beamed power and with sufficient speed to win.