Pages

October 23, 2009

Radical Productivity Improvement


Here is a 48 page presentation on a project called "One Pass to Production : Radical Productivity Improvement". It has the goal within 8 years of reducing the development time for a new cellphone from 2 years down to one day. The presentation was made in 2007 and the project was started in 2001.

One-Pass to Production Summary: (2001)
– Vendor Collaboration for Complete Top-Down Flow
– Tools Interoperability & Distributed Simulation
– Open Modeling Standard & Model Re-Use
– Different layer of model abstraction for different design stages
– System level IP.
– S/W oriented mentality
– Revision control
– Data Analysis



The requirements gathering, modular software and configurable hardware all had to be developed. Systems and processes had to be prepared so that requirements can be just be added so that a tested product can be rapidly realized.

This is an example of the kind of analysis and project that could be developed for radical productivity improvement for all kinds of businesses and products. The goals would not just be massively shortened design time and product realization but also concurrent retooling of an assembly line and also reduction of the staffing needed to support a business.

The system would need to be flexible enough to assist and encourage and enable improvements like the iPhone and for adding on compatible devices.


Dwave Systems Adiabatic Quantum Computer

Dwave Systems has a new page which contains links to scientific papers and presentations describing the adiabatic quantum comptuer systems they are building. Dwave currently is testing a 128 qubit quantum computer processor.

Overview and history of Dwave

Dwave Processor architecture 38 page pdf

Status
* Have run optimization problems on C1 (8 qubit, 1500 JJs) single unit cells
with on-chip DAC control.
* Have cooled and calibrated two C4 (128 qubits, 1024 DACs, 24000JJs).
* Fabrication yield, magnetic field are limits for these chips. Ran small
optimization problems on one of them.
* Developed (-ing) calibration, synchronization, and operation methods to go
from new chip to running optimization problems.
* Working on understanding limits to programming accuracy of hi , J¼ , and
how well we achieve Ising Hamiltonian.




D-Wave rf-SQUID qubit

The qubit we are using in our current designs is a novel type of rf-SQUID device. It is designed to have a lot of control knobs that allow the user to set several different parameters of the devices in situ. This type of control is required in order to be able to operate large numbers of these devices in a scalable way.



59 page pdf on the performance of the dwave quantum processor

10 page pdf on the fabrication capability of Dwave Systems.
* they are using 200 mm wafers
* they can make 10,000 to 30,000 wafers per month

Carbon nanotubes Costs, Strength and Building things

When people think about building things with carbon nanotubes or with carbon nanotube enhanced materials they need to know that there is not much production of carbon nanotubes. There is less than 1000 tons/year of carbon nanotubes being produced and most of that is in a form that is like an unsorted powder. The material that has been woven into threads or sheets will soon be tens of tons per year and that macro scale material is not as strong at the macroscale. The cost of material increases as you require higher purity or a particular form of it.

Even material like Kevlar and regular carbon fiber is produced at the 10,000 ton to 100,000 ton per year level. Global raw steel is produced at about 1.2 billion tons per year and cement is at 2.6 billion tons per year.

There are many different kinds of steel and concrete. There is superstrong geopolymer concrete. Regular concrete can take 1500-5000 pounds per square inch but there is believed to be super-concrete that can withstand 50,000 pounds per square inch.

Fiber reinforced concrete using regular carbon fibers is advancing with long carbon fibers that are 500+ times better at resisting cracking and able to be 40% lighter.


There is a table of strength of materials and cost per kilogram.

Current thinking is that carbon nanotubes could get to 20 Gigapascals of tensile strength on a macroscale, but currently 9 Gigapascals is only for a few strands. They have to make a lot more.

Some believe and are working on nanokites to produce a lot of carbon nanotubes of unlimited length Longer carbon nanotubes allow more of the natural strength of carbon nanotubes to be available on a macro scale.

Nanocomp Technologies is making and selling spools of carbon nanotube thread and sheets of carbon nanotubes.



There has been strengthening of Kevlar by dipping it into carbon nanotubes to get to about 5.9 gigapascals. These improvements have been achieved at only 1 - 1.75 wt% of carbon nanotube content.

In terms of layman terms you can think of the strength as multiples of the strength of Kevlar or Zylon.

Currently even powder like carbon nanotubes can cost $45-150/kg, but it goes up from there for purer forms of carbon nanotubes.

Some in Italy and other places are considering building with carbon nanotube enhanced material to make longer bridges or to reinforce old bridges.



CNAno Technologies 500 ton/year factory has received ISO 9001 certification.

Carbon nanotube nanostitching can reinforce sheets of material and reinforce existing materials at their weakest point for airplane skins and other applications.

The Great Restructuring and the Great Innovation Civilization

Information technology is all around us — but how does it really change the way we do business? Erik Brynjolfsson, the Schussel Family Professor at MIT's Sloan School of Management, explores this question in his new book, "Wired for Innovation," written along with Adam Saunders, a lecturer at The Wharton School of the University of Pennsylvania.

* organizational changes are bigger and more costly than the technology investments themselves.
* New business practices are really driving productivity
* There are many ways companies are adding value to the economy, but it's often showing up as greater consumer value rather than in GDP. (Example: we have more music as consumers but a shrinking music industry)
* The Great Restructuring: a fundamental reorganization of business activity. It's not simply a matter of going back and buying the same things we used to buy. Some of them won't exist any more. One reason I think we're seeing such a lag in hiring and employment is that people aren't simply being hired back into the same old jobs they did before. It's going to take a lot of entrepreneurial activity to figure out the best way of grappling with these problems. I think it's going to happen and in the end the economy will be even more productive than in the past. But the transition is clearly very painful.
* There are different kinds of winners. Companies like Google can be big beneficiaries directly, but there are also people who benefit from Google's free services. We use Google Scholar to check citations and that's a tremendously easier way than what we used to have.


The economic restructurings that we have experienced over the last few decades have been partial restructurings. I think it is part of a long trend similar to shift from agriculture to industry. Two hundred years ago 80-98% of the populations in countries were employed in agriculture. Increasing agricultural productivity and shifting and training people and developing whole new industries was not possible without shifting people out of agriculture.

Great Innovation Civilization
There needs to be large and nearly complete shift to super-innovation and entrepreneurism that is maximally supported with automation.

Today even in information based and seemingly creative business segments such as architecture, most of the people who work in that area are not innovative. 90+% of the people in architecture are adapting a few dozen set of standard plans for a new city or building site. They figure out how to maximize the living space and minimize hallways and they make adaptations to local building codes.



How large a percentage of the population could be shifted and developed into another Steve Jobs, Thomas Edison, Buckminister Fuller, I.M. Pei or other innovator, inventor, developer of new systems, projects and concepts ?

More people could be become great innovators with virtual reality training, brain computer interfaces and other training and enhancement technology.

Great innovators could be supported by massive flexible automation, such that whatever they imagine can be refined with simulations and tested and certified for bad side effects.

This relates to the mundane singularity of taking the technology that we have or almost have and making adjustments to maximize what can be done.

A complete restructuring and shift to full innovation could easily take many decades even for the countries who are the fastest at change and adoption. The shift from rural and agricultural had some countries complete the shift almost 100 years ago while other countries are still subsistence farming.

Blue Brain Videos and Update on Brain Emulation

It took less than two years for the Blue Brain supercomputer to accurately simulate a neocortical column, which is a tiny slice of brain containing approximately 10,000 neurons, with about 30 million synaptic connections between them.

The Blue Brain team is now developing a model of a complete rat brain—that should be done in 2010—Markram will download the simulation into a robotic rat, so that the brain has a body. He’s already talking to a Japanese company about constructing the mechanical animal.

Installing Blue Brain in a robot will also allow it to develop like a real rat. The simulated cells will be shaped by their own sensations, constantly revising their connections based upon the rat’s experiences. “What you ultimately want,” Markram says, “is a robot that’s a little bit unpredictable, that doesn’t just do what we tell it to do.” His goal is to build a virtual animal—a rodent robot—with a mind of its own.

Markram is candid about the possibility of failure. He knows that he has no idea what will happen once the Blue Brain is scaled up. “I think it will be just as interesting, perhaps even more interesting, if we can’t create a conscious computer,” Markram says. “Then the question will be: ‘What are we missing? Why is this not enough?’”


This site had an interview with Henry Markram, which is available through this link



A component-based extension framework for large-scale parallel simulations in NEURON

As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments, analysis and visualization. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in its native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation.




Notes from Henry Markram TED Talk, July 2009

Henry Markram is leading the Blue Brain Project, which hopes to create a realistic digital 3D model of the whole human brain within the next 10 years. (The simulation promises to do all the things that real human brains can do, including consciousness.) He's done a proof of concept by modeling half of a rodent brain. Now he's scaling up the project to reach a human brain.

But why? It's essential to understand the brain for us to get along in society. We can't keep doing animal experimentation forever. We have to embody our data in a working digital model. We need better medicines that are more specific, more concrete, more precise. (Also, it's just fascinating.)

Markram, for the first time, shares how he is addressing one theory of how the brain works. The theory is that the brain "builds" a version of the universe and projects this version, like a bubble, all around us. But Markram says we can directly address this philosophical question with science. Anesthetics don't work by blocking receptors. They introduce a noise into the brain to confuse the neurons to prevent you from making "decisions." You must make decisions to perceive anything. 99% of what you see in a room is not what comes in through the eyes -- it's what you infer about that room.

Instead of speculating or philosophizing, we can actually build something to test the theories.

It took the universe 11 billion years to build a brain. The big step was the neocortex. It allowed animals to cope with parenthood, social functions. So the neocortex is the ultimate solution, the pinnacle of complex design that the universe has produced. The neocortex continues to evolve rapidly. The neocortex uses the same basic unit for computation, over and over again, and built up so fast evolutionarily that the brain had to fold itself up to fit more of the stuff into the skull.

The holy grail for neuroscience is to understand the design of the neocortical column. It will help us understand not just the brain, but perhaps physical reality. Understanding the structures that make it up is extremely difficult, because beyond just cataloging the parts, you have to figure out how they actually work -- and then build realistic digital models.

Mathematics underlies the models of the brain. Each neuron has a mathematical representation. Even though this simplifies things, you still need a huge computer to do the kinds of simulations Markram is talking about. You'd need one laptop for every single neuron in order to accurately model it. So what do you do? You go to IBM!




Hypersonic Weapons and Rockets


India and Russia have agreed to develop and induct a new hypersonic version of their joint venture 174 miles-range BrahMos cruise missile by 2015.

The new missile will be known as ‘BrahMos-2’ and will have a speed of over 6 Mach (around 3,600 miles per hour) with a striking-range of 174 miles.


NASA Hypersonic Project
NASA has selected a Williams International high-speed turbojet as the turbine element of its Turbine Based Combined Cycle (TBCC) engine test rig, which will be used to evaluate technologies for potential future two-stage to orbit launcher concepts.

The TBCC is designed to integrate a turbine and ramjet/scramjet into a unified propulsion system that could be used to power the first-stage of a two-stage launch vehicle from a standing start on a runway to speeds in excess of Mach 7. The concept also is being evaluated by Lockheed Martin and Pratt & Whitney Rocketdyne as the ongoing Mode Transition (MoTr) program, which aims to fill the void left by the DARPA HTV-3X/Blackswift hypersonic demonstrator canceled in 2008. Unlike the NASA effort, MoTr is aimed at a propulsion system for potential high-speed strike/reconnaissance vehicles, and will include a running scramjet.





Blackbird Replacement?

Pratt & Whitney Co.'s rocket-motor division has been hired to work on a prototype for a combo jet turbine-ramjet propulsion system capable of moving a low-orbit military vehicle at hypersonic speeds.

Aerospace and defense giant Lockheed Martin Corp. signed a 10-month contract with Pratt & Whitney Rocketdyne for preliminary design of a high speed accelerator for a turbine-based combined-cycle propulsion system, which could support flight up to Mach 6 -- six times the speed of sound.

Pratt said such a vehicle could be used for strike and reconnaissance missions. A vehicle such as this sounds like a replacement for the old Blackbird recon plane


US Navy Close to Accepting Second of 55 Fast Littoral Combat Ships


The Navy’s second littoral combat ship, the Independence, finished its builder’s trials Wednesday.

The aluminum trimaran hit a top speed of 45 knots [50mph] and kept a sustained speed of 44 knots during its full power run in the Gulf of Mexico, shipbuilder General Dynamics said in an announcement. It kept a high speed and stability despite eight-foot waves and 25-knot winds. The latest schedule calls for the Independence to be delivered before the end of 2009 and be commissioned sometime early next year.

The Independence is the second of two ships the Navy is considering for its planned fleet of 55 littoral combat ships, along with the Lockheed Martin-built Freedom, commissioned last November. The ships were built to swap inter-changeable equipment for three missions: mine countermeasures, surface warfare and anti-submarine warfare. Navy officials will decide next spring which version of LCS they will put into full-scale production.



Each LCS was initially pitched to Congress for a cost of about $220 million, but according to the Navy’s latest budget figures, the Freedom has cost $637 million and the Independence has cost $704 million. The Navy has awarded contracts for a second Freedom-class ship — the Fort Worth — and a second Independence — the Coronado — but has not disclosed the value of the contracts.

Navy officials claim the ongoing competition between GD and Lockheed mean they can’t release the ships’ costs, although Landay said he hopes the Navy will reveal those costs soon.


















GE Hitachi Propose Nuclear Fuel Recycling with Prism Fast Reactor and Electroseparation


GE (General Electric) Hitachi is proposing the Advanced Recycling Center (ARC) which would be the fourth generation PRISM sodium-cooled fast reactors and an electrometallurgical separation process that would make a new form of fuel from spent fuel rods without separating plutonium. The first of kind system they are proposing would cost about $3.2 billion and would be completed by 2020 if funded and the project is executed as planned.

In the third sectio of this article, the GE prism reactor was discussed.

Here is a three page pdf on the Advanced Recycling Center (ARC) (H/T Coal2 nuclear)

The ARC would cut radioactive waste. It can extract/burn by up to 90 percent of the energy in uranium, instead of the 2-3 percent that widely-used light water reactors do.

A European study in the 1990s showed fast reactors would cost about 20 percent more than conventional reactors. However, GE says it would be also economic, particularly if the disposal costs of nuclear waste from other existing technologies are taken into account. "If you factor in long term storage, then the economics support recycling, and even reprocessing," GE Hitachi's Price said. "The long term disposal is going to be very expensive." The cost conventional reactors the plus cost of building and operating fuel storage like Yucca Mountain(s) is argued to be higher than a fast reactor appraoch. Plus having a really deep burning fuel system would allow the industry to show real progress to a closed fuel cycle and to address environmental concerns.




Fuel reprocessing, like GE Hitachi's electrometallurgical process, was the area of the technology that was least well proven, Hore-Lacy said.

Abram agreed, saying: "On a relatively small scale, the electrometallurgical reprocessing technology has been shown to work."

"It's conceptually relatively easy to describe. But because the fuel is very radioactive, all of the fuel manufacturing operations would have to be done in very heavily shielded facilities, and remotely, using robotic manipulation. Nobody has demonstrated it at industrial scales yet."

GE Hitachi says it could develop the technology in 10-15 years as it has been working on it since the 1980s, partly funded by the U.S. government.


GE Hitachi Description

The ARC combines electrometallurgical processing and one or more sodium cooled fast burner reactors on a single site. This process produces power while alleviating the spent nuclear fuel burden from nuclear power generation.

The ARC starts with the separation of spent nuclear fuel into three components: 1) uranium that can be used in CANDU reactors or re-enriched for use in light water reactors; 2) fission products (with a shorter half life) that are stabilized in glass or metallic form for geologic disposal; and 3) actinides (the long lived radioactive material in SNF) which are used as fuel in the Advanced Recycling Reactor (ARR).

GEH has selected the electrometallurgical process to perform separations. The electrometallurgical process uses electric current passing through a salt bath to separate the components of Spent Nuclear Fuel (SNF).

A major advantage of this process is that it is a dry process (the processing materials are solids at room temperature). This significantly reduces the risk of inadvertent environmental releases. Additionally, unlike traditional aqueous MOX separations technology, electrometallurgical separations does not generate separated pure plutonium making electrometallurgical separations more proliferation resistant. Electrometallurgical separations technology is currently widely used in the aluminum industry and has been studied and demonstrated in US National Laboratories as well as other research institutes around the world.

The actinide fuel (including elements such as plutonium, americium, neptunium, and curium) manufactured from the separations step is then used in GEH’s PRISM (Power Reactor Innovative Small Modular) advanced recycling reactor to produce electricity. PRISM is a reactor that uses liquid sodium as a coolant. This coolant allows the neutrons in the reactor to have a higher energy (sometimes called fast-reactors) that drive fission of the actinides, converting them into shorter lived “fission products.” This reaction produces heat energy, which is converted into electrical energy in a conventional steam turbine. Sodium cooled reactors are well developed and have safely operated at many sites around the world.

The ARC produces carbon-free base load electrical power. An ARC consists of an electrometallurgical separations plant and three power blocks of 622 MWe each for a total of 1,866 MWe. The sale of electricity will provide the revenues (private sector) to operate the ARC while supplemental income will be obtained from the sale of uranium (private sector) and the payment for SNF treatment (currently Government controlled).

Today, in the US there are approximately 100 nuclear power reactors in operation. Assuming that they each produce 20 tons of SNF a year for 60 years of operation, then the current fleet will produce 120,000 tons of SNF. 26 ARCs are capable of consuming the entire 120,000 tons of SNF. Additionally, they are capable of producing 50,000 MWe and avoiding the emission of 400,000,000 tons of CO2 every year.

In order to gain the confidence of utilities and financial markets that the regulatory and resource issues (personnel and materials) can be solved, a first of a kind ARC must be built at “full-scale.” A full-scale facility is a single reactor and 50 tons per year separations facility. This facility could be available as early as 2020. A well-managed US government sponsored program using US technology, US national laboratories and universities, and US companies can lead this process. The project will take approximately 10 years to complete. We estimate the total first of a kind cost for the Nuclear Fuel Recycling Center and a PRISM reactor (design, technology development, licensing, constructions, safety testing, etc.) is $3.2B thus requiring an average spend of $320M/yr with peak construction period requiring $700M. The first PRISM reactor could be fueled by excess Pu from the weapons stockpile, thus further reducing proliferation risk. This program will enable the US to lead the world nuclear community in demonstrating a sound approach to solving the problem of SNF, a solution that our national laboratories pioneered decades ago. The US taking action to build the GEH Advanced Recycling Center allows the US to capitalize on existing US funded technology and demonstrate US leadership in providing a safe, proliferation resistant method to close the nuclear fuel cycle.

















RELATED READING
11 page presentation on the PRISM reactor

China has agreed to buy and build two 880 MWe russian BN-800 fast neutron reactors. The russian reactors are less advanced than the proposed GE Hitachi PRISM, but the first BN-800 is being built in Russia and should be done by 2012-2013. The russian's have been operating a 600 MWe BN800 reactor since 1980.

India is completing a 500 MWe breeder reactor and plans to complete three more by 2020.

October 22, 2009

Technologies that Would Change The Energy Picture

The Wall Street Journal has an article about five technologies that would change the energy and climate picture.

Over the next few decades, the world will need to wean itself from dependence on fossil fuels and drastically reduce greenhouse gases. Current technology will take us only so far; major breakthroughs are required.

* Space based solar power
* Utility scale energy storage to enable a high percentage of solar and wind
* Next Generation Biofuels
* Carbon capture and storage
* Advanced Car Batteries


There are several issues with the whole article.

1. Even if we were business as usual and if the climate models were correct it could be decades before we really notice serious climate change

Mojib Latif of Germany's Leibniz Institute is one of the leading climate modellers in the world. He is the recipient of several international climate-study prizes and a lead author for the United Nations Intergovernmental Panel on Climate Change (IPCC). He has contributed significantly to the IPCC's last two five-year reports that have stated unequivocally that man-made greenhouse emissions are causing the planet to warm dangerously. Yet last week in Geneva, at the UN's World Climate Conference--an annual gathering of the so-called "scientific consensus" on man-made climate change --Latif conceded the Earth has not warmed for nearly a decade and that we are likely entering "one or even two decades during which temperatures cool."


A New Scientist article also points out that climate change a few years or a decade of cool temperatures is not conclusive.

Bottom line: There is concern and fears and breakthroughs or increased usage of existing technologies will often take decades to have impact on slowing growth of greenhouse gases and then lowering the amount of gases.

This site has provided a list of faster impacting steps, since climate change or not it is better to reduce air pollution.



2. Plenty can be done with existing technology.

This goes to the list of faster impacting steps and technologies.

Making container ships run on nuclear power, improving aerodynamics on cars, retrofiting existing cars, building more nuclear power, uprating more nuclear power plants and other steps are either existing technology or ones that are incremental improvements.

3. Success in the listed technology breakthroughs would mainly change the psychology of more people to being more hopeful and optimistic about a portfolio of technological solutions.

4. Faster impact on climate would be to reduce air pollution and soot (black carbon). It could be achieved faster and would have more climate effects.

5. Breakthroughs with factory mass produced small deep burn fission nuclear reactors

Breakthroughs with nuclear fusion

would be technology that more rapidly transform not just the energy/climate picture but the economy and civilization.

6. Even modest molecular nanotechnology would enable climate control technology.


WWF Funds Another Biased Climate Change Report Based on Crude and Incorrect Spreadsheet Calculations

The environment group WWF has a report which says that $17 trillion must be spent on renewable energy between now and 2050 to avoid temperatures rising by 2 degrees celsius and irreversible climate change.

Previously the World Wide Fund for Nature funded a climate scorecard with rigged numbers

WWF does not consider nuclear power to be a viable policy option. The indicators “emissions per capita”, “emissions per GDP” and “CO2 per kWh electricity” for all countries have therefore been adjusted as if the generation of electricity from nuclear power had produced 350 gCO2/kWh (emission factor for natural gas). Without the adjustment, the original indicators for France would have been much lower, e.g. 86 gCO2/kWh.


The report is 159 pages long and based on faulty assumptions loaded into a spreadsheet and run forward for 40 years.

On page 157 and 158, nuclear power is dismissed even though it currently generates 80% of the very low carbon energy in the world.

On pages 129-142 they have their spreadsheets with gigwatt hours per year by energy source. Nuclear power starts out about 300,000 GWh per year lower than actual numbers. 2321828 GWh/yr which is 2321 Twh/yr for 2010 which is less than the 2600-2650TWh/yr from 2006-2008.




The capacity factors that they used for their assumptions are made up. The current best actual capacity factors are in the USA and South Korea and are 90-95%. They represent 33% of the total nuclear capacity and more of the nuclear generation. By using the wrong capacity factors, they calculated nuclear power generation to be 15% below actual values.

The capacity factors for wind are too high based on current technology and would require development of high altitude wind where wind is more reliable. This site has covered kitegen technology, but that technology is likely at least ten years away from deployment in any significant scale. They have demonstrated a 40 KW unit and have gotten 15 million euro to develop 1 megawatt and 20 megawatt systems over the next five years. If that works out then it could scale.



The 159 pages comes down to how large can they justify a multiplication factor for wind and solar and other favored technologies. They do not look at grid upgrade issues and costs or backup power generation or power storage. They could not justify a continued growth rate beyond 30% per year for 30-40 years for wind energy so in order for wind and solar power to be scaled in their spreadsheet to achieve the replacement of energy, they need to start the multiplier in 2014.

They did not look at any detailed technologies or projects. It is 159 pages of window dressing on a crude and incorrect spreadsheet projection.

The WWF and their reports are comical in their simplicity and bias.

RELATED REAL SOLUTIONS FOR REDUCING GREENHOUSE GASES
This site has produced a long list of steps that could be taken to economically and quickly reduce greenhouse gases and mitigate possible climate change effects.

Michael Anissimov of Accelerating Future Answers Ten Questions on the Singularity

Michael Anissimov answers ten questions from Popular Science about the technological Singularity and Artificial General Intelligence (AGI). It is a lengthy piece but is well worth reading for anyone interested in the Singularity and AGI.

Al Fin believes that the time scales for AGI are too optimistic.

There is a 6 page draft of an outline for a roadmap to AGI. There should be a more detailed roadmap forthcoming from an AGI roadmap workshop which has been happening this week.



EEStor Worth $1.5 billion and Zenn Motor Betting Its Company on Eestor

The Toronto Star writes about Zenn Motor betting its company on EEStor.

Zenn Motor is waiting for EEStor to deliver that first commercial unit by the end of this year. ZENN's future is riding on that delivery. "The entire aspect of our business model is dependent on EEStor commercializing that technology," he says. "The transformative moment is with the commercial proof, and then the whole tenor of the discussion changes to the excitement about the reality."


Wired indicates that Zenn Motor Company (ZNN: CA) is publicly traded and with a market value of US$169 million. Zenn has a 10.7% ownership stake in EEStor. As a result, investors in effect get a small investment in EEStor as part of the deal. Therefore, EEStor has an implied value of $1.5 billion.



The best commercial ultracapacitors today can only hold one-twentieth the energy of a lithium-ion battery the same size. Some researcher says it's virtually impossible to develop a reliable, competitively priced ultracapacitor that can match or exceed the energy storage capability of lithium-ion batteries, which are emerging as the standard for electric vehicles. At best, an ultracap might be able to achieve a quarter of a battery's energy density.

EEStor says its ultracapacitor has several times the energy density of lithium-ion batteries

Krstic and Alupro-MPI's Ultracapacitor

EEStor's not the only one trying. Vladimir Krstic, professor in the department of mechanical and materials engineering at Queen's University, is working with Stratford, Ont.-based Alupro-MPI on a new type of advanced ceramic alloy that could dramatically outperform existing materials.

The plan is to use an ultrasonic device that can forcefully blend liquid metals, such as barium and titanium, which have been impossible to mix. New ceramic alloys that result can raise the bar on ultracapacitor energy storage.

Krstic says the initial target is to make an ultracapacitor with "at least" a third of the energy density of lithium-ion batteries, then going higher from there. But like all research, the pace of innovation depends on market need and financial backing, whether from government or the private sector. "In order to speed this up we need to have more funding," he says
.

Vladimir Krstic web page at Queen's University

Alupro-MPI website

Alupro supercapacitor claims

* Will recharge in seconds
* Extreme high dielectric C of 80,000-120,000 +
* 200 X higher capacitance than BaTiO3
* 220 V or slightly higher
* No dielectric saturation!

New, previously unsoluble, alloy that can only be mixed by our proprietary unique ultrasonic generator.


Alupro's ultrasonic technology

A new revolutionairy ultrasonic generator technology has been developed to allow the use of conventional high amplitude transducers to drive large irregular shapes and un-tuned mechanical systems such as extruder heads, drawing die, injection mould tools, or a metal melt mixer. This new technology has the unique capability to stimulate wide-band sonic and ultrasonic energy (ranging from infrasonic up to the MHz domain). The key to the technology is its use of a proprietary technique to analyze feedback signals and create custom driving wave forms to initiate and control ringing and relaxing, modulated, multimode mechanical oscillations in the harmonics and sub-harmonics of the attached mechanical system. Such ultrasonic driving creates uniform and homogenous distribution of acoustical activity on the surface and inside of the vibrating system, while avoiding the creation of stationary and standing waves, so that the whole vibrating system is fully agitated.

Innovative Aspects
Development of a new ultrasonic electronics and a system feedback concept that allows real-time adaptation to continuously evolving acoustic conditions. Capability to ultrasonically drive any arbitrary shape or large mechanical system at high energy if needed- Ultrasonic driving of system harmonics and subharmonics gives a wideband multi-frequency effect that improves stimulation of the system. - Elimination of ultrasonic standing waves gives uniform treatment of material

Main Advantages
Improved metal homogenization and mixing of new alloys.- Improved crystallization and alloy characteristics in casting.- Friction reduction between a tool (e.g. casting, drawing, extruding, moulding) and the material being worked improves the material flow.- Improved Surface finish on extruded or drawn materials.- Reduce or eliminate material voids or cavities in casting or moulding applications.- Power Supply Generators are available in a standard range of 300 watts to 2,000 watts. Custom systems available to 120,000 watts.


Nanowires Go 2D and 3D and are Biocompatible


Taking nanomaterials to a new level of structural complexity, Harvard scientists have determined how to introduce kinks into arrow-straight nanowires, transforming them into zigzagging two- and three-dimensional structures with correspondingly advanced functions.

Among possible applications, the authors say, the new technology could foster a new nanoscale approach to detecting electrical currents in cells and tissues.

“We are very excited about the prospects this research opens up for nanotechnology,” said Lieber, Mark Hyman Jr. Professor of Chemistry in Harvard’s Faculty of Arts and Sciences. “For example, our nanostructures make possible integration of active devices in nanoelectronic and photonic circuits, as well as totally new approaches for extra- and intracellular biological sensors. This latter area is one where we already have exciting new results, and one we believe can change the way much electrical recording in biology and medicine is carried out.”

Lieber and Tian’s approach involves the controlled introduction of triangular “stereocenters” – essentially, fixed 120-degree joints – into nanowires, structures that have previously been rigidly linear. These stereocenters, analogous to the chemical hubs found in many complex organic molecules, introduce kinks into 1-D nanostructures, transforming them into more complex forms.

The researchers were able to introduce stereocenters as nanowires, which are self-assembled. The researchers halted growth of the 1-D nanostructures for 15 seconds by removing key gaseous reactants from the chemical brew in which the process was taking place, replacing these reactants after joints had been introduced into the nanostructures. This approach resulted in a 40 percent yield of bent nanowires, which can then be purified to achieve higher yields.

“The stereocenters appear as kinks, and the distance between kinks is completely controlled,” said Tian, a research assistant in Harvard’s Department of Chemistry and Chemical Biology. “Moreover, we demonstrated the generality of our approach through synthesis of 2-D silicon, germanium, and cadmium sulfide nanowire structures.”




Nature Nanotechnology: Single-crystalline kinked semiconductor nanowire superstructures

The ability to control and modulate the composition doping crystal structure and morphology of semiconductor nanowires during the synthesis process has allowed researchers to explore various applications of nanowires. However, despite advances in nanowire synthesis, progress towards the ab initio design and growth of hierarchical nanostructures has been limited. Here, we demonstrate a 'nanotectonic' approach that provides iterative control over the nucleation and growth of nanowires, and use it to grow kinked or zigzag nanowires in which the straight sections are separated by triangular joints. Moreover, the lengths of the straight sections can be controlled and the growth direction remains coherent along the nanowire. We also grow dopant-modulated structures in which specific device functions, including p–n diodes and field-effect transistors, can be precisely localized at the kinked junctions in the nanowires.


3 page pdf with supplemental information


Nanowires Are not Damaging the Brain

In separate but nanowire related research: Nanowire [are showing] Biocompatibility in the Brain - Looking for a Needle in a 3D Stack

We investigated the brain-tissue response to nanowire implantations in the rat striatum after 1, 6, and 12 weeks using immunohistochemistry. The nanowires could be visualized in the scar by confocal microscopy (through the scattered laser light). For the nanowire-implanted animals, there is a significant astrocyte response at week 1 compared to controls. The nanowires are phagocytized by ED1 positive microglia, and some of them are degraded and/or transported away from the brain.


Nanowires are not damaging the brain

"Together with other findings and given that the number of microglial cells decreased over time, the results indicate that the brain was not damaged or chronically injured by the nanowires," Christelle Prinz concludes.




Not Satisfied Being A Wine Snob, You can Learn to be a Beef Snob

In the future there could be genetic engineering to make a transhuman level of taste or a perfect palate. For now, we can have a better eating experience by understanding what we like and thinking a bit about our food.

From the Independent UK, a master beef taster who can tell a cow's age, gender and breed from one mouthful of meat.

People think about the about animals on farms and the meat they buy but tend to do so separately. Yet the difference these factors make to the eating experience is astonishingly marked – and, says Laurent Vernet, hugely underestimated, even by top chefs.

Whilst others brand his skill unique, he isn't so sure. "There's almost no science involved. I've met farmers and butchers who can immediately identify the same things I can, but they do it informally," he tells me as we sit in London's Landmark Hotel waiting for executive chef Gary Klaner to rustle up the first few cuts. "In any case, I want people to understand the differences in beef so they can find their favourites. You wouldn't pick up any old red wine without thinking about your preferences. Why shouldn't the same go for beef, which after all, tends to be the most expensive food in your shopping trolley?"

Beef, like any food, has fashions. The current hit, especially in London butchers, is 46-day maturation. A mere nine days is, in some circles, now considered so bland that you might as well chuck it in the kids' packed lunch. Vernet disagrees and he's right that while its taste lingers for a shorter time than its two longer-matured equivalents, it is juicier and melts in the mouth. "Beware of the butcher that says all his beef is matured for x number of days," cautions Vernet. "The reality is they should get to know the individual carcasses, checking them every day, recognising every animal is different in terms of its premium state."

"Ultimately, it comes down to individual preference," he says. "Provided it's good-quality well-reared beef, you can't (as some butchers and chefs do) say one type is categorically better than another."

The idea that animals should only be fed grass is another current trend that Vernet disagrees with. "They need other things in winter and turnips or barley – two examples – develop sweetness and a nice marbling, which ultimately give more juiciness."




The perfect steak: Vernet's tips

Use the right pan: For juicy cuts, use a griddle. For dryer steaks, use a flat pan. If you place a juicy steak on a flat pan, the liquid will boil and ruin taste and texture. If you put a dry steak on a griddle, you'll make it even dryer. If you're not sure which category your steak fits into, ask your butcher. Alternatively, go by the rule that the more marbling, the more juice.

Get your temperatures right: Your pan needs to be very hot. Top chefs think nothing of heating the pan five to ten minutes before putting the steak anywhere near it. The steak itself should ideally be room temperature. If it's too chilled, the meat fibres will contract together and produce a massive release of juice, potentially drying out your steak.

Cooking your meat: Steaks can be trimmed of fat before or after cooking. The latter option adds a little more flavour. Lightly coat the steak with oil before placing on your pan/griddle and allow the meat to cook until the desired amount of browning occurs. Go by the rule of two-and-a-half minutes each side for rare; 3-4 minutes each side for medium rare; 4 for medium; 5 for medium well; and 6 for well done. If using a griddle, rotate the steak 45 degrees while cooking for a criss-cross effect.

Bone or no bone? Some butchers say steaks best retain their flavour when cooked on the bone, but there's no evidence or logic to suggest this is true. Since butchers have to pay to dispose of bones, they may have an ulterior motive. That said, bones are always good for stock or soup.




What is the Future of Shale Gas ?

A Financial Times blog article covers a debate between Shale Gas will and is boosting US and World Natural Gas supplies and production versus Shale Gas declines too quickly and is not and will not provide that much natural gas

Here is the there will be and there is a lot of Shale Gas side

1. Reality – it takes many years for a very tight (low permeability) gas reservoirs to exhibit exponential decline behavior. Thus, hyperbolic decline can and should be used to approximate/extrapolate EUR’s

2. Type Curves work.

3. High Terminal Decline Rate is wrong. Need 10-20 years of Barnett history (oldest Shale play) to be certain but current results show lower decline than the 15% of skeptics

4. They loaded the skeptics Barnett ~1bcf EUR type curve (which are called optimistic) into our Barnett Shale model. We applied their type curve to the ~3,000 wells drilled in 2008. During 2008, actual Barnett production grew by 1.2bcf.day. The skeptics “optimistic” EUR curve estimated growth of only 0.7bcf.day – which says it underpredicted actual incremental production by 0.5bcf/day or 70%.

5. In the Barnett and Fayetteville, reported well production shows higher y/y rates and higher projected EURs due to improved technology and better understanding of the reservoir (its called a learning curve).

6. Peak rate IS a good indicator of EUR

7. With less gas from shales, the marginal costs of supply will be high – some skeptics say as high as $8/mcf. We agree that if shale disappoints, gas prices will be quite high. Yet some skeptics run economic analysis of shales at $4/mcf gas prices..to prove that the Barnett is uneconomic. Circular logic here. If recovery is low and prices are therefore high..you MUST use the higher price when evaluating shale economics.




Low Shale Gas Case
Part of the low shale gas case

Another part of the low shale Gas case Basically there are big first year declines and those declines must continue for all or most wells and for following years.

New Big Shale Gas Finds Around the World and Possible Future Scenario

Poland and Pennsylvania are finding gigantic new natural gas reserves. The new technology will enable other places to unlock new shale gas. Total US natural gas reserves are now estimated at 75 years. In less than two years, the US has gone from a gas importing nation to a gas surplus nation.

Outside the United States, there has been almost no exploration for shale gas resources, and correspondingly little is known about the reserve potential in other countries. But suffice it to say that a lot of shale gas will be found and developed in the next five years, and when it is the global energy equation is going to change. Places like Poland, Germany, Sweden, France, China and India will suddenly emerge as major natural gas producers. The net effect of these new sources of supply will be the decline in importance of Russia, Canada, Iran, Qatar and Algeria as energy suppliers.



China H1N1 Strategy Was Superior


China has had far fewer deaths from H1N1 and has limited and delayed the spread of H1N1.

From April 25 to July 5, a total of 8,272 people entering China with flu-like symptoms were transferred to medical institutions for quarantine. Some 228 of them were confirmed as H1N1 flu patients.

China's strategy for fighting swine-flu is "resource intensive." But China's strategy worked to greatly slow the spread of swine flu infections in their country; Australia's strategy didn't do the same for their country. China's way of fighting the pandemic was to greatly limit its spread at the start. That doesn't mean they caught every case. It means they caught every case that came to the attention of a physician or hospital. The CDC pandemic-fighting model is based largely on analysis of the 1918 swine flu pandemic. China's model is based on empirical observations of the way 'airborne' or aerosolized infectious disease first spreads from country to country in the 21st Century; namely, via globalized, heavy-volume commercial airline travel. There is no way to erase the data on China's success at doing the Number One thing a national pandemic fighting plan is supposed to do: greatly slow the spread of the disease in a country.



Report from the summer of 2009 on the H1N1 procedures:
When the plane lands, current procedure is that four medical personnel come on board and check everyone's temperature with a very fast "ear" thermometer. We were on a smaller plane and waited in total about 35 minutes until they were finished and cleared the plane so that we could all leave.



China has far fewer cases relative to India, Australia, and the United States. The United States now.

India's H1N1 Situation



India has had 351 dead from H1N1 as of Oct 11, 2009, but china only had their second H1N1 death on October 18, 2009

China had ignored the World Health Organization recommendation to not test for H1N1 and not try to institute quarantine of sick inbound travellers.

CDC guidelines were for separating people with 6 feet or more distance but not putting them into mandatory isolation.

27 page document that discussed the options for managing H1N1. At the end of it:

In the face of the current situation of the new influenza A(H1N1), WHO does not currently recommend international border closures or international
travel restrictions.


October 21, 2009

China Building New Grand Canals


Red lines show the new canals that are being built

The Grand Canal of China, also known as the Beijing-Hangzhou Grand Canal is the longest ancient canal or artificial river in the world. Starting at Beijing it passes through Tianjin and the provinces of Hebei, Shandong, Jiangsu and Zhejiang to the city of Hangzhou. The oldest parts of the canal date back to the 5th century BC, although the various sections were finally combined into one during the Sui Dynasty (581–618 AD).

BBC News reports China has begun relocating people for the next stage of major work on a modern $62 billion water diversion project which will provide more water to Beijing, Tainjin and Weihai and other northern provinces. 330,000 people are being relocated to make way for one section of the canals.

The Grand Canal is currently being upgraded to serve as the Eastern Route of the South-North Water Transfer Project.

From Ritchiewiki, China has about seven percent of the world’s water resources and roughly 20 percent of its population. About four-fifths of the country’s water supply is in the south. The water transfer project has been divided into three separate sections: Eastern, Central, and Western routes. It will divert 58 billion cubic yards (44 billion cubic meters) of water annually, thereby providing the drier north with a more reliable water source. The project was discussed with severe scrutiny for 50 years before being approved by China’s State Council on August 23, 2002. The construction is expected to take almost as long, with an estimated completion date of 2050. Water is expected to flow from the Yangtze and its tributaries to Beijing in 2014 along the central route.

The South-North Water Transfer Project is described at wikipedia.



The Eastern route will supply water to Shandong Province and the north part of Jiangsu. Construction of the Eastern route began in December 2002

The Central route will supply water to Hebei, Henan, Beijing and Tianjin. The completed line will be approximately 785.5 miles (1,264 km) long, initially providing 12.5 billion cubic yards (9.5 billion m^3) of water annually. By 2030, it is expected to increase its water transfer to 16 to 17 billion cubic yards (12 to 13 billion m^3) annually.

The Western route is the most challenging and also controversial of all the diversion lines. This route is designed to bring five billion cubic yards (3.8 billion m^3) of water from three tributaries of the Yangtze River (Tontian, Yalong and Dadu), nearly 300 miles (483 km) across the Bayankala Mountains to northwest China. The diversion of this water could affect a number of other nations, including Burma, Thailand, Laos, Cambodia and Vietnam, who rely on this water downstream.

RELATED

The 2008 global desalination industry had 13,869 plants providing 52 million cubic meters of water per day (19 billion cubic meters per year)

1.8 to 2 billion cubic meters per year of desalination capacity is being installed each year.

Desalination currently costs about 40 to 55 cents per cubic meter.

The equivalent desalination value of 10 billion cubic meters/year of Chinese water diversion (by 2015) would be $4 billion/year at the 40 cents per cubic meter. The full 44 billion cubic meters/year would be $18 billion/year at 40 cents per cubic meter or $9 billion/year if future desalination prices were halved.

Siemens beat 35 other groups with a unique spin on desalination.

Using electricity instead of high-pressure or heat to remove salt from seawater, the team produced a metric cube of pure drinking water - with 1.5 kWh.

Advanced desalination methods around the world currently use twice that amount of energy.

Recycled water, also known here as Newater, requires 0.7kWh to produce.

By passing seawater through electric fields, salt is drawn out - the complete opposite of conventional methods which push water through a membrane, explained Siemens vice-president of R&D Ruediger Knauf.


Get the H1N1 Vaccine When It is Available For You


With normal flu 90% of the 30,000 people who die in a year in the USA are 65 years or older, but with H1N1 90% of those who are dieing are YOUNGER than 65 years of age

New Scientist covers the relative risks of the H1N1 Vaccine and the H1N1 Flu

* The risk of getting Guillain-Barré from a flu vaccine is almost certainly less than 1 in a million
* the risk of getting Guillain-Barré from the flu itself is more than 40 in a million.
* Swine flu is estimated to have killed 800 people in the US already, or more than 2 in every million so far. These numbers could go up by 50 to 1000 times as the flu season progresses as we are not into the typical peak of the flu season which is January and February.
* And during the first wave of swine flu this summer, 1 out of every 20,000 children aged 4 or under in the US ended up in hospital.

The odds of dying from H1N1 after you get it could be as high as 1 in 200. So if 100 million people got the H1N1 flu in the USA then 500,000 people could die of it. Having more at risk people get vaccinated could reduce the spread of the H1N1 Flu.

UPDATE: Canada will start vaccinating this week The US has been vaccinating and about 6 million people (most health care workers and most at risk people) have been vaccinated.
END UPDATE

H1N1 Vaccine supply is a behind production schedule

The CDC had hoped and their last estimate about expected vaccine that they made several weeks ago was that by the end of October there would be around 40 million doses. The CDC now thinks at most it might be about a 10--about a 10 to 12 million doses less than that by the end of the month. So 30 million doses of vaccine available and distributed by the end of October.

Mid-November should see good availability.

People at greatest risk for 2009 H1N1 infection include children, pregnant women, and people with chronic health conditions like asthma, diabetes or heart and lung disease.

Ask your doctor if you should get a 2009 H1N1 vaccine.




Three steps to take

Get vaccinated, take disease prevention steps (wash hands, avoid contact with sick people) and take anti-viral drugs if your doctor tells you to.



Interview of Artificial General Intelligence Researcher Itamar Arel by Sander Olson

Here is an interview with Dr. Itamar Arel interview by Sander Olson. Dr. Arel runs the Machine Intelligence Lab at the University of Tennessee. Dr. Arel believes that "baby" AIs are possible within 3 years and computers with human-level intelligence are feasible within ten. Dr. Arel is further convinced that the individual components necessary for AI have largely been developed, and that building an Artificial General Intelligence (AGI) should cost only $10-15 million.

Question 1: What were your thoughts on the Singularity Summit?

Answer: I consider it a complete success. There were twice as many attendees as in previous years, and it was more practically oriented than in earlier years. This year's conference was more focused, and that may explain the increased attendance.

Video of Arel from Singularity Summit 2009
Here is a clip from Dr. Arel at the 2009 Singularity Summit:




Question 2: You made some rather provocative arguments at the singularity summit .

Answer: At the singularity summit, I argued that the technologies needed to drive Artificial General Intelligence (AGI) are readily available. Given sufficient funding, we could create "baby" AGIs within 3-5 years and human level AGIs within a decade. At this point what is needed is a focused engineering effort rather than a dramatic breakthrough.

Question 3: On what grounds do you base these bold predictions?

Answer: I run the Machine Intelligence Lab at the University of Tennessee. We focus our attention on building intelligent machines, but we have a unique approach. We believe that general sentience is based on two critically important subsystems. These subsystems, when combined, could lead to general sentience and intelligence.

Question 4: Tell us more about these subsystems.

Answer: We call the first subsystem the situation inference subsystem. The agent or system infers the state of the world with which it interacts. This is obtained using a subsystem which utilizes a technique known as deep machine learning. The inferred state information is passed to a second subsystem which maps situations to actions using reinforcement learning. Deep machine learning, which is necessary for situation inference subsystems, has recently been developed and should be ready within 3-5 years. So the pieces of the puzzle are already around for us to make a quantum leap in thinking machines.

Question 5: So are the advances more software or hardware related?

Answer: It is actually both. The two subsystems - the inference engine and the decision making engine – can be realized in software. But the great advancements in VLSI technology now allows us to pack billions of transistors on a die and thus implement these systems in custom hardware. Each transistor can correspond to a synaptic gap, so we are within a few orders of magnitude of a mammal brain.


Question 6: You have argued that the Turing Test is an insufficient guide to assessing true intelligence. What would a computer need to do in order to convince you that it was both sentient and intelligent?

Answer: I am co-chairing a workshop that explicitly deals with establishing an AGI roadmap. This is a first of its kind effort that is designed to generate a roadmap that wouldn't simply strive to create a machine that would pass the Turing test. Rather, we want to devise a better way of evaluating machine intelligence. The goal is to present an increasingly challenging series of tasks to a computer. A computer that passed all of these tests would be deemed human-level intelligent.


Question 7: What role are you playing in creating this AGI roadmap?

Answer: I am the co-organizer of the conference, along with Ben Goertzel. The term “AI” now pertains to narrow AI - AI that performs specific tasks well. By contrast, AGI systems would need to exhibit a broad intelligence, quickly learn new tasks, and be able to readily adapt to an unstructured environment. The goal of the AGI conferences is to specify and codify a series of metrics by which we could measure AGI performance and the challenges that need to be surmounted in order to create a truly intelligent machine.


Question 8: Is reverse-engineering the human brain a necessary prerequisite for AGI?

Answer: There are two schools of thought on this subject. One school of thought advocates reverse-engineering the brain as a necessary precursor to creating a sentient machine. This is often referred to as "whole brain emulation". The other school of thought argues that replicating the human brain is an unnecessary task that would take decades. I agree with the later - there are quicker and easier ways to impart intelligence to a machine.


Question 9: Much AI involves giving "weights" to values. But how useful is such a system in the real world?

Answer: An AGI system will need to interact with its environment in a manner similar to humans. There will probably need to be some sort of positive feedback mechanism, so we will need to discover a way to give a computer a "rush" from doing a task correctly. This reward system will need to be both internal and external.

Question 10: What funding levels would be required to bring about AGI within a decade?

Answer: Although I cannot provide a precise number, the funding requirements would be relatively modest. Assembling a team together and equipping them with the necessary compute power could probably done for $10 million.

Question 11: Are you troubled by critics of AI arguing that sentience is a quantum phenomena?

Answer: I am not. I had a stimulating conversation with Stewart Hameroff who was arguing for quantum effects associated with consciousness. He argues that there are synchronized regions of neural activity that lead to consciousness. But he and I agreed that it should be possible to emulate any such quantum effects with digital logic. So that was very encouraging for me.


Question 12: If you were given a petaflop supercomputer, could you create an AGI now?

Answer: The computational resources are actually readily available. We could probably achieve rudimentary AGI with a fairly modest cluster of servers. That is one of the main advantages of not trying to emulate the human brain - accurately simulating neurons and synapses requires prodigious quantities of compute power.


Question 13: Do you believe that AGI will quickly and necessarily lead to superintelligence?

Answer: At this point it isn't clear how long it will take to transition from human level AI to greater-than-human intelligence. For many tasks superintelligence simply isn't needed. It is logical to assume that once we achieve human level AI, that superintelligence will follow relatively quickly. Transitioning from human level AI to a superintelligent AI might simply be a matter of upgrading hardware.


Question 14: Assuming sufficient funding, how much progress do you anticipate by 2019?

Answer: With sufficient funding, I am confident that a breakthrough in AI could be demonstrated within 3 years. This breakthrough would result in the creation of a "baby" AI that would exhibit rudimentary sentience and would have the reasoning capabilities of a 3 year old child. Once a "baby" AI is created, funding issues should essentially disappear since it will be obvious at that point that AGI is finally within reach. So by 2019 we could see AGI equivalent to a human adult, and at that point it would only be a matter of time before superintelligent machines emerge.

FURTHER READING
AGI Roadmap wiki

Google is Planning for 10 Million Servers and an Exabyte of Information



A recent presentation (73 page pdf) by a Google engineer shows that the company is preparing to manage as many as 10 million servers in the future. (H/T Sander Olson)

Google's Jeff Dean discussed a new storage and computation system called Spanner, which will seek to automate management of Google services across multiple data centers. That includes automated allocation of resources across “entire fleets of machines.”

The goal will be “automatic, dynamic world-wide placement of data & computation to minimize latency or cost.”


Future scale of Spanner:
* ~10^6 to 10^7 machines (10 million machines)
* ~10^13 directories (10 trillion directories)
* ~10^18 bytes of storage (exabyte), spread at 100s to 1000s of locations around the world, ~10^9 client machines









RELATED READING

There is a proposal called server sky for placing millions of servers into orbit as a lower energy alternative.

October 20, 2009

Electron Spin Controlled with Electric Fields


The holy grail in spintronics is to address spin with something other than magnets and now University of Ohio researchers provided a theoretical modeling for a recent experiment that was the first to successfully control an electron’s spin using purely electrical fields.

The team collaborated with a research group at the University of Cincinnati, led by Philippe Debray and Marc Cahay. Debray conceived and designed the experiments. The Ohio University researchers’ calculations explained the behavior of the electrons in Debray’s experimental conditions and predicted how strong the electric field’s control over the spin would be.

Their research also revealed one of the key conditions of the experiment—that the tiny connection along which the electrons travel in the device must be asymmetrical.

Controlling spin electronically has major implications for the future of novel devices such as transistors, but this experiment is only the first step of many, Ulloa said. The next step would be to rework the experiment so that it could be performed at a higher, more practical temperature not requiring the use of liquid helium.



All-electric quantum point contact spin-polarizer

The controlled creation, manipulation and detection of spin-polarized currents by purely electrical means remains a central challenge of spintronics. Efforts to meet this challenge by exploiting the coupling of the electron orbital motion to its spin, in particular Rashba spin–orbit coupling, have so far been unsuccessful. Recently, it has been shown theoretically that the confining potential of a small current-carrying wire with high intrinsic spin–orbit coupling leads to the accumulation of opposite spins at opposite edges of the wire, though not to a spin-polarized current. Here, we present experimental evidence that a quantum point contact—a short wire—made from a semiconductor with high intrinsic spin–orbit coupling can generate a completely spin-polarized current when its lateral confinement is made highly asymmetric. By avoiding the use of ferromagnetic contacts or external magnetic fields, such quantum point contacts may make feasible the development of a variety of semiconductor spintronic devices.


17 pages of supplemental information

Nanoscale and Quantum Phenomena Institute (NQPI) Web site of Ohio University

Long carbon fibers could improve blast resistance of concrete structures


long, coated carbon fibers, like those pictured in his left hand, could significantly improving a structure's ability to withstand blasts, hurricanes and other natural disasters. In his right hand are short, uncoated fibers, which resemble clumps of human hair

Missouri Univerist of Science and Technology have received $567,000 to explore how adding carbon fibers could improve the blast and impact resistance of conventional reinforced concrete.

Today short carbon fibers - measuring no more than 1.5 inches - are found in buildings, bridges and slabs to limit the size of cracks. But in the future, Volz says the carbon fibers could be up to 6 inches in length, significantly improving a structure's ability to withstand blasts, hurricanes and other natural disasters.

"The long fibers will absorb more energy as they pull-out during the pressure wave or impact, cutting down on the potential for failure during an explosion or earthquake," Volz explains. "The fibers will also significantly diminish secondary fragmentation, reducing one of the leading causes of damage to surrounding personnel and materials. First responders will be able to get to the scene faster because they won't have to clear chunks of concrete out of their way."



Previous efforts by other researchers to incorporate longer carbon fibers have failed for two reasons. First, longer carbon fibers are more likely to ball up as the concrete is mixed. Second, it's difficult to disperse the carbon fibers throughout the concrete.

Coating the fibers can reduce the fibers tendency to form into a ball. The team plans to study a variety of formulas to find a coating that balances between flexibility and rigidity.

In addition, the team plans to study how a negative electric charge, applied to a polymer coating, could force the fibers to disperse more uniformly during mixing.


Carnival of Space 125

Carnival of space 125 is up at orbiting frog.

Superconducting magnets are reaching 30-35 tesla now and appear on track to reach 60-70 tesla. This will enable the superconducting magnets to test a possible superpropulsion theory where the magnets are used to shunt into hyperdrive.

The basic concept is this: according to the paper's authors - Jochem Häuser, a physicist and professor of computer science at the University of Applied Sciences in Salzgitter and Walter Dröscher, a retired Austrian patent officer - if you put a huge rotating ring above a superconducting coil and pump enough current through the coil, the resulting large magnetic field will "reduce the gravitational pull on the ring to the point where it floats free".

The origins of this "repulsive anti-gravity force" and the hyperdrive it might power lie in the work of German scientist Burkhard Heim, who - as part of his attempts to reconcile quantum mechanics and Einstein's general theory of relativity - formulated a theoretical six-dimensioned universe by bolting on two new sub-dimensions to Einstein's generally-accepted four (three space, one time).


Adam Crowl of Crowlspace looks at several interesting physics papers that consider attoscale blackholes for powering spaceships and antimatter propulsion. This site will follow up with a detailed examination of those research papers.



Check out the Carnival of space 125 at orbiting frog for a lot more articles on astronomy, telescopes, space missions and more.


Nanostructured Nickel Magnesium Oxide Will Enable One Terabyte Computer Chip and 80 mpg Cars

North Carolina State University engineers have created a new material that would allow a fingernail-size computer chip to store the equivalent of 20 high-definition DVDs or 250 million pages of text, far exceeding the storage capacities of today’s computer memory systems. (H/T Brett Coalson)

the engineers made their breakthrough using the process of selective doping, in which an impurity is added to a material that changes its properties. The process also shows promise for boosting vehicles’ fuel economy and reducing heat produced by semiconductors, a potentially important development for more efficient energy production.

Working at the nanometer level — a pinhead has a diameter of 1 million nanometers — the engineers added metal nickel to magnesium oxide, a ceramic. The resulting material contained clusters of nickel atoms no bigger than 10 square nanometers, a 90 percent size reduction compared to today’s techniques and an advancement that could boost computer storage capacity.

“Instead of making a chip that stores 20 gigabytes, you have one that can handle one terabyte, or 50 times more data,” Narayan says.

By introducing metallic properties into ceramics, Narayan says engineers could develop a new generation of ceramic engines able to withstand twice the temperatures of normal engines and achieve fuel economy of 80 miles per gallon. And since the thermal conductivity of the material would be improved, the technique could also have applications in harnessing alternative energy sources like solar energy




The engineers’ discovery also advances knowledge in the emerging field of “spintronics,” which is dedicated to harnessing energy produced by the spinning of electrons. Most energy used today is harnessed through the movement of current and is limited by the amount of heat that it produces, but the energy created by the spinning of electrons produces no heat. The NC State engineers were able to manipulate the nanomaterial so the electrons’ spin within the material could be controlled, which could prove valuable to harnessing the electrons’ energy. The finding could be important for engineers working to produce more efficient semiconductors.

Working with Narayan on the study were Dr. Sudhakar Nori, a research associate at NC State, Shankar Ramachandran, a former NC State graduate student, and J.T. Prater, an adjunct professor of materials science and engineering. Their findings are published as “The Synthesis and Magnetic Properties of a Nanostructured Ni-MgO System,” which appeared in the June edition of JOM, the journal of the Minerals, Metals and Materials Society. The research was sponsored by the National Science Foundation.


FURTHER READING

The synthesis and magnetic properties of a nanostructured Ni-MgO system

We have investigated the magnetic properties of the Ni-MgO system with an Ni concentration of 0.5 at.%. In asgrown crystals, Ni ions occupy substitutional Mg sites. Under these conditions the Ni-MgO system behaves as a perfect paramagnet. By using a controlled annealing treatment in a reducing atmosphere, we were able to induce clustering and form pure Ni precipitates in the nanometer size range. The size distribution of precipitates or nanodots is varied by changing annealing time and temperature. Magnetic properties of specimens ranging from perfect paramagnetic to ferromagnetic characteristics have been studied systematically to establish structure-property correlations. The spontaneous magnetization data for the samples, where Ni was precipitated randomly in MgO host, fits well to Bloch’s T3/2-law and has been explained within the framework of spin wave theory predictions.


1 page pdf preview of the full article


China, Taiwan Free Trade Pact Talks

A free-trade agreement between China and Taiwan could boost China's gross domestic product growth by 0.36-0.4 percentage point and by 0.63-0.67 percentage point after the members of the Asean +3 trade bloc begin removing import tariffs as soon as 2010.

Taiwan's annual gross domestic product (GDP) will increase by an estimated 1.72 percent if Taiwan and China sign an economic cooperation framework agreement (ECFA), the Ministry of Economic Affairs.

The Ministry of Economic Affairs used the standard GTAP Model, a global computable general equilibrium model that is widely used by major countries around the world in measuring the impact of forming a free trade area with major trading partners.


Taiwan hopes to sign an economic cooperation framework agreement (ECFA) , similar to a free trade agreement, with China as soon as possible, Minister of Economic Affairs Shih Yen-shiang said Thursday, Oct 15, 2009.

"We expect the proposed ECFA accord can be signed next year, but if the goal can be reached earlier, it would be more than welcome, " Shih said, while fielding questions at a Legislative Yuan committee meeting.

Officials from both sides of the Taiwan Strait will hold a fourth round of informal talks on the ECFA deal late this month and formal talks are expected to get underway later this year.


Taiwan has been pushing for the trade agreement with China, its largest export market, fearing it could be marginalized when the Asean +3 start removing the tariffs.


Taiwan's Ministry of Economic Affairs (MOEA) will handle the issue of naming the economic cooperation agreement which the island hopes to sign with China, Mainland Affairs Council Deputy Chairman Liu Te-shun

Taiwan has called it an "economic cooperation framework agreement (ECFA) " while China has called it a "cross-strait economic cooperation agreement" without the word framework.




The Washington Post reported in Feb, 2009 on the beginning of free trade talks between China and Taiwan


If Taiwan fails to strike an ECFA deal with China, its economic growth rate could fall by 0.176 percentage points after the free trade agreement between China and ASEAN countries takes effect next year.

Once "ASEAN plus 3" becomes a reality, with which China, Japan and South Korea forming a free trade area with ASEAN countries, Taiwan's annual GDP growth rate might see an even bigger drop of 0.836 percent, the CIER report conjectured.

As to how the ministry will promote the ECFA project, Yiin said the proposed pact will only outline negotiation goals and timeframes for reaching each policy goal or cooperation project in a gradual manner in order to minimize any possible negative impact on local industries.