Pages

October 10, 2009

Coded Programmable Magnets


Inventor Larry Fullerton applies signal correlation methods and coding theory to magnetism to precisely control magnetic fields. Coded magnetic structures correlate to produce stronger bonding force, programmable precision alignment, and deterministic magnetic field interaction that promise to accelerate product performance and innovation.

Coded magnetic structures can be designed to deliver precise holding strength characteristics, customized release behavior, prescribed alignment tolerances and even unique identities that can discriminate among other programmed magnets and determine which devices will interact. Rare-earth materials, ferrites and electromagnets alike can be programmed using one-, two- or three-dimensional arrays of magnetic elements that alternate polarities in a prescribed spatial pattern.

“We applied signal correlation methods that are well-understood and widely used in radio communications today”, said company founder and Chief Scientist Larry Fullerton. “By alternating the polarity of individual magnetic elements on the magnet surface, we can alter the shape and density of the magnetic field.

“I initially programmed a pair of correlated magnets to produce a peak attractive force at one alignment and one alignment only. What this means in practical terms is that two very strong magnets will lock together in one particular alignment, but then can be easily released by twisting the patterns away from the correlated position,” Fullerton said.

Coded magnetic devices offer dramatic improvement in safety for applications involving strong industrial magnets as well, because the engagement distance or “reach” can be precisely controlled.

For example, a coded magnet strong enough to lift a large metal cargo container won’t attract metal until it’s within inches of its intended target. Door locks and other hardware can be programmed so pacemakers and credit cards are not affected.

By using programmed magnets, designers can increase magnet performance, or decrease the size and weight required to achieve a particular design objective. This realization comes at a time when the cost of magnetic materials is increasing rapidly, and the availability of rare-earth materials is becoming less reliable.










Nvidia Fermi, AMD Radeon and Intel Larrabee

IEEE Spectrum predicted that Intel's Larrabee chip would be a technological winner.

we believe that Larrabee is a winner—because of the one feature that it alone offers: C++ programmability. The first test chips came out only a few weeks ago (Jan 2009) , and the product won’t reach store shelves until late 2009.


Intel Larrabee looks to be up to 6-12 months later than planned and Nvidia's next generation GPU the Fermi will be out with C++ support.

The last stated release date from Intel was either late this year or early 2010. That does not seem likely now.

"I never thought they were on time and I don't think they are on track," said Jim McGregor, chief technology analyst with In-Stat. "And I don't think they are going to make their goal. Their goal that Pat [Gelsinger, former senior vice president who headed the Larrabee project] said last year was if it can't compete on the highest end, they won't release it."

(May 2009)Intel VP of corporate technology Joseph Schultz mentioned that Intel is now looking at a release date of the first half of 2010, moved back from the company’s original late-2009 target.

The Top 10 Innovations in the New NVIDIA Fermi Architecture, and the Top 3 Next Challenges, David Patterson Director, Parallel Computing Research Laboratory (Par Lab), U.C. Berkeley

I believe the Fermi architecture is as big as an architectural advance over G80 as G80 was over NV40. The combined result represents a giant step towards bringing GPUs into mainstream computing. The table previews my take on the Top 10 most important innovations in the new Fermi architecture. This list is from a computer architect’s perspective, as a user would surely rank performance higher. At the end of the paper, I offer 3 challenges on how to bring future GPUs even closer to
mainstream computing, which the table also lists.




Past GPUs had a variety of different types of memories, each in their own address space. Although these could be used to achieve excellent performance, such architectures are problematic with programming languages that rely on pointers to any piece of data in memory, such as C, C++, and CUDA

Fermi has rectified that problem with by placing those separate memories—the local scratchpad memory, the graphics memory, and system memory—into a single 64‐bit address space, thereby making it much more easier to compile and run C and C++ programs on Fermi. 6 Once again, PTX enabled this relatively dramatic architecture change without the legacy binary compatibility problems of mainstream computing.


21 page Nvidia white paper on Fermi

The implementation of a unified address space enables Fermi to support true C++ programs. In C++, all variables and functions reside in objects which are passed via pointers. PTX 2.0 makes it possible to use unified pointers to pass objects in any memory space, and Fermi’s hardware address translation unit automatically maps pointer references to the correct memory space. Fermi and the PTX 2.0 ISA also add support for C++ virtual functions, function pointers, and ‘new’ and ‘delete’ operators for dynamic object allocation and de-allocation. C++ exception handling operations ‘try’ and ‘catch’ are also supported.



Nvidia Fermi versus AMD Radeon

Fermi versus AMD Radeon: Who Wins, Who Loses in Supercomputing Applications?

While AMD and Nvidia battle for supremacy in the GPU computing arena, there's one obvious loser, Intel. AMD's 5870 appeared on schedule. Nvidia's Fermi is late, but its GTX 280 series still is competitive. Intel's Larrabee remains a no-show. End users who buy their systems by the teraflop have discovered and validated an alternative approach that requires fewer x86 CPUs, less power, and less space. GPU computing is here to stay, and the market will punish those who lack a competitive offering.

While AMD and Nvidia battle for supremacy in the GPU computing market, there's one obvious loser, Intel. AMD's 5870 appeared on schedule. Although Nvidia's Fermi is late, its prior generation GTX 280 still has some life left in it. But, Intel's many-core Larrabee is still a no-show, and the company's Larrabee demo at its recent Developers' Forum was universally regarded as brain-dead, if not an outright embarrassment. End users with high performance computational requirements previously filled their data centers with racks of x86 servers to handle those requirements. Now they have now discovered and validated an alternative approach that requires fewer x86 CPUs, less power, and less space. GPU computing won't solve all the world's computing problems, but it will give users who buy their systems by the teraflop a new, more cost effective alternative that will take some of the wind out of Intel's high performance computing sales.

H1N1 Swine Flu and Regular Flu Update


The CDC (Center for Disease control) H1N1 site has the following information on the swine flu

UPDATE: Journal of American Medical Association studies of H1N1/Swine flu deaths in Canada and Mexico.

From the Wall Street Journal, Anand Kumar, lead author of one of the studies and ICU attending physician for the Winnipeg Regional Health Authority in Canada. "There's almost two diseases. Patients are either mildly ill or critically ill and require aggressive ICU care. There isn't that much of a middle ground." About 3.9% of total reported cases of flu for the period in Canada had people who became extremely ill. More people are getting sick and not reporting it.
END UPDATE

Total influenza hospitalization rates for laboratory-confirmed influenza are higher than expected for this time of year for adults and children. And for children 5-17 and adults 18-49 years of age, hospitalization rates from April – October 2009 exceed average flu season rates (for October through April).

The proportion of deaths attributed to pneumonia and influenza (P&I) based on the 122 Cities Report has increased and now exceeds what is normally expected at this time of year. In addition, 19 flu-related pediatric deaths were reported this week; 16 of these deaths were confirmed 2009 H1N1 and 3 were unsubtyped influenza A and likely to be 2009 H1N1. A total of 76 laboratory confirmed 2009 H1N1 pediatric deaths have been reported to CDC since April.

Around the World Almost All Flu is H1N1

Light blue is H1N1 and Dark Purple is other kinds of Flu

According to WHO, the majority of 2009 H1N1 influenza isolates tested worldwide remain sensitive to oseltamivir, an antiviral medicine used to treat influenza disease. Only 31 2009 H1N1 isolates tested worldwide have been found to be resistant to oseltamivir – 12 of these isolates were detected in the United States.

Flu Vaccine Situation

This week (Oct 6, 2009 press briefing), the flu vaccine became available in the internasal variety. Next week, it will become available in the injectable variety. The first was done yesterday with a priority on health care workers and children as well as people who care for infants. Flu mist, only able to be used for people age 2 to 49 and who do not have an underlying health problem. With the production of this strain, we have cut no corners. This flu vaccine is made as flu vaccine is made each year. By the same companies. In the same production facilities. With the same procedures. With the same safety, safeguards. We have had literally hundreds of millions of people vaccinated against flu with flu vaccine made in this way




What to Do if you Get Sick

How do I know if I have the flu?


You may have the flu if you have some or all of these symptoms:

fever *
cough
sore throat
runny or stuffy nose
body aches
headache
chills
fatigue
sometimes diarrhea and vomiting
*It’s important to note that not everyone with flu will have a fever.

What should I do if I get sick?
If you get sick with flu-like symptoms this flu season, you should stay home and avoid contact with other people except to get medical care. Most people with 2009 H1N1 have had mild illness and have not needed medical care or antiviral drugs and the same is true of seasonal flu.

However, some people are more likely to get flu complications and they should talk to a health care provider about whether they need to be examined if they get flu symptoms this season. They are:

Children younger than 5, but especially children younger than 2 years old
People 65 and older
Pregnant women
People who have:
Cancer
Blood disorders (including sickle cell disease)
Chronic lung disease [including asthma or chronic obstructive pulmonary disease (COPD)]
Diabetes
Heart disease
Kidney disorders
Liver disorders
Neurological disorders (including nervous system, brain or spinal cord)
Neuromuscular disorders (including muscular dystrophy and multiple sclerosis)
Weakened immune systems (including people with AIDS)
Also, it’s possible for healthy people to develop severe illness from the flu so anyone concerned about their illness should consult a health care provider.

There are emergency warning signs. Anyone who has them should get medical care right away.

What are the emergency warning signs?
In children

Fast breathing or trouble breathing
Bluish skin color
Not drinking enough fluids
Not waking up or not interacting
Being so irritable that the child does not want to be held
Flu-like symptoms improve but then return with fever and worse cough
Fever with a rash
In adults

Difficulty breathing or shortness of breath
Pain or pressure in the chest or abdomen
Sudden dizziness
Confusion
Severe or persistent vomiting
Do I need to go the emergency room if I am only a little sick?
No. The emergency room should be used for people who are very sick. You should not go to the emergency room if you are only mildly ill. If you have the emergency warning signs of flu sickness, you should go to the emergency room. If you get sick with flu symptoms and are at high risk of flu complications or you are concerned about your illness, call your health care provider for advice. If you go to the emergency room and you are not sick with the flu, you may catch it from people who do have it

Are there medicines to treat 2009 H1N1?
Yes. There are drugs your doctor may prescribe for treating both seasonal and 2009 H1N1 called “antiviral drugs.” These drugs can make you better faster and may also prevent serious complications. This flu season, antiviral drugs are being used mainly to treat people who are very sick, such as people who need to be hospitalized, and to treat sick people who are more likely to get serious flu complications. Your health care provider will decide whether antiviral drugs are needed to treat your illness. Remember, most people with 2009 H1N1 have had mild illness and have not needed medical care or antiviral drugs and the same is true of seasonal flu.

How long should I stay home if I’m sick?
CDC recommends that you stay home for at least 24 hours after your fever is gone except to get medical care or for other things you have to do and no one else can do for you. (Your fever should be gone without the use of a fever-reducing medicine, such as Tylenol®.) You should stay home from work, school, travel, shopping, social events, and public gatherings.

What should I do while I’m sick?
Stay away from others as much as possible to keep from making them sick. If you must leave home, for example to get medical care, wear a facemask if you have one, or cover coughs and sneezes with a tissue. And wash your hands often to keep from spreading flu to others. CDC has information on “Taking Care of a Sick Person in Your Home” on its website at

Homecare guidance

Wash hands thoroughly with soap.

* check with their health care provider about any special care they might need if they are pregnant or have a health condition such as diabetes, heart disease, asthma, or emphysema
* check with their health care provider about whether they should take antiviral medications
* keep away from others as much as possible. This is to keep from making others sick. Do not go to work or school while ill
* stay home for at least 24 hours after fever is gone, except to seek medical care or for other necessities. (Fever should be gone without the use of a fever-reducing medicine.)
* get plenty of rest
* drink clear fluids (such as water, broth, sports drinks, electrolyte beverages for infants) to keep from being dehydrated
* cover coughs and sneezes. Wash hands often with soap and water. If soap and water are not available, use an alcohol-based hand rub.*
* wear a facemask – if available and tolerable – when sharing common spaces with other household members to help prevent spreading the virus to others. This is especially important if other household members are at high risk for complications from influenza. For more information, see the Interim Recommendations for Facemask and Respirator Use
* be watchful for emergency warning signs (see below) that might indicate you need to seek medical attention.


Get medical care right away if the sick person at home:

* has difficulty breathing or chest pain
* has purple or blue discoloration of the lips
* is vomiting and unable to keep liquids down
* has signs of dehydration such as dizziness when standing, absence of urination, or in infants, a lack of tears when they cry
* has seizures (for example, uncontrolled convulsions)
* is less responsive than normal or becomes confused

October 09, 2009

Tom Craver Idea - Earth to Mars in 20 Days Each Way



Recently there has been discussion of the near term future of a space rated VASIMR plasma rocket which can be scaled up to several megawatts to enable 39 day (one way) trips to Mars. This site has looked at combining a small nuclear reactor that has a commercialization target of 2013 with the VASIMR which is also expected to be tested in space in 2013. Both would need a few more years of work and production but a combined nuclear-VASIMR is technically possible before 2020.

Nextbigfuture reader Tom Craver had the idea of using VASIMR to accelerate and decelerate a Earth-Mars Transit Habitat:

VASIMR could accelerate a transit hab and tanks of rocket fuel on an orbit that swings back past Earth. A light and fast crew shuttle (too cramped for long trips) would rendezvous. At Mars the cramped crew shuttle would be re-fueled and the crew would decelerate into orbit. This should cut transit time about in half again (about 20 days) because it would be mostly up to speed by the time the crew boarded, and the ship wouldn't slow down nearly as much as it approaches Mars.


UPDATE: Will need to look at this more closely based on orbits, as well as acceleration to see how much lower speed travel is reduced with a relay system of vehicles. A powerful VASIMR has a pretty strong low gear acceleration, so there may not be as much benefit from this relay transfer system.

There was a (65 page pdf) 1994 NASA study "Ion engine propelled Earth-Mars cycler with nuclear thermal propelled transfer vehicle, volume 2" by Meyer, Rudolf X.; Baker, Myles; Melko, Joseph However, the study was looking at far weaker ion engines so the calculated times were slower than a straight up VASIMR.

The goal of this project was to perform a preliminary design of a long term, reusable transportation system between earth and Mars which would be capable of providing both artificial gravity and shelter from solar flare radiation. The heart of this system was assumed to be a Cycler spacecraft propelled by an ion propulsion system. The crew transfer vehicle was designed to be propelled by a nuclear-thermal propulsion system. Several Mars transportation system architectures and their associated space vehicles were designed.


There was an eight page article on the various near term ways to from Earth to Mars.

Buzz Aldrin has been advocating an Earth Mars Cycler that takes 5 months for each one way trip between Earth to Mars. Tom is suggesting speeding up the cycler with plasma rockets (and also using plasma rockets for decelerating for transfers) and not just depending on gravity.

A cycler trajectory is a special kind of spacecraft trajectory that encounters two or more bodies on a regular basis. Cyclers are potentially useful for transporting people or materials between those bodies using little or no propellant. Instead, they rely on gravity assist maneuvers to keep them going.

Once we have nuclear powered VASIMR spaceships, we can pre-deliver well shielded (more massive) habitats to human destinations, such as Mars orbit. So we don't need to take the risk and expense of going to Mars surface just for exploration - we can stay in orbit and use remote presence robots to "virtually" visit and explore Mars surface.




Analysis of Various Two Synodic Period Earth-Mars Cycler Trajectories (8 page pdf)




A Megawatt class VASIMR engine cluster (five 200 KW VASIMR) for 2013

Artificial Intelligence, Brain Emulation and Singularity Analysis

Anders Sandberg discusses his view of the summit as a speaker and a participant. Anders provides a view into the follow up discussions that occured at lunch and other breaks. Anders also provides good analysis of the Artificial Intelligence talks from the perspective of someone in the field of AI and Brain Emulation. (H/T Michael Anissimov at Accelerating Future)

I of course talked about whole brain emulation, sketching out my usual arguments for how complex the undertaking is. Randall Koene presented more on the case for why we should go for it, and in an earlier meeting Kenneth Hayworth and Todd Huffman told us about some of the simply amazing progress on the scanning side. Ed Boyden described the amazing progress of optically controlled neurons. I can hardly wait to see what happens when this is combined with some of the scanning techniques. Stuart Hameroff of course thought we needed microtubuli quantum processing; I had the fortune to participate in a lunch discussion with him and Max Tegmark on this. I think Stuart's model suffers from the problem that it seems to just explain global gamma synchrony; the quantum part doesn't seem to do any heavy lifting. Overall, among the local neuroscientists there were some discussion about how many people in the singularity community make rather bold claims about neuroscience that are not well supported; even emulation enthusiasts like me get worried when the auditory system just gets reduced to a signal processing pipeline.

Stephen Wolfram and Gregory Benford talked about the singularity and especially about what can be "mined" from the realm of simple computational structures ("some of these universes are complete losers"). During dinner this evolved into an interesting discussion with Robin Hanson about whether we should expect future civilizations to look just like rocks (computronium), especially since the principle of computational equivalence seems to suggest than there might not be any fundamental difference between normal rocks and posthuman rocks. There is also the issue of whether we will become very rich (Wolfram's position) or relatively poor posthumans (Robin's position); this depends on the level of possible coordination.

During the workshop afterwards we discussed a wide range of topics. Some of the major issues were: what are the limiting factors of intelligence explosions? What are the factual grounds for disagreeing about whether the singularity may be local (self-improving AI program in a cellar) or global (self-improving global economy)? Will uploads or AGI come first? Can we do anything to influence this?

One surprising discovery was that we largely agreed that a singularity due to emulated people (as in Robin's economic scenarios) has a better chance given current knowledge than AGI of being human-friendly. After all, it is based on emulated humans and is likely to be a broad institutional and economic transition. So until we think we have a perfect friendliness theory we should support WBE - because we could not reach any useful consensus on whether AGI or WBE would come first. WBE has a somewhat measurable timescale, while AGI might crop up at any time. There are feedbacks between them, making it likely that if both happens it will be closely together, but no drivers seem to be strong enough to really push one further into the future. This means that we ought to push for WBE, but work hard on friendly AGI just in case. There were some discussions about whether supplying AI researchers with heroin and philosophers to discuss with would reduce risks
.



J Storrs Hall analysis of the Singularity Summit and topics of the summit.
Riffing on robocars
Robocars would save us a trillion dollars of wasted time with the current amount of driving, they are likely to enable more than a trillion dollars of totally new transportation. And that’s a stimulus that would actually work


Why we need Artificial General Intelligence as soon as possible.

One of the best arguments for developing AI as fast as possible and putting it into use in the real world without delay: humans making these decisions are messing up big time. We don’t need superintelligence to do better, just human-level perception combined with rational decision-making — rational decision-making, I might add, that we already know how to do and believe and understand is the right way to do it, but just don’t bother to for most of our decisions. It’s a low bar
.

“AI — when and how?”

I claim, though, that we do have an existence proof for superintelligence: it’s not humans, but human societies. Put a thousand (emulated) brains in a box, and crank up the clock speed to whatever you can. Build in all the communications substrate they might need, and turn them loose. You can try different forms or internal organization — literally, try them, experimentally — and give the internal brains the ability to mate electronically, have children, teach them in various ways. Some forms of human organization, for example the scientific community over the past 500 years, have clearly demonstrated the ability to grow in knowledge and capability at an exponential rate.

In what way could you argue such a box would not be a superintelligence? Indeed, some very smart people such as Marvin Minsky believe that this is pretty much the way our minds already work. And yet this “Society of Minds” would be a model we intuitively understand. And it would help us understand that, in a sense, we have already constructed superintelligent machines.




The real question isn’t whether people are stupid. The real question is whether people make decisions that matter a lot incorrectly.

We’ve replaced kings — human beings — with artificial rule-based decision procedures based on vote-counting and other random esoterica. Likewise the governance of large business enterprises. We don't need friendliness in markets or politics, we need competence.


Other Reviews and Analysis
Ronald Bailey has an analysis and synthesis of the whole Singularity Conference

Peter Thiel began his talk on the economics of the singularity by asking the audience to vote on which of seven scenarios they are most worried about. (See Reason's interview with Thiel here.) The totals below are my estimates from watching the audience as they raised their hands:

A. Singularity happens and robots kill us all, the Skynet scenario, (5 percent)
B. Biotech terrorism using something more virulent than smallpox and Ebola combined (30 percent)
C. Nanotech grey goo escapes and eats up all organic matter (5 percent)
D. Israel and Iran engage thermonuclear war that goes global (25 percent)
E. A one-world totalitarian state arises (10 percent)
F. Runaway global warming (5 percent)
G. The singularity takes too long to happen (30 percent)

Thiel argued that the last one—that the singularity is going to take too long to happen—is what worries him.


Dresden Codak's review of the singularity Summit

Five Year Project To Make Nanostructured Boron Doped Superconducting Wires


Wires made up of yarns spun from millions of carbon nanotube bundles may help make superconductivity practical.

Researchers from UT Dallas, Clemson University and Yale University are using science on the nanoscale to address one of the most elusive challenges in physics—the discovery of room temperature superconductivity. With that as the ultimate goal, the team is working to develop superconducting wires made from nanotubes that carry high currents at the temperature of liquid nitrogen, or higher.

With a $3 million research grant from the Air Force Office of Scientific Research (AFOSR), the team has embarked on a five-year project to invent new superconducting wires based on highly engineered nanomaterials, each component thousands of times smaller than a human hair. Such wires would be used for applications ranging from magnets for Magnetic Resonance Imaging to replacing energy-wasting copper in power transmission lines.

Modern high temperature superconducting materials are too brittle, expensive and deficient in electronic properties for wide-scale application. We hope to overcome those limitations by fabricating wires from nanotubes, using carbon nanotubes or other nanotubes enhanced by atoms like boron, nitrogen or sulfur.”

According to Zakhidov, who is a professor of physics, as much as 30 percent of electrical energy can be lost as heat when electricity travels through power lines. Superconducting materials promise enormous environmental and energy savings.

“Making superconducting wires and cables from nanofibers and nanoparticles presents special challenges that go beyond the discovery of new superconductors,” Baughman said. “For example, for each pound of superconducting wire, it may be necessary to assemble more than 3 billion miles of individual nanotubes—and the goal is to achieve this assembly at commercially useful rates. For this task, we are inventing radically new methods for making superconducting wires.”

Dr. Lisa Pfefferle, professor of chemical engineering at Yale University and member of the research team, is experimenting with new types of nanofibers that have been synthesized by her team using elements like boron.

Team member Dr. Apparao Rao, professor of physics at Clemson University, has already produced superconducting nanotubes by a process called pulsed laser ablation. The process results in carbon nanotubes “doped” with boron that superconduct at higher temperatures than other carbon based materials—but still at relatively low temperatures.



Dr. Myron Salamon, dean of the School of Natural Sciences and Mathematics, will evaluate the team’s new superconductors to test the maximum temperature of superconductivity as a function of current and power transmitted, which is a crucial factor for using these materials in power systems.

“There’s always been a sense that we can enhance superconductivity by using lighter materials,” Salamon said. “Wires made from ultra-light nanotubes can allow atoms to vibrate easily, which helps with superconductivity. There’s good evidence that carbon-based materials, like dopant modified carbon nanotubes, might make good superconductors.”


Atomtronic Circuits of Diodes and Transistors

Atomtronics has the goal of developing a one-to-one analogy of electronic systems, components and devices with ultracold atoms trapped in optical lattices It is being researched at the University of Colorado. The Atomtronic Anderson Group of Optical Physics

Their atom-optical analogy to electronic circuits begins with the definition of the `atomtronic battery', which is composed of two reservoirs of ultracold atoms having different chemical potentials (corresponding to different electric potentials at the terminals of a conventional battery). The `wires' and atomtronic components are composed of optical lattices, and current refers to the number of atoms that pass a specific point in a given amount of time.

Atomtronic Diode
The atomtronic diode is a device that allows an atomic flux to flow across it in essentially only one direction. It is made by adding a potential step, which emulates a semiconductor junction (the boundary between p-type and n-type solid-state materials), to an energetically-flat optical lattice


Atomtronic analogy to a simple diode circuit. The atomtronic analogy of a diode formed from the joining of p-type and n-type semiconductor materials. Electrons are replaced by ultracold atoms, the battery is replaced by high and low chemical potential reservoirs, and the metallic crystal lattices (the microscopic medium that the electrons traverse) are replaced by an optical lattice. The atomtronic diode is achieved by energetically shifting one half of the optical lattice with respect to the other.



The atomtronic transistor
The desired function of an atomtronic transistor is to enable a weak atomtronic current to be amplified or to switch,either on or off, a much larger one. Transistor action requires at least three lattice sites connected to three independent reservoirs. The resonance condition for this device is found to be an extension of the diode case to account for the third well: the left external energy is shifted above the middle site by the on-site interaction energy and is of equal energy to that of the right site.


Dynamics of the atomtronic transistor.(a) A cartoon of the atomtronic transistor as a three-well system, where each well is connected to its own independent reservoir. (b) An energy schematic of the relevant states of the system under the assumed resonance condition. In both illustrated cases, there is a fixed chemical potential difference across the system. In case 1, the middle chemical potential maintains an occupancy of zero particles on the middle site and most of the population remains on the left site. In case 2, the base potential is raised to put one particle on the middle site. This triggers two competing cycles that, given weak coupling of the middle reservoir, causes an avalanche of current to flow across the system. (c) An exact calculation of the current responses of the atomtronic transistor. The middle reservoir here has one-tenth the coupling strength of the left and right reservoirs. For a fixed chemical potential difference across the device, we vary the middle potential and record the response of currents leaving the system from both the right site (blue) as well as out of the middle site (red). The differential current gain for this specific system is both large and essentially linear
Atomtronic Circuits of Diodes and Transistors in Physic Review Letters

From Physorg: Atomtronics probably won’t replace electronics. “Atoms are sluggish compared to electrons, and that means that you probably won’t see atomtronics replace current electronic devices. What atomtronics might be useful for is the field of quantum information.”

The dynamics of our atomtronic devices would be coherent and potentially useful in quantum computing.” He also suggests that there is the possibility that atomtronics could be useful in obtaining sensitive measurements. At the very least, he concludes, “atomtronic systems provide a nice test of fundamental concepts in condensed matter physics.”

While these ideas have been modeled, they have yet to be built. Pepino says that an effort is under way to set up experiments that could provide a proof of principle for the work being done at JILA and the University of Colorado by experimantal collaborator and co-author Dana Anderson.


We illustrate that open quantum systems composed of neutral, ultracold atoms in one-dimensional optical lattices can exhibit behavior analogous to semiconductor electronic circuits. A correspondence is demonstrated for bosonic atoms, and the experimental requirements to realize these devices are established. The analysis follows from a derivation of a quantum master equation for this general class of open quantum systems.



Atomtronics: Ultracold-atom analogs of electronic devices (from 2007)

Atomtronics focuses on atom analogs of electronic materials, devices, and circuits. A strongly interacting ultracold Bose gas in a lattice potential is analogous to electrons in solid-state crystalline media. As a consequence of the gapped many-body energy spectrum, cold atoms in a lattice exhibit insulatorlike or conductorlike properties. P-type and N-type material analogs are created by introducing impurity sites into the lattice. Current through an atomtronic wire is generated by connecting the wire to an atomtronic battery which maintains the two contacts at different chemical potentials. The design of an atomtronic diode with a strongly asymmetric current-voltage curve exploits the existence of superfluid and insulating regimes in the phase diagram. The atom analog of a bipolar junction transistor exhibits large negative gain. Our results provide the building blocks for more advanced atomtronic devices and circuits such as amplifiers, oscillators, and fundamental logic gates.


Quantum Computer Algorithms for Exponentially Speeding the Solution of Linear and Differential Equations

In a paper appearing today in Physical Review Letters, however, MIT researchers present a new algorithm that could exponential speed solutions of systems of linear equations — whose solution is crucial to image processing, video processing, signal processing, robot control, weather modeling, genetic analysis and population analysis, to name just a few applications.

Researchers at the University of London have already expanded on the MIT researchers' approach to develop a new quantum algorithm for solving differential equations. Early in their paper, they describe the MIT algorithm, then say, "This promises to allow the solution of, e.g., vast engineering problems. This result is inspirational in many ways and suggests that quantum computers may be good at solving more than linear equations."


Quantum Algorithm for Linear Systems of Equations by Aram W. Harrow, Avinatan Hassidim, and Seth Lloyd

Solving linear systems of equations is a common problem that arises both on its own and as a subroutine in more complex problems: given a matrix A and a vector , find a vector such that A=. We consider the case where one does not need to know the solution itself, but rather an approximation of the expectation value of some operator associated with , e.g., †M for some matrix M. In this case, when A is sparse, N×N and has condition number , the fastest known classical algorithms can find and estimate †M in time scaling roughly as N. Here, we exhibit a quantum algorithm for estimating †M whose runtime is a polynomial of log(N) and . Indeed, for small values of [i.e., polylog(N)], we prove (using some common complexity-theoretic assumptions) that any classical algorithm for this problem generically requires exponentially more time than our quantum algorithm.


11 pages of supplemental information

In this supplementary material, we describe and analyze our algorithm in full detail. While the main paper attempted to convey the spirit of the procedure and left out various improvements, here we take the opposite approach and describe everything, albeit possibly in a less intuitive way. We also describe in more detail our reductions from non-Hermitian matrix inversion to Hermitian matrix inversion and from a general quantum computation to matrix inversion.

















Greg Kuperberg, a mathematician at the University of California, Davis, who works on quantum algebra, says that the MIT algorithm "could be important," but that he's "not sure yet how important it will be or when." Kuperberg cautions that in applications that process empirical data, loading the data into quantum memory could be just as time consuming as extracting it would be. "If you have to spend a year loading in the data," he says, "it doesn't matter that you can then do this linear-algebra step in 10 seconds."

But Hassidim argues that there could be applications that allow time for data gathering but still require rapid calculation. For instance, to yield accurate results, a weather prediction model might require data from millions of sensors transmitted continuously over high-speed optical fibers for hours. Such quantities of data would have to be loaded into quantum memory, since they would overwhelm all the conventional storage in the world. Once all the data are in, however, the resulting forecast needs to be calculated immediately to be of any use.

Still, Hassidim concedes that no one has yet come up with a "killer app" for the algorithm


Other Quantum Computer Research

Six Qubits Have Stable Entanglement



An entangled state of six photons could potentially carry quantum information over large distances and between different reference frames.

RELATED

Quantum Computer Algorithm Review

A Quantum algorithm for quantum simulation

Understanding the nature of quantum computer algorithms

A list of the efficient (a lot better than algorithms for regular computers)
Shor's algorithm, an algorithm for factoring numbers, which is key to decryption
Solving pell's equation Pell's equation is useful for approximating things like the square root of 2
Estimating certain Gauss sums
Solving hidden shift problems
solving certain hidden subgroup problems

Every quantum algorithm is fourier transforms and classical computation.


A summary of Quantum computers

October 08, 2009

Better heart repair patches from stem cells and Closed Heart Surgery from Gene Therapy

University of Washington (UW) researchers have succeeded in engineering human tissue patches free of some problems that have stymied stem-cell repair for damaged hearts. Pre-formed blood vessels in patches connect to rodents' heart circulation.

The disk-shaped patches can be fabricated in sizes ranging from less than a millimeter to a half-inch in diameter. Until now, engineering tissue for heart repair has been hampered by cells dying at the transplant core, because nutrients and oxygen reached the edges of the patch but not the center. To make matters worse, the scaffolding materials to position the cells often proved to be harmful.

In contrast to the heart muscle cell-only tissue, which failed to survive transplantation and which remained apart from the rat's heart circulatory system, the pre-formed vessels in the mixed-cell tissue joined with the rat's heart circulatory system and delivered rat blood to the transplanted graft.

"The viability of the transplanted graft was remarkably improved," Murry observed. "We think the gain in viability is due to the ability for the tissue to form blood vessels."

Equally as exciting, the scientists observed that the patches of engineered tissue actively contracted. Moreover, these contractions could be electronically paced, up to what would translate to 120 beats per minute. Beyond that point, the tissue patch didn't relax fully and the contractions weakened. However, the average resting adult heart pulses about 70 beats per minute. This suggests that the engineered tissue could, within limits, theoretically keep pace with typical adult heart muscle, according to the study authors.

Another physical quality that made the mixed-cell tissue patches superior to heart muscle-cell patches was their mechanical stiffness, which more closely resembled human heart muscle. This was probably due to the addition of supporting cells, which created connective tissues. Passive stiffness allows the heart to fill properly with blood before it contracts.

When the researchers implanted these mixed celled, pre-vascularized tissue patches into rodents, the patches grew into cell grafts that were ten times larger than the too-small results from tissue composed of heart muscle cells only. The rodents were bred without an immune system that rejects tissue transplants.

Murry noted that these results have significance beyond their contribution to the ongoing search for ways to treat heart attack damage by regenerating heart tissue with stem cells

The study findings, he observed, suggest that researchers consider including blood vessel-generating and vascular-supporting elements when designing human tissues for certain other types of regenerative therapies unrelated to heart disease.


2. Scientists from the Universities of Michigan and Minnesota show in a research report published online in the FASEB Journal that gene therapy may be used to improve an ailing heart's ability to contract properly. [Scientists Jump-start The Heart By Gene Transfer]

In addition to showing gene therapy's potential for reversing the course of heart failure, it also offers a tantalizing glimpse of a day when "closed heart surgery" via gene therapy is as commonly prescribed as today's cocktail of drugs.



To make this advance, Herron and colleagues treated heart muscle cells from the failing hearts of rabbits and humans with a virus (adenovirus) modified to carry a gene which produces a protein that enables heart cells to contract normally (fast molecular motor) or a gene that becomes active in failing hearts, which is believed to be part of the body's way of coping with its perilous situation (slow molecular motor). Heart cells treated with the gene to express the fast molecular motor contracted better, while those treated with the gene to express the slow molecular motor were unaffected.

"Helping hearts heal themselves, rather than prescribing yet another drug to sustain a failing organ, would be a major advance for doctors and patients alike," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal. "Equally important, it shows that gene therapy remains one of the most promising approaches to treating the world's most common and deadliest diseases."


Ca2+-independent positive molecular inotropy for failing rabbit and human cardiac muscle by -myosin motor gene transfer

Current inotropic therapies used to increase cardiac contractility of the failing heart center on increasing the amount of calcium available for contraction, but their long-term use is associated with increased mortality due to fatal arrhythmias. Thus, there is a need to develop and explore novel inotropic therapies that can act via calcium-independent mechanisms. The purpose of this study was to determine whether fast -myosin molecular motor gene transfer can confer calcium-independent positive inotropy in slow -myosin-dominant rabbit and human failing ventricular myocytes. To this end, we generated a recombinant adenovirus (AdMYH6) to deliver the full-length human -myosin gene to adult rabbit and human cardiac myocytes in vitro. Fast -myosin motor expression was determined by Western blotting and immunocytochemical analysis and confocal imaging. In experiments using electrically stimulated myocytes from ischemic failing hearts, AdMYH6 increased the contractile amplitude of failing human [23.9±7.8 nm (n=10) vs. AdMYH6 amplitude 78.4±16.5 nm (n=6)] and rabbit myocytes. The intracellular calcium transient amplitude was not altered. Control experiments included the use of a green fluorescent protein or a -myosin heavy chain adenovirus. Our data provide evidence for a novel form of calcium-independent positive inotropy in failing cardiac myocytes by fast -myosin motor protein gene transfer.—Herron, T. J., Devaney, E., Mundada, L., Arden, E., Day, S., Guerrero-Serna, G., Turner, I., Westfall, M., Metzger, J. M. Ca2+-independent positive molecular inotropy for failing rabbit and human cardiac muscle by -myosin motor gene transfer.


Genetically Engineered Stem Cells for Tissue Regeneration


Genetic engineering of human stem cells for enhanced angiogenesis using biodegradable polymeric nanoparticles, 6 page pdf Genetically engineered stem cells regrew veins and help salvage damaged limbs and tissue in mice.

This study suggests that stem cells transiently modified with biodegradable polymeric nanoparticles can promote therapeutic angiogenesis. This technology may facilitate engineering and regeneration of large masses of various tissues such as bone and muscle, as well as complex structures that encompass multiple tissue types. We further hypothesize that this approach could be useful in treating other types of ischemic diseases such as myocardial infarction and cerebral ischemia.


Stem cells hold great potential as cell-based therapies to promote vascularization and tissue regeneration. However, the use of stem cells alone to promote angiogenesis remains limited because of insufficient expression of angiogenic factors and low cell viability after transplantation. Here, we have developed vascular endothelial growth factor (VEGF) high-expressing, transiently modified stem cells for the purposes of promoting angiogenesis. Nonviral, biodegradable polymeric nanoparticles were developed to deliver hVEGF gene to human mesenchymal stem cells (hMSCs) and human embryonic stem cell-derived cells (hESdCs). Treated stem cells demonstrated markedly enhanced hVEGF production, cell viability, and engraftment into target tissues. S.c. implantation of scaffolds seeded with VEGF-expressing stem cells (hMSCs and hESdCs) led to 2- to 4-fold-higher vessel densities 2 weeks after implantation, compared with control cells or cells transfected with VEGF by using Lipofectamine 2000, a leading commercial reagent. Four weeks after intramuscular injection into mouse ischemic hindlimbs, genetically modified hMSCs substantially enhanced angiogenesis and limb salvage while reducing muscle degeneration and tissue fibrosis. These results indicate that stem cells engineered with biodegradable polymer nanoparticles may be therapeutic tools for vascularizing tissue constructs and treating ischemic disease




8 pages of supporting information

Canadian and European Lab on a Chip Systems for Breast Cancer Detection and 1024 Reactions at Once for a Lab on a Chip

1. IMEC, a leading European research center in nanotechnology, has achieved a major milestone in the development of a lab-on-chip for the detection and therapy evaluation of breast cancer. This is the first time that a lab-on-chip system including many complex sample preparation steps and multiplexed detection was conceived and is being implemented. All modules for sample preprocessing and detection are ready for further miniaturization and integration in a single lab-on-chip platform. The system will be clinically validated in a breast cancer therapy study in Oslo.

The project partners developed a modular platform where each module has its specific task and autonomy and as such can also be used for many different medical applications. The first module is the incubation module performing the mixing of the blood sample with functionalized magnetic beads which specifically bind the tumor cells. The second module is used for tumor cell isolation and counting using a combination of dielectrophoresis and magnetic sensing with single cell sensitivity. In the third module, the amplification module, the cell wall of the tumor cells is destroyed and the genetic material (i.e. the mRNA) is extracted and amplified based on multiplex ligation dependent probe amplification (MLPA).

Within this module, specific assays amplify about 20 markers that are expressed in breast carcinoma cells. In the final detection module, the amplified genetic material is detected using an array of electrochemical sensors. The different building blocks have been developed and validated on spiked blood samples


2. Lab-on-a-Chip Performs 1024 Chemical Reactions At Once

An integrated microfluidic device has been developed to perform 1024 in situ click chemistry reactions in parallel using the bovine carbonic anhydrous II (bCAII) click chemistry system as a proof-of-concept study and a rapid hit identification approach using SPE purification and electrospray-ionization mass spectrometry, multiple reaction monitoring (MRM) analysis, all of which improves the sensitivity and throughput of the downstream analysis.






3. In a move to quicken detection for women at risk of breast cancer, Canadian researchers said they had developed a hormone testing technique that could eventually be used in a handheld device.

While the results are several years away from usage, the new "lab-on-a-chip" technique developed at the University of Toronto can analyze "tiny samples of blood and breast tissue to identify women at risk of breast cancer much more quickly than ever before," researchers said.



Clinical Trials for White blood cell infusions to treat cancer

The Wake Forest Clinical trial for granulocyte infusions to cure cancer was cancelled but a new South Florida clinical trial is proceeding. The clinical trials are to see if simple blood transfusions can transfer cancer immunity from people with strong cancer immunity to those without such strong immunity. This procedure has been shown to cure cancer in mice. A mouse is given a lot of cancer and then is cured of it with blood transfusions from super-immune mice.
Here is the link to the cancelled clinical trial.

South Florida Bone Marrow/Stem Cell Transplant Institute is sponsoring the white blood cell transfusion clinical trial. The phase I and II trial that is running now.

>South Florida Bone Marrow/Stem Cell Transplant Institute page on the granulocyte anti-cancer clinical trial

The South Florida Institute information page for cancer patients cites the Wake Forest work and history of the project.


This is the cancer treatment from Zheng Cui, which has been covered extensively at this site.
Update on the GIFT cancer treatment

White blood cells from a strain of cancer-resistant mice cured advanced cancers in ordinary laboratory mice, researchers at Wake Forest University School of Medicine reported

at the SENS3 conference in Sept 2007, Dr. Cui presented the next logical step in his research: work demonstrating the existence of, and characterizing, high-potency cancer-killing granulocytes in humans. This same cancer killing cells provides mice with immunity to cancer.

A Phase I/II Study For the Use of White Blood Cells From Healthy Donor-Participants To Treat Subjects With Solid Cancers

About 75% of US population living today will not die of cancer. It is not uncommon that some people remain cancer-free into their 80s and 90s, even if they are regularly exposed to environmental carcinogens such as air pollutants, cigarette smoking, etc. A frequently asked but unanswered question is why these individuals do not get cancer. There has been a recent report of a colony of cancer-resistant mice developed from a single male mouse that unexpectedly survived challenges of lethal cancer cell injections. In these so-called spontaneous regression/complete resistant (SR/CR) mice, cancer cells are killed by rapid infiltration of leukocytes, mainly of innate immunity. This highly effective natural cancer immunity is inherited and mediated entirely by white blood cells. Moreover, this cancer resistance can be transferred to wild type mice through the transfer of various immune cell types including granulocytes.

This observation raises the possibility that infusion of white blood cells, particularly cells of innate immunity, is a viable anticancer therapy in humans as well.

This proposed trial will test whether white blood cell infusions from healthy unrelated donors can be used to treat cancer. The trial is designed to determine whether responses can be seen in cancer patients after infusion of HLA-mismatched white cells from healthy donors. It is important that the donors and recipients be unrelated and HLA-mismatched to avoid the possibility of transfusion-related Graft vs. Host Disease.




The white blood cells from the healthy donors are being collected via apheresis following granulocyte mobilization with dexamethasone and filgrastim. The investigators will refer to the white blood cells as 'granulocytes' because 75-90% of the white blood cells collected through the apheresis will consist of granulocytes.

The dose of at least 2x10 to the11th will be given from 4-5 donors at a rate of no more than one donor per day for each recipient. There will only be one infusion per day and no more than 5 infusions per week, but in many scenarios there may only be 3 days per week. Thus, a typical treatment in the study would span 1-2 weeks with up to a 4-day interval between 3rd and 4th infusion. After each infusion, the patients will be monitored carefully for possible adverse events. If adverse events occur at any time point during or after individual infusion, the treatment can be stopped until the adverse events can be managed. The daily dose of each infusion is a frequently used level that has a long safety record.

The trial will observe the subject's cancer for 3 months after the granulocyte infusions are completed. Response at 90 days will be based on comparison of tumor measurements at baseline.

The trial has 3 major endpoints: dose response and tolerance, safety, and efficacy.


Aubrey de Grey Big Think Videos

Immortality as a Cure for Climate Change by Aubrey de Grey
Chief Science Officer, SENS Foundation

Would an ageless society be a more humane society? Aubrey de Grey explains why he believes that, when we defeat aging, the world will band together to finally solve the major crises of our time.


Two videos are below.





Wimpy Cars and Other Implications of Immortality



Anti-aging expert Aubrey de Grey speculates about how an ageless society would operate differently than the world does today; expect changes in preferred careers and religion, but don’t expect a new outlook on suicide.

China and the World Economy From Now to 2018

UK Telegraph: You can date the end of dollar hegemony from China's decision last month to sell its first batch of sovereign bonds in Chinese yuan to foreigners

Beijing does not need to raise money abroad since it has $2 trillion (£1.26 trillion) in reserves. The sole purpose is to prepare the way for the emergence of the yuan as a full-fledged global currency.

"It's the tolling of the bell," said Michael Power from Investec Asset Management. "We are only beginning to grasp the enormity and historical significance of what has happened."

It is this shift in China and other parts of rising Asia and Latin America that threatens dollar domination, not the pricing of oil contracts. The markets were rattled yesterday by reports – since denied – that China, France, Japan, Russia, and Gulf states were plotting to replace the Greenback as the currency for commodity sales, but it makes little difference whether crude is sold in dollars, euros, or Venetian Ducats.

What matters is where OPEC oil producers and rising export powers choose to invest their surpluses. If they cease to rotate this wealth into US Treasuries, mortgage bonds, and other US assets, the dollar must weaken over time.

Clearly this is more than a dollar problem. It is a mismatch between the old guard – US, Europe, Japan – and the new powers that require stronger currencies to reflect their dynamism and growing wealth. The longer this goes on, the more havoc it will cause to the global economy.

The new order may look like the 1920s, with four or five global currencies as regional anchors – the yuan, rupee, euro, real – and the dollar first among equals but not hegemon. The US will be better for it.


From the UK Independent: China's Push for Power is Irresistible

World's market [movements] show they simply do not believe the denials of the story [that there is move to phase out the dollar for pricing oil contracts by 2018] that have been issued by several countries. Saying so may not suit those countries which have strong political relationships with the US to worry about, but it would be remarkable if talks about repricing oil had not taken place. Indeed, China, the prime mover in these talks, has already openly floated the idea of ending the dollar's reserve currency status, and has never made a secret of its views on the matter.








The NY Times on the testing of the waters 6 billion ($879 million) yuan bond sale.

There is a shift in the IMF rules that will a still being negotiated increase in power to the rising economies of Brazil, Russia, India and China.

Business Week: has Portfolio Investments for a Weakening Dollar

Multicurrency bank deposits, U.S. multinational equities, and Asian bonds are a few ways to play what some regard as a necessary dollar correction


Nouriel Roubini has a survey of the economies of Asia through the end of 2010.

My analysts and I forecast Asia will grow a mere 2.6% in 2009 and 5.4% in 2010. Asia ex-Japan (AXJ) will grow 4.9% in 2009 and 6.6% in 2010. As the impact of policy measures fade in 2010, the pace of Asia's recovery will hinge on the recovery of global export demand and continued risk appetite. I project that Japan will contract sharply in 2009 and grow below 1.0% in 2010. Fiscal stimulus will push China's growth to over 8.0% during 2009 and 2010. India will grow less than 6.0% in 2009 and below potential in 2010. The Asian Tigers (Singapore, Taiwan and Hong Kong), Thailand, Malaysia and New Zealand will contract in 2009 while the contraction in South Korea will be mild and Australia will barely grow. The Philippines, Indonesia, Vietnam, Pakistan and Sri Lanka will slow sharply in 2009.


Stem Cells could be used for Colon Cancer Vaccine and Possibly Lead to Universal Cancer Vaccine

Scientists from the United States and China have revealed the potential for human stem cells to provide a vaccination against colon cancer, reports a study published inthe journal Stem Cells.

"Although we have only tested the protection against colon cancer, we believe that stem cells might be useful for generating an immune response against a broad-spectrum of cancers, thus serving as a universal cancer vaccine." " concluded Dr. Bei Liu.

The team vaccinated laboratory mice with human embryonic stem (hES) cells and discovered a consistent immune response against colon cancer cells. The team witnessed dramatic decline in tumor growth within the immunized mice. This revealed that immunized mice could generate a strong anti-tumour response through the application of hES cells.

The team also discovered that while natural embryonic stem cells are able to provide a response, artificially induced pluripotent stem cells (iPSC) are not. This is significant as it challenges the theory that iPSC are the same as hES cells and may replace them at the forefront of stem cell research.




This discovery, led by experts in immunology, Dr. Bei Liu and Dr. Zihai Li, builds upon a century old theory that immunizing with embryonic materials may generate an anti-tumour response. However, this theory has never before been advanced beyond animal research so the discovery that human stem cells are able to immunize against colon cancer is both new and unexpected.

"This finding potentially opens up a new paradigm for cancer vaccine research," said Dr. Zihai Li. "Cancer and stem cells share many molecular and biological features. By immunizing the host with stem cells, we are able to 'fool' the immune system to believe that cancer cells are present and thus to initiate a tumor-combating immune program."

The research is the first of its kind to implicate the role of human stem cells in vaccinating against colon cancer, and represents collaboration between the prestigious laboratories of Dr. Zihai Li and stem cell expert Dr. Renhe Xu at the University of Connecticut Stem Cell Institute.


October 07, 2009

A Relativistic Gravity Theory Could be Tested at Large Hadron Collider


Test of relativistic gravity for propulsion at the Large Hadron Collider, 13 page pdf

A design is presented of a laboratory experiment that could test the suitability of relativistic gravity for propulsion of spacecraft to relativistic speeds. The first exact time-dependent solutions of Einstein’s gravitational field equation confirm that even the weak field of a mass moving at relativistic speeds could serve as a driver to accelerate a much lighter payload from rest to a good fraction of the speed of light. The time-dependent field of ultrarelativistic particles in a collider ring is calculated. An experiment is proposed as the first test of the predictions of general relativity in the ultrarelativistic limit by measuring the repulsive gravitational field of bunches of protons in the Large Hadron Collider (LHC). The estimated ‘antigravity beam’ signal strength at a resonant detector of each proton bunch is 3 nm/s2 for 2 ns during each revolution of the LHC. This experiment can be performed off-line, without interfering with the normal operations of the LHC.



Exact ‘antigravity-field’ solutions of Einstein’s equation, 3 page pdf


‘Antigravity’ propulsion and relativistic hyperdrive, 4 page pdf

Exact payload trajectories in the strong gravitational fields of compact masses moving with constant relativistic velocities are calculated. The strong field of a suitable driver mass at relativistic speeds can quickly propel a heavy payload from rest to a speed significantly faster than the driver, a condition called hyperdrive. Hyperdrive thresholds and maxima are calculated as functions of driver mass and velocity.

Technology Review covers the arvix paper

In 1924, the influential German mathematician David Hilbert published a paper called "The Foundations of Physics" in which he outlined an extraordinary side effect of Einstein's theory of relativity.

Hilbert was studying the interaction between a relativistic particle moving towards or away from a stationary mass. His conclusion was that if the relativistic particle had a velocity greater than about half the speed of light, a stationary mass should repel it. At least, that's how it would appear to a distant inertial observer.

That's an interesting result and one that has been more or less forgotten, says Franklin Felber an independent physicist based in the US (Hilbert's paper was written in German).

Felber has turned this idea on its head, predicting that a relativistic particle should also repel a stationary mass. He says this effect could be exploited to propel an initially stationary mass to a good fraction of the speed of light.

The basis for Felber's "hypervelocity propulsion" drive is that the repulsive effect allows a relativistic particle to deliver a specific impulse that is greater than its specific momentum, thereby achieving speeds greater than the driving particle's speed . He says this is analogous to the elastic collision of a heavy mass with a much lighter, stationary mass, from which the lighter mass rebounds with about twice the speed of the heavy mass.

What's more, Felber predicts that this speed can be achieved without generating the sever stresses that could damage a space vehicle or its occupants. That's because the spacecraft follows a geodetic trajectory in which the only stresses arise from tidal forces (although it's not clear why those forces wouldn't be substantial)






The experiment would measure the repulsive gravitational impulses of proton bunches delivered in their forward direction to resonant detectors just outside the beam pipe. This test could provide accurate measurements of post-Newtonian parameters and the first observation of ‘antigravity’, as well as validating the potential utility of relativistic gravity for spacecraft propulsion in the distant future.
A new exact time-dependent field solution of Einstein’s equation is given in Eq. (8) by (Felber, 2008 and 2009).
This exact strong-field solution provides further support for the weak-field results presented in this paper. According to Table 1 and Eq. (9), the exact field solution in Eq. (8) for a mass moving with constant velocity corresponds precisely in the weak-field approximation to the weak-field solution in Eq. (2), for the special case of constant velocity.
A simple Lorentz transformation of the well-known unbound orbit of a payload in a Schwarzschild field gives the exact payload trajectory in the strong field of a relativistic driver with constant velocity, as seen by a distant inertial observer. The calculations of these payload trajectories by this two-step approach, and their animated versions
(Felber, 2006b), clearly show that suitable drivers at relativistic speeds can quickly propel a heavy payload from rest to speeds close to the speed of light.
The strong field of a compact driver mass can even propel a payload from rest to speeds faster than the driver itself – a condition called hyperdrive. Hyperdrive is analogous to the elastic collision of a heavy mass with a much lighter, initially stationary mass, from which the lighter mass rebounds with about twice the speed of the heavy mass.
Hyperdrive thresholds and maxima were calculated and shown in Figure 5 as functions of driver mass and velocity. Substantial payload propulsion can be achieved in weak driver fields, especially at relativistic speeds.

The exact time-dependent gravitational-field solutions of Einstein’s equation in (Felber, 2008 and 2009) for a mass moving with constant velocity, and the two-step approach in (Felber, 2005b, 2006a, 2006b and 2006c) to calculating exact orbits in dynamic fields, and the retarded fields calculated in (Felber, 2005a) all give the same result: Even weak gravitational fields of moving masses are repulsive in the forward and backward directions at source speeds greater than 31/2 c .

The field solutions in this paper have potential theoretical and experimental applications in the near term and potential propulsion applications in the long term. In the near term, the solutions can be used in the laboratory to test relativistic gravity for the first time. Performing such a test at an accelerator facility has many advantages over similar space-based tests of relativity that have been performed and contemplated for the future, including low cost, quickness, convenience, ease of data acquisition and data processing, and an ability to modify and iterate tests in real time. Such a test could provide accurate measurements of post-Newtonian parameters in the extreme relativistic regime and the first observation of ‘antigravity’. Our estimates suggest that each proton bunch in the LHC beam would produce an ‘antigravity beam’ with a signal strength of 3 nm/s2 and a duration of 2 ns at a detector. With a suitable high-Q resonant detector, a typical proton circulation time of 10 hours, and an impulse frequency at peak luminosity of 31.6 MHz, the SPL of the ‘antigravity beam’ at the LHC could be resonantly amplified to exceed 160 dB re 1 ╬╝Pa.

Robert Freitas Wins the 2009 Feynman Prize for Theory

The winner of the 2009 Feynman Prize for Theory is Robert A. Freitas Jr. (IMM), in recognition of his pioneering theoretical work in mechanosynthesis in which he proposed specific molecular tools and analyzed them using ab initio quantum chemistry to validate their ability to build complex molecular structures. This Prize also recognizes his previous work in systems design of molecular machines, including replicating molecular manufacturing systems which should eventually be able to make large atomically precise products economically and the design of medical nanodevices which should eventually revolutionize medicine. The Foresight Institute, a nanotechnology education and public policy think tank based in Palo Alto awards the Feynman Prizes.

The winner of the 2009 Feynman Prize for Experimental work is the team of Yoshiaki Sugimoto, Masayuki Abe (Osaka University), and Oscar Custance (National Institute for Materials Science, Japan), in recognition of their pioneering experimental demonstrations of mechanosynthesis, specifically the use of atomic resolution dynamic force microscopy — also known as non-contact atomic force microscopy (NC-AFM) — for vertical and lateral manipulation of single atoms on semiconductor surfaces. Their work, published in Nature, Science, and other prestigious scientific journals, has demonstrated a level of control over the ability to identify and position atoms on surfaces at room temperature which opens up new possibilities for the manufacture of atomically precise structures.

Congratulations to the winners. Below is the latest work from Robert Freitas.

Recent Nanotechnology and Nanomedicine Publications by Robert Freitas

Robert A. Freitas Jr., “Chapter 22. Comprehensive Nanorobotic Control of Human Morbidity and Aging,” in Gregory M. Fahy, Michael D. West, L. Stephen Coles, and Steven B. Harris, eds, The Future of Aging: Pathways to Human Life Extension, Springer, New York, 2009. In press.

Denis Tarasov, Natalia Akberova, Ekaterina Izotova, Diana Alisheva, Maksim Astafiev, Robert A. Freitas Jr., “Optimal Tooltip Trajectories in a Hydrogen Abstraction Tool Recharge Reaction Sequence for Positionally Controlled Diamond Mechanosynthesis,” J. Comput. Theor. Nanosci. 6(2009). In press.

Robert A. Freitas Jr., “Medical Nanorobotics: The Long-Term Goal for Nanomedicine,” in Mark J. Schulz, Vesselin N. Shanov, YeoHeung Yun, eds., Nanomedicine Science and Engineering, Artech House, Norwood MA, 2009, Chapter 14, pp. 367-392. In press.


Nanomedicine, Nanorobotics, Nanofactories, Molecular Assemblers and Machine-Phase Nanotechnology Publications of Robert A. Freitas Jr. in 2009

Welcome to the future of medicine,” Studies in Health Technol. Inform.

A chapter describing the negative consequences of medical technology development and commercialization that is too slow, and makes the case for an immediate large scale investment in medical nanorobots to save 52 million lives a year. It also explains the essence of nanotechnology, its life-saving applications, the engineering challenges, and the possibility of 1000-fold improvement over our current human biological abilities. Every decade that we delay development and commercialization of medical nanorobotics, half a billion people perish who could have been saved.


Chemical Power for Microscopic Robots in Capillaries (arxiv, by Tad Hogg and Robert A. Freitas Jr.



The power available to microscopic robots (nanorobots) that oxidize bloodstream glucose while aggregated in circumferential rings on capillary walls is evaluated with a numerical model using axial symmetry and time-averaged release of oxygen from passing red blood cells. Robots about one micron in size can produce up to several tens of picowatts, in steady-state, if they fully use oxygen reaching their surface from the blood plasma. Robots with pumps and tanks for onboard oxygen storage could collect oxygen to support burst power demands two to three orders of magnitude larger. We evaluate effects of oxygen depletion and local heating on surrounding tissue. These results give the power constraints when robots rely entirely on ambient available oxygen and identify aspects of the robot design significantly affecting available power. More generally, our numerical model provides an approach to evaluating robot design choices for nanomedicine treatments in and near capillaries.



Meeting the Challenge of Building Diamondoid Medical Nanorobots

The technologies that are needed for the atomically precise fabrication of diamondoid nanorobots in macroscale quantities at low cost require the development of a new nanoscale manufacturing technology called positional diamondoid molecular manufacturing, enabling diamondoid nanofactories that can build nanorobots. Achieving this new technology will require the significant further development of four closely related technical capabilities: (1) diamond mechanosynthesis (2) programmable positional assembly (3) massively parallel positional assembly1 and (4) nanomechanical design. The Nanofactory Collaboration is coordinating a combined experimental and theoretical effort involving direct collaboration among dozens of researchers at multiple institutions in four countries to explore the feasibility of positionally controlled mechanosynthesis of diamondoid structures using simple molecular feedstocks, which is the first step along a direct pathway to developing working nanofactories that can fabricate diamondoid medical nanorobots.


Nanorobot Control 39 page pdf

Medical nanorobots may be constructed of diamondoid nanometer-scale parts
and subsystems including onboard sensors, motors, manipulators, power plants,
and molecular computers. The presence of onboard nanocomputers will allow
in vivo medical nanorobots to perform numerous complex behaviors which must
be conditionally executed on at least a semiautonomous basis, guided by receipt of
local sensor data, constrained by preprogrammed settings, activity scripts, and
event clocking, and further limited by a variety of simultaneously executing realtime
control protocols.

Such nanorobots cannot yet be manufactured, but preliminary scaling studies
for several classes of medical nanorobots including respirocytes, microbivores,
clottocytes and chromallocytes have been published in the literature. These
designs allow an analysis of basic computational tasks and a summation of major
computational control functions common to all complex medical nanorobots.
These functions include the control and management of pumping, sensing,
configuration, energy, communication, navigation, manipulation, locomotion,
computation, and the use of redundancy management and flawless compact
software.

Nanorobot control protocols are required to ensure that each nanorobot
completes its intended mission accurately, completely, safely, and in a timely
manner according to plan. Six major classes of nanorobot control protocols have
been identified and include operational, biocompatibility, theater, safety, security,
and group protocols. Six important subclasses of theater protocols include locational, functional, situational, phenotypic, temporal, and identity control
protocols.


Alcor Cryonics Has Published a Detailed Denial of the False Claims About Abuse of Ted Williams Head

A lot of people are letting misguided emotion and cryonics ick factor guide them into believing the false accusations. It seems clear that the accuser Larry Johnson planned to try to both make a buck out of finding fault with Alcor. It looks like he was also attracted to do it because of the Ted Williams angle. Ted Williams was a his "hero" and he started working their 6 months after Ted was placed there. Larry was at Alcor for 8 months and about half of that time Larry was actively recording and stealing.

ABC confirms that the head freezing was what was ordered. A Huffington post piece talks about how the son of Ted Williams was trashed by sports writers who emotionally did not want Ted frozen.

October 7, 2009: Alcor Response to ABC Nightline

Last night, Larry Johnson appeared on ABC's Nightline to promote the sale of his book, Frozen: My Journey into Cryonics, Deception and Death.

Mr. Johnson has had numerous opportunities to defend his actions in a court of law — both in Arizona and New York. He has failed to appear in Court in both states and has taken extreme steps to avoid service of process, and yet has no problem appearing on national television to slander innocent people and attempt to destroy a 40 year old nonprofit organization that has worked hard to gain respect among many in the scientific and medical communities.

Nightline made efforts to investigate Mr. Johnson's many fallacious claims. Mr. Johnson was caught in his own web of deceit when one of his claimed errors in the Ted Williams case was exposed as false. He was also forced to admit that he tried to profit from the death of baseball great, Ted Williams by charging visitors to his website $20 to view alleged photos of Mr. Williams' cryopreserved head. Such photos, some of which are part of internal case documentation files, were removed from Alcor without authorization by Mr. Johnson.

In his book and during the Nightline segment, Mr. Johnson claimed he witnessed Alcor staff striking Ted William's head with a wrench. Mr. Johnson, who was an executive with authority over the procedure in question, also claimed he said nothing about the purported incident when it allegedly occurred nor did he bring it to the attention of any other staff or board member. In fact, multiple individuals verified as documented witnesses to patient transfer procedures state without hesitation that Mr. Johnson's claims are pure fabrication. Alcor's internal investigation did not reveal any reports or recollections of any Alcor patient ever being struck by a wrench or any other object, accidentally or otherwise. Yet this fictional and unsubstantiated report continues to echo, as if it is fact, over and over again in the media

It is important to note that Mr. Johnson came to Alcor with supposed medical experience, and he was paid and entrusted to improve procedures and ensure the safety and privacy of Alcor members. In his short tenure, Mr. Johnson misappropriated Alcor property for his own financial gain; he invaded the privacy of private individuals by secretly recording their conversations; he absconded with medical records and technical photographs that were taken for documentation purposes and has presented these out of the context in which they were intended in order to make Alcor and its well-founded and documented procedures seem ghoulish in the eyes of the unsuspecting public. Mr. Johnson's actions violated the trust of Alcor, breached the confidence of its members and damaged the reputation of the science of cryonics.

As Nightline asked in the lead-in to the segment, "is this self-styled whistleblower just out to make money?" The answer is a resounding yes.


Ralph Merkle said that Johnson's main area of responsibility during his tenure at Alcor in 2003 was the supervision of the cryopreservation of Alcor members. According to Merkle, "Johnson expressed none of his lurid and sensationalistic concerns during his employment — when preventing and correcting any such alleged mistakes would have been a major part of his duties. Only afterwards, when he could profit from exaggerations and misrepresentations, did he start to complain about how Alcor performed cryopreservations."

Some of Johnson's most derogatory attacks of Alcor involve alleged mistakes during the cryopreservation of baseball great Ted Williams. Merkle said "It is absurd for Johnson to make these allegations because he had yet to be hired by Alcor when Williams was cryopreserved.




Bloomberg reports that Alcor Life Extension Foundation Inc., the Arizona cryonics company, sued a former employee to block publication of his expose claiming the organization mishandled the remains of baseball star Ted Williams.

ABC Coverage

In one of the most potent allegations in Johnson's book, he said Alcor cut off Williams' head without prior approval from his family.

"He was supposed to be a whole-body suspension," Johnson said. "He was supposed to be in one piece. They got him to the O.R. at Alcor and proceeded to cut through his neck."

But, in this instance at least, Johnson's version seemed to be incorrect. ABC News found notarized agreements, signed by Williams' oldest son and youngest daughter allowing Alcor the option of removing their father's head. The papers were signed in Florida just after 9 p.m. ET -- at least an hour before the operation began in Arizona, according to the log Johnson cites in his book.


Slanted Media Against the Son of Ted Williams
Huffington Post' Brian Ross: Ted Williams Head: How SI and the Mainstream Sports Media Gave John Henry Williams a Bum Rap

With the Larry Johnson kiss-and-tell book about Alcor Life Extension Labs coming out, the pot is being stirred again about Ted Williams frozen head, and its treatment.

Forget that Mr. Johnson, as COO, ran the lab, and could have cleaned up the very things that he is now lamenting... for profit.

Disregard that Tom Verducci, the sports writer who began ringing the fire bell about Williams' handling of his father's remains at Sports Illustrated (SI) did little more than source Bobby Jo Williams-Ferrell and those friends and family who didn't like the idea. He also did very little follow-up when the court upheld that Williams indeed did co-author this request along with the children to whom he was still speaking.