Pages

June 13, 2009

Anti-Recession Fiber Internet for Multi-Trillion Boost to the Economy


High speed fiber internet is being implemented with greater speed and higher penetration around the world than in the United States. The 5-10+% [700 billion to 1.4 trillion per year initially. A nextbigfuture article that covers many studies that connect broadband to economic stimulus] boost to the GDP that would come from 100+mbps symmetrical access would quickly pay for initial subsidies. Implementation by say Japan means that other countries the United States could also have them by adjusting policies and rules to prevent incumbant companies and groups from blocking successful rollout. The first example is super-broadband. The economic benefits for super-broadband have been shown. It is to the benefit of a economic benefit of country and its people to enable super-broadband (at least 100 mbps both up and down). Having a system set up that slows and prevents this rollout is stupid.

Japan is rolling out 10 gigabit per second (symmetrical, upload and download) fiber internet connections. Speeds up to 160 gigabit per second have been demonstrated and 200+ gigabit per second speed is possible. Wireless speeds of 10 gigabits per second over distances have been demonstrated.

There is no societal or technological reason to settle for lesser connection speed targets.

The Fiber-to-the-Home Council proposed [2007] a goal of providing affordable access to next-generation broadband networks to a majority of Americans by 2010 with universal access by 2015. To ensure that consumers can both receive and transmit video and other high-speed services, applications, and content, these networks should have transmission speeds in excess of 100 Mbps and symmetrical access capabilities. As part of the stimulus plan there has been about $7.8 billion allocated for broadband. Another $80-200 billion needs to be spent to build out a full all-fiber multi-gigabit symmetrical network for everyone in the United States. To get the full GDP economic benefits [and real stimulus], we need to speed up the build up of fiber and crank up the speed from 10-50 mbps up to the multi-gigabit range. $80-200 billion in government and private money is quite a bit but it would be buying $700-1.4 trillion per year in increased economy. This seems like a better bet than car company bailouts. Super-broadband an investment in the future.

How did a Norwegian electricity company become the biggest fiber-to-the-home provider in the country?

Lyse's business model is different from companies like Verizon, which is currently rolling out fiber across its service area and then trying to sign up customers. Lyse instead sends people into unserved areas, knocks on all the doors, and passes out information on the new fiber service. Only when 60 percent of the people in an area sign up in advance for the service does Lyse start the actual fiber install.

Sixty percent sounds like a tough threshold, but the company says that it has been "very successful" so far by offering people far greater Internet speeds for the same price they are currently paying. Lyse's Altibox service offers 10Mbps, 30Mbps, or 50Mbps connections—all of them fully symmetrical (upload and download speeds are identical). In many areas, the uptake rate tops 80 percent, though competitors have boosted speeds and started deploying fiber of their own in an effort to retain customers.

In addition to entering an area with tremendous support already lined up, Lyse also does something innovative: it allows prospective customers to dig their own fiber trenches from the street to their homes. In return, customers can save about $400. "They can arrange things just the way they want," says Herbjørn Tjeltveit of Lyse, which makes for happier customers; apparently, nothing angers a Norwegian more than having some faceless corporation tunnel through his flower garden.

Lyse can ramp up the speed dramatically once all that precious fiber is in the ground; its partners are already testing both 100Mbps and 1,000Mbps connections.


The Fiber to the Home Council has a lot of information on the benefits and what the status is of fiber to the home.

Here is the fiber to the home primer for 2009 (32 pages)



It cost $84 billion for the cable companies to pass about 100 million households a decade ago, or $850 a household ($1,500 in today’s). Fiber to the home has dropped to the same levels for installation today.








Verizon’s pure FTTP FiOS service was extremely risky at first but it’s clearly the long-term strategic winner. It can even be argued as a near-term winner since Verizon is winning over so many new triple-play TV/Voice/Internet customers and the costs of laying the fiber to the home has been slashed significantly.

As of 2009[update] the number of homes with FiOS [Fiber Optic Service] access was 12.7 million, of which 2.5 million subscribe to the Internet service, and 2.04 million to FiOS TV



2007 cost of fiber to the home and driving costs down. In 2006, costs for fiber to the home were about $1350 in the USA and dropping 5-20% per year.

Fiber to the premises by country.

FTTP/FTTH (Fiber to the premises or fiber to the home) in Japan, was first introduced in 1999, and did not become a large player until 2001. In 2003-2004, FTTH grew at a remarkable rate, while DSL's growth slowed. 10.5 million FTTH connections are reported as of September 2007 in Japan.[1] Currently, many people are switching from DSL to FTTH, the use of DSL is decreasing, with the peak of DSL usage being March 2006. On September 17, 2008, Ministry of Internal Affairs and Communications reported that for the first time, the number of FTTH connections (13.08 million connections) eclipsed that of DSL (12.29 million connections) and became the biggest means of broadband connection in Japan at 45% of total compared to that of DSL at 42%.

South Korea FTTP is offered by various Internet service providers including KT (formerly, Korea Telecom), Hanaro Telecom, and LG Powercom. The connection speed for both downloading and uploading is set to be 100 Mbit/s. Monthly subscription fee ranges between USD20 and USD30 depending on subscription period.

Taiwan's Chunghwa Telecom Co offers FTTB for around $30USD. Taiwan ranks 4th highest FTTB penetration rate in the world.




Fiber to the home explained at wikipedia.



Active optical networks rely on some sort of electrically powered equipment to distribute the signal, such as a switch, router, or multiplexer. Each signal leaving the central office is directed only to the customer for which it is intended. Incoming signals from the customers avoid colliding at the intersection because the powered equipment there provides buffering.

Passive optical network (PON) is a point-to-multipoint, fiber to the premises network architecture in which unpowered optical splitters are used to enable a single optical fiber to serve multiple premises, typically 32-128. A PON configuration reduces the amount of fiber and central office equipment required compared with point to point architectures.




June 12, 2009

Cancer Breakthroughs : Unraveling Genetic Sequences for Cancer Cures, Ultrasound/Magnets for Zapping Tumors, 2-in-1 Breast Cancer Treatment and More

1. Colorectal cancer is especially difficult to diagnose in its early stages - usually, people are in advanced stages when the cancer is discovered, and the diagnostic process itself requires the removal of entire polyps as well as a laboratory assessment that may take weeks.

Vernick's [ doctoral student of the Department of Physical Electronics at Tel Aviv University] lab-on-a-chip solution works by recognizing tell-tale biomarkers that lab technicians cannot see with the naked eye. Cancer biomarkers are molecular changes detectable in the tumor or in the blood, urine, or other body fluids of cancer patients. These biomarkers are produced either by the tumor itself or by the body in response to the presence of cancer. The most commonly-used biomarker tests used today are the off-the-shelf pregnancy test and the test used by diabetics to monitor blood-sugar levels.



With his tool, Vernick can scan up to four different biomarkers for colon cancer, an extraordinarily effective method for finding elusive colon cancer malignancies.

The chip is essentially an electrochemical biosensor programmed to recognize and bind to colorectal cancer biomarkers with high specificity. "Following this bio-recognition event, the electrodes on the chip transduce the signal it receives into an electric current, which can be easily measured and quantified by us," says Vernick.

In addition to the lab-on-a-chip technology, Vernick and his fellow researchers believe they are well on the way to establishing a blood test for colon cancer, which, when used together with colonoscopies, offers a comprehensive package of colon cancer detection.

"When you combine all these methods together, you increase the level of confidence in the results, eliminating false positives and negatives which are dominant today in tests for colorectal cancer," says Vernick. This research, which is funded in part by American-Israeli businessman and philanthropist Lester Crown, is to be commercialized as a complete method of cancer detection, combining blood screening and biopsy.

The ultimate goal would be for patients to have the ability to test themselves at home. "Glucose sensors used by diabetics are the best example today of a hand-held home biosensor test," says Vernick. In the future, he would like to offer patients a similar technology for colorectal cancer detection, in partnership with their physicians. "A person could submit the results of a home test directly online or to their doctor. This is my ultimate goal," he says.


2. BC Cancer Agency and Vancouver Coastal Health Research Institute, discovered that a single genetic mutation is responsible for granulosa cell tumours, which are a rare and often untreatable form of ovarian cancer.



The discovery is akin to finding a needle in a haystack, as there are three billion components to the genetic code of the tumour.

The research, published this week in the New England Journal of Medicine, can be applied to more than ovarian cancer. The use of state-of-the-art technology to identify the single mutation in ovarian cancer's DNA means the same technology can be used to unravel the genetic sequences of other cancers The breakthrough cancer discovery that has potential to lead to a host of new cancer diagnostics and treatments. The ability to decode the genetic sequences of specific cancers will be part of a road map to truly personalized medicine, in which doctors will be able to come up with an individualized "recipe" for every patient.

Mutation of FOXL2 in Granulosa-Cell Tumors of the Ovary [New England Journal of Medicine]



Full text of the ovarian cancer discovery paper

32 page supplemental pdf

3. Friedward Winterberg (major researcher in nuclear fusion and for the ideas that led to the global positioning system) has proposed to treat cancer by the combination of a strong magnetic field with intense ultrasound. At the low electrical conductivity of tissue the magnetic field is not frozen into the tissue, and oscillates against the tissue which is brought into rapid oscillation by the ultrasound. As a result, a rapidly oscillating electric field is induced in the tissue, strong enough to disrupt cancer cell replication. Unlike radio frequency waves, which have been proposed for this purpose, ultrasound can be easily focused onto the regions to be treated. This method has the potential for the complete eradication of the tumor. [H/T Joel Campbell, NASA]






4. Scientists from the Breakthrough Breast Cancer Research Centre based at the Institute of Cancer Research have shown for the first time that it is possible for one drug to simultaneously attack cancer cells in two completely different ways. Researchers now hope this discovery could lead to further two-in-one treatments - meaning breast cancer patients could potentially need to take fewer drugs to treat tumours in the future.

The team showed that an experimental compound called PTK/ZK, originally developed as an 'angiogenesis inhibitor' to block a tumour's blood supply and slow its growth, also acted as an 'aromatase inhibitor'. In this way it prevents the growth of hormone sensitive breast cancers reliant on oestrogen for their growth and survival. This type of breast cancer accounts for over 70% of all cases of breast cancer.


5. A team of scientists claims to have developed a drug capable of treating skin cancer even in its most-advanced stages.

Presenting their findings at a meeting of the American Society of Clinical Oncology in Florida, researchers from Roche and Plexxikon explained that the drug PLX4032, which is still in its experimental stages, could work to combat malignant melanomas.

6. Researchers at McGill University and the University of Pennsylvania have discovered that a widely used anti-diabetic drug can boost the immune system and increase the potency of vaccines and cancer treatments. Their findings will be published June 3 in the journal Nature.

Few talk about cancer and diabetes in the same breath. However, recent advances have uncovered common links between cancer and diabetes, in particular how metabolic pathways, the basic chemical reactions that happen in our cells, are controlled in these diseases. The recent findings suggest a new link between the metabolic pathways deregulated in cancer and diabetes and their role in immune cell function. The results suggest that common diabetic therapies which alter cellular metabolism may enhance T-cell memory, providing a boost to the immune system. This could lead to novel strategies for vaccine and anti-cancer therapies.


7. University of Florida researchers have come up with a new gene therapy method to disrupt cancer growth by using a synthetic protein to induce blood clotting that cuts off a tumor's blood and nutrient supply.

In mice implanted with human colorectal cancer cells, tumor volume decreased 53 percent and cancer cell growth slowed by 49 percent in those treated with a gene that encodes for the artificial protein, compared with those that were untreated.


8. MicroRNA Replacement Therapy May Stop Cancer In Its Tracks

Scientists at Johns Hopkins have discovered a potential strategy for cancer therapy by focusing on what's missing in tumors. They have discovered a potential strategy for cancer therapy by focusing on what's missing in tumors. A new study suggests that delivering small RNAs, known as microRNAs, to cancer cells could help to stop the disease in its tracks. MicroRNAs control gene expression and are commonly lost in cancerous tumors.

Publishing results of the study June 12 in Cell, the researchers say they have provided one of the first demonstrations that microRNA replacement provides an effective therapy in an animal model of human disease.

"This work suggests that microRNA replacement may be a highly effective and nontoxic treatment strategy for some cancers or even other diseases," says Josh Mendell, M.D., Ph.D., an associate professor in the McKusick-Nathans Institute of Genetic Medicine, Johns Hopkins University School of Medicine. "We set out to learn whether tumors in a mouse model of liver cancer had reduced levels of specific microRNAs and to determine the effects of restoring normal levels of these microRNAs to these cancer cells. We were very excited to see that the tumors were, in fact, very vulnerable to microRNA replacement."


9. Cancer detection progress from the Canary Foundation. Article by Melanie Swan.

The Canary Foundation’s annual symposium held May 4-6, 2009 indicated progress in two dimensions of a systemic approach to cancer detection: blood biomarker identification and molecular imaging analysis.

Systems approach to cancer detection
A systems approach is required for effective cancer detection as assays show that many proteins, miRNAs, gene variants and other biomarkers found in cancer are also present in healthy organisms. The two current methods are one, looking comprehensively at the full suite of genes and proteins, checking for over-expression, under-expression, mutation, quantity, proximity and other factors in a tapestry of biological interactions and two, seeking to identify biomarkers that are truly unique to cancer, for example resulting from post-translational modifications like glycosylation and phosphorylation. Establishing mathematical simulation models has also been an important step in identifying baseline normal variation, treatment windows and cost trade-offs.

Blood biomarker analysis
There are several innovative approaches to blood biomarker analysis including blood-based protein-assays (identifying and quantifying novel proteins related to cancer), methylation analysis (looking at abnormal methylation as a cancer biomarker) and miRNA biomarker studies (distinguishing miRNAs which originated from tumors). Creating antibodies and assays for better discovery is also advancing particularly protein detection approaches using zero, one and two antibodies.

Molecular Imaging
The techniques for imaging have been improving to molecular level resolution. It is becoming possible to dial-in to any set of 3D coordinates in the body with high-frequency, increase the temperature and destroy only that area of tissue. Three molecular imaging technologies appear especially promising: targeted microbubble ultrasound imaging (where targeted proteins attach to cancer cells and microbubbles are attached to the proteins which make the cancerous cells visible via ultrasound; a 10-20x cheaper technology than the CT scan alternative), Raman spectroscopy (adding light-based imaging to endoscopes) and a new imaging strategy using photoacoustics (light in/sound out).

Tools: Cancer Genome Atlas and nextgen sequencing
As with other high-growth science and technology areas, tools and research findings evolve in lockstep. The next generation of tools for cancer detection includes a vast cataloging of baseline and abnormal data and a more detailed level of assaying and sequencing. In the U.S., the NIH’s Cancer Genome Atlas is completing a pilot phase and being expanded to include 50 tumor types (vs. the pilot phase’s three types: glioblastoma, ovarian and lung) and abnormalities in 25,000 tumors. The project performs a whole genomic scan of cancer tumors, analyzing mutations, methylation, coordination, pathways, copy number, miRNAs and expression. A key tool is sequencing technology itself which is starting to broaden out from basic genomic scanning to targeted sequencing, whole RNA sequencing, methylome sequencing, histone modification sequencing, DNA methylation by arrays and RNA analysis by arrays. The next level would be including another layer of detail, areas such as acetylation and phosphorylation

Future paradigm shifts: prevention, omnisequencing, nanoscience and synthetic biology
Only small percentages of annual cancer research budgets are spent on detection vs. treatment, but it is possible that the focus will be further upstreamed to prevention and health maintenance as more is understood about the disease mechanisms of cancer. Life sciences technology is not just moving at Moore’s Law paces but there are probably also some paradigm shifts coming.



Lawrenceville Plasma Physics : Focus Fusion/Dense Plasma Focus update



Lawrenceville Plasma Physics (LPP’s) design team has now completed the six-month design phase of the project, finishing design work on the device, the shielding wall and the vacuum chamber. We are now into the fabrication and construction phase, which will last three months.

LPP has submitted a concept paper to the new Advanced Research Projects Agency-Energy. ARPA-E is a new agency, molded on DARPA, which is soliciting proposals for “transformational” energy technology. Based on the 8-page concepts papers, ARPA-E will ask selected proposers to submit a full 50-page application. We will know if we are selected for that step by late June.






A dense plasma focus (DPF) is a plasma machine that produces, by electromagnetic acceleration and compression, short-lived plasma that is so hot and dense that it becomes a copious multi-radiation source. It was invented in the early 1960s by J.W. Mather and also independently by N.V. Filippov. The electromagnetic compression of a plasma is called a "pinch". The plasma focus is similar to the high-intensity plasma gun device (HIPGD) (or just plasma gun), which ejects plasma in the form of a plasmoid, without pinching it.

Reciprocal DNA Nanomechanical Devices Controlled by the Same Set Strands



Reciprocating devices are key features in macroscopic machines. We have adapted the DNA PX-JX2 device to a reciprocal format. The PX-JX2 device is a robust sequence-dependent nanomachine, whose state is established by a pair of control strands that set it to be either in the PX state or in the JX2 state. The two states differ by a half-turn rotation between their ends. Here we report the construction of a pair of reciprocal PX-JX2 devices, wherein the control strands leading to the PX state in one device lead to the JX2 state in the other device and vice versa. The formation, transformation, and reciprocal motions of these two device systems are confirmed using gel electrophoresis and atomic force microscopy. This system is likely to be of use for molecular robotic applications where reciprocal motions are of value in addition its inherent contribution to molecular choreography and molecular aesthetics.

Nanowerk has information on these nanoscale DNA pistons.

The synchronous reciprocal motion demonstrated by Seeman's team, i.e. a complete cycle of operation to reach equilibrium, took almost half a day.

Seeman points out that the strategy of using the same control strands to set distinct states of two devices enables many nanoscale capabilities from reciprocating machines to molecular choreography.

"The ability to correlate the motions of molecular devices can lead to complex behavior on the nanoscale" he says. "For example, in a recent device the legs of a bipedal walker (see "A Bipedal DNA Brownian Motor with Coordinated Legs") communicate with each other, producing an autonomous walker."








8 page pdf with supplemental information

June 11, 2009

Nvidias Next GPGPU to Have 3 Teraflop Performance

Previously we had an article about cloud computing, distributed computing and the importance of GPGPU acceleration.

HPCwire reports that Nvidia's next-generation GPU design, the G300, may turn out to be the biggest architectural leap the graphics chip maker has ever attempted. If the early rumors are true, NVIDIA has decided move the architecture a step closer to the CPU and make GPU computing even more compelling for HPC (High power computers/Supercomputers).

According to Valich's sources, the GT300 will offer up to 512 cores, up from 240 cores in NVIDIA's current high-end GPU. Since the new chips will be on the 40nm process node, NVIDIA could also crank up the clock. The current Tesla GPUs are running at 1.3-1.4 GHz and deliver about 1 teraflop, single precision, and less than 100 gigaflops, double precision. Valich speculates that a 2 GHz clock could up that to 3 teraflops of single precision performance, and, because of other architectural changes, double precision performance would get an even larger boost.

In a later post Valich writes that the upcoming GPU will sport a 512-bit interface connected to GDDR5 memory. If true he says, "we are looking at memory bandwidth of 256GB/s per single GPU."

More importantly though, NVIDIA is said to be moving from the traditional SIMD (single instruction, multiple data) GPU computing model to MIMD (multiple instruction, multiple data) or at least MIMD-like. As the name suggests, MIMD means you can run different instruction streams on different processing units in parallel. It offers a much more flexible way of doing all sorts of vector computing, and is a standard way to do technical programming on SMP machines and clusters. Presumably CUDA will incorporate MIMD extensions to support the new hardware.

There was some speculation that the GT300 would hit the streets this year, but reports of trouble with TSMC's 40nm manufacturing technology may have slowed NVIDIA's plans.


Nvidia also will soon have the ION. It is a system/motherboard platform that includes NVIDIA's GeForce 9400M (MCP79) GPU and Intel's Atom on a Pico-ITXe motherboard designed for netbook and nettop devices. In February 2009, Microsoft certified the upcoming ION-based PCs as Vista-capable. The small form factor ION-based PCs are expected to be released in the summer of 2009, starting at $299.99. The ion will have about 50 Gigaflops of performance at a netbook price.






Latest Solid State Drives and Hard Drives

Transcend SSD18M holds 128GB of flash for about $340 [at time of writing]

Corsair CMFSSD-256GB flash drive for $699

Western digital has a 4 terabyte hard drive for about $650.

Solid State Drives and Hard Drive Enterprise Combo for Database I/O

A memory system setup to handle 53,850 simultaneously online game users.

The IT group now reserves its DRAM SSD-based RamSan-400 for the part of the database that's accessed the most and uses its newer "cached flash" RamSan-500 -- which has 64 GB of DRAM cache and 2 TB of RAID-protected single-level cell (SLC) NAND flash -- for the bulk of the database calls. It credits solid-state drives with helping the system to handle 53,850 simultaneous users in mid-May.

The DRAM solid-state drive-based RamSan-400 system claims to be able to attain 400,000 IOPS, for both reads and writes. Although CCP hasn't verified that number through testing, it appeared fairly accurate based on database usage and percentage usage of the RamSan device, according to Mayes. The RamSan-500, which has both DRAM and NAND flash technology, claims read performance of 100,000 IOPS and write performance of 25,000 IOPS, according to Texas Memory Systems.

CCP is also testing a flash-only RamSan-20, which has 450 GB of SLC NAND flash storage attached via PCI Express. The RamSan-20 claims to handle 120,000 read IOPS vs. 50,000 write IOPS, illustrating the difference in performance for reads/write that users might expect to see in a dedicated SLC-based flash system.

For systems produced by Texas Memory Systems, the list price of DRAM SSD is $300 per gigabyte, while SLC-based flash SSD is $40 to $70 per gigabyte, according to Woody Hutsell, the company's president.

The 2 TB Flash-based RamSan-500, which has 32 GB of DRAM cache, lists at $150,000, whereas pricing for the all-Flash 5 TB RamSan-620 is $220,000, Hutsell said. The latest DRAM SSD-based 512 GB RamSan-440 lists at $180,000.
Today's list price for the RamSan-400 is $61,000, while the RamSan-20 is $18,000.

Server Farms

Apple is prepping a new US data center that may cost as much as $1bn, which would nearly double what Google typically spends on the mega data centers backing its web-based applications and services.

Google has about 35 data centers

Google data center FAQ


Physical Intelligence Follow Up


J Storrs Hall is at the DARPA workshops for physical intelligence. It turns out they are trying to advance the concepts of entropy and cybernetics. The picture is an example of cybernetics from wikipedia.

I’m currently at the Proposer’s Workshop for the program, and it turns out that what they’re actually talking about is a lot more like cybernetics. The “thermodynamics” they are talking about is a bit more like the entropy in information theory (Shannon, you will remember, was a student of Wiener, founder of cybernetics). The term cybernetics itself isn’t much used anymore but the reason is more historical than anything else — there was a strange soap opera that broke up the intellectual cadre of cybernetics in the 50s for personal reasons, and computers and symbolic AI stepped into the vacuum, but the core discoveries are still valid.

There’s a chapter about cybernetics in Beyond AI, including the soap opera.


The original post about physical intelligence.

Darpa’s latest venture, called “Physical Intelligence” (PI) is to prove, mathematically, that the human mind is nothing more than parts and energy. In other words, all brain activities — reasoning, emoting, processing sights and smells — derive from physical mechanisms at work, acting according to the principles of “thermodynamics in open systems.” Thermodynamics is founded on the conversion of energy into work and heat within a system (which could be anything from a test-tube solution to a planet).







Entropy is discussed at wikipedia.

There is also an introduction to entropy.

In thermodynamics, entropy is a measure of certain aspects of energy in relation to absolute temperature. In thermodynamics, entropy is one of the three basic thermodynamic potentials: U (internal energy), S (entropy) and A (Helmholtz energy). Entropy is a measure of the uniformity of the distribution of energy.

The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, can provide a measure of the amount of energy in a physical system that cannot be used to do work.


Cybernetics at wikipedia.

Cybernetics is the interdisciplinary study of the structure of regulatory systems. Cybernetics is closely related to control theory and systems theory. Both in its origins and in its evolution in the second-half of the 20th century, cybernetics is equally applicable to physical and social (that is, language-based) systems.

Cybernetics is preeminent when the system under scrutiny is involved in a closed signal loop, where action by the system in an environment causes some change in the environment and that change is manifest to the system via information / feedback that causes the system to adapt to new conditions: the system changes its behaviour. This "circular causal" relationship is necessary and sufficient for a cybernetic perspective.

Example of cybernetic thinking. On the one hand a company is approached as a system in an environment. On the other hand cybernetic factory can be modeled as a control system.Contemporary cybernetics began as an interdisciplinary study connecting the fields of control systems, electrical network theory, mechanical engineering, logic modeling, evolutionary biology, neuroscience, anthropology, and psychology in the 1940s, often attributed to the Macy Conferences.

Other fields of study which have influenced or been influenced by cybernetics include game theory, system theory (a mathematical counterpart to cybernetics), psychology (especially neuropsychology, behavioral psychology, cognitive psychology), philosophy, and architecture.


Human Enhancement Overview

My thinking is that there are three main paths on bone strengthening with different times to get them done. [Link references will be added later, check the labels for references]
Plus the years for FDA approvals (but probably faster for any non-FDA military program or in places like China or North Korea)

1. Replicate the biological action of the genes that strengthen bones. Make pills etc or other treatment that activates the proteins or pathways that cause the denser bones. Requires cracking the code of what is going on.

2. Gene engineer stem cells that they take from you and replace some bone marrow or inject it in so that those end up acting to strengthen bones

3. replace pre hardened bones via surgery. IF we get regeneration and tissue replacement going a lot better first then it could be easier to do replacements that way. Maybe some kind of microsurgery process with MEMS/NEMS or MNT

Same deal for muscles and other tissue.






Brain - a bunch of work on detail mapping and analysis of the brain. Figuring out what to do to improve is rapidly progressing. The better brain simulation might also let you trial run changes to see what would happen before you actually did it in brains. Ideally would want personalized scans and simulations

20 years seems like a reasonable timetable for these things to be a significant niche.

Earlier if you are willing to take risks. A Mengala or for "weapon X" style military program.
Or for people who through disease or age need to get the boost or be impaired. ie Clinical trial 3-10 years

If we were to run the full gamut of enhancements that could be done Among the most important for individuals
- immune system boosting ( possible way to cure cancer, reduce cardiavascular problems)
- enhance longevity

- brain enhancement (how much and how soon depends on how far from optimal we really are now.
Is everyone losing 20-150IQ points because of rampant defects. some kind of thing that with analysis we can see. WOW this problem is endemic. Like air pollution is probably costing 20 IQ points and imperfect nutrition is probably costing another 20-30IQ points. How easy is that damage to fix after living that way for 20-50 years ?

Brain/productivity enhancement would be the most impact on the overall betterment of humanity and civilization

ie Eat right and exercise and everyone could be 2-4 times stronger and healthier. Why wouldn't the same thing apply to brain function ? And if we can make the corrections in pills and other adjustments then maybe it is easy to get people to what is now 200-400 IQ. The other thing is the whole IQ score thing is imperfect.

Super -Virtual reality training and wearable computer cognitive aids could make everyone test out great.

The measure that I think is more important is expertise and productivity rather than IQ. Let everyone start getting the results/achievements of an Edison or Henry Ford or DaVinci or Dean Kamen. Who cares about paper tests/SATs ?


A lot of the rest are nice but you could always wear a good exoskeleton (just like you have glasses instead of Lasik, or drive a car/segway instead of training like an Olympian)
- myostatin inhibitors strength
- bone strengthening
- re-activate (toad/lizard) regeneration in humans
- radiation resistance enhancement (in case of dirty bombs or nukes, carbon nanotube drugs help a lot)
- transgenic gene therapy (muscle like an ape or cheetah)
- metal particle in ligaments for more toughness there

Fujitsu Achieves Worlds First Impulse Radio-Based Millimeter-Band Transmissions Exceeding 10 Gbps, Plans 2012 Commercialization

Fujitsu Limited and Fujitsu Laboratories Ltd. today announced the development of the world's first impulse radio-based high-capacity wireless transmission equipment using millimeter-band transmissions in the 70-100 gigahertz (70-100 GHz) range band, resulting in throughput exceeding 10 gigabits-per-second (10 Gbps). This technology dispenses with the oscillators and other components that have been required in conventional wireless transmission technologies, enabling compact and inexpensive millimeter-band transmission equipment. The new technology is suitable as an alternative to fiber-optic trunk lines in regions where those lines would be difficult to lay, as a way to bridge the digital divide, and can also be used for ultra-fast wireless LANs.

Fujitsu will begin field testing this new technology with the aim of developing commercial systems by around 2012.





Dispensing with oscillators and mixers which are required in conventional technologies makes millimeter-band transmitters more compact and less costly. This technology can be used as an alternative to fiber-optic networks, to provide a trunk-line equivalent to bridge the digital divide. In addition, it can be used for a wide range of potential applications, including indoor ultra-fast wireless LANs and high-resolution radar.


Optical superlens resolves 30 nanometers, 1/12th the Wavelength of light


Optical superlens shows 30 nanometer features that 1/12th the wavelength of light.(15 page pdf)

We demonstrate a smooth and low loss silver (Ag) optical superlens capable of resolving features at 1/12th of the illumination wavelength with high fidelity. This is made possible by utilizing state-of-the-art nanoimprint technology and intermediate wetting layer of germanium (Ge) for the growth of flat silver films with surface roughness at sub-nanometer scales. Our measurement of the resolved lines of 30nm half-pitch shows a full-width at half-maximum better than 37nm, in excellent agreement with theoretical predictions. The development of this unique optical superlens lead promise to parallel imaging and nanofabrication in a single snapshot, a feat that are not yet available with other nanoscale imaging techniques such as atomic force microscope or scanning electron microscope.

It has been demonstrated experimentally that a silver superlens allows to resolve features well below the working wavelength. Resolution as high as 60nm half-pitch or 1/6th of wavelength has been achieved.

Theoretically, it was predicted that a resolution as high as λ/20 (where λ is the
illumination wavelength) is feasible with careful design of silver superlens. However, challenges remain to realize such a high resolution imaging system, such as minimizing the information loss due to evanescent decay, absorption or scattering. Our numerical simulations have indicated that the thickness of spacer layer (separating the object and the lens) and that of silver film are the two major governing factors that determine subwavelength information loss due to evanescent decay and material absorption.






In conclusion, we have demonstrated a new approach to realize ultra-smooth Ag superlenses with an unprecedented λ/12 resolution capability. Incorporating few monolayers of Ge drastically improves Ag film quality and minimizes the subwavelength information loss due to scattering. Our theoretical and experimental results clearly indicate subdiffraction imaging down to 30nm half-pitch resolution with 380nm illumination. This ultra-high image resolution capability can also be extended to far-field by incorporating a corrugated silver surface on top of Ag-Ge superlens.

Fabrication of sub-20nm thick smooth Ag films will also enable development of novelmultilayer (Ag-dielectric-Ag) superlenses operating at visible wavelengths. The development of visible superlenses will allow the use of white light sources instead of specialized UV sources and facilitate possible integration of superlens with optical microscopes. The development of such novel superlenses would enable real-time, dynamic imaging at the molecular level.

June 10, 2009

Enhancing Brains and Bones

New Scientist covers progress in understanding how to speed up the brain.

Van den Heuvel's team to build connectivity networks for each volunteer, and to measure the efficiency of each network. "It more or less reflects how many steps a [brain] region has to take to send information from one region to another," he says.

This measure proved a decent predictor of each person's IQ, explaining about 30 per cent of the differences between subjects, van den Heuvel says.

Intriguingly, the researchers found no link between the total number of connections in a subject's brain network and their IQ. "We show that more intelligent people don't have more connections, but they have more efficiently placed connections," he says."

If it's genetic, genes work through biology and, once we understand the biology, we have lots of ways to manipulate biology," says Richard Haier, a neuroscientist at the University of California, Irvine. "In my mind, one of the important directions of this kind of research should lead to ways to improve intelligence on a neurochemical basis."



picking and choosing genetic copying of extreme humans - superhard bones





The genomes of outlier (extreme people) can be examined for the biology of how we can modify other people.

The late 1990s, a surprised radiologist in Connecticut came across a real-life version of Bruce Willis's character in the movie Unbreakable. The patient came to the hospital after a motor vehicle accident. But rather than revealing broken bones, the x-rays revealed an extremely high bone density. (Bone-density testing later confirmed it to be the highest ever recorded. Eight times higher than the bone density for a man his age.) In 2002, Richard Lifton, a geneticist at Yale who specializes in genetic analysis of human outliers--people with extreme phenotypes--discovered that a mutation in a gene called LDL-related receptor protein 5 was responsible for the man's high bone density, a condition shared by about half of his family. (While mutations in this gene can sometimes lead to health problems, Lifton says that this family's only complaint was that they couldn't float in water because their bones are so dense.)

Lifton's team went on to study the molecular pathway affected by this mutation--and just seven years later, drugs targeting one of these molecules is in late-stage clinical testing for osteoporosis, a progressive disease of brittle bones that leads to fracture and a substantially increased risk of disability and death among the elderly.




Ultimate Specific Energy for Batteries, Ultracapacitors


A comparison of practical and theoretical specific energy limits for various battery technology. Others predict higher practical and theoretical levels.

The determination of the theoretical maximum capacity of a Lithium-air battery is complex, and there isn’t a flat statement of fact in the Handbook of Batteries , Third Edition as are many more well developed chemistries. To provide the most accurate value for the maximum capacity, BD asked Dr. Arthur Dobley to provide an expert opinion, which we quote as follows:
“Specific capacity:
* For lithium metal alone 13 kWh/kg.
* For the lithium and air, theoretical, 11,100 Wh/kg, not including the weight of oxygen, and 5,200 Wh/kg including the weight of oxygen. This was checked by calculation and agrees with K.M. Abrahams publication ,JECS 1996.
* For the Lithium air cell, practical, 3,700 Wh/kg, not including the weight of oxygen, and 1,700 Wh/kg with the weight of oxygen. These numbers are predictions and are made with the presumption that 33% of the theoretical energy will be obtained. The battery industry typically obtains 25% to 50% of the theoretical energy (Handbook of Batteries). Metal air batteries are higher in the range. Zinc-air is about 44% (Handbook of Batteries, 3rd Ed. pg 1.12 and 1.16 table and fig).

We selected a conservative 33%. You may quote these numbers above and make any comments with them. The theoretical numbers are similar to the numbers in the ECS 2004 abstract. ( The difference is due to mathematical rounding.)



PolyPlus Battery Company is developing novel lithium/air batteries with unprecedented energy density, rivaling that possible for hydrocarbon fuel cells. The technology is based on proprietary encapsulated water stable lithium metal enabling the practical realization of unique galvanic couples such as Li/Air and Li/Water batteries. The theoretical specific energy of lithium metal/aqueous couples is greater than 10,000 Wh/kg and commercial batteries are expected to exceed 1000 Wh/l and Wh/kg.

IBM is starting research on lithium air batteries as well.

Only a handful of labs around the world, including those at PolyPlus Battery, in Berkeley, CA, Japan's AIST, and St. Andrews University, in Scotland, are currently working on lithium-air batteries. Lithium metal-air batteries can store a tremendous amount of energy--in theory, more than 5,000 watt-hours per kilogram. That's more than ten-times as much as today's high-performance lithium-ion batteries, and more than another class of energy-storage devices: fuel cells. Instead of containing a second reactant inside the cell, these batteries react with oxygen in the air that's pulled in as needed, making them lightweight and compact.

Metal Air Batteries estimated specific energy:

Polyplus has approached the challenge of the Lithium metal electrode with a coating of a glass-ceramic membrane, sealing the Lithium from an aqueous catholyte. The resultant structure exhibits very small self discharge, ordinarily a large contributor to cell failure. Test cells have produced 0.5 mAh/cm2 for 230 hours exhibiting approximately 100% Coulombic efficiency.

A production oriented cell construction with double sided lithium anode, solid electrolyte and double sided air/cathode is anticipated to have 600 to 1000 Wh/kg energy density.







Carbon-air Battery are the focus of a program being performed by St. Andrews University, (UK). The free energy of carbon oxidation is 9100 Wh kg and a fuel-only specific energy of 7200 Wh/kg is possible. The final system is anticipated to have a device specific energy of 2000-3000 Wh/kg. A major consideration is to maintain the operational temperature of the electrolyte at 7000 C or greater. This limits the carry-it-around-in-your-pocket and turn-it-on-in-a-moment possibilities.

Metal-Air Battery/Fuel Cell by eVionyx have been able to overcome many limitations of self corrosion and passivation while increasing specific energy, specific power and Coulombic efficiency.

Air cathodes provide up to 10 times the current of conventional air cathodes with proprietary electrolytes and patented processes. Solid polymer electrolyte membranes have reduced the problems of electrolyte loss due to dry out. Zinc-air cells produce up to 450 Wh/kg while Aluminum-air cells can produce a specific energy of more than 550 Wh/kg, and specific energies up to 650 Wh/kg are expected. Ordinarily, Magnesium-air fuel cells utilize only up to 60%, but with the electrolyte additives, can utilize up to 94% of the alloy.

Lithium Sulfur batteries, University of Waterloo, have and energy density iof about 1200 Wh/kg, for just the positive electrode, which would put the energy density of the cell at about 500 Wh/kg or more, but this depends on the other components of the cell," Dr. Nazar said. "That is about a factor of 3 to 5 times more than a conventional lithium-ion battery.



New Scientist covers lithium air batteries.

EEStor ultracapacitor estimates if they work.

Prototype / LightEVs Mass
Low Volume Estimate Production
Energy density (Wh/l).. 606 ......... 700 .......1513
Specific energy (Wh/kg) 273........ 450 ........682
Price ($ USD/kWh) .....$61......... n/a....... $40


Lithium battery update from 2007.

Primary Zinc-carbon batteries produce less than 100 Wh/kg, primary Alkaline less than 200 Wh/kg and Lithium-thyonyl chloride batteries 730 Wh/kg.

Interconnected Carbon Nanostructures Made With Graphene and An Engineered Tunable Bandgap Made With Graphene

Two separate research developments with graphene advance applications with graphene for electronics. One is the engineering of tunable bandgaps and the other is formation of nanostructures that connect graphene layers.

1. Wang, who is also an assistant professor in the Department of Physics at the University of California at Berkeley, have engineered a bandgap in bilayer graphene that can be precisely controlled from 0 to 250 milli-electron volts (250 meV, or .25 eV).


On the left, a microscope image looking down through the bilayer-graphene field-effect transistor. The diagram on the right identifies the elements. (Image: Feng Wang and colleagues, Lawrence Berkeley National Laboratory)

Moreover, their experiment was conducted at room temperature, requiring no refrigeration of the device. Among the applications made possible by this breakthrough are new kinds of nanotransistors and – because of its narrow bandgap – nano-LEDs and other nanoscale optical devices in the infrared range. Researchers can precisely tune a bandgap in bilayer graphene from zero to the infrared.




2. Engineers from the University of Pennsylvania, Sandia National Laboratories and Rice University have demonstrated the formation of interconnected carbon nanostructures on graphene substrate in a simple assembly process that involves heating few-layer graphene sheets to sublimation using electric current that may eventually lead to a new paradigm for building integrated carbon-based devices.










The "knife" and "welding torch" used in the experiments, which were performed inside an electron microscope, was electrical current from a Nanofactory scanning probe, generating up to 2000°C of heat. Upon applying the electrical current to few-layer graphene, they observed the in situ creation of many interconnected, curved carbon nanostructures, such as "fractional nanotube"-like graphene bi-layer edges, or BLEs; BLE rings on graphene equivalent to "anti quantum-dots"; and nanotube-BLE assembly connecting multiple layers of graphene.

Remarkably, researchers observed that more than 99 percent of the graphene edges formed during sublimation were curved BLEs rather than flat monolayer edges, indicating that BLEs are the stable edges in graphene, in agreement with predictions based on symmetry considerations and energetic calculations. Theory also predicts these BLEs, or "fractional nanotubes," possess novel properties of their own and may find applications in devices


Researchers induced the sublimation of multilayer graphene by Joule-heating, making it thermodynamically favorable for the carbon atoms at the edge of the material to escape into the gas phase, leaving freshly exposed edges on the solid graphene. The remaining graphene edges curl and often welded together to form BLEs. Researchers attribute this behavior to nature's driving force to reduce capillary energy, dangling bonds on the open edges of monolayer graphene, at the cost of increased bending energy.

"This study demonstrates it is possible to make and integrate curved nanostructures directly on flat graphene, which is extended and electrically conducting," said Li, associate professor in the Department of Materials Science and Engineering in Penn's School of Engineering and Applied Science. "Furthermore, it demonstrates that multiple graphene sheets can be intentionally interconnected. And the quality of the plumbing is exceptionally high, better than anything people have used for electrical contacts with carbon nanotubes so far. We are currently investigating the fundamental properties of graphene bi-layer edges, BLE rings and nanotube-BLE junctions."


Abstract: In situ observation of graphene sublimation and multi-layer edge reconstructions.

We induced sublimation of suspended few-layer graphene by in situ Joule-heating inside a transmission electron microscope. The graphene sublimation fronts consisted of mostly {1100} zigzag edges. Under appropriate conditions, a fractal-like “coastline” morphology was observed. Extensive multiple-layer reconstructions at the graphene edges led to the formation of unique carbon nanostructures, such as sp2-bonded bilayer edges (BLEs) and nanotubes connected to BLEs. Flat fullerenes/nanopods and nanotubes tunneling multiple layers of graphene sheets were also observed. Remarkably, >99% of the graphene edges observed during sublimation are BLEs rather than monolayer edges (MLEs), indicating that BLEs are the stable edges in graphene at high temperatures. We reproduced the “coastline” sublimation morphologies by kinetic Monte Carlo (kMC) simulations. The simulation revealed geometrical and topological features unique to quasi-2-dimensional (2D) graphene sublimation and reconstructions. These reconstructions were enabled by bending, which cannot occur in first-order phase transformations of 3D bulk materials. These results indicate that substrate of multiple-layer graphene can offer unique opportunities for tailoring carbon-based nanostructures and engineering novel nano-devices with complex topologies.


8 pages of supporting information in a pdf.



June 09, 2009

Ultra-Broadband Worldwide and GDP Boost


Information and Communications Technology (ICT) and broadband can have a direct and measurable impact on GDP and that a number of studies have indicated that ‘true’ broadband (that is, symmetrical bandwidth in excess of 100 Mb/s) can increase GDP by up to 5%. Other studies indicate a return ten times greater than the investment in broadband.

The only current technology that will support symmetric 100 Mbps speeds is fiber. The US is well behind leading countries in fiber penetration, which reduces our average download speed. The median download speed of the United States is from 2.3 Mbps to 8.9Mbps, compared to that of Japan at 63 Mbps to 93.7 Mbps, some 10 to 27 times faster. Estimates range from as low as 3% of all broadband connections in the US are on fiber to as high as 4% on fiber (see below). In contrast, leading countries have from 23.1% to 45% of broadband subscribers due to fiber. Japan (45%), Korea (39%), and China (23.1%) have the largest share of broadband subscribers on fiber, some 5 to 15 times more fiber subscribers per population than the US.


A Gartner study from 2003, Gartner estimates the California could have obtained a $376 Billion increase in the Gross State Product by 2010 if there was an implementation of a One Gigabit or Bust Broadband initiative.

*There is no new single killer application that will justify Broadband deployment. The killer app remains improved communications.

Japan is planning to have 100% broadband penetration by 2011. Japan already has 1 gbps symmetrical fiber broadband. Average advertised download speed in Japan is about 100mbps now.

The next step towards ever breakneck speeds is commercialisation of 10 GBPs fibre optic deliver. Telecoms firm Oki Japan has successfully tested a 160 GBPs long-distance, high-speed optical connection that delivers the equivalent of "four full movies" worth of data every second. Oki expects it to be commercialized late next year maintaining Japan's bragging rights for some time to come.


South Korea plans to spend about $24.6 billion to boost broadband speeds to 1 gigabit per second by 2012. South Korea is already at 100 mbps speed.



By the end of 2009, Shanghai Telecom will provide more than 750,000 families with access to fiber optic lines connecting their homes, and these work at up to 100 megabits per second (Mbps) compared with one or two Mbps now, according to the MONET (Metro Optical Network) plan. By 2010, about 1.5 million families will have access to fiber optic lines and 3.0 million by 2011, compared with the current 3.6 million family broadband users, according to Shanghai Telecom.

Australia announced a A$43 billion eight year broadband plan to increase broadband speeds to 100 mbps to 90% of Australians.

Singapore has funded an ultra-broadband 1 gbps plan for most people in the city by 2015.

OpenNet has proposed wholesale prices of S$15 (US$10) per month per residential fiber connection and S$50 (US$35) per month per non-residential connection. StarHub Limited recently announced the launch of its HSPA+ mobile service. HSPA+ increases StarHub’s mobile broadband network capacity to support speeds of up to 21Mbps, from 14.4Mbps previously. Singapore launched a maritime WiMAX network last year. It offers mobile internet access to ships in the Port of Singapore and up to 15km from Singapore’s southern coastline.







In December 2008, Pyramid Research predicted that FTTB/FTTH operators would pass around 212 million homes by the end of 2013, which is only about 12 percent of all households globally. But Asia stands out because of how aggressively its countries are passing homes with fiber.

The Asia/Pacific region had more than 68 million homes passed by fiber infrastructure at the end of 2008. Nearly 25 million households throughout Asia/Pacific subscribe to a fiber-based service. So Asia/Pac countries represent about 78 percent of all residential FTTH connections across the planet.

South Korea, Hong Kong, Japan, and Taiwan have the highest proportion of households connected to fiber networks globally.

Japan, NTT Group (NYSE: NTT)'s home, and Hong Kong, home of PCCW Ltd. (NYSE: PCW; Hong Kong: 0008) and Hong Kong Broadband Network Ltd. (HKBN) , each boast fiber penetrations above 20 percent.

During the next five years, additional network upgrades in five Asian markets (Japan, South Korea, China, Hong Kong, and Taiwan) will increase penetration to 122 million homes passed, the Pyramid report states.


Passive Optical Networks could have 50 million subscribers in the world by 2010.

While Ethernet PON (EPON) and Gigabit PON (GPON) are the dominant fiber to the home (FTTH) technologies, this presentation will give a quick overview of other FTTH technologies and how DOCSIS PON (DPON) [a mix of optical fiber and cable to speed up cable internet access] can map on top of them.


California has had a broadband plan proposed.

OECD broadband data for 2008

FURTHER READING

Some of the communication predictions that I made in 2006 are true or coming true in Japan and South Korea.
Fiber to the home (100Mbps-1000Mbps) 2010-2015 [Japan now, plus the fastest Docsis 3 cable communication can get to about 300 mbps.] Perhaps fiber 18-20% penetration worldwide. 12% [212 million homes for 2013] from Pyramid research. High speed cable and DSL could have another 20-50% of homes in the 100+ Mbps download range.

Nextgen communication (1000Mbps-10000Mbps) 2013-2020 [upgraded fiber, rolling out in Japan next year. Easy to upgrade the components for faster speed because the same fiber is used as earlier rollouts]

Wireless superbroadband (50-1000Mbps) 2009-2012 [the faster versions of whitespace modems, Advanced 3.5G mobile and 4G mobile, faster wimax implementations, free space optics]

UPDATE: Fujitsu demonstrated 10 Gbps [10,000 Mbps] millimeter wireless and plans to commercialize by 2012.

LTE [Long Term Evolution/System Architecture Evolution]has surpassed the technical requirements outlined by the 3GPP, achieving a peak downlink rate of 154 Mbps field in a drive test using 2x2 MIMO. Another test using a 4x4 MIMO configuration yielded downlink peak rates close to 250 Mbps. Robson adds that LTE doesn't suffer much throughput loss under additional factors like differing RF conditions between users and application overhead, and that it also meets the 3GPP's latency requirements of 10 milliseconds for the air interface, 20 milliseconds end to end, and 100 milliseconds for the control plane.

But for all the hoopla over LTE's imminent arrival, that won't equate to massive rollouts any time soon. The standardization process is still ongoing, and apart from some early rollouts in the US and Japan (where NTT DoCoMo has already begun deploying its pre-LTE standard "Super3G" technology) in 2010, even the most optimistic projections don't see serious LTE commercial rollouts before 2011.


UQ Communications has been building a network of WiMax transmitters in Tokyo and neighboring cities of Kawasaki and Yokohama and aims to reach 90% of Japan’s population by 2012.

The promise of WiMax isn’t that it offers another phone network for voice calls. Rather the network is expected to let make wireless e-mail and Internet-surfing available from more places. WiMax resembles Wi-Fi but WiMax can reach up to 30 miles compared to Wi-Fi’s far more limited range of a few hundred of feet. That means anyone with a laptop computer or other portable gizmo that comes with WiMax technology can tap into the Net wirelessly over a zippy wireless network without a Wi-Fi router or a cable connection.

Three manufacturers--Toshiba, Panasonic and Onkyo—showed off laptops today that will run on Intel chips with WiMax capability when UQ’s services start. WiMax download speeds in Japan will be as fast as 40 Mbps, comparable to Wi-Fi connections already in use and faster than broadband Internet connections over a land-based line in most other countries. (Upload speeds are slower at 10 Mbps.)


Fiber to the premises by country at wikipedia

Japan has the Kizuna satellite for internet access across asia.

Kizuna satellite communication system aims for a maximum speed of 155Mbps (receiving) / 6Mbps (transmitting) for households with 45-centimetre aperture antennas (the same size as existing Communications Satellite antennas), and ultra-high speed 1.2 Gbps communication for offices with five-meter antennas.
In addition to establishing a domestic ultra high speed Internet network, the project also aims to construct ultra high speed international Internet access, especially with Asian Pacific countries and regions that are more closely related to Japan.


Z-Ram and other Next Generation Computer Memories

1. Dr. Tae-Su Jang, member of technical staff in the Hynix R&D Division, will deliver the paper on June 17, 2009 that highlights the operating characteristics of Z-RAM memory technology fabricated on a 50nm DRAM process. Using a 54nm x 54nm floating-body memory bitcell, the paper presents the longest floating-body retention time reported – longer than 8 seconds at 93 degrees Celsius – as well as an extremely large programming window of 1.6 volts. These improvements were obtained through DRAM technology optimizations such as junction engineering, thermal treatments, and improved passivation processes. The paper concludes by demonstrating the suitability of floating body memories for DRAM applications

Z-RAM, short for "zero capacitor RAM" is a new type of computer memory in development by Innovative Silicon Inc. Z-RAM offers performance similar to the standard six-transistor SRAM cell used in cache memory but uses only a single transistor, therefore offering much higher densities. Z-RAM offers twice the density of DRAM, and five times that of SRAM. Although Z-RAM's individual cells are not as fast as SRAM, the lack of the long lines allows a similar amount of cache to be run at roughly the same data rates by avoiding this delay while taking up less space. Response times as low as 3ns have been stated.

Innovative Silicon has a technology overview of Z-ram

2. Atomic Layer Deposition is being use for next generation flash and other non-volatile memories.





Technological innovations in superior cell architectures, new materials, and advanced deposition techniques such as atomic layer deposition (ALD) will be required to enable the continued growth in flash memory. As a result, technology development efforts in the flash memory market parallel leading research in logic transistor development, such as the use of high-k dielectrics and work function engineered metal gates. Furthermore, novel memory architectures utilizing phase change materials and ferroelectric thin films are being investigated to respond to the mounting scaling challenges of non-volatile memory (NVM) cells.

Flash accounted for 8% of the total $277 billion semiconductor industry in 2008, and is expected to post higher than average growth rates of 18% annually. The demand for memory capacity has resulted in aggressive scaling of flash memory cells far in excess of projections by the International Technology Roadmap for Semiconductors (ITRS).

FeRAM

FeRAM is shown above on the right. Several alternative technologies, like FeRAM, can potentially provide faster programming times, lower programming voltages and increased endurance over Flash.








Phase Change Memory

Chalcogenide-based phase change memory (PCM) is another alternative novel memory being actively investigated. Phase change can occur on the order of 10nsec and the material can be cycled between 10^9 and 10^13 times -- considerably in excess of the 10^6 write/erase cycles required by modern day flash technologies. Besides speed, endurance and low voltage operation, PCM memory is highly scalable. Because PCM memory uses a deposited thin-film structure that is not inherently tied to the silicon substrate, there is potential to stack these memory arrays on back-end metallization layers, resulting in higher densities. Denser arrays and the need for low thermal budgets and high conformality have resulted in active study of ALD techniques for the deposition of GST, resistive heater metal layers and post-GST dielectrics.


Oak Ridge Jaguar Supercomputer Going to About 2.7 Petaflops by Yearend


New six-core "Istanbul" processors from AMD are expected to arrive later this summer and will rev up Jaguar's peak processing capability to "well over 2 petaflops." That's more than 2,000 trillion mathematical calculations per second.

The switch from quadcore processing to six-core processors will results in about a 70 percent performance gain and also enhance the memory applications. So the 1.6 petaflop peak processing of Jaguar should go to 2.7 petaflops peak.

Separately, the University of Tennessee's "Kraken," which is housed at ORNL and funded by the National Science Foundation, is also being upgraded with new six-core processors and will join Jaguar as a member of the exclusive petaflops club -- capable of more than 1,000 trillion calculations per second. The Kraken supercomputer currently runs at 600 teraflops.





AMD says it will release Istanbul in June 2009, and believes Istanbul will keep them in the competitive running against Intel over the next year. The company believes that a proposed 12-core processor, code-named Magny-Cours, in 2010 will give them a major advantage.

And that may be why Advanced Micro Devices Inc. said this week that it is releasing its six-core Opteron chip in June, well ahead of schedule, and plans to follow it early next year with a chip code-named Magny-Cours that will ship in eight- and 12-core models. After that, it plans a 16-core chip in 2011.


If there was a 2010 chip refresh on the Jaguar supercomputer then it could go 5 petaflops of peak performance and a 2011 chip refresh could provide 7 petaflops.

Pricing of top of the link AMD Opteron chips are about $1000. So 50,000 to 200,000 top of the line chips would be $50 million to $200 million for ballpark of hardware costs for a petaflop supercomputer refresh upgrade.

June 08, 2009

iPhone 3GS will have Voice Control, 2X Speed and Video Camera



The next iPhone, the iPhone 3GS, will be available June 19, 2009. It will have twice the speed for browsing, a video camera, 32 Gigabytes of storage and voice control.

Techcrunch notes that upgrading from an existing iPhone to the iPhone 3GS would be suckers bet.






IPhone 3GS details at the Apple site.

* Voice Control recognizes the names in your Contacts and knows the music on your iPod. So if you want to place a call or play a song, all you have to do is ask.
* The first thing you’ll notice about iPhone 3G S is how quickly you can launch applications. Web pages render in a fraction of the time, and you can view email attachments faster. Improved performance and updated 3D graphics deliver an incredible gaming experience, too. In fact, everything you do on iPhone 3G S is up to 2x faster and more responsive than iPhone 3G.
*Now you can shoot video, edit it, and share it — all on your iPhone 3G S. Shoot high-quality VGA video in portrait or landscape
*Cut, copy, and paste words and photos, even between applications. Copy and paste images and content from the web, too. [so perhaps blogging from the iPhone will be practical?]
* With a built-in digital compass, iPhone 3G S can point the way. Use the new Compass app, or watch as it automatically reorients maps to match the direction you’re facing
* The new 3-megapixel camera takes great still photos, too, thanks to built-in autofocus and a handy new feature that lets you tap the display to focus on anything (or anyone) you want.

Interview with Ben Goertzel Who is Working on Artificial General Intelligence

Here is an interview with Ben Goertzel. The interview is by Sander Olson, who has also interviewed Dr. Richard Nebel. This link is to all Sander Olson interviews on this site). Dr. Goertzel has a PhD in mathematics and is currently one of the world's top Artificial General Intelligence (AGI) researchers. In this interview, Dr. Goertzel makes some noteworthy points:

- There is an 80% chance of creating a sentient AI within 10 years with proper funding. Adequate funding would only be $3 million per year. Even without this modest funding Goertzel is still confident (80% probabilitiy) that AGI will arrive within 20 years.

-The pace of AI research is clearly accelerating - forums and conferences are occuring at an increasing pace, and corporations are increasingly interested in AI

- Several industries, including robotics and search engines, could become major drivers of AI research in the next few years.

-Goertzel is working with J Storrs Hall and others to create an artificial intelligence roadmap similar to the Foresight/Batelle roadmap unveiled last year. The creation of this roadmap will be challenging but should further spur AI development.

Note: An interview with Fusion researcher Eric Lerner will be completed shortly.

Question: Your company, Novamente, is doing research on Artificial General Intelligence (AGI). How is that research progressing?

Answer: AGI is a long term pursuit aimed at achieving human level and eventually superhuman level intelligence. But given the enormousness of the challenge you shouldn't expect to be interviewing a superhuman AI anytime in the next several years. However, we do believe that we are developing a system that has a reasonable chance of achieving human-level intelligence and sentience within the next decade, especially if our project gets properly funded.






Question: Tell us about the Novamente Cognition Engine (NCE). Does it really possess the ability to improve itself?

Answer: There is a fine line between learning and self-improvement - to an extent any life form or system that learns is capable of improving itself. The cognition engine differentiates itself from most current AI systems because it changes its knowledge and its strategies as it progresses. But it doesn’t yet have the power of full self-improvement, in the sense of being able to rewrite all its own code according to its own ideas. That will come a little later!

Question: The NCE already controls a dog in second life. How long before it controls a convincing human?

Answer: It is already possible to create an AI bot that can pass for human in many casual online interactions, but that is not our objective. Rather, we are striving to create a program that is genuinely intelligent and which would pass rigorous testing. We are years from that goal. How many years it takes will depend on funding and the effectiveness of our algorithms. But this project should be doable within a decade, given adequate funding.

Question: What funding levels do you consider adequate?

Answer: We are capable of operating on a shoestring budget, so an annual budget of $3 million per year should be sufficient, including staff and hardware. Maybe even less; depending on the hardware requirements, which aren’t yet fully clear, we might be able to make do with half that. The only input costs are for skilled labor, perhaps a dozen researchers, and for sufficiently large and capable server farms. Finding a source of funding for high-risk, long-term research is a continuing challenge. But AGI really could be developed on a shoestring budget, relative to a lot of other technologies.

Question: You don't believe that reverse engineering the brain is the quickest way to achieving AI. Why do you believe that other approaches are superior?

Answer: Ray Kurzweil's argument that brain-scanning technologies will improve exponentially in the next few decades is plausible, but some of his exponential growth curves are more reliable than others. There simply aren't enough data points to be able to make extrapolations about the future accuracy of brain scanning with a high degree of confidence. By contrast, the Moore's law and digital computing trends are more clearly established and will directly benefit all AI approaches. I think that AGI via brain scanning and emulation will work, I just think there’s a possibility to create powerful AGI faster via using other methods, like the ones we’re working on.

Question: J. Storrs Hall has argued that the hardware necessary for general intelligence already exists. Do you agree?

Answer: Yes. I would be surprised if one could run an AGI program on my macbook, but I wouldn't be surprised if google or Amazon's server farms have sufficient computational capacity to achieve human-level intelligence. Current AI researchers may be constrained by their lack of access to sufficient computer power, but Moore's law will eventually eliminate that problem. Although better hardware always helps, the primary problem at this point is software and algorithms, not hardware.

Question: Is the pace of AGI research accelerating?

Answer: The pace is clearly accelerating. Ten years ago talks on general artificial intelligence at conferences were virtually nonexistent. Now conferences and symposia are springing up on AGI at an ever increasing pace. The biggest problem with AGI research at this point is funding. But with the increasingly broad interest that we’re seeing, increasing funding may come. Another problematic issue is the lack of metrics for measuring progress. How do you know when you are a quarter of the way to your goal? This is something I’m putting some effort into lately.

Question: What is the likelihood of the development of AGI in the next 10, 20, 30 years?

Answer: The answer to that depends largely on the resources that society dedicates to the problem. Assuming current funding levels, I would guess a 70% chance within the next twenty years, and a 98% chance of general AI occurring within the next 30 years. But with generous funding there is an 80% chance of creating AGI within the next ten years. And superhuman AIs will very likely emerge within a few years of human-level AIs.

Question: What is the single biggest impediment to AGI research?

Answer: Funding is currently the biggest impediment. The ideas already exist, but building complex software is a nontrivial task. Microsoft employs hundreds of coders to develop an operating system. By contrast we have a handful of engineers working on AGI. We would also benefit from having our own server farms, which are expensive to build and maintain.

Question: During the next decade, what will be the main driver of AI research?

Answer: At a certain point, service robotics is going to take off. If the robotics industry manages to solve the issues regarding low level processing - walking without falling, manual dexterity, object recognition, and so forth – quickly enough, then service robotics could become a major driver of AI research. Another major driver could be online search and question answering applications, which would benefit enormously from having natural language searches. Another possible driver of AGI could be the finance industry, since that industry already makes extensive use of narrow AI systems – before long some of the visionaries of the financial world may realize the overwhelming advantages of being able to utilize a general AI system.


Question: Will it be possible to achieve AGI by combining numerous narrow AI programs?

Answer: Although general intelligence systems might make use of narrow AI programs, something besides a combination of narrow AI programs needs to be involved in order to have a true general intelligence. Even if you integrated together a lot of great narrow AI programs, without the transfer of knowledge from one of the narrow-AI programs to another, the system wouldn't be able to derive insights and reason in a general way. And this kind of “transfer learning” is what general intelligence is all about. A grab-bag of narrow-AI algorithms would also lack a sense of self, which is a prerequisite of any true AGI.

Question: The Battelle/Foresight nanotechnology roadmap was recently unveiled. Is there a similar roadmap for AGI?

Answer: There isn't yet, and we are trying to remedy that. A colleague and I are organizing a workshop for fall 2009 with the aim of formulating an AGI roadmap. This is an important step, since there are currently more approaches to AGI than AGI researchers, and the commonalities are sometimes obscured by the different terminologies different researchers use. J Storrs Hall, one of the key formulators of the nanotechnology roadmap, is going to be involved with the AGI roadmap as well. The creation of an AGI roadmap should be a boon to the field of artificial intelligence.

FURTHER READING
Novamente blog.

Novamente Cognition Engine.