October 08, 2007

Petaflop plans progress

The Blue Gene/P machine at Argonne is supposed to reach one petaflop — 1 quadrillion sustained operations per second — in 2008. It should have a peak speed of three petaflops by the end of 2008.

Turek said IBM's goal was 10 petaflops by 2011 and 20 petaflops by 2017. The Japanese have announced their intent to reach 10 petaflops by 2012.

IBM and Google have dedicated a large cluster of several hundred computers (a combination of Google machines and IBM BladeCenter and System x servers) that is planned to grow to more than 1,600 processors.
Students will access the cluster via the Internet to test their parallel programming course projects. The servers will run open source software including the Linux operating system, XEN systems virtualization and Apache's Hadoop project, an open source implementation of Google's published computing infrastructure, specifically MapReduce and the Google File System (GFS).

Sony PS3's have helped the Folding@Home project to pass a petaflop in processing power during Sept 2007


Sigma said...

Brian, have you heard about this site:
intelligence realm
They are trying to brute force their way to A.I with distributed computing.

bw said...

I had not heard about it. It is only a small project so far.

clever application of brute force is a proven useful approach.

It is how checkers, chess and soon Go could be conquered. Take a large search space, apply clever measuring to know whether you are getting better solutions and then optimize and use tricks to reduce the search space by millions or billions of times.

Spectrum IEEE article talking about progress towards cracking GO

Sigma said...

I have always wondered if it was possible to create a system where you could start by evolving the base components of a system, say a computer chip, and then add criteria for every scale from the nano to the macro and see if it could essentially create the most optimized design possible.

of course the programming would be incredible, but perhaps someone could write a evolutionary algorithm so it could find the best possible way to create such a program?

In essence what I am thinking is an invention machine like John Koza's but with the ability to create and integrate all facets of design an operation into the most optimal design.

Anonymous said...

My predictions, that superinteligent never will be build size of the brain and consumption of the brain power (20W). Brain has about 10^17 FLOPs and about storage 10^17 bits. My philosphy is that nature is faster and better way than manufactured chips. Nature use that you can't use, becouse you ahve limitations. It is very hard to simulate neurons with computer and hard to know how they are connected... Brain is general purpose computer builded in better way, than programed computer.