$16 million Synthetic Biology funding

The Synthetic Biology Engineering Research Center at the University of Berkeley, California was funded by the NSF for $16 million Synthetic biology today is where chips were 50 years ago. A researcher who discovers, say, a potentially useful DNA fragment has no reliable way of mass producing it. Instead, to create large quantities, she must rely on a collection of laborious, hit-and-miss processes, which is the best the field has to offer.

Prof. Keasling says he envisions a day when a biologist can concentrate on difficult science questions and leave production and engineering matters to others. That’s the way many chip companies work, creating the designs for their chips themselves, but then shipping off the patterns to “fabs” to get the products made.

Drew Endy, an MIT professor in structural engineering who is involved in the effort, says researchers like himself have learned from the computer industry the importance of three main ideas: standardization, decoupling and abstraction.

Standardization is the process of establishing a technical standard among competing entities in a market. In Synthetic Biology it is what is happening at biobricks and the Registry of Standard Biological Parts

Decoupling refers to splitting a task into multiple parts, the way the computer industry has different suppliers for disk drives, memory and CPUs. Currently, says Prof. Endy, most biology labs do everything themselves.

Abstraction takes a cue from what has happened in programming languages over the decades; software has advanced to the point where programmers increasingly are able to use English-like statements in their code, as opposed to the 1s and 0s of the early days of computing. Prof. Endy says he hopes that future biologists won’t need to work on the sort of molecule-by-molecule basis that is used today.

The parts in the Registry of standard biological parts currently have an average of about 1000 base pairs