Pages

January 26, 2008

Bakken and Torquay Formations - A Saudi Arabia of oil under Saskatchewan, North Dakota, South Dakota, Montana and Manitoba

MORE NEWS
There is a separate North Dakota study of the Bakken and a link to where the state publishes online reports of completed wells and wells with reported production. A bunch of wells were reported on April 28, 2008.

BREAKING NEWS
Well my guess was far to optimistic. The USGS 2008 figure was 3.65 billion barrels of oil for the US part of the Bakken formation.

UPDATE: Bakken in the context of the world's oil megaprojects. About 7 million barrels per day being added in 2008 and 2009 from about 50 projects per year.

The Bakken oil formation is possibly the largest conventional oil discovery in Canada since 1957. If this oil formation plays out toward the higher end of size and recoverability then it will change the geopolitics of oil and the economies of the United States and Canada. If a lot of the oil proves difficult to recover now, new technologies could still drastically improve the percent recoverable. The motivation to pull out another 100 billion barrels would be $9 trillion at todays prices. UPDATE:Here is my article on multi-stage fracturing (Stackfrac) horizontal drilling.

This thursday, April 10, 2008 the new US Geological survey will be released on the potential of the Bakken

Estimates are anywhere from a conservative 25 billion barrels of oil in place, to a high estimate by the United States Geological Survey of 400 billion barrels of oil in the Bakken formation. Not only is the oil plentiful, but it's high quality too, 41 degree light sweet crude. The Bakken formation is a formation of black shale, siltstone, and sandstone. The formation lies beneath the Mississippian formation, Saskatchewan's current source of light sweet crude. The Bakken formation is situated beneath southeastern Saskatchewan, southwestern Manitoba, and North Dakota.


In 2007, EOG Resources out of Houston, Texas reported that a single well it had drilled into an oil-rich layer of shale below Parshall, North Dakota is anticipated to produce 700,000 barrels of oil.


The resources of the Bakken Formation are defined by the United States Geological Survey (USGS) as unconventional “continuous-type” oil resources. This means the hydrocarbons within the Bakken have not accumulated into discrete reservoirs of limited areal extent. With new horizontal drilling and completion technology taken into account, the technically recoverable resource base for the entire Bakken Formation is potentially much larger.



Isopach map of the Bakken formation in Saskatchewan, Canada (Map of the areal extent and thickness variation of a stratigraphic unit; used in geological exploration for oil and for underground structural analysis)



The Williston Basin covers approximately 300,000 square miles over parts of North Dakota, South Dakota, and Montana and parts of the adjacent Canadian provinces of Saskatchewan and Manitoba. The Bakken formation can be encountered throughout the Williston Basin.

Application of new drilling and completion technology has begun to unlock new potential in this legacy basin. There is speculation that the total resource in the play could be in the billions of barrels.



Hydrocarbon Potential of the Bakken and Torquay Formations,
Southeastern Saskatchewan by L.K. Kreis and A. Costa


Much of the oil reservoired within the Bakken shale likely resides in a network of enhanced porosity and permeability related to microfractures.

• Upper and Lower Bakken shales showing anomalously high resistivity values in southeastern Saskatchewan suggest that they are saturated with oil that has either been generated in place or has migrated into these locations.

• Basement structures, such as those associated with the Brockton-Froid lineament, and compactional features in regions of Middle Devonian salt dissolution may control fractures that serve as primary migration pathways for Bakken-sourced oils into possible plays in the Bakken and Torquay formations.

• The relatively low permeability of Bakken and Torquay reservoirs are likely best exploited through horizontal wells.

• A large untested and poorly evaluated rock volume remains in the Bakken and Torquay formations of southeastern Saskatchewan, within which there may be significant potential for finding new oil.


Porosities in the Bakken average about 5%, and permeabilities are very low, averaging 0.04 millidarcies—much lower than typical oil reservoirs. However, the presence of horizontal fractures makes the Bakken an excellent candidate for horizontal drilling techniques in which a well drills along the extent of the rock layer, rather than punching a hole vertically through it. In this way, many thousands of feet of oil reservoir rock can be penetrated in a unit that reaches a maximum thickness of only about 140 feet. Production is also enhanced by artificially fracturing the rock.



Oils with an API gravity of 40 to 45 have the highest market price and those with values outside this range sell for less. Above an API gravity of 45, the molecular chains become shorter and are less valuable to a refinery. Crude oil classified as light, medium or heavy, on the following basis:
Light crude oil has an API gravity of above 31.1 °.
Medium oil has an API gravity in the range 22.3 ° and 31.1 °.
Heavy oil has an API gravity less than 22.3.


FURTHER READING
EOG Resources has about $5 million direct drilling costs and is recovering that in about 6 months

Petrobank in Saskatchewan has about $1.7 million drilling and completion costs with costs recovered well within one year.

In 1992, Energy and Mines estimated there was roughly 100 billion barrels of oil in the Bakken formation throughout the entire Williston Basin.

Dancsok, who co-authored the 1991 study, said the prevailing view in the geoscience community at the time was "the potential of the Bakken was immense, but the price of oil in 1991 was not such that people wanted to risk (exploration and development dollars)."

Dancsok estimated roughly 25 per cent of the Williston Basin, which covers some 200,000 square miles (518,000 square kilometres) is located in Saskatchewan. Based on that simple arithmetic, the estimate of Bakken oil in the province could range anywhere from 25 billion barrels to 100 billion barrels of oil in place.

Of course, geology isn't that simple.

"Whether the Bakken is evenly distributed throughout the basin is one question," Dancsok said. "It is deeper in North Dakota. But is the distribution of Bakken oil equal in Saskatchewan to North Dakota or Montana? That's a big question mark."


Research documents for purchase from the Saskatchewan government

North Dakotas Bakken Reserve estimates

New estimates of the amount of hydrocarbons generated by the Bakken were
presented by Meissner and Banks (2000) and by Flannery and Kraus (2006). The first of
these papers tested a newly developed computer model with existing Bakken data. Data
used was not as extensive as some of the other studies mentioned in this discussion
therefore estimates of generated oil presented were 32 BBbls. The second paper by
Flannery and Kraus used a more sophisticated computer program with extensive data
input supplied by the ND Geological Survey and Oil and Gas Division. Early numbers
generated from this information placed the value at 200 BBbls (pers. comm. Jack
Flannery, 2005). Estimates had been revised to 300 BBbls when the paper was presented
in 2006. Even if the lower value of 32 BBbls is correct, the amount that may be
potentially recovered from the Bakken is significant.

How much of the oil that has been generated is technically recoverable is still to
be determined. Price places the value as high as 50% recoverable reserves. A primary
recovery factor of 18% was recently presented by Headington Oil Company for their
Richland County, Montana wells. Values presented in ND Industrial Commission Oil
and Gas Hearings have ranged from 3 to 10%. The Bakken play in the North Dakota side
of the basin is still in the learning curve. North Dakota wells are still undergoing
adjustments and modifications to the drilling and completion practices used for this
formation. It is apparent that technology and the price of oil will dictate what is
potentially recoverable from this formation.


New cheap computer modeling could allow access to about 218 billion barrels of oil that is still in the ground in the USA in old wells and less economical fields.

Microwave oil recovery could make it cheaper to extract a lot more oil from oil shale (up to 800 billion barrels in the USA extractable) and from the oil sands. (Up to 2 trillion barrels in Canada's oilsands.

Al Fin also has a feature on the Toe to Heel Air Injection (THAI) technology for extracting oil from tar sands deposits.

A posting and msg board related to the EOG Resources well

EOG resources website

In the United States EOG’s total crude oil and condensate production increased 23 percent compared to the same quarter a year ago, driven by continued drilling success in North Dakota and the Mid Continent.

EOG announced 2008 total company production growth targets, ranging from 13 to 17 percent, depending on drilling economics and North American natural gas prices. Production growth in 2008 will be driven by United States operations, particularly the Fort Worth Basin Barnett Shale natural gas and the North Dakota Bakken crude oil plays, both very high rate of return programs



2007 EOG Resources SEC filings

In the United States, EOG's total crude oil and condensate production increased 23 percent compared to the same quarter a year ago, driven by continued drilling success in North Dakota and the Mid Continent.

In Mountrail County, North Dakota, EOG has reported successful drilling from the Bakken Formation. The Wenco #1-30H, in which EOG has a 52 percent working interest, was completed to sales at the end of September at an initial production rate of 1,930 barrels of oil per day (Bopd), gross. Also in Mountrail County, the Austin #1-02H was completed to sales in October at an initial production rate of 2,000 Bopd. EOG has a 100 percent working interest in the well, which is located nine miles north of existing production. This is the northernmost location that EOG has drilled to date. To further confirm the northern extension of the field, following completion of the Austin #1-02H, EOG drilled an offset well, the Austin #2-03H that will be completed in November. Based on shows during drilling, EOG expects the well to produce at a rate similar to that of the Austin #1-02H. EOG has an 81 percent working interest in the Austin #2-03H. In the North Dakota Bakken Play, where it has accumulated over 175,000 net acres, EOG plans to increase drilling activity from six to eight rigs in early 2008.

"The results from the two Austin wells have given us the confidence to increase estimated reserves in the Bakken Play from the previously announced 60 million barrels of oil to approximately 80 million barrels, net to EOG. By extending the perimeter of the field, we have also increased our inventory of firm drilling locations. Therefore, we expect this area to have a significant impact on EOG's oil production in 2008 and beyond. The Bakken is currently the highest rate of return play in our drilling program," said Mark G. Papa, Chairman and Chief Executive Officer


North Dakota Bakken

Zacher 1-24H - EOG has a 75 percent working interest in the Zacher 1-24H that was completed in June with a peak production rate of 1,774 barrels of oil per day (Bopd), gross.
Hoff 1-10H - EOG has a 75 percent working interest in the Hoff 1-10H, which began flowing to sales in June at a peak rate of 2,034 Bopd, gross.
N&D 1 - 05H - EOG holds a 67 percent working interest in the N&D 1-05H, which was completed in July at an initial peak production rate of 1,610 Bopd, gross.


Rocky Mountain Oil Journal: EOG Confirmed to Have Significant Producer North of Parshall Field

EOG Resources (EOG) has confirmed that its wildcat 10 miles north of Parshall Field is a large volume, horizontal Bakken producer. According to EOG’s 2007 Third Quarter results, the Austin #1-02H, a single-lateral test in the sw-se 2-154n-90w, Mountrail County, has been producing at an initial rate of 2,000 bopd. EOG has a 100 percent working interest in the #1-02H. Parshall Field is the largest Bakken oil pool discovered in North Dakota with a monthly production exceeding 200,000 bo. The company believes that this well is connected to Parshall Field, and if geological data supports this, Parshall Field could be a Class A oil field (100 mmbo +). Not only did EOG confirm the huge rates on the Austin #1-02H, the company also said that its first stepout to this well, the Austin #2-03H sw-se 3-154n-90w, will also produce at similar rates based on shows encountered during the drilling of this well. The #2-03H scales about one mile west of the of the 1-02H.




drillers added fracture technology to horizontal drilling. In fracture technology, mud is forced into the drilled hole under immense pressures to "frack" or break up the shale further. The deeper cracks allow more oil to flow to the pipe

FURTHER READING
How oil is used in the United States

Optimum home energy efficiency

Enhanced oil recovery and a view of possible sources for increased north american oil production, which does not yet count the Bakken oil

Bakken oil is currently about 100,000 barrels per day and seems to be heading to 250,000 barrels per day by the end of 2008. It could take 4-6 years to get a million or two million barrels per day. So it will take time to develop the fields and get the pipelines and other infrastructure. A continued build up of oilsand oil, the Gulf of Mexico oil and continued increase in biofuels. Shifting the US off of non-North American oil imports is still a big job even with the Bakken oil. However, it could be part of a substantial shift and allow more time for the development of more nuclear and other power sources and for more efficiency from thermoelectrics, ultracapacitor/battery hybrid and electric vehicles.

An informative 2006 analysis of the Bakken oil formation.


Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions

January 25, 2008

Nanochip, an Intel backed startup, plans to make 100 Gigabyte persistent memory chips by 2010

Nanochip will use an array of MEMS probes (similar to the IBM millipede project) to stores 100 gigabytes (GB) of memory on one chip. The latest flash memory chips have 64 gigabits or 8 gigabytes of storage in the lab. Samsung expects production of 64-Gb flash devices to begin in 2009. Samsung aims to have 128 gigabit (16 gigabyte) flash chips ready in the second half of 2008 in the lab (commercial in 2010). Nanochip uses polarization instead of IBM Millepede’s heat to store data. If nanochip can get the 100 GB chip working in volume then they would have a 2-3 year lead at the higher density over flash. If the 100GB chip is delayed in getting into high volume then Flash would continue to dominate. Of course with Intel backing Nanochip, if they get this working properly, Intel will be able to finance a serious move for market share. Also, the processes seem promising and look good for scaling at a good price with superior features.




Nanochip uses one micron semiconductor fabs to make our MEMs chips. This type of equipment was used over ten years ago for most semiconductor products. Therefore, the cost of building a MEMs fab to make our chips is in the tens of millions of dollars, unlike the several billion dollars needed to make a 70 nm and soon a 45 nm semiconductor fabrication facility. Furthermore, we can use the same initial semiconductor/MEMS fab to make future generations of Nanochips since we have no requirements to change our lithography as we double density every year. The design is scalable to 1 Terabyte (TB) chips, according to Nanochip.


FURTHER READING
Another competing technology for future computer memory is programmable metallization cell (nanoionic)

Programmable-metallization-cell (PMC) memory, or nano-ionic memory could start replacing flash memory in 18 months (2009)

A new type of memory technology could lead to thumb drives or digital-camera memory cards that store a terabyte of information--more than most hard drives hold today. The first examples of the new technology, which could also slash energy consumption by more than 99 percent, could be on the market within 18 months


Endoscope in pill form, 33% thinner than current scopes

A fundamentally new design has created a smaller endoscope (currently 9mm wide and new one is 6mm wide with a thinner tether) that is more comfortable for the patient and cheaper to use than current technology. Its first use on a human, scanning for early signs of esophageal cancer.


The UW's scanning fiber endoscope fits in a pill that can be comfortably swallowed. The casing measures 6 millimeters wide and 18 millimeters long. (Credit: Image courtesy of University of Washington)


This is the image of the map produced by the endoscope. The devices records 15 color images per second with a resolution of more than 500 lines per inch. (Credit: Image courtesy of University of Washington)

An endoscope is a flexible camera that travels into the body's cavities to directly investigate the digestive tract, colon or throat. Most of today's endoscopes capture the image using a traditional approach where each part of the camera captures a different section of the image. These tools are long, flexible cords about 9 mm wide, about the width of a human fingernail. Because the cord is so wide patients must be sedated during the scan.

The scanning endoscope developed at the UW is fundamentally different. It consists of just a single optical fiber for illumination and six fibers for collecting light, all encased in a pill. Seibel acted as the human volunteer in the first test of the UW device. He reports that it felt like swallowing a regular pill, and the tether, which is 1.4 mm wide, did not bother him.

Once swallowed, an electric current flowing through the UW endoscope causes the fiber to bounce back and forth so that its lone electronic eye sees the whole scene, one pixel at a time. At the same time the fiber spins and its tip projects red, green and blue laser light. The image processing then combines all this information to create a two-dimensional color picture.

In the tested model the fiber swings 5,000 times per second, creating 15 color pictures per second. The resolution is better than 100 microns, or more than 500 lines per inch. Although conventional endoscopes produce images at higher resolution, the tethered-capsule endoscope is designed specifically for low-cost screening.



Printable electronics with 10 times better resolution and up to a million times faster

Chemical engineers at Princeton developed a method for shooting stable jets of electrically charged liquids from a wide nozzle. The technique, which produced lines just 100 nanometers wide (about one ten-thousandth of a millimeter), offers at least 10 times better resolution than ink-jet printing and far more speed and ease than conventional nanotechnology. The new technique can lay down lines at the rate of meters per second as opposed to millionths of a meter per second. The researchers were able to use a nozzle that is half a millimeter wide, or 5,000 times wider than the lines it produced. This also can improve some kinds of rapid prototyping, rapid manufacturing, fabbers and for medical applications like printing organs.

The key to the process is something called an “electrohydrodynamic (EHD) jet” -- a stream of liquid forced from a nozzle by a very strong electric field. In the past, the stream from such a process is unstable, but researchers produced a stable stream.




Schematic of EHDP. The suspension is first deployed by field-assisted flow in the form of a thin continuous filament. Rapid evaporation suppresses the Rayleigh instability and the feature shape is fixed by radiation or heating.

The result is highly practical not only because of the fineness of the stream but also because the large size of the nozzle and the distance from the nozzle to the printed surface will prevent clogs or jams.

a chief use for the technique could be in printing electrically conducting organic polymers (plastics) that could be the basis for large electronic devices. Conventional techniques for making wires of that size (100 nanometers) require laboriously etching the lines with a beam of electrons, which can only be done in very small areas. The new technique can lay down lines at the rate of meters per second as opposed to millionths of a meter per second.

Another application would be to use a liquid that solidifies into a fiber for making precise three-dimensional lattices. Such a product could be used as a scaffold to promote blood clotting in wounds and in other medical devices.

Princeton University has filed for a patent on the discovery and has licensed rights to Vorbeck Materials Corp., a specialty chemical company based in Maryland.

“Electronics is a huge potential application for this discovery,” said John Lettow, president of Vorbeck and a 1995 chemical engineering alumnus of Princeton. “The printing technique could greatly increase the size of video displays and the speed with which high performance displays are made.” Lettow said the technique also could be used in creating large sensors that collect information over a wide area, such as a sensor printed onto an airplane wing to detect metal fatigue.


FURTHER READING
Link to a video image of the straight and whipping jets.

Electrohydrodynamic Printing (EHDP) site at Princeton which is part of the larger Ceramic Materials Laboratory The Ceramic Materials Laboratory is doing a lot of interesting work.

The ability to decorate surfaces with micron or nanometer-scale features is of increasing importance for various applications, such as photonic materials [1,2], high-density magnetic data storage devices [3], microchip reactors [4] and biosensors [5]. One method of preparing such structures is through the hierarchical assembly of colloidal particles [6-10]. Colloidal particles are used since they can be synthesized in a variety of shapes and sizes from different precursor materials. Micropatterned colloidal assemblies have been produced with lithographically patterned electrodes [5,11] or micromolds [12,13]. An alternative approach is the category of direct writing techniques where patterns are formed by direct transfer of precursor materials without using masks or molds. Printed circuit board [14], transistor circuits [15], array-based nanostructures [16] and biosensors [17] have been made.


The research paper: (RJ148 ) S. Korkut, D.A. Saville, I.A. Aksay, "Enhanced Stability of Electrohydrodynamic Jets through Gas Ionization," Phys. Rev. Lett. 100 (in press) (2008)

Links to publications of the Electrohydrodynamic Printing (EHDP) group at Princeton

January 24, 2008

on the brink of synthetic life: DNA synthesis has increased twenty times to full bacteria size

A 582,970 base pair sequence of DNA has been synthesized.
It's the first time a genome the size of a bacterium has chemically been synthesized that's about 20 times longer than [any DNA molecule] synthesized before.


This is a huge increase in capability. It has broad implications for DNA nanotechnology and synthetic biology.

This means that the Venter Institute is on the brink of sythesizing a new bacterial life.

The process to synthesize and assemble the synthetic version of the M. genitalium chromosome

began first by resequencing the native M. genitalium genome to ensure that the team was starting with an error free sequence. After obtaining this correct version of the native genome, the team specially designed fragments of chemically synthesized DNA to build 101 “cassettes” of 5,000 to 7,000 base pairs of genetic code. As a measure to differentiate the synthetic genome versus the native genome, the team created “watermarks” in the synthetic genome. These are short inserted or substituted sequences that encode information not typically found in nature. Other changes the team made to the synthetic genome included disrupting a gene to block infectivity. To obtain the cassettes the JCVI team worked primarily with the DNA synthesis company Blue Heron Technology, as well as DNA 2.0 and GENEART.

From here, the team devised a five stage assembly process where the cassettes were joined together in subassemblies to make larger and larger pieces that would eventually be combined to build the whole synthetic M. genitalium genome. In the first step, sets of four cassettes were joined to create 25 subassemblies, each about 24,000 base pairs (24kb). These 24kb fragments were cloned into the bacterium Escherichia coli to produce sufficient DNA for the next steps, and for DNA sequence validation.

The next step involved combining three 24kb fragments together to create 8 assembled blocks, each about 72,000 base pairs. These 1/8th fragments of the whole genome were again cloned into E. coli for DNA production and DNA sequencing. Step three involved combining two 1/8th fragments together to produce large fragments approximately 144,000 base pairs or 1/4th of the whole genome.

At this stage the team could not obtain half genome clones in E. coli, so the team experimented with yeast and found that it tolerated the large foreign DNA molecules well, and that they were able to assemble the fragments together by homologous recombination. This process was used to assemble the last cassettes, from 1/4 genome fragments to the final genome of more than 580,000 base pairs. The final chromosome was again sequenced in order to validate the complete accurate chemical structure.

The synthetic M. genitalium has a molecular weight of 360,110 kilodaltons (kDa). Printed in 10 point font, the letters of the M. genitalium JCVI-1.0 genome span 147 pages.


The US is buying the modern equivalent of a Maginot Line

Tom Craver posits in a comment that the current financial troubles of the United States are from spending too much money on the military and for War in Iraq, Afghanistan and the war on terror and not because of the subprime mortgage situation. The USA is overspending on the wrong kinds of defense and buying the modern equivalent of a Maginot line. The Maginot line were fixed defenses built by the French before WW2 that were overrun and circumvented by the Nazis.

The US needs some but a lot less traditional defense spending and more technological spending and a stronger economy.

Another thing is that the new kind of warfare like the Russian hacker attack on Estonia, your defences are not improved with your 4th, 5th, 6th and 7th aircraft carrier group.

Supporting this is a cost estimate of buying the mortgage bond insurers by Jim Cramer:
$250 billion maximum and probably $125 billion or less.

The government needs to buy these mortgage insurers and mortgage-backed and municipal bond insurers – MBIA MBI, PMI Group PMI, MGIC MTG and Ambac ABK he said. The insurance covering municipal bonds could be sold to Warren Buffett or the highest bidder. Then Washington could guarantee the loans at 50 cents on the dollar. That way, even if all of the whole $500 billion worth defaulted, it would only cost $250 billion to lift the economy out of this rut. But most likely no more than half of that $500 billion would need to be covered.


So $125 billion to cover half of $250 billion which gets back half of that from reselling at 50 cents on the dollar.

So a $125 billion subprime problems against trillions spent on the wars and on overspending on the military.

I do support prudent military spending and national security. However, excessive and inefficient spending does not provide more security.

Plenty of security could be had from a $200-250 billion/year military budget instead of $440 billion/year. (military-related expenses totaled approximately $626.1 billion) Note: it would take time to make the adjustments from the current situation.

Adding up all of the national security spending gets to a total of about $1 trillion Note: 25% of that number is from interest payment on debt, so we would have to get to budget surpluses and then to paying down the debt to reverse that situation.

The existing level of US defences (nuclear and conventional) is sufficient to deter all of the other possible big nation enemies (Russia, China) and it should not be that expensive to keep Iran, the middle east and terrorists under control. An economically stronger America with balanced budgets and very little or no debt and a more modest military with the right technological infrastructure would be stronger than an economically stretched America with a lot of debt a big military and an infrastructure dependent on foreign oil. The Soviet Union was the classic example of a country with a military that was too big and an economy that was too weak and out of balance.

Better fiscal discipline would provide for more security and a better economy. A better economy is critical to a stronger nation.

Note: Fiscal/budgetary discipline could also be achieved by cutting way back on entitlement programs, but defence and security seem like areas with more excess that are not achieving real goals.

The Atlantic review also makes the case that the US Defense budget is too big and the European one is too small

A Yale professor also makes the case that the US defense budget is too big

Also, better technological and policy choices are also critical. Building a lot of nuclear and some renewable power so that there is no dependence on foreign oil would change the geopolitical situation and alter the need to spend money in the middle east.

I have plenty of articles on my site about the right technological and infrastructure choices.

China is passing the USA in technology development

According to a worldwide technology competiveness study by the Georgia Institute of Technology China may soon rival the United States as the principal driver of the world’s economy and become the technology development leader.

1993-2007 world technology competitiveness

The study’s indicators predict that China will soon pass the United States in the critical ability to develop basic science and technology, turn those developments into products and services – and then market them to the world.


The 2007 statistics show China with a technological standing of 82.8, compared to 76.1 for the United States, 66.8 for Germany and 66.0 for Japan. Just 11 years ago, China’s score was only 22.5. The United States peaked in 1999 with a score of 95.4.

“China has really changed the world economic landscape in technology,” said Alan Porter, another study co-author and co-director of the Georgia Tech Technology Policy and Assessment Center, which conducted the research. “When you take China’s low-cost manufacturing and focus on technology, then combine them with the increasing emphasis on research and development, the result ultimately won’t leave much room for other countries.”

Recent statistics for the value of technology products exported – a key component of technological standing – put China behind the United States by the amount of “a rounding error:” about $100 million. If that trend continues, Newman noted, China will shortly pass the United States in that measure of technological leadership.

China’s emphasis on training scientists and engineers – who conduct the research needed to maintain technological competitiveness – suggests it will continue to grow its ability to innovate. In the United States, the training of scientists and engineers has lagged, and post-9/11 immigration barriers have kept out international scholars who could help fill the gap.

China is becoming a leader in research and development, Porter noted. For instance, China now leads the world in publications on nanotechnology, though U.S. papers still receive more citations.

China has been dramatically improving its input scores, which portends even stronger technological competitiveness in the future.

“It’s like being 40 years old and playing basketball against a competitor who’s only 12 years old – but is already at your height,” Newman said. “You are a little better right now and have more experience, but you’re not going to squeeze much more performance out. The future clearly doesn’t look good for the United States.”


RELATED READING
I have noted that the exchange rate based size of the Chinese Economy is on course to pass the United States in about 2018 plus or minus 3 years


Industrial scalable process for bulk alignment of carbon nanotubes

Researchers from Seoul National University and Sungkyunkwan in South Korea have developed a technique for aligning nanotubes over large areas based on the flow of a nanotube-containing solution through nanochannels. Not being able to align carbon nanotubes in a cheap and simple way has been a roadblock to making the superior commercial electronic devices that should be possible with carbon nanotubes.

This technique is especially attractive because of its simplicity; no external stimuli such as the application of an electric field or syringe pumping are required to align the nanotubes.


This novel approach for aligning carbon nanotubes is based on the simple flow of a nanotube solution through a nanochannel fabricated from a charged polymeric mold. The nanotubes are ordered within the channels by the influence of the capillary force existing within the confines of the channel. When the channels are of the correct geometry, aqueous solutions containing nanotubes enter from both ends, and upon evaporation leave behind dense and highly oriented arrays of nanotubes. Suh cautions that the mechanical properties and surface chemistry of the polymeric mold used for making the nanochannels are of paramount importance. “The stiffness of the polymer has to be just right”, says Suh, “it has to be rigid enough to keep the nanochannels from collapsing but flexible enough to bond well with the substrate over a large area”. Good adhesion is required between the nanochannel and the substrate to prevent the polymer nanochannels from coming unstuck upon the introduction of the aqueous nanotube solution. The researchers have found that polyethylene glycol diacrylate has the right combination of properties for use as the polymer mold. It is negatively charged and facilitates conformal contact with the substrate. Moreover, it is hydrophilic and thus the nanotube solution is able to enter and flow through the channels without need for additional pumping.



Carnival of Space week 38

Carnival of space week 38 is up at sortingout science

My contribution was my discussion of Virgin Galactic and how NASA should learn some lessons as it makes new plans

Hobby space also talked about spaceshiptwo and white knight two

An interesting aspect is that White Knight Two could be the first stage able to launch a one person module into low earth orbit.

Virgin Galactic has begun studying a SpaceShipFour (SS4) that could serve as a satellite launcher. By 2014 a version of WK2, with its 13,300kg (30,000lb) payload capability, could air launch the two-stage SS4 vehicle. SpaceShipThree would be an orbital version of SS2



Selenian boondocks talks about orbital access methods in multiple parts. The second part of two stage to orbit reusable launch vehicles is here

Centauri Dreams talks about extinction odds and moving into space.

January 23, 2008

Nanodynamics will be first US company to IPO on the Dubai Stock Exchange

Nanodynamics is IPO on the Dubai International Financial Exchange (DIFX)

The intent of this offering is to place 9,100,000 shares at a filing price between US$ 10-12.50 raising US$ 100 million. Pricing is expected to take place the week of January 28, 2008



Alan Shalleck of nanoclarity has a lot of useful coverage at nanotech-now.

In February, 2008, the Dubai exchange will be renamed the NASDAQ-DIFX because NASDAQ has taken a 20% stake in the Dubai exchange while the Dubai exchange has taken a similar stake in the NASDAQ allowing the Dubai exchange to carry the NASDAQ name. By going public on the DIFX, a nanotech company can be listed, after February, on the international portion of the NASDAQ and be traded on the NASDAQ exchange.

Because many of the ND applications for its nanotech water filters and fuel cells are in third world nations and the Far East, going public on the Dubai exchange makes strategic business sense for ND. In addition, a series of green applications for specific versions of ND's developments will be coming to technical and economic fruition during the spring. These opportunities will provide an immediate income boost to ND.

Last month I recommended a 2008 strategy of nanotech company consolidation. Consolidate or die. This month I have exposed the second part of my recommended strategy. The reason to merge is first to create a critical operational mass and second to go public on the NASDAQ-DIFX exchange to finance your next growth stage. ND is leading the way… follow them while international big money is looking for US technology and for US based growing nanotech companies.

Microwave oil recovery of oil shale and oil sands

Schlumberger has bought Raytheon microwave oil recovery technology. They can retrieve four to five barrels of oil (EROEI 4-5) for every barrel of oil consumed in the process. Other methods have reported 1 1/2 to three barrels (EROEI 1.5-3) for each one consumed.

Because microwaves can generate heat faster than convection heating, shale can be adequately heated to extract oil within a month or two of beginning production activities, rather the year or longer for other methods, Raytheon says.
This technology could help unlock 800 billion barrels of recoverable oil from oil shale in the USA.

For tarsands and heavy oil the process could yield 10-15 barrels per barrel consumed (EROEI 10-15).

Hat tip to Al Fin



Virgin Galactic revealed and compared to shifting NASA plans

The new designs for Virgin Galactic's SpaceShipTwo rocket plane and WhiteKnightTwo mothership were unveiled in New York.

The biggest twist is that the WhiteKnightTwo plane has spread out and sprouted another passenger cabin on its 140-foot-long wing. The two cabins and four Pratt & Whitney jet engines straddle a central mount for the rocket plane, which will be carried to an altitude of 50,000 feet and dropped. Then SpaceShipTwo will light up its hybrid rocket engine for the final push to the edge of outer space.


UPDATE:
Scaled Composites plans to build 40-50 launch aircraft. At least 15 will be used for space tourism, with the rest used for satellites and other payloads.

Virgin Galactic says it thinks it could launch small satellites in the range of 50-100kg into low-Earth orbit using an unmanned rocket hung from White Knight Two for less than $2.5m.

They have spent $70m already and will spend another $130 million. Virgin Galactic expects to break even in 2014. Reducing the price of a trip into space to attract more customers is also part of the plan, as is exploiting every possible form of additional income, such as selling media rights.


Live coverage from Wired of the event

I think a related news is that on Feb 12-13 there will be "leaders of the space community" who will look at forming an alternative to the Lunar Vision for Space Exploration centered around manned asteroid landings. If alternative-vision planners have their way, the mission could instead be flown to an asteroid in about 2025 as opposed to the original and uncertain 2020 target for a lunar landing. The contrast between NASA plans and Virgin galactic will be discussed below.


Artistic representation of Scaled Composites spaceshiptwo in space. Eight seater craft, seating 2 pilots and 6 passengers, which will be used by Virgin Galactic passengers.



White Knight 2

The new White Knight 2 mothership, says Whitehorn will have four jet engines and is a significant departure from the first White Knight.

The mothership is 70 percent complete and will be the largest all-carbon-composite airplane in the world. Test flights are expected to begin this summer.


More on the shifting Nasa plans:

The asteroid visit and Lagrangian mission concepts would use much of the same CEV Ares I and Ares V heavy-lift booster infrastructure, but in ways that would be much faster stepping-stones to Mars than developing a manned lunar base. Asteroid and Lagrangian point missions would each last several weeks or months. Both the libration points and asteroids would be about 1 million mi. from Earth, requiring operations more like much longer trips to Mars at least 40-100 million mi. away.

Robotic options for all mission elements also will be reviewed, and one working group will be devoted to better defining manned versus robotic trade­offs.



White Knight 2 model

Contrast of Nasa plans with Virgin Galactic.

Nasa plans if they are successful do not achieve the main program milestones until 2025 or 2020. Virgin Galactic could start flying passengers in late 2009.

Virgin Galactic could expand the number of people (passengers) who get to fly by 10 to 1000 times versus the NASA plans. Seats on spaceshiptwo cost $200,000. Virgin Galactic says more than 200 individuals have booked, and another 85,000 have registered an interest to fly. Tens of millions of dollars in deposits have already been taken. If set prices drop to $100,000 each then 85,000 people would generate $8.5 billion in revenue. This could make spaceshipthree (an orbital system) fully fundable from Virgin Galactic operational profits.

Virgin Galactic appears to be offering a path forward to safer (100 times or more safer) and cheaper travel into space for a lot more people. NASA plans are for multi-billion new hardware (which does not have interesting new capabilities beyond what is currently available) and on the ground jobs for the current bureaucracy and plans for a few elite astronauts to go on short term missions.

Virgin Galactic has systems that leverage and build upon the best of what has gone before. NASA plans restart development every few years and costs are constantly escalating.

NASA plans are centered around justifying building the CEV Ares I and Ares V heavy-lift booster infrastructure. Before that it was around justifying building the International Space station. Before that it was around justifying the Hubble Space telescope and the Space shuttle. The hardware should be about supporting the missions and finding ways to increase safety and lower costs and expand capability.

NASA should look at shorter development cycles. What can NASA deliver in 4 years that can reduce costs and add new capability ?

Low orbit fuel space stations would fit that short development timeline and provide useful value for later missions both by NASA and private industry.

Missions for better propulsion (better ion drive, Vasimr, laser array launches etc...). Fuel efficient electric engines for LEO to moon, langrange points, asteroids etc...

Development of space mining and resource utilization. Generating oxygen from regolith etc...

January 22, 2008

More autonomous robot through new instantaneous 3d freezeframe LADAR

iRobot (NASDAQ:IRBT) (maker of the millions of Roomba vacuums and military Packbots) may be on the verge of creating a new generation of robots with a much higher degree of autonomy than is currently possible. This is another example of how important the sensing systems and other non-artificial intelligence parts of a robot are to improving functionality. By giving a system better "artificial eyes" a far more capable and useful system is created without any advance in processing power or coding.

The robotics maker is announcing a deal with Santa Barbara, CA-based Advanced Scientific Concepts under which it has obtained the exclusive rights (in exchange for future purchasing commitments) to ASC’s 3-D Flash LADAR (for Laser Detection and Ranging) technology.

the camera and laser setup for the 3d flash ladar




Schematic design of part of the 3d flash ladar system by Advanced Scientific Concepts

Laser range detectors have been employed in the robotics community for years, Greiner points out. “In fact, we’ve done it for many, many years,” she says. The problem is, the systems are not especially rugged or durable, they’re susceptible to glare from the sun, and have trouble cutting through dust or fog. In short, they don’t do as well as they should in the extreme conditions in which they must typically be deployed.

What ASC brings to the table is a new approach to the problem. As Greiner puts it, “These guys have invented a new type of LADAR system.” The Flash-based, solid-state system has no moving parts, one factor in improving its ruggedness. Instead of scanning the terrain one line at a time like traditional LADAR, ASC’s system illuminates an entire scene at once with diffuse laser light, providing a full, instantaneous 3D picture of the territory around it. The flash technology, Greiner says, can “visually freeze the entire geometry.”

The new LADARs going not only on Packbots but larger vehicles like Humvees or even tanks. Outside the military, she envisions it enabling robot-driven tractors or other large industrial robots.


(5 pages) Three Dimensional Flash LADAR Focal Planes and Time Dependent Imaging

Accurate three-dimensional data can be acquired a frame at a time with frame rates of at least 30 Hz using a flash ladar 3-D camera invented and fabricated by ASC. Each frame of data is acquired instantaneously with respect to the mechanical motion of the objects within the 3-D flash ladar camera field of view. Although only 50 – 100 m time dependent data was presented, the static image at 1 km suggests that this data could also be acquired a many kms. The current camera has an FPA of 128 x 128 pixels but there is no technological limitation restricting the array size. Furthermore, although in general larger arrays take longer to readout, 30 Hz is certainly not the upper bound frame rate and the camera can quantitatively describe high-speed or rapidly contorting objects.

Applications of the 3-D flash camera abound and are seemingly limited only by imagination. However, the camera appears to offer an immediate breakthrough in collision avoidance or navigation of unmanned or manned vehicles. Vehicle motion can distort the 3-D image of conventionally scanned ladar system which collect a full data frame over time rather than instantaneously. By making a precision scanner unnecessary, 3-D flash ladar systems offer low weight, small size, high reliability, optical zooming and eventually low price.


This paper reviews the progress of Advanced Scientific Concepts, Inc (ASC). flash ladar 3-D imaging systems. the system has the ability to look past obstructions and obscuring objects (like smoke or things in the way of or hiding a target)

FURTHER READING
Wikipedia on LIDAR

LIDAR (Light Detection and Ranging) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant target. Other terms for LIDAR include ALSM (Airborne Laser Swath Mapping) and laser altimetry. The acronym LADAR (Laser Detection and Ranging) is often used in military contexts. The term laser radar is also in use but is misleading because it uses laser light and not the radiowaves that are the basis of conventional radar


Real time comes to LADAR

In MEMS ladar, which is a “single-point” or scanning ladar, a collimated laser pulse hits a MEMS mirror tilted toward a point in the field of view (FOV) that represents an image pixel. After the pulse is reflected from the target, the portion reflected back to the ladar’s photodetector (PD) is then analyzed for time-of-flight and intensity.


To yield the same energy at the target area, the pulse from a flash ladar’s laser would have to be at least 65,000 times more powerful than that from a MEMS ladar’s laser (assuming all other characteristics are the same) to produce a 256 x 256 pixel image (top). Furthermore, because the pulse reflected back from the target is being distributed over the entire FPA, the flash ladar’s laser would have to be an additional 65,000 times more powerful to yield the same pulse energy per pixel as the MEMS ladar (bottom)


Google techtalks on the 3d cameras of ASC

Nanothin polymer films hide drug delivery from immune system for months

Nanoscale polymer films, about four nanometers per layer, were used to build a sort of matrix or platform to hold and slowly release an anti-inflammatory drug. The films are orders of magnitude thinner than conventional drug deliver coatings, said Genhong Cheng, a researcher at UCLA’s Jonsson Comprehensive Cancer Center and one of the study’s authors. A nanometer is one billionth of a meter.

“Using this system, drugs could be released slowly and under control for weeks or longer,” said Cheng, a professor of microbiology, immunology and molecular genetics. “A drug that is given orally or through the bloodstream travels throughout the system and dissipates from the body much more quickly. Using a more localized and controlled approach could limit side effects, particularly with chemotherapy drugs.”



Researchers coated tiny chips with layers of the nanoscale polymer films, which are inert and helped provide a Harry Potter-like invisibility cloak for the chips, hiding them from the body’s natural defenses. They then added Dexamethasone, an anti-inflammatory drug, between the layers. The chips were implanted in mice, and researchers found that the Dexamethasone-coated films suppressed the expression of cytokines, proteins released by the cells of the immune system to initiate a response to a foreign invader. Mice without implants and those with uncoated implants were studied to compare immune response.

The uncoated implants generated an inflammatory response from the surrounding tissue, which ultimately would have led to the body’s rejection of the implant and the breakdown of its functionality. However, tissue from the mice without implants and the mice with the nano-cloaked implants were virtually identical, proving that the film-coated implants were effectively shielded from the body’s defense system, said Edward Chow, a former UCLA graduate student who participated in the study and is one of its authors.

The nanomaterial technology serves as a non-invasive and biocompatible platform for the delivery of a broad range of therapeutics, said Dean Ho, an assistant professor of biomedical and mechanical engineering with the McCormick School of Engineering and Applied Science, a member of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University and the study’s senior author.

The technology also may prove to be an effective approach for delivering multiple drugs, controlling the sequence of multi-drug delivery strategies and enhancing the life spans of commonly implanted devises such as cardiac stents, pacemakers and continuous glucose monitors

Natural gas discovery and waste

Many places online have been discussing the 168 trillion to 516 trillion cubic feet of natural gas discovered within the north Appalachian Plateau of the USA.

alfin : Peak Oil: Meet Marcellus Black Shale

Roland Piquepaille's Technology Trends : Giant gas field found in the Appalachia

The yearly consumption of natural gas worldwide is slightly above 100 trillion cubic feet. The U.S. currently produces roughly 30 trillion cubic feet of gas a year. Horizontal drilling techniques could help to recover about 50 trillion cubic feet of gas from the Marcellus (Conservative estimate that 10% of the reserve can be accessed)


There are over 150,000 NGVs on U.S. roads today and over 5 million worldwide. 3% of the natural gas is used for transportation, 97% is used for heating and cooking. If nuclear power was used instead to generate electricity for the heating and cooking, then 32 times more vehicles could be natural gas powered. Natural gas vehicles have less pollution than gasoline powered vehicles.

As of January 1, 2007, proved world natural gas reserves, as reported by Oil & Gas Journal, were estimated at 6183 trillion cubic feet. Worldwide undiscovered natural gas is estimated at 4,136 trillion cubic feet.
For 2008, 6185 trillion cubic feet of natural gas reserves.



The World Bank’s GGFR estimates that 150 billion cubic meters (or 5.3 trillion cubic feet) of natural gas is being flared and vented annually.

That is equivalent to 25 per cent of the United States’ gas consumption or 30 per cent of the European Union’s gas consumption per year. It is also estimated that global gas flaring releases about 390 million tons of CO2 per year into the atmosphere.

According to the World Bank, the gas flared in Africa could generate half of the continent’s power consumption. Nigeria is probably the world’s largest flarer of natural gas. Nigerian officials want a venture to tackle gas flaring but western oil companies say they cannot meet a deadline to end flaring by 2009


RELATED READING
Using corncob waste as a starting material, researchers have created carbon briquettes with complex nanopores capable of storing natural gas at an unprecedented density of 180 times their own volume and at one seventh the pressure of conventional natural gas tanks.

Researchers report development of a sponge-like material with the highest methane storage capacity ever measured. It can hold almost one-third more methane than the U.S. Department of Energy's (DOE) target level for methane-powered cars.

DOE info on natural gas vehicles and fuel

Unconventional Natural Gas Reservoir In Pennsylvania Poised To Dramatically Increase US Production

GE breakthrough for cheap solar cells

GE Global Research, the centralized research organization of the General Electric Company (NYSE: GE), announced that scientists on their Nano Photovoltaics (PV) team have demonstrated a scalable silicon nanowire-based solar cell, which has the potential to achieve up to 18% efficiency and be produced at a dramatically lower cost than conventional solar cells.




“GE’s demonstration of the silicon nanowire-based cell represents a significant breakthrough in our efforts to enable higher efficiency cells that can be produced at much lower production costs,” said Dr. Loucas Tsakalakos, Project Leader of GE’s Nano PV team. “Today, higher efficiency often comes with a higher price tag. Through the unique processing and materials property benefits enabled by nanotechnology, we’re aiming to break that paradigm and pave the way to making solar power more affordable for consumers while maintaining and even improving cell performance.”


Silicon nanowire solar cells by
L. Tsakalakos, J. Balch, J. Fronheiser, and B. A. Korevaar
General Electric-Global Research Center, Niskayuna, New York 12309, USA

O. Sulima and J. Rand
GE Energy-Solar Technologies, Newark, Delaware 19702, USA

Silicon nanowire-based solar cells on metal foil are described. The key benefits of such devices are discussed, followed by optical reflectance, current-voltage, and external quantum efficiency data for a cell design employing a thin amorphous silicon layer deposited on the nanowire array to form the p-n junction. A promising current density of ~1.6 mA/cm2 for 1.8 cm2 cells was obtained, and a broad external quantum efficiency was measured with a maximum value of ~12% at 690 nm. The optical reflectance of the silicon nanowire solar cells is reduced by one to two orders of magnitude compared to planar cells


Loucas Tsakalakos, Project Leader of GE’s Nano PV team, has a blog entry on this

The cells were fabricated on a metal foil substrate, thus showing potential for future roll-to-roll manufacturing of such devices. We used standard, scaleable processes to grow the nanowires and to fabricate p-n junctions conformally around the nanowires. The use of conformal p-n junctions allows for de-coupling light absorption from charge transport. In a standard solar cell the active material must be thick enough to absorb all the sunlight (for silicon this is > 125 micrometers), however, as charge carriers diffuse back to the p-n junction many are lost due to non-radiative recombination. In these nanowire-based devices the minority carriers must only diffuse a few hundred nanometers to reach the charge-separating junction. The nanowire cells also showed the expected improvements in their optical properties. While the power conversion efficiency in these devices is still low, and much work remains to improve the performance, this nanoscale solar cell architecture and processing approach has promise to create a new paradigm in solar cell manufacturing and device design in the future.


Hat tip to Al fin

0.5 Angstrom transmission electron microscope

TEAM 0.5 (Transmission electron achromatic microscope), the world's most powerful transmission electron microscope — capable of producing images with half‑angstrom resolution (half a ten-billionth of a meter), less than the diameter of a single hydrogen atom — has been installed at the Department of Energy's National Center for Electron Microscopy (NCEM) at Lawrence Berkeley National Laboratory.

This is continued testing and operational installation from Sept 2007 when it was tested before being sent to the lab

Correcting spherical aberration makes it possible to use the TEAM 0.5 not only for broad-beam, "wide-angle" images but also for scanning transmission electron microscopy (STEM),
in which the tightly focused electron beam is moved across the sample as a probe, capable of performing spectroscopy on one atom at a time — an ideal way to precisely locate impurities in an otherwise homogeneous sample, such as individual dopant atoms in a semiconductor material.

Aberration correction is also essential for another advanced feature of TEAM 0.5: its ability to maintain high resolution with lower electron beam energies.




TEAM 0.5, the world's best transmission electron microscope, is being assembled at the National Center for Electron Microscopy. (Photo Roy Kaltschmidt, Berkeley Lab CSO)

Installation of the new stage must await the next phase of the TEAM Project: the TEAM I microscope, due to be set up at NCEM early in 2009.

While TEAM 0.5 corrects spherical aberration in both the "probe" beam (the electron beam before it strikes the sample) and the image beam (after it exits the sample, but before it reaches the detector), TEAM I will also correct chromatic aberration in the image beam, which has never beeen accomplished before. Spherical aberration is caused by the shape of a lens; chromatic aberration results when a lens refracts light or electrons of different wavelengths (different colors or energies) at different angles.




FURTHER READING
The TEAM project

Part of a 20 year roadmap for improving science facilities in the USA

January 21, 2008

EEstor ultracapacitor system expected for mid-2008

Lockheed has signed an exclusive international license to use EEStor's power system for military and homeland-security applications--everything from advanced remote sensors and missile systems to mobile power packs and electric vehicles.

Zenn motors is now expecting delivery of the energy-storage unit in mid-2008. It will be a mass-produced commercial product.

Future tallest building contenders - twice as tall as Sear's tower [pics]

As of Dec 27, 2007 the Burj Dubai is 598.5 meters tall and 158 stories. It will end up over 800 meters tall including its antenna. These supertall skyscrapers are taking 4-5 years to complete.

Burj Dubai's last two milestones will be to surpass the 628.8 m (2,063 ft) height of the KVLY-TV Mast in North Dakota, United States to become the world's tallest structure and to pass the Warsaw radio mast in GÄ…bin, Poland (646.4 m (2,121 ft) until it collapsed in 1991) to become the world's tallest structure of any type ever built.






Skyscraper profiles

Another skyscraper that seems likely to be built is the Al Burj The secrecy behind Al Burj's height originally suggested that it will also try to compete for this title and rumours even suggest it will break the 1,000 m (3,281 ft) mark. There is talk that it will be 1200 meters (3900 ft) tall. 5,295,800 to 6 million sq ft of mixed use development.


Here is the profile if it was 1200 meters tall

Burj Mubarak al-Kabir is another proposed 1000+ meter tall building It would be part of a massive $85 billion development in Kuwait.


Burj Mubarak al-Kabir

The Murjan City Tallest Tower is a supertall skyscraper planned to be built in Bahrain. The tallest tower is currently planned to be 1,022m high. The 9.3 million square metre edifice aims to be "the most luxurious ever built", and is also intended to contain what the developer considers the world's largest shopping mall.

The Buenos Aires Forum is also a proposed 1000 meter tall building but is not expected to break ground until 2010 and not be completed until 2016

Less likely Proposals
The Bionic tower is 1128-1228 meter tall building
bionic tower
Bionic tower

Sky City 1000 is a building 1,000 meters (3,280.8 feet) tall and 400 meters (1,312 feet) wide at the base, and a total floor area of 8 km² (3.1 miles² or 1976.8 acres). Sky City 1000 is a possible future urban project aimed at helping put an end to major congestion and lack of green space in the Tokyo, Japan ward area.



The Shimizu TRY 2004 Mega-City Pyramid is a proposed project for construction of a massive pyramid over Tokyo Bay in Japan. The structure would be 12 times higher than the Great Pyramid at Giza, and would house 750,000 people. If built, it will be the largest man-made structure on Earth. The structure would be 2,004 meters (6,575 feet) high and would answer Tokyo's increasing lack of space.


Shimizu TRY 2004 Mega-City Pyramid

The Aeropolis 2001 is a proposed project for construction of a massive 500-story (2001 meter tall) high-rise building over Tokyo Bay in Japan.

The X-Seed 4000 is the tallest building ever fully envisioned, meaning that the designs for construction have been completed. Its proposed 4 kilometer (13,123 foot) height, 6 kilometre wide sea-base, and 800 floor capacity could accommodate five hundred thousand to one million inhabitants. The purpose of the plan was to earn some recognition for the firm, and it worked. It was never intended to be built.

xseed 4000
X-seed 4000

FURTHER READING
Nanomaterial enhanced steel and concrete will enable safe super skyrises

Accelerating future on Xseed 4000

Advertising

Trading Futures
 
Nano Technology
 
Netbook     Technology News
 
Computer Software
   
Future Predictions


Cigarette size plasma jet powered UAVs

There is interesting progress being made to develop a nanoair vehicle that weighs less than 10 grams and is shorter than 3 inches. A promising design is using electricity and plasma jets.

plasma micro thruster UAV
Plasma micro thruster powered UAV the size of a cigarette

“It’s a new propulsion technology to be used by micro and nano-unmanned aerial vehicles, or UAV,” Jacob said. “By micro, we mean smaller than a foot, and by nano, we mean smaller than six inches.”

This is part of the DARPA nanoair vehicle program

The Nano Air Vehicle (NAV) Program will develop and demonstrate an extremely small (less than 7.5 cm), ultra-lightweight (less than 10 grams) air vehicle system with the potential to perform indoor and outdoor military missions.

Technical Area Figures of Merit -Phase I targets
1. Aerodynamic Performance and Airfoil/Wing/Rotor Design and Manufacture Develop computational aerodynamic modeling tools to design a high performance airfoil at a low Reynolds number.
Demonstrate reliable wing manufacturing principles and achieve wing loading of > 0.1 kg/m2.
Demonstrate airfoil section steady lift to drag capability over 8 at low Reynolds number (Re < 15,000).

2. Propulsion and Power Demonstrate system electrical power to mechanical transduction conversion efficiency of at least 20 percent. Demonstrate an ability to meet power requirements for a notional mission of 1 kilometer with a total hover time of over one minute.



19 page pdf, Santhanakrishnan, A. and Jacob, J.D., “On Plasma Synthetic Jet Actuators,”
Plasma actuators, also known as dielectric barrier discharge actuators (or OAUGDPTM, one atmosphere uniform glow discharge plasma1) typically refer to an asymmetric arrangement of two electrodes separated by dielectric material. Back in Jan 2006 they were creating 10,000 pulses of 1 m/s jets.

FURTHER READING
Hydrodynamics & Aerodynamics Laboratory at Oklahoma State University is working on novel applications of fluid mechanics, particularly to aerospace, including flow control, UAV design, and bio-fluid mechanics.

Singularity perspectives using hindsight and optimal algorithms: AGI raised by wolves

We can use a thought experiment of placing a hypothetical superior intelligence back 20, 30 or 40 years, we could use our hindsight knowledge of superior algorithms (developed between then and now) and new technologies to approximate possible improvements a AGI could use. The level of advantage could be used to approximate advantages of a current or future AGI.

The technological Singularity is described as creation of [significantly] smarter-than-human intelligence.

Combine faster intelligence, smarter intelligence, and recursively self-improving intelligence, and the result is an event so huge that there are no metaphors left. There's nothing remaining to compare it to. The Singularity is beyond huge, but it can begin with something small. If one smarter-than-human intelligence exists, that mind will find it easier to create still smarter minds.


I feel that the recursively self-improving aspect is important for requiring time and physical revisions to properly implement the full scope of improvements. Leaving some of the substantial technical issues with achieving significantly smarter-than-human intelligence, I wanted to look at what I feel are limitations in what can be achieved until an AGI commandeers resources and directs several iterations of improvements.

I feel some useful perspectives can be gained with some thought experiments. If we assume something with vastly superior intelligence and insight was placed at different points in technological history we could assume that all of the insights that we have developed since that time would be available. The effect of those insights would give us an idea of the advantages of vastly superior insight and the limitations of inferior resources. The superior intelligence initially only has whatever crap we have made with regular human intelligence. If something with all of our knowledge went back in time, it would only have the technology of that time time period to work with. The Artificial General Intelligence raised by wolves has to overcome its upbringing.


There are various estimates of the total computing power and memory that is in existence in the world.

There were more than 5 exabytes (10^18) of information stored in the world in 2003, but most of this was kept offline on paper, film, CDs, and DVDs. Since then online storage has mushroomed. Today the Machine's memory totals some 246 exabytes of information (246 billion gigabytes.)


If we were to deposit a superior intelligence back 20, 30 or 40 years, then there would far less computing power and vastly slower communications and parallel processing and distributed computing was almost non-existent.

Many (but not all) algorithms and compression scheme were vastly inferior to what is possible with superior insight

Superior compression allows effective communication speed to be increased several times. Better algorithms provides varying amounts of improvement (from very little to thousands of times or more.) Improved parallel and distributed computing allows for more resources to be used together.

A superior intelligence is limited with inferior inputs (sensors) and faulty starting data (our current understanding has errors and misinterpretations).

An AGI that develops at a time and with access to better technology such as molecular fabrication systems would be able to more readily utilize superior insights to develop a larger advantage.

RELATED READING
Artificial intuition