Tech Convergence Will Spur Demand for New ADAS Technology

UKLight/StarLight/TeraGrid



UKLight



Particle physics, radio astronomy, computing visualisation experiments, remote viewing of high-resolution mammography images. These are just some project areas that researchers are working on which suck up computer network band-width.



Researchers increasingly require data communications between institutions in the Gigabit/second range and above. Examples include Scientific data transport, Computational Science, Visualisation, Environmental Modelling and Medical applications requiring high volume remote imaging. Such applications require connections between a set of end-points which must be sustained for hours or days, and which may have stringent constraints on the levels of service required from the network. These requirements are difficult and costly to meet with traditional shared IP-based networking, and may be best addressed through the use of channel-switched networks.




StarLight



StarLight is a 1GigE and 10GigE switch/router facility for high-performance access to participating networks, and a true optical switching facility for wavelengths. Since summer 2001, StarLight management and engineering has been working with the international academic and commercial communities to create a proving ground in support of grid-intensive e-Science applications, network performance measurement and analysis, and computing and networking technology evaluations.



StarLight users include a global scientific community conducting advanced networking, database, visualization and computing research using IP-over-lambda networks. StarLight also supports experimental protocol and middleware research of high-performance application provisioning of lightpaths over optical networks.


TeraGrid



TeraGrid is a multi-year effort to build and deploy the world's largest, fastest, distributed infrastructure for open scientific research. When completed, the TeraGrid will include 20 teraflops of computing power distributed at nine sites, facilities capable of managing and storing nearly 1 petabyte of data, high-resolution visualization environments, and toolkits for grid computing. These components will be tightly integrated and connected through a network that will operate at 40 gigabits per second—the fastest research network on the planet.



NCSA is home to the bulk of the computers that power the TeraGrid. When completely installed, NCSA's TeraGrid system will consist of 10 teraflops of computing capacity in IBM Linux clusters powered by Intel® Itanium™ 2 processors. The clusters include 2 teraflops of computing power already on the floor when the TeraGrid project began and another 8 teraflops paid for by the NSF TeraGrid awards. The system at NCSA also includes 240 terabytes of secondary storage. The center recently announced that the first computing systems of TeraGrid are in production mode, making 4.5 teraflops of distributed computing power available to scientists across the country. NCSA is the leading edge site for the National Computational Science Alliance.
Supercomputing using optically linked grids of thousands of computers is currently aimed at scientific and research applications like weather forecating, particle physics experiments, etc. This massive, parallel computing power will be repurposed to provide resources for consumer augmented and virtual reality applications, before too long.

Comments