NI named a finalist in supercomputing competition

02 December 2008

National Instruments Corporation (UK) Ltdvisit website


NI named a finalist in supercomputing competitionNational Instruments was recently announced as a finalist in the 2008 Supercomputing Conference Analytics Challenge for accomplishments in high-performance computing (HPC) with the NI LabVIEW graphical system design platform. This recognition acknowledges the most innovative solution to the more complex problems in supercomputing applications. For the competition, the National Instruments LabVIEW research and development team submitted a technical paper establishing multicore programming benchmarks in developing real-time control for the forthcoming European Extremely Large Telescope (E-ELT), which represents historic computational challenges.

Dr James Truchard, CEO, Cofounder and President of National Instruments, says: "We are excited to be a finalist in this challenge because it recognises the parallel programming potential National Instruments has been developing since introducing LabVIEW more than 20 years ago. In addition to acknowledging the impressive high-performance computing capabilities of LabVIEW and our work on the European Extremely Large Telescope, this honour positions National Instruments as a leader in real-time control applications. This achievement also complements the major solutions National Instruments has facilitated for the Max Planck Institute for Plasma Physics in the field of nuclear fusion and for CERN in particle acceleration, which represent two of the biggest technical challenges of our time."

The Analytics Challenge was held in conjunction with SC08, the international conference on high-performance computing, networking, storage and analysis, 15 to 21 November 2008 in Austin, Texas. Each year, the Analytics Challenge provides a forum for researchers and industry representatives to present solutions that embody all facets of high-performance computing, such as comprehensive computational approaches, large-data-set processing and innovative analysis and visualisation techniques.

6000 sensors, 3000 actuators, 1 millisecond

For their Analytics Challenge submission, National Instruments engineers documented their breakthrough work with the European Southern Observatory (ESO) on the E-ELT project, which is currently in the proof-of-concept phase and, when constructed, will be the world's largest telescope ever created. The ESO needed help to prove the viability of a commercial off-the-shelf (COTS) option for controlling the two most complex mirrors within the E-ELT, which will have a total of five mirrors. The telescope's primary active mirror will be 42m in diameter and will comprise 984 hexagonal mirror segments, all of which must be in strict alignment continuously, even in windy conditions. To maintain mirror segment alignment, the control system must respond to a total of 6000 sensor inputs and then send control signals to 3000 actuators, and it must complete this input–output cycle up to 1000 times per second.

To solve this problem, NI engineers used the multicore programming functionality of LabVIEW Real-Time to create a highly deterministic, hardware-in-the-loop (HIL) communication network that moves 36MB of data per second. The benchmarks achieved included distributing control algorithms on up to eight cores simultaneously and performing a 3000-by-6000 matrix-vector multiplication within 0.5ms. This meets a monumental computational challenge while maintaining the determinism required in real-time applications and breaking the 1ms closed-loop threshold.

NI’s team also documented its work on the even larger problem of developing control for the telescope's 2.5m active mirror, which will comprise a thin, flexible mirror membrane spread across 8000 actuators. Instead of maintaining alignment, this mirror will adapt and deform to compensate for waveform aberrations caused by atmospheric disturbances. The computational requirements for controlling this mirror are nearly 15 times more complex than that of the large primary mirror. NI engineers determined that this problem could be solved only by using a state-of-the-art multicore blade system, and they tested their solution on the Dell M1000, a 16-blade system in which each blade machine features eight cores. Although the work is still in progress, the results from the Dell system have shown that the LabVIEW solution already has effectively distributed the control problem onto 128 cores, which is another groundbreaking achievement in itself.

Distributed computing

Greg Weir, Senior Manager of Worldwide Business Development for Dell Precision Workstations, says: "The leading-edge power of the Dell Precision workstation and PowerEdge servers, together with the real-time and graphical programming capabilities of NI LabVIEW, deliver impressive capabilities to efficiently distribute computing loads across all the nodes in HPC applications. The full memory and graphics potential of our workstations is realised with the key visualisation functions of LabVIEW that HPC applications require."

Other parallel hardware that may be used to add processing power to the final E-ELT solution for the ESO includes field-programmable gate arrays (FPGAs), which LabVIEW already supports with the NI LabVIEW FPGA Module, and general-purpose graphics processing units (GPGPUs), which are being researched as a viable acceleration platform. In addition to the Dell proof of concept, a prototype in which NVIDIA's CUDA technology enables LabVIEW has been thoroughly benchmarked with impressive computational results.

For more information about LabVIEW implementation in the E-ELT project, read the full case study.

National Instruments Corporation (UK) Ltdvisit website
See all stories for this company
© Copyright 2006-14 Damte Ltd.