International Science Grid This Week , an international science publication, touted the Nautilus supercomputer, managed by National Institute for Computational Sciences, and other Extreme Science and Engineering Discovery Environment (XSEDE) resources used in research focused on competition in the financial and insurance industries. The researchers are using the supercomputing resources to consider information frictions which are
Supernovae exhibit the most-energetic explosions, dispersing elements that make life possible into the universe. However, the energy source for the violent death of these massive stars is not known. Researchers using UT’s Kraken supercomputer have created three-dimensional simulations that have made great strides in uncovering the source.
Jacek Jakowski, a computational scientist at the National Institute for Computational Sciences, was interviewed on an HPCWire podcast about a new computational capability he and his team developed to study the dynamics of prospective energy materials under diverse environmental situations. The researcher discussed how he and his team are using the Kraken supercomputer to explore
Research being done on the supercomputer Kraken holds promise for overcoming limitations in the study of energy and materials applications. The method employs quantum mechanics to understand how nuclear effects change the dynamics of microscopic-size materials.
A blogger for the Wall Street Journal covered research conducted by UT and Oak Ridge National Laboratory scientists. Using first a smaller supercomputer named Anton, scientists at ORNL, UT, and the UT-ORNL Joint Institute for Computational Sciences simulated the behavior of 140,000 atoms from the biological signaling mechanisms in E. coli cells. Identifying this amino
Cellulase enzymes found in nature from sources such as wood-degrading fungi or in cows’ stomach compartments form one of the key catalysts for breaking down plant biomass to make biofuels. But, they remain quite expensive. Compute allocations from the Extreme Science and Engineering Discovery Environment (XSEDE) have made a breakthrough possible that could have big cost implications.
The UT–Oak Ridge National Laboratory Joint Institute for Computational Sciences—and UT’s Office of Information Technology—have announced final plans to upgrade the bandwidth of UT’s wide area network for research and education to 100 gigabit per second (100G) capability by July 2014. This project makes UT an early adopter of the technology and will improve a wide range of big data and other science data flows.
As disease progresses over space and time in the body, high-resolution imaging can capture the changes taking place down to the sub-cellular level; meanwhile, huge sets of hereditary (genomic) information hold clues about the dynamics of illness. Comparing certain characteristics in the images with genomic and clinical data may be key in predicting disease progression and in targeting new treatments. The current work of a research team at UT’s National Institute for Computational Sciences revolves around making those very connections.
Jack Dongarra, distinguished professor of computer science at UT is designing software that will be critical in making the next generation of supercomputers operational. For decades, supercomputers have been tackling the world’s most pressing challenges, from sequencing the human genome to predicting climate changes. But their power is limited and thus, so is our knowledge.
The way the power of supercomputers is measured is about to change. Since 1993, Jack Dongarra, distinguished professor of computer science at UT has led the ranking of the world’s top 500 supercomputers. The much-celebrated bi-annual TOP500 list is compiled using Dongarra’s benchmark system, called Linpack. But Dongarra says Linpack hasn’t kept pace with supercomputing needs and must be updated.