While the rest of the country prepared to view a total solar eclipse with their own eyes in August, researchers at Predictive Science Inc. got a sneak peek of the phenomenon with the help of one of UT’s supercomputers.
Scientists at Predictive Science, a research company from San Diego that studies solar physics, ran simulations of the solar eclipse using three supercomputers. One of those computers included the newly improved Stampede2 supercomputer at UT’s Texas Advanced Computing Center, TACC. The simulation was intended to study the sun’s atmosphere, called the corona, according to Predictive Science research scientist Cooper Downs.
“When you ask a regular person what the sun looks like, they’re going to draw you a big, orange ball,” he said. “It turns out, that’s the tip of the iceberg for the rest of the sun’s atmosphere, which extends and reaches out to Earth.”
Niall Gaffney, TACC director for data-intensive computing, said that a solar eclipse is the only time when the entire corona is visible to telescopes and satellites.
Downs said Predictive Science wants to learn more about the complex structure of the corona because the influence of the sun’s magnetic field on the corona is responsible for solar wind, high-energy particles ejected from the sun. UT astronomy assistant professor Stella Offner said when the corona becomes very active, solar wind gets strong, and the particles can affect satellites and electronics that are in space or high in the atmosphere.
“If you can predict (solar wind), just like we can predict the weather on the Earth to some extent, then you can take steps to shield the electronics from damages from the high-energy particles,” Offner said.
Stampede2 was one of three supercomputers used, along with Comet at the San Diego Supercomputer Center and NASA’s Pleiades supercomputer. Predictive Science ran different simulations on different computers in order to get their simulations done faster, Downs said.
Supercomputers, such as Stampede2, contain thousands of small computers, similar to personal computers, that work together on one problem at the same time, Gaffney said.
“It gives you the ability to model very complex systems, whether it’s the atmospheres of stars or the atmosphere of Earth, with what we’re modeling in Houston for the hurricane,” he said. “What these systems let us do is tackle large-scale, complex problems where a pen and paper or a laptop just won’t do it.”
Stampede2, which went online in August, is an upgraded version of the Stampede supercomputer. The previous system was composed of 6,800 individual computers with 16 processors each, resulting in a total performance of 2.2 petaflops, or 2.2 million billion operations involving numbers with decimal points completed per second, Gaffney said. However, the new Stampede2 supercomputer runs at about 9.8 petaflops and will be upgraded again to between 15 and 18 petaflops within the next couple of months.
“It’s faster and it has more memory, so it can do things at a much higher resolution than it could before,” Gaffney said.
Offner, whose research includes computer simulations of star formation, said simulations such as this can take millions of CPU hours, or the sum of hours used by the computer’s central processing units.
“What we do instead is parallel simulations, so we divide up the calculation among hundreds or thousands of processors,” she said. “That reduces the calculation that takes millions of CPU hours into taking only maybe a year or a month of … actual waiting time. That’s much more manageable in terms of the researcher’s lifetime.”
Offner said she liked the visualizations of the data that the simulations produced.
“There’s just so much data that comes out of the simulation, so one of the best ways to really understand the data and communicate it is to make visualizations,” she said. “You can just look at it and get a sense for what the calculation is actually doing and what it is actually discovering.”