From designing new airplane wings to better understanding how fuel sprays ignite in a combustion engine, researchers have long been interested in better understanding how chaotic, turbulent motions impact fluid flows under a variety of conditions. Despite decades of focused research on the topic, physicists still consider a fundamental understanding of turbulence statistics to be among the last major unsolved challenges in physics.
Due to its complexity, researchers have come to rely on a combination of experiments, semi-empirical turbulence models, and computer simulation to advance the field. Supercomputers have played an essential role in advancing researchers’ understanding of turbulence physics, but even today’s most computationally expensive approaches have limitations.
Recently, researchers at the Technical University of Darmstadt (TU Darmstadt) led by Prof. Dr. Martin Oberlack and the Universitat Politècnica de València headed by Prof. Dr. Sergio Hoyas started using a new approach for understanding turbulence, and with the help of supercomputing resources at the Leibniz Supercomputing Centre (LRZ), the team was able to calculate the largest turbulence simulation of its kind. Specifically, the team generated turbulence statistics through this large simulation of the Navier-Stokes equations, which provided the critical data base for underpinning a new theory of turbulence.
“Turbulence is statistical, because of the random behaviour we observe,” Oberlack said. “We believe Navier-Stokes equations do a very good job of describing it, and with it we are able to study the entire range of scales down to the smallest scales, but that is also the problem — all of these scales play a role in turbulent motion, so we have to resolve all of it in simulations. The biggest problem is resolving the smallest turbulent scales, which decrease inversely with Reynolds number (a number that indicates how turbulent a fluid is moving, based on a ratio of velocity, length scale, and viscosity). For airplanes like the Airbus A 380, the Reynolds number is so large and thus the smallest turbulent scales are so small that they cannot be represented even on the SuperMUC NG.”
Statistical averages show promise for closing an unending equation loop
In 2009, while visiting the University of Cambridge, Oberlack had an epiphany — while thinking about turbulence, he thought about symmetry theory, a concept that forms the fundamental basis to all areas of physics research. In essence, the concept of symmetry in mathematics demonstrates that equations can equal the same result even when being done in different arrangements or operating conditions.
Oberlack realized that turbulence equations did, in fact, follow these same rules. With this in mind, researchers could theoretically forego using the extremely large, dense computational grids and measuring equations within each grid box — a common approach for turbulence simulations — and instead focus on defining accurate statistical mean values for air pressure, speed, and other characteristics. The problem is, by taking this averaging approach, researchers must “transform” the Navier-Stokes equations, and these changes unleash a never-ending chain of equations that even the world’s fastest supercomputers would never be able to solve.
The team realized that the goal needed to be finding another accurate method that did not require such a computationally intensive grid full of equations, and instead developed a “symmetry-based turbulence theory” and solved the problem through mathematical analysis.
“When you think of computations and you see these nice pictures of flows around airplanes or cars, you often see grids,” Oberlack said. “What people have done in the past is identify a volume element in each box — whether it is velocity, temperature, pressure, or the like — so we have local information about the physics. The “symmetry-based turbulence theory” now allows to drastically reduce this extreme necessary resolution and at the same time it directly provides the sought-after mean values such as the mean velocity and the variance.”
Using an almost 100-year-old mathematical turbulence law, the logarithmic law of the wall, the team was able to focus on a simple geometric shape to test the symmetry theory — in this case, a flat surface. In this simplified shape, the team’s theory proved successful — the researchers found that this law served as a foundational solution for the first equation in the seemingly unending string of equations, and that it therefore served as the basis from which all subsequent equations in the chain could be solved.
This is significant, as researchers studying turbulence often must find a place to cut, or close, this infinite string of equations, introducing assumptions and potential inaccuracies into simulations. This is known as the closure problem of turbulence, and its solution has long eluded physicists and other researchers trying to better understand turbulent motion of fluids.
Of course, just like other mathematical theories, the researchers had to try and verify what they had found. To that end, the team needed to do computationally expensive direct numerical simulations (DNS) to compare its results with what most researchers consider the most accurate method for simulating turbulence. That said, DNS simulations for even simple geometries are only capable of running on world-leading computational resources, such as LRZ’s SuperMUC-NG supercomputer, which Professor Oberlack’s team has been using extensively for years.
“For us, we wanted to have the most reliable database for comparing our symmetry theory to data that is possible at the time,” Oberlack said. “For that reason, we had no other choice than doing DNS, because we didn’t want to have any effect of empirical influence other than the assumptions contained in the Navier-Stokes equations themselves.”
The team found excellent agreement between the simulation results and its theories, demonstrating that its approach shows promise for helping fluid dynamics researchers solve the elusive closure problem of turbulence.
Closing in on a long-time goal
Oberlack indicated that the team was highly motivated to use its theory in other contexts, and as supercomputing resources continue to get faster, the team hopes to test this theory on more complex geometries.
Oberlack mentioned that he appreciated the role that LRZ played in the work. Several team members have participated in LRZ training courses, and while the team was overall very experienced using HPC resources, it got good, responsive support from LRZ user support staff. “It is really important to actually have humans behind these machines that are dedicated to helping users,” he said.
Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here