High Performance Computing Will Take Us to the Far Reaches of Space

High Performance Computing

Taking Us to the Far Reaches of Space

By Kelly McSweeney

With America’s spaceflight program setting its sights on the Moon and Mars, rockets now must be powerful enough to launch spacecraft farther into space. To do this, Northrop Grumman’s propulsion systems engineers use high performance computing capabilities to design the rocket boosters for NASA’s Space Launch System (SLS).

“We’re truly advancing humankind’s capabilities and understanding of the universe,” said Mark Tobias, deputy chief engineer for the SLS program and chief engineer for the Booster Obsolescence and Life Extension (BOLE) program.  Regarding using high performance computing, Tobias said, “In terms of time-savings and capability, it’s like going from a drafting board to a CAD terminal.”

“The Teton system is larger and more capable than anything we’ve had before. It’s a highly parallel computer system.”

— Brad Roe, Aerothermal Analyst on the BOLE program

rocket engine laying horizontal and firing in a test scenario

Ready for Liftoff

After eight SLS launches, Tobias and his team will have used all the steel casing for the rocket boosters and new composite cases will be required to sustain the program for additional launches. The required obsolescence changes permitted complementary design enhancements for performance so that the rockets can meet NASA’s objectives of launching spacecraft farther into space.

To perform simulations for incorporating the new design elements, the Northrop Grumman engineers used TETON, a high performance computing (HPC) cluster, which is made up of racks of processors — 10,240 to be exact —  working together to process trillions of calculations.

“The TETON system is larger and more capable than anything we’ve had before. It’s a highly parallel computer system,” said Brad Roe, an aerothermal analyst on the BOLE program.

Rocket Optimization, A Million Simulations at a Time

They started the project with a set of loose design requirements, which meant there were millions of possibilities for booster capabilities and system requirements. This freedom made the initial design very computationally expensive.

Historically, the design evaluation process has been very manual and time-consuming, where the booster performance analyst would design a motor and pass along the thrust profile to a trajectory analyst to evaluate system-level performance. The two would iterate back and forth to find the best possible design. In this case, the design space was too large for a manual approach to be effective.

For this effort, the first order of business was to build an integrated system-level computer model in which the booster design and performance were plugged directly into the vehicle trajectory model. With this process in place, a given booster design could be evaluated against requirements automatically within a minute or so. This automated process could be plugged into an optimization algorithm, but even with the significantly reduced design evaluation time, evaluating millions of designs would take a regular PC far too long to crunch the numbers.

“So, we put it on TETON, which allowed us to evaluate hundreds of designs simultaneously,” says Roe.

TETON shortened total runtime down to only a few days, assessing system-level performance on a scale that would not have been possible previously. At times, the team used 30 to 40 percent of the HPC’s capacity — a much higher percentage of the computer’s capacity than one typical use.

Uncertainty Quantification Simulation Flips the Analysis

Once the team had the optimal propulsion system design, they moved on to assess how the SLS vehicle would fly with realistic performance variability and prediction uncertainty. By again running millions of simulations, the team could assess the likelihood that the selected booster design would meet the overall requirement, specifically SLS vehicle payload capability. This uncertainty quantification (UQ) activity provided an initial requirement verification much earlier than for previous programs.

The final step, which is currently in progress, is to use the statistical results of the UQ simulation to derive booster-level performance requirements that ensure system requirements are met. This is much different than the typical process, where booster requirements are pushed up based on expectation rather than flowed down based on need.

TETON Defines Possible Using Numerical Methodology

The design and analysis process implemented for the BOLE program provides confidence that the optimal design was found, that even with reasonable prediction uncertainty, the booster will provide the vehicle the needed performance.  The analysis also provides confidence that the booster requirements to be used throughout the program are closely aligned with vehicle needs. This numerical methodology simply would not have been possible without an HPC tool like TETON, which made the design and performance validation process more efficient and added new capabilities.

More innovation stories

Read all stories >

rocket test

Todd Mott: Clearing the Smoke on Data Reduction

woman smiling

Small But Mighty: Magnetometers and the Future of Inertial Navigation

NASA's Mars 2020 Perseverance Rover on Mars

Navigating Mars and Beyond with LN-200 Inertial Measurement Units

Sign up now to receive updates from Northrop Grumman on our technology and innovation.
high, performance, computing, rockets, booster, NASA