3D electromagnetic (EM) simulation technology has been steadily advancing over the past 30 years. Integration of multi-core and multi-GPU capabilities, extension to cloud and high performance computing resources and development of novel solution techniques have reduced the simulation time of complex and large simulations.1 However, many of these features are not native to commercial software suites, require additional software modules or are paywalled.

Traditionally, there have been limitations to design optimizations that legacy EM simulation codes could perform. This comes about because most well-known, commercially-distributed EM codes originated three to four decades ago.2 Embracing new EM simulation software tools, designed from the ground up with modern computing technology and advanced algorithms in mind, can revolutionize the practical utility of EM simulation for complex RF applications.

This article discusses EM simulation challenges for modern RF technologies. It also provides updates on recently released advances in EM simulation technology. The article also discusses the potential benefits of having access to faster and more accurate EM simulation software that has been purposefully built to efficiently simulate electrically large bodies.

Current Challenges in EM Simulation

Figure 1

Figure 1 Output of an AESA radar simulation.

Trends in RF technology are creating greater challenges for legacy EM solvers. First, there are historical challenges, such as performing accurate simulations of electrically large bodies. Most current EM solvers originated some decades ago, often in academic settings. The original software suites to support these solvers were developed to leverage the technology of that time. These software suites have advanced somewhat over the years to include new simulation techniques and add-ons or support for multi-core and GPU processing, but these accelerating technologies are not typically innate functions of the core solvers. As complex phased array antenna systems with multiple antenna elements have evolved for commercial and defense applications, simulations of these electrically large and complex objects have gotten time and computing resource intensive. Figure 1 shows the output from a simulation of an electrically large and complex active electronically steered array (AESA). For these types of structures, radar cross-section analysis and co-site interference assessment are typically prohibitively expensive from a computational standpoint.

As is often the case with common EM software suites, much of the capability of the software tools are optional upgrade modules that require additional licensing for GPU acceleration or other accelerating methods. The most common licensing methods fix the simulation capability to a single computer, with this method called node-locked licensing, though floating licenses are sometimes available for an additional cost. Some EM software suites are extensible or allow for scripting functions to perform some type of automation or organization of complex/parametric simulations. This is not universal functionality, however.

In the past, there may not have been adequate standardization between CPU and GPU processing technologies. This did not readily allow for software designs that natively took advantage of the scalability and processing benefits of using multi-core CPUs and multi-GPU technology. Many of the legacy EM solvers are also only well-suited to a narrow range of models and problems and the most widely known commercial EM solvers require large amounts of computational resources to accurately simulate electrically large bodies.

Benefits of High Efficiency EM Simulation Technology

EM simulation of electrically large bodies and/or complex structures has been a trade-off between available computational resources, time and accuracy. The further a simulation model is away from physical reality, the harder it becomes to verify the simulation tool and process with real experimentation. Inadequate accuracy in an EM simulation can also result in the loss of many man-hours chasing artifacts of poor-quality simulation.

These artifacts can be caused either by limitations of the modeling technique along with omission or addition of aberrant model data. This aberrant data may include CAD or material information or approximations in the model’s excitations or post-processed data. Examples of this include simulations with inadequate low-order meshing leading to geometry-approximation errors or low-order physics approximations.

Figure 2

Figure 2 Comparing Nullspace solver performance versus a leading competitor.

With a flexible and efficient EM simulation tool, users can perform more early-stage virtual prototyping and integrate these models into larger model-based systems engineering environments. Capturing system concepts and performance early, with confidence in the accuracy of the models, provides tremendous potential for improved customer satisfaction, cost savings and reduced project schedule risk. To illustrate this, Figure 2 shows the same EM simulation, run on the same hardware with legacy commercial EM simulation software and the Nullspace solver technology. This figure shows how errors accumulate during a typical design process and how the reduction of simulation and meshing errors provides the benefits of innovation, cost reduction and shorter time to market.3

Advances in EM Simulation Technology Optimized for Electrically Large Bodies

To efficiently handle realistic simulations of electrically large bodies requires extensive optimizations throughout the simulation process, as well as a small geometry error. Reducing the initial modeling error can have dramatic impacts on the overall accuracy of an EM simulation, as geometry errors tend to compound with simulation complexity. This presents a hard limit on the simulation accuracy of EM solvers not built to reduce geometry error.

Figure 3

Figure 3 (a) Surface current depiction for a PEC at a particular radar look-angle. (b) PEC open pipe RCS simulation for varying angles of attack.

In most cases, EM software consumes large amounts of system memory, often prohibitively so, while attempting to run compute-intensive simulations. Highly efficient compression algorithms can help reduce the amount of data that needs to be stored, reduce the time that the data is stored and reduce the time needed to process and redeploy the data generated by EM simulation. A more advanced version of this concept is adaptive and on-the-fly compression, which intelligently handles data in a way that optimizes the use of memory and storage space. This approach enables an EM simulation tool to handle much larger problems more rapidly with the same computational resources.

The last major efficiency hurdle that needs to be addressed with EM simulation tools is the handling of multi-core, multi-CPU and multi-GPU acceleration. Most established EM software tools were designed when multi-core CPUs were still evolving and multi-CPU platforms were often proprietary. Multi-GPU acceleration is a more recent feature in some EM simulation software suites and this technique can be a crucial factor in reducing overall simulation times.4 However, many EM software suites put multi-CPU and multi-GPU acceleration behind paywalls that require the purchase of additional modules. This results in the same trade-off between cost and computational resources. As the cost of these hardware resources has dropped enormously in recent years, it would be a benefit for many EM simulation users to have access to full support of the computational hardware they have available.

Why EM Simulation Efficiency & Accuracy Matter

With more efficient EM simulation, the same computing hardware requires less EM simulation time. If the efficiency and speed of the simulation are substantially better, that time can be used in other areas, like additional optimizations, uncertainty analysis or faster product introductions. This concept is illustrated in Figure 3a and Figure 3b.

Simulating the radar cross-section (RCS) of electrically large bodies is generally a very demanding simulation. Figure 3a shows a perfect electrical conductor (PEC) hollow cylindrical tube at 8 GHz. The RCS analysis requires a model that is relatively simple to set up in most EM simulation software tools. This reference solution analysis provides a useful comparison between the tools because it allows for simultaneous analysis of the accuracy and speed. This reference case can also be useful for tuning and calibration of a simulation setup. While the model is easy to set up, the open pipe geometry exhibits complex scattering physics, which are generally challenging simulations for most EM codes.

Figure 3b compares the RCS in dBsm to the angle of attack in degrees for a PEC open pipe at 8 GHz. The plot shows that the Nullspace EM simulation software matches the reference very closely. The reduced geometric error and the high-order physics of the Nullspace solution results in a much more accurate simulation when compared to a leading commercial EM simulation software.

In addition to benefits in accuracy, the Nullspace solution is much faster. A greater level of optimization, built-in compression algorithm and intrinsic integration of multi-CPU and multi-GPU acceleration leads to an enormous reduction in the simulation time with only a marginal increase in the CAD/meshing time. Figure 4 shows a side-by-side comparison of the simulation process times for a leading commercial EM simulation software tool and Nullspace EM on identical hardware simulating an electrically large PEC hollow cylindrical tube in free space.

Figure 4

Figure 4 Comparing simulation process times for an electrically large PEC hollow cylindrical tube in free space.

Figure 5

Figure 5 Speed-up factor versus number of GPUs to simulate PEC hollow cylindrical tube RCS.

To illustrate the value of intrinsic multi-GPU support, the plot in Figure 5 shows a roughly linear benefit in increased simulation speed with additional GPUs. In addition, the simulation speed factor increased with frequency. Of the three frequencies tested, the greatest benefit came at the highest frequency, 12 GHz. This is not a surprising result because the simulations become more resource intensive as the frequency increases.