Introduction

My first experience with real RF design was as an electrical engineering student intern at Texas Instruments (TI) during one summer in the early 1980’s. I was anxious to put my HP-41CV, running a program based on Allen and Medley’s “Microwave Circuit Design Using Programmable Calculators,” to work on a real microwave circuit design. Instead, I was introduced to a then new computer-aided-design program called Touchtone [23], which ran on a TI PC. That was the good news. The bad news was that in order to make sure the circuit worked as simulated, I had to first figure out where to get the latest version of the MESFET model. Fortunately, the GaAs fab was co-located with the circuit design team so it seemed like a good place to start. But instead of a model file, I was handed the “golden” wafer of the day and shown to the test lab where I could take all the data I wanted to build my own Spice model. How did we handle yield for our finished products? Every part we delivered was tweaked and tuned by hand.

Just imagine this scenario for modern CMOS RFIC designs. I think we lost the tall thin RF design engineer around the 0.25um CMOS node. The point of this story is that if things had remained the same we never would have realized such explosive growth in the wireless IC market. Luckily, over the years, the RF design process has evolved. The advances in EDA, semiconductor manufacturing, design, and test have yielded great commercial and consumer benefit, but it has also created divisions between the various design tasks, and the information and knowledge needed to keep pace with the demand for faster, smaller, cheaper, and more connected wireless devices. The dilemma over how to best verify advanced RFICs and design-for-yield is a direct consequence of this evolution. Today’s complex high-performance, low-power, low-cost, and high volume RFIC requirements continue to put tremendous strain on the ecosystem to provide a solution to this dilemma.

Simulation at the Center.


Figure 1 depicts the various sources of requirements and information needed as part of the RFIC design and verification process.

Let’s look at the importance of each of these and where we are with respect to integrating and using the information necessary to enable proper verification and design-for-yield at the RFIC design level.

Figure 1. Shown here is the simulation-centric RFIC design ecosystem.

System/Product Design

System- and product-level design tools for electronics, or the Electronic System Level (ESL) category of Electronic Design Automation (EDA) tools, are available from various vendors. However, few are focused on accurately modeling the entire wireless RF subsystem as shown in Figure 2. Most of the attention is either focused outside this subsystem entirely, or only on baseband algorithmic development or other DSP functions that may reside on radio itself.

In this paradigm, simple models are used for the radio and RF front-end blocks. While more specific tools for the RF architecture design (radio and front-end module) exist, more often than not spreadsheets or numerical programming languages seem to be the tools of choice. There are two main weaknesses to this scheme. Where general tools are used, the information flow between system and circuit design is manual and iterations targeted at improving either performance or yield requirements are prohibitively time consuming. On the other hand, where more RF specific tools are used [1], the lack of a standard interface for signals and specifications between system and circuit simulation environments limits the development and business model for developing standards-based or custom wireless verification intellectual property (IP). It also means that performance information captured at the circuit level has little chance of being used for overall system verification. An emerging standard for simulation- and measurement-based nonlinear models [2] offers a glimmer of hope for helping in the design of RF front-end building blocks, and it seems to be gaining both momentum and support [3] [4]. Although theory supports extending the use of these models for transceiver functional path modeling, the implementation and operational aspects required for a successful solution are still being worked out. One thing is certain, until RFIC verification follows the path of digital and builds verification teams to write behavioral models, simulation- and measurement-based approaches seem to be more tractable for the RF designer to accommodate.

Figure 2. A simplified wireless RF subsystem.

Packaging/Module Development

Until somewhat recently, RFIC designers have always found ways to either include or avoid, “by design,” the respective chip/package/module/board interactions that could render their product (assembled from well-designed individual pieces) unusable. What has changed is the market demand for highly functional, small form factor, wirelessly connected consumer devices. Not only is the demand high, but since consumers are willing to pay a premium for these “smart,” connected devices, manufacturers are now more incentivized to produce them [5]. The old engineering paradigm of separation of variables, and divide and conquer just doesn’t cut it anymore.

Remember RF “keep out” regions under critical areas of potential IC to package/board coupling? The evolution of flip chip technology, die stacking and wafer-level packaging, combined with small form factor area requirements [6] have driven deeper analysis of how to accommodate these technologies. The lack of a unified 3D, high-capacity EDA solution that crosses the chip/package/module/board boundaries is problematic. I have heard anecdotally that at the RF subsystem level, dozens of placement-centric circuit iterations are required to build prototypes, let alone a working product. While there have been a few good initial attempts at addressing various parts of the problem [7], one big issue still remains: a unified design database between RF SOC and package/module/board design environments.

As the industry moves forward with the adoption of Open Access [8] (OA) as a standard electronic design database, this particular issue may be solved. This represents forward movement on the custom IC, microwave and MMIC fronts [9]. However, until mainstream packaging/board vendors deliver OA-based products, and current interoperability efforts [10] [11] extend to include their requisite data structures, and package houses deliver IC-like process design kits (PDKs), the design community will continue to create their own RF-centric models to analyze the effects of bond wires, bumps, thru-wafer vias, off-chip components, and high frequency/data rate interconnects the best way they can. The expense of multiple iterations continues to plague the industry.

IC Manufacturing

Probably the most critical input for RFIC design and verification are the process design kits (PDKs) delivered by captured fabs and commercial foundries. These continue to be the bedrock that all RFIC design and verification rests on. Many, if not most, mainstream fabless RFIC chip design companies augment these kits to create a more RF-accurate and design-specific representation of the manufacturing data from their own experience and measured data. While there has been a lot of improvement in the support of RF-centric models [12] [13], RF-specific process technology and RF-oriented foundry EDA programs [14] [15], when it comes to RF verification and yield, the support of statistical Spice models has not been enough to convince RFIC designers to embrace them.

There are various reasons for this response from RFIC designers, including: model/process alignment, the cost of licensing to support comprehensive verification, time constraints on product schedules, and the value of the information derived. Let’s take a closer look at model/process alignment.

After much development time, expense and pre-production silicon with key customers, a given process technology is deemed ready to release for production only when process and electrical parametric variations meet the release criteria. This is usually some mean value and number of standard deviations of the distribution for each of parameters the foundry tracks. These distributions are the foundry’s guarantee that the process is under control and are also used for determining whether to re-process wafers at certain steps (e.g, chemical mechanical polishing or photo) or to reject wafers at final electrical probe. It’s important to note that in most cases, the statistical Spice models are usually aligned with this release.

The simplest process indices used for describing the measured vs. target specification alignment are Cp and Cpk [16] [17]. In simple terms, Cp is the amount of measured specification variation and its relationship to the target spread between upper and lower specification limits (Figure 3). Cpk is a measure of how centered the measured data is relative to the specifications.

Figure 3. A graphical depiction of the process capability index Cp.

The more measured standard deviations that fit between the specification limits, the tighter the process control. It’s in the foundries interest therefore, to reduce the variability in the process. Doing so creates both margin on wafer yield and the opportunity to tune the mean values of these parameters within the guaranteed specification limits to address product-dependent yield sensitivities. In general, foundries do not update the statistical models to reflect this. Consequently, just like skew or corner models, actual RFICs may never see the process that these models represent.

There are various tactics to deal with this issue, but design managers generally see very little return on investment in this level of verification due to the model alignment and expense in terms of time/licenses. Even if there was continuous statistical Spice model re-alignment and cost issues could be overcome, one question would still remain, “What do you do with the information you obtain?” For key RFIC product specifications, for example, you may know that you have poor yield, but it doesn’t tell you the root cause.

Test

There are many important inputs to the RFIC design and verification flow that are test related. One is from the modeling aspect and we’ve already touched on statistical transistor-level modeling in the previous section. Now let’s focus on two other areas: behavioral modeling and functional-level testing.

As previously mentioned, nonlinear distortion models [2] are gaining momentum even at the system level [3]. Measurement can play a critical role in situations where a compact model representation does not exist or does not accurately capture real nonlinear effects for an RF component. As a result, test solutions have come to market to address this capability [4]. While transistor-level simulation across technology boundaries is possible (transceiver and RF front-end, for example), a transistor-level approach with inaccurate results is not very useful. The ability to use measurement-based nonlinear distortion models in either system or circuit-level simulations can be very beneficial under these conditions.

A principal critical stage of the product development cycle is product testing, especially on first silicon for a complex RFIC design. It’s usually the first chance you have to see the effect of all the verification runs that you didn’t have time for, the post layout extraction re-simulation issues that never got resolved, or the chip-to-chip or package and board effects that couldn’t be analyzed due to a lack of a comprehensive EDA flow. It is the place where everything comes together and usually where debug begins. Normally, there is little alignment between the test benches used for product design and test. Due to the highly integrated nature of complex RFICs, block-level testing is seldom useful. Instead, system-level RF functional path performance and programmability is what is tested on the actual silicon. One solution is to perform functional path simulations with the same system-level test benches used for testing the hardware. While this pushes the capacity of both RF simulators and available compute resources, self-consistent solutions do exist [18]. RFIC Design and Verification.

After reviewing the various inputs, capabilities and limitations from the rest of the RFIC design ecosystem, we can now focus on what RF simulation environments need to support for the design and verification of complex RFICs. For transistor-level functional path simulation to align with system test, it needs to support the same complex modulated sources for the various standards of interest. While multi-tone analysis has been the standard approach for many years, the evolution and complexity of wireless standards is now driving the use of real-world complex modulated signals, especially to see the effect of cross modulation and interference between the various wireless emitters. 4G LTE, for example, requires backward compatibility with W-CDMA as they share most of the same frequency band allocations. The typical approach for handling these types of input signals is some form of envelope transient simulation. While these approaches can handle the complex signaling and the circuit’s nonlinear RF behavior [19], simulation times can still be prohibitive for more complex analysis.

Advanced “Fast Envelope” methods [20] have been developed to yield a speedup of many orders, with little loss of accuracy. These models are created once at run-time for a particular circuit configuration. This allows the designer to push the envelope with respect to the size of the simulated circuit vs. simulation speed, and enables maximum reuse for additional analysis including yield. Figure 4 and Table 1 give the results of “Fast Envelope” transient simulation for an IS95 up converter from the previously referenced paper. These advances make use of real-world signaling for verification and yield analysis possible for complex RF circuits and systems.

Figure 4. An IS95 transmitter output spectrum for various fast envelope techniques.

Table 1. Fast envelope simulation results for an IS95 transmitter.

While we are on the topic of yield analysis, let’s try and make some sense out of how to deal with statistical models and investigate a more comprehensive methodology for both block and functional path statistical analysis. The primary techniques included in RF simulators that target advanced (Bi)CMOS technologies cover simulating the effects of varying process, voltage, temperature (PVT) and mismatch. Such PVT analysis takes two forms, either worse-case corner or full statistical sampling analysis also known as Monte-Carlo process or mismatch analysis. Keep in mind that process models are global variation models usually representing large variations in process, while mismatch models represent local variations.

It is technically feasible for foundries to align worse-case corners with the specification limits of the statistical models, and with either sigma or design of experiments-based sampling, the line between these techniques has blurred. What hasn’t changed is the fact that no matter which technique you choose, neither will yield any statistical insight into the sources of variation or how the various circuits interact with respect to their impact on performance. While others recognize the issues [21], they take a very data-, simulation- and modeling-intensive approach to the problem. While they certainly merit further investigation, they do not provide a practical everyday solution for RF designers. Instead, what’s required is a fast and accurate statistical-based variational analysis that can also give the insight that traditional Monte-Carlo process, mismatch and corner analysis cannot. To insure alignment with regular Monte-Carlo, it should also use the same statistical Spice models. This would fill the gap between traditional circuit- and block-level design and full-fledged yield analysis.

An example of the insight that such an analysis [22] could provide is shown in Figures 5 and 6. Here, the Harmonic Balance-based output voltage of a receiver down converter design, including low-noise amplifier (LNA), mixer and bias circuit was analyzed using a fast yield contributor and mismatch analysis technique. The important metric here is not the amount of variation contributed by a block at the output, nor the correlation between the two, but the product of the standard deviation and the correlation. In effect, this is the impact that the various blocks have on the overall output variation for the measure of interest. The longer the bar is, the greater the impact. One important item to note is that different blocks can drive the variation in different directions. You can see in Figure 5, for example, that the mixer block’s negative correlation shows up in the results. This is an important insight to understand when taking the next step and trying to reduce the effective output variation (when necessary).

Figure 5. The block-level impact of process variations with fast yield contributor analysis.

In addition to block-level insight, this technique can also provide the same kind of insight all the way down to the device level as shown in Figure 6. Since the device model parameter link to process is also available, the future may bring a solution for circuit-dependent corner generation.

Figure 6. The mixer device-level impact of process variation with fast yield contributor analysis.

This technique is an important tool in understanding sources of variation derived using the standard statistical Spice models for global variation effects. It does not however, replace corner or Monte-Carlo process analysis completely as they represent large global variational effects. For mismatch analysis, where we are dealing with local variations only, this technique can be a direct replacement, offering orders of magnitude improvement in speed. Since most designers perform mismatch and corner analysis together, this can reduce the overall analysis time dramatically. And, because this fast statistical variation and mismatch technique is applicable at the circuit, block and functional path level, it gives RF and mixed signal designers a useful tool that can be used at any stage of the design.

Conclusion

This article has described the various sources and inputs needed for RFIC verification and design-for-yield. It also described the current status of the various parts of the RFIC design ecosystem, as well as their strengths and weaknesses. Finally, an overview of a few of the key analyses needed at the RFIC design level to overcome the challenges of verifying highly integrated, high-performance, low-power, low-cost, and high-volume RFICs was discussed. This is a very complex topic and while only scratching the surface, I believe this article provides a good overview of where we are, the choices RF designer have on verification and yield methodology, as well as areas that need improvement.

References
[1] SystemVue RF System Design Kit.
[2] D. E. Root, “X-parameters: Commercial implementations of the latest technology enable mainstream applications,”, Microwave Journal, September 10, 2009 http://www.mwjournal.com/resources/ExpertAdvice.asp?page=0&HH_ID=RES_200.
[3] Using Analog/RF X-ParameterModels in System-Level Design http://www.home.agilent.com/agilent/product.jspx?cc=US&lc=eng&ckey=1455073&nid=-34261.804610.00&id=1455073.
[4] http://www.agilent.com/find/nvna.
[5] http://www.htc.com/us/products/evo-sprint#overview and http://www.apple.com/iphone/.
[6] TECHINSIGHTS, Apple iPhone 4 Teardown http://www.ubmtechinsights.com/reports-and-subscriptions/outlook-and-analysis/apple-iphone-4/teardown/.
[7] http://www.cadence.com/products/pkg/Pages/default.aspx.
[8] Si2 Open Access Coalition, http://www.si2.org/?page=69.
[9] http://www.si2.org/?page=86.
[10] Interoperable PDK Libraries, http://www.iplnow.com/index.php.
[11] Si2 Open PDK Coalition, http://www.si2.org/?page=1118.
[12] http://www-device.eecs.berkeley.edu/~bsim3/BSIM4/BSIM400/slide/slide.pdf.
[13] A.J. Scholten, G.D.J. Smit, B.A. De Vries, L.F. Tiemeijer, J.A. Croon, D.B.M. Klaassen, R. van Langevelde, X. Li, W. Wu, and G. Gildenblat, "The new CMC standard compact MOS model PSP: advantages for RF applications,” IEEE Solid-State Circuits, Vol. 44, No. 5, May 2009, pp. 1415-1424.
[14] “TSMC Unveils New 65-Nanometer Mixed-Signal and RF Tool Qualification Program,” http://www.tsmc.com.tw/tsmcdotcom/PRListingNewsAction.do?action=detail&newsid=2426&newsdate=2007/12/13.
[15] “Agilent Technologies Announces United Microelectronics Corporation Certification of GoldenGate Software,” http://www.agilent.com/about/newsroom/presrel/2010/04jun-em10082.html.
[16] WikipediA, Process Capability Index, http://en.wikipedia.org/wiki/Process_capability_index.
[17] Process Capability Indices – Visual Animation, http://elsmar.com/Cp_vs_Cpk.html.
[18] http://www.agilent.com/find/eesof-deslibs and http://www.agilent.com/find/eesof-goldengate.
[19] E. Ngoya, R. Larcheveque, “Envelop transient analysis: a new method for the transient and steady state analysis of microwave communication circuits and systems,” IEEE MTT-S International Microwave Symposium, pp. 1365-1368, June 1996.
[20] A.Soury, E. Ngoya, “Using Sub-Systems Behavioral Modeling to Speed-up RFIC Design Optimizations and Verification,” Integrated Nonlinear Microwave and Millimetre-Wave Circuits, 2008. INMMIC 2008.
[21] Amit Gupta, “Variation in custom ICs: It’s not just a foundry issue,” Chip Design Magazine, http://chipdesignmag.com/display.php?articleId=4172.
[22] David Vye, “Accelerating Advanced Node CMOS RFIC Design,” Microwave Journal, December 7 2009, http://www.mwjournal.com/search/article.asp?HH_ID=AR_8441.
[23] David Vye, “How Design Software Changed the World”, Microwave Journal, July 2009, http://mwjournal.com/search/article.asp?HH_ID=AR_7817