advertisment Advertisement
advertisment Advertisement
advertisment Advertisement
advertisment Advertisement
Test and Measurement

Agilent Offers Market Perspective and Identifies Five Future Trends

May 27, 2010
/ Print / Reprints /
| Share More
/ Text Size+

As 2010 enters its halfway point Greg Peters, Vice President and General Manager Component Test Division of Agilent Technologies, offered a perspective of the current state of the test and measurement market and identified five current/future trends in electronic design and characterization under the headings – Investments Move From Core to Edge, Bifurcation of the Electronic Foodchain, Modeling the Real World, Lowering the Cost of Test, and Nano Scale is the New Frontier.

He began by giving his general market perspective, saying, “At last year’s IMS in Boston we felt that we had just reached the bottom of the downturn. That was a correct call because starting in about July of last year our business started to pick back up as customers decided that they needed to reinvest in a limited sense in the industry and in their own capacity to deliver to their customers. At that time most of that activity came from R&D. However, we started to see some manufacturing orders come in, especially from Asia, and that began in August. So, we have been on an upswing since July and August and we have seen that upswing continue in the second quarter of 2010 (ended May 1).

“The good news is that the electronics economy and the industry have recovered to about 85 percent of the peak in 2008. We have not fully recovered but we have come up fairly sharply from what was a very sharp downturn. However, the indications going forward are that we are going to see continued volatility.”

Focusing on the supply chain and particularly the semiconductor sector, Peters stated: “Semiconductor customers are still very cautious about capital equipment spending and expansion. In general there are very few semiconductor companies that are choosing to expand at this point. Those that have the strongest market position, are well capitalized and secure about their future are the ones that are continuing to invest.”

Against this background the first trend that Peters indentified was the move from the core to the edge with his definition of an edge device being anything that touches the real world.

He explained, “Back in the 1980s the investment was in personal computers. The big computer companies were investing in new servers to move away from the mainframe to distribute to the desktop, etc. That was the first set of trends that moved from the core to the edge. In the 1990s cellphones took off, which was the next step.”

He continued, “What we are seeing now is that edge devices of all sorts are really starting to attract attention and the investment dollars of the venture capitalist community. These are devices that measure the real world yet at the same time are connected to the internet.

“With the development of such devices comes the need to measure them in operation, particularly in the RF range and to be able to monitor the signals and understand what is going on. Agilent’s investment in this area is in handheld products. What we are doing is taking the best of our bench equipment and putting it into handheld form factors that meet the needs of those customers that are working at the edge.”

Moving on to consider bifurcation of the electronics’ industry, Peters concentrated on the very high performance needs of those customers for which electronics is the core contribution.

He gave an example of the IBM computer that is used for localized weather monitoring, saying, “This is an example of where electronics is the core contribution. That particular IBM computer is full of high performance silicon and high speed interconnects. Not surprisingly such companies are looking at optical interconnects, whether that is board-to-board at present or chip-to-chip in the future. This is where a lot of investment in the high end is going. For such customers what is needed is best in class, very high performance test equipment.”

He then presented the range of products that were on the show floor that addressed these requirements – X-series spectrum analyzers, the new X-series oscilloscope, a high performance signal generator, MIMO fading signal generator, etc.

Peters then considered “when electronics augments the core contribution”. He said, “What is needed is good performance general purpose products targeted at specific applications backed up by services and support from the test and measurement vendor. We are seeing this bifurcation happening, with the number of customers in the middle general purpose pool declining and migrating to the high end, which is not a surprise in a mature industry.”

With respect to modeling the real world the third trend that Peters identified was the transitioning of linear measurements in the RF space to the nonlinear. He said, “We have been building X-parameter technology to help our customers understand, model and simulate with regards to measuring in the real world, which is not linear.

“We are continuing to develop X-parameters, which is open and published. There have been a continuing number of manufacturers who have approached us to use the trademark name, go through the validation process and support X-parameters. This is a developing trend – over the last year two vendors, Avago Technologies and Skyworks have publicly announced that they support X-parameters and over the next six months we will see more and more component vendors doing the same.”

The next trend Peters addressed was lowering the cost of test, which although not a new issue, is accelerating as a result of the economic downturn and particularly the need to get fixed costs out of organizations. He said, “A large amount of fixed costs revolve around test development time, the required expertise to write test programs, maintenance, etc. One of the approaches we are taking is to develop the PNA-X Rack in a Box, which takes a number of measurements that might have been made with a rack of equipment in the past and putting it in a box. The benefits are integration, speed, measurement accuracy, etc., but the additional benefit is that we can now put in the box the measurement capability that used to be written by the test engineer.

“What we are doing is taking measurements and algorithms, putting them in the box, validating them and making them traceable back to standards, so that customers not only have confidence in the measurement but they can also reduce their test development time.”

Elaborating on ‘nanoscale activity,’ he focused particularly on semiconductors and sighted that both Intel and TSMC had recently announced that their latest nodes are going to be in 10 to 15 nm range.

He used an illustration to demonstrate the operation of a scanning microwave microscope, which combines the power of a vector network analyzer with an atomic force microscope and emphasized the increasing importance of being able to look at semiconductor dimensions.

Peters commented, “This is becoming increasingly significant because as the structures get smaller the channels get narrower and all of the dimensions become more important. You will see this in microwave in the future as microwave is a great tool to be applied to these types of measurements. You can achieve the resolution required and can augment other tools to scan devices and ascertain what’s going on beneath the surface.”

He concluded, “Nano is certainly a hot topic for the future, particularly for Agilent and is also the bridge into the Life Sciences side of our business.”


Recent Articles by Richard Mumford

Post a comment to this article

Sign-In

Forgot your password?

No Account? Sign Up!

Get access to premium content and e-newsletters by registering on the web site.  You can also subscribe to Microwave Journal magazine.

Sign-Up

advertisment Advertisement