“Increasingly, the functionality of complex devices is being defined by the software embedded in them. This is challenging for many test engineers because most standalone instruments cannot change their functionality as fast as changes in the device under test (DUT) due to the fixed user interface and firmware that must be developed and embedded in the instrument. Thus test engineers are turning to a software-defined approach to instrumentation, so that they can quickly customise their equipment to meet specific application needs and integrate testing directly into the design process,” says Eric Starkloff, director of NI Test Product Marketing.
Thakare shares two major advantages of software-defined instrumentation: “First, it can dramatically reduce the number of hardware components in all the mixed-signal designs, which means smaller chip size for system-on-chip implementation. Second, it can provide automatic adjustment or compensation for circuit component variations due to temperature dependence, ageing and manufacturing tolerances.”
[stextbox id=”info” caption=”Test engineering: A strategic asset”]
According to NI’s recent global survey of test engineering leaders, a shift has emerged withinelectronics manufacturing firms wherein they are now using product test for competitivedifferentiation. This shift culminates in an elevation of the test engineering function to become astrategic asset to the company. The test participants of the survey said that their primary goal overthe next few years was to reorganise their test organisation structures for enhanced efficiency.
(Source: NI Automated Test Outlook 2012)
[/stextbox]
Software-defined instrumentation looks to become an essential component of scalable and highly performing test systems. Singh agrees by saying, “We predict a bright future for software-defined instrumentation. Software-defined instruments, also known as virtual instruments, are modular hardware with user-defined software giving the flexibility to combine standard and user-defined measurements with custom data processing using common hardware components. This flexibility is useful for electronic devices like advanced navigation systems and communication devices like smartphones to integrate diverse capabilities and adopt new communication standards.”
5. Use of multicore and parallel test systems
As the complexity and functionality of electronic devices grow exponentially (in sync with Moore’s law), so does the cost of testing them. Minimising the cost of test can be challenging, but one way is to test more with less. The inherent parallelism that is made available by the graphical programming paradigm of software like LabVIEW from National Instruments and FlowStone DSP from DSP Robotics helps engineers immediately benefit from multicore processors and overcome the complexity associated with traditional text-based languages.
The trend of increasing clock speed to get better performance ended back in the early 2000s. Since then, processor manufacturers have implemented alternate technologies to ramp up performance while keeping the clock speeds around 3 GHz. These technologies include the use of processors with multiple cores on a single chip, hyperthreading, wider buses and hyper transport. Moreover, the advancement of the process node to the current 22nm process by utilising 3D transistors has resulted in significantly faster, leaner and more efficient processors for use in embedded controllers and modular instrumentation.
Denver D’Souza, senior technical consultant at National Instruments India, says, “The reality that transistor density doubles every 18 months has led to significant advances in the performance of electronic devices. This is evident not only in the latest Intel Core i7 processors but in the shrinking of technology such as 64GB solidstate drives, which are now the size of a postage stamp. These technological advances translate into considerable cost reductions.”
6. Merging of EDA tools and hardware test platforms
The extremely competitive environment in which electronics companies work now is shown by how next-generation communication protocols are barely labeled as standards before they can be seen in the market. For instance, the 802.11ac solutions have already been brought out by Broadcom even though it is yet to be ratified. In situations like these, companies go all out to get a jumpstart on the competition, and what better way to do this than to merge design and testing in order to accelerate the ‘time to market’.
Adesh Jain, applications consultant at Agilent Technologies, explains why the traditional method is slow: “Traditionally, for any complete electronic product to be ready for the market, each component of the complete system is first designed and verified with EDA tools, then prototypes are fabricated and tested, before the final product is released to the market. If discrepancies are found in the hardware at later stages, the whole cycle has to be repeated, which would result in loss of time as well as money for any organisation.”
Proper verification at earlier stages reduces this time and effort to a great extent. The tests, specs, algorithms and plots used in the early stages of EDA are the same as measured on the test bench. The aim is to merge both the worlds and see if it is possible to save the design engineers’ time by streamlining the flow and thus improve productivity while reducing the time to get the product out to the market.