Thursday, March 28, 2024

Big Data in Test and Measurement

- Advertisement -

“Differentiation is no longer about who can collect the most data; it’s about who can quickly make sense of the data they collect”—Measurements Outlook 2013, National Instruments Corporation.

Challenges encountered with the ‘Big’ in Big Data
There used to be a time when hardware sampling rates were limited by the speed at which analogue-to-digital conversion took place, physically restricting the amount of data that could be acquired. But today, hardware is no longer a limiting factor in acquisition applications. Management of the acquired data is the challenge of the future.

As it becomes both necessary and easier to capture large amounts of data at high speeds, engineers will, needless to say, face challenges in creating end-to-end solutions that require a close relationship between the automated test and IT equipment. This is driving test and measurement system providers to work with IT providers in order to offer bundled and integrated solutions for automated test applications.

- Advertisement -

Contextual data mining. Data mining is the practice of using the contextual information saved along with data to search through and pare down large data sets into more manageable, applicable volumes. By storing raw data alongside its original context, or ‘metadata,’ it becomes easier to accumulate, locate and later on, manipulate and comprehend. This is one of the major benefits that makes it easier to analyse the collected data.

Intelligent data acquisition nodes. Though it is common to stream test data to a host PC over standard buses like Ethernet and USB, high-channel-count measurements with fast sampling rates can easily overload the communication bus. An alternative approach is to first store data locally and then transfer the files for post-processing after a test is run, which increases the time taken to realise valuable results. To overcome these hurdles, the latest measurement systems integrate leading technology from Intel, ARM and Xilinx to offer increased performance and processing capabilities as well as off-the-shelf storage components to provide high-throughput streaming to the disk.

U4431A MIPI M-PHY protocol analyser
U4431A MIPI M-PHY protocol analyser

Ensuring inter-operability. With on-board processors, the intelligence of measurement systems has become more decentralised by having processing elements closer to the sensor and the measurement itself.

Mohanram shares, “Advanced data acquisition hardware include high-performance multi-core processors that can run acquisition software and processing-intensive analysis algorithms in line with the measurements. These measurement systems are so intelligent that they can analyse and deliver results more quickly without waiting for large amounts of data to transfer, or without having to enter it in the first place. This optimises the system’s ability to use the disk space more efficiently.”

Bhatia adds, “The biggest challenge in these buses is to ensure interoperability between systems designed by different companies and to ensure compliance to the design specifications. Compliance test specifications are created to ensure inter-operability and compliance by the standard bodies.”

Breaking the resolution barrier. Today, hardware vendors have accelerated data collection rates to such an extent that engineers and scientists have been able to break through rate and resolution barriers rapidly, triggering a new wave of data consequences.

Bhatia shares, “With data rates going up, validation and compliance procedures have become more complex, as jitter budget will be very tight. Higher data rates have also driven the bandwidth requirements on oscilloscopes higher and created the need for higher-speed BERTs in the future.”

Mohanram paints a slightly different picture: “Advancements in computing technology, including increasing microprocessor speed and hard drive storage capacity, combined with decreasing costs of hardware and software have caused an explosion of data coming in at a blistering pace. In measurement applications in particular, engineers and scientists can collect vast amounts of data every second of every day.”

‘Big’ is here to stay, deal with it
Technology research firm IDC recently conducted a digital data study, which included the world’s measurement files, video, music and many other files. This study estimates that the amount of data available is doubling every two years. The fact that data production is doubling every two years mimics one of electronics’ most famous laws: Moore’s law. Forty-eight years later, Moore’s law still influences many aspects of IT and electronics. If digital data production continues to mimic Moore’s law, success as an organisation will hinge on the speed at which acquired data can be turned into useful knowledge.

The Big Data phenomenon adds new challenges to data analysis, search, integration, reporting and system maintenance, which must be met to keep pace with the exponential growth of data. The sources of data are many, but data derived from the physical world is amongst the most interesting to the engineers and scientists. This is analogue data that is captured and digitised. Thus it can be called ‘Big Analogue Data.’


SHARE YOUR THOUGHTS & COMMENTS

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components

Calculators