Friday, March 29, 2024

What’s Happening in Geological T&M

- Advertisement -

 [stextbox id=”info” caption=”In-situ tests”]

In-situ tests can greatly increase the volume of geomaterial investigated at a foundation site and save cost as compared to sampling and lab testing. Historically, these tests have been developed to evaluate specific parameters for geotechnical design. Some tests, such as plate load test and pile load test, measure the response to a particular type of load. These tests verify design assumptions, and help to determine soil or rock properties by inversion.

Standard penetration test (SPT). The standard penetration test is an in-situ dynamic penetration test primarily designed to provide information on geological engineering properties of the soil in question. The main information gained from this test pertains to the relative density of granular deposits, such as sand and gravel, from which it is almost virtually impossible to obtain undisturbed samples. The main reasons for its widespread use are relatively low cost and simplicity. The usefulness of SPT results depends on the soil type. Fine-grained sands give the most useful results, while coarser sands and silty sands give reasonably useful results.

- Advertisement -

Dynamic cone penetrometer (DCP). In this in-situ test, a weight is manually lifted and dropped on a cone which penetrates the ground. The penetration depth (in millimetres) per hit is recorded and used to estimate certain soil properties.

Cone penetration test (CPT). This in-situ test is performed using an instrumented probe with a conical tip, pushed into the soil hydraulically at a constant rate. The early applications of CPT mainly determined the bearing capacity of soil. The original cone penetrometers involved simple mechanical measurements of the total penetration resistance to pushing a tool with a conical tip into the soil. Latest electronic CPT cones now also employ a pressure transducer with a filter to gather pore water pressure data.

[/stextbox]

Studies estimate that the amount of data being created is doubling every two years. A small data set often limits the accuracy of conclusions and predictions. A classical example is that of a gold mine where only 20 per cent of the gold is visible. Analysing Big Data is akin to finding the remaining 80 per cent which is in the dirt, hidden from view. This analogy leads to the term ‘digital dirt,’ which means that digitised data, more often than not, has concealed value. Hence, Big Data analytics, i.e., data mining, is required to achieve new insights.

“In test and measurement field, data can be acquired at astronomical rates (as high as several terabytes per day). Big Analog Data issues are growing challenges for automated test and analysis systems. When there are many devices under test, distributed automated test nodes (DATNs) are needed, which are often connected to computer networks in parallel. Since DATNs are effectively computer systems with software drivers and images, the need arises for remote network-based systems management tools to automate their configurations, maintenance and upgrades,” explains a whitepaper on the issue.

The volume of test and measurement data is fuelling a growing need in global companies to offer access to this data to many more engineers than in the past. This requires network gear and data management systems that can accommodate multi-user access, which, in turn, drives the need to geo-graphically distribute the data and its access. A growing approach to providing this distributed data access is the use of cloud technologies.

It is a known fact that Big Analog Data applications create strong dependency on IT equipment such as servers, networking and, not to mention, storage. In addition, requisite software is needed to manage, organise and analyse the data. Thus traditional IT technologies are being seen as part of the total solution post data capture to ensure efficient data movement, archiving, and execution of analytics and visualisation for both in-motion and at-rest data.

Several vendors, viz, Averna, Virinco, National Instruments and OptimalTest, already have solutions to help manage Big Analog Data. In order to analyse, manage and organise billions of data points from millions of files, engineers and scientists use software tools like NI’s DIAdem to quickly locate, load, visualise and report measurement data collected during data acquisition and/or generated during simulation. These software are designed to meet the demands of today’s testing environments, in which quick access, processing and report on large volumes of scattered data in multiple custom formats are required to make informed decisions. These software solutions help to interface the data collected with existing IT solutions or create new servers that can be accessed globally to make faster decisions.

Reliability is always king
Needless to say, geotechnical investigations play a vital role in ensuring adequate performance of a structure. Geotechnical engineers are under pressure to develop reliable and economical designs for heavier loads and diffiult soil conditions, which, in turn, necessitates strong performance of test and measurement tools used.

The need of the hour is to predict the behaviour of structures to a very high degree of reliability. This has resulted in advanced in-situ testing methods to predict the behaviour more rationally and accurately for developing the most stable, economical foundation designs. With increased dataflow set to become an industry defaul rather than a differentiator in test and measurement domain, engineers have to work on solutions capable of collecting, processing and analysing huge amounts of data, helping geotechnical investigators to gain useful insights.


The author is a tech correspondent at EFY Bengaluru

SHARE YOUR THOUGHTS & COMMENTS

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components

Calculators