Sunday, October 1, 2023

Improving Cyber-Security: Stringent Analysis for Software

- Advertisement -

A. Considering current trends both in terms of applications and new regulations, it seems certain that almost all software will have to be subject to quite stringent analysis. Connectivity means that is no longer obvious whether a simple piece of software doing something apparently innocuous can be compromised by an aggressor, perhaps by using it to access a more critical part of the system.

Q. You have mentioned that you expect there to be a lot of new users wanting to use your tools to secure and certify their systems. Do you foresee a situation mirroring that in the Test & Measurement sector, where the number of skilled people in software automation and test teams could not meet demand? In that case, the market response was to provide test-as-a-service through test centres.

A. This seems very likely to become a trend. We already do this ourselves for some of our customers who do not have the skills to negotiate the regulations by themselves – for large projects at the moment, but I can imagine it being scaled down for smaller firms in the consumer markets. I also suspect that other companies will enter this market as it develops.

Q. What are your views on whether systems consumer devices should be subjected to a similar level of certification to your traditional markets? After all, the advent of IoT has seen consumer products interacting with sensitive and critical equipment.

A. Although these devices may well function perfectly in isolation, a common problem we have witnessed is that insufficient thought has gone into the design of their interfaces. For example, suppose an infotainment head unit and a more critical control system are both placed on a common CAN bus in a car. If communications between the devices are tightly controlled, then it is difficult for an aggressor to compromise the critical system from the head unit. Too often in such situations, we find that very general interfaces are used and so that defence is gone. Clearly, such an approach does not represent best practice, meaning that it would be very difficult to defend in court.

Q. With the impending rise of self-driving cars and the level of development work required to implement them, do you anticipate that their development cycle will be as lengthy those in the aerospace sector ten years ago?

- Advertisement -

A. It will certainly have to be rigorous, and it’s even debatable whether existing best practices are going to be good enough. Because at the end of the day, self-driving cars could baulk at an unanticipated set of circumstances, do something silly, and kill people. The automotive industry has already witnessed the consequences of un-commanded acceleration in a conventional car – imagine that happening in the middle of town under computer control! We already have too much evidence of how effective cars can be as killing machines. So, I suspect that there is a great deal of work to be done.

Q. Any thoughts on how it would be possible to test semi-intelligent systems? Will we need something beyond today’s static and dynamic analysis?

A. The problem is combinatorics. It is necessary to consider the myriad of possible combinations and circumstances, and to be sure that they all work together in a satisfactory way. At the present time we don’t consider combinatorics, we just look at circumstances in isolation. I don’t think that is going to be good enough. So, I think the amount of effort required to provide convincing evidence that this software is ok is going to be hard. Combinatorial mathematical techniques alone are unlikely to provide the answer because understanding the requirements of these complex systems is stunningly difficult. Countries like India seen a huge variety of vehicles and objects on the road, and I am not sure that a computer system will be able to handle them all.

Q. Given that no connected software can be perfectly secure, what would your golden rules or guidelines be for engineers working on connected critical systems?

A. I think the problem is that outside the safety critical world, there remains a casual approach to software development in general. New regulations are likely to mean a requirement to demonstrate that you have applied best practice to minimize the potential for other people to be harmed as a result of your software being compromised, however indirectly. That is going to be very painful for the software industry.

Q. Can lessons learned by individual manufacturers be learned by the industry as a whole?

A. Yes. For example, the Lexus problem has been discussed in great depth amongst the Japanese automotive association. I know that because of the feedback from them to the MISRA organisation in Britain. Because MISRA was originally focused on the automotive sector, we have retained close relationships with the industry and we frequently receive requests to implement appropriate rules and guidelines.

SHARE YOUR THOUGHTS & COMMENTS

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components

Calculators