Bots & Robotic Software, the next threat surface

Posted on October 18, 2016

0


bots


Following on from the theme of my last blog – ‘Insider Threat – Wetware issue’ – One of the big themes at the moment is the attraction of using Robotic Software and ‘Bots’ on the pretext that these can reduce the Insider Threat surface area amongst other suggested efficiency benefits. Robotic Software needs to be considered on its merits in the full awareness that it is not a solution to the Insider Threat but yet another new technology that comes with its own hidden costs

Why?

It is not hard to draw a parallel in the wake of the latest Internet of Things (IoT) Distributed Denial of Service (DDoS) attack that took down leading websites – ‘IoT hacks on the rise IoT-based DDoS attacks on the rise’.

IoT entered the public domain in a largely uncontrolled and in many cases commercially cavalier manner. It is proving the biggest threat to Internet integrity. With Billions of IoT devices connected to the public network either directly or via a private network the predominant nature of their economics (cheap production) means they are not secure and in many cases cannot be patched. The responsibility lies with the vendors to ensure fitness, form and function. If a user fails to maintain a system that has a friendly user interface to support ease of maintenance them the vendor can fairly blame end user error, but that is rarely the case with the current state of IoT.

“Technological progress is like an axe in the hands of a pathological criminal.”

Albert Einstein

The experience with IoT demonstrates the irresponsibility of vendors driven by commercial ends. It is not the first time and will not be the last. Just think of all the bugs from rushed software production processes that have gone into the wild over the years. The trend though is that at each evolution the impact is getting bigger, spreading faster, adapting in real time and causing collateral fallout jumping the gap to impact our offline environments.  We have already been hearing of the security challenges with autonomous vehicles, yet we blithely push ahead with those in the spirit of innovation (global scale commercial interests). Let me put your optics onto the next use of technology that is coming to a workplace near you with a worryingly similar profile to how IoT was allowed into the wild to run out of control that harbours an even greater impact area

Robotic Software and Bots

  • Bots – the personalised Digital Assistants that can do anything from booking a cinema for you to responding to all your whole Social Media interactions according to set parameters you give it or it may learn over time by monitoring your habits online, typing style, likes, dislikes, favourite sites, news themes, music, films, clothing, colours etc …… ‘getting to know you’ , you get the idea. Bots are being heralded as the future on mobile platforms that will make Apps redundant. Well I for one disable anything to do with Digital assistants and advise you to do the same
  • Robotic Software – Software that can automate  / virtualise the repetitive computational tasks and save companies millions by replacing human labour. Think of these in terms of Digital Labour, with each instance replacing a human worker.

Yes, you guessed it those wonderful new Digital Labour units you thought would eradicate human error come with their own dirty little secret. Think of a buggy Digital Labour unit or a personal Bot. It’s one thing having a single human as the weakest link, imagine a chain where every link repeats the same error blindly, or worst still a hacker finds a bug and exploits it to recruit all that Digital Labour to their own ends. Oh yes and they do it at the speed of silicon and 24 hours a day, 365 days a year. So not only are they the new risk surface but can take it nuclear faster than just some free clicking human.

I repeat, the truth is as I have quoted before, with every 1,000 lines of software code comes between 15 and 50 errors. So even taking out all the users, with the complexity in IT systems today the chances of an exploit are almost guaranteed somewhere.

Quoting from the book ‘Code Complete: A Practical Handbook of Software Construction, Second Edition 2nd Edition’:

  1. Industry Average: “about 15 – 50 errors per 1,000 lines of delivered code.”
  2. Microsoft Applications:“about 10 – 20 defects per 1,000 lines of code during in-house testing, and 0.5 defect per KLOC (KLOC = 1,000 lines of code) in released product (Moore 1992).”
  3.  Cleanroom development“A Harlan Mills pioneered technique that has been able to achieve rates as low as 3 defects per 1,000 lines of code during in-house testing and 0.1 defect per 1,000 lines of code in released product (Cobb and Mills 1990). A few projects – for example, the space-shuttle software achieved a level of 0 defects in 500,000 lines of code using a system of format development methods, peer reviews, and statistical testing.”

Those software production failure statistics are the ground zero of our future challenge with the Internet. As we become more hyper-connected the quality of the software as the very fungible control medium that everything relies on, is where we need to focus. If we do not address this now, we are building our future hyper-connected society on quicksand. If the software acreage magnifies exponentially without this foundation principle being addressed there will be a point at which we will experience a digital equivalent of a Nuclear China Syndrome. A runaway event that takes everything down and destroys the good with the bad.

For organisations sitting in the belief that their adoption of a DevOps strategy to their code development  is addressing this, they need to look a little closer. Whilst the rapid / Agile development nature of DevOps lends itself to more frequent code checking cycles and the ability to integrate more frequent security reviews, this does not mean more secure unless you are actually including dedicated security checks. A recent survey by HP Enterprise revealed that whilst 99% of organisations viewed DevOps as more secure, only 20% have integrated dedicated security testing into their DevOps code lifecycle and 17% are doing NONE at all. The reality is that DevOps is no more secure, in fact it can be less secure as it can introduce defects faster and more frequently, it’s all in the implementation. I fear that goes back to an organisations attitude to IT Security, if it has a poor or lazy one that will prevail in the end product

This is symptomatic of a vacuous world of urban myth that drives many trends in IT. With so much information being thrown at us we rarely go more than skin deep and fail to get into the real meat of a subject. Today’s perception of IT proficiency is a person with the ability to access an app store and use an app without reading a manual and everyone is an IT ‘guru’ with an opinion because they can switch on an iPad. Cannon fodder for a world of hacking script kiddies.

If we take a leap of faith and imagine a world where software is written in a way that eliminated all bugs, it does not solve other software related problems that compromise security. This requires ALL parties in the development, deployment, support and maintenance of software to become responsible and accountable. A start would be to adopt a simple principle – If it cannot afford the security attention it deserves it should not be created’. Or at least NOT deployed in a live environment to burden an unsuspecting end user audience and connected world.  It gives us a head start on the threat actors who are so creative in compromising software, but even that nirvana is a dangerous false sense of security.

The truth is even error free code can end up introducing security flaws in the way it is architected to perform. The design of systems themselves need to similarly be resilient against unpredicted scenarios. Think of a car being used as a battering ram to rob a bank. The car, road and Bank may be all 100% secure and subject to their individual compliance regimes, but provide an attractive target (Bank) with a means of access (road) and a tool (car) mix in a little bit of creative thinking and you get the unexpected and unconventional cashpoint drive through, aka Ram Raid. A somewhat over simplified scenario but it illustrates the point. For a current software example look at how every version of the Windows Operating System has suddenly become exposed by the ‘AtomBombing’ hack. This uses an existing legitimate process in the Windows Operating System in a way it was not designed to be used. The challenge is this is not a bug, but a fundamental design flaw on the way the system as a whole works.  Bugs can be patched (fixing a leak in a pipe), design flaws can be significant engineering exercises (ripping out all the plumbing and replacing it), a bit like finding you have asbestos in your house. It does what it was designed to do but you are now aware of some undesirable side effects.

Few in IT will not have heard of Ransomware by now, the malicious act on a digital asset by a hacker which that allows them to hold user to ransom for the return of that digital asset to a usable condition.  Well imagine a world where instead of an obvious digital asset ransom the hacker compromised your Digital Labour via a software flaw, introducing micro changes to data records that only become discernible over time, or the analytical reporting Robots in the output cycle that tell lies at inconvenient times. The impact on data driven decision making in organisations and ecosystems is frightening to consider.  The trust impact on the integrity of data driven reporting for decision making for a digitally driven world would touch all our lives. When you can no longer trust the output from your multi-million dollar Big Data investments where does a business go, its share price in freefall? Backups, do you trust them, how can you tell, they are software as well.

It’s one thing getting users to pay attention to monthly Cyber Security training. What hope do you think there is when the brave new world of Digital Labour Units get up to full speed. Some humans may be subject to social engineering, but ALL Software Robots are subject to code compromises because ALL code has bugs.

Advertisements