8th IET International System Safety Conference incorporating the Cyber Security Conference 2013
14 - 17 October 2013 | Cardiff, UK
Eminent keynote speakers have agreed to present during the conference. These keynote speakers are:
Ian Bryant is the Technical Director of the United Kingdom’s Trustworthy Software Initiative (TSI)
TSI is a cross sector, public-private partnership remitted to “Make Software Better”. This role is based at the Cyber Security Centre (CSC) at De Montfort University (DMU), where he is on academic attachment from the Ministry of Defence (MOD). He is also involved in a number of national and international Information Security committees, including chairing the UK National IA Forum (NIAF), being Deputy
Chair of BSI Panel IST/033/-/4 (Information Security Controls) with lead responsibility for ISO/IEC activity in Architecture, Cybersecurity, Incident Management and Software matters, and contributing to ETSI’s MTS Security SIG.
Who wants stovepipes?
It is a fact universally acknowledged that “stovepipes” of communities of interest are seldom in the Public Good. Yet stovepipes are very much a reality – indeed the title of this conference (“System Safety Conference, incorporating the Cyber Security Conference”) provides tangible evidence of this.
This talk considers the question as to whether a more holistic approach to trustworthiness (safety, reliability, availability, resilience and security) would be preferable, including an examination of the behavioural economic factors that need to be understood.
Ivan Lucic, Professional Head of System Safety, London Underground
Ivan has acquired international experience through a number of railway systems and safety related projects in UK and Europe.
Currently as the Professional head of System Safety Ivan is responsible for embedding the system safety strategy and the framework and delivery of Engineering Safety and Assurance Cases across the LU Upgrade Programmes.
How to deliver a safe system and not be loathed by everyone involved with the project?
Delivering safe systems in the environment of increased complexity coupled with commercial pressures, is difficult at best of times. Ethical issues, timescales, cost, societal expectations, resources, competencies, business needs, legislation, standards, appear to be conflicting energies unleashed on project teams which are in turn driven to take sides and entrenched positions.
From their trenches different factions throw heavy words at each others like grenades. Things like 'schedule', 'ALARP', 'illegal', 'un-missable milestones' 'CENELEC', 'GRIP' or 'PMF', 'effective' and 'efficient', 'system', 'system of systems', 'sub-system', 'federation of systems', 'UML', 'EMI', 'SIL', 'business case', 'ROGS', 'ISA', 'Safety Taleban', 'Project Management Cowboys' etc. are fired at will and like in any trench conflict casualties are high on all sides.
In the end we deliver a safe system. But does it really need to be so difficult? Is it really a black art?
No one wakes up in the morning and goes to work with a firm conviction that doing a bad job and endangering others is great.
I want to challenge us, the safety professionals, to think hard about not only what others need to change or do better, but also what we can, or indeed must, do to make it easier for us to understand needs and drivers of others in the organisation and to be understood, and to stop this wasteful conflict.
We can not blame the majority of human race for being unable to appreciate the eternal saintly beauty of a perfect GSN or dazzling attractiveness of an FFT model describing the EMI behaviour of track-circuits in Wimbledon area or the excitement of reading thousands of accident reports as part of the data analysis to be able to populate risk assessments.
This speech suggests some of the reasons behind the conflict and proposes an approach that could aid the peace process.
Les Hatton, CEO, Oakwood Computing Associates and Professor of Forensic Software Engineering, Kingston University
Les Hatton has held various roles in industry and academia over the years. He still does one day a week as Professor of Forensic Software Engineering at Kingston University although he is really a secret nerd who lives behind a keyboard in a shed in New Malden, researching, writing and poring over enormous piles of code.
Some reflections on software systems: where does safety stop and security start?
System safety has rightly been a mainstream part of software systems engineering for a long time. However, as most control systems were essentially closed in the early days, security has generally been treated as somebody else's problem. This situation has completely changed now for various reasons.
First of all, embedded systems have grown massively in their resource requirements and frequently sit on top of a standard OS stack such as Linux. In other words, they look more and more like normal systems. Secondly, and partly resulting from the first point, they have become open to the outside world in some sense. The arguments in favour of this include compelling ones such as the logistics of patching software defects.
Thirdly, while this has been going on, software security has become a major growth area in its own right in response to the very considerable increase in sophistication and intensity of attack vectors, both malicious and espionage related. An early warning shot of the confluence of these factors was the Stuxnet virus.
In this talk, I will muse over system properties in general before concentrating on the rapidly growing overlap between software system safety and software system security. Its not particularly good news.
Rich Jones, Oversight Programme Manager, CAA Gatwick
During 32 years serving in the RAF, Rich Jones flew in the fighter role, operating in over 40 countries. Other tours included Commands in the UK, Falklands, and Middle East. In his final tour in the RAF, he was Air Commodore Operations responsible for the planning of all the RAF’s involvement in operations in Iraq and Afghanistan and for the RAF Flight Safety Organisation.
On leaving the RAF in 2008, Rich took over as the Chief Executive of the United Kingdom Flight Safety Committee for four years, where his primary focus was the promotion of aviation safety in commercial aviation around the world. In his current CAA role, Rich is responsible for designing and implementing a risk and performance based approach to regulation across UK aviation.
Enhancing safety performance in UK civil aviation
Tony Cant, Former DSTO
Dr Tony Cant has 22 years of experience in DSTO in the field of technical and policy aspects of trusted systems. During this time, he has both carried out and managed research and development into the application of formal methods (i.e. formal modelling, specification and verification techniques) to critical systems, with applications to both safety-critical and security-critical systems. Most recently, he led R&D into tool support for structured documentation, called the Hierarchical Verification Environment (HiVE).
System safety; Where next?
System safety is a widely practiced discipline that is built on the familiar everyday notions of risk and hazard. There are a number of well-known standards covering the safety of defence systems, railway systems, automotives and aircraft.
Central to system safety is the idea of a safety case: a reasoned argument for safety of a system that is based on evidence. However, the concepts on which system safety is built, i.e. risk and hazard, can sometimes be confusing, are often poorly understood and are frequently misapplied in the context of safety engineering for software-intensive systems. In this paper we discuss these matters, and speculate on where system safety might be heading.
We describe some common issues with -- and desirable attributes of -- safety standards and safety cases. We present the idea of a structured document as a means of understanding these issues. We discuss first how the notion of structured document was used in a lightweight way in the Australian Defence standard DEF(AUST)5679, and how the HiVe tool, currently under development at DSTO, offers a more powerful means of building structured documents.
We conclude with brief comments on how the notion of hazard could be replaced by that of a safety protocol.