We’ve been using a framework for a couple of years to explain how the Internet of Things and Cyber Safety are different from Enterprise IT and most other high tech products we’re familiar with. It’s proven useful to frame discussions, particularly with a non-technical audience, and builds a base of agreement for more substantive conversations. Most of our (debatably) “best practices” are highly tuned for financially motivated adversaries, with confidentiality impacts, in managed corporate environments, with common technologies, and established economics and time scales. As compared with Enterprise IT, the Internet of Things and Cyber Safety has several differences that must be appreciated and accounted for throughout the device’s lifecycle by all stakeholders who design, build, or operate them. 

  • Consequences – When software is a dependency for safety-critical systems, consequences of security failure may manifest in direct, individual harm including loss of life. Impacts from widescale harm can shatter confidence in the firm or the market, as well as trust in government to safeguard citizens through oversight and regulation.
  • Adversaries – Different adversaries have different goals, motivations, methods, and capabilities. While some adversaries may be chastened by potential harm from safety-impacting systems, others may seek these systems out. For instance, ideological actors may wish to inflict harm, and criminal groups may suspect owners will pay higher ransoms.
  • Composition – Some components in Internet of Things devices, including safety systems, are not found in typical IT environments. Elements such as sensors, programmable logic controllers, low power chips, embedded controllers, limited battery life, etc., limit capabilities available to the manufacturer in design and response.
  • Economics – Components for safety systems may require both a high degree of resourcing to protect and have a very low cost of goods, and margins may also be smaller. Security capabilities for million dollar data centers are likely cost prohibitive in 42 cent microchips.
  • Context and Environment – Safety critical systems often exist in unique operational, environmental, physical, network, immediacy/realtime, and legal contexts. For instance, a pacemaker is implanted in a human body, has no IT staff, must respond immediately, has no bolt-on security measures, and carries strict regulatory requirements.
  • Timescales – Timescales for design, development, implementation, operation, and retirement are often measured in decades. Response time may also be extended because of composition, context, and environment. Safety systems in design today may be with us for 10, 20, 40 years.