Interesting talk about security given in 2011:
Analytic Framework for Cyber Security (Keynote)
During the talk, Peiter described a report in which a number of security exploits were in the security software deployed to prevent security problems, and then made the comment “complexity is often times at odds with security”. A parallel observation can probably also be made about reliability. Even when designing complex systems, there are ways to make them simple. Layering on more stuff to fix something is often not the best solution – it may be required in the short term, but it is not sustainable long term.
The problem with dynamic linking of C++ applications is also illustrated in the talk – even though your application may be tiny, it is dynamically linked into a huge set of libraries (DLL’s on Windows) that expose 10,000’s of functions, which greatly increases your attack surface. This illustrates the need to do things differently – using statically linked safe languages like Go or Rust gets you much further. For example, a pure Go application does not even link to libc. This may also be one of the reasons Google started from scratch when they created Go instead of building on the existing C libraries/ecosystem, compilers, etc. A system is only as strong as its weakest link.
Another interesting comment – IBM uses a metric that for every 1000 lines of code, 1-5 bugs are introduced. Hmm, I wonder if my software is that bad? Again, this shows the fallibility of humans as programmers – we really need to do things differently if we want better results – both in the technology base we use, and how we do it. I am fairly convinced if we do things in the open, we do better work. This does not have to be an OSS project on Github, but rather in whatever group we are working, we can emulate some of the same benefits if we do our work in an open/transparent method. Closed work/processes tends towards sloppiness – the great enemy of security.
Another observation is that we are bringing some of these problems on ourselves by demanding mono-cultures, because mono-cultures are easier to manage. In an organization, IT may require the same OS and patch level be used for all computers. This makes it easier to manage, but also makes the organization very vulnerable if that particular version is insecure.
The US approach to cyber security is dominated by a strategy that layers security on to a uniform architecture. We do this to create tactical breathing space, but it is not convergent with an evolving threat.
Throughout the presentation Peiter illustrates how the our current approach to security is not sustainable and threat/response efforts are diverging.
Technology is not the only culprit … nor the only answer.
Real-world security skills are still largely taught in an apprentice type basis.
At the end of the presentation, Peiter describes how valuable security work is being done in informal organizations/communities and how difficult it is for the Government to engage with these types of informal/small organizations.