I just read an interesting post by Michael Howard (Security is bigger than finding and fixing bugs). He refers to a statement Google seem to have made on its development practices (Google shares its security secrets):
In order to keep its products safe, Google has adopted a philosophy of ‘security as a cultural value’. The programme includes mandatory security training for developers, a set of in-house security libraries, and code reviews both by Google developers and outside security researchers.
This reminds me of the days back at University: I learned a hell lot about Software Engineering, Data Modeling and stuff like that. Well, I learned about programming as well (up until I was able to look at Niklaus Wirth’s Modula-2 compiler â€“ but this is a different story). And then I started my first job in the industry â€“ and all of a sudden I had to learn that there nobody actually cared about a design. Just write the code! Nobody “had time to do a design on paper, this is just a waste of time”. Did it work? Not really.
Now, we are coming to security and what do we do: Look at the code. Look for security vulnerabilities in the code. What about the design? What about the threat models? This drives me nuts: Why are we not ready to learn fromâ€¦
- â€¦ the past
- â€¦ the learning others went through?
I know that our Security Development Lifecycle is pretty successful which can be shown by a lot of different metrics â€“ Michael gives a few in his blog. Additionally, we are working with SafeCode to share the experience and learn from others. Why do other companies not join in?