Banners advertising cybersecurity on houses, busses and everywhere. The city of San Francisco talks security in the week of RSA Conference. The event itself is really huge with many tracks in parallel. I even got lost one time at the expo which seems to be twice the size of the German it-sa. But I had to refill my stock of pens anyway after the break of in-person events caused by Corona 😉 However, I did not only bring home pens and shirts, but also new insights described in this article.
Threat Modelling made easy
Threat modelling often fails because methods are too complex. Alyssa Miller, one of the authors of the threat modelling manifesto, pointed out in her presentation that threat modelling can be easy and everyone is able to do it. We already do it every day like when deciding if we wear a mask at the conference or not. She also showed with the example in the picture above how natural language can be used to simplify threat modelling. At this point I was reminded about threat modelling requirements from the automotive cybersecurity standard ISO/SAE 21434 and whether they might be too detailed (cp. Anti-Pattern “Tendency to Overfocus” from Threat Modelling Manifesto). I will spend some more thoughts on this later on. There is definitely a tendency to focus (too?) much on details and methods in Automotive TARA (Threat Analysis and Risk Assessment).
Privacy on the rise, but still way to go
Privacy is definitely on the rise, also at RSA Conference. There was a keynote with the Chief Privacy Officers (CPOs) of Google and Apple and they pointed out that privacy has highest attention in their companies. But imho there is still way to go and most tech giants (except Apple) are struggling with practical implementation because their business model does not go well together with privacy. Google’s CPO Keith Enright mentioned consent as the gold standard. I disagreed in my presentation because “Consent on Everything” is often misused to legitimate processing of personal data. That’s why it became risk #4 of the updated OWASP Top 10 Privacy Risks.
Bruce Schneier‘s new book
Bruce Schneier presented contents of his new book that will be published in January. It’s about hacking, but not only about IT hacks. It covers cheating on systems and laws in general like the tax system or emission regulation and how AI could speed up finding such hacks. There is more information on his talk in this article.
Business Information Security Officers needed
Nicole Dove pointed out that BISOs (Business Information Security Officers) are on the rise and needed to support business teams in implementing cybersecurity. They should move from a “NO” to a “YES, and” attitude to get better results. There is also a video with Nicole where she talks about the role of the BISO.
Cybersecurity Workforce Gap
The cybersecurity workforce gap and how to address it was topic of several presentations. Bryan Palma proposed in his keynote “Soulless to Soulful: Security’s Chance to Save Tech” to come up with new ideas and collaborate across company-borders to close the workforce gap. He proposed the new campaign “I do #soulfulwork” because for many employees it is important to do something good and valuable in their work which is definitely the case in the area of cybersecurity.
The closing keynote “The Hugh Thompson Show” started very funny, but only to discuss one of the most serious topics in today’s world later on: Disinformation and how it threatens our democratic values. The panelists proposed several ideas on how to address it, but in the end they pointed out that education and awareness will be key to be able to challenge (fake) news and validate sources. They also recommended to talk to each other in real and not only on social media.
Risk Management is daily business in cybersecurity and when following the COVID-19 news I am sometimes surprised that at least in some countries the decisions on countermeasures like social distancing and lockdowns are mainly taken by virologists. This makes sense at a first glance because they know best how COVID-19 spreads, but they might not be experts on risk management and social behavior.
Take Sweden as an example where staying at home is only recommended, but no enforced lockdowns take place. Most decisions there are taken by their chief virologist Anders Tegnell. He might be a good virologist, but the Swedish government seems to lack risk management expertise or follows the strategy to prefer economic or own interests over saving human life. Even though the strategy of herd immunity might be a valid option for COVID-19, it is a risky one and comparably high death rates proof that for Sweden and the UK (which recently changed their strategy). Of course, many people did not expect such a pandemic to ever occur. Anyway, every professional government should have prepared a crisis plan that covers such a severe situation beforehand. Like in every good information security strategy or business continuity plan, the risk appetite and priorities should be defined in this crisis plan, e.g. “Human life should be protected over economic interest” or maybe the other way around.
In times of capitalism and market economy it is not surprising that many people strive to optimize life for themselves and their close friends and family, but do not consider the well-being of the whole society if the social or financial burden for themselves gets too high. For sure there are many people that consider lockdown restrictions as a high burden and the risk of getting severely sick as relatively low. Those people will ignore government recommendations and some of them even guidelines and thus further spread the disease. Also, people tend to evaluate risks wrongly in general. They underestimate risks if they have the feeling that they can influence the risk by themselves e.g. when steering a car and overestimate risks if they have no influence on them like being on an airplane (not the pilot) or being hit by a terrorist attack. People also perceive risks higher if they are personally affected. You can read more about perceived vs. actual risks in this article.
Having those factors in mind it should be unavoidable from a government perspective to adopt and enforce rules to limit the spread of COVID-19 and keep death rates as low as possible. Lockdowns should be kept until sufficient compensating measures like widespread testing for antibodies and a (privacy-friendly) Corona App are in place to keep the curve flat.
So be patient if your government enforces or keeps lockdowns longer than you wish. They might have a limited risk appetite that helps to avoid cases of death. And you can speed up re-opening by supporting compensating measures: stay home if possible, keep distance & wear masks when in public, …
It has been a while since I have been posting here but my family and my job kept me quite busy. Especially my role as information security officer in my company with a successful ISO 27001 certification took its effort over the last couple of months.
Anyway new ideas regarding security and privacy are popping up in my head all the time and I am glad to find this moment to write one of them down and spread it.
This post is about a topic that I have been thinking for quite a while now. Being father of a 2 year old son I see quite some similarities between parentship and security management and I even think that being a parent made me also a better information security manager because I stay calmer in difficult situations like incidents.
As parents you have a lot more situations to deal with that need urgent response to some kind of incident like overfull diapers or your kid running towards the busy street etc. and you learn to and will stay calmer. You also sharpen your sense for what could go wrong – I call it my daily risk analysis. I regularly have to judge what my kid(s) can do like climb somewhere or play themselves outside and what I want to (try) to avoid them to do like run on the street by telling them not to do it and/or locking the fence.
Like in the role of an information security manager as a parent you will not be able to mitigate all risk because kid(s) should not be overprotected and parents do not have the time and energy to control everything. Also there are other stakeholders you have to consider like your partner or your employer that might have different thoughts on how much protection or time your kids need.
This is very similar in a business environment. You will usually not be able to mitigate all information security risks because cost and effort will be too high and too much restriction might have a negative impact on your business. Also time and ressources for information security are limited so that controls have to be prioritized according to the risk level.
But finally there is one huge different: when raising kids parents have to deal with a lot more safety issues than an average information security manager. And what is one of the most important thing you learn in your information security education? Safety (protection of human lives) is always more important than security 😉
Apparently those people have a different understanding of security. In my eyes security protects assets like data to reduce risks. Risks are usually determined by multiplying the likelihood with the impact. E.g. the risk that an administrator maliciously steels your data by downloading it from the database could be reduced by lowering the number of admins by 50%. This will lower the likelihood and the corresponding risk by 50% as well. The impact is influenced by the amount of data and its criticality. If you practice data minimization the amount of data and consequently the impact will be reduced. Thus, data minimization is like the need-to-know principle very important for security because it lowers the impact not only for one risk like data theft by administrators, but for all risks associated with this data set. Furthermore anonymization and privacy by design can help to perform data analysis anyway.
And IMHO the digital future in Germany is rather hindered by a risk-averse and a bit old-fashioned culture than by data minimization 😉
According to the study Net Losses – Estimating the Global Cost of Cybercrime published by the Center for Strategic and International Studies (CSIS) Germany loses 1.6 percent of its gross domestic product (GDP) because of cybercrime. This is more than in all other countries. Second are the Netherlands with 1.5 percent followed by Norway and USA with 0.64 percent each. One reason for Germany’s high loss numbers could be the recent efforts to collect and publish cybercrime incidents, but of course also a lack of security measures in German companies.
The World Health Organization (WHO) recently published a report with new numbers of dead people because of air pollution each year: 7 million worldwide – most of them because of stroke and heart and lung diseases.
This is more than twice as high as previously estimated and thus air pollution is the biggest environmental health risk now. And even in Europe 600,000 deaths are linked to pollution according to the WHO study and the German SPIEGEL reports that particulate matter pollution in some German cities like Berlin or Leipzig is significant.
Of course pollution is not a direct risk to data centers or IT equipment like an earthquake or flood, but it has a significant influence when it comes to sustain one of the most important resources for IT: Skilled people. Pollution already influences decisions about whether to outsource IT to certain countries especially in South-east Asia. It becomes harder to find employees from Western countries that are willing to build up a subsidiary or manage partners in these countries if there is significant pollution in this area.
And pollution might even be a risk to IT operations in case administrators are not able to leave their homes due to high Pollution if critical maintenance has to be done on-site. So pollution definitely becomes a topic for information security considerations and a gas mask might be standard equipment in your future emergency kit.
I recently read an article about eight people having died in a medium German city last year because they walked over red traffic lights and have been hit by a car. They apparently had the feeling they had everything under control, but they underestimated the risk. The story reminded me about the daily business of information security people having to deal with risk perception of their managers and employees – luckily without casualty. But in a much more complex environment that is hard to oversee. False risk perception is typical for human beings. There are studies saying that the number of deaths of the 9/11 attacks was exceeded by the number of deaths caused by additional car accidents because people chose to drive instead of flying and driving is much more dangerous than flying.
As it seems there is also a false risk perception about terror attacks and the NSA spying. The NSA only contributed to the prevention of 4 out of 225 terror cases since 9/11 according to a study of the New America Foundation. The rest was prevented by the police etc. If there is only a small number of suicides because people are identified as terrorists by mistake because of misleading correlation results of the NSA, the spying would not only help to save lives, but even “kill” additional people. But I doubt there are public reports on the impact of such false positives.