Insights from RSA Conference 2022

Banners advertising cybersecurity on houses, busses and everywhere. The city of San Francisco talks security in the week of RSA Conference. The event itself is really huge with many tracks in parallel. I even got lost one time at the expo which seems to be twice the size of the German it-sa. But I had to refill my stock of pens anyway after the break of in-person events caused by Corona 😉
However, I did not only bring home pens and shirts, but also new insights described in this article.

Threat Modelling made easy

Threat modelling often fails because methods are too complex. Alyssa Miller, one of the authors of the threat modelling manifesto, pointed out in her presentation that threat modelling can be easy and everyone is able to do it. We already do it every day like when deciding if we wear a mask at the conference or not. She also showed with the example in the picture above how natural language can be used to simplify threat modelling. At this point I was reminded about threat modelling requirements from the automotive cybersecurity standard ISO/SAE 21434 and whether they might be too detailed (cp. Anti-Pattern “Tendency to Overfocus” from Threat Modelling Manifesto). I will spend some more thoughts on this later on. There is definitely a tendency to focus (too?) much on details and methods in Automotive TARA (Threat Analysis and Risk Assessment).

Privacy on the rise, but still way to go

Privacy is definitely on the rise, also at RSA Conference. There was a keynote with the Chief Privacy Officers (CPOs) of Google and Apple and they pointed out that privacy has highest attention in their companies. But imho there is still way to go and most tech giants (except Apple) are struggling with practical implementation because their business model does not go well together with privacy. Google’s CPO Keith Enright mentioned consent as the gold standard. I disagreed in my presentation because “Consent on Everything” is often misused to legitimate processing of personal data. That’s why it became risk #4 of the updated OWASP Top 10 Privacy Risks.

Bruce Schneier‘s new book

Bruce Schneier presented contents of his new book that will be published in January. It’s about hacking, but not only about IT hacks. It covers cheating on systems and laws in general like the tax system or emission regulation and how AI could speed up finding such hacks. There is more information on his talk in this article.

Business Information Security Officers needed

Nicole Dove pointed out that BISOs (Business Information Security Officers) are on the rise and needed to support business teams in implementing cybersecurity. They should move from a “NO” to a “YES, and” attitude to get better results. There is also a video with Nicole where she talks about the role of the BISO.

Cybersecurity Workforce Gap

The cybersecurity workforce gap and how to address it was topic of several presentations. Bryan Palma proposed in his keynote “Soulless to Soulful: Security’s Chance to Save Tech” to come up with new ideas and collaborate across company-borders to close the workforce gap. He proposed the new campaign “I do #soulfulwork” because for many employees it is important to do something good and valuable in their work which is definitely the case in the area of cybersecurity.

Disinformation

The closing keynote “The Hugh Thompson Show” started very funny, but only to discuss one of the most serious topics in today’s world later on: Disinformation and how it threatens our democratic values. The panelists proposed several ideas on how to address it, but in the end they pointed out that education and awareness will be key to be able to challenge (fake) news and validate sources. They also recommended to talk to each other in real and not only on social media.

COVID-19 from a risk management perspective

Risk Management is daily business in cybersecurity and when following the COVID-19 news I am sometimes surprised that at least in some countries the decisions on countermeasures like social distancing and lockdowns are mainly taken by virologists. This makes sense at a first glance because they know best how COVID-19 spreads, but they might not be experts on risk management and social behavior.

Take Sweden as an example where staying at home is only recommended, but no enforced lockdowns take place. Most decisions there are taken by their chief virologist Anders Tegnell. He might be a good virologist, but the Swedish government seems to lack risk management expertise or follows the strategy to prefer economic or own interests over saving human life. Even though the strategy of herd immunity might be a valid option for COVID-19, it is a risky one and comparably high death rates proof that for Sweden and the UK (which recently changed their strategy). Of course, many people did not expect such a pandemic to ever occur. Anyway, every professional government should have prepared a crisis plan that covers such a severe situation beforehand. Like in every good information security strategy or business continuity plan, the risk appetite and priorities should be defined in this crisis plan, e.g. “Human life should be protected over economic interest” or maybe the other way around.

In times of capitalism and market economy it is not surprising that many people strive to optimize life for themselves and their close friends and family, but do not consider the well-being of the whole society if the social or financial burden for themselves gets too high. For sure there are many people that consider lockdown restrictions as a high burden and the risk of getting severely sick as relatively low. Those people will ignore government recommendations and some of them even guidelines and thus further spread the disease. Also, people tend to evaluate risks wrongly in general. They underestimate risks if they have the feeling that they can influence the risk by themselves e.g. when steering a car and overestimate risks if they have no influence on them like being on an airplane (not the pilot) or being hit by a terrorist attack. People also perceive risks higher if they are personally affected. You can read more about perceived vs. actual risks in this article.

Having those factors in mind it should be unavoidable from a government perspective to adopt and enforce rules to limit the spread of COVID-19 and keep death rates as low as possible. Lockdowns should be kept until sufficient compensating measures like widespread testing for antibodies and a (privacy-friendly) Corona App are in place to keep the curve flat.

So be patient if your government enforces or keeps lockdowns longer than you wish. They might have a limited risk appetite that helps to avoid cases of death. And you can speed up re-opening by supporting compensating measures: stay home if possible, keep distance & wear masks when in public, …

How to pack your cyberwar emergency kit

In Germany there have been recent reports that hacking attempts on water and power suppliers have increased. It is not really surprising that such institutions are primary targets in case of a cyber-attack by foreign countries because power or water outage could lead to a very critical situation quickly. Marc Elsberg’s book Blackout describes quite well that a wide-spread blackout could be hard to recover due to to power network instabilities and thus lead to severe consequences only after a few days: shortage of medical supply, fuel, food, heating, communication and so on and thus also a raise of criminal activities due to lack of life-critical resources in many areas. Hopefully this scenario will never occur, but experts consider it a genuine risk and there have been cases like in the Ukraine in 2015 where cyber-attacks led to power outages. Not only the German government published a checklist to prepare for such situations. I agree that you should have the following basic supplies at home all the time:

  • Sufficient drinking water for one week (2 liters per person and day)
  • Food supply for one week (rice, pasta, cans of tinned food)
  • Portable, battery-driven radio
  • Flashlights
  • Candles
  • Spare batteries
  • Camping stove
  • Some way to heat at least one room
  • Always sufficient fuel (in your car or a canister)

You should also be prepared that you might not be able to communicate via internet or phone and that electronic devices like a garage opener will not work in case of a power outage.  

Facebook’s hindsight?

Dear Mark Zuckerberg,

apparently now you also consider what I tried to convince your Head of Brussels Office at a panel discussion at the IAPP Data Protection Congress in Brussels already in 2013: A paid version of Facebook without ads and analysis of user behavior as an alternative to protect privacy. But my suggestion was denied and my concerns regarding loss of trust not taken seriously.

Also we recommended to assess and whitelist third parties and only hand over personal data to those who are trustworthy to reduce the OWASP Top 10 Privacy Risk #7 some years ago. There are further serious privacy issues in Facebook that I will report to your data abuse bug bounty program. Facebook should start to listen to privacy experts before your business gets damaged even more. I would be happy to discuss.

Privacy Engineering Workshop in Leuven

I was invited to the Future of Privacy Forum’s (FPF) workshop on Privacy Engineering in Leuven (Belgium). The Internet Privacy Engineering Network IPEN supported it and compared to past IPEN workshops the number of participants increased a lot which also shows the growing importance of the topic. Privacy Engineering as a subset of Data Protection by Design, which is a requirement of the GDPR, is becoming an important discipline to implement privacy in software and the workshop aimed to discuss practical issues and develop guidelines for software developers and architects.

In the morning, it started with presentations from Giovanni Buttarelli (European Data Protection Supervisor) and Norman Sadeh from Carnegie Mellon University who developed tools to semi-automatically analyze privacy policies (see https://usableprivacy.org/). Later on Wojciech Wiewiorowski (Assistant European Data Protection Supervisor) pointed out that there is not always an easy solution (or “silver-bullet”) for Privacy by Design as people would like to have and compares the situation with his 10 year-old daughter that rather pretends to believe in Santa Claus and sends him her wish list instead of discussing it with her parents.

In the afternoon five breakout sessions were organized. In my group, the challenges arising from development and deployment practices have been discussed and how data protection by design methodologies can be integrated into existing software development approaches. The most important questions were about how to integrate Data Protection Impact Assessments (PIA) and a risk-based approach into the Software Development Process and about the challenges of software modularity and integration of third party code.

EDPS aims to publish a Privacy by Design guidance in the beginning of next year that will consider the issues identified in the workshop. Further information about the event and its results are available on Twitter.

IPEN Workshop on 9 June in Vienna

The 4th workshop of the Internet Privacy Engineering Network (IPEN) will take place adjacent to the ENISA Annual Privacy Forum, on 9 June 2017 in Vienna at the following address: OCG Austrian Computing Society – Wollzeile 1.

IPEN invites participants from different areas such as data protection authorities, academia, open source and business development, and other individuals who are committed to finding engineering solutions to privacy challenges.

The event is for free and a great opportunity to inform about latest developments in privacy engineering and for networking. Registration and handing in proposals is possible via ipen@edps.europa.eu

Are parents better security managers?

It has been a while since I have been posting here but my family and my job kept me quite busy. Especially my role as information security officer in my company with a successful ISO 27001 certification took its effort over the last couple of months.

Anyway new ideas regarding security and privacy are popping up in my head all the time and I am glad to find this moment to write one of them down and spread it.

This post is about a topic that I have been thinking for quite a while now. Being father of a 2 year old son I see quite some similarities between parentship and security management and I even think that being a parent made me also a better information security manager because I stay calmer in difficult situations like incidents.

As parents you have a lot more situations to deal with that need urgent response to some kind of incident like overfull diapers or your kid running towards the busy street etc. and you learn to and will stay calmer. You also sharpen your sense for what could go wrong – I call it my daily risk analysis. I regularly have to judge what my kid(s) can do like climb somewhere or play themselves outside and what I want to (try) to avoid them to do like run on the street by telling them not to do it and/or locking the fence.

Like in the role of an information security manager as a parent you will not be able to mitigate all risk because kid(s) should not be overprotected and parents do not have the time and energy to control everything. Also there are other stakeholders you have to consider like your partner or your employer that might have different thoughts on how much protection or time your kids need.

This is very similar in a business environment. You will usually not be able to mitigate all information security risks because cost and effort will be too high and too much restriction might have a negative impact on your business. Also time and ressources for information security are limited so that controls have to be prioritized according to the risk level.

But finally there is one huge different: when raising kids parents have to deal with a lot more safety issues than an average information security manager. And what is one of the most important thing you learn in your information security education? Safety (protection of human lives) is always more important than security 😉

How to boost application privacy

Privacy engineering is on the run and the upcoming EU Data Protection Regulation further pushes the requirement of Privacy by Design (PbD). So there is a need for guidelines and patterns on how to implement PbD and there are some recent developments and publications. The OWASP Top 10 Privacy Risks Project published hints and best practices on how to avoid privacy risks in web applications. On privacypatterns.eu you can read about how to implement pseudonymous messaging or protection against tracking among others. Also privacypatterns.org provides valuable information.

Voices from it-sa

I was at the biggest German Security Expo and Congress it-sa last week. One of the highlights was definitely the speech of Edward Snowden (German report) in particular because the European Court of Justice declared Safe Harbor invalid two days earlier. I had a presentation about the OWASP Top 10 Privacy Risks and gave a radio interview for Deutschlandfunk about the importance of a holistic approach for information security. Furthermore heise TechConsult published an interesting study with 5 Steps for IT Security (in German) and recommends companies to spend 1% of their turnover for information security.

Data minimization, Digital innovation & Security

I have been joining the KITS Conference in Berlin recently and there have been lively discussions about privacy as enabler or disabler of the digital future in Germany. Startup consortia and inter-trade organizations like BITKOM think that data minimization is no longer acceptable because it hinders digital innovation. And also at the German BSI IT Security Congress have been statements like “We no longer need data minimization. We need to secure our data.”

Apparently those people have a different understanding of security. In my eyes security protects assets like data to reduce risks. Risks are usually determined by multiplying the likelihood with the impact. E.g. the risk that an administrator maliciously steels your data by downloading it from the database could be reduced by lowering the number of admins by 50%. This will lower the likelihood and the corresponding risk by 50% as well. The impact is influenced by the amount of data and its criticality. If you practice data minimization the amount of data and consequently the impact will be reduced. Thus, data minimization is like the need-to-know principle very important for security because it lowers the impact not only for one risk like data theft by administrators, but for all risks associated with this data set. Furthermore anonymization and privacy by design can help to perform data analysis anyway.

And IMHO the digital future in Germany is rather hindered by a risk-averse and a bit old-fashioned culture than by data minimization 😉