Insights from RSA Conference 2022

Banners advertising cybersecurity on houses, busses and everywhere. The city of San Francisco talks security in the week of RSA Conference. The event itself is really huge with many tracks in parallel. I even got lost one time at the expo which seems to be twice the size of the German it-sa. But I had to refill my stock of pens anyway after the break of in-person events caused by Corona 😉
However, I did not only bring home pens and shirts, but also new insights described in this article.

Threat Modelling made easy

Threat modelling often fails because methods are too complex. Alyssa Miller, one of the authors of the threat modelling manifesto, pointed out in her presentation that threat modelling can be easy and everyone is able to do it. We already do it every day like when deciding if we wear a mask at the conference or not. She also showed with the example in the picture above how natural language can be used to simplify threat modelling. At this point I was reminded about threat modelling requirements from the automotive cybersecurity standard ISO/SAE 21434 and whether they might be too detailed (cp. Anti-Pattern “Tendency to Overfocus” from Threat Modelling Manifesto). I will spend some more thoughts on this later on. There is definitely a tendency to focus (too?) much on details and methods in Automotive TARA (Threat Analysis and Risk Assessment).

Privacy on the rise, but still way to go

Privacy is definitely on the rise, also at RSA Conference. There was a keynote with the Chief Privacy Officers (CPOs) of Google and Apple and they pointed out that privacy has highest attention in their companies. But imho there is still way to go and most tech giants (except Apple) are struggling with practical implementation because their business model does not go well together with privacy. Google’s CPO Keith Enright mentioned consent as the gold standard. I disagreed in my presentation because “Consent on Everything” is often misused to legitimate processing of personal data. That’s why it became risk #4 of the updated OWASP Top 10 Privacy Risks.

Bruce Schneier‘s new book

Bruce Schneier presented contents of his new book that will be published in January. It’s about hacking, but not only about IT hacks. It covers cheating on systems and laws in general like the tax system or emission regulation and how AI could speed up finding such hacks. There is more information on his talk in this article.

Business Information Security Officers needed

Nicole Dove pointed out that BISOs (Business Information Security Officers) are on the rise and needed to support business teams in implementing cybersecurity. They should move from a “NO” to a “YES, and” attitude to get better results. There is also a video with Nicole where she talks about the role of the BISO.

Cybersecurity Workforce Gap

The cybersecurity workforce gap and how to address it was topic of several presentations. Bryan Palma proposed in his keynote “Soulless to Soulful: Security’s Chance to Save Tech” to come up with new ideas and collaborate across company-borders to close the workforce gap. He proposed the new campaign “I do #soulfulwork” because for many employees it is important to do something good and valuable in their work which is definitely the case in the area of cybersecurity.

Disinformation

The closing keynote “The Hugh Thompson Show” started very funny, but only to discuss one of the most serious topics in today’s world later on: Disinformation and how it threatens our democratic values. The panelists proposed several ideas on how to address it, but in the end they pointed out that education and awareness will be key to be able to challenge (fake) news and validate sources. They also recommended to talk to each other in real and not only on social media.

COVID-19 from a risk management perspective

Risk Management is daily business in cybersecurity and when following the COVID-19 news I am sometimes surprised that at least in some countries the decisions on countermeasures like social distancing and lockdowns are mainly taken by virologists. This makes sense at a first glance because they know best how COVID-19 spreads, but they might not be experts on risk management and social behavior.

Take Sweden as an example where staying at home is only recommended, but no enforced lockdowns take place. Most decisions there are taken by their chief virologist Anders Tegnell. He might be a good virologist, but the Swedish government seems to lack risk management expertise or follows the strategy to prefer economic or own interests over saving human life. Even though the strategy of herd immunity might be a valid option for COVID-19, it is a risky one and comparably high death rates proof that for Sweden and the UK (which recently changed their strategy). Of course, many people did not expect such a pandemic to ever occur. Anyway, every professional government should have prepared a crisis plan that covers such a severe situation beforehand. Like in every good information security strategy or business continuity plan, the risk appetite and priorities should be defined in this crisis plan, e.g. “Human life should be protected over economic interest” or maybe the other way around.

In times of capitalism and market economy it is not surprising that many people strive to optimize life for themselves and their close friends and family, but do not consider the well-being of the whole society if the social or financial burden for themselves gets too high. For sure there are many people that consider lockdown restrictions as a high burden and the risk of getting severely sick as relatively low. Those people will ignore government recommendations and some of them even guidelines and thus further spread the disease. Also, people tend to evaluate risks wrongly in general. They underestimate risks if they have the feeling that they can influence the risk by themselves e.g. when steering a car and overestimate risks if they have no influence on them like being on an airplane (not the pilot) or being hit by a terrorist attack. People also perceive risks higher if they are personally affected. You can read more about perceived vs. actual risks in this article.

Having those factors in mind it should be unavoidable from a government perspective to adopt and enforce rules to limit the spread of COVID-19 and keep death rates as low as possible. Lockdowns should be kept until sufficient compensating measures like widespread testing for antibodies and a (privacy-friendly) Corona App are in place to keep the curve flat.

So be patient if your government enforces or keeps lockdowns longer than you wish. They might have a limited risk appetite that helps to avoid cases of death. And you can speed up re-opening by supporting compensating measures: stay home if possible, keep distance & wear masks when in public, …

How to boost application privacy

Privacy engineering is on the run and the upcoming EU Data Protection Regulation further pushes the requirement of Privacy by Design (PbD). So there is a need for guidelines and patterns on how to implement PbD and there are some recent developments and publications. The OWASP Top 10 Privacy Risks Project published hints and best practices on how to avoid privacy risks in web applications. On privacypatterns.eu you can read about how to implement pseudonymous messaging or protection against tracking among others. Also privacypatterns.org provides valuable information.

Data minimization, Digital innovation & Security

I have been joining the KITS Conference in Berlin recently and there have been lively discussions about privacy as enabler or disabler of the digital future in Germany. Startup consortia and inter-trade organizations like BITKOM think that data minimization is no longer acceptable because it hinders digital innovation. And also at the German BSI IT Security Congress have been statements like “We no longer need data minimization. We need to secure our data.”

Apparently those people have a different understanding of security. In my eyes security protects assets like data to reduce risks. Risks are usually determined by multiplying the likelihood with the impact. E.g. the risk that an administrator maliciously steels your data by downloading it from the database could be reduced by lowering the number of admins by 50%. This will lower the likelihood and the corresponding risk by 50% as well. The impact is influenced by the amount of data and its criticality. If you practice data minimization the amount of data and consequently the impact will be reduced. Thus, data minimization is like the need-to-know principle very important for security because it lowers the impact not only for one risk like data theft by administrators, but for all risks associated with this data set. Furthermore anonymization and privacy by design can help to perform data analysis anyway.

And IMHO the digital future in Germany is rather hindered by a risk-averse and a bit old-fashioned culture than by data minimization 😉

Do Not Track Blog & Linkography

German and French TV channels founded Do Not Track – a very informative blog about privacy and big data with the goal to raise awareness and provide transparency. They call themselves “A personalized documentary series about privacy and the web economy”. The latest article is about Apple and Google participating in a confidential spy summit in a remote English mansion. They also published some hints on how to protect privacy on your smartphone.

I case you are further interested in trustful mobile Apps for users and related instructions for developers the Guardian Project is a good source.

Privacy is not only about laws …

… it is about ethical principles and providing full transparency and choice.

LutherKingMemorial

I was reminded of that when I walked by Martin Luther Jr.’s memorial in Washington DC last week and thought about if currently justice with transparency and choice is still present and in how far capitalism already undermines fundamental laws and even democracy. Often companies try everything to gain financial advantage – not only by challenging or influencing privacy laws by their lobbyists.

Also the defense industry selling their weapons to unstable countries to raise their profit and afterwards wondering that it is war again. Or clothing industry producing in countries like Bangladesh with poorly paid workers and terrible safety in place which already led to the death of many workers caused by fires or collapsing roofs. I sometimes wonder if capitalism and acquisitiveness has grown too strong or if people just don’t understand the interconnections.

What do you think?

Germans willing to pay 900 million Euro for Privacy

35% of all Germans would be willing to pay for a guarantee that their data is being used according to their wishes and not for profit without consent according to a study of the German Institute for Trust and Security on the Internet (DIVSI). And there would be even more, but many think even though they would pay it cannot be guaranteed that their data is not misused. The average payment would be 41 Euro which results in the sum of 900 million.

divsi-studie-daten-ware-und-waehrung-22-politik-nutzer-und-unternehmen-in-der-pflicht-1-432x251

Source: DIVSI

The chart shows that 85% think that the misuse of personal data should be followed and prosecuted harder and 84% want that foreign Internet companies have to comply to German laws. Thus, the European Data Protection reform is definitely overdue.

Encryption not sufficient to protect personal data

I could hardly believe it when I read the latest developments on the negotiations of the EU Data Protection Regulation in an article on the German heise news site. Of course it is nice to hear that there is progress at all, but the suggestion to skip noticing the authorities in case of an incident if the data has been stored encrypted is nonsense. Encryption is an important bit in the puzzle, but it does not protect against many threats. It mainly helps if the device (server) is turned off or during data transfer. It does not help against most common problems like application vulnerabilities, unpatched servers, weak authentication or insecure passwords. A device might be easily hacked even though the data is encrypted. Trusting in encryption alone is like trusting in your car’s security systems like the airbag only and not caring about traffic rules or speed limits: it won’t function.

Hopefully data protection experts do not let them influence too much by lobbyists and listen a bit more to people understanding how personal data is protected in real-life scenarios by a comprehensive approach supported by technology and processes.

IPEN workshop Berlin

berlin_state_parliament

On Friday the first workshop of the Internet Privacy Engineering Network (IPEN) took place in Berlin State Parliament with leading data protection experts like Peter Hustinx (European Data Protection Supvervisor, EDPS), Peter Schaar (EAID), and several Data Protection Authority (DPA) representatives from all over Europe. IPEN was founded by Achim Klabunde (Head of IT Policy of the EDPS) and aims to build privacy into everyday tools and bring legal people and engineers closer together. George Danezis from the University College London said he never saw so many legal experts and engineers at one table and that this is promising to push privacy in software engineering. Carlo from Lynx stated that the internet is broken and surveillance cannot be prevented as long as we have insecure protocols.

Anyway there are much more things to improve besides protocols and quick wins possible to reduce the misuse of personal data as performed by many companies nowadays. We from OWASP presented our initial version of the Top 10 Privacy Risks that provides engineers and business architects guidance and raises awareness for common privacy risks in web applications.

IPEN decided beside others to develop a privacy cookbook for engineers and one for business architects and to start a project to boost secure communication for several channels like email and sms. Further information about the event was published in a press release and on Twitter.