Insights from RSA Conference 2022

Banners advertising cybersecurity on houses, busses and everywhere. The city of San Francisco talks security in the week of RSA Conference. The event itself is really huge with many tracks in parallel. I even got lost one time at the expo which seems to be twice the size of the German it-sa. But I had to refill my stock of pens anyway after the break of in-person events caused by Corona 😉
However, I did not only bring home pens and shirts, but also new insights described in this article.

Threat Modelling made easy

Threat modelling often fails because methods are too complex. Alyssa Miller, one of the authors of the threat modelling manifesto, pointed out in her presentation that threat modelling can be easy and everyone is able to do it. We already do it every day like when deciding if we wear a mask at the conference or not. She also showed with the example in the picture above how natural language can be used to simplify threat modelling. At this point I was reminded about threat modelling requirements from the automotive cybersecurity standard ISO/SAE 21434 and whether they might be too detailed (cp. Anti-Pattern “Tendency to Overfocus” from Threat Modelling Manifesto). I will spend some more thoughts on this later on. There is definitely a tendency to focus (too?) much on details and methods in Automotive TARA (Threat Analysis and Risk Assessment).

Privacy on the rise, but still way to go

Privacy is definitely on the rise, also at RSA Conference. There was a keynote with the Chief Privacy Officers (CPOs) of Google and Apple and they pointed out that privacy has highest attention in their companies. But imho there is still way to go and most tech giants (except Apple) are struggling with practical implementation because their business model does not go well together with privacy. Google’s CPO Keith Enright mentioned consent as the gold standard. I disagreed in my presentation because “Consent on Everything” is often misused to legitimate processing of personal data. That’s why it became risk #4 of the updated OWASP Top 10 Privacy Risks.

Bruce Schneier‘s new book

Bruce Schneier presented contents of his new book that will be published in January. It’s about hacking, but not only about IT hacks. It covers cheating on systems and laws in general like the tax system or emission regulation and how AI could speed up finding such hacks. There is more information on his talk in this article.

Business Information Security Officers needed

Nicole Dove pointed out that BISOs (Business Information Security Officers) are on the rise and needed to support business teams in implementing cybersecurity. They should move from a “NO” to a “YES, and” attitude to get better results. There is also a video with Nicole where she talks about the role of the BISO.

Cybersecurity Workforce Gap

The cybersecurity workforce gap and how to address it was topic of several presentations. Bryan Palma proposed in his keynote “Soulless to Soulful: Security’s Chance to Save Tech” to come up with new ideas and collaborate across company-borders to close the workforce gap. He proposed the new campaign “I do #soulfulwork” because for many employees it is important to do something good and valuable in their work which is definitely the case in the area of cybersecurity.

Disinformation

The closing keynote “The Hugh Thompson Show” started very funny, but only to discuss one of the most serious topics in today’s world later on: Disinformation and how it threatens our democratic values. The panelists proposed several ideas on how to address it, but in the end they pointed out that education and awareness will be key to be able to challenge (fake) news and validate sources. They also recommended to talk to each other in real and not only on social media.

Facebook’s hindsight?

Dear Mark Zuckerberg,

apparently now you also consider what I tried to convince your Head of Brussels Office at a panel discussion at the IAPP Data Protection Congress in Brussels already in 2013: A paid version of Facebook without ads and analysis of user behavior as an alternative to protect privacy. But my suggestion was denied and my concerns regarding loss of trust not taken seriously.

Also we recommended to assess and whitelist third parties and only hand over personal data to those who are trustworthy to reduce the OWASP Top 10 Privacy Risk #7 some years ago. There are further serious privacy issues in Facebook that I will report to your data abuse bug bounty program. Facebook should start to listen to privacy experts before your business gets damaged even more. I would be happy to discuss.

Privacy Engineering Workshop in Leuven

I was invited to the Future of Privacy Forum’s (FPF) workshop on Privacy Engineering in Leuven (Belgium). The Internet Privacy Engineering Network IPEN supported it and compared to past IPEN workshops the number of participants increased a lot which also shows the growing importance of the topic. Privacy Engineering as a subset of Data Protection by Design, which is a requirement of the GDPR, is becoming an important discipline to implement privacy in software and the workshop aimed to discuss practical issues and develop guidelines for software developers and architects.

In the morning, it started with presentations from Giovanni Buttarelli (European Data Protection Supervisor) and Norman Sadeh from Carnegie Mellon University who developed tools to semi-automatically analyze privacy policies (see https://usableprivacy.org/). Later on Wojciech Wiewiorowski (Assistant European Data Protection Supervisor) pointed out that there is not always an easy solution (or “silver-bullet”) for Privacy by Design as people would like to have and compares the situation with his 10 year-old daughter that rather pretends to believe in Santa Claus and sends him her wish list instead of discussing it with her parents.

In the afternoon five breakout sessions were organized. In my group, the challenges arising from development and deployment practices have been discussed and how data protection by design methodologies can be integrated into existing software development approaches. The most important questions were about how to integrate Data Protection Impact Assessments (PIA) and a risk-based approach into the Software Development Process and about the challenges of software modularity and integration of third party code.

EDPS aims to publish a Privacy by Design guidance in the beginning of next year that will consider the issues identified in the workshop. Further information about the event and its results are available on Twitter.

IPEN Workshop on 9 June in Vienna

The 4th workshop of the Internet Privacy Engineering Network (IPEN) will take place adjacent to the ENISA Annual Privacy Forum, on 9 June 2017 in Vienna at the following address: OCG Austrian Computing Society – Wollzeile 1.

IPEN invites participants from different areas such as data protection authorities, academia, open source and business development, and other individuals who are committed to finding engineering solutions to privacy challenges.

The event is for free and a great opportunity to inform about latest developments in privacy engineering and for networking. Registration and handing in proposals is possible via ipen@edps.europa.eu

How to boost application privacy

Privacy engineering is on the run and the upcoming EU Data Protection Regulation further pushes the requirement of Privacy by Design (PbD). So there is a need for guidelines and patterns on how to implement PbD and there are some recent developments and publications. The OWASP Top 10 Privacy Risks Project published hints and best practices on how to avoid privacy risks in web applications. On privacypatterns.eu you can read about how to implement pseudonymous messaging or protection against tracking among others. Also privacypatterns.org provides valuable information.

Data minimization, Digital innovation & Security

I have been joining the KITS Conference in Berlin recently and there have been lively discussions about privacy as enabler or disabler of the digital future in Germany. Startup consortia and inter-trade organizations like BITKOM think that data minimization is no longer acceptable because it hinders digital innovation. And also at the German BSI IT Security Congress have been statements like “We no longer need data minimization. We need to secure our data.”

Apparently those people have a different understanding of security. In my eyes security protects assets like data to reduce risks. Risks are usually determined by multiplying the likelihood with the impact. E.g. the risk that an administrator maliciously steels your data by downloading it from the database could be reduced by lowering the number of admins by 50%. This will lower the likelihood and the corresponding risk by 50% as well. The impact is influenced by the amount of data and its criticality. If you practice data minimization the amount of data and consequently the impact will be reduced. Thus, data minimization is like the need-to-know principle very important for security because it lowers the impact not only for one risk like data theft by administrators, but for all risks associated with this data set. Furthermore anonymization and privacy by design can help to perform data analysis anyway.

And IMHO the digital future in Germany is rather hindered by a risk-averse and a bit old-fashioned culture than by data minimization 😉

Do Not Track Blog & Linkography

German and French TV channels founded Do Not Track – a very informative blog about privacy and big data with the goal to raise awareness and provide transparency. They call themselves “A personalized documentary series about privacy and the web economy”. The latest article is about Apple and Google participating in a confidential spy summit in a remote English mansion. They also published some hints on how to protect privacy on your smartphone.

I case you are further interested in trustful mobile Apps for users and related instructions for developers the Guardian Project is a good source.

Privacy is not only about laws …

… it is about ethical principles and providing full transparency and choice.

LutherKingMemorial

I was reminded of that when I walked by Martin Luther Jr.’s memorial in Washington DC last week and thought about if currently justice with transparency and choice is still present and in how far capitalism already undermines fundamental laws and even democracy. Often companies try everything to gain financial advantage – not only by challenging or influencing privacy laws by their lobbyists.

Also the defense industry selling their weapons to unstable countries to raise their profit and afterwards wondering that it is war again. Or clothing industry producing in countries like Bangladesh with poorly paid workers and terrible safety in place which already led to the death of many workers caused by fires or collapsing roofs. I sometimes wonder if capitalism and acquisitiveness has grown too strong or if people just don’t understand the interconnections.

What do you think?

Austrian study about Online Tracking, Big Data and surveillance

The Austrian Research Institute Cracked Labs has published a very interesting study in German about Online Tracking, Big Data and commercial surveillance with many examples showing what information companies derive from the data we provide on the internet and how they violate privacy best practices like user consent and restriction to a specific purpose.

One quote From Google’s Eric Schmidt (2013) mentioned in the online version of the report is also a good slogan for 2015: “You have to fight for your privacy or you will lose it”.

Having said that, I wish you Merry Christmas and a Happy New Year!