I was invited to the Future of Privacy Forum’s (FPF) workshop on Privacy Engineering in Leuven (Belgium). The Internet Privacy Engineering Network IPEN supported it and compared to past IPEN workshops the number of participants increased a lot which also shows the growing importance of the topic. Privacy Engineering as a subset of Data Protection by Design, which is a requirement of the GDPR, is becoming an important discipline to implement privacy in software and the workshop aimed to discuss practical issues and develop guidelines for software developers and architects.
In the morning, it started with presentations from Giovanni Buttarelli (European Data Protection Supervisor) and Norman Sadeh from Carnegie Mellon University who developed tools to semi-automatically analyze privacy policies (see https://usableprivacy.org/). Later on Wojciech Wiewiorowski (Assistant European Data Protection Supervisor) pointed out that there is not always an easy solution (or “silver-bullet”) for Privacy by Design as people would like to have and compares the situation with his 10 year-old daughter that rather pretends to believe in Santa Claus and sends him her wish list instead of discussing it with her parents.
In the afternoon five breakout sessions were organized. In my group, the challenges arising from development and deployment practices have been discussed and how data protection by design methodologies can be integrated into existing software development approaches. The most important questions were about how to integrate Data Protection Impact Assessments (PIA) and a risk-based approach into the Software Development Process and about the challenges of software modularity and integration of third party code.
EDPS aims to publish a Privacy by Design guidance in the beginning of next year that will consider the issues identified in the workshop. Further information about the event and its results are available on Twitter.
The 4th workshop of the Internet Privacy Engineering Network (IPEN) will take place adjacent to the ENISA Annual Privacy Forum, on 9 June 2017 in Vienna at the following address: OCG Austrian Computing Society – Wollzeile 1.
IPEN invites participants from different areas such as data protection authorities, academia, open source and business development, and other individuals who are committed to finding engineering solutions to privacy challenges.
The event is for free and a great opportunity to inform about latest developments in privacy engineering and for networking. Registration and handing in proposals is possible via email@example.com
Privacy engineering is on the run and the upcoming EU Data Protection Regulation further pushes the requirement of Privacy by Design (PbD). So there is a need for guidelines and patterns on how to implement PbD and there are some recent developments and publications. The OWASP Top 10 Privacy Risks Project published hints and best practices on how to avoid privacy risks in web applications. On privacypatterns.eu you can read about how to implement pseudonymous messaging or protection against tracking among others. Also privacypatterns.org provides valuable information.
I have been joining the KITS Conference in Berlin recently and there have been lively discussions about privacy as enabler or disabler of the digital future in Germany. Startup consortia and inter-trade organizations like BITKOM think that data minimization is no longer acceptable because it hinders digital innovation. And also at the German BSI IT Security Congress have been statements like “We no longer need data minimization. We need to secure our data.”
Apparently those people have a different understanding of security. In my eyes security protects assets like data to reduce risks. Risks are usually determined by multiplying the likelihood with the impact. E.g. the risk that an administrator maliciously steels your data by downloading it from the database could be reduced by lowering the number of admins by 50%. This will lower the likelihood and the corresponding risk by 50% as well. The impact is influenced by the amount of data and its criticality. If you practice data minimization the amount of data and consequently the impact will be reduced. Thus, data minimization is like the need-to-know principle very important for security because it lowers the impact not only for one risk like data theft by administrators, but for all risks associated with this data set. Furthermore anonymization and privacy by design can help to perform data analysis anyway.
And IMHO the digital future in Germany is rather hindered by a risk-averse and a bit old-fashioned culture than by data minimization 😉
German and French TV channels founded Do Not Track – a very informative blog about privacy and big data with the goal to raise awareness and provide transparency. They call themselves “A personalized documentary series about privacy and the web economy”. The latest article is about Apple and Google participating in a confidential spy summit in a remote English mansion. They also published some hints on how to protect privacy on your smartphone.
I case you are further interested in trustful mobile Apps for users and related instructions for developers the Guardian Project is a good source.
… it is about ethical principles and providing full transparency and choice.
I was reminded of that when I walked by Martin Luther Jr.’s memorial in Washington DC last week and thought about if currently justice with transparency and choice is still present and in how far capitalism already undermines fundamental laws and even democracy. Often companies try everything to gain financial advantage – not only by challenging or influencing privacy laws by their lobbyists.
Also the defense industry selling their weapons to unstable countries to raise their profit and afterwards wondering that it is war again. Or clothing industry producing in countries like Bangladesh with poorly paid workers and terrible safety in place which already led to the death of many workers caused by fires or collapsing roofs. I sometimes wonder if capitalism and acquisitiveness has grown too strong or if people just don’t understand the interconnections.
What do you think?
OWASP published an infographics banner about its Top 10 Privacy Risks.
The Austrian Research Institute Cracked Labs has published a very interesting study in German about Online Tracking, Big Data and commercial surveillance with many examples showing what information companies derive from the data we provide on the internet and how they violate privacy best practices like user consent and restriction to a specific purpose.
One quote From Google’s Eric Schmidt (2013) mentioned in the online version of the report is also a good slogan for 2015: “You have to fight for your privacy or you will lose it”.
Having said that, I wish you Merry Christmas and a Happy New Year!
35% of all Germans would be willing to pay for a guarantee that their data is being used according to their wishes and not for profit without consent according to a study of the German Institute for Trust and Security on the Internet (DIVSI). And there would be even more, but many think even though they would pay it cannot be guaranteed that their data is not misused. The average payment would be 41 Euro which results in the sum of 900 million.
The chart shows that 85% think that the misuse of personal data should be followed and prosecuted harder and 84% want that foreign Internet companies have to comply to German laws. Thus, the European Data Protection reform is definitely overdue.
I could hardly believe it when I read the latest developments on the negotiations of the EU Data Protection Regulation in an article on the German heise news site. Of course it is nice to hear that there is progress at all, but the suggestion to skip noticing the authorities in case of an incident if the data has been stored encrypted is nonsense. Encryption is an important bit in the puzzle, but it does not protect against many threats. It mainly helps if the device (server) is turned off or during data transfer. It does not help against most common problems like application vulnerabilities, unpatched servers, weak authentication or insecure passwords. A device might be easily hacked even though the data is encrypted. Trusting in encryption alone is like trusting in your car’s security systems like the airbag only and not caring about traffic rules or speed limits: it won’t function.
Hopefully data protection experts do not let them influence too much by lobbyists and listen a bit more to people understanding how personal data is protected in real-life scenarios by a comprehensive approach supported by technology and processes.