Wednesday 25 September 16:00-17:30
New ICT solutions aim to take care of people who are older, sometimes fragile and most of the time not tech-savvy. They make use of people’s data to improve the technology in general and, on a personal level, to keep an eye on the users’ wellbeing. We expect they are keeping them ‘safe’ when collecting and analyzing their data. Because, caring for people – who are mostly unaware of the ’worth’ of their information – is also taking responsibility for their safety by protecting their data and privacy.
However, technology doesn’t police itself. Several technologies, namely in the health sector, are well framed in terms of data protection, ethical concerns and privacy, but many others are flourishing with no framework other than the GDPR and no adequate supervision. How can we make European policy to keep people using non-medical technology solutions ‘safe’ and raise their awareness concerning data and privacy so they can make ‘safe’ decisions themselves? Not to scare anyone, but to understand how the common citizen might be unaware of what’s coming down the pipeline or already is in their house or work.
The goal of this workshop is to raise discussion on how we can make sure that older – mostly not tech-savvy – people who use non-medical technology solutions are ‘safe’ where it concerns to their data and privacy. Policy making at a European level on the ethical framework is one way to protect them, but it is also essential to raise their awareness, so they are able to make ‘safe’ choices themselves. How to achieve these two sides of the same medallion is our objective.
Areas (tables) to be discussed on the issue of European policy making:
Areas to be discussed on the issue of awareness of the user:
‘Living well with Anne’ (short: Anne) is a personal Virtual assistant with a voice and an expressive face, easy and fun to use and specially made for forgetful elderly or mental impaired people, that will make their live at home safe and comfortable. She speaks, listens, talks back, picks up commands and executes them. Anne is an example of developers who think it’s important to keep their (often fragile) users safe, taking responsibility for their data protection and privacy. So, they put in an extra effort. E.g. by not using the cloud for speech recognition and not base their income on the use of the data (new gold). But do the users and their (family) carers value that, since these products come to a higher price? Or not? Are people aware of the ‘danger’ and the ‘real price’ they pay for cheap or even free solutions? Do we have to tell them what can happen to their privacy using IoT solutions in the cloud? How? Can we give people a choice to use their own data to pay for services?
Elisa Irlandese (European Commission | Policy Officer)
Carina Dantas (Cáritas Coimbra)
Ellen Steenmeijer (Virtask)
Stephanie Koenderink (De Parabool)
Edith Birrer and Daniel Bolliger (iHomeLab)
Sonja Hansen (Aarhus Municipality)
Peter Mayer (Technical University Vienna)
Ana Jegundo (Cáritas Coimbra)
Valentina Tageo (ECHAlliance)