Aiming at engaging people with informatics by focusing on digital footprint and acknowledged advantages, I want people to understand the shadow behind AI. For individuals, confronting the huge network of interests underlying AI in terms of personal privacy, such as cookies on websites, can be a particularly hard task. I propose developing a projection-based interactive experience that replicates the user's data shadows with real-life shadows and presents it in a novel and approachable manner. By interaction with the virtual shadow, users would be inspired to investigate the story behind their unintentionally submitted data online. By watching, they could compare their current shadow with those fake shadows.
When talking about AI, most companies tend to talk only about the benefits and avoid talking about privacy issues. It overlooks the fact that AI is being fed mountains of human-generated data: records of human movements, observations, measurements, utterances, categories, choices and preferences. This does not only contain the digital footprint of individuals, which refers to the data they leave behind themselves but also, and more significantly, contain the data shadow of individuals which refers to the information they generated unintentionally. Shadow, in this context, means to follow: Our data shadow follows us.
Once digital traces are generated and transferred, they usually escape our control and end up in the hands of others, where they are kept on servers that are not readily forgotten. These can give others huge insight into your life; and they can also be totally wrong.