Grants and Contributions:
Grant or Award spanning more than one fiscal year. (2017-2018 to 2018-2019)
Our personal information is continually shared amongst personal devices, 'things' in the Internet of Things (IoT), online social media and other websites, and more. Whilst sometimes seemingly innocuous, it is potentially linkable and can be mined to build quite accurate pictures of us. Tools, such as Privacy Enhancing Technologies, are available to help us better manage this, and other tools, Transparency Enhancing Technologies, help in both legal and practical settings to enhance our awareness of where our information is and how it is potentially used and shared. However, Many of these tools are difficult to use or understand, and the result is a lack of needed awareness, and a lack of use. We are examining how to use human social concepts, such as trust, regret, comfort and wisdom, to enhance these technologies and aid in understanding and increased awareness and use. This will result in active, socially enhanced Transparency Technologies that are contextually aware and human oriented. We are further engaged in practical applications of these social norms in the development of autonomous agents that embed the notion of active Transparency and Privacy, to enable information to travel between systems yet remain protected and aware of their context, and the requirements of their owners (us) in how they share themselves.
This work will advance the current state of the art in privacy technologies, in particular in (1) the human awareness of privacy and the implications of information flow in young and emerging technologies such as IoT and ubiquitous computing, and established but growing technologies such as mobile and ultra-personal computing, and (2) the technological awareness of privacy and information flow in these systems. In the former, the benefit to Canada and Canadians is clear: privacy awareness is a public social good, and an aware community is a strong, enabled community. In the case of the former, the benefits are more diffuse: privacy aware artificial systems can hep enforce standards of information flow, incorporating strong protective behaviours for how to share and process information, and result in a more resilient information society comprised of people and technology.
The work will fund the training of HQP (PhD students) who will enter the workforce as privacy and information use sensitive, soft security experts. The combination of technical and social understanding will serve them well in their chosen career path.