Today the Privacy Commissioner of Canada is hosting a one-day public consultation about the various new privacy challenges citizens face when navigating the online world. The motivation for the consultation, which is the first in a series of three events across Canada, was the upcoming review process for the Personal Information Protection and Electronic Documents Act (PIPEDA), which has not been modified in 10 years.
Commissioner Jennifer Stoddart identified three challenges that are of particular concern for her office: behavioural or targeting advertising; location-based or geospatial tracking; and the online tracking of children. The general idea, she said, was to understand our “digital footprint” – what it means for our privacy and how we can minimize the impact on protection of our data.
David Vladeck, director of the Federal Trade Commission’s Bureau of Consumer Protection, gave an accessible talk, peppered with good one-liners, about the international context of the consultations. “We learned the hard way that existing privacy frameworks have limitations that are no longer tolerable,” he commented. Lengthy and complicated “privacy agreements” place undue burdens on consumers to read and understand, he said. In fact legal, “privacy agreements” are legal disclaimers written by lawyers that explain how companies will not protect users privacy. The harm based model of privacy is problematic, Vladeck said, because they only recognize a narrow set of harms and not privacy as value unto itself. Further, this model is applied retrospectively.
What Vladeck said what became clear from roundtable consultations in the US, as well as some prominent cases before the FTC, was the consumers did not understand breadth of collection of personal data engaged in by online corporations, as well as the uses to which their data would be put. He cited a recent case where 7500 Americans “sold their souls” to an online computer gaming company. The hoax intended to make point that consumers did not read privacy agreements when shopping online. There was an opt-out option, which almost no one selected:
If you a) do not believe you have an immortal soul, b) have already given it to another party, or c) do not wish to grant Us such a license, please click the link below to nullify this sub-clause and proceed with your transaction.
One potential new model is “privacy by design“, the philosophy and approach of embedding privacy into the design of technology, business practices and physical design. Developed by Dr. Ann Cavoukian in the 1990s, privacy by design acknowledges that “the future of privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must ideally become an organization’s default mode of operation.”
Another question Vladeck raised was whether we can we build security and privacy into internet after the fact? That is to say, on top of a foundation that was built to be trusting and open? Related to that is the question of a common understanding of what characterizes private information, and whether there is a need for consensus around this if people can control online flows of their own data.
Elizabeth Denham, the Assistant Privacy Commissioner of Canada, said that PIPEDA is a robust piece of legislation, whose principle-based approach makes it flexible, enabling PIPEDA to serve its purpose despite its age. “We have been able to apply the law to technologies and business models that didn’t exist when PIPEDA came into force. It’s a neutral law that does not thwart technological innovation,” she added.
Denham explained PIPEDA as law that applies to personal information used in commercial activity and that has strengthened Canadians’ privacy rights. She reminded, however, that the Privacy Commissioner does not have order making powers, and can not enforce the law. Thus PIPEDA’s success is dependent upon businesses and organizations submitting to the authority of Canadian privacy law. The main concern she raised was the profiling of consumers via behavioural advertising, largely enabled by users who routinely and unquestioningly volunteer significant amounts of personal information.
Other interesting highlights:
Jules Polonetsky, from The Future Of Privacy Forum, explained what happens to users’ data before they even click through a website with a this visual map.
Anne Toth, head of privacy for Yahoo! explained her company’s efforts to allow users to control targeted ads. The idea is that transparency in data use will empower their users, for example, providing privacy notices outside privacy policies and making privacy choices relevant and contextual to consumers. I’m not sure that just showing people how they are being targeted by marketers, or allowing them to manipulate advertising experience is enough, however: they need to know how and why their data is being used, and what the broader implications are. Nevertheless, I suppose Ad Interest Manager is a baby step…
Ian Kerr discussed the problem of social graphing by websites, in particular Facebook’s open graph (opt out here), which he called Google Street View for people. He called social graphing is transformative for marketers, in that it allowed them to build a smarter, more “social” web that weaves connections from a long list of multiple and diverse ties. This new marketing paradigm “soft surveillance,” heavily reliant on what Gary T. Marx refers to as “mandatory volunteerism.” The tricky bit about this new model is that it is consensual, and Kerr said this is one reason, among others, that new privacy law is likely to be retrofitted to fit new technology. Interpreting privacy by design as privacy by default is one approach to legislation that can thwart the slippery slope of Zuckerberg’s privacy-phobic “social web.”
Teresa Scassa gave a kickass talk on tracking and surveillance. She pointed out that this is not a new phenomena – Canadian courts and lawmakers have experience in dealing with these issues, most commonly in employment or law enforcement. The character of tracking and surveillance is changing, however, with new technologies, actors and social contexts emerging and expanding objectives.
Scassa said there are new issues for which established responses are no longer adequate. In the old days, surveillance was imposed upon us, for example through surveillance cameras. Corporations and governments were required to justify transgressions of privacy and create a balance between these and the preservation of dignity and autonomy.
In today’s data protection context, things are less focused on privacy, dignity and autonomy and more on accommodating commerce. Whereas tracking/surveillance was formerly specific, and limited to government and employers, the new context is dominated by the private sector. Rather than surveillance being imposed, we are now being asked to be the enablers of own surveillance. Thus it’s difficult to argue against it from a privacy perspective because we have consented. We facilitate our own online tracking through the data trails we shed.
Scassa added that it is difficult to cover our digital tracks, or avoid leaving them altoghter because many feel the benefits outweigh drawbacks. However, and this is important, the drawbacks are certainly there, but unlike the benefits, which are immediate, the drawbacks are nebulous, remote, and non-transparent.
Tracking has become “social” in the Facebook redefinition of the word. Personal information drawn from our online consumption practices is used to profile us, and Scassa said it’s important to understand the implications for autonomy, dignity, privacy,fairness and liberty.