Privacy by Design – New Oasis Tech comm.. for Sotware Engineers

Session Topic: Privacy by Design Documentation for Software Engineers – New OASIS Technical Committee Tuesday 1C Convener: Dawn Jutla and Craig Burton

Notes-taker(s): Dawn Jutla, Peter Brown Tags for the session - technology discussed/ideas considered: Privacy by Design; OASIS Privacy by Design Documentation for Software Engineers; PbD-SE; Data-as-currency; Personal information-Free Rider Issue

Discussion notes, key understandings, outstanding questions, observations, and, if appropriate to this discussion: action items, next steps:

Dawn Notes: Dawn spoke about the OASIS TC on Privacy by Design Documentation for Software Engineers that she recently convened. She mentioned the TC’s initial seeding of ideas around embedding the 7 Privacy by Design principles, ranging from a positive sum scenario to user respect AND the 7 Cs principles (e.g. consent, confinement etc.) as user-centric privacy requirements. Further the TC will be looking at generating privacy by design documentation using modeling and programming languages. The TC’s coverage will include, and is not limited to extending UML use case diagrams, scenario diagrams, class diagrams, and U/I diagrams to help software engineers embed privacy by design in their resulting software and services. Emphasis is also put on the use of systems analysis and design documentation to help software organizations show compliance to privacy best practices and regulations.

The group of approximately 16-20 people with active participation from John Biccum (Microsoft), William Yasnoff (Health Record Banking Alliance), Peter Brown (Independent Consultant), and Drummond Reed (Respect Network) discussed some drivers of new privacy requirements. The consensus was that people do not like that they are not getting sufficient economic benefit from third parties’ use of their data. We do get free services in exchange (e.g. free email, free social network communication among friends and colleagues) but there is a perception that people are not extracting enough relative value from giving up their personal data as compared to the value that companies extract. We framed this concern as a Personal Information-Free Rider issue. Dawn mentioned that a positive sum scenario for privacy and the advertising business model in use today by popular Internet giants could be created. Peter Brown, passionate privacy advocate, exclaimed that she was perhaps being too polite!

In addition, we highlighted data-as-currency in future business models such as Drummond Reed’s Respect Network where the free-rider issue may be rectified. As future collaborative action, Dawn invited the audience members to join the TC and to contribute further to the effectiveness of its output.

This session was a pre-cursor of a joint session with Craig Burton on Wednesday, W1F: Session 1 9:30-10:30, Room F on Identity and API Economy plus Privacy by Design. Please link to that session’s notes for more information on the OASIS Technical Committee on Privacy by Design Documentation for Software Engineers and for a larger picture of the importance of embedding privacy by design in future business models.

_______________________________________________________________________

ADDENDUM:

Below are convenient descriptions of the 7 Privacy by Design (PbD) Principles and the 7Cs Privacy Control Principles that lend to a User-Centric view of Privacy Requirements. The Seven Cs (7-Cs) for User Privacy Control Requirements

The 7 Cs for User Privacy Control adopted three initial control elements (comprehension, consciousness, and consent), from Andrew Patrick and colleagues’ research on human–computer interaction in privacy. Dawn added 4 other constructs from user behavior theories to round out the 7 Cs. Together they describe the ways in which users perceive they have some measure of privacy control; that is, through understanding, being aware, choosing explicitly, giving consent, adapting privacy rules according to context, setting limits, and anticipating the familiar through consistency. The 7 Cs are not restricted to the user interface. They can be embedded at the data and behavioral modeling stages of analysis and design.

CONTROL CATEGORY DESCRIPTION as adopted from [1].

1.	Comprehension: Users should understand how personal identifiable information (PII) is handled, who’s collecting it and for what purpose, and who will process the PII and for what purpose. Users are entitled to know all parties that can access their PII, the limits to processing transparency, why the PII data is being requested, when the data will expire (either from a collection or database), and what happens to it after that. This category also includes legal rights around PII, and the implications of a contract when one is formed.

2.	Consciousness Users should be aware of when data collection occurs, when a contract is being formed between a user and a data collector, when their PII is set to expire, who’s collecting the data, with whom the data will be shared, how to subsequently access the PII, and the purposes for which the data is being collected.

3.	Choice Users should have choices regarding data collection activities in terms of opting in or out, whether or not to provide data, and how to correct their data.

4.	Consent Users must first consent (meaning informed, explicit, unambiguous agreement) to data collection, use, and storage proposals for any PII. Privacy consent mechanisms should explicitly incorporate mechanisms of comprehension, consciousness, limitations, and choice.

5.	Context Users should be able to change privacy preferences according to context. Situational or physical context—such as crowded situations (for example, when at a service desk where several people can listen in on your exchange when you provide a phone number, or when you’re in an online community chat room)—is different from when you perform a buy transaction with Amazon.com or in rooms with cameras (where digitization makes the information permanent and unmistakably you) and data context (such as the sensitivity of data, for example, health data) could dictate different actions on the same PII in different contexts.

6.	Confinement Users should be able to set limits on who may access their PII, for what purposes, and where and possibly when it may be stored. Setting limits could provide some good opportunities for future negotiation between vendors and users.

7.	Consistency Users should anticipate with reasonable certainty what will occur if any action involving their PII is taken. That is, certain actions should be predictable on user access of PII or giving out of PII.

PRIVACY-BY-DESIGN’s 7 Foundational Principles:

1.	Proactive not Reactive; Preventative not Remedial

The PbD framework is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events, well before they can occur. PbD does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred − it aims to prevent them from occurring altogether. In short, PbD comes before-the-fact, not afterwards.

2.	Privacy as the Default Setting

We can all be certain of one thing − the default rules! The power of the default cannot be over stated. PbD seeks to deliver the maximum degree of privacy by ensuring that personal data are automatically protected in any given IT system or business practice by default. If an individual does nothing, their privacy still remains intact. No action is required on the part of the individual to protect their privacy − it should be built into the system, by default.

3. Privacy Embedded into Design

PbD is embedded into the design and architecture of IT systems and business practices. It is not bolted on as an add-on, after the fact. The result is that privacy becomes an essential component of the core functionality being delivered. Privacy is integral to the system, without diminishing functionality.

4. Full Functionality – Positive-Sum, not Zero-Sum

PbD seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through the dated, zero-sum approach, where unnecessary trade-offs are made. PbD avoids the pretense of false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both.

5. End-to-End Security – Full Lifecycle Protection

PbD, having been embedded into the system prior to the first element of information being collected, extends securely throughout the entire lifecycle of the data involved — strong security measures are essential to privacy, from start to finish. This ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion. Thus, PbD ensures cradle to grave, secure lifecycle management of information, end-to-end. There can be no privacy without strong security.

6. Visibility and Transparency – Keep it Open

PbD seeks to assure all stakeholders that whatever the business practice or technology involved, it is in fact, operating according to the stated promises and objectives, subject to independent verification. Its component parts and operations remain visible and transparent, to both users and providers alike. Remember, trust but verify!

7. Respect for User Privacy – Keep it User-Centric

Above all, PbD requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice, and empowering user-friendly options. Keep it user-centric – respect for the user is paramount.

Reference:

[1] Dawn N. Jutla, Peter Bodorik, "Sociotechnical Architecture for Online Privacy," IEEE Security and Privacy, vol. 3, no. 2, pp. 29-39, March-April 2005, doi:10.1109/MSP.2005.50

Peter Notes: Presenters: Craig Burton, Distinguished Analyst at Kuppinger Cole, and Dr. Dawn Jutla, Sobey School of Business, Saint Mary’s University.

Craig Burton (presentation slides here)

5 API tenets:
 * Everything and everyone will be API-enabled (28bn APIs by 2015), each with a provider (inside > out) and/or consumer (outside > in) type of API;
 * The API Ecosystem is core to any cloud strategy – Amazon stores 260bn objects; Twitter handles 13bn API calls/day; SalesForce over 50% of all traffic via APIs; etc.;
 * Baking core competency in an API-set is an economic imperative
 * Enterprise inside-out
 * Enterprise outside-in

Open APIs are growing in a “Cambrian Explosion”, leading to great number and diversity of APIs available to work with. An intractable management problem in the API economy where thousands of APIs exist in a broken one-to-one or one-to-many publish-subscribe consumption model. Instead, what is needed is a federated, many-to-many evented API model where events automatically trigger a possible cascade of evented API actions. Implied in the automation are event managers that understand rule-based or semantic contexts. These event managers will manage event prioritization and will be needed in future cloud operating systems. Clearly, the identity of people and things is an intrinsic part of the fix and the future many-to-many cloud-based solutions. Concern about SAML because the identity model, as deployed on current, admin-centric systems, does not scale.

Dawn Jutla (presentation slides here)

We need to intentionally create a positive sum scenario for privacy, security, and the advertising business model. Several original diagrams illustrate how personal data profiles, in the example of the mobile space, flow among a stack of interdependent and partnering platforms, such as carrier networks, device, operating system, applications and apps, and marketing aggregators. As governments adopt cloud solutions to lower costs, in many instances, citizens’ data profiles remain vulnerable to collection by the stack of popular platforms residing between the citizen and the government online service. Furthermore, cloud vendors will voluntarily keep government cloud service instances separate. However, in many countries, does it mean that the service interaction will not be subject to personal data leakage as occurs similarly in the consumer space?

The 7 Cs Privacy Principles:
 * Comprehension (user understanding of how PII I handled);
 * Consciousness (user awareness of what is happening and when);
 * Choice (to opt-in or out, divulge or refuse to share PII);
 * Consent (informed, explicit, unambiguous);
 * Context (user adjusting preferences as conditions require);
 * Confinement (data minimization and user-controlled re-use of data);
 * Consistency (user predictability of outcome of transactions)

The International Privacy by Design standard to responsibly embed Privacy by Design in online services. The 7Cs principles are also in Dawn’s 2005 IEEE Security and Privacy publication along with a privacy and rules-based architecture for user control that implemented rudimentary Vendor Relationship Management. The very important Privacy by Design principles were created by Dr. Ann Cavoukian, Information and Privacy Commissioner of Ontario. They have been translated in over 25 languages.

The OASIS TC on Privacy by Design Documentation for Software Engineers has been convened by Dawn in partnership with Commissioner Cavoukian. This TC intends to create a specification that will facilitate software engineers to embed and document privacy by design in their output at the analysis and design phase of software development. She mentioned how software engineers may use other standards such as OASIS PMRM to document data flows at early analysis stage. She also showed a Visio- extended screen with icons for Privacy Services that she and her student created to help people to visualize some of the tool specifications that the TC may output.

Discussion around whether privacy is an issue for the software engineer (should they have to be burdened with guidelines for enforcing PbD in their work? Would they even follow them?) or for the software development environment (should PbD rules be embedded into development environments so that an engineer cannot make mistakes and inadvertently collect or release PII).

Possible fit with future personal data services.