Big Data and Privacy: A Technological Perspective; Executive Office of the President, President’s Council of Advisors on Science and Technology (PCAST); 2014-05-01; 76 pages; landing.
Related
Workshops
- White House / UC Berkeley School of Information / Berkeley Center for Law and Technology; John Podesta; 2014-04-01; transcript, video.
- White House / Data & Society Research Institute / NYU Information Law Institute; John Podesta; 2014-03-17; video.
- White House / MIT; John Podesta; 2014-03-04; transcript, video.
Who
PCAST Big Data and Privacy Working Group.
- Susan L. Graham, co-chair.
- William Press, co-chair.
- S. James Gates, Jr.,
- Mark Gorenberg,
- John Holdren,
- Eric S. Lander,
- Craig Mundie,
- Maxine Savitz,
- Eric Schmidt.
- Marjory S. Blumenthal, Executive Director of PCAST; coordination & framing..
PCAST
- John P Holdren, co-chair, OSTP
- Eric S. Lander, co-chair, Broad Institute (Harvard&MIT)
- William Press, co- vice chair, U. Texas
- Maxine Savitz, co- vice chair, National Academy of Engineering
- Rosina Bierbaum, U. Michigan
- Christine Cassel, National Quality Forum
- Christopher Chyba, Princeton
- S. James Gates, Jr., U. Maryland
- Gorenberg, Zetta Venture Partners
- Susan L. Graham, UCB
- Shirley Ann Jackson, Rensselaer Polytechnic
- Richard C. Levin, Yale
- Chad Mirkin, Northwestern
- Mario Molina, UCSD
- Craig Mundie, Microsoft
- Ed Penhoet, UCB
- Barbara Schaal, Washington University
- Eric Schmidt, Google
- Daniel Schrag, Harvard
Staff
- Marjory S. Blumenthal
- Michael Johnson
Recommendations
From the Executive Summary [page xiii], and also from Section 5.2 [page 49]
- Recommendation 1 [consider uses over collections activites]
Policy attention should focus more on the actual uses of big data and less on its collection and analysis.
- Recommendation 2 [no Microsoft lockin; no national champion]
Policies and regulation, at all levels of government, should not embed particular technological solutions, but rather should be stated in terms of intended outcomes.
- Recommendation 3 [fund]
With coordination and encouragement from [The White House Office of Science and Technology Policy] OSTP, the [Networking and Information Technology Research and Development] NITRD agencies should strengthen U.S. research in privacy‐related technologies and in the relevant areas of social science that inform the successful application of those technologies.
- Recommendation 4 [talk]
OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection, including career paths for professionals.
- Recommendation 5 [talk & buy]
The United States should take the lead both in the international arena and at home by adopting policies that stimulate the use of practical privacy‐protecting technologies that exist today. It can exhibit leadership both by its convening power (for instance, by promoting the creation and adoption of standards) and also by its own procurement practices (such as its own use of privacy‐preserving cloud services)
Table of Contents
- Executive Summary
- Introduction
- Context and outline of this report
- Technology has long driven the meaning of privacy
- What is different today?
- Values, harms, and rights
- Examples and Scenarios
- Things happening today or very soon
- Scenarios of the near future in healthcare and education
- Healthcare: personalized medicine,
- Healthcare: detection of symptoms by mobile devices
- Education
- Challenges to the home’s special status
- Tradeoffs among privacy, security, and convenience
- Collection, Analytics, and Supporting Infrastructure
- Electronic sources of personal data
- “Born digital” data
- Data from sensors
- Big data analytics
- Data mining
- Data fusion and information integration
- Image and speech recognition
- Social‐network analysis
- The infrastructure behind big data
- Data centers
- The cloud
- Technologies and Strategies for Privacy Protection
- The relationship between cybersecurity and privacy
- Cryptography and encryption
- Well Established encryption technology
- Encryption frontiers
- Notice and consent
- Other strategies and techniques
- Anonymization or de‐identification
- Deletion and non‐retention
- Robust technologies going forward
- A Successor to Notice and Consent
- Context and Use
- Enforcement and deterrence
- Operationalizing the Consumer Privacy Bill of Rights
- PCAST Perspectives and Conclusions
- Technical feasibility of policy interventions
- Recommendations
- Final Remarks
- Appendix A. Additional Experts Providing Input
- Special Acknowledgment
Mentions
- The President’s Council of Advisors on Science and Technology (PCAST)
- PCAST Big Data and Privacy Working Group
- Enabling Event
- President Barack Obama
- Remarks, 2014-01-17
- Counselor John Podesta
- New Concerns
- Born digital vs born analog
- standardized components
- particular limited purpose vs repurposed, reused.
- data fusion
- algorithms
- inferences
- Provenance of data, recording and tracing the provenance of data
- Trusted Data Format (TDF)
Claims
- Right to forget, right to be forgotten is unenforceable infeasible [page 48].
- Prior redress of prospective harms is a reasonable framework [page 49]
- Conceptualized as vulnerable groups who are stipulated as harmed a priori or are harmed sunt constitua.
- Government may be forbidden from certain classes of uses, despite their being available in the private
sector
- Government is allowed some activities and powers
- Private industry is allowed some activities and powers
- It is feasible in practice to mix & match
- government coercion => private privilege => result
- private privilege => private coercion => result
Consumer Privacy Bill of Rights (CPBR)
Obligations [of service providers, as powerful organizations]
- Respect for Context => use consistent with collection context.
- Focused Collection => limited collection.
- Security => handling techniques
- Accountability => handling techniques.
Empowerments [of consumers, as individuals]
- Individual Control => control of collection, control of use.
- Transparency => of practices [by service providers]
- Access and Accuracy => right to review & edit [something about proportionality]
Definition of Privacy
The definition is unclear and evolving. It is frequently defined in terms of the harms in curred when it is lost.
Privacy Framework of Via Harms
The Prosser Harms, <quote> page 6.
- Intrusion upon seclusion. A person who intentionally intrudes, physically or otherwise (now including electronically), upon the solitude or seclusion of another person or her private affairs or concerns, can be subject to liability for the invasion of her privacy, but only if the intrusion would be highly offensive to a reasonable person.
- Public disclosure of private facts. Similarly, a person can be sued for publishing private facts about another person, even if those facts are true. Private facts are those about someone’s personal life that have not previously been made public, that are not of legitimate public concern, and that would be offensive to a reasonable person.
- “False light” or publicity. Closely related to defamation, this harm results when false facts are widely published about an individual. In some states, false light includes untrue implications, not just untrue facts as such.
- Misappropriation of name or likeness. Individuals have a “right of publicity” to control the use of their name or likeness in commercial settings.
</quote>
Adjacencies
<quote>One perspective informed by new technologies and technology‐mediated communication suggests that privacy is about the “continual management of boundaries between different spheres of action and degrees of disclosure within those spheres,” with privacy and one’s public face being balanced in different ways at different times. See: Leysia Palen, Paul Dourish; Unpacking ‘Privacy’ for a Networked World; In Proceedings of CHI 2003, Association for Computing Machinery, 2003-04-05.</quote>, footnote, page 7.
Adjacency Theory
An oppositional framework wherein harms are “adjacent to” benefits:
- Invasion of private communications
- Invasion of privacy ihn a person’s virtual home.
- Public disclosure of inferred private facts
- Tracking, stalking and violations of locational privacy.
- Harm arising from false conclusions about individuals, based on personal profiles from big‐data analytics.
- Foreclosure of individual autonomy or self‐determination
- Loss of anonymity and private association.
Mosaic Theory
Oblique referenced via quote from Sotomayor.
<quote>“I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.” United States v. Jones (10‐1259), Sotomayor concurrence.</quote>
Yet, not cited, but related (at least):
Definition of Roles [of data processors]
- data collectors
- data analyzers
- data users
The data generators or producers in this roles framework are substantially only customers or consumers (sic).
Definitions
- Definition of analysis versus use
- <quote>Analysis, per se, does not directly touch the individual (it is neither collection nor, without additional action, use) and may have no external visibility.
- & by contrast, it is the use of a product of analysis, whether in commerce, by government, by the press, or by individuals, that can cause adverse consequences to individuals.</quote>
- Big Data => definitions
- [comprises data with] high‐volume, high‐velocity and high‐variety
information assets that demand cost‐effective, innovative forms of information processing for enhanced insight and decision making,” attributed to Gartner Inc.
- a term describing the storage and analysis of large and/or complex data sets using a series of techniques including, but not limited to, NoSQL, MapReduce, and machine learning.” attributed to “computer scientists” on arXiv.
Quoted
The strong, direct, unequivocal, un-nuanced, provocative language…
<quote>For a variety of reasons, PCAST judges anonymization, data deletion, and distinguishing data from metadata (defined below) to be in this category. The framework of notice and consent is also becoming unworkable as a useful foundation for policy.</quote>
<quote>Anonymization is increasingly easily defeated by the very techniques that are being developed for many legitimate applications of big data. In general, as the size and diversity of available data grows, the likelihood of being able to re‐identify individuals (that is, re‐associate their records with their names) grows substantially. While anonymization may remain somewhat useful as an added safeguard in some situations, approaches that deem it, by itself, a sufficient safeguard need updating. </quote>
<quote>Notice and consent is the practice of requiring individuals to give positive consent to the personal data collection practices of each individual app, program, or web service. Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent. <snip/>The conceptual problem with notice and consent is that it fundamentally places the burden of privacy protection on the individual. Notice and consent creates a non‐level playing field in the implicit privacy negotiation between provider and user. The provider offers a complex, take‐it‐or‐leave‐it set of terms, while the user, in practice, can allocate only a few seconds to evaluating the offer. This is a kind of market failure. </quote>
<quote>Also rapidly changing are the distinctions between government and the private sector as potential threats to individual privacy. Government is not just a “giant corporation.” It has a monopoly in the use of force; it has no direct competitors who seek market advantage over it and may thus motivate it to correct missteps. Governments have checks and balances, which can contribute to self‐imposed limits on what they may do with people’s information. Companies decide how they will use such information in the context of such factors as competitive advantages and risks, government regulation, and perceived threats and consequences of lawsuits. It is thus appropriate that there are different sets of constraints on the public and private sectors. But government has a set of authorities – particularly in the areas of law enforcement and national security – that place it in a uniquely powerful position, and therefore the restraints placed on its collection and use of data deserve special attention. Indeed, the need for such attention is heightened because of the increasingly blurry line between public and private data. While these differences are real, big data is to some extent a leveler of the differences between government and companies. Both governments and companies have potential access to the same sources of data and the same analytic tools. Current rules may allow government to purchase or otherwise obtain data from the private sector that, in some cases, it could not legally collect itself, or to outsource to the private sector analyses it could not itself legally perform. [emphasis here] The possibility of government exercising, without proper safeguards, its own monopoly powers and also having unfettered access to the private information marketplace is unsettling.</quote>
Referenced
Substantially in order of appearance in the footnotes, without repeats.
- Remarks by the President on Review of Signals Intelligence; The White House; 2014-01-27.
- Jonathan Stuart Ward, Adam Barker; Undefined By Data: A Survey of Big Data Definitions; In arXiv; 2013-09-20; 2 pages.
- David J. Seipp; The Right to Privacy in American History; Program on Information Resources Policy; Harvard University; 1978-07; 218 pages.
- Samuel D. Warren, Louis D. Brandeis; “The Right to Privacy”; Harvard Law Review; 4:5, 193; 1890-12-15.
- Publishing Personal and Private Information; Digital Media Law Project; 2008-09-08.
- Griswold v. Connecticut; 381 U.S. 479; 1965
- Olmstead v. United States; 277 U.S. 438; 1928
- McIntyre v. Ohio Elections Commission; 514 U.S. 334, 340‐41; 1995.
Cited & quoted on the position & importance of anonymous speech.
- Privacy Online: Fair Information Practices in the Electronic Marketplace; Federal Trade Commission (FTC); 2000-05-01; landing.
- Genetic Information Nondiscrimination Act of 2008; PL 110–233, 122 Stat 881; 2008-05-21.
- Privacy: The use of commercial information resellers by federal agencies; Hearing before the Subcommittee on Information Policy, Census, and National Archives of the Committee on Oversight and Government Reform, House of Representatives; One Hundred Tenth Congress; 2008-03-11; transcript; U.S. Government Printing Office;
- Having trouble proving your identity to HealthCare.gov? Here’s how the process works; staff; In Consumer Reports; 2013-12-18.
- William L. Prosser; Privacy; In California Law Review; 48:383, 389; 1960.
- Publishing Personal and Private Information; Digital Media Law Project; 2008-09-08.
- Elements of an Intrusion Claim; Digital Media Law Project
- United States v. Jones; 10‐1259; Sotomayor concurrence
- Leysia Palen, Paul Dourish; Unpacking ‘Privacy’ for a Networked World; In Proceedings of CHI 2003, Association for Computing Machinery, 2003-04-05; 8 pages.
- Phillip K. Dick; “The Minority Report”; In Fantastic Universe; 1956; reprinted in Selected Stories of Philip K. Dick, Pantheon; 2002.
- Dina ElBoghdady; Advertisers Tune In to New Radio Gauge; In The Washington Post; 2004-10-25.
- Staff (ACLU); You Are Being Tracked: How License Plate Readers Are Being Used To Record Americans’ Movements; American Civil Liberties Union; 2013-07; 37 pages; landing.
- How Urban Anonymity Disappears When All Data Is Tracked; Quentin Hardy; In The New York Times; 2014-04-19.
- Predictive Policing: Using Machine Learning to Detect Patterns of Crime; Cynthia Rudin; In Wired; 2013-08-22.
- Benjamin Reed Schiller; First Degree Price Discrimination Using Big Data; Brandeis University; 2014-01-30; 36 pages.
- Will Big Data Bring More Price Discrimination; Adam Ozimek; In Forbes; 2013-09-01
- William W. Fisher III; When Should We Permit Differential Pricing of Information?; In UCLA Law Review; 55:1; 2007; 38 pages
- John Burn‐Murdoch; UK technology firm uses machine learning to combat gambling addiction; In The Guardian, 2012-08-01.
- Using Data to Stage‐Manage Paths to the Prescription Counter”; Stephanie Clifford; In The New York Times; 2013-06-19.
- Attention, Shoppers: Store Is Tracking Your Cell; Stephanie Clifford; In The New York Times; 2013-07-14.
- How Companies Learn Your Secrets; Charles Duhigg; In The New York Times Magazine, February 12, 2012-02-12.
- Eugene Volokh; Outing Anonymous Bloggers; In His Blog; 2009-06-08..
- Arvind Narayanan, Hristo Paskov, Neil Zhenqiang Gong, John Bethencourt, Emil Stefanov, Eui Chul richard Shin, Dawn Song; On the Feasibility of Internet‐Scale Author Identification; In Proceedings of IEEE Symposium on Security and Privacy; 2012-05.
- The Graph API; Developer Documentation; Facebook.
- UN tackles socio‐economic crises with big data; Julia King; In Computerworld, 2013-06-03.
- This May Be The Most Vital Use Of “Big Data” We’ve Ever Seen; Neal Ungerleider; In Fast Company; 2013-07-12.
- 100 Data Innovations; Center for Data Innovations, Information Technology and Innovation Foundation, Washington, DC; 2014-01.
- Data Open Doors to Financial Innovation; Richard Waters; In Financial Times; 2013-12-13.
- Jenna Wiens, John Guttag, Eric Horvitz; A Study in Transfer Learning: Leveraging Data from Multiple Hospitals to Enhance Hospital‐Specific Predictions; In Journal of the American Medical Informatics Association; 2014-01.
- Daniel J. Weitzner, Hal Abelson, Cynthia Dwork, Cameron Kerry, Daniela Rus, Sandy Pentland, Salil Vadhan; Consumer Privacy Bill of Rights and Big Data: Response to White House Office of Science and Technology Policy Request for Information; Privacy Tools for Sharing Research Data, A National Science Foundation Secure and Trustworthy Cyberspace Project, School of Engineering and Applied Sciences, Harvard University; 2014-04-04; 15 pages; landing.
- MIT Computer Program Reveals Invisible Motion in Video; Bryant Frazer; In The New York Times (NYT); 2013-02-27; video
- PCAST; Re: Tuition pricing for college; Letter from PCAST to President Barack Obama; 2013-12; 9 pages.
tl;dr => an estimation of the MOOC phenomenon.
- Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices; U.S. Department of Education; 2014-02
A regulatory clarification
- Kenneth Cukier, Viktor Mayer‐Schöenberger; How Big Data Will Haunt You Forever; In Quartz; 2014-03-11.
- Nest Gives the Lowly Smoke Detector a Brain; Kenji Aoki; In Wired, 2013-10.
- Apple acquires Israeli 3D chip developer PrimeSense; Staff; In Reuters; 2013-11-25.
- Glass Gestures; product documentation; Google
- Omer Tene, Jules Polonetsky; A Theory of Creepy: Technology, Privacy and Shifting Social Norms; In Yale Journal of Law and Technology; 16:59, 2013; pages 59‐100.
- FTC Staff Revises Online Behavioral Advertising Principles; press release; Federal Trade Commission (FTC); 2009-02-12.
- What They Know; a series; In The Wall Street Journal (WSJ); WHEN?
- Joseph Turow; The Daily You: How the Advertising Industry is Defining your Identity and Your Worth; Yale University Press; 2012.
- DuckDuckGo
- The Web Cookie Is Dying. Here’s The Creepier Technology That Comes Next; Adam Tanner; In Forbes; 2013-06-17.
- Gunes Acar, Marc Juarez, Nick Nikiforakis, Claudia Diaz, Seda Gürses, Frank Piessens, Bart Preneel; FPDetective: Dusting the Web for Fingerprinters; In Proceedings of Computer and Communications Security (CCS); 2013-11-04; 13 pages; previously noted.
- Android Flashlight App Developer Settles FTC Charges It Deceived Consumers; press release, Federal Trade Commission (FTC); 2013-12-05.
- FibBit
- Steven E. Koonin, Gregory Dobler, Jonathan S. Wurtele; Urban Physics; American Physical Society News, 2014-03; also Urban Physics; Elizabeth Thomson; In Spectrum; 2014-Winter; pdf; 24 pages.
- MIT Computer Program Reveals Invisible Motion in Video; Fredo Durand, et al.; In The New York Times; 2014-02-27; video.
- Feldman Ronen; Techniques and Applications for Sentiment Analysis; In Communications of the ACM; 56:4; WHEN?; pages 82‐89.
- Viktor Mayer‐Schönberger, Kenneth Cukier; Big Data: A Revolution That Will Transform How We Live, Work, and Think; Houghton Mifflin Harcourt; 2013;
- Frontiers in Massive Data Analysis; National Research Council, National Academies Press, 2013; 129 pages; landing.
- Brent Thill, Nicole Hayashi; Big Data = Big Disruption: One of the Most Transformative IT Trends Over the Next Decade; UBS Securities LLC, 2013-10.
tl;dr => boosterism.
- Open data: Unlocking innovation and performance with liquid information; Center for Government, and Business Technology Office, McKinsey Global Institute, McKinsey & Company; 2013-10
tl;dr => boosterism.
- Quoc V. Le, Marc’Aurelio Ranzato, Rajat Monga, Matthieu Devin, Kai Chen, Greg S. Corrado, Jeff Dean, Andrew Y. Ng (Google); Building High‐level Features Using Large Scale Unsupervised Learning; In Proceedings of the 29th International Conference on Machine Learning (ICML); 2012; 11 pages
- Max Bramer; Principles of Data Mining; Springer; 2013-02-25; 454 pages; Amazon: kindle: $37, paper: $39.
- Tom M. Mitchell; The Discipline of Machine Learning; Technical Report CMU‐ML‐06‐108; Carnegie Mellon University; 2006-07; 9 pages.
- Big Mechanism; DARPA
- Solon Barocas, Helen Nissenbaum; “Big Data’s End Run Around Anonymity and Consent”; Chapter I0; In Julia Lane, et al.; Privacy, Big Data, and the Public Good; Cambridge University Press, 2014; Amazon: kindle: no, paper: $24.
- J. Manyika, et al.; “Big Data: The next frontier for innovation, competition, and productivity”; McKinsey Global Institute, 2011.
tl;dr => boosterism.
- Guillermo Navarro‐Arriba, V. Torra; Information fusion in data privacy: A survey; In Information Fusion; 13:4; 2012; pages 235‐244.
- B. Khaleghi, et al.; Multisensor data fusion: A review of the state‐of‐the‐art; In Information Fusion; 14:1; 2013; pp. 28‐44.
- J. Lam, et al.; Urban scene extraction from mobile ground based lidar data; In Proceedings of 3DPVT; 2010.
- S. Agarwal, et al.; “Building Rome in a Day”; In Communications of the ACM; 54:10; 2011; pages 105‐112.
- Workshop on Frontiers in Image and Video Analysis; National Science Foundation, Federal Bureau of Investigation, Defense Advanced Research Projects Agency, and University of Maryland Institute for Advanced Computer Studies, 2014-01-28 & 2014-01-29.
- Sensity
Newark Airport recently installed a system of 171 LED lights with cameras in them.
- Skybox
- Social Media Explosion: Do social networking sites threaten privacy rights?; In CQ Researcher; Thomas J. Billitteri, editor; 23:84‐104; 2013-01-25,
- B.H. Juang, Lawrence, R. Rabiner; Automated Speech Recognition – A Brief History of the Technology Development; 2004-10-08.
- Where Speech Recognition is Going; In Technology Review; 2012-08-29
- S. Wasserman; Social Network Analysis: Methods and Applications; Cambridge University Press; 1994; 116 pages; Amazon; kindle: $27, paper: $40.
- D. Crandall, Lars Backsrom, D. Cosley, S. Suri, D. Huttenlocher, J. Kleinberg; Inferring Social Ties from Geographic Coincidences; In Proceedings of the National Academy of Sciences; 2010.
- Lars Backsrom; Wherefore Art Though R3579X? Anonymized Social Networks, Hidden Patterns, and Structural Steganography; In Proceedings of the International World Wide Web Conference (WWW); 2007-04-12.
- Tools for Network (Graph) Datasets
- Allegrograph
- Cytoscape,
- Gephi
- GraphVis
- Netviz
- R
- Wolfram Alpha
- visone
- Lise Getoor, E. Zheleva; Preserving the privacy of sensitive relationships in graph data; In Proceedings of Privacy, Security, and Trust KDD; 2008; pages 153‐171 (10 pages).
- Bimal Viswanath, Krishna P. Gummadi, Ansley Post, Alan Mislove; An analysis of Social‐based Network Sybil Defenses; In Proceedings of ACM SIGCOMM Computer Communication Review; 200-09-03; 12 pages.
- Lars Backstrom, et al.; “Find Me If You Can: Improving Geographic Prediction
with Social and Spatial Proximity”; In Proceedings of the 19th International Conference on World Wide Web (WWW), 2010.
- L. Backstrom, J. Kleinberg; Romantic Partnerships and the Dispersion of Social Ties: A Network Analysis of Relationship Status on Facebook; In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW), 2014; landing.
- A. Narayanan, V. Shmatikov; De‐anonymizing social networks; In Proceedings of the 30th IEEE Symposium on Security and Privacy; 2009; pages 173‐187.
- David J. Crandall,Lars Beckstrom, Dan Cosley, Siddharth Suri, Daniel Huttenlocher, Jon Kleinberg; Inferring Social Ties From Geographic Coincidences; In Proceedings of the National Academy of Sciences; 107:52, 2010; landing.
- L. Backstrom, C. Dwork, J. Kleinberg; Wherefore Art Thou R3579X? Anonymized Social Networks, Hidden Patterns, and Structural Steganography; In Proceedings of the 16th International World Wide Web Conference (WWW); 2007.
- Jari Saramäki, et al.; Persistence of Social Signatures in Human Communication; In Proceedings of the National Academy of Sciences; 111.3; 2014; pages 942‐947; landing (paywalled).
- S.E. Fienberg; Is the Privacy of Network Data an Oxymoron?; In Journal of Privacy and Confidentiality; 4:2; 2013; 6 pages.
- V.E. Krebs; Mapping networks of terrorist cells; In Connections; 24.3; 2002; pages 43‐52.
- Pål Roe Sundsøy, Johannes Bejelland, Geoffrey Canright, Kenth Engø-Monsen, Richard Ling; Product adoption networks and their growth in a large mobile phone network; In Advances in Social Networks Analysis and Mining (ASONAM), 2010.
- Bob Hodgson; A Vital New Marketing Metric: The Network Value of a Customer; Predictive Marketing; Tag: Optimize Your ROI With Analytics.
- Lars Backstrom, et al.; “Find me if you can: improving geographical prediction with social and spatial proximity”; In Proceedings of the 19th International Conference on World Wide Web (WWW), 2010.
- Top 20 social media monitoring vendors for business; Socialmedia.biz; 2011-01-12.
- The 21st Century Data Center: An Overview; Charles McLellan; In ZDNet; 2013-04-02.
- Accumulo of Apache
- Software of AMPlab
- Big Data Working Group: Comment on Big Data and the Future of Privacy; Cloud Security Alliance; 2014-03.
- Han Qi, Abdullah Gani; Research on mobile cloud computing: Review, Trend and Perspectives; In Proceedings of the Second International Conference on Digital Information and Communication Technology and Its Applications (DICTAP); 2012-06-12; 8 pages; landing.
- K. Jeffery, et al.; “A vision for better cloud applications”; In Proceedings of the 2013 International Workshop on Multi‐Cloud Applications and Federated Clouds (MODAClouds), 2013-04-22;
- Immediate Opportunities for Strengthening the Nation’s Cybersecurity; a report; PCAST; 2013.
- PCAST has addressed issues in cybersecurity, both in reviewing the NITRD programs and directly in a 2013 report,
- Who Goes There: Authentication Through the Lens of Privacy; Computer Science and Telecommunications Board, National Academies Press, 2003; promotion; 13 slides; landing.
- Travis D. Breaux, Ashwini Rao; Formal Analysis of Privacy Requirements Specifications for Multi‐Tier Applications; In Proceedings of the 21st IEEE Requirements Engineering Conference (RE 2013); 2013-07.
- Joan Feigenbaum, Aaron D. Jaggard, Rebecca N. Wright; Towards a Formal Model of Accountability; In Proceedings of the New Security Paradigms Workshop 2011; 2011-09-12.
- Carl Landwehr; “Engineered Controls for Dealing with Big Data”; Chapter 10; In Privacy, Big Data, and the Public Good; Julia Lane, Victoria Stodden, Stefan Bender, Helen Nissenbaum, editors; Cambridge University Press, 2014.
- Fred P. Brooks; No silver bullet – Essence and Accidents of Software Engineering; In IEEE Computer; 20:4; 1987-04; pages 10‐19.
- Brad Krebs; Collected Posts On the Target Data Breach; In His Blog entitled Krebs on Security; 2014.
- Dennis Fisher; Final Report on DigiNotar Hack Shows Total Compromise of CA Servers; In ThreatPost; 2012-10-31.
- Tony Bradley; VeriSign Hacked: What We Don’t Know Might Hurt Us; In PC World; 2012-02-02.
- Someone (EFF); Encrypt the Web Report: Who’s Doing What; In Their Blog; 2013-11.
- Whitfield Diffie, Paul C. Van Oorschot, Michael J. Wiener; Authentication and Authenticated Key Exchanges; In Designs, Codes and Cryptography; 2:2; 1992; pages.107‐125.
- Cynthia Dwork; Differential Privacy; In Proceedings of the 33rd International Colloquium on Automata, Languages and Programming, 2006.
- Cynthia Dwork; A Firm Foundation for Private Data Analysis; In Communications of the ACM; 54.1; 2011; landing.
- Susan E. Gindin; Nobody Reads Your Privacy Policy or Online Contract: Lessons Learned and Questions Raised by the FTC’s Action against Sears; In Northwestern Journal of Technology and Intellectual Property; 1:8; 2009‐2010.
- Response to Request for Information Filed by U.S. Public Policy Council of the Association for Computing Machinery; Association for Computing Machinery (ACM); 2014-03.
- Sweeney, et al., Identifying Participants in the Personal Genome Project by Name; White Paper 1021‐1, Data Privacy Lab, Harvard University; 2013-04-25.
- Ryan Whitwam; Snap Save for iPhone Defeats the Purpose of Snapchat, Saves Everything Forever; In PC Magazine; 2013-08-12.
- Hal Abelson, Lalana Kagal; Access Control is an Inadequate Framework for Privacy Protection; In; 2013-07-12.
- Craig Mundie; “Privacy Pragmatism: Focus on Data Use, Not Data Collection”; In Foreign Affairs, 2014-03/2014-04; regwalled.
- Helen Nissenbaum; Privacy in Context: Technology, Policy, and the Integrity of Social Life; Stanford Law Books; 2009-11-24; 304 pages.
- Daniel J. Weitzner, Harold Abelson, Tim Berners-Lee, Joan Feigenbaum, James Hendler, Gerald Jay Sussman; Information Accountability; In Communications of the ACM; 2008-06; pages 82‐87.
- Michael Carl Tschantz, Anupam Datta, Jeannette M. Wing; Formalizing and Enforcing Purpose Restrictions in Privacy Policies; Carnegie Mellon University; 2012;
- CyLab Usable Privacy and Security Laboratory, Lorrie Cranor, director.
- Proceedings of the 2nd International Workshop on Accountability: Science, Technology and Policy, MIT Computer Science and Artificial Intelligence Laboratory, 2014-01-29 & 30.
- Oracle’s eXtensible Access Control Markup Language (XACML) for identity management systems; cited as personal communication: Mark Gorenberg and Peter Guerra of Booz Allen.
- IC CIO Enterprise Integration & Architecture: Trusted Data Format; Office of the Director of National Intelligence; 2014?
- OpenStack
- NITRD Agencies List; Networking and Information Technology Research and Development (NITRD) program.
- Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology; Networking and Information Technology Research and Development (NITRD).
- Report on Privacy Research Within NITRD [Networking and Information Technology Research and Development], National Coordination Office for
NITRD; Federal Networking and Information Technology Research and Development Program; 2014-04-23.
- The Secure and Trustworthy Cyberspace Program of the National Science Foundation (NSF)
- Joint Solicitation for Privacy-Related Research; Directorates for Computer and Social Science; National Science Foundation (NSF); 2013-12.
- Security and Privacy Assurance Research (SPAR); at IARPA
- National Strategy for Trusted Identities in Cyberspace (NSTIC)
- W.A. Pike, et al., “PNNL [Pacific Northwest National Laboratory] Response to OSTP Big Data RFI,” 2014-03
- Curriculum Guidance (link failure); Association for Computing Machinery; 2013.
- FedRAMP, a federal program for certifying cloud services
Privacy Threshold Analysis and Privacy Impact Assessment, a guidance document.
- Privacy Threshold Analysis
- Privacy Impact Analysis
- The Office of the U.S. Chief Information Officer
Via: backfill, backfill
Snide
And yet even with all the letters and professional editing and techwriting staff available to this national- and historical-level enterprise we still see [Footnote 101, page 31]
Qi, H. and A. Gani, “Research on mobile cloud computing: Review, trend and perspectives,” Digital Information and Communication Technology and it’s Applications (DICTAP), 2012 Second International Conference on, 2012.
The correct listing is at Springer
Digital Information and Communication Technology and Its Applications;International Conference, DICTAP 2011, Dijon, France, June 21-23, 2011. Proceedings, Part I, Series: Communications in Computer and Information Science, Vol. 166 Cherifi, Hocine, Zain, Jasni Mohamad, El-Qawasmeh, Eyas (Eds.) 2011, XIV, 806 p.
But:
- it’s → is a contraction for it is
- its → is a possessive
Ergo: s/it's/its/g;