Overview of the Digital Object Architecture (DOA) | ISOC

Overview of the Digital Object Architecture (DOA); an Information Paper, The Internet Society; 2016-10-25; 8 pages; landing.
contributor credit: Chip Sharp, scrivener


  • Introduction
  • What is the Digital Object Architecture?
  • What is a Handle?
  • Handle Resolution
  • Who Runs the Global Handle System?
  • Examples of systems based on the Digital Object Architecture/Handle System
  • Standards and the Handle System
  • Policy Considerations
  • Trademarks and Service Marks
  • Resourcs


<quote>The Digital Object Architecture (DOA) and associated Handle System® originated at the Corporation for National Research Initiatives (CNRI) in the early 1990’s based on its work on digital libraries under contract for the Defense Advanced Research Projects Agency (DARPA).1 One of the original motivations for its design was the need to identify and retrieve information over long periods of time (on the order of tens or hundreds of years) so persistence was a critical design requirement. At the time it was developed, the Digital Object Architecture was an attempt to shift from a view of the Internet as organized around a set of hosts and the transport to reach them to a view in which the Internet was organized around the discovery and delivery of information in the form of digital objects.<quote>


  • Digital Object Architecture (DOA)
  • associated Handle System®
    Yes, that’s a registration mark, ®.
  • Corporation for National Research Initiatives (CNRI)
  • Defense Advanced Research Projects Agency (DARPA)
  • um, like “digital libraries”
  • DONA Foundation
  • International Telecommunication Union (ITU)
  • Memoranda of Understanding (MOUs)
  • Multi-Primary Administrators (MPAs), manage the partitions of the namespace of the root servers of the top level GHR.
  • Policy Development Process (PDP)


Digital Objects
The records, blobs of bits.
The containers of records objects.
The names are global contextually scoped, universal, persistent.
There is a Handle Protocol, in (at least) version v2.1.
Resolution System and Registries
Like DNS, but different; maps “names” to Handles.

  • Global Handle Registry (GHR), is the root server.
  • Local Handle Services (LHS), are the regional delegates.
The names are, e.g. GUIDs, UUIDs.
  • unique
  • persistent
  • location-agnostic
  • taxonomy-agnostic


Defined as a string:
prefix “/” identifier
prefix is a “like a” reversed FQDN.
identifier is “like a” filename on the FQDN so referenced.
Thus handles are

  • unique
  • persistent
  • location-agnostic
  • taxonomy-agnostic

For the FQDN of the U.S. Library of Congress (LOC)
FQDN: ye-auguste-national-librarye.loc.gov
Handle: gov.loc.ye-auguste-national-librarye


  • Coalition for Handle Services (ETIRI, CDI and CHC)
  • Communications and Information Technology Commission (CITC)
  • Corporation for National Research Initiatives (CNRI)
  • Gesellschaft für Wissenschaftliche Datenverarbeitung mbH Göttingen (GWDG)/ePIC
  • International DOI Foundation (IDF)


DOI® System
  • International DOI Foundation (IDF)
    UK, non-profit, 1998
  • Prefix 10
  • Handle semantics is idiosyncratic, known to themselves.
  • Examples
Persistent Identifier Consortium for eResearch (ePIC)
  • [for the benefit of the] European Research Community
  • Prefix 21


RFC 3650
S. Sun, L. Lannom, B. Boesch. Handle System Overview, RFC 3650, 2003-11.
RFC 3651
S. Sun, Reilly, L. Lannom. Handle System Namespace and Service Definition, RFC 3651, 2003-11.
RFC 3652
S. Sun, S. Reilly, L. Lannom, J. Petrone. Handle System Protocol (ver 2.1) Specification, RFC 3652, 2003-11.
RFC 4452
H. Van de Sompel, T. Hammond, E. Neylon, S. Weibel. The “info” URI Scheme for Information Assets with Identifiers in Public Namespaces, RFC 4452, 2006-04.
ISO 26324:2012
International Organization for Standardization (ISO), “ISO 26324:2012 Information and documentation — Digital object identifier system“, ISO Standard 26324, 2012-06.
ANSI/NISO Z39.84-2005 (R2010)
ANSI/NISO Z39.84-2005 (R2010) Syntax for the Digital Object Identifier. (revised 2010)
ITU-T Recommendation X.1255, Framework for discovery of identity management information, ITU-T, 2014

Trademarks and Service Marks

International DOI Foundation, Inc.
DOI, DOI.ORG, “short DOI” are registered service marks of …
DONA Foundation
DONA, GLOBAL HANDLE REGISTRY, HANDLE SYSTEM are registered service marks of…
Corporation for National Research Initiatives (CNRI)
HANDLE.NET, HDL, HDL.NET, CNRI are registered service marks of …
HDL, HDL.NET are registered trademarks of …
Internet Society
Internet Society is a registered service mark of the …



  • Documents at the DONA Foundation
  • open-stand.org
  • ANSI/NISO Z39.84-2005 (R2010) Syntax for the Digital Object Identifier. (revised 2010)
  • Corporation for National Research Initiatives, Overview of the Digital Object Architecture, July 28, 2012
  • DONA Foundation. DONA Foundation Statutes. Geneva. 2014.
  • International Organization for Standardization (ISO), “ISO 26324:2012 Information and documentation — Digital object identifier system”, ISO Standard 26324, 2012-06.
  • ITU-T Recommendation X.1255, Framework for discovery of identity management information, ITU-T, 2014.
  • R. Kahn, R. Wilensky. “A Framework for Distributed Digital Object Services”, In International Journal of Digital Libraries (2006) 6: 115.
  • Norman, Paskin. The Digital Object Identifier: From Ad Hoc to National to International. In The Critical Component: Standards in the Information Exchange Environment, edited by Todd Carpenter, ALCTS, 2015.
  • Peter J. Denning & Robert E. Kahn, The Long Quest for Universal Information Access. In Communications of the ACM, Vol. 53 No. 12, Pages 34-36.
  • RFC 3650. S. Sun, L. Lannom, B. Boesch. Handle System Overview, IETF, RFC 3650, 2003-11.
  • RFC 3651. S. Sun, S. Reilly, L. Lannom. Handle System Namespace and Service Definition, IETF, RFC 3651, 2003-11.
  • RFC 3652. S. Sun, S. Reilly, L. Lannom. J. Petrone, Handle System Protocol (ver 2.1) Specification, IETF, RFC 3652, 2003-11.
  • RFC 4452. H. Van de Sompel, T. Hammond, E. Neylon, S. Weibel, The “info” URI Scheme for Information Assets with Identifiers in Public Namespaces, IETF, RFC 4452, 2006-04.

Previously filled.

Surviving on a Diet of Poisoned Fruit: Reducing the National Security Risks of America’s Cyber Dependencies | Danzig (CNAS)

Richard J. Danzig; Surviving on a Diet of Poisoned Fruit Reducing the National Security Risks of America’s Cyber Dependencies; Center for a New American Security; 2014-07; 64 pages; landing.

tl;dr → a metaphor for an ambivalent relationship with the technical platforms upon which all things depend.  Writ large into the relationship with the supply chain that we do not control and is inimical to our interests..

Executive Summary

Digital technologies, commonly referred to as cyber systems, are a security paradox: Even as they grant unprecedented powers, they also make users less secure. Their communicative capabilities enable collaboration and networking, but in so doing they open doors to intrusion. Their concentration of data and manipulative power vastly improves the efficiency and scale of operations, but this concentration in turn exponentially increases the amount that can be stolen or subverted by a successful attack. The complexity of their hardware and software creates great capability, but this complexity spawns vulnerabilities and lowers the visibility of intrusions. Cyber systems’ responsiveness to instruction makes them invaluably flexible; but it also permits small changes in a component’s design or direction to degrade or subvert system behavior. These systems’ empowerment of users to retrieve and manipulate data democratizes capabilities, but this great benefit removes safeguards present in systems that require hierarchies of human approvals. In sum, cyber systems nourish us, but at the same time they weaken and poison us.

The first part of this paper illuminates this intertwining. The second part surveys the evolution of strategies to achieve greater cybersecurity. Disadvantaged by early design choices that paid little attention to security, these strategies provide some needed protection, especially when applied collectively as a coordinated “defense in depth.” But they do not and never can assure comprehensive protection; these strategies are typically costly, and users will commonly choose to buy less security than they could obtain because of the operational, financial or convenience costs of obtaining that security.

Three other factors, discussed in Section V, amplify cyber insecurity. First, the cyber domain is an area of conflict. Cyberspace is adversarial, contested territory. Our adversaries (including criminals, malevolent groups and opposing states) co-evolve with us. The resulting ecosystem is not static or stable. Second, the speed of cyber dissemination and change outpaces our recognition of problems and adoption of individual and societal safeguards to respond to them. Protective actions are likely to continue to lag behind security needs. Third, in cyberspace America confronts greater-than customary limits to U.S. government power because of the global proliferation of cyber capabilities, cyber attackers’ ability to remain outside the United States even while operating within the country’s systems and our likely inability, over the long term, to avoid technological surprise. Two-thirds of a century of technological dominance in national security matters has left the United States intuitively ill-prepared for technology competitions that it probably will not continue to dominate and in which there is a high likelihood of surprise.

What then is to be done? The concluding part of this paper does not attempt to recapitulate or evaluate efforts now extensively debated or in progress. It focuses instead on recommending initiatives that deserve fresh attention from U.S. government decision-makers. These include:

  1. Articulate a national security standard defining what it is imperative to protect in cyberspace. The suggested standard is: “The United States cannot allow the insecurity of our cyber systems to reach a point where weaknesses in those systems would likely render the United States unwilling to make a decision or unable to act on a decision fundamental to our national security.” A more stringent standard may later be in order, but this standard can now secure a consensus, illuminate the minimum that the United States needs to do and therefore provide an anvil against which the nation can hammer out programs and priorities.
  2. Pursue a strategy that self-consciously sacrifices some cyber benefits in order to ensure greater security for key systems on which security depends. Methods for pursuing this strategy include stripping down systems so they do less but have fewer vulnerabilities; integrating humans and other out-of-band (i.e., non-cyber) factors so the nation is not solely dependent on digital systems; integrating diverse and redundant cyber alternatives; and making investments for graceful degradation. Determining the trade-offs between operational loss and security gain through abnegating choices will require and reward the development of a new breed of civilian policymakers, managers and military officers able to understand both domains.
  3. Recognize that some private-sector systems fall within the national security standard. Use persuasion, federal acquisition policies, subsidy and regulation to
  4. apply the abnegating approach to these systems. While doing this, reflect an appreciation of the rapidity of cyber change by focusing on required ends while avoiding specification of means. Refrain from regulating systems that are not critical.
  5. Bolster cyber strategic stability between the United States and other major nation-states by seeking agreement on cyber constraints and confidence-building measures. As an early initiative of this kind, focus on buttressing the fragile norm of not using cyber as a means of physical attack between China, Russia and the United States.
  6. Evaluate degradation in the sought-after certainties of mutually assured destruction (MAD) as a result of uncertainties inherent in cyber foundations for nuclear command, control and attack warning. If we are moving to a regime of mutually unassured destruction (MUD), suggest to China and Russia that we are all becoming less secure. Then pursue agreements that all parties refrain from cyber intrusions into nuclear command, control and warning systems.
  7. Map the adversarial ecosystem of cyberspace in anthropological detail with the aim of increasing our understanding of our adversaries and our own incentives and methods of operation.
  8. Use the model of voluntary reporting of near-miss incidents in aviation to establish a data collection consortium that will illuminate the character and magnitude of cyber attacks against the U.S. private sector. Use this enterprise as well to help develop common terminology and metrics about cybersecurity.
  9. Establish a federally funded research and development center focused on providing an elite cyber workforce for the federal government. Hire that workforce by cyber competition rather than traditional credentials, and promote, train, retain and assign (including to the private sector) that workforce by standards different from those currently used in federal hiring.

Previously filled.

Some hard questions for the self-identifying Open Source Community

The Questions

In the form of provocative statements against which a panel of Greybeards and Elders could react in a plenary session. presented in the style of Nine Point Five Theses.

1. On the continued inability to make a living “performing” Open Source

There have been a few very notable failures in Open Source of late which have been attributed to the inability of anyone (anyone at all) to fund the continued development and maintenance of the critical componentry of the internet. The failure to fund has manifested in the core contributors being unable to fund even a meager lifestyle based upon their work in Open Source. One can point to openssl (as a group of individuals) [1], pgp (gpg), ntp [3], at least. One can assert that this is “all different now” with the Core Infrastructure Initative [4][5], but is it? Why?

  1. Tech giants, chastened by Heartbleed, finally agree to fund OpenSSL; Jon Brodkin; In Ars Technica; 2014-04-24.
    Teaser: IBM, Intel, Microsoft, Facebook, Google, and others pledge millions to open source.
  2. The Open-Source Question; In Slate; 2015-02-12.
    Teaser: Some of the Web’s most important infrastructure is barely funded. How can we preserve it?
  3. NTP’s Fate Hinges On ‘Father Time’; Charles Babcock; In InformationWeek; 2015-03-11.
    Teaser: The Network Time Protocol provides a foundation to modern computing. So why does NTP’s support hinge so much on the shaky finances of one 59-year-old developer?
  4. Core Infrastructure Initiative
  5. OpenSSL, OpenSSH, NTP Get Funding From Core Infrastructure Initiative; Eduard Kovachs; In Security Week; 2014-05-30.

2. Open Source is priced right (free) because testing is omitted (end-users, consumers, are the testers)

One claim about open source software is that it is so cheap (free) because the quality is commensurate with the price. With all bugs being shallow to the many eyes [1], then every consumer becomes front-line on the quality assurance team. Similarly, the criticism from the pundits follows the line that “if someone isn’t paying you to do it, then it is a hobby.” Now we are hearing that the cost and efficiency gains of opens source software is being replicated in the hardware realm as well [2], namely cutting the cost out of the system by making the end users do the testing; i.e. skipping the testing cycles altogether. With hardware and software combinations being increasingly composed into (human-)life critical applications, this seems like an unwise direction. Should one use, could one use open source in a home automation rig, in a drive-by-wire car, fly-by-wire airplane, in a power plant, in a nuclear power plant? Are there still areas where the open source model is still not appropriate; i.e. where quality “really matters” or where trade secrecy really really is important because(lives are at stake, property is at stake, “think of the children”, etc.).

  1. Linus’ Law; In Jimi Wales Wiki.
    Stated: “given enough eyeballs, all bugs are shallow”; restated: “Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix will be obvious to someone.”
    Attributed to Eric Raymond, The Cathedral and the Bazaar, 1999.
  2. Open Compute Project testing is a ‘complete and total joke’”; Chris Mellor; In The Register; 2015-07-07.
    Teaser: Source questions integrity, neutrality of certification programme

3. Graphics in Open Source is second class, by corollary X11 will never be bettered, will it?

Why is that the open source world has never produced a successor to the graphics stack of X11? A view is that if one wants “kickass” graphics, one has to go closed source (i.e. Microsoft with Windows drivers). One is always hearing that there are new improvements to the open source versions of various graphics drivers, but that they are a ways off from the baseline (i.e. from Windows). One hears about fewer total, utter unworkable incompatibilities nowadays, but the essence remains: the closed source stuff is “better.” Why?

4. Open Source has been captured by copyrighting the very APIs that it uses

Apparently from legal jurisprudence, one can create and copyright Application Programming Interfaces (APIs) within a programming language. Is it still reasonable to believe that one can create valid open source based upon such closed interfaces? Of course, I’m thinking about Android’s application interfaces being based upon Java whose owner has asserted various copyright claims. Is it not unreasonable to believe that “true open source” cannot be written in terms of “locked APIs?” Wasn’t guarding against just such an eventuality as this the whole impetus behind the open source movement in the first place?

5. Open Source OSes live at the behest of BIOS and Unified Extensible Firmware Interface (UEFI) will squeeze it away

What will become of the open source operating systems when their boot images must be signed with a Microsoft key? <quote>Microsoft refuses to sign binaries distributed under certain open source licenses, including the GPLv3, which GRUB 2 and rEFInd both use.</quote>The “shim trick” seems like a slim kindness that will go away at some point, and soon. Is it not unreasonable to believe that at some point they will longer allow [your] work to boot on available hardware? A rebuttal could be that open source could live on within virtual machine containers, but that doesn’t feel very comforting.

  1. James Bottomley; Linux Foundation Secure Boot Released; In His Blog; 2013-02-09.
  2. James Bottomley; Adventures in Microsoft UEFI Signing; In His Blog; 2012-11-20.
  3. Microsoft UEFI Certification Authority; Jeremiah Cox (Microsoft); At UEFI PlugFest; 2013-09-19; 25 slides.
  4. UEFI Secure Boot: Big Hassle, Questionable Benefit; Carla Schroder; In Linux.com (Magazine); 2012-06-12.
  5. Matthew Garrett; UEFI Secure Booting; In His Blog; 2011-09-20.

6. Closed Source has money and they Play Rough

Consider the case where an open source project achieves success and a large user base only to be taken “off the market” by a closed source competitor. Not immediately of course, but slowly over time the open source (community version) drifts and atrophies. I’m particularly thinking of the MySQL case here, but there may be others. Is this just fair game in the art and science of businesses in a closed-end marketplace?

Like crabs in a bucket they are…

systemd, upstart, sysvinit, /etc/rc.local? Tomato-tomaaaato, let’s call the whole thing off. Don’t you [panelists] think that this sort of squabbling keeps open source software “small” and “hobbyist?”

8. Open Source has hit ‘Peak Toolchain’


On counterpoint though the Java culture folks will point out that with their JIT compiler, VM technology and multi-tenant mega-container client-server architectures (J2EE), they simply don’t run into the class of problems that Docker and container systems attempt to address.  No remediation needed, they just don’t have the problem.

So a question to the panel is whether the open source operating system world has hit “Peak Complexity” and every move after this is merely saving off the ultimate collapse of the whole regime.  Even Koji, the Fedora build system is really a sourdough system where one must seed the system with the packages & components of the last known working release and build/rebuild to a fixed point in the new stack (kernel, glibc, systemd, service daemon, applications, window system, etc.).

The original reference is 18 months old and the direct criticism may have been addressed. Red Hat now sponsors has Project Atomic (Atomic App, Nulecule, Atomic Host).

An appropriate response to the direct technical criticism of Docker in a 5-min response from a panelist would be that these are the normal teething pains for a new technology.  They should be more solid in the future.  “Let’s move on.”  But the greater criticism of addressing dependencies and complexity to stave off impending collapse is worthy of a plenary panel.

In real life … best to wait until these get solved, and the nested, nasty dependencies only get worse.

For example, we have today’s emergency-mode announcement from OpenSSL of CVE-2015-1793 (2015-07-09) that they have repudiated OpenSSL 1.0.2 and prior outright and that the whole wide world of the internet must upgrade to the new code that has no bugs.  Yet for most systems, it simply isn’t possible to “upgrade” because of the dependency issues, we will have to upgrade from bare metal upwards to get access to the new OpenSSL, and that means upgrading storage systems, databases, networks, and applications.  Some systems just won’t be upgraded.



  • Alexander Larsson; Adventures in Docker land; In GNOME Blog; 2013-10-15.
    <quote>Unfortunately Docker relies on AUFS, a union filesystem that is not in the upstream kernel, nor is it likely to ever be there. Also, while AUFS is in the current Ubuntu kernel it is deprecated there and will eventually be removed. This means Docker doesn’t run on Fedora which has a primarily-upstream approach to packaging.</quote>

9.  An Open Source strategy is basically “beggar thy neighbor

There exists thinking in technology strategy circles, and in cocktail party circles (same thing really, no?), that Open Source is a weapon that must be used appropriately and sparingly.  The thinking runs like this:

Use open source
  • for the peripheral competency of your business; think: operating systems of “the cloud.”
  • to devalue the core competency of your competitor’s business; think Hadoop vs (Google)MapReduce.
Use closed source (trade secrets)
  • the core competency of your business; keep how you add value to yourself.

This mode of thinking explains much about open source and how it behaves in the marketplace.  Rebut, assent or explain.

9.5 Software-as-a-Service (SaaS) is just a way of packaging Open Source, now isn’t it?

Somewhat of a softball here.  SaaS forces payments to happen on a regular basis.  But also the thinking is more aligned with Peak Toolchain concepts.

ATIS Open Web Alliance

Open Web Alliance of ATIS


  • These are the telecoms and network infrastructure vendors.
  • They have jurisdictional obligations to monitor the traffic on their networks; c.f. Communications Assistance for Law Enforcement (CALEA) and other in-country operating laws, regulations, rules.
  • They base their future business aspirations off of Deep Packet InspectionInsertion (DPI) which they call Value Added Services.
  • In particular they are looking to track-and-target persons on their network for at least advertising purposes.
  • In particular, they are looking to insert ad creative into the entertainment-type network traffic.
  • They hope to avoid being low-value dumb pipes supporting a high-value smart edge‘.
  • A web with universal E2E crypto breaks this.
  • They are against end-to-end encryption
  • They are against ubiquitous SSL/TLS
  • They are against SPDY
  • They are against HTTP/2.0 (which will require e2e crypto)
  • They are an substantially an anti-Google axis (yet Google is a member)
  • Their counterproposal is Open Transparent Proxy
    • They crack the SSL at their edge.
    • From their proxy head end, they will transport the communcations to their upline obligations: government, advertisers, vendors, monitors, and also to the consumer’s desired end point.
    • Value Added Services (VAS)


(conveners, staff)

  • Sanjay Mishra
    Distinguished Member of the Technical StaffNetwork Infrastructure Planning
    Corporate Technology
  • Kevin ShatzkamerDistinguished Architect
    Mobility, Web and Media
  • Jim McEachern
    Senior Technology Consultant


  • Alcatel-Lucent
  • AT&T
  • Cisco
  • Ericsson
  • Google
  • GSMA
  • Hitachi
  • Hughes
  • iconectiv
  • Intrado
  • Leidos
  • NTT
  • Openwave Mobility
  • Orange
  • Rogers
  • TDS
  • Time Warner Cable
  • T-Mobile
  • Verizon





Big Data and Privacy: A Technological Perspective | PCAST

Big Data and Privacy: A Technological Perspective; Executive Office of the President, President’s Council of Advisors on Science and Technology (PCAST); 2014-05-01; 76 pages; landing.



  • White House / UC Berkeley School of Information / Berkeley Center for Law and Technology; John Podesta; 2014-04-01; transcript, video.
  • White House / Data & Society Research Institute / NYU Information Law Institute; John Podesta; 2014-03-17; video.
  • White House / MIT; John Podesta; 2014-03-04; transcript, video.


PCAST Big Data and Privacy Working Group.
  • Susan L. Graham, co-chair.
  • William Press, co-chair.
  • S. James Gates, Jr.,
  • Mark Gorenberg,
  • John Holdren,
  • Eric S. Lander,
  • Craig Mundie,
  • Maxine Savitz,
  • Eric Schmidt.
  • Marjory S. Blumenthal, Executive Director of PCAST; coordination & framing..


  • John P Holdren, co-chair, OSTP
  • Eric S. Lander, co-chair, Broad Institute (Harvard&MIT)
  • William Press, co- vice chair, U. Texas
  • Maxine Savitz, co- vice chair, National Academy of Engineering
  • Rosina Bierbaum, U. Michigan
  • Christine Cassel, National Quality Forum
  • Christopher Chyba, Princeton
  • S. James Gates, Jr., U. Maryland
  • Gorenberg, Zetta Venture Partners
  • Susan L. Graham, UCB
  • Shirley Ann Jackson, Rensselaer Polytechnic
  • Richard C. Levin, Yale
  • Chad Mirkin, Northwestern
  • Mario Molina, UCSD
  • Craig Mundie, Microsoft
  • Ed Penhoet, UCB
  • Barbara Schaal, Washington University
  • Eric Schmidt, Google
  • Daniel Schrag, Harvard


  • Marjory S. Blumenthal
  • Michael Johnson


From the Executive Summary [page xiii], and also from Section 5.2 [page 49]

  • Recommendation 1 [consider uses over collections activites]
    Policy attention should focus more on the actual uses of big data and less on its collection and analysis.
  • Recommendation 2 [no Microsoft lockin; no national champion]
    Policies and regulation, at all levels of government, should not embed particular technological solutions, but rather should be stated in terms of intended outcomes.
  • Recommendation 3 [fund]
    With coordination and encouragement from [The White House Office of Science and Technology Policy] OSTP, the [Networking and Information Technology Research and Development] NITRD agencies should strengthen U.S. research in privacy‐related technologies and in the relevant areas of social science that inform the successful application of those technologies.
  • Recommendation 4 [talk]
    OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection, including career paths for professionals.
  • Recommendation 5 [talk & buy]
    The United States should take the lead both in the international arena and at home by adopting policies that stimulate the use of practical privacy‐protecting technologies that exist today. It can exhibit leadership both by its convening power (for instance, by promoting the creation and adoption of standards) and also by its own procurement practices (such as its own use of privacy‐preserving cloud services)

Table of Contents

  1. Executive Summary
  2. Introduction
    1. Context and outline of this report
    2. Technology has long driven the meaning of privacy
    3. What is different today?
    4. Values, harms, and rights
  3. Examples and Scenarios
    1. Things happening today or very soon
    2. Scenarios of the near future in healthcare and education
    3. Healthcare: personalized medicine,
    4. Healthcare: detection of symptoms by mobile devices
    5. Education
    6. Challenges to the home’s special status
    7. Tradeoffs among privacy, security, and convenience
  4. Collection, Analytics, and Supporting Infrastructure
    1. Electronic sources of personal data
      1. “Born digital” data
      2. Data from sensors
    1. Big data analytics
      1. Data mining
      2. Data fusion and information integration
      3. Image and speech recognition
      4. Social‐network analysis
    2. The infrastructure behind big data
      1. Data centers
      2. The cloud
  5. Technologies and Strategies for Privacy Protection
    1. The relationship between cybersecurity and privacy
    2. Cryptography and encryption
      1. Well Established encryption technology
      2. Encryption frontiers
    3. Notice and consent
      1. Other strategies and techniques
        1. Anonymization or de‐identification
        2. Deletion and non‐retention
    4. Robust technologies going forward
      1. A Successor to Notice and Consent
      2. Context and Use
      3. Enforcement and deterrence
      4. Operationalizing the Consumer Privacy Bill of Rights
  6. PCAST Perspectives and Conclusions
    1. Technical feasibility of policy interventions
    2. Recommendations
    3. Final Remarks
  7. Appendix A. Additional Experts Providing Input
  8. Special Acknowledgment


  • The President’s Council of Advisors on Science and Technology (PCAST)
  • PCAST Big Data and Privacy Working Group
  • Enabling Event
    • President Barack Obama
    • Remarks, 2014-01-17
    • Counselor John Podesta
  • New Concerns
    • Born digital vs born analog
    • standardized components
    • particular limited purpose vs repurposed, reused.
    • data fusion
    • algorithms
    • inferences
  • Provenance of data, recording and tracing the provenance of data
  • Trusted Data Format (TDF)


  • Right to forget, right to be forgotten is unenforceable infeasible [page 48].
  • Prior redress of prospective harms is a reasonable framework [page 49]
    • Conceptualized as vulnerable groups who are stipulated as harmed a priori or are harmed sunt constitua.
  • Government may be forbidden from certain classes of uses, despite their being available in the private

    • Government is allowed some activities and powers
    • Private industry is allowed some activities and powers
    • It is feasible in practice to mix & match
      • government coercion => private privilege => result
      • private privilege => private coercion => result

Consumer Privacy Bill of Rights (CPBR)

Obligations [of service providers, as powerful organizations]

  • Respect for Context => use consistent with collection context.
  • Focused Collection => limited collection.
  • Security => handling techniques
  • Accountability => handling techniques.

Empowerments [of consumers, as individuals]

  • Individual Control => control of collection, control of use.
  • Transparency => of practices [by service providers]
  • Access and Accuracy => right to review & edit [something about proportionality]

Definition of Privacy

The definition is unclear and evolving. It is frequently defined in terms of the harms in curred when it is lost.

Privacy Framework of Via Harms

The Prosser Harms, <quote> page 6.

  1. Intrusion upon seclusion. A person who intentionally intrudes, physically or otherwise (now including electronically), upon the solitude or seclusion of another person or her private affairs or concerns, can be subject to liability for the invasion of her privacy, but only if the intrusion would be highly offensive to a reasonable person.
  2. Public disclosure of private facts. Similarly, a person can be sued for publishing private facts about another person, even if those facts are true. Private facts are those about someone’s personal life that have not previously been made public, that are not of legitimate public concern, and that would be offensive to a reasonable person.
  3. “False light” or publicity. Closely related to defamation, this harm results when false facts are widely published about an individual. In some states, false light includes untrue implications, not just untrue facts as such.
  4. Misappropriation of name or likeness. Individuals have a “right of publicity” to control the use of their name or likeness in commercial settings.



<quote>One perspective informed by new technologies and technology‐mediated communication suggests that privacy is about the “continual management of boundaries between different spheres of action and degrees of disclosure within those spheres,” with privacy and one’s public face being balanced in different ways at different times. See: Leysia Palen, Paul Dourish; Unpacking ‘Privacy’ for a Networked World; In Proceedings of CHI 2003, Association for Computing Machinery, 2003-04-05.</quote>, footnote, page 7.

Adjacency Theory

An oppositional framework wherein harms are “adjacent to” benefits:

  • Invasion of private communications
  • Invasion of privacy ihn a person’s virtual home.
  • Public disclosure of inferred private facts
  • Tracking, stalking and violations of locational privacy.
  • Harm arising from false conclusions about individuals, based on personal profiles from big‐data analytics.
  • Foreclosure of individual autonomy or self‐determination
  • Loss of anonymity and private association.
Mosaic Theory

Oblique referenced via quote from Sotomayor.
<quote>“I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.” United States v. Jones (10‐1259), Sotomayor concurrence.</quote>

Yet, not cited, but related (at least):

Definition of Roles [of data processors]

  • data collectors
  • data analyzers
  • data users

The data generators or producers in this roles framework are substantially only customers or consumers (sic).


  • Definition of analysis versus use
    • <quote>Analysis, per se, does not directly touch the individual (it is neither collection nor, without additional action, use) and may have no external visibility.
    • & by contrast, it is the use of a product of analysis, whether in commerce, by government, by the press, or by individuals, that can cause adverse consequences to individuals.</quote>
  • Big Data => definitions
    • [comprises data with] high‐volume, high‐velocity and high‐variety
      information assets that demand cost‐effective, innovative forms of information processing for enhanced insight and decision making,” attributed to Gartner Inc.
    • a term describing the storage and analysis of large and/or complex data sets using a series of techniques including, but not limited to, NoSQL, MapReduce, and machine learning.” attributed to “computer scientists” on arXiv.


The strong, direct, unequivocal, un-nuanced, provocative language…

<quote>For a variety of reasons, PCAST judges anonymization, data deletion, and distinguishing data from metadata (defined below) to be in this category. The framework of notice and consent is also becoming unworkable as a useful foundation for policy.</quote>

<quote>Anonymization is increasingly easily defeated by the very techniques that are being developed for many legitimate applications of big data. In general, as the size and diversity of available data grows, the likelihood of being able to re‐identify individuals (that is, re‐associate their records with their names) grows substantially. While anonymization may remain somewhat useful as an added safeguard in some situations, approaches that deem it, by itself, a sufficient safeguard need updating. </quote>

<quote>Notice and consent is the practice of requiring individuals to give positive consent to the personal data collection practices of each individual app, program, or web service. Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent. <snip/>The conceptual problem with notice and consent is that it fundamentally places the burden of privacy protection on the individual. Notice and consent creates a non‐level playing field in the implicit privacy negotiation between provider and user. The provider offers a complex, take‐it‐or‐leave‐it set of terms, while the user, in practice, can allocate only a few seconds to evaluating the offer. This is a kind of market failure. </quote>

<quote>Also rapidly changing are the distinctions between government and the private sector as potential threats to individual privacy. Government is not just a “giant corporation.” It has a monopoly in the use of force; it has no direct competitors who seek market advantage over it and may thus motivate it to correct missteps. Governments have checks and balances, which can contribute to self‐imposed limits on what they may do with people’s information. Companies decide how they will use such information in the context of such factors as competitive advantages and risks, government regulation, and perceived threats and consequences of lawsuits. It is thus appropriate that there are different sets of constraints on the public and private sectors. But government has a set of authorities – particularly in the areas of law enforcement and national security – that place it in a uniquely powerful position, and therefore the restraints placed on its collection and use of data deserve special attention. Indeed, the need for such attention is heightened because of the increasingly blurry line between public and private data. While these differences are real, big data is to some extent a leveler of the differences between government and companies. Both governments and companies have potential access to the same sources of data and the same analytic tools. Current rules may allow government to purchase or otherwise obtain data from the private sector that, in some cases, it could not legally collect itself, or to outsource to the private sector analyses it could not itself legally perform. [emphasis here] The possibility of government exercising, without proper safeguards, its own monopoly powers and also having unfettered access to the private information marketplace is unsettling.</quote>


Substantially in order of appearance in the footnotes, without repeats.

Via: backfill, backfill


And yet even with all the letters and professional editing and techwriting staff available to this national- and historical-level enterprise we still see [Footnote 101, page 31]

Qi, H. and A. Gani, “Research on mobile cloud computing: Review, trend and perspectives,” Digital Information and Communication Technology and it’s Applications (DICTAP), 2012 Second International Conference on, 2012.

The correct listing is at Springer

Digital Information and Communication Technology and Its Applications;International Conference, DICTAP 2011, Dijon, France, June 21-23, 2011. Proceedings, Part I, Series: Communications in Computer and Information Science, Vol. 166 Cherifi, Hocine, Zain, Jasni Mohamad, El-Qawasmeh, Eyas (Eds.) 2011, XIV, 806 p.


  • it’s → is a contraction for it is
  • its → is a possessive

Ergo: s/it's/its/g;

Diffusion of Innovations | Everett M. Rogers

Everett M. Rogers; Diffusion of Innovations, 5th Edition Paperback; Free Press; 5th Edition; 2003-08-16; 576 pages; kindle: $25, paper: $13+SHT; earlier editions kindle: $24, paper: $0.01+SHT.

Table of Contents

  1. Elements of diffusion.
  2. A history of diffusion research.
  3. The generation of innovations.
  4. The Innovation-decision process.
  5. Attributes of innovations.
  6. Innovativeness and adapter categories.
  7. Diffusion networks.
  8. The change agent.
  9. Innovation in organizations.
  10. Consequences of innovations.


Individual Decision Life Cycle Model

  1. Knowledge
    … of the innovation.
  2. Persuasion
    i.e. forming a favorable or unfavorable attitude toward it.
  3. Decision
    to accept or reject.
  4. Implementation
    … of the innovation.
  5. Confirmation
    i.e. seeking reinforcement of the decision from others.

Mass Adoption Life Cycle Model

  1. innovators,
  2. early adopters,
  3. early majority,
  4. late majority,
  5. laggards.


Via: backfill

The Bell-Mason Diagnostic for (Venture Capital) Investing

The Instrument

  • Space
    • 12 dimensions
  • Time
    • 4 stages, company development
    • 7 sub-stages, product development
  • Quantification
    • rules => yes/no
  • Visualization
    • a graph


  1. Business Plan
  2. Marketing
  3. Sales
  4. CEO
  5. Team
  6. Board
  7. Cash
  8. Financeability
  9. Control
  10. Technology/Engineering
  11. Product
  12. Manufacturing


  1. Concept
    • 0-12 months
  2. Seed
    • 3-12 months
  3. Product Development
    • 12–48 months
  4. Market Development
    • 24-48 months
  5. Steady-State
    • exit => IPO
  • Stage-to-stage transition => event-based state-transitions
    • continue in the state
    • exit the state
    • cease operations
    • etc.
  • Sub-state model of product development
  1. Product Development
    1. Hire & Plan
    2. Alpha Test
    3. Beta Test
  2. Market Development
    1. Calibrate the Market
    2. Market Expansion
    3. Steady-State Operation


  • Heuristic -> Rule -> Question
    • e.g. is there a design walkthrough or code review process?
  • Staged Evolution of Questions
    • questions & level of detail appropriate to the stage
  • Standard Questionaire


In the form of a Kiviat diagram.


  • “You don’t have to understand the technology to ask the right business questions”
  • Companies that scored 75 or higher had a business success rate of 95%; slide 26.
  • Justification
    • Factory-like, repeatable, optimizable => executable by lower skill units
    • Similar to Medical Schools
    • Alternatives
      • Case based => Biz School
      • Statistical Factor Analysis


Via: backfill

Revisiting the convergence of Metcalfe’s Law, Shannon’s Law and McLuhan’s “The Medium is the Message” | Excapite

; Revisiting the convergence of Metcalfe’s Law, Shannon’s Law and McLuhan’s “The Medium is the Message”; In Excapite; 2014-02-23.


  • MobCon‘s Law => as the rate of network adoption increases, the price of software decreases
  • <quote>Over the 3 generations the network effect translates into 100:60:30. i.e. 2nd Generation generates 60% of the first Generation Revenues. The 3rd Generation generates 30% of the Second Generation Revenues. This suggests the 4th Generation will be generating 15% of the 3rd Generation Revenues when it achieves 1 Billion Users.</quote>
  • <quote>The new market reality for software developers is there has never been more customers but prices have never been this low.</quote>
  • <quote>Media has been unbundled by the network. And by that I mean what once was sold as a wholesale product (Think Newspaper, Magazine or Music Album) is now sold as fragments (Think pages and songs). So too with software. Software is under going the same unbundling. What was sold as a wholesale bundle of function points (Think: COTS software) is now being unbundled and sold off as function points and limited functionality (Think: API’s and Apps). The reason being of course, when it comes to the long wave of the product cycle the spreadsheet and the word processor, 30 to 40 years on, is looking very much like end of cycle, and is under the types of market pressures one would expect of a mature market.</quote>
  • <quote>Google’s Search Engine is a function point endlessly combinatorial. Facebook is a collection of function points. To disrupt Google you need to build a better function point. To disrupt Facebook, as with Microsoft, you need merely to fragment the Platform</quote>



Table of Contents

Via: backfill

Internet of Things – Privacy and Security in a Connected World | FTC

Internet of Things – Privacy and Security in a Connected World (IoT); Federal Trade Commission (FTC); 2013-11-19.


Via: backfill


National Science Foundation

Keith Marzulo

Collateral, slides 1121

  • Precious Nomenclature
    • Ubiquitous Computing
    • Pervasive Computing
    • Distributed Sensor Networks
    • Internet of Things
    • Cyber-Physical Systems
  • NSF CPS Program
  • Paul Ford, Some Opinement; Hemispheres; 2013-11; pages 66-68.
  • Highlighted Programs
    • Networked Embedded Sensor-Rich Systems (ActionWebs)
      • Claire Tomlin, Edward Lee, S. Shankar Sastry, David
        Culler (Berkeley)
      • Hamsa Balakrishnan (MIT)
    • Foundations Of Resilient Cyber-physical Systems (FORCES)
      • who?
    • Advanced Transportation Systems
      • Raj Rajkumar, Ed Clarke, John Dolan, Sicuan Gao, Paul
        Ribski, David Wettergreen, Paolo Zuliana (CMU)
    • Environment Monitoring (Intelligent River)
    • Semantic Security Monitoring for Industrial Control Systems (ICS)
      • Robin Sommer (Berkeley)
      • Adam Slagell & Ravishankar Iyer (Illinois)
    • Reprogramming a Pacemaker
      • Kevin Fu (Mass-Amherst; Michigan)
    • Reprogramming Automobiles
      • Tadayoshi Kohno & Shwetak Patel (U Washington)
      • Stefan Savage & Ingolf Krueger (UCSD)
    • Security and Privacy in Vehicular Cyber-Physical Systems
      • Hari Balakrishnan, Samuel Madden, Daniela Rus (MIT)
    • Secure Telerobotics
      • Howard Jay Chizeck & Tadayoshi Kohno (Washington)


M.H. Carolyn Nguyen
Director, Technology Policy Group, Microsoft

Collateral, slides 22-39

Panel 1: The Smart Home

  • Michael Beyerle, GE Appliances
  • Jeff Hagins, SmartThings
  • Craig Heffner, Tactical Network Solutions
  • Eric Lightner, Department of Energy
  • Lee Tien, Electronic Frontier Foundation

Collateral, slides 40-57

  • Connected Platform
    • ACM controller to appliances
    • GEA server (cloud controlled)
    • iOS & Android apps
  • SmartThings
  • Smart home
  • SmartSense Product Line: Multi, Presence,Hub, Motion, Outlet

An Internet of Things

Vint Cerf
Slides 58-72


  • Gee Whiz, my how far we’ve come, what a long strange trip it’s been
  • Smart Cities
  • Self-Driving Cars
  • Implications, Challenges & Opportunities

Panel 2: Connected Health & Fitness

Moderator: Commissioner Maureen Ohlhausen

  • Stan Crosley, Indiana University
  • Joseph Lorenzo Hall, Center for Democracy & Technology
  • Anand Iyer, WellDoc Communications
  • Scott Peppet, University of Colorado School of Law
  • Jay Radcliffe, InGuardians

Collateral, slides 73-75

  • Insulin Pump
  • BlueStar

Panel 3: Connected Cars

  • Yoshi Kohno, University of Washington
  • John Nielsen, American Automobile Association
  • Wayne Powell, Toyota Technical Center
  • Christopher Wolf, Future of Privacy Forum


  • none

Panel 4: Privacy and Security in a Connected World

  • Ryan Calo, University of Washington Law School
  • Dan Caprio, McKenna Long & Aldridge LLP
  • Michelle Chibba, Office of Information & Privacy Commissioner of Ontario
  • Drew Hickerson, Happtique
  • David Jacobs, Electronic Privacy Information Center
  • Marc Rodgers, Lookout Security

Collateral, slides 79-85

  • Four Scenarios (user stories)


The Dark Mail Alliance of Lavabit & Silent Circle




  • Jon Callas, CTO, Silent Circle
  • Ladar Levison, founder, Lavabit



In archaeological order, derivaties and copypaste on top, original work further below

Via: backfill

Why Our Privacy Problem is a Democracy Problem in Disguise | Evgeny Morozov, MIT Technology Review

Evgeny Morozov; Evgeny Morozov on Why Our Privacy Problem is a Democracy Problem in Disguise; In MIT Technology Review; 2013-10-21.


Via: backfill

Bruce Schneier: Talks at Google, at DEFCON 20

Bruce Schneier; Talks at Google, at Google; On YouTube; 2012-06-19; 55:23.


  • Insights about Technology and Power
  • Security isn’t part of products as sold today; products today aren’t “complete.”
  • Four Trends
    1. The Cloud
      • Computers owned by someone else.
      • Located somewhere else.
      • 3rd party holds “your data.”
    2. Locked-down Endpoints (Closed CPE)
      • More closed is more successful for the host of the ecosystem.
      • Mobile is lockdown.
      • Desktop (Windows 8, Mountain Lion) is moving to lockdown.
      • Curated “stores” for executables.
  • Trust
    • Users have to trust vendors
    • Users give up control in return for functionality.
  • Cost of Computing drives architecture, product definition and social configuration of usage.
  • Feudal Security
    • Big Host takes care of Users.
    • Users are not customers.
    • Vendors can act arbitrarily.
    • Vendors are allowed to make “mistakes.”
    • Vendors can shade the rules (cheat) to tie users to the system even more strongly.
    • Based on deceit on many dimensions.
    • Based on power and the exercise of it.
  • Agenda of the Net
    • The natural laws of the internet
    • Exhibits quotes as a flavor of the thinking “back then”
    • John Perry Barlow, 1996; Declaration of Independence of Cyberspace
    • John Gilmore, 1993; <quote>the internet interprets censorship as damage and routes around it</quote>
  • Theory of Power and Technologyrelative to The Internet
    • The Internet magnifies power
    • The powerless got some, they got it fast.
    • The powerful got more, but they were very very slow.
  • Four classes of the use of power
    • have dual use; police/military as well as market/consumer
    • The List
      1. Censorship / Content Filter
      2. Propaganda / Marketing
      3. Surveillance / Track-N-Targ (the business model of The Internet)
      4. Use Control / DRM & AppStores & Code Licensing
  • Comingling of the corporate and governance
    • Charlies Stros => The End of Pre-History (where we save everything)
    • Changing the social norms: sharing.
    • Industry lobbies for laws that benefits their business model.
  • Cybernationalism
    • ITU wants to “take over the Internet”
  • Militarization of Cyberspace
    • Something about Snowden.
    • Something about China.
    • Something about ARAMCO attack attributed to Iran.
  • Legal Theory
    • Two types of law
      1. Constitutional Law (limits government acts)
      2. Business Law (regulation limits business acts)
    • Each side uses the laws/regulations of their domain to control the “consumer in the middle.”
    • Facebook CIA Onion satire is now truth.
      CIA’s ‘Facebook’ Program Dramatically Cut Agency’s Costs
    • Thought Experiments (what if government said…; what if private industry said…)
      • Each citizen must carry a continuous tracking device
        yet cellphones
      • Each citizen must register each new contact (friend) each time an acquaintance is made
        yet Facebook
      • Business can reach out and destroy data that does not suit them
        Yet RIAA proposes a law “attack back” to destroy rogue copies of files “out there.”
  • Military Strategic Theory / Social Theory
    • Attackers have an advantage with technology (usually mil-theory says that defenders are considered to have a 3:1 advantage).
    • Analogies are cited: cars for bank robbers vs police.
    • Cybercime took a decade before police understood it.
    • Security Gap is the time before The Establishment figures out how to use it.
  • The Classes
    • The Nimble (The Dissidents)
    • The Powerful (The Establishment)
    • The Rest of Us (users in the middle)
  • Complex Social Questions which are power struggles
    • an Algorithms Judge
    • Can information be corrected
    • Can information be forgotten
    • Can certain data files be prevented from being executed
      • Music files
      • Design files: Gun designs, Barbie Dolls, Mickey Mouse
    • Weapons of Mass Destruction
      • If they exist in the wrong hands, what level of control is warranted?
  • Claim:
    • The powerful are winning right now.
    • Need innovative solutions.
    • This isn’t where government can be involved.
  • The Internet, origins and future
    • Technolibertarian origins
    • Geopolitical regulatory now
  • Suggestions
    • Researchers: study more the 4 dual use technologies
      • Areas
        1. Censorship
        2. Propaganda
        3. Surveillance
        4. Use Control
      • Examples
        • Fake yelp reviews
        • Fake Amazon reviews
        • Astroturfing on Twitter?
      • Need
        • Safe places to anonymously publish
        • Wikileaks is not safe
        • Strongbox New Yorker is under review
      • Vendors: every technology is dual-use
        • Blue Coat
        • Social network monitoring
        • FBI wants CALEA-II in the U.S., but not abroad
    • Policy
      • Keep circumvention legal
      • Keep network neutrality
      • Can’t have both ways: privacy at home requires privacy abroad
    • Laws will come, and they will be bad. Maybe they can be headed off
    • Power must be leveled; as was the case in the Rise of the Nation State
      • Rights & Responsibilities
      • Limitations on Use
      • Transparency on Rules
  • Power, coded in
    • Money
    • Social Control
    • Marketing (Advertising)

Via: backfill


The basic theme of the Feudalism metaphor is one of prognostication; future prediction.  Its premise is that history repeats itself in fundamental ways (rhymes) and so if we all agree to choose and understand the appropriate metaphor for the current time then we can by implication predict the outcome in the future.

Doesn’t Mention

  • Marc Davis who has been using this metapor for a while: Digital Feudalism. c.f. slides especially slides
  • Doc Searls who has a calf/cow metaphor for client/server in the VRM activism; c.f. blog
  • Tim Wu who has the same basic idea in his popularization and some academic output.
  • Anil Dash, who talks about online entertainment systems  using the metaphor of the legal structure of privately-owned public spaces (POPS); c.f. notes and notes


DEFCON 20 Bruce Schneier Answers Your Questions; On YouTube; 2013-06-14; 47:52; also here (with better audio)
The DEFCON talk is much more freeform but is the same basic material. The Google talk is structured and positioned as the input material for his next book. This is a promotional tour for Liars & Outliers or maybe not since the book has been out for a year.

Gloating & Fingerwagging about the Technical Hiring Process

On the occasion of the gloating on the occasion of Mea Culpa by the self-styled Thought Leaders in the field.


Wow … this area is evergreen in blogland, no shortage of advice, and definitely. not. self. serving.


If Entrepreneurs are the New Labor then programmers are the New Labor’s labor. The advice is on how to hire labor. Programmers are thus Old Labor in the complete sense; of variable cost time and materials.

Venkatesh Rao; in Forbes; 2012-09-03.

<quote>There is dignity to labor just as there is romance to entrepreneurship.</quote>.  There is even a whole day given over annually to the celebration of that.

Internet Trends 2013 | Mary Meeker, Liang Wu @ KPCB

Mary  Meeker, Liang Wu (KPCB); Internet Trends 2013; At D11 Conference; 2013-06-29; 117 slides.


  • Gee Whiz!
  • Tempest & Teapot
    • Ms. Meeker was once an “analyst” producing “analyzed true facts” for a bank-brokerage in service of promoting the industry sector at large as a place to do the M&A.
    • Now she is at a venture firm and (by definition) she’s talking their book.
    • Intuition, insight and viewpoint have replaced components of the previous work product.


Backend-as-a-Service (BaaS) and Mobile Enterprise Application Platform (MEAP)


  • Applicasa
    • Lior Malenbiom, CEO
    • Features: Database, CMS, SDK, cloud
    • the “Start-App” program.  fremium.
  • Cocoafish (Appcelerator)
    • 10-person team
    • co-working space in San Francisco
    • Michael Goff & Wei Kong, founders
    • iOS, Android, JavaScript and REST
    • Objective-C, Java, PhoneGap, Sencha, HTML5
    • Appcelerator purchase 2012-02-08
      • Jeff Haynie, CEO (circa 2011-09)
      • Scott Schwarzhoff, VP Marketing (circa 2011-09)
      • Spencer Chen, director of bizdev (circa 2013-01)
      • Mountain View
      • Open Mobile Marketplace (like Salesforce AppExchange)
        Modules: Get Glue, PayPal, AdMob, Box.net, Greystripe, Twilio, OpenGL and Urban Airship
      • Titanium Platform
        • Windows, Mac, Linux, HTML5
        • Aptana Studio’s Eclipse-based IDE
        • Integrations
      • Links with StackMob & Parse in Marketplace
  • Crashlytics
    • Acquired by Twitter, 2013-01-28.
    • Jeff Seibert, Wayne Chang, founders
  • FeedHenry
  • Kinvey
    • Sravish Sridhar, CEO, Co-founder
    • Joe Chernov, “head” of marketing
    • Feature: Facebook Open Graph extended to mobile applications.
    • Function: Something about mobile app objects (metadata) hosted on Kinvey servers for Facebook to crawl for insertion on the Timeline.
  • Parse
    • iOS & Android SDKs
    • Y-Combinator “batch”
    • Tikhon Bernstam & James Yu, founders
      • were co-founders at Scribd along with Ilya Sukhar (Ooyala) and Kevin Lacker (Google)
    • Raised $1.1M in Y! Combinator, circa 2011-08
  • PhoneGap
  • Sencha (Sencha.io)
  • StackMob
    • A single integration point to backend services
    • API creation and management
    • Java, Ruby, Python, Lua or any JVM supported language.
    • Analytics across all StackMob services
      • geographic location
      • platform
      • app version
      • demographic info
      • “and more”
    • Messaging
      • Push
      • Email by SendGrid, Inc.
    • Social
      • Integratns: Twitter, Facebook, Google
      • Login (Single Sign-On)
      • Sharing
    • Future (circa 2011-01)
      • Location services
      • Advertising
      • Monetization
      • Android
    • SDKs
      • Open source
      • iOS
      • Android (future)
  • Tiggzi (Appery.io)

The Walled Gardens

  • Facebook
  • Twitter
    • Crashlytics
    • Bluefin
    • Vine
    • Cards

The Conglomerates

  • Apple (iCloud)
  • IBM
  • SAP (Sybase Unwired Platform)


In archaeological order, all from Read Write as they recirculate you through their own past output.



The diagram is in the style of a faux subway map.  Not clear how that metaphor is related to the domain.  Mentioned on the diagram

Service Provider

  • China Mobile
  • Verizon
  • Sprint
  • T-Mobile
  • Airtel
  • Vodaphone
  • Smart
  • AT&T


  • GoGrid
  • Cloud.com (Citrix)
  • Rackspace
  • Terremark
  • HP
  • Flexiscale
  • IBM
  • AT&T
  • Microsoft Azure
  • google


  • SAP NetWeaver Cloud
  • OpenShift, Red Hat
  • AppFog
  • Joyent
  • Cloud Foundry, VMware
  • Engine Yard
  • CloudBees
  • Oracle

MEAP (Mobile Enterprise Application Platform)

  • Sybase Unwired Platform (SAP)
  • Webalo
  • Verivo
  • Kony Solutions
  • Antenna
  • Syclo
  • IBM Worklight


  • Sencha.io
  • iKnode
  • iCloud
  • Deployd
  • CloudyRec
  • FeedHenry
  • Applicasa
  • ScottyApp
  • YorAPI
  • mobDB
  • Flurry AppCloud (TrestleApp)
  • Kumulos
  • Netmera
  • Apstrata
  • AppGlu
  • Usergrid
  • CloudMine
  • Firebase
  • ACS (Cocafish)
  • Buddy
  • Kinvey
  • StackMob
  • Parse
  • Meteor

Mobile Services

  • Facebook
  • iAd
  • Where
  • PayPal, eBay
  • Urban Airship
  • Weibo
  • InfoChimps
  • Brightcove
  • Fiksu
  • Flurry
  • foursquare
  • JumpTap
  • Twitter
  • GooglePlaces
  • AdMob
  • Apigee

Mobile SDK

  • Sencha
  • iOS
  • jQuery Mobile
  • Appcelerator
  • Cabana App
  • PhoneGap, Adobe
  • Tiggzi
  • Trigger.io
  • App Cloud
  • AnyPresence
  • MoSync
  • Marmalade
  • Windows (rly?)
  • bada
  • Haxe
  • Android
  • Qt
  • Symbian
  • Rhomobile

Handset OEM

  • Apple
  • Sony Ericsson
  • ZTE
  • TCL/Alcatel
  • HTC
  • RIM
  • LG
  • Huawei
  • Nokia
  • Samsung
  • Motorola Mobility

The New Digital Age | Eric Schmidt, Jared Cohen

Eric Schmidt, Jared Cohen; The New Digital Age: Reshaping the Future of People, Nations and Business; Knopf; 2013-04-23; 336 pages.


From Amazon.

ERIC SCHMIDT is executive chairman of Google, where he served as chief executive officer from 2001 to 2011. A member of the President’s Council of Advisors on Science and Technology, Schmidt also chairs the board of the New America Foundation and is a trustee of the Institute for Advanced Study in Princeton, New Jersey.

JARED COHEN is director of Google Ideas and an Adjunct Senior Fellow at the Council on Foreign Relations. He is a Rhodes Scholar and the author of several books, including Children of Jihad and One Hundred Days of Silence. He is a member of the Director’s Advisory Board at the National Counterterrorism Center.


  • and Jared Cohen;Your life in 2033; In The Guardian; 2013-04-12.
    Teaser: ‘You skim through the day’s news on translucent screens while a freshly cleaned suit is retrieved from your automated closet…’ An extract from Google chairman Eric Schmidt and Jared Cohen’s new book

    • It’s all good.
    • It’s all from the tech.
    • Lightweight, fast, powerful, fun, safe, meaningful, personal, professional, clean, safe, etc.

STEM Labor Shortages? | Economic Policy Institute | Daniel Costa

Promoted from backfill

Daniel Costa; STEM labor shortages?; Economic Policy Institute; 2012-11-19.
Teaser: Microsoft report distorts reality about computing occupations


  • <quote>Conclusion: Now is not the time to increase the number of H-1B visas and STEM green cards</quote>
  • <quote>Microsoft is proposing that the government increase the supply of STEM workers with college degrees even though their unemployment rate is already double the rate at which full employment occurs for such workers. Microsoft’s proposal is unsurprising, since adding workers to the STEM labor supply during times of high unemployment and insufficient job creation would propel STEM unemployment rates even higher, thereby preventing wages in these occupations from rising. If this occurs, more STEM workers would have little choice but to accept whatever terms and conditions are offered by employers. This wage suppression is already occurring in computer and mathematical occupations. Figure D shows the average hourly wage for college-educated workers in computer and mathematical occupations over the last 11 years.</quote>
  • <quote>The first significant problem with Microsoft’s report is the assumption that job openings “in computing” not filled by college graduates with computer science (CS) degrees will go unfilled. It is a well-known fact that computer science graduates are not the only source of new hires in computing.</quote>
  • <quote>They add that there is also an adequate supply of experienced STEM workers, writing, “Purported labor market shortages for scientists and engineers are anecdotal and also not supported by the available evidence” (Lowell and Salzman 2007, 43).</quote>


Original Event

Summary: need more H-1B to keep labor rates low and lowering (a.k.a. “be competitive”)

Comcast ConstantGuard: capabilities and purpose



  • Surely there are some …
  • They have to do something about all the Windows machines on their networks


  1. Customer HTTP traffic (port 80) is run through a transparent gateway.
  2. Customer HTTP fly-by traffic may be inspected in situ
  3. Messages to the end user can be inserted into the returning HTML.
    This mirrors somewhat the capability on the video stream of inserting messages on top of the
And by may be inspected, the intended sense here is is inspected for reasonable definitions of is
Engineers have been interviewed who claimed to have worked on this sort of thing using the carrier-grade routers’ WCCP hook, specifically: extracting keywords out of Amazon URL traffic for sale to consumer trak-n-targ operators.


Circa 2012-02-28 through 2012-03-02 during the leadup to the expected chaos pursuant to the DNS Changer name server shutdown.


Where does that leave you if you aren’t interested in having your ISP inspect and modify your traffic as they haul it?

Routing Out Beyond Comcast ConstantGuard with OpenVPN

EFF Patent Project Gets Half-Million-Dollar Boost from Mark Cuban and ‘Notch’

EFF; EFF Patent Project Gets Half-Million-Dollar Boost from Mark Cuban and ‘Notch’; 2012-12-19.
Teaser: New Funds Dedicated to Protecting Innovation and Reforming Software Patents


  • EFF’s Defend Innovation Project
  • $500,000 donation
    • $250,000 from Mark Cuban
    • $250,000 from Markus “Notch” Persson
  • Attribution & Fame
    • Mark Cuban
      • Sold Yahoo! a video contgent delivery system for Six BILLION dollars back in Web Bubble I, back when a billion dollars was real money
      • Owns a basketball team
      • Distributes Films
      • Man about town
    • Markus “Notch” Persson


  • Quotes & Quips
    • ‘The Mark Cuban Chair to Eliminate Stupid Patents’ attributed to Julie Samuels, Staff Attorney, EFF.
  • Personalities Cited
    • Shari Steele, Executive Director , EFF.
    • Julie Samuels, Staff Attorney, EFF
    • Daniel Nazer, Staff Attorney, will join EFF in 2013-01
  • EFF’s intellectual property team
    • Corynne McSherry, Intellectual Property Director
    • Kurt Opsahl, Senior Staff Attorney
    • Mitch Stoltz, Staff Attorney
    • Michael Barclay, Fellow
    • Jason Schultz, Fellow