The Three Laws of Robotics in the Age of Big Data | Balkin

Jack M. Balkin  (Yale); The Three Laws of Robotics in the Age of Big Data; Ohio State Law Journal, Vol. 78, (2017), Forthcoming (real soon now, RSN), Yale Law School, Public Law Research Paper No. 592; 2016-12-29 → 2017-09-10; 45 pages; ssrn:2890965.

tl;dr → administrative laws [should be] directed at human beings and human organizations, not at [machines].

Laws

  1. machine operators are responsible
    [for the operations of their machines, always & everywhere]
  2. businesses are responsible
    [for the operation of their machines, always & everywhere]
  3. machines must not pollute
    [in a sense to be defined later: e.g. by a "tussle"]

None of this requires new legal theory; c.f. licensing for planes, trains & automobiles; and on to nuclear plants, steel unto any intellectual business operation of any kind (ahem, medical, architecture, legal services; and anything at all under the Commerce Clause, no?)

Mentions

  • Isaac Asimov, the stories of
    …and the whole point of the stories was the problematic nature of The Three Laws, They seemed fun and clear but they were problematized and the don’t work as a supervisory apparatus. Maybe they don’t work at all. Is the same true here? Not shown.
  • Laws of Robotics,
    Three Laws of Robotics.
  • [redefined] the “laws of robotics” are the legal and policy principles that govern [non-persons, unnatural-persons].

Concepts Principles (HF/SE/IF/AN)

  1. homunculus, a fallacy
  2. substitution, an effect
  3. information fiduciaries, a role
  4. algorithmic nuisance, an ideal (an anti-pattern

Analysis

A matrix, the he cross product, of twelve (12) combinations:

Requirement of (TAdP)
  1. Transparency
  2. Accountability
  3. due Process
Principles of (HF/SE/IF/AN)
  • [the] homunculus fallacy
  • [a] substitution effect
  • information fiduciaries
  • algorithmic nuisance

 Argot

The Suitcase Words
  • Isaac Asimov.
  • three law of robotics.
  • programmed,
    programmed into every robot.
  • govern.
  • robots.
  • algorithms.
  • artificial intelligence agents..
  • legal principles,
    basic legal principles.
  • the homunculus fallacy.
  • he substitution effect.
  • information fiduciaries.
  • algorithmic nuisance.
  • homunculus fallacy.
  • attribution.
  • human intention.
  • human agency.
  • robots.
  • belief,
    false belief.
  • person
    little person.
  • robot.
  • program.
  • intentions,
    good intentions.
  • substitution effect.
  • social power.
  • social relations.
  • robots.
  • Artificial Intelligence (AI).
  • AI agents.
  • algorithms.
  • substitute,
    algorithmssubstitute for human beings.
  • operate,
    algorithms operate as special-purpose people..
  • mediated
    ,mediated through new technologies.
  • three laws of robotics
    Three Laws of Robotics.
  • Algorithmic Society.
  • robots.
  • artificial intelligence agents.
  • algorithms.
  • governments.
  • businesses.
  • staffed.
  • Algorithmic Society.
  • asymmetries,
    asymmetries of information,
    asymmetries of monitoring capacity,
    asymmetries computational power.
  • Algorithmic Society:.
  • operators,
    operators of robots,
    operators of algorithms
    operators of artificial intelligence agents.
  • information fiduciaries.
  • special duties,
    special duties of good faith,
    special duties fair dealing.
  • end-users, clients and customersdata subjects.
  • businesses,
    privately owned businesses.
  • the public,
    the general public..
  • duty,
    central public duty.
  • algorithmic nuisances.
  • leverage utilize use.
  • asymmetries of information,
    asymmetries of monitoring capacity,
    asymmetries of computational power.
  • externalize,
    externalize the costs,
    externalize the costs of their activities.
  • algorithmic nuisance.
  • harms
    harms of algorithmic decision making.
  • discrimination
    intentional discrimination.
  • pollution,
    unjustified pollution
    socially unjustified pollution
    contra (socially-)justified pollution.
  • power
    computational power.
  • obligations,
    obligations of transparency,<br/ obligations of due process,
    obligations of accountability.
  • obligations flow.
  • requirements,
    substantive requirements,
    three substantive requirements.
  • transparency.
  • accountability.
  • due process.
  • obligation,
    an obligation of.
  • fiduciary relations.
  • public duties.
  • measure,
    a measure,
    a prophylactic measure.
  • externalization,
    unjustified externalization
    unjustified externalization of harms.
  • remedy,
    remedy for harm.

Previously filled.

AFrame: Isolating Advertisements From Mobile Applications in Android | Zhang, Ahlawat, Du

Xiao Zhang, Amit Ahlawat, Wenliang Du; AFrame: Isolating Advertisements From Mobile Applications in Android; In Proceedings of Annual Computer Security Applications Conference (ACSAC); 2013-12-09; 10 pages.

Abstract

Android uses a permission-based security model to restrict applications from accessing private data and privileged resources. However, the permissions are assigned at the application level, so even untrusted third-party libraries, such as advertisement, once incorporated, can share the same privileges as the entire application, leading to over-privileged problems.

We present AFrame, a developer friendly method to isolate untrusted third-party code from the host applications. The isolation achieved by AFrame covers not only the process/permission isolation, but also the display and input isolation. Our AFrame framework is implemented through a minimal change to the existing Android code base; our evaluation results demonstrate that it is effective in isolating the privileges of untrusted third-party code from applications with reasonable performance overhead.

Via: backfill