As IBM Ramps Up Its AI-Powered Advertising, Can Watson Crack the Code of Digital Marketing? | Ad Week

As IBM Ramps Up Its AI-Powered Advertising, Can Watson Crack the Code of Digital Marketing?; ; In Ad Week (Advertising Week); 2017-09-24.
Teaser: Acquisition of The Weather Company fuels a new division

tl;dr → Watson (a service bureau, AI-as-a-Service) is open for business.

Mentions

The Weather Company

  • lines of business
    • location-based targeted audiences, delivered to the trade.
    • weather indica, delivered to consumers.
  • 2.2 billion locations/15 minutes
  • Dates
    • WHEN?, Acquisition by IBM
    • 2016-01, new business strategy,
      “AI” as a service (AIaaS)
  • Artificial Intelligence (AI)
  • Cloud Computing
  • Products
    • WeatherFx
    • JourneyFx
  • The Weather Company is a <quote>legacy business<quote> (deprecated).
  • AIaaS is a <quote>cutting-edge advertising powerhouse</quote> (house of power).

Watson Advertising

  • Cognitive Advertising
    • contra Computational Advertising, circa the ‘oughties (2005)
    • something about
      • <buzzzz>transform every aspect of marketing from </buzzz>
      • something about image and voice recognition to big data analysis and custom content.
  • What is it? (What is Watson-as-a-Service?)
    • Count: <quote>dozens</quote>
    • Interfaces
      • API
      • Projects <quote>studio-like</quote>
    • Pricing: <quote>millions of dollars</quote>
    • Structure: four (4) sub-units
  • “<snip/>It’s not been designed to target consumers the same way that Alexa or Siri have been,” attributed to Cameron Clayton.

Units

The 4 pillars of Watson Advertising.
  1. Targeting, Audience construction & activation
  2. Optimization, Bidding & buying
  3. Advertising, Synthesis of copy and creative
  4. Planning, media planning, the buy plan, the execution plan

Audience Targeting

  • the flagship service
  • neural networks
  • scoring users, propensity scoring <quote>based on how likely they are to take an action</quote>
  • towards CPA or CCPV or CPVisit or <more!>
  • Performable on the Weather Company O&O
    • <quote>but on TV, print, radio and other platforms. <quote>
    • Partnerships
      • Cognitiv
      • Equals 3

Optimization

Bidding Optimization
  • Is too boring for details early in the article.
  • Optimize against brand-specific KPIs.
  • Uses <buzzz>deep learning and neural networks</buzzz>
  • Optimize Cost Per Action (CPA).

Advertising

  • Badged as Watson Ads and Watson Advertising
  • Services
    • content creation
    • content copywriting
  • Launched: 2016-06.
  • Is merely: nterest-Based Advertising (IBA)
    which in turn is a but regulatory term of art, that covers a wide range of in-trade practices.
  • Sectors, aspirational
    • <fancy>aviation</fancy> (airline ticket booking?)
    • insurance
    • energy
    • finance
  • Cognitive Media Council,
    • a focus group.
    • a user group, “friends & family” of the business.
    • a group of important customers representatives
      <quote>senior-level executives from agencies and brands</quote>
Reference Customers
Toyota
  • Mirai
  • Prius Prime
  • Benefits
    Attributed to Eunice Kim, Toyota (TMNA), something about…

    • <buzzz>create a one-to-one conversational engagement</buzzz>
    • <buzzz>garner insights about the consumer thought process that could potentially inform our communication strategies elsewhere”</buzzz>
Campbell’s
  • the Soup people
  • Something about creative synthesis
    themed as: recipe generation with flu symptoms with location
H&R Block
  • Something about creative synthesis
    themed as: automated robot tax expert, suggest tax deductions.
UM [You and Em]
  • An agency. Off shore? They have a “U.S. CEO” Maybe one of those English Invasion thingies.
  • Refused to name their client.
  • Something about auto dealerships.
  • <quote>meshing Watson data with client stats to analyze metrics across a large number of car dealerships in a way that optimizes ad spend while also checking local inventory to see whether or not it should personalize an ad to someone in that market.<quote>
  • <quote>combination of weather data, Google searches and pollen counts to trigger when media should be bought in various markets.</quote>

Planning

  • <quote>AI-powered planning</quote>

Partners

Cognitiv
Something about a partnership for understanding marketing texts.
Jeremy Fain, CEO and co-founder
Equals 3
Lucy, a product-service-platform.
Something about <quote>to uncover extra insights and research.<quote>

Fairness & Balance

Promotions

Ogilvy & Mather
  • Honorific <quote>longtime agency<quote> [fof record for IBM].
Stunts
2011
Jeopardy
2015
[Television] campaign, with Bob Dylan.
2016
Synthesis of the trailier for Morgan (a move; genre: science fiction)
2017-02
Performance, an “analysis” of the stylings of Antoni Gaudi, <quote>inspire an art installation </quote> (what does that mean?)
The “art installation” was exhibited at the Mobile World Congress in Barcelona.
Statista

…is quoted
the future is boosted.

Sectors
  • “AI services”
  • “Big Data services”

Themes

Problem
  • The people are “afraid” of AI.
  • The people need to be groomed to accept AI.
Remediation

Ensmoothen & enpitchen the Artificial Intelligence (AI) as…

  • humble
  • friendly
  • ”I’m here to help’ type personality”

Attributed to Lou Aversano, Ogilvy.

Detractors

James Kisner, Jeffries

Via: James Kisner, A Report, Jeffries, 2017-07.
Jeffries is an opinion vendor in support of an M&A banking operation.
tl;dr → Watson is a failing product-service. <quote>IBM is being “outgunned” in the race…</quote> (yup, he mixed the metaphor).

  • as evidenced in measured job listings at Monster.com
    Apple had more listings booked thereon than IBM.
  • Customers were interviewed.
    Watson’s performance/price ratio was low (the rate card is very high).
    2016-10, IBM reduced the rate card for API access <quote>by 70 percent</quote>
  • Lots of press
  • Not a lot of monetary results, as evidenced in the quarterly & annual reports.
Joe Stanhope, Gartner

Via: an interview, perhaps;
Gartner Group is an opinion vendor.

  • Too much hype, can be forgiven
  • Gartner runs the Hype Cycle brand
  • Claims: <quote>IBM does seem to be all-in with Watson.<quote> (be nice to hear that from IBM, not as a “hot-take” from a newshour pundit
DemandBase, Wakefield Research

A Report; attributed to “staff”; DemandBase and Wakefield Research

  • A survey,
    • “how do you feel?”
    • Do you “have plans-to …” in the next N months.
  • There are a lot of uncertainties

Uncertainties

Training Data
  • Just isn’t there.
  • And … computers can only give answers, it can’t give [pose] questions.
Does it [even] Work?
  • No one knows.
  • Many are nervous.
  • No one wants to be first to fail
    (& be fired for outsourcing their job function to The AI).

Competitors

  • Einstein, of Salesforce(.com)
  • Sensei, of Adobe
In-House
  • Buying operations, Xaxis of WPP
    the “AI” is a “co-pilot” to the trading desk operator; optimization recommendations towards CPM and viewability; North American operations only.
  • others?
    Surely everyone nowadays has some initiative that does “co-pilot”-level decision support to adops.
Research Efforts
  • Amazon
  • Facebook
  • Google
Venture Capital
  • Albert
  • Amenity Analytics
  • LiftIgniter
  • Persado
Amenity Analytics

An exemplar of the smaller-nimbler-smarter clones of the Watson genre.

  • A Watson-type experience, but cheaper
  • Does text mining of press releases
  • Reference customers:
    Pepsi
  • A spin-out from some hedge fund, <quote>origins in the hedge fund world</quote>
  • Nathaniel Storch, CEO, Amenity Analytics.
  • <zing!>“Think of it as ‘moneyball’ for media companies,”<zing!>, attributed to Nathaniel Storch.

Consumer

  • Siri, of Apple
  • Cortana, of Microsoft
  • Now, of Google

Who

  • Lou Aversano, U.S. CEO, Ogilvy & Mather (Ogilvy, O&M).
  • Jordan Bitterman, CMO, Watson (Business Unit), IBM.
    attributed in quoted material aso “earlier this year” (2017?); c.f. Michael Mendenhall
  • Kasha Cacy, U.S. CEO, UM
    UM is an agency.
  • Cameron Clayton,
    • General Manager, Content and IoT Platform, Watson (Business Unit), IBM..
    • ex-CEO, The Weather Company
  • Jacob Colker, “entrepreneur in residence,” The Allen Institute
    …quoted for color, background & verisimilitude.The Allen Institute is a tank for thinkers.
  • Jeremy Fain, CEO and co-founder, Cognitiv.
  • Chris Jacob, director of product marketing, Marketing Cloud, Salesforce(.com).
  • Eunice Kim, media planner, Toyota Motor North America (TMNA).
    …quoted for color, background & verisimilitude.
  • James Kisner, staff, Jeffries.
    …quoted for color, background & verisimilitude.
    Jeffries is an advice shop, like Gartner, but different.
  • Francesco Marconi,
    …quoted for color, background & verisimilitude.

    • strategy manager and AI co-lead, Associated Press
    • visitor, MIT Media Lab
  • Michael Mendenhall, CMO, Watson (BU), IBM.
    announced as CMO in prior press [Ad Week, Marty Swant, 2017-07-07].
  • Sara Robertson, VP of Product Engineering, Xaxis of WPP.
  • Joe Stanhope, staff, Forrester
    …quoted for color, background & verisimilitude.
  • Nathaniel Storch, CEO, Amenity Analytics.
  • Marty Wetherall, director of innovation, FallonFallon is the agency that certain campaign booked on Watson for H&R Block

Pantheon

  • Antoni Gaudi, architect (per civil engineering), citizen of Spain.

Previously

In archaeological order, within Advertising Week

Previously filled.

Big Data and Privacy: A Technological Perspective | PCAST

Big Data and Privacy: A Technological Perspective; Executive Office of the President, President’s Council of Advisors on Science and Technology (PCAST); 2014-05-01; 76 pages; landing.

Related

Workshops

  • White House / UC Berkeley School of Information / Berkeley Center for Law and Technology; John Podesta; 2014-04-01; transcript, video.
  • White House / Data & Society Research Institute / NYU Information Law Institute; John Podesta; 2014-03-17; video.
  • White House / MIT; John Podesta; 2014-03-04; transcript, video.

Who

PCAST Big Data and Privacy Working Group.
  • Susan L. Graham, co-chair.
  • William Press, co-chair.
  • S. James Gates, Jr.,
  • Mark Gorenberg,
  • John Holdren,
  • Eric S. Lander,
  • Craig Mundie,
  • Maxine Savitz,
  • Eric Schmidt.
  • Marjory S. Blumenthal, Executive Director of PCAST; coordination & framing..

PCAST

  • John P Holdren, co-chair, OSTP
  • Eric S. Lander, co-chair, Broad Institute (Harvard&MIT)
  • William Press, co- vice chair, U. Texas
  • Maxine Savitz, co- vice chair, National Academy of Engineering
  • Rosina Bierbaum, U. Michigan
  • Christine Cassel, National Quality Forum
  • Christopher Chyba, Princeton
  • S. James Gates, Jr., U. Maryland
  • Gorenberg, Zetta Venture Partners
  • Susan L. Graham, UCB
  • Shirley Ann Jackson, Rensselaer Polytechnic
  • Richard C. Levin, Yale
  • Chad Mirkin, Northwestern
  • Mario Molina, UCSD
  • Craig Mundie, Microsoft
  • Ed Penhoet, UCB
  • Barbara Schaal, Washington University
  • Eric Schmidt, Google
  • Daniel Schrag, Harvard

Staff

  • Marjory S. Blumenthal
  • Michael Johnson

Recommendations

From the Executive Summary [page xiii], and also from Section 5.2 [page 49]

  • Recommendation 1 [consider uses over collections activites]
    Policy attention should focus more on the actual uses of big data and less on its collection and analysis.
  • Recommendation 2 [no Microsoft lockin; no national champion]
    Policies and regulation, at all levels of government, should not embed particular technological solutions, but rather should be stated in terms of intended outcomes.
  • Recommendation 3 [fund]
    With coordination and encouragement from [The White House Office of Science and Technology Policy] OSTP, the [Networking and Information Technology Research and Development] NITRD agencies should strengthen U.S. research in privacy‐related technologies and in the relevant areas of social science that inform the successful application of those technologies.
  • Recommendation 4 [talk]
    OSTP, together with the appropriate educational institutions and professional societies, should encourage increased education and training opportunities concerning privacy protection, including career paths for professionals.
  • Recommendation 5 [talk & buy]
    The United States should take the lead both in the international arena and at home by adopting policies that stimulate the use of practical privacy‐protecting technologies that exist today. It can exhibit leadership both by its convening power (for instance, by promoting the creation and adoption of standards) and also by its own procurement practices (such as its own use of privacy‐preserving cloud services)

Table of Contents

  1. Executive Summary
  2. Introduction
    1. Context and outline of this report
    2. Technology has long driven the meaning of privacy
    3. What is different today?
    4. Values, harms, and rights
  3. Examples and Scenarios
    1. Things happening today or very soon
    2. Scenarios of the near future in healthcare and education
    3. Healthcare: personalized medicine,
    4. Healthcare: detection of symptoms by mobile devices
    5. Education
    6. Challenges to the home’s special status
    7. Tradeoffs among privacy, security, and convenience
  4. Collection, Analytics, and Supporting Infrastructure
    1. Electronic sources of personal data
      1. “Born digital” data
      2. Data from sensors
    1. Big data analytics
      1. Data mining
      2. Data fusion and information integration
      3. Image and speech recognition
      4. Social‐network analysis
    2. The infrastructure behind big data
      1. Data centers
      2. The cloud
  5. Technologies and Strategies for Privacy Protection
    1. The relationship between cybersecurity and privacy
    2. Cryptography and encryption
      1. Well Established encryption technology
      2. Encryption frontiers
    3. Notice and consent
      1. Other strategies and techniques
        1. Anonymization or de‐identification
        2. Deletion and non‐retention
    4. Robust technologies going forward
      1. A Successor to Notice and Consent
      2. Context and Use
      3. Enforcement and deterrence
      4. Operationalizing the Consumer Privacy Bill of Rights
  6. PCAST Perspectives and Conclusions
    1. Technical feasibility of policy interventions
    2. Recommendations
    3. Final Remarks
  7. Appendix A. Additional Experts Providing Input
  8. Special Acknowledgment

Mentions

  • The President’s Council of Advisors on Science and Technology (PCAST)
  • PCAST Big Data and Privacy Working Group
  • Enabling Event
    • President Barack Obama
    • Remarks, 2014-01-17
    • Counselor John Podesta
  • New Concerns
    • Born digital vs born analog
    • standardized components
    • particular limited purpose vs repurposed, reused.
    • data fusion
    • algorithms
    • inferences
  • Provenance of data, recording and tracing the provenance of data
  • Trusted Data Format (TDF)

Claims

  • Right to forget, right to be forgotten is unenforceable infeasible [page 48].
  • Prior redress of prospective harms is a reasonable framework [page 49]
    • Conceptualized as vulnerable groups who are stipulated as harmed a priori or are harmed sunt constitua.
  • Government may be forbidden from certain classes of uses, despite their being available in the private
    sector

    • Government is allowed some activities and powers
    • Private industry is allowed some activities and powers
    • It is feasible in practice to mix & match
      • government coercion => private privilege => result
      • private privilege => private coercion => result

Consumer Privacy Bill of Rights (CPBR)

Obligations [of service providers, as powerful organizations]

  • Respect for Context => use consistent with collection context.
  • Focused Collection => limited collection.
  • Security => handling techniques
  • Accountability => handling techniques.

Empowerments [of consumers, as individuals]

  • Individual Control => control of collection, control of use.
  • Transparency => of practices [by service providers]
  • Access and Accuracy => right to review & edit [something about proportionality]

Definition of Privacy

The definition is unclear and evolving. It is frequently defined in terms of the harms in curred when it is lost.

Privacy Framework of Via Harms

The Prosser Harms, <quote> page 6.

  1. Intrusion upon seclusion. A person who intentionally intrudes, physically or otherwise (now including electronically), upon the solitude or seclusion of another person or her private affairs or concerns, can be subject to liability for the invasion of her privacy, but only if the intrusion would be highly offensive to a reasonable person.
  2. Public disclosure of private facts. Similarly, a person can be sued for publishing private facts about another person, even if those facts are true. Private facts are those about someone’s personal life that have not previously been made public, that are not of legitimate public concern, and that would be offensive to a reasonable person.
  3. “False light” or publicity. Closely related to defamation, this harm results when false facts are widely published about an individual. In some states, false light includes untrue implications, not just untrue facts as such.
  4. Misappropriation of name or likeness. Individuals have a “right of publicity” to control the use of their name or likeness in commercial settings.

</quote>

Adjacencies

<quote>One perspective informed by new technologies and technology‐mediated communication suggests that privacy is about the “continual management of boundaries between different spheres of action and degrees of disclosure within those spheres,” with privacy and one’s public face being balanced in different ways at different times. See: Leysia Palen, Paul Dourish; Unpacking ‘Privacy’ for a Networked World; In Proceedings of CHI 2003, Association for Computing Machinery, 2003-04-05.</quote>, footnote, page 7.

Adjacency Theory

An oppositional framework wherein harms are “adjacent to” benefits:

  • Invasion of private communications
  • Invasion of privacy ihn a person’s virtual home.
  • Public disclosure of inferred private facts
  • Tracking, stalking and violations of locational privacy.
  • Harm arising from false conclusions about individuals, based on personal profiles from big‐data analytics.
  • Foreclosure of individual autonomy or self‐determination
  • Loss of anonymity and private association.
Mosaic Theory

Oblique referenced via quote from Sotomayor.
<quote>“I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.” United States v. Jones (10‐1259), Sotomayor concurrence.</quote>

Yet, not cited, but related (at least):

Definition of Roles [of data processors]

  • data collectors
  • data analyzers
  • data users

The data generators or producers in this roles framework are substantially only customers or consumers (sic).

Definitions

  • Definition of analysis versus use
    • <quote>Analysis, per se, does not directly touch the individual (it is neither collection nor, without additional action, use) and may have no external visibility.
    • & by contrast, it is the use of a product of analysis, whether in commerce, by government, by the press, or by individuals, that can cause adverse consequences to individuals.</quote>
  • Big Data => definitions
    • [comprises data with] high‐volume, high‐velocity and high‐variety
      information assets that demand cost‐effective, innovative forms of information processing for enhanced insight and decision making,” attributed to Gartner Inc.
    • a term describing the storage and analysis of large and/or complex data sets using a series of techniques including, but not limited to, NoSQL, MapReduce, and machine learning.” attributed to “computer scientists” on arXiv.

Quoted

The strong, direct, unequivocal, un-nuanced, provocative language…

<quote>For a variety of reasons, PCAST judges anonymization, data deletion, and distinguishing data from metadata (defined below) to be in this category. The framework of notice and consent is also becoming unworkable as a useful foundation for policy.</quote>

<quote>Anonymization is increasingly easily defeated by the very techniques that are being developed for many legitimate applications of big data. In general, as the size and diversity of available data grows, the likelihood of being able to re‐identify individuals (that is, re‐associate their records with their names) grows substantially. While anonymization may remain somewhat useful as an added safeguard in some situations, approaches that deem it, by itself, a sufficient safeguard need updating. </quote>

<quote>Notice and consent is the practice of requiring individuals to give positive consent to the personal data collection practices of each individual app, program, or web service. Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent. <snip/>The conceptual problem with notice and consent is that it fundamentally places the burden of privacy protection on the individual. Notice and consent creates a non‐level playing field in the implicit privacy negotiation between provider and user. The provider offers a complex, take‐it‐or‐leave‐it set of terms, while the user, in practice, can allocate only a few seconds to evaluating the offer. This is a kind of market failure. </quote>

<quote>Also rapidly changing are the distinctions between government and the private sector as potential threats to individual privacy. Government is not just a “giant corporation.” It has a monopoly in the use of force; it has no direct competitors who seek market advantage over it and may thus motivate it to correct missteps. Governments have checks and balances, which can contribute to self‐imposed limits on what they may do with people’s information. Companies decide how they will use such information in the context of such factors as competitive advantages and risks, government regulation, and perceived threats and consequences of lawsuits. It is thus appropriate that there are different sets of constraints on the public and private sectors. But government has a set of authorities – particularly in the areas of law enforcement and national security – that place it in a uniquely powerful position, and therefore the restraints placed on its collection and use of data deserve special attention. Indeed, the need for such attention is heightened because of the increasingly blurry line between public and private data. While these differences are real, big data is to some extent a leveler of the differences between government and companies. Both governments and companies have potential access to the same sources of data and the same analytic tools. Current rules may allow government to purchase or otherwise obtain data from the private sector that, in some cases, it could not legally collect itself, or to outsource to the private sector analyses it could not itself legally perform. [emphasis here] The possibility of government exercising, without proper safeguards, its own monopoly powers and also having unfettered access to the private information marketplace is unsettling.</quote>

Referenced

Substantially in order of appearance in the footnotes, without repeats.

Via: backfill, backfill


Snide

And yet even with all the letters and professional editing and techwriting staff available to this national- and historical-level enterprise we still see [Footnote 101, page 31]

Qi, H. and A. Gani, “Research on mobile cloud computing: Review, trend and perspectives,” Digital Information and Communication Technology and it’s Applications (DICTAP), 2012 Second International Conference on, 2012.

The correct listing is at Springer

Digital Information and Communication Technology and Its Applications;International Conference, DICTAP 2011, Dijon, France, June 21-23, 2011. Proceedings, Part I, Series: Communications in Computer and Information Science, Vol. 166 Cherifi, Hocine, Zain, Jasni Mohamad, El-Qawasmeh, Eyas (Eds.) 2011, XIV, 806 p.

But:

  • it’s → is a contraction for it is
  • its → is a possessive

Ergo: s/it's/its/g;