Skip to content

Terrible things, yes, but great

Wow, what a couple of weeks!  Who would have thought that a company would be in the middle of a scandal about using Facebook data without permission against us?  Oh wait, that’s happened before… Well, at least Cambridge Analytica did something really new with it.  It sort of feels like we are living in the middle of Wag The Dog.  I wanted to write a post that incorporated KM, sharing, best practices, trust, etc., and this Cambridge Analytica scandal but I don’t think I can do that on a deadline of by COB tomorrow (a personal goal, no real reason).  Having just submitted a paper to class for my peers to review, I can see the threads of information provenance running through the news stories about Cambridge and the data scientist who blew the whistle, Chris Wylie. BuzzFeed said,

According to documents obtained by BuzzFeed News and interviews with people who knew Wylie, the young data scientist was enthralled by the idea of a tool that years later would be used to create detailed psychological profiles of the US electorate ahead of the 2016 presidential election. These documents, provided by three sources who verified their identities and connections to Wylie, but declined to be named, include presentation slides, online correspondence, employment contracts, and work prospectuses.

But while the documents do not always cast Wylie — touted by some as “The Millennials’ first great whistleblower” — in a flattering light, they also do not call his claims into question. In fact, they bolster the substance of the reporting by the New York Times and the Observer about Cambridge Analytica, and add credibility to the allegations Wylie has made to the press since last Friday. (Emphasis mine)

https://www.buzzfeednews.com/article/ryanmac/christopher-wylie-cambridge-analytica-scandal#.wuLqo0R3Z

And in doing some research, I came across this prank by someone’s roommate – hysterical, but prophetic.  It strikes me that people like to tell stories – see the article by Brown and Duguid, as mentioned by ebcollier, or the way the prank post talks about how-to – see Jenn Nippert’s blog as well or Bob Prestley’s blog post about communities – wouldn’t this prank apply?  It’s seems harmless enough.  And what about all of us on Facebook?  Did we not contribute to the problem by allowing them to see us in action so often we could be codified?  That is, our tacit understandings of ourselves were externally codified into explicit – exploitable explicit knowledge – about ourselves?  It would seem harmless at the individual level, as long as it didn’t happen to you.  The Cambridge Analytica story, along with various pieces of knowledge management – how else does a 20-something learn about hyper-targeting compliance techniques.  (Run that through Web of Science and you’ll won’t find anything per se – try looking around for compliance-gaining techniques and piece in hyper-targeting.  Hyper-targeting is the idea that one can use multiple audience targeting techniques to segment things down.  Then one would adjust the story to suit the targeted audience.  Since it fits in our perceptions of the world, this can be a technique to incite people (Dog whistle politics, anyone?)  But using big data science practices to adjust the stories to influence people on a huge scale – that might be terrible genius.

Note that in the middle of the prank post, there is a section on how-to get around current Facebook policies for the size of the audience.  This could be similar to a best practice – Szulanski would likely have something to say about – that but Bob Prestley’s blog about Lucas and trust is a nice counterpoint to Szulanski; in this scenario the prank poster is enhancing his reputation by showing his mastery of the technology; something that Wasko and Faraj talked about in their article, “Why Should I Share?”.

I still don’t know where Chris Wylie got his inspiration.  In reading the article by Daneshgar & Parirokh, the research intent of which is “How can the existing bulk of customer knowledge accumulated in many of today’s academic libraries be used in more effective ways?”, they examine the concept of “market orientation”, which is closely linked to the organizational learning process.  They examine the value of an largely unleveraged resource – the library clients themselves – and try to suggest that also linked to the idea of organization-wide generation of knowledge regarding current and future customer needs, the dissemination of knowledge across departments, and organization-wide responsiveness.  Normally I would call that market research and trending – because I do call it that at work – but the point is well taken.  I spend significant time interviewing the field sales people to get insights about what is trending and changing across the customer base.  Daneshgar and Parirokh suggest that libraries should do the same, so that their collections are used more effectively.  The only question I have is, where is the adult supervision in all this?  (No, it’s not the library’s job to provide adult supervision to people bent on nefarious ends).  Chris Wylie is quoted, ” “Our goal is first to make it an extremely profitable company, then we will cleanse our souls with other projects, like using the data for good rather than evil. But evil pays more.”  The really interesting tidbit in this scandal is that they tried to sell their services to the Liberal party first, because they are supposed to have an open mind about things.  But the Liberals didn’t snap at the bait, so they turned to the (UK opposition – which I forget) and the Republicans.

Well, we know the rest.  Why is everyone mad at Facebook?  The big scandal is that the electorate was tampered with; sadly, that’s not a crime (yet).

I guess Uncle Ben did say it best:

Bibliography

Daneshgar, F. & Parirokh, M. (2012). An Integrated Customer Knowledge Management Framework for Academic Libraries. The Library Quarterly, 82, 7–28. doi: 10.1086/662943

Brown, J. S. & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization science, 2, 40–57.

Szulanski, G. (1996). Exploring Internal Stickiness: Impediments to the Transfer of Best Practice Within the Firm. Strategic Management Journal, 17, 27-43.

Wasko, M. M. & Faraj, S. (2005). Why Should I Share? Examining Social Capital and Knowledge Contribution in Electronic Networks of Practice. MIS Quarterly, 29, 35–57.

Tags:

7 thoughts on “Terrible things, yes, but great”

  1. The Daneshgar article did not initially catch my eye, but I think I’ll have to take a look at it now after reading your assessment. I too have concerns with trying to leverage patron information to provide better collections and services, if only because libraries already tend to possess a great deal of personal information that could be put to nefarious purposes if one so wished. Even if we only use information provided anonymously and voluntarily for a specified purpose, an astute employee could link a survey with an individual based on similarities in the library’s patron records, and thus will have an idea of how useful that patron’s contribution really is. In contrast to bigger organizations, like say, Target, low-level employees can have easy access to sensitive patron information, but there is an established level of trust between the two parties which makes the situation seem more acceptable.

    I don’t want to get too far into it without having read the article, but those are my initial concerns when I think of market research in the context of libraries, especially in light of the scandals you mentioned with Facebook and Cambridge.

    1. Hi Alex, surprisingly the Target teams do have access to patron behavior – I just returned from a conference where we both use the same software to track loyalty programs and customer behavior. I haven’t seen nefarious behavior from a Target, but Über was an awfully good example of bad monitoring behavior.

  2. I wouldn’t say that we’re mad at Facebook, I would say that we’re disappointed in how they have approached discussing privacy management in a retroactive fashion.

    That being said, I think we should disentangle this notion of “what we should share,” since much of these rules are implicit among those who use the platform. How can Facebook attempt to keep their users in mind when users have a tacit set of rules?

    1. Now that is an excellent question. Perhaps Facebook should be much more explicit about how much information they generate. That is what got ‘borrowed’ after all – the analyzed data that categorizes us all.

      1. True, I think if Facebook was more upfront with what data they were “sharing” that it might have caused a pause in some people to think about what they want to post. On the other hand, it seems to be easier to ask for forgiveness than it is to ask for permission.

    2. I haven’t been following this story too closely because I’ve been on a news diet / strike for a couple of months now and deleted FB a long while ago, but weren’t academics involved? I used to think that public API + public communication = game/fair-play, but in the last year, I’ve decided that I’m much too uncomfortable with using any of this data for any kind of research. In any case, I think there’s some blame among the academics, too.

Leave a Reply