This post continues the saga of organization’s learning practices; tacit, or unwritten knowledge; and how things ultimately get better.  Based on the commentary by JennNippert, I elected to read the article by Brown and Duguid.  I decided to add some other articles that sounded cool and since I worked on the Space Shuttle program, a chance to revisit the Challenger program via the Kumar & Chakrabarti article shot to the top.  

Astronaut Bruce McCandless II approaches his maximum distance from the Earth-orbiting Space Shuttle Challenger. McCandless is in the midst of the first “field” tryout of the nitrogen-propelled, hand-controlled back-pack device called the manned maneuvering unit (MMU).

In 1985 I went to work on the Space Shuttle program.  As the senior Quality Engineer for the depot work on the Manned Maneuvering Unit (MMU) and the Remote Manipulator System (RMS, aka the ‘arm’), I become a member of the professional community of engineering staff working on the space shuttle. When I saw that one of the articles we could read was about the Challenger accident (Kumar & Chakrabarti, 2012), I could not resist. They present the events leading up to the disaster from a very interesting perspective – what is it about tacit knowledge that can blind a manager and ultimately put limits on their awareness and acceptance of new information? To start, the authors identify three facts from the Rogers Commission Report:

  1. NASA tasted plenty of success before the Challenger disaster.  The authors will contend that this blinds people to the possibility of failure somewhat.  Ostensibly, if worked last time, why won’t it work this time?  
  2. Managers and engineers were aware of problems with SRB joints and O-rings since 1977, but never considered them a threat.  At one point, the O-rings had degraded due to erosion by about 30%.  So the team considered that to be a good sign – that they still had 2/3’s left and so plenty of safety margin.  The reverse was the correct perception, as the O-rings were not designed to have any erosion, and any erosion should have been considered a major risk item.  
  3. NASA and Thiokol managers proceeded with the launch on January 28th 1986 despite strong appeals from Thiokol engineers not to.  From the article:
Astronaut Scott Parazynski totes a Direct Current Switching Unit while anchored on the end of the Canadian-built Remote Manipulator System (RMS) robotic arm.

…MSFC [Marshall Space Flight Center) manager Larry Mulloy first asked for the opinion of Kilminster, Thiokol’s Vice President of Space Booster Programs. Kilminster sided with his engineers’ advice to not launch. Mulloy then asked for his colleague George Hardy’s opinion, who said that though he was appalled at the recommendation to delay launch he would still go by engineers’ recommendations. Mulloy, who was not happy with this, then stated that the engineers’ data was inconclusive, and did not support their recommendations. The Thiokol managers then had a ‘‘private’’ five minute meeting (in the presence of the engineers) during which they decided to take a ‘management decision’.

Along the way, engineers had apparently followed a different path to knowledge than the management team.  Therefore, they were more open to new ideas and possibilities, based upon the tacit knowledge they held.

 …Engineers had noted that O-ring related problems almost certainly occurred in earlier flights launched at ambient temperatures below 65oF

So when temperatures at the pad dropped to 26oF, they were able to extrapolate and recognize the dangers.  Management, who had not walked that particular path, could not accept the danger for what it was.  Since this was completely new territory for both, it is accurate and correct to say that both points of view are valid. However, you can just imagine management thinking, “Well, we’ve launched in cold weather before and the O-rings have held together, so this might be something we can ignore again”.   (I am reminded of times in high school math class and other science courses where the teacher says, in this case, the term is nearly 0 so we can ignore it – the effect is miniscule).  Of course, in this scenario, the effect is not miniscule and the O-rings’ elasticity is a very big issue.

And both points of view are reasonable at the time.

Chua examines the ability of various governmental and public safety institutions to learn from disasters and to incorporate the lessons they learn – to become, in effect, a learning organization.  My favorite line, and one with which I completely agree, was:

Knowledge creation is heavily social in nature.

Chua makes the point that through dialog, story-telling, etc., knowledge is exchanged, justified, examined, and refined to become new knowledge.  I strongly suspect that people create mnemonics to easily communicate and reinforce the lessons of the past.  We often call these adages, but idioms and proverbs are also relevant terms.  By incorporating these simple mnemonics into everyday patterns, we can instill the lessons broadly across the social fabric.  For this post, focusing on disasters as it does, I would select the following adages:

  • “Hindsight is 20-20”,
  • “It seemed like a good idea at the time”,
  • “If I knew then what I know now…”,
  • “Failure is the best teacher”

One of the aspects of any social community – a community of practice is an excellent example – the issue of trust.  (See my previous post which discusses trust as an aspect of anyone’s deliberate hiding of information, Spieglein, spieglein…).  Chua writes,

“However, the fact that arduous relationships and a culture of distrust evident among agencies in Katrina were quickly replaced by collegiality and the willingness to stand in solidarity [during Hurricane] Rita cast a new light on the notion of trust. It is likely that those agencies were drawn together by a similar sense of mission and became resolute in overcoming the second disaster after having paid a hefty price for the first. Thus, it appears that trust among agencies could be expedited if there is a clear alignment to a common set of superordinate objectives. It remains to be seen, nonetheless, if trust fostered under such conditions endures”.

Note here that I am not accusing the agencies of deliberately hiding information during Hurricane Katrina, but they definitely did not know enough about disaster management to share, how much to share, and to enforce the need to share.  Intriguingly, trust issues might be overcome if an organization or person is sufficiently motivated to share, such as the agencies experienced in the fallout from their response to Hurricane Katrina.

Which brings me to the idea of an organization that learns.  Brown and Duguid (1991) examine the variance of a major organization’s “formal descriptions of work both in its training programs and manuals and the actual work practices performed by its members”, research done by Orr.  What gets written down in the manuals and training programs is referred to as the canonical practice; yet that often bears little semblance to the actual work done in the field.  Brown and Duguid focus on Orr’s ethnographic work emphasizing the attempt of a field team to resolve an equipment problem which does not fit into the organization’s idea of a possible problem, thus building up explicit knowledge from their tacit :

The story-telling process continued throughout the morning, over lunch, and back in front of the machine, throughout the afternoon, forming a long but purposeful progression from incoherence to coherence: “The final trouble-shooting session was a five hour effort…. This session yielded a dozen anecdotes told during the trouble shooting, taking a variety of forms and serving a variety of purposes” (Orr 1990b, 10).

…An important part of the reps’ skill, though not recognized by the corporation, comprises the ability to create, to trade, and to understand highly elliptical, highly referential, and to the initiated, highly informative war stories.

…in telling these stories an individual rep contributes to the construction and development of his or her own identity as a rep and reciprocally to the construction and development of the community of reps in which he or she works. Individually, in telling stories the rep is becoming a member.

I think here that “war-stories” represent the explicit embodiment of an organization learning.  In the retelling, and the implicit reexamination of the story, the community learns or reinforces the skills and knowledge imparted in the story.  A mnemonic, such as an adage, would help the community remember.

As an example “war-story”, let me tell you about one of the Quality Engineers I first worked with (and sat across from) during my time on the Space Shuttle program: Maynard.  Maynard was at least twice my age at the time and had “been around the block more than once”.  He loved to tell jokes.  They were the same jokes over and over.  Eventually, we numbered the jokes and he could often just look up, say “Joke #4”, and we’d all laugh.

 

Notes and bibliography:

From the McGraw-Hill classroom site:

Idioms, adages, and proverbs are types of common expressions and sayings that have meanings beyond what can be understood by their individual words.

An idiom is an expression common to a particular culture that does not mean what it literally says. You have to learn the meanings of idioms, just like you learn the meanings of words. For example, the idiom Break a leg!means “Good luck!” People often say this to performers before a show.

proverb is a statement of practical wisdom expressed in a simple way. An example of a proverb is “A stitch in time saves nine,” which means that doing something in a timely way saves you from having to do more work later. An adage is a well-known proverb that has been used for a long time. An example of an adage would be “Where there’s smoke, there’s fire,” which means that if there is evidence that something is happening, it probably is actually happening. Adages and proverbs are so closely related that the terms are often used interchangeably.

Bibilography:

Brown, J. S. & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization science, 2, 40–57.

Chua, A. Y. (2007). A tale of two hurricanes: Comparing Katrina and Rita through a knowledge management perspective. Journal of the American Society for Information Science and Technology, 58, 1518–1528. doi: 10.1002/asi.20640

Kumar, A. & Chakrabarti, A. (2012). Bounded Awareness and Tacit Knowledge: Revisiting challenger Disaster.. J. Knowledge Management, 16, 934-949.

McGraw-Hill. (n.d.). Understand Common Expressions and Sayings. Retrieved February 03, 2018, from https://mhschool.com/lead_21/grade5/ccslh_g5_lv_6_3f.html


Comments

One response to “Adages, Idioms, Proverbs, are Lessons Learned”

  1. mcoblentz@comcast.net Avatar
    mcoblentz@comcast.net

    It occurs to me now that there is an adage for the article about the ‘managers bounded awareness’ issue – “he had blinders on”. Referring of course to the black square leather patches that go on a horse to keep them from bolting.

Leave a Reply