Who’s Zoomin’ Who?

How do you know that? Why do you think that?  How does that make any sense?  

I was a highly opinionated child with a lot of crazy ideas. But my Dad was patient. He never told me “that’s crazy” or “that’s wrong”.  Instead he usually greeted my pronouncements with some variation of those three questions and often he strung them together into a dialogue.  I’d answer and he’d ask the next question or repeat the first.  At some age, I don’t really recall when,  I began to internalize those questions and the resulting dialogue.  When I got to college I had the chance to study rhetoric and semantics. I added my own questions to his three.

Why these words? What do they want me to think/feel/do? Why are they saying this?

I guess these questions are what the education folks call “critical thinking”. What I know is that we’d be better off asking these questions when we read. I’ve been reading lots of stories, tweets, and posts about “fake news” websites and the need for improved “fact-checking” and digital literacy.  But I’m not too sure we’re getting at the problem. The problem is a lack of critical thinking as my Dad would have approached.  Instead, people seem to be emphasizing the following questions:

What are the “facts”? Is this true? Is this a “legitimate” news site? Should I trust this source? How do we filter out the “fake news”?

These are the wrong questions. They won’t lead to critical insight. They’ll only lead to more deception and propaganda.  I see two problems with these questions people are posing.

First, everything cannot be reduced to some “fact” status as either true or not true. I don’t want to get into some deep philosophical exploration of the nature of truth, I just want to point out any statement of the future  or intentions is inherently speculative and cannot be “fact checked”. All statements of policy intents are statements about the future.   A person can lie about their intents (and even lie to themselves) but it cannot be “fact checked”. The lie can only be challenged by building an argument of reasoning why the person should not be believed. Further the class of things that can be called “facts” includes only objectively verifiable things. Yet subjective things matter too. Feelings, preferences, and perceptions cannot be “fact-checked”. Culture is made of more feelings and perceptions than it is facts.

I could elaborate on the inadequacy of “fact-checking” and likely will in some future post, but right now I want to focus on the second issue: the problems involved in focusing on “legitimate” vs. “fake” news sites.  This isn’t really critical thinking at all. It’s a reliance on authority as the sole arbiter of truth. It’s actually the approach that says we don’t have to engage the actual message itself and critically think about it. This approach advises to divide the world into approved “legitimate” news sources, presumably nice establishment entities such as the New York Times, or Washington Post, or ABC/CBS/NBC/CNN.  I suppose whether Fox News qualifies depends on whether you’re Republican or Democrat.  But other sources are deemed suspicious and likely to be “fake”.  Folks, the problem isn’t whether the news publisher is “legit” it’s whether the news story itself is “legit”.  Big difference.

Let me use a story that has made the rounds in the last day or so.  The Washington Post published a story with the headline:
Russian propaganda effort helped spread ‘fake news’ during election, experts say

Almost instantly, the Twittersphere and blogosphere lit up with mostly unhappy Clinton supporters claiming this is the biggest news story and everybody is missing it.  And yet, the Washington Post site fails on all my Dad’s questions. There’s nothing really there. And when I ask myself about their semantics and ask myself “cui bono?” from this piece, I find it seriously lacking.  I don’t have to take it apart for you because Fortune magazine and journalist Caitlin Johnstone, quoting Glenn Greenwald, did it for me.  You can read for yourself:

Fortune:  Russian Fake News

Caitlyn Johnstone on Newslogue: Glenn Greenwald Just Beat The Snot Out Of Fake News Rag ‘The Washington Post’

(update 28Nov2016: An even better critical thinking take-down of the Washington Post article from William Black at New Economic Perspectives: The Washington Post’s Propaganda About Russian Propaganda )

I’ll reiterate what I’ve said on Twitter and FB.  We shouldn’t be calling out “fake news” sites. We shouldn’t even be calling out “fake news”.  We should call it what it is: propaganda.  Calling it “fake news” will mislead us and get all of us into trouble.  It leads to binary thinking: is this “true” or “fake”?  The problem is propaganda. The most effective propaganda is neither true nor fake. It contains at least some elements of truth or facts but uses rhetorical sleight of hand to get you to believe something you really don’t know. We used to call it spin, but I guess that’s gone out of style.

Let’s remember “legitimate” news sources can and often do deliver propaganda, “fake news” if you will, just as easily and even more effectively than any “fake news sites” spun up by some troll teenager in his basement.

I’m old enough to remember that the legitimate news sources delivered the news to us about Gulf of Tonkin incident and Saddam Hussein’s weapons of mass destruction and anthrax.   Those were propaganda, “fake news”, spun up to work the nation up to war. They worked unfortunately and hundreds of thousands died. Indeed, the march to war is always accompanied by the whole hearted support of the merchants of death and the “legitimate” news sources.

Crying “Russians! Russians!” is dangerous. Accepting such stories uncritically is even more dangerous.  It allows people, especially establishment Democrats, to ignore their own culpability in creating this disaster of an impending Trump presidency. But even more dangerous is it feeds the war machine. We have a populace that wants to look elsewhere to blame their problems: Republicans want to blame Arabs, Muslims, and immigrants.  Now Democrats are crying to blame Russians.  That way lies madness. Let’s remember, when it comes to world wars, it’s three strikes and we’re all out.

So I humbly ask that we all ask ourselves as we read these days: Who’s zoomin’ here?

hat tip to the Queen of Soul, Aretha Franklin for the inspiration for the post.  Enjoy:

 

 

Critical Analytics: It’s Stories All the Way Down

I’ve been hearing much lately about stories, narratives, analytics, data, and “big data”.  I have no need to call out exactly who or which pieces of writing. You know who you are. My aim here is not to criticize, oppose, or take sides. It’s to take a brief critical look at what’s being discussed.

Much of the discussion strikes me as one tribe (I’ll call them non-quants) pleading that stories and narratives are important too!  All of which is an understandable reaction to how the other tribe (I’ll call them quants) have seemingly gained a favored position and perceived superiority at divining the “truth” because they are evidence based!  Because data! I’m actually a member of both tribes and find the posturing of stories and narratives as alternative to quantitative analysis disheartening.

The most encouraging blog piece I’ve read recently comes from Michael Feldstein.  In his lengthy (and excellent) post called Analytics Literacy is a Major Limiter of Edtech Growth.  Please do read it.   He argues for the dissolving this false juxtaposition between “stories” and “data”.

…some of these arguments position analytics in opposition to narratives. That part is not right. Analytics are narratives. They are stories that we tell, or that machines tell, in order to make meaning out of data points. The problem is that most of us aren’t especially literate in this kind of narrative and don’t know how to critique it well.

I wholeheartedly agree.  Feldstein is (correctly) arguing that data points are nothing without stories.  The meaning we take from the data is itself nothing but a story we weave using the data points as we might use punctuation or particular words.  In essence, quantitative analysis is itself a story.

This really isn’t news or at least it shouldn’t be.  I remember how powerful McCloskey’s Rhetoric of Economics was for me when I read it decades ago.  McCloskey powerfully made the point that no matter how much we wrapped an idea in data, mathematical formalism, or econometric analysis, everything we said in economics was just a metaphor or a story we imposed on the data. Alan Grossman long ago pointed out that even that high temple of data-driven evidence, Science(tm), it’s still just rhetoric and it’s still just stories.

Yes, the meaning we attach to a set of data is itself a story.  So stories are not alternatives to data. Data is a story.  But it’s not just the obvious story we tell with the data. There’s a story unstated underneath the data the we use. Our choice of particular data variables constitutes a story itself. We (or at least the data collector) have in mind a story and narrative of what’s important before they collect the data.  They don’t collect data about the context that they don’t see as important or relevant (or easy enough to collect), so they assume a story about that uncollected contextual data holds no meaning.  There’s a story underneath the story we told with the data.

But it keeps getting deeper. Much like the philosophical turtles, it’s stories all the way down. That measure of the data you’re using. The one you think is just basic stats or math, something like the average (properly called arithmetic mean), or the variance, or correlation, or whatever.  It has a story too.  Let’s take that arithmetic mean (average) and each observation’s difference from the average. We think of that average as “the norm” – but that’s just a story invented by a couple of different statisticians in the 19th century.

I can’t really do justice here to the story of how that story of what the average or norm is.  I strongly urge you to read The End of Average by Todd Rose.  It’s fully accessible to members of both tribes, quants and non-quants.  You’ll never use your quantitative data the same way again. Todd Quinn writing in the Elearning magazine of the ACM had the same kind of dramatic reaction as I had.

I’ve finished reading Todd Rose’s The End of Average, and I have to say it was transformative in ways that few books are. I read a fair bit, and sometimes what I read adds some nuance to my thinking, and other times I think the books could stand to extend their own nuances. Few books fundamentally make me “think different,” but The End of Average was one that did, and I believe it has important implications for learning and business.

Rose’s point is pretty simple: All our efforts to try to categorize people on a dimension like GPA or SAT or IQ are, essentially, nonsensical.

But going another level down, as Rose explains in End of Average, there are assumptions beneath the calculation and use of ordinary stats like the average or the variance.  Let’s face it, “assumptions” is another way of staying “believed a story to be so true that it didn’t need to be stated”.  In the case of the average and the calculation of differences from “the norm”, that assumed story has to do with the ergodic properties of what’s being examined.  So what’s “ergodic  properties”? Well here’s Wikipedia’s attempt to explain ergodicity. It’s not very accessible to non-quants (or even most quants!).  Again, I would refer you  to Rose’s book for a beginning glimpse of what ergodicity means. I can’t explain it here, but the essence is that mathematically, statistically the vast majority of the stories being told with quantitative analytics are complete nonsense. Garbage. Invalid. Wishful alchemy.

It’s stories all the way down.  At first this might seem discouraging. But it’s not. I’m calling for not just analytics literacy but a critical analytics.  We need to investigate and become aware of not only the stories we tell using data, but also the assumed stories we slide under the table by choosing particular measures and statistical techniques without thinking about them. We wouldn’t let the semantics of narratives escape critical examination. Why should we let analytics?

 

Open is Alive

OpenEd16 isn’t your normal higher ed conference.  This year it had all the normal features of a higher ed conference: keynotes, the stimulating concurrent presentations, food, and evening socializing by academics that felt just a little more freedom by being out of town.   But it also had a something new. A jam session.

Yes, that’s right. In addition to organizing the usual conference, David Wiley (@opencontent) rented a drum kit and who knows what other instruments and somehow convinced the Hilton Hotel to allow us to take over the lobby bar from 8 to 10 last night. Anybody from the conference was free to step up to the microphones, grab and instrument, and make music with their peers. Peers they had never practiced with. Peers they were playing with for the first time. Peers who were all at different stages of experience in playing. Peers who had varying levels of talent and skill (I’m assuming that, since being at the zero level on that scale I can’t really judge). Does this sound like an open pedagogy class to anyone yet?

When I heard of the plan, it sounded crazy. But it wasn’t. It was brilliant. It was fun. It was energizing. Some people got up and danced. Many watched and listened intently. Many others were actively engaged in conversations around the room with the music of their peers as background. I think everybody there had fun. And I know I at least had a moment of insight, that exquisite moment when the blood surges in the brain near the right temple that Gardner Campbell told us about in the opening keynote yesterday.

Somebody called it the OpenEd band (although membership was rather fluid). I have to agree. The band actually demonstrated why open education (open pedagagy) works. I’m now music expert, but even I know that objectively they weren’t “great”. They certainly weren’t as polished or slick as the original bands that sold platinum albums of those recordings. But that peer-reviewed, objective standard of “great” didn’t matter. Nobody wanted to sit around and hear the albums of The Monkees, The Rolling Stones, Dylan, Bonnie Raitt, Lynyrd Skynrd, and all the other bands whose songs got played. What mattered was who was playing and that they were playing – creating– live. Live music beats polished recorded music. Everywhere.

Why?  Why would we rather listen to flawed music, complete with mistakes, than all gather to listen together to the perfect, polished recording?  Because it’s live. And live means alive.

I think the same is true with students and learning.  Alive matters. Alive gets us the real learning, not the “picture of the learning”.  But for learning to be alive, somebody has to be actively creating something. We have to be part of a live experience.  To me, the core of open learning is being in that space where things and ideas are created. The best space for that is for both instructors and students to create, share, and publish their own work.  Simply reading or viewing the flawless, peer-reviewed, polished, perfected work of some publisher is like listening to an album in public. It becomes background noise. If directed, we can attend a small part of it, maybe.  But mostly, we it has no affect on us.  On the other hand, reading, viewing, and listening to each others’ creations in the same time and space as they’re being created engages us. It even inspires us to create ourselves.  The flaws don’t matter.  The creating does.

Open works because it’s live. And live means we’re alive.

David Wiley, the leader in the background.

David Wiley, the leader in the background.

David Wiley, you modeled the proper role of a professor tonight perfectly. You set up the space. You provided the assignment. You mixed the sounds to pull in everybody. Folks engaged the risky experiment because they trusted you. And then you let the students open it up and create. Open. Live. Alive.

Running Errands for Open Learning Ideas

This is my presentation for Open Ed 2016 in Richmond, VA.   It’s kind of a progress report on the LCC Open Learning Lab project.  It’s very much a work-in-progress (the Lab project, not the presentation).   Assuming the universe cooperates, I’ll follow-up on this posting of the slides with a few long-form posts explaining what I said and going into some more detail.

If perchance your browser or Internet connection takes too long to load the above presentation, you can download the file here.

 

Out of the Traps: The Paradoxes of Successful Change

Note: This is the second of two three posts that summarize the presentation Sue-Anne Sweeney of Madonna University and I made at the Higher Learning Commission 2016 annual conference in Chicago in April 2016. The first post in this series is The Leadership Traps That Stop Transformative Change in Higher Ed. The slides for the whole presentation are available at the original post Iterating Toward Disruption: The Paradox of Becoming Agile (HLC 2016). The thoughts are our own based upon our research and our years over eight decades of change management experience as both leaders and consultants in both higher ed and many other organizations. They do not necessarily reflect those of our respective institutions.

shutterstock_201808379.jpg-550x0In this Part II, I we explain two of the paradoxes by which we can escape the six leadership traps we identified in Part I. These six leadership traps, while well-intentioned, actually prove self-defeating in our quest to achieve transformative change in higher education.  We labelled those traps as

  • Unquestioned Brilliance
  • Urgency
  • Vision Delusion
  • Technical Solutionism
  • “They just….”
  • Telling, Not Listening

man-solves-world-largest-and-hardest-3d-printed-rubiks-cube-1 (1)Solving the Puzzle With Paradox

These traps are largely of our own doing as leaders. We choose  how to react. The traps are made of our own values and perceptions.  The traps intertwine with each other. We choose to emphasize urgency and discount studied learning which in turn leads us to emphasize Grand Vision to the neglect of detailed context. We choose to make quick analyses and choices which lead to discounting contradictory information via confirmation bias, which in turn leads us to conclude “they just…don’t get it” or, worse, “they just…don’t want to change”.

But the reality is that people do embrace change. In fact, our very institutions of higher education are monuments to just how much people not only embrace life-altering change, they seek it out. Life-altering change is what we’re about. Everyday, students enroll and attend classes, often at great hardship or difficulty for themselves, simply because they want their lives to change. Our research faculty dedicate their lives and work extensive hours in search of information that will change lives and change how we all think. Our teaching faculty dedicate their lives to helping others learn so they can change lives.

Yet as institutions we often seem stuck. Trapped. Change as an institution is difficult and frustrating, often due to the leadership traps we’ve identified. The intertwining of the traps combine with the nature of higher education (if the confidence of unquestioned brilliance is a trap, then higher education is certainly a target-rich environment!) to make change seem intractable. It’s a giant puzzle – a kind of n-dimensional Rubik’s cube. The key to solving the institutional change puzzle lies in Paradox.

paradoxes-quotes-2So let’s revisit what a paradox is. We turn to Merriam-Webster:

something (such as a situation) that is made up of two opposite things and that seems impossible but is actually true or possible

How can a paradox help us? The seeming contradiction embedded in a paradox opens our thinking. The paradox acts as a siren to our reasoning mind.We want to “solve” the paradox, so we begin to question assumptions and terms and we open ourselves to new perspectives. It’s fun especially for intellectuals like us in higher education. Therein lie the solutions to our puzzles – and our puzzle of change leadership. Edward Teller, the famous theoretical physicist observed that

two paradoxes are better than one; they may even suggest a solution.

Well, if two paradoxes are better than one, we go even better. We are going to offer five paradoxes that can help unlock the transformative change puzzle and help us escape the traps. Each paradox takes the form of a simple directive or rule (I actually prefer guideline) that at first glance seems to be internally contradictory.

Paradox #1: To Move Faster, Start Slower

We feel the urgency. We know we need large-scale change and we need it as soon as possible. We want to move fast – the urgency trap. The key to moving fast though is to start slow. Traditionally, especially in the U.S., we stress and reward accomplishment, and tend to place less importance on the planning needed to effectively prepare for the implementation.  I recall many years ago back in the early 1980’s when I first began to learn and absorb Continuous Quality Improvement concepts and the teachings of W. Edwards Deming and William Ouchi. At that time Japanese industry, particularly electronics, precision machinery, and automotive, far exceeded their American counterparts in quality, both real and perceived. I remember an illustration of how the typical American firm at the time approached change. It looked like this. Americans were accomplishment-focused and in a hurry. They were proud of their “bias for action”. Traditionally, especially in the U.S., we stress and reward accomplishment, and tend to place less importance on the planning needed to effectively prepare for the implementation.
HLC2016-Iterating To Disruption-Change Mgt3The contrast was this diagram which illustrated the approach of high-quality, continuous improvement focused organizations. Many Japanese firms led the world in quality in the early (and in most any other metric of success) in the early 1980’s because they spent more time up front planning and studying. It was once they really knew and understood the whole system, as a system, that they could take decisive action. Organizations with successful change management reverse the traditional model and spend more time and energy in planning in order to be able to implement rapidly and efficiently.

HLC2016-Iterating To Disruption-Change Mgt4

 

It is indeed ironic that in higher education leadership there seems a real fear of “analysis paralysis” and a real desire to quickly move through the plan phase to the doing. We seem to measure leaders by their appearance of doing rather than their understanding. It is ironic because higher education is all about study and learning. Whether it’s research or teaching, that’s why we exist. Yet when it comes to our own affairs we skimp on the learning part. It’s the Unquestioned Brilliance trap in action.

Albert 55So we need to start slow. But how does slow become faster? There are three reasons why starting slow can actually lead to a faster overall implementation of change.

  • First we need to thoroughly understand the phenomenon and how it affects everyone involved with it. Once we know and really understand, we can identify the right actions to take. Study, planning, and learning is vastly cheaper and less disruptive than implementing actions. It’s better to be less efficient at planning & studying and be more efficient at doing. Planning helps set better priorities. Precision helps when defining the destination.
  • Second, a more in-depth planning and study phase enables us to not only identify the change we want to have happen, but also to identify the best method of making that change happen.  There may be many routes to our destination. Planning enables us to consider the most efficient and efficacious.
  • Finally, starting slow allows us to practice and refine our skills, new processes, and new tools. What we do in practice, even when slow, determines how we react when we speed up. Again I turn to a lesson learned from my adventures in auto racing. When I first started racing I attended a race-drivers’ school.  I was so excited. Real race cars. A real race track (sports car road course). I was geeked to prove how fast I could go. But the order of the first day was to lap the track in these 150 mph cars at 40-50 mph. We were scrutinized in incredible detail for the lines and approaches we made to each turn on each lap – when it was so slow. But that practice at slow speed built the habits and reactions that enabled us to go really fast a few days later. What we do slowly, we will  attempt to do fast. So we need to practice the right way first.

Paradox #2: To Achieve Big Change, Rapidly Iterate Small Changes

9099796942_c4a9ce1d2e_bHigher education is in love with the Grand Initiative. That is perhaps why higher education leaders have been so taken by Christiansen’s “disruptive innovation” hype, despite the Christiansen’s own revisions and clarifications or the many criticisms of it. It sounds so good – so attractive to how we think in higher ed. A single “innovation”, a single initiative that conquers all. What greater legacy could a college president or university provost ask for? It seems so plausible too. After all we see what appear to be dramatic change in many industries led by some significant change in some organization, right? So, despite the evidence that the theory really doesn’t work that way and the evidence that it really isn’t applicable to higher education, we embrace the rhetoric. It sounds so exciting. But rhetoric has consequences and may get misled by our own rhetoric. (Confession: I, Jim, have used the term “disruption” in titles of presentations at conferences a few times – mostly as a blatant attempt to use the latest buzzword to attract attendees so I could evangelize what I knew better. Sue is innocent of this charge).

We want BIG change, but we can get there by biting off small little pieces, making small changes, and then rapidly repeating.  Change one aspect of a system quickly, then repeat on other parts. Change one unit and then repeat on the other units.  Learn from experience and practice.

The big bang concept very rarely works. What does work is iteration. he easiest way to achieve dramatic change, so-called “innovative disruption” or transformative change, is to focus on smaller changes and to rapidly iterate or cycle through them. Change, by definition is doing things differently. We need practice. As we practice, the changed behavior becomes  second-nature. It enters procedural memory. Iteration is how some of most successful organizations have accomplished dramatic change. The core of Deming-inspired Continuous Quality Improvement processes is a cycling of small, incremental improvements made an a relentless schedule with the result that a small organization such as Honda or Toyota eventually conquer their world industry.

IHLC2016-Iterating To Disruption-Change Mgt5n a more modern context, the software package WordPress has largely conquered the World Wide Web in less than a decade. Over 25% of all websites now are powered by WordPress. Yet WordPress doesn’t produce new dramatic versions. Instead, the entire open source WordPress community – it’s not even a single company, but rather a community of ‘000’s of volunteers – creates powerful software by adhering to a philosophy of releasing updated versions every 4 months. Sometimes the changes are big and sometimes small. But the key is to relentlessly keep making modest changes. Eventually the changes accumulate like a snowball rolling down hill conquering.  Web browsers work the same way. Remember the old days when there were major updates with dramatic changes at long intervals? Nowadays, Firefox and Chrome get updated regularly at frequent intervals with small changes each time. Gradually the experience changes dramatically, but without the drama of the big bang.

How does iteration work its magic? One way is because small changes also allow people to manage the amount of distress and extra energy involved in doing things differently.  As the changes are iterated, they develop some “change stamina” and skills, such as collaboration across silos and routinely engaging in debrief and lessons learned to improve the process for the next go-round. Practice makes perfect.

1280px-San_Antonio_DerailmentA second way iteration works magic is because it lowers risk. Regardless of whether you measure risk as “the potential damage done by an error” or “the probability of making an error”, risk slows change efforts. A higher perceived risk means people proceed cautiously and slowly. When we have the grand initiative approach, risk is higher. We can’t risk any mistakes. So we use “proven solution” even though it doesn’t really produce the transformative change we want or need, but it’s safe and predictable. The big bang or Grand Initiative approach is like having the organization take the train together. All parts of the organization are going to go through the same journey at the same time and are expected to arrive at the same central station even if their real destination is a few blocks or a mile away from that station. To move the entire organization safely down the same path at the same time requires us to build a railroad to get to our destination. That takes time and investment. We could use somebody else’s railroad, but then we won’t get to our destination, we’ll arrive at theirs. If we rush the track building process, the result can ugly. The whole organization derails and fails.

Transformative change necessarily involves innovation & creativity. And creativity & innovation require the ability to make mistakes and then correct and learn from them. If you’ve got all the eggs in that one basket, you’re not going to run with it. Iteration lowers risk. We can move faster. We can try different approaches or tools. If one doesn’t work we can easily regroup and the damage, if any, is limited. Instead of sending the entire organization down the same track at the same time and same speed (which by the way is too slow for some and too fast for others in our institution), we could use fleets of cars or trucks that communicate. They can follow similar paths and learn from each other. They can deviate slightly to better fit their circumstance. And, if one fails, the others survive and thrive.

The great inventor aBiggest-mistakend innovator Charles Kettering knew that iteration, repeated attempts, was key to innovation and creativity. He knew that to minimize risk we need to learn from each attempt so that the next attempt had a greater chance of success.  He said:

We often say that the biggest job we have is to teach a newly hired employee how to fail intelligently.

Unfortunately “failure” is too often a 4-letter work in higher education. Again it is ironic. Our researchers know that learning from failure in the labs is how science advances, but we don’t apply the concept to our own institutions.

A word on “pilots” is appropriate here. Doing a “pilot” of some initiative is not iterating. Pilot projects can be useful. They can be great laboratories to learn and explore. But too often they are treated as “let’s do a small version of the concept first and when we show some evidence that it works, we’ll turn around and ‘roll it out’ to the whole organization”. Pilots have many issues. One is that the real purpose often isn’t to learn or explore – we’re convinced with unquestioned brilliance that it’s the right or best solution, so what’s to learn? Instead the real purpose is often to simply demonstrate that it will work. We’re trying to demonstrate that our idea is as brilliant as we think. Never mind that we’re likely to fall prey to confirmation bias in our judgement that it works. The pilot cannot be allowed to fail. So additional resources, energy, and attention are given the pilot – resources that cannot be scaled or applied to the larger organization. Further, doing a pilot really is just a slight postponement in the big bang. Iteration doesn’t work that way.

Let’s consider an example that many institutions are pursuing today. In the name of increasing student completion rates, the assumption is that students have too many options and choices in the curriculum. The favoured idea is that if we reduce the choices and provide a clearer pathway to a degree that has no options for getting “off-track”, then students will arrive at the destination (degree) in larger numbers. I don’t want to discuss the relative merits of the concept here – that’s a topic for other future posts. What we want to consider is how an institution goes about changing all of its degree plans of study to conform with the pathway idea. A common approach is to do a pilot first. Pick one or maybe two degree programs and redesign them. Often the pilot programs are either smaller, already more coherent, or just staffed by true believers. The pilot is deemed a success because the degree pathways have been redesigned. Note that success has already been redefined. Success is now actually doing the idea, not achieving the transformative goal the idea is supposed to accomplish. The pilot succeeded because we have proof of a redesigned plan of study, not because we have proof of higher completion rates. With the success of the pilot, the institution’s leadership is eager to “roll out” the innovation to all programs and all degrees. Phase II begins with a massive simultaneous effort to redesign all degree pathways. It takes time. Conflicts arise. Questions and issues not seen in the pilot arise. It seems each program has its own problem fitting the template. That’s not iteration. That’s big bang Grand Initiative.

Iteration approaches the challenge differently. Iteration appears more like a series of “pilots”. It might start with a pilot too, but the purpose of the first pilot is different. It’s to learn and to see what’s involved in doing this. A second pilot can be rapidly dispatched with some modification based on lessons of the first pilot. Then a third and fourth “pilot” might be started in parallel. Rather than doing all programs at the same time, iteration calls for repeatedly taking each program individually but learning the whole way. Early efforts are used to create tools to accelerate and simplify the next program. Rather than emphasizing all programs doing the same thing simultaneously, each program is urged to make it fit to their needs, pass along lessons to the next program, and complete the change rapidly.

That should bring us to the third paradox, but this post is already getting quite long. So in the interest of iteration, we’ll stop here and pick up the story with paradoxes 3-5 in another post soon. I hope to get the next post up this weekend.

 

 

 

 

The Leadership Traps that Stop Transformative Change in Higher Ed

Note: This is the first of two posts that summarize the presentation Sue-Anne Sweeney and I made at the Higher Learning Commission annual conference 2016 in Chicago in April 2016.  The slides for the whole presentation are available at the original post Iterating Toward Disruption: The Paradox of Becoming Agile (HLC 2016).  The thoughts are our own based upon our research and our years of change management and consulting experience in both higher ed and many other organizations.  They do not necessarily reflect those of our respective institutions. 

4483120The message is clear. Higher education needs to change. We, the leaders of higher education – and although I’m only a teaching Professor, I consider myself one of those leaders – are constantly getting the message. Sometimes it’s delivered from outside by those politicians, venture capitalists, entrepreneurs, philanthropists, and self-appointed thought leaders who say we need to be “disrupted” or “unbundled” or just “fixed”. More often the message is from inside – the realization that student loan debt is sky-rocketing, costs are difficult to control, funding sources are drying up, grades are inflating, and enrollment is declining. While the nature of the needed changes may not always be clear, one thing is clear: We need to change. And not just small changes or small additions. We need large-scale, transformative change. We need to change everything.

Virtually all colleges and universities are in the midst of major change initiatives and have been for many years. Indeed, all the accreditation pathways of the HLC now focus entirely on Continuous Quality Improvement efforts of some kind – in other words accreditation depends on successfully changing.  Yet it often seems we’re running in place. Lots of action. Lots of effort. Not necessarily a lot of successful, positive change to show for it.

Change can be frustating and challenging.

Change can be frustrating and challenging.

For higher ed leaders, trying to trigger transformative change in their institutions is often challenging and frustrating.  Hanging out in the halls at any conference with higher ed leaders one will often hear that repeated refrain that “people don’t like change”, that “they” (whoever “they” are – I suspect it’s most often faculty) are just too resistant.  Caught between a “resistive” organization that seems stuck in place and  the increasing pain and complaints posed by finances, accreditors, regulators, and boards, we leaders often find ourselves in a stressful, discouraging position.

In this presentation, we want to take a closer look at how to lead change in higher education organizations.  We apply to higher ed the lesson learned from a variety of organizations and of the research on change management of the last 20-40 years.  Yes, most of the research and ideas we bring originated in “business”, that part of society that higher ed sometimes views with suspicion. However, the ideas have been applied, demonstrated, and proven in many non-business sectors including government, non-profits, NGO’s, healthcare, volunteer open source software communities, and even education.

We split our comments into two parts. In the first part, which I will summarize in this blog post, we will diagnose the change challenge as lying within ourselves. The reasons our institutions fail to complete the transformative changes we desire lies as much within ourselves as leaders as it does within the organization. Simply put, we get caught in the traps of leadership.

Why? Why Is Transformative Change So Difficult?

Leadership. It sounds so powerful, so romantic. The word conjures images of heroes saving the day with some brilliant, daring, or inspiring action. Or perhaps we imagine generals leading the troops to victory against the odds through their brilliance and strength of character. Yet our very expectations of ourselves as “leaders” set the trap of our own undoing.

Trap #1: Unquestioned Brilliance

uqbrillcoverThis is the biggest leadership trap. This is the phenomenon former Penn State and U. Washington management professor John Austin describes in his book of the same name Unquestioned Brilliance. This is the combination of urgency and expectations that feeds on confirmation bias. We charge off to lead the organization, sometimes with their blind support, fully confident of the brilliance of our solutions and diagnoses.

As academics we are often particularly vulnerable to the unquestioned brilliance trap.  The trap depends on the expectation, both our own and that of our people about us, that we are very smart people – that we know the answers.  Since nobody rises to any level of success or prominence in higher education without having demonstrated that they are indeed very smart persons, it’s a natural for us.  Nobody rises to increasing leadership roles in higher ed, be it in administration or in faculty, by adopting a constant persona of “I don’t know..” through out their career.  Yet this skill that gets us recognized as leaders is often our very own undoing.  We start to believe the press releases in our head.

How does it work?  A typical cycle goes like this. It’s the cognitive path of least resistance.Trap meme 964556

  1. We feel an urgent need for solution.
    — People look to us for an answer partly because they think that’s our job, partly because they don’t know, and we’re eager to show why we’re the leader by providing the answer.
  2. We assert a solution based our limited experience or based on other authorities. Often these other authorities might be the funders/foundations or speakers at some conference we attended.
    —  People they expect us to know and we’re eager to prove why we’re the smart leader by giving them a solution. “here this one looks good!”
  3. Seek supporting evidence and confirmation bias sets in.  
    — We become more convinced of the rightness and brilliance of our proposals/ideas/initiatives because we collect confirming evidence. But we discount and ignore critical contrary information and data that might cause us to re-think or adjust our approach. In the interest of sounding positive and providing what we believe to be “leadership and direction” the meetings and efforts of the organization become exclusively focused on implementing this solution.  
  4. We see the organization as split into supporters and resisters.
    — Some people are eagerly following the leader’s direction and pushing for action in implementing this solution. But others, those who question the solution and want to look closer at the “problem” are marginalized and discounted as anti-change resisters. We don’t hear them.  We think we know what they’re feeling, but we’re wrong.
  5. Things stall or go awry.
    — Our solution peg turns out to have square corners and our institution’s hole has round corners. We either jam the solution in and damage either the solution or the organization, or we eventually abandon the attempt. That brings a new perception of a crisis begging for solution.
    Rinse and Repeat. 

Trap #2: Urgency

running-man-1149787_960_720We feel the need so powerfully for the change, that we feel we have to hurry. We, college leaders, are being bombarded with messages of how “unsustainable” our current models are, of how technology is going to “disrupt” our “industry”. We see enrollment declining. We see the debts being piled up by students. We see students taking entirely too long to complete. We feel either the institution or we personally are running out of time. We feel the pain.  We feel the tentativeness of our current situation. It feels as if we are running on cracked glass floor. We don’t when or how soon it might all collapse. So we run.

if-you-dont-have-time-to-do-it-right-when-will-you-have-the-time-to-do-it-over5We are again affected by our own expectations of what it means to be “leader”. This is particularly true in our Western and uniquely American culture. We believe leaders do things. Nobody ever imagines a leader in our culture as somebody who in response to a crisis says “hmm. Interesting. Let me think about this awhile and get back to you.”  No, leaders take action and they do it in a hurry. And being good leaders we want to take dramatic action. In particular we want to be seen as taking action. Being able to report that all our plans of study have been revised takes precedence over making sure that the revisions are really well-thought out. After awhile, organizations tune out. They come to see such crisis du jour action-oriented leadership for what it often is: theater.

Fundamentals have to be in place.   Doing it wrong, using ineffective process isn’t really any “faster”.  It’s just the appearance of action but without effective or positive change. It’s more theater.

In sports, where one might expect competitive dynamics to foster urgency one sees the opposite amongst champions. The winningest college basketball coach John Wooden (and countless Continuous Quality Improvement coaches ) asked “if you don’t have time to do it right, when will you have time to do it over?” In my favorite sport, auto racing, two-time Formula One World Champion Graham Hill was quoted as saying the objective was to win the race at the slowest possible speed. His observation aligns with one of oldest aphorisms in racing. There are old drivers and bold drivers, but no old, bold drivers. When urgency is allowed to override following proper process and practice, the outcome usually not pretty.

Trap #3: Vision Delusion

Double-alaskan-rainbow

The seriousness of the need for change and the urgency of that change causes us to describe, define, and envision larger, more dramatic change. We don’t want to tweak things and polish the edges. We want TRANSFORMATIVE change. It also appeals to our egos to be the author of a grand solution. Again, our expectations of what it is to be the “leader” come to play. We expect leaders to provide grand, inspiring vision.

Don’t get us wrong. Vision is a good thing – but only in context and when grounded. A grand vision is often necessary. If the degree of change needed is great, then the vision also needs to offer corresponding grandness. Nobody wants to plunge forward tearing up things without an idea of what or how to replace them with something better. The problem is when we focus only on the grand vision.  A grand vision of how the major parts fit together is useful. It’s necessary. The big picture is needed. It provides clarity of purpose. It also helps motivate. But the 30,000 foot level “future picture” looks like this:

HLC2016-Iterating To Disruption-Change Mgt-final

The reality on the ground is different and more complex. The big picture abstracts and glosses over critical inter-relationships, parts, processes, and details.  The reality of the organization is more like this:

HLC2016-Iterating To Disruption-Change Mgt-final2

The larger the change, the bigger the transformation, the more foreign and overwhelming it is to the organization. There are more inter-related parts or details that aren’t understood. And uncertainty can lead to fear. And fear can lead to paralysis or even resistance. It’s likely not that the people in your institution are resisting your grand vision so much as it’s more likely the uncertainty and the incompleteness of your grand vision engenders fear and concern.

Trap #4: Technical Solutionism

best-practices-bart-simpsonIn order to flesh out the large-scale vision of how the organization will be different, we feel we need to “engineer” the details, the parts.  Working out project workplans becomes a technical exercise in designing the new. We feel like we have more control. We are designing the future. The problem of change seems to be one of implementing some kind of solution – a “technical solution”.  Technical solutions are not necessarily IT data systems, although they might be. If we think the solution(s) to our problems lie primarily in adopting/implementing a new data system or tool, in creating a new policy, or in making some structural change to the organization, we are likely pursuing a technical solution. Technical solutions are the things that we  already know how to do. We apply those solutions when there is disequilibrium  (their term) in the system. For those of us in education we would call those solutions best practice.

In higher ed technical solutions abound. Attend any conference and walk the show floor. An army of vendors will gladly tell you how this data system will transform your institution into a paragon of assessment or how student retention and completion is only a different data system away. All technical solutions. Now move away from the show floor and attend the sessions, especially the keynotes. Consultants and other higher ed leaders explain exactly how what they are doing or want you to do is the way. It’s a best practice. Best practices. It sounds like the solution. Who wouldn’t want to adopt the best practice? I mean we don’t seriously want to adopt worst practices do we?  Of course not. But the language deceives. No practice stands alone. Context and the institution matter. No two institutional contexts or exigencies are the same. But it’s a trap. We think all we have to do is follow and copy what they did. Adopt the best practice. But what worked for them might not (actually likely won’t) work the same way for our institution. We aren’t them. The best practice ignores all those detailed interrelationships we talked about in the vision delusion trap. We could try to learn from the best practices – identify what nugget of new insight is there and adapt it to our institution and our students. But typically we don’t. The best practice itself becomes the magic bullet – a technical solution. We ignore the evidence that perhaps what worked at one place might be the same situation as ours (that confirmation bias in unquestioned brilliance again). We’re in a hurry (urgency) and this solution seems to provide the promise (the grand vision).

14166E_500 Ronald Heifitz (and co-author Linksey) have written and researched this problem extensively in the past 20 years. Their work has led to the concept of adaptive leadership. It’s based on understanding the difference between challenges or needed changes that can be solved with a technical solution versus the adaptive challenges that face most organizations today. Heifitz and Linskey ask:

  • Is this a problem that an expert can fix? Or is this a problem that requires people …to change their values, behavior, or attitudes? If an expert can fix it, then it’s a technical problem. If it requires people to change, then it’s an adaptive challenge.
  • Do people have to learn new ways of doing business? If so, then it’s an adaptive challenge.

Why does it matter?  Because how we go about successfully inducing change – the role of the leader – is different in each context.

technicalvsadaptive-clergy-conference-2012-presentation-by-archbishop-colin-johnson-10-728“Indeed, the single most common source of leadership failure … is that people, especially those in positions of authority, treat adaptive challenges like technical problems.” – Ronald Heifitz

The problems we face in higher education, whether you think of retention, completion, finances, new technology, or whatever are generally adaptive challenges. We have to jointly, collaboratively develop solutions with all stakeholders in a way that we produce adaptive learning.  No matter how much Silicon Valley or the philanthropic foundations want it, there is no magic bullet technical solutions or policies that will get us out of our challenges.

Trap #5: “They just…”

theyjustPushing a technical solution on an institution facing an adaptive challenge is a recipe for what appears to us to be resistance. The people in the organization itself, the staff, the faculty, and perhaps even students don’t seem to perceive the brilliance of our technical solution and grand vision. They just …. don’t get it. They just … don’t want to change.  They just…are adversarial. They just…don’t care.

That’s what we tell ourselves when the initiatives bog down. We find ourselves frustrated and saying “they just…..” They just don’t want to cooperate. They just don’t get it. They just don’t want change. That’s an indication that we don’t understand the root cause of the change problem. It’s easier to fall back on a stereotype than to do the hard investigative work of root cause analysis. It’s true that change takes an extra effort. The trick is finding out why people aren’t willing or able to invest the extra energy to make the change happen.

45c8d6428387dcb206a391d752abd9eaPeople fear what they don’t understand. Remember all those detailed relationships we glossed over with the Grand Vision? Those details which seem like “stuff down in the weeds” to those in the corner C-suite or Dean’s office are reality for the people of the organization.  It’s necessary to get into the weeds. If you don’t you’ll never know where the poisonous snakes are or whether you’re actually standing in a swamp. But your people know. Fear, and distress.  And additional effort.  It takes a lot of energy to adopt new changes and processes – at least until the new becomes old habit.  Example: when they ask you to change your password, don’t you keep re-typing the old one for a while?

To us, the leaders with the urgent Grand Vision and our beautiful technical solution (it must beautiful, it looks so good in the brochure!), we think we’re the Agents of Change. We too easily discount what our people say (after all we’re unquestionably brilliant!) and assume they are simply recalcitrant or slow. Either way we push harder. Big mistake. We can easily appear to be Angel of Death.

agent-of-changeTo overcome the apparent resistance, we need to first back off our own assumption that it’s resistance. We need to understand why people aren’t doing what we want right away. To do that, we need to respect them, listen, and ask questions. We need to borrow a practice from Continuous Quality Improvement: Ask 5 Why’s.  Why are they saying or doing that? And why again and again. We need to keep investigating what is happening, asking “why”, until we understand the underlying root cause of the reluctance to get on board.

 

 

Trap #6: Telling, Not Listening

Leader_thumbWe all acknowledge that communications is critical to successful change. But what do we really mean or do?  As leaders we too often feel that people look to us for direction.  We feel like we have to “have something to say, have a strategy, have an answer, have a solution”. So communication too often becomes one of the last tasks in the change initiative. Indeed, how can we do the communication until we know what we need to tell them?  Communication with stakeholders then comes as part of implementation which is seen as a separate activity that follows planning and design.

It’s communications, right? We’re supposed to be the knowledgeable leader with the answers, right? So we wait till late in the project to begin to “communicate” TO the faculty, TO the staff, TO the students, what the changes are. But sending a message out isn’t listening. We miss the listening half of communication.  By not listening, we reinforce our confirmation bias. By not listening, we miss the details and critical relationships that could put the Grand Vision in context. By not listening, we miss the opportunity to engage in the adaptive leadership needed to change habits, beliefs, processes, and norms. 

news_09_11_2015We want to reassure people. We want to have the whole solution in a nice package. And that’s part of the reason why we wait so late to communicate the change. It also helps, sometimes, that it can feed our own need, and theirs, for us to appear like the leader who’s got the answer, who’s in charge. The leader who is unquestionably brilliant.

But it’s all a trap. It’s theater.  It won’t really produce the transformative change we wanted and needed.  Now if you’re suitably discouraged or depressed, take heart.  In the next post we propose some ways out of the traps. They may seem paradoxical at first, but they work.