Data and strategy revisited: three essential frameworks

the-pavilions-1641171_1920

In a recent article I identified the large and yawning gap between business strategy and data analysis, as both are commonly practiced. I argued that business strategy — despite its roots in research — is often applied in the manner of folklore, and that data analysis is often aimlessly concrete, creating a culture of repetitive reporting without any clear purpose. I concluded that no effective, workable synthesis of these disciplines exists in real practice inside the typical business, whatever the theoretical claims of each.

What then, is the best way to bring data and strategy together — not just in a single PowerPoint or board report, but as an integrated, ongoing part of regular operations? As always, the true answer will be found by individual teams in the trenches, working hard to realize this ideal in practice for their own companies. We need a starting point, however. The starting point I’d suggest is taking some of the best business strategy frameworks and using them to identify, collect, and analyze real-world data.

The starting point I’d suggest is taking some of the best business strategy frameworks and using them to identify, collect, and analyze real-world data.

Strategy frameworks have a mixed reputation, and rightly so. They are taught in business school as a method for cracking case studies. Some of the most famous frameworks are not particularly rigorous, as a recent parody on HBO’s Silicon Valley(the “Let Blaine Die?” SWOT analysis) demonstrated.

That aside, the best business frameworks provide a relatively simple shorthand for understanding complex situations. They summarize diverse and complex information in a way that an intelligent person can readily understand. They provide a starting point for breaking down problems — a starting point that most businesses lack. As I have learned from painful experience, the “strategy” of most businesses is simply mirroring their competition; or even worse, subscribing to the “Vision, Mission, Values” pseudo-thought that Richard Rumelt has labeled bad strategy.

High-quality strategy frameworks also provide a bulwark against availability bias, the tendency to overvalue information that is readily at hand and that fits our preconceptions. Availability bias makes companies overvalue their own enterprise data and assume that it will necessarily yield up great competitive or operational insights. A good framework, in contrast, directs attention outward towards the customer and the competitive environment. It prevents a company from confirming and reconfirming its own biases and ignoring contradictory evidence.

A good framework … directs attention outward towards the customer and the competitive environment. It prevents a company from confirming and reconfirming its own biases and ignoring contradictory evidence.

I collect frameworks like some people collect knick-knacks. All have value, all have hazards, but in my mind three stand out for being comprehensive, logical, and descriptive (rather than prescriptive or proscriptive). These are:

  • Alexander Osterwalder’s Business Model Canvas (BMC), which nicely illustrates how a firm creates value for its customers
  • Michael Porter’s Five Forces (FF), which directs attention to competition and the external environment
  • Richard Rumelt’s Strategic Kernel (SK), which forces a business to define its core challenge and its approach for meeting that challenge

There are areas of overlap and redundancy between these frameworks — putting them together does not yield a ready-made set of measures and metrics for charting the course of a business. But I believe that together, they define the correct areas of inquiry, which is what frameworks are supposed to do.

 

Data and strategy: art of the frame

architecture-1868502_1920

In this data-crazed era, businesspeople are prone to assume that data and strategy are closely related — or in some extreme interpretations, even synonymous. Vendors, technologists and executives all seem equally convinced that data is key to realizing strategic advantage, and that by immersing themselves in data, companies will improve across the board in efficiency, productivity and competitiveness. The existence of young, flourishing, data-driven companies like Google and Facebook helps strengthen this impression.

In practice, however, data and strategy are not easy and intuitive to reconcile. The discipline of business data analysis has developed in IT, finance, and other internal departments where it is most often a form of cyclical reporting. Business strategy has grown up separately as an academic and somewhat abstract branch of knowledge. Generally speaking the business executives and consultants most likely to drive strategy work have been situated far from the data. A true reconciliation of these disciplines has never really taken place.

Business strategy as we understand it today came to fruition in the 60s, 70s, and 80s, a history richly chronicled by Walter Kiechel in his book The Lords of Strategy. The original groundbreaking strategy work of people like Bruce Henderson and Michael Porter was fairly empirical and grounded in primary research. But these strategy pioneers were also marketers and salespeople for their own firms and consulting practices. As their theories and methods spread through the business world, they drifted further and further from their empirical roots. Business strategy became more like a folkloric practice based on high-level models, such as the BCG Matrix or Porter’s Five Forces. These models, while intellectually interesting and often useful, were not rigorously tested or supported with data as they were applied to the situations of individual companies.

The gap between these disciplines no longer makes practical or economic sense. In an age of proliferating data and cheap processing power, it should be possible to bring data analysis and business strategy into much closer alignment. We have the data, we have the strategic literature. What we don’t have is practitioners committed to working across these disciplines in a careful and thoughtful way.

I don’t believe that big data — or even traditional enterprise data — will yield up its potential value until it is wedded to strategy. Data most certainly does not speak for itself. Without context, it is meaningless; and without an interpretive framework, it cannot drive decision-making. As the history of departmental BI shows, without strategy data analysis tends to devolve into a bland form of non-financial accounting, generating lots of reports but not much insight or action. Data analysis performed without strategy is the reason that so many organizations complain, even today, that “We have all this data, but we don’t know what to do with it.”

A rapprochement of data and strategy, in contrast, would raise the bar for both disciplines. The use of a strategic frame would force us to think about how we acquire, interpret, and apply data, and free us from the bureaucratic disease of report proliferation. Conversely, the use of data would make strategy more empirical and impose greater rigor on a field that often hovers at the level of folk practice.

In my next article, I’ll explore the practical application of data to an older framework (Porter’s Five Forces) and to a newer one (the Business Model Canvas).

The platform era—or, revenge of the enterprise architect

The platform era, which has begun, will ultimately be one of shockingly total domination by shockingly few players.construction-1895879_1920

The dominant platform player achieves massive returns to scale. Its platform grinds competitors, middlemen and other intermediaries to dust, as the 90s word “disintermediation” suggested. The term wasn’t wrong; it just failed to acknowledge that in many cases one supreme intermediary would be left standing. The control of data and processing assets is turning out to be late modern equivalent of control of distribution. It creates a position of great, seemingly unassailable, and relaxed power. The “relaxed” part is important — because a power that is relaxed and not tense can take its time, pick its targets, and generally do as it pleases.

There are already dominant platform players operating in broad daylight, and they are well-known: Amazon for retail, Google for search, Apple for mobile platforms and applications (and Google successfully imitating them with Android). As HBR correctly summarized a couple of years ago, the strategy of such platform players is to play  “matchmaker” between a group of producers and a group of consumers. But a truly dominant platform player goes well beyond this. While it may cloak its intentions in friendly (especially consumer-friendly) rhetoric, in reality it creates a broad and steadily growing ecosystem with the express purpose of controlling and exploiting it. The platform player achieves this objective through ambition, stepwise strategy, and purposeful design.

Platform dominance starts with a broad and ambitious goal, such as Bezos’ plan to create an Everything Store or Page and Brin’s mission to “organize the world’s information and make it universally accessible and useful.” The platform player’s first concrete step is to dominate a relatively limited niche (say, books); the subsequent steps are to leverage advantages created in the first niche to take new territory (clothes, electronics, toasters, watches), and then use those new learnings to launch into newer territory altogether (Internet infrastructure). No move is random, though it might appear so from a distance.

 The details of platform design may be complex, but the axioms are simple. Create a service-oriented, component architecture that is highly scalable and highly modular, with strict separation of concerns. Build numerous APIs that can be made available to internal and external clients whenever tactics dictate. Make the analysis, understanding, and applied use of collected data a core part of the business strategy from day one.

What must a company have to achieve these types of platform objectives? First, as we’ve said, ambition. The end objective must be big enough to make the incremental steps individually worth the effort. Second, a dearth of things to lose. Disruptive platform players have little or nothing to defend, which makes them fearless and able to draw powerful conclusions that incumbents are afraid to see. Third though, and this is perhaps most important, a platform player must be led by engineering talent.

A common conceit of the dot-com era was that wild-eyed visionary kids would found a company, and then “mature adults” would be brought in to run it. How has that worked out in practice? Consider a couple of platform players where the alleged adults were brought in (AOL, Yahoo), and contrast them to a couple where the visionaries are still in charge (Amazon, Google). Who is doing better? Who is still growing and expanding? Who is moving effectively into adjacent and even non-adjacent industries?

It would seem that adulthood is overrated. Or that perhaps the entire conceit is incorrect. It is not a matter of adult vs. child; it is a matter of engineer vs. business manager. The latter increasingly looks like a relic of late 20th-century capitalism, which is now well over a decade in the rearview mirror. The 20th-century business manager suffers today from having too many things to unlearn. For a great illustration of this, read the 2006 book Enterprise Architecture as Strategy, by Weill, Ross et. al. The book is an extended argument made to mature executives about how they need to rethink their businesses, build flexible and enabling technologies, exploit the value of data, and so on. The book also explains the complex journey from their current state to this end state, including how to talk individual business units along for the ride, placate executives who prioritize their own functional needs over those of the enterprise, and so on.

Does anyone at Google need to read this book? Would anyone at Google be able to stifle laughter at the idea of having to talk executives into understanding basic enterprise architecture? I say no and no. That is why I say that the platform era is the revenge of the enterprise architect. The principles the architect has tried to preach to traditional companies are not only correct; they are the entire foundation of the future of business.

You may feel I’m overstating the potential of platform players or their ability to expand their spheres of influence. Very well then. Tell me, what are the obstacles to Apple extending Apple Pay and becoming a full-service bank? To Google leveraging its user data and becoming a highly-informed insurance company? To Amazon using its logistics knowledge and becoming a manufacturer? I would say habit, convention, and certain temporary advantages of incumbents, including legal underpinnings. But all of these obstacles can be overcome with time, including the legal ones. And the war chests that platform players are accumulating provide all the resources needed for these future battles. I could close with a comment about how you need to start listening to your enterprise architects, but if you really need to hear that now, it is quite likely too late.

Your industry is not unique

Management consultants are, on the whole, reverent towards their clients. I have never met any who acted anything like Marty Kihn from House of Lies. If anything, consultants tend to be a bit too austere and serious, and develop a deep awe for their patron-benefactors. Over time they often begin looking, acting, and dressing like them. They reach epic heights of sympathetic identification, and in the most extreme cases, develop a type of client Stockholm Syndrome that we in our own industry call “going native.”

ice-crystal-1065155_1920

 

There is, however, one client belief that will make consultants laugh one hundred times out of one hundred, even if silently. This is every client’s conviction that “our industry is unique.” This comment is usually offered by a 25-year veteran of the client’s industry who has never worked anywhere else. While consultants will go to great lengths to find ways to agree with their clients, this is one issue in which they can rarely manage anything better than stifled non-assent. Consultants spend their own careers mostly moving across industries, where they see patterns, repetition, and predictable dynamics — hearing the whole way, of course, about how unique these dynamics are.

Industry uniqueness bias is a genuinely bad and damaging thing. First, it shuts a company off from significant sources of talent and knowledge, most notably all the valuable people with relevant skills who happened to develop elsewhere. Second, it shuts off the process of learning. The people who grew up in an industry share the same formation, perspectives, and cognitive biases. Over time they mirror each other to an extreme degree. Organizations composed exclusively of them tend to stagnate, as their wellsprings of curiosity and innovation dry up.

The ability to flourish in a job requires traits like intelligence, charisma, persistence, steadiness, functional knowledge, and industry knowledge. Put otherwise, industry knowledge is one of myriad factors in professional job success. When companies focus on industry experience and make it the sine qua non of hiring decisions, they potentially under-optimize for all other relevant qualities in their team members. The same goes for receptiveness to outside ideas. When companies believe they can only learn from watching their peers, they and those peers begin resembling each other to an astonishing degree, and find themselves locked in an undifferentiated race to the bottom in margin and profit.

In an era of innovation and disruption, it is a grave mistake not to regularly draw talent and ideas from other industries. Industry boundaries are being drawn and redrawn almost daily today. Business models and technologies are being imported from one industry to another with startling frequency. BMW’s iDrive technology, for example, came from the gaming industry. Throughout the 90s Commerce Bancorp won depositors by running its branches like retail stores. The TRIZ method, a systematic method for innovation derived by studying patents, catalogs countless examples of cross-industry borrowing and breakthroughs.

Steady-state operation is of course an important part of any business, and it is absolutely acceptable to learn from one’s peers. The important thing is to do this learning consciously and not let it degenerate into mindless imitation. Too often, it does. Regularly importing talent and knowledge from outside your industry is the best inoculation.

Are we being digital yet?

Being “digital” in 2016 means any number of things, ranging from the utterly real to the utterly unreal. There are on the one hand genuinely transformative things that are happening in our economy due to the expansion of mobile and cloud technology. Existing business models are indeed threatened by these developments, and true opportunities to benefit really do abound.

whiteboard-849803_1920

There are on the other hand digital cultural gestures and signifiers that have little or nothing to do with how business is really being transacted. You can let your employees wear jeans, put whiteboards all over corporate HQ, cover them with post-its, and fundamentally change nothing. The flexibility in defining digital allows the real and the unreal to coexist, and also enables a continuous war of one-upmanship waged by those who claim to “get it” against those who allegedly don’t.

Take away the relatively new realities of smartphones, tablets and the cloud, and those remaining cultural signifiers turn out not be very original at all. They are simply the same old gestures of youth: informality, anti-authoritarianism, and the conviction that one’s views are novel and unprecedented in the history of the world. Such beliefs are always just true enough to get and keep traction. Mobile technology really is different than anything before it. As for open workspaces, modular furniture, and so on, those are simply the kinds of things that young people like. They have always liked them and they always will. They fit the cognitive style of youth and reflect its preference for openness and conceptual novelty.

Stylistic change always accompanies substantive change. I learned this firsthand during the dot-com era fifteen short years ago. The period was indeed transformative. The “web” – the term makes us chuckle now – changed our ideas about commerce, cultural production and social relations. Nothing could be more real. But a lot of dot-com style turned out to be empty symbolism. Ironically, many elements of that style were the same we see today with digital — jeans and whiteboards most notably. Others, like the Herman Miller chair, have either been absorbed into the mainstream or forgotten.

While dot-com fashion was mostly harmless, the era’s version of the “who gets it” game was positively toxic. Saying that some company or executive “got it” was the highest and vaguest compliment. No one ever pressed hard enough on the phrase to figure out what it might actually mean. As it turned out, it meant nothing. Or, everything. It was infinitely flexible and hence meaningless. Yet projects and even companies were conceived and funded based on this vapid distinction, this simple subjective sludgy idea. With no one questioning the concept of “getting it” or the resulting business models, billions in capital were wasted. The Silicon Valley glorification of failure aside, a society never gets back that kind of wasted capital. The associated effort and brainpower are gone forever.

I recently read Jeff Sunderland’s book on Scrum. Scrum was one of the first Agile methodologies and Sunderland was one of its originators in the 1980s. The method is one of the cornerstones of digital business culture. The book was brilliant and the method is too. Scrum is based on the idea that we should design teams for how work really gets done. Its track record for building physical and software products is exceptional. When companies tell me they are “looking into” Scrum or another Agile method, my only response is, why? Why deprive yourself of improved productivity and better products? Why stick to obsolete methods like waterfall development that you know are ineffective and unproductive? If it is because you secretly resent the culture of whiteboards and post-its and know-it-all kids – shame on you for confusing style and substance.

To be clear, I do not even think that the style of “digital” is malign. As a committed follower of Warhol I reject simplistic contrasts between surface and depth. As neither a moralist nor a bore, I have no desire to complain about “those digital kids today” or tell them to get out of my yard, or company. But I do study the interplay between style and substance closely to make sure I’m grasping and understanding the bits that actually matter. A lot of digital style, like all fashion, is really just about youthful identity. There are those who are unconsciously caught up in it, those who worry they’re past it, and those who feel definitively past it and thus resent it. Those types of emotions should not cloud your business thinking. Missing an economic revolution because you dislike changing fashion is a grave mistake.

Digital vs. clerical IT

What is digital? I believe the best way to answer the question is to identify and analyze the opposite of digital, which I call “clerical IT.”

storage-1209059_1920

If this exercise seems circuitous, consider that most attempts to define digital are not that illuminating. If you look at Deloitte Digital‘s promotional video, for example, you might conclude that digital means hip office workers congregating around whiteboards and 3D printers. While I’ve already argued that there’s nothing wrong with digital style, I believe that understanding digital this way reduces it to a fashion statement and encourages either shallow adoption or shallow rejection of some genuinely important changes in business culture. The importance of digital becomes much clearer when we consider the nature of the clerical IT that it is replacing.

Clerical IT is characterized by the idea that “IT supports the business.” It begins with the unstated and incorrect presumption that “the business” exists externally and prior to any technology. The clerical IT mentality is rooted in the history of IT as data processing, a heritage we have not yet completely overcome. In this schema, IT equals the automation of back-office functions. It is “clerical” in that its primary purpose is to automate the basic transactions of office work: filing, sorting, archiving, and so on. Even when we propose performing such activities on a mobile device (“Let’s have salespeople enter their call data on their iPads!”), we are still trapped in a clerical IT mentality.

The markers of a clerical IT culture are heavy governance processes and a generalized atmosphere of distrust between “IT” and “the business” — as if that distinction made any sense. The dominant mode of development is waterfall or, increasingly, “Wagile,” which is another way of saying that enlightened people are pushing for Agile but fighting the ingrained culture of the organization. Development is based on continual negotiation of requirements and a contractual mindset. Paranoid financial controls may be in place, making it difficult to operate, but in such an environment such controls are actually rational, given that they are the only bulwark against runaway spending.

Perhaps the clearest way to understand clerical IT is through how addresses the classic triangle of “people, process, technology.” In clerical IT the modus operandi is to talk to the people, ask them their process, then automate that process through technology. Note the premise that the business exists prior to and outside the technology. Note the other premise that people, in this case “the business,” can determine their future needs in an informational vacuum and clearly articulate them to developers. As anyone who has ever worked on any development project can tell you, this is simply not true.

With clerical IT defined, we are in a much better position to understand what digital is. Digital is not just the automation of back-office processing, nor a thinly disguised version of such executed on a mobile device. In digital there is no assumption that “the business” exists outside or prior to technology. Digital begins with a holistic look at all of the human and technical resources available in some bounded environment. It then attempts to solve the business problems in that environment in a new way, or eliminate them, or even redraw the boundaries of the environment altogether. A digital mindset is fundamentally continuous with an innovation mindset, which as I’ve argued elsewhere is fundamentally different than a competitive one.

Digital’s approach to people, process, and technology is fundamentally different than that of clerical IT. Digital begins with the people, meaning all impacted parties, working in equal roles in the manner of a GE-style workout. These people are then poised to break apart and solve a complex problem using all the insights at their disposal. The outcome is a “people, process, technology” solution in which any one element, or no element, may dominate. The result is not simply a new way to execute clerical work.

Hegel once commented that history repeats itself. Marx then added “He forgot to mention: the first time as tragedy, the second time as farce.” I believe that in the developing histories of dot-com and digital, the traditional order of tragedy and farce has been reversed. The dot-com era was a farce, even if a somewhat impactful one. The digital era will be a tragedy for anyone who fails to grapple with it or take it seriously. Leaving behind clerical IT is the first and most important step.

Why traditional consulting is doomed

In its current form traditional management consulting, the business of providing advice for money, is doomed.

dino

This may seem quite a bold statement. But I believe that anyone who understands the industry and takes a long unemotional look at the developing technological landscape will conclude the same thing. A couple of years ago Christensen et. al.’s HBR article Consulting on the Cusp of Disruption caused quite a stir. I think that in ten years we will look back on the article and it will read like a rather tame advertisement for McKinsey Solutions.

The thesis of “Cusp” — that sometimes clients just want data and an answer instead of a heavy and complicated engagement with a consulting firm — is absolutely true. But it rather understates the pressures consulting is going to face in light of developments in artificial intelligence and big data. Some people, somewhere have undoubtedly had this realization already and are hard at work building the consulting firm of the future. In the meantime, the rest of us in the industry owe it to ourselves to wake up.

Of course all broad prediction about a “traditional” industry being “doomed in its current form” can degenerate into a clever sort of shell game. Industries are always changing and it’s easy to be right if you don’t define all your terms clearly up front. Consulting has always been changing. In the 60s and 70s, for example, strategy firms like BCG had production departments that produced transparencies for presentations delivered via overhead projectors (hence, incidentally, the term “deck”). Nowadays, of course, we have PowerPoint, and even senior partners can crank out their own presentations when they have to. These sorts of shifts in the division of labor are going on constantly in consulting, as are growth and shrinkage in the size of the industry and of individual firms. Even new rent-a-consultant companies like SpareHire and the cringeworthily-named HourlyNerd are not truly disruptive. These brands are fundamentally parasitic; advertising that you can offer “ex-McKinsey consultants starting at $60 an hour” assumes that McKinsey exists and commands buyer respect. Such companies may nibble at the fringes of big-firm revenue, but they will not replace those firms.

What we are discussing here is not “change as usual” but much more fundamental set of changes that will be wrought by technology. The issue of consulting’s disruption becomes clearer when you focus clearly on what consultants do. A careful examination reveals that the four principal activities of consultants are all inherently disruptable. Taking them in order:

Diagnosis and problem identification, the art of determining “what is the problem,” is a machine-duplicable process that has been managed by expert systems in other fields for decades. The medical expert system Mycin beat doctors in diagnosing bacterial infections way back in the 1970s. Expert systems gained traction in numerous fields throughout the 1980s, limited only by the expense of computing resources and the high cost of engaging experts to build knowledge bases. Nowadays, computing resources are cheap and the developing world is full of experts on every topic, who can be engaged easily and remotely. It is only a matter of time before expert systems are applied to simpler, less life-critical problems like “What is the best way to structure and run my project office?”

Data collection is traditionally done through interviews, surveys, and research. All of these activities can be replaced or at a minimum improved through the use of big data. The ability to crawl the web and create a broader, richer data set will inevitably change how consulting research is done. Having a big-data answer to a business problem will become an expectation rather than a novelty.

Analysis, the art of crunching data and using it to develop insights, will become computationally-driven analytics. Of course in many cases consulting firms are using analytics already; the difference is again that this will become an expectation rather than a novelty, applied to a broader range of problems as the world’s store of easily accessible data increases.

Reporting, the task of writing and developing reports and presentations, will become increasingly automated due to the use of natural language generation software. Companies like NarrativeScience are already developing sophisticated products that greatly simplify such work and create human-readable sentences and paragraphs.

Of course the common objection to all such arguments is “Machines could never do those tasks as well as people.” This objection, even if true (that remains to be seen), is irrelevant. Even if machines “can’t replace human beings” in consulting work, they can certainly augment human capabilities and undermine the current division of labor in an unprecedented manner. Under new technological conditions the traditional pyramid structure of most firms and the economics of the billable hour might no longer make defensible business sense. The management consulting firm of the future might consist of a handful of skilled experts and data practitioners, providing statistically informed recommendations in highly specialized areas of expertise.

As routine work moved offshore during the 1990s and 2000s, many firms worked to become “more strategic” and instead solve higher-level problems for client sponsors. In the consulting world of the near future, this might turn out to be exactly the wrong approach. Implementation work may turn out to be far more durable, since it involves complicated day-to-day interaction with client personnel on real projects. Conversely strategy work may end up coming the most disruptable consulting of all, insofar as it involves providing speculative recommendations based on research and experience, which is ideal work for expert systems. Indeed, business school case-cracking is the exact sort of thing that expert systems excel at, insofar as it involves abstract reasoning and the application of knowledge from similar cases. That is one reason that prestigious firms have always been able to deliver this work with bright young MBAs who are highly educated, but not necessarily experienced.

If consulting were disrupted in the above manner, who would benefit? I believe the winners would be a select few disruptors, plus consulting clients themselves, who would have access to better insight and advice at much lower cost. The losers would be anyone in consulting with a muddled or unclear value proposition. Disruption might also open up completely new markets, such as a “retail” model of consulting at a lower price point, that could be used by more buyers with smaller budgets.

Regardless of how consulting develops in the next 5-10 years, consulting firms themselves should beware anyone who says they can help “guard against disruption” by helping them do any or all of the above preemptively. This claim misses the entire point about disruption which Christensen made in The Innovator’s Dilemma, which is that any step you take to preempt your own disruption may in fact bring it about. That is why disruption is a “dilemma” and not simply a problem to manage.