The Strategy Paradox
Michael E. Raynor
Business strategists observe that companies that significantly outperform the average company employ “pure” strategies. They are relentlessly focused on operational excellence (think Walmart) or product differentiation (think Apple). Average companies wanting to improve returns are encouraged to become more focused. The paradox is that many companies that fail also have pure strategies.
The book’s main theme is success and failure are blamed on the strategic choices that companies made, when those successes and failure may just be due to luck. The future is not predictable enough for a company to fully commit to a strategy and insure success through good execution.
The book uses the Betamax video recorder effort by Sony to illustrate the perspective. Sony’s motto was “Always lead, never follow” and this was justified by their leadership in development of the Walkman and Discman personal music players. They were major producers of TVs and stereo equipment with global distribution. They identified a growing desire among consumers to time shift their TV viewing, in part because of a desire to watch competing programs and in part due to simple schedule conflicts for viewers (school events conflicting with TV programs for example). They also realized that consumers expected to watch programs that looked and sounded as good as broadcast TV. A couple of companies had tried to launch video recorders, but low quality manufacturing and images made these products unattractive to consumers. Sony developed the Betamax to provide a high quality viewing experience with a focus on consumers recording regular TV programming for later viewing. Sony had had a previous joint venture to produce a predecessor product that had failed due to manufacturing flaws, so they decided that they would not risk the brand by licensing out the technology. In the end, they launched a high quality video recorder that was pricy but perfect for recording TV shows. This was a pure strategy of product differentiation. It should have succeeded, but it did not.
Matsushita launched a competitive recorder using a different technology (VHS) almost immediately with a focus on low price – and they licensed their technology to many manufacturers. For the first few years, Sony had a slightly greater market share – until consumers began to watch a lot of movies on their recorders. Video rental stores could not afford to stock every movie in two formats and for a variety of reasons VHS came to dominate rentals – and this was the death of Betamax. The company’s strategies failed not because they were bad strategies but because they were great strategies.*
When Sony made its major decisions, it could not have known or anticipated that rental movies would become important – all available evidence suggested that it would not. They could not have anticipated the importance of low cost compared to quality because consumers said they preferred quality. What was not clear was how much quality was good enough at what price. Given the information about competition and consumers, Sony devised a good strategy to dominate the market and maximize returns. It failed to obtain that outcome. It is interesting to compare the Sony Betamax strategy with the Apple iPod/iPhone strategy. Both had a focus on quality/design, higher prices than the alternative, a closed system and a need to enlist content partners. Apple has been wildly successful using the same strategy as Sony. Both organizations have good execution skills, so this can’t explain the different outcome. Sony had other big successes and failures in and around the period when Betamax was launched. Apple had had other successes and failures in its history. Apple had the good fortune to offer their products at the time consumers wanted what Apple was offering.
What could Sony have done? Sony was not the victim of bad planning or lousy strategy or poor execution….The short answer…is that Sony focused too much on the pursuit of strategic success and not enough on the management of strategic uncertainty. Strategic uncertainty is the realm of future events that are essentially unknowable and unpredictable.
For simplicity, the book describes two pure strategies – cost leadership and product differentiation. There are variants on these, but these two imply a commitment to one of these extremes and avoidance of the medium price-medium differentiation approach. Seeking competitive advantage in one direction is incompatible with seeking competitive advantage in the other. This is the basis for the idea that strategic commitment creates greater risk than strategic ambiguity. Many companies fear a high quality-low cost competitor, but these are probably a mythical entity. What may be true is that technological advance may move the frontier of price or quality in a way that makes the current quality (at a low price) superior to past quality. This advancing frontier means that the “acceptable” absolute price and quality may be changing relative to a company’s past accomplishments.
Creating the kind of superiority that market winners enjoy is complex and time consuming. A company that embraces cost leadership will not develop that capability without sustained effort (In this strategic context, cost leadership is different from cost cutting). If that company were to choose to switch to product differentiation leadership, it would need to rebuild all of its processes, facilities and messaging – no one of which is easily or quickly done. There is a general underappreciation of the complexity of having either cost leadership or product differentiation leadership. Past strategic commitment to one strategy inhibits the ability to change to another. Strategically ambiguous companies do not experience that switching cost, but they also never develop the associated strategic advantage. This is the root of the greater risk associated with commitment to a pure strategy.
Companies consider two approaches to overcome strategic uncertainty in hope of picking the “right” strategy. One approach is adaptation and the other is prediction. Adapting companies hope to detect the change in its early phases and then react to that change in a timely manner to get it right just-in-time. The second tries to study the market, customer needs and so forth in order to predict the future well enough to pick the strategy that will fit the future. Neither of these approaches really works.
The problem with adaptation is timing. It is difficult to change fast and the required change in such a case could be extreme. Even if a company knew when to begin adapting, it might not be able to adapt in enough time to obtain a competitive advantage. Alternatively, it might begin to adapt to the potential future as quickly as possible, and be “adapted” before the adaptation makes sense. To take the problem a bit further, it can be quite difficult to usefully detect a change in the business’ environment. Many businesses struggle to recognize an important change in its environment because the individual clues can be quite small and incremental (non-fundamental) changes overcome these problems. Raynor uses the example of large US integrated steel companies facing disruptive mini-mills. In early stages, the mini-mills produced low margin products with low quality requirements. The integrated mills were able to abandon these products and shift capacity to higher margin products. Competition actually raised profitability. But over time, the mini-mills increased quality, invaded more product lines and began to cut away the basis of profitability. By the time that the integrated steel companies detected the nature of the problem – it was too late to adapt. The same thing happened in passenger automobiles and magnetic memories. In a classic paradox, during good times there was no need for change, and by the time that it was obvious that radical change was needed, there was no capacity to effect it.
The alternative approach is to try to predict the future in order to choose the proper strategy. Prediction works adequately when there is a mechanistic connection between the future and current states. For example, weather forecasting is moderately successful and is improving over time. The same can’t be said for GDP forecasting. For example, one study concluded that only 2 out of 60 recessions…were forecast a year in advance, 40 remained undetected by April of the year of recession, and 25 percent of forecasts were still for positive growth in October of the recessionary year. These numbers represent the consensus and obviously a few people would have been predicting the recession and been ignored. Raynor calls this “monkeys at typewriters”; if there are enough predictions of anything – some will get it “right” by chance, which is not really helpful at all. It is also misleading in another way because it suggests that the future only fits two cases: “the predicted outcome” and “not the predicted outcome”, when there are many possible futures. Predictions are rarely presented as distributions of outcomes with a distribution of probabilities – they are presented as point values. “Next year’s sales will be $25MM with a 30% profit” is a common type of prediction. You rarely see a prediction that reads something like, “There is a 10% chance that profits will be less than $5MM, a 30% chance that they will be between $5-10MM, A 50% chance that they will be between $10MM and $25MM and a 10% chance that they will between $25MM and $30MM, assuming that GDP growth is between 1.5 and 2.5%.”
Many things make it hard to make useful predictions. For one thing, many predictions are based on observation of multiple similar cases of outcome distributions. Weather forecasting owes much of its usefulness to a statistical view of the past. But most business situations don’t have lots of similar previous cases to build up a base for prediction. Indeed, the range of things that are part of the background for any prediction in society or business is huge. One thing depends on many things – and it is very hard to know which things to ignore. Is potential default on Greek bonds a reasonable thing to consider when building a business case for a new server design? You would not think so, if it had not happened at about the time this book was published. In some cases, systems are fairly mechanistic and thus predictable – IF you can assess the initial conditions. This means that you must know when things “begin”, which is equally hard. Did the problem that triggered potential default of Greek bonds begin when Greece joined the euro or was it already present in the 1980s - or even before. Just to make it worse, many events represent a convergence of events so you must understand a whole range of timings to set the so-called initial conditions. Prediction is not a practical exercise – and a poor basis for understanding strategic uncertainty and risk.
To summarize the situation to this point, creating an above average return requires committing to relatively pure strategy which also creates greater risk. That risk can’t be offset by adaptability or prediction. The remainder of the book suggests an approach to dealing with the dilemma of making risky commitments in the face of strategic uncertainty; what is the right structure for such a company?
The discussion of structure sets out some perspective before making suggestions. Consider two dimensions: environmental uncertainty and “benefit of committing”. An environment might have low or high uncertainty. Similarly, there make be environments where there is low benefit to making a commitment and others where there is a big advantage. Both uncertainty and benefit are relative to the specific competitive circumstance. These two dimensions create a grid that describes three conventional organizational structure choices.
The situation is fairly clear when the benefit of commitment is low, but the combination of high uncertainty and large benefits create a dilemma for organizations. A typical response is that senior executives should set strategy and mid-level managers should execute it. Raynor suggests a modification to this view, but this suggestion requires some context.
When an organization is young it typically has one product and a simple hierarchy is sufficient. Everybody works in a functional department and all departments report to the CEO. As the company grows in scale –nothing needs to change. But when the company decides to diversify their offerings, the usual approach is to break the business into divisions or business units, in which each function is replicated. This creates a need to integrate across the divisions – a role that often falls to the corporate center. Integration involves allocation of resources, alignment of processes, and setting of goals. But integration requires both technical knowledge and exposure to detail – neither of which is the emphasis of executive positions. In fact, integration is better conducted at upper middle levels of the organization for this reason. This can be understood better by the following.
Elliot Jaques studied who people sought for decisions, and found that most people have more than one “boss”. Certain decisions went to their direct supervisor, but others went to their “real” boss. This seemed like a hierarchal failure, but Jaques observed that the difference between the two bosses was in their time frame. For example, a sales manager may have annual sales goals and emphasizes those goals in their decision making. But a sales person may have monthly goals that relate to some different manager’s monthly goals. The sales person will consult the person whose time frame matches the problem rather than the person that the hierarchy suggests. For the same reason, senior people may skip down a few ranks to gain information that they need for a particular type of decision. The idea of level skipping is an intelligent adaptation to matching time-based decisions. Jaques identified seven relevant time frames and suggested that this meant that an organization needs only seven levels. This theory is generally known as “requisite organization” and helps explain how organizations deal with different kinds of uncertainty. Lower levels of an organization face certain types of uncertainty, but they are near-term in nature. At higher levels of the organization, the time frame of uncertainty shifts to longer periods. And higher points in the organization must spend less effort on dealing with nearer term uncertainty. Division leaders should spend little time dealing with very short term uncertainty, and much more dealing with medium term uncertainty. This rests on another simple fact. It is very hard to think about and balance many time frames. Few people, maybe less than 5% of people can manage the intellectual and emotional strains of considering many time frames simultaneously. Consequently, one purpose of a hierarchy is to divide the time frames so that people can focus on one time frame only.
Strategy is one of those decisions that involve uncertainty over longer time frames. One way to consider strategic uncertainty (versus operational uncertainty) is that strategic uncertainty affects how you make money – its creation or capture. Sony’s uncertainty was in how people would use video recorders, and if consumers chose time shifting as the primary purpose, they made money. This sort of strategic uncertainty is rarely a short-term problem and helps explain why strategic uncertainty and risks are a higher level concern.
In practice, companies rarely have strategy set by the CEO and his cohort. In most companies, the corporate leadership sets targets and general approaches for execution, but delegate the specific activities to achieve these targets to subordinate levels. Approaches mostly relate to the two aspects of hierarchy previously mentioned. Effort can be devoted to increasing scale without changing “type” and/or effort can be devoted to changing the scope of the business (different type of value creation). Seen in this light, acquisitions/divestitures can be viewed as efforts to change the scale or scope of the business. Re-organizations can be viewed as efforts to change the integration of the component divisions. These approaches dictate the highest level of action to react to uncertainty, but the subsequent actions by mid-business managers are dedicated to delivering on the potential of the strategic approaches imposed by the corporate center.
From a strategic perspective, these strategic approaches can take the form of commitments or real options. For example, a commitment would be to make a large investment by buying a company to add to the portfolio. In contrast, a real option would be to buy 20% of the same company, with an option to gain control at a later date. As described in this book, the total acquisition creates the potential for both great success and complete failure. Taking a real option (as opposed to a financial option) is an example of confronting the strategy paradox. If the investment fails, the loss is less than the full commitment. If the investment succeeds, future investment is possible with greater rewards. The book describes some examples of some real options. These mostly take the form of venture capital like activities involving investments in small companies with exit options. For example, a telecoms company acquired a share of an IT support company and merged its own support division into the company without taking a large stake in the resulting company. Eventually, it became clear that there was no synergy (at the time) between telecoms and IT support, so the telecoms company sold its share and moved on. This same company had bought a share in a small mobile phone company when it was unclear how landline and mobile services would evolve. This share turned out to be very valuable when it became clear that mobile would displace consumer landlines. The telecoms company exercised their option to acquire the rest of the company. In both cases, the telecoms company was unsure how the market would evolve and positioned itself with options that could be exercised as the uncertainty declined.
One alternative for the telecoms company was to invest in building their own divisions dedicated to mobile or IT support which would have been more expensive and involve a large commitment. Instead, promising companies were found with other stakeholders to share the risk. The book spent some time discussing diversification in this context. Was the telecom company diversifying by taking options in a mobile phone or an IT support company? In this case, the question boiled down to the question of whether there was a future synergy between the option and the operating business. It eventually became clear that there would be synergy between landline and mobile, but not between IT services and telephones. Implementing the IT services option would have been diversifying because there was no strategic synergy. It is the element of synergy that often explains when companies should acquire or divest business units. Changes to the external environment creates both potential of synergy and degrades existing synergies. This is the other side of strategic uncertainty; over time, existing business combinations that were perfectly functional may lose their rational – and may be better fit to be combined with a different group of businesses or even to stand alone.
Synergies also explain a problem in innovation. In a corporation where the onus for all innovation is on operating divisions, synergies will generally be missed because the operating divisions will be hamstrung by the organizational constraints that define their mandate….A second problem is that individual divisions will…be constrained from exploring opportunities that are “too big” for them to handle. Yet another problem is rewards for managers. It is relatively easy to judge the current performance of a business based on earnings compared to targets. In contrast, it is very hard to judge the quality of their decisions about the future. This makes it hard for the managers of an operating business to invest (raise their current costs, perhaps decrease their current rewards) in uncertain innovations or future opportunities.
With this perspective, it is clear that strategic flexibility needs to be created somewhere above the operating businesses and below the CEO; in some companies this takes place at the “group” level. The book suggests that the process that the people use to create this strategic flexibility - which has nothing to do with adaptability, but everything to do with optionality. The process for creating these options encompasses four steps: anticipate, formulate, accumulate, and operate.
Anticipation is not prediction, but it is a search for plausible 10-year distant futures. Scenario planning is the preferred tool for anticipation. Scenarios would consider general economic and political conditions, potential market dynamics and evolving technology. The book suggests that the best insight will come from considering “edge” scenarios that embody the extreme plausible range of outcomes. The probable futures would be a less exaggerated case of one of these scenarios. The goal of scenario planning is to describe a variety of potential futures. Potential scenarios should not “collapse” to a single scenario – this completely misses the point. The the group should not strive for consensus on the correct scenario, but should agree that the scenarios developed are plausible.
The first step in developing a set of scenarios is to ask the right question. There is a temptation to ask a specific question rather than ask about the possible conditions that would apply in the future. The book suggests that a question like “Should we build in China?” might be too narrow and a better question might be “What kind of manufacturing base will be most advantageous in 7-10 years?” The second step is to identify the key dimensions of the scenario. For example, regulatory environments could be “restrictive” or “permissive” and the difference would be important to companies that face regulatory hurdles in their basic business. Other dimensions could relate to factors like: who makes the buying decision, general economic conditions, industry consolidation or technology change. A dimension should be something that is out of your control. There should also be a big difference between the extremes on the dimension. Although the initial examination might identify many potential dimensions, you should recognize that the number of scenarios to be developed equals 2n where n equals the number of dimensions. For example, 3 dimensions requires development of 8 scenarios and then 8 strategies. It is a rare case that is only impacted by one dimension, but few organizations can manage the analysis of 5 or more. Two or three is ideal, so the team should prune their thinking to those dimensions of greatest probable impact. The third step is to define the conditions on each end of the dimension. As mentioned above, a regulatory dimension might have extremes described a restrictive and permissive, only more specific as applied to the business situation. What do restrictive and permissive mean? Outsider authorities are a useful resource to combine with inside thinking. One must scan the environment for credible but outlying opinions, consensus opinions entirely miss the point because you are seeking diversity. The fourth step is to determine the final set of scenarios. Upon examination, some scenarios are internally inconsistent (one aspect of the scenario contradicts another aspect of the same scenario) which means it can be discarded. Other scenarios may resemble each other enough to suggest only one needs to be retained. Developing strategies is a lot of work and there is no value in creating “extras”. After all, each strategy will call for some effort and investment to execute; too many options will begin to cost more than the realized value of a few that come to pass. The fifth step is determine the relative probabilities of the remaining scenarios. This should be done by some kind of secret voting procedure to minimize the effect of persuasion and influence. This is another point where the diversity of the group has the potential to produce a more accurate assessment of the potential than any individual or group consensus.
For each scenario, a strategy should be formulated that identifies the core competencies, assets, threats and opportunities that correspond to each of the scenarios in the final set. These futures are compared to existing capabilities to understand what gaps may exist. Another outcome of the analysis will be identification of “core” and “contingent” aspects of the scenarios. Core aspects are those for which there is little uncertainty. For example, a biomedical company would consider the future emergence of a large elderly population in the US to be fairly certain. In contrast, the structure of the health insurance market is fairly uncertain. Coping with core aspects is properly the role of operating businesses, since there is little uncertainty. Contingent futures is the purview of group level. In the accumulate stage, the group identifies and acquires the desired options to address the contingent future previously identified.
During the operate stage, managers keep watch on the environment to try and determine which of the potential scenarios developed in the anticipation stage are most similar to the evolving reality. In other words, they must determine which options to exercise. They must also determine what to do with the options that will not be useful to the company.
Once the scenario process is complete and a set of options are created, it is important to insure that the organization remembers what the option is “on”. The option may be related to a technology, a specific market or regulatory future, or some other uncertain future situation. Operating the set of options requires that the “purpose” of the option inform the maintenance, exercise, and abandon decision making. This sort of “real” option is different from a financial option which has strict exercise criteria. The company operating the set of options must decide when the uncertainty has cleared enough to know which options to exercise and which to sell or abandon - and this is a function of the strategic potential of the option rather than the cost of letting the option go. Business is rarely thought of as a series of experiments, but that is a useful way to consider a real option. Some experiments work and some do not; there is no reason to hold onto a failed experiment. In many ways, it is the ability to productively exit an option that defines successful management. Many such investments are not failures as much as they are not answers for that company’s strategic needs anymore.
There is an ongoing argument in the strategy world about whether the best strategies result from planning or reaction to emergent conditions. Raynor says that this is a false dichotomy. At the level of corporate strategy, the future is most uncertain and thus strategy must be emergent. The sort of scenario planning and strategic flexibility described here must be applied. But at lower levels in the same company, the uncertainty is much less and planning is the best way to execute against the limited uncertainty. For senior executives who have advanced due to good skills and planning, there is a need to recognize that those skills are not the same skills needed in the face of greater uncertainty. In fact, the belief in their skill can be a serious risk by itself. Much of business culture celebrates the hero CEO and it is hard to resist the impact of such adulation. Yet, given the uncertainty, there is an important role for the humble approach embodied in an approach that accepts an inability to predict or control the future and yet commit to a method to thrive despite the uncertainty.
Comment and interpretation:
- As I read the first part of the book, I was struck by a potential parallel between the idea of managing a venture to cope with strategic uncertainty and the ideas of effectuation as an entrepreneurial strategy (as presented by Saras Saravathy). The moderation of commitment permits reaction to unexpected conditions and flexibility in responding. Most importantly, entrepreneurs, in this interpretation, are strongly focused on survival and manage accordingly. Entrepreneurs can’t afford many options, yet they explore via options using affordable loss and the crazy quilt principle.
- Many years ago, I visited a Chinese company who sought to diversify their capabilities and products. They had a pretty good position and could have made a lot of money by scaling up that position. They were uninterested in scaling up and responded that if they focused the business more, they ran the risk of the market turning and putting them out of business. What that meant was that they would lose their jobs and the entire village that was dependent on the business would be destitute. By diversifying, they might not get as rich – but they would not risk getting as poor either. I think there are many companies, especially family companies, with exactly this perspective. They prefer moderate returns and survival.
- You must wonder how many industries are in the middle of a significant change in customer desires. Brands in so many industries seem to be losing their impact and companies seem to have fewer solutions to retain their positions. It is interesting to consider Raynor’s suggestion that the decline is hidden behind other events or causes (our sales are down because of the recession) that delay sensing the background trend. It is easy to say that they should know better, but slow change is not easy to detect. If you don’t track it carefully, it is easy to not notice that you are gaining weight, your car’s tires are deflating slowly or that your air conditioner is losing efficiency due to lack of maintenance. Slow change creeps on us until we can see it everywhere. Adapting to change is hard because we can’t see this kind of change.
- We know that it is knowledge content that drives profits more than asset base and this explains the structural decline in the significance of the steel industry versus health care. But in 1974, how could we have known that increasing the knowledge content of the steel industry was not a viable long-term solution? This is a fascinating comment because it emphasizes the difference between a commodity, that’s an invariant “thing” with its value governed only by supply and demand, and a non-commodity, that is differentiated in some respect and whose value ALSO depends on the users’ specific needs. It may be in the interest of some buyers or sellers to create the impression that particular products are commodities and convert all decisions into price decisions (where they think they are advantaged). But an alternative view is that a commodity can be de-commoditized if you can demonstrate that the information embodied in the product is increasing. This works especially well for natural products where the natural variation can be converted into a predictable variation. Location-based wines are one example of this approach. It further leads one to think that operation excellence can become the basis for de-commoditization when efficient processes can “add” information to otherwise commodity inputs through sophisticated process control.
- I read the section on strategic uncertainty and time frames and thought about how common the phrase “what’s our strategy?” is when discussing some relatively near-term items. “What is our strategy for the upcoming meeting?” is used when a better question is “What is our plan?” But it also leads me to notice that many activities occurring in the middle of an organization are actually fairly strategic in that they determine how a business intends to make money in the future (3-10 years). Is that activity misplaced or is it mislabeled (the actual decision was made at higher levels and the activity is simply the resulting action? The activity is one example of the strategy already selected? Finally, are those activities actually the stimulus for strategic decisions? If people in a division decide to undertake a project that forces senior managers to ask the question “What is our strategy?”, is strategy still be created by the senior managers who approve the emerging activity? Would companies that enforced strategy creation in the upper most ranks lose significant opportunity?
- According to the book, one of the indicators that strategy is not the primary role of a CEO is the connection between strategic timeframe and CEO tenure. It is quite common for a CEO to hold their position for no more than 5 years and for new CEOs to be brought from another company. There is little chance that the results of a change in as strategy can be seen in less than 5 years, unless it is a complete disaster, so it is easier to understand that the role of the CEO is to set the conditions (the approaches) for the organization. If the CEO is taking too little or two much risk through their approaches, this could be grounds for a change. The assertion that CEOs are not the agent of strategy development runs contrary to common thinking, but also makes sense. In a large company, not too many strategies will work across all businesses. But the decisions related to scale, scope and integration can impact the entirety. The ability to explain these decisions both to shareholders and employees is something more easily assessed. Apparently, this is still a high standard to judge from the short tenures of CEO.
*Text in italics is quoted directly from the book
Recent Comments