Archive
Design for Change and HBS article
A key principle of Excellence by Design is the that of ‘Design for Change’. While the ability of any system to change over time has been important, it is becoming increasingly so due to the more rapid pace of change in technology, globalization, and consumer demands. Simply put, the world is changing much faster these days than it did even just 5 years ago, and systems (both technological and organizational) must be much more agile and responsive to this change.
Which means ‘Design for Change’ is more critical than ever. Designing a system intentionally, to be more capable of change (in a managed, effective way) should now be a paramount consideration. It is also one reason why systems that are rigid and inflexible to change, regardless of their maturity and function, are becoming greater sources of dissatisfaction. Agility is becoming more important than function. The Apple iPhone is the perfect example…it’s ecosystem of thousands of apps allowed it to be very flexible to new needs, while the typical cell phone providers struggled to provide more ‘fixed’ features in their phones. It is fast becoming an unsustainable business, and ‘smartphones’ the norm….because they are ‘designed for change’.
An article by the Harvard Business School does an outstanding job of putting more explanation into this. While it is very technical, the critical results can be summarized, and have great implications for Design for Change. The full article can be found here.
Summary:
The article analyzed complex software systems in terms of Core and Periphery Subsytems. Their interest was to measure the level of core vs periphery use and understand its implications. The study analyzed large software systems with a minimum level of complexity and usage as measured by having a large number of end-user deployments.
Any complex technological system can be decomposed into a number of subsystems and associated components, some of which are core to system function while others are only peripheral. Core components are those that are tightly coupled to other components. Peripheral components are those that are only loosely coupled to other components.
Some key findings:
- “How such “core-periphery” structures evolve and become embedded in a firm’s innovation routines has been shown to be a major factor in predicting survival, especially in turbulent technology-based industries.”
- they found “tremendous variation in the number of core components across systems, even when controlling for system size and function. Critically, these differences appear to be driven by differences in the structure of the developing organization.”
- The article notes research showing: that “tightly coupled (‘core’) components tend to survive longer and are more costly to maintain, as compared to loosely coupled equivalents… higher levels of component coupling are associated with more frequent changes and higher defect levels…teams developing components with higher levels of coupling require increased amounts of communication to achieve a given level of quality.”
- There are substantial variations in the number of core components across systems of similar size. For example, variation from 7% (linux) to 64% (myboks) of a system as being core.
- Most interestingly different organizational forms appear to yield designs with different structures. The difference between systems constructed in distributed (esp open source) methods, and closed/commercial (single company/team) environments was striking . Even when comparing similar functionality, the closed/commerical offerings were significantly based more on core subsystems with an average of over 50%, compared to avg of less than 10% core in those systems constructed from distributed, independent teams.
- And finally, their summary, which is telling: “it is significant that a substantial number of systems lack such a structure. This implies that a considerable amount of managerial discretion exists when choosing the “best” architecture for a system. Such a conclusion is supported by the large variations we observe with respect to the characteristics of such systems. In particular, there are major differences in the number of core components across a range of systems of similar size and function, indicating that the differences in design are not driven solely by system requirements. These differences appear to be driven instead, by the characteristics of the organization within which system development occurs.”
Implications/Recommendations:
Design is a key consideration in the design of systems, yet this study would show that design (in terms of subsystems) is variable, influenced by the organizational design, and yet has very important ramifications for future extensibility, flexibility, and resilience to change. My experience has been heavily influenced by the focus on Core/Periphery subsystem design, and these findings match my observations. As such, an organization developing a system (whether it be software, hardware, process, or organizational) would be wise to remember:
- constructing your ‘system’ with a minimal of ‘core’ components, and well interfaced (standardized, loosely coupled, independent of implementation) ‘periphery’ components, will lead to lower costs for change…which is inevitable, an increasingly the norm.
- avoid having your organizational structure unduly influence your system structure. For example, while a ‘small tight team’ may be good to drive an effective design effort, ensure they design the system with the intention for ‘least core/maximum periphery’ system design
- these recommendations apply equally to business design, something to think about especially as a company expands globally and it is critical to determine (design!) the right balance of adherence to corporate consistency (core), while allowing regional/market adaptability (periphery) capabilities.
In any case, some good aspects to consider when you ‘Design for Change’ as part of Excellence by Design.
How Ford got its groove back
An article this week in CIO reviews how the IT transformation at Ford Motor Company helped drive, and support, the turnaround at that Company. Before I comment and provide some personal experience from my participation, lets review some of the very impressive news from Ford:
- Profit is back. Ford reported its fifth consecutive profitable quarter, and $2.6B for the last period (2Q10)
- US Marketshare has grown. In fact, in every month of the last 2 years (except one)
- US Brand impression is MUCH higher in the US. For perception of quality and very importantly, innovation
- US Vehicles now receive high ratings. As measured by Consumer Reports, and by other consumer testing sources (and Ford’s internal research)
- US Product Winners abound. Taurus (especially SHO) is back in a big way and finally sheds the ‘500’ fiasco years. Fusion continues to do very well. The F-150 is taking share and winning awards (as usual). The revamped Mustang is a hot hit, again. The new Edge interior with Ford’s new driver UI is gorgeous. The small Fiesta has entered the market to warm reviews. There really are few duds and perhaps the only complaint is that Lincoln is still not performing as well as one might like (but customer satisfaction, especially with dealer experience, is very high) and some vehicles like Flex are not the runaway hits one might have hoped for.
- Europe leadership grows. Ford is #1 or #2 selling brand in Europe (depends on period you select over the last 2 years) and its design leadership there has influenced the Company’s direction, leading to better perception, higher sales, and better US products. KA, Transit, Focus, C-Max, Mondeo. Numerous product hits demonstrate a strength and foundation for future success.
The article in CIO talks about IT’s actions relevant to this transformation, and quotes CIO nick Smither, who also gives due credit to the prior CIO Marv Adams. I was at Ford from 2002 to 2008, worked for both CIO’s, and was fortunate to participate closely in much of the work done to help IT become more effective, and help drive the corporate revitalization. That effort, which started before Alan Mullaly arrived at Ford, really took hold once he took the reigns. But the principles were the same. Here are a few of the key ones.
- Reduce Complexity. This was a key IT strategy starting in 2002. Initially it started in areas IT could control, like infrastructure, and then moved slowly upward, towards business applications and information. Over time, this effort helped not only shed duplicate assets, but gain greater focus on the assets that remained, so they became better. This occurred in servers, storage, and networks, but also in key enterprise wide application services like collaboration, data warehousing, and application hosting. This IT strategy bled into the business and took hold in product development, where Ford finally began to take seriously the needless complexity in platforms and components. The benefits IT saw also occurred in vehicles. Product engineering costs lowered, quality rose, capability increased. IT customers saw better service levels. Ford customers saw better products. Complexity kills and focus saves. Of course, its not just reduction. You have to design for greater commonality.
- Be truly Global. Ford has always acted like a multinational, not single global company. IT did too. But over the last few years this balkanization of organizations finally ended. IT started working to leverage global talent, consolidate facilities, and share best practices. The business side of Ford has done so too. While Ford still seems to be very skillful at providing market unique offerings when required, the ‘back office’ of IT and business functions works together much more effectively as a global entity. But note importantly that the reduction in complexity and greater commonization of IT and vehicle products makes this all possible. You can’t maximize global potential if you act like a million separate entities. You have to redesign your systems and processes to enable globalization.
- Leverage the Community. Ford (IT and the business) has moved more towards a model of true teaming, and using methods of enabling that. This not only builds camraderie, it builds best practices, and it increases momentum. A single person’s great idea can be absorbed and magnified, instead of possibly resented or ignored. Team sport is something IT built with Computing Patterns, Centers of Excellence and Communities of Practice. Alan Mullaly brought it into Ford executive suite (where it had, ahem, been lacking) with his common Business Plan Review (BPR) process that encouraged open, efficient dialogue of issues, where help was needed, and a fresh attitude of working together. The lesson here is you have to design enablers and solutions to help leverage the community, not just yell a people to work together more (like many companies often do).
- Pursue Product Leadership. Both IT and the Ford business rededicated themselves to building outstanding products (and in IT’s case, services). In IT Ford led the industry in moving towards utility computing, introduced better methods of developing applications ‘like a product’, and less like one-off order taking, and helped introduce new innovations like Sync. The business re-energized itself too. Under Derrick Kuzak, Global VP of Engineering, new methods, along with more ambitious objectives, were employed to better define the key attributes of excellence, and aggressively design for them.
The points above could sound like motherhood and apple pie, but Ford (IT and the business) made them real by designing for them. It was the perfect example of excellence by design. Many IT and business leaders can talk the talk, but few have walked the walk as we did at Ford. It was truly a transformation in leadership, strategy, tactics, and results that Americans should be proud of, I know I am blessed for having been a part of it. I hope to share more insights soon in my forthcoming book, because the lessons learned in Ford’s successful transformation should be regularly taught in any business, and IT organization.
Applying Excellence by Design…for Healthcare
Much of my professional time over the last few months has been focused on the area of Healthcare and considering the application of Excellence by Design techniques to it.
Here’s a look at Healthcare using just some of the Excellence by Design model facets:
- Environment: Challenging! The Healthcare industry is perhaps the leading example today of a challenging Environment that exhibits the paradox of Chaos vs Control. (Control) The industry is facing unprecedented standardization and regulatory pressures driven by government entities. These cover things like basic interoperability of protocols based on the National Information Exchange model (NEIM) in which the US will guide the development of a health information exchange framework. There is also new content standards for specifying clinical diagnosis and procedures, among others. These new standards will/are significantly affecting the Environment that all players must live in, whether they be software product vendors, information value added services vendors, hospitals, insurance carriers, or others. (Chaos) Of course at the same time the desire to drive new competitive innovations marches on, in medical devices, in information (i.e. business) intelligence services, and in solutions that drive cost down and effectiveness up. But don’t forget that many/most Healthcare systems are based on pretty antiquated technology. So all this change is occurring against a landscape that badly needs modernization of basic infrastructure. From my perspective it seems the Healthcare industry, which has been a laggard in IT evolution compared to other industries (in particular Manufacturing, Finance, and Travel) in both optimization (Control) and innovation (Chaos), now seems to be paying the piper by having to face simultaneous pressures from multiple directions, in a shorter (government imposed, politically energized) timeframe.
- Systems as Strategy: A Paradox. A key facet of Excellence by Design is the use of ‘systems as strategy’ (meaning structured approaches to problems and design of systemic solutions to them). The Healthcare industry has a dual personality it seems in this regard. The medical/clinical side of the industry is the poster child for developing structured approaches to disease discovery, diagnosis, and treatment. It is a hallmark of the industry. Yet IT has not adopted this same level of rigor. Why? Typical reasons given are underinvestment in IT in general, relatively low competency (in staff and even in CIO roles, which are being posted with a flourish these days, as if it never was regarded as important before!) a lack of cross-industry driven desire to solve some of the broader IT challenges like Automotive did with CAD and Supply Chain, or like Finance took on with bank funds transfer interoperability and stock trading processing. The Healthcare industry and its functional organizations have generally tended to remain ‘islands’ that did not seek to cooperate among competing entities, technology providers, and even across functions within a company. There was with little application of broad ‘systems’ of execution as a strategic approach to business process design and technology solutions planning.
- Product as Platforms: An Opportunity (again). As an industry, the IT solutions employed for Healthcare are very ‘siloed’ both in design and in implementation. Other industries have shown the advantages of greater integration of IT solutions into broad platforms that enable a wider class of functionality and information insight, in a more consistent and approachable (same UI, same interface, etc.) form. Of course the classic examples are the ERP vendors, although their offerings have become so bloated and complex they are not the model I would recommend. Better examples are Salesforce.com, Amazon, and e-Bay. These have become very successful not only due to their function and content, but because of the capability to provide as ‘platforms’ that are extendable. Other companies are following this trend. Facebook and Twitter are among the many social networking offerings that are trying to grow beyond being ‘an app’ to become a ‘platform’. So what is happening in Healthcare? Not clear yet. While there is some noise in this direction I cannot say I have been overly impressed that what I have seen is more than marketing spin. Just adding function to an existing offering, or rebranding/bundling of applications, does not a platform make. In my forthcoming book (or a future blog post) I’ll provide some general characteristics that I believe define a great product-as-platform.
In summary, Healthcare is either a scary place to be, or the best game to be in right now. The industry is facing great change, ripe for all kinds of improvement, forced with a sense of urgency by government, and has a noble mission to improve the lives of people. It can be a great podium for those wise and skilled enough to apply smart approaches to meet the challenge. It can also be a vast graveyard for the those who are unable to think broadly, and try and save the patient by applying the ‘one more band-aid and pray’ approach.
I am optimistic that, driven by the forces of today, the industry (and IT especially) will leverage the good capabilities that abound, to improve efficiency of operations, as well patient outcomes. But of course I also believe a key to most effectively doing this is not brute force but Excellence by Design.
Simplicity and Design
I have emphasized the issue of Complexity in Design before in this blog. It is an ongoing and critical aspect of understanding Excellence by Design.
In the talk above, George Whitesides does a nice job of providing a very simple introduction to Simplicity and Complexity. Excellence by Design requires the designer to be adept at using simplicity to create complex capabilities through what George refers to as stacking. A not new concept, he just reminds us of the basic value of using small elements to build bigger things. He also tries to define what ‘simplicity’ is. Interestingly he defines it as:
- Cheap (low cost, so easy to reuse on a massive scale)
- Functional (must provide some utility)
- Reliable (does what it says with extreme predictability and consistency)
- Stackable (has some characteristic to enable easy combination/connection with other things)
Although George claims little study has been made of the subject of simplicity in general, the use of stacking is certainly not new. It is a basic concept that engineers (whether mechanical, chemical, or information technology) strongly use as a fundamental part of their jobs.
I might say however that typically engineers strive to 1) ‘shorten the distance’ from building blocks to complex solutions by using the highest level building blocks they can (use a light switch off the shelf instead of redesigning and manufacturing your own light switch) and 2) seek to build complex designs that are predictable and stable not emergent.
Said another way, the traditional (engineering) way man has viewed simplicity and complexity is to SHORTEN the ‘distance’ between the two needed to accomplish a SPECIFIC result. What this yields is less understanding of the truly simple building blocks, in favor of using a more complex one. No problem if the issue is of some type that lends itself to a ‘static’ goal, like building construction.
But below is a vastly different presentation discussing the effects and factors that have contributed to the destruction of ocean life. The ‘distance’ between the most simple elements of ocean life, and the ultimate effects it will have on life on our plant, is obviously a huge challenge to understand because it is a dynamic, emergent system without fixed, predictable results.
Moral of this post: In business, when considering how to achieve Excellence by Design, the designer must be careful to understand whether the solution they are designing is really
- one best served by shortening the distance to a specific/static solution
or
- one that must enable dynamic/emergent behavior
or some combination of the two…
This ability to determine what level of ‘simplification’ to use, and how, and the effects it will enable, is a very challenging task. It would frankly, be a great subject for a college course in advanced design…but perhaps we’ll get to that level of detail another day.
Einstein of Design
Several years ago I was astounded upon reading the book ‘A New Kind of Science’ by Stephen Wolfram. It provides a point of view I highly concur with, that the universe of complexity can be explained via computational models. Essentially, in my terms, it points out how brilliant design (that is, at its core, quite simple!) can produce infinite variety.
The video above is a talk by Stephen at TED, in which he provides an update on some new capabilities he and his team have subsequently produced (like Wolfram Alpha and Wolfram Tunes), but more importantly, expounds on his belief/vision that computation can present the basis for understanding the fundamentals of the universe…indeed, modeling alternative universes as well.
I believe Mr. Wolfram is well on his way to being the next Einstein for several reasons, and they worth touching upon I think, because they are directly related to the theme of Excellence by Design.
- Great Design can be simple, yet yield infinite variety. This is a core theme of Stephen’s work, my own beliefs, this blog, and is a key characteristic of great designers. It is interesting to me to see, in the universe of IT professionals and organizations, how some embrace this deeply and some do not. It is a capability I watch for in peers and colleagues, and a capability that this blog tries to show how to enable for IT organizations especially.
- Models may be simple, but Results are irreducible. This is a very interesting paradox and is something again, many people may have strong reactions against. Stephen declares (and shows) how by enabling infinite diversity, simple designs are understandable, but their results are not predictable in reduced form. This has huge ramifications. It means you could design something that evolves with unintended consequences…a scary thought if working in biotech or some field whose outcome of your experiment could create a pathogen of death! On the brighter side (a lot brighter) it means that designers can be charged to create more ‘organic’ solutions that can evolve and react to new needs, not just mindless programs that do only what they were originally coded for.
- Model modularity is a powerful concept. In the IT world ‘SOA’ has followed ‘OO programming’ and ‘modular programming’ before that, as a more organized approach to producing, and reusing, functionality. Stephen certainly understand the concept but extends the theory into his concept of computational modeling and in his products (like Wolfram Alpha). I love what Stephen is doing both conceptually and practically.
There is a lot more to Stephen Wolfram, his contributions and concepts, than I highlight here. but if I may may two grand statements:
Statement 1 (not SO grand): Any IT organization (or any business for that matter) would be wise to deeply study what Stephen has done and is proposing to do, and develop a core competence in its application to IT & business. There are deep implications for how to organize work, design products and solutions, and deliver value to your customers. I would argue that just as concepts like industrialization, mass production, process reengineering, and six sigma quality had their time of birth, adoption, and eventual incorporation into the DNA of business management, so will the concept of computational modeling into the methods of planning, production, integration, and service of businesses. It certainly is happening today in many areas (again SOA being a trivial example) but is not really recognized yet for the broader value it can provide.
Statement 2 (very grand): I believe the idea of simple computational models as the basis for understanding systems (whether they be mathematical systems, physical systems, biological systems, or the universe itself) is not only correct, but is, frankly, how God would have done it. Seriously. If you were God, would you build the world in 7 days by painstakingly creating and positioning every molecule? Or would you, as the Great Designer, craft the ability for systems (the universe) to start, and computationally evolve using simple models over eons of time? The idea is so appealing. And it can fit whether you are are deeply religious, spiritual, or atheist. Given the fact of irreducibility, this Great Designer had ideas on what might evolve, yet enabled the freedom of evolution.
I hope you are intrigued enough by Stephen’s talk above to take a bit more time and think about this. He has done a fabulous job of providing a fantastic view of, and methods for, Design, and one that still has very practical applications today. He may well go down as the next Einstein in terms of contributing to the understanding of science, physics, and the universe.
Science, Society, and Excellence by Design
Michael Specter does a nice job reminding us of the importance and value of science based understanding and decision making. I highly agree with his concern that while the world has become more connected and more capable, and science has contributed so many advances, there are many people who are still willing to believe falsehoods or unsubstantiated theories, and confuse issues of facts and science, with policy and politics.
This is important to understand for the Designer, because while good design should be rooted in facts, science, and engineering, it must also face the reality of populism and politics. Take health care information technology, or genetically modified foods as two good examples. Both are subjects for which there is a rich and broad potential for designing solutions and improvements that can benefit mankind, yet both are subjected to highly charged debate, filled with both prejudice and confusion.
One must be careful to understand and differentiate between the science/engineering/fact based aspects of the design, and those aspects that are not so grounded. This does not mean the political/emotional/prejudicial is unimportant. It simply means be careful to distinguish the two and address each appropriately.
I have found this in many types of design challenges. When doing process reengineering for example it is easy for an organization to act with fear at the idea of simplifying operations. The facts/science/engineering may show a far better method of organizing work execution, yet the designer must be cognizant of the potential for the organization to resist the changes for reasons that are factually groundless even if personally very real. This is a trivial example.
The examples Michael discusses are real and much larger, and as a human race we must become more skilled at dealing with this challenge because, as science capabilities accelerate (and they are/will due in great degree to the advancements in computer technology) the opportunities for improvement…and debate, will increase.
Several hundred years ago the world debated the science that said the world was round. This one argument was one of the few, and went on for many decades. Today such scientific discoveries happen all the time, and have much greater consequences. As a society we must become skilled at the process of learning about, absorbing, accepting, and reacting to, this increasing pace of scientific advancements.
So Excellence by Design should not only include design based on the underlying principles of science/engineering, must also take into account the very possible and in some cases likely resistance to the design.
Using Excellence by Design to manage Complexity
Nature does it Better. Something to really consider is how immature we are compared to nature. Nature supports infinite complexity, yet does so by design. Biology and Chemistry form the design basis for nature to support broad complexity. Every leaf, tree, and flower is different, yet they are all formed based on the same design principles. Man has a long ways to go to form design principles as robust as nature has, but what the heck, we have only been at this for a blink of the eye compared to the age of the earth.
Business in general, and IT in particular, that has become much more complex. There are several ways to think about this subject. One is simply how business (and life in general!) has become more complex. This is due to wider variety of options (in products and services), greater breadth of customer base and relationships in general, and more rules/regulations/considerations in these, due to government, social, economic, environmental, and legal aspects. Yes the world in general is getting more complex.
Another way to look at complexity is from an IT perspective. Certainly technology has gotten more complex for the same reasons noted above, plus the advancements in technology itself, which provides an ever increasing set of alternatives in hardware/software/networking technology and perhaps the most influential, the rise of independent offerings that must be ‘integrated’ into a solution. It is not unusual today to find an IT solution that mixes cell phones, web servers, third party hosted applications, remote storage, and enterprise databases.
In fact, the combination of these two trends is growing, and influencing each other. McKinsey recently issued a report on Tackling IT Complexity in Product Design. Should we be concerned and if so, what can be done about it?
Actually this is a subject I have spent some great amount of effort on over my career. Since college, where I studied systems science (BS, MSU, ’80), I have been involved in understanding complex systems and forming models and solutions to explain and address this complexity in ways that are sustainable (i.e. not by spaghetti code that implements every complex feature!).
Some great examples of solutions that support complex behavior, but do so in simple, consistent, excellent designs are operating systems (who are able to run an infinite variety of applications), networking (able to transport infinite data payloads over an incredible variety of communication types including data, voice, video), and perhaps most understandable to many, the spreadsheet (which after all is probably the most widely used IT tool in business and is able to manage an infinite variety of calculations and structures for reporting).
So complexity of need is inevitable (people want to run all kids of applications, send all kinds or data, execute all kinds of calculations), but designing solutions to address this complexity in simple, well structured, sustainable ways is still possible. It is another example of Excellence by Design.
Not too surprisingly, there are more example of solutions to complex challenges via poor, complex designs, then there are examples of elegant, excellent design. And the problem is growing. Creating a great design for a minimally complex world is not too hard, creating one for a highly complex world is much tougher.
McKinsey provides a nice summary of some of considerations that can tend to result in poor design, and overly complex products. While not complete (call me if you want a full discussion 😉 ), it hits some good highlights including: Growth in technology inside the product itself, poor architecture for the product, weak or myopic understanding of the business needs (creating a product for a fixed set of requirements is inflexible and shows not only poor architecture but a poor understanding of the long term business needs) , poor collaboration/teaming among the parties who influence product design (mktg, engineering, manufacturing, etc.), and weak competency in the overall product design and development process.
In the Excellence by Design framework I use as the basis for this blog, I hit these points and a few others. Here are some highlight how they help address complexity and help guide an organization to Excellence by Design:
Chaos vs Control: The world is complex and not all requirements are the same. Deeply understanding and in fact embracing what aspects of a product must thrive in a chaotic environment, vs what aspects must ensure very disciplined control, is a key part of designing for complexity. The internet protocols are very controlled and precise in order to ensure interoperability, yet they are designed to enable a wildly chaotic set of data to be transported. Very few companies or teams i have worked with really try and differentiate requirements in this way.
Systems as Strategy: Creating ‘systems of execution’ that reliably operate, yet support broad usage types is very useful. As a simple example, it is surprising how many companies have financial processes that are still not systemized in any robust way, and still rely on a (often constantly changing) variety of custom spreadsheets, personnel, and submission timing, for budgeting, forecasting, and final reporting of costs. Same is true for HR in most companies. I could go on but the point is you can address complexity in part by excellent design of the operational aspects of the organization.
Craftsmanship to Community: Enabling and organization to leverage both wise/competent experts, and the broad community of participants inside and outside the organization, can help address complexity by making the subject more of a priority, and seeking best ideas for how to design more holistically, yet few organizations utilize this potential.
Architecture Advantage, Design for Change, Product as Platform: These three Excellence by Design principles are core to addressing complexity in product design. Combined together, they can make a huge difference in how products are designed and result in better products (higher quality, greater customer satisfaction), that are more resilient to change (lowering costs and improving competitive advantage), and have a higher value proposition (a product that is able to be easily extended and/or combined with other capabilities generally has a much greater value in the marketplace).
Service Excellence: Not an obvious principle to help reduce complexity, but an increasingly important one. As the world becomes more dynamic and changing and complex, the ability of a product to promise ‘service excellence’ over time becomes more important AND a key differentiator to competitors. Again though this is a subject for which few organizations have developed a core strategy and strength in.
Yes, in a world that is inevitably and increasingly complex, developing enhanced organizational capabilities that help manage complexity is a key success factor for business and IT organizations. Using the Excellence by Design principles is a start.
Reducing Complexity thru The Architecture Advantage
One of the principles of Excellence by Design is called “The Architecture Advantage”. It promotes the idea that just as excellence is rooted in great design, great design is rooted in great architecture. This truth is apparent all around us, as we encounter the products and services of life. Those that seem to work well are usually well architected (if one cares to look deep). It is also true that most that work poorly are in some part, based on poor architecture (although there are a vast number of other reasons they may perform poorly).
Another perspective on this truth on the value and advantage of architecture comes from an unlikely source. The book ‘The Invisible Edge’ focuses on the value of intellectual property as a strategic tool. My experience has been that too often IP is seen simply as a ownership issue, and more is better. The book does good job explaining what makes good IP. The chapter labeled ‘Simplify’ provides the Architectural perspective. It provides 40 pages of very insightful reading.
It starts by describing the danger of business complexity: “Complexity can kill a business. It saps energy. It increases transaction costs. It erodes focus. It distracts attention. Complexity, though, is the inevitable outcome of the kind of economic interdependency that characterizes our modern economy….businesses need to make deliberate choices to reduce it.”
The book then answers the question (and in a way, demonstrates ‘Excellence by Design’) by stating how: “Design strategies lie at the heart of meaningful simplification.” What is really illuminating is that the book respects the importance of good design (and in their focus, IP strategies related to that design) in achieving simplification. In fact they state “simplification strategies are rarely easy to pull off; in fact, executing a successful simplification strategy can be the hardest challenge of all.”
This advice is true for all aspects of a business including product complexity, process complexity, marketing complexity, human resources management complexity, supply chain complexity, etc. Again the authors are right on when they state “Important design choices can be made at every level of aggregation, from the smallest detail of a product’s architecture, to the design of the manufacturing floor, all the way to the design of the organization, and even to the design of the entire network of relationships in the business ecosystem”.
What makes this point so valuable and related to the ExD principle of ‘The Architecture Advantage” is the fact that the book pays homage to the role and importance of architecture as the key to good design and valuable IP that drives simplification and reduces complexity.
Easily said but as a colleague of mine is fond of saying; architecting and designing well is not a job for amateurs. The book goes to provide an excellent discussion of what architecture is and what characteristics are found in ‘good architecture’. Is covers several examples and discusses the tension between having architectural features that are more ‘closed/controlling’ versus ‘open/collaborative’.
A most eloquent quote in this chapter sums it up brilliantly. “Some of the most powerful and sophisticated strategies in modern business involve alignment of IP and design strategies behind a new architecture that breaks the compromise between complete control and overly complex collaboration…strategies like this simplify by rejecting complexity instead of redesigning it.”
Clearly there is an Architecture Advantage to Excellence by Design.