Tuesday, April 26, 2011

We've Been Busy

You may have noticed that we have not posted to our blog in awhile. We’ve been very busy with other writings!

Forthcoming Book in 2012

Execution Dynamics: Align to the Customer Quantum for the Best Possible Business Performance

Let us know if you would like to review an advance copy. We are always appreciative of a few kind words.

Detailed Articles


Lean success, as enjoyed by Toyota Motor Co, has eluded most Western companies. Recent studies have documented and identified various causes for this abject failure. In this e-article, we analyze the situation and propose six attributes necessary to achieve success. We further describe the Profit Mapping methodology that incorporates the six attributes and provides a holistic and systematic way to rapidly achieve success with lean.


Predictive Operational Analytics is the convergence of business methods, systems thinking, and business intelligence software for rapid decision making to maximize performance and profit. In this article, we describe the individual elements of Predictive Operational Analytics and present its practical application using Profit Mapping methods and tools.

Adam Garfein

Thursday, November 05, 2009

Two Tiers of Continuous Improvement

Part I in a Series of Observations from Biology for Business

Running a business is akin to maintaining health and well-being. In our book "Profit Mapping" we devoted a lot of attention to lean thinking, systems theory, business goals, and integrated financials. Profit Mapping is a set of methods and tools for business analysis and optimization. Fundamentally, Profit Mapping is inspired by cell biology -- the basic unit of life. As the book did not talk as much about biology, we are writing a series of blogs to illustrate parallels between business improvement and what we were taught in high school biology.

Continuous Improvement "Tiers"

Over many years and across varied organizations we have observed that many successful companies use a two-tiered approach to continuous improvement.

The first tier is akin to emergency response fire fighting. Here, managers and workers battle daily operational “fires.” Problems are immediate, often mission critical, and must be resolved quickly or the risks may threaten operations and/or the business. Effective managers are good at this, calling upon their wealth of real-world experiences. In addition, low hanging fruit for change is often targeted in this tier.

Actions taken in Tier I are tactical in nature. The objective is to quickly get operations back on track. Living to fight another day is an ongoing struggle.

The second tier tackles harder and more complex opportunities to maximize process and financial performance simultaneously. A big challenge for managers is to figure out how to optimally configure multiple product, process, resources, supply chain and financial parameters all at once to achieve the best possible performance.

Managers frequently confide in us how difficult this is to do well. On the one hand, we are trained to reduce problems into narrower definitions and improve upon a few variables at a time. Reduce waste in any particular part of the system, we are taught, for instance, and the impact will percolate up to show real business value.

Yet, the reality is that everything is interconnected. Even the smallest operational or policy changes have ripple effects, which are dependent on both the specific and collective changes within the environment. In other words, a business is not the sum of its individual parts, but the collection of its interdependent operational systems.

Thus, managers must both constantly fight fires on demand (Tier I) as well as systematically drive complex change for the purpose of meeting multiple business objectives, including financial performance (Tier II). You can't do just one or the other! You have to do both effectively or the organization will get sick and may perish.

The Biology of Continuous Improvement

A little background on us helps explain our thinking. I am a gerontologist by training -- one who studies adult development and aging. Anil Menawat, who is the creator of Profit Mapping, comes from a cellular metabolism and systems theory background. How we ended up in business together is a story for another day. One might say our interests span from individual cells to entire human beings, with emphasis on facilitating growth and maintaining health and well-being.

You can begin to see the connection to business. Swap the above terms with those such as products, operations, factories, business units, supply chains, people, etc., and the "living" system comparison comes to life in a business context.


The first tier of continuous improvement is analogous to disease and the immune system. For instance, we are all exposed to viruses and bacteria on a daily basis. This is our risk exposure. Sometimes we develop infections. Some are life threatening. Some people may obsess over prevention. We take different actions depending on the severity and our pre-existing conditions and current situation when disease strikes. These symptoms are the "fires" that we all fight sooner or later.


The second tier of continuous improvement is analogous to growth and cell division. Growth relies on the production of new cells through cell division. This biological process is one of the most complex systems in the world, particularly when you extend this to an entire human being. Its parallel in the business world is also complex and dynamic. We've all heard the slogans concerning the "DNA" of business. The analogy seems to be that strategy is the genetic code guiding growth. This raises several business questions. Does the organization have the right strategy and roadmap? Is the cell division carried out properly? What are the risks?

The objectives for growth in both biology and business seem to be to "get it right" and sustain. Yet, the risks are great with plenty of opportunity for "error" in both systems, whether in cell division or implementing the right operational changes.

In the Tier I example of a virus or bacterial threat, our actions are tactical, such as taking a couple of Tylenols or getting a prescription for antibiotics. However, that does not change the larger context of cell growth and life (or death).

In Tier II, disturbances in cell division can also cause problems. Sometimes the implications are benign, but they can also have dire consequences. In addition, uncontrolled cell growth is cancer. In business this might be viewed as irresponsible management. Both of these biological and business situations are unsustainable without swift and appropriate action.

As we know from biology, there are more than two “tiers” to human existence and functioning. Likewise, organizations have many interconnected functions and tiers. We have simply focused on a couple to illustrate the dual approach to business improvement.

Adam Garfein

Future Blogs in this Biology-Inspired Series
   • Planning for Growth and Sustenance
   • Two Tiers of Toyota Production System
   • Systems of the Human Body and Business Management
   • Interdependence of Metabolism in Cells
   • Please share your ideas for others too


Friday, May 15, 2009

Why Progressive Companies Should Embrace Their Complexity

A corporation is like a sponge in the sea of commercial energy which surges around it. Taking in goods and services, it makes a unique conversion, and then sends out something of value to others. Like the biosphere, the output of one part becomes the input of the next.

The memory of this activity is represented by the countless records of every transaction, internal and external, created by the conversion process. To become fully in tune with what’s going on, the organization needs to somehow be mindful of all the data flowing through it, and find a way of metabolizing it. By doing so, the company mediates its functions and remains in harmony both internally with its own constituent parts, and externally with the wider commercial world.

The balancing process, like the sea, is fraught with danger, however. The chaotic energy of the world, like fluctuating markets forces, pull and push against the innate self-organizing ability of the enterprise, and if we can’t react quickly enough, our business becomes inflexible. This eventually results in points of starvation and blockage in the company’s throughput and ultimately to a decline in the commercial health of the enterprise.

The information we need to stay subtle is all there in the transaction dynamics, but interpreting such a rich stream in time to take the right action it is not an easy task, and without the correct analytical tools, most companies struggle. Viewed in isolation the numbers are useless fragments like individual pixels in a giant TV screen. But composed into a grand mosaic, they form a meaningful picture of the corporate metabolism, and once the movie is synchronized, decisions can be taken in the best interest of the company with very low risk of failure.

Learn to read the pulse of the organization and you will benefit from a new source of profit generating energy and risk deflection.

James O’Sullivan
Ireland

Monday, March 23, 2009

Integration of Finance and Operations

Recently, an operations executive expressed his frustration about the management’s inability to manage their company in these hard economic times. He complained about limited integration between Finance and Operations affecting their decision making. The CFO runs business models in Excel spreadsheets, but the models are not tied to the operations systems. The scenarios cannot be validated for being operationally realistic. Nevertheless, they set the expectations for the operations.

Without a “reality check” the decisions become surgical mandates. If the CFO demands a 10% drop in inventories, operations manager cannot evaluate its consequences on the customer service. They don’t know whether they could afford even more cuts. Or, what would be the right level of inventory for the current business environment. Such single dimensional approaches often result in unintended consequences elsewhere. More importantly, managers lose the real opportunities of making a number of smaller changes to achieve strategic business goals.

On the flip side, the value of an operational improvement is not known until the end of the quarter when financials accrue. The lack of integration hampers forward looking projections of financial KPI’s. Impact of the proposed initiative is unknown without implementation. Some initiatives succeed while others don’t. In reality the situation is even more troublesome. With many on-going initiatives it is difficult to isolate the successful ones.

On the surface it sounds like a typical silo mentality that exists in most companies. The CFO office does their model and throws a mandate over to the operations. Operational managers do not know the reason why they are being asked to do so. They have no opportunity to contribute in developing a solution to meet real objectives. It being a mandate they work busily to meet without understanding the full situation.

Fortunately, the situation is usually not this critical. The operations managers do talk to the financial managers to understand the situation. This is, however, a partial solution only. Both are talking with their respective expertise but without real facts and data. Without the integrated analytical capability neither can substantiate their proposals with any real analysis for the expected “future” environment. The product mix, volumes, and a whole lot of other variables change frequently over time. If the goal is to reduce overall cost then probably there are many options available that must be considered along with inventory reduction to make it work. Single variable approach is dangerous and often results in moving costs around the strategic business unit netting little to nothing on the bottom-line. This is a far too frequent occurrence in continuous improvement programs.

There are many who believe that such problems can never be solved with any mathematical analysis. They argue no matter the sophistication of mathematics it cannot capture the complexity of the real situation. In my humble opinion this is an uninformed position. Indeed there is no single formula or software that can solve this problem of integrating process and finance. But, it can be achieved by integrating problem solving and mathematical analysis tools in tandem. Rudimentary mathematical analysis such as OEE calculations or static capacity analyses are inadequate for the job.

Commonly used continuous improvement approaches, and even the mentality, are not sufficient for the required restructuring. It requires holistic thinking, which is different from “divide to conquer” approach of continuous improvement. It demands a systematic approach and expertise to identify the correct set of parameters to change simultaneously. You have to create many such scenarios before finding the most feasible solution with minimum risk. Detailed dynamic process, resource requirements and financials analyses with “appropriate” mathematical rigor provide the basis for comparison among scenarios. It’s an iterative process that results in a collaborative solution to meet the strategic business goals. That’s what the discipline of risk management is all about.

We have been studying this problem for over 20 years and have confronted this problem as senior managers and consultants in many different industries. Our consulting business is solely dedicated to this very single issue of connecting operations and financials to strategic business goals. We have developed a set of methods and tools called Profit Mapping focused for rapid restructuring. We invite you to learn more about them at www.menawat.com.

Anil Menawat

Saturday, March 07, 2009

The Winds of Change

A business can be like a hanging mobile. Pull on any part of it, and the whole thing will go into a dance until some kind of equilibrium is restored. That's just what happens when you take one part of a production process and change it. We typically focus on that one part, but all the other parts begin to dance around it, sometimes with unexpected results.





Likewise, if the wind comes along, every part of the mobile starts changing all at once. It’s like that now for businesses everywhere, as shockwaves of the global banking crisis reverberate around the world.

Because we’re powerless over external factors, it’s now vital to optimize the profitability of the processes inside our businesses; and we can’t do that without looking deeply into their dynamic interactions. Tools like Value Stream Maps, and measurements like Overall Equipment Effectiveness served us well as a compass might an explorer or a ship’s captain at sea. But now we’re navigating narrow streets and alleyways for survival itself and we need something with the pinpoint accuracy of a GPS if all’s not to be lost. In short we need a way to put a financial value on the disposition of every workflow, asset, unit of labor, utility, raw material and consumable for every conceivable situation…and be able to quickly what-if the whole thing as each new situation presents. Impossible? Up until recently, yes.

Now, it is possible for a company of any size to embrace this level of complexity. You need Profit Mapping’s analytical engine, PAL and a way to hook it up to your process. That connection is made with a Business Execution Profile from which emerges your Process and Policy Map, (a mechanism we’ll talk a little more about another time). For now the message is this. The technology is here to turn the current situation into an unprecedented opportunity. Thousands of companies are going to fail in the near future, but yours doesn’t have to be one of them. Instead you can position it not just to survive, but to absorb the huge amount of additional business that will become available when the shakeout is over. Business you couldn’t win in any other circumstances. But to do it will take a level of agility few companies currently possess. Profit Mapping can give you that edge. Your time has arrived, seize the day.

James O'Sullivan

Tuesday, December 09, 2008

Optimize the Value Stream: Closing the Loop Between Continuous Improvement, Execution and Business Objectives

A culture of continuous improvement is vital for success. Leading companies engrain this mindset at all levels of the organization, and provide the necessary resources and management support to sustain initiatives. However, many senior managers are beginning to ask for more. They want to know precisely how any change in the operational environment will specifically impact the financials at the end of the reporting period.

Unless you can demonstrate the bottom line impact of any proposed operational improvement on a financial statement, there is no real improvement! Managers are less willing to sign on for changes with only the “promises of a better tomorrow.” They need something more substantial before they commit valuable time and resources to implementing the best ideas of your domain experts. In addition, they want to leapfrog continuous improvements and proceed directly to the “optimum” process and policy configuration to maximize performance.

Critical Management Questions in a Multi-Product and Multi-Workflow Environment

Managers convey to us several critical questions when it comes to reconfiguring processes and policies for superior performance.
  • How do you know that the proposed change will improve the business?
  • Can you guarantee that it won’t actually make things worse?
  • How much will it cost and what is the return on investment?
  • What is the financial performance of each product and workflow in our multi-product and multi-workflow environment?
Until now, it was not easy to obtain clear answers to these questions. Savvy managers can see right through hollow answers. Either show them the data or move on to something else. Fortunately, a benefit to answering the last question is that it also contains the answers to the above management questions. Financial insight at the level of products and workflows tells you your precise costs, and is a starting point for improving performance throughout the value stream.

Optimize the Value Stream Flow

Optimizing the value stream flow creates the best possible operational and financial performance for any given business context. The following figure represents all of the operations in a Strategic Business Unit (SBU). The rows represent different products and the columns represent different operations and activities involved in the production of a product or service. Collectively, the products and workflows represent all of the value steams for an SBU.



Typically, companies take the best advice from their domain experts as they set out to resolve operational challenges. Investing in new technologies and reconfiguring a line is a common solution approach. Yet, senior managers often confide that neither they nor their experts know which one is the best new configuration to adopt out of several reasonable possibilities. They are also not sure whether the new equipment is worth the investment.

The problem is they have been burned in the past. Previous investments in automation may have modernized production processes, but the cost savings never materialized over the years. Moreover, managers often report that previous technology investments have actually reduced their flexibility to quickly respond to market changes.

Drive Effectiveness In and Costs Out of the SBU

We frequently see how technology issues such as the deployment of robots impact business performance. A robot, for instance, is installed to feed parts into three independent grinding or polishing stations. If either the robot fails or any of the stations fail, the whole operation comprised of the four pieces of equipment (1 robot, 3 stations) ceases production.

Unintended consequences of this configuration can include: (a) exacerbated failure time behavior because all three grinding operations come to a halt if any of the four operations fail, (b) reduced flexibility for handling changes in product mix due to constraints imposed by the configuration of the robot and three stations, and (c) increased indirect labor cost for the more expensive technical personnel who are on call to support the technology.

The central management issue is not whether automation is helpful or not, but how well the proposed changes in automation plus all other changes in the SBU collectively roll up to produce measurable financial results. Local improvements may look quite reasonable when viewed in isolation in the SBU, but it is a fact that all such changes exist in the context of multiple products and workflows. The activities required to produce the given product mix coexist and dynamically interact with the other functions in the SBU.

As observed above several factors conspire against achieving true improvement in financial performance. Finicky automation behavior results in a line being run slower than its rated speed. Breakdowns are more severe in terms of their collective reduction in throughput. The particular configuration in a value stream flow is tied to a specific product mix. It isn’t practical to run different product mixes through these operations, resulting in less agility to respond to market conditions. The “visible” reduction in direct labor is offset by a “hidden” increase in indirect expensive technical personnel added to the SBU for support.

These types of management challenges are becoming more problematic for companies as they grapple with how to handle an increasingly complex and fluctuating product mix. Negative past experiences and a difficult economy conspire against committing to needed improvements.

Only when you can “see,” coordinate, and holistically balance improvements for the collective good within the entire SBU will you begin to achieve superior performance. The financial benefits flow when you optimize the interactive value streams in alignment to your business objectives for the entire SBU.

Adam Garfein

Note: The next posting will explore the central role Cost of Goods Manufactured and Unit Cost financial reports play in optimizing value stream flows. Hint: Optimization is easy when you quickly produce these reports at the level of products and workflows.

Thursday, July 03, 2008

What’s Right and Wrong with OEE?

Lately, Overall Equipment Effectiveness (OEE) has taken center stage as the measurement of choice among manufacturers. Most have adopted it as the measure of their performance. A significant segment uses it in “real time” to control shop floor. It is common to see electronic boards hanging over manufacturing lines prominently displaying OEE along with other manufacturing measurements such as parts produced and quality.

A measure of manufacturing efficiency

OEE incorporates three critical aspects of manufacturing efficiency – productivity, equipment availability, and product quality. Following equation is its definition:

OEE = Productivity x Availability x Quality

OEE condenses the complexity of a manufacturing operation in a single value. A decline in OEE is often viewed as losing profitability, and, thus, is highly undesirable. Foremen work diligently to meet or exceed the target on shop floors.

The popularity of OEE stems from a strong belief that high productivity coupled with high quality and availability is a prescription for high manufacturing efficiency. And, a highly efficient process operating at or above the target OEE meets the customer requirements of on-time-delivery of good quality products but also does so at low cost. In other words, a high OEE represents little or no waste in the process.

OEE has served manufacturing well in growth market. As demand increased, it became imperative to have high productivity and availability. Of course, meeting quality standards goes without saying; it is essential to stay in business these days. Manufacturers continued to improve OEE performance. Maintaining high OEE became an important strategy to ward off competition from low cost producers. It, therefore, is not surprising that improving OEE has become a cottage industry within the manufacturing improvement consultancies. It is hard to find a manufacturing enterprise software vendor who does not have an OEE module in their offerings.

Changing operating environments

Many manufacturers are facing the wrath of increased oil prices and declining US auto industry in the form of reduced demand for their products. Some have lost more than half of their volumes. They have limited choices: either run at full rates for shorter times, or run at slower speeds. Devising new operating policies are not simple as many other policy factors confound the issue. In any scenario, OEE takes a tumble. High productivity is difficult to maintain in a declining market because of lower demands unless assets are taken out of service.

The correlations between manufacturing cost and OEE, which apparently worked in the narrow range at near capacity operations, are now failing. Absence of such simple relationships is leaving managers to wonder about their decision making process. The apparent “linearity” of correlation in the narrow range is no longer valid as the operations have moved far beyond the vicinity of earlier operating conditions. The relationship between cost and OEE, in many cases, is exhibiting sudden changes like quantum shifts. Some managers are starting to ask how OEE is related to cost. Some have started to wonder whether OEE is at all related to cost. They want to move beyond the simple correlative approach to understanding the functional relationship. Can they continue to depend on OEE as a good measure of their performance? They don’t know any more.

OEE as measurement in setting operating policy

Making correct decisions require apposite measurements. This premise is elementary to every feedback decision-making process notwithstanding the type of decision – whether to control a machine, a process, or the whole business.

Most manufacturing improvement consultants and practitioners strongly believe that maintaining high process efficiency is the way to control cost. In other words, can OEE, which is a measurement of process efficiency, act as a surrogate for business performance? Let’s answer this by comparing OEE as a measure of performance in two instances. The scenarios represent two completely distinct operating policies for the same manufacturing line. Both are equally viable options. The scenarios are designed to have identical OEE but differ vastly in operating conditions.

The table and the figure below summarize the two scenarios. The blue scenario has high productivity and lower quality, while the red scenario has high quality and lower productivity and availability. The question is which one is better to select? The answer, of course, depends upon your business objective and not process performance. If the quality is of greater importance then you pick the red one, whereas if throughput is most critical then you pick the blue one. This selection is without regard to the cost to add value in either case. If OEE and cost are related by a direct and linear relationship then we would expect the cost structures for the two scenarios to be at least similar, if not identical.

Table 1: Two scenarios with identical OEE



Figure 1: Comparing two scenarios with same OEE


Relationship of OEE to cost and business performance

Since both scenarios have identical OEE, do they both have the same cost structure also? The answer is a most definite “NO”. In order to maintain high quality in the red scenario, we have to inspect more parts. This requires additional inspection stations and the associated resources. There is more rework to be done as well as increased scrap in order to meet the higher quality standard. Increased rework takes time away from what otherwise would be first time quality production resulting in lower productivity as manufacturing line produces less good parts per hour. With the lower productivity and increased activities for rework, we would expect the cost to add value per part would be higher.

In addition, a more stringent preventive maintenance program may be necessary to reduce variations in machine performances. This would reduce machine availability for production. In the red scenario in comparison to the blue one, increased quality does not increase the composite value of OEE as the productivity and the availability, both, decrease.

Both scenarios are two distinct modes of operation on the same manufacturing line. Despite both reporting the same composite value of OEE, their process dynamics, resource requirements, and financial performance are vastly different. The single value measurement of OEE does not reflect these differences. This example makes it clear that there is no one-to-one relationship between OEE, which is a measure of process performance, and the financial or the business performance.

In fact, one could question the claim of OEE, a composite value, as a suitable measurement at all for any purposes. Fortunately the situation is not that hopeless. OEE is a good indicator of process performance only when it is closer to the 100% marks on all three dimensions and the deviations are not large. In such cases the triangle on the radar chart approaches its boundary. If any one of the dimensions moves farther away from the 100% mark, OEE tends to lose its sensitivity and starts to break down.

Conclusion

OEE, the simplicity of which lured many to adopt it as a primary measurement, is not a good measure for process performance if the line is operating far away from full capacity. It is rarely a good indicator of the business performance. OEE is a good indicator for the foreman on the shop floor only when deviations in all three dimensions are small. Using OEE, however, for continuous improvement projects or to devise operating policies to face the challenges of the constantly changing marketplace is risky and warrants caution.

Anil Menawat

Thursday, April 10, 2008

Beyond Continuous Improvement: Continuous Optimization

Ensure Process Improvement Meets or Exceeds Business Expectations

Your organization is guided by lean principles, invests in people and lean practices, and deploys tools that help identify and reduce wasteful activities while increasing customer value. Your plant has smoothed out uneven production flows, converted to cellular operations, moved to smaller batches, transformed the culture, and so on. Your technical or business services group (e.g., call center, insurance processing, financial services, software development team) has incorporated similar lean principles to improve performance.

Despite doing all of the right things, management is disappointed that the return on investment in equipment, technology, and people has not produced the expected contributions to the bottom line – that the investment threshold did not warrant the benefits. What happened?

Continuous Improvement is Important

The leading continuous improvement frameworks are grounded in sound principles and concepts, and most organizations have very competent people involved in the deployment and adoption of the particular best practices. Given this environment, why do so many efforts produce uneven results? That is, why isn’t everyone successful?

We believe that the fundamental stumbling block is not with the people or technology, but with the lack of tools that are specifically designed to inform the practitioner of the holistic impact of any proposed change on the organization. Change does not occur in isolation, and we should stop creating solutions based on a limited view of the potential interactive impact.

Divide to Conquer Doesn’t Add Up

In practice, we have been taught to deal with the complexity of operations and performance by breaking challenges into smaller “manageable” pieces. The accepted thinking is that by making piecemeal or “isolated” process improvements, it is perfectly reasonable to expect the activities to roll up and produce a total impact that is greater than the sum of the individual pieces. Upgrade to higher technology equipment, invest in IT improvements, reduce the workforce or do more with the same, etc. The assumption is that these or other changes in isolation will produce performance gains expected by management.

Unfortunately, while these and other actions may seem like the right things to do at the time, all of the individual changes have dynamic interdependent effects on performance – and the leading continuous improvement tools are not capable of shedding light on these effects. Thus, management is literally operating in the dark.

How do you know whether any individual or collective change will bring the organization one step closer to achieving the overall business objectives? The short answer is, you don’t. This is not a philosophical debate, but a stark reality. We rely on our intuition and expertise and hope that we are making the right decision. The risks, however, are too high today. We need new tools that provide the necessary insight, complement our own expertise, and give us greater confidence that we are making the right decisions.

Continuous Improvement Benefits from a Holistic Balance

A plant manager recently shared with us that his team had at least three different and good ideas for how to reduce costs while meeting quality requirements and fluctuating demand. That is the good news. His company is in a fiercely cost competitive global business and this particular plant must quickly improve its financial performance or risk losing business. The company did not want to invest any more in the facility and expected to see radical improvement within the current asset base and capabilities.

The plant manager matter-of-factly stated that because there were “no good tools” to evaluate the “business impact” of each operational change option, he was going to implement his own ideas. It wasn’t that his thoughts were necessarily any better than the others. He admitted that he had “no idea” which improvement option was in fact the best one. His team was very competent and also had access to expert consultants.

However, since he had “to live with the final decision” he “might as well” follow his own recommendations! Thus, in the absence of an integrated method for holistically quantifying the process and business impact of his team’s ideas, his confidence in selecting the “right” option was vastly reduced. Does this situation sound familiar?

Continuous Optimization is “Guided” Continuous Improvement

Continuous optimization involves a slightly different yet wholly complementary way of thinking. It is a method of making operations as fully effective or functional as possible. The definition of “effectiveness” is not arbitrary, but comes directly from the organization’s business objectives.

Whereas continuous improvement asks, “how can it be made better,” continuous optimization asks “how do we make it most effective within its larger operational and business context.” The senior management team frankly does not care if you make the process better. They are concerned with identifying the most effective path for transforming operations to better serve customers while achieving the business objectives. Thus, you must do it all.

The central difference between improvement and optimization thinking, therefore, lies in the explicit focus on business objectives. One can be “successful” at continuous improvement by meeting the definition of success provided for the particular continuous approach (level flow, cellular design, defect reduction, etc). This is why gaps are often observed when results are rolled up through successive management levels, where process improvements do not always produce the expected benefits as defined from a financial or customer perspective.

With this definition, “effective” is always a moving target as objectives, customer needs, and internal capabilities and limitations are always changing. Thus, continuous optimization is a “guided” implementation of continuous improvement, ensuring that every change brings the organization at least one step closer to achieving its customer and business objectives. Conversely, changes that cannot achieve both process and business goals are bypassed, freeing up valuable resources to apply elsewhere.

Continuous Optimization and Continuous Improvement: A Partnership for Success

Continuous improvement and optimization should be treated as a partnership that has the best interests of the organization in mind. Business optimization requires sound strategies for improvement and improvement can be wasteful if it does not ultimately improve the business.

Partnership for Success: Attributes of Both Approaches

Continuous ImprovementContinuous Optimization
- Seeks to “make it better”
- Process improvement focus
- Assumes local optimization rolls up
- Implicitly linked to business objectives
- Static approach
- "Divide to conquer" philosophy
- Seeks to optimize the business
- Holistic thinking for the larger benefit
- Process and financial integration focus
- Explicitly linked to business objectives
- Dynamic approach
- "Unite to win" philosophy

When systematically deployed in partnership these two approaches ensure that every activity – ranging from changes on the shop or services “floor” to upper management activities – move the organization at least one step forward in the optimal business direction. This partnership for success thus requires that (a) all process changes contribute in some meaningful way to optimizing the business and (b) business optimization be treated as an ongoing endeavor because operational constraints and capabilities as well as the business environment are never static. This is the domain of Profit Mapping.

Adam Garfein

Tuesday, March 18, 2008

It’s all About the Right Business Balance: Use Lean to Improve the Business, not as Carte Blanch for Workforce Reduction

As I reflect upon my “Lean” experiences with various German manufacturers it strikes me that I am often left wondering, “What is the real issue here?”

We want to sell a product at the best possible market price and at a high profit, and the only way to do that is to have a company at which all parties involved in production, directly or indirectly, dovetail their work and are thinking and cooperating between each other.

If we look at the beginning of modern automation in Germany, the main reason was to let machines take over simple and repeating work in order to set people free for more demanding tasks, considering the limited sources available after the war. At that time companies invested both in automation and to up-skill their employees, who brought the right balance between machines and workers, creating the success of the 50s, 60s and 70s.

Later on, due to higher competition in the market and changes in the ownership of companies, the situation changed. It seems that the current management thinking is that to save money, companies must invest in even more machines and fire more skilled (expensive) people. Unfortunately this has reached the point of diminishing returns between automation and workforce, and by that, many companies have lost their know-how and competence because of the loss of highly skilled and devoted employees.

Due to the pressure from the stock market, many managers have tried to perform miracles to show a good financial result, by relocating parts of their production into other countries, etc. Or they sell parts of the company to satisfy the shareholders. Shareholders want to maximize their profit and management faces high pressure to achieve profit targets. These actions may work for a short time but in the end the risk is too high and sometimes kills the company.

For me, it seems that the “automation-driven” mentality has passed its zenith. Yet, hiring or re-hiring more skilled workers isn’t necessarily the answer either. Unfortunately, the solution is not so simple as “It is better to have more of x and less of y.” So, what are we to do?

The answer is easier to identify if one asks a different question. Rather than asking “Which is better?” we should be asking “What is the right balance?” In this case, the question is, “What is the right balance between automation and people?” This leads us down a different pathway for creating the right solution.

We often conveniently “forget” or ignore that fact that automation has its own issues. An expensive piece of equipment typically requires very expensive support personnel. From the view of accounting, these support personnel are part of the indirect budget. Yet, their costs are very real and the company doesn’t see any improvement at the end of the day when all such “indirect” costs are tallied.

Lean tells us to use automation to reduce cycle time variation to level out production. This is often coupled with redesigning into work cells. But these same principles can lead to other challenges. For example, I have seen many instances where managers question if operations are “over-automated.” A machine breaks down or demand changes suddenly. The automation does not provide enough flexibility (or is too expensive) to adjust on the spot to support the required configuration. In some instances, having lower levels of automation and an appropriate level of operators may actually improve flexibility and lower overall costs at the same time.

In my opinion, the only solution that makes sense is the one that balances the total costs of the machines, labor and salaried personnel, etc. while satisfying customer demand – as well as the needs of the internal and external stakeholders. In the absence of finding the right balance between the process and financial implications relative to serving our customers, we are all just fooling ourselves that any changes we make in the short term will guarantee positive longer-term and sustainable impact.

Michaela Gerweler
Germany

Monday, January 28, 2008

Financial Performance in the Life Sciences: Costs Are Moving from an Afterthought to the Forefront

Changing market conditions, increased competition from generics, difficulty bringing new products to market, high regulatory costs, etc. are putting intense pressure on life sciences companies. The recent headline in the Wall Street Journal, “Drug Maker Wyeth May Cut 10% off Work Force Over Three Years,” (1/25/08) is just the latest in a string of news reflecting the economic realities facing this industry.

The article mentioned Effexor, one of the Wyeth’s leading drugs as facing generic competition. In a coincidence of geography, Effexor is manufactured in Ireland, which was the setting the day before for a talk given by James O’Sullivan, Managing Director of Menawat & Co. Ireland at a MESA International working group meeting hosted at Bausch & Lomb in Waterford. James’ research focused on the opportunities of MES in contributing to financial and other performance in the life sciences (see presentation).

Traditionally, reasons to implement MES include: optimizing production efficiencies for increased profitability, increasing asset utilization by reducing downtime and unscheduled maintenance, improving product quality, and reducing waste. Yet, in the life sciences, there are more compelling reasons to implement MES.

Historically, companies have approached enterprise IT solutions, whether MES, PLM, etc. with a tactical mindset. However, in today’s intensely competitive global environment, the expectations of the returns on such investments are changing dramatically. Tactical improvements are often no longer enough to justify such investments; they must also help the company achieve its strategic priorities. Choosing the right tactical path is still imperative, but it is critical to frame the opportunity in the larger strategic context to be successful from a business perspective – and to receive the project funding.

The most striking revelation of James’ research is the true cost reduction opportunity of MES solutions in the life sciences. If you look at the “Cost to Add Value” (direct costs minus materials) in industrial manufacturing, it represents about 44% of the business. This presents a large field of opportunity to optimize production, quality, costs, and so on. Even a small change can have significant improvement implications due to the size of the improvement pool.

In contrast, the Cost to Add Value in the life sciences shrinks to approximately 10% of the business. Other cost drivers, such as increased R&D and regulatory requirements expand dramatically compared to other industries. In other words, once you design the process, make the capital investments, and obtain certification, the research shows that 90% of the costs are outside of manufacturing. Thus, the promise of a “big” cost improvement may not percolate up given the small opportunity relative to other cost drivers in the organization.

Implications for Life Science Companies: Optimize the Business not Operations

Forward thinking companies are elevating MES, PLM, and other related investments into strategic advantage. Compliance and product development are prime examples. MES and PLM represent significant investments, and their potential value expands vastly when couched in terms of strategic priorities. For example, reducing service and warranty costs, improving traceability, increasing speed to market, and so on are “business” issues of central importance to corporate management, which transcend plant level operational issues.

This kind of integrated business thinking (i.e., strategic + tactical = success) can be a challenge for operational personnel who have not traditionally been pressured to think beyond their local domain. These are not simply efficiency tools that reduce process, supply chain, and product development costs. They support regulatory and strategic priorities. It appears that MES and PLM vendors are catching on to this new message and are beginning to talk about how they serve the business goals, not simply how their tools help plug local tactical gaps.

What are some of the “takeaways” for operational managers? Think in terms of planning and aligning investments to strategic value. Right size deployment and support efforts and costs to where they will have the greatest business impact and avoid investments that will not provide sustainable business improvement. To do so, one must validate the business value before implementation. Most importantly, maintain a clear focus on the business objectives before, during, and after installation. This will help facilitate a change in thinking from process improvement to optimizing the business. Profit Mapping provides a structured and holistic approach for accomplishing all of the above.

MESA International’s rapidly expanding membership is perhaps a reflection of this new integrated thinking. For instance, what does the term MES mean to you – Manufacturing Execution System? MESA is short for Manufacturing Enterprise Solutions Association. If we take their lead and substitute the word Enterprise for Execution, Manufacturing Enterprise System (MES) takes on a completely new strategic meaning – one that companies can choose to ignore at their own peril.

Look for an expanded business optimization talk for the life sciences by James in the April timeframe in Ireland. It promises to generate some lively discussion.

It seems that the more things change from industry to industry, the more the fundamental competitive issues remain the same. At least, that is what the life sciences industry is painfully discovering about their costs and business improvement opportunities.

Adam Garfein