We must also collect that information (data) in a fair and systematic way. For example, we should think about who we ask for information, and how they will understand our questions. If we cannot ask everyone involved, then we must be able to justify why we ask only a certain section of that population.
Business Research
We must also collect that information (data) in a fair and systematic way. For example, we should think about who we ask for information, and how they will understand our questions. If we cannot ask everyone involved, then we must be able to justify why we ask only a certain section of that population.
Global Business Environment
Global Business Environment
To date, our world market is dominated mostly by many well established global brands. Over the last three decades, there have been a steady trend of global market convergence – the tendency that indigenous markets start converge on a set of similar products or services across the world. The end-result of the global market convergence is that companies have succeeded on their products or services now have the whole wide world to embrace for their marketing as well as sourcing.
The rationale of global market convergence lies partially in the irreversible growth of global mass media including Internet, TVs, radios, news papers and movies, through which our planet has become truly a small global village. Everybody knows what everybody else is doing, and everyone wants the same thing if it is perceived any good. It also lies in the rise of emerging economic powers led by BRICs (Brazil, Russia, India and China), which has significantly improved the living standard and the affordability of millions if not billions of people.
For organizations and their supply chains, the logic of going global is also clearly recognizable from economic perspective. They are merely seeking growth opportunities by expanding their markets to wherever there are more potentials for profit-making; and to wherever resources are cheaper in order to reduce the overall supply chain costs. Inter-organizational collaborations in technological frontier and market presences in the predominantly non-homogeneous markets can also be the strong drivers behind the scene.
One can also observe from a more theoretical perspective that the trends of globalization from Adam Smith’s law of “division of labour”. A global supply chain is destined to be stronger than a local supply chain because it takes the advantage of the International Division of Labor . Surely, the specialization and cooperation in the global scenario yields higher level of economy than that of any local supply chains. Thus the growth of global supply chain tends to give rise to the need for more coordination between the specialized activities along the supply chain in the global scale.
As the newly appointed Harvard Business School dean professor Nitin Nohria said “If the 20th century is American’s century, then the 21st century is definitely going to be the global century.” The shift of economic and political powers around world is all too visible and has become much more dynamic and complex. But, one thing is certain that there will be significantly and increasingly more participation of diverse industries from all around the world into the global supply chain network; hence bringing in the influences from many emerging economies around the world. Their roles in the globally stretched network of multinational supply chains are going to be pivotal and will lead towards a profoundly changed competitive landscape.
Six Sigma Definition
Definition of Six Sigma
Before we study the subject of Six Sigma in any depth, we need to define the term. Perhaps unusually, Six Sigma has 3 distinct elements to its definition:
• A Measure: A statistical definition of how far a process deviates from perfection.
• A Target: 3.4 defects per million opportunities.
• A Philosophy: A long term business strategy focused on the reduction of cost through the reduction of
variability in products and processes.
Accordingly, it is defined in a variety of ways by several authors, but for the purposes of these notes the definition from Pande et al (2000) focused on the more comprehensive philosophy of Six Sigma will be used:
“A comprehensive and flexible system for achieving, sustaining and maximizing business success. Six Sigma is uniquely driven by close understanding of customer needs, disciplined use of facts, data, and statistical analysis, and diligent attention to managing, improving, and reinventing business processes.”
A strong structure and clear alignment to organizational goals (particularly financial) are a key part of the Six Sigma approach as defined by Eckes (2001). Leadership is provided by a team of Champions – Senior Champion, Deployment Champion, Project Champion at corporate, unit and department levels respectively supported by a team of experts. The experts are referred to as Black Belts (who work full time on projects at process level to solve critical problems and achieve bottom-line results) and Master Black Belts (who provide mentoring, training and expert support to the Black Belts). Ingle and Roe (2001) note that that this significant organizational structure can range from 4000 Black Belts in a corporate population of 340,000 in GE to 120 Black Belts in a corporate population of 100,000 in Motorola. Black Belt training is typically 16 –20 weeks in GE and a year in Motorola (Ingle and Roe, 2001), although both are interspersed with projects that bring value to the organization.
• A Measure: A statistical definition of how far a process deviates from perfection.
• A Target: 3.4 defects per million opportunities.
• A Philosophy: A long term business strategy focused on the reduction of cost through the reduction of
variability in products and processes.
“A comprehensive and flexible system for achieving, sustaining and maximizing business success. Six Sigma is uniquely driven by close understanding of customer needs, disciplined use of facts, data, and statistical analysis, and diligent attention to managing, improving, and reinventing business processes.”
Decision Model (Normative)
Normative Decision Model
When beginning a home repair project, it is helpful to have all necessary tools close at hand. It is often advisable to even have extra tools within reach should the project be more complicated than originally thought. The larger the variety of tools in a handyman’s toolbox, the more likely he will be to fix the problems that he encounters.
In a way, a manager is like an organizational handyman. Managers identify and solve many types of problems (e.g., personnel, planning, scheduling, budgeting, technology, operations, facilities, policies, resources, etc.) with the best interests of their organizations in mind. Some problems are straightforward and predictable, and others are more complicated. A good manager, like a good handyman, is able to quickly determine the types of tools that he needs to fix the problems that he encounters.
Sometimes the tools that are needed to solve organizational problems are co-workers and the knowledge, insight, and creativity that they possess. People use the knowledge that they gain from past experiences to define and remedy the problems that they encounter. Knowledge can be gained from direct personal experiences or from the experiences of others. Groups are able to outperform individuals on mental tasks in large part because of the diversity of experiences that members bring to their groups. When the experiences of group members are used to solve problems instead of just those of a single manager, better solutions usually arise. The benefits of group problem solving, however, come with costs—primarily, the time spent by group members away from their normal work responsibilities.
Not all of the decisions that managers make need to be solved with the help of coworkers. Managers can make some decisions with little or no input from workers. Effective managers know when to solicit input from others and when to solve problems by themselves. The Normative Decision Model, developed by Victor Vroom and his associates, gives explanation to the appropriate level of worker involvement in the decision-making process. Decision acceptance and decision quality drive the model. When it is important that workers buy into and accept the decision, they should be included in the decision-making process. Likewise, when it is important that exceptional and high-quality solutions be developed, more people should be included in the process. Time should also be considered when selecting the appropriate degree of worker involvement—as time available to make decisions decreases, more autocratic decision styles are appropriate.
In a way, a manager is like an organizational handyman. Managers identify and solve many types of problems (e.g., personnel, planning, scheduling, budgeting, technology, operations, facilities, policies, resources, etc.) with the best interests of their organizations in mind. Some problems are straightforward and predictable, and others are more complicated. A good manager, like a good handyman, is able to quickly determine the types of tools that he needs to fix the problems that he encounters.
Strategy and Information System
Strategy and Information Systems
Traditionally business organizations are divided into three levels. These are operational, management and strategic levels. They exist in nearly all businesses irrespective of their size or sector of operations, although in small companies some levels may converge.
At the operational level decisions are made to ensure smooth running of operational processes or day-to-day business. At this level it is necessary to oversee that resources are used efficiently, inventory is up to date, production levels are as planned, etc. Decision making at this level requires information almost entirely internal to the company, although it may be extremely detailed and real-time.
Information for decision making at management level has a typical time-frame ranging from weeks to several month or a year. Middle management usually controls medium term scheduling, forecasting and budgeting operations. These rely on internal as well as occasional external information. For instance, setting the quarterly budget requires the knowledge of current expenditure as well as external pricing information.
Traditionally business organizations are divided into three levels. These are operational, management and strategic levels. They exist in nearly all businesses irrespective of their size or sector of operations, although in small companies some levels may converge.
Information for decision making at management level has a typical time-frame ranging from weeks to several month or a year. Middle management usually controls medium term scheduling, forecasting and budgeting operations. These rely on internal as well as occasional external information. For instance, setting the quarterly budget requires the knowledge of current expenditure as well as external pricing information.
Senior management will focus on general, or strategic , issues related to overall business development in the long term. At this level decisions tend to relate to issues with long term such as restructuring, major financial investments and other strategic undertakings related to company’s future rather than present. Information necessary for decision making at this level is comprehensively gathered not only from the internal sources of the company itself, but also involves external information, such as data related to economic situation or sectors as a whole.
Businesses that heavily rely on information develop an information strategy to establish how to manage information for business advantage and to comply with government regulations. An Information Strategy is a planning document usually created at the strategic level by the Chief Information Officer (CIO), possibly together with a Chief Technology Officer (CTO) and IT manager.
An information strategy is developed to support the overall business strategy of an organization and explains how information should be captured, processed, used and disposed of throughout its life cycle. Although the structure of an information strategy varies from business to business, there are some common areas included in most information strategy.
An information strategy is developed to support the overall business strategy of an organization and explains how information should be captured, processed, used and disposed of throughout its life cycle. Although the structure of an information strategy varies from business to business, there are some common areas included in most information strategy.
Hoshin Kanri Planning Principles
• Five-year vision : This should include a draft plan by the president and executive group. This is normally an improvement plan based on internal and external obstacles, and revision based on input from all managers
on the draft plan. This enables top management to develop a revised vision that they know will produce the desired action.
• The one-year plan: This involves the selection of activities based on feasibility and likelihood of achieving desired results. Ideas are generated from the five-year vision, the environment and ideas based on last year’s performance. The tentative plans are rated against a selection of criteria and a decision made on the best action plans.
identification of key implementation items and a consideration of how they can systematically accomplish the
plan. The individual plans developed are evaluated using the criteria that were used for the one-year plans.
• Detailed implementation: This is the implementation of the deployment plans. The major focus is on contingency planning. The steps to accomplish the tasks are identified and arranged in order. Things that could go wrong at each stage are listed and appropriate countermeasures selected. The aim here is to achieve a level of self-diagnosis, self-correction and visual presentation of action.
• Monthly diagnosis: This is the analysis of things that helped or hindered progress and the activities to benefit from this learning. It focuses attention on the process rather than the target and the root cause rather than the symptoms. Management problems are identified and corrective actions are systematically developed and implemented.
Hoshin Kanri Planning Principles
• Five-year vision : This should include a draft plan by the president and executive group. This is normally an improvement plan based on internal and external obstacles, and revision based on input from all managers
on the draft plan. This enables top management to develop a revised vision that they know will produce the desired action.
• The one-year plan: This involves the selection of activities based on feasibility and likelihood of achieving desired results. Ideas are generated from the five-year vision, the environment and ideas based on last year’s performance. The tentative plans are rated against a selection of criteria and a decision made on the best action plans.
identification of key implementation items and a consideration of how they can systematically accomplish the
plan. The individual plans developed are evaluated using the criteria that were used for the one-year plans.
• Detailed implementation: This is the implementation of the deployment plans. The major focus is on contingency planning. The steps to accomplish the tasks are identified and arranged in order. Things that could go wrong at each stage are listed and appropriate countermeasures selected. The aim here is to achieve a level of self-diagnosis, self-correction and visual presentation of action.
• Monthly diagnosis: This is the analysis of things that helped or hindered progress and the activities to benefit from this learning. It focuses attention on the process rather than the target and the root cause rather than the symptoms. Management problems are identified and corrective actions are systematically developed and implemented.
Six Sigma
Six Sigma
There are those who will tell you that Six Sigma is radical and new. The fact is that Six Sigma (done properly) is a recognisable evolution of TQM. De Mast (2006) sees it as an on-going phase in the evolution of methods and approaches for quality and efficiency improvement. Six Sigma can be seen as the accumulation of principles and practices developed in management statistics and quality engineering, all of which matured significantly over the course of the Twentieth Century.
The Six Sigma approach was first developed in the late 1980s within a mass manufacturing environment in Motorola (Harry, 1998) as they struggled to meet demanding quality targets on complex manufactured products; and become widely known when GE adopted it in the mid-90s (Folaron and Morgan, 2003; Thawani, 2004) when, arguably, it evolved from being a process improvement methodology to a broader, companywide philosophy. Both companies still consider Six Sigma as the basis for their on-going strategic improvement approach. Since the 1980s Six Sigma has become one of the most popular improvement initiatives; widely implemented around the world in a wide range of sectors (by companies such as Boeing, DuPont, Toshiba, Seagate, Allied Signal, Kodak, Honeywell, Texas Instruments, Sony, Bombardier, Lockheed Martin) that all declared considerable financial savings (Harry, 1998; Antony and Banuelas, 2001; Kwak and Anbari, 2006).
Other benefits claimed for Six Sigma include increased stock price, improved processes and products quality, shorter cycle times, improved design and increased customer satisfaction (Lee, 2002; McAdam et al, 2005). Six Sigma has undergone a considerable evolution since the early manifestations (Folaron and Morgan, 2003; Abramowich, 2005). Initially it was a quality measurement approach based on statistical principles. Then it transformed to a disciplined processes improvement technique (based on reducing variation within the system with the help of a number of statistical tools). For example, Snee (1999) defined Six Sigma as an ‘approach that seeks to find and eliminate causes of mistakes or defects in business processes by focusing on outputs that are critical importance to customers’. The definition given in 1999 by Harry and Schroeder (1999) also defines Six Sigma as ‘a disciplined method of using extremely rigorous data gathering and statistical analysis to pinpoint sources of errors and ways of eliminating them’. In its current incarnation it is commonly presented as ‘a breakthrough strategy’ and even holistic quality philosophy (Pande, 2002; Eckes, 2001). It is now generally accepted that Six Sigma is applicable to various environments such as service, transactions or software industry regardless the size of the business (Pande, 2002; Lee, 2002) and being adapted Six Sigma may lead to nearly perfect products and services. Moreover, Six Sigma is widening its areas of application very rapidly and there are examples of applying Six Sigma to predicting the probability of a company bankruptcy (Neagu and Hoerl, 2005) or finding opportunities for growth (Abramowich, 2005).

In the past five years, hundreds of organizations have indicated their interest in making Six Sigma their management philosophy of choice. While many of the businesses attempting to implement Six Sigma are well intentioned and want to implement Six Sigma properly just as General Electric did, there are also those impatient executives who now look on Six Sigma in the same way as they look on downsizing. This quick-fix approach to Six Sigma is a sure path to the same short-term results that prevent long-term profitability.It is worth noting that the evolution of Six Sigma is continuing with, for example, the integration of Lean Principles, development of a product/service variant (Design for Six Sigma) amongst others (De Mast, 2006). 



The Six Sigma approach was first developed in the late 1980s within a mass manufacturing environment in Motorola (Harry, 1998) as they struggled to meet demanding quality targets on complex manufactured products; and become widely known when GE adopted it in the mid-90s (Folaron and Morgan, 2003; Thawani, 2004) when, arguably, it evolved from being a process improvement methodology to a broader, companywide philosophy. Both companies still consider Six Sigma as the basis for their on-going strategic improvement approach. Since the 1980s Six Sigma has become one of the most popular improvement initiatives; widely implemented around the world in a wide range of sectors (by companies such as Boeing, DuPont, Toshiba, Seagate, Allied Signal, Kodak, Honeywell, Texas Instruments, Sony, Bombardier, Lockheed Martin) that all declared considerable financial savings (Harry, 1998; Antony and Banuelas, 2001; Kwak and Anbari, 2006).
Mean Variance Efficiency
The Role of Mean-Variance Efficiency
We began the Chapter with an idealized picture of investors (including management) who are rational and risk-averse and formally analyses one course of action in relation to another. What concerns them is not only profitability but also the likelihood of it arising; a risk-return trade-off with which they feel comfortable and that may also be unique.
Thus, in a sophisticated mixed market economy where ownership is divorced from control, it follows that the
objective of strategic financial management should be to implement optimum investment-financing decisions using risk-adjusted wealth maximizing criteria, which satisfy a multiplicity of shareholders (who may already hold a diverse portfolio of investments) by placing them all in an equal, optimum financial position.
No easy task!
But remember, we have not only assumed that investors are rational but that capital markets are also reasonably efficient at processing information. And this greatly simplifies matters for management. Because today’s price is independent of yesterday’s price, efficient markets have no memory and individual security price movements are random. Moreover, investors who comprise the market are so large in number that no one individual has a comparative advantage. In the short run, “you win some, you lose some” but long term, investment is a fair game for all, what is termed a “martingale”. As a consequence, management can now afford to take a linear view of investor behavior (as new information replaces old information) and model its own plans accordingly.
Like Fisher’s Separation Theorem, the concept of linearity offers management a lifeline because in efficient capital markets, rational investors (including management) can now assess anticipated investment returns (ri) by reference to their probability of occurrence, (pi) using classical statistical theory. What rational market participants require from companies is a diversified investment portfolio that delivers a maximum return at minimum risk.
What management need to satisfy this objective are investment-financing strategies that maximize corporate wealth, validated by simple linear models that statistically quantify the market’s risk-return trade-off .
If the returns from investments are assumed to be random, it follows that their expected return (R) is the expected monetary value (EMV) of a symmetrical, normal distribution (the familiar “bell shaped curve” sketched overleaf). Risk is defined as the variance (or dispersion) of individual returns: the greater the variability, the greater the risk.
We began the Chapter with an idealized picture of investors (including management) who are rational and risk-averse and formally analyses one course of action in relation to another. What concerns them is not only profitability but also the likelihood of it arising; a risk-return trade-off with which they feel comfortable and that may also be unique.
Thus, in a sophisticated mixed market economy where ownership is divorced from control, it follows that the
objective of strategic financial management should be to implement optimum investment-financing decisions using risk-adjusted wealth maximizing criteria, which satisfy a multiplicity of shareholders (who may already hold a diverse portfolio of investments) by placing them all in an equal, optimum financial position.
No easy task!
But remember, we have not only assumed that investors are rational but that capital markets are also reasonably efficient at processing information. And this greatly simplifies matters for management. Because today’s price is independent of yesterday’s price, efficient markets have no memory and individual security price movements are random. Moreover, investors who comprise the market are so large in number that no one individual has a comparative advantage. In the short run, “you win some, you lose some” but long term, investment is a fair game for all, what is termed a “martingale”. As a consequence, management can now afford to take a linear view of investor behavior (as new information replaces old information) and model its own plans accordingly.
Like Fisher’s Separation Theorem, the concept of linearity offers management a lifeline because in efficient capital markets, rational investors (including management) can now assess anticipated investment returns (ri) by reference to their probability of occurrence, (pi) using classical statistical theory. What rational market participants require from companies is a diversified investment portfolio that delivers a maximum return at minimum risk.
What management need to satisfy this objective are investment-financing strategies that maximize corporate wealth, validated by simple linear models that statistically quantify the market’s risk-return trade-off .
If the returns from investments are assumed to be random, it follows that their expected return (R) is the expected monetary value (EMV) of a symmetrical, normal distribution (the familiar “bell shaped curve” sketched overleaf). Risk is defined as the variance (or dispersion) of individual returns: the greater the variability, the greater the risk.
Incremental IRR (Internal Rate of Return)
The Incremental IRR
Despite their apparent wealth maximization defects, IRR project rankings that conflict with NPV
can be brought into line by a supplementary IRR procedure whereby management:
Determine the incremental yield (IRR) from an incremental investment,
which measures marginal profitability by subtracting one project’s cash
inflows and outflows from those of another to create a sub-project
(sometimes termed a ghost or shadow project).
To prove the point, let us incremental the data from Section 3.1.Two projects that not only
differ with respect to their cash flow patterns ( size and timing ) but also their investment cost.
Project Year 0 Year 1 Year 2 Year 3 Year 4 Year 5 IRR(%) NPV
15% (10%)
1 less 2 (35) (30) - 20 40 50 11.1
You will recall that IRR maximization favored a higher percentage return on the smaller more
liquid investment (Project1), whereas NPV maximization focused on higher money profits
overall (Project 2). Now see how the incremental IRR (15%) on the incremental investment
(Project 1 minus Project 2 = £35k) exceeds the discount rate (10%) so Project 1 is accepted.
Moreover, this corresponds to Equation (1) on single project acceptance. The incremental NPV is
positive (£11.1k) because its discount rate r < incremental IRR.
Despite their apparent wealth maximization defects, IRR project rankings that conflict with NPV
can be brought into line by a supplementary IRR procedure whereby management:
Determine the incremental yield (IRR) from an incremental investment,
which measures marginal profitability by subtracting one project’s cash
inflows and outflows from those of another to create a sub-project
(sometimes termed a ghost or shadow project).
To prove the point, let us incremental the data from Section 3.1.Two projects that not only
differ with respect to their cash flow patterns ( size and timing ) but also their investment cost.
Project Year 0 Year 1 Year 2 Year 3 Year 4 Year 5 IRR(%) NPV
15% (10%)
1 less 2 (35) (30) - 20 40 50 11.1
You will recall that IRR maximization favored a higher percentage return on the smaller more
liquid investment (Project1), whereas NPV maximization focused on higher money profits
overall (Project 2). Now see how the incremental IRR (15%) on the incremental investment
(Project 1 minus Project 2 = £35k) exceeds the discount rate (10%) so Project 1 is accepted.
Moreover, this corresponds to Equation (1) on single project acceptance. The incremental NPV is
positive (£11.1k) because its discount rate r < incremental IRR.
Subscribe to:
Posts (Atom)