Baldrige Criteria

Baldrige Criteria Acceptance Problems Grow

Can it possibly get any worse than this???

More than 20 million American businesses are seriously seeking better ways to improve and become more successful. The Baldrige Award was created by President Reagan to be a world-class business excellence award to improve America’s competitiveness. However, the Baldrige Award Program data-based graphic below shows “Zero Participation” after strong initial participation.

 Baldrige Award Failure

 List of Baldrige Criteria Flaws

Criteria flaws and practicality-adverse complexity have long been cited as major reasons for the sharp prolonged decline of business participation.

Now a new and potentially stronger reason for Criteria rejection has emerged. New research comparing the Criteria when they were extremely popular with businesses to today’s Criteria which has a near 100% rejection rate show stark contrasts in relation to how businesses think and how Baldrige thinks.

All American businesses seek to improve their competitiveness. The Baldrige Mission is to improve the competitiveness of American businesses. The graphic above indicates total failure of the Baldrige Award with respect to its Mission.

Baldrige-based improvement accelerated and made practical from the most experienced source



Criteria made Practical


Baldrige Model Framework

  Best Systems & Processes  

All 45
Case Studies


Breaking BADrige




 Winners’ Applications


Innovation Winners

  International Winners  

Guides to Improvement


Site Visit Preparation

Core Services:


Application Development


Training, Workshops


Seminars, Speaking, Presenting


Interview Assessment -Best Option


One Stop Consulting Services


Special Focus:


Baldrige International (30+ countries)




Health Care


Nonprofit & Public Sector


Business - Private Sector


Baldrige Store:

All 2015 Guides Bundle

Guide to a Well-Written Application

Site Visit Preparation Guide

Guide to Effectively Presenting Results

Metrics - Benchmarks

Paul Steel

MISSION: Accelerating the total organizational improvement rate beyond the capabilities of all Business Excellence approaches combined. Paul


Breaking BADrige: The news is not so good for some Baldrige Winners . . .


2015 Criteria Changes:

  • More than 100 new Criteria requirements have been added for 2015 - 2016: There are more than 800 individual requirements (questions) in the new 2015 Categories 1, 2, 3, 4, 5, & 6 and more than 1,100 individual requirements (questions) in total. This represents an increase from 985 individual requirements in the previous Criteria version.

  • Big changes to the Core Values . . . none are the same as the original ones but a few are similar. Core Value Titles that are no more include: Continuous Improvement, Valuing Partners, Focus on the Future, Employee Participation, Partnership Development, and Design Quality, and Personal Learning. In addition, "Agility" has been demoted from a lead role to a sharing role.

  • "Continuous Improvement" has been redefined and is now at odds with the ASQ definition and most other widely-accepted definitions. (See: Baldrige and Continuous Improvement Conflict).

More on why Baldrige is underperforming soon but first, congratulations to our customers and newest Baldrige winners: St. David's HealthCare & Hill Country Memorial!!!

2015 Criteria Developments

The illusion of listening?: To the Baldrige Program's credit, they have reached out more than perhaps ever before. But soliciting input and listening with an open mind are not the same thing. In fact, the type of improvement feedback solicited was narrowed to only a few topics. History indicates that the Criteria will retain most flaws and introduce new ones. Update: It did just that.

Why Criteria Terminology Changes are Necessary: In May of 2014, about 50 Baldrige Senior Examiners and Judges participated in a work session to analyze Criteria terminology including 'work systems', 'work processes', 'innovation', 'alignment', and 'integration'. The participants represented more than 500 years of Baldrige experience. Baldrige Program officials facilitated the session in Gaithersburg. All of these terms were unanimously reported as being confusing and/or difficult to understand. So . . ., if the experts find these terms to be difficult to understand, how can the applicants possibly understand them?

Short (?) 2015 Criteria Version: There has been a lot of speculation about a new shorter version of the Criteria for 2015. A draft of the 'Short Criteria' has been developed and it appears that they will consist of nearly 200 separate requirements . . . if this is true, the lack of practicality in the existing Criteria will be preserved in the short version . . . and that is not good.


Eleven is a Charm!

Vodafone was my 11th telecommunications client in 11 different countries. I am pleased to report that all eleven achieved their project objectives  . . . which for most of them was to win their 2013 national quality/excellence award . . . thank you Vodafone for keeping the streak alive . . . and, thank you for the special honor of inviting me to work with a truly outstanding organization that never stops improving.

Fiji FBEA Vodafone

Has the Baldrige Award Program Gone Out of 'Business'?

To put this in perspective, not one of the 20 million for-profit businesses in the United States applied for the Baldrige Award this year.

The Baldrige Criteria are regarded by some (myself included) as a valuable pre-improvement tool for organizational and performance excellence assessment.

However, there is a wide gap between their actual value and their perceived value as measured by the prolonged decline in private sector participation show here.


Why such a disastrous performance? The growing consensus is that Criteria issues are the cause. At a high level, key problems identified by users include:

  • The Criteria have become exceedingly complex and therefore not practical. This discourages their use. For example, there are 71 separate and distinct questions (requirements) that literally use the word 'engage'. However, there are no questions using the word 'engage' for most types of stakeholders including, suppliers, shareholders, community, or board of directors.

  • Confusing 'out-of-the-mainstream' terminology is used leading to resistance

  • The Criteria are imposed for a 2-year period without adequate user review or acceptance prior to imposition

  • Criteria errors are perpetuated until the next 2-year cycle and sometimes for more than 10 years

  • The Criteria are overly complex with 1,100+ individual non-overlapping response requirements (questions) for 2015

  • Major topics including suppliers, customer relationship management, and innovation are cyclically emphasized and de-empathized . . . without any apparent reason

We can all agree or disagree with the reasons for the decline. What is clear is that the Criteria purport to offer a means to improve organizational performance excellence and profit organizations are desperately in need of and open to a means to improve their performance. Defiance of the basic laws of economics: There is an ideal market scenario with both inexhaustible supply and unquenchable demand . . . but, something [arguably the Criteria] has not only prevented growth in the number of 'for profit' applicants but caused a steady exodus. 


What follows is a summary of Criteria improvement opportunities (OFIs) identified by users including winners, Examiners, Judges, former Baldrige Foundation Chairman, national award leaders, and an advisor to two US Presidents. Everyone wants to see the Baldrige Program regain its former stature.

Please Note: This summary of the new Criteria changes is being continuously updated. It is possible that in some cases the findings presented may be incorrect. However, if the Criteria are correct and the perception is that they are not, there would appear to be an important opportunity for the Criteria to better communicate to the users.

How many words is too many? There are 5,878 words in the 2013 Criteria and 4,595 additional words in the explanatory notes under the Criteria. Some users think this is too many.


Reinstate 'world-class': As Baldrige celebrates its 27th anniversary, it may want to consider reinstating the original 'world-class' requirement in both processes and results scoring guidelines. America ranks lower internationally in manufacturing, health care (worst among large nations), and education than it did when the Baldrige Program began in 1988. Lowering the competiveness bar for winning to national or regional benchmark levels is not compatible with the award' original purpose and is not the best approach to improving America's competitiveness. Reinstating the original 'world-class' threshold will result in fewer winners but State and international excellence award programs have addressed this effectively by using tiered award levels.

Baldrige Work Systems Relevance The Migratory Habits of 'Work Systems'

'Work Systems' were relocated from Operations Focus to Strategic Planning for 2013.
During the past 20 years, Work Systems have been relocated to the Criteria requirements in 13 Criteria Items, the Organizational Profile, and six of the seven Categories. Work Systems never-ending quest for a permanent home has long confused and frustrated users. Confusion and frustration are not conducive to improvement.

 Problematic: In the 2013 Criteria, the design, development, implementation, control, improvement, and sustainability of approximately 95% of the 'key processes' of a large manufacturing organization (e.g., automotive) are not included in the Criteria 'work processes' or 'work systems' requirements because they are performed by suppliers?
  • 'Support Processes' are back after a 6-year absence and they are now located in the Operations Focus Category which is problematic for some who view support processes to not be constrained to being operationally focused. Support processes (AKA, 'support services') enjoyed their very own Criteria Item from 1992 through 2004 before sharing an Item with Operational Planning in 2005 and 2006.They met their Waterloo after 15 years of loyal Criteria service when the 'less than precisely defined' 'Value Creation Processes' were introduced under the definition that 'Support processes' are not 'value creation' processes. The protest by support process members who passionately believed that their processes created value for their organizations was unsuccessful. They were never to be seen or heard from again . . . until now. Caution: If you are optimistic that the Criteria learned from the previous issue related to their not being Value Creation processes then by all means don't read the notes under the new Item 6.1.
    Did you know?: 'Support processes' are not part of 'work processes', 'value creation processes', or 'work systems' . . . if they are performed by members of your workforce?

  •  Support processes' have been included as a subset of the 'Operational Focus' Category. Does this mean that all Criteria using organizations now need to have their accounting, sales, marketing, strategy, billing, and human resource organizations reporting to the COO?

  • Work Process Confusion Acknowledged: The Baldrige Program publically acknowledged after decades of feedback that the term 'work process' was confusing Criteria users. However, it appears that no improvement action has been taken. At the heart of the concern related to 'work processes' is the Criteria definition that work done by members of your organization is part of a 'work process'. If the same work is done by a supplier, it is not part of a 'work process'.

    Now You See Them . . . Now You Don't!: Work processes do not appear to subscribe to Dr. Deming's "Constancy of Purpose" commandment as illustrated below:

    • 1991 - 'work processes' were added to the Criteria in Item 4.5

    • 1995 -  'work processes' were deleted from the Criteria

    • 1997 -  'work processes' were re-added to the Criteria in Item 5.1

    • 1999 -  'work processes' were re-deleted from the Criteria

    • 2007 -  'work processes' were re-added to the Criteria in Items 3.1, 6.1, 6.2, and 7.5

    • 2009 -  'work processes' were re-deleted from Criteria Item 3.1

    • 2011 -  'work processes' were  re-added to the Criteria in Item 4.1 and deleted from Item 7.5

    • 2013 -  'work processes' were re-deleted from the Criteria in Item 4.1 and added to Item 7.1

Some users perceive that the relationships between and among 'work process' and 'work system' are convoluted or in some cases illogical. Here is an example: A 'work process' is defined as being performed internally. The same process when performed by a supplier is (as of 2013) included under 'work systems' in Strategy Development. If the same process is performed internally, it is required to be assessed for design, development, implementation, control, improvement, and sustainability. However, these requirements are not part of a Baldrige assessment if a supplier performs the work.

Guide to Knowing What 'Is' or 'Is Not' a 'Work Process': Assembly of Samsung Galaxy smartphones is a 'work process' . . . but, assembly of Apple iPhones is not a 'work process' . . . is there any wonder why the Baldrige Award business applicants dropped to zero in 2013?

  • 'Do or Source' Decision: Sourcing decision-making of 'key processes' (undefined term) is strategic in nature and has appropriately been moved from the downstream 'Operations Focus' Category upstream to the 'Strategic Planning' Category. In addition to processes, organizations do other sourcing including materials, components, and services. In the case of a manufacturing organization, this may represent more than half of their total expenditures. However, it does not appear that the Criteria include how this non-process sourcing is addressed. Selection of suppliers which is different from sourcing decision-making is addressed downstream in 'Operations Focus'. However, supplier selection is often a critical upstream function especially if it relates to innovation or new products, services or programs. For example, new product proposals need to determine if the new product can be reliably and economically made shortly after the Concept Phase.

  • Business Processes??? Business processes are included in the definitions for both Work Processes and Work Systems . . . but what exactly are business processes?  The Criteria do not define what 'business processes' are. Using a term (e.g., business processes) that is defined differently by different people and/or organizations to define Work Systems and Work Processes can only add to the already high level of confusion concerning 'work processes'.

  • Taking the "Total" out of 'Total Quality': The 'Cost Control' Criteria area has been refocused from 'work systems' to operations in general . . . that is arguably an improvement. But, some may be curious as why the Process Items Criteria do not directly address cost control on a total organizational basis.

  • Support for Innovation Has Vacillated: The very first Criteria featured innovation as a means to improve organizational competitiveness in the requirements in at least one Item in all Seven Categories of the original Criteria. In fact, it was literally included in the title of each of those Items and Categories. Innovation Dark Ages: During the 1990's, the Criteria deemphasized innovation. For example, the 1993 through 1997 Criteria referenced innovation in only one obscure Sub-Area of one Item (Employee Involvement). For 2013, innovation has been reinstated into at least one Item in every Criteria Category.

  • The new 'Innovation Management' area in "Operations Focus" is a promising addition . . . except the 2013 Criteria deleted the most important Criteria question from the 2012 Criteria . . .

  • Criteria Area 2.1a(4) now addresses outsourcing of "processes". This was appropriately relocated from the Operations Focus Category because sourcing decisions are strategic. So, why not address outsourcing of products, components, materials, and/or services? For example, does a hospital outsource the 'process' of making medicines? Does a K-12 school outsource the 'process' of writing textbooks? The bigger issue here is that the Criteria again undervalue the importance of suppliers in this case by minimalizing their critical sourcing value. [Action: If the Criteria really want to provide an example of effectively outsourcing a process, they should consider outsourcing the Process of Criteria Writing . . . just kidding of course.]

 SUPPLIERS DENIERS: There Would Not Have Been a Baldrige Award if it Was Not for Suppliers

Baldrige stunned the business community in 2001 by deleting all Supplier-dedicated Criteria Items, Criteria Areas, and Criteria questions.

There would not have been a Baldrige Award if it were not for suppliers: President Reagan was concerned that the Baldrige Award Program could fail if it did not receive wide support from the manufacturing community. He required that financial support for the award be funded 50% by manufacturing suppliers as a condition for his support. For every $300,000 asked from each major manufacturing sponsor organization, it was required that there be an additional $300,000 from their suppliers.


Most Baldrige applicants in the early years were suppliers: The chart at the top of this page paints a bleak trend for manufacturing/supplier applicants . . . from nearly 100 in the early years to only six in 2012.


Baldrige Faux Excellence FrameworkBaldrige stunned the business community in 2001 by deleting all Process and Results Criteria Items, Areas, and Criteria questions dedicated to suppliers. This led to the December 2001 Quality Digest Magazine cover titled: "Is the Baldrige Award Still About Quality". The feature story was written by Richard J. Schonberger a world-acclaimed author and expert in the field of Lean - Six Sigma and World-Class Manufacturing. He cites several flaws in the Criteria including the removal of the Items dedicated to suppliers. Dr. Schonberger's position accurately reflected the sentiment throughout the business community that sustained the already declining award participation rate of the business community. Today, a consensus is growing in other sectors including education and health care that it is important for the Criteria to accountably address the supplier and partner organizations that represent on the order of 50% of their total expenditures. These costs often take the form of management, design, HR, maintenance, contract workforce, IT, customer support, and operational functions.
Good News (sort of):
For 2013, Supplier-dedicated Criteria Areas (not Items as before) have returned to the Criteria. This marks a significant first step towards restoring the importance of suppliers in achieving excellence. More pressure needs to be applied to restore the Criteria to their previous supplier recognition level in years 1988 through 2000 but even that will be insufficient.

Baldrige may want to check out the European Model to gain a better perspective of the importance of suppliers and partners to achieving organizational excellence.]

Suppliers and partners are not second class stakeholders. Give them equal status with other major stakeholders (e.g., workforce, customers) by adding a dedicated Suppliers and Partners Focus Category and a corresponding results item. Doing this will establish applicant accountability to addressing these valuable stakeholders. The winners’ application summaries (especially healthcare) make a compelling argument that to win the Baldrige Award it is no longer necessary to meaningfully address supplier and partner organizations . . . nobody wins when the value of suppliers is minimalized.


“On Her Majesty’s Secret Product” . . . Product? Didn’t you mean ‘service’ Mr. Bond? No M. The Queen is using the Baldrige Criteria now. This silly analogy refers to the Criteria requirements references to products and services being deleted in 2008 (with a few exceptions) and replaced with references to products only. In 2012, I had the opportunity to introduce the Criteria through assessment training to more than 30 public and private service organizations in several countries. In every session, I was asked a question by someone new to the Criteria equivalent to this: ‘The Criteria do not apply to us because it asks about how we address products and what our product results are. We don’t have any products. We are a service organization.’ Disillusioning potential Criteria users and providing ammunition to Criteria critics within existing users is not a good strategy for gaining acceptance from service organizations. In fairness to the Criteria, the Glossary explains that products, programs and services are intended to be covered by the word 'products'. Good luck trying to find this information in the glossary though because it is well-hidden and also not included in the Index of Key Terms. However, this hidden information exemplifies a universal problem that is arguably one of the Criteria's worst problems . . . practicality . . . you would need to search through all 64 pages of the Criteria booklet to find the definition of any term in the Criteria requirements and retain what you read . . . and repeat this process thousands of times to effectively understand the Criteria meaning. This impracticality discourages people and organizations from initially embracing the Criteria and severely limits the efficiency of using the Criteria. The Criteria could benefit from being written more from the perspective of the users.

Strategy Results Have Been Deported Back to their Home Country for 2013: In 2011, 'Strategy Results' ran away from their 'Leadership Results' (Item 7.4) home and illegally entered 'Product Results' (Item 7.1) at a remote border crossing under the cover of darkness. Complaints led to Passport Control eventually locating the adventuresome 'Strategy Results' and unceremoniously deporting them back to their home country of 'Leadership Results'. [Action Consideration: Take away the power to impose the new Criteria without user review and acceptance and these type of changes will be filtered out.]



Proposed Baldrige Criteria Improvements Based on User Inputs

2013 Baldrige Criteria Framework Improvement Opportunity

 Baldrige Criteria Development Process Improvements

  • Why all the secrecy? Make the Criteria Improvement Process transparent to the public including the improvement inputs and summaries of the analyses of the inputs. Inform people suggesting improvements of their status including whether they were given to an improvement team, the improvement team's recommendation and the rationale regarding accepting or rejecting suggested improvements.

  • Require user acceptance of Criteria changes before imposing a two-year freeze (AKA, taking away the authority to impose the Criteria without user approval). Doing this will prevent major mistakes. For example, when the importance of customer relationship management was recently downgraded to a minor role. This error was evident immediately to most users worldwide but no correction was made for years.  In addition, the the importance of suppliers partners would never have been allowed to be minimalized. And, portraying projections as results would have been DOA. Errors of this magnitude would never have occurred if Criteria user approval were required.

2013 Baldrige Criteria Improvements

  • Eliminate the Criteria bloat by reducing the total number of words in the Criteria by more than half to make them more practical for all users especially smaller organizations. One way to reduce the bloat is to use the Singapore Quality Award as a role model. Making the Criteria more practical has been effective in engaging, sustaining engagement, and increasing engagement of the private sector (manufacturing, service and small business) organizations which has not occurred for the Baldrige program as is evident in the summary of applicants charts above. Tweaking and patching the existing Criteria is not the answer. See: Graphic example of Criteria bloat.

  • Don't ever surrender to complexity: Does anyone know of an excellent organization that accepts complexity as ‘Business As Usual’. Interestingly, the Baldrige philosophy appears to be diametrically opposed to the practices of excellent organizations. The 2012 Criteria booklet states: “Complexity is a fact of organizational life. To succeed in today’s global, competitive, uncertain environment, organizations must accept complexity.” This philosophy appears to manifest itself in Criteria perceived as overly complex (impractical) by several users. For example, if you separate the run-on sentences in Item 5.1 you will find no less than 97 discrete questions which Baldrige refers to as requirements. Excellent organizations see complexity as waste and as an improvement opportunity. They succeed by reducing complexity . . . not accepting it. Baldrige may be well served to learn from these excellent organizations and advocate simplification. Better yet . . . they could simultaneously strengthen and simplify the Criteria to serve as a role model.

  • Use a consistent writing style across all areas of the Criteria using the 2012 version of Item 2.1 as a model. It is the most enduring of all Items and it preserves the systems perspective style that previously typified all process Criteria Items. For example, Items 3.1, 3.2, 5.1, and 5.2 have evolved (?) to more of an ISO checklist format (example) which diminishes their systems perspective importance.

  • Separate the individual Criteria requirements (questions) to improve readability, understanding, and ease of use for assessment, improvement and application submission writing purposes. The bundling of sometimes disparate questions has long had an adverse effect on understanding and ease-of-use. [Update: Unfortunately, the 2013 Criteria failed to take advantage of this improvement opportunity.]

  • Replace the term "Category" with "System" to better reflect the systems perspective underlying the Criteria and integral to the Baldrige Framework

Category 6 Improvements?

  • Category 6 has no "constancy of purpose" as a famed former mentor was fond of saying. The terminology has changed frequently and errantly, the process and systems questions have been embarrassingly juxtaposed, Item 6.1 and 6.2 had become embarrassingly redundant and convoluted in the 2009 - 2012 versions. A sense of a shift from a practical to theoretical is widely perceived and scorn from business users is pervasive. 
    Action: Start with a clean sheet and rewrite Category 6 to improve understanding for all users and to provide a basis for reengaging the Business Sector (Service, Manufacturing and Small Business) users and to make it beneficial to all others

  • Process management is applicable to all processes and systems in Categories 1 through 6. However, the Criteria now exclude all processes except operational processes from being accountable to address process management including design, implementation, cost control, improvement, control/sustainability. So, why not either restore Category 6 to a more global focus or include its questions in the other five Process Categories to ensure consistent deployment.

  • From 1988 through 2006, the Category 6 equivalent Criteria included designing, improving and controlling most to all processes. Starting in 2007, the focus of process management was narrowed to operations-only processes only. This appears to be counter to the total process focus (e.g., TQC, TQM) that evolved almost fifty years ago. In 2011, the Category was renamed to "Operations Focus" and now appears to exclude all non-operational processes. This creates a gap in how non-operational processes are addressed. For example, Criteria users no longer need to address how they design and innovate non-operational processes (e.g., leadership, strategic planning, workforce engagement, customer engagement, and knowledge management) that are essential to organizational success, it appears that the Criteria are delinquent in addressing process management for these key processes.

  • Combine Items 6.1 and 6.2 from the 2012 Criteria to eliminate redundancy. For example, why ask applicants to describe the process how they use to design and innovate twice? This implies that there should be a difference in the process used to design systems and processes as opposed to a robust process capable of designing both that is abundantly evident in mature organizations.

  • Use layman's terminology. Work Processes and Work Systems are not mainstream . . . although, some people within the Baldrige community may have fatigued and succumbed to thinking that they are. For others, these terms limit their acceptance of the Category 6 Criteria. Given that "Work Systems" have now been unsuccessfully trialed in six Categories (no joke) indicates that terminology enamorment may have taken precedence.  It may be helpful to know that the Criteria trialed, replaced, and in some cases cycled back to the following types of process nomenclature: production, delivery, business, supplier, partnering, support, service, work, and value creation. This appears to some as a compelling argument that they are practicing a 'trial and error' approach and Criteria users deserve better. Recommendation: Rewrite Category 6 starting with a clean sheet. There is no shortage of people in the Examiner and user community who are capable to assist in doing this.

  • Move strategic outsourcing to Category 2 (Strategic Planning). The process of deciding which internal processes are best done internally or externally is an upstream strategic decision. It belongs in Strategic Planning (Category 2) and not downstream. [Update: The 2013 Criteria listened to this user feedback.]

  • Emergency Readiness pertains organization wide. Given that the focus of Category 6 has been narrowed to operations-only, move Emergency Readiness to Leadership or another suitable Category/Item that universally addresses this important area.

Core Terminology Improvements

Baldrige Faux Integration Graphic

 Baldrige Flawed Integration Graphic

Does Baldrige understand what 'integration' is? Unless some users are really, really wrong, this may be the single-most damaging integrity problem for the award because integration is a core element of both the Criteria and the Scoring Guidelines. The first words of the definition ("integration" refers to the harmonization of . . .")  introduced in 2002 are vague and create a sense doubt. When the "Integrated Approaches (70% - 100%)" graphic at the left was added to the Criteria booklet that same year, it appeared to confirm the worst case scenario. This graphic does not depict integration

Additional confirmation that alignment is improperly substituted for integration can be found in the 100% scoring band wording for both the Process and Results Scoring Guidelines. Degrading the meaning of integration to alignment greatly lowers the standard of excellence bar and impedes the learning of Criteria users . . . if Baldrige would better define and illustrate integration as well as correct the wording in the Scoring Guidelines, this issue goes away. . . and the effectiveness of the Criteria could be improved as much as a quantum level.

  • Remove all forms of the word "align" from the Criteria and Scoring Guidelines. Replace them with "integrate/integrate with" as appropriate. When Baldrige uses the term 'integration' in the scoring guidelines (Process and Results) it is actually using 'aligned with' and not 'integrated' [Note: The Scoring Guidelines did not introduce "align" until 1999. See: "Does Baldrige understand what 'integration is?" (see above)]. For this reason, use of 'align' is not necessary and is in fact confusing to many users.

  • Remove the word "work" when referring to processes and systems to eliminate the confusion experienced by many to most users.
    Some breaking good news
    : Baldrige has finally (in September 2012 after 6 years of outcry) admitted that Criteria users are confused by the terms "work processes" and "work systems".

Scoring Points Improvements

  • Restore the 65 scoring points previously assigned to Product, Process and Strategy Effectiveness performance measures in the 2010 Criteria. This is yet another example where a major Criteria problem could have been prevented if the authority to impose Criteria on users without user acceptance were taken away.

Criteria Engagement Improvements

  • Restore "services" to the Criteria. Mysteriously, the term "services" has been progressively removed from the Criteria since 2008 causing negative feedback from service, nonprofit, and public sector users worldwide. For example, Item 7.1 was formerly named: "Product and Service Outcomes". This causes confusion in training when and during application submission writing for service-based organizations. Removing 'services' also appears to be counter to the Baldrige Vision: "To be the partner of choice for excellence in every sector of the economy." Ironically, one of the main criticisms of the original 1988 Criteria was that it did not focus on service organizations. The Criteria were quickly modified to address this early concern but now the pendulum appears to have swung back 25 years.

2013 Baldrige Scoring Guidelines Improvements

  • Reduce the total number of words in the Scoring Guidelines by more than half to make them more practical and effective. For example, is anyone going to read the 400 words in the Results Scoring Guidelines and remember what they read each time they assign a score? . . . not everyone for sure. If the objective is to assess trends, comparisons, segmentation and whether the right results are being presented, there is no need for more than 200 words.

  • Add a "Segmentation" scoring dimension to the Results Scoring Guidelines to reflect the popularity of this type of feedback comment

  • Remove all reference to non-results (e.g., projections) from the Results Scoring Guidelines. The outcry when 'projections' were first introduced in Results was unanimous among the many users I encountered including judges . . . yet they still remain at the 100% scoring band.

  • The terminology "Multiple Requirements", "Overall Requirements" and "Basic Requirements" are confusing to most users and contributes to assessment variation. Guidance that the "requirements" don't really mean "requirements" doesn't help either. Advice to take a holistic view and not hold applicants accountable to the "requirements" . . . well, you get the picture.

  • Results are quantitative by nature. So, why use judgmental terms (e.g., ‘important’, ‘poor’, ‘good’, ‘good relative’, ‘very good’, ‘good to excellent’, ‘excellent’, or my personal long-time favorite ‘early good’)? They are not needed. They introduce variation into the assessment. Get rid of them.

  • How is "fully deployed without significant gaps" different from "fully deployed with significant gaps" (BTW, how can something that is fully deployed have significant gaps?) . . . one of several examples where the Scoring Guidelines can be improved through more careful wording selection, simplification, and word count reduction. "Sustained over time" is another.

  • Improve the coherency of the Results Scoring Guidelines language including the use of  'few’, ‘little’, ‘little to no’,  ‘limited’, ‘limited or no’, ‘some’, ‘many’, ‘many to most’, ‘most’, ‘majority’, ‘fully’, or my personal favorite ’mainly’. Examples: Is ‘majority’ closer to ‘many’ or is it closer to ‘most’? Is 'majority' a simple majority? Is 'mainly' more or less than 'majority'? Is 'majority' between 'many' and 'many to most' or between 'many to most' and 'most? How does 'many' relate to 'mainly'? . . . this act needs to be cleaned up folks.

  • Why does the "accomplishment of Mission" verbiage switch from the Trend scoring dimension to the Integration Scoring dimension in the middle scoring range?

  • Eliminate confounded terminology. For example, how should the terms “important”, “high priority”, and “key” be used in scoring results? Further, which of them is most important? Which should be given the highest priority? Are they all key terms? This variation in terminology is unnecessary, confusing, and contributes to scoring variation, and reduced assessment validity.

  • The Results Scoring Guidelines reference customers directly but not other key stakeholders such as workforce, suppliers and partners, and community.

Baldrige Case Study Writing Improvement

Have the Baldrige Case Study writers 'lost the plot'? Warning: The answer is graphic in nature. Viewer discretion is advised.

Baldrige Glossary Improvements

"Imitation is the sincerest form of flattery" . . . but, it is not 'innovation' . . . unfortunately for the Baldrige Glossary

Eliminate the words "Innovation involves the adoption of . . ." from the definition for Innovation to ensure a focus on new and not only copied improvement.

Baldrige Faux Innovation Graphic
Baldrige Faux Integration Graphic

Why? The word innovation comes from Latin word ‘nova’ which means new. The Baldrige definition for innovation is based on the action verb “adopt” which does not have the same meaning as new. Adopting something that is not new and defining it as innovation because it is new to the adopting organization does not make something that is already used . . . ‘new’. For example, one Baldrige winner adopted a process that had been used previously for more than fifty years and presented this in their application and in post-award presentations as innovation . . . not good. However, I suppose that one could salvage some face-saving value by arguing that at least this was an innovative application of an old process. But, that is not simply not good enough for a role model winner. The point is that the organization should not be faulted . . . rather, the Baldrige definition is the enabler of this degradation of the meaning of the term 'innovation'..

Further, it appears that Baldrige may have mistaken a classic corrective action process for innovation in the Criteria Booklet graphic to the left.

Most importantly, allowing imitation to be credited as innovation adversely affects the competitiveness improvement rate of organizations using the Criteria.


Special limited time offers:

 1988 Baldrige Criteria


  Did you ever wonder how the 1988 Original Malcolm Baldrige Criteria compares to the current version?

MISSION: Accelerating organizational improvement beyond the capabilities of Business Excellence approaches. Paul Steel

Quality Gurus 1

Classic 'Quality Gurus Summit' photo (Click on photo for larger version)

Baldrige Excellence Tools and Resources Home

All information provided here on the website are not intended in any way to represent the views of the Baldrige Performance Excellence Program @

 Baldrige Award Criteria