Just The Facts Please! The Quantitative Approach To Rating Suppliers

Author(s):

Patrick S. Woods, C.P.M., CPIM, A.P.P.
Patrick S. Woods, C.P.M., CPIM, A.P.P., Commodity Manager, Emerson Electric/Fisher Controls, Sherman, TX 75091, 903-868-8160

83rd Annual International Conference Proceedings - 1998 

Overview. How often have you heard expressions such as "they're a bad supplier," "they need to shape up or ship out" or "I think I have a good supplier?" Although such expressions are spoken with sincerity, they convey a subjective, qualitative and sometimes vague approach to rating suppliers. In contrast, this presentation focuses on an objective, quantitative and very specific approach to rating suppliers.

This presentation focuses on three progressive levels of supplier acceptability which are Accepted, Certified and Preferred including the specific rating criteria to meet each level. Rating criteria is established via formulas and each formula is developed and reviewed in such areas as quality (overall Parts Per Million (PPM)), delivery (per customer request date and customer promise date) as well as supplier lead time.

The implementation of this program is discussed via the Ratings Implementation Committee (RIC) which is a cross-functional team made up a various in-plant disciplines directly or indirectly working with the supplier. The presentation is rounded out by discussing a Supplier Quality Recognition Program for suppliers who exceed the ratings or a Probationary Program for suppliers falling below the ratings.

The Precursor To Developing A Rating System. The purpose for progressive levels is to encourage the supplier to always strive to do better. A supplier who is complacent or who is "resting on their laurels" will not continue to improve or worse, may disintegrate in their overall service to you. In the real world, there is no "perfect" supplier and there is always room for improvement.

In developing a supplier rating system, ask yourself two questions. Number 1, what types of performance measures of a supplier are important to my firm? Number 2, what are the minimum acceptable criteria in each of these measures.

To answer question number 1, you may wish to meet with other department heads in your organization (form a committee) such as quality assurance, production and inventory control, customer service, etc. To list all possible factors for ratings would be beyond the length and scope of this paper. We will discuss three key measures, quality, delivery and supplier lead time in terms of examples and hopefully you can adapt other measures, if desired, in the same type of format.

To answer question number 2, again as a committee, come up with the various acceptable criteria (examples to be provided below).

The Rating Criteria. Based on the 3 criteria listed above, the next step is to develop specific rating formulas.

Quality Rating: One of the latest techniques in quality assurance is the Parts Per Million (PPM) rating. For every 1 million parts shipped, this rating detects how many are defective. The lower the PPM the higher the quality level the supplier is maintaining. Obviously, every part shipped to you by your supplier may not reach of volume level of 1 million, but as shown below, the particular volumes can be offset to still reflect a PPM rating.

Quality PPM Formula: TREJ
(TRPD/1,000,000)

  • TREJ = Total number of parts rejected to the supplier in the last 12 months
  • TRPD = Total number of parts received from the supplier in the last 12 months

Delivery Rating: Listed below are two basic types of delivery ratings. One is based on the supplier promise date. For example, you place an order on March 1, 1998. The supplier promises delivery to your dock on May 1, 1998. If delivery is made after May 1, 1998, then for that particular order, the supplier would be counted as late.

Delivery By Supplier Promise
Date Formula: % ONTIME = OTR/TR

  • OTR = Total number of on-time receipts by supplier promise date (items not $) for supplier for the last 12 months.
  • TR = Total number of receipts (items not $) for the supplier in the last 12 months.

The above delivery rating is a reflection of the supplier's ability to meet its commitments to you and to do what they say they are going to do. The second type of delivery rating is based on customer request date. Technically, per the above rationale, the supplier could promise delivery in 5 years and as long as they met the 5 year time frame, delivery would be acceptable (obviously not acceptable to your customer who will not wait 5 years!). Delivery by request date is an attempt to rate the supplier in their ability to meet your customer's request for your product that will incorporate the supplier's item.

Delivery By Customer Request
Date Formula: % ONTIME = OT/TR

  • OT = Total number of on-time receipts by customer request date (items not $) for supplier for the last 12 months.
  • = Total number of receipts (items not $) for the supplier in the last 12 months.

Obviously, the customer could request delivery in 2 hours and it would not be fair to penalize the supplier for impossible delivery requests. You may wish to offset this requirement (ex. supplier must meet customer requests for delivery 3 weeks or greater). In order to meet the above rating, you should encourage your supplier to reduce its lead time (will help rating below) and/or develop just-in-time stocking programs to reduce delivery time.

Supplier Lead Time Rating: The last rating to be discussed is the supplier's lead time. Lead time is defined as the total time it takes the supplier to provide product to your dock from when you first place the purchase order. Although there are different schools of thought on how to reduce lead time, true lead time reduction is the supplier's ability to improve its process and reduce set-ups to produce product(s) faster as opposed to carrying inventory (you will eventually pay for anyway). This is not a contradiction with stocking programs but there should be a balance between the two.

Lead Time (Days) Formula: LDTIME = SLT/TNP

  • SLT = Sum Of (Days) lead time for all parts assigned to a given supplier
  • TNP = Total number of parts assigned to a given supplier

Note: If your company prefers to state lead time in weeks (based on either 5 or 7 days) then weeks can be substituted for days with the same result.

Developing Data Into 3 Progressive Levels. The next step is to organize the findings into expectations as it relates to the 3 progressive levels. The names of the three levels chosen in this paper are Accepted (minimum level), Certified (next level) and Preferred (highest level) but you may chose different names or possibly more than 3 levels of expectation. The purpose here is to provide the format and examples. Again, we assume that the committee will determine the specific rating acceptability within the various levels.

Example:
ACCEPTED LEVEL

  • Quality By PPM: 7,500
  • Delivery By Supplier Promise: 95%
  • Delivery By Customer Request: 75%*
  • Lead Time (Days): 40

* Note that the delivery by customer request is more lenient than by supplier promise due to its difficulty to achieve.

CERTIFIED LEVEL

  • Quality By PPM: 3,000
  • Delivery By Supplier Promise: 98 %
  • Delivery By Customer Request 85%
  • Lead Time (Days): 30

PREFERRED LEVEL

  • Quality By PPM: 1,000
  • Delivery By Supplier Promise: 99.5%
  • Delivery By Customer Request 95%
  • Lead Time (Days): 20

Note: The data above can be collected on a monthly, quarterly or semi-annual basis or at any given point in time. Monthly is the most common interval and will allow you to see trends toward improvement or disintegration.

Communicating The Results With The Supplier. Hopefully, by now you can see that the above data is an objective, quantitative approach to discussing a given supplier's performance. By focusing on specific rating areas, the supplier can clearly see where they need to improve or be proud of their accomplishment. Also, this data can be used as a basis to either reward the supplier or place them on probationary status (see below).

The Ratings Implementation Committee (RIC). The RIC should be made up of key customer representatives and can include purchasing (obviously), quality assurance, production and inventory control, users of the supplier product and any other representatives that you deem important. The purpose of the RIC is to meet usually on a bi-monthly basis to review the ratings of each supplier. The RIC can specifically focus on problem areas and communicate with the supplier requesting their improvement plan. If a supplier does not heed the advise of the RIC and does not continue to make improvement, then the RIC can recommend that the supplier be placed on "probationary" status which should be communicated via letter from the RIC. Probationary status means that the supplier must make drastic improvement within the next ___ months (to be determined by the RIC) or purchasing will take steps to resource all the supplier's products (assuming you do not have a sole source supplier).

On a more positive note, if a supplier is exceeding the ratings and continues to maintain Preferred status, then the RIC can nominate and approve the supplier for your company's quality award. The quality award is an opportunity for the RIC to personally visit the supplier, its entire operation and reward them for their efforts and hard work. Not all tasks of the RIC are unpleasant.

Parting Thoughts. Obviously, for any type of rating system to exist, you must have a system in place to collect data; in this example, quality, delivery and lead time data. Also, you may have to begin with an education program both for your staff as well as the supplier, particularly if you are starting from square one. The supplier(s) may also be initially hesitant if they have never before received constructive criticism or a request for a corrective action plan. Once you have overcome these hurdles, this program should prove to be rewarding especially if the supplier(s) realize that you are evaluating them on objective data and not just a whim.


Back to Top