Week 5 blog/ Hamed AL Foori: Evaluation Methods of Supervisory Consultants ( Part 2)
Problem Statement:
The
previous weekly blog discussed the criteria of hiring technical supervisory
consultants to support the Implementation Team starting the execution of the
project until commercial operation. Because it is extremely important to award
the supervisory services contract to the highly qualified and competent
engineering firm, the goal of this blog is to examine two sets of selecting the
“ best bidder”.
Feasible Alternative:
For
multi-attribute decision making, there are two models used in the evaluation:
- Non-compensatory approach
- Compensatory approach
Outcomes of the Alternatives:
- Non-compensatory approach. This method treats all attributes as being equally weighted
and examples of non-compensatory tools include dominance, satisficing, disjunctive
reasoning and lexicography.
- Compensatory approach. This method enables the user
to weight the attributes differently to come up with quantitative measures.
Acceptance Criteria:
The
acceptance criteria will be the selection criteria used in the evaluation of
supervisory consultant services. These criteria will be compared between the
alternatives mentioned earlier.
Table 1:
Acceptance Criteria of Supervisory Consultant Services
Evaluation
Categories
|
Selection Criteria
|
|
Technical Proposal
|
Quality
of submission
|
|
Plan
of Work
|
Methodology
|
|
Completeness
|
||
Personal
Profile
|
Project
Manager's Profile
|
|
Team's
Profile
|
||
Qualification
|
||
Firm
Profile
|
Relevant
Projects
|
|
Omanisation
|
||
Reputation
|
||
Financial
Proposal
|
Cost
of services
|
$
|
We
use disjunctive reasoning to rank the previous selection criteria to determine
which is more important than the other:
Attributes
|
Quality
|
Method
|
Completeness
|
PM
|
Team
|
Qualification
|
Projects
|
Omanis
|
Reputation
|
Cost
|
Total
|
Quality
|
1
|
1
|
0
|
0
|
0
|
0
|
1
|
1
|
1
|
5
|
|
Method
|
0
|
1
|
0
|
0
|
0
|
0
|
1
|
1
|
0
|
3
|
|
Completeness
|
0
|
0
|
0
|
0
|
0
|
0
|
0
|
1
|
0
|
1
|
|
PM
|
1
|
1
|
1
|
1
|
1
|
0
|
1
|
1
|
1
|
8
|
|
Team
|
1
|
1
|
1
|
0
|
0
|
0
|
1
|
0
|
1
|
5
|
|
Qualification
|
1
|
1
|
1
|
0
|
1
|
1
|
1
|
1
|
1
|
8
|
|
Projects
|
1
|
1
|
1
|
1
|
1
|
0
|
1
|
1
|
1
|
8
|
|
Omanis
|
0
|
0
|
1
|
0
|
0
|
0
|
0
|
0
|
0
|
1
|
|
Reputation
|
0
|
0
|
0
|
0
|
1
|
0
|
0
|
1
|
0
|
2
|
|
Cost
|
0
|
1
|
1
|
0
|
0
|
0
|
0
|
1
|
1
|
4
|
We
can notice that the proposed project manager, qualifications and previous
projects are the highest criteria, while completeness of the submission,
Omanisation and reputation of the bidder are the lowest.
If
we have three bidders; A , B and C with their following bids:
Attributes
|
A
|
B
|
C
|
Quality of
submission
|
Good
|
Excellent
|
Good
|
Method of
work
|
Excellent
|
Excellent
|
Good
|
Completeness
|
100 % completed
|
90 % completed
|
80 % completed
|
PM
|
20 years experience
|
16 years experience
|
15 years experience
|
Team
|
6 members
|
8 members
|
5 members
|
Qualification
|
Very high
|
High
|
Medium
|
Previous Projects
|
5 projects
|
6 projects
|
4 projects
|
Omanis
|
3
|
1
|
5
|
Reputation
|
High
|
Medium
|
Low
|
Cost
|
$ 3000/month
|
$ 5000/month
|
$ 1000/month
|
- Non-compensatory model: Lexicography
Approach.
Using the Lexicography Approach in evaluating bidder A, B and C and based
on the ordinal ranking of the attributes, the following conclusion can be
achieved:
Attributes
|
Rank
|
Relative
Ranking of bidders
|
Project
Manager
|
8
|
A > B > C
|
Qualification
|
8
|
A > B > C
|
Previous projects
|
8
|
B > A > C
|
Quality of
submission
|
5
|
B > A =C
|
Team’s
Profile
|
5
|
B > A > C
|
Cost
|
4
|
C > A > B
|
Methodology
|
3
|
A = B > C
|
Reputation
|
2
|
A > B > C
|
Omani Members
|
1
|
C > A > B
|
Completeness
|
1
|
A > B > C
|
From
the above, it’s concluded that, based on Lexicography Approach, bidder A captured most of the attributes/
criteria of the evaluation. But this approach does not tell how much A is
better than bidder B.
- Compensatory approach: Additive Weighting Technique. ( with ranks from 1 to 3 )
Attributes
|
Relative Rank
|
Normalized Weight (X)
|
Bidder
A
(A) (X*A)
|
Bidder
B (B) (X*B)
|
Bidder
C (C) (X*C)
|
|||
Project
Manager
|
8
|
0.18
|
3
|
0.54
|
2
|
0.36
|
2
|
0.36
|
Qualification
|
8
|
0.18
|
3
|
0.54
|
2
|
0.36
|
1
|
0.18
|
Previous projects
|
8
|
0.18
|
2
|
0.36
|
3
|
0.54
|
1
|
0.18
|
Quality of
submission
|
5
|
0.11
|
2
|
0.22
|
3
|
0.33
|
2
|
0.22
|
Team’s
Profile
|
5
|
0.11
|
2
|
0.22
|
3
|
0.33
|
1
|
0.11
|
Cost
|
4
|
0.09
|
2
|
0.18
|
1
|
0.09
|
3
|
0.27
|
Methodology
|
3
|
0.07
|
3
|
0.21
|
3
|
0.21
|
2
|
0.14
|
Reputation
|
2
|
0.04
|
3
|
0.12
|
2
|
0.08
|
1
|
0.04
|
Omani Members
|
1
|
0.02
|
2
|
0.04
|
1
|
0.02
|
3
|
0.06
|
Completeness
|
1
|
0.02
|
3
|
0.06
|
2
|
0.04
|
1
|
0.02
|
Sum
|
45
|
1
|
Sum
|
2.49
|
Sum
|
2.36
|
Sum
|
1.58
|
Comparing the Outcomes and Selection
of the Best Alternatives
The additive weighting technique from the compensatory
approach tells us how much each bidder is differing from the other by providing
quantitative comparison. We can see that there is no a big difference between
bidder A and B ( 2.49 vs. 2.36 ), while bidder C is not recommended.
How to utilize the
best approach in the procurement process
Depending on the assigned project
and budget and to ensure that the best bidder is being selected, the
procurement department may have to adopt the compensatory approach when
evaluating the bids from the consultants and also to get quantitative measures
of how each bidder is differing from another.
Reference:
Guild of
Project Controls Compendium and Reference (CaR). (2015). Retrieved fromhttp://www.planningplanet.com/guild/gpccar/managing-change-the-owners-perspective
Evaluation
Criteria Use of evaluation criteria for procurement of Goods, Works, and
Non-consulting Services using RFB and RFP. (2016). Retrieved from http://pubdocs.worldbank.org/en/201591478724669006/Guidance-Evaluation-Criteria-FINAL.pdf
Project
Administration Instructions. (N.D). Retrieved from https://www.adb.org/sites/default/files/institutional-document/33431/pai-2-02.pdf
Interesting case study Hamed. Let me throw this idea out as a challenge to you......
ReplyDeleteHow about if instead of COMBINING the Cost and the Technical, you looked only at the technical and you took the SUM and divided it by the costs to generate a "Benefit to Cost" ratio? As an example Bidder A had a score of 2.49 and a cost of 3000/month. 2.49/3 = 0.83 Benefit to Cost Ratio. On the other hand Bidder C had a score of 1.58 and a cost of 1000, 1.58/1 = 1.58 Benefit to Cost Ratio. This should give you a "value for money" analysis?
Certainly an interesting experiment BUT you need to back costs out of the initial set of calculations....
Worth thinking about....
BR,
Dr. PDG, Jakarta
Today, while I was at work, my sister stole my iPad and tested to see if it can survive a forty foot drop, just so she can be a youtube sensation. My iPad is now destroyed and she has 83 views. I know this is completely off topic but I had to share it with someone!
ReplyDeleteYour method of telling all in this piece of writing is truly nice, every one can effortlessly be aware of it, Thanks a lot.
ReplyDelete