Evaluation of RFP responses


A previous article looked at the initial creation of an RFx document and some of the challenges both in creating the RFP pack as well as some pitfalls for companies responding to one. Our recent experience confirms that it is beneficial to consider some main points in the creation. This article will focus on the next steps within the RFP process, mainly the evaluation of responses.

To recap quickly – in the RFx creation process ensure that the critical requirements are explained clearly, so that respondents can be certain in terms of what needs to be delivered. The art in creating the RFx document lies in striking the right balance between describing the requirements while still allowing a flexible approach. The latter becomes especially important with an agile or DevOps approach. Generally companies will be given several weeks to respond to an RFP (or RFQ). Depending on the amount and complexity of work required, 2 to 6 weeks is a typical timeframe for briefings and questions until the submission date.

Diaxion recommends to decide on the evaluation process prior to the RFP submission date. This offers several benefits:
1. Responders can be advised of the evaluation and scoring criteria
2. Internal resources (evaluators) can be informed to allow them to set aside time for evaluation
3. A probity plan can be created for good governance ensuring transparency and traceability

Some decisions that have to be made prior and during the evaluation process are:
How will the engagement with the selected company work, i.e. what are the contractual terms:

1. Master Service Agreement – which one to use?
Depending on the complexity of the client’s – or vendor’s – MSA, the negotiation process for a new vendor / new MSA can take anywhere from a few days to several months including reviews by the legal team. If done well it will allow to structure the ongoing engagement and fast-track future engagements with the vendor.

2. Contractual terms and conditions within e.g. a Statement of Work (SoW) including:

  • Pricing (fixed vs. variable, milestones, use of a rate card)
  • Deliverables and assumptions
  • Some of these items will already be at least outlined in the RFx documents however, often vendors are reluctant to commit fully to RFx responses. Be it that during the negotiation phase additional items are discovered or the scope is adjusted or that the vendor’s delivery organisation finds out at this point, what the vendor’s sales team has responded with.

    Probity plan

    The probity plan describes the overall RFx process and may include items like

  • How were the RFP participants selected (open / closed tender; selection criteria)
  • Submission process and details covering items like
    1. planned timeline for the RFx evaluation process
      planned activities including number and level of workshops or similar
  • Evaluation criteria and associated weighting of criteria
  • Evaluation team and member’s responsibilities
  • Commercial and financial criteria

  • Evaluation Process

    The evaluation process at a high level – regardless, if a probity plan is in place or not – will follow the mentioned activities.

    One of the main challenges in Diaxion’s experience is that the evaluation process imposes significant additional workload on the evaluation team members. This can be reduced somewhat by splitting up the review into meaningful sections, e.g. technical, financial, governance or implementation, project management and commercial. That way only a small number of staff may have to review the complete response.

    Weighting the criteria is another challenge as naturally there will be a tension between the various areas, e.g. between technical and financial: where does one draw the line and a technically superior proposed solution is just not worth the quoted cost!?

    As long as the evaluation process is consistent for all staff evaluating the responses, there are no firm rules, i.e. most approaches are acceptable – whether there is a different multiplier between mandatory/desirable/optional criteria (3, 2, 1 or 5, 3, 1 or…) or evaluation results are sorted separately. Again, this is where it is good to have a probity plan that outlines these decision that can be referred to.

    Likewise, each item can be scored on a contiguous scale (0 to 5, 1 to 10, etc.) or can have select distinct scores (as an example: 0 = not provided/insufficient; 2.5 = bare minimum/severely lacking; 5 = acceptable; 7.5 = above average; 10 = outstanding/well above expectations).
    Diaxion recommends a minimum of 3 reviewers for each section of each response to detect any outliers. These outliers, i.e. where one person scores “0” and another “10”, should then be discussed as part of the evaluation process. Where the difference is not significant, no such discussion is required – this will speed up to reach a common ranking of responses.

    Depending on the number and spread of responses, a second round of evaluation between the top 2 or 3 responses may include workshops, proof of concept, demo sessions or further financial refinement.

    It is quite common for the top two vendors to provide a “best and final offer” (BAFO), which provides responders with an opportunity to reduce the price, provide additional services or similar – with the aim to show their interest or appetite to actually want to win the business. There can be times when this BAFO phase results in rather unexpected outcomes – with one vendor recently reducing the scope to come back with a financially viable outcome, but ultimately increasing the total amount over the lifetime of the contract.

    Outcomes like this may be somewhat disappointing when the preferred and best solution is just not financially viable, it however allows to reach a joint decision quickly.

    Once the successful vendor has been notified, a common next phase is the creation of a Statement of Work, which then is followed by the steps outlined in the SoW – be that technical implementation, transition of services or some other project initiation.