Re-using the same questionnaires can bring great efficiencies with SupplierSelect. Respondents can import and edit a previous set of responses, and scores are silently copied at the same time. This means that scoring a response become an exercise in exception processing - you only read/score what has changed. But scoring like this can lead to inconsistencies because the comparison is against the project of respondent's choosing (ie the response they imported from). SupplierSelect addresses this problem with a new feature: Benchmark Issues.
Benchmark Issues allow the evaluator to associate a "benchmark" response with any previous response from that respondent organisation. This means that scores are imported from the benchmark, and answers to the current response are compared with the benchmark. The benefit of this is that the benchmark may belong to a project that featured an unusually high degree of research and due diligence - you know these scores are good, so it makes sense to compare new responses to that benchmark.
For example, Acme Consultants are specialists in the Fashion industry. They evaluate clothing manufacturers for fashion brands. They issue very similar RFP questionnaires to subsets of the same 25-30 suppliers.
In June 2005 they undertook a sourcing project for a very big fashion label - F.U.B.A. It was a high value project, so they flew to each supplier and conducted numerous interviews to back up the information gained from the RFP questionnaire. They then scored and re-scored each response.
Later, in May 2006, Acme are working on a smaller project. This time, it's a smaller project, and there is no budget to fly consultants around the world. However, they issue a very similar RFP to the 80% the same suppliers. Birmingham Textile group is one of the respondents. They have answered the same RFP questionnaire 3 times previously, and they need to decide which one to import their answers from. One of the previous responses was for F.U.B.A project, but they didn't do very well on this (they lost the contract), so they don't think their responses were good enough. So they use the response set from an earlier, smaller project where they did get shortlisted.
When ACME consultants come to score the responses to the current project, they would usually compare imported answers to the answers/scores from the source project. Where the answers are the same, it's assumed there is no need to re-score them. But in this case ACME know they have one set of answers and scores in which they have great confidence - the F.U.B.A. project. So what they'd like to compare is to compare the current response answers with the F.U.B.A project.
Benchmark Issues allow them to do exactly that. By defining the "Benchmark Issue" for a current response, SupplierSelect will compare answers and scores to that project, not the project from where the answers were imported.
To set a Benchmark Issue, go to Projects > Project Title > Issue Title, and click the link to "Select Benchmark Response".
Once Bencmark Issues are set, you can use the links under Analysis->Scoring differences to see a table showing the differences between scores awarded to Benchmark answers, and scores awarded in the current project. TODO - elaborate and screenschot.