Agencies are required to evaluate
proposals based solely on the factors identified in the
solicitation, and must adequately document the bases for
their evaluation conclusions. Intercon Assocs., Inc.,
B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD ¶ 121 at 5.
While agencies properly may apply evaluation
considerations that are not expressly outlined in the RFP
if those considerations are reasonably and logically
encompassed within the stated criteria, an agency may not
give importance to specific factors, subfactors, or
criteria beyond that which would reasonably be expected by
offerors. See Risk Analysis and Mitigation Partners,
B-409687, B-409687.2, July 15, 2014, 2014 CPD ¶ 214 at
6-13; Raytheon Co., B-404998, July 25, 2011, 2011 CPD ¶
232 at 15-16; Consolidated Eng'g Servs., Inc., B-311313,
June 10, 2008, 2008 CPD ¶ 146 at 6-11; Lloyd H. Kessler,
Inc., B-284693, May 24, 2000, 2000 CPD ¶ 96 at 3. While we
will not substitute our judgment for that of the agency,
we will question the agency's conclusions where they are
inconsistent with the solicitation criteria and applicable
procurement statutes and regulations, undocumented, or
unreasonable. Public Comms. Servs., Inc., B-400058,
B-400058.3, July 18, 2008, 2009 CPD ¶ 154 at 17.
(section deleted)
Here, as discussed above, the
solicitation advised offerors that, in evaluating
experience, the agency would consider the relevance of an
offeror's experience, taking into consideration various
aspects of that performance, including "experience
performing deliveries as a full line food service
distributor, dollar value, and number of customers." RFP
at 202. However, nothing in the solicitation either
stated, or could be reasonably construed as: (1)
precluding a rating of acceptable under the experience
factor if the level of relevant experience fell below 80%
of the estimated dollar value of the competed contract; or
(2) limiting the experience considered relevant to only
experience as a "full line food service distributor." As
noted above, although agencies properly may apply
evaluation considerations that are reasonably encompassed
within the solicitation's stated criteria, an agency may
not give importance to specific criteria beyond that which
would reasonably be expected by offerors. In our view,
based on the language of the solicitation, the offerors
could not have reasonably understood that a level of
experience below 80% of the competed contract's estimated
dollar value would preclude a rating of acceptable or that
only experience as a "full line food service distributor"
would be considered relevant for purposes of applying the
unstated 80% factor. (EFS
Ebrex SARL B-416076: Jun 4, 2018)
In reviewing protests challenging an
agency's evaluation of proposals, we do not independently
evaluate proposals. Rather we review the record to
determine whether the agency's evaluation was reasonable
and consistent with the terms of the solicitation and
applicable statutes and regulations. Metis Solutions, LLC
et al., B-411173.2 et al., July 20, 2015, 2015 CPD ¶ 221
at 4. While we will not substitute our judgment for that
of the agency, we will sustain a protest where the
agency's conclusions are inconsistent with the
solicitation's evaluation criteria, inadequately
documented, or not reasonably based. Id.
We conclude that the agency's review of ME's proposal was
not consistent with the terms of the solicitation's
evaluation criteria. The record shows that the agency
performed only a superficial, perfunctory review of the ME
proposal to identify instances where ME allegedly did not
fully comply with the instructions for proposal
preparation. However, the agency did not meaningfully
evaluate the substance of the ME proposal, as required by
the solicitation's evaluation criteria.
As an initial matter, we point out that nothing in the
RFP's evaluation criteria advised offerors that the agency
would perform a preliminary, pass/fail compliance check of
proposals to determine whether the offerors had complied
strictly with the solicitation's proposal preparation
instructions. Agencies are required to evaluate proposals
exclusively based on the evaluation factors stated in the
solicitation. While a solicitation may establish
additional informational, technical, administrative, or
other requirements in the instructions for proposal
preparation, those requirements may not properly be
considered in connection with the evaluation of
proposals--and correspondingly may not provide a basis for
eliminating a proposal from consideration--unless those
additional requirements also are specified as a basis for
proposal evaluation. Metis Solutions LLC. supra. at 5 n.
6; Veterans Evaluation Services, Inc., et al., B-412940,
et al., July 13, 2016, 2016 CPD ¶ 185 at 16 n. 21.
Here, the record shows that the agency exclusively
identified only instances where ME allegedly failed to
comply strictly with the solicitation's proposal
preparation instructions as a basis for finding the
proposal unacceptable. See AR, exh, 15, Contracting
Officer's Compliance Checklist; exh. 16 Cost Price Analyst
Review Memorandum; exh. 17, Contract Specialist's
Compliance Memorandum; exh. 18, Source Selection
Authority's Clearance Memorandum, Noncompliance Briefing,
and Memorandum for the Record. However, as noted, nothing
in the solicitation's evaluation criteria advised offerors
that the agency would perform a preliminary pass/fail
compliance check to determine whether the firms had
prepared their proposals in strict conformance with the
solicitation's proposal preparation instructions.
The agency argues that the solicitation's evaluation
factors did, in fact, advise offerors of (and contemplate)
the compliance review performed by the agency. In this
connection, the agency points to the following language
included in the solicitation:
Award will be made to a single
Offeror who is deemed responsible, whose proposal
conforms to the solicitation requirements (to include
all stated terms, conditions, representations,
certifications, and all other information required by
Section L and Section M of this solicitation), and whose
proposal represents the most advantageous offer to the
Government.
RFP at 154. While this language
generically describes the basis for the agency's award
decision, it does not describe the evaluation factors that
the agency would use to evaluate proposals, and, in fact,
appears under the heading "Basis for Contract Award." That
clause is followed by another section of the solicitation
entitled "Evaluation Criteria" which is where the actual
evaluation factors are described. Notwithstanding this
distinction, neither the "Basis for Contract Award"
section, nor the evaluation criteria, advised offerors
that their proposals could be rejected for failure to
adhere to the proposal preparation requirements. (McCann-Erickson
USA, Inc. B-414787: Sep 18, 2017)
As described above, the RFP
established two criteria for offer acceptability, one of
which was proposing aircraft that met a list of minimum
requirements. RFP § D7. In its combined
comments/supplemental protest, Phoenix argues that Tempus
identified seven specific aircraft (each identified by its
tail number) in volume I of its proposal, but failed to
provide the required aircraft questionnaire form for one
of them. In this regard, Tempus listed a [DELETED] bearing
tail number [DELETED] as the second on-call aircraft in
its pricing proposal. See AR Tab 8, Tempus Proposal, vol.
I, at 2. However, Tempus’s technical proposal did not
include an aircraft questionnaire form for that aircraft.
Instead, Tempus provided an aircraft questionnaire form
for a different aircraft (a different tail number, also a
[DELETED]) as its second on-call aircraft. AR Tab 8,
Tempus Proposal, vol. II, at 44.
Initially, Interior responded that the offer acceptability
evaluation was to determine whether the proposal offered
to meet basic aspects of the RFP, thus the RFP required
specific information about each aircraft the offeror was
proposing to use “as well as [its] FAA Registration
Number” for offer acceptability, and “Tempus provided the
required information.” AR at 5.
However, after Phoenix pointed out that Tempus had
identified two different aircraft for its second on‑call
aircraft, the agency abandoned its initial contention that
the RFP required specific information for each proposed
aircraft, and its related contention that Tempus had
provided the required information. Instead, the agency now
argues that only the aircraft make and model were
“integral to the offer.” Compare AR at 3, with
Supplemental AR at 3-4. As a result, the agency now
contends that the conflicting aircraft registration
numbers have no impact on the offer acceptability
evaluation. Supplemental AR at 4.
Our Office will review a protester’s challenge to an
agency’s technical evaluation, by examining the record to
determine whether the agency’s evaluation conclusions were
reasonable and consistent with the terms of the
solicitation and applicable procurement laws and
regulations. CACI-ISS, Inc., B-412074, Dec. 21, 2015, 2016
CPD ¶ 35 at 6 n.6.
Interior’s view that Tempus’s proposal met the
acceptability criteria despite proposing one aircraft in
its § A2 table but providing an aircraft information form
about a different aircraft, is unreasonable and
inconsistent with the RFP. The RFP expressly required
offerors to identify each aircraft by its unique FAA
registration number. The RFP also required specific
information for each of those aircraft, such as its total
hours of service for the airframe and for each engine, the
most recent weight and balance test, and all modifications
to the aircraft. Tempus’s proposal identified a specific
aircraft, but then provided information about a different
aircraft. As a result, the conclusion that Tempus’s
proposal was acceptable under the offer acceptability
criteria was unreasonable, so we sustain this ground of
protest. (Phoenix Air
Group, Inc. B-412796.2,B-412796.3: Sep 26, 2016)
Ascella maintains that the agency
applied an unstated evaluation criterion when it evaluated
the protester’s quotation under the technical excellence
evaluation factor. The RFQ provided that quotations would
be evaluated under the technical excellence factor based
on the following narrative provision:
The Offeror will be evaluated to
determine the extent to which the Offeror demonstrates
an understanding of the Performance Work Statement,
Attachment 1 of the RFQ. The Offeror’s plan should
demonstrate requisite experience and expertise in
supporting Enterprise Resource Planning (ERP) accounting
systems and interfaces. The Plan shall be evaluated to
determine the extent to which the Offeror demonstrates
experience and expertise in providing business
architecture support and software development. The Plan
shall be evaluated to determine the extent to which the
Offeror demonstrates their ability to provide an
applications helpdesk. The Offeror will be evaluated on
their ability to provide surge support.
RFQ at 5. According to the protester,
the agency unreasonably downgraded its
quotation--assigning it only an acceptable, rather than a
higher, rating--for failing to provide details of its
approach to accomplishing various aspects of the
requirement. Ascella argues that the RFQ did not require
such details, but only required Ascella to demonstrate its
experience, expertise and understanding of the agency’s
requirement.
We find no merit to this allegation. In reviewing protests
challenging an agency’s evaluation of quotations or
proposals, our Office does not reevaluate the quotations
or proposals, or substitute our judgment for that of the
agency; rather, we review the record to determine whether
the agency’s evaluation was reasonable and consistent with
the solicitation’s evaluation criteria, as well as
applicable statutes and regulations. FP-FAA Seattle, LLC,
B-411544, B-411544.2, Aug. 26, 2015, 2015 CPD ¶ 274 at 7.
While solicitations must inform offerors of the basis for
quotation or proposal evaluation, and the evaluation must
be based on the factors set forth in the solicitation,
agencies are not required specifically to list every area
that may be taken into account, provided such areas are
reasonably related to, or encompassed by, the stated
criteria. MicroTechnologies LLC, B-403713.6, June 9, 2011,
2012 CPD ¶ 131 at 3.
Here, the RFQ specifically advised firms that their
quotations would be evaluated to determine the extent to
which they demonstrated an understanding of the
performance work statement (PWS). RFQ at 5. It is
axiomatic that demonstrating an understanding of the PWS
necessarily involves the vendor detailing the actions and
approaches it intends to use to accomplish performing the
agency’s requirement.
Ascella has not taken issue with any of the agency’s
substantive evaluation findings made in connection with
various elements of the PWS, and apparently believes
instead that it was not required to provide any
information beyond generalized representations concerning
its experience, expertise and understanding of the
agency’s requirements. However, the record shows that the
agency made a number of reasonable, specific evaluation
findings where Ascella simply failed to provide any
information concerning how it would accomplish certain of
the solicitation’s requirements. The record shows that
Ascella failed to provide any details for how it would
maintain standard operating procedures and work
instructions, as required by PWS element C.1.2.a; failed
to provide any information about a transition out plan, as
required by PWS element C.1.2.c; failed to describe a plan
to install desktop software, as required by PWS element
E.1.1.c; failed to describe a plan for annual account
review, as required by PWS element E.1.1.g; and failed to
address software configuration and turnover, as required
by PWS element D.1.1.d. AR, exh. 30, Technical Evaluation
Report, at 7.
(sections deleted)
The protest is denied in part and
dismissed in part. (Ascella
Technologies Inc. B-412679, B-412679.2: Apr 27, 2016)
(pdf)
As noted, one of the RFP’s evaluation
factors required offerors to demonstrate their experience
in performing services of a similar nature to those being
solicited. RFP at 375. The evaluation factor specifically
required offerors to demonstrate their prior experience in
performing one or more prior contracts for a minimum of
one year as either a prime contractor, or as a joint
venture, and precluded consideration of an offeror’s prior
experience if that experience had been gained as a
subcontractor, or had been gained by the offeror’s
proposed key personnel. Id. The RFP’s instructions to
offerors provided as follows:
The Offeror must demonstrate
experience as either a PRIME CONTRACTOR OR JOINT
VENTURE within the last 5 years from the date of
this solicitation providing contract services that are
similar to the scope of Operations and
Maintenance, custodial, and grounds maintenance
services being required by this solicitation.
Experience as a SUBCONTRACTOR or KEY PERSONNEL will not
be considered . . . .
RFP at 367 (emphasis in original).
The record shows that SDS submitted five examples of its
experience. AR, exh. 8, SDS Proposal Excerpt. In
evaluating the SDS proposal, the evaluators concluded that
four of SDS’s experience examples could not be considered
because they were for less than the 1-year duration
specified in the RFP for experience examples. AR, exh. 14,
SSEB Report, at 28. The evaluators did consider SDS’s
remaining experience example and concluded that it was
very relevant, and for work very similar to the solicited
requirement; based on these conclusions, the evaluators
assigned the SDS proposal a good rating for the experience
factor. Id. at 28-29. The record shows that this
experience example was performed by a joint venture known
as DL JV; the agency credited SDS with this experience
because one of its joint venturers, Dae Sung, was one of
the DL JV joint venturers. AR, exh. 12, SDS Experience
Evaluation, at 2.
QSI argues that the RFP only permitted consideration of
the experience of the offeror itself and, since SDS has no
experience as a joint venture, its proposal should have
been found unacceptable under the experience factor.[3]
QSI argues that the agency improperly evaluated SDS’s
experience by considering the experience of DL JV, even
though SDS did not perform the contract either as a prime
contractor in its own right, or as a joint venture
comprised of the two concerns that make up SDS, Dae Sung,
LLC, and Siege Enterprises, LLC.
We agree with the protester that the agency improperly
credited SDS with the experience of DL JV. As noted above,
the RFP expressly limited the offerors’ experience
examples to those where “the offeror” was either the
“prime contractor” or “joint venture” performing the
experience example. RFP at 367. Here, the contract
considered by the agency in evaluating SDS was one
performed not by “the offeror,” SDS, but by an entirely
different entity, DL JV. While it is true that one of
SDS’s joint venturers also was a member of DL JV, that
fact does not meet RFP criteria for consideration; that
is, the actual offeror here, SDS, was not the joint
venture that had performed the experience example.
We point out that, in contrast to the evaluation of SDS,
the agency apparently used a strict interpretation of the
RFP in its evaluation of QSI. The record shows that three
of QSI’s experience examples had been performed by QSI’s
major subcontractor, [DELETED]. AR, exh. 9, QSI Proposal
Excerpt. GSA apparently concluded that it was required to
exclude consideration of these experience examples--even
though [DELETED] had been the prime contractor performing
them--because it determined that [DELETED] was not the
“prime contractor” or a “joint venture” proposing to
perform the solicited requirement. AR, exh. 13, QSI’s
Experience Evaluation, at 5-7. (Quality
Services International, LLC, B-410156, B-410156.2,
B-410156.3: Nov 3, 2014) (pdf)
As an initial matter, we note that
GPO procurements are governed by GPO’s Printing
Procurement Regulation (PPR),[3] rather than the Federal
Acquisition Regulation (FAR). Nevertheless (just as in the
case of ordinary quotations and purchase orders under the
FAR), GPO purchasing regulations specify that a quotation
is not an offer; rather, the quotation is informational,
the purchase order is the offer, which the vendor may
accept expressly or through performance (unless the
purchase order specifies the means of acceptance). PPR
Part VII, § 4; see generally FAR § 13.004.
P&S argues that GPO applied an unstated evaluation factor
by failing to allow P&S a reasonable time--two hours--to
review and confirm its quotation, and that GPO lacks a
valid basis to issue the purchase order to a higher-priced
vendor. Protest at 3-4.
GPO argues that P&S should have expected a
review-and-confirm call between 2:00 p.m. (when quotations
were due), and 4:00 p.m. (when, GPO explains, its business
day ends). In this regard, the agency points to a
statement in the RFQ that “Furnished Material will be
available for pickup by 08/06/2014.” RFQ at 1. The agency
argues that this statement implied that it planned to
issue the order on August 6 between the 2:00 p.m. deadline
and GPO’s 4:00 p.m. closing time.
The parties agree that GPO has a practice of making
review-and-confirm calls to vendors. This
review-and-confirm process is not addressed in the PPR,
however. The agency argues that its process has practical
benefits, such as alerting the agency to the possibility
that a vendor may no longer wish to receive the order. On
the other hand, GPO has not shown that the process is
required (or even anticipated) by the PPR or any other GPO
policy.
It is not the purpose of our bid protest function to
squelch the use of innovative techniques by contracting
officials. When a protest is filed, however, we will
review the record to assess whether an agency has complied
with requirements for competition, has conducted the
procurement consistent with regulations and the terms of
its solicitation, and has treated vendors fairly. See,
e.g., Tipton Textile Rental, Inc., B-406372, May 9, 2012,
2012 CPD ¶ 156 at 12 (although agency was not obliged to
conduct discussions in simplified acquisition, once it did
so, it was required to treat competitors fairly); Russell
Enters. of N. Carolina, Inc., B-292320, July 17, 2003,
2003 CPD ¶ 134 at 3 (GAO reviews simplified acquisitions
to ensure that the procurements are conducted consistent
with a concern for fair and equitable competition and with
the terms of the solicitation).
GPO’s actions violate a fundamental premise of government
procurements: that offerors must be advised of the bases
upon which their proposals (or quotations, as here) will
be evaluated. H.J. Group Ventures, Inc., B-246139, Feb.
19, 1992, 92-1 CPD ¶ 203 at 4. Specifically, it was
unreasonable for the contracting specialist to provide P&S
approximately 48 minutes to respond to a telephone call,
when nothing in the RFQ alerted P&S that the agency’s
process anticipated such a short response time. Further,
there is nothing in the record suggesting that P&S would
fail to deliver the pens just as its quotation submitted
several hours earlier indicated it would. In short, GPO
points to nothing that would cast doubt on P&S’s ability
or intention to accept and perform the order. See AeroSage
LLC, B-409627, July 2, 2014, 2014 CPD ¶ 192 at 5.
Based on our review of the contemporaneous record and
GPO’s explanations to our Office, the agency’s decision to
effectively reject P&S’s quotation, and make award to a
KD8 at its higher price, lacks a valid legal basis. As P&S
does not challenge the review-and-confirm process itself,
we express no view on its use per se. Rather, the record
reflects that GPO did not provide notice that, as a
condition for award, the successful vendor would have to
respond affirmatively to a review-and-confirm call in an
unusually short time of less than an hour.
We are not persuaded by GPO’s arguments that, as an
experienced contractor, P&S should have been able to infer
what the agency intended to do, or that the language in
the RFQ regarding “Furnished Material . . . available for
pickup by 08/06/2014,” RFQ at 1, provided notice to
vendors that they should be prepared to confirm their
interest to GPO shortly after the RFQ closing time. If GPO
wanted such a short “review and confirm” period, it should
have made its intentions known in the solicitation, so
that vendors would be on notice that GPO would “move on”
to another vendor if no response was received in that
time. GPO issued the order to a higher-priced vendor
without a valid legal basis so we sustain the protest.
(Premiums & Specialties, Inc..
B-410247: Nov 13, 2014) (pdf)
Here, we conclude that the evaluation
was consistent with the terms of the RFP, and not premised
on unstated evaluation criteria. As reviewed above, the
PWS requires a staffing model that delivers required field
services/deskside support. RFP at 40. Regarding the
submission of proposals, the RFP established that the
proposals should address “the extent to which the offeror
proposes a workforce with the requisite skillsets,
knowledge and capabilities that demonstrates the proposed
staffing is adequate and sufficient to perform the PWS
requirements.” Id. at 145. The RFP further advised
offerors that USCIS [U.S. Citizenship and Immigration
Services] would evaluate various aspects of staffing,
including the extent to which an offeror’s proposal
presented a “staffing plan, organizational structure and
model capable of supporting Program Management,” the
“extent to which the offeror proposes a workforce with the
requisite skillsets, knowledge and capabilities in support
of their technical solutions,” and would conduct “an
assessment of the offeror’s proposed labor categories and
mix that details the sufficiency and adequacy as it
relates to their proposed technical solution.” RFP at 145.
In this context, the protester incorrectly argues that the
agency was prohibited from evaluating whether an offeror’s
proposed staffing approach was adequate to fulfill the
requirements of the PWS for each USCIS location, or from
assigning weaknesses if an offeror’s proposal failed to
provide sufficiently detailed staffing information to
allow for such an assessment. Further, Harris is misplaced
in its contention that the agency’s analysis was improper
where the proposals were not required to include a
“staffing plan” with their proposals. As the agency points
out, the required post‑award “staffing plan” was a
specific document addressing the certifications of various
personnel and did not provide for the submission of the
offeror’s detailed approach to staffing to fulfill the
terms of the PWS. Rather, we conclude that the offeror’s
staffing model was to be provided in the proposal and
evaluated by the agency. See, e.g., RFP at 40, 145.
In sum, although the RFP may not have specifically
directed offerors to substantiate the adequacy of their
staffing models to meet the requirements of the PWS on a
site-by-site basis, the RFP did clearly provide for an
evaluation of the capability of the staffing models to
meet the PWS requirements, which included providing
support at various USCIS field office and service center
locations. In evaluating proposals, an agency properly may
take into account specific, albeit not expressly
identified, matters that are logically encompassed by, or
related to, the stated evaluation criteria. Independence
Constr., Inc., B-292052, May 19, 2003, 2003 CPD ¶ 105 at
4. It is an offeror’s responsibility to submit a proposal
that establishes its capability and the technical merits
of its proposed approach, and allows for a meaningful
review by the procuring agency in accordance with the
evaluation terms of the solicitation. See Verizon Fed.,
Inc., B-293527, Mar. 26, 2004, 2004 CPD ¶ 186 at 4.
(Harris Corporation,
B-409869: Sep 4, 2014) (pdf)
Logistics first contends that (National
Geospatial-Intelligence Agency) NGA improperly abandoned
the stated evaluation scheme when reviewing the proposals
of Logistics and AC4S under the technical/management
factor. We agree. Based upon our review of the
contemporaneous record, we find that the agency failed to
follow the solicitation’s stated scheme in evaluating
proposals under the personnel subfactor, and we sustain
the protest on this basis. We find no merit to Logistics
remaining challenges to the agency’s technical/management
evaluation.
As discussed above, the solicitation stated that the
agency would assign one of five adjectival ratings to
proposals under each of the three subfactors of the
technical/management factor: outstanding, good,
acceptable, marginal, or unacceptable. The solicitation
provided that the personnel subfactor would be evaluated
based upon “the extent that” the proposed personnel meet
the solicitation’s requirements. RFP at 122.
In its report on this protest, NGA acknowledged that the
highest assessment an offeror could achieve under the
personnel subfactor was “Meets the Standard.” Supp. AR
(Sept. 5, 2013), at 16. In this regard, the agency
concluded that it would not assign any minor, major, or
significant strengths under this subfactor, and instead
would only assign weaknesses if an offer failed to meet
the solicitation’s requirements. Id. The agency further
explained that, in its view, “[a]n offeror may propose
personnel that are above the solicitation requirements,
but that does not [mean] those people will perform
better.” Id. Further, the agency “did not give more credit
for people [with] more experience/education because more
experience/education was not considered to provide any
advantage for the employee performance of the positions
under this [contract].” Id. For these reasons, the agency
explained, the maximum rating an offeror could receive for
this evaluation subfactor was “good,” which denoted that
the offeror had met the minimum requirements; offerors
were not eligible to receive an outstanding rating, as
indicated in the solicitation. Id. at 16.
Where, as here, the RFP provides for award on a best value
basis, as opposed to selection of the lowest-priced,
technically acceptable offer, the evaluation of proposals
cannot be limited to determining whether a proposal is
merely technically acceptable. Systems Research &
Applications, Corp.; Booz Allen Hamilton, Inc., B-299818
et al., Sept. 6, 2007, 2008 CPD ¶ 28 at 24. Rather,
proposals must be evaluated to identify their relative
qualities under the stated evaluation factors, including
the degree to which technically acceptable proposals
exceed the stated minimum requirements, or will better
satisfy the agency’s needs. Id.
When evaluation criteria are written in the manner
utilized here by the agency, and where proposals are to be
evaluated for technical merit on a qualitative basis, an
offeror can reasonably expect that a proposal that exceeds
the minimum requirement will receive a more favorable
evaluation than one that merely meets the requirement. See
Trijicon, Inc., B-244546, Oct. 25, 1991, 91-2 CPD ¶ 375 at
5. Based upon our review, we conclude that the terms of
this solicitation clearly advised offerors that
technically superior proposals would be favored over
proposals that merely met the minimum solicitation
requirements. See RFP at 122.
In addition, our review of the record, and the agency’s
own statements, lead us to conclude that the agency’s
assessment under the personnel subfactor failed to follow
the solicitation’s evaluation scheme. Specifically, the
RFP did not advise offerors of the agency’s view that an
offer to exceed the solicitation’s personnel requirements
provided no benefit to the government, and did not advise
offerors they would not receive credit for exceeding the
minimum requirements of the solicitation. As a result, the
agency’s evaluation did not consider whether the
protester’s proposal offered any advantages based on the
experience and capabilities of Logistics’ proposed staff,
or whether this experience provided a basis to
discriminate between Logistics’ and AC4S’s proposals under
this subfactor. Instead, the agency simply found that both
offerors met the minimum requirements. As a result, the
proposals were viewed as equal in technical merit, despite
any differences that may have existed in this area.
We also find that Logistics was prejudiced by the agency’s
improper evaluation. As discussed above, the agency did
not advise offerors that it would not assign a higher
rating for a proposal that exceeded the solicitation’s
minimum requirements regarding personnel. Since the agency
failed to consider the relative merits of the offerors’
proposed personnel, we have no basis to conclude whether
this review would or would not have changed the agency’s
view of this competition. See Kellogg, Brown & Root Servs.,
Inc.--Recon., B-309752.8, Dec. 20, 2007, 2008 CPD ¶ 84 at
5. Accordingly, we sustain the protest on this basis.
(Logistics 2020, Inc.
B-408543, B-408543.3, Nov 6, 2013) (pdf)
Coburn asserts that the VA improperly evaluated Coburn’s
experience/capability based on an unstated evaluation
criterion. Protest at 8-9. The protester argues that the
RFP did not require offerors to submit a list of
subcontractors, but that the RFP only requires the
successful contractor to submit a list of subcontractors
for each individual task order. Id. The protester points
out that failure to submit a subcontractor list was the
sole weakness assessed against Coburn’s proposal under the
experience/capability factor, and it argues that, but for
this improper assessment, Coburn would have received a
higher rating under that factor, and overall, because the
agency identified other strengths in Coburn’s proposal.
Comments at 5. Coburn contends that the VA’s best value
determination and source selection decision was flawed
because it was based on improper technical evaluations.
Id. at 2-4; Protest at 9-10.
The VA concedes that the RFP did not identify submission
of a subcontractor list as an evaluation criterion. AR at
7-8; Contracting Officer’s (CO) Statement at 1. However,
the agency argues that, contrary to the protester’s
belief, Coburn’s negative past performance rating, not its
failure to include a subcontractor list, contributed to
Coburn’s overall satisfactory rating. The agency suggests
that the protester was not prejudiced by the SSEB’s
assessment of a weakness under the experience/capability
factor, because it was not assessed as a “significant
weakness” and because it did otherwise impact Coburn’s
overall rating. See AR at 7-8, 11.
In reviewing protests of an agency's evaluation, our
Office does not reevaluate proposals, rather, we review
the evaluation to determine if it was reasonable,
consistent with the solicitation’s evaluation scheme, as
well as procurement statutes and regulations, and
adequately documented. Wackenhut Servs., Inc., B-400240,
B-400240.2, Sept. 10, 2008, 2008 CPD ¶ 184 at 6; Cherry
Road Techs.; Elec. Data Sys. Corp., B-296915 et al., Oct.
24, 2005, 2005 CPD ¶ 197 at 6.
Here, the RFP explicitly stated that “[t]he contractor
shall submit a list of subcontractors to be used on each
individual task order.” RFP at 17 (emphasis added). As the
protester points out, and the agency acknowledges, the RFP
did not require offerors, in their proposals, to list
their proposed subcontractors. Nor did the RFP state that
the agency would evaluate proposals in that regard. As
described above, the solicitation indicated that the
agency would evaluate a list of items that offerors were
to provide in their technical proposals. Id. at 9-11.
Specifically, the RFP enumerated six criteria under the
experience/capability evaluation factor that offerors were
to address in their proposals: experience and capability
managing construction projects; proposed key personnel;
current workload and staffing; quality control; infection
control; and safety. Id. at 10. Nowhere does the RFP
instruct or otherwise suggest that offerors were to
identify or list proposed subcontractors, either under the
experience/capability evaluation factor, or the past
performance and price factors.
Contrary to the agency’s arguments, it is uncertain
whether Coburn’s experience/ capability rating, and
ultimately Coburn’s overall rating, would have remained
the same, where only one weakness was assessed,
improperly, against Coburn’s proposal, and where it was
otherwise assessed multiple strengths. In other words, we
have no basis--and we decline the VA’s invitation--to
speculate about how the SSA would have viewed the relative
merits of Coburn’s and Abrams’ proposals; we cannot
conclude that the agency’s source selection decision would
not have proceeded differently given the paucity of
information in the contemporaneous record regarding the
agency’s cost/technical trade-off. Since we resolve any
doubts regarding competitive prejudice in favor of the
protester, and since the protester has shown a reasonable
possibility that it was prejudiced by the VA’s action, we
sustain Coburn’s protest that the agency applied an
unstated evaluation criterion in evaluating the
protester’s technical proposal. See Kellogg, Brown & Root
Servs., Inc.--Recon., B-309752.8, Dec. 20, 2007, 2008 CPD
¶ 84 at 5. (Coburn
Contractors, LLC B-408279.2, Sep 30, 2013) (pdf)
Cost Savings Evaluation
Next, NPP and INPS protest that the agency’s evaluation of
the offerors’ proposed cost savings failed to comply with
the provisions of the solicitation. More specifically, the
protesters assert that the agency’s source selection
decision ignored section M of the solicitation wherein the
agency committed to assess the feasibility and size of the
each offeror’s proposed cost savings. We agree.
It is a fundamental procurement principle that agencies
must evaluate proposals consistent with the terms of a
solicitation and, while evaluation of offerors’ proposals
is generally a matter within the procuring agency’s
discretion, our Office will question an agency’s
evaluation where it is unreasonable, inconsistent with the
solicitation’s stated evaluation criteria, or
undocumented. Excelis Systems Corp., B-407111 et al., Nov.
13, 2012, 2012 CPD ¶ 340 at 5; Public Commc’ns Servs.,
Inc., B-400058, B-400058.3, July 18, 2008, 2009 CPD ¶ 154
at 17. Contracting officials do not have the discretion to
announce in the solicitation that they will use one
evaluation plan, and then follow another without informing
offerors of the changed plan. Kumasi Ltd./Kukawa Ltd. et
al., B-247975.7 et al., May 3, 1993, 93-1 CPD ¶ 352 at 7.
Further, while source selection officials may reasonably
disagree with the evaluation ratings and results of
lower-level evaluations, they are nonetheless bound by the
fundamental requirements that their independent judgments
be reasonable, consistent with the stated evaluation
factors, and adequately documented. AT&T Corp.,
B-299542.3, B-299542.4, Nov. 16, 2007, 2008 CPD ¶ 65 at
16; AIU N. Am., Inc., B-283743.2, Feb. 16, 2000, 2000 CPD
¶ 39 at 8-9.
Here, in evaluating offerors’ proposals under one of the
most heavily weighted evaluation factors, management
approach/cost savings, the solicitation expressly stated
that the agency: (1) would evaluate the feasibility and
quality of an offeror’s management/cost savings approach,
and (2) would “also evaluate and assess the feasibility
and the size of the proposed cumulative savings to the
Government.” RFP at 334. The record establishes that the
SSA’s source selection decision failed to reflect any
meaningful assessment of the feasibility and size of
proposed cost savings.
As discussed above, although he performed no independent
cost savings analysis, the SSA elected to consider
feasible every cost savings dollar proposed by every
offeror--effectively ignoring the documented contrary
conclusions of the agency’s own financial management
specialists. COSF/MOL for NPP Protest at 21; COSF/MOL for
INPS Protest at 16; Tr. at 356. Specifically, in addition
to considering “feasible” portions of the offerors’ cost
savings that the [cost savings advisory committee] CSAC
had specifically determined to be “not reasonable,” the
SSA did not consider the CSAC’s documented conclusion that
a significant portion of the awardee’s proposed cost
savings were “NOT clearly documented,” appeared to
constitute “soft savings,” or were otherwise unsupported.
See AR, Tab E.9, Cost Savings Summary. Further,
notwithstanding the CSAC’s documented concerns regarding
inadequate support, the agency declined to seek additional
information from the offerors during discussions.
In reviewing this matter, we have considered, and
rejected, various agency arguments and rationalizations
purporting to justify the agency’s actions. By way of
overview, the agency asserts that (1) the solicitation did
not require quantification of feasible cost savings; (2)
requests for additional information from the offerors were
not warranted; (3) the solicitation’s “gateway” provision
eliminated the requirement to consider the feasibility and
size of proposed cost savings; and (4) the protesters were
not prejudiced by the agency’s actions. We discuss each of
these assertions below.
First, the agency maintains that the SSA appropriately
determined that all of the cost savings proposed by all of
the offerors were feasible because the solicitation “never
contemplated that the agency would attempt to quantify--to
the point where a specific dollar amount was identified
for each offeror--the proposed cost savings to be
achieved.” NNSA Post-Hearing Brief, Apr. 2, 2013, at 5.
We agree that the solicitation did not require the agency
to calculate a “specific dollar amount” of feasible cost
savings for each offeror. However, by providing that the
agency would “evaluate and assess the feasibility and the
size of the proposed cost savings,” the agency was
required, at a minimum, to make a reasonable assessment of
the relative magnitude of feasible cost savings proposed
by each offeror. That obligation was not met by the SSA’s
blanket conclusion that all proposed cost savings by every
offeror were feasible, nor by the SEB’s conclusion that
all proposed cost savings, except those the CSAC had
specifically determined to be unreasonable, were feasible.
While we agree that something less than calculation of a
“specific dollar amount” was required, the unsupported
assumptions by both the SEB and the SSA were inadequate
and unreasonable.
Next, with regard to its failure to seek additional
information from the offerors during discussions, the
agency maintains that its inaction was appropriate because
“we would have [had] to go back and ask for [information
for ] all 161 initiatives” and, in any event, the agency
“did not think [the offerors] could give us the
information.” Tr. at 1120.
Regarding the amount of additional information at issue,
the agency has provided no explanation as to why it
believes it would have had to seek additional information
for all 161 initiatives--since the CSAC clearly identified
a portion of those initiatives as being reasonable and
supported. With regard to the agency’s contention that the
offerors would be unable to provide additional supporting
information, we think the agency’s contention is
illogical. On the one hand, the agency maintains that the
cost savings are feasible while, on the other hand, it
contends that the very offerors who proposed those
“feasible” cost savings will be unable to provide further
explanation and support for them. Finally, the agency’s
assertion regarding the unavailability of supporting cost
savings information appears to be inconsistent with the
record here. As noted above, the majority of the offerors’
proposed cost savings relate to workforce reductions. NNSA
Post-Hearing Brief at 14. In this regard, the SSA’s source
selection decision specifically states that, “under the
M&O model . . . personnel/HR costs are specific, hard
numbers that are base-lined and traceable.” AR, Tab G.1,
Source Selection Decision, at 11. On the record before us,
we find unpersuasive the agency’s position that requests
for additional information from the offerors were
unwarranted.
Next, the agency maintains that the solicitation’s
“gateway” provision effectively excused the agency from
performing a meaningful pre-award analysis of the offerors’
proposed cost savings. NNSA Post-Hearing Brief at 17, 20;
see RFP at 21. Under this provision, in order to be
eligible for exercise of a contract option, the awardee
must have achieved, by the end of the third year, at least
80 percent of the cost savings that it proposes to achieve
by that time. The agency maintains that the SSA reasonably
concluded that all offerors’ proposed cost savings were
feasible because the offerors knew that the contract will
not be extended beyond the 5-year base period if the
awardee fails to achieve a significant portion of the
savings proposed. Accordingly, the agency maintains that
the solicitation’s “gateway” provision established a
safeguard against the offerors’ proposals of unrealistic
cost savings. We disagree.
Even if the agency enforces the solicitation’s “gateway”
provision, the safeguard on which the agency relies has no
effect until the end of the five-year base period, when
the agency may decline to exercise a contract option. At
that point, the agency will have spent a significant
portion of the $22.8 billion estimated value of this
contract. The purpose for performing the required cost
savings evaluation before award is to provide some level
of confidence that the source selection decision reflects
the savings that are most likely to be achieved.
Accordingly, the agency’s reliance on the solicitation’s
“gateway” provision, which takes effect only after five
years of contract performance, does not absolve the agency
of the requirement to meaningfully assess the feasibility
and size of the proposed cost savings prior to award.
Finally, the agency asserts that there is no prejudice to
the protesters because they were all treated equally.
Among other things, the agency contends that NPP actually
benefitted from the agency’s approach because it was
[redacted]. See AR, Tabs E.11 at 6, E.12 at 6.
On the record here, there is no reasonable basis to
determine whether little, much, or most of each offeror’s
proposed cost savings are, in fact, feasible. For example,
the record provides virtually no basis for determining
that more than [redacted] percent of the awardee’s
proposed cost savings are feasible. On the record here, we
have no basis to determine whether a meaningful analysis,
based on additional information, would have established
that significantly higher portions of the protesters’ cost
savings were feasible. Accordingly, we reject the agency’s
assertion that there is no potential prejudice to the
protesters.
In conclusion, we sustain the protests based on the
agency’s failure to reasonably evaluate the feasibility
and size of the offerors’ proposed cost savings, as
required by the terms of the solicitation. (Nuclear
Production Partners LLC; Integrated Nuclear Production
Solutions LLC, B-407948, B-407948.2, B-407948.3,
B-407948.4, B-407948.5, B-407948.6, B-407948.7,
B-407948.8, Apr 29, 2013) (pdf)
Caduceus also
claims that the agency used unstated evaluation criteria
in identifying the following as weaknesses in the
protester’s proposal: its failure to offer a dedicated
quality control manager, to identify any specific quality
control plan, and to devise quality control metrics
related to patient outcomes. Caduceus asserts that the RFP
did not require offerors to provide this information.
Although agencies are required to identify in a
solicitation all major evaluation factors, they are not
required to identify all areas of each factor that might
be taken into account in an evaluation, provided that the
unidentified areas are reasonably related to or
encompassed by the stated factors. Master Lock Company,
LLC, B-309982.3, Dec. 10, 2008, 2009 CPD ¶ 6 at 5.
We do not find that the Army used unstated evaluation
criteria in evaluating Caduceus’s proposal. With regard to
the lack of a dedicated quality control manager, the RFP
specifically informed offerors that their quality control
plans would be evaluated for the ability to resolve and
prevent quality assurance problems. RFP at 70. Addressing
project quality management in its proposal, Caduceus
indicated that its project director has overall quality
control and project management responsibility, and stated
that, to properly monitor performance, Caduceus’s project
director performs [DELETED]. AR, Tab 15, Protester’s
Proposal, Vol. II, at 6. In its evaluation, the Army
questioned the plausibility of having one person be
responsible for the competing (and potentially
conflicting) demands of operation and quality assurance
requirements, and concluded that it was “not an ideal
solution for an effective [quality assurance] program”—in
short, a weakness. AR, Tab 17, Consensus Technical
Evaluation, at 4. The agency explains, and the protester
does not dispute, that a “basic [tenet] of Quality Systems
is the separation of production/management and quality
control/assurance responsibilities.” Id. Caduceus’s
proposal suggested that the same person would be
responsible for these roles, and did not indicate anyone
else would be assigned to manage quality assurance or
quality control, and the agency took that information into
account in evaluating the protester’s approach. In our
view, the agency’s consideration of the protester’s
approach to managing or staffing its quality control plan
is reasonably encompassed within the quality control plan
evaluation subfactor. (Caduceus
Healthcare, Inc., B-407791, Feb 21, 2013) (pdf)
Price Evaluation
Colt argues that the Army did not apply the 5-percent royalty
adjustment in the solicitation to the other offerors’ total
evaluated prices. Instead, Colt contends that the agency
improperly applied the royalty adjustment only to certain
portions of the proposed prices, based on an undisclosed
interpretation of the license agreement between the government
and Colt.
In response, the Army argues that the terms of the license
agreement mandate the application of the royalty only to the
portions of the M4/M4A1 carbine that are proprietary to Colt.
Additionally, the agency alleges that Colt’s interpretation of
the solicitation is unreasonable, and is inconsistent with the
license agreement the solicitation provision is implementing.
As shown by the Army’s response, this dispute requires a review
of the license agreement between the government and Colt--which
was not a part of this solicitation. During the course of this
protest, the Army provided a copy of the license agreement,
which provides as follows:
[DELETED] by GAO
AR, Tab 19, M4 Carbine Addendum to Technical Data Sales and
Patent License Agreement, Article VII Royalties, at 13-14.
Our review of the record shows that the Army did not apply a
5-percent royalty adjustment to the total proposed prices for
the non-Colt offerors. Instead, as Colt has argued, the agency
interpreted the solicitation in conjunction with the license
agreement, quoted above. AR, Tab 11-1-1, SSEB Price Evaluation,
at 4.
In general terms, the agency identified those parts of the
M4/M4A1 carbine that were proprietary to Colt, and applied the
5-percent royalty adjustment only to the costs associated with
those parts. Id. Specifically, the agency concluded that
[DELETED] of the carbine’s parts were proprietary to Colt, or
partially proprietary to Colt, and that only the portion of the
price related to those parts was therefore subject to the
5-percent royalty adjustment. Id. The agency then referenced
historical data from a unit price database to assign a price to
each M4/M4A1 component. Using these prices, the agency
calculated a royalty base expressed as a percentage of the
historical price of the carbine that is proprietary to Colt. Id.
at 5. For example, CLIN 0003 was assigned a proprietary
percentage of 49.41 percent, meaning 49.41 percent of the
carbine’s historical price for CLIN 0003 was determined to be
for parts that were proprietary to Colt. Id. at 5. As a result
of this calculation, the agency multiplied the prices of the
other offerors by 49.41 percent, and applied the 5-percent
royalty to the result. It was this number that was added to the
non-proprietary portion of the other offerors’ prices to
calculate a total price for each CLIN. Each CLIN, inclusive of
royalty, was then added together to arrive at a total evaluated
price for each offeror.
In essence, this dispute arises from a blurring of two distinct
concepts--a price adjustment provision set out in a
solicitation, and a royalty payment provision between the
government and Colt designed to compensate Colt for the use of
its proprietary technology. While the Army’s calculation may (or
may not) accurately represent the amount due Colt under the
license agreement, the solicitation’s price adjustment provision
provided no notice to Colt--or other offerors--that the
provision would be applied in this way.
In cases where a dispute exists between the parties as to the
actual meaning of a particular solicitation provision, our
Office will resolve the matter by reading the solicitation as a
whole and in a manner that gives effect to all its provisions;
to be reasonable, an interpretation of a solicitation must be
consistent with such a reading. The Boeing Co., B–311344 et al.,
June 18, 2008, 2008 CPD ¶ 114 at 34.
The plain language of the RFP stated that “[t]he royalty rate is
5%” and that, for offerors who do not own or have licenses for
the Colt technology, their proposed prices “will be evaluated by
adding an amount equal to the royalty to [their] proposed
prices.” RFP at 88. We agree with Colt that the only reasonable
interpretation of the RFP would be to apply the 5-percent
royalty to an offeror’s total evaluated price. The agency’s
interpretation is simply at odds with the plain meaning of the
solicitation language and, as such, is unreasonable. See
Raytheon Co., B-404998, July 25, 2011, 2011 CPD ¶ 232 at 17.
The RFP here did not indicate that the agency would perform an
analysis of the proprietary and non-proprietary parts, nor did
it provide notice to the offerors that the agency would
calculate a royalty base expressed as proprietary percentage for
each CLIN. Thus, the agency departed from the RFP’s price
evaluation criteria when it created its own undisclosed formula
to apply the 5-percent royalty to only those portions of the
M4/M4A1 carbine which it had independently determined were
proprietary to Colt. As stated above, we find that the
solicitation’s price evaluation clause provided no notice that
the evaluation would be conducted in such a manner.
To the extent the Army contends that Colt’s protest is an
untimely challenge to the terms of the solicitation, we
disagree. Specifically, the agency contends that Colt was aware
that the license agreement between the Army and Colt contained
language that was inconsistent with Colt’s interpretation of the
RFP. In effect, the agency argues that Colt’s special knowledge
of the license agreement created an ambiguity that obligated the
protester to challenge this solicitation prior to the time for
submitting proposals. See Bid Protest Regulations, 4 C.F.R. §
21.2(a)(1) (2012).
An ambiguity exists where two or more reasonable interpretations
of the terms or specifications of the solicitation are possible.
DynCorp International LLC, B-289863, B-289863.2, May 13, 2002,
2002 CPD ¶ 83. A patent ambiguity exists where the solicitation
contains an obvious, gross, or glaring error (e.g., where the
solicitation provisions appear inconsistent on their face),
while a latent ambiguity is more subtle. Ashe Facility Servs.,
Inc., B-292218.3, B-292218.4, Mar. 31, 2004, 2004 CPD ¶ 80 at
11. Where there is a latent ambiguity, both parties’
interpretation of the provision may be reasonable, the
appropriate course of action is to clarify the requirement and
afford offerors an opportunity to submit proposals based on the
clarified requirement. Allied Signal, Inc., Elec. Sys., B-
275032, B-275032.2, Jan. 17, 1997, 97-1 CPD ¶ 136 at 11.
Although, as noted above, we conclude that a solicitation’s
price adjustment clause and the appropriate calculation of a
royalty payment are two distinct concepts, if there is any
ambiguity here, it was latent, rather than patent. As a result,
we find that this protest is timely.
To the extent that the Army argues that Colt’s interpretation of
the RFP would render ineffective a portion of the license
agreement, we again disagree. Clearly Colt was aware of the
terms of the license agreement--as the Army argues--and Colt was
aware that the agreement stated that the royalty amount due to
Colt would be based on parts or components that are produced
from the proprietary portions of the carbine. However, the terms
of the license agreement are not at issue here. Put differently,
the RFP and license agreement need not be read in a manner that
puts either in conflict.[6] See The Boeing Co., supra, at 35.
We also find no merit to the Army’s contention that Colt should
have known that the agency would evaluate the other offerors’
prices in the manner employed by the SSEB. As discussed above,
the agency unilaterally identified the portions of the carbine
that are proprietary to Colt, used historical prices to assign a
price to these portions, calculated a percentage of the price
that could be considered tied to the proprietary items, and
applied that percentage to the offerors’ evaluated prices. There
is no dispute that the Army and Colt have yet to reach an
agreement on a per-weapon value of the non-proprietary portion
of the technical data package to be utilized to reduce the
royalty base. See Protester’s Response to Agency’s Dismissal
Request, at 6. Thus, Colt could not have known that the agency
would apply the 5-percent royalty as it did. We therefore
sustain the protest based upon the Army’s decision to evaluate
prices in a manner different from the approach established in
the solicitation. (Colt Defense,
LLC, B-406696, Jul 24, 2012) (pdf)
Rocamar contends
that requiring its wall system to undergo a second fire/burn
test, after it began dismantling its wall system, was unfair and
improper. Rocamar argues that the second fire/burn test on the
rear non-threat side of the wall system was not contemplated by
the solicitation’s test provisions. Rocamar also notes that none
of the other offerors’ wall systems were required to pass a
second burn test on the non-threat side of their wall systems
after partial demolition. Email from DOS (May 7, 2012).
Our Office will review an allegedly improper technical
evaluation of product samples to determine whether the
evaluation was fair, reasonable, and consistent with the
evaluation criteria. We will not make an independent
determination of the merits of an offeror’s proposal; rather, we
will review the evaluation record, including the results of any
test demonstration, to ensure that the agency’s technical
judgment has a rational basis and is consistent with the stated
evaluation criteria. Optical Sys. Tech, Inc., B-296516.2,
B-296516.3, Mar. 17, 2006, 2006 CPD ¶ 63 at 5. Our Office
affords particular deference to the technical expertise of
agency personnel regarding judgments that involve matters of
human life and safety and the conduct of qualification testing
is an area where contracting agencies have broad discretion so
long as their action is reasonable and does not prejudice
potential offerors by, for example, treating them unequally. See
Science Applications Int’l Corp., B-405612 et al., Dec. 5, 2011,
2012 CPD ¶ 8 at 7; Chemonics Indus., Inc., B-260284, Apr. 19,
1995, 95-1 CPD ¶ 206 at 4. As discussed below, we cannot
conclude that DOS fairly or reasonably conducted the second
fire/burn test of Rocamar’s proposed system.
Our review of the video tape of the testing process confirms
that the second fire/burn test was conducted by the DOS testing
official, after Rocamar began to dismantle the components of its
wall system. See VT at 27:47-36-34. Indeed, the protester notes
that the video shows that prior to the unannounced second
fire/burn test, the mesh was cut by bolt cutters approximately
210 times, had materials removed from the system with shovels,
and the containment material cut with a razor. See Protester’s
Supp. Comments (May 8, 2012) at 5-6; Rocamar President Affidavit
(May 15, 2012) at 6.
As noted, the agency argues that such action was justified
because the testing official observed that Rocamar’s wall system
was constructed with a different material on the rear wall. See
DOS Reply Comments (May 1, 2012) at 2. However, there is nothing
in the RFP that would reasonably suggest that testing would be
performed on anything other than the fully assembled system. See
RFP, amend. No. 0009, attach. 8, § 2.1.2. Nor was any other
offeror subjected to a similar test of its system.
In a sworn statement, Rocamar’s president, who is a licensed
engineer, attributes the failure of its wall system to pass the
DOS’s second burn test to the loss of the wall system’s
compaction resulting from partially dismantling the wall system.
He also states that the demolition directly compromised the
structural integrity of its system and caused the wall system to
lose its compaction, which was a key component of the design of
its system. He explains that the pre-test demolition allowed
excess oxygen into Rocamar’s wall system, which caused the fill
inside the protester’s system to move outside the structure, and
for Rocamar’s system to fail the test. Affidavit of Rocamar’s
President (May 15, 2012) at 6-7. The agency has not refuted the
protester’s engineering analysis of the design of its system and
what would happen if there was a loss of compaction in the
system. Based on this record, we find that DOS’s second
fire/burn test of Rocomar’s wall system was neither fair nor
reasonable, and sustain the protest on this basis.
We recommend that the agency review its test results to
determine whether Rocamar’s system should be considered
acceptable. The agency should also review the test results of
the other offerors’ systems to determine whether further testing
is required to determine their acceptability. In the event the
agency determines that Rocamar’s system requires further
testing, we recommend that Rocamar be provided an opportunity to
provide its system for testing and that it be tested in
accordance with the RFP and consistent with the procedures used
in testing the other offerors’ systems. We also recommend that
the protester be reimbursed for the costs of filing and pursuing
its protest, including reasonable attorneys' fees. 4 C.F.R. §
21.8(d)(1) (2012). The protester should submit its certified
claim for such costs, detailing the time expended and costs
incurred, directly with the agency within 60 days of receiving
this decision. 4 C.F.R. § 21.8(f)(1). (Rocamar
Engineering Services, Inc., B-406514, Jun 20, 2012) (pdf)
Key Personnel
Experience Evaluation
Y&K protests that the Army unreasonably found that the awardee’s
proposal satisfied the RFP’s minimum requirements for key
personnel experience at a TJC accredited hospital. Protest at
7-15; Protester’s Comments at 3-5. Specifically, the protester
contends that the awardee’s two proposed area managers do not
meet the requirement for 4 continuous years of experience
working at a TJC accredited hospital. Rather, the protester
asserts that the proposed managers only have experience at
ambulatory facilities, that is, outpatient clinics. Protester’s
1st Supp. Comments at 3-5. Moreover, the protester notes that,
although identified ambulatory clinics had at one time been
accredited by TJC, the TJC accreditation standards for
ambulatory care clinics are different than those for hospitals.
Protester’s 2nd Supp. Comments at 2.
Agencies are required to evaluate proposals based solely on the
factors identified in the solicitation, and must adequately
document the bases for their evaluation conclusions. Intercon
Assocs., Inc., B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD ¶
121 at 5. While agencies properly may apply evaluation
considerations that are not expressly outlined in the RFP, where
those considerations are reasonably and logically encompassed
within the stated evaluation criteria, there must be a clear
nexus between the stated criteria and the unstated
considerations. Global Analytic Info. Tech. Servs., Inc.,
B-298840.2, Feb. 6, 2007, 2007 CPD ¶ 57 at 4. Although we will
not substitute our judgment for that of the agency, we will
question the agency’s conclusions where they are inconsistent
with the solicitation criteria, undocumented, or not reasonably
based. Sonetronics, Inc., B-289459.2, Mar. 18, 2002, 2002 CPD ¶
48 at 2, 3.
SEM’s proposal provides resumes for two individuals that were
proposed for its area managers. The first shows experience at
the Moody Air Force Base (AFB) Medical Treatment Facility (MTF),
Georgia, for 13 years from 1998. The second shows experience at
the Cannon AFB MTF, New Mexico, for 3 years from 2008. See SEM’s
Proposal, Key Personnel Resumes, at 33, 35. The Army contends
that this shows that SEM’s proposed area managers satisfied the
key personnel experience requirements, although the agency
acknowledges that the Moody and Cannon AFB MTFs are not
currently TJC accredited. Supp. AR at 2. The Army states,
however, that the Moody AFB MTF was accredited from 1998 to
2005, and that the Cannon AFB MTF was accredited from
1998-2006.[8] See Supp. AR at 2-3; encl. 1, TJC Accreditations,
Cannon AFB, Moody AFB.
As discussed above, the RFP required offerors to propose area
managers/foremen who had a minimum of 4 continuous years of TJC
experience working at a TJC accredited hospital. RFP amend. 1 at
5-6; RFP at 81-83. Offerors were advised that key personnel
resumes would be evaluated to ensure that the offeror proposed
personnel that were qualified and had actual experience working
at a TJC accredited hospital and, in this regard, offerors were
required to provide documentation describing the facilities at
which key personnel had worked, including the facility’s bed
count and square footage. See RFP at 81-83, 89-90.
Here, as the protester points out, SEM’s proposal and the
resumes provided for its proposed area managers do not show that
the awardee’s area managers have experience working at a TJC
accredited hospital, as required by the RFP. SEM’s Proposal, Key
Personnel Resumes, at 25, 33-35. Rather, the resumes plainly
indicate that each worked at an Air Force MTF. The accreditation
certificates for each of these facilities show that they have
been accredited by TJC as Ambulatory Health Care facilities, not
as hospitals. Supp. AR, encl. 1, TJC Accreditations, Cannon AFB,
Moody AFB. TJC has different accreditation standards for
hospitals and for ambulatory care facilities, reflecting that
hospitals provide for inpatient, confined care, whereas
ambulatory facilities do not. See TJC, Hospital Accreditation
Standards, 2009, at 470; TJC, Standards for Ambulatory Care, at
402 (ambulatory healthcare is defined as “[a]ll types of health
services provided to patients who are not confined to an
institutional bed as inpatients during the time services are
rendered”). Moreover, we note that SEM did not demonstrate in
its proposal, as required by the RFP, that the facilities at
which the proposed area managers had worked were hospitals, and
did not include documentation of, among other things, the
facilities’ square footage and bed count. See RFP at 81, 90.
The record also shows that, even accepting the Air Force MTF
facilities as hospitals, one of the awardee’s proposed area
managers would still not satisfy the RFP’s requirements for 4
years of continuous experience at a TJC accredited facility.
That is, the resume for the area manager who had experience at
Cannon AFB MTF, only shows 3 years of experience at this
facility. In addition, the record shows that this facility was
not accredited by TJC during the time he was employed there.
As a result, we conclude that the agency did not evaluate SEM’s
proposed key personnel consistent with the terms of the RFP, and
we sustain the protest on that basis. See, e.g., Mechanical
Contractors, S.A., B-277916, Oct. 27, 1997, 97-2 CPD ¶ 121 at 4.
(protest sustained where agency failed to evaluate certification
relevant to RFP’s specialized experience subfactor). (Y&K
Maintenance, Inc., B-405310.6, Feb 2, 2012) (pdf)
FULL OPERATING
CAPABILITY (FOC)
IBM asserts that the agency improperly credited Jacobs’s
proposal, but not IBM’s proposal, for offering to achieve FOC
sooner than required. There are two elements to this aspect of
IBM’s protest. First, according to the protester, the RFP makes
no mention of FOC, either in terms of what the phrase means, or
in terms of when it was supposed to have been achieved. IBM
therefore asserts that the agency improperly used an unstated
evaluation factor. Second, IBM asserts that the agency erred in
concluding that Jacobs, but not IBM, offered to achieve FOC on
an accelerated basis. In this regard, IBM asserts that while the
agency evaluators found that Jacobs had proposed to achieve FOC
[deleted] months “ahead of schedule,” they improperly failed to
credit IBM with its proposal to achieve FOC within [deleted]
months of being awarded the contract.
In response, the agency maintains that it properly gave Jacobs
credit for proposing to achieve FOC in an accelerated fashion.
According to the agency, achieving FOC was an implicit
requirement of the solicitation that was properly considered
under the technical/management evaluation factor. Furthermore,
asserts the agency, Jacobs in fact proposed to achieve FOC
earlier than did IBM.
As a general rule, agencies are required to advise offerors of
the evaluation criteria against which proposals will be
evaluated. Although agencies are not required to identify each
and every element encompassed within the solicitation’s
evaluation scheme, unstated evaluation considerations must
reasonably be subsumed within the stated considerations.
Mnemonics, Inc., B-290961, Oct. 28, 2002, 2003 CPD ¶ 39 at 6.
Here, the RFP neither defined FOC nor established a schedule for
achieving it. Further, the stated evaluation approach in the RFP
did not provide for consideration of FOC, nor was such
consideration reasonably subsumed within the stated criteria.
As an initial matter, the term FOC is not defined in the
solicitation. The only specific references to FOC appear in
certain materials disseminated at an industry day meeting prior
to issuance of the RFP, and in the agency’s evaluation and
source selection materials. Specifically, the agency’s industry
day briefing materials included a slide entitled “ITMO
[information technology management office] Approach,” which
provided as follows:
Provide the ITIL [information technology infrastructure
library] expertise
--Initial Operating Capability (IOC) –1 April 2011
--Full Operating Capability (FOC) –1 April 2013.
AR, exh. 6, at 49. Among the industry day questions and answers
was the following specific question and answer:
Q: Will the requirements for IOC/FOC be defined/allocated to
IOC/FOC?
A: USSOCOM expects to define/achieve Initial Operating
Capability (IOC) by 1 April 2011. USSOCOM expects to achieve
Full Operating Capability (FOC) no later than 1 April 2013.
AR, exh. 14, at 24.
However, these specific pre-solicitation references to FOC were
not reflected in the subsequently issued RFP. Specifically,
there was no RFP definition of what constitutes FOC and no RFP
schedule requirement for achieving FOC at a particular time.
Further, there was no provision in the RFP for evaluation of FOC,
nor were any of these considerations raised during discussions.
Notwithstanding the absence in the RFP of either a definition
of, or schedule for, achieving FOC, the record shows that this
was a prominent consideration in the agency’s evaluation of the
Jacobs proposal as well as in its source selection decision. In
the agency’s final technical evaluation report, the evaluators
specifically note, with respect to the Jacobs proposal:
The commitment to achieve the Final Operating Capability for
process maturity [deleted] months ahead of schedule will
enable the Government to focus on performance and process
improvements that much quicker. The offeror’s commitment and
corporate investment of [deleted] on these tools will
significantly enhance the ITMO’s ability to implement the ITIL
management processes and increases the confidence of the
offeror’s claim to reach FOC [deleted] months ahead of
schedule.
AR, exh. 27, at 29. Similarly, the briefing materials presented
to the agency’s source selection authority (SSA) also favorably
referred to Jacobs’ evaluated ability to achieve FOC in
[deleted] versus [deleted] months. AR, exh. 26, at 18. Those
same briefing materials include a comparison of the Jacobs and
IBM proposals, noting as a strength in favor of award to Jacobs,
the agency’s conclusion that Jacobs would achieve FOC [deleted]
months earlier than IBM. Id. at 41. The agency’s source
selection advisory council also made specific favorable
reference to Jacob’s alleged proposal to achieve FOC in an
accelerated manner. AR, exh. 29, at 1.
Finally, the SSA, in his tradeoff analysis, after discussing the
strengths in the Jacobs proposal (including its evaluated
ability to achieve FOC [deleted] months ahead of IBM),
specifically stated as follows:
The increased cost resulting from choosing Jacobs over IBM is
less than [deleted] per year Average ($[deleted] Total) and
represents less than [delete] percent increase in the MPC
[most probable cost]. The benefit of achieving FOC ahead of
schedule provided by the Jacobs proposal will have a
significant impact on the management and oversight of all
SITEC efforts that more than justify this increase in MPC.
Based on this, I have determined that the proposal provided by
Jacobs provides significantly more value and benefits to the
Government than IBM.
AR, exh. 30, at 6.
The record thus shows that the evaluators and SSA acted on the
assumption that there was a schedule for implementing FOC
included in the RFP, and that Jacob’s evaluated advantage with
respect to achieving FOC represented the principal technical
discriminator between the IBM and Jacobs proposals. However, as
noted, the RFP did not define or specifically refer to FOC; did
not establish a schedule for achieving FOC; and did not provide
for evaluation of offerors’ schedules for FOC.
For the reasons discussed above, we sustain IBM’s protest that
the agency improperly applied an unstated evaluation factor. As
a final matter, as noted, the agency and IBM disagree concerning
what constitutes FOC and whether or not IBM proposed to achieve
it by a certain point in time. We need not resolve this dispute,
since our recommendation below renders these considerations
immaterial. We point out, however, that this disagreement
between the agency and IBM serves to underscore the absence of
an established, objective definition of, and schedule for
achieving, FOC in the solicitation. (IBM
Global Business Services, B-404498,B-404498.2, Feb 23, 2011)
(pdf)
APEX protests the
agency’s technical evaluation, arguing that the agency did not
evaluate its proposal in accordance with the RFP’s stated
evaluation criteria and, in this respect, complains that the RFP
did not require offerors to identify staff deployments at each
installation or propose tools, supplies, and equipment. See
Protest at 17; Comments at 15-17.
The Army argues that APEX’s technical proposal was found
unacceptable because it did not comply with the RFP’s
requirements and did not adequately address the technical
acceptability factors. See AR at 10-11. In this regard, the
agency asserts that the protester failed to provide adequate
details for evaluation, specifically, a staffing plan, proposed
supplies and equipment, or quality control procedures. See id.
at 7, 9-10.
In reviewing protests challenging the evaluation of proposals,
we do not conduct a new evaluation or substitute our judgment
for that of the agency, but examine the record to determine
whether the agency’s judgment was reasonable and in accord with
the solicitation’s evaluation criteria. Abt Assocs., Inc.,
B-237060.2, Feb. 26, 1990, 90-1 CPD ¶ 223 at 4. Where a
protester challenges the agency’s evaluation and source
selection, we will review the evaluation and award decision to
determine if they were reasonable, consistent with the
solicitation’s evaluation scheme, as well as procurement
statutes and regulations, and adequately documented. See, e.g.,
Wackenhut Servs., Inc., B-400240, B-400240.2, Sept. 10, 2008,
2008 CPD ¶ 184 at 6; S4, Inc., B-299817, B-299817.2, Aug. 23,
2007, 2007 CPD ¶ 164 at 9.
Based on our review of the record here, we agree with the
protester that the agency evaluated its technical proposal and
rejected it as unacceptable based on unstated criteria. As set
forth above, the RFP required technical proposals to
describe--in no more than 15 pages--the offeror’s proposed
plans, processes, procedures, and systems, and provide all
supporting data and documentation. RFP at 119. No further
substantive instructions were provided in this regard. More
specifically, the RFP did not request that offerors identify
their proposed staff deployments at each installation or the
tools, supplies and equipment that would be used. Id. at 118-33.
Rather, the PWS requires the contractor to submit--after
award--work schedules and a list of all supplies and equipment
needed to perform the requirements.[7] See PWS at 13-14, 16.
Moreover, the PWS requires the contractor to coordinate cleaning
schedules with the agency and, in this regard, requires the
contractor to provide work schedules at least 30 calendar days
prior to the month of performance. Id. at 7, 16. Under these
circumstances, we do not think that an offeror would reasonably
understand that it was required to propose staff deployments for
each of the three medical facilities, as well as tools, supplies
and equipment, particularly here where the PWS requires such
details be coordinated with, or approved by, the agency after
contract award.
Accordingly, we find that the agency’s determination that APEX’s
proposal was unacceptable because it failed to propose staff
deployments at each installation or supplies and equipment, was
based on unstated evaluation criteria, and we sustain the
protest. See, e.g., DRS ICAS, LLC, B-401852.4, B-401852.5, Sept.
8, 2010, 2010 CPD ¶ 261 at 10-11 (agency applied unstated
evaluation criteria when it assessed a technical weakness based
on protester’s failure to identify and address specific threats
and technologies for future systems upgrades, and offerors would
not reasonably realize that they should submit such information,
where RFP only required offerors to address how the
“architecture” of their proposed systems would accommodate
upgrades and systems growth); RJO Enter., Inc., B-260126.2, July
20, 1995, 95-2 CPD ¶ 93 at 13-16 (agency applied unstated
evaluation criteria when it downgraded protester’s proposal for
failure to submit resumes for certain labor skill levels where
RFP did not require resumes for every skill level and
contemplated that contractor would propose staffing plans and
labor mixes for future task orders). (APEX-MBM,
JV, B-405107.3, October 3, 2011) (pdf)
Unstated
Evaluation Criteria
The solicitation included three sample task orders (STO 1, STO
2, and STO 3), the responses to which were to be evaluated as
follows:
The Sample Tasks are designed to test the offeror's expertise
and innovative capabilities to respond to the types of
situations that may be encountered in performance. . .
Accordingly, the offerors will not be given an opportunity to
correct or revise a sample task response.
The evaluation of each sample task will consider the following:
(1) Understanding of Problems‑‑The proposal will be evaluated
to determine the extent to which the offeror demonstrates a
clear understanding of all features involved in solving the
problems and meeting the requirements presented by the sample
task; and the extent to which uncertainties are identified and
resolutions proposed.
(2) Feasibility of Approach‑‑The proposal will be evaluated to
determine whether the offeror's methods and approach to
meeting the sample task requirements provided the Government
with a high level of confidence of successful completion. The
evaluation will also consider the realism of the labor
categories being proposed in the offeror's response to the
sample tasks.
RFP sect. M.2.C.
DSS was rated acceptable for STO 1, unacceptable for STO 2, and
good for STO 3. DSS asserts that the agency used unstated
criteria in evaluating its responses to the STOs. Specifically,
DSS notes that for each STO the agency compiled a list of task
areas and subtasks that an offeror was required to address to
receive evaluation credit. We have reviewed this assertion and
find it without merit. We discuss the issue with respect to STO
2 below.
The solicitation required offerors under STO 2 (software
development) to design, develop and field a single financial
system to replace three current systems used by approximately
100,000 VA personnel. Offerors were required to describe their
approach to how they would execute all tasks necessary for this
effort, and list the labor categories required to perform each
effort of the task. RFP sect. J, attach. 15,
STO 2.
As part of its evaluation of offerors' STO responses, the agency
prepared a list of key focus areas and lower level focus areas
that it believed an offeror would have to address to demonstrate
that it understood the task and had a feasible approach to
solving it. With respect to STO 2, the key focus areas and lower
level focus areas were: project management (stakeholders,
resources/labor categories, project schedule, and project
management processes including change management, configuration
management, risk management, quality assurance, communications,
tools, and integrated product teams); policy understanding
(compliance with government policy and standards for a technical
reference model, VA Enterprise Architecture (EA), Office of the
Management and Budget financial guidelines, and security
policies); software solution (business process gap analysis,
system architecture design, development process that takes into
account system capacity requirements, and data
warehouse/reporting capability); data migration (data migration
planning, data requirements sessions with legacy system
owners/users, data cleansing strategy, and migration process);
training (training strategy and plan, developing training
materials, training environment, and training delivery); testing
(test planning, methods for test support, conducting tests, and
providing test activities); and deployment/post go-live support
(support planning, fielding strategy, help desk, and support
teams). Technical Factor Ratings at 18. The agency asserts that
the above considerations were related to the stated evaluation
criteria and thus were properly considered.
While solicitations must inform offerors of the basis for
proposal evaluation, and the evaluation must be based on the
factors set forth in the solicitation, Federal Acquisition
Regulation sect. 15.304, agencies are not required to
specifically list every area that may be taken into account,
provided such areas are reasonably related to or encompassed by
the stated criteria. AIA-Todini-Lotos, B-294337, Oct. 15, 2004,
2004 CPD para. 211 at 8.
Here, the key focus and lower level focus areas that the agency
considered in evaluating STO 2 were reasonably related to or
encompassed by the stated criteria, and thus were not improper
unstated evaluation criteria. As noted, the solicitation
provided that the offeror's response to the sample task would be
evaluated to determine whether the offeror understood the
problem and had a feasible approach to solving it. RFP sect.
M.2.C. Understanding and feasibility of approach, however, are
not factors that can be evaluated on their own, without
reference to the work to be accomplished. In this regard, STO 2
instructed offerors to describe how they would execute all tasks
necessary for the effort. RFP sect. J, attach. 15, STO 2. It
follows then, that offerors were on notice that, in evaluating
understanding and feasibility of approach, the agency was going
to consider the tasks it believed were necessary to perform the
sample task. DSS has furnished no basis for us to conclude that
such areas considered by the agency as project management,
policy understanding, testing and training, and the elements
that comprise these broader areas such as stakeholder
involvement and compliance with government policies, were not
reasonably related to performing the overall STO 2. (DSS
Healthcare Solutions, LLC, B-403713.3, June 22, 2011)
(pdf)
MEP takes issue
with multiple aspects of the agency's evaluation, arguing, among
other things, that the agency failed to evaluate vendors'
quotations under the resumes factor as contemplated by the RFQ.
We sustain MEP's protest on this issue.
When an agency conducts a formal competition under the FSS
program for award of a task order contract, we will review the
agency's actions to ensure that the evaluation was reasonable
and consistent with the terms of the solicitation. The MIL
Corp., B-297508, B-297508.2, Jan. 26, 2006, 2006 CPD para. 34 at
4.
MEP argues that the agency did not evaluate vendors' quotations
under the resumes factor as provided for by the terms of the RFQ.
As noted above, the solicitation required vendors to submit
resumes for several key personnel and provided for evaluation of
each resume against the SOW requirements. In addition, the RFQ
provided that each resume would receive its own rating, that
resumes failing to meet minimum requirements would be rated as
unacceptable, and that the agency would determine an overall
rating under the resume factor by averaging the rating of each
panel member for each resume. RFQ, at 62.
The agency concedes that it did not evaluate the resumes in the
manner described by the RFQ. Contracting Officer's (CO)
Statement at 6. Instead, the agency explains that due to an
"administrative oversight," the evaluators were given an
incorrect evaluation standard for the resumes factor. That is,
rather than being advised of the above standard, the evaluators
were instructed simply to verify that the vendors had furnished
resumes for the key personnel positions. Applying this
instruction, the evaluators essentially reviewed the resumes
factor on a pass/fail basis--i.e., they merely determined
whether vendors had provided resumes for their key personnel and
did not further review the resumes to determine any strengths or
weaknesses for the purpose of determining a rating under the
resumes factor. Since both MEP and SAIC submitted the required
resumes, they both simply received an [deleted] rating. Id. The
fact that the evaluators did not further evaluate vendors'
quotations with respect to the resumes factor is further
confirmed by the fact each evaluator worksheet for this factor
is completely blank, with no documentation of any evaluation or
assignment of relative strengths or weaknesses. See AR, Tab 17,
MEP worksheets, at 4, 8, 12; Tab 18, SAIC worksheets, at 4, 18,
12.
Notwithstanding this clear deviation from the evaluation
criteria established by the RFQ, the agency attempts to excuse
its admitted error by suggesting that it effectively considered
the qualifications of the vendors' key personnel under the
management plan factor, which provided for consideration of the
relevant experience of the proposed management team in relation
to the SOW.
The agency's analysis conflates two evaluation factors that the
RFQ established as separate and distinct from one another, and,
in so doing, undermines the significance of the resumes factor.
By considering the resumes factor as subsumed under the
management plan factor, rather than assigning it the separate
adjectival rating and weight provided for in the RFQ, the agency
conducted its evaluation in a manner that was contrary to the
evaluation scheme expressly established by the RFQ. (Mission
Essential Personnel, LLC, B-404218.2; B-404218.3, June 14,
2011) (pdf)
Wood Cuts
complains that the agency's assessment of its proposal under the
cabin design and materials evaluation factor was unreasonable
because the agency did not credit Wood Cuts for its offer to use
[environmentally preferable products] EPPs in its alternative
design. In this regard, Wood Cuts argues that the agency was
required by statute and regulation to apply a preference for
"young-growth" timber over the use of "old-growth" timber, such
as Alaskan Yellow cedar. See Comments at 3-6. In support of its
arguments, Wood Cuts cites to the Multiple-Use Sustained-Yield
Act of 1960 (MUSYA), Pub. L. No. 86‑517, 74 Stat. 215, 16 U.S.C.
sections 528-531 (2006), and FAR subpart 23.7, Contracting for
Environmentally Preferable Products and Services.
In reviewing protests against allegedly improper evaluations,
our Office examines the record to determine whether the agency's
evaluation was, in fact, in accord with the stated evaluation
factors. Computer Prods., Inc., B-284702, May 24, 2000, 2000 CPD
para. 95 at 4‑5; Computer Assocs. Int'l, Inc., B‑292077.3 et
al., Jan. 22, 2004, 2004 CPD para. 163 at 6.
Here, the record shows that the agency evaluated Wood Cuts' and
Ketchikan's proposals in accordance with the evaluation criteria
identified in the solicitation. Although the protester contends
that it should have received enhanced credit for offering (in
its alternative design) to use EPPs, the RFP did not provide
that offerors would receive enhanced credit for using EPPs.[2]
In fact, offerors were specifically informed (as Wood Cuts
itself recognized in its basic proposal) that the agency
preferred the use of Alaska Yellow Cedar. Wood Cuts' objections
are grounded upon its belief that statutes and regulations, such
as MUSYA and FAR subpart 23.7, require the agency to give an
evaluation preference for the use of EPPs, even where the
solicitation's evaluation criteria do not provide for such a
preference. There is no merit to this argument. As an initial
matter, there is no provision in the cited statute or regulation
that requires the use of an evaluation preference in any
particular procurement. In any event, it is a fundamental
principle of federal procurement law that a contracting agency
must evaluate all offerors' proposals against the solicitation's
stated evaluation criteria. See, e.g., Source Diversified, Inc.,
B‑403437.2, Dec. 16, 2010, 2010 CPD para. 297 at 6. (Wood
Cuts, B-403960.3, May 19, 2011) (pdf)
Retail Clean
complains that the Army evaluated unstated evaluation criteria
and improperly awarded the contract to Farrington. Retail Clean
asserts that the agency gave Farrington credit for providing
additional services beyond what was required by the
solicitation, that is, stripping and waxing floors more
frequently than was required by the solicitation, and shampooing
carpets when only vacuuming was required. Protest at 9.
We review an agency's evaluation to ensure that it is fair and
reasonable, and consistent with the evaluation criteria stated
in the solicitation and with applicable procurement statutes and
regulations. Godwin Corp., B-290291, June 17, 2002, 2002 CPD
para. 103 at 3.
As noted above, the awardee proposed to strip and wax the floors
four times per year, and shampoo the carpet twice a year. AR,
Tab 6, Farrington's Quotation, at 24. The protester offered to
strip and wax the floors only once a year, and scrub the floor
and reapply the wax one other time during the year. AR, Tab 5,
Retail Clean's Final Quotation, at 103. The protester did not
offer to shampoo carpeted areas.
Although the protester is correct that the solicitation did not
identify desired frequencies for performing the required tasks,
the solicitation nonetheless contemplated that the frequency of
performing tasks would be evaluated. As stated above, the RFQ
required vendors to provide the frequency for performance of
each task as a part of the vendor's quality control plan, RFQ at
49, and vendors were specifically advised that it was the
vendor's responsibility to provide a "frequency chart"
identifying the frequency with which tasks would be performed.
RFQ amend. 1 at 3 (Question No. 11). In addition, the RFQ
specifically required shampooing of the carpeted areas. RFP at
18. Based on this RFQ, we find reasonable the agency's
consideration of the frequency of performance, as well as
whether carpets would be shampooed. (Retail
Clean Management Systems, B-403651; B-403651.2, November 18,
2010) (pdf)
PMO-JV filed the
current protest with our Office on July 1 challenging its
elimination from the competition. PMO-JV argues that FTA failed
to follow the solicitation criteria for evaluation because the
RFP did not require the submission of cost or pricing data, and
yet, PMO-JV was evaluated against, and excluded for, not meeting
the requirements of FAR sect. 15.408, Table 15-2, which only
apply when cost or pricing data are required. See RFP at 88. We
agree.
The RFP advised that FTA would evaluate and assess offerors'
cost proposals for cost reasonableness and realism, and to
determine that all items proposed have been included in
accordance with FAR Subpart 15.4-Contract Pricing. RFP at 98.
The RFP also expressly advised that because the contracting
officer expected adequate price competition, "this action is
therefore exempt from the requirement for submission of cost or
pricing data."[6] RFP at 88. Instead of cost or pricing data,
the RFP requested information other than cost and pricing data
and required, "all Offerors (prime and subcontractors) to submit
a budget summary for the entire contract period of
performance."[7] Id. The RFP informed offerors that the
requested information was necessary to determine the adequacy of
the offeror's proposal, but noted that such information was "not
considered cost or pricing data, and will not require
certification in accordance with FAR [sect.] 15.406-2." RFP at
89. The RFP also stated if the contracting officer concluded
there was insufficient information available to determine price
reasonableness and none of the exceptions described in FAR sect.
15.403‑1 applies, then cost or pricing data shall be obtained.
Id.
In accordance with the terms of the solicitation, PMO-JV did not
submit cost or pricing data with its cost proposal, nor did it
submit data in the format specified at FAR sect. 15.408, Table
15-2. PMO-JV instead submitted other than cost or pricing data
on Attachment J‑6, Contract Pricing Summary, with supporting
back-up material, and a budget summary as requested by the RFP.
However, the FTA contracting officer limited DCAA's review of
PMO's cost proposal to verifying whether the data was presented
as required by FAR sect. 15.408, Table 15‑2. This was improper
because the use of these requirements are only appropriate where
cost or pricing data is required by the solicitation. We also
note that this table was neither referenced nor incorporated
into the RFP, and there is nothing in the RFP to put offerors on
notice that the agency would evaluate cost proposals against FAR
15.408, Table 15-2; to the contrary, the solicitation expressly
stated that cost or pricing data was not required.
As indicated, DCAA's constrained adequacy review found various
inadequacies in PMO's cost proposal because supporting data
required by FAR sect. 15.408, Table 15‑2 was not included. For
example, DCAA reported that PMO-JV's Contract Pricing Summary
Sheet (Attachment J-6) failed to include a price analyses of all
subcontractor proposals and a cost analyses for subcontract
proposals exceeding the threshold for cost or pricing data
($650,000), as required by FAR 15.408, Table 15‑2, II.A.(2).
However, the RFP's cost proposal instructions did not indicate
that PMO-JV had to conduct and submit such analyses.
An agency may not induce offerors to prepare and submit
proposals based on one premise, then make source selection
decisions based on another. DynCorp, B‑245289, B-245289.2, Dec.
23, 1991, 91-2 CPD para. 575 at 6; Hattal & Assocs., B-243357,
B‑243357.2, July 25, 1991, 91-2 CPD para. 90 at 7. The problems
found by DCAA were based upon FAR requirements that are only
applicable when cost or pricing data is required. Because the
RFP expressly provided that cost or pricing data was not
required, and because the RFP did not otherwise indicate that
the data should be presented in this format, the agency's
evaluation of PMO-JV's cost proposal was unreasonable.
Moreover, the record, which includes numerous audits of the cost
proposals of the other offerors (including the 18 awardees),
shows that none of these cost proposals were evaluated for
adequacy based on the instructions contained in FAR sect.
15.408, Table 15-2. It is fundamental that the contracting
agency must treat all offerors equally, and therefore it must
evaluate offers evenhandedly against common requirements.
Tidewater Homes Realty, Inc., B-274689, Dec. 26, 1996, 96-2 CPD
para. 241 at 3.
Furthermore, the agency's prohibition on DCAA communications
with PMO-JV concerning the adequacy of its submitted cost data
appears inconsistent not only with DCAA practice, but with FAR
sect. 15.404-2(d), which states:
The [administrative contracting officer] or the auditor, as
appropriate, shall notify the contracting officer immediately
if the data provided for review is so deficient as to preclude
review or audit. . . . The contracting officer immediately
shall take appropriate action to obtain the required data.
Should the offeror/contractor again refuse to provide adequate
data, or provide access to necessary data, the contracting
officer shall withhold the award . . .
In this case, we think that questions about the adequacy of the
submitted cost data should have been a subject of dialogue
between the agency (or DCAA) and PMO-JV before that firm's
proposal was rejected for this reason, particularly given that
the previous awards under this solicitation were made over a
year ago.
The protest is sustained. (PMO
Partnership Joint Venture, B-403214; B-403214.2, October 12,
2010) (pdf)
Wackenhut
challenges the agency's evaluation of proposals under the
management approach factor and argues that the source selection
decision was not reasonable. Specifically, Wackenhut complains
that the agency's evaluation of Coastal's approach for providing
relief to guard workers was inconsistent with the RFP, and that
offerors' proposals were evaluated disparately under the
management approach factor.
An agency is required to evaluate proposals in a manner that is
consistent with the terms of the solicitation, and to evaluate
offers on a common basis. Contingency Mgmt. Group, LLC; IAP
Worldwide Servs., Inc., B-309752 et al., Oct. 5, 2007, 2008 CPD
para. 83 at 10. Where the agency's evaluation does not adhere to
the solicitation's evaluation criteria, or where proposals are
evaluated disparately, our Office will sustain the protest. Id.
at 10-11. Our review of the record confirms that the agency
failed to properly evaluate proposals here.
Despite the clear direction provided by the agency in answers to
questions from potential offerors about not using supervisors
for relief, Coastal's proposal indicated that the firm would use
on-site supervisors to provide relief for guard workers at
outlying sites at times when the supervisors are not performing
supervisory duties. AR, Tab 11, Coastal's Technical Proposal, at
13. As noted above, the agency credited this as a benefit to the
government that not only met, but exceeded the requirements of
the RFP. AR, Tab 13, Technical Executive Summary, at 66-67. The
agency specifically stated, with regard to Coastal's approach,
that "relief will be provided in accordance with the contract's
requirements," and it cited Coastal's approach as an
"innovation" that contributed to the proposal's highly
acceptable rating. Id.
The agency contends that it was reasonable to credit this aspect
of the proposal because, although the RFP prohibited supervisors
from providing relief, Coastal's use of supervisors here was
limited to time when these personnel were acting in a
non-supervisory capacity, and were not being billed at
supervisory rates. Supplemental AR at 9. In the agency's view:
since the person who will be providing relief will be doing so
in a non‑supervisory capacity and will not be charging the
Government at supervisory rates, this does not conflict with
the terms of the solicitation and there is no merit to [Wackenhut's]
allegation.
Id.
We are unpersuaded by this argument. The solicitation was very
clear that the agency did not want supervisors to provide
relief. Wackenhut structured its proposal to provide relief to
guard workers without relying on supervisors to provide relief,
and the agency specifically noted that Wackenhut "understand[s]
the [RFP] requirement that supervisors do not provide relief
('replacements')." AR, Tab 13, Technical Executive Summary, at
25. The agency took the opposite approach with regard to its
evaluation of Coastal's proposal when it ignored this
requirement and found that Coastal's plan to have supervisors
provide relief was "innovat[ive]," "creative," and "exceeds the
requirements" of the RFP. Id. at 66‑67.
Based on the record here, we find that the agency waived for
Coastal the requirement that supervisors cannot provide relief,
and that Wackenhut was prejudiced by this action. We sustain the
protest on this basis.
It also appears from the record that the agency may have
evaluated proposals disparately in other ways under the
management approach factor. In this regard, Wackenhut's proposal
contained some of the same features that the agency viewed as
advantageous in Coastal's proposal, but the record does not
indicate that Wackenhut received the same favorable
consideration in the evaluation. For example, as noted above,
the agency cited as support for Coastal's highly acceptable
rating that Coastal had its own training academy and a
transition plan that included hiring the incumbent workforce.
AR, Tab 16, Post Negotiation Memorandum, at 17. However, the
record shows that Wackenhut also had its own training academy in
the District of Columbia area, and its transition plan already
consisted of at least a portion of the incumbent workforce
because Wackenhut is the incumbent on the DEA headquarters
contract. The record does not explain why Wackenhut was found
only to have met the requirements of the RFP for these features,
yet Coastal was found to have exceeded the requirements. AR, Tab
13, Technical Executive Summary, at 25, 66-67; Tab 16, Post
Negotiation Memorandum, at 17. (Wackenhut
Services, Inc., B-402550.2, June 7, 2010) (pdf)
Trailblazer
challenges the agency's technical and cost evaluations in two
principal respects. First, TrailBlazer argues that CMS
improperly credited Highmark with staffing reductions based on
the agency's belief that Highmark would be able to reduce the
suspense rate through its use of [DELETED]. This was improper,
according to TrailBlazer, because the RFP established the
suspense rate as a mandatory workload assumption and CMS was
therefore required to evaluate Highmark's technical and cost
proposals based on its ability to perform at the RFP's 25
percent suspense rate, not some lower value.
(Sentence deleted).
Suspense Rate
Where a protest challenges an agency's evaluation, we will
review the evaluation record to determine whether the agency's
judgments were reasonable and consistent with the stated
evaluation criteria and applicable procurement statutes and
regulations. W. Gohman Constr. Co., B-401877, Dec. 2, 2009, 2010
CPD para. 11 at 3. Further, when an agency evaluates a proposal
for the award of a cost-reimbursement contract, as in this case,
the agency must perform a cost realism analysis to determine the
extent to which an offeror's proposed costs are realistic for
the work to be performed. FAR sect. 15.404-1(d)(1). An offeror's
proposed estimated costs are not dispositive because, regardless
of the costs proposed, the government is bound to pay the
contractor its actual and allowable costs. FAR sections
15.305(a)(1); 15.404-1(d); Palmetto GBA, LLC, B-298962,
B-298962.2, Jan. 16, 2007, 2007 CPD para. 25 at 7. We review an
agency's evaluation in this area only to see that the agency's
cost realism evaluation was reasonably based and not arbitrary.
Hanford Envtl. Health Found., B-292858.2, B-292858.5, Apr. 7,
2004, 2004 CPD para. 164 at 9.
TrailBlazer asserts, and the record reflects that Highmark's
lower evaluated cost, as compared to the cost of TrailBlazer's
proposal, was driven, in part, by Highmark's [DELETED] to
introduce greater automation with respect to claims processing.
One aspect [DELETED], as explained by Highmark's proposal, is
its ability to reduce the rate at which claims are
suspended--the suspense rate. In challenging CMS's evaluation of
Highmark's proposal, TrailBlazer does not argue that the
staffing reductions attributed to TrailBlazer's use of increased
automation are not possible. See Protester's Comments on Supp.
Report, at 14. Rather, TrailBlazer argues that accepting
Highmark's reduced staffing based on the ability [DELETED] to
reduce the suspense rate was improper since the RFP clearly
required offerors to assume a 25 percent suspense rate. Id.
TrailBlazer contends, and the record confirms, that Highmark
used the 25 percent suspense rate workload assumption set forth
in the RFP as the basis for its proposed staffing levels. As
noted above, in responding to the concern raised by CMS during
discussions that its claims processing staffing was too low,
Highmark expressly reaffirmed its use of [DELETED] FTEs for the
claims processing work and that its staffing was predicated on
performing this function at the assumed 25 percent suspense rate
set forth in the RFP. See AR, Tab 25, Highmark Discussion
Response, at 1. TrailBlazer further argues--and the record
appears to support its contention--that CMS did not believe
[DELETED] FTEs would be adequate. See AR, Tab 21, TEP Report for
Business Proposals, at 1. Rather, CMS appears to have only
accepted Highmark's staffing levels due to Highmarks' use of
[DELETED], which, as explained by Highmark in its proposal, has
the ability to reduce the suspense rate; and, as determined by
CMS, has the ability to reduce staffing levels by approximately
one half. Id.
Thus, the record supports one aspect of TrailBlazer's thesis,
that CMS credited TrailBlazer with reduced staffing and cost
resulting from its use of [DELETED], and its resulting ability
to achieve a reduction in the suspense rate. What is not
supported, however, is TrailBlazer's premise that the agency's
evaluation considerations in this regard were contrary to the
terms of the RFP. While TrailBlazer correctly notes that the RFP
required all offerors to use the workload assumptions in
preparing their proposals, there is nothing in the RFP which
precluded CMS from considering each offeror's unique technical
approach or proposed innovations, and their impact on the
workload assumptions (in this case the suspense rate), and then
accounting for such impacts as part of its evaluation. To the
contrary, as noted above, the RFP expressly provided that CMS
would evaluate the extent to which offerors proposed
"innovations" and "efficiencies" with respect to the various
requirements to include claims processing. RFP at 93. Moreover,
pursuant to FAR sect. 15.404-1(d)(1), a cost realism analysis,
which CMS was required to perform, provides for independently
reviewing whether specific cost elements proposed by an offeror
are realistic for the work to be performed considering the
unique methods of performance described in the offeror's
proposal. Thus, having identified Highmark's use of [DELETED] as
an innovative, and viable, technical approach, it was
appropriate for CMS to consider the impact of this approach on
Highmark's staffing levels and thereby its overall cost to the
government.
TrailBlazer alternatively argues that even if it was appropriate
for CMS to consider Highmark's use of [DELETED] to reduce its
staffing, CMS failed to perform any meaningful analysis of
Highmark's ability to achieve the staffing reductions with which
it credited Highmark's proposal. In support of its position,
TrailBlazer relies on our decision in National City Bank of
Indiana, B-287608.3, Aug. 7, 2002, 2002 CPD para. 190. The
record, however, reflects that, unlike the agency's cost realism
evaluation in National City Bank, which consisted entirely of
conclusory statements regarding the viability of proposed cost
reductions, CMS expressly considered Highmark's [DELETED],
analyzed the degree to which it would reduce staffing, and
provided a contemporaneous explanation for its conclusions in
this regard. As noted above, the TEP determined that [DELETED]
had the potential to reduce staffing by half [DELETED]. AR, Tab
20, Addendum to Technical Evaluation Report, at 1; AR, Tab 22,
Addendum to TEB Report for Business Proposals, at 1. Since the
Section 1011 program is essentially a replicate of the Medicare
Part A and B processing system, CMS reasonably had little doubt
that similar staffing reductions could be achieved. Moreover,
the evaluation record identifies several of the specific
features of [DELETED], which the agency believes would be useful
in reducing staffing, including, [DELETED]. Id. Based on this
record, we have no basis to conclude that the agency's
consideration of Highmark's staffing, in conjunction with its
use of [DELETED], was improper or otherwise unreasonable.
(TrailBlazer Health Enterprises, LLC,
B-402751; B-402751.2, July 20, 2010) (pdf)
With respect to
its evaluation, T-C challenges each of the deficiencies assessed
against its proposal under the technical evaluation factor--the
most heavily-weighted factor. We agree with the protester that
the evaluation of the technical factor was not based on the
criteria stated in the RFP, and that the criteria in the RFP
were not applied fairly to the information in the protester's
proposal. We will discuss the evaluation of the protester's
proposal under technical subfactor 1 to provide an example of
the flaws found throughout the evaluation of the technical
factor.
First, T-C argues that it was unreasonable for the VA to assess
both a strength and a deficiency for T-C's proposal on the same
issue--i.e., whether its transcriptionists would have experience
transcribing for physicians who were not native English
speakers. One of the strengths identified for T-C's proposal
explained that "[a]ll transcriptionists must have experience
with ESL (English as second language) physicians." AR, Tab 14,
Evaluation of T-C, at 1. At the same time, one of the
deficiencies identified for T-C's proposal under this same
subfactor was that the proposal "[d]oes not mention skill with
ESL (English as second language) physicians." There is nothing
in the contemporaneous record reconciling the apparent
contradiction in these assessments. When asked by our Office to
address the issue directly, the VA acknowledged that the
evaluation was erroneous on this point. We agree.
Second, T-C argues that it was unreasonably assigned a
deficiency under technical subfactor 1 for failing to provide a
resume for each transcriptionist, because the RFP did not
require proposals to include that information. Protester's
Comments at 3, 5. Our Office asked the VA to identify where the
RFP required this information. Fax from GAO to Parties, Aug 3,
2009, at 2 (Question 5). In its supplemental report, the VA
responded that "[t]he factor referenced paragraph 10 of the PWS
which indicates that resumes will be provided." Thus, the VA
argued that it was proper for the VA to evaluate T-C's proposal
as "satisfactory," and eTrans's proposal as "marginal" because
those firms failed to provide resumes, and to evaluate K&R's
proposal as "excellent" because it did provide resumes. Supp. AR
at 2 (Answer 5). The RFP, as quoted above, does not support the
VA's argument on this point.
The RFP instructed offerors to include in their proposals
information to "indicate their ability to provide personnel that
meet the requirements of paragraph 10 of the PWS." RFP at 61. In
turn, paragraph 10 of the PWS described the certification and
experience required. It does not require that proposals include
resumes. The only reference to resumes is in the statement in
the PWS that "Curriculum vitae of employees will be provided to
each of the facilities." RFP at 16. The language in this
sentence, however, plainly describes an event that will occur in
the future ("will be"), and specifically after award of
particular locations ("to each of the facilities"). Therefore,
we think that this sentence described a performance requirement,
and not information that an offeror needed to provide in its
proposal. Since the RFP did not require offerors to submit a
resume for each employee, the downgrading of the protester's
proposal on this basis was improper.
In conclusion, although the examples above are taken from
subfactor 1, they demonstrate the flawed evaluation approach
taken by the VA evaluators with respect to the technical factor
generally. From our review of the record, we conclude that the
flawed evaluation also affected the other three subfactors.
(T-C Transcription, Inc.,
B-401470, September 16, 2009) (pdf)
Master Lock
challenges the award of 11 of the 13 remaining lock types to
Evergreen during the reevaluation. The protester argues that the
agency improperly considered the anticipated “demand” for each
lock type in its price/technical tradeoff. The protester also
argues, in the alternative, that if demand was a proper
evaluation criterion, the agency’s selection decision elevated
its importance in a manner inconsistent with the solicitation’s
stated evaluation scheme. We disagree with both of the
protester’s arguments.
Use of “Demand” as an Evaluation Criterion
First, Master Lock argues that DLA’s consideration of
anticipated demand for each lock type constituted an undisclosed
evaluation criterion. For the reasons discussed below, we
disagree.
Agencies are required to evaluate proposals based solely on the
factors identified in the solicitation, and must adequately
document the bases for their evaluation conclusions. Intercon
Assocs., Inc., B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD
para. 121 at 5. While agencies properly may apply evaluation
considerations that are not expressly outlined in the RFP where
those considerations are reasonably and logically encompassed
within the stated evaluation criteria, there must be a clear
nexus between the stated criteria and the unstated
consideration. Global Analytic Info. Tech. Servs., Inc.,
B--298840.2, Feb. 6, 2007, 2007 CPD para. 57 at 4.
As discussed above, DLA’s selection decision considered the
anticipated volume of orders for each lock type, as identified
in the RFP. Specifically, the selection decision considered
whether offerors’ past performance demonstrated that they could
meet the “demand” for the higher-volume lock types. AR, Tab 10,
Revised SSAD, at 5. Master Lock argues that the solicitation did
not state that DLA would consider “demand” as an evaluation
factor, and that the agency had therefore departed from the
evaluation scheme.
The RFP stated that the agency would consider the overall
relevance of offerors’ past performance as follows: “Past
performance information will be used in the determination of the
best value to the Government. This evaluation process will focus
on information that demonstrates quality of performance relative
to the type, size and complexity of this acquisition.” RFP at
66. The RFP also stated that the delivery subfactor of the past
performance factor would evaluate offerors’ performance record
to assess their ability to deliver the required volumes of
locks. Specifically, the delivery subfactor stated as follows:
The Government will evaluate the offeror’s Delivery. We will
review the offeror’s record of on-time performance.
Accordingly, the offeror’s past performance record will be
evaluated to ensure its ability to handle the volume of sales
similar to the solicitation’s estimated requirements with
on-time delivery and to assess its experience in complex
contracting with an emphasis on Stock Delivery.
Id. at 67.
With regard to the two lock types for which the RFP anticipated
the highest demand, DLA concluded that Master Lock’s past
performance record provided the agency a high degree of
confidence that the offeror would be able to meet the
requirements for providing the high-demand lock types. Id. at 6.
With regard to the remaining 11 lock types, the agency concluded
that although Evergreen’s past performance record demonstrated a
“slightly higher performance risk” because the offeror did not
have as much experience with high-demand item delivery as
compared to Master Lock, this risk was acceptable because
Evergreen would be providing the lower-demand lock types. Id. at
6-7.
We think the solicitation advised offerors that their past
performance records would be evaluated based on relevance,
thereby making the offeror’s performance of higher and
lower-demand requirements relevant. We further think that the
agency’s consideration of the demand for each lock type and the
offerors’ delivery records was reasonably and logically
encompassed within the express terms of the delivery subfactor.
On this record we find no merit to the protester’s argument that
the agency used an undisclosed evaluation criterion.
Weight of “Delivery” Subfactor in the Selection Decision
Next, Master Lock argues that even if DLA properly considered
demand for each lock type in its evaluation of offerors’ past
performance, the agency placed an improper emphasis on this
criterion in a way that was inconsistent with the stated
evaluation scheme. For the reasons discussed below, we disagree.
As discussed above, the RFP stated that past performance was the
most important non-price evaluation factor. The RFP also stated
that the three past performance subfactors of delivery, business
relations/customer satisfaction, and quality were of equal
importance. Master Lock argues that DLA improperly treated
delivery as more important than the other past performance
subfactors, and thereby departed from the solicitation’s stated
evaluation scheme. Specifically, the protester contends that
although the selection decision discusses the offerors’ delivery
records, the agency did not address the offerors’ evaluations
under the other two subfactors. We find these arguments without
merit.
The record does not show, as the protester argues, that DLA
considered only the delivery subfactor, and ignored the business
relations/customer satisfaction and quality subfactors. Instead,
the selection decision discussed the degree to which the ratings
under all three subfactors demonstrated confidence in the
offeror’s ability to meet the solicitation requirements. Id.
With regard to each offeror, the agency also addressed the
relevance of each offerors’ past performance, particularly with
regard to each offeror’s ability to deliver the volume of locks
required under the contract. AR, Tab 10, Revised SSAD, at 3.
As discussed above, we think the agency’s consideration of
“demand” was reasonably related to both the overall evaluation
of the relevance of the offerors’ past performance, and to the
specific subfactor of delivery. We do not agree with the
protester that because more words were devoted to the discussion
of the delivery subfactor than to the specific ratings under the
business relations/customer satisfaction and quality subfactors,
the agency departed from the evaluation scheme. In this regard,
the selection decision shows that DLA considered the overall
past performance evaluation in a tradeoff between the price and
non-price evaluation factors. The record further shows that for
each lock type, the agency weighed Evergreen’s lower price
against Master Lock’s higher ratings under the past performance,
socio-economic considerations and JWOD considerations factors.
In light of Master Lock’s higher prices, the agency determined
that the protester’s advantage under the non-price evaluation
factors merited award to the protester only for those lock types
where the high volume of orders decreased the confidence in
Evergreen’s ability to perform.
In sum, we think the record shows that DLA considered all of the
evaluation factors and made its selection decision in a manner
consistent with the solicitation. (Master
Lock Company, LLC, B-309982.3, December 10, 2008) (pdf)
Master Lock
argues that the selection decision was flawed for two reasons:
(1) the selection decision relied on a mechanical comparison of
offerors’ proposed prices, and (2) the selection decision
excluded consideration of Evergreen’s evaluation under the
socio-economic and JWOD considerations evaluation factors. We
disagree with the first argument, but agree with the second one.
First, the protester argues that the selection decision was
based solely on a mechanical comparison of the offerors’
proposed price, rather than a tradeoff between price and
non-price factors, as required by the solicitation.
Specifically, the protester contends that the selection decision
was made based on an arbitrary price evaluation that “split the
difference” between the offerors’ prices.
Where a price/technical tradeoff is made, the selection decision
must be documented, and the documentation must include the
rationale for any tradeoffs made, including the benefits
associated with additional costs. Federal Acquisition Regulation
(FAR) sect. 15.308. It is improper for an agency to rely on a
purely mathematical price/technical tradeoff methodology.
Opti-Lite Optical, B-281693, Mar. 22, 1999, 99-1 CPD para. 61 at
4-5.
Here, the record shows that the SSA used the average of the
offerors’ prices as a starting point for the tradeoff
determination, but also considered the non-price factors. The
SSA noted that offerors prices ranged in difference from 1% to
17% from each other, with the average difference being
approximately 8%. AR, Tab 12, SSD, at 1. The SSA determined
that, based on Master Lock’s superior past performance, award
would be made to Master Lock for all lock types where that
offeror’s price was either lower, or was no more than 8% higher
than Evergreen’s price. Id. The SSA determined that, based on
Evergreen’s lower-rated past performance, award would be made to
Evergreen for the balance of the lock types, i.e., where that
offeror’s price represented a greater than 8% price savings as
compared to Master Lock. Id.
We think this evaluation was reasonable because the non-price
evaluation for each offeror was to be based on factors that did
not depend on which lock was being offered--i.e., past
performance, socio-economic considerations, and JWOD
considerations. Consequently, price was the only difference
between the offerors regarding each of the distinct lock types
offered. The SSA’s determination was for each lock type, in
essence, that an 8% difference represented a reasonable price
premium for Master Lock’s advantage on the non-price evaluation.
In sum, we conclude that this tradeoff decision was not merely a
mechanical comparison of price, but instead represented a
reasonable exercise of business judgment by the SSA.
Finally, Master Lock argues that the SSA’s decision to rate
Evergreen’s proposal “neutral” under the socio-economic and JWOD
evaluation factors was inconsistent with the solicitation. DLA
contends that its selection decision did take these factors into
consideration, and that the “neutral” ratings were consistent
with the solicitation. We agree with the protester that the
agency’s assignment of “neutral” evaluation ratings was
unreasonable, and further conclude that the evaluation was
inconsistent with the solicitation’s stated award scheme because
it resulted in a selection decision that ignored the
socio-economic and JWOD evaluation factors.
The record here shows that the tradeoff decision considered only
one of the non-price evaluation factors, past performance. For
the 22 lock types awarded to Master Lock, the SSA concluded that
the protester’s “outstanding rating for past performance
demonstrated the best overall value to the government, with
price and past performance weighted equally.” AR, Tab 12, SSD,
at 1. For the 13 lock types awarded to Evergreen and challenged
by the protester, the record also shows that the agency did not
consider the socio-economic or JWOD considerations evaluation
factors. The SSA concluded that “[f]or the remaining 13 items .
. . award is recommended to Evergreen,” because “Evergreen is
the lowest priced offeror, and their past performance is rated
overall as good.” Id.
Thus, the record shows, the tradeoff decision did not consider
the socio-economic or JWOD considerations evaluation factors.
The solicitation stated, however, that price would be considered
along with three non-price factors: past performance,
socio-economic considerations, and JWOD considerations. RFP at
66. Furthermore, the RFP specifically advised offerors that
“[a]n offeror who proposes a higher percentage, complexity
level, and variety of participation by” small business and JWOD
concerns, “will receive a higher rating.” RFP at 67-68. On this
record, we conclude that the selection decision was inconsistent
with the solicitation’s award scheme.
DLA contends that, notwithstanding the solicitation’s
identification of three non-price factors, the agency had a
reasonable basis for rating Evergreen’s proposal as “neutral”
under these factors, and thus not factoring them into the award
determination.[7] The agency contends that the “neutral” ratings
were reasonable because Evergreen was itself a small business
offeror, and thus “has fewer opportunities to subcontract to
meet the socio-economic and JWOD factors.” AR, Tab 11, PNM, at
4; Tab 12, SSD, at 1. The agency also argues that a “neutral”
rating for a small business was reasonable because it furthered
the agency’s goal of promoting opportunities for small
businesses. SAR at 13.
The clear effect of these neutral ratings, as discussed above,
was that the agency only considered one non-price factor: past
performance. Nothing in the RFP stated that small business
offerors were exempt from evaluation under these factors, or
that an offeror’s status as a small business could offset an
otherwise negative rating under these factors. For the same
reason, we think the agency unreasonably concluded that
Evergreen’s status as a small business and its “poor” ratings
made that offeror equal to Master Lock’s “good” ratings for
these evaluation factors. Furthermore, DLA’s rationale for
rating Evergreen as “neutral” under the socio-economic
considerations factor because the offeror was itself a small
business has no rational relevance to the evaluation of the
offeror under the JWOD considerations factor.
In sum, we conclude that DLA’s tradeoff decision for the 13 lock
types awarded to Evergreen relied on a tradeoff that only
considered past performance and price, based on the agency’s
unreasonable determination that Evergreen’s proposal should be
rated “neutral” under the socio-economic and JWOD considerations
evaluation factors. We further conclude that because Master Lock
had a higher rating than Evergreen for these factors, the
protester was prejudiced by this error, and sustain the protest
because of the flawed tradeoff decision. See McDonald Bradley,
B-270126, Feb. 8, 1996, 96-1 CPD para. 54 at 3; Statistica, Inc.
v. Christopher, 102 F.3d 1577, 1581 (Fed. Cir. 1996). (Master
Lock Company, LLC, B-309982.2, June 24, 2008) (pdf)
During the
hearing, counsel for the agency asked the CO to explain “what
the government was looking for in the overall solicitation.” AD
at 1:57:45. The CO stated the agency was seeking to identify
offerors that had “experience in providing all services under
all the categories under one contract.” Id. at 1:57:55. The CO
further stated that in order to be considered equivalent in
terms of experience to the incumbent, an offeror would need to
have performed at least one contract involving every service
sought under the solicitation. Id. at 2:02:45, 2:04:50.
Agencies are required to evaluate proposals based solely on the
factors identified in the solicitation, and must adequately
document the bases for their evaluation conclusions. Intercon
Assocs., Inc., B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD
para. 121 at 5. While agencies properly may apply evaluation
considerations that are not expressly outlined in the RFP where
those considerations are reasonably and logically encompassed
within the stated evaluation criteria, there must be a clear
nexus between the stated criteria and the unstated
consideration. Global Analytic Info. Tech. Servs., Inc.,
B-298840.2, Feb. 6, 2007, 2007 CPD para. 57 at 4.
The solicitation stated that offerors experience would be
evaluated as follows: “Experience will be evaluated in terms of
the size, scope and complexity of the operation. Experience in
the following [eight] critical areas, listed in alphabetical
order, will be evaluated.” RFP sect. H-1(d)(1). We agree with
the agency that the solicitation anticipated award of a single
contract, and that the ability to perform all of the services
under a single contract is a basis for evaluation. We do not
agree, however, that the CO’s “one contract” criterion for
evaluating the relevance of offerors’ experience was a
reasonable proxy for that evaluation. Instead, we think that the
RFP does not support the agency’s “one contract” interpretation
for evaluating experience, and that the interpretation was not
reasonably related to the stated evaluation criteria.
The RFP does not state that the agency was seeking proposals
from offerors who had performed all of the eight critical areas
of experience under "one contract"; instead, the solicitation
states that the agency will evaluate offerors' experience to
determine whether their experience with the eight critical areas
is similar to the requirements of the solicitation. The CO's
"one contract" approach meant that the agency did not consider
any offeror to have equivalent experience to the incumbent
contractor unless the offeror had performed the exact same
services required under the solicitation in the same manner as
the incumbent, i.e., under one contract.
In addition to being inconsistent with the RFP, the CO's
interpretation appears to have unduly restricted competition by
unreasonably disfavoring non-incumbent firms competing with the
incumbent. Specifically, the agency's action prejudiced CES by
precluding consideration of whether offerors' references
demonstrated, individually or collectively, experience that was
similar in size, scope or complexity to the solicitation
requirements. In this regard, the CO simply assumed, without
meaningful analysis, that CES's collective experience, which the
record shows clearly included multi-function contracts involving
all of the services required under the solicitation, could not
have been the equal to MMC's under the incumbent contract. On
this record, we conclude that the agency's evaluation was not
reasonably consistent with the stated evaluation factors. (Consolidated
Engineering Services, Inc., B-311313, June 10, 2008) (pdf)
With respect to
the testing of the array prototypes, Lockheed argues that it was
improperly denied the opportunity specified in the RFP to repair
its array before phase 2 of the towed testing. The Navy responds
that Lockheed had already been provided an opportunity to repair
its array before the RFP was issued, and the RFP had warned
offerors that "the government may not complete all the tests
listed" and thus the Navy had the discretion to omit phase 1,
and even if the firms had been given a repair opportunity,
Lockheed might not have been able to successfully repair its
array.
We have held that where an agency provides in a solicitation for
the evaluation of a product sample--a situation analogous to the
prototypes being tested here--that evaluation must be fair,
reasonable, and consistent with the evaluation criteria. Design
Contempo, Inc., B‑252589.2, Aug. 11, 1993, 93-2 CPD para. 90 at
4. This includes an obligation to perform product tests as
described in a solicitation. Mine Safety Appliances Co.,
B‑238597.2, July 5, 1990, 90‑2 CPD para. 11 at 11. Thus, while
our Office will not make an independent determination of the
merits of an offeror's proposal or, in the case of a product
demonstration, the performance of the offeror's product, we will
review the evaluation record, including the results of any test
demonstration, to ensure that the agency's technical judgment
has a rational basis and is consistent with the stated
evaluation criteria. Exploration Prods., B-279251.2, B-279251.3,
June 1, 1998, 98‑2 CPD para. 15 at 5.
The RFP provisions quoted above anticipated the possible failure
of this equipment before the beginning of phase 2 towed testing,
and provided certain repair rights to each vendor in such a
situation. Specifically, the RFP provided that in phase 1 the
offeror would be given a single repair opportunity of up to 7
days in which to return a failed prototype array to a working
state. Lockheed was denied this opportunity the RFP "ground
rules" had promised.
Additionally, given the specific language of the RFP here, we
think it is not reasonable for the Navy to argue that it was
allowed to omit the phase 1 repair opportunity--particularly
where that omission brought about exactly the type of harm that
the RFP seemed to be designed to avoid. The Navy emphasizes that
phase 1 was also intended, in part, to allow the offeror to
train Navy personnel in the proper handling and operation of the
array, and we accept the Navy's argument that its personnel
needed no further training. Nevertheless, the phase 1 testing
plainly promised a repair opportunity, apparently in an effort
to ensure that the prototype arrays would be operational at the
start of phase 2 towed testing.
The Navy points out that both offerors repaired their arrays
prior to issuance of the RFP, and argues that Lockheed was not
entitled to a second repair opportunity. In our view, however,
the earlier repair opportunity cannot be relied upon by the Navy
to meet a promise it had not yet made. In any event, the
mid-March repairs (made before the prototypes were shipped from
Florida to Idaho) cannot reasonably be seen as a substitute for
the RFP-provided right to restore a failed array to an
operational status at the most crucial time under the RFP, that
is, pierside at Lake Pend Oreille, immediately before the
beginning of the towing testing.
We also are not persuaded by the Navy's speculative assertions
that Lockheed might have been unable to repair its array, if it
had been offered that opportunity. On this subject, Lockheed
answers with its own speculation that it could have repaired its
array quickly and easily using spares that it had on hand. Our
process is ill-suited to deciding protests on the basis of
speculation--by either party. In our view, we need not determine
that Lockheed would have successfully repaired its array; we
need only conclude that it was denied the opportunity to do so
immediately before phase 2 testing, and that this was contrary
to the ground rules specified in the RFP.
(Lockheed Martin Maritime Systems &
Sensors, B-299766; B-299766.2, August 10, 2007) (pdf)
JPDT asserts that
the agency improperly relaxed the SFO's requirements only for
Stonebridge by accepting its letter of commitment as sufficient
proof that the above described proposed amenities will exist.
The protester states that, had it known that a mere letter of
commitment to provide future amenities would suffice, it could
have used this information to propose new amenities that would
have increased its score under the access to amenities subfactor.
It is a fundamental principle of government procurement that a
competition must be conducted on an equal basis, that is,
offerors must be treated equally and be provided with a common
basis for the preparation of their proposals. Continental
RPVs, B-292768.2, B-292768.3, Dec. 11, 2003, 2004 CPD para.
56 at 8; Systems Mgmt., Inc.; Qualimetrics, Inc.,
B-287032.3, B-287032.4, Apr. 16, 2001, 2001 CPD para. 85 at 8.
We will sustain a protest that an agency improperly relaxed its
requirements only for the awardee where the protester
establishes a reasonable possibility that it was prejudiced by
the agency's actions. Datastream Sys., Inc. B-291653,
Jan. 24, 2003, 2003 CPD para. 30 at 6.
We find that the agency's evaluation of Stonebridge's proposed
amenities was inconsistent with the solicitation. As stated
above, the SFO provided that “the offeror must demonstrate to
the reasonable satisfaction of the Government (i.e., through
evidence of signed leases, construction contracts, etc.) that
such amenities will exist by the Government's required occupancy
date.” AR, Tab 2, SFO at 7. We think this language informed
offerors that some level of evidence of a proposed amenity to be
furnished by a third party, beyond a mere promise of its
existence, had to be provided in order for an offer to receive
evaluation credit for the proposed amenity. The agency asserts
that Stonebridge's promise is contractually enforceable, and
that it thus was reasonably satisfied that the awardee would
provide the amenity. However, the key consideration here is not
whether a promise could be viewed as providing some level of
assurance that a third party amenity would be available, but
whether the agency's reliance on the promise alone was
consistent with the terms of the SFO. Again, we find that it was
not. The SFO stated that offerors "must demonstrate" that the
proposed amenities would exist, indicated in the parenthetical
that this was to be “through evidence,” and then gave examples
of satisfactory evidence. We do not think a bare promise
satisfied the SFO requirement that offerors "demonstrate" the
existence of future amenities through "evidence." It follows
that, in accepting Stonebridge’s promise, the agency relaxed the
evidence requirement only for the benefit of Stonebridge,
without advising JPDT and the other offerors of the reduced
evidence standard.
The agency asserts that "all offerors were informed" during oral
discussions that they could improve their offers by providing
amenities in their building, and that "the government would
accept, in addition to letters of intent or other proof, the
guarantee of a financially responsible offeror as proof that the
amenity will be 'existing' at the time of occupancy." AR, Tab 1,
Contracting Officer’s Statement, at 3. The agency cites the
protester’s addition of a fitness center reserved to its
building after discussions as evidence that JPDT was aware that
future amenities could be supported merely by an offeror’s
contractual guarantee. Agency Response to Comments, May 16,
2008, at 7.
The record simply does not support the agency's position. JPDT
denies that it was told that the agency would accept an
offeror’s guarantee as the sole evidence that the amenity would
exist at the time of occupancy. In support of this denial, JPDT
has furnished: (1) a declaration executed by the vice president
of one of its teaming partners who attended the discussions, in
which he reports the information conveyed by the agency and
specifically denies that the agency advised that an offeror’s
guarantee would suffice to verify the existence of a future
amenity; and (2) a contemporaneous e-mail reporting in detail
the matters raised by the agency during discussions, which makes
no mention of the agency’s having relaxed or even discussed the
evidence of future amenities requirements. Declaration of JPDT
Partner Boston Properties Vice President; E-mail From Bennett
Group DC to Boston Properties [JPDT partners], 9‑12-07 Meeting
Notes, Sept. 12, 2007.
In addition, the contemporaneous record of discussions provides
no support for the agency’s position. Specifically, GSA's
statement of objectives for the negotiations makes no reference
at all to the level of evidence required to establish future
amenities, let alone any reference to permitting JPDT to rely on
a promise or guarantee as evidence of third party amenities. AR,
Tab 22, DOJ Negotiation Objectives. There is nothing else in the
record that confirms that the agency advised JPDT of the relaxed
evidence requirement. Further, we do not agree with the agency
that JPDT's offer of a fitness center in its building indicates
it understood the level of evidence required for third party
amenities, since the fitness center was to be furnished by JPDT,
not a third party entity. We conclude that there is no evidence
that the agency advised JPDT of the relaxed requirements.
(New Jersey & H Street, LLC,
B-311314.3, June 30, 2008) (pdf), (See
Trammell Crow Company, B-311314.2,
June 20, 2008) (pdf)
An agency is obligated to conduct an evaluation consistent with
the evaluation scheme set forth in the RFP. FAR sect. 15.305(a);
Serco, Inc., B-298266, Aug. 9, 2006, 2006 CPD para. 120 at 8. We
recognize that proposal evaluation judgments are by their nature
often subjective; nevertheless, the exercise of these judgments
in the evaluation of proposals must be reasonable and must bear
a rational relationship to the announced criteria upon which
competing offers are to be selected. Systems Research and
Applications Corp.; Booz Allen Hamilton, Inc., B‑299818 et al.,
Sept. 6, 2007, 2008 CPD para. 28 at 11. In order for our Office
to perform a meaningful review, the record must contain adequate
documentation showing the bases for the evaluation conclusions
and source selection decision. Southwest Marine, Inc.; American
Sys. Eng’g Corp., B-265865.3, B-265865.4, Jan. 23, 1996, 96-1
CPD para. 56 at 10.
Here, we agree that the SSET’s evaluation identified and
documented the SRD requirements under which the firm's evaluated
strengths and weaknesses were assessed. Nevertheless, the record
does not establish that the SSAC and SSA, in considering those
strengths and weaknesses, applied the relative weights
identified in the RFP for the various SRD requirements (under
which the KPPs were most important). Moreover, the record does
not show any consideration by the SSAC or SSA of the fact that
Boeing’s proposal was evaluated as satisfying significantly more
SRD requirements than Northrop Grumman’s.
For example, the record shows that most of Boeing’s evaluated
“major discriminators” in the aerial refueling area were
assessed under KPP requirements, and conversely most of Northrop
Grumman’s evaluated “major discriminators” were assessed under
less important non-KPP/KSA “requirements.” Specifically, the
SSAC identified as “major discriminators” the following
requirements that Boeing’s aircraft satisfied but Northrop
Grumman’s aircraft did not: (1) the capability to [Deleted] (a
KPP No. 1 objective); (2) the capability, [Deleted] (another KPP
No. 1 objective); (3) the capability to [Deleted] (another KPP
No. 1 objective); and (4) the capability to [Deleted] (a
“noteworthy” non‑KPP/KSA requirement). See AR, Tab 55, PAR, at
13. The SSAC identified as a “major discriminator” for Northrop
Grumman that firm’s satisfaction of one KPP objective (KPP No. 2
objective for exceeding the fuel offload unrefueled range),
where Boeing also satisfied this objective but to a lesser
degree. The Air Force also identified as “major discriminators”
for Northrop Grumman under this area the firm’s better air
refueling efficiency, larger boom envelope, and better offload
and receive rates, all of which were non-KPP/KSA requirements.
Although the record thus evidences that most of Boeing’s
evaluated “major discriminators” were assessed under KPP
requirements, and conversely most of Northrop Grumman’s
evaluated “major discriminators” were assessed under less
important non-KPP/KSA requirements, we have found no document in
the contemporaneous evaluation record that shows that the SSAC
or SSA gave any meaningful consideration to the weights that
were to be assigned to the various KPP, KSA, and other
requirements. That is, the SSAC’s briefing slides to the SSA and
its PAR do no more than identify the SRD requirements for which
the evaluated discriminators were assessed, but do not evidence
any consideration of the descending order of importance assigned
to these various SRD requirements.
In its briefing to the SSA, the SSAC merely reports each of the
firms’ “advantages” without any analysis of whether or not
Boeing’s “advantages” (which as indicated above are mostly
derived from KPP objectives) were entitled to greater weight
than Northrop Grumman’s advantages (which are mostly derived
from less important non‑KPP/KSA requirements). See, e.g., AR,
Tab 55, SSAC Recommendation Briefing to SSA, at 6-7 (aerial
refueling discriminators). Similarly, in the PAR, the SSAC duly
reports the relative order of importance that was to be assigned
to the KPP, KSA and other requirements, see AR, Tab 55, PAR, at
4, but there is no suggestion that the assigned weights to these
requirements were applied in any of the SSAC’s comparative
analyses of the firms’ evaluated discriminators. See, e.g., id.
at 13‑14 (aerial refueling discriminators). Thus, although it is
true that the SSAC reported in the PAR that it considered the
“priorities of KPP, KSA, and non‑KPP/KSAs,” see id. at 29, the
record does not provide any evidence of such a weighing.
Similarly, in [the SSA’s] selection decision, the SSA reports
that the evaluation of the “offerors’ proposed capabilities and
approaches against the SRD requirements were made in the
following order of importance: KPPs, KSAs, and all other non-KPP/KSA
requirements.” See AR, Tab 54, Source Selection Decision, at 5.
Despite this reported recognition of the varying weights
assigned to the different SRD requirements, the SSA’s decision
document does not evidence any consideration of the fact that
Boeing’s assessed “major discriminators” were derived from
requirements that were identified as being more important than
most of the requirements from which Northrop Grumman’s
discriminators were derived. See id. at 5-7.
We agree with the Air Force that it is permissible to identify
relative strengths found under less important evaluation factors
to be discriminators for selection purposes, where there are
lesser relative differences favoring another proposal under more
important evaluation factors. However, we find no evidence in
this record that any such analysis, which considered the
relative weight of the KPPs, KSAs and non‑KPP/KSA requirements,
was performed here.
The Air Force also identified more “discriminators offering less
benefit” for Boeing’s proposal than for Northrop Grumman’s
proposal in the aerial refueling area. Specifically, the SSAC
identified five such discriminators for Boeing that were
assessed under 13 different SRD requirements, and only one such
discriminator for Northrop Grumman that was assessed under 2 SRD
requirements. As noted above, the RFP requested that offerors
satisfy as many of the “trade space” SRD requirements “as
possible.” See RFP, SOO for KC-X SDD, at 2. Despite having
solicited proposals that satisfy as many SRD requirements as
possible, there is no evidence in the record showing that either
the SSAC or the SSA accounted for the fact that Boeing’s
proposal was evaluated as satisfying significantly more SRD
requirements in the aerial refueling area than did Northrop
Grumman’s proposal. In fact, in deciding that Northrop Grumman
had a significant advantage in the aerial refueling area, the
SSA did not even discuss the fact that Boeing had more
“discriminators offering less benefit” than did Northrop
Grumman, much less that Boeing satisfied far more SRD
requirements than did Northrop Grumman in this area.
As noted by the Air Force, the assignment of adjectival ratings
and the source selection should generally not be based upon a
simple count of strengths and weaknesses, but upon a qualitative
assessment of the proposals. See Kellogg Brown & Root Servs.,
Inc., B‑298694.7, June 22, 2007, 2007 CPD para. 124 at 5. Such a
qualitative assessment must be consistent with the evaluation
scheme, however. Here, although the RFP expressly encouraged
offerors to satisfy as many of the “trade space” SRD
requirements “as possible,” see RFP, SOO for KC-X SDD, at 2, the
record shows no evidence that the Air Force gave any
consideration to Boeing’s offer to satisfy significantly more
trade space SRD requirements. This, in our view, is not a matter
of simply counting strengths and weaknesses, but of evaluating
the advantages and disadvantages of competing proposals
consistent with the RFP’s evaluation scheme. See, e.g., Systems
Research and Applications Corp.; Booz Allen Hamilton, Inc.,
supra, at 14.
In short, our review of the record indicates that, as
illustrated by the aerial refueling area examples above, the Air
Force failed to evaluate proposals in accordance with the RFP’s
evaluation criteria. That is, the record evidences that the Air
Force failed to assess the relative merits of the offerors’
proposals based upon the importance assigned to the various SRD
requirements by the RFP or to account for the fact that Boeing
proposed to satisfy far more SRD requirements than did Northrop
Grumman.
(The
Boeing Company, B-311344; B-311344.3; B-311344.4;
B-311344.6; B-311344.7; B-311344.8; B-311344.10; B-311344.11,
June 18, 2008) (pdf)
Operational
Utility Area
Boeing also complains that the Air Force unreasonably evaluated
the firms’ proposals in the operational utility area under the
key system requirements subfactor. The RFP provided that
evaluation of this area would consist of an assessment of the
offeror’s approach to meeting (or exceeding, where appropriate)
SRD requirements, “including the following: aircraft
maneuverability, worldwide airspace operations,
communication/information systems (including Net-Ready
capability), treaty compliance support, formation flight,
intercontinental range, 7,000-foot runway operations, bare base
airfield operations, and growth provisions for upgrades.” RFP
sect. M.2.2.1.2.c. Boeing contends that its proposal should have
been found technically superior to Northrop Grumman’s in this
area, and not essentially equal, as the SSA and SSAC concluded.
See AR, Tab 54, Source Selection Decision, at 8.
As noted above, the SSET found that both offerors satisfied the
three KPP thresholds identified in this area, and partially met
the one KPP objective identified; the SSET also found that both
offerors met all associated KSA thresholds and objectives. See
AR, Tab 55, PAR, at 20-21. The SSAC also identified two “major
discriminators” in each of the firms’ respective proposals; the
discriminators for Boeing were the firm’s (1) [Deleted] and (2)
[Deleted], and for Northrop Grumman were (1) the ability of the
KC‑30 to operate from a 7,000-foot runway carrying more fuel
than the KC-767, and (2) the KC-30’s longer ferry range compared
to the KC-767’s. Id. Boeing’s two “major discriminators” were
assessed under 17 different SRD requirements, while Northrop
Grumman’s two discriminators were assessed under only 2 SRD
requirements. The SSAC also identified a number of
“discriminators offering less benefit” for each firm: six such
discriminators for Boeing assessed under 19 SRD requirements,
and five such discriminators for Northrop Grumman assessed under
6 SRD requirements. Id. at 22‑24.
Here, too, as we described above with respect to the aerial
refueling area, the record does not evidence that the SSAC and
SSA, in determining that the firms’ proposals were essentially
equal in the operational utility area, gave any consideration to
the fact that Boeing’s proposal was evaluated as satisfying more
SRD requirements than Northrop Grumman’s in this area, as was
sought by the RFP. Given this failure by the SSAC and SSA to
address Boeing’s apparent advantage in meeting more SRD
requirements than Northrop Grumman, we conclude that the
agency’s evaluation and selection decision was unreasonable in
this regard.
Boeing also complains that the agency conducted misleading
discussions with Boeing with respect to whether Boeing had fully
satisfied the KPP No. 7 objective, Net-Ready Capability. RFP,
SRD sect. 3.2.4.1.1; app. A, Net-Ready Capability KPP, at 3. The
KPP No. 7 objective provides that the offeror’s “system should
be capable of accomplishing all operational activities
identified in Table 5.” RFP, SRD, app. A, at 4. Table 5 of the
appendix identified a number of information exchange
requirements. Id. at 15-25. (The
Boeing Company, B-311344; B-311344.3; B-311344.4;
B-311344.6; B-311344.7; B-311344.8; B-311344.10; B-311344.11,
June 18, 2008) (pdf)
First, the
agency’s argument that the protester has no way of knowing
whether Alon provided information regarding the value of its TSA
contract at its oral presentation clearly is not dispositive of
the issue, given that the agency has not asserted that Alon did
in fact furnish such information. There simply is no evidence in
the record that Alon’s TSA contract (or its MDA or NWS
contracts) were similar in magnitude to the effort here.
Accordingly, the issue before us is whether it was consistent
with the terms of the RFQ and reasonable for the evaluators to
have considered relevant Alon’s experience on projects that were
highly similar in scope and nature, but not similar in magnitude
to the effort solicited here. In our view, it was not
consistent with the terms of the RFQ, which essentially provided
that prior work efforts had to be similar in scope, magnitude,
and nature to be considered relevant, for the evaluators to have
considered Alon’s contracts with the TSA, NWS, and MDA relevant.
See Si-Nor, Inc., B-292748.2 et al., Jan. 7, 2004, 2004 CPD para.
10 at 17. We think that this case is distinguishable from
Computer Sys. Dev. Corp., cited by the agency above, in that in
the cited case, the solicitation identified size, complexity,
and participation as factors to be considered in determining the
significance of a prior work effort, but did not require that a
project have been similar in all three respects to be considered
significant, whereas here, the RFQ defines relevant work as work
of similar scope, magnitude, and nature, meaning that all three
factors need to be present for a contract to be considered
relevant.
That does not end our analysis, however, given that it is the
agency’s position that the terms of the RFQ notwithstanding,
contracts of lesser magnitude than the effort here may in fact
be relevant where sufficiently similar in scope and nature. In
other words, the agency’s position is, in essence, that the RFQ
overstates the agency’s requirements pertaining to prior work
effort relevance. Where an agency takes the position that
a solicitation overstates its requirements, and proposes to
evaluate vendor responses on the basis of its actual, as opposed
to its stated, requirements, the relevant inquiry is whether the
protester was prejudiced by the overstatement. Unfair
competitive prejudice from a waiver or relaxation of the terms
of the RFQ exists where the terms were not similarly waived or
relaxed for the protester, or the protester would be able to
alter its quotation to its competitive advantage if it were
given the opportunity to respond to the relaxed terms. See 4-D
Neuroimaging, B‑286155.2 et al., Oct. 10, 2001 , 2001 CPD para.
183 at 10. Here, the evaluators considered prior work efforts of
lesser magnitude in their evaluation of both Alon’s and DMSJV’s
corporate experience; that is, there is no evidence that the
evaluators waived the requirement pertaining to similarity in
magnitude for Alon while failing to waive it for DMSJV. Further,
we see no basis to conclude that DMSJV would have cited
different contracts as examples of its own corporate experience
had it recognized that contracts dissimilar in magnitude might
be considered relevant. Indeed, since three of the five
contracts cited by the protester in its quotation as examples of
its corporate experience have present values of less than 5
percent of its evaluated price for the services here, it appears
that the protester, like the agency and the successful vendor,
assumed that contracts dissimilar in dollar value might
nonetheless be considered relevant if sufficiently similar in
other respects. Given the apparent lack of prejudice to DMSJV,
the agency’s consideration of Alon’s prior contracts despite
their dissimilarity in magnitude provides no basis to object to
the evaluation. (Data Management
Services Joint Venture, B-299702; B-299702.2, July 24, 2007)
(pdf)
As discussed above, DRC clearly stated in its quotation that it
did not intend to comply with the on-site requirement for at
least one of the web page designer/developer positions, and the
record establishes that the agency clearly recognized this
aspect of DRC’s quotation; yet, the agency evaluated the
personnel proposed to fill these positions as constituting a
“Significant Strength.” Further, in performing the contract,
none of the personnel DRC proposed to perform either of the two
web page designer/developer positions has ever performed
on-site. Finally, as noted above, LAI maintains that, had it
been permitted to similarly propose off‑site personnel to fill
these positions, it could have proposed more qualified
personnel. On this record, it is clear that the agency permitted
DRC to propose to perform the contract requirements on a basis
that was materially different than that required by the
solicitation. That is, the agency effectively relaxed the
solicitation’s stated requirements for on-site performance
without providing an opportunity for the other competing firms
to submit quotations on a similar basis. See SeaSpace, supra. It
is also clear that the agency’s relaxation of this requirement
was detrimental to, and prejudiced, LAI’s competitive position.
(Low & Associates, B-297444.2,
Inc., April 13, 2006) (pdf)
It appears that the agency accorded little or no weight to the
degree to which offerors proposed to accelerate completion ahead
of the 2012 required site closure date, instead evaluating the
acceleration aspect of the proposed schedules solely in
comparison to Fluor Hanford’s ultimate 2018 closure date. The
agency’s disregarding the differences in the offerors’ proposed
completion dates was inconsistent with the solicitation. While
it may be the case that SCA’s more relaxed proposed schedule
after the deactivation phase offered more flexibility to deal
with transitions in the project work, potential problems or
unexpected decreases in agency funding, the fact is that the
solicitation made it clear that acceleration of the project was
desired. In this regard, the solicitation schedule stated that
“DOE is committed to achiev[ing] accelerated closure that does
not jeopardize safety and protection of workers, the public, or
the environment.” RFP sect. B.1. Likewise, the SOW advised that
“[t]he purpose of the FFTF Closure Project is to safely
accelerate the deactivation and decommissioning [of] the FFTF
and its support structures.” SOW sect. C.1. Lest offerors be
unclear as to the role of the contemplated contract in meeting
that goal, the SOW added that “[t]he contractor has the
flexibility to develop the project structure and to sequence the
work, subject to [National Environmental Policy Act]
requirements, to optimize the project schedule to achieve safe,
cost‑effective and accelerated closure of the site.” RFP sect.
C.3. Likewise, section H of the solicitation stated that “[t]he
FFTF Project and this contract have a mission of accelerated
closure.” RFP sect. H.1(a). In this context, we think the
reference in section M (quoted above) to evaluating “the extent
to which” an “offeror’s approach to deactivation,
decommissioning, and demolition . . . optimizes project
execution and schedule” could reasonably be understood by the
offerors as including consideration of how soon closure of the
site would be accomplished. Indeed, all three offerors
apparently read the RFP in this manner, as reflected by the fact
that all proposed accelerated completion, ahead of the 2012
mandatory completion date. While the agency may have intended to
limit the extent to which greater acceleration would be rewarded
in the evaluation, this intent was not reflected in the RFP.
Further, we agree with the protesters that, because the
evaluation criterion in question concerned evaluation of an
offeror’s technical proposal, and evaluation of the offeror’s
cost proposal was already accounted for under another criterion,
it was improper for the agency to consider SCA’s low cost in its
evaluation under the technical approach criterion. This is
especially significant in light of the fact that the
technical/management proposal was to be significantly more
important in the source selection than the cost/fee proposal.
RFP sect. M.2. We conclude that the agency’s evaluation of
schedule‑‑which furnished a moderate discriminator in SCA’s
favor and was specifically cited by the SSO in the SSD in
concluding that SCA’s proposal had an advantage over FRC’s under
the technical approach criterion, the most important
technical/management evaluation criterion‑‑failed to take into
account the fact that FRC’s schedule offered what could be
viewed as significantly greater acceleration of FFTF site
closure than SCA’s. Further, while DOE may be correct that FRC
would be less likely to meet its aggressive schedule, the agency
does not show, nor does the record furnish any basis for
concluding, that FRC would not be likely to complete closure
earlier than SCA. (In contrast, as discussed below, while EPW’s
proposed scheduled also offered greater acceleration than SCA’s,
the agency evaluated EPW’s schedule as being premised on an
essentially unacceptable technical approach.) Accordingly, we
conclude that the agency did not reasonably evaluate the
proposals in this area. (EPW Closure
Services, LLC; FFTF Restoration Co., LLC, B-294910;
B-294910.2; B-294910.3; B-294910.4; B-294910.5; B-294910.6;
B-294910.7, January 12, 2005) (pdf)
We agree with ProTech that the source selection decision was
unreasonable because the SSA in making his cost/technical
tradeoff determination accorded the betterments factor less
weight than identified by the RFP. As noted above, the SSA
stated that the betterments factor was the "fourth, and least
most important factor," but this factor was identified in the
RFP as being equal in weight to the experience and past
performance factors. Compare AR, Tab 21, Source Selection
Decision, at 9, with RFP at 44. In fact, the SSA erroneously
stated that all of the technical evaluation factors were listed
in the RFP in descending order of importance. See AR, Tab 21,
Source Selection Decision, at 1, 8-9. Thus, when the SSA
performed his cost/technical tradeoff to determine whether
Atherton's apparently superior technical proposal outweighed
ProTech's significant price advantage, the SSA diminished the
importance of the betterments factor, which was the only factor
under which ProTech's proposal was evaluated as having a
technical advantage over Atherton's proposal. The SSA's failure
to accord the betterments factor the evaluation weight
identified in the solicitation was exacerbated by the SSA's
judgment that, under this factor, Atherton's offer of no
betterments represented 63 percent of the value of ProTech's
offer of six betterments. See id. at 9. This may have had the
effect of further diminishing the value of ProTech's offer of
betterments, where the awardee chose to offer only the
solicitation's basic requirement. We fail to see how such an
evaluation was in accord with the solicitation's evaluation
scheme that informed offerors that "[m]ore betterments will be
considered more favorably than fewer betterments." [9] RFP at
44. If the SSA had applied the correct weight to the betterment
factor, there is a reasonable possibility that ProTech's lower
priced proposal would have been selected for the award;
accordingly, we find that ProTech was prejudiced by the agency's
flawed source selection analysis. See McDonald-Bradley ,
B-270126, Feb. 8, 1996, 96-1 CPD 54 at 3; see Statistica, Inc.
v. Christopher , 102 F.3d 1577, 1581 (Fed. Cir. 1996). We
therefore sustain the protest on this basis. (ProTech
Corporation, B-294818, December 30, 2004) (pdf)
The record clearly shows, however, that the soap bubble test
will not demonstrate compliance with the strength requirements
because that test does not measure the strength of the bonding
between the fittings and the bladder fabric, but merely
determines whether the bladder leaks air after inflation. RFP
Purchase Description, 4.5.2.1. As evidenced by the purchase
description and as described by MPCs technical consultant, the
leak test does not evaluate the strength of the bond between the
fittings and the fabrics with specific pounds/inch loads exerted
on the samples as contemplated by Table IVs strength
requirements and the required tests contained in the
specifications and referenced in the table. At best, the soap
bubble test will determine whether, after inflation with air,
the tanks leak at the areas where the fittings are bonded to the
fabric. Additionally, the soap bubble test, as described in the
purchase description and by MPC, does not involve subjecting the
product sample to fuel immersion as required by the tests set
forth in Table IV; thus, no information will be provided
regarding the strengths that samples must demonstrate after
immersion in the fuels. Consequently, since MPCs revised
proposal fails to reasonably address the test requirements, we
find that the agency improperly determined that MPCs proposal
was technically acceptable and we sustain this basis of Cooleys
protest. (Cooley/Engineered
Membranes; GTA Containers, Inc., B-294896.2; B-294896.3;
B-294896.4, January 21, 2005) (pdf)
With respect to STEMs complaint that the agency relied
upon an unstated criterion when it evaluated offerors reach back
capability, USMC asserts that this capability, which refers to
an offerors ability to call upon the personnel and other
resources available from both itself and its team partners, is
reasonably related to evaluation factors (1), (2) and (3). Based
on our review, we agree with the agency that these factors all
contemplated that the agency would evaluate the capability of
the offerors team to call upon its personnel and resources to
perform the work. For example, the RFP stated that offerors
would be evaluated for their experience and capabilities that
demonstrate familiarity and expertise with [USMC] programs as
described in the Statement of Work; their demonstrated
understanding of [the USMC] Installations and Logistics program
missions, functions and goals; and employee technical expertise
and experience . . . related to support of installations ... and
military logistics programs. RFP M.1.2. These statements
contemplate that the agency would evaluate an offerors
capability and resources, which would include its ability to
reach back to the resources of its team members. Accordingly, we
find that the agency did not apply an unstated evaluation
criterion in its evaluation. (STEM
International, Inc., B-295471, January 24, 2005) (pdf)
As for Knit-Rite's challenge to the agency's technical
evaluation methodology, we find that the methodology was
reasonable. Indeed, we already held--in Apex Foot Health Indus.
, supra --that the agency's evaluation methodology under this
RFP was reasonable. Apex objected there--as KnitRite objects
here--to the subjective nature of the evaluation. We rejected
this argument, noting that, while the evaluation was subjective,
this is what the RFP provided for; the RFP did not state that an
objective, scientific process would be used in testing offered
socks, but that a subjective evaluation of the socks would be
performed. [4] RFP at 28. The evaluation consisted of the
evaluators visually inspecting the socks, with special attention
given to the seams, and then walking in the socks, with shoes,
on carpeted and concrete floors, for approximately 10 minutes.
The evaluators wore shoes to simulate actual usage and to assist
in determining whether pressure was placed on the foot from any
seam or crease. AR, exh. 6, Declaration of TEP Chairperson, at
2. It remains our view that the agency's methodology was
reasonable and consistent with the evaluation scheme set forth
in the RFP. (Knit-Rite, Inc.,
B-293088.3, August 5, 2004) (pdf)
CDC first argues that it was unreasonable for VA to
consider the additional features Braun offered because these
features were not specified in the solicitation. We disagree. As
noted above, the RFQ expressly stated that VA would consider a
vendor’s ability to offer additional features not required by
the specifications. VA’s consideration of the Braun equipment’s
ability to link with VA’s existing systems thus is entirely
consistent with the RFQ. Further, VA had no obligation to hold
discussions with CDC about its ability to offer this feature
given that the RFQ stated that VA did not intend to hold
discussions before selecting a vendor, see OMNIPLEX World Servs.
Corp., B-282630.2, Sept. 22, 1999, 99-2 CPD ¶ 64 at 4, and, in
fact, specifically advised vendors to include their best terms
from a price and technical standpoint in their initial
quotations. Similarly, VA was under no obligation to specify its
dimensional requirements before downgrading the CDC equipment’s
technical score for its inability to fit properly into the VA
laundry facilities. The RFQ stated that equipment must be
suitable for installation in the available space and that space
limitations do exist for installation of the new equipment. It
also advised vendors to perform a site visit to ensure that they
could move or reinstall existing equipment alongside new
equipment. It clearly was reasonable for VA to give CDC a
less-than-maximum technical score for equipment that did not fit
properly into the available space, especially when CDC had the
opportunity to--and in fact did--visit the site for the very
purpose of assessing such questions as whether its equipment
might cause congestion of cart pathways. (Chicago
Dryer Company, B-293940, June 30, 2004) (pdf)
Where
detailed technical proposals are sought and technical evaluation
criteria are used to enable the agency to make comparative
judgments about the relative merits of competing proposals,
vendors are on notice that qualitative distinctions among
competing proposals will be made under the various evaluation
factors. See Leach Mgmt. Consulting Corp., B-292493.2, Oct. 3,
2003, 2003 CPD ¶ 175 at 4-5. In making such distinctions, an
agency may properly take into account specific, albeit not
expressly identified, matters that are logically encompassed by
or related to the stated evaluation criteria. Si-Nor, Inc.,
B-292748.2 et al., Jan. 7, 2004, 2004 CPD ¶ 10 at ____. Here,
the RFP required management of four different functional areas
and required offerors to demonstrate their past performance and
corporate experience with contracts of similar “complexity.”
Clearly, an offeror’s past performance history and experience
with managing the four functional areas to be performed is
relevant to the offeror’s history and experience with regard to
contracts that were similar in “complexity” to the RFP
requirements. See Leach Mgmt. Consulting Corp., supra.
Accordingly, it was reasonable for the agency to consider the
extent of the offerors’ multifunction contracting history and
experience when evaluating the past performance and corporate
experience factors. (Ashe
Facility Services, Inc., B-292218.3; B-292218.4, March 31,
2004) (pdf)
In our view, the agency’s evaluation was clearly inconsistent
with the terms of the solicitation. As indicated above, the RFP,
while initially referring only to the evaluation of proposals to
determine the extent of each offeror’s proposed subcontracting
plan, was amended to provide that the “[e]valuation will include
the extent of participation of small businesses in terms of the
total value of the acquisition.” RFP, amend. 4, at L-11. B&R’s
proposal plainly states that Ferguson-Williams, a small
business, would perform approximately 40 percent of the work if
B&R were awarded the contract.7 AR, Tab 5, B&R Technical
Proposal, at 157. In light of this, we find that the agency’s
evaluation of B&R’s proposal under the “Support for the Small
Business, Small Disadvantaged Business, and Woman-Owned Business
Program” evaluation criterion was unreasonable. See Summit
Research Corp., B-287523; B-287523.3, July 12, 2001, 2001 CPD ¶
176 at 7 (in evaluating proposals under small business
participation factor, agency was required by the terms of the
solicitation to consider whether the offeror itself was a small
business in addition to whether the offeror proposed to rely on
small business subcontractors). (Burns
and Roe Services Corporation, B-291530, January 23, 2003) (pdf)
Although this solicitation is not a model of clarity, the
protester's interpretation that the solicitation required a
two-step process, with the second step providing for something
other than the consideration of the offerors' technical
proposals on a “pass/fail” basis, is unreasonable. As set forth
above, the RFP clearly provided in a number of places that each
proposal would be evaluated under the technical capability
factor “on a pass/fail basis” to determine technical
acceptability, and in describing the technical capability factor
and the relevant evaluation subfactors, included provisions that
“define[] an acceptable (pass) rating.” RFP §§ M‑2, M-3(a). The
solicitation also referred in a number of instances to the use
of the PPT procedures for the selection of the best-value
proposal, referenced the relevant Air Force Federal Acquisition
Regulation Supplement section that details the PPT procedure,
and explained that the PPT procedure here provides for the
best‑value determination to be made from those “technically
acceptable” proposals based upon a trade-off of past performance
and price. RFP § M-2. Given the RFP's repeated references to the
evaluation of proposals under the technical capability factor on
a “pass/fail” basis, and the RFP's statements that only those
proposals found to be technically acceptable would be eligible
for award, we find the protester's argument that the RFP
provided for the consideration of technical merit on a basis
other than technical acceptability (or “pass/fail”) to be
without merit. (All Phase
Environmental, Inc., B-292919.2; B-292919.3; B-292919.4;
B-292919.5; B-292919.6; B-292919.7, February 4, 2004) (pdf)
Here, the
RFP specifications were primarily stated in terms of functional
or performance requirements, which permitted an offeror to
propose its own unique approach. In particular, the solicitation
was silent with respect to whether the software should be
web-based or client-server-based, or some combination of the
two, and we see no legal defect in that. While the RFP thus left
it to the offeror to propose what it believed to be the best
approach for satisfying the RFP requirements, there was no
guarantee that the agency would consider different approaches to
be equally effective. Moreover, the agency was not required to
advise a technically acceptable offeror during discussions that
it considered another approach to be superior to that proposed
by the offeror. See Canadian Commercial Corp./Canadian Marconi
Co., B‑250699.4, Mar. 5, 1993, 93-1 CPD ¶251 at 7. (Cerner
Corporation, B-293093; B-293093.2, February 2, 2004) (pdf)
In evaluating the revised CIS proposal, four of the five
evaluators again prepared only cursory narrative materials. In
terms of scoring, three of these four evaluators raised CIS’s
score by [deleted] points; the agency’s final consensus
technical evaluation report states that two of the three
evaluators increased their scores based on their conclusion that
the CIS proposal now met the personnel experience requirements,
and that the third increased his score based on CIS’s providing
“additional information” in its revised proposal. AR, exh. 7, at
5-6. Among these four evaluators, two assigned a final overall
technical score of [deleted] points and two assigned a score of
[deleted] points. Id. at 7. The fifth evaluator scored CIS’s
revised proposal dramatically differently, reducing CIS’s score
from [deleted] total points initially to [deleted] points on
reevaluation. Unlike the other four evaluators, he prepared
extensive narrative materials during his rescoring of the
proposal, AR, exh. 4, at 25-26, and his unedited comments were
ultimately incorporated into the final consensus technical
evaluation report, along with a summary of the other evaluators’
limited comments. AR, exh. 7, at 5. The agency’s source
selection decision document, AR, exh. 18, does not reflect any
critical or independent analysis or evaluation of the proposals
by the source selection official (SSO); instead, it relies
entirely upon the numeric scores for purposes of the agency’s
source selection decision. This being the case, the comments of
the fifth evaluator regarding deficiencies in CIS’s proposal
represent the sole support in the evaluation record for the
relatively low ranking of CIS’s revised proposal. This is
problematic for the agency because we find that the conclusions
expressed by the fifth evaluator are unreasonable. (Computer
Information Specialist, Inc., B-293049; B-293049.2, January
23, 2004) (pdf)
The solicitation specified both minimum or threshold (T)
requirements and desired or objective (O) requirements for the
RIS and other MDNS subsystems. In addition, it identified Key
Performance Parameters (KPP) and Additional Performance
Parameters (APP). The solicitation described KPPs as “must-pass
testing events” and provided that “[a]ny offering failing the
(T) value of any of the KPPs will be removed from further
testing and will not be considered for contract award.” RFP,
amend. 5, § 3.1.2. The solicitation described APPs as “tradable
parameters . . . used to measure effectiveness and performance,”
and provided that [f]ailure to meet either (T) or (O)
requirement values specified in an APP does not remove a
submission from further testing or from consideration for
contract award. APPs are evaluated to provide information
leading to a best value award determination. To the extent
that by noting that “the SIR decreased the SOF operators’
ability to better and more rapidly acquire, identify, and
accurately fire on enemy targets,” the contracting officer meant
that the SIR configurations failed to improve the operators’
ability to sight and fire on enemy targets, improvement in
firing accuracy was an APP (i.e., an objective or desired
improvement), as opposed to a KPP (i.e., a requirement). See
Subsection 3.2.1. Accordingly, to the extent that the evaluators
found ARMS’s SIRs unsuitable on the basis that they failed to
improve operators’ ability to acquire, identify, and fire on
enemy targets, the evaluators in effect found the SIRs
unsuitable for failing to meet the values specified in an APP.
In our view, rejection on such a basis was contrary to the terms
of the RFP, which, as previously noted, provided that “[f}ailure
to meet either (T) or (O) specified in an APP does not remove a
submission from further testing or from consideration for
contract award.” (Atlantic
Research Marketing Systems, Inc., B-292743, December 1,
2003)
Next, Dismas protests the agency's evaluation of offerors' past
performance, the most heavily weighted evaluation factor, on the
basis that the agency failed to request the same type of
information from all references and in a manner consistent with
that specified by the RFP. In this regard, the RFP stated: “The
Contractor Evaluation Form [CEF], located in Section J, will be
used to collect [past performance] information.” Agency Report,
Tab 1, RFP at 61. The CEF requested references to assign
numerical ratings, ranging from 1 to 5,[12] in three contract
performance areas--contract compliance, customer satisfaction,
and business relations. Agency Report Tab 7, CEF, at 2-3. Of
significance, the CEF also contained the following directive to
references: “If performance is over and above the minimum
requirements of the SOW, add either one, two or three points [to
the aggregate point score].” Agency Report Tab 7, CEF, at 2.
The agency does not dispute that it failed to use the
RFP-specified CEF for many of Dismas's references. Rather, for
any reference other than the Bureau of Prisons, the agency
states that it prepared a shorter “letter/questionnaire.” Agency
Report, Contracting Officer's Statement, at 15. Although the
letter/questionnaire sought the same numerical/adjectival
ratings for the three contract performance areas identified in
the CEF, it did not direct references to increase an offeror's
rating “[i]f performance is over and above the [SOW] minimum
requirements.” Agency Report, Tab 7, Letter/Questionnaire, at
38. (Dismas Charities,
Inc. , B-292091, June 25, 2003) (pdf)
Independence specifically contends that it was improperly
downgraded for failing to provide items not required by the RFP,
such as documentation of project management, project timelines
or flowcharts, organizational charts, key personnel resumes,
quality control plan, safety plan, and the identity of
subcontractors. However, offerors were specifically instructed
by the RFP to provide a detailed technical proposal that
contained “sufficient information to reflect a thorough
understanding of the requirements and a detailed, description of
the techniques, procedures and program for achieving the
objective of the specifications/statement of work.” RFP at 82.
Offerors were also specifically instructed to “at a minimum”
provide information concerning the “name and qualifications of
key personnel” and to “address[] supervision and communication,”
“crew size and experience,” “quality control plan,” “proposed
schedule of operations,” and “safety.” RFP at 84. Given
these requirements, we think that the items noted as missing or
deficient in Independence's proposal were reasonably encompassed
under the four listed subfactors of the technical
approach/organization structure factor: (1) how the project will
be managed and staffed, addressing contractor's proposed
schedule of operations--which reasonably encompassed project
management information and organizational charts as an indicator
of how the project will be managed, subcontractor identity as an
indicator of staffing, and the time frame for completion of the
work as an indicator of scheduling; (2) skills and skill levels
within the firm to accomplish the project--which reasonably
encompassed key personnel qualifications and experience, as may
be reflected in resumes; and (3) quality control and (4)
safety--which both reasonably encompassed providing plans. RFP
at 84, 86. (Independence
Construction, Inc. , B-292052, May 19, 2003) (pdf)
An agency may not induce offerors to prepare and submit
proposals based on one premise, then make source selection
decisions based on another. Hattal & Assocs., B‑243357;
B-243357.2, July 25, 1991, 91-2 CPD ¶ 90. Accordingly, once
offerors are informed of the evaluation criteria against which
proposals will be evaluated, the agency must adhere to the
stated criteria, or inform all offerors of all significant
changes. DynCorp, B-245289, B-245289.2, Dec. 23, 1991, 91-2 CPD
¶ 575 at 5. Here, the agency clearly advised offerors that
proposals would be evaluated under 14 of the 17 technical
evaluation factors based on an objective assessment as to
whether or not the proposed IBR would meet the stated
performance requirements. The agency's stated basis of
evaluation for these factors was specifically described in
contrast to the subjective grading by which the remaining three
technical factors would be evaluated. Notwithstanding these
publicly stated ground rules for the competition, the agency
proceeded to make qualitative distinctions between the two
proposals based on factors which were neither disclosed, nor
reasonably subsumed within the stated requirements. Ultimately,
the agency concluded that [deleted] “among the most highly
rated” based on these distinctions. (Mnemonics,
Inc., B-290961, October 28, 2002) (pdf) (txt
version)
The agency's action here effectively restricted the competition
to firms with direct corporate detention/custody experience and
represents the application of an unstated minimum requirement in
the evaluation. If this represented the agency's actual minimum
needs, the agency was required to amend the RFP to reflect those
needs.[9] FAR § 15.206(a);
Canberra Indus., Inc., B-271016, June 5, 1996, 96-1 CPD ¶ 269 at
4. (Omniplex
World Services Corporation, B-290996.2, January 27, 2003)
(txt
version)
In any case, while agencies are
required to identify the major evaluation factors in a
solicitation, they are not required to identify all areas of
each factor that might be taken into account, provided that the
unidentified areas are reasonably related to or encompassed by
the stated criteria. S3 LTD, B-288195, Sept. 10, 2001, 2001 CPD
P: 164 at 10. Here, a system's high frequency performance was
indicated by the model (NVTHERM) the offerors knew would be used
in the evaluation, and increasing frequency was related to range
in that it indicated an ability to perform the same task at a
greater range. Tr. at 29-30, 193. At a minimum, in these
circumstances, high frequency performance was reasonably related
to the thermal channel performance specified in the RFP. (DRS
Systems, Inc., B-289928.3; B-289928.7, September 18,
2002) (pdf)
With respect to the merits of the
protest, the agency basically contends that there is no language
in section M of the RFP stating how CLINs for existing INMARSAT
services would be awarded, so that the awards, based on the
evaluation of 3-year pricing, are consistent with section G.6
because they may save the government money, inasmuch as it is
not certain that the options would be exercised, as offerors
were advised in section M. This ID/IQ contract had two
types of CLINs. The CLINs for existing INMARSAT services
were, in accordance with section G.6, to be awarded to the
“winning Contractor” “based on a comparison of all
awardee[s'] contract prices.” The solicitation
contemplated no future competitions for this work among the
awardees. In contrast, work covered by the other CLINs
would, in accordance with section G.5, be generally awarded
based on competition among the awardees. The agency's
basic argument here is that it is not bound by section M in
making the awards of the CLINs for existing INMARSAT services
because section M did not specify how the awardees of these
CLINs would be selected and because it cautioned that options
may not exercised. However, a solicitation must be read
and interpreted as a whole, and where it is appropriate sections
other than section M of the solicitation must be considered in
determining how proposals should be evaluated. See Joseph
W. Beausoliel, B-285643, Aug. 31, 2000, 2001 CPD ¶ 26
at 4; Recon Optical, Inc. B-232125, Dec. 1, 1988, 88-2
CPD ¶ 544 at 8-9. Here, section G.6 stated that the
awards for the existing INMARSAT services “will be based upon
low price,” based on a “comparison of all awardee[s']
contract prices” with the low price controlling which awardee
would receive the award of these CLINs. Section M.4.2.b of
the RFP then expressly provided that “the evaluation period is
120 months.” No other period of evaluation is provided
for in section M. Thus, we find that the RFP unambiguously
provided that the awards for each CLIN covering existing
INMARSAT services would be made to the awardee offering the
lowest 10‑year pricing for these services, as reflected in
its proposal under the RFP. (Global
Communications Solutions, Inc., B-291113, November 15, 2002)
There
is another defect in the agency's evaluation under the objective
suitability factor. The RFP stated that the evaluation under
this factor would be under a descending-order-of-importance
weighting scheme for both criteria and sub-criteria. The
agency's evaluation erroneously applied equal weight to all
criteria and sub-criteria in awarding Carr's proposal [DELETED]
points and M&S Farms' proposal [DELETED] points. During the
course of this protest, the agency acknowledged this defect and
re-scored the proposals applying the following weights to the
criteria:
Criterion 1 --- 120 points
Criterion 2 --- 90 points
Criterion 3 --- 80 points
Criterion 4 --- 60 points
Criterion 5 --- 50 points
The new technical scores resulted
in Carr's proposal still receiving a higher point score with
[DELETED] points as compared to M&S Farms' proposal with
[DELETED] points. Agency Report at 14-15. However, the
re-scoring applied equal weight to the sub-criteria under each
criterion that had them. This re-scoring thus fails to accord
sub-criteria descending weights consistent with the evaluation
scheme stated in the RFP. Since the sub-criteria under which the
protester's proposal was scored lower than the awardee's were
all at or near the bottom of the lists of sub-criteria, and thus
should be accorded less weight than any of the evaluation
schemes applied by the agency to date, the difference in overall
technical scores (prior to any correction of other defects
identified in this decision) should be less than calculated by
the agency. (M&S
Farms, Inc., B-290599, September 5, 2002) (pdf)
This
argument is without merit. While the RFP included standard
"best value" evaluation language providing for award
based on the "most advantageous" proposal, it did not
go on to establish any evaluation factors or a scheme for
conducting a comparative evaluation. To the contrary, in lieu of
evaluation factors, the RFP stated only that proposals had to
"[meet] all the terms and conditions of the
solicitation," that the offeror had to be responsible, and
that the price had to be considered fair and reasonable.
Consistent with these limited evaluation considerations, the RFP
required offerors to provide only a "technical description
of the items being offered in sufficient detail to evaluate
compliance with the requirements in the solicitation." RFP
at 10. Reading these provisions together, there was no basis for
the agency to conduct a comparative evaluation; rather,
notwithstanding inclusion of the standard "best value"
evaluation language, it was sufficiently clear from the RFP that
the award was to be made to the low-priced offeror that
demonstrated compliance with the RFP requirements, i.e.,
acceptability. See Vistron, Inc., B-277497, Oct. 17, 1997,
97-2 CPD ¶ 107 at 4. (Duncan
Security Consultants, Inc., B-290574, August 8, 2002)
(pdf)
Oral advice that conflicts with
the solicitation is not binding on the government, and an
offeror therefore relies on an oral explanation of a
solicitation at its own risk. Input/Output Tech., Inc.,
B-280585, B-280585.2, Oct. 21, 1998, 98-2 CPD para. 131 at 5.
Here, while the response to Diamond's questions was provided in
the form of an e-mail, rather than orally, the result is the
same. The solicitation put offerors, including Diamond, on
notice that proposals would be evaluated against stated
evaluation factors, and specific requirements; if an offered
model did not satisfy one or more of the RFP requirements, it
would be rejected as unacceptable. No informal advice--oral, or
otherwise--could change this basis for evaluation, since the
advice would not amend the solicitation. (Diamond
Aircraft Industries, Inc., B-289309, February 4, 2002)
While procuring agencies are
required to identify the significant evaluation factors and
subfactors in a solicitation, they are not required to identify
the various aspects of each factor which might be taken into
account, provided that such aspects are reasonably related to or
encompassed by the RFP's stated evaluation criteria. Farnham
Security, Inc., B-280959.5, Feb. 9, 1999, 99-1 CPD para. 100 at
3. GSA asserts that organizational structure and
transition/start-up plan are logically related to the staffing
plan factor, and therefore were properly considered under that
factor. NCLN20 does not dispute that these items are
logically related to the staffing plan factor, and we think the
agency reasonably could expect offerors to recognize the need to
address these areas, even without an express RFP requirement for
the information. We therefore find no basis for questioning the
agency's position. (NCLN20,
Inc., B-287692, July 25, 2001)
In a negotiated procurement, an
agency has broad discretion in determining the manner and extent
to which it will make use of evaluation results. TRW, Inc.,
B-234558, June 21, 1989, 89-1 CPD ¶ 584 at 4. Agencies do not,
however, have the discretion to announce in the solicitation
that they will use one evaluation plan and then follow another;
once offerors are informed of the criteria against which their
proposals will be evaluated and the source selection decision
made, the agency must adhere to those criteria or inform all
offerors of significant changes. DynCorp, B-245289,
B-245289.2, Dec. 23, 1991, 91-2 CPD ¶ 575 at 5. Specifically,
it is improper to induce an offeror to prepare and submit a
proposal emphasizing technical excellence, then evaluate
proposals only for technical acceptability, and make the source
selection decision on the basis of technical acceptability and
lowest cost or price. Hattal & Assocs., B-243357,
B-243357.2, July 25, 1991, 91-2 CPD ¶ 90 at 7-9. (Special Operations Group, Inc., B-287013; B-287013.2, March
30, 2001)
While agencies have broad discretion in making source selection decisions, their decisions must be rational and consistent with the solicitation’s stated evaluation scheme; an agency may not announce one basis for evaluation and award in the RFP and then evaluate proposals and make award on a different basis. Marquette Med. Sys., Inc., B-277827.5, B-277827.7, Apr. 29, 1999, 99-1 CPD ¶ 90 at 5-6. The source selection decision document articulates three considerations other than past performance that formed the basis for the agency’s decision to select TAC’s higher-priced offer for award. None of these considerations is either identified in the RFP as an evaluation criterion, or logically encompassed within the stated criteria.
(Tennier
Industries, Inc., B-286706.2; B-286706.3, March 14, 2001)
IGC primarily argues that the
agency improperly made award on the basis of an undisclosed
criterion--proven installation compatibility of a third party's
gradient shim assembly with the offered magnet, including
consideration of the areas of mechanical vibrations and
inductive coupling. The evaluation here was
unobjectionable. While installation compatibility of the magnet
with a third party's gradient shim assembly was not specifically
required in the solicitation, the need for such compatibility
should have been clear from the discussion questions. (Intermagnetics
General Corporation, B-286596, January 19, 2001)
It is fundamental that offerors
must be advised of the bases upon which their proposals will be
evaluated. Sci-Tec Gauging, Inc.; Sarasota Measurements &
Controls, Inc., B-252406, B-252406.2, June 25, 1993, 93-1 CPD
para. 494 at 7. It is also fundamental that the contracting
agency must treat all offerors equally, which includes providing
a common basis for the preparation and submission of proposals
and not disparately evaluating offerors with respect to the same
requirements. Id. at 8. In reviewing a protest of an agency's
evaluation of proposals, we examine the record to ensure that
the agency's evaluation was reasonable and consistent with the
stated evaluation criteria. Ogden Support Servs., Inc.,
B-270012.2, Mar. 19, 1996, 96-1 CPD para. 177 at 5. (Rockwell
Electronic Commerce Corporation, B-286201; B-286201.2;
B-286201.3, December 14, 2000)
Requirements stated in source
selection plans which are not disclosed to offerors do not give
outside parties any rights. Mandex, Inc.; Tero Tek Int'l, Inc.,
B-241759 et al., Mar. 5, 1991, 91-1 CPD para. (Wilson
5 Service Company, Inc., B-285343.2; B-285343.3, October 10,
2000)
The most significant example in this regard is G&N's claim that its own experience significantly exceeded both the RFP requirements and BMAR's experience, so that G&N should have been more highly rated than BMAR in this area. Because the RFP does not call for awarding extra evaluation credit for exceeding the stated requirements, the agency was not obligated to credit additional evaluation points for doing so. See SeaSpace, B-241564, Feb. 15, 1991, 91-1 CPD ¶ 179 at 4-5. Since the evaluators determined that G&N's proposal met the RFP requirements "to the fullest extent," they rated G&N's proposal outstanding. To the extent G&N is arguing that the agency was required to reserve the highest rating for a proposal that exceeded the requirements, as noted above, there is nothing in the RFP to support this view. The issue is not, as G&N repeatedly insists, whether BMAR's and G&N's proposals demonstrated identical or precisely equivalent technical merit or equivalent experience vis-à-vis each other, but rather, whether each satisfied the RFP requirements and met the RFP definition of "outstanding."
(G&N,
L.L.C., B-285118; B-285118.2; B-285118.3, July 19, 2000)
Moreland's underlying premise that the enhancements proposed by Heidorn are unrelated to the evaluation factors established in the SFO is misplaced. While the precise "extras" offered by Heidorn were not included as part of the minimum requirements listed in the SFO, they are very closely related to the stated evaluation criteria. For example, the "quality of building and design concept" and its subfactor, "building design," expressly include "the way natural daylight will be accessed within the building design, i.e., the number of windows, skylights and open courtyards" as considerations. SFO § 2.4(2)(a)(2). The SFO even mentions as an alternative to the courtyards depicted in the SFO floorplans "maintaining the courtyard space within the building and providing a minimum 15' X 15' skylight centered in the three courtyards," just as Heidorn proposed in its revised final offer. Id. Thus, the SFO very clearly placed offerors on notice that the agency was interested in a design concept that would maximize the elements of natural light and energy efficiency, and the enhancements offered in Heidorn's revised proposal simply reflected a better implementation of the agency's stated preferences in this area. Accordingly, the agency's recognition of these enhancements in evaluating Heidorn's proposal is unobjectionable.
(The
Moreland Corporation, B-283685, December 17, 1999)
Interlog's protest raises but
one issue for our review; whether the agency's evaluation scheme
reserved such a significant portion of available points for
proposals exceeding the stated requirements of the RFP that the
evaluation turned on unstated criteria. When an RFP states
a best value evaluation plan; as opposed to selection of the
lowest-priced, technically acceptable proposal; evaluation of
proposals is not limited to determining whether a proposal is
merely technically acceptable; rather, proposals may be further
differentiated to distinguish their relative quality by
considering the degree to which technically acceptable proposals
exceed the standard minimum requirements or will better satisfy
the agency's needs. Israel Aircraft Indus., Ltd., MATA
Helicopters Div., B-274389 et al., Dec. 6, 1996, 97-1 CPD para.
41 at 5-6; Meridian Corp., B-246330.3, July 19, 1993, 93-2 CPD
para. 29 at 6-7. (Interlog,
Inc., B-282139, April 27, 1999)
In evaluating a proposal, an
agency properly may take into account specific, albeit not
expressly identified, matters that are logically encompassed by
or related to the stated evaluation criteria. (Farnham
Security, Inc., B-280959.5, February 9, 1999)
While adjectival ratings and
point scores are useful as guides to decision-making, they
generally are not controlling, but rather, must be supported by
documentation of the relative differences between proposals,
their weaknesses and risks, and the basis and reasons for the
selection decision. FAR sec. 15.608(a)(3), 15.612(d)(2) (June
1997); Century Envtl. Hygiene, Inc., B-279378, June 5, 1998,
98-1 CPD para. 164 at 4; Engineering and Computation, Inc.,
supra, at 3. Here, the record is devoid of any
documentation of the agency's evaluation of final revised
proposals. There is no indication of an analysis of the revised
proposals, no information in the record regarding proposal
strengths and weaknesses after discussions, and no discussion as
to why the strengths and weaknesses from the initial evaluation
remained the same. (Biospherics
Incorporated, B-278508.4; B-278508.5; B-278508.6, October 6,
1998)
Agency's source selection
decision in which price is considered only as an eligibility
factor (i.e., price does or does not fall within agency's
budget) is improper because it fails to consider price as a
significant evaluation factor, as required by 10 U.S.C. sec.
2305(a)(2)(A) and sec. 2305(a)(3)(A) (1994) and as contemplated
by the best value evaluation plan stated in the solicitation.
(Electronic
Design, Inc., B-279662.2; B-279662.3; B-279662.4, August 31,
1998)
Concerning the lack of
industrial hygiene technician resumes, Century argues that this
was improperly considered a weakness since the RFP did not
require resumes for industrial hygiene technicians. In response
to a question asked by our Office, HRSA concedes that the RFP
required resumes only for professional personnel and that
industrial hygiene technicians are not considered professional
personnel. The evaluator that made this comment assigned only 7
points (out of 20) to Century's BAFO. The record includes no
explanation of how many points this comment cost Century's
proposal in this evaluator's scoring; since it was one of only
two negative comments, it appears to have been a substantial
number. (Century
Environmental Hygiene, Inc., B-279378, June 5, 1998)
First, it was improper for the
Army not to consider the results of the sample delivery order
cost evaluation results in reaching its award decision. The
Competition in Contracting Act requires that a solicitation for
competitive proposals state all significant factors that will be
considered in the evaluation of proposals and ultimate award
decision, 10 U.S.C. sec. 2305(a)(2)(A) (1994), and that agencies
evaluate proposals based solely on the factors included in the
solicitation. 10 U.S.C. sec. 2305(b)(1). By virtue of this
statutory requirement, an agency may not represent that it will
evaluate proposals on one basis, and then use another. Thus,
given the RFP's specific representation that sample delivery
order costs would be considered in the cost evaluation, the Army
could not ignore that information in making its source
selection. Group Techs. Corp., B-240736, Dec. 19, 1990, 90-2 CPD
para. 502 at 9-10 (agency is required to use cost evaluation
tools provided in the evaluation scheme). (FC
Business Systems, Inc., B-278730, March 6, 1998) (pdf) |