Civic RFP transparency trends, state by state
We scored 50 state procurement portals on five transparency criteria — addenda visibility, debrief access, vendor Q&A publication, awarded-vendor disclosure, and data-portability. Wide variance, four leaders, and a long tail.
This is a research note from the PursuitAgent research team. We scored every U.S. state’s primary procurement portal on five criteria. The findings are wide variance, four genuine leaders, a middle pack of competent-but-incomplete portals, and a long tail of portals that publish the bare minimum a sunshine law requires.
The scoring rubric and methodology are at the bottom. The rankings come first.
The five criteria
Each portal scored 0-3 on each criterion. Total score 0-15.
- Addenda visibility. Are RFP modifications and Q&A responses posted in a way vendors can subscribe to? (3 = email/RSS push notifications, 2 = visible on the listing without login, 1 = visible after login, 0 = no consistent pattern)
- Debrief access. Can losing vendors request a written debrief, with a published process? (3 = automatic written debrief on losing notification, 2 = published process with timeline, 1 = available on request without published process, 0 = ad hoc)
- Vendor Q&A publication. Are pre-bid questions and answers published to all vendors? (3 = real-time public Q&A, 2 = batched public Q&A, 1 = answers shared with the asking vendor only, 0 = no public Q&A mechanism)
- Awarded-vendor disclosure. Are award decisions, including price and shortlist, published? (3 = full disclosure including evaluation summary, 2 = vendor name and price, 1 = vendor name only, 0 = no consistent disclosure)
- Data portability. Can the portal’s data be accessed via API or bulk download in a structured format? (3 = OCDS-compliant API, 2 = structured CSV/JSON download, 1 = HTML scrapeable, 0 = PDF only)
The leaders (score 12+)
California (14). Cal eProcure publishes addenda with subscription, has a published debrief process, posts Q&A publicly in batches, discloses awards with evaluation summaries, and provides bulk data downloads. The remaining gap is real-time API access; downloads are batched weekly.
Washington (13). Washington’s Enterprise Procurement system has the strongest data portability of any state — full API access, near-real-time. Loses a point on debrief access; the process exists but isn’t published, so vendors discover it on request.
Minnesota (12). Strong on disclosure and Q&A. Debrief access is by request and runs on a 60-day SLA. Data portability is structured download but not API.
Utah (12). Tied with Minnesota on the rubric, slightly different distribution. Strong on addenda and Q&A, weaker on data portability.
These four states are noticeably ahead. Vendors who track them as references for “what we expect from procurement transparency” are calibrating against a real ceiling, not an imagined one.
The middle pack (score 7-11)
About 30 states sit in this range. The pattern is similar everywhere: solid on basic addenda visibility and awarded-vendor disclosure (the sunshine-law-required pieces), weaker on debrief access and data portability (the discretionary pieces).
A few notable subpatterns:
- Northeast cluster — Massachusetts, New York, New Jersey, Connecticut: strong on disclosure, weak on data portability. State portals built 10-15 years ago that haven’t been modernized.
- Texas (10): Above the middle pack on disclosure but the procurement system is split across multiple portals depending on the buying agency, which fragments the vendor experience and lowers the practical score below the rubric score.
- Florida (9): Reasonable on every criterion, exceptional on none.
The middle pack is where procurement modernization is currently in motion. Several states (Colorado, Oregon, Virginia) have published modernization roadmaps that should move them into the leaders’ tier within the next two years.
The long tail (score below 7)
About 15 states score below 7. The patterns are consistent:
- PDF-only data. The portal lists RFPs but the listings are HTML pages without structured data; the actual RFPs are PDF attachments. Vendors who want to track multiple bids do it manually.
- No published debrief process. Vendors who lose a bid have no written explanation of how to request one, and the answer they get when they ask varies by agency.
- Vendor Q&A is private. Pre-bid questions are answered to the asking vendor; other bidders don’t see them. This advantages vendors who know to ask and disadvantages everyone else.
- Awarded-vendor disclosure is name-only. The portal publishes who won; price, evaluation, and shortlist aren’t disclosed.
We’re not naming the long-tail states by rank, because the data is volatile and the rubric is imperfect. The states in this tier know who they are; their procurement modernization plans, when they exist, are the public record of their position.
What the variance looks like across criteria
| Criterion | Median score | High | Low |
|---|---|---|---|
| Addenda visibility | 2.0 | 3 | 1 |
| Debrief access | 1.0 | 3 | 0 |
| Vendor Q&A publication | 2.0 | 3 | 0 |
| Awarded-vendor disclosure | 2.0 | 3 | 1 |
| Data portability | 1.0 | 3 | 0 |
Debrief access and data portability are the two criteria with the widest spread. They’re also the two that most directly help losing vendors learn from their losses — debrief access for the immediate feedback, data portability for the longitudinal view across bids. The variance here matters more for vendor strategy than the variance in addenda visibility, where most states cluster.
What this means for vendor strategy
For vendors active in multiple states, the transparency variance is a planning input. A vendor who treats every state as if it were Washington misses the debrief-access gap in 40 other states. A vendor who treats every state as if it were the long tail misses the data-availability advantage in California and Minnesota.
Three concrete moves that fall out of the data:
Subscribe to addenda where you can. California, Washington, and the Northeast cluster all support addenda subscription. Use it. The “we missed the modification” failure mode in the eight-stage RFP pipeline post is preventable in ~30 states.
Build a debrief request template for the states without a published process. The states where you’ll have to ask are the majority. A standardized request, sent within 5 business days of award notification, is the practice.
Use data-portable states as your benchmark. When you’re trying to understand pricing, evaluation criteria, or competitor positioning at the state level, the states with bulk-data access are the ones where you can actually study the corpus. Start there; extrapolate to the rest with the appropriate uncertainty.
Methodology and caveats
We scored portals between November 2025 and January 2026 by direct portal access. Where access required vendor registration, we registered. Scores reflect what was discoverable in 30 minutes of portal use per state — a vendor’s first-day experience, not a procurement-specialist’s deep familiarity.
The rubric is PursuitAgent’s. It draws on the Sunlight Foundation’s open procurement principles and the Open Contracting Partnership data standard, but the specific weights and interpretation are ours. Other rubrics — particularly ones that weight equity-of-access or small-business outreach — would produce different rankings. The rankings here are not “the” transparency rankings.
State procurement portals change. Three states meaningfully improved their data portability between when we started scoring and when we finished. We re-ran those states; the corpus reflects the higher score. The picture in 18 months will be different.
We did not score local (city, county, school district) procurement. That corpus is roughly 10x the size and a different research project. The variance there is larger than the variance across states.
What’s next
The next research note in this series, landing in March, is the federal-side cross-cut: how the major federal procurement systems (SAM.gov, Acquisition.gov, agency portals) compare on the same five criteria, and how the federal record relates to the state record. Federal disclosure is more uniform, but the variance across agencies is wider than across states — a finding worth its own post.