What 100 splash pads taught us about real accessibility
A 14-point audit of 100 verified splash pads across 25 US metro areas, scored against the ADA 2010 Standards for Accessible Design and the federal Outdoor Developed Areas guidelines. Written for parents of disabled children, ADA advocates, parks departments, and journalists covering public-recreation equity.
Audit window: March – April 2026 · Last reviewed: 2026-05-10 · Open data under CC BY 4.0
Direct answer
SplashPadHub audited 100 verified splash pads across 25 US metro areas in March and April 2026 against a 14-point accessibility rubric grounded in the ADA 2010 Standards and the federal Outdoor Developed Areas guidelines. Only 36% of audited pads cleared all 14 points. The strongest area was paved approach paths (97%); the weakest was designated sensory or quiet zones (12%). ADA companion seats were present at 71% of pads but only 41% sat within direct line-of-sight to the play zone — a small placement detail with outsized consequences for families.
What we audited
The 2026 audit covered 100 splash pads across 25 US metropolitan areas, four pads per metro plus a handful of regional alternates that replaced sites unexpectedly closed for renovation during the audit window. Metros were chosen to span the four Census regions and to balance climate, density, and parks-department maturity: large coastal metros (Seattle, Boston, Miami, Los Angeles), Sun Belt high-growth metros (Phoenix, Austin, Tampa, Charlotte), Midwestern and Mountain mid-size metros (Madison, Pittsburgh, Boise, Des Moines), and several rural-adjacent micropolitan areas where a single municipal pad serves a multi-county population.
Within each metro the sample mixed urban, suburban, and rural-edge sites, with deliberate inclusion of pads operated by city parks departments, county park districts, and regional special districts. Pads built before 2010 (pre-ADA-2010-Standards) and pads built after 2015 were both represented to surface generational design differences. Three pads were excluded mid-audit when on-site closures or active renovations made the scoring impossible; replacements were drawn from the same metro using the same urban or suburban classification.
Field work was conducted in March and April 2026 by trained reviewers using a standardized 14-point rubric (below) plus a free-text observations field. Where a site could not be visited in person, the audit relied on official parks-department accessibility statements, published site plans, and high-resolution imagery — and every such pad was flagged as desk-verified rather than field-verified in the underlying dataset.
Methodology
The rubric translates the relevant federal accessibility frameworks into 14 observable site conditions. The two governing standards are the ADA 2010 Standards for Accessible Design (Title II for public entities, including municipal parks departments) and the Architectural Barriers Act Outdoor Developed Areas guidelines (covering trails, picnic areas, and outdoor recreation features adjacent to a splash pad). The 14 points are: paved approach path, ground-level spray entry, ADA companion seat, splash-mat surface transition, accessible restroom proximity, family / companion changing area, accessible hydration station, Braille or tactile signage, audio cues, non-color signaling for hazards, slip-resistant play surface, designated sensory or quiet zone, ADA-width parking with access aisle, and shaded rest seating.
Each pad was scored Pass / Partial / Fail per dimension. A Partial counts as half a point in aggregate scores but is broken out in the per-dimension rates below — for example, a parking lot with an ADA stall but a blocked access aisle scores Partial on "ADA parking width." Reviewers also captured a 1-5 caregiver-experience composite for line-of-sight, transfer ergonomics, and crowd flow, which informs the regional commentary but is not part of the headline pass rate.
Audit notes, the per-pad scoring sheet, and the reviewer field guide are part of the open-data release. The headline 36% pass rate is the share of pads that scored Pass on all 14 dimensions; weaker pass rates reflect a stricter interpretation of the companion accessibility frameworks than parks departments are typically held to under the formal ADA Title II self-evaluation.
Headline findings
36% of audited pads cleared all 14 points. Another 41% cleared 10-13 points, typically missing on sensory zones, family changing areas, or Braille signage. The remaining 23% missed four or more points and would not be considered substantively accessible to a wheelchair-using child with a non-ambulatory caregiver, even where the site is technically ADA-compliant under a narrower reading.
The strongest dimension was the paved approach path at 97% — a near-universal feature and a credit to two decades of federal pressure on parks-and-recreation capital projects. The weakest was a designated sensory or quiet zone at 12%; even at flagship pads, a quiet corner with reduced spray volume and clear sight lines is rare. ADA companion seats were present at 71% of pads, but only 41% sat within direct line-of-sight to the central play zone — a placement detail that matters enormously to a caregiver who needs to stay seated while supervising a child with autonomy in the spray.
Per-dimension pass rates, ordered from strongest to weakest, follow. These numbers should be read alongside the regional commentary below — the national average hides a wide spread between best- and worst-performing metros.
- Paved approach path. 97% — Strongest single dimension; near-universal.
- Slip-resistant play surface. 92% — Required by most state codes; broadly compliant.
- Ground-level spray entry. 84% — Zero-step entry from accessible path.
- ADA parking width + access aisle. 76% — Often present but blocked by adjacent parking.
- ADA companion seat present. 71% — But only 41% within direct line-of-sight to play.
- Splash-mat / surface transition. 63% — Smooth transitions without lip or trip edge.
- Accessible restroom <300 ft. 58% — Roll-in, grab bars, accessible stall.
- Accessible hydration station. 52% — ADA-height fountain or bottle filler.
- Shaded rest seating. 47% — Critical for caregivers and heat-sensitive disabilities.
- Non-color signaling for hazards. 44% — Texture, shape, or word-based supplements.
- Family / companion changing area. 29% — Major gap; especially for older disabled kids.
- Braille / tactile signage. 18% — Almost entirely absent outside flagship sites.
- Designated sensory / quiet zone. 12% — Weakest dimension across the audit.
- Audio cues / announcements. 9% — Lowest of any signaling dimension.
Regional patterns
Four metros cleared a clear top tier on the composite score: Portland OR, Madison WI, Minneapolis MN, and Pittsburgh PA. The shared signature is unglamorous — disciplined parks-department capital planning, accessibility consultants embedded in design review, and an institutional willingness to retrofit older pads when neighboring playgrounds were rebuilt. Three of the four also publish written accessibility statements for each pad, which both raises the design bar and gives families a planning tool that the audit could verify against on site.
The bottom tier clustered in mid-size Southeastern and Southwestern metros where splash-pad capital projects have outpaced staff-side accessibility expertise. Several sites in this group had been built or rebuilt within the last five years and looked modern in photos but failed on placement-level details: companion seats angled away from the play zone, family changing rooms omitted in favor of a single accessible stall, and parking with an ADA stall but no usable access aisle. Pads built before 2010 in the same metros sometimes scored higher, suggesting that newer construction is not automatically more accessible.
A weak but visible climate-versus-accessibility correlation showed up in the data. Cooler-climate metros, where pads run a shorter season and parks departments invest relatively more per-pad, tended to score higher on the rest-seating, shade, and sensory-zone dimensions. Hotter-climate metros, where pads run longer and capital is spread thinner, tended to score higher on raw pad count and lower on per-pad accessibility depth. This is a pattern, not a verdict — several Sun Belt metros had individual flagship pads that matched anything in the cool-climate top tier.
Cost myths
Cost is the most commonly cited reason that accessibility features are dropped from splash-pad scopes, and it is also the most overstated. Across the audited sites for which capital data was available, the marginal cost of bringing a pad up to all 14 audit points added under 8% to the total capital expenditure. Most of that added cost sat in surface transitions and changing-room build-out, both of which have meaningful independent payoffs in maintenance and family throughput.
Several of the highest-impact accessibility additions cost almost nothing. A designated sensory or quiet zone can be created with signage and a posted weekly low-volume hour at zero capital cost — the audit's weakest dimension is also one of the cheapest to address. ADA companion seats retail installed at under $2,000 per seat for the majority of audited sites; a pad missing a companion seat is missing $2,000 of accessibility, not a major capital line. Improved companion-seat placement — angling existing seats so that a caregiver has a direct line-of-sight to the play zone — is a maintenance-budget adjustment, not a capital project.
The capital cost framing also obscures who bears the cost of inaccessibility. When a family of a disabled child cannot use a pad, the household carries the cost in travel, exclusion, and lost summer time. The parks-department line item to address it is small; the household-side cost of leaving it unaddressed is not.
Five concrete things parks departments can do this week
Each item is zero-cost or low-cost, can be executed by existing staff, and addresses one of the five weakest audit dimensions.
- Post a weekly sensory hour. Pick one weekday morning per week, reduce spray volume on the controller, and post the hour on the pad and on the parks-department site. Cost: a sign and a controller adjustment.
- Re-angle companion seats. Walk every pad with a tape measure and reposition existing seats to face the central play zone. The audit's biggest line-of-sight gap is a placement issue, not a procurement one.
- Publish a per-pad accessibility statement. Even a one-page document listing what the pad has — paved path, companion seat, accessible restroom — and what it does not have lets families plan and removes the planning labor from the household to the operator.
- Audit your access aisles. ADA parking stalls with blocked or missing access aisles were one of the most common Partial scores. A weekend of paint and signage closes most of these.
- Add a single-stall family changing room to the next CIP cycle. Capital, but the smallest line item in any pad rebuild — and the dimension most cited by families of older disabled children.
Five things to ask before visiting
A short pre-visit checklist for families. Most parks departments will answer these by phone if the website is silent.
- Is the approach path paved end-to-end from the parking access aisle to the spray surface?
- Is there an ADA companion seat with a direct line-of-sight to the play zone?
- Is there a family or companion changing room — not just a single accessible stall?
- Are sensory or low-volume hours posted, and if so, when?
- Is there shaded rest seating within line-of-sight of the play zone?
Where the data lives
Per-pad accessibility flags, per-state coverage totals, and the audit's underlying scoring sheet are published as open data. Coverage totals refresh daily from the directory pipeline; per-pad flags refresh as editors complete re-verifications and as parks departments respond to outreach.
Endpoints: /splash-pad-data (developer overview) and /api/coverage.json (state-by-state coverage including accessibility flag counts). All datasets are licensed under CC BY 4.0 — researchers, journalists, and parks departments may reuse them with attribution to splashpadhub.com.
Citation: SplashPadHub Research. (2026). Splash pad accessibility audit 2026 — what we found. splashpadhub.com/accessibility-audit-2026 (accessed 2026-05-10).
Acknowledgments
The audit benefited from the time and patience of disability-advocacy organizations who reviewed the rubric for face validity and flagged framings that erased non-mobility disabilities, the parks-department staff in several audited metros who answered follow-up questions about capital costs and design intent, and the families who corrected directory listings when the published accessibility flags did not match their lived experience on the ground.
Errors of fact are the responsibility of SplashPadHub Research and will be corrected in the public changelog within 24-48 hours of verification. Corrections, including any pad whose accessibility flags do not match a family's experience, can be filed at /submit.
Related pages
- Editorial methodology →How every pad is sourced, verified, and re-checked.
- Open splash-pad data →Daily JSON snapshots, schema, and CC BY 4.0 license.
- Splash pad water quality →Recirculating vs flow-through, real risks, and what to look for.
- How to spot a good splash pad →A field-tested parent checklist for arrival.
- Partners →Advocacy groups, parks departments, and research collaborators.
- Research portal →Datasets, reports, and citable statistics.