From discovered to verified — every splash pad's journey
The operational walkthrough of how SplashPadHub takes a splash pad from a first lead to a published, verified, indexed listing — written for parents who want to understand what "verified" actually means, parks departments curious about the workflow that covers their facility, and journalists auditing the directory's process.
Last reviewed: 2026-05-10 · Distinct from /methodology (formal verification rules) and /editorial-standards (writing standards).
Direct answer
Every splash pad on SplashPadHub goes through 8 steps: discovery, source ranking, geocoding, feature audit, accessibility audit, photo verification, editorial review, publishing. Average turnaround is 2–3 weeks from first lead to live page. A last-verified date is posted on every park page so visitors can see exactly when the record was last touched.
Step 01Discovery
Where new pads enter the queue
New splash pad leads enter the queue from five monitored channels. The first is parks-and-recreation department feeds — RSS subscriptions to municipal parks blogs, federal recreation feeds where applicable, and county press-release pages. The second is OpenStreetMap edits in the leisure=splash_pad and amenity=fountain categories, watched through a daily diff against a saved query. The third is local news: regional papers, public-radio summer-feature roundups, and chamber-of-commerce coverage that frequently announces new openings before the parks page is updated.
The fourth channel is community submissions through /submit. Parents, parks staff, and neighborhood association members report new pads, missing pads, and pads that have closed. Every submission lands in the editor queue with a timestamp and the submitter's stated source. The fifth channel is parks-department press releases sent directly to a monitored inbox — a small but rising stream as the directory becomes recognized within the parks-and-recreation trade.
Discovery is deliberately wide. The cost of missing a real pad is higher than the cost of triaging a duplicate or a rumor, so the queue accepts everything and lets later steps filter. A discovered lead is not a public listing; it is a candidate record that owes the directory eight more steps before it appears.
Step 02Source ranking
Tier 1 wins on operational fields; Tier 4 never publishes alone
Each candidate record is anchored to documented sources, and every source carries a tier weight. The official municipal parks-and-recreation department page (or county or regional park district equivalent) is the primary source for hours, season dates, fees, accessibility statements, and rules. Local news and tourism-bureau coverage are secondary — useful for confirming existence and capturing community context, but weighted lower on operational fields because they describe a moment rather than the live state.
Community submissions and aggregated map placemarks are tertiary. They serve as verification triggers, not publication triggers. A parent reporting a new pad prompts an editor to find the parks-department page that confirms it; a Google Maps pin alone does not justify a listing. Where two sources disagree — for example, the city site says hours are 10am–8pm and a GIS extract says 9am–9pm — Tier 1 wins on operational fields and the disagreement is logged in the editor notes.
The full conflict-resolution rule, including how the directory handles unresolved disagreements, lives on the methodology page. Source ranking is the gate where weak leads quietly fail; a candidate without at least one documented Tier 1 or Tier 2 source is held back until one is found.
Step 03Geocoding
Pinned to the splash pad surface, not the parking lot
Once sources are ranked, an editor pins precise coordinates to the splash pad itself — the play surface, not the park entrance and not the parking lot. Where the parks department or city GIS layer publishes a feature-level latitude/longitude pair, the published value is captured to six-decimal precision. Where only a park-level point is available, the editor opens current satellite imagery and visually places the pin on the visible spray deck.
Every coordinate is sanity-checked against three rules: the point must be inside the listed state and county, it must not fall in a body of water or on a roadway, and it must sit within a plausible park parcel. Records that fail the sanity check are quarantined for human review rather than published with a default. Where the spray deck is not visible in current imagery — common for newly built pads — the record is flagged as 'coordinate provisional' and re-checked once imagery refreshes.
Map presentation depends on coordinate quality. A pad without verified spray-deck coordinates does not appear in the interactive national map at full precision; instead it shows in the city index until a precise pin is captured. The discipline matters because parents driving to the pad rely on the map deeplink to land them at the splash pad, not the far end of a 40-acre park.
Step 04Feature audit
12-point feature checklist, only when the source explicitly says so
Each candidate record clears a 12-point feature checklist before any features are claimed on the public page. The checklist covers zero-depth entry, ground spray jets, geyser jets, dump bucket, water table or interactive water features, themed sculptural elements (animals, nautical, abstract), misters, shade structures, restrooms, on-site parking, drinking-water access, and indoor-versus-outdoor classification. Each item is marked present, absent, or unknown, and the assessment is signed by the editor who reviewed it.
A feature is marked present only when the operator page or a documented secondary source explicitly says so, or when the feature is unambiguously visible in operator-published imagery. Photos in news coverage are accepted as documentation when a caption identifies the feature; aesthetic guesses from a hero image are not. If the source is silent on a feature, the field is rendered empty on the public page rather than imputed from the pad's name, era, or category.
The 12-point checklist is one of the directory's most labor-intensive steps; it is also why pad pages render specific feature lists rather than the generic 'splash pad features' boilerplate that aggregator sites default to. A feature flag that ships on the public page can be defended back to a documented source.
Step 05Accessibility audit
14-point audit, four-tier rating, no inferred claims
Accessibility is held to a stricter standard than other features because families plan trips around it. The 14-point audit covers the approach path (paved or firm-stable surface, no steps, ADA-compliant gradient and width), the deck perimeter (transfer-friendly bench, accessible companion seating, level transition from path to deck), at least one ground-level spray feature reachable from the accessible path, accessible parking with the required curb cut, accessible restrooms with the required clearances, and a published operator accessibility statement.
Each pad receives a four-tier rating: fully accessible (all 14 conditions documented), substantially accessible (path and at least one ground-level spray plus accessible seating, with one or two minor gaps), partially accessible (path verified but other conditions undocumented or absent), and unverified (audit not yet completed). The full criteria for each tier and how the directory handles edge cases is published at /accessibility-tier-explained, which every pad page links to.
The directory does not infer accessibility from age of build, build cost, or the presence of a single curb cut, and partial conditions are never collapsed into a binary 'wheelchair accessible' yes-or-no. A page that lists a pad as substantially accessible enumerates which condition is met and which is not, so a family planning a visit can decide whether the gap matters to their party.
Step 06Photo verification
At least one photo from a public source, with attribution
Before a record can publish, at least one photo of the actual splash pad must be sourced from a public channel. Eligible sources are parks-department websites and public social channels, tourism-bureau photo libraries, local news coverage with a documented photo credit, Wikimedia Commons under a permissive license, and family submissions where the submitter has explicitly granted publication permission through /submit. Stock photography of generic splash pads is never substituted; an unrelated 'representative' image misleads parents about what they will find on arrival.
Every photo carries an attribution line on the pad page identifying the source and license. Wikimedia Commons photos display the photographer's credit and the CC license string. Family-submitted photos credit the community without exposing the submitter's name or contact information. AI-generated images are not used as documentary photography; where an SVG illustration is rendered as a placeholder for a pad without a photo, the illustration is visually distinct and the page is flagged for photo backfill in the editor dashboard.
The photo step is the most common reason a record sits in the queue past two weeks. Coverage backfill rolls in over each summer season as parks departments publish opening photos and as parents submit, and the editor dashboard prioritizes photo-less pads on every weekly sweep.
Step 07Editorial review
Staff editor cross-checks against /editorial-standards
A staff editor opens the candidate record alongside its sources and runs the editorial-standards checklist. Voice and framing are reviewed for neutrality — phrases like 'one of the best' or 'world-class' are flagged unless backed by a documented superlative. Numerical claims are cross-checked against their cited sources; an install year, a fee figure, or a season window that does not appear in the linked source is corrected or removed. Named claims about parks staff, manufacturers, or design firms are fact-checked against the operator page or the relevant trade coverage.
Where AI assistance was used to draft a description or summary, the editor verifies every factual claim against a primary source before the draft ships. Hallucinated claims are treated as a corrections-grade workflow failure: the AI-drafted line is removed, the underlying source is re-verified, and if the failure pattern repeats it is logged in the editor notes for workflow audit. The standing rule is that AI accelerates the work; it does not replace editorial judgment.
Sensitive topic handling is reviewed at this step. Accessibility framing, mentions of identity or community, and any reference to parks-department staff or municipal politics are checked for fairness, completeness, and the avoidance of stereotype. The full criteria are published at /editorial-standards. The editor signs the record with a handle and a timestamp before it can move to publishing.
Step 08Publishing
Dateline, sources, last-verified timestamp, listed everywhere
On clearing editorial review, the record publishes to the live directory. The pad page renders with a dateline showing the date of first publication, a sources block listing the documented Tier 1 and Tier 2 URLs that anchor the listing, and a last-verified timestamp that updates on every subsequent re-verification pass. The page is added to the relevant state index, city index, and the interactive national map at full coordinate precision.
Publishing also updates the open-data feed at /splash-pad-data, the daily JSON snapshot that powers the public site, and any regional or category guides that surface the pad (best for toddlers, best for accessibility, best free pads, etc.). Schema.org JSON-LD for LocalBusiness and TouristAttraction is emitted alongside the page so search engines and AI agents can ingest the structured record. Where the listing falls into a featured rotation — splash pad of the day, weekly trending, regional roundup — the editor confirms the dateline aligns with the rotation calendar.
Average end-to-end turnaround from first discovery to live page is two to three weeks. The bottleneck is rarely the directory's own queue; it is photo backfill and the wait for a parks-department page to update its season window, which the directory will not pre-empt with a guess.
After publishMaintenance loop
Daily snapshots, weekly sweeps, season-change pulses, 48-hour correction response
Publishing is not the end of the record. The directory rebuilds from a daily JSON snapshot every 24 hours, regenerating affected static pages so any correction landed in the previous day appears on the public site without a weekly batch. On top of the daily rebuild, editors run a weekly sweep against a rotating subset of pads, prioritized by traffic, age of last verification, and any pending community reports.
Twice a year — early May before Memorial Day opening and early September before Labor Day closing — the directory runs a season-change pulse: a directory-wide re-verification of opening dates, closing dates, fee figures, and feature flags. Pads built during the prior off-season are added; pads that closed are flagged in the changelog with a closure note rather than silently removed. Pages that have not been re-verified within the current cycle are surfaced on the editor dashboard for priority review.
Corrections submitted through /submit are acknowledged within 48 hours of arrival, including weekends during peak season. Acknowledgement does not imply verification; it confirms that the report has entered the queue and an editor has been assigned. Confirmed corrections are landed within 24–48 hours of verification against a primary source, logged on the public changelog with the prior value, the new value, and the source URL that triggered the change. The maintenance loop is what keeps the directory honest after a pad opens for the season, breaks a feature, or quietly closes.
Related pages
- Methodology →Formal verification: source priority, three-pass verification, conflict resolution.
- Editorial standards →Writing voice, sourcing hierarchy, AI usage policy, corrections process.
- Accessibility tiers →The four-tier accessibility rating and what each tier requires.
- Submit a correction →Report a closure, fee change, or a missing pad — 48-hour acknowledgement.
- Trust & methodology overview →Editorial principles, independence, contact channels, known limitations.
- Public changelog →Every correction, addition, and removal with a timestamp.