URL Parameters
Understanding parameter strategies to optimize crawl budget and indexation
The Parameter Problem
Why unchecked parameters are an SEO risk
URL parameters create exponential duplicate content that wastes crawl budget and dilutes ranking signals. A single product category with just 5 filters (color, size, price, sort, gender) can generate 100,000+ URL variations pointing to the same products.
Without a parameter strategy, search engines waste time crawling duplicates instead of discovering valuable new content.
Strategic Decision Framework
Three approaches based on search intent and implementation
Clean Paths
/shoes/runningUse when: The page represents a real search intent users actually type (e.g. “running shoes”).
SEO: Indexable, self-canonical, included in sitemap.
Rule: If an intent matters, give it a clean, permanent URL.
Single Parameter
/shoes?color=redUse when: As a temporary fallback when clean paths cannot be implemented.
SEO: Not indexed by default; canonicalized to the clean path.
Rule: If a filter deserves to rank, it deserves a clean URL.
Multiple Parameters
/shoes?color=red&size=10&sort=priceUse when: Never for SEO—only for on-site filtering UX.
SEO: Not indexed, excluded from sitemap, canonicalized to clean URL.
Rule: Multi-parameter URLs are crawl traps, not landing pages.
Parameter Policies - Best Practices
Recommended parameter handling strategies for optimal SEO performance
These parameters create meaningful variations. In this playground they stay noindex variants with canonicals to the base; promote to clean paths if you want them indexed.
colorColor is a meaningful facet. Best practice: keep as variant (noindex) and canonical to base.
sizeSize is a meaningful attribute. Best practice: keep as variant (noindex) and canonical to base.
These parameters create duplicate or low-value pages. Use noindex,follow to allow crawling but prevent indexing. Strip from canonical URLs.
sortSorting changes order only; no unique value.
These parameters should be completely blocked from crawling via robots.txt and always stripped from canonical URLs to prevent crawl waste.
utm_sourceTracking parameter; strip from canonical and block in robots.
utm_mediumTracking parameter; strip from canonical and block in robots.
utm_campaignTracking parameter; strip from canonical and block in robots.
gclidGoogle Click ID tracking parameter.
fbclidFacebook Click ID tracking parameter.
sidSession ID tracking parameter.
viewUI preference parameter. Creates noise, should be blocked via robots.txt.
per_pageItems-per-page UI preference. Creates explosive crawl space, block via robots.txt.
price_minNumeric range filter. Creates infinite combinations, block via robots.txt.
price_maxNumeric range filter. Creates infinite combinations, block via robots.txt.
Other Parameters
qSearch query parameter. Search pages default to noindex,follow to avoid thin content.