SEO Pattern Gallery
Production-ready patterns for URL management, filtering, and crawl control
Every SEO pattern demonstrated in this app, organized by category with live examples. Filter by risk level or category to find the pattern you need.
Risk Level
Category
🔴Multi-Select Parameters
Multiple color selections create exponential URL combinations (2^N). These are blocked via robots.txt to prevent crawl waste.
When users select multiple values like color=black,blue, each combination creates a unique URL. With 5 colors selected, you get 2^5 = 32 possible URLs.
When to Avoid:
- ✗Never allow for indexable pages
- ✗Always use single-select or separate pages
- ✗Avoid checkbox multi-select without proper blocking
Strategy:
Block via robots.txt: Disallow: /*?*color=*,*
SEO Impact:
🟡Multiple Stable Filters
Combining stable filters like color + size creates N×M URLs. Uses noindex,follow to prevent index bloat while maintaining discoverability.
5 colors × 4 sizes = 20 URL variations. Risk of index bloat but manageable with proper robots directives.
When to Use:
- ✓For user filtering with noindex,follow
- ✓When combinations are limited (< 100 URLs)
- ✓To maintain link equity flow
When to Avoid:
- ✗If combinations exceed 100 URLs
- ✗When crawl budget is limited
- ✗For infinite combination possibilities
Strategy:
Apply noindex,follow with canonical to single-filter version
SEO Impact:
🟢Single Stable Filter
Single filters like color=black represent real user intent with limited variations. These are index,follow and ideal for clean path conversion.
Linear growth (one URL per value). Safe to index as they represent meaningful product segments.
When to Use:
- ✓For meaningful product segments
- ✓When users search for specific attributes
- ✓As candidates for clean path conversion
When to Avoid:
- ✗For temporary UI state
- ✗For infinite value ranges
- ✗When values change frequently
Strategy:
index,follow OR convert to clean path for better keywords
SEO Impact:
🟢Gender Filter
Gender-based clean paths create stable product segmentation. Ideal for e-commerce with clear gender differentiation.
4 gender options (women, men, girls, boys) create 4 valuable landing pages per category.
When to Use:
- ✓Clear product differentiation by gender
- ✓Significant product count per gender (10+ products)
- ✓Users search by gender ('women's shoes')
When to Avoid:
- ✗Truly unisex products
- ✗Insufficient products per gender segment
- ✗Gender not relevant to product type
Strategy:
Clean path with index,follow and static generation
SEO Impact:
🔄Sort Parameters
Sorting changes layout but not content. Use noindex,follow or create curated clean paths for high-value sorts.
Sorted pages create duplicate content. Blocking allows discovery without indexing variations.
When to Use:
- ✓For user convenience only
- ✓To allow crawler discovery of products
- ✓When you need link equity flow
When to Avoid:
- ✗Never index sorted variations
- ✗Don't create infinite sort combinations
- ✗Avoid when sort doesn't change content significantly
Strategy:
noindex,follow or curated clean paths (/cheapest/, /bestsellers/)
SEO Impact:
🎨View Preferences
Layout preferences like grid/list view are purely UI state and should be blocked from indexing.
View parameters don't change content, only presentation. Should never be indexed.
When to Use:
- ✓For layout preferences only
- ✓To remember user display settings
- ✓Client-side UI state management
When to Avoid:
- ✗Never for content differentiation
- ✗Avoid putting in URLs if possible
- ✗Don't use for SEO purposes
Strategy:
Block via robots.txt: Disallow: /*?*view=*
SEO Impact:
💰Price Ranges
Numeric ranges like price_min/max create infinite URLs. Always blocked via robots.txt to protect crawl budget.
Any combination of min/max values creates a unique URL. Potentially infinite combinations.
When to Use:
- ✓For user filtering only
- ✓With proper robots.txt blocking
- ✓When using manual apply button
When to Avoid:
- ✗Never allow indexing
- ✗Don't use without robots.txt protection
- ✗Avoid auto-updating URLs while typing
Strategy:
Always block via robots.txt: Disallow: /*?*price_min=*, Disallow: /*?*price_max=*
SEO Impact:
📅Date Ranges
Date range filters create infinite URL combinations and should always be blocked from crawlers.
Similar to price ranges, date parameters create unlimited URL variations.
When to Use:
- ✓For event filtering
- ✓For booking date selection
- ✓With robots.txt blocking
When to Avoid:
- ✗Never for indexable pages
- ✗Don't allow crawling
- ✗Avoid in public URLs if possible
Strategy:
Block via robots.txt pattern matching
SEO Impact:
🛣️Clean Path Routes
Convert stable parameters to clean paths for better SEO. Example: /for/women/ instead of ?gender=women.
Semantic URLs are better for keywords, user intent, and click-through rates.
When to Use:
- ✓Stable filters with high search volume
- ✓When SEO is priority over flexibility
- ✓For primary navigation paths
When to Avoid:
- ✗Unstable or infinite combinations
- ✗When flexibility is needed
- ✗For user-specific or temporary filters
Strategy:
Static generation with generateStaticParams at build time
SEO Impact:
📄Path Parameters
Dynamic path segments for product pages. Always indexable with proper metadata and content.
Product pages are core content and should always be indexed with unique titles and descriptions.
When to Use:
- ✓For individual product pages
- ✓For blog posts and articles
- ✓For any unique content pages
When to Avoid:
- ✗Never block product pages
- ✗Don't use for filtering
- ✗Avoid duplicate content
Strategy:
Always indexable with proper metadata
SEO Impact:
❓Query Parameters
Traditional query strings with parameter classification. Indexability depends on parameter policy.
Query params are flexible but require careful SEO policy: stable, unstable, or blocked.
When to Use:
- ✓For flexible filtering
- ✓When clean paths aren't practical
- ✓For unstable or blocked parameters
When to Avoid:
- ✗For primary navigation
- ✗When clean paths would work better
- ✗Without parameter policy defined
Strategy:
Apply parameter policy (stable/unstable/blocked)
SEO Impact:
📖Pagination
Page 2+ creates duplicate content. Use noindex,follow to prevent indexing while allowing discovery.
Page 1 is indexable, page 2+ uses noindex,follow to avoid duplicate content issues.
When to Use:
- ✓For large result sets
- ✓To maintain crawl efficiency
- ✓With self-referencing canonicals
When to Avoid:
- ✗Never block pagination in robots.txt
- ✗Don't canonical all pages to page 1
- ✗Avoid infinite scroll without fallback
Strategy:
Page 1 = index,follow | Page 2+ = noindex,follow
SEO Impact:
🍞Breadcrumb Navigation
Structured internal linking that helps both users and search engines understand site hierarchy.
Breadcrumbs improve UX and provide valuable internal links and structured data.
When to Use:
- ✓On all content pages
- ✓For hierarchical site structures
- ✓With structured data markup
When to Avoid:
- ✗Never hide from users or search engines
- ✗Don't create fake hierarchies
- ✗Avoid inconsistent trails
Strategy:
Always beneficial for SEO and UX
SEO Impact:
🎯Faceted Navigation
Multi-dimensional filtering requires careful parameter classification to avoid crawl traps.
Faceted nav creates many URL combinations. Success depends on proper parameter policies.
When to Use:
- ✓For complex product catalogs
- ✓With proper parameter classification
- ✓When crawl budget allows
When to Avoid:
- ✗Without parameter policy
- ✗With unlimited combinations
- ✗When simpler navigation works
Strategy:
Careful parameter classification and canonical strategy
SEO Impact:
🔒Protected Routes
User-specific pages like account areas should be noindex,nofollow and blocked via robots.txt.
Private user data should never be indexed or crawled by search engines.
When to Use:
- ✓For user account pages
- ✓For checkout processes
- ✓For any private user data
When to Avoid:
- ✗Never allow indexing
- ✗Don't forget robots.txt block
- ✗Avoid exposing in sitemaps
Strategy:
noindex,nofollow + robots.txt: Disallow: /account/
SEO Impact:
🔍Search Pages
Search result pages are thin content but allow discovery. Use noindex,follow.
Search pages don't add unique value but help crawlers discover content.
When to Use:
- ✓For site search functionality
- ✓To help content discovery
- ✓With noindex,follow directive
When to Avoid:
- ✗Never index search results
- ✗Don't block in robots.txt
- ✗Avoid infinite query variations
Strategy:
noindex,follow for discovery without indexing
SEO Impact:
⚙️API Routes
System endpoints should be blocked except for specific routes like /api/robots and /api/sitemap.
APIs aren't meant for search engines, except specific SEO endpoints.
When to Use:
- ✓Allow /api/robots
- ✓Allow /api/sitemap
- ✓Block all other /api/* routes
When to Avoid:
- ✗Never expose API endpoints to crawlers
- ✗Don't forget explicit allows
- ✗Avoid indexing API responses
Strategy:
Allow: /api/robots, Allow: /api/sitemap, Disallow: /api/