SEO Pattern Gallery

Production-ready patterns for URL management, filtering, and crawl control

Every SEO pattern demonstrated in this app, organized by category with live examples. Filter by risk level or category to find the pattern you need.

Risk Level

Category

Showing all 17 patterns

🔴Multi-Select Parameters

High Risk

Multiple color selections create exponential URL combinations (2^N). These are blocked via robots.txt to prevent crawl waste.

When users select multiple values like color=black,blue, each combination creates a unique URL. With 5 colors selected, you get 2^5 = 32 possible URLs.

Example:

/shop/t-shirts?color=black,blue,red

When to Avoid:

  • Never allow for indexable pages
  • Always use single-select or separate pages
  • Avoid checkbox multi-select without proper blocking

Strategy:

Block via robots.txt: Disallow: /*?*color=*,*

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

🟡Multiple Stable Filters

Medium Risk

Combining stable filters like color + size creates N×M URLs. Uses noindex,follow to prevent index bloat while maintaining discoverability.

5 colors × 4 sizes = 20 URL variations. Risk of index bloat but manageable with proper robots directives.

Example:

/shop/t-shirts?color=black&size=M

When to Use:

  • For user filtering with noindex,follow
  • When combinations are limited (< 100 URLs)
  • To maintain link equity flow

When to Avoid:

  • If combinations exceed 100 URLs
  • When crawl budget is limited
  • For infinite combination possibilities

Strategy:

Apply noindex,follow with canonical to single-filter version

SEO Impact:

Indexable
✗ No
Robots
Allowed
Sitemap
✗ No

🟢Single Stable Filter

Low Risk

Single filters like color=black represent real user intent with limited variations. These are index,follow and ideal for clean path conversion.

Linear growth (one URL per value). Safe to index as they represent meaningful product segments.

Example:

/shop/t-shirts?color=black

When to Use:

  • For meaningful product segments
  • When users search for specific attributes
  • As candidates for clean path conversion

When to Avoid:

  • For temporary UI state
  • For infinite value ranges
  • When values change frequently

Strategy:

index,follow OR convert to clean path for better keywords

SEO Impact:

Indexable
✓ Yes
Robots
Allowed
Sitemap
✓ Yes

🟢Gender Filter

Low Risk

Gender-based clean paths create stable product segmentation. Ideal for e-commerce with clear gender differentiation.

4 gender options (women, men, girls, boys) create 4 valuable landing pages per category.

Example:

/shop/t-shirts/for/women/

When to Use:

  • Clear product differentiation by gender
  • Significant product count per gender (10+ products)
  • Users search by gender ('women's shoes')

When to Avoid:

  • Truly unisex products
  • Insufficient products per gender segment
  • Gender not relevant to product type

Strategy:

Clean path with index,follow and static generation

SEO Impact:

Indexable
✓ Yes
Robots
Allowed
Sitemap
✓ Yes

🔄Sort Parameters

Medium Risk

Sorting changes layout but not content. Use noindex,follow or create curated clean paths for high-value sorts.

Sorted pages create duplicate content. Blocking allows discovery without indexing variations.

Example:

/shop/t-shirts?sort=price_desc

When to Use:

  • For user convenience only
  • To allow crawler discovery of products
  • When you need link equity flow

When to Avoid:

  • Never index sorted variations
  • Don't create infinite sort combinations
  • Avoid when sort doesn't change content significantly

Strategy:

noindex,follow or curated clean paths (/cheapest/, /bestsellers/)

SEO Impact:

Indexable
✗ No
Robots
Allowed
Sitemap
✗ No

🎨View Preferences

Medium Risk

Layout preferences like grid/list view are purely UI state and should be blocked from indexing.

View parameters don't change content, only presentation. Should never be indexed.

Example:

/shop/t-shirts?view=grid

When to Use:

  • For layout preferences only
  • To remember user display settings
  • Client-side UI state management

When to Avoid:

  • Never for content differentiation
  • Avoid putting in URLs if possible
  • Don't use for SEO purposes

Strategy:

Block via robots.txt: Disallow: /*?*view=*

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

💰Price Ranges

High Risk

Numeric ranges like price_min/max create infinite URLs. Always blocked via robots.txt to protect crawl budget.

Any combination of min/max values creates a unique URL. Potentially infinite combinations.

Example:

/shop/t-shirts?price_min=20&price_max=50

When to Use:

  • For user filtering only
  • With proper robots.txt blocking
  • When using manual apply button

When to Avoid:

  • Never allow indexing
  • Don't use without robots.txt protection
  • Avoid auto-updating URLs while typing

Strategy:

Always block via robots.txt: Disallow: /*?*price_min=*, Disallow: /*?*price_max=*

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

📅Date Ranges

High Risk

Date range filters create infinite URL combinations and should always be blocked from crawlers.

Similar to price ranges, date parameters create unlimited URL variations.

Example:

/events?start_date=2024-01-01&end_date=2024-12-31

When to Use:

  • For event filtering
  • For booking date selection
  • With robots.txt blocking

When to Avoid:

  • Never for indexable pages
  • Don't allow crawling
  • Avoid in public URLs if possible

Strategy:

Block via robots.txt pattern matching

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

🛣️Clean Path Routes

Low Risk

Convert stable parameters to clean paths for better SEO. Example: /for/women/ instead of ?gender=women.

Semantic URLs are better for keywords, user intent, and click-through rates.

Example:

/shop/t-shirts/for/women/

When to Use:

  • Stable filters with high search volume
  • When SEO is priority over flexibility
  • For primary navigation paths

When to Avoid:

  • Unstable or infinite combinations
  • When flexibility is needed
  • For user-specific or temporary filters

Strategy:

Static generation with generateStaticParams at build time

SEO Impact:

Indexable
✓ Yes
Robots
Allowed
Sitemap
✓ Yes

📄Path Parameters

Low Risk

Dynamic path segments for product pages. Always indexable with proper metadata and content.

Product pages are core content and should always be indexed with unique titles and descriptions.

Example:

/shop/t-shirts/classic-white-tee/

When to Use:

  • For individual product pages
  • For blog posts and articles
  • For any unique content pages

When to Avoid:

  • Never block product pages
  • Don't use for filtering
  • Avoid duplicate content

Strategy:

Always indexable with proper metadata

SEO Impact:

Indexable
✓ Yes
Robots
Allowed
Sitemap
✓ Yes

Query Parameters

Varies

Traditional query strings with parameter classification. Indexability depends on parameter policy.

Query params are flexible but require careful SEO policy: stable, unstable, or blocked.

Example:

/shop/t-shirts?size=M

When to Use:

  • For flexible filtering
  • When clean paths aren't practical
  • For unstable or blocked parameters

When to Avoid:

  • For primary navigation
  • When clean paths would work better
  • Without parameter policy defined

Strategy:

Apply parameter policy (stable/unstable/blocked)

SEO Impact:

Indexable
Varies
Robots
Varies
Sitemap
Varies

📖Pagination

Medium Risk

Page 2+ creates duplicate content. Use noindex,follow to prevent indexing while allowing discovery.

Page 1 is indexable, page 2+ uses noindex,follow to avoid duplicate content issues.

Example:

/shop/t-shirts?page=2

When to Use:

  • For large result sets
  • To maintain crawl efficiency
  • With self-referencing canonicals

When to Avoid:

  • Never block pagination in robots.txt
  • Don't canonical all pages to page 1
  • Avoid infinite scroll without fallback

Strategy:

Page 1 = index,follow | Page 2+ = noindex,follow

SEO Impact:

Indexable
✗ No
Robots
Allowed
Sitemap
✗ No

🍞Breadcrumb Navigation

Low Risk

Structured internal linking that helps both users and search engines understand site hierarchy.

Breadcrumbs improve UX and provide valuable internal links and structured data.

Example:

Home > Shop > T-Shirts > Women's

When to Use:

  • On all content pages
  • For hierarchical site structures
  • With structured data markup

When to Avoid:

  • Never hide from users or search engines
  • Don't create fake hierarchies
  • Avoid inconsistent trails

Strategy:

Always beneficial for SEO and UX

SEO Impact:

Indexable
✓ Yes
Robots
Allowed
Sitemap
✓ Yes

🎯Faceted Navigation

Varies

Multi-dimensional filtering requires careful parameter classification to avoid crawl traps.

Faceted nav creates many URL combinations. Success depends on proper parameter policies.

Example:

/shop/t-shirts with color, size, price filters

When to Use:

  • For complex product catalogs
  • With proper parameter classification
  • When crawl budget allows

When to Avoid:

  • Without parameter policy
  • With unlimited combinations
  • When simpler navigation works

Strategy:

Careful parameter classification and canonical strategy

SEO Impact:

Indexable
Varies
Robots
Varies
Sitemap
Varies

🔒Protected Routes

N/A

User-specific pages like account areas should be noindex,nofollow and blocked via robots.txt.

Private user data should never be indexed or crawled by search engines.

Example:

/account/orders

When to Use:

  • For user account pages
  • For checkout processes
  • For any private user data

When to Avoid:

  • Never allow indexing
  • Don't forget robots.txt block
  • Avoid exposing in sitemaps

Strategy:

noindex,nofollow + robots.txt: Disallow: /account/

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

🔍Search Pages

Medium Risk

Search result pages are thin content but allow discovery. Use noindex,follow.

Search pages don't add unique value but help crawlers discover content.

Example:

/search?q=shoes

When to Use:

  • For site search functionality
  • To help content discovery
  • With noindex,follow directive

When to Avoid:

  • Never index search results
  • Don't block in robots.txt
  • Avoid infinite query variations

Strategy:

noindex,follow for discovery without indexing

SEO Impact:

Indexable
✗ No
Robots
Allowed
Sitemap
✗ No

⚙️API Routes

N/A

System endpoints should be blocked except for specific routes like /api/robots and /api/sitemap.

APIs aren't meant for search engines, except specific SEO endpoints.

Example:

/api/robots or /api/sitemap

When to Use:

  • Allow /api/robots
  • Allow /api/sitemap
  • Block all other /api/* routes

When to Avoid:

  • Never expose API endpoints to crawlers
  • Don't forget explicit allows
  • Avoid indexing API responses

Strategy:

Allow: /api/robots, Allow: /api/sitemap, Disallow: /api/

SEO Impact:

Indexable
✗ No
Robots
Blocked
Sitemap
✗ No

See These Patterns in Action

Visit the shop pages to see how these SEO patterns are implemented. The SEO Receipt panel shows real-time details about each pattern's impact.