Pages That Have Not Been Requested in the Past 30 Days

Identify and manage stale URLs that haven't been visited by bots in the last 30 days.

Screenshot Placeholder – Cache Manager Filter (image-1726474608111)

Summary / Overview

Index render’s “Pages that have not been requested in the past 30 days” view helps you quickly locate stale content that hasn't been accessed by crawlers recently. This feature is available in the Cache Manager and helps teams clean up old content, optimize cache usage, and prioritize high-value pages for recaching or retention.

Detailed Explanation / How It Works

What It Does

When enabled, this toggle view filters your cached page list to only show URLs that have not been requested by bots (such as Googlebot or Bingbot) in the past 30 days.

These pages are considered inactive from a bot visibility standpoint and may be candidates for cache cleanup or deprioritization.

Why It Matters

  • Provides insight into which pages are no longer being crawled.
  • Helps identify forgotten or deprecated URLs still living in cache.
  • Allows teams to focus recache or crawl efforts on pages that are more SEO-relevant.

Step-by-Step Usage

1. Go to the Cache Manager from the Index render Dashboard.
2. Just above the cached pages table, locate the filter toggle labeled “Pages that have not been requested in the past 30 days”.
3. Click the toggle to enable the filter.
4. The table will refresh to show only pages that have had zero bot requests in the last month.
Screenshot Placeholder – Filter Enabled (image-1726474608111)
Important: This filter tracks inactivity from bots only—not human visitors.

Common Pitfalls / Tips

  • Some inactive pages may still be important for seasonal or campaign-based SEO—review URLs before purging.
  • Use this view together with Rendering Queues to evaluate content freshness strategies.
  • Combine bot traffic insights with your analytics platform to get a full picture of page performance.