Skip to content

[Feature Request] Implement Batch Processing (Chunking) to stabilize large library scans #26

@dgomesbr

Description

@dgomesbr

Is your feature request related to a problem? Please describe. Yes. Currently, in internal/media/processor.go, the application iterates through the entire movie/series list in a single continuous loop. For large libraries (e.g., 4,000+ items), this can flood the Radarr/Sonarr API with thousands of requests per second.

This causes:

  1. Radarr/Sonarr Instability: The SQLite database locks up under the read pressure.
  2. Container Crashes: The client.go HTTP requests consume file descriptors or memory until the container is killed.

Describe the solution you'd like I suggest modifying the main processing loop in internal/media/processor.go to process items in batches (e.g., 100 items at a time), with a configurable delay between batches.

Key Changes:

  1. Batch Size: Default to 100.
  2. Batch Delay: Default to 10 seconds (allows Radarr to "cool down" and write to DB).
  3. Environment Variables: Add BATCH_SIZE and BATCH_DELAY_SECONDS to cmd/labelarr/main.go.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions