Skip to content

Conversation

@jmagly
Copy link

@jmagly jmagly commented Dec 11, 2025

Summary

This PR adds retry utilities to utils.py, addressing E2.6 from the roadmap ("Add retry logic for network failures").

What's Included

Two new functions in src/skill_seekers/cli/utils.py:

  1. retry_with_backoff() - Sync version for requests operations
  2. retry_with_backoff_async() - Async version for httpx.AsyncClient operations

Features

  • Configurable max attempts (default: 3)
  • Exponential backoff with configurable base delay (doubles each retry)
  • Operation name parameter for meaningful log messages
  • Proper exception handling (re-raises last exception if all retries fail)

Roadmap Alignment

  • E2.6: Add retry logic for network failures ✅

Usage Example

from skill_seekers.cli.utils import retry_with_backoff

def fetch_page():
    response = requests.get(url, timeout=30)
    response.raise_for_status()
    return response.text

content = retry_with_backoff(
    fetch_page,
    max_attempts=3,
    base_delay=1.0,
    operation_name=f"fetch {url}"
)

Async version:

from skill_seekers.cli.utils import retry_with_backoff_async

async def fetch_page():
    response = await client.get(url, timeout=30.0)
    response.raise_for_status()
    return response.text

content = await retry_with_backoff_async(
    fetch_page,
    operation_name=f"fetch {url}"
)

Log Output Example

WARNING: fetch https://example.com failed (attempt 1/3), retrying in 1.0s: Connection refused
WARNING: fetch https://example.com failed (attempt 2/3), retrying in 2.0s: Connection refused
ERROR: fetch https://example.com failed after 3 attempts: Connection refused

Tests Included

7 new unit tests in TestRetryWithBackoff and TestRetryWithBackoffAsync:

  • test_successful_operation_first_try
  • test_successful_operation_after_retry
  • test_all_retries_fail
  • test_exponential_backoff_timing
  • test_async_successful_operation
  • test_async_retry_then_success
  • test_async_all_retries_fail

Integration Note

This PR provides the utility functions but does not modify existing scrapers. Once merged, the doc_scraper can be updated to use these utilities:

# In doc_scraper.py scrape_page():
def _fetch():
    response = requests.get(url, headers=headers, timeout=30)
    response.raise_for_status()
    return response

response = retry_with_backoff(_fetch, operation_name=f"fetch {url}")

This keeps the PR focused and reviewable. Integration can be done in a follow-up PR if desired.


Contributed by the AI Writing Guide project.

Add retry_with_backoff() and retry_with_backoff_async() functions for
network operations that may fail due to transient errors.

Features:
- Configurable max attempts (default: 3)
- Exponential backoff with configurable base delay
- Operation name for meaningful log messages
- Both sync and async versions for different use cases

This addresses E2.6 from the roadmap: "Add retry logic for network failures"

Includes 7 unit tests covering:
- First-try success
- Retry then success
- All retries fail
- Exponential timing verification
- Async variants

Usage example:
    from skill_seekers.cli.utils import retry_with_backoff

    def fetch_page():
        response = requests.get(url, timeout=30)
        response.raise_for_status()
        return response.text

    content = retry_with_backoff(
        fetch_page,
        max_attempts=3,
        operation_name=f"fetch {url}"
    )

Contributed by the AI Writing Guide project.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant