Skip to content

Commit

Permalink
Reject basebackup requests for archived timelines (#10828)
Browse files Browse the repository at this point in the history
For archived timelines, we would like to prevent all non-pageserver
issued getpage requests, as users are not supposed to send these.
Instead, they should unarchive a timeline before issuing any external
read traffic.

As this is non-trivial to do, at least prevent launches of new computes,
by errorring on basebackup requests for archived timelines. In #10688,
we started issuing a warning instead of an error, because an error would
mean a stuck project. Now after we can confirm the the warning is not
present in the logs for about a week, we can issue errors.

Follow-up of #10688 
Related: #9548
  • Loading branch information
arpad-m authored Feb 24, 2025
1 parent 5fad4a4 commit df362de
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions pageserver/src/page_service.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2097,9 +2097,8 @@ impl PageServerHandler {
set_tracing_field_shard_id(&timeline);

if timeline.is_archived() == Some(true) {
// TODO after a grace period, turn this log line into a hard error
tracing::warn!("timeline {tenant_id}/{timeline_id} is archived, but got basebackup request for it.");
//return Err(QueryError::NotFound("timeline is archived".into()))
tracing::info!("timeline {tenant_id}/{timeline_id} is archived, but got basebackup request for it.");
return Err(QueryError::NotFound("timeline is archived".into()));
}

let latest_gc_cutoff_lsn = timeline.get_applied_gc_cutoff_lsn();
Expand Down

1 comment on commit df362de

@github-actions
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

7141 tests run: 6784 passed, 13 failed, 344 skipped (full report)


Failures on Postgres 16

  • test_throughput[github-actions-selfhosted-50-pipelining_config7-30-100-128-batchable {'max_batch_size': 1, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config8-30-100-128-batchable {'max_batch_size': 2, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config9-30-100-128-batchable {'max_batch_size': 2, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config10-30-100-128-batchable {'max_batch_size': 4, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config11-30-100-128-batchable {'max_batch_size': 4, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config12-30-100-128-batchable {'max_batch_size': 8, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config13-30-100-128-batchable {'max_batch_size': 8, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config14-30-100-128-batchable {'max_batch_size': 16, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config15-30-100-128-batchable {'max_batch_size': 16, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config16-30-100-128-batchable {'max_batch_size': 32, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config17-30-100-128-batchable {'max_batch_size': 32, 'execution': 'tasks', 'mode': 'pipelined'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config5-30-100-128-batchable {'mode': 'serial'}]: release-x86-64-with-lfc
  • test_throughput[github-actions-selfhosted-50-pipelining_config6-30-100-128-batchable {'max_batch_size': 1, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]: release-x86-64-with-lfc
# Run all failed tests locally:
scripts/pytest -vv -n $(nproc) -k "test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config7-30-100-128-batchable {'max_batch_size': 1, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config8-30-100-128-batchable {'max_batch_size': 2, 'execution': 'concurrent-futures', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config9-30-100-128-batchable {'max_batch_size': 2, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config10-30-100-128-batchable {'max_batch_size': 4, 'execution': 'concurrent-futures', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config11-30-100-128-batchable {'max_batch_size': 4, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config12-30-100-128-batchable {'max_batch_size': 8, 'execution': 'concurrent-futures', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config13-30-100-128-batchable {'max_batch_size': 8, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config14-30-100-128-batchable {'max_batch_size': 16, 'execution': 'concurrent-futures', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config15-30-100-128-batchable {'max_batch_size': 16, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config16-30-100-128-batchable {'max_batch_size': 32, 'execution': 'concurrent-futures', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config17-30-100-128-batchable {'max_batch_size': 32, 'execution': 'tasks', 'mode': 'pipelined'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config5-30-100-128-batchable {'mode': 'serial'}] or test_throughput[release-pg16-github-actions-selfhosted-50-pipelining_config6-30-100-128-batchable {'max_batch_size': 1, 'execution': 'concurrent-futures', 'mode': 'pipelined'}]"
Flaky tests (1)

Postgres 17

Test coverage report is not available

The comment gets automatically updated with the latest test results
df362de at 2025-02-24T20:58:02.232Z :recycle:

Please sign in to comment.