Hello ECMWF / C3S team,
I am a researcher at Özyeğin University (Istanbul, Turkey) working on
a hail event modeling study covering European and Turkish domains.
The project requires the full CERRA reanalysis (single-levels and
pressure-levels) over 20 years (2005–2024), totaling ~7.7 TB.
Due to a project submission deadline on Wednesday, 2026-04-22,
I am requesting your support for completing this retrieval in time.
Research context
- University: Özyeğin University
- Purpose: Historical hail event analysis and atmospheric modeling
(surface variables + pressure-level profiles) for Europe and Turkey - Non-commercial, academic research
Current setup
- Account email:
guven.fidan@ozu.edu.tr - Using
cdsapiwith monthly-split requests
(each variable × each month, 96 requests/year) - Running 7 parallel script instances on different years, 4–6 workers each
- Observed per-script throughput: ~7.6 GB/hour
- Combined aggregate throughput: ~25–30 GB/hour
Progress so far (~1.54 TB completed)
- Completed: 2020, 2022, 2023, 2024
- In progress: 2013–2019 (except 2020) via CDS
- Remaining: ~3.5 TB
Issue
At the current aggregate rate, the remaining ~3.5 TB will take ~120 hours
— which misses the Wednesday deadline by a significant margin.
Questions
- Is it possible to temporarily increase my queue / concurrent
request limit for the next 4 days to expedite this academic retrieval? - Is there any bulk/institutional download path for multi-TB
CERRA research campaigns (e.g., MARS direct access, alternative endpoint)? - Any guidance on request structure to accelerate tape-archive retrieval?
I have already attempted WEkEO HDA as a parallel path, but per-connection
throughput is ~0.2 MB/s (server-throttled), so only CDS is viable.
Thank you very much for any assistance you can provide for this
time-sensitive research work.
Best regards,
Güven Fidan
Özyeğin University