Changes to grib to netCDF converter on CDS-Beta/ADS-Beta

The new CDS-Beta/ADS-Beta system uses cfgrib for the grib to netCDF conversion. This converter is labelled as “experimental”, as it is still under development.

The same convertor is used for direct downloads and post-processed data.

There will be some differences between the netCDF (produced using grib_to_netcdf) from the previous CDS /ADS system.

In addition, the way the data are encoded will vary from dataset to dataset.

The changes from the previous version of the converter include.

  • NetCDF3 → NetCDF4 (including compression options)
  • Changes to metadata attributes in files
  • ordering of dimensions
  • changes to time dimension names
  • Splitting of files when incompatibilities are detected, with a zip file returned rather than a single netCDF file.

Further changes may be introduced in the future, and will be documented accordingly.

1 Like

Hello Kevin,
thank you for addressing this very important compatibility issue.
Is the current format documented somewhere, and if not will it be before the September 16 migration ?
Thank you.

Hi Nicolas,
The converter format is still under development (and is labelled as “Experimental” on the CDS-beta/ADS-beta download forms. As such, there will probably be changes to the netCDF produced over the coming weeks, but we will attempt to document the key changes as they become finalised.
Hope that helps,
Kevin

Hello,

When does the final version of the converter will be released?

We are relying on netCDF format for our services and the change in structure requires us to refactor our codebase to comply with the new format. The shutdown of the old CDS/ADS on 16/09 gives us a very short time frame to upgrade of services to the new CDS-beta/ADS-beta.

Thanks,
Greg

Hi Kevin,

We have the same issue - all of our operational services use the ADS API to access data in netCDF format, so the change in file format requires us to change our internal processor and implement this in all our systems. The shutdown of the old ADS on 16 September already presents a very challenging timescale to make these changes.
Can you confirm that the final version of the GRIB to netCDF processor will be ready before 16 September?

Many of our systems use the European gridded forecast product, which is not yet available in the ADS-Beta. We need time to switch these systems over to ADS-Beta, but we cannot even start testing because the dataset we need is not yet available.

Will you consider extending the old ADS for a short time if the new ADS-Beta components are not ready in sufficient time for users to make the transition?

Thanks,
Amy Stidworthy

Hi Amy,

We’re hoping to release the European AQ forecast on ADS-Beta this week. This dataset won’t use the new convertor - it will continue to use the same custom one which should mean you won’t see any difference in the returned results. At some point we’d like to switch to using the new converter for this dataset but it won’t be any time soon.

Luke.

Hello Luke,
What is the difference between the dataset you are mentioning and era5 datasets ? Does this mean that the new converter won’t be ready any time soon ? If so is it also possible to use the old one for era5 datasets as long as the new one is not ready to avoid disruptions ?

Thank you.

Thanks Luke, that’s good to hear! When you do come to switching the European AQ dataset to use the new GRIB to netCDF converter, would be great if you could provide some sample data in advance, so that we can test our processor with the new-format data.

Thanks,
Amy

now l download a netcdf file from new dataset,but i can not open the nc file with “sdfopen” command of grads soft.why? looking for your reply.thankyou.

1 Like

Does your download include wave variables? For me, that caused problems with the CDS-Beta API.

We are also having some problems opening the new ERA5 NetCDF using grads and cdo, it seems to me that the added variables and the change in time dimension make the file unredable in these programs.

The solution I found was to use the grads command xdfopen to open the netcdf with a descriptor file, but this process is very time consuming because the descriptor needs to be written for each file.

I have tried downloading the GRIB file and then using the grib_to_netcdf tool, but it doesn’t seem to work with all variable types.

It would be nice to have a more robust and general solution for this.

1 Like

Hi Kevin,

Respectfully, why would CDS do this? This is a BIG change, and a change that breaks many applications that aggregate datasets via dimension name and data type! I maintain a repository of hundreds of TB of your ERA5 data, and now all of the new files will no longer be maximally compatible with the files that have already been stored! That is, unless I re-download all of the previous files or write a script that restores the dimension/data types to what they were before.

Might I suggest giving users an option to use the previous converter or converter version, perhaps with a warning saying that the previous one is no longer supported?

Thanks,

  • Kris R.
5 Likes

Hello,

I seem to have a similar problem to the ones stated above following the switch to CDS-Beta. The netCDF files that I download from the new API seem to be missing some variables, in my case specifically surface solar radiation downwards and total precipitation from ERA5 hourly data.

I have tried downloading a GRIB file instead and then converting it through cfgrib, but ran into a problem similar to this one: Cfgrib only reads some variables · Issue #245 · ecmwf/cfgrib · GitHub (with the difference that my missing variables have a different dataType (fc) from the other ones in the file (an)).

As @Kristopher_Rand already suggested, it would be beneficial to have the option to choose between different converter versions in order to avoid these types of problems. Also, as a first step, it would at least be good to warn users that some of their variables get lost in the conversion process. Are there any updates on the current converter being used, or are any further developments expected soon to resolve this before decommissioning the old API?

Thank you in advance.

1 Like

As a follow-up to my previous post, and for anyone else who is having issues, NetCDF files downloaded using the previous CDS should have the conversion step within the “history” global attribute. The tool “grib_to_netcdf” was used, in particular, version 2.28.1. The conversion typically took the form of:

grib_to_netcdf -S param -o /path/to/convertedFile.nc /path/to/originalFile.grib

More info:
https://confluence.ecmwf.int/display/ECC/grib_to_netcdf

If you have access to this tool, try simply downloading the original .grib file and then perform the conversion separately. Doing this should produce a file almost identical to what you would have gotten in the previous CDS.

  • Kris R.
4 Likes

Hello,

many thanks for your updates. I tried to use this to convert the SEAS5 files grib with all members to netcdf but still when I open them with ncview or I try to manipulate them with cdo, I have several warning and errors, it seems that the converted netCDF is not with standard nc4 format. With more detail with the following command:

cdo -f nc4 copy d68bffce2ea6e76461b4910fbd7eb47d.grib SEAS5_PRATE_GLOBAL_ABS_M202409_51ENS.nc
cdo copy: Open failed on >SEAS5_PRATE_GLOBAL_ABS_M202409_51ENS.nc<
Unsupported file type (library support not compiled in)
CDO was build with a NetCDF version which doesn’t support NetCDF4 data!

but I compiled cdo with all standard netcdf 4, hdf5, etc… libraries and it works with the data retrieved from old CDS or from the NOAA.

If I run the basic command:
cdo -f nc copy d68bffce2ea6e76461b4910fbd7eb47d.grib SEAS5_PRATE_GLOBAL_ABS_M202409_51ENS.nc

it works properly but in the netCDF files there is only one memebr, it looses the others.

Has someone an idea to fix the issue with the SEAS5 files?

thank you in advance.
Marco

Hi @Marco_Formenton !

can I ask you to provide a bit more details about this?

It is not clear to me if you are having issues with GRIB data while you use CDO (which would be unrelated to this topic of the new GRIB to netCDF conversion) or if you are having issues with the netCDF data coming from the new CDS.

Thanks!
Edu

Hi Eduardo,

actually I am having issues converting the grib files to netcdf for the case of ensemble members monthly files. I did several tests:

1- I tried with the grib_to_netcdf.py provided by ECMWF; it seems to work but when I do an operation with cdo (for example sellonlatbox) it gives me back the error shown earlier. When I try to read it with ncview it give several lines of “ncview: netcdf_dim_value: unknown data type (10) for
dimension number
ncview: netcdf_dim_value: unknown data type (10) for
dimension number”

2 - the same issue is present when I download the netCDF directly from the CDS Beta Webpage or by means of API, something that not happened with the old Copernicus.

3 - I tried with cdo -f nc the netcdf seems fine, the manipulation works well, but it appends all the members as time slots, it means that the netcdf has 51*6 time steps and not two distinct dimensions; 6 time steps and 51 members.

We are making many attempts, even recompiling cdo with all HDF5, ntecdf 4 libraries but without success.

thank you a lot for your support with the disruptive issue.

Best,
Marco

Hi everyone,
an informative page about the new GRIB to netCDF converter used on the new CDS and ADS systems is available here: GRIB to netCDF conversion on CDS-Beta and ADS-Beta - Copernicus Knowledge Base - ECMWF Confluence Wiki

Thanks
ECMWF Support

3 Likes

Hi,

Also having issues opening NetCDF files post download using python package iris.

Traceback (most recent call last):
  File "/gws/nopw/j04/4dhydro/data/jules/164_deg/code/drivers/rhine/../create_jules_met_era5_mp.py", line 660, in <module>
    main()
  File "/gws/nopw/j04/4dhydro/data/jules/164_deg/code/drivers/rhine/../create_jules_met_era5_mp.py", line 657, in main
    process_driving_data(region_bounds, year, month, timestep, grid_file, era5_type, outputdir, remove_temps, dl_only, mp_run)
  File "/gws/nopw/j04/4dhydro/data/jules/164_deg/code/drivers/rhine/../create_jules_met_era5_mp.py", line 621, in process_driving_data
    driver_generator(i)
  File "/gws/nopw/j04/4dhydro/data/jules/164_deg/code/drivers/rhine/../create_jules_met_era5_mp.py", line 501, in driver_generator
    cube = iris.load_cube(jules_map[f]['filepath'], jules_map[f]['era5_names'])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/__init__.py", line 358, in load_cube
    cubes = _load_collection(uris, constraints, callback).cubes()
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/__init__.py", line 298, in _load_collection
    result = _CubeFilterCollection.from_cubes(cubes, constraints)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/cube.py", line 95, in from_cubes
    for cube in cubes:
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/__init__.py", line 279, in _generate_cubes
    for cube in iris.io.load_files(part_names, callback, constraints):
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/io/__init__.py", line 219, in load_files
    for cube in handling_format_spec.handler(fnames, callback, constraints):
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/netcdf/loader.py", line 633, in load_cubes
    cube = _load_cube(engine, cf, cf_var, cf.filename)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/netcdf/loader.py", line 328, in _load_cube
    return _load_cube_inner(engine, cf, cf_var, filename)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/netcdf/loader.py", line 357, in _load_cube_inner
    engine.activate()
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/_nc_load_rules/engine.py", line 95, in activate
    run_actions(self)
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/_nc_load_rules/actions.py", line 575, in run_actions
    action_build_auxiliary_coordinate(engine, auxcoord_fact)
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/_nc_load_rules/actions.py", line 92, in inner
    rule_name = func(engine, *args, **kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/_nc_load_rules/actions.py", line 410, in action_build_auxiliary_coordinate
    hh.build_auxiliary_coordinate(engine, cf_var, coord_name=coord_name)
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/_nc_load_rules/helpers.py", line 1240, in build_auxiliary_coordinate
    points_data = _get_cf_var_data(cf_coord_var, engine.filename)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/apps/jasmin/jaspy/miniforge_envs/jaspy3.11/mf3-23.11.0-0/envs/jaspy3.11-mf3-23.11.0-0-v20240815/lib/python3.11/site-packages/iris/fileformats/netcdf/loader.py", line 218, in _get_cf_var_data
    total_bytes = cf_var.size * cf_var.dtype.itemsize
                                ^^^^^^^^^^^^^^^^^^^^^
AttributeError: type object 'str' has no attribute 'itemsize'

The workaround I currently have is to download the file in grib format and do a straight conversion with xarray. Seems to then open without issue, albeit not completely CF compliant. This also seems to be OK for cdo use (@Marco_Formenton)

Please let me know if there’s somewhere more appropriate to put this post.

Thanks, Matt

hi Jelena,
if the missing data are for requests including the first day of July in netCDF, it may be this ‘experiment version’ issue due to the request including a mix of ERA5 and ERA5T data:

Where the workaround is to download the missing variables in a separate request. (The issue is due to the grib to netCDF converter used on CDS-Beta)