这是indexloc提供的服务,不要输入任何密码
Skip to content

Releases: neuralhydrology/neuralhydrology

v.1.0.0 First major release

24 Oct 20:11
a710d61
Compare
Choose a tag to compare

A long time has passed since we started this library for internal research use but also since its publication as an open source library. After an extended beta period we are happy to finally bump NeuralHydrology to it's first major release version.

New Features

  • We removed the dependency of the pickle library for storing the scaler and basin-id-to-integer dictionaries. pickle has brought a lot of headaches in the past, because it can get troublesome to run old models after upgrading the Python libraries. This PR removes the pickle dependencies and from now on, the scaler (used or normalization) and basin-id-to-integer dictionaries (in case of using basin-one-hot-encoding) are stored in YAML files. However, the old format is still supported so that old runs can still be used.

Additionally

  • Updates of the description of the config argument metrics and clip_targets_to_zero

Update of data use in tutorial noteboks

19 Aug 08:51
cd73563
Compare
Choose a tag to compare
Pre-release

In some of the tutorials, we used the updated forcing data of the CAMELS US data set but we missed to link to the corresponding download links. This has been updated and information on the data requirements was added. We also removed the requirement for the updated forcing data on the very first tutorial so that this can be run with only the original CAMELS US data set.

Small performance improvements, minor fix

04 Jun 09:05
ffbc600
Compare
Choose a tag to compare
  • Small performance improvements while loading CAMELS US forcing data #38
  • Fixed the requirement for local installation of Git in the Logger class #41

Bugfixes for MTS-LSTM uncertainty sampling and ensemble script

06 May 09:48
23dba64
Compare
Choose a tag to compare
  • MTS-LSTM failed with uncertainty heads, due to a missing class attribute.
  • Ensemble script failed due to issues with the new datetime logic (#30) and type incompatibilities.

Multi-target, multi-freq uncertainty support

07 Apr 15:21
a49f2e4
Compare
Choose a tag to compare

With this update we will transition into the beta phase of v.1.0.0 as it includes the last thing we had planned for the first major version release, which is full multi-target, multi-frequency (and the combination of both) regression and uncertainty heads. We will most likely add some tutorial show-casing some of the new features. For everyone who is using the library, we would be grateful to report any bugs you encounter, so we can get rid of them.

New Features/Additions

  • Support for multi-frequency and multi-target (and the combination of both) uncertainty models.
  • Added tests for frequency handling

Fixes

  • Issues around various frequencies and combinations of those. Closes #30

added MC-LSTM, multi-target support for all losses

23 Mar 15:06
f320b41
Compare
Choose a tag to compare

Another minor updated before v.1.0.0 final.

New Features

  • (weighted) multi-target support for all losses
  • added MC-LSTM to modelzoo, as proposed in this publication. The publication also includes benchmarking results using the CAMELS US dataset.

Fixes

  • ignore capitalization of loss name in config

Uncertainty heads and minor additions/fixes

11 Feb 11:49
e63bf3e
Compare
Choose a tag to compare

NeuralHydrology 1.0.0-alpha Release Notes

This version marks the first steps towards our first major version release. The biggest addition are the newly added uncertainty heads that are now available for all models. That is, it is now possible to train any of the implemented models with probabilistic model heads. For details on these probabilistic model heads, see Klotz et al. (2021). The uncertainty heads support all models, also the multi-frequency models like MTS-LSTM. The only limitation so far are multi-targets (per-frequency), which is currently not supported but will be added within the next days/weeks. We also plan to release a tutorial around the uncertainty heads soon.

New Features

  • The optional embedding of static and dynamic inputs was re-implemented and made more flexible for possible extensions in the future. It is now possible to specify distinct architectures for dynamic and static input embeddings. The config arguments statics/dynamics_embedding should now be dictionaries with keys 'type', 'hiddens', 'activation', 'dropout' or None (no embedding). The old way of specifying embeddings via embedding_hiddens etc. still works but is deprecated.
  • The config argument lagged_features does now accept a list of integers (as well as just an integer like before) to specify multiple lagged copies of a single feature.
  • The warning message of the metric functions does now include the basin id, if all simulations or observations are NaN, to facilitate the inspection of eventual problems with the model/data.
  • Different model heads for uncertainty estimation (see the corresponding publication by Klotz et al.). It is now possible to combine any model with different heads (config argument head, see Config arguments), instead of before only regression. The new heads are cmal, umal, gmm (see the linked publication for more details). All heads require special loss functions that are also provided and all models now have a sample function for the different head (e.g. model.sample_cmal() for a model trained with CMAL head), to sample from the (trained) model. There are also a couple of new config arguments that are required/optional for those heads, so make sure to checkout the specific sections in the Documentation

Fixes

  • Missing raise in the dump_config method of the Config class, if the file already exists.
  • Problem, when evaluating the validation period outside of the model training process with log_n_figures > 0 in the config
  • Problem in continue-training mode, when specifying device as special input argument