tsdat.io.storage

Classes

FileSystem

Handles data storage and retrieval for file-based data formats.

FileSystemS3

Handles data storage and retrieval for file-based data in an AWS S3 bucket.

ZarrLocalStorage

Handles data storage and retrieval for zarr archives on a local filesystem.

class tsdat.io.storage.FileSystem[source]

Bases: tsdat.io.base.Storage

Handles data storage and retrieval for file-based data formats.

Formats that write to directories (such as zarr) are not supported by the FileSystem storage class.

Parameters:
  • parameters (Parameters) – File-system specific parameters, such as the root path to where files should be saved, or additional keyword arguments to specific functions used by the storage API. See the FileSystemStorage.Parameters class for more details.

  • handler (FileHandler) – The FileHandler class that should be used to handle data I/O within the storage API.

class Parameters[source]

Bases: tsdat.io.base.Storage.Parameters

data_filename_template: str = '{datastream}.{date_time}.{extension}'[source]

Template string to use for data filenames.

Allows substitution of the following parameters using curly braces ‘{}’:

  • ext: the file extension from the storage data handler

  • datastream from the dataset’s global attributes

  • location_id from the dataset’s global attributes

  • data_level from the dataset’s global attributes

  • date_time: the first timestamp in the file formatted as “YYYYMMDD.hhmmss”

  • Any other global attribute that has a string or integer data type.

At a minimum the template must include {date_time}.

data_storage_path: pathlib.Path[source]

The directory structure under storage_root where ancillary files are saved.

Allows substitution of the following parameters using curly braces ‘{}’:

  • storage_root: the value from the storage_root parameter.

  • datastream: the datastream as defined in the dataset config file.

  • location_id: the location_id as defined in the dataset config file.

  • data_level: the data_level as defined in the dataset config file.

  • year: the year of the first timestamp in the file.

  • month: the month of the first timestamp in the file.

  • day: the day of the first timestamp in the file.

  • extension: the file extension used by the output file writer.

Defaults to data/{location_id}/{datastream}.

merge_fetched_data_kwargs: Dict[str, Any][source]

Keyword arguments passed to xr.merge.

Note that this will only be called if the DataReader returns a dictionary of xr.Datasets for a single input key.

handler: tsdat.io.handlers.FileHandler[source]
parameters: FileSystem.Parameters[source]

Class Methods

fetch_data

Fetches data for a given datastream between a specified time range.

save_ancillary_file

Saves an ancillary filepath to the datastream's ancillary storage area.

save_data

Saves a dataset to the storage area.

Method Descriptions

fetch_data(start: datetime.datetime, end: datetime.datetime, datastream: str, metadata_kwargs: Dict[str, str] | None = None, **kwargs: Any) xarray.Dataset[source]

Fetches data for a given datastream between a specified time range.

Parameters:
  • start (datetime) – The minimum datetime to fetch.

  • end (datetime) – The maximum datetime to fetch.

  • datastream (str) – The datastream id to search for.

  • metadata_kwargs (dict[str, str], optional) – Metadata substitutions to help resolve the data storage path. This is only required if the template data storage path includes any properties other than datastream or fields contained in the datastream. Defaults to None.

Returns:

xr.Dataset – A dataset containing all the data in the storage area that spans the specified datetimes.

save_ancillary_file(filepath: pathlib.Path, target_path: pathlib.Path | None = None)[source]

Saves an ancillary filepath to the datastream’s ancillary storage area.

NOTE: In most cases this function should not be used directly. Instead, prefer using the self.uploadable_dir(*args, **kwargs) method.

Parameters:
  • filepath (Path) – The path to the ancillary file. This is expected to have a standardized filename and should be saved under the ancillary storage path.

  • target_path (str) – The path to where the data should be saved.

save_data(dataset: xarray.Dataset, **kwargs: Any)[source]

Saves a dataset to the storage area.

At a minimum, the dataset must have a ‘datastream’ global attribute and must have a ‘time’ variable with a np.datetime64-like data type.

Parameters:

dataset (xr.Dataset) – The dataset to save.

class tsdat.io.storage.FileSystemS3[source]

Bases: FileSystem

Handles data storage and retrieval for file-based data in an AWS S3 bucket.

Parameters:
  • parameters (Parameters) – File-system and AWS-specific parameters, such as the path to where files should be saved or additional keyword arguments to specific functions used by the storage API. See the FileSystemS3.Parameters class for more details.

  • handler (FileHandler) – The FileHandler class that should be used to handle data I/O within the storage API.

class Parameters[source]

Bases: FileSystem

Additional parameters for S3 storage.

Note that all settings and parameters from Filesystem.Parameters are also supported by FileSystemS3.Parameters.

bucket: str[source]

The name of the S3 bucket that the storage class should use.

Note

This parameter can also be set via the TSDAT_S3_BUCKET_NAME environment variable.

region: str[source]

The AWS region of the storage bucket.

Note

This parameter can also be set via the AWS_DEFAULT_REGION environment variable.

Defaults to us-west-2.

parameters: FileSystemS3.Parameters[source]

Class Methods

last_modified

Returns the datetime of the last modification to the datastream's storage area.

modified_since

Returns the data times of all files modified after the specified datetime.

save_ancillary_file

Saves an ancillary filepath to the datastream's ancillary storage area.

save_data

Saves a dataset to the storage area.

Method Descriptions

last_modified(datastream: str) datetime.datetime | None[source]

Returns the datetime of the last modification to the datastream’s storage area.

modified_since(datastream: str, last_modified: datetime.datetime) List[datetime.datetime][source]

Returns the data times of all files modified after the specified datetime.

save_ancillary_file(filepath: pathlib.Path, target_path: pathlib.Path | None = None)[source]

Saves an ancillary filepath to the datastream’s ancillary storage area.

NOTE: In most cases this function should not be used directly. Instead, prefer using the self.uploadable_dir(*args, **kwargs) method.

Parameters:
  • filepath (Path) – The path to the ancillary file. This is expected to have a standardized filename and should be saved under the ancillary storage path.

  • target_path (str) – The path to where the data should be saved.

save_data(dataset: xarray.Dataset, **kwargs: Any)[source]

Saves a dataset to the storage area.

At a minimum, the dataset must have a ‘datastream’ global attribute and must have a ‘time’ variable with a np.datetime64-like data type.

Parameters:

dataset (xr.Dataset) – The dataset to save.

class tsdat.io.storage.ZarrLocalStorage[source]

Bases: FileSystem

Handles data storage and retrieval for zarr archives on a local filesystem.

Zarr is a special format that writes chunked data to a number of files underneath a given directory. This distribution of data into chunks and distinct files makes zarr an extremely well-suited format for quickly storing and retrieving large quantities of data.

Parameters:
  • parameters (Parameters) – File-system specific parameters, such as the root path to where the Zarr archives should be saved, or additional keyword arguments to specific functions used by the storage API. See the Parameters class for more details.

  • handler (ZarrHandler) – The ZarrHandler class that should be used to handle data I/O within the storage API.

class Parameters[source]

Bases: FileSystem

data_filename_template: str = '{datastream}.{extension}'[source]

Template string to use for data filenames.

Allows substitution of the following parameters using curly braces ‘{}’:

  • ext: the file extension from the storage data handler

  • datastream from the dataset’s global attributes

  • location_id from the dataset’s global attributes

  • data_level from the dataset’s global attributes

  • Any other global attribute that has a string or integer data type.

data_storage_path: pathlib.Path[source]

The directory structure under storage_root where ancillary files are saved.

Allows substitution of the following parameters using curly braces ‘{}’:

  • storage_root: the value from the storage_root parameter.

  • datastream: the datastream as defined in the dataset config file.

  • location_id: the location_id as defined in the dataset config file.

  • data_level: the data_level as defined in the dataset config file.

  • year: the year of the first timestamp in the file.

  • month: the month of the first timestamp in the file.

  • day: the day of the first timestamp in the file.

  • extension: the file extension used by the output file writer.

handler: tsdat.io.handlers.ZarrHandler[source]
parameters: ZarrLocalStorage.Parameters[source]