storage
Classes#
FileSystem #
Bases: Storage
Handles data storage and retrieval for file-based data formats.
Formats that write to directories (such as zarr) are not supported by the FileSystem storage class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
parameters |
Parameters
|
File-system specific parameters, such as the root path to where files should be saved, or additional keyword arguments to specific functions used by the storage API. See the FileSystemStorage.Parameters class for more details. |
required |
handler |
FileHandler
|
The FileHandler class that should be used to handle data I/O within the storage API. |
required |
Attributes#
handler
class-attribute
instance-attribute
#
parameters
class-attribute
instance-attribute
#
Classes#
Parameters #
Bases: Parameters
Attributes#
class-attribute
instance-attribute
#Template string to use for data filenames.
Allows substitution of the following parameters using curly braces '{}':
ext
: the file extension from the storage data handlerdatastream
from the dataset's global attributeslocation_id
from the dataset's global attributesdata_level
from the dataset's global attributesdate_time
: the first timestamp in the file formatted as "YYYYMMDD.hhmmss"- Any other global attribute that has a string or integer data type.
At a minimum the template must include {date_time}
.
class-attribute
instance-attribute
#The directory structure under storage_root where ancillary files are saved.
Allows substitution of the following parameters using curly braces '{}':
storage_root
: the value from thestorage_root
parameter.datastream
: thedatastream
as defined in the dataset config file.location_id
: thelocation_id
as defined in the dataset config file.data_level
: thedata_level
as defined in the dataset config file.year
: the year of the first timestamp in the file.month
: the month of the first timestamp in the file.day
: the day of the first timestamp in the file.extension
: the file extension used by the output file writer.
Defaults to data/{location_id}/{datastream}
.
class-attribute
instance-attribute
#Keyword arguments passed to xr.merge.
Note that this will only be called if the DataReader returns a dictionary of xr.Datasets for a single input key.
Functions#
fetch_data #
fetch_data(
start: datetime,
end: datetime,
datastream: str,
metadata_kwargs: Union[Dict[str, str], None] = None,
**kwargs: Any
) -> xr.Dataset
Fetches data for a given datastream between a specified time range.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
start |
datetime
|
The minimum datetime to fetch. |
required |
end |
datetime
|
The maximum datetime to fetch. |
required |
datastream |
str
|
The datastream id to search for. |
required |
metadata_kwargs |
dict[str, str]
|
Metadata substitutions to help resolve the data storage path. This is only required if the template data storage path includes any properties other than datastream or fields contained in the datastream. Defaults to None. |
None
|
Returns:
Type | Description |
---|---|
Dataset
|
xr.Dataset: A dataset containing all the data in the storage area that spans |
Dataset
|
the specified datetimes. |
Source code in tsdat/io/storage.py
save_ancillary_file #
Saves an ancillary filepath to the datastream's ancillary storage area.
NOTE: In most cases this function should not be used directly. Instead, prefer
using the self.uploadable_dir(*args, **kwargs)
method.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filepath |
Path
|
The path to the ancillary file. This is expected to have a standardized filename and should be saved under the ancillary storage path. |
required |
target_path |
str
|
The path to where the data should be saved. |
None
|
Source code in tsdat/io/storage.py
save_data #
Saves a dataset to the storage area.
At a minimum, the dataset must have a 'datastream' global attribute and must have a 'time' variable with a np.datetime64-like data type.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dataset |
Dataset
|
The dataset to save. |
required |
Source code in tsdat/io/storage.py
FileSystemS3 #
Bases: FileSystem
Handles data storage and retrieval for file-based data in an AWS S3 bucket.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
parameters |
Parameters
|
File-system and AWS-specific parameters, such as the path to where files should be saved or additional keyword arguments to specific functions used by the storage API. See the FileSystemS3.Parameters class for more details. |
required |
handler |
FileHandler
|
The FileHandler class that should be used to handle data I/O within the storage API. |
required |
Attributes#
parameters
class-attribute
instance-attribute
#
Classes#
Parameters #
Bases: Parameters
Additional parameters for S3 storage.
Note that all settings and parameters from Filesystem.Parameters
are also
supported by FileSystemS3.Parameters
.
Attributes#
class-attribute
instance-attribute
#The name of the S3 bucket that the storage class should use.
Note
This parameter can also be set via the TSDAT_S3_BUCKET_NAME
environment
variable.
class-attribute
instance-attribute
#The AWS region of the storage bucket.
Note
This parameter can also be set via the AWS_DEFAULT_REGION
environment
variable.
Defaults to us-west-2
.
Functions#
last_modified #
Returns the datetime of the last modification to the datastream's storage area.
Source code in tsdat/io/storage.py
modified_since #
Returns the data times of all files modified after the specified datetime.
Source code in tsdat/io/storage.py
save_ancillary_file #
Saves an ancillary filepath to the datastream's ancillary storage area.
NOTE: In most cases this function should not be used directly. Instead, prefer
using the self.uploadable_dir(*args, **kwargs)
method.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
filepath |
Path
|
The path to the ancillary file. This is expected to have a standardized filename and should be saved under the ancillary storage path. |
required |
target_path |
str
|
The path to where the data should be saved. |
None
|
Source code in tsdat/io/storage.py
save_data #
Source code in tsdat/io/storage.py
ZarrLocalStorage #
Bases: FileSystem
Handles data storage and retrieval for zarr archives on a local filesystem.
Zarr is a special format that writes chunked data to a number of files underneath a given directory. This distribution of data into chunks and distinct files makes zarr an extremely well-suited format for quickly storing and retrieving large quantities of data.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
parameters |
Parameters
|
File-system specific parameters, such as the root path to where the Zarr archives should be saved, or additional keyword arguments to specific functions used by the storage API. See the Parameters class for more details. |
required |
handler |
ZarrHandler
|
The ZarrHandler class that should be used to handle data I/O within the storage API. |
required |
Attributes#
handler
class-attribute
instance-attribute
#
parameters
class-attribute
instance-attribute
#
Classes#
Parameters #
Bases: Parameters
Attributes#
class-attribute
instance-attribute
#Template string to use for data filenames.
Allows substitution of the following parameters using curly braces '{}':
ext
: the file extension from the storage data handlerdatastream
from the dataset's global attributeslocation_id
from the dataset's global attributesdata_level
from the dataset's global attributes- Any other global attribute that has a string or integer data type.
class-attribute
instance-attribute
#The directory structure under storage_root where ancillary files are saved.
Allows substitution of the following parameters using curly braces '{}':
storage_root
: the value from thestorage_root
parameter.datastream
: thedatastream
as defined in the dataset config file.location_id
: thelocation_id
as defined in the dataset config file.data_level
: thedata_level
as defined in the dataset config file.year
: the year of the first timestamp in the file.month
: the month of the first timestamp in the file.day
: the day of the first timestamp in the file.extension
: the file extension used by the output file writer.