file Package

exceptions

Exceptions for File Storage layer.

exception tvb.storage.h5.file.exceptions.FileMigrationException(message)[source]

Bases: TVBException

Exception to be thrown in case of an unexpected problem when migrating an H5 file to a newer version.

exception tvb.storage.h5.file.exceptions.FileStorageException(message)[source]

Bases: TVBException

Generic exception when storing in data in files.

exception tvb.storage.h5.file.exceptions.FileStructureException(message)[source]

Bases: TVBException

Exception to be thrown in case of a problem related to File Structure Storage.

exception tvb.storage.h5.file.exceptions.FileVersioningException(message)[source]

Bases: TVBException

A base exception class for all TVB file storage version conversion custom exceptions.

exception tvb.storage.h5.file.exceptions.IncompatibleFileManagerException(message)[source]

Bases: FileVersioningException

Exception that should be raised in case a file is handled by some filemanager which is incompatible with that version of TVB file storage.

exception tvb.storage.h5.file.exceptions.MissingDataFileException(message)[source]

Bases: FileStorageException

Exception when the file associated to some manager does not exist on disk for some reason.

exception tvb.storage.h5.file.exceptions.MissingDataSetException(message)[source]

Bases: FileStorageException

Exception when a dataset is accessed, but no written entry exists in HDF5 file for it. we will consider the attribute None.

exception tvb.storage.h5.file.exceptions.RenameWhileSyncEncryptingException(message)[source]

Bases: TVBException

Exception to be thrown in case a project is to be renamed during sync encryption.

exception tvb.storage.h5.file.exceptions.UnsupportedFileStorageException(message)[source]

Bases: TVBException

Exception to be thrown in case of an unsupported file storage is chosen. Currently only H5 Storage is supported.

files_helper

class tvb.storage.h5.file.files_helper.FilesHelper[source]

Bases: object

This class manages all Structure related operations, using File storage. It will handle creating meaning-full entities and retrieving existent ones.

ALLEN_MOUSE_CONNECTIVITY_CACHE_FOLDER = 'ALLEN_MOUSE_CONNECTIVITY_CACHE'
PROJECTS_FOLDER = 'PROJECTS'
check_created(**kw)

New function will actually write the Lock.

static compute_recursive_h5_disk_usage(start_path)[source]

Computes the disk usage of all h5 files under the given directory. :param start_path: :return: A tuple of size in kiB

static compute_size_on_disk(file_path)[source]

Given a file’s path, return size occupied on disk by that file. Size should be a number, representing size in KB.

static copy_file(source, dest, dest_postfix, buffer_size)[source]

Copy a file from source to dest. source and dest can either be strings or any object with a read or write method, like StringIO for example.

get_allen_mouse_cache_folder(project_name)[source]
get_images_folder(project_name, images_folder)[source]

Computes the name/path of the folder where to store images.

get_project_folder(project_name, *sub_folders)[source]

Retrieve the root path for the given project. If root folder is not created yet, will create it.

static get_project_folder_from_h5(h5_file)[source]
get_project_meta_file_path(project_name, project_file)[source]

Retrieve project meta info file path.

Returns:

File path for storing Project meta-data File might not exist yet, but parent folder is created after this method call.

static get_projects_folder()[source]
move_datatype(new_project_name, new_op_id, full_path)[source]

Move H5 storage into a new location

static read_project_metadata(project_path, project_file)[source]
remove_figure(figure, images_folder)[source]

Remove the file storing image and its meta data

static remove_files(file_list, ignore_exception=False)[source]
Parameters:
  • file_list – list of file paths to be removed.

  • ignore_exception – When True and one of the specified files could not be removed, an exception is raised.

static remove_folder(folder_path, ignore_errors)[source]

Given a folder path, try to remove that folder from disk. :param folder_path: Folder to be removed :param ignore_errors: When False throw FileStructureException if folder_path is invalid.

remove_operation_data(project_name, operation_id)[source]

Remove H5 storage fully.

rename_project_structure(project_name, new_name)[source]

Rename Project folder or THROW FileStructureException.

write_image_metadata(figure, meta_entity, images_folder)[source]

Writes figure meta-data into XML file

write_project_metadata(meta_dictionary, project_file)[source]
Parameters:
  • meta_dictionary – Project metadata

  • project_file – Project file path

static write_project_metadata_from_dict(project_path, meta_entity, project_file)[source]
class tvb.storage.h5.file.files_helper.TvbZip(dest_path, mode)[source]

Bases: ZipFile

unpack_zip(folder_path)[source]

Simple method to unpack ZIP archive in a given folder.

write_zip_folder(folder, exclude, need_parent_folder=False)[source]

write folder contents in archive :param folder: root folder in archive. Defaults to “” the archive root :param exclude: a list of file or folder names that will be recursively excluded :param need_parent_folder: if it is True, the parent_folder will be added as well to the zip file name

write_zip_folders(folders, exclude)[source]

This method creates a ZIP file with all folders provided as parameters :param folders: array with the FULL names/path of the folders to add into ZIP :param exclude: a list of file or folder names that will be recursively excluded

hdf5_storage_manager

Persistence of data in HDF5 format.

class tvb.storage.h5.file.hdf5_storage_manager.HDF5StorageManager(storage_full_name)[source]

Bases: object

This class is responsible for saving / loading data in HDF5 file / format.

BOOL_VALUE_PREFIX = 'bool:'
DATETIME_VALUE_PREFIX = 'datetime:'
DATE_TIME_FORMAT = '%Y-%m-%d %H:%M:%S.%f'
class H5pyStorageBuffer(h5py_dataset, buffered_data=None, grow_dimension=-1)[source]

Bases: object

Helper class in order to buffer data for append operations, to limit the number of actual HDD I/O operations.

buffer_data(data_list)[source]

Add data_list to an internal buffer in order to improve performance for append_data type of operations. :returns: True if buffer is still fine, False if a flush is necessary since the buffer is full

flush_buffered_data()[source]

Append the data buffered so far to the input dataset using :param grow_dimension: as the dimension that will be expanded.

LOCKS = {'/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/1/Connectivity_6b28b6c61836452bb52cbd18538a7775.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/1/ZIPConnectivityImporterModel_4866892f890d420ca6e0e35847b5905d.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_10467c4fd4874186afa6d9b1fd8383d8.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_3716aab9d3664d20ab510549ea418e3a.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_6f246663030d4dd3affaad9df6d47c53.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_713b0dfd95804e779339f6c06ba0aada.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_a6cca09cbca646af95f286a222d119bf.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/Surface_ebe790e783ba497ea6ee2a8053842618.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/2/ZIPSurfaceImporterModel_356f5a84eb144b069c3e9776d1bc72b5.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/3/SensorsImporterModel_0b7a7bfa6b634bfca79f5ad35a35dece.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/3/Sensors_1a4ee47c7d964493a33852c32a09cec1.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/3/Sensors_568ffb7358f84a0eb4e9ea3ff9a4568b.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/3/Sensors_7b6b6060dde04990b1f79698fe2681c8.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/4/BRCOImporterModel_f43259fa842b40b2b80a525428e1e506.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/4/ConnectivityAnnotations_9134ec2cf5bd4456b889b2e1ba692184.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/5/RegionMappingImporterModel_8173b01a810141b5b50804bbe465d99a.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/5/RegionMapping_30eb223d205b4f728e95c76068c06f92.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/6/ProjectionMatrixImporterModel_8fe81e2928be48749405be8e34ec25d4.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/6/ProjectionMatrix_42c105f356c245498db8df991886c48a.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/6/ProjectionMatrix_5f4883d4096a4a86ae65d48058883450.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/6/ProjectionMatrix_6e6dcd8a09bf473d8659479f8fa67663.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/7/LocalConnectivity_3e551cbd47ca11e49f213c075431bf56.h5': <unlocked _thread.lock object>, '/home/tvb_user/TVB_STORAGE/TEMP/2024-2-27_15-11-16_101258-ImportProject/Default_Project/7/TVBImporterModel_a62c8ae10a1a4a4897e709e04f995d97.h5': <unlocked _thread.lock object>}
ROOT_NODE_PATH = '/'
TVB_ATTRIBUTE_PREFIX = 'TVB_'
append_data(data_list, dataset_name='', grow_dimension=-1, close_file=True, where='/')[source]

This method appends data to an existing data set. If the data set does not exists, create it first.

Parameters:
  • dataset_name – Name of the data set where to store data

  • data_list – Data to be stored / appended

  • grow_dimension – The dimension to be used to grow stored array. By default will grow on the LAST dimension

  • close_file – Specify if the file should be closed automatically after write operation. If not, you have to close file by calling method close_file()

  • where – represents the path where to store our dataset (e.g. /data/info)

close_file()[source]

The synchronization of open/close doesn’t seem to be needed anymore for h5py in contrast to PyTables for concurrent reads. However since it shouldn’t add that much overhead in most situation we’ll leave it like this for now since in case of concurrent writes(metadata) this provides extra safety.

get_data(dataset_name='', data_slice=None, where='/', ignore_errors=False, close_file=True)[source]

This method reads data from the given data set based on the slice specification

Parameters:
  • close_file – Automatically close after reading the current field

  • ignore_errors – return None in case of error, or throw exception

  • dataset_name – Name of the data set from where to read data

  • data_slice – Specify how to retrieve data from array {e.g (slice(1,10,1),slice(1,6,2)) }

  • where – represents the path where dataset is stored (e.g. /data/info)

Returns:

a numpy.ndarray containing filtered data

get_data_shape(dataset_name='', where='/')[source]

This method reads data-size from the given data set

Parameters:
  • dataset_name – Name of the data set from where to read data

  • where – represents the path where dataset is stored (e.g. /data/info)

Returns:

a tuple containing data size

get_file_data_version(data_version, dataset_name='', where='/')[source]

Checks the data version for the current file.

get_metadata(dataset_name='', where='/')[source]

Retrieve ALL meta-data information for root node or for a given data set.

Parameters:
  • dataset_name – name of the dataset for which to read metadata. If None, read metadata from ROOT node.

  • where – represents the path where dataset is stored (e.g. /data/info)

Returns:

a dictionary containing all metadata associated with the node

is_valid_tvb_file()[source]

This method checks if specified file exists and if it has correct HDF5 format :returns: True is file exists and has HDF5 format. False otherwise.

remove_data(dataset_name='', where='/')[source]

Deleting a data set from H5 file.

:param dataset_name:name of the data set to be deleted :param where: represents the path where dataset is stored (e.g. /data/info)

remove_metadata(meta_key, dataset_name='', tvb_specific_metadata=True, where='/', check_existence=False)[source]

Remove meta-data information for root node or for a given data set.

Parameters:
  • meta_key – name of the metadata attribute to be removed

  • dataset_name – name of the dataset from where to delete metadata. If None, metadata will be removed from ROOT node.

  • tvb_specific_metadata – specify if the provided metadata is specific to TVB (keys will have a TVB prefix).

  • where – represents the path where dataset is stored (e.g. /data/info)

static serialize_bool(value)[source]
set_metadata(meta_dictionary, dataset_name='', tvb_specific_metadata=True, where='/')[source]

Set meta-data information for root node or for a given data set.

Parameters:
  • meta_dictionary – dictionary containing meta info to be stored on node

  • dataset_name – name of the dataset where to assign metadata. If None, metadata is assigned to ROOT node.

  • tvb_specific_metadata – specify if the provided metadata is TVB specific (All keys will have a TVB prefix)

  • where – represents the path where dataset is stored (e.g. /data/info)

store_data(data_list, dataset_name='', where='/')[source]

This method stores provided data list into a data set in the H5 file.

Parameters:
  • dataset_name – Name of the data set where to store data

  • data_list – Data to be stored

  • where – represents the path where to store our dataset (e.g. /data/info)

xml_metadata_handlers

This module contains logic for meta-data handling.

It handles read/write operations in XML files for retrieving/storing meta-data. More specific: it contains XML Reader/Writer Utility, for generic metadata dictionary.

class tvb.storage.h5.file.xml_metadata_handlers.XMLReader(xml_path)[source]

Bases: object

Reader for XML with meta-data on generic entities (e.g. Project, Operation).

static get_node_text(node)[source]

From XMl node, read string content.

read_metadata_from_xml()[source]

Return a dictionary, filled with data read from XML file.

class tvb.storage.h5.file.xml_metadata_handlers.XMLWriter(entity)[source]

Bases: object

Writer for XML with meta-data on generic entities (e.g. Project, Operation).

ELEM_ROOT = 'tvb_data'
FILE_EXTENSION = '.xml'
write_metadata_in_xml(final_path)[source]

From a meta-data dictionary for an entity, create the XML file.