The Virtual Brain Project

Table Of Contents

Previous topic

simulator Package

Next topic

brco Package

This Page

uploaders Package

Define a list with all Python modules in which the introspect mechanism should search for Import Adapters.

abcuploader

class tvb.adapters.uploaders.abcuploader.ABCUploader[source]

Bases: tvb.core.adapters.abcadapter.ABCSynchronous

Base class of the uploaders

LOGGER = <logging.Logger object at 0x7fefe4c165d0>
ensure_db()[source]

Ensure algorithm exists in DB and add it if not

get_input_tree()[source]
Returns:the result of get_upload_input_tree concatenated with “subject” input field.
get_required_disk_size(**kwargs)[source]

As it is an upload algorithm and we do not have information about data, we can not approximate this.

get_required_memory_size(**kwargs)[source]

Return the required memory to run this algorithm. As it is an upload algorithm and we do not have information about data, we can not approximate this.

get_upload_input_tree()[source]

Build the list of dictionaries describing the input required for this uploader. :return: The input tree specific for this uploader

static read_list_data(full_path, dimensions=None, dtype=<type 'numpy.float64'>, skiprows=0, usecols=None)[source]

Read numpy.array from a text file or a npy/npz file.

static read_matlab_data(path, matlab_data_name=None)[source]

Read array from matlab file.

brco_importer

class tvb.adapters.uploaders.brco_importer.BRCOImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Import connectivity data stored in the networkx gpickle format

get_output()[source]
get_upload_input_tree()[source]

Take as input a mat file

launch(*args, **kwargs)[source]

cff_importer

class tvb.adapters.uploaders.cff_importer.CFF_Importer[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload Connectivity Matrix from a CFF archive.

get_output()[source]
get_upload_input_tree()[source]

Define as input parameter, a CFF archive.

launch(*args, **kwargs)[source]

Process the uploaded CFF and convert read data into our internal DataTypes. :param cff: CFF uploaded file to process.

logger = <logging.Logger object at 0x7fefe52ad0d0>

connectivity_measure_importer

class tvb.adapters.uploaders.connectivity_measure_importer.ConnectivityMeasureImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This imports a searies of conectivity measures from a .mat file

get_output()[source]
get_upload_input_tree()[source]

Take as input a mat file

launch(*args, **kwargs)[source]

Execute import operations:

csv_connectivity_importer

class tvb.adapters.uploaders.csv_connectivity_importer.CSVConnectivityImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Handler for uploading a Connectivity csv from the dti pipeline

DELIMITER_OPTIONS = [{'name': 'comma', 'value': ','}, {'name': 'semicolon', 'value': ';'}, {'name': 'tab', 'value': '\t'}, {'name': 'space', 'value': ' '}, {'name': 'colon', 'value': ':'}]
TRACT_FILE = 'tract_lengths.txt'
WEIGHTS_FILE = 'weights.txt'
get_output()[source]
get_upload_input_tree()[source]
launch(weights, weights_delimiter, tracts, tracts_delimiter, input_data)[source]

Execute import operations: process the weights and tracts csv files, then use the reference connectivity passed as input_data for the rest of the attributes.

Parameters:
  • weights – csv file containing the weights measures
  • tracts – csv file containing the tracts measures
  • input_data – a reference connectivity with the additional attributes
Raises LaunchException:
 

when the number of nodes in CSV files doesn’t match the one in the connectivity

class tvb.adapters.uploaders.csv_connectivity_importer.CSVConnectivityParser(csv_file, delimiter=', ')[source]

Bases: object

Parser for a connectivity csv file Such a file may begin with a optional header of ordinal integers The body of the file is a square matrix of floats -1 is interpreted as 0 If a header is present the matrices columns and rows are permuted so that the header ordinals would be in ascending order

permutation = None

A permutation represented as a list index -> new_index. Defaults to the identity permutation

fieldtrip_importer

Provides facilities to import FieldTrip data sets into TVB as time series and sensor data.

class tvb.adapters.uploaders.fieldtrip_importer.FieldTripUploader[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload time series and sensor data via a MAT file containing “dat” and “hdr” variables from the ft_read_data and ft_read_header functions.

For the moment, we treat all data coming from FieldTrip as MEG data though the channels may be of heterogeneous type.

get_output()[source]
get_upload_input_tree()[source]
launch(matfile)[source]
logger = <logging.Logger object at 0x7fefe5420fd0>

gifti_surface_importer

class tvb.adapters.uploaders.gifti_surface_importer.GIFTISurfaceImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This importer is responsible for import of surface from GIFTI format (XML file) and store them in TVB as Surface.

get_output()[source]
get_upload_input_tree()[source]

Take as input a .GII file.

launch(file_type, data_file, data_file_part2, should_center=False)[source]

Execute import operations:

gifti_timeseries_importer

class tvb.adapters.uploaders.gifti_timeseries_importer.GIFTITimeSeriesImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This importer is responsible for import of a TimeSeries from GIFTI format (XML file) and store them in TVB.

get_output()[source]
get_upload_input_tree()[source]

Take as input a .GII file.

launch(data_file, surface=None)[source]

Execute import operations:

mat_timeseries_importer

class tvb.adapters.uploaders.mat_timeseries_importer.MatTimeSeriesImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Import time series from a .mat file.

TS_EEG = 'EEG'
TS_REGION = 'region'
create_eeg_ts(data, sensors)[source]
create_region_ts(data, connectivity)[source]
get_output()[source]
get_upload_input_tree()[source]
launch(*args, **kwargs)[source]
ts_builder = {'region': <function create_region_ts at 0x7fefe536fd70>, 'EEG': <function create_eeg_ts at 0x7fefe536fde8>}

networkx_importer

class tvb.adapters.uploaders.networkx_importer.NetworkxConnectivityImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Import connectivity data stored in the networkx gpickle format

get_output()[source]
get_upload_input_tree()[source]
launch(*args, **kwargs)[source]

nifti_importer

class tvb.adapters.uploaders.nifti_importer.NIFTIImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This importer is responsible for loading of data from NIFTI format (nii or nii.gz files) and store them in TVB as TimeSeriesVolume or RegionVolumeMapping.

get_output()[source]
get_upload_input_tree()[source]

Take as input a GZ archive or NII file.

launch(*args, **kwargs)[source]

Execute import operations:

obj_importer

class tvb.adapters.uploaders.obj_importer.ObjSurfaceImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This imports geometry data stored in wavefront obj format

get_output()[source]
get_upload_input_tree()[source]

Take as input an obj file

launch(*args, **kwargs)[source]

Execute import operations:

projection_matrix_importer

class tvb.adapters.uploaders.projection_matrix_importer.BrainstormGainMatrixImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Import a Brainstorm file containing an sEEG, EEG or MEG gain matrix / lead field / projection matrix.

Brainstorm calculates the gain matrix for a set of three orthogonally oriented dipoles at each source location. However, we assume that these source points correspond to the cortical surface to which this head model shall be linked, thus we can use the source orientations to weight the three dipoles’ gain vectors, to produce a gain matrix whose number of rows matches the number of sensors and number of columns matches the number of vertices in the linked cortical surface.

get_output()[source]
get_upload_input_tree()[source]

Defines input parameters for this uploader

launch(filename, surface, sensors)[source]
class tvb.adapters.uploaders.projection_matrix_importer.ProjectionMatrixSurfaceEEGImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload ProjectionMatrix Cortical Surface -> EEG/MEG/SEEG Sensors from a MAT or NPY file.

get_output()[source]
get_upload_input_tree()[source]

Define input parameters for this importer.

launch(projection_file, surface, sensors, dataset_name='ProjectionMatrix')[source]

Creates ProjectionMatrix entity from uploaded data.

Raises LaunchException:
 when * no projection_file or sensors are specified * the dataset is invalid * number of sensors is different from the one in dataset
logger = <logging.Logger object at 0x7fefde03fc10>
tvb.adapters.uploaders.projection_matrix_importer.build_projection_instance(sensors, storage_path)[source]

region_mapping_importer

class tvb.adapters.uploaders.region_mapping_importer.RegionMapping_Importer[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload RegionMapping from a TXT, ZIP or BZ2 file.

get_output()[source]
get_upload_input_tree()[source]

Define input parameters for this importer.

launch(mapping_file, surface, connectivity)[source]

Creates region mapping from uploaded data.

Parameters:mapping_file – an archive containing data for mapping surface to connectivity
Raises LaunchException:
 when * a parameter is None or missing * archive has more than one file * uploaded files are empty * number of vertices in imported file is different to the number of surface vertices * imported file has negative values * imported file has regions which are not in connectivity
logger = <logging.Logger object at 0x7fefddf77e50>

sensors_importer

class tvb.adapters.uploaders.sensors_importer.BrainstormSensorUploader[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload sensors from Brainstorm database files

get_output()[source]
get_upload_input_tree()[source]
launch(filename)[source]
class tvb.adapters.uploaders.sensors_importer.Sensors_Importer[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload Sensors from a TXT file.

EEG_SENSORS = 'EEG Sensors'
INTERNAL_SENSORS = 'Internal Sensors'
MEG_SENSORS = 'MEG sensors'
get_output()[source]
get_upload_input_tree()[source]

Define input parameters for this importer.

launch(sensors_file, sensors_type)[source]

Creates required sensors from the uploaded file.

Parameters:
  • sensors_file – the file containing sensor data
  • sensors_type – a string from “EEG Sensors”, “MEG sensors”, “Internal Sensors”
Returns:

a list of sensors instances of the specified type

Raises LaunchException:
 

when * no sensors_file specified * sensors_type is invalid (not one of the mentioned options) * sensors_type is “MEG sensors” and no orientation is specified

logger = <logging.Logger object at 0x7fefddea1610>

signals_importer

Provides facilities to import FieldTrip data sets into TVB as time series and sensor data.

class tvb.adapters.uploaders.signals_importer.EEGLAB[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

EEGLAB .set file

get_output()[source]
get_upload_input_tree()[source]
launch(matfile, fdtfile)[source]
logger = <logging.Logger object at 0x7fefdddc5750>
class tvb.adapters.uploaders.signals_importer.FieldTripUploader[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload time series and sensor data via a MAT file containing “dat” and “hdr” variables from the ft_read_data and ft_read_header functions.

For the moment, we treat all data coming from FieldTrip as MEG data though the channels may be of heterogeneous type.

get_output()[source]
get_upload_input_tree()[source]
launch(matfile)[source]
logger = <logging.Logger object at 0x7fefdddc5750>
class tvb.adapters.uploaders.signals_importer.VHDR[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Upload a BrainVision Analyser file.

get_output()[source]
get_upload_input_tree()[source]
launch(vhdr, dat)[source]
read_data(mmap=False, dt='float32', mode='r')[source]

VHDR stores data in channel contiguous way such that reading disparate pieces in time is fast, when using memmap.

tract_importer

class tvb.adapters.uploaders.tract_importer.TrackvizTractsImporter[source]

Bases: tvb.adapters.uploaders.tract_importer._TrackImporterBase

This imports tracts from the trackviz format

launch(*args, **kwargs)[source]
class tvb.adapters.uploaders.tract_importer.ZipTxtTractsImporter[source]

Bases: tvb.adapters.uploaders.tract_importer._TrackImporterBase

This imports tracts from a zip containing txt files. One txt file for a tract.

get_upload_input_tree()[source]
launch(*args, **kwargs)[source]
tvb.adapters.uploaders.tract_importer.chunk_iter(iterable, n)[source]

Reads a generator in chunks. Yields lists. Last one may be smaller than n.

tvb_importer

class tvb.adapters.uploaders.tvb_importer.TVBImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

This importer is responsible for loading of data types exported from other systems in TVB format (simple H5 file or ZIP file containing multiple H5 files)

get_output()[source]
get_upload_input_tree()[source]

Take as input a ZIP archive or H5 file.

launch(data_file)[source]

Execute import operations: unpack ZIP, build and store generic DataType objects.

Parameters:data_file – an archive (ZIP / HDF5) containing the DataType
Raises LaunchException:
 when data_file is None, nonexistent, or invalid (e.g. incomplete meta-data, not in ZIP / HDF5 format etc. )

zip_connectivity_importer

class tvb.adapters.uploaders.zip_connectivity_importer.ZIPConnectivityImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Handler for uploading a Connectivity archive, with files holding text export of connectivity data from Numpy arrays.

AREA_TOKEN = 'area'
CENTRES_TOKEN = 'centres'
CENTRES_TOKEN2 = 'centers'
CORTICAL_INFO = 'cortical'
HEMISPHERE_INFO = 'hemisphere'
ORIENTATION_TOKEN = 'orientation'
TRACT_TOKEN = 'tract'
WEIGHT_TOKEN = 'weight'
get_output()[source]
get_upload_input_tree()[source]

Take as input a ZIP archive.

launch(uploaded, normalization=None)[source]

Execute import operations: unpack ZIP and build Connectivity object as result.

Parameters:

uploaded – an archive containing the Connectivity data to be imported

Returns:

Connectivity

Raises:
  • LaunchException – when uploaded is empty or nonexistent
  • Exception – when * weights or tracts matrix is invalid (negative values, wrong shape) * any of the vector orientation, areas, cortical or hemisphere is different from the expected number of nodes

zip_surface_importer

class tvb.adapters.uploaders.zip_surface_importer.ZIPSurfaceImporter[source]

Bases: tvb.adapters.uploaders.abcuploader.ABCUploader

Handler for uploading a Surface Data archive, with files holding vertices, normals and triangles to represent a surface data.

get_output()[source]
get_upload_input_tree()[source]

Take as input a ZIP archive.

launch(uploaded, surface_type, zero_based_triangles=False, should_center=False)[source]

Execute import operations: unpack ZIP and build Surface object as result.

Parameters:
  • uploaded – an archive containing the Surface data to be imported
  • surface_type – a string from the following: “Skin Air”, “Skull Skin”, “Brain Skull”, “Cortical Surface”, “EEG Cap”, “Face”
Returns:

a subclass of Surface DataType

Raises:
  • LaunchException – when * uploaded is missing * surface_type is invalid
  • RuntimeError – when triangles contain an invalid vertex index
logger = <logging.Logger object at 0x7fefdddf3190>