TenetoBIDS

TenetoBIDS allows use of Teneto functions to analyse entire datasets in just a few lines of code. The output from Teneto is then ready to be placed in statistical models, machine learning algorithms and/or plotted.

Prerequisites

To use TenetoBIDS you need preprocessied fMRI data in the BIDS format. It is tested and optimized for fMRIPrep but other preprocessing software following BIDS should (in theory) work too. For fMRIPrep V1.4 or later is requiresd. This preprocessed data should be in the ~BIDS_dir/derivatives/ directory. The output from teneto will always be found in …/BIDS_dir/derivatives/ in directories that begin with teneto- (depending on the function you use).

Contents of this tutorial

This tutorial will run a complete analysis on some test data.

For this tutorial, we will use some dummy data which is included with teneto. This section details what is in this data.

[1]:
import teneto
import os
dataset_path = teneto.__path__[0] + '/data/testdata/dummybids/'
print(os.listdir(dataset_path))
print(os.listdir(dataset_path + '/derivatives'))

['participants.tsv', 'dataset_description.json', 'sub-001', 'derivatives', 'sub-002']
['teneto-censor-timepoints', 'teneto-derive-temporalnetwork', 'teneto-volatility', 'teneto-exclude-runs', 'teneto-tests', 'teneto-make-parcellation', 'fmriprep', 'teneto-binarize', 'teneto-remove-confounds']

From the above we can see that there are two subjects in our dataset, and there is a fMRIPrep folder in the derivatives section. Only subject 1 has any dummy data, so we will have to select subject 1.

A complete analysis

Below is a complete analysis of this test data. We will go through each step after it.

[2]:

#Imports.
from teneto import TenetoBIDS
from teneto import __path__ as tenetopath
import numpy as np
#Set the path of the dataset.
datdir = tenetopath[0] + '/data/testdata/dummybids/'

# Step 1:
bids_filter = {'subject': '001',
               'run': 1,
               'task': 'a'}
tnet = TenetoBIDS(datdir, selected_pipeline='fmriprep', bids_filter=bids_filter, exist_ok=True)

# Step 2: create a parcellation
parcellation_params = {'atlas': 'Schaefer2018',
                       'atlas_desc': '100Parcels7Networks',
                       'parc_params': {'detrend': True}}
tnet.run('make_parcellation', parcellation_params)

# Step 3: Regress out confounds
remove_params = {'confound_selection': ['confound1']}
tnet.run('remove_confounds', remove_params)

# Step 4: Additonal preprocessing
exclude_params = {'confound_name': 'confound1',
                   'exclusion_criteria': '<-0.99'}
tnet.run('exclude_runs', exclude_params)
censor_params = {'confound_name': 'confound1',
                   'exclusion_criteria': '<-0.99',
                   'replace_with': 'cubicspline',
                   'tol': 0.25}
tnet.run('censor_timepoints', censor_params)

# Step 5: Calculats time-varying connectivity
derive_params = {'params': {'method': 'jackknife',
                            'postpro': 'standardize'}}
tnet.run('derive_temporalnetwork', derive_params)

# Step 6: Performs a binarization of the network
binaraize_params = {'threshold_type': 'percent',
                    'threshold_level': 0.1}
tnet.run('binarize', binaraize_params)

# Step 7: Calculate a network measure
measure_params = {'distance_func': 'hamming'}
tnet.run('volatility', measure_params)

# Step 8: load data
vol = tnet.load_data()
print(vol)

{'sub-001_run-1_task-a_vol.tsv':           0
0  0.103733}

Big Picture

While the above code may seem overwhelming at first. It is quite little code for what it does. It starts with nifti images and ends with a single measure about a time-varying connectivity estimate of the network.

There is one recurring theme used in the code above:

tnet.run(function_name, function_parameters)

function_name is a string and function_parameters is a dictionary function_name can be most functions in teneto if the data is in the correct format. function_parameters are the inputs to that function. You never need to pass the input data (e.g. time series or network), or any functions that have a sidecar input.

TenetoBIDS will also automatically try and find a confounds file in the derivatives when needed, so, this does not need to be specified either.

Once you have grabbed the above, the rest is pretty straight forward. But we will go through each step in turn.

Step 1 - defining the TenetoBIDS object.

[3]:
#Set the path of the dataset.
datdir = tenetopath[0] + '/data/testdata/dummybids/'
# Step 1:
bids_filter = {'subject': '001',
               'run': 1,
               'task': 'a'}
tnet = TenetoBIDS(datdir, selected_pipeline='fmriprep', bids_filter=bids_filter, exist_ok=True)

selected_pipeline

**This states where teneto will go looking for files. This example shows it should look in the fMRIPrep derivative directory. (i.e. in: datadir + ‘/derivatives/fmriprep/’).

bids_filter

teneto uses pybids to select different files. The bids_filter argument is a dictionary of arguments that get passed into the BIDSLayout.get. In the example above, we are saying we want subject 001, run 1 and task a. If you do not provide any arguments for bids_filter, all data found within the derivatives folder gets selected for analyses.

exist_ok (default: False)

This checks that it is ok to overwrite any previous calculations. The output data is saved in a new directory. If the new output directory already exists, the teneto step has previously been run, and an error will be returned because otherwise data may be overwritten. To overrule this error, set exists_ok to True.

We can now look at what files are selected that will be run on the next step.

[4]:
tnet.get_selected_files()

[4]:
[<BIDSDataFile filename='/home/william/anaconda3/lib/python3.6/site-packages/teneto/data/testdata/dummybids/derivatives/fmriprep/sub-001/func/sub-001_task-a_run-01_desc-confounds_regressors.tsv'>,
 <BIDSImageFile filename='/home/william/anaconda3/lib/python3.6/site-packages/teneto/data/testdata/dummybids/derivatives/fmriprep/sub-001/func/sub-001_task-a_run-01_desc-preproc_bold.nii.gz'>]

If there are files here you do not want, you can add to the bids filter with tnet.update_bids_filter Or, you can set tnet.bids_filter to a new dictionary if you want.

Next, you might want to see what functions you can run on these selected files. The following will specify what functions can be run specifically on the selected data. If you want all options, you can add the for_selected=False.

[5]:
tnet.get_run_options()

[5]:
'make_parcellation, exclude_runs'

The output here (exclude_runs and make_parcellation) says which functions that, with the selected files, can be called in tnet.run. Once different functions have been called, the options change.

Step 2 Calling the run function to make a parcellation.

When selecting preprocessed files, these will often be nifti images. From these images, we want to make time-series of regions of interests. This can be done with :py:func:.make_parcellation. This function uses TemplateFlow atlases to make the parcellation.

[6]:
parcellation_params = {'atlas': 'Schaefer2018',
                       'atlas_desc': '100Parcels7Networks',
                       'parc_params': {'detrend': True}}
tnet.run('make_parcellation', parcellation_params)

The atlas and atlas_desc are used to identify TemplateFlow atlases.

Teneto uses nilearn’s NiftiLabelsMasker to mark the parcellation. Any arguments to this function (e.g. preprocessing steps) can be passed in the argument using ‘parc_params’ (here detrend is used).

Step 3 Regress out confounds

[7]:
remove_params = {'confound_selection': ['confound1']}
tnet.run('remove_confounds', remove_params)

Confounds can be removed by calling :py:func:.remove_confounds.

The confounds tsv file is automatically located as long as it is in a derivatives folder and that there is only one

Here ‘confound1’ is a column namn in the confounds tsv file.

Similarly to make parcellation, it uses nilearn (nilean.signal.clean. clean_params is a possible argument, like parc_params these are inputs to the nilearn function.

Step 4: Additonal preprocessing

[8]:
exclude_params = {'confound_name': 'confound1',
                   'exclusion_criteria': '<-0.99'}
tnet.run('exclude_runs', exclude_params)
censor_params = {'confound_name': 'confound1',
                   'exclusion_criteria': '<-0.99',
                   'replace_with': 'cubicspline',
                   'tol': 0.25}
tnet.run('censor_timepoints', censor_params)


These two calls to tnet.run exclude both time-points and runs, which are problematic. The first, exclude_runs, rejects any run where the mean of confound1 is less than 0.99. Excluded runs will no longer be part of the loaded data in later calls of tnet.run().

Centoring time-points here says that whenever there is a time-point that is less than 0.99, it will be “censored” (set to not a number). We have also set argument replace_with to ‘cubicspline’. This argument means that the values that have censored now get simulated using a cubic spline. The parameter tol says what percentage of time-points are allowed to be censored before the run gets ignored.

Step 5: Calculats time-varying connectivity

The code below now derives time-varying connectivity matrices. There are multiple different methods that can be called. See teneto.timeseries.derive_temporalnetwork for more options.

[9]:
derive_params = {'params': {'method': 'jackknife',
                            'postpro': 'standardize'}}
tnet.run('derive_temporalnetwork', derive_params)

Step 6: Performs a binarization of the network

Once you have a network representation, there are multiple ways this can be transformed. One example, is to binarize the network so all values are 0 or 1. The code below converts the top 10% of edges to 1s, the rest 0.

[10]:
binaraize_params = {'threshold_type': 'percent',
                    'threshold_level': 0.1}
tnet.run('binarize', binaraize_params)

Step 7: Calculate a network measure

We are now ready to calculate a property of the temproal network. Here we calculate volatility (i.e. how much the network changes per time-point). This generates one value per subject.

[11]:
measure_params = {'distance_func': 'hamming'}
tnet.run('volatility', measure_params)

Step 8: load data

[12]:
vol = tnet.load_data()
print(vol)

{'sub-001_run-1_task-a_vol.tsv':           0
0  0.103733}

Now that we have a measure of volatility for the network. We can now load it and view the measure.