nimare.meta.kernel
.ALEKernel
- class ALEKernel(fwhm=None, sample_size=None)[source]
Bases:
KernelTransformer
Generate ALE modeled activation images from coordinates and sample size.
By default (if neither
fwhm
norsample_size
is provided), the FWHM of the kernel will be determined on a study-wise basis based on the sample sizes available in the input, via the method described in Eickhoff et al.1.Changed in version 0.0.12:
Remove low-memory option in favor of sparse arrays for kernel transformers.
- Parameters
fwhm (
float
, optional) – Full-width half-max for Gaussian kernel, if you want to have a constant kernel across Contrasts. Mutually exclusive withsample_size
.sample_size (
int
, optional) – Sample size, used to derive FWHM for Gaussian kernel based on formulae from Eickhoff et al. (2012). This sample size overwrites the Contrast-specific sample sizes in the dataset, in order to hold kernel constant across Contrasts. Mutually exclusive withfwhm
.
References
- 1
Simon B Eickhoff, Danilo Bzdok, Angela R Laird, Florian Kurth, and Peter T Fox. Activation likelihood estimation meta-analysis revisited. Neuroimage, 59(3):2349–2361, 2012. URL: https://doi.org/10.1016/j.neuroimage.2011.09.017, doi:10.1016/j.neuroimage.2011.09.017.
Methods
get_params
([deep])Get parameters for this estimator.
load
(filename[, compressed])Load a pickled class instance from file.
save
(filename[, compress])Pickle the class instance to the provided file.
set_params
(**params)Set the parameters of this estimator.
transform
(dataset[, masker, return_type])Generate modeled activation images for each Contrast in dataset.
- classmethod load(filename, compressed=True)[source]
Load a pickled class instance from file.
- Parameters
- Returns
obj – Loaded class object.
- Return type
class object
- set_params(**params)[source]
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.- Return type
self
- transform(dataset, masker=None, return_type='image')[source]
Generate modeled activation images for each Contrast in dataset.
- Parameters
dataset (
Dataset
orpandas.DataFrame
) – Dataset for which to make images. Can be a DataFrame if necessary.masker (img_like or None, optional) – Mask to apply to MA maps. Required if
dataset
is a DataFrame. If None (anddataset
is a Dataset), the Dataset’s masker attribute will be used. Default is None.return_type ({'sparse', 'array', 'image', 'dataset'}, optional) – Whether to return a numpy array (‘array’), a list of niimgs (‘image’), or a Dataset with MA images saved as files (‘dataset’). Default is ‘image’.
- Returns
imgs – If return_type is ‘sparse’, a 4D sparse array (E x S), where E is the number of unique experiments, and the remaining 3 dimensions are equal to shape of the images. If return_type is ‘array’, a 2D numpy array (C x V), where C is contrast and V is voxel. If return_type is ‘image’, a list of modeled activation images (one for each of the Contrasts in the input dataset). If return_type is ‘dataset’, a new Dataset object with modeled activation images saved to files and referenced in the Dataset.images attribute.
- Return type
(C x V)
numpy.ndarray
orlist
ofnibabel.Nifti1Image
orDataset
- Variables