nimare.meta.cbma.kernel¶
Methods for estimating thresholded cluster maps from neuroimaging contrasts (Contrasts) from sets of foci and optional additional information (e.g., sample size and test statistic values).
NOTE: Currently imagining output from “dataset.get_coordinates” as a DataFrame of peak coords and sample sizes/statistics (a la Neurosynth).
Classes
ALEKernel ([fwhm, n]) |
Generate ALE modeled activation images from coordinates and sample size. |
KDAKernel ([r, value]) |
Generate KDA modeled activation images from coordinates. |
KernelTransformer () |
Base class for modeled activation-generating methods. |
MKDAKernel ([r, value]) |
Generate MKDA modeled activation images from coordinates. |
Peaks2MapsKernel ([resample_to_mask]) |
Generate peaks2maps modeled activation images from coordinates. |
-
class
ALEKernel
(fwhm=None, n=None)[source]¶ Generate ALE modeled activation images from coordinates and sample size.
Parameters: - fwhm (
float
, optional) – Full-width half-max for Gaussian kernel, if you want to have a constant kernel across Contrasts. Mutually exclusive withn
. - n (
int
, optional) – Sample size, used to derive FWHM for Gaussian kernel based on formulae from Eickhoff et al. (2012). This sample size overwrites the Contrast-specific sample sizes in the dataset, in order to hold kernel constant across Contrasts. Mutually exclusive withfwhm
.
Methods
get_params
(self[, deep])Get parameters for this estimator. load
(filename[, compressed])Load a pickled class instance from file. save
(self, filename[, compress])Pickle the class instance to the provided file. set_params
(self, \*\*params)Set the parameters of this estimator. transform
(self, dataset[, mask, masked])Generate ALE modeled activation images for each Contrast in dataset. -
get_params
(self, deep=True)[source]¶ Get parameters for this estimator.
Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. Returns: params – Parameter names mapped to their values. Return type: mapping of string to any
-
classmethod
load
(filename, compressed=True)[source]¶ Load a pickled class instance from file.
Parameters: Returns: obj – Loaded class object.
Return type: class object
-
save
(self, filename, compress=True)[source]¶ Pickle the class instance to the provided file.
Parameters:
-
set_params
(self, **params)[source]¶ Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.Returns: Return type: self
-
transform
(self, dataset, mask=None, masked=False)[source]¶ Generate ALE modeled activation images for each Contrast in dataset.
Parameters: - dataset (
nimare.dataset.Dataset
orpandas.DataFrame
) – Dataset for which to make images. Can be a DataFrame if necessary. - mask (img_like, optional) – Only used if dataset is a DataFrame.
- masked (
bool
, optional) – Return an array instead of a niimg.
Returns: imgs – A list of modeled activation images (one for each of the Contrasts in the input dataset).
Return type: list
ofnibabel.Nifti1Image
- dataset (
- fwhm (
-
class
MKDAKernel
(r=10, value=1)[source]¶ Generate MKDA modeled activation images from coordinates.
Parameters: Methods
get_params
(self[, deep])Get parameters for this estimator. load
(filename[, compressed])Load a pickled class instance from file. save
(self, filename[, compress])Pickle the class instance to the provided file. set_params
(self, \*\*params)Set the parameters of this estimator. transform
(self, dataset[, mask, masked])Generate MKDA modeled activation images for each Contrast in dataset. -
get_params
(self, deep=True)[source]¶ Get parameters for this estimator.
Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. Returns: params – Parameter names mapped to their values. Return type: mapping of string to any
-
classmethod
load
(filename, compressed=True)[source]¶ Load a pickled class instance from file.
Parameters: Returns: obj – Loaded class object.
Return type: class object
-
save
(self, filename, compress=True)[source]¶ Pickle the class instance to the provided file.
Parameters:
-
set_params
(self, **params)[source]¶ Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.Returns: Return type: self
-
transform
(self, dataset, mask=None, masked=False)[source]¶ Generate MKDA modeled activation images for each Contrast in dataset. For each Contrast, a binary sphere of radius
r
is placed around each coordinate. Voxels within overlapping regions between proximal coordinates are set to 1, rather than the sum.Parameters: - dataset (
nimare.dataset.Dataset
orpandas.DataFrame
) – Dataset for which to make images. Can be a DataFrame if necessary. - mask (img_like, optional) – Only used if dataset is a DataFrame.
- masked (
bool
, optional) – Return an array instead of a niimg.
Returns: imgs – A list of modeled activation images (one for each of the Contrasts in the input dataset).
Return type: list
ofnibabel.Nifti1Image
- dataset (
-
-
class
KDAKernel
(r=6, value=1)[source]¶ Generate KDA modeled activation images from coordinates.
Parameters: Methods
get_params
(self[, deep])Get parameters for this estimator. load
(filename[, compressed])Load a pickled class instance from file. save
(self, filename[, compress])Pickle the class instance to the provided file. set_params
(self, \*\*params)Set the parameters of this estimator. transform
(self, dataset[, mask, masked])Generate KDA modeled activation images for each Contrast in dataset. -
get_params
(self, deep=True)[source]¶ Get parameters for this estimator.
Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. Returns: params – Parameter names mapped to their values. Return type: mapping of string to any
-
classmethod
load
(filename, compressed=True)[source]¶ Load a pickled class instance from file.
Parameters: Returns: obj – Loaded class object.
Return type: class object
-
save
(self, filename, compress=True)[source]¶ Pickle the class instance to the provided file.
Parameters:
-
set_params
(self, **params)[source]¶ Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.Returns: Return type: self
-
transform
(self, dataset, mask=None, masked=False)[source]¶ Generate KDA modeled activation images for each Contrast in dataset. Differs from MKDA images in that binary spheres are summed together in map (i.e., resulting image is not binary if coordinates are close to one another).
Parameters: - dataset (
nimare.dataset.Dataset
orpandas.DataFrame
) – Dataset for which to make images. Can be a DataFrame if necessary. - mask (img_like, optional) – Only used if dataset is a DataFrame.
- masked (
bool
, optional) – Return an array instead of a niimg.
Returns: imgs – A list of modeled activation images (one for each of the Contrasts in the input dataset).
Return type: list
ofnibabel.Nifti1Image
- dataset (
-
-
class
Peaks2MapsKernel
(resample_to_mask=True)[source]¶ Generate peaks2maps modeled activation images from coordinates.
Methods
get_params
(self[, deep])Get parameters for this estimator. load
(filename[, compressed])Load a pickled class instance from file. save
(self, filename[, compress])Pickle the class instance to the provided file. set_params
(self, \*\*params)Set the parameters of this estimator. transform
(self, dataset[, mask, masked])Generate peaks2maps modeled activation images for each Contrast in dataset. -
get_params
(self, deep=True)[source]¶ Get parameters for this estimator.
Parameters: deep (boolean, optional) – If True, will return the parameters for this estimator and contained subobjects that are estimators. Returns: params – Parameter names mapped to their values. Return type: mapping of string to any
-
classmethod
load
(filename, compressed=True)[source]¶ Load a pickled class instance from file.
Parameters: Returns: obj – Loaded class object.
Return type: class object
-
save
(self, filename, compress=True)[source]¶ Pickle the class instance to the provided file.
Parameters:
-
set_params
(self, **params)[source]¶ Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form
<component>__<parameter>
so that it’s possible to update each component of a nested object.Returns: Return type: self
-
transform
(self, dataset, mask=None, masked=False)[source]¶ Generate peaks2maps modeled activation images for each Contrast in dataset.
Parameters: - ids (
list
) – A list of Contrast IDs for which to generate modeled activation images. - masked (
boolean
) – Whether to mask the maps generated by peaks2maps
Returns: imgs – A list of modeled activation images (one for each of the Contrasts in the input dataset).
Return type: list
ofnibabel.Nifti1Image
- ids (
-