interfaces.afni.preprocess

AlignEpiAnatPy

Link to code

Wraps the executable command align_epi_anat.py.

Align EPI to anatomical datasets or vice versa This Python script computes the alignment between two datasets, typically an EPI and an anatomical structural dataset, and applies the resulting transformation to one or the other to bring them into alignment.

This script computes the transforms needed to align EPI and anatomical datasets using a cost function designed for this purpose. The script combines multiple transformations, thereby minimizing the amount of interpolation applied to the data.

Basic Usage:
align_epi_anat.py -anat anat+orig -epi epi+orig -epi_base 5

The user must provide EPI and anatomical datasets and specify the EPI sub-brick to use as a base in the alignment.

Internally, the script always aligns the anatomical to the EPI dataset, and the resulting transformation is saved to a 1D file. As a user option, the inverse of this transformation may be applied to the EPI dataset in order to align it to the anatomical data instead.

This program generates several kinds of output in the form of datasets and transformation matrices which can be applied to other datasets if needed. Time-series volume registration, oblique data transformations and Talairach (standard template) transformations will be combined as needed and requested (with options to turn on and off each of the steps) in order to create the aligned datasets.

For complete details, see the align_epi_anat.py’ Documentation.

Examples

>>> from nipype.interfaces import afni
>>> al_ea = afni.AlignEpiAnatPy()
>>> al_ea.inputs.anat = "structural.nii"
>>> al_ea.inputs.in_file = "functional.nii"
>>> al_ea.inputs.epi_base = 0
>>> al_ea.inputs.epi_strip = '3dAutomask'
>>> al_ea.inputs.volreg = 'off'
>>> al_ea.inputs.tshift = 'off'
>>> al_ea.inputs.save_skullstrip = True
>>> al_ea.cmdline # doctest: +ELLIPSIS
'python2 ...align_epi_anat.py -anat structural.nii -epi_base 0 -epi_strip 3dAutomask -epi functional.nii -save_skullstrip -suffix _al -tshift off -volreg off'
>>> res = allineate.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        EPI dataset to align
        argument: ``-epi %s``
anat: (a pathlike object or string representing an existing file)
        name of structural dataset
        argument: ``-anat %s``
epi_base: (a long integer >= 0 or 'mean' or 'median' or 'max')
        the epi base used in alignmentshould be one of
        (0/mean/median/max/subbrick#)
        argument: ``-epi_base %s``

[Optional]
anat2epi: (a boolean)
        align anatomical to EPI dataset (default)
        argument: ``-anat2epi``
epi2anat: (a boolean)
        align EPI to anatomical dataset
        argument: ``-epi2anat``
save_skullstrip: (a boolean)
        save skull-stripped (not aligned)
        argument: ``-save_skullstrip``
suffix: (a unicode string, nipype default value: _al)
        append suffix to the original anat/epi dataset to usein the
        resulting dataset names (default is "_al")
        argument: ``-suffix %s``
epi_strip: ('3dSkullStrip' or '3dAutomask' or 'None')
        method to mask brain in EPI datashould be one
        of[3dSkullStrip]/3dAutomask/None)
        argument: ``-epi_strip %s``
volreg: ('on' or 'off', nipype default value: on)
        do volume registration on EPI dataset before alignmentshould be 'on'
        or 'off', defaults to 'on'
        argument: ``-volreg %s``
tshift: ('on' or 'off', nipype default value: on)
        do time shifting of EPI dataset before alignmentshould be 'on' or
        'off', defaults to 'on'
        argument: ``-tshift %s``
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
py27_path: (a pathlike object or string representing an existing file
          or 'python2', nipype default value: python2)
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

anat_al_orig: (a pathlike object or string representing a file)
        A version of the anatomy that is aligned to the EPI
epi_al_orig: (a pathlike object or string representing a file)
        A version of the EPI dataset aligned to the anatomy
epi_tlrc_al: (a pathlike object or string representing a file)
        A version of the EPI dataset aligned to a standard template
anat_al_mat: (a pathlike object or string representing a file)
        matrix to align anatomy to the EPI
epi_al_mat: (a pathlike object or string representing a file)
        matrix to align EPI to anatomy
epi_vr_al_mat: (a pathlike object or string representing a file)
        matrix to volume register EPI
epi_reg_al_mat: (a pathlike object or string representing a file)
        matrix to volume register and align epi to anatomy
epi_al_tlrc_mat: (a pathlike object or string representing a file)
        matrix to volume register and align epito anatomy and put into
        standard space
epi_vr_motion: (a pathlike object or string representing a file)
        motion parameters from EPI time-seriesregistration (tsh included in
        name if slicetiming correction is also included).
skullstrip: (a pathlike object or string representing a file)
        skull-stripped (not aligned) volume

References:

None None

Allineate

Link to code

Wraps the executable command 3dAllineate.

Program to align one dataset (the ‘source’) to a base dataset

For complete details, see the 3dAllineate Documentation.

Examples

>>> from nipype.interfaces import afni
>>> allineate = afni.Allineate()
>>> allineate.inputs.in_file = 'functional.nii'
>>> allineate.inputs.out_file = 'functional_allineate.nii'
>>> allineate.inputs.in_matrix = 'cmatrix.mat'
>>> allineate.cmdline
'3dAllineate -source functional.nii -prefix functional_allineate.nii -1Dmatrix_apply cmatrix.mat'
>>> res = allineate.run()  # doctest: +SKIP
>>> allineate = afni.Allineate()
>>> allineate.inputs.in_file = 'functional.nii'
>>> allineate.inputs.reference = 'structural.nii'
>>> allineate.inputs.allcostx = 'out.allcostX.txt'
>>> allineate.cmdline
'3dAllineate -source functional.nii -base structural.nii -allcostx |& tee out.allcostX.txt'
>>> res = allineate.run()  # doctest: +SKIP
>>> allineate = afni.Allineate()
>>> allineate.inputs.in_file = 'functional.nii'
>>> allineate.inputs.reference = 'structural.nii'
>>> allineate.inputs.nwarp_fixmot = ['X', 'Y']
>>> allineate.cmdline
'3dAllineate -source functional.nii -nwarp_fixmotX -nwarp_fixmotY -prefix functional_allineate -base structural.nii'
>>> res = allineate.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dAllineate
        argument: ``-source %s``

[Optional]
reference: (a pathlike object or string representing an existing
          file)
        file to be used as reference, the first volume will be used if not
        given the reference will be the first volume of in_file.
        argument: ``-base %s``
out_file: (a pathlike object or string representing a file)
        output file from 3dAllineate
        argument: ``-prefix %s``
        mutually_exclusive: allcostx
out_param_file: (a pathlike object or string representing a file)
        Save the warp parameters in ASCII (.1D) format.
        argument: ``-1Dparam_save %s``
        mutually_exclusive: in_param_file, allcostx
in_param_file: (a pathlike object or string representing an existing
          file)
        Read warp parameters from file and apply them to the source dataset,
        and produce a new dataset
        argument: ``-1Dparam_apply %s``
        mutually_exclusive: out_param_file
out_matrix: (a pathlike object or string representing a file)
        Save the transformation matrix for each volume.
        argument: ``-1Dmatrix_save %s``
        mutually_exclusive: in_matrix, allcostx
in_matrix: (a pathlike object or string representing a file)
        matrix to align input file
        argument: ``-1Dmatrix_apply %s``, position: -3
        mutually_exclusive: out_matrix
overwrite: (a boolean)
        overwrite output file if it already exists
        argument: ``-overwrite``
allcostx: (a pathlike object or string representing a file)
        Compute and print ALL available cost functionals for the un-warped
        inputsAND THEN QUIT. If you use this option none of the other
        expected outputs will be produced
        argument: ``-allcostx |& tee %s``, position: -1
        mutually_exclusive: out_file, out_matrix, out_param_file,
          out_weight_file
cost: ('leastsq' or 'ls' or 'mutualinfo' or 'mi' or 'corratio_mul' or
          'crM' or 'norm_mutualinfo' or 'nmi' or 'hellinger' or 'hel' or
          'corratio_add' or 'crA' or 'corratio_uns' or 'crU')
        Defines the 'cost' function that defines the matching between the
        source and the base
        argument: ``-cost %s``
interpolation: ('nearestneighbour' or 'linear' or 'cubic' or
          'quintic')
        Defines interpolation method to use during matching
        argument: ``-interp %s``
final_interpolation: ('nearestneighbour' or 'linear' or 'cubic' or
          'quintic' or 'wsinc5')
        Defines interpolation method used to create the output dataset
        argument: ``-final %s``
nmatch: (an integer (int or long))
        Use at most n scattered points to match the datasets.
        argument: ``-nmatch %d``
no_pad: (a boolean)
        Do not use zero-padding on the base image.
        argument: ``-nopad``
zclip: (a boolean)
        Replace negative values in the input datasets (source & base) with
        zero.
        argument: ``-zclip``
convergence: (a float)
        Convergence test in millimeters (default 0.05mm).
        argument: ``-conv %f``
usetemp: (a boolean)
        temporary file use
        argument: ``-usetemp``
check: (a list of items which are 'leastsq' or 'ls' or 'mutualinfo'
          or 'mi' or 'corratio_mul' or 'crM' or 'norm_mutualinfo' or 'nmi'
          or 'hellinger' or 'hel' or 'corratio_add' or 'crA' or
          'corratio_uns' or 'crU')
        After cost functional optimization is done, start at the final
        parameters and RE-optimize using this new cost functions. If the
        results are too different, a warning message will be printed.
        However, the final parameters from the original optimization will be
        used to create the output dataset.
        argument: ``-check %s``
one_pass: (a boolean)
        Use only the refining pass -- do not try a coarse resolution pass
        first. Useful if you know that only small amounts of image alignment
        are needed.
        argument: ``-onepass``
two_pass: (a boolean)
        Use a two pass alignment strategy for all volumes, searching for a
        large rotation+shift and then refining the alignment.
        argument: ``-twopass``
two_blur: (a float)
        Set the blurring radius for the first pass in mm.
        argument: ``-twoblur %f``
two_first: (a boolean)
        Use -twopass on the first image to be registered, and then on all
        subsequent images from the source dataset, use results from the
        first image's coarse pass to start the fine pass.
        argument: ``-twofirst``
two_best: (an integer (int or long))
        In the coarse pass, use the best 'bb' set of initialpoints to search
        for the starting point for the finepass. If bb==0, then no search is
        made for the beststarting point, and the identity transformation
        isused as the starting point. [Default=5; min=0 max=11]
        argument: ``-twobest %d``
fine_blur: (a float)
        Set the blurring radius to use in the fine resolution pass to 'x'
        mm. A small amount (1-2 mm?) of blurring at the fine step may help
        with convergence, if there is some problem, especially if the base
        volume is very noisy. [Default == 0 mm = no blurring at the final
        alignment pass]
        argument: ``-fineblur %f``
center_of_mass: (a unicode string)
        Use the center-of-mass calculation to bracket the shifts.
        argument: ``-cmass%s``
autoweight: (a unicode string)
        Compute a weight function using the 3dAutomask algorithm plus some
        blurring of the base image.
        argument: ``-autoweight%s``
automask: (an integer (int or long))
        Compute a mask function, set a value for dilation or 0.
        argument: ``-automask+%d``
autobox: (a boolean)
        Expand the -automask function to enclose a rectangular box that
        holds the irregular mask.
        argument: ``-autobox``
nomask: (a boolean)
        Don't compute the autoweight/mask; if -weight is not also used, then
        every voxel will be counted equally.
        argument: ``-nomask``
weight_file: (a pathlike object or string representing an existing
          file)
        Set the weighting for each voxel in the base dataset; larger weights
        mean that voxel count more in the cost function. Must be defined on
        the same grid as the base dataset
        argument: ``-weight %s``
weight: (a pathlike object or string representing an existing file or
          a float)
        Set the weighting for each voxel in the base dataset; larger weights
        mean that voxel count more in the cost function. If an image file is
        given, the volume must be defined on the same grid as the base
        dataset
        argument: ``-weight %s``
out_weight_file: (a pathlike object or string representing a file)
        Write the weight volume to disk as a dataset
        argument: ``-wtprefix %s``
        mutually_exclusive: allcostx
source_mask: (a pathlike object or string representing an existing
          file)
        mask the input dataset
        argument: ``-source_mask %s``
source_automask: (an integer (int or long))
        Automatically mask the source dataset with dilation or 0.
        argument: ``-source_automask+%d``
warp_type: ('shift_only' or 'shift_rotate' or 'shift_rotate_scale' or
          'affine_general')
        Set the warp type.
        argument: ``-warp %s``
warpfreeze: (a boolean)
        Freeze the non-rigid body parameters after first volume.
        argument: ``-warpfreeze``
replacebase: (a boolean)
        If the source has more than one volume, then after the first volume
        is aligned to the base.
        argument: ``-replacebase``
replacemeth: ('leastsq' or 'ls' or 'mutualinfo' or 'mi' or
          'corratio_mul' or 'crM' or 'norm_mutualinfo' or 'nmi' or
          'hellinger' or 'hel' or 'corratio_add' or 'crA' or 'corratio_uns'
          or 'crU')
        After first volume is aligned, switch method for later volumes. For
        use with '-replacebase'.
        argument: ``-replacemeth %s``
epi: (a boolean)
        Treat the source dataset as being composed of warped EPI slices, and
        the base as comprising anatomically 'true' images. Only phase-
        encoding direction image shearing and scaling will be allowed with
        this option.
        argument: ``-EPI``
maxrot: (a float)
        Maximum allowed rotation in degrees.
        argument: ``-maxrot %f``
maxshf: (a float)
        Maximum allowed shift in mm.
        argument: ``-maxshf %f``
maxscl: (a float)
        Maximum allowed scaling factor.
        argument: ``-maxscl %f``
maxshr: (a float)
        Maximum allowed shearing factor.
        argument: ``-maxshr %f``
master: (a pathlike object or string representing an existing file)
        Write the output dataset on the same grid as this file.
        argument: ``-master %s``
newgrid: (a float)
        Write the output dataset using isotropic grid spacing in mm.
        argument: ``-newgrid %f``
nwarp: ('bilinear' or 'cubic' or 'quintic' or 'heptic' or 'nonic' or
          'poly3' or 'poly5' or 'poly7' or 'poly9')
        Experimental nonlinear warping: bilinear or legendre poly.
        argument: ``-nwarp %s``
nwarp_fixmot: (a list of items which are 'X' or 'Y' or 'Z' or 'I' or
          'J' or 'K')
        To fix motion along directions.
        argument: ``-nwarp_fixmot%s...``
nwarp_fixdep: (a list of items which are 'X' or 'Y' or 'Z' or 'I' or
          'J' or 'K')
        To fix non-linear warp dependency along directions.
        argument: ``-nwarp_fixdep%s...``
verbose: (a boolean)
        Print out verbose progress reports.
        argument: ``-verb``
quiet: (a boolean)
        Don't print out verbose progress reports.
        argument: ``-quiet``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output image file name
out_matrix: (a pathlike object or string representing an existing
          file)
        matrix to align input file
out_param_file: (a pathlike object or string representing an existing
          file)
        warp parameters
out_weight_file: (a pathlike object or string representing an
          existing file)
        weight volume
allcostx: (a pathlike object or string representing a file)
        Compute and print ALL available cost functionals for the un-warped
        inputs

References:

None None

AutoTLRC

Link to code

Wraps the executable command @auto_tlrc.

A minmal wrapper for the AutoTLRC script The only option currently supported is no_ss. For complete details, see the 3dQwarp Documentation.

Examples

>>> from nipype.interfaces import afni
>>> autoTLRC = afni.AutoTLRC()
>>> autoTLRC.inputs.in_file = 'structural.nii'
>>> autoTLRC.inputs.no_ss = True
>>> autoTLRC.inputs.base = "TT_N27+tlrc"
>>> autoTLRC.cmdline
'@auto_tlrc -base TT_N27+tlrc -input structural.nii -no_ss'
>>> res = autoTLRC.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        Original anatomical volume (+orig).The skull is removed by this
        scriptunless instructed otherwise (-no_ss).
        argument: ``-input %s``
base: (a unicode string)
         Reference anatomical volume Usually this volume is in some standard
        space like TLRC or MNI space and with afni dataset view of (+tlrc).
        Preferably, this reference volume should have had the skull removed
        but that is not mandatory. AFNI's distribution contains several
        templates. For a longer list, use "whereami
        -show_templates"TT_N27+tlrc --> Single subject, skull stripped
        volume. This volume is also known as N27_SurfVol_NoSkull+tlrc
        elsewhere in AFNI and SUMA land. (www.loni.ucla.edu,
        www.bic.mni.mcgill.ca) This template has a full set of FreeSurfer
        (surfer.nmr.mgh.harvard.edu) surface models that can be used in
        SUMA. For details, see Talairach-related link:
        https://afni.nimh.nih.gov/afni/sumaTT_icbm452+tlrc --> Average
        volume of 452 normal brains. Skull Stripped.
        (www.loni.ucla.edu)TT_avg152T1+tlrc --> Average volume of 152 normal
        brains. Skull Stripped.(www.bic.mni.mcgill.ca)TT_EPI+tlrc --> EPI
        template from spm2, masked as TT_avg152T1 TT_avg152 and TT_EPI
        volume sources are from SPM's distribution.
        (www.fil.ion.ucl.ac.uk/spm/)If you do not specify a path for the
        template, the scriptwill attempt to locate the template AFNI's
        binaries directory.NOTE: These datasets have been slightly modified
        from their original size to match the standard TLRC dimensions (Jean
        Talairach and Pierre Tournoux Co-Planar Stereotaxic Atlas of the
        Human Brain Thieme Medical Publishers, New York, 1988). That was
        done for internal consistency in AFNI. You may use the original form
        of these volumes if you choose but your TLRC coordinates will not be
        consistent with AFNI's TLRC database (San Antonio Talairach Daemon
        database), for example.
        argument: ``-base %s``

[Optional]
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
no_ss: (a boolean)
        Do not strip skull of input data set(because skull has already been
        removedor because template still has the skull)NOTE: The -no_ss
        option is not all that optional. Here is a table of when you should
        and should not use -no_ss Template Template WITH skull WITHOUT skull
        Dset. WITH skull -no_ss xxx WITHOUT skull No Cigar -no_ss Template
        means: Your template of choice Dset. means: Your anatomical dataset
        -no_ss means: Skull stripping should not be attempted on Dset xxx
        means: Don't put anything, the script will strip Dset No Cigar
        means: Don't try that combination, it makes no sense.
        argument: ``-no_ss``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

AutoTcorrelate

Link to code

Wraps the executable command 3dAutoTcorrelate.

Computes the correlation coefficient between the time series of each pair of voxels in the input dataset, and stores the output into a new anatomical bucket dataset [scaled to shorts to save memory space].

For complete details, see the 3dAutoTcorrelate Documentation.

Examples

>>> from nipype.interfaces import afni
>>> corr = afni.AutoTcorrelate()
>>> corr.inputs.in_file = 'functional.nii'
>>> corr.inputs.polort = -1
>>> corr.inputs.eta2 = True
>>> corr.inputs.mask = 'mask.nii'
>>> corr.inputs.mask_only_targets = True
>>> corr.cmdline  # doctest: +ELLIPSIS
'3dAutoTcorrelate -eta2 -mask mask.nii -mask_only_targets -prefix functional_similarity_matrix.1D -polort -1 functional.nii'
>>> res = corr.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        timeseries x space (volume or surface) file
        argument: ``%s``, position: -1

[Optional]
polort: (an integer (int or long))
        Remove polynomical trend of order m or -1 for no detrending
        argument: ``-polort %d``
eta2: (a boolean)
        eta^2 similarity
        argument: ``-eta2``
mask: (a pathlike object or string representing an existing file)
        mask of voxels
        argument: ``-mask %s``
mask_only_targets: (a boolean)
        use mask only on targets voxels
        argument: ``-mask_only_targets``
        mutually_exclusive: mask_source
mask_source: (a pathlike object or string representing an existing
          file)
        mask for source voxels
        argument: ``-mask_source %s``
        mutually_exclusive: mask_only_targets
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Automask

Link to code

Wraps the executable command 3dAutomask.

Create a brain-only mask of the image using AFNI 3dAutomask command

For complete details, see the 3dAutomask Documentation.

Examples

>>> from nipype.interfaces import afni
>>> automask = afni.Automask()
>>> automask.inputs.in_file = 'functional.nii'
>>> automask.inputs.dilate = 1
>>> automask.inputs.outputtype = 'NIFTI'
>>> automask.cmdline  # doctest: +ELLIPSIS
'3dAutomask -apply_prefix functional_masked.nii -dilate 1 -prefix functional_mask.nii functional.nii'
>>> res = automask.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dAutomask
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
brain_file: (a pathlike object or string representing a file)
        output file from 3dAutomask
        argument: ``-apply_prefix %s``
clfrac: (a float)
        sets the clip level fraction (must be 0.1-0.9). A small value will
        tend to make the mask larger [default = 0.5].
        argument: ``-clfrac %s``
dilate: (an integer (int or long))
        dilate the mask outwards
        argument: ``-dilate %s``
erode: (an integer (int or long))
        erode the mask inwards
        argument: ``-erode %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        mask file
brain_file: (a pathlike object or string representing an existing
          file)
        brain file (skull stripped)

References:

None None

Bandpass

Link to code

Wraps the executable command 3dBandpass.

Program to lowpass and/or highpass each voxel time series in a dataset, offering more/different options than Fourier

For complete details, see the 3dBandpass Documentation.

Examples

>>> from nipype.interfaces import afni
>>> from nipype.testing import  example_data
>>> bandpass = afni.Bandpass()
>>> bandpass.inputs.in_file = 'functional.nii'
>>> bandpass.inputs.highpass = 0.005
>>> bandpass.inputs.lowpass = 0.1
>>> bandpass.cmdline
'3dBandpass -prefix functional_bp 0.005000 0.100000 functional.nii'
>>> res = bandpass.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dBandpass
        argument: ``%s``, position: -1
lowpass: (a float)
        lowpass
        argument: ``%f``, position: -2
highpass: (a float)
        highpass
        argument: ``%f``, position: -3

[Optional]
out_file: (a pathlike object or string representing a file)
        output file from 3dBandpass
        argument: ``-prefix %s``, position: 1
mask: (a pathlike object or string representing an existing file)
        mask file
        argument: ``-mask %s``, position: 2
despike: (a boolean)
        Despike each time series before other processing. Hopefully, you
        don't actually need to do this, which is why it is optional.
        argument: ``-despike``
orthogonalize_file: (a list of items which are a pathlike object or
          string representing an existing file)
        Also orthogonalize input to columns in f.1D. Multiple '-ort' options
        are allowed.
        argument: ``-ort %s``
orthogonalize_dset: (a pathlike object or string representing an
          existing file)
        Orthogonalize each voxel to the corresponding voxel time series in
        dataset 'fset', which must have the same spatial and temporal grid
        structure as the main input dataset. At present, only one '-dsort'
        option is allowed.
        argument: ``-dsort %s``
no_detrend: (a boolean)
        Skip the quadratic detrending of the input that occurs before the
        FFT-based bandpassing. You would only want to do this if the dataset
        had been detrended already in some other program.
        argument: ``-nodetrend``
tr: (a float)
        Set time step (TR) in sec [default=from dataset header].
        argument: ``-dt %f``
nfft: (an integer (int or long))
        Set the FFT length [must be a legal value].
        argument: ``-nfft %d``
normalize: (a boolean)
        Make all output time series have L2 norm = 1 (i.e., sum of squares =
        1).
        argument: ``-norm``
automask: (a boolean)
        Create a mask from the input dataset.
        argument: ``-automask``
blur: (a float)
        Blur (inside the mask only) with a filter width (FWHM) of 'fff'
        millimeters.
        argument: ``-blur %f``
localPV: (a float)
        Replace each vector by the local Principal Vector (AKA first
        singular vector) from a neighborhood of radius 'rrr' millimeters.
        Note that the PV time series is L2 normalized. This option is mostly
        for Bob Cox to have fun with.
        argument: ``-localPV %f``
notrans: (a boolean)
        Don't check for initial positive transients in the data. The test is
        a little slow, so skipping it is OK, if you KNOW the data time
        series are transient-free.
        argument: ``-notrans``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

BlurInMask

Link to code

Wraps the executable command 3dBlurInMask.

Blurs a dataset spatially inside a mask. That’s all. Experimental.

For complete details, see the 3dBlurInMask Documentation.

Examples

>>> from nipype.interfaces import afni
>>> bim = afni.BlurInMask()
>>> bim.inputs.in_file = 'functional.nii'
>>> bim.inputs.mask = 'mask.nii'
>>> bim.inputs.fwhm = 5.0
>>> bim.cmdline  # doctest: +ELLIPSIS
'3dBlurInMask -input functional.nii -FWHM 5.000000 -mask mask.nii -prefix functional_blur'
>>> res = bim.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dSkullStrip
        argument: ``-input %s``, position: 1
fwhm: (a float)
        fwhm kernel size
        argument: ``-FWHM %f``

[Optional]
out_file: (a pathlike object or string representing a file)
        output to the file
        argument: ``-prefix %s``, position: -1
mask: (a pathlike object or string representing a file)
        Mask dataset, if desired. Blurring will occur only within the mask.
        Voxels NOT in the mask will be set to zero in the output.
        argument: ``-mask %s``
multimask: (a pathlike object or string representing a file)
        Multi-mask dataset -- each distinct nonzero value in dataset will be
        treated as a separate mask for blurring purposes.
        argument: ``-Mmask %s``
automask: (a boolean)
        Create an automask from the input dataset.
        argument: ``-automask``
preserve: (a boolean)
        Normally, voxels not in the mask will be set to zero in the output.
        If you want the original values in the dataset to be preserved in
        the output, use this option.
        argument: ``-preserve``
float_out: (a boolean)
        Save dataset as floats, no matter what the input data type is.
        argument: ``-float``
options: (a unicode string)
        options
        argument: ``%s``, position: 2
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

BlurToFWHM

Link to code

Wraps the executable command 3dBlurToFWHM.

Blurs a ‘master’ dataset until it reaches a specified FWHM smoothness (approximately).

For complete details, see the 3dBlurToFWHM Documentation

Examples

>>> from nipype.interfaces import afni
>>> blur = afni.preprocess.BlurToFWHM()
>>> blur.inputs.in_file = 'epi.nii'
>>> blur.inputs.fwhm = 2.5
>>> blur.cmdline  # doctest: +ELLIPSIS
'3dBlurToFWHM -FWHM 2.500000 -input epi.nii -prefix epi_afni'
>>> res = blur.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        The dataset that will be smoothed
        argument: ``-input %s``

[Optional]
automask: (a boolean)
        Create an automask from the input dataset.
        argument: ``-automask``
fwhm: (a float)
        Blur until the 3D FWHM reaches this value (in mm)
        argument: ``-FWHM %f``
fwhmxy: (a float)
        Blur until the 2D (x,y)-plane FWHM reaches this value (in mm)
        argument: ``-FWHMxy %f``
blurmaster: (a pathlike object or string representing an existing
          file)
        The dataset whose smoothness controls the process.
        argument: ``-blurmaster %s``
mask: (a pathlike object or string representing an existing file)
        Mask dataset, if desired. Voxels NOT in mask will be set to zero in
        output.
        argument: ``-mask %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

ClipLevel

Link to code

Wraps the executable command 3dClipLevel.

Estimates the value at which to clip the anatomical dataset so
that background regions are set to zero.

For complete details, see the 3dClipLevel Documentation.

Examples

>>> from nipype.interfaces.afni import preprocess
>>> cliplevel = preprocess.ClipLevel()
>>> cliplevel.inputs.in_file = 'anatomical.nii'
>>> cliplevel.cmdline
'3dClipLevel anatomical.nii'
>>> res = cliplevel.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dClipLevel
        argument: ``%s``, position: -1

[Optional]
mfrac: (a float)
        Use the number ff instead of 0.50 in the algorithm
        argument: ``-mfrac %s``, position: 2
doall: (a boolean)
        Apply the algorithm to each sub-brick separately.
        argument: ``-doall``, position: 3
        mutually_exclusive: g, r, a, d
grad: (a pathlike object or string representing a file)
        Also compute a 'gradual' clip level as a function of voxel position,
        and output that to a dataset.
        argument: ``-grad %s``, position: 3
        mutually_exclusive: d, o, a, l, l
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

clip_val: (a float)
        output

DegreeCentrality

Link to code

Wraps the executable command 3dDegreeCentrality.

Performs degree centrality on a dataset using a given maskfile via 3dDegreeCentrality

For complete details, see the 3dDegreeCentrality Documentation.

Examples

>>> from nipype.interfaces import afni
>>> degree = afni.DegreeCentrality()
>>> degree.inputs.in_file = 'functional.nii'
>>> degree.inputs.mask = 'mask.nii'
>>> degree.inputs.sparsity = 1 # keep the top one percent of connections
>>> degree.inputs.out_file = 'out.nii'
>>> degree.cmdline
'3dDegreeCentrality -mask mask.nii -prefix out.nii -sparsity 1.000000 functional.nii'
>>> res = degree.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dDegreeCentrality
        argument: ``%s``, position: -1

[Optional]
sparsity: (a float)
        only take the top percent of connections
        argument: ``-sparsity %f``
oned_file: (a unicode string)
        output filepath to text dump of correlation matrix
        argument: ``-out1D %s``
mask: (a pathlike object or string representing an existing file)
        mask file to mask input data
        argument: ``-mask %s``
thresh: (a float)
        threshold to exclude connections where corr <= thresh
        argument: ``-thresh %f``
polort: (an integer (int or long))
        argument: ``-polort %d``
autoclip: (a boolean)
        Clip off low-intensity regions in the dataset
        argument: ``-autoclip``
automask: (a boolean)
        Mask the dataset to target brain-only voxels
        argument: ``-automask``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

oned_file: (a pathlike object or string representing a file)
        The text output of the similarity matrix computed after thresholding
        with one-dimensional and ijk voxel indices, correlations, image
        extents, and affine matrix.
out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Despike

Link to code

Wraps the executable command 3dDespike.

Removes ‘spikes’ from the 3D+time input dataset

For complete details, see the 3dDespike Documentation.

Examples

>>> from nipype.interfaces import afni
>>> despike = afni.Despike()
>>> despike.inputs.in_file = 'functional.nii'
>>> despike.cmdline
'3dDespike -prefix functional_despike functional.nii'
>>> res = despike.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dDespike
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Detrend

Link to code

Wraps the executable command 3dDetrend.

This program removes components from voxel time series using linear least squares

For complete details, see the 3dDetrend Documentation.

Examples

>>> from nipype.interfaces import afni
>>> detrend = afni.Detrend()
>>> detrend.inputs.in_file = 'functional.nii'
>>> detrend.inputs.args = '-polort 2'
>>> detrend.inputs.outputtype = 'AFNI'
>>> detrend.cmdline
'3dDetrend -polort 2 -prefix functional_detrend functional.nii'
>>> res = detrend.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dDetrend
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

ECM

Link to code

Wraps the executable command 3dECM.

Performs degree centrality on a dataset using a given maskfile via the 3dECM command

For complete details, see the 3dECM Documentation.

Examples

>>> from nipype.interfaces import afni
>>> ecm = afni.ECM()
>>> ecm.inputs.in_file = 'functional.nii'
>>> ecm.inputs.mask = 'mask.nii'
>>> ecm.inputs.sparsity = 0.1 # keep top 0.1% of connections
>>> ecm.inputs.out_file = 'out.nii'
>>> ecm.cmdline
'3dECM -mask mask.nii -prefix out.nii -sparsity 0.100000 functional.nii'
>>> res = ecm.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dECM
        argument: ``%s``, position: -1

[Optional]
sparsity: (a float)
        only take the top percent of connections
        argument: ``-sparsity %f``
full: (a boolean)
        Full power method; enables thresholding; automatically selected if
        -thresh or -sparsity are set
        argument: ``-full``
fecm: (a boolean)
        Fast centrality method; substantial speed increase but cannot
        accomodate thresholding; automatically selected if -thresh or
        -sparsity are not set
        argument: ``-fecm``
shift: (a float)
        shift correlation coefficients in similarity matrix to enforce non-
        negativity, s >= 0.0; default = 0.0 for -full, 1.0 for -fecm
        argument: ``-shift %f``
scale: (a float)
        scale correlation coefficients in similarity matrix to after
        shifting, x >= 0.0; default = 1.0 for -full, 0.5 for -fecm
        argument: ``-scale %f``
eps: (a float)
        sets the stopping criterion for the power iteration; l2|v_old -
        v_new| < eps*|v_old|; default = 0.001
        argument: ``-eps %f``
max_iter: (an integer (int or long))
        sets the maximum number of iterations to use in the power iteration;
        default = 1000
        argument: ``-max_iter %d``
memory: (a float)
        Limit memory consumption on system by setting the amount of GB to
        limit the algorithm to; default = 2GB
        argument: ``-memory %f``
mask: (a pathlike object or string representing an existing file)
        mask file to mask input data
        argument: ``-mask %s``
thresh: (a float)
        threshold to exclude connections where corr <= thresh
        argument: ``-thresh %f``
polort: (an integer (int or long))
        argument: ``-polort %d``
autoclip: (a boolean)
        Clip off low-intensity regions in the dataset
        argument: ``-autoclip``
automask: (a boolean)
        Mask the dataset to target brain-only voxels
        argument: ``-automask``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Fim

Link to code

Wraps the executable command 3dfim+.

Program to calculate the cross-correlation of an ideal reference waveform with the measured FMRI time series for each voxel.

For complete details, see the 3dfim+ Documentation.

Examples

>>> from nipype.interfaces import afni
>>> fim = afni.Fim()
>>> fim.inputs.in_file = 'functional.nii'
>>> fim.inputs.ideal_file= 'seed.1D'
>>> fim.inputs.out_file = 'functional_corr.nii'
>>> fim.inputs.out = 'Correlation'
>>> fim.inputs.fim_thr = 0.0009
>>> fim.cmdline
'3dfim+ -input functional.nii -ideal_file seed.1D -fim_thr 0.000900 -out Correlation -bucket functional_corr.nii'
>>> res = fim.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dfim+
        argument: ``-input %s``, position: 1
ideal_file: (a pathlike object or string representing an existing
          file)
        ideal time series file name
        argument: ``-ideal_file %s``, position: 2

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-bucket %s``
fim_thr: (a float)
        fim internal mask threshold value
        argument: ``-fim_thr %f``, position: 3
out: (a unicode string)
        Flag to output the specified parameter
        argument: ``-out %s``, position: 4
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Fourier

Link to code

Wraps the executable command 3dFourier.

Program to lowpass and/or highpass each voxel time series in a dataset, via the FFT

For complete details, see the 3dFourier Documentation.

Examples

>>> from nipype.interfaces import afni
>>> fourier = afni.Fourier()
>>> fourier.inputs.in_file = 'functional.nii'
>>> fourier.inputs.retrend = True
>>> fourier.inputs.highpass = 0.005
>>> fourier.inputs.lowpass = 0.1
>>> fourier.cmdline
'3dFourier -highpass 0.005000 -lowpass 0.100000 -prefix functional_fourier -retrend functional.nii'
>>> res = fourier.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dFourier
        argument: ``%s``, position: -1
lowpass: (a float)
        lowpass
        argument: ``-lowpass %f``
highpass: (a float)
        highpass
        argument: ``-highpass %f``

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
retrend: (a boolean)
        Any mean and linear trend are removed before filtering. This will
        restore the trend after filtering.
        argument: ``-retrend``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Hist

Link to code

Wraps the executable command 3dHist.

Computes average of all voxels in the input dataset which satisfy the criterion in the options list

For complete details, see the 3dHist Documentation.

Examples

>>> from nipype.interfaces import afni
>>> hist = afni.Hist()
>>> hist.inputs.in_file = 'functional.nii'
>>> hist.cmdline
'3dHist -input functional.nii -prefix functional_hist'
>>> res = hist.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dHist
        argument: ``-input %s``, position: 1

[Optional]
out_file: (a pathlike object or string representing a file)
        Write histogram to niml file with this prefix
        argument: ``-prefix %s``
showhist: (a boolean, nipype default value: False)
        write a text visual histogram
        argument: ``-showhist``
out_show: (a pathlike object or string representing a file)
        output image file name
        argument: ``> %s``, position: -1
mask: (a pathlike object or string representing an existing file)
        matrix to align input file
        argument: ``-mask %s``
nbin: (an integer (int or long))
        number of bins
        argument: ``-nbin %d``
max_value: (a float)
        maximum intensity value
        argument: ``-max %f``
min_value: (a float)
        minimum intensity value
        argument: ``-min %f``
bin_width: (a float)
        bin width
        argument: ``-binwidth %f``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file
out_show: (a pathlike object or string representing a file)
        output visual histogram

LFCD

Link to code

Wraps the executable command 3dLFCD.

Performs degree centrality on a dataset using a given maskfile via the 3dLFCD command

For complete details, see the 3dLFCD Documentation.

Examples

>>> from nipype.interfaces import afni
>>> lfcd = afni.LFCD()
>>> lfcd.inputs.in_file = 'functional.nii'
>>> lfcd.inputs.mask = 'mask.nii'
>>> lfcd.inputs.thresh = 0.8 # keep all connections with corr >= 0.8
>>> lfcd.inputs.out_file = 'out.nii'
>>> lfcd.cmdline
'3dLFCD -mask mask.nii -prefix out.nii -thresh 0.800000 functional.nii'
>>> res = lfcd.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dLFCD
        argument: ``%s``, position: -1

[Optional]
mask: (a pathlike object or string representing an existing file)
        mask file to mask input data
        argument: ``-mask %s``
thresh: (a float)
        threshold to exclude connections where corr <= thresh
        argument: ``-thresh %f``
polort: (an integer (int or long))
        argument: ``-polort %d``
autoclip: (a boolean)
        Clip off low-intensity regions in the dataset
        argument: ``-autoclip``
automask: (a boolean)
        Mask the dataset to target brain-only voxels
        argument: ``-automask``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Maskave

Link to code

Wraps the executable command 3dmaskave.

Computes average of all voxels in the input dataset which satisfy the criterion in the options list

For complete details, see the 3dmaskave Documentation.

Examples

>>> from nipype.interfaces import afni
>>> maskave = afni.Maskave()
>>> maskave.inputs.in_file = 'functional.nii'
>>> maskave.inputs.mask= 'seed_mask.nii'
>>> maskave.inputs.quiet= True
>>> maskave.cmdline  # doctest: +ELLIPSIS
'3dmaskave -mask seed_mask.nii -quiet functional.nii > functional_maskave.1D'
>>> res = maskave.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dmaskave
        argument: ``%s``, position: -2

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``> %s``, position: -1
mask: (a pathlike object or string representing an existing file)
        matrix to align input file
        argument: ``-mask %s``, position: 1
quiet: (a boolean)
        matrix to align input file
        argument: ``-quiet``, position: 2
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Means

Link to code

Wraps the executable command 3dMean.

Takes the voxel-by-voxel mean of all input datasets using 3dMean

For complete details, see the 3dMean Documentation.

Examples

>>> from nipype.interfaces import afni
>>> means = afni.Means()
>>> means.inputs.in_file_a = 'im1.nii'
>>> means.inputs.in_file_b = 'im2.nii'
>>> means.inputs.out_file =  'output.nii'
>>> means.cmdline
'3dMean -prefix output.nii im1.nii im2.nii'
>>> res = means.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> means = afni.Means()
>>> means.inputs.in_file_a = 'im1.nii'
>>> means.inputs.out_file =  'output.nii'
>>> means.inputs.datum = 'short'
>>> means.cmdline
'3dMean -datum short -prefix output.nii im1.nii'
>>> res = means.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file_a: (a pathlike object or string representing an existing
          file)
        input file to 3dMean
        argument: ``%s``, position: -2

[Optional]
in_file_b: (a pathlike object or string representing an existing
          file)
        another input file to 3dMean
        argument: ``%s``, position: -1
datum: (a unicode string)
        Sets the data type of the output dataset
        argument: ``-datum %s``
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
scale: (a unicode string)
        scaling of output
        argument: ``-%sscale``
non_zero: (a boolean)
        use only non-zero values
        argument: ``-non_zero``
std_dev: (a boolean)
        calculate std dev
        argument: ``-stdev``
sqr: (a boolean)
        mean square instead of value
        argument: ``-sqr``
summ: (a boolean)
        take sum, (not average)
        argument: ``-sum``
count: (a boolean)
        compute count of non-zero voxels
        argument: ``-count``
mask_inter: (a boolean)
        create intersection mask
        argument: ``-mask_inter``
mask_union: (a boolean)
        create union mask
        argument: ``-mask_union``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

OutlierCount

Link to code

Wraps the executable command 3dToutcount.

Calculates number of ‘outliers’ at each time point of a a 3D+time dataset.

For complete details, see the 3dToutcount Documentation

Examples

>>> from nipype.interfaces import afni
>>> toutcount = afni.OutlierCount()
>>> toutcount.inputs.in_file = 'functional.nii'
>>> toutcount.cmdline  # doctest: +ELLIPSIS
'3dToutcount -qthr 0.00100 functional.nii'
>>> res = toutcount.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input dataset
        argument: ``%s``, position: -2

[Optional]
mask: (a pathlike object or string representing an existing file)
        only count voxels within the given mask
        argument: ``-mask %s``
        mutually_exclusive: autoclip, automask
qthr: (0.0 <= a floating point number <= 1.0, nipype default value:
          0.001)
        indicate a value for q to compute alpha
        argument: ``-qthr %.5f``
autoclip: (a boolean, nipype default value: False)
        clip off small voxels
        argument: ``-autoclip``
        mutually_exclusive: mask
automask: (a boolean, nipype default value: False)
        clip off small voxels
        argument: ``-automask``
        mutually_exclusive: mask
fraction: (a boolean, nipype default value: False)
        write out the fraction of masked voxels which are outliers at each
        timepoint
        argument: ``-fraction``
interval: (a boolean, nipype default value: False)
        write out the median + 3.5 MAD of outlier count with each timepoint
        argument: ``-range``
save_outliers: (a boolean, nipype default value: False)
        enables out_file option
outliers_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-save %s``
polort: (an integer (int or long))
        detrend each voxel timeseries with polynomials
        argument: ``-polort %d``
legendre: (a boolean, nipype default value: False)
        use Legendre polynomials
        argument: ``-legendre``
out_file: (a pathlike object or string representing a file)
        capture standard output
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_outliers: (a pathlike object or string representing an existing
          file)
        output image file name
out_file: (a pathlike object or string representing a file)
        capture standard output

QualityIndex

Link to code

Wraps the executable command 3dTqual.

Computes a `quality index’ for each sub-brick in a 3D+time dataset. The output is a 1D time series with the index for each sub-brick. The results are written to stdout.

For complete details, see the 3dTqual Documentation

Examples

>>> from nipype.interfaces import afni
>>> tqual = afni.QualityIndex()
>>> tqual.inputs.in_file = 'functional.nii'
>>> tqual.cmdline  # doctest: +ELLIPSIS
'3dTqual functional.nii > functional_tqual'
>>> res = tqual.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input dataset
        argument: ``%s``, position: -2

[Optional]
mask: (a pathlike object or string representing an existing file)
        compute correlation only across masked voxels
        argument: ``-mask %s``
        mutually_exclusive: autoclip, automask
spearman: (a boolean, nipype default value: False)
        Quality index is 1 minus the Spearman (rank) correlation coefficient
        of each sub-brick with the median sub-brick. (default).
        argument: ``-spearman``
quadrant: (a boolean, nipype default value: False)
        Similar to -spearman, but using 1 minus the quadrant correlation
        coefficient as the quality index.
        argument: ``-quadrant``
autoclip: (a boolean, nipype default value: False)
        clip off small voxels
        argument: ``-autoclip``
        mutually_exclusive: mask
automask: (a boolean, nipype default value: False)
        clip off small voxels
        argument: ``-automask``
        mutually_exclusive: mask
clip: (a float)
        clip off values below
        argument: ``-clip %f``
interval: (a boolean, nipype default value: False)
        write out the median + 3.5 MAD of outlier count with each timepoint
        argument: ``-range``
out_file: (a pathlike object or string representing a file)
        capture standard output
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing a file)
        file containing the captured standard output

Qwarp

Link to code

Wraps the executable command 3dQwarp.

A version of 3dQwarp Allineate your images prior to passing them to this workflow.

For complete details, see the 3dQwarp Documentation.

Examples

>>> from nipype.interfaces import afni
>>> qwarp = afni.Qwarp()
>>> qwarp.inputs.in_file = 'sub-01_dir-LR_epi.nii.gz'
>>> qwarp.inputs.nopadWARP = True
>>> qwarp.inputs.base_file = 'sub-01_dir-RL_epi.nii.gz'
>>> qwarp.inputs.plusminus = True
>>> qwarp.cmdline
'3dQwarp -base sub-01_dir-RL_epi.nii.gz -source sub-01_dir-LR_epi.nii.gz -nopadWARP -prefix ppp_sub-01_dir-LR_epi -plusminus'
>>> res = qwarp.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> qwarp = afni.Qwarp()
>>> qwarp.inputs.in_file = 'structural.nii'
>>> qwarp.inputs.base_file = 'mni.nii'
>>> qwarp.inputs.resample = True
>>> qwarp.cmdline
'3dQwarp -base mni.nii -source structural.nii -prefix ppp_structural -resample'
>>> res = qwarp.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> qwarp = afni.Qwarp()
>>> qwarp.inputs.in_file = 'structural.nii'
>>> qwarp.inputs.base_file = 'epi.nii'
>>> qwarp.inputs.out_file = 'anatSSQ.nii.gz'
>>> qwarp.inputs.resample = True
>>> qwarp.inputs.lpc = True
>>> qwarp.inputs.verb = True
>>> qwarp.inputs.iwarp = True
>>> qwarp.inputs.blur = [0,3]
>>> qwarp.cmdline
'3dQwarp -base epi.nii -blur 0.0 3.0 -source structural.nii -iwarp -prefix anatSSQ.nii.gz -resample -verb -lpc'
>>> res = qwarp.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> qwarp = afni.Qwarp()
>>> qwarp.inputs.in_file = 'structural.nii'
>>> qwarp.inputs.base_file = 'mni.nii'
>>> qwarp.inputs.duplo = True
>>> qwarp.inputs.blur = [0,3]
>>> qwarp.cmdline
'3dQwarp -base mni.nii -blur 0.0 3.0 -duplo -source structural.nii -prefix ppp_structural'
>>> res = qwarp.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> qwarp = afni.Qwarp()
>>> qwarp.inputs.in_file = 'structural.nii'
>>> qwarp.inputs.base_file = 'mni.nii'
>>> qwarp.inputs.duplo = True
>>> qwarp.inputs.minpatch = 25
>>> qwarp.inputs.blur = [0,3]
>>> qwarp.inputs.out_file = 'Q25'
>>> qwarp.cmdline
'3dQwarp -base mni.nii -blur 0.0 3.0 -duplo -source structural.nii -minpatch 25 -prefix Q25'
>>> res = qwarp.run()  # doctest: +SKIP
>>> qwarp2 = afni.Qwarp()
>>> qwarp2.inputs.in_file = 'structural.nii'
>>> qwarp2.inputs.base_file = 'mni.nii'
>>> qwarp2.inputs.blur = [0,2]
>>> qwarp2.inputs.out_file = 'Q11'
>>> qwarp2.inputs.inilev = 7
>>> qwarp2.inputs.iniwarp = ['Q25_warp+tlrc.HEAD']
>>> qwarp2.cmdline
'3dQwarp -base mni.nii -blur 0.0 2.0 -source structural.nii -inilev 7 -iniwarp Q25_warp+tlrc.HEAD -prefix Q11'
>>> res2 = qwarp2.run()  # doctest: +SKIP
>>> res2 = qwarp2.run()  # doctest: +SKIP
>>> qwarp3 = afni.Qwarp()
>>> qwarp3.inputs.in_file = 'structural.nii'
>>> qwarp3.inputs.base_file = 'mni.nii'
>>> qwarp3.inputs.allineate = True
>>> qwarp3.inputs.allineate_opts = '-cose lpa -verb'
>>> qwarp3.cmdline
"3dQwarp -allineate -allineate_opts '-cose lpa -verb' -base mni.nii -source structural.nii -prefix ppp_structural"
>>> res3 = qwarp3.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        Source image (opposite phase encoding direction than base image).
        argument: ``-source %s``
base_file: (a pathlike object or string representing an existing
          file)
        Base image (opposite phase encoding direction than source image).
        argument: ``-base %s``

[Optional]
out_file: (a pathlike object or string representing a file)
        Sets the prefix/suffix for the output datasets.
        * The source dataset is warped to match the base
        and gets prefix 'ppp'. (Except if '-plusminus' is used
        * The final interpolation to this output dataset is
        done using the 'wsinc5' method. See the output of
         3dAllineate -HELP
        (in the "Modifying '-final wsinc5'" section) for
        the lengthy technical details.
        * The 3D warp used is saved in a dataset with
        prefix 'ppp_WARP' -- this dataset can be used
        with 3dNwarpApply and 3dNwarpCat, for example.
        * To be clear, this is the warp from source dataset
         coordinates to base dataset coordinates, where the
         values at each base grid point are the xyz displacments
         needed to move that grid point's xyz values to the
         corresponding xyz values in the source dataset:
         base( (x,y,z) + WARP(x,y,z) ) matches source(x,y,z)
         Another way to think of this warp is that it 'pulls'
         values back from source space to base space.
        * 3dNwarpApply would use 'ppp_WARP' to transform datasets
        aligned with the source dataset to be aligned with the
        base dataset.
        ** If you do NOT want this warp saved, use the option '-nowarp'.
        -->> (However, this warp is usually the most valuable possible
        output!)
        * If you want to calculate and save the inverse 3D warp,
        use the option '-iwarp'. This inverse warp will then be
        saved in a dataset with prefix 'ppp_WARPINV'.
        * This inverse warp could be used to transform data from base
        space to source space, if you need to do such an operation.
        * You can easily compute the inverse later, say by a command like
         3dNwarpCat -prefix Z_WARPINV 'INV(Z_WARP+tlrc)'
        or the inverse can be computed as needed in 3dNwarpApply, like
         3dNwarpApply -nwarp 'INV(Z_WARP+tlrc)' -source Dataset.nii ...
        argument: ``-prefix %s``
resample: (a boolean)
        This option simply resamples the source dataset to match thebase
        dataset grid. You can use this if the two datasetsoverlap well (as
        seen in the AFNI GUI), but are not on thesame 3D grid.* If they
        don't overlap well, allineate them first* The reampling here is done
        with the'wsinc5' method, which has very little blurring artifact.*
        If the base and source datasets ARE on the same 3D grid,then the
        -resample option will be ignored.* You CAN use -resample with these
        3dQwarp options:-plusminus -inilev -iniwarp -duplo
        argument: ``-resample``
allineate: (a boolean)
        This option will make 3dQwarp run 3dAllineate first, to align the
        source dataset to the base with an affine transformation. It will
        then use that alignment as a starting point for the nonlinear
        warping.
        argument: ``-allineate``
allineate_opts: (a unicode string)
        add extra options to the 3dAllineate command to be run by 3dQwarp.
        argument: ``-allineate_opts %s``
        requires: allineate
nowarp: (a boolean)
        Do not save the _WARP file.
        argument: ``-nowarp``
iwarp: (a boolean)
        Do compute and save the _WARPINV file.
        argument: ``-iwarp``
        mutually_exclusive: plusminus
pear: (a boolean)
        Use strict Pearson correlation for matching.* Not usually
        recommended, since the 'clipped Pearson' methodused by default will
        reduce the impact of outlier values.
        argument: ``-pear``
noneg: (a boolean)
        Replace negative values in either input volume with 0.* If there ARE
        negative input values, and you do NOT use -noneg,then strict Pearson
        correlation will be used, since the 'clipped'method only is
        implemented for non-negative volumes.* '-noneg' is not the default,
        since there might be situations whereyou want to align datasets with
        positive and negative values mixed.* But, in many cases, the
        negative values in a dataset are just theresult of interpolation
        artifacts (or other peculiarities), and sothey should be ignored.
        That is what '-noneg' is for.
        argument: ``-noneg``
nopenalty: (a boolean)
        Replace negative values in either input volume with 0.* If there ARE
        negative input values, and you do NOT use -noneg,then strict Pearson
        correlation will be used, since the 'clipped'method only is
        implemented for non-negative volumes.* '-noneg' is not the default,
        since there might be situations whereyou want to align datasets with
        positive and negative values mixed.* But, in many cases, the
        negative values in a dataset are just theresult of interpolation
        artifacts (or other peculiarities), and sothey should be ignored.
        That is what '-noneg' is for.
        argument: ``-nopenalty``
penfac: (a float)
        Use this value to weight the penalty.The default value is 1.Larger
        values mean thepenalty counts more, reducing grid
        distortions,insha'Allah; '-nopenalty' is the same as '-penfac 0'.
        -->>* [23 Sep 2013] -- Zhark increased the default value of the
        penalty by a factor of 5, and also made it get progressively larger
        with each level of refinement. Thus, warping results will vary from
        earlier instances of 3dQwarp. * The progressive increase in the
        penalty at higher levels means that the 'cost function' can actually
        look like the alignment is getting worse when the levels change. *
        IF you wish to turn off this progression, for whatever reason (e.g.,
        to keep compatibility with older results), use the option
        '-penold'.To be completely compatible with the older 3dQwarp, you'll
        also have to use '-penfac 0.2'.
        argument: ``-penfac %f``
noweight: (a boolean)
        If you want a binary weight (the old default), use this option.That
        is, each voxel in the base volume automask will beweighted the same
        in the computation of the cost functional.
        argument: ``-noweight``
weight: (a pathlike object or string representing an existing file)
        Instead of computing the weight from the base dataset,directly input
        the weight volume from dataset 'www'.* Useful if you know what over
        parts of the base image youwant to emphasize or de-emphasize the
        matching functional.
        argument: ``-weight %s``
wball: (a list of from 5 to 5 items which are an integer (int or
          long))
        -wball x y z r fEnhance automatic weight from '-useweight' by a
        factorof 1+f*Gaussian(FWHM=r) centered in the base image atDICOM
        coordinates (x,y,z) and with radius 'r'. Thegoal of this option is
        to try and make the alignmentbetter in a specific part of the
        brain.* Example: -wball 0 14 6 30 40to emphasize the thalamic area
        (in MNI/Talairach space).* The 'r' parameter must be positive!* The
        'f' parameter must be between 1 and 100 (inclusive).* '-wball' does
        nothing if you input your own weightwith the '-weight' option.*
        '-wball' does change the binary weight created bythe '-noweight'
        option.* You can only use '-wball' once in a run of 3dQwarp.*** The
        effect of '-wball' is not dramatic. The exampleabove makes the
        average brain image across a collectionof subjects a little sharper
        in the thalamic area, whichmight have some small value. If you care
        enough aboutalignment to use '-wball', then you should examine
        theresults from 3dQwarp for each subject, to see if thealignments
        are good enough for your purposes.
        argument: ``-wball %s``
wmask: (a tuple of the form: (a pathlike object or string
          representing an existing file, a float))
        -wmask ws fSimilar to '-wball', but here, you provide a dataset
        'ws'that indicates where to increase the weight.* The 'ws' dataset
        must be on the same 3D grid as the base dataset.* 'ws' is treated as
        a mask -- it only matters where itis nonzero -- otherwise, the
        values inside are not used.* After 'ws' comes the factor 'f' by
        which to increase theautomatically computed weight. Where 'ws' is
        nonzero,the weighting will be multiplied by (1+f).* As with
        '-wball', the factor 'f' should be between 1 and 100.* You cannot
        use '-wball' and '-wmask' together!
        argument: ``-wpass %s %f``
out_weight_file: (a pathlike object or string representing a file)
        Write the weight volume to disk as a dataset
        argument: ``-wtprefix %s``
blur: (a list of from 1 to 2 items which are a float)
        Gaussian blur the input images by 'bb' (FWHM) voxels beforedoing the
        alignment (the output dataset will not be blurred).The default is
        2.345 (for no good reason).* Optionally, you can provide 2 values
        for 'bb', and thenthe first one is applied to the base volume, the
        secondto the source volume.-->>* e.g., '-blur 0 3' to skip blurring
        the base image(if the base is a blurry template, for example).* A
        negative blur radius means to use 3D median filtering,rather than
        Gaussian blurring. This type of filtering willbetter preserve edges,
        which can be important in alignment.* If the base is a template
        volume that is already blurry,you probably don't want to blur it
        again, but blurringthe source volume a little is probably a good
        idea, tohelp the program avoid trying to match tiny features.* Note
        that -duplo will blur the volumes some extraamount for the initial
        small-scale warping, to makethat phase of the program converge more
        rapidly.
        argument: ``-blur %s``
pblur: (a list of from 1 to 2 items which are a float)
        Use progressive blurring; that is, for larger patch sizes,the amount
        of blurring is larger. The general idea is toavoid trying to match
        finer details when the patch sizeand incremental warps are coarse.
        When '-blur' is usedas well, it sets a minimum amount of blurring
        that willbe used. [06 Aug 2014 -- '-pblur' may become the default
        someday].* You can optionally give the fraction of the patch size
        thatis used for the progressive blur by providing a value between0
        and 0.25 after '-pblur'. If you provide TWO values, thethe first
        fraction is used for progressively blurring thebase image and the
        second for the source image. The defaultparameters when just
        '-pblur' is given is the same as givingthe options as '-pblur 0.09
        0.09'.* '-pblur' is useful when trying to match 2 volumes with
        highamounts of detail; e.g, warping one subject's brain image
        tomatch another's, or trying to warp to match a detailed template.*
        Note that using negative values with '-blur' means that
        theprogressive blurring will be done with median filters, ratherthan
        Gaussian linear blurring.-->>*** The combination of the -allineate
        and -pblur options will makethe results of using 3dQwarp to align to
        a template somewhatless sensitive to initial head position and
        scaling.
        argument: ``-pblur %s``
emask: (a pathlike object or string representing an existing file)
        Here, 'ee' is a dataset to specify a mask of voxelsto EXCLUDE from
        the analysis -- all voxels in 'ee'that are NONZERO will not be used
        in the alignment.* The base image always automasked -- the emask
        isextra, to indicate voxels you definitely DON'T wantincluded in the
        matching process, even if they areinside the brain.
        argument: ``-emask %s``
noXdis: (a boolean)
        Warp will not displace in x direction
        argument: ``-noXdis``
noYdis: (a boolean)
        Warp will not displace in y direction
        argument: ``-noYdis``
noZdis: (a boolean)
        Warp will not displace in z direction
        argument: ``-noZdis``
iniwarp: (a list of items which are a pathlike object or string
          representing an existing file)
        A dataset with an initial nonlinear warp to use.* If this option is
        not used, the initial warp is the identity.* You can specify a
        catenation of warps (in quotes) here, as inprogram 3dNwarpApply.* As
        a special case, if you just input an affine matrix in a .1Dfile,
        that will work also -- it is treated as giving the initialwarp via
        the string "IDENT(base_dataset) matrix_file.aff12.1D".* You CANNOT
        use this option with -duplo !!* -iniwarp is usually used with
        -inilev to re-start 3dQwarp froma previous stopping point.
        argument: ``-iniwarp %s``
        mutually_exclusive: duplo
inilev: (an integer (int or long))
        The initial refinement 'level' at which to start.* Usually used with
        -iniwarp; CANNOT be used with -duplo.* The combination of -inilev
        and -iniwarp lets you take theresults of a previous 3dQwarp run and
        refine them further:Note that the source dataset in the second run
        is the SAME asin the first run. If you don't see why this is
        necessary,then you probably need to seek help from an AFNI guru.
        argument: ``-inilev %d``
        mutually_exclusive: duplo
minpatch: (an integer (int or long))
        * The value of mm should be an odd integer.* The default value of mm
        is 25.* For more accurate results than mm=25, try 19 or 13.* The
        smallest allowed patch size is 5.* You may want stop at a larger
        patch size (say 7 or 9) and usethe -Qfinal option to run that final
        level with quintic warps,which might run faster and provide the same
        degree of warp detail.* Trying to make two different brain volumes
        match in fine detailis usually a waste of time, especially in
        humans. There is toomuch variability in anatomy to match gyrus to
        gyrus accurately.For this reason, the default minimum patch size is
        25 voxels.Using a smaller '-minpatch' might try to force the warp
        tomatch features that do not match, and the result can be
        uselessimage distortions -- another reason to LOOK AT THE RESULTS.
        argument: ``-minpatch %d``
maxlev: (an integer (int or long))
        The initial refinement 'level' at which to start.* Usually used with
        -iniwarp; CANNOT be used with -duplo.* The combination of -inilev
        and -iniwarp lets you take theresults of a previous 3dQwarp run and
        refine them further:Note that the source dataset in the second run
        is the SAME asin the first run. If you don't see why this is
        necessary,then you probably need to seek help from an AFNI guru.
        argument: ``-maxlev %d``, position: -1
        mutually_exclusive: duplo
gridlist: (a pathlike object or string representing an existing file)
        This option provides an alternate way to specify the patchgrid sizes
        used in the warp optimization process. 'gl' isa 1D file with a list
        of patches to use -- in most cases,you will want to use it in the
        following form:-gridlist '1D: 0 151 101 75 51'* Here, a 0 patch size
        means the global domain. Patch sizesotherwise should be odd integers
        >= 5.* If you use the '0' patch size again after the first
        position,you will actually get an iteration at the size of
        thedefault patch level 1, where the patch sizes are 75% ofthe volume
        dimension. There is no way to force the programto literally repeat
        the sui generis step of lev=0.* You cannot use -gridlist with -duplo
        or -plusminus!
        argument: ``-gridlist %s``
        mutually_exclusive: duplo, plusminus
allsave: (a boolean)
        This option lets you save the output warps from each levelof the
        refinement process. Mostly used for experimenting.* Cannot be used
        with -nopadWARP, -duplo, or -plusminus.* Will only save all the
        outputs if the program terminatesnormally -- if it crashes, or
        freezes, then all thesewarps are lost.
        argument: ``-allsave``
        mutually_exclusive: nopadWARP, duplo, plusminus
duplo: (a boolean)
        Start off with 1/2 scale versions of the volumes,for getting a
        speedy coarse first alignment.* Then scales back up to register the
        full volumes.The goal is greater speed, and it seems to help
        thispositively piggish program to be more expeditious.* However,
        accuracy is somewhat lower with '-duplo',for reasons that currenly
        elude Zhark; for this reason,the Emperor does not usually use
        '-duplo'.
        argument: ``-duplo``
        mutually_exclusive: gridlist, maxlev, inilev, iniwarp, plusminus,
          allsave
workhard: (a boolean)
        Iterate more times, which can help when the volumes arehard to align
        at all, or when you hope to get a more precisealignment.* Slows the
        program down (possibly a lot), of course.* When you combine
        '-workhard' with '-duplo', only thefull size volumes get the extra
        iterations.* For finer control over which refinement levels work
        hard,you can use this option in the form (for example)
        -workhard:4:7which implies the extra iterations will be done at
        levels4, 5, 6, and 7, but not otherwise.* You can also use
        '-superhard' to iterate even more, butthis extra option will REALLY
        slow things down.-->>* Under most circumstances, you should not need
        to use either-workhard or -superhard.-->>* The fastest way to
        register to a template image is via the-duplo option, and without
        the -workhard or -superhard options.-->>* If you use this option in
        the form '-Workhard' (first letterin upper case), then the second
        iteration at each level isdone with quintic polynomial warps.
        argument: ``-workhard``
        mutually_exclusive: boxopt, ballopt
Qfinal: (a boolean)
        At the finest patch size (the final level), use Hermitequintic
        polynomials for the warp instead of cubic polynomials.* In a 3D
        'patch', there are 2x2x2x3=24 cubic polynomial basisfunction
        parameters over which to optimize (2 polynomialsdependent on each of
        the x,y,z directions, and 3 differentdirections of displacement).*
        There are 3x3x3x3=81 quintic polynomial parameters per patch.* With
        -Qfinal, the final level will have more detail inthe allowed warps,
        at the cost of yet more CPU time.* However, no patch below 7x7x7 in
        size will be done with quinticpolynomials.* This option is also not
        usually needed, and is experimental.
        argument: ``-Qfinal``
Qonly: (a boolean)
        Use Hermite quintic polynomials at all levels.* Very slow (about 4
        times longer). Also experimental.* Will produce a (discrete
        representation of a) C2 warp.
        argument: ``-Qonly``
plusminus: (a boolean)
        Normally, the warp displacements dis(x) are defined to matchbase(x)
        to source(x+dis(x)). With this option, the matchis between
        base(x-dis(x)) and source(x+dis(x)) -- the twoimages 'meet in the
        middle'.* One goal is to mimic the warping done to MRI EPI data
        byfield inhomogeneities, when registering between a 'blip up'and a
        'blip down' down volume, which will have oppositedistortions.*
        Define Wp(x) = x+dis(x) and Wm(x) = x-dis(x). Then sincebase(Wm(x))
        matches source(Wp(x)), by substituting INV(Wm(x))wherever we see x,
        we have base(x) matches source(Wp(INV(Wm(x))));that is, the warp
        V(x) that one would get from the 'usual' wayof running 3dQwarp is
        V(x) = Wp(INV(Wm(x))).* Conversely, we can calculate Wp(x) in terms
        of V(x) as follows:If V(x) = x + dv(x), define Vh(x) = x +
        dv(x)/2;then Wp(x) = V(INV(Vh(x)))* With the above formulas, it is
        possible to compute Wp(x) fromV(x) and vice-versa, using program
        3dNwarpCalc. The requisitecommands are left as an exercise for the
        aspiring AFNI Jedi Master.* You can use the semi-secret '-pmBASE'
        option to get the V(x)warp and the source dataset warped to base
        space, in addition tothe Wp(x) '_PLUS' and Wm(x) '_MINUS'
        warps.-->>* Alas: -plusminus does not work with -duplo or -allineate
        :-(* However, you can use -iniwarp with -plusminus :-)-->>* The
        outputs have _PLUS (from the source dataset) and _MINUS(from the
        base dataset) in their filenames, in addition tothe prefix. The
        -iwarp option, if present, will be ignored.
        argument: ``-plusminus``
        mutually_exclusive: duplo, allsave, iwarp
nopad: (a boolean)
        Do NOT use zero-padding on the 3D base and source images.[Default ==
        zero-pad, if needed]* The underlying model for deformations goes to
        zero at theedge of the volume being warped. However, if there
        issignificant data near an edge of the volume, then it won'tget
        displaced much, and so the results might not be good.* Zero padding
        is designed as a way to work around this potentialproblem. You
        should NOT need the '-nopad' option for anyreason that Zhark can
        think of, but it is here to be symmetricalwith 3dAllineate.* Note
        that the output (warped from source) dataset will be on thebase
        dataset grid whether or not zero-padding is allowed. However,unless
        you use the following option, allowing zero-padding (i.e.,the
        default operation) will make the output WARP dataset(s) beon a
        larger grid (also see '-expad' below).
        argument: ``-nopad``
nopadWARP: (a boolean)
        If for some reason you require the warp volume tomatch the base
        volume, then use this option to have the outputWARP dataset(s)
        truncated.
        argument: ``-nopadWARP``
        mutually_exclusive: allsave, expad
expad: (an integer (int or long))
        This option instructs the program to pad the warp by an extra'EE'
        voxels (and then 3dQwarp starts optimizing it).* This option is
        seldom needed, but can be useful if youmight later catenate the
        nonlinear warp -- via 3dNwarpCat --with an affine transformation
        that contains a large shift.Under that circumstance, the nonlinear
        warp might be shiftedpartially outside its original grid, so
        expanding that gridcan avoid this problem.* Note that this option
        perforce turns off '-nopadWARP'.
        argument: ``-expad %d``
        mutually_exclusive: nopadWARP
ballopt: (a boolean)
        Normally, the incremental warp parameters are optimized insidea
        rectangular 'box' (24 dimensional for cubic patches, 81 forquintic
        patches), whose limits define the amount of distortionallowed at
        each step. Using '-ballopt' switches these limitsto be applied to a
        'ball' (interior of a hypersphere), whichcan allow for larger
        incremental displacements. Use thisoption if you think things need
        to be able to move farther.
        argument: ``-ballopt``
        mutually_exclusive: workhard, boxopt
baxopt: (a boolean)
        Use the 'box' optimization limits instead of the 'ball'[this is the
        default at present].* Note that if '-workhard' is used, then ball
        and box optimizationare alternated in the different iterations at
        each level, sothese two options have no effect in that case.
        argument: ``-boxopt``
        mutually_exclusive: workhard, ballopt
verb: (a boolean)
        more detailed description of the process
        argument: ``-verb``
        mutually_exclusive: quiet
quiet: (a boolean)
        Cut out most of the fun fun fun progress messages :-(
        argument: ``-quiet``
        mutually_exclusive: verb
overwrite: (a boolean)
        Overwrite outputs
        argument: ``-overwrite``
lpc: (a boolean)
        Local Pearson minimization (i.e., EPI-T1 registration)This option
        has not be extensively testedIf you use '-lpc', then '-maxlev 0' is
        automatically set.If you want to go to more refined levels, you can
        set '-maxlev'This should be set up to have lpc as the second to last
        argumentand maxlev as the second to last argument, as needed by
        AFNIUsing maxlev > 1 is not recommended for EPI-T1 alignment.
        argument: ``-lpc``, position: -2
        mutually_exclusive: nmi, mi, hel, lpa, pear
lpa: (a boolean)
        Local Pearson maximizationThis option has not be extensively tested
        argument: ``-lpa``
        mutually_exclusive: nmi, mi, lpc, hel, pear
hel: (a boolean)
        Hellinger distance: a matching function for the adventurousThis
        option has NOT be extensively tested for usefullnessand should be
        considered experimental at this infundibulum.
        argument: ``-hel``
        mutually_exclusive: nmi, mi, lpc, lpa, pear
mi: (a boolean)
        Mutual Information: a matching function for the adventurousThis
        option has NOT be extensively tested for usefullnessand should be
        considered experimental at this infundibulum.
        argument: ``-mi``
        mutually_exclusive: mi, hel, lpc, lpa, pear
nmi: (a boolean)
        Normalized Mutual Information: a matching function for the
        adventurousThis option has NOT be extensively tested for
        usefullnessand should be considered experimental at this
        infundibulum.
        argument: ``-nmi``
        mutually_exclusive: nmi, hel, lpc, lpa, pear
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

warped_source: (a pathlike object or string representing a file)
        Warped source file. If plusminus is used, this is the
        undistortedsource file.
warped_base: (a pathlike object or string representing a file)
        Undistorted base file.
source_warp: (a pathlike object or string representing a file)
        Displacement in mm for the source image.If plusminus is used this is
        the field suceptibility correctionwarp (in 'mm') for source image.
base_warp: (a pathlike object or string representing a file)
        Displacement in mm for the base image.If plus minus is used, this is
        the field suceptibility correctionwarp (in 'mm') for base image.
        This is only output if plusminusor iwarp options are passed
weights: (a pathlike object or string representing a file)
        Auto-computed weight volume.

References:

None None

QwarpPlusMinus

Link to code

Wraps the executable command 3dQwarp.

A version of 3dQwarp for performing field susceptibility correction using two images with opposing phase encoding directions.

For complete details, see the 3dQwarp Documentation.

Examples

>>> from nipype.interfaces import afni
>>> qwarp = afni.QwarpPlusMinus()
>>> qwarp.inputs.in_file = 'sub-01_dir-LR_epi.nii.gz'
>>> qwarp.inputs.nopadWARP = True
>>> qwarp.inputs.base_file = 'sub-01_dir-RL_epi.nii.gz'
>>> qwarp.cmdline
'3dQwarp -prefix Qwarp.nii.gz -plusminus -base sub-01_dir-RL_epi.nii.gz     -source sub-01_dir-LR_epi.nii.gz -nopadWARP'
>>> res = warp.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        Source image (opposite phase encoding direction than base image).
        argument: ``-source %s``
base_file: (a pathlike object or string representing an existing
          file)
        Base image (opposite phase encoding direction than source image).
        argument: ``-base %s``

[Optional]
source_file: (a pathlike object or string representing an existing
          file)
        Source image (opposite phase encoding direction than base image)
        argument: ``-source %s``
out_file: (a pathlike object or string representing a file, nipype
          default value: Qwarp.nii.gz)
        Output file
        argument: ``-prefix %s``, position: 0
plusminus: (a boolean, nipype default value: True)
        Normally, the warp displacements dis(x) are defined to matchbase(x)
        to source(x+dis(x)). With this option, the matchis between
        base(x-dis(x)) and source(x+dis(x)) -- the twoimages 'meet in the
        middle'. For more info, view Qwarp` interface
        argument: ``-plusminus``, position: 1
        mutually_exclusive: duplo, allsave, iwarp
resample: (a boolean)
        This option simply resamples the source dataset to match thebase
        dataset grid. You can use this if the two datasetsoverlap well (as
        seen in the AFNI GUI), but are not on thesame 3D grid.* If they
        don't overlap well, allineate them first* The reampling here is done
        with the'wsinc5' method, which has very little blurring artifact.*
        If the base and source datasets ARE on the same 3D grid,then the
        -resample option will be ignored.* You CAN use -resample with these
        3dQwarp options:-plusminus -inilev -iniwarp -duplo
        argument: ``-resample``
allineate: (a boolean)
        This option will make 3dQwarp run 3dAllineate first, to align the
        source dataset to the base with an affine transformation. It will
        then use that alignment as a starting point for the nonlinear
        warping.
        argument: ``-allineate``
allineate_opts: (a unicode string)
        add extra options to the 3dAllineate command to be run by 3dQwarp.
        argument: ``-allineate_opts %s``
        requires: allineate
nowarp: (a boolean)
        Do not save the _WARP file.
        argument: ``-nowarp``
iwarp: (a boolean)
        Do compute and save the _WARPINV file.
        argument: ``-iwarp``
        mutually_exclusive: plusminus
pear: (a boolean)
        Use strict Pearson correlation for matching.* Not usually
        recommended, since the 'clipped Pearson' methodused by default will
        reduce the impact of outlier values.
        argument: ``-pear``
noneg: (a boolean)
        Replace negative values in either input volume with 0.* If there ARE
        negative input values, and you do NOT use -noneg,then strict Pearson
        correlation will be used, since the 'clipped'method only is
        implemented for non-negative volumes.* '-noneg' is not the default,
        since there might be situations whereyou want to align datasets with
        positive and negative values mixed.* But, in many cases, the
        negative values in a dataset are just theresult of interpolation
        artifacts (or other peculiarities), and sothey should be ignored.
        That is what '-noneg' is for.
        argument: ``-noneg``
nopenalty: (a boolean)
        Replace negative values in either input volume with 0.* If there ARE
        negative input values, and you do NOT use -noneg,then strict Pearson
        correlation will be used, since the 'clipped'method only is
        implemented for non-negative volumes.* '-noneg' is not the default,
        since there might be situations whereyou want to align datasets with
        positive and negative values mixed.* But, in many cases, the
        negative values in a dataset are just theresult of interpolation
        artifacts (or other peculiarities), and sothey should be ignored.
        That is what '-noneg' is for.
        argument: ``-nopenalty``
penfac: (a float)
        Use this value to weight the penalty.The default value is 1.Larger
        values mean thepenalty counts more, reducing grid
        distortions,insha'Allah; '-nopenalty' is the same as '-penfac 0'.
        -->>* [23 Sep 2013] -- Zhark increased the default value of the
        penalty by a factor of 5, and also made it get progressively larger
        with each level of refinement. Thus, warping results will vary from
        earlier instances of 3dQwarp. * The progressive increase in the
        penalty at higher levels means that the 'cost function' can actually
        look like the alignment is getting worse when the levels change. *
        IF you wish to turn off this progression, for whatever reason (e.g.,
        to keep compatibility with older results), use the option
        '-penold'.To be completely compatible with the older 3dQwarp, you'll
        also have to use '-penfac 0.2'.
        argument: ``-penfac %f``
noweight: (a boolean)
        If you want a binary weight (the old default), use this option.That
        is, each voxel in the base volume automask will beweighted the same
        in the computation of the cost functional.
        argument: ``-noweight``
weight: (a pathlike object or string representing an existing file)
        Instead of computing the weight from the base dataset,directly input
        the weight volume from dataset 'www'.* Useful if you know what over
        parts of the base image youwant to emphasize or de-emphasize the
        matching functional.
        argument: ``-weight %s``
wball: (a list of from 5 to 5 items which are an integer (int or
          long))
        -wball x y z r fEnhance automatic weight from '-useweight' by a
        factorof 1+f*Gaussian(FWHM=r) centered in the base image atDICOM
        coordinates (x,y,z) and with radius 'r'. Thegoal of this option is
        to try and make the alignmentbetter in a specific part of the
        brain.* Example: -wball 0 14 6 30 40to emphasize the thalamic area
        (in MNI/Talairach space).* The 'r' parameter must be positive!* The
        'f' parameter must be between 1 and 100 (inclusive).* '-wball' does
        nothing if you input your own weightwith the '-weight' option.*
        '-wball' does change the binary weight created bythe '-noweight'
        option.* You can only use '-wball' once in a run of 3dQwarp.*** The
        effect of '-wball' is not dramatic. The exampleabove makes the
        average brain image across a collectionof subjects a little sharper
        in the thalamic area, whichmight have some small value. If you care
        enough aboutalignment to use '-wball', then you should examine
        theresults from 3dQwarp for each subject, to see if thealignments
        are good enough for your purposes.
        argument: ``-wball %s``
wmask: (a tuple of the form: (a pathlike object or string
          representing an existing file, a float))
        -wmask ws fSimilar to '-wball', but here, you provide a dataset
        'ws'that indicates where to increase the weight.* The 'ws' dataset
        must be on the same 3D grid as the base dataset.* 'ws' is treated as
        a mask -- it only matters where itis nonzero -- otherwise, the
        values inside are not used.* After 'ws' comes the factor 'f' by
        which to increase theautomatically computed weight. Where 'ws' is
        nonzero,the weighting will be multiplied by (1+f).* As with
        '-wball', the factor 'f' should be between 1 and 100.* You cannot
        use '-wball' and '-wmask' together!
        argument: ``-wpass %s %f``
out_weight_file: (a pathlike object or string representing a file)
        Write the weight volume to disk as a dataset
        argument: ``-wtprefix %s``
blur: (a list of from 1 to 2 items which are a float)
        Gaussian blur the input images by 'bb' (FWHM) voxels beforedoing the
        alignment (the output dataset will not be blurred).The default is
        2.345 (for no good reason).* Optionally, you can provide 2 values
        for 'bb', and thenthe first one is applied to the base volume, the
        secondto the source volume.-->>* e.g., '-blur 0 3' to skip blurring
        the base image(if the base is a blurry template, for example).* A
        negative blur radius means to use 3D median filtering,rather than
        Gaussian blurring. This type of filtering willbetter preserve edges,
        which can be important in alignment.* If the base is a template
        volume that is already blurry,you probably don't want to blur it
        again, but blurringthe source volume a little is probably a good
        idea, tohelp the program avoid trying to match tiny features.* Note
        that -duplo will blur the volumes some extraamount for the initial
        small-scale warping, to makethat phase of the program converge more
        rapidly.
        argument: ``-blur %s``
pblur: (a list of from 1 to 2 items which are a float)
        Use progressive blurring; that is, for larger patch sizes,the amount
        of blurring is larger. The general idea is toavoid trying to match
        finer details when the patch sizeand incremental warps are coarse.
        When '-blur' is usedas well, it sets a minimum amount of blurring
        that willbe used. [06 Aug 2014 -- '-pblur' may become the default
        someday].* You can optionally give the fraction of the patch size
        thatis used for the progressive blur by providing a value between0
        and 0.25 after '-pblur'. If you provide TWO values, thethe first
        fraction is used for progressively blurring thebase image and the
        second for the source image. The defaultparameters when just
        '-pblur' is given is the same as givingthe options as '-pblur 0.09
        0.09'.* '-pblur' is useful when trying to match 2 volumes with
        highamounts of detail; e.g, warping one subject's brain image
        tomatch another's, or trying to warp to match a detailed template.*
        Note that using negative values with '-blur' means that
        theprogressive blurring will be done with median filters, ratherthan
        Gaussian linear blurring.-->>*** The combination of the -allineate
        and -pblur options will makethe results of using 3dQwarp to align to
        a template somewhatless sensitive to initial head position and
        scaling.
        argument: ``-pblur %s``
emask: (a pathlike object or string representing an existing file)
        Here, 'ee' is a dataset to specify a mask of voxelsto EXCLUDE from
        the analysis -- all voxels in 'ee'that are NONZERO will not be used
        in the alignment.* The base image always automasked -- the emask
        isextra, to indicate voxels you definitely DON'T wantincluded in the
        matching process, even if they areinside the brain.
        argument: ``-emask %s``
noXdis: (a boolean)
        Warp will not displace in x direction
        argument: ``-noXdis``
noYdis: (a boolean)
        Warp will not displace in y direction
        argument: ``-noYdis``
noZdis: (a boolean)
        Warp will not displace in z direction
        argument: ``-noZdis``
iniwarp: (a list of items which are a pathlike object or string
          representing an existing file)
        A dataset with an initial nonlinear warp to use.* If this option is
        not used, the initial warp is the identity.* You can specify a
        catenation of warps (in quotes) here, as inprogram 3dNwarpApply.* As
        a special case, if you just input an affine matrix in a .1Dfile,
        that will work also -- it is treated as giving the initialwarp via
        the string "IDENT(base_dataset) matrix_file.aff12.1D".* You CANNOT
        use this option with -duplo !!* -iniwarp is usually used with
        -inilev to re-start 3dQwarp froma previous stopping point.
        argument: ``-iniwarp %s``
        mutually_exclusive: duplo
inilev: (an integer (int or long))
        The initial refinement 'level' at which to start.* Usually used with
        -iniwarp; CANNOT be used with -duplo.* The combination of -inilev
        and -iniwarp lets you take theresults of a previous 3dQwarp run and
        refine them further:Note that the source dataset in the second run
        is the SAME asin the first run. If you don't see why this is
        necessary,then you probably need to seek help from an AFNI guru.
        argument: ``-inilev %d``
        mutually_exclusive: duplo
minpatch: (an integer (int or long))
        * The value of mm should be an odd integer.* The default value of mm
        is 25.* For more accurate results than mm=25, try 19 or 13.* The
        smallest allowed patch size is 5.* You may want stop at a larger
        patch size (say 7 or 9) and usethe -Qfinal option to run that final
        level with quintic warps,which might run faster and provide the same
        degree of warp detail.* Trying to make two different brain volumes
        match in fine detailis usually a waste of time, especially in
        humans. There is toomuch variability in anatomy to match gyrus to
        gyrus accurately.For this reason, the default minimum patch size is
        25 voxels.Using a smaller '-minpatch' might try to force the warp
        tomatch features that do not match, and the result can be
        uselessimage distortions -- another reason to LOOK AT THE RESULTS.
        argument: ``-minpatch %d``
maxlev: (an integer (int or long))
        The initial refinement 'level' at which to start.* Usually used with
        -iniwarp; CANNOT be used with -duplo.* The combination of -inilev
        and -iniwarp lets you take theresults of a previous 3dQwarp run and
        refine them further:Note that the source dataset in the second run
        is the SAME asin the first run. If you don't see why this is
        necessary,then you probably need to seek help from an AFNI guru.
        argument: ``-maxlev %d``, position: -1
        mutually_exclusive: duplo
gridlist: (a pathlike object or string representing an existing file)
        This option provides an alternate way to specify the patchgrid sizes
        used in the warp optimization process. 'gl' isa 1D file with a list
        of patches to use -- in most cases,you will want to use it in the
        following form:-gridlist '1D: 0 151 101 75 51'* Here, a 0 patch size
        means the global domain. Patch sizesotherwise should be odd integers
        >= 5.* If you use the '0' patch size again after the first
        position,you will actually get an iteration at the size of
        thedefault patch level 1, where the patch sizes are 75% ofthe volume
        dimension. There is no way to force the programto literally repeat
        the sui generis step of lev=0.* You cannot use -gridlist with -duplo
        or -plusminus!
        argument: ``-gridlist %s``
        mutually_exclusive: duplo, plusminus
allsave: (a boolean)
        This option lets you save the output warps from each levelof the
        refinement process. Mostly used for experimenting.* Cannot be used
        with -nopadWARP, -duplo, or -plusminus.* Will only save all the
        outputs if the program terminatesnormally -- if it crashes, or
        freezes, then all thesewarps are lost.
        argument: ``-allsave``
        mutually_exclusive: nopadWARP, duplo, plusminus
duplo: (a boolean)
        Start off with 1/2 scale versions of the volumes,for getting a
        speedy coarse first alignment.* Then scales back up to register the
        full volumes.The goal is greater speed, and it seems to help
        thispositively piggish program to be more expeditious.* However,
        accuracy is somewhat lower with '-duplo',for reasons that currenly
        elude Zhark; for this reason,the Emperor does not usually use
        '-duplo'.
        argument: ``-duplo``
        mutually_exclusive: gridlist, maxlev, inilev, iniwarp, plusminus,
          allsave
workhard: (a boolean)
        Iterate more times, which can help when the volumes arehard to align
        at all, or when you hope to get a more precisealignment.* Slows the
        program down (possibly a lot), of course.* When you combine
        '-workhard' with '-duplo', only thefull size volumes get the extra
        iterations.* For finer control over which refinement levels work
        hard,you can use this option in the form (for example)
        -workhard:4:7which implies the extra iterations will be done at
        levels4, 5, 6, and 7, but not otherwise.* You can also use
        '-superhard' to iterate even more, butthis extra option will REALLY
        slow things down.-->>* Under most circumstances, you should not need
        to use either-workhard or -superhard.-->>* The fastest way to
        register to a template image is via the-duplo option, and without
        the -workhard or -superhard options.-->>* If you use this option in
        the form '-Workhard' (first letterin upper case), then the second
        iteration at each level isdone with quintic polynomial warps.
        argument: ``-workhard``
        mutually_exclusive: boxopt, ballopt
Qfinal: (a boolean)
        At the finest patch size (the final level), use Hermitequintic
        polynomials for the warp instead of cubic polynomials.* In a 3D
        'patch', there are 2x2x2x3=24 cubic polynomial basisfunction
        parameters over which to optimize (2 polynomialsdependent on each of
        the x,y,z directions, and 3 differentdirections of displacement).*
        There are 3x3x3x3=81 quintic polynomial parameters per patch.* With
        -Qfinal, the final level will have more detail inthe allowed warps,
        at the cost of yet more CPU time.* However, no patch below 7x7x7 in
        size will be done with quinticpolynomials.* This option is also not
        usually needed, and is experimental.
        argument: ``-Qfinal``
Qonly: (a boolean)
        Use Hermite quintic polynomials at all levels.* Very slow (about 4
        times longer). Also experimental.* Will produce a (discrete
        representation of a) C2 warp.
        argument: ``-Qonly``
nopad: (a boolean)
        Do NOT use zero-padding on the 3D base and source images.[Default ==
        zero-pad, if needed]* The underlying model for deformations goes to
        zero at theedge of the volume being warped. However, if there
        issignificant data near an edge of the volume, then it won'tget
        displaced much, and so the results might not be good.* Zero padding
        is designed as a way to work around this potentialproblem. You
        should NOT need the '-nopad' option for anyreason that Zhark can
        think of, but it is here to be symmetricalwith 3dAllineate.* Note
        that the output (warped from source) dataset will be on thebase
        dataset grid whether or not zero-padding is allowed. However,unless
        you use the following option, allowing zero-padding (i.e.,the
        default operation) will make the output WARP dataset(s) beon a
        larger grid (also see '-expad' below).
        argument: ``-nopad``
nopadWARP: (a boolean)
        If for some reason you require the warp volume tomatch the base
        volume, then use this option to have the outputWARP dataset(s)
        truncated.
        argument: ``-nopadWARP``
        mutually_exclusive: allsave, expad
expad: (an integer (int or long))
        This option instructs the program to pad the warp by an extra'EE'
        voxels (and then 3dQwarp starts optimizing it).* This option is
        seldom needed, but can be useful if youmight later catenate the
        nonlinear warp -- via 3dNwarpCat --with an affine transformation
        that contains a large shift.Under that circumstance, the nonlinear
        warp might be shiftedpartially outside its original grid, so
        expanding that gridcan avoid this problem.* Note that this option
        perforce turns off '-nopadWARP'.
        argument: ``-expad %d``
        mutually_exclusive: nopadWARP
ballopt: (a boolean)
        Normally, the incremental warp parameters are optimized insidea
        rectangular 'box' (24 dimensional for cubic patches, 81 forquintic
        patches), whose limits define the amount of distortionallowed at
        each step. Using '-ballopt' switches these limitsto be applied to a
        'ball' (interior of a hypersphere), whichcan allow for larger
        incremental displacements. Use thisoption if you think things need
        to be able to move farther.
        argument: ``-ballopt``
        mutually_exclusive: workhard, boxopt
baxopt: (a boolean)
        Use the 'box' optimization limits instead of the 'ball'[this is the
        default at present].* Note that if '-workhard' is used, then ball
        and box optimizationare alternated in the different iterations at
        each level, sothese two options have no effect in that case.
        argument: ``-boxopt``
        mutually_exclusive: workhard, ballopt
verb: (a boolean)
        more detailed description of the process
        argument: ``-verb``
        mutually_exclusive: quiet
quiet: (a boolean)
        Cut out most of the fun fun fun progress messages :-(
        argument: ``-quiet``
        mutually_exclusive: verb
overwrite: (a boolean)
        Overwrite outputs
        argument: ``-overwrite``
lpc: (a boolean)
        Local Pearson minimization (i.e., EPI-T1 registration)This option
        has not be extensively testedIf you use '-lpc', then '-maxlev 0' is
        automatically set.If you want to go to more refined levels, you can
        set '-maxlev'This should be set up to have lpc as the second to last
        argumentand maxlev as the second to last argument, as needed by
        AFNIUsing maxlev > 1 is not recommended for EPI-T1 alignment.
        argument: ``-lpc``, position: -2
        mutually_exclusive: nmi, mi, hel, lpa, pear
lpa: (a boolean)
        Local Pearson maximizationThis option has not be extensively tested
        argument: ``-lpa``
        mutually_exclusive: nmi, mi, lpc, hel, pear
hel: (a boolean)
        Hellinger distance: a matching function for the adventurousThis
        option has NOT be extensively tested for usefullnessand should be
        considered experimental at this infundibulum.
        argument: ``-hel``
        mutually_exclusive: nmi, mi, lpc, lpa, pear
mi: (a boolean)
        Mutual Information: a matching function for the adventurousThis
        option has NOT be extensively tested for usefullnessand should be
        considered experimental at this infundibulum.
        argument: ``-mi``
        mutually_exclusive: mi, hel, lpc, lpa, pear
nmi: (a boolean)
        Normalized Mutual Information: a matching function for the
        adventurousThis option has NOT be extensively tested for
        usefullnessand should be considered experimental at this
        infundibulum.
        argument: ``-nmi``
        mutually_exclusive: nmi, hel, lpc, lpa, pear
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

warped_source: (a pathlike object or string representing a file)
        Warped source file. If plusminus is used, this is the
        undistortedsource file.
warped_base: (a pathlike object or string representing a file)
        Undistorted base file.
source_warp: (a pathlike object or string representing a file)
        Displacement in mm for the source image.If plusminus is used this is
        the field suceptibility correctionwarp (in 'mm') for source image.
base_warp: (a pathlike object or string representing a file)
        Displacement in mm for the base image.If plus minus is used, this is
        the field suceptibility correctionwarp (in 'mm') for base image.
        This is only output if plusminusor iwarp options are passed
weights: (a pathlike object or string representing a file)
        Auto-computed weight volume.

References:

None None

ROIStats

Link to code

Wraps the executable command 3dROIstats.

Display statistics over masked regions

For complete details, see the 3dROIstats Documentation

Examples

>>> from nipype.interfaces import afni
>>> roistats = afni.ROIStats()
>>> roistats.inputs.in_file = 'functional.nii'
>>> roistats.inputs.mask_file = 'skeleton_mask.nii.gz'
>>> roistats.inputs.stat = ['mean', 'median', 'voxels']
>>> roistats.inputs.nomeanout = True
>>> roistats.cmdline
'3dROIstats -mask skeleton_mask.nii.gz -nomeanout -nzmean -nzmedian -nzvoxels functional.nii > functional_roistat.1D'
>>> res = roistats.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input dataset
        argument: ``%s``, position: -2

[Optional]
mask: (a pathlike object or string representing an existing file)
        input mask
        argument: ``-mask %s``, position: 3
mask_file: (a pathlike object or string representing an existing
          file)
        input mask
        argument: ``-mask %s``
mask_f2short: (a boolean)
        Tells the program to convert a float mask to short integers, by
        simple rounding.
        argument: ``-mask_f2short``
num_roi: (an integer (int or long))
        Forces the assumption that the mask dataset's ROIs are denoted by 1
        to n inclusive. Normally, the program figures out the ROIs on its
        own. This option is useful if a) you are certain that the mask
        dataset has no values outside the range [0 n], b) there may be some
        ROIs missing between [1 n] in the mask data-set and c) you want
        those columns in the output any-way so the output lines up with the
        output from other invocations of 3dROIstats.
        argument: ``-numroi %s``
zerofill: (a unicode string)
        For ROI labels not found, use the provided string instead of a '0'
        in the output file. Only active if `num_roi` is enabled.
        argument: ``-zerofill %s``
        requires: num_roi
roisel: (a pathlike object or string representing an existing file)
        Only considers ROIs denoted by values found in the specified file.
        Note that the order of the ROIs as specified in the file is not
        preserved. So an SEL.1D of '2 8 20' produces the same output as '8
        20 2'
        argument: ``-roisel %s``
debug: (a boolean)
        print debug information
        argument: ``-debug``
quiet: (a boolean)
        execute quietly
        argument: ``-quiet``
nomeanout: (a boolean)
        Do not include the (zero-inclusive) mean among computed stats
        argument: ``-nomeanout``
nobriklab: (a boolean)
        Do not print the sub-brick label next to its index
        argument: ``-nobriklab``
format1D: (a boolean)
        Output results in a 1D format that includes commented labels
        argument: ``-1Dformat``
        mutually_exclusive: format1DR
format1DR: (a boolean)
        Output results in a 1D format that includes uncommented labels. May
        not work optimally with typical 1D functions, but is useful for R
        functions.
        argument: ``-1DRformat``
        mutually_exclusive: format1D
stat: (a list of items which are 'mean' or 'sum' or 'voxels' or
          'minmax' or 'sigma' or 'median' or 'mode' or 'summary' or
          'zerominmax' or 'zerosigma' or 'zeromedian' or 'zeromode')
        statistics to compute. Options include: * mean = Compute the mean
        using only non_zero voxels. Implies the opposite for the mean
        computed by default.
         * median = Compute the median of nonzero voxels
         * mode = Compute the mode of nonzero voxels. (integral valued sets
        only)
         * minmax = Compute the min/max of nonzero voxels
         * sum = Compute the sum using only nonzero voxels.
         * voxels = Compute the number of nonzero voxels
         * sigma = Compute the standard deviation of nonzero voxels
        Statistics that include zero-valued voxels:
         * zerominmax = Compute the min/max of all voxels.
         * zerosigma = Compute the standard deviation of all voxels.
         * zeromedian = Compute the median of all voxels.
         * zeromode = Compute the mode of all voxels.
         * summary = Only output a summary line with the grand mean across
        all briks in the input dataset. This option cannot be used with
        nomeanout.
        More that one option can be specified.
        argument: ``%s...``
out_file: (a pathlike object or string representing a file)
        output file
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output tab-separated values file

Retroicor

Link to code

Wraps the executable command 3dretroicor.

Performs Retrospective Image Correction for physiological motion effects, using a slightly modified version of the RETROICOR algorithm

The durations of the physiological inputs are assumed to equal the duration of the dataset. Any constant sampling rate may be used, but 40 Hz seems to be acceptable. This program’s cardiac peak detection algorithm is rather simplistic, so you might try using the scanner’s cardiac gating output (transform it to a spike wave if necessary).

This program uses slice timing information embedded in the dataset to estimate the proper cardiac/respiratory phase for each slice. It makes sense to run this program before any program that may destroy the slice timings (e.g. 3dvolreg for motion correction).

For complete details, see the 3dretroicor Documentation.

Examples

>>> from nipype.interfaces import afni
>>> ret = afni.Retroicor()
>>> ret.inputs.in_file = 'functional.nii'
>>> ret.inputs.card = 'mask.1D'
>>> ret.inputs.resp = 'resp.1D'
>>> ret.inputs.outputtype = 'NIFTI'
>>> ret.cmdline
'3dretroicor -prefix functional_retroicor.nii -resp resp.1D -card mask.1D functional.nii'
>>> res = ret.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dretroicor
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``, position: 1
card: (a pathlike object or string representing an existing file)
        1D cardiac data file for cardiac correction
        argument: ``-card %s``, position: -2
resp: (a pathlike object or string representing an existing file)
        1D respiratory waveform data for correction
        argument: ``-resp %s``, position: -3
threshold: (an integer (int or long))
        Threshold for detection of R-wave peaks in input (Make sure it is
        above the background noise level, Try 3/4 or 4/5 times range plus
        minimum)
        argument: ``-threshold %d``, position: -4
order: (an integer (int or long))
        The order of the correction (2 is typical)
        argument: ``-order %s``, position: -5
cardphase: (a pathlike object or string representing a file)
        Filename for 1D cardiac phase output
        argument: ``-cardphase %s``, position: -6
respphase: (a pathlike object or string representing a file)
        Filename for 1D resp phase output
        argument: ``-respphase %s``, position: -7
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Seg

Link to code

Wraps the executable command 3dSeg.

3dSeg segments brain volumes into tissue classes. The program allows for adding a variety of global and voxelwise priors. However for the moment, only mixing fractions and MRF are documented.

For complete details, see the 3dSeg Documentation.

Examples

>>> from nipype.interfaces.afni import preprocess
>>> seg = preprocess.Seg()
>>> seg.inputs.in_file = 'structural.nii'
>>> seg.inputs.mask = 'AUTO'
>>> seg.cmdline
'3dSeg -mask AUTO -anat structural.nii'
>>> res = seg.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        ANAT is the volume to segment
        argument: ``-anat %s``, position: -1
mask: ('AUTO' or a pathlike object or string representing an existing
          file)
        only non-zero voxels in mask are analyzed. mask can either be a
        dataset or the string "AUTO" which would use AFNI's automask
        function to create the mask.
        argument: ``-mask %s``, position: -2

[Optional]
blur_meth: ('BFT' or 'BIM')
        set the blurring method for bias field estimation
        argument: ``-blur_meth %s``
bias_fwhm: (a float)
        The amount of blurring used when estimating the field bias with the
        Wells method
        argument: ``-bias_fwhm %f``
classes: (a unicode string)
        CLASS_STRING is a semicolon delimited string of class labels
        argument: ``-classes %s``
bmrf: (a float)
        Weighting factor controlling spatial homogeneity of the
        classifications
        argument: ``-bmrf %f``
bias_classes: (a unicode string)
        A semicolon delimited string of classes that contribute to the
        estimation of the bias field
        argument: ``-bias_classes %s``
prefix: (a unicode string)
        the prefix for the output folder containing all output volumes
        argument: ``-prefix %s``
mixfrac: (a unicode string)
        MIXFRAC sets up the volume-wide (within mask) tissue fractions while
        initializing the segmentation (see IGNORE for exception)
        argument: ``-mixfrac %s``
mixfloor: (a float)
        Set the minimum value for any class's mixing fraction
        argument: ``-mixfloor %f``
main_N: (an integer (int or long))
        Number of iterations to perform.
        argument: ``-main_N %d``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

SkullStrip

Link to code

Wraps the executable command 3dSkullStrip.

A program to extract the brain from surrounding tissue from MRI T1-weighted images. TODO Add optional arguments.

For complete details, see the 3dSkullStrip Documentation.

Examples

>>> from nipype.interfaces import afni
>>> skullstrip = afni.SkullStrip()
>>> skullstrip.inputs.in_file = 'functional.nii'
>>> skullstrip.inputs.args = '-o_ply'
>>> skullstrip.cmdline
'3dSkullStrip -input functional.nii -o_ply -prefix functional_skullstrip'
>>> res = skullstrip.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dSkullStrip
        argument: ``-input %s``, position: 1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

TCorr1D

Link to code

Wraps the executable command 3dTcorr1D.

Computes the correlation coefficient between each voxel time series in the input 3D+time dataset.

For complete details, see the 3dTcorr1D Documentation.

>>> from nipype.interfaces import afni
>>> tcorr1D = afni.TCorr1D()
>>> tcorr1D.inputs.xset= 'u_rc1s1_Template.nii'
>>> tcorr1D.inputs.y_1d = 'seed.1D'
>>> tcorr1D.cmdline
'3dTcorr1D -prefix u_rc1s1_Template_correlation.nii.gz  u_rc1s1_Template.nii  seed.1D'
>>> res = tcorr1D.run()  # doctest: +SKIP

Inputs:

[Mandatory]
xset: (a pathlike object or string representing an existing file)
        3d+time dataset input
        argument: `` %s``, position: -2
y_1d: (a pathlike object or string representing an existing file)
        1D time series file input
        argument: `` %s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output filename prefix
        argument: ``-prefix %s``
pearson: (a boolean)
        Correlation is the normal Pearson correlation coefficient
        argument: `` -pearson``, position: 1
        mutually_exclusive: spearman, quadrant, ktaub
spearman: (a boolean)
        Correlation is the Spearman (rank) correlation coefficient
        argument: `` -spearman``, position: 1
        mutually_exclusive: pearson, quadrant, ktaub
quadrant: (a boolean)
        Correlation is the quadrant correlation coefficient
        argument: `` -quadrant``, position: 1
        mutually_exclusive: pearson, spearman, ktaub
ktaub: (a boolean)
        Correlation is the Kendall's tau_b correlation coefficient
        argument: `` -ktaub``, position: 1
        mutually_exclusive: pearson, spearman, quadrant
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file containing correlations

References:

None None

TCorrMap

Link to code

Wraps the executable command 3dTcorrMap.

For each voxel time series, computes the correlation between it and all other voxels, and combines this set of values into the output dataset(s) in some way.

For complete details, see the 3dTcorrMap Documentation.

Examples

>>> from nipype.interfaces import afni
>>> tcm = afni.TCorrMap()
>>> tcm.inputs.in_file = 'functional.nii'
>>> tcm.inputs.mask = 'mask.nii'
>>> tcm.mean_file = 'functional_meancorr.nii'
>>> tcm.cmdline # doctest: +SKIP
'3dTcorrMap -input functional.nii -mask mask.nii -Mean functional_meancorr.nii'
>>> res = tcm.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        argument: ``-input %s``

[Optional]
seeds: (a pathlike object or string representing an existing file)
        argument: ``-seed %s``
        mutually_exclusive: s, e, e, d, s, _, w, i, d, t, h
mask: (a pathlike object or string representing an existing file)
        argument: ``-mask %s``
automask: (a boolean)
        argument: ``-automask``
polort: (an integer (int or long))
        argument: ``-polort %d``
bandpass: (a tuple of the form: (a float, a float))
        argument: ``-bpass %f %f``
regress_out_timeseries: (a pathlike object or string representing an
          existing file)
        argument: ``-ort %s``
blur_fwhm: (a float)
        argument: ``-Gblur %f``
seeds_width: (a float)
        argument: ``-Mseed %f``
        mutually_exclusive: s, e, e, d, s
mean_file: (a pathlike object or string representing a file)
        argument: ``-Mean %s``
zmean: (a pathlike object or string representing a file)
        argument: ``-Zmean %s``
qmean: (a pathlike object or string representing a file)
        argument: ``-Qmean %s``
pmean: (a pathlike object or string representing a file)
        argument: ``-Pmean %s``
thresholds: (a list of items which are an integer (int or long))
absolute_threshold: (a pathlike object or string representing a file)
        argument: ``-Thresh %f %s``
        mutually_exclusive: absolute_threshold, var_absolute_threshold,
          var_absolute_threshold_normalize
var_absolute_threshold: (a pathlike object or string representing a
          file)
        argument: ``-VarThresh %f %f %f %s``
        mutually_exclusive: absolute_threshold, var_absolute_threshold,
          var_absolute_threshold_normalize
var_absolute_threshold_normalize: (a pathlike object or string
          representing a file)
        argument: ``-VarThreshN %f %f %f %s``
        mutually_exclusive: absolute_threshold, var_absolute_threshold,
          var_absolute_threshold_normalize
correlation_maps: (a pathlike object or string representing a file)
        argument: ``-CorrMap %s``
correlation_maps_masked: (a pathlike object or string representing a
          file)
        argument: ``-CorrMask %s``
expr: (a unicode string)
average_expr: (a pathlike object or string representing a file)
        argument: ``-Aexpr %s %s``
        mutually_exclusive: average_expr, average_expr_nonzero, sum_expr
average_expr_nonzero: (a pathlike object or string representing a
          file)
        argument: ``-Cexpr %s %s``
        mutually_exclusive: average_expr, average_expr_nonzero, sum_expr
sum_expr: (a pathlike object or string representing a file)
        argument: ``-Sexpr %s %s``
        mutually_exclusive: average_expr, average_expr_nonzero, sum_expr
histogram_bin_numbers: (an integer (int or long))
histogram: (a pathlike object or string representing a file)
        argument: ``-Hist %d %s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

mean_file: (a pathlike object or string representing a file)
zmean: (a pathlike object or string representing a file)
qmean: (a pathlike object or string representing a file)
pmean: (a pathlike object or string representing a file)
absolute_threshold: (a pathlike object or string representing a file)
var_absolute_threshold: (a pathlike object or string representing a
          file)
var_absolute_threshold_normalize: (a pathlike object or string
          representing a file)
correlation_maps: (a pathlike object or string representing a file)
correlation_maps_masked: (a pathlike object or string representing a
          file)
average_expr: (a pathlike object or string representing a file)
average_expr_nonzero: (a pathlike object or string representing a
          file)
sum_expr: (a pathlike object or string representing a file)
histogram: (a pathlike object or string representing a file)

References:

None None

TCorrelate

Link to code

Wraps the executable command 3dTcorrelate.

Computes the correlation coefficient between corresponding voxel time series in two input 3D+time datasets ‘xset’ and ‘yset’

For complete details, see the 3dTcorrelate Documentation.

Examples

>>> from nipype.interfaces import afni
>>> tcorrelate = afni.TCorrelate()
>>> tcorrelate.inputs.xset= 'u_rc1s1_Template.nii'
>>> tcorrelate.inputs.yset = 'u_rc1s2_Template.nii'
>>> tcorrelate.inputs.out_file = 'functional_tcorrelate.nii.gz'
>>> tcorrelate.inputs.polort = -1
>>> tcorrelate.inputs.pearson = True
>>> tcorrelate.cmdline
'3dTcorrelate -prefix functional_tcorrelate.nii.gz -pearson -polort -1 u_rc1s1_Template.nii u_rc1s2_Template.nii'
>>> res = tcarrelate.run()  # doctest: +SKIP

Inputs:

[Mandatory]
xset: (a pathlike object or string representing an existing file)
        input xset
        argument: ``%s``, position: -2
yset: (a pathlike object or string representing an existing file)
        input yset
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
pearson: (a boolean)
        Correlation is the normal Pearson correlation coefficient
        argument: ``-pearson``
polort: (an integer (int or long))
        Remove polynomical trend of order m
        argument: ``-polort %d``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

TNorm

Link to code

Wraps the executable command 3dTnorm.

Shifts voxel time series from input so that seperate slices are aligned to the same temporal origin.

For complete details, see the 3dTnorm Documentation.

Examples

>>> from nipype.interfaces import afni
>>> tnorm = afni.TNorm()
>>> tnorm.inputs.in_file = 'functional.nii'
>>> tnorm.inputs.norm2 = True
>>> tnorm.inputs.out_file = 'rm.errts.unit errts+tlrc'
>>> tnorm.cmdline
'3dTnorm -norm2 -prefix rm.errts.unit errts+tlrc functional.nii'
>>> res = tshift.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dTNorm
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
norm2: (a boolean)
        L2 normalize (sum of squares = 1) [DEFAULT]
        argument: ``-norm2``
normR: (a boolean)
        normalize so sum of squares = number of time points * e.g., so RMS =
        1.
        argument: ``-normR``
norm1: (a boolean)
        L1 normalize (sum of absolute values = 1)
        argument: ``-norm1``
normx: (a boolean)
        Scale so max absolute value = 1 (L_infinity norm)
        argument: ``-normx``
polort: (an integer (int or long))
        Detrend with polynomials of order p before normalizing
         [DEFAULT = don't do this]
         * Use '-polort 0' to remove the mean, for example
        argument: ``-polort %s``
L1fit: (a boolean)
        Detrend with L1 regression (L2 is the default)
         * This option is here just for the hell of it
        argument: ``-L1fit``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

TProject

Link to code

Wraps the executable command 3dTproject.

This program projects (detrends) out various ‘nuisance’ time series from each voxel in the input dataset. Note that all the projections are done via linear regression, including the frequency-based options such as ‘-passband’. In this way, you can bandpass time-censored data, and at the same time, remove other time series of no interest (e.g., physiological estimates, motion parameters). Shifts voxel time series from input so that seperate slices are aligned to the same temporal origin.

For complete details, see the 3dTproject Documentation.

Examples

>>> from nipype.interfaces import afni
>>> tproject = afni.TProject()
>>> tproject.inputs.in_file = 'functional.nii'
>>> tproject.inputs.bandpass = (0.00667, 99999)
>>> tproject.inputs.polort = 3
>>> tproject.inputs.automask = True
>>> tproject.inputs.out_file = 'projected.nii.gz'
>>> tproject.cmdline
'3dTproject -input functional.nii -automask -bandpass 0.00667 99999 -polort 3 -prefix projected.nii.gz'
>>> res = tproject.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dTproject
        argument: ``-input %s``, position: 1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``, position: -1
censor: (a pathlike object or string representing an existing file)
        filename of censor .1D time series
         * This is a file of 1s and 0s, indicating which
         time points are to be included (1) and which are
         to be excluded (0).
        argument: ``-censor %s``
censortr: (a list of items which are a unicode string)
        list of strings that specify time indexes
         to be removed from the analysis. Each string is
         of one of the following forms:
         37 => remove global time index #37
         2:37 => remove time index #37 in run #2
         37..47 => remove global time indexes #37-47
         37-47 => same as above
         2:37..47 => remove time indexes #37-47 in run #2
         *:0-2 => remove time indexes #0-2 in all runs
         +Time indexes within each run start at 0.
         +Run indexes start at 1 (just be to confusing).
         +N.B.: 2:37,47 means index #37 in run #2 and
         global time index 47; it does NOT mean
         index #37 in run #2 AND index #47 in run #2.
        argument: ``-CENSORTR %s``
cenmode: ('KILL' or 'ZERO' or 'NTRP')
        specifies how censored time points are treated in
         the output dataset:
         + mode = ZERO ==> put zero values in their place
         ==> output datset is same length as input
         + mode = KILL ==> remove those time points
         ==> output dataset is shorter than input
         + mode = NTRP ==> censored values are replaced by interpolated
         neighboring (in time) non-censored values,
         BEFORE any projections, and then the
         analysis proceeds without actual removal
         of any time points -- this feature is to
         keep the Spanish Inquisition happy.
         * The default mode is KILL !!!
        argument: ``-cenmode %s``
concat: (a pathlike object or string representing an existing file)
        The catenation file, as in 3dDeconvolve, containing the
         TR indexes of the start points for each contiguous run
         within the input dataset (the first entry should be 0).
         ++ Also as in 3dDeconvolve, if the input dataset is
         automatically catenated from a collection of datasets,
         then the run start indexes are determined directly,
         and '-concat' is not needed (and will be ignored).
         ++ Each run must have at least 9 time points AFTER
         censoring, or the program will not work!
         ++ The only use made of this input is in setting up
         the bandpass/stopband regressors.
         ++ '-ort' and '-dsort' regressors run through all time
         points, as read in. If you want separate projections
         in each run, then you must either break these ort files
         into appropriate components, OR you must run 3dTproject
         for each run separately, using the appropriate pieces
         from the ort files via the '{...}' selector for the
         1D files and the '[...]' selector for the datasets.
        argument: ``-concat %s``
noblock: (a boolean)
        Also as in 3dDeconvolve, if you want the program to treat
         an auto-catenated dataset as one long run, use this option.
         ++ However, '-noblock' will not affect catenation if you use
         the '-concat' option.
        argument: ``-noblock``
ort: (a pathlike object or string representing an existing file)
        Remove each column in file
         ++ Each column will have its mean removed.
        argument: ``-ort %s``
polort: (an integer (int or long))
        Remove polynomials up to and including degree pp.
         ++ Default value is 2.
         ++ It makes no sense to use a value of pp greater than
         2, if you are bandpassing out the lower frequencies!
         ++ For catenated datasets, each run gets a separate set
         set of pp+1 Legendre polynomial regressors.
         ++ Use of -polort -1 is not advised (if data mean != 0),
         even if -ort contains constant terms, as all means are
         removed.
        argument: ``-polort %d``
dsort: (a list of items which are a pathlike object or string
          representing an existing file)
        Remove the 3D+time time series in dataset fset.
         ++ That is, 'fset' contains a different nuisance time
         series for each voxel (e.g., from AnatICOR).
         ++ Multiple -dsort options are allowed.
        argument: ``-dsort %s...``
bandpass: (a tuple of the form: (a float, a float))
        Remove all frequencies EXCEPT those in the range
        argument: ``-bandpass %g %g``
stopband: (a tuple of the form: (a float, a float))
        Remove all frequencies in the range
        argument: ``-stopband %g %g``
TR: (a float)
        Use time step dd for the frequency calculations,
         rather than the value stored in the dataset header.
        argument: ``-TR %g``
mask: (a pathlike object or string representing an existing file)
        Only operate on voxels nonzero in the mset dataset.
         ++ Voxels outside the mask will be filled with zeros.
         ++ If no masking option is given, then all voxels
         will be processed.
        argument: ``-mask %s``
automask: (a boolean)
        Generate a mask automatically
        argument: ``-automask``
        mutually_exclusive: mask
blur: (a float)
        Blur (inside the mask only) with a filter that has
         width (FWHM) of fff millimeters.
         ++ Spatial blurring (if done) is after the time
         series filtering.
        argument: ``-blur %g``
norm: (a boolean)
        Normalize each output time series to have sum of
         squares = 1. This is the LAST operation.
        argument: ``-norm``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

TShift

Link to code

Wraps the executable command 3dTshift.

Shifts voxel time series from input so that seperate slices are aligned to the same temporal origin.

For complete details, see the 3dTshift Documentation.

Examples

Slice timing details may be specified explicitly via the slice_timing input:

>>> from nipype.interfaces import afni
>>> TR = 2.5
>>> tshift = afni.TShift()
>>> tshift.inputs.in_file = 'functional.nii'
>>> tshift.inputs.tzero = 0.0
>>> tshift.inputs.tr = '%.1fs' % TR
>>> tshift.inputs.slice_timing = list(np.arange(40) / TR)
>>> tshift.cmdline
'3dTshift -prefix functional_tshift -tpattern @slice_timing.1D -TR 2.5s -tzero 0.0 functional.nii'

When the slice_timing input is used, the timing_file output is populated, in this case with the generated file.

>>> tshift._list_outputs()['timing_file']  # doctest: +ELLIPSIS
'.../slice_timing.1D'
>>> np.loadtxt(tshift._list_outputs()['timing_file']).tolist()[:5]
[0.0, 0.4, 0.8, 1.2, 1.6]

If slice_encoding_direction is set to 'k-', the slice timing is reversed:

>>> tshift.inputs.slice_encoding_direction = 'k-'
>>> tshift.cmdline
'3dTshift -prefix functional_tshift -tpattern @slice_timing.1D -TR 2.5s -tzero 0.0 functional.nii'
>>> np.loadtxt(tshift._list_outputs()['timing_file']).tolist()[:5]
[15.6, 15.2, 14.8, 14.4, 14.0]

This method creates a slice_timing.1D file to be passed to 3dTshift. A pre-existing slice-timing file may be used in the same way:

>>> tshift = afni.TShift()
>>> tshift.inputs.in_file = 'functional.nii'
>>> tshift.inputs.tzero = 0.0
>>> tshift.inputs.tr = '%.1fs' % TR
>>> tshift.inputs.slice_timing = 'slice_timing.1D'
>>> tshift.cmdline
'3dTshift -prefix functional_tshift -tpattern @slice_timing.1D -TR 2.5s -tzero 0.0 functional.nii'

When a pre-existing file is provided, timing_file is simply passed through.

>>> tshift._list_outputs()['timing_file']  # doctest: +ELLIPSIS
'.../slice_timing.1D'

Alternatively, pre-specified slice timing patterns may be specified with the tpattern input. For example, to specify an alternating, ascending slice timing pattern:

>>> tshift = afni.TShift()
>>> tshift.inputs.in_file = 'functional.nii'
>>> tshift.inputs.tzero = 0.0
>>> tshift.inputs.tr = '%.1fs' % TR
>>> tshift.inputs.tpattern = 'alt+z'
>>> tshift.cmdline
'3dTshift -prefix functional_tshift -tpattern alt+z -TR 2.5s -tzero 0.0 functional.nii'

For backwards compatibility, tpattern may also take filenames prefixed with @. However, in this case, filenames are not validated, so this usage will be deprecated in future versions of Nipype.

>>> tshift = afni.TShift()
>>> tshift.inputs.in_file = 'functional.nii'
>>> tshift.inputs.tzero = 0.0
>>> tshift.inputs.tr = '%.1fs' % TR
>>> tshift.inputs.tpattern = '@slice_timing.1D'
>>> tshift.cmdline
'3dTshift -prefix functional_tshift -tpattern @slice_timing.1D -TR 2.5s -tzero 0.0 functional.nii'

In these cases, timing_file is undefined.

>>> tshift._list_outputs()['timing_file']  # doctest: +ELLIPSIS
<undefined>

In any configuration, the interface may be run as usual:

>>> res = tshift.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dTshift
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
tr: (a unicode string)
        manually set the TR. You can attach suffix "s" for seconds or "ms"
        for milliseconds.
        argument: ``-TR %s``
tzero: (a float)
        align each slice to given time offset
        argument: ``-tzero %s``
        mutually_exclusive: tslice
tslice: (an integer (int or long))
        align each slice to time offset of given slice
        argument: ``-slice %s``
        mutually_exclusive: tzero
ignore: (an integer (int or long))
        ignore the first set of points specified
        argument: ``-ignore %s``
interp: ('Fourier' or 'linear' or 'cubic' or 'quintic' or 'heptic')
        different interpolation methods (see 3dTshift for details) default =
        Fourier
        argument: ``-%s``
tpattern: ('alt+z' or 'altplus' or 'alt+z2' or 'alt-z' or 'altminus'
          or 'alt-z2' or 'seq+z' or 'seqplus' or 'seq-z' or 'seqminus' or a
          unicode string)
        use specified slice time pattern rather than one in header
        argument: ``-tpattern %s``
        mutually_exclusive: slice_timing
slice_timing: (a pathlike object or string representing an existing
          file or a list of items which are a float)
        time offsets from the volume acquisition onset for each slice
        argument: ``-tpattern @%s``
        mutually_exclusive: tpattern
slice_encoding_direction: ('k' or 'k-', nipype default value: k)
        Direction in which slice_timing is specified (default: k). If
        negative,slice_timing is defined in reverse order, that is, the
        first entry corresponds to the slice with the largest index, and the
        final entry corresponds to slice index zero. Only in effect when
        slice_timing is passed as list, not when it is passed as file.
rlt: (a boolean)
        Before shifting, remove the mean and linear trend
        argument: ``-rlt``
rltplus: (a boolean)
        Before shifting, remove the mean and linear trend and later put back
        the mean
        argument: ``-rlt+``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

timing_file: (a pathlike object or string representing a file)
        AFNI formatted timing file, if ``slice_timing`` is a list
out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

TSmooth

Link to code

Wraps the executable command 3dTsmooth.

Smooths each voxel time series in a 3D+time dataset and produces as output a new 3D+time dataset (e.g., lowpass filter in time).

For complete details, see the 3dTsmooth Documentation.

Examples

>>> from nipype.interfaces import afni
>>> from nipype.testing import  example_data
>>> smooth = afni.TSmooth()
>>> smooth.inputs.in_file = 'functional.nii'
>>> smooth.inputs.adaptive = 5
>>> smooth.cmdline
'3dTsmooth -adaptive 5 -prefix functional_smooth functional.nii'
>>> res = smooth.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dTSmooth
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output file from 3dTSmooth
        argument: ``-prefix %s``
datum: (a unicode string)
        Sets the data type of the output dataset
        argument: ``-datum %s``
lin: (a boolean)
        3 point linear filter: 0.15*a + 0.70*b + 0.15*c[This is the default
        smoother]
        argument: ``-lin``
med: (a boolean)
        3 point median filter: median(a,b,c)
        argument: ``-med``
osf: (a boolean)
        3 point order statistics filter:0.15*min(a,b,c) + 0.70*median(a,b,c)
        + 0.15*max(a,b,c)
        argument: ``-osf``
lin3: (an integer (int or long))
        3 point linear filter: 0.5*(1-m)*a + m*b + 0.5*(1-m)*cHere, 'm' is a
        number strictly between 0 and 1.
        argument: ``-3lin %d``
hamming: (an integer (int or long))
        Use N point Hamming windows.(N must be odd and bigger than 1.)
        argument: ``-hamming %d``
blackman: (an integer (int or long))
        Use N point Blackman windows.(N must be odd and bigger than 1.)
        argument: ``-blackman %d``
custom: (a pathlike object or string representing a file)
        odd # of coefficients must be in a single column in ASCII file
        argument: ``-custom %s``
adaptive: (an integer (int or long))
        use adaptive mean filtering of width N (where N must be odd and
        bigger than 3).
        argument: ``-adaptive %d``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        output file

References:

None None

Volreg

Link to code

Wraps the executable command 3dvolreg.

Register input volumes to a base volume using AFNI 3dvolreg command

For complete details, see the 3dvolreg Documentation.

Examples

>>> from nipype.interfaces import afni
>>> volreg = afni.Volreg()
>>> volreg.inputs.in_file = 'functional.nii'
>>> volreg.inputs.args = '-Fourier -twopass'
>>> volreg.inputs.zpad = 4
>>> volreg.inputs.outputtype = 'NIFTI'
>>> volreg.cmdline  # doctest: +ELLIPSIS
'3dvolreg -Fourier -twopass -1Dfile functional.1D -1Dmatrix_save functional.aff12.1D -prefix functional_volreg.nii -zpad 4 -maxdisp1D functional_md.1D functional.nii'
>>> res = volreg.run()  # doctest: +SKIP
>>> from nipype.interfaces import afni
>>> volreg = afni.Volreg()
>>> volreg.inputs.in_file = 'functional.nii'
>>> volreg.inputs.interp = 'cubic'
>>> volreg.inputs.verbose = True
>>> volreg.inputs.zpad = 1
>>> volreg.inputs.basefile = 'functional.nii'
>>> volreg.inputs.out_file = 'rm.epi.volreg.r1'
>>> volreg.inputs.oned_file = 'dfile.r1.1D'
>>> volreg.inputs.oned_matrix_save = 'mat.r1.tshift+orig.1D'
>>> volreg.cmdline
'3dvolreg -cubic -1Dfile dfile.r1.1D -1Dmatrix_save mat.r1.tshift+orig.1D -prefix rm.epi.volreg.r1 -verbose -base functional.nii -zpad 1 -maxdisp1D functional_md.1D functional.nii'
>>> res = volreg.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dvolreg
        argument: ``%s``, position: -1

[Optional]
in_weight_volume: (a tuple of the form: (a pathlike object or string
          representing an existing file, an integer (int or long)) or a
          pathlike object or string representing an existing file)
        weights for each voxel specified by a file with an optional volume
        number (defaults to 0)
        argument: ``-weight '%s[%d]'``
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
basefile: (a pathlike object or string representing an existing file)
        base file for registration
        argument: ``-base %s``, position: -6
zpad: (an integer (int or long))
        Zeropad around the edges by 'n' voxels during rotations
        argument: ``-zpad %d``, position: -5
md1d_file: (a pathlike object or string representing a file)
        max displacement output file
        argument: ``-maxdisp1D %s``, position: -4
oned_file: (a pathlike object or string representing a file)
        1D movement parameters output file
        argument: ``-1Dfile %s``
verbose: (a boolean)
        more detailed description of the process
        argument: ``-verbose``
timeshift: (a boolean)
        time shift to mean slice time offset
        argument: ``-tshift 0``
copyorigin: (a boolean)
        copy base file origin coords to output
        argument: ``-twodup``
oned_matrix_save: (a pathlike object or string representing a file)
        Save the matrix transformation
        argument: ``-1Dmatrix_save %s``
interp: ('Fourier' or 'cubic' or 'heptic' or 'quintic' or 'linear')
        spatial interpolation methods [default = heptic]
        argument: ``-%s``
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        registered file
md1d_file: (a pathlike object or string representing an existing
          file)
        max displacement info file
oned_file: (a pathlike object or string representing an existing
          file)
        movement parameters info file
oned_matrix_save: (a pathlike object or string representing an
          existing file)
        matrix transformation from base to input

References:

None None

Warp

Link to code

Wraps the executable command 3dWarp.

Use 3dWarp for spatially transforming a dataset

For complete details, see the 3dWarp Documentation.

Examples

>>> from nipype.interfaces import afni
>>> warp = afni.Warp()
>>> warp.inputs.in_file = 'structural.nii'
>>> warp.inputs.deoblique = True
>>> warp.inputs.out_file = 'trans.nii.gz'
>>> warp.cmdline
'3dWarp -deoblique -prefix trans.nii.gz structural.nii'
>>> res = warp.run()  # doctest: +SKIP
>>> warp_2 = afni.Warp()
>>> warp_2.inputs.in_file = 'structural.nii'
>>> warp_2.inputs.newgrid = 1.0
>>> warp_2.inputs.out_file = 'trans.nii.gz'
>>> warp_2.cmdline
'3dWarp -newgrid 1.000000 -prefix trans.nii.gz structural.nii'
>>> res = warp_2.run()  # doctest: +SKIP

Inputs:

[Mandatory]
in_file: (a pathlike object or string representing an existing file)
        input file to 3dWarp
        argument: ``%s``, position: -1

[Optional]
out_file: (a pathlike object or string representing a file)
        output image file name
        argument: ``-prefix %s``
tta2mni: (a boolean)
        transform dataset from Talairach to MNI152
        argument: ``-tta2mni``
mni2tta: (a boolean)
        transform dataset from MNI152 to Talaraich
        argument: ``-mni2tta``
matparent: (a pathlike object or string representing an existing
          file)
        apply transformation from 3dWarpDrive
        argument: ``-matparent %s``
oblique_parent: (a pathlike object or string representing an existing
          file)
        Read in the oblique transformation matrix from an oblique dataset
        and make cardinal dataset oblique to match
        argument: ``-oblique_parent %s``
deoblique: (a boolean)
        transform dataset from oblique to cardinal
        argument: ``-deoblique``
interp: ('linear' or 'cubic' or 'NN' or 'quintic')
        spatial interpolation methods [default = linear]
        argument: ``-%s``
gridset: (a pathlike object or string representing an existing file)
        copy grid of specified dataset
        argument: ``-gridset %s``
newgrid: (a float)
        specify grid of this size (mm)
        argument: ``-newgrid %f``
zpad: (an integer (int or long))
        pad input dataset with N planes of zero on all sides.
        argument: ``-zpad %d``
verbose: (a boolean)
        Print out some information along the way.
        argument: ``-verb``
save_warp: (a boolean)
        save warp as .mat file
        requires: verbose
num_threads: (an integer (int or long), nipype default value: 1)
        set number of threads
outputtype: ('NIFTI' or 'AFNI' or 'NIFTI_GZ')
        AFNI output filetype
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a pathlike object or string representing an existing file)
        Warped file.
warp_file: (a pathlike object or string representing a file)
        warp transform .mat file

References:

None None