interfaces.camino.convert

AnalyzeHeader

Link to code

Wraps the executable command analyzeheader.

Create or read an Analyze 7.5 header file.

Analyze image header, provides support for the most common header fields. Some fields, such as patient_id, are not currently supported. The program allows three nonstandard options: the field image_dimension.funused1 is the image scale. The intensity of each pixel in the associated .img file is (image value from file) * scale. Also, the origin of the Talairach coordinates (midline of the anterior commisure) are encoded in the field data_history.originator. These changes are included for compatibility with SPM.

All headers written with this program are big endian by default.

Example

>>> import nipype.interfaces.camino as cmon
>>> hdr = cmon.AnalyzeHeader()
>>> hdr.inputs.in_file = 'tensor_fitted_data.Bdouble'
>>> hdr.inputs.scheme_file = 'A.scheme'
>>> hdr.inputs.data_dims = [256,256,256]
>>> hdr.inputs.voxel_dims = [1,1,1]
>>> hdr.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        Tensor-fitted data filename
        argument: ``< %s``, position: 1
datatype: ('byte' or 'char' or '[u]short' or '[u]int' or 'float' or
          'complex' or 'double')
        The char datatype is 8 bit (not the 16 bit char of Java), as
        specified by the Analyze 7.5 standard. The byte, ushort and uint
        types are not part of the Analyze specification but are supported by
        SPM.
        argument: ``-datatype %s``

[Optional]
scheme_file: (an existing file name)
        Camino scheme file (b values / vectors, see camino.fsl2scheme)
        argument: ``%s``, position: 2
readheader: (an existing file name)
        Reads header information from file and prints to stdout. If this
        option is not specified, then the program writes a header based on
        the other arguments.
        argument: ``-readheader %s``, position: 3
printimagedims: (an existing file name)
        Prints image data and voxel dimensions as Camino arguments and
        exits.
        argument: ``-printimagedims %s``, position: 3
printprogargs: (an existing file name)
        Prints data dimension (and type, if relevant) arguments for a
        specific Camino program, where prog is one of shredder,
        scanner2voxel, vcthreshselect, pdview, track.
        argument: ``-printprogargs %s``, position: 3
printintelbyteorder: (an existing file name)
        Prints 1 if the header is little-endian, 0 otherwise.
        argument: ``-printintelbyteorder %s``, position: 3
printbigendian: (an existing file name)
        Prints 1 if the header is big-endian, 0 otherwise.
        argument: ``-printbigendian %s``, position: 3
initfromheader: (an existing file name)
        Reads header information from file and intializes a new header with
        the values read from the file. You may replace any combination of
        fields in the new header by specifying subsequent options.
        argument: ``-initfromheader %s``, position: 3
data_dims: (a list of from 3 to 3 items which are an integer (int or
          long))
        data dimensions in voxels
        argument: ``-datadims %s``
voxel_dims: (a list of from 3 to 3 items which are a float)
        voxel dimensions in mm
        argument: ``-voxeldims %s``
centre: (a list of from 3 to 3 items which are an integer (int or
          long))
        Voxel specifying origin of Talairach coordinate system for SPM,
        default [0 0 0].
        argument: ``-centre %s``
picoseed: (a list of from 3 to 3 items which are an integer (int or
          long))
        Voxel specifying the seed (for PICo maps), default [0 0 0].
        argument: ``-picoseed %s``
nimages: (an integer (int or long))
        Number of images in the img file. Default 1.
        argument: ``-nimages %d``
offset: (an integer (int or long))
        According to the Analyze 7.5 standard, this is the byte offset in
        the .img file at which voxels start. This value can be negative to
        specify that the absolute value is applied for every image in the
        file.
        argument: ``-offset %d``
greylevels: (a list of from 2 to 2 items which are an integer (int or
          long))
        Minimum and maximum greylevels. Stored as shorts in the header.
        argument: ``-gl %s``
scaleslope: (a float)
        Intensities in the image are scaled by this factor by SPM and
        MRICro. Default is 1.0.
        argument: ``-scaleslope %d``
scaleinter: (a float)
        Constant to add to the image intensities. Used by SPM and MRIcro.
        argument: ``-scaleinter %d``
description: (a string)
        Short description - No spaces, max length 79 bytes. Will be null
        terminated automatically.
        argument: ``-description %s``
intelbyteorder: (a boolean)
        Write header in intel byte order (little-endian).
        argument: ``-intelbyteorder``
networkbyteorder: (a boolean)
        Write header in network byte order (big-endian). This is the default
        for new headers.
        argument: ``-networkbyteorder``
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

header: (an existing file name)
        Analyze header

DT2NIfTI

Link to code

Wraps the executable command dt2nii.

Converts camino tensor data to NIfTI format

Reads Camino diffusion tensors, and converts them to NIFTI format as three .nii files.

Inputs:

[Mandatory]
in_file: (an existing file name)
        tract file
        argument: ``-inputfile %s``, position: 1
header_file: (an existing file name)
         A Nifti .nii or .hdr file containing the header information
        argument: ``-header %s``, position: 3

[Optional]
output_root: (a file name)
        filename root prepended onto the names of three output files.
        argument: ``-outputroot %s``, position: 2
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

dt: (an existing file name)
        diffusion tensors in NIfTI format
exitcode: (an existing file name)
        exit codes from Camino reconstruction in NIfTI format
lns0: (an existing file name)
        estimated lns0 from Camino reconstruction in NIfTI format

Image2Voxel

Link to code

Wraps the executable command image2voxel.

Converts Analyze / NIFTI / MHA files to voxel order.

Converts scanner-order data in a supported image format to voxel-order data. Either takes a 4D file (all measurements in single image) or a list of 3D images.

Examples

>>> import nipype.interfaces.camino as cmon
>>> img2vox = cmon.Image2Voxel()
>>> img2vox.inputs.in_file = '4d_dwi.nii'
>>> img2vox.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        4d image file
        argument: ``-4dimage %s``, position: 1

[Optional]
out_type: ('float' or 'char' or 'short' or 'int' or 'long' or
          'double', nipype default value: float)
        "i.e. Bfloat". Can be "char", "short", "int", "long", "float" or
        "double"
        argument: ``-outputdatatype %s``, position: 2
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

voxel_order: (an existing file name)
        path/name of 4D volume in voxel order

NIfTIDT2Camino

Link to code

Wraps the executable command niftidt2camino.

Converts NIFTI-1 diffusion tensors to Camino format. The program reads the NIFTI header but does not apply any spatial transformations to the data. The NIFTI intensity scaling parameters are applied.

The output is the tensors in Camino voxel ordering: [exit, ln(S0), dxx, dxy, dxz, dyy, dyz, dzz].

The exit code is set to 0 unless a background mask is supplied, in which case the code is 0 in brain voxels and -1 in background voxels.

The value of ln(S0) in the output is taken from a file if one is supplied, otherwise it is set to 0.

NOTE FOR FSL USERS - FSL’s dtifit can output NIFTI tensors, but they are not stored in the usual way (which is using NIFTI_INTENT_SYMMATRIX). FSL’s tensors follow the ITK / VTK “upper-triangular” convention, so you will need to use the -uppertriangular option to convert these correctly.

Inputs:

[Mandatory]
in_file: (an existing file name)
        A NIFTI-1 dataset containing diffusion tensors. The tensors are
        assumed to be in lower-triangular order as specified by the NIFTI
        standard for the storage of symmetric matrices. This file should be
        either a .nii or a .hdr file.
        argument: ``-inputfile %s``, position: 1

[Optional]
s0_file: (an existing file name)
        File containing the unweighted signal for each voxel, may be a raw
        binary file (specify type with -inputdatatype) or a supported image
        file.
        argument: ``-s0 %s``
lns0_file: (an existing file name)
        File containing the log of the unweighted signal for each voxel, may
        be a raw binary file (specify type with -inputdatatype) or a
        supported image file.
        argument: ``-lns0 %s``
bgmask: (an existing file name)
        Binary valued brain / background segmentation, may be a raw binary
        file (specify type with -maskdatatype) or a supported image file.
        argument: ``-bgmask %s``
scaleslope: (a float)
        A value v in the diffusion tensor is scaled to v * s + i. This is
        applied after any scaling specified by the input image. Default is
        1.0.
        argument: ``-scaleslope %s``
scaleinter: (a float)
        A value v in the diffusion tensor is scaled to v * s + i. This is
        applied after any scaling specified by the input image. Default is
        0.0.
        argument: ``-scaleinter %s``
uppertriangular: (a boolean)
        Specifies input in upper-triangular (VTK style) order.
        argument: ``-uppertriangular %s``
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

out_file: (a file name)
        diffusion tensors data in Camino format

ProcStreamlines

Link to code

Wraps the executable command procstreamlines.

Process streamline data

This program does post-processing of streamline output from track. It can either output streamlines or connection probability maps.

Examples

>>> import nipype.interfaces.camino as cmon
>>> proc = cmon.ProcStreamlines()
>>> proc.inputs.in_file = 'tract_data.Bfloat'
>>> proc.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        data file
        argument: ``-inputfile %s``, position: 1

[Optional]
inputmodel: ('raw' or 'voxels', nipype default value: raw)
        input model type (raw or voxels)
        argument: ``-inputmodel %s``
maxtractpoints: (an integer (int or long))
        maximum number of tract points
        argument: ``-maxtractpoints %d``
mintractpoints: (an integer (int or long))
        minimum number of tract points
        argument: ``-mintractpoints %d``
maxtractlength: (an integer (int or long))
        maximum length of tracts
        argument: ``-maxtractlength %d``
mintractlength: (an integer (int or long))
        minimum length of tracts
        argument: ``-mintractlength %d``
datadims: (a list of from 3 to 3 items which are an integer (int or
          long))
        data dimensions in voxels
        argument: ``-datadims %s``
voxeldims: (a list of from 3 to 3 items which are an integer (int or
          long))
        voxel dimensions in mm
        argument: ``-voxeldims %s``
seedpointmm: (a list of from 3 to 3 items which are an integer (int
          or long))
        The coordinates of a single seed point for tractography in mm
        argument: ``-seedpointmm %s``
seedpointvox: (a list of from 3 to 3 items which are an integer (int
          or long))
        The coordinates of a single seed point for tractography in voxels
        argument: ``-seedpointvox %s``
seedfile: (a file name)
        Image Containing Seed Points
        argument: ``-seedfile %s``
regionindex: (an integer (int or long))
        index of specific region to process
        argument: ``-regionindex %d``
iterations: (a float)
        Number of streamlines generated for each seed. Not required when
        outputting streamlines, but needed to create PICo images. The
        default is 1 if the output is streamlines, and 5000 if the output is
        connection probability images.
        argument: ``-iterations %d``
targetfile: (a file name)
        Image containing target volumes.
        argument: ``-targetfile %s``
allowmultitargets: (a boolean)
        Allows streamlines to connect to multiple target volumes.
        argument: ``-allowmultitargets``
directional: (a list of from 3 to 3 items which are an integer (int
          or long))
        Splits the streamlines at the seed point and computes separate
        connection probabilities for each segment. Streamline segments are
        grouped according to their dot product with the vector (X, Y, Z).
        The ideal vector will be tangential to the streamline trajectory at
        the seed, such that the streamline projects from the seed along (X,
        Y, Z) and -(X, Y, Z). However, it is only necessary for the
        streamline trajectory to not be orthogonal to (X, Y, Z).
        argument: ``-directional %s``
waypointfile: (a file name)
        Image containing waypoints. Waypoints are defined as regions of the
        image with the same intensity, where 0 is background and any value >
        0 is a waypoint.
        argument: ``-waypointfile %s``
truncateloops: (a boolean)
        This option allows streamlines to enter a waypoint exactly once.
        After the streamline leaves the waypoint, it is truncated upon a
        second entry to the waypoint.
        argument: ``-truncateloops``
discardloops: (a boolean)
        This option allows streamlines to enter a waypoint exactly once.
        After the streamline leaves the waypoint, the entire streamline is
        discarded upon a second entry to the waypoint.
        argument: ``-discardloops``
exclusionfile: (a file name)
        Image containing exclusion ROIs. This should be an Analyze 7.5
        header / image file.hdr and file.img.
        argument: ``-exclusionfile %s``
truncateinexclusion: (a boolean)
        Retain segments of a streamline before entry to an exclusion ROI.
        argument: ``-truncateinexclusion``
endpointfile: (a file name)
        Image containing endpoint ROIs. This should be an Analyze 7.5 header
        / image file.hdr and file.img.
        argument: ``-endpointfile %s``
resamplestepsize: (a float)
        Each point on a streamline is tested for entry into target,
        exclusion or waypoint volumes. If the length between points on a
        tract is not much smaller than the voxel length, then streamlines
        may pass through part of a voxel without being counted. To avoid
        this, the program resamples streamlines such that the step size is
        one tenth of the smallest voxel dimension in the image. This
        increases the size of raw or oogl streamline output and incurs some
        performance penalty. The resample resolution can be controlled with
        this option or disabled altogether by passing a negative step size
        or by passing the -noresample option.
        argument: ``-resamplestepsize %d``
noresample: (a boolean)
        Disables resampling of input streamlines. Resampling is
        automatically disabled if the input model is voxels.
        argument: ``-noresample``
outputtracts: (a boolean)
        Output streamlines in raw binary format.
        argument: ``-outputtracts``
outputroot: (a file name)
        Prepended onto all output file names.
        argument: ``-outputroot %s``
gzip: (a boolean)
        save the output image in gzip format
        argument: ``-gzip``
outputcp: (a boolean)
        output the connection probability map (Analyze image, float)
        argument: ``-outputcp``
        requires: outputroot, seedfile
outputsc: (a boolean)
        output the connection probability map (raw streamlines, int)
        argument: ``-outputsc``
        requires: outputroot, seedfile
outputacm: (a boolean)
        output all tracts in a single connection probability map (Analyze
        image)
        argument: ``-outputacm``
        requires: outputroot, seedfile
outputcbs: (a boolean)
        outputs connectivity-based segmentation maps; requires target
        outputfile
        argument: ``-outputcbs``
        requires: outputroot, targetfile, seedfile
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

proc: (an existing file name)
        Processed Streamlines
outputroot_files: (a list of items which are an existing file name)

Shredder

Link to code

Wraps the executable command shredder.

Extracts periodic chunks from a data stream.

Shredder makes an initial offset of offset bytes. It then reads and outputs chunksize bytes, skips space bytes, and repeats until there is no more input.

If the chunksize is negative, chunks of size chunksize are read and the byte ordering of each chunk is reversed. The whole chunk will be reversed, so the chunk must be the same size as the data type, otherwise the order of the values in the chunk, as well as their endianness, will be reversed.

Examples

>>> import nipype.interfaces.camino as cam
>>> shred = cam.Shredder()
>>> shred.inputs.in_file = 'SubjectA.Bfloat'
>>> shred.inputs.offset = 0
>>> shred.inputs.chunksize = 1
>>> shred.inputs.space = 2
>>> shred.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        raw binary data file
        argument: ``< %s``, position: -2

[Optional]
offset: (an integer (int or long))
        initial offset of offset bytes
        argument: ``%d``, position: 1
chunksize: (an integer (int or long))
        reads and outputs a chunk of chunksize bytes
        argument: ``%d``, position: 2
space: (an integer (int or long))
        skips space bytes
        argument: ``%d``, position: 3
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

shredded: (an existing file name)
        Shredded binary data file

TractShredder

Link to code

Wraps the executable command tractshredder.

Extracts bunches of streamlines.

tractshredder works in a similar way to shredder, but processes streamlines instead of scalar data. The input is raw streamlines, in the format produced by track or procstreamlines.

The program first makes an initial offset of offset tracts. It then reads and outputs a group of bunchsize tracts, skips space tracts, and repeats until there is no more input.

Examples

>>> import nipype.interfaces.camino as cmon
>>> shred = cmon.TractShredder()
>>> shred.inputs.in_file = 'tract_data.Bfloat'
>>> shred.inputs.offset = 0
>>> shred.inputs.bunchsize = 1
>>> shred.inputs.space = 2
>>> shred.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        tract file
        argument: ``< %s``, position: -2

[Optional]
offset: (an integer (int or long))
        initial offset of offset tracts
        argument: ``%d``, position: 1
bunchsize: (an integer (int or long))
        reads and outputs a group of bunchsize tracts
        argument: ``%d``, position: 2
space: (an integer (int or long))
        skips space tracts
        argument: ``%d``, position: 3
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

shredded: (an existing file name)
        Shredded tract file

VtkStreamlines

Link to code

Wraps the executable command vtkstreamlines.

Use vtkstreamlines to convert raw or voxel format streamlines to VTK polydata

Examples

>>> import nipype.interfaces.camino as cmon
>>> vtk = cmon.VtkStreamlines()
>>> vtk.inputs.in_file = 'tract_data.Bfloat'
>>> vtk.inputs.voxeldims = [1,1,1]
>>> vtk.run()                  

Inputs:

[Mandatory]
in_file: (an existing file name)
        data file
        argument: `` < %s``, position: -2

[Optional]
inputmodel: ('raw' or 'voxels', nipype default value: raw)
        input model type (raw or voxels)
        argument: ``-inputmodel %s``
voxeldims: (a list of from 3 to 3 items which are an integer (int or
          long))
        voxel dimensions in mm
        argument: ``-voxeldims %s``, position: 4
seed_file: (a file name)
        image containing seed points
        argument: ``-seedfile %s``, position: 1
target_file: (a file name)
        image containing integer-valued target regions
        argument: ``-targetfile %s``, position: 2
scalar_file: (a file name)
        image that is in the same physical space as the tracts
        argument: ``-scalarfile %s``, position: 3
colourorient: (a boolean)
        Each point on the streamline is coloured by the local orientation.
        argument: ``-colourorient``
interpolatescalars: (a boolean)
        the scalar value at each point on the streamline is calculated by
        trilinear interpolation
        argument: ``-interpolatescalars``
interpolate: (a boolean)
        the scalar value at each point on the streamline is calculated by
        trilinear interpolation
        argument: ``-interpolate``
out_file: (a file name)
        argument: ``> %s``, position: -1
args: (a unicode string)
        Additional parameters to the command
        argument: ``%s``
environ: (a dictionary with keys which are a bytes or None or a value
          of class 'str' and with values which are a bytes or None or a
          value of class 'str', nipype default value: {})
        Environment variables

Outputs:

vtk: (an existing file name)
        Streamlines in VTK format