Skip to content

cardiotensor

Modules:

analysis

Modules:

analysis_functions

Functions:

calculate_intensities

calculate_intensities(img_helix: ndarray, start_point: tuple[int, int], end_point: tuple[int, int], angle_range: float = 5, N_line: int = 10, max_value: float | None = None, min_value: float | None = None) -> list[ndarray]

Calculate intensity profiles along multiple lines.

Parameters: img_helix (np.ndarray): The image array. start_point (Tuple[int, int]): The starting point of the line. end_point (Tuple[int, int]): The ending point of the line. angle_range (float, optional): The range of angles to consider in degrees. Default is 5. N_line (int, optional): The number of lines to generate. Default is 10. max_value (Optional[float], optional): Maximum value for intensity normalization. Default is None. min_value (Optional[float], optional): Minimum value for intensity normalization. Default is None.

Returns: List[np.ndarray]: List of intensity profiles for each line.

find_end_points

find_end_points(start_point: tuple[float, float], end_point: tuple[float, float], angle_range: float, N_line: int) -> ndarray

Find the end points for lines at different angles within a range.

Parameters: start_point (Tuple[int, int]): The starting point of the main line. end_point (Tuple[int, int]): The ending point of the main line. angle_range (float): The range of angles to consider in degrees. N_line (int): The number of lines to generate within the range.

Returns: np.ndarray: Array of end points for the generated lines.

plot_intensity

plot_intensity(intensity_profiles: list[ndarray], label_y: str = '', x_max_lim: float | None = None, x_min_lim: float | None = None, y_max_lim: float | None = None, y_min_lim: float | None = None) -> None

Plot intensity profiles with mean and percentile shading.

Parameters: intensity_profiles (List[np.ndarray]): List of intensity profiles. label_y (str, optional): Label for the y-axis. Default is an empty string. x_max_lim (Optional[float], optional): Maximum x-axis limit. Default is None. x_min_lim (Optional[float], optional): Minimum x-axis limit. Default is None. y_max_lim (Optional[float], optional): Maximum y-axis limit. Default is None. y_min_lim (Optional[float], optional): Minimum y-axis limit. Default is None.

save_intensity

save_intensity(intensity_profiles: list[ndarray], save_path: str) -> None

Save intensity profiles to a CSV file.

Parameters: intensity_profiles (List[np.ndarray]): List of intensity profiles. save_path (str): Path to save the CSV file.

Returns: None

gui_analysis_tool

Functions:

  • convert_slice_for_display

    Return normalized float slice for display, in the mode's physical domain, then min-max to 0..1.

  • discover_modes

    Return list of available subfolders among HA, IA, AZ, EL, FA, in a stable order.

convert_slice_for_display

convert_slice_for_display(slice2d: ndarray, mode: str) -> ndarray

Return normalized float slice for display, in the mode's physical domain, then min-max to 0..1.

discover_modes

discover_modes(base: Path) -> list[str]

Return list of available subfolders among HA, IA, AZ, EL, FA, in a stable order.

launcher

Modules:

slurm_launcher

Functions:

  • is_chunk_done

    Check if all output files (HA, IA, FA) for a given chunk are already present.

  • monitor_job_output

    Monitor OUTPUT_DIR/HA until total_images files appear (subset-aware).

  • slurm_launcher

    Launch Slurm array jobs for a subset [start_index, end_index] (inclusive) of the volume.

  • submit_job_to_slurm

    Submit a Slurm job and return its job ID.

is_chunk_done

is_chunk_done(output_dir: str, start: int, end: int, output_format: str = 'jp2') -> bool

Check if all output files (HA, IA, FA) for a given chunk are already present.

Parameters:

  • output_dir
    (str) –

    Base output directory containing HA/IA/FA folders.

  • start
    (int) –

    Start index of the chunk (inclusive).

  • end
    (int) –

    End index of the chunk (exclusive).

  • output_format
    (str, default: 'jp2' ) –

    File extension for the output images (e.g., "jp2", "tif").

Returns:

  • bool ( bool ) –

    True if all expected output files exist, False otherwise.

monitor_job_output

monitor_job_output(output_directory: str, total_images: int, file_extension: str) -> None

Monitor OUTPUT_DIR/HA until total_images files appear (subset-aware).

slurm_launcher

slurm_launcher(conf_file_path: str, start_index: int = 0, end_index: int | None = None) -> None

Launch Slurm array jobs for a subset [start_index, end_index] (inclusive) of the volume. If end_index is None, the last slice of the volume is used.

submit_job_to_slurm

submit_job_to_slurm(executable_path: str, conf_file_path: str, start_image: int, end_image: int, N_chunk: int = 10, mem_needed: int = 64) -> int

Submit a Slurm job and return its job ID.

Parameters:

Returns:

  • int ( int ) –

    The Slurm job ID.

orientation

Modules:

orientation_computation_functions

Functions:

adjust_start_end_index

adjust_start_end_index(start_index: int, end_index: int, N_img: int, padding_start: int = 0, padding_end: int = 0, is_test: bool = False, n_slice: int = 0) -> tuple[int, int]

Adjusts start and end indices for image processing, considering padding and test mode.

Parameters:

Returns:

  • tuple[int, int]

    Tuple[int, int]: Adjusted start and end indices.

calculate_center_vector

calculate_center_vector(points: ndarray) -> ndarray

Compute the linear regression vector for a given set of 3D points.

Parameters:

  • points
    (ndarray) –

    An Nx3 array of (x, y, z) coordinates representing the curved line.

Returns:

  • ndarray

    np.ndarray: A single 3D unit vector representing the direction of the best-fit line.

calculate_structure_tensor

calculate_structure_tensor(volume: ndarray, sigma: float, rho: float, truncate: float = 4.0, devices: list[str] | None = None, block_size: int = 200, use_gpu: bool = False, dtype: type = float32) -> tuple[ndarray, ndarray]

Calculates the structure tensor of a volume.

Parameters:

  • volume
    (ndarray) –

    The 3D volume data.

  • sigma
    (float) –

    sigma value for Gaussian smoothing.

  • rho
    (float) –

    rho value for Gaussian smoothing.

  • devices
    (Optional[list[str]], default: None ) –

    List of devices for parallel processing (e.g., ['cpu', 'cuda:0']).

  • block_size
    (int, default: 200 ) –

    Size of the blocks for processing. Default is 200.

  • use_gpu
    (bool, default: False ) –

    If True, uses GPU for calculations. Default is False.

Returns:

  • tuple[ndarray, ndarray]

    tuple[np.ndarray, np.ndarray]: Eigenvalues and eigenvectors of the structure tensor.

compute_azimuth_and_elevation

compute_azimuth_and_elevation(vector_field_2d: ndarray) -> tuple[ndarray, ndarray]

Azimuth = angle in XY plane from +X toward +Y, in degrees [-180, 180] Elevation = angle from XY plane toward +Z, in degrees [-90, 90]

compute_fraction_anisotropy

compute_fraction_anisotropy(eigenvalues_2d: ndarray) -> ndarray

Computes Fractional Anisotropy (FA) from eigenvalues of a structure tensor.

Parameters:

Returns:

  • ndarray

    np.ndarray: Fractional Anisotropy values.

compute_helix_and_transverse_angles

compute_helix_and_transverse_angles(vector_field_2d: ndarray, center_point: tuple[int, int, int]) -> tuple[ndarray, ndarray]

Computes helix and transverse angles from a 2D vector field.

Parameters:

Returns:

  • tuple[ndarray, ndarray]

    Tuple[np.ndarray, np.ndarray]: Helix and transverse angle arrays.

interpolate_points

interpolate_points(points: list[tuple[float, float, float]], N_img: int) -> ndarray

Generates interpolated points using cubic spline interpolation for a given set of 3D points.

Parameters:

  • points
    (list[tuple[float, float, float]]) –

    A list of (x, y, z) points.

  • N_img
    (int) –

    The number of slices in the z-dimension.

Returns:

  • ndarray

    np.ndarray: Array of interpolated points.

plot_images

plot_images(img: ndarray, img_angle1: ndarray, img_angle2: ndarray, img_FA: ndarray, center_point: tuple[int, int, int], colormap_angle=None, colormap_FA=None, angle1_title: str = 'Helix Angle', angle2_title: str = 'Intrusion Angle') -> None

Render a 2x2 figure of source, angle1, angle2, FA for a single slice.

Parameters

img : np.ndarray 2D grayscale slice of the anatomical image. img_angle1 : np.ndarray 2D float map for the first angle, for example HA or AZ, in degrees. img_angle2 : np.ndarray 2D float map for the second angle, for example IA or EL, in degrees. img_FA : np.ndarray 2D float map of fractional anisotropy in [0, 1]. center_point : tuple[int, int, int] Integer voxel coordinates (z, y, x) of the centerline point on this slice. Only (y, x) is used here for the marker. colormap_angle : matplotlib colormap, optional Colormap for angles. Defaults to plt.cm.hsv if None. colormap_FA : matplotlib colormap, optional Colormap for FA. Defaults to plt.cm.magma if None. angle1_title : str Title for the first angle panel. angle2_title : str Title for the second angle panel.

Notes

This function shows the centerline marker on the source panel.

remove_padding

remove_padding(volume: ndarray, val: ndarray, vec: ndarray, padding_start: int, padding_end: int) -> tuple[ndarray, ndarray, ndarray]

Removes padding from the volume, eigenvalues, and eigenvectors.

Parameters:

Returns:

  • tuple[ndarray, ndarray, ndarray]

    Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray]: Adjusted data without padding.

rotate_vectors_to_new_axis

rotate_vectors_to_new_axis(vector_field_slice: ndarray, new_axis_vec: ndarray) -> ndarray

Rotates a vector field slice to align with a new axis.

Parameters:

Returns:

  • ndarray

    np.ndarray: Rotated vectors with the same shape as input.

write_images

write_images(img_angle1: ndarray, img_angle2: ndarray, img_FA: ndarray, start_index: int, output_dir: str, output_format: str, output_type: str, z: int, colormap_angle=None, colormap_FA=None, angle_names: tuple[str, str] = ('HA', 'IA'), angle_ranges: tuple[tuple[float, float], tuple[float, float]] = ((-90, 90), (-90, 90))) -> None

Write per-slice angle1, angle2, and FA images to disk with flexible naming and ranges.

Parameters

img_angle1 : np.ndarray 2D float array for the first angle, for example HA or AZ, in degrees. img_angle2 : np.ndarray 2D float array for the second angle, for example IA or EL, in degrees. img_FA : np.ndarray 2D float array for FA in [0, 1]. start_index : int Global starting index for z numbering in filenames. output_dir : str Base output directory. Subfolders angle_names[0], angle_names[1], and FA are created. output_format : {"jp2", "tif"} Output file format. Uses glymur for jp2 and tifffile for tif. output_type : {"8bit", "rgb"} Write mode. "8bit" writes single channel uint8, "rgb" writes colormapped RGB. z : int Current z offset used to compute the running index in filenames. colormap_angle : matplotlib colormap, optional Colormap for angles in "rgb" mode. Defaults to plt.cm.hsv if None. colormap_FA : matplotlib colormap, optional Colormap for FA in "rgb" mode. Defaults to plt.cm.magma if None. angle_names : tuple[str, str] Names used for subfolders and file prefixes, for example ("HA", "IA") or ("AZ", "EL"). angle_ranges : tuple[(float, float), (float, float)] Min and max for normalization of angle1 and angle2. For example HA and IA use (-90, 90), AZ uses (-180, 180) and EL uses (-90, 90).

Raises

RuntimeError If required IO backends are missing for the chosen format. ValueError If output_format or output_type is unsupported.

Notes

The function creates subdirectories and writes files named like: {output_dir}/{name}/{name}{index:06d}.{ext} and {output_dir}/FA/FA}.{ext

write_img_rgb

write_img_rgb(img: ndarray, out_path: str, vmin: float, vmax: float, colormap: object | None = None, output_format: str = 'jp2') -> None

Write a single 2D float image as an RGB file using a matplotlib colormap.

Parameters

img : np.ndarray 2D float array to color map. out_path : str Full output path without extension handling. vmin : float Minimum for normalization. vmax : float Maximum for normalization. colormap : matplotlib colormap, optional Colormap to apply, for example plt.cm.hsv. If None, use plt.cm.hsv. output_format : {"jp2", "tif"} Output format. Uses glymur for jp2 and tifffile for tif.

Notes

The function normalizes to [0, 1], applies the colormap, then writes uint8 RGB.

write_vector_field

write_vector_field(vector_field_slice: ndarray, start_index: int, output_dir: str, slice_idx: int) -> None

Saves a vector field slice to the specified directory in .npy format.

Parameters:

Returns:

  • None

    None

orientation_computation_pipeline

Functions:

check_already_processed

check_already_processed(output_dir: str, start_index: int, end_index: int, write_vectors: bool, write_angles: bool, output_format: str, angle_names: tuple[str, str] = ('HA', 'IA'), fa_name: str = 'FA', extra_expected: Sequence[str] | None = None) -> bool

Check whether all required output files already exist for every slice index.

Parameters

output_dir : str Base output directory. start_index : int First global slice index to check (inclusive). end_index : int Last global slice index to check (exclusive). write_vectors : bool If True, expect eigenvector .npy files (e.g., eigen_vec_{idx:06d}.npy). write_angles : bool If True, expect angle images for angle_names[0], angle_names[1], and FA. output_format : str Image format/extension for angles, for example "jp2" or "tif". angle_names : tuple[str, str], optional Names of the two angle outputs, e.g. ("HA", "IA") or ("AZ", "EL"). fa_name : str, optional Name of the FA subfolder, default "FA". extra_expected : sequence of str, optional Additional per-slice path templates to check. Each template must contain "{idx}" which will be formatted as a zero-padded integer (06d), and may also contain "{ext}" for the image extension.

Returns

bool True if all expected files for all indices exist (and pass the quick corruption filter), False otherwise.

compute_orientation

compute_orientation(volume_path: str, mask_path: str | None = None, output_dir: str = './output', output_format: str = 'jp2', output_type: str = '8bit', sigma: float = 1.0, rho: float = 3.0, truncate: float = 4.0, axis_points: ndarray | None = None, vertical_padding: float | None = None, write_vectors: bool = False, angle_mode: str = 'ha_ia', write_angles: bool = True, use_gpu: bool = True, is_test: bool = False, n_slice_test: int | None = None, start_index: int = 0, end_index: int | None = None) -> None

Compute the orientation for a volume dataset.

Parameters:

  • volume_path
    (str) –

    Path to the 3D volume.

  • mask_path
    (str | None, default: None ) –

    Optional binary mask path.

  • output_dir
    (str, default: './output' ) –

    Output directory for results.

  • output_format
    (str, default: 'jp2' ) –

    Image format for results.

  • output_type
    (str, default: '8bit' ) –

    Image type ("8bit" or "rgb").

  • sigma
    (float, default: 1.0 ) –

    Noise scale for structure tensor.

  • rho
    (float, default: 3.0 ) –

    Integration scale for structure tensor.

  • truncate
    (float, default: 4.0 ) –

    Gaussian kernel truncation.

  • axis_points
    (ndarray | None, default: None ) –

    3D points defining LV axis for cylindrical coordinates.

  • vertical_padding
    (float | None, default: None ) –

    Padding slices for tensor computation.

  • write_vectors
    (bool, default: False ) –

    Whether to save eigenvectors.

  • write_angles
    (bool, default: True ) –

    Whether to save HA/IA/FA maps.

  • use_gpu
    (bool, default: True ) –

    Use GPU acceleration for tensor computation.

  • is_test
    (bool, default: False ) –

    If True, runs in test mode and outputs plots.

  • n_slice_test
    (int | None, default: None ) –

    Number of slices to process in test mode.

  • start_index
    (int, default: 0 ) –

    Start slice index.

  • end_index
    (int | None, default: None ) –

    End slice index (None = last slice).

compute_slice_angles_and_anisotropy

compute_slice_angles_and_anisotropy(z: int, vector_field_slice: ndarray, img_slice: ndarray, center_point: ndarray, eigen_val_slice: ndarray, center_line: ndarray, output_dir: str, output_format: str = 'jp2', output_type: str = '8bit', start_index: int = 0, write_vectors: bool = False, write_angles: bool = True, is_test: bool = False, angle_mode: str = 'ha_ia') -> None

Compute either HA/IA or Azimuth/Elevation plus FA for a single slice, then plot and/or write outputs depending on flags.

tractography

Modules:

generate_streamlines

Functions:

generate_streamlines_from_params

generate_streamlines_from_params(vector_field_dir: str | Path, output_dir: str | Path, fa_dir: str | Path, angle_dir: str | Path, mask_path: str | Path | None = None, start_xyz: tuple[int, int, int] = (0, 0, 0), end_xyz: tuple[int | None, int | None, int | None] = (None, None, None), bin_factor: int = 1, num_seeds: int = 20000, fa_seed_min: float = 0.4, fa_threshold: float = 0.1, step_length: float = 0.5, max_steps: int | None = None, angle_threshold: float = 60.0, min_length_pts: int = 10, bidirectional: bool = True, voxel_sizes_zyx: tuple[float, float, float] = (1.0, 1.0, 1.0), save_trk_file: bool = True) -> None

Generate streamlines from the eigenvector field, then export: - .trk with all discovered per-point angle fields - .am with all per-edge mean angle scalars

Angle discovery

If angle_dir is one of HA, IA, AZ, EL, discover siblings with those names and include all that exist. If angle_dir is a parent, include all subfolders named HA, IA, AZ, EL that exist. If none are found, treat angle_dir as a single custom angle and include it.

save_trk_dipy_from_vox_zyx

save_trk_dipy_from_vox_zyx(streamlines_zyx: list[list[tuple[float, float, float]]], out_path: str | Path, vol_shape_zyx: tuple[int, int, int], voxel_sizes_zyx: tuple[float, float, float] = (1.0, 1.0, 1.0), data_values: list[ndarray] | None = None, data_name: str | None = None)

Save streamlines given in voxel indices (z,y,x) as TrackVis .trk using DIPY. Optionally attach one per-point scalar list under data_name.

save_trk_dipy_from_vox_zyx_multi

save_trk_dipy_from_vox_zyx_multi(streamlines_zyx: list[list[tuple[float, float, float]]], out_path: str | Path, vol_shape_zyx: tuple[int, int, int], voxel_sizes_zyx: tuple[float, float, float] = (1.0, 1.0, 1.0), data_per_point: dict[str, list[ndarray]] | None = None) -> None

Save streamlines in voxel indices (z,y,x) as TrackVis .trk using DIPY. Accepts multiple per-point scalar lists via data_per_point dict, with keys like "HA", "IA", "AZ", "EL". Each list must align with streamlines.

trilinear_interpolate_scalar

trilinear_interpolate_scalar(volume: ndarray, pt: tuple[float, float, float]) -> float

Trilinearly interpolate a scalar volume at fractional point (z, y, x). Clamps to valid range.

trilinear_interpolate_vector

trilinear_interpolate_vector(vector_field: ndarray, pt: tuple[float, float, float]) -> ndarray

Given a fractional (z,y,x), returns the trilinearly‐interpolated 3‐vector from vector_field (shape = (3, Z, Y, X)). Clamps to nearest voxel if out‐of‐bounds.

utils

Modules:

DataReader

Classes:

DataReader

DataReader(path: str | Path)

Initializes the DataReader with a path to the volume.

Parameters:

  • path
    (str | Path) –

    Path to the volume directory or file.

Methods:

Attributes:

  • dtype (dtype) –

    Returns the data type of the volume.

  • shape (tuple[int, ...]) –

    Returns the shape of the volume as (Z, Y, X) or (Z, Y, X, C).

  • volume_size_gb (float) –

    Returns the total size of the volume in GB.

dtype property
dtype: dtype

Returns the data type of the volume.

shape property
shape: tuple[int, ...]

Returns the shape of the volume as (Z, Y, X) or (Z, Y, X, C).

volume_size_gb property
volume_size_gb: float

Returns the total size of the volume in GB.

check_memory_requirement
check_memory_requirement(shape, dtype, safety_factor=0.8)

Check if the dataset can fit in available memory.

Parameters:

  • shape
    (tuple[int]) –

    Shape of the array.

  • dtype
    (dtype) –

    NumPy dtype of the array.

  • safety_factor
    (float, default: 0.8 ) –

    Fraction of available memory allowed to be used.

load_volume
load_volume(start_index: int = 0, end_index: int | None = None, unbinned_shape: tuple[int, int, int] | None = None) -> ndarray

Loads the volume and resizes it to unbinned_shape if provided, using fast integer-only resampling: - np.repeat for upsampling - block_reduce (max) for downsampling

Parameters:

  • start_index
    (int, default: 0 ) –

    Start index for slicing (for stacks).

  • end_index
    (int, default: None ) –

    End index for slicing (for stacks). If None, loads the entire stack.

  • unbinned_shape
    (tuple, default: None ) –

    Desired shape (Z, Y, X). If None, no resizing is done.

Returns:

  • ndarray

    np.ndarray: Loaded volume.

am_utils

Functions:

write_spatialgraph_am

write_spatialgraph_am(out_path: str | Path, streamlines_xyz: list[ndarray], point_thickness: Sequence[ndarray] | ndarray | None = None, edge_scalar: Sequence[float] | ndarray | Mapping[str, Sequence[float] | ndarray] | None = None, edge_scalar_name: str | None = None) -> None

Minimal Amira SpatialGraph writer with optional EDGE scalar blocks.

Writes blocks

@1 VERTEX float[3] @2 EDGE int[2] @3 EDGE int NumEdgePoints @4 POINT float[3] EdgePointCoordinates @5 POINT float thickness @6.. EDGE float one or more per-edge scalars

Parameters

out_path : str | Path Output .am path. streamlines_xyz : list[np.ndarray] List of polylines (x, y, z). Each array shape = (Ni, 3), Ni >= 2. point_thickness : np.ndarray | list[np.ndarray], optional Per-point thickness. Either a flat array of length sum(Ni) or a list aligned to streamlines with lengths Ni. edge_scalar : array-like | dict[str, array-like], optional - If a 1D array-like: one scalar per edge, use edge_scalar_name. - If a dict: multiple scalars, each value must be 1D, length = n_edges. Keys become field names. edge_scalar_name : str, optional Name for the single-scalar case. If edge_scalar is a dict, this is ignored.

Notes
  • Field names are lightly validated for Amira compatibility.
  • Values are written as ASCII floats with 6 decimal places.

downsampling

Functions:

chunked_downsample_vector_volume_mp

chunked_downsample_vector_volume_mp(input_npy_dir: Path, bin_factor: int, output_dir: Path) -> None

Multi‐process + progress‐bar version of chunked_downsample_vector_volume. Reads a directory of per‐slice NumPy files (shape = (3, H, W) each), groups every bin_factor consecutive slices into blocks, averages, downsamples, renormalizes, and writes each block as one coarse slice in output_dir/bin{bin_factor}/eigen_vec/.

Parameters

input_npy_dir : Path Directory containing fine‐scale “eigen_vec_*.npy” files, each shape (3, H, W). bin_factor : int Number of fine Z‐slices per block. output_dir : Path Base output directory. Will create output_dir/bin{bin_factor}/eigen_vec/.

downsample_vector_volume

downsample_vector_volume(input_npy: Path, bin_factor: int, output_dir: Path) -> None

Downsamples a vector volume using multiprocessing.

Parameters:

downsample_volume

downsample_volume(input_path: Path, bin_factor: int, output_dir: Path, subfolder: str = 'HA', out_ext: str = 'tif', min_value: float = 0, max_value: float = 255) -> None

Downsamples a 3D image volume along the Z and XY axes and saves as 8-bit images.

This function reads a volumetric image dataset (e.g. TIFF stack) using DataReader, performs block averaging along the Z-axis and spatial downsampling in XY, then saves each resulting slice in a specified output directory as 8-bit images.

Parameters:

  • input_path
    (Path) –

    Path to the directory containing the image stack.

  • bin_factor
    (int) –

    Factor to downsample in XY and the number of Z-slices to average per output slice.

  • output_dir
    (Path) –

    Path to the output root directory.

  • subfolder
    (str, default: 'HA' ) –

    Subdirectory name under binX/ to place results (default: "HA").

  • out_ext
    (str, default: 'tif' ) –

    Output image format extension (e.g., 'tif', 'png').

  • min_value
    (float, default: 0 ) –

    Minimum value for intensity normalization to 8-bit.

  • max_value
    (float, default: 255 ) –

    Maximum value for intensity normalization to 8-bit.

Returns:

  • None

    None

process_image_block

process_image_block(file_list, block_idx, bin_factor, out_file, min_value, max_value)

Process a Z-block of images by averaging along the Z axis, downsampling in XY, converting to 8-bit, and writing to disk.

Parameters:

process_vector_block

process_vector_block(block: list[Path], bin_factor: int, h: int, w: int, output_dir: Path, idx: int) -> None

Processes a single block of numpy files and saves the downsampled output.

Parameters:

  • block
    (List[Path]) –

    List of file paths to the numpy files in the block.

  • bin_factor
    (int) –

    Binning factor for downsampling.

  • h
    (int) –

    Height of the data block.

  • w
    (int) –

    Width of the data block.

  • output_dir
    (Path) –

    Path to the output directory.

  • idx
    (int) –

    Index of the current block.

streamlines_io_utils

Functions:

compute_elevation_angles

compute_elevation_angles(streamlines_xyz: list[ndarray]) -> list[ndarray]

Compute per-vertex elevation angle from streamline geometry: elevation = arcsin(z-component of unit tangent) in degrees. The last vertex copies the previous value to keep lengths aligned.

ha_to_degrees_per_streamline

ha_to_degrees_per_streamline(ha_list: list[ndarray]) -> list[ndarray]

Convert HA values that might be byte-scaled (0..255) to degrees (-90..90). Leaves values unchanged if they already look like degrees.

load_npz_streamlines

load_npz_streamlines(p: Path) -> tuple[list[ndarray], dict[str, list[ndarray]]]

Load streamlines from a .npz file saved as object arrays. Expects 'streamlines' in (z, y, x). Converts to (x, y, z). Collects any per-point arrays whose keys end with '_values' and exposes them as uppercase names without the suffix, e.g. 'ha_values' -> 'HA'.

Returns:

  • streamlines_xyz ( list[ndarray] ) –

    list[np.ndarray], each (N_i, 3) in (x, y, z)

  • per_point ( dict[str, list[ndarray]] ) –

    dict[str, list[np.ndarray]] keyed by field, each list aligned to streamlines

load_trk_streamlines

load_trk_streamlines(p: Path) -> tuple[list[ndarray], dict[str, list[ndarray]]]

Load streamlines and all per-point fields from a TrackVis .trk file. Returns streamlines in (x, y, z) voxel/world space (as stored in the TRK), and a dict of per-point fields, one list per field aligned with streamlines.

normalize_attrs_to_degrees

normalize_attrs_to_degrees(attrs: dict | None) -> dict[str, list[ndarray]]

Normalize HA, IA, AZ, EL fields to degrees if stored as 0–255. If already in degrees or unit vectors, returns unchanged except cast to float32.

Input

attrs: dict[str, list[np.ndarray]] from TRK (e.g., {"HA":[...], "IA":[...], ...})

Returns:

  • normalized ( dict[str, list[ndarray]] ) –

    same keys, each entry = list of np.ndarray (float32) in degrees.

reduce_per_edge

reduce_per_edge(values_per_point: list[ndarray], how: str = 'mean') -> ndarray

Reduce per-point values along each streamline to a single scalar per edge.

utils

Functions:

convert_to_8bit

convert_to_8bit(img: ndarray, perc_min: int = 0, perc_max: int = 100, min_value: float | None = None, max_value: float | None = None) -> ndarray

Converts a NumPy array to an 8-bit image.

Parameters:

  • img
    (ndarray) –

    Input image array.

  • perc_min
    (int, default: 0 ) –

    Minimum percentile for normalization. Default is 0.

  • perc_max
    (int, default: 100 ) –

    Maximum percentile for normalization. Default is 100.

  • min_value
    (Optional[float], default: None ) –

    Optional explicit minimum value.

  • max_value
    (Optional[float], default: None ) –

    Optional explicit maximum value.

Returns:

  • ndarray

    np.ndarray: 8-bit converted image.

read_conf_file

read_conf_file(file_path: str) -> dict[str, Any]

Reads and parses a configuration file into a dictionary.

Parameters:

Returns:

  • dict[str, Any]

    Dict[str, Any]: Parsed configuration parameters.

Raises:

  • FileNotFoundError

    If the configuration file does not exist.

  • ValueError

    If expected numerical or array values are incorrectly formatted.