Status: Needs Review
This page has not been reviewed for accuracy and completeness. Content may be outdated or contain errors.
Training API¶
Training-related components including losses and metrics.
Overview¶
Training functionality in CUVIS.AI is provided through loss functions, metrics, and monitoring nodes that integrate with PyTorch Lightning.
Loss Functions¶
losses
¶
Loss nodes for training pipeline (port-based architecture).
LossNode
¶
Bases: Node
Base class for loss nodes that restricts execution to training stages.
Loss nodes should not execute during inference - only during train, val, and test.
Source code in cuvis_ai/node/losses.py
OrthogonalityLoss
¶
Bases: LossNode
Orthogonality regularization loss for TrainablePCA.
Encourages PCA components to remain orthonormal during training. Loss = weight * ||W @ W.T - I||^2_F
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Weight for orthogonality loss (default: 1.0) |
1.0
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute weighted orthogonality loss from PCA components.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
components
|
Tensor
|
PCA components matrix [n_components, n_features] |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing weighted loss |
Source code in cuvis_ai/node/losses.py
AnomalyBCEWithLogits
¶
Bases: LossNode
Binary cross-entropy loss for anomaly detection with logits.
Computes BCE loss between predicted anomaly scores and ground truth masks. Uses BCEWithLogitsLoss for numerical stability.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Overall weight for this loss component (default: 1.0) |
1.0
|
pos_weight
|
float
|
Weight for positive class (anomaly) to handle class imbalance (default: None) |
None
|
reduction
|
str
|
Reduction method: 'mean', 'sum', or 'none' (default: 'mean') |
'mean'
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute weighted BCE loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
predictions
|
Tensor
|
Predicted scores [B, H, W, 1] |
required |
targets
|
Tensor
|
Ground truth masks [B, H, W, 1] |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing scalar loss |
Source code in cuvis_ai/node/losses.py
MSEReconstructionLoss
¶
Bases: LossNode
Mean squared error reconstruction loss.
Computes MSE between reconstruction and target. Useful for autoencoder-style architectures.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Weight for this loss component (default: 1.0) |
1.0
|
reduction
|
str
|
Reduction method: 'mean', 'sum', or 'none' (default: 'mean') |
'mean'
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute MSE reconstruction loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
reconstruction
|
Tensor
|
Reconstructed data |
required |
target
|
Tensor
|
Target for reconstruction |
required |
**_
|
Any
|
Additional arguments (e.g., context) - ignored but accepted for compatibility |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing scalar loss |
Source code in cuvis_ai/node/losses.py
DistinctnessLoss
¶
Bases: LossNode
Repulsion loss encouraging different selectors to choose different bands.
This loss is designed for band/channel selector nodes that output a
2D weight matrix [output_channels, input_channels]. It computes the
mean pairwise cosine similarity between all pairs of selector weight
vectors and penalizes high similarity:
.. math::
L_\text{repel} = \frac{1}{N_\text{pairs}} \sum_{i < j}
\cos(\mathbf{w}_i, \mathbf{w}_j)
Minimizing this loss encourages selectors to focus on different bands, preventing the common failure mode where all channels collapse onto the same band.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Overall weight for this loss component (default: 0.1). |
0.1
|
eps
|
float
|
Small constant for numerical stability when normalizing (default: 1e-6). |
1e-06
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute mean pairwise cosine similarity penalty.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
selection_weights
|
Tensor
|
Weight matrix of shape [output_channels, input_channels]. |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with a single key |
Source code in cuvis_ai/node/losses.py
SelectorEntropyRegularizer
¶
Bases: LossNode
Entropy regularization for SoftChannelSelector.
Encourages exploration by penalizing low-entropy (over-confident) selections. Computes entropy from selection weights and applies regularization.
Higher entropy = more uniform selection (encouraged early in training) Lower entropy = more peaked selection (emerges naturally as training progresses)
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Weight for entropy regularization (default: 0.01) Positive weight encourages exploration (maximizes entropy) Negative weight encourages exploitation (minimizes entropy) |
0.01
|
target_entropy
|
float
|
Target entropy for regularization (default: None, no target) If set, uses squared error: (entropy - target)^2 |
None
|
eps
|
float
|
Small constant for numerical stability (default: 1e-6) |
1e-06
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute entropy regularization loss from selection weights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weights
|
Tensor
|
Channel selection weights [n_channels] |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing regularization loss |
Source code in cuvis_ai/node/losses.py
SelectorDiversityRegularizer
¶
Bases: LossNode
Diversity regularization for SoftChannelSelector.
Encourages diverse channel selection by penalizing concentration on few channels. Uses negative variance to encourage spread (higher variance = more diverse).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Weight for diversity regularization (default: 0.01) |
0.01
|
Source code in cuvis_ai/node/losses.py
forward
¶
Compute weighted diversity loss from selection weights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weights
|
Tensor
|
Channel selection weights [n_channels] |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing weighted loss |
Source code in cuvis_ai/node/losses.py
DeepSVDDSoftBoundaryLoss
¶
Bases: LossNode
Soft-boundary Deep SVDD objective operating on BHWD embeddings.
Source code in cuvis_ai/node/losses.py
forward
¶
Compute Deep SVDD soft-boundary loss.
The loss consists of the hypersphere radius R² plus a slack penalty for points outside the hypersphere. The radius R is learned via an unconstrained parameter with softplus activation.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
embeddings
|
Tensor
|
Embedded feature representations [B, H, W, D] from the network. |
required |
center
|
Tensor
|
Center of the hypersphere [D] computed during initialization. |
required |
**_
|
Any
|
Additional unused keyword arguments. |
{}
|
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing the scalar loss value. |
Notes
The loss formula is: loss = weight * (R² + (1/ν) * mean(ReLU(dist - R²))) where dist is the squared distance from embeddings to the center.
Source code in cuvis_ai/node/losses.py
IoULoss
¶
Bases: LossNode
Differentiable IoU (Intersection over Union) loss.
Computes: 1 - (|A ∩ B| + smooth) / (|A U B| + smooth) Works directly on continuous scores (not binary decisions), preserving gradients.
The scores are normalized to [0, 1] range using sigmoid or clamp before computing IoU, ensuring differentiability.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weight
|
float
|
Overall weight for this loss component (default: 1.0) |
1.0
|
smooth
|
float
|
Small constant for numerical stability (default: 1e-6) |
1e-06
|
normalize_method
|
('sigmoid', 'clamp', 'minmax')
|
Method to normalize predictions to [0, 1] range (default: "sigmoid") - "sigmoid": Apply sigmoid activation (good for unbounded scores) - "clamp": Clamp to [0, 1] (good for scores already in reasonable range) - "minmax": Min-max normalization per batch (good for varying score ranges) |
"sigmoid"
|
Examples:
>>> iou_loss = IoULoss(weight=1.0, smooth=1e-6)
>>> # Use with AdaClip scores directly (no thresholding needed)
>>> loss = iou_loss.forward(predictions=adaclip_scores, targets=ground_truth_mask)
Source code in cuvis_ai/node/losses.py
forward
¶
Compute differentiable IoU loss.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
predictions
|
Tensor
|
Predicted anomaly scores [B, H, W, 1] (any real values) |
required |
targets
|
Tensor
|
Ground truth binary masks [B, H, W, 1] |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Tensor]
|
Dictionary with "loss" key containing scalar IoU loss |
Source code in cuvis_ai/node/losses.py
Metrics¶
metrics
¶
Metric nodes for training pipeline (port-based architecture).
ExplainedVarianceMetric
¶
Bases: Node
Track explained variance ratio for PCA components.
Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute explained variance metrics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
explained_variance_ratio
|
Tensor
|
Explained variance ratios from PCA node |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
AnomalyDetectionMetrics
¶
Bases: Node
Compute anomaly detection metrics (precision, recall, F1, etc.).
Uses torchmetrics for GPU-optimized, robust metric computation. Expects binary decisions and targets to be binary masks. Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute anomaly detection metrics using torchmetrics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
decisions
|
Tensor
|
Binary anomaly decisions [B, H, W, 1] |
required |
targets
|
Tensor
|
Ground truth binary masks [B, H, W, 1] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 | |
ScoreStatisticsMetric
¶
Bases: Node
Compute statistical properties of score distributions.
Tracks mean, std, min, max, median, and quantiles of scores. Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute score statistics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
scores
|
Tensor
|
Score values [B, H, W] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 | |
ComponentOrthogonalityMetric
¶
Bases: Node
Track orthogonality of PCA components during training.
Measures how close the component matrix is to being orthonormal. Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute component orthogonality metrics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
components
|
Tensor
|
PCA components matrix [n_components, n_features] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
SelectorEntropyMetric
¶
Bases: Node
Track entropy of channel selection distribution.
Measures the uncertainty/diversity in channel selection weights. Higher entropy indicates more uniform selection (less confident). Lower entropy indicates more peaked selection (more confident).
Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute entropy of selection weights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weights
|
Tensor
|
Channel selection weights [n_channels] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
SelectorDiversityMetric
¶
Bases: Node
Track diversity of channel selection.
Measures how spread out the selection weights are across channels. Uses Gini coefficient - lower values indicate more diverse selection.
Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute diversity metrics for selection weights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
weights
|
Tensor
|
Channel selection weights [n_channels] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
AnomalyPixelStatisticsMetric
¶
Bases: Node
Compute anomaly pixel statistics from binary decisions.
Calculates total pixels, anomalous pixels count, and anomaly percentage. Useful for monitoring the proportion of detected anomalies in batches. Executes only during validation and test stages.
Source code in cuvis_ai/node/metrics.py
forward
¶
Compute anomaly pixel statistics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
decisions
|
Tensor
|
Binary anomaly decisions [B, H, W, 1] |
required |
context
|
Context
|
Execution context with stage, epoch, batch_idx |
required |
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dictionary with "metrics" key containing list of Metric objects |
Source code in cuvis_ai/node/metrics.py
Monitoring¶
monitor
¶
TensorBoard Monitoring Nodes.
This module provides nodes for logging artifacts and metrics to TensorBoard during pipeline execution. The monitoring nodes are sink nodes that accept artifacts (visualizations) and metrics from upstream nodes and write them to TensorBoard logs for visualization and analysis.
The primary use case is logging training and validation metrics, along with visualizations like heatmaps, RGB renderings, and PCA plots during model training.
See Also
cuvis_ai.node.visualizations : Nodes that generate artifacts for monitoring
TensorBoardMonitorNode
¶
Bases: Node
TensorBoard monitoring node for logging artifacts and metrics.
This is a SINK node that logs visualizations (artifacts) and metrics to TensorBoard. Accepts optional inputs for artifacts and metrics, allowing predecessors to be filtered by execution_stage without causing errors.
Executes during all stages (ALWAYS).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output_dir
|
str
|
Directory for TensorBoard logs (default: "./runs") |
'./runs'
|
comment
|
str
|
Comment to append to log directory name (default: "") |
''
|
flush_secs
|
int
|
How often to flush pending events to disk (default: 120) |
120
|
Examples:
>>> heatmap_viz = AnomalyHeatmap(cmap='hot', up_to=10)
>>> tensorboard_node = TensorBoardMonitorNode(output_dir="./runs")
>>> graph.connect(
... (heatmap_viz.artifacts, tensorboard_node.artifacts),
... )
Source code in cuvis_ai/node/monitor.py
forward
¶
Log artifacts and metrics to TensorBoard.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
context
|
Context
|
Execution context with stage, epoch, batch_idx, global_step |
None
|
artifacts
|
list[Artifact]
|
List of artifacts to log (default: None) |
None
|
metrics
|
list[Metric]
|
List of metrics to log (default: None) |
None
|
Returns:
| Type | Description |
|---|---|
dict
|
Empty dict (sink node has no outputs) |
Source code in cuvis_ai/node/monitor.py
log
¶
Log a scalar value to TensorBoard.
This method provides a simple interface for external trainers to log metrics directly, complementing the port-based logging. Used by GradientTrainer to log train/val losses to the same TensorBoard directory as graph metrics and artifacts.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str
|
Name/tag for the scalar (e.g., "train/loss", "val/accuracy") |
required |
value
|
float
|
Scalar value to log |
required |
step
|
int
|
Global step number |
required |
Examples:
>>> tensorboard_node = TensorBoardMonitorNode(output_dir="./runs")
>>> # From external trainer
>>> tensorboard_node.log("train/loss", 0.5, step=100)