save_detection_arrays_to_store¶

save_detection_arrays_to_store(detection_arrays, scale_factor=(1.0, 1.0), class_dict=None, save_path=None, batch_size=5000)[source]¶

Write nucleus detection arrays to an SQLite-backed AnnotationStore.

Converts the detection arrays into NumPy form, applies coordinate scaling and optional class-ID remapping, and writes the results into an in-memory SQLiteStore. If save_path is provided, the store is committed and saved to disk as a .db file. This method provides a unified interface for converting Dask-based detection outputs into persistent annotation storage.

Parameters:
  • detection_arrays (dict[str, da.Array]) – A dictionary containing the detection fields: - "x": dask array of x coordinates (np.uint32). - "y": dask array of y coordinates (np.uint32). - "classes": dask array of class IDs (np.uint32). - "probabilities": dask array of detection scores (np.float32).

  • scale_factor (tuple[float, float], optional) – Multiplicative factors applied to the x and y coordinates before saving. The scaled coordinates are rounded to integer pixel locations. Defaults to (1.0, 1.0).

  • class_dict (dict or None) – Optional mapping of class IDs to class names or remapped IDs. If None, an identity mapping is used based on the detected class IDs.

  • save_path (Path or None) – Destination path for saving the .db file. If None, the resulting SQLiteStore is returned in memory. If provided, the parent directory is created if needed, and the final store is written as save_path.with_suffix(".db").

  • batch_size (int) – Number of detection records to write per batch. Defaults to 5000.

Returns:

  • If save_path is provided: the path to the saved .db file.

  • If save_path is None: an in-memory SQLiteStore containing all detections.

Return type:

Path or SQLiteStore

Notes

  • The heavy lifting is delegated to _write_detection_arrays_to_store(), which performs coordinate scaling, class mapping, and batch writing.