Added autogenerated sphinx documentation

This commit is contained in:
2025-05-28 15:12:44 +01:00
parent 0ad146179c
commit 0b0a0ca5c5
13 changed files with 1824 additions and 2 deletions

1
.gitignore vendored
View File

@@ -83,6 +83,7 @@ instance/
.scrapy
# Sphinx documentation
docs/rust/crates
docs/_build/
# PyBuilder

View File

@@ -27,6 +27,7 @@ flexbuffers = "25.2.10"
float-cmp = "0.10.0"
ndarray = { version = "0.16.1", features = ["rayon", "serde"] }
serde = "1.0.200"
sphinx-rustdocgen = "0.8.1"
tar = "0.4.40"
tempfile = "3.10.1"
xz = "0.1.0"

86
docs/conf.py Normal file
View File

@@ -0,0 +1,86 @@
# Configuration file for the Sphinx documentation builder.
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "read_aconity_layers"
copyright = "2024, Cian Hughes"
author = "Cian Hughes"
release = "0.4.3"
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = [
"sphinxcontrib_rust",
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
"sphinx.ext.napoleon",
"sphinx.ext.viewcode",
"sphinx.ext.intersphinx",
"sphinx_autodoc_typehints",
"sphinx_copybutton",
"myst_parser",
]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output
html_theme = "sphinx_rtd_theme"
# -- Extension configuration -------------------------------------------------
# Napoleon settings
napoleon_google_style = True
napoleon_numpy_style = True
napoleon_include_init_with_doc = False
napoleon_include_private_with_doc = False
napoleon_include_special_with_doc = True
napoleon_use_admonition_for_examples = False
napoleon_use_admonition_for_notes = False
napoleon_use_admonition_for_references = False
napoleon_use_ivar = False
napoleon_use_param = True
napoleon_use_rtype = True
napoleon_preprocess_types = False
napoleon_type_aliases = None
napoleon_attr_annotations = True
# Autodoc settings
autodoc_typehints = "description"
autodoc_member_order = "bysource"
autosummary_generate = True
# Intersphinx mapping
intersphinx_mapping = {
"python": ("https://docs.python.org/3", None),
"numpy": ("https://numpy.org/doc/stable/", None),
}
# Sphinx-rust settings
rust_crates = {
"read_aconity_layers": ".",
}
rust_doc_dir = "rust/crates"
rust_rustdoc_fmt = "md"
# MyST settings
myst_enable_extensions = [
"deflist",
"html_image",
"attrs_block",
"colon_fence",
"html_admonition",
"replacements",
"smartquotes",
"strikethrough",
"tasklist",
]
# Copy button settings
copybutton_prompt_text = r">>> |\.\.\. |\$ |In \[\d*\]: | {2,5}\.\.\.: | {5,8}: "
copybutton_prompt_is_regexp = True

275
docs/development.rst Normal file
View File

@@ -0,0 +1,275 @@
Development
===========
This guide covers setting up a development environment and contributing to ``read_aconity_layers``.
Development Setup
-----------------
Prerequisites
~~~~~~~~~~~~~
* Rust 1.70+ with Cargo
* Python 3.11+
* Poetry
* Git
Environment Setup
~~~~~~~~~~~~~~~~~
1. **Clone and setup the repository:**
.. code-block:: bash
git clone https://github.com/Cian-H/read_aconity_layers.git
cd read_aconity_layers
2. **Install dependencies:**
.. code-block:: bash
poetry install --with dev,docs
3. **Setup pre-commit hooks:**
.. code-block:: bash
poetry run pre-commit install
4. **Build the Rust extension:**
.. code-block:: bash
poetry run maturin develop
Code Style and Quality
----------------------
This project uses several tools to maintain code quality:
Python Code
~~~~~~~~~~~
* **Ruff**: For linting and formatting
* **MyPy**: For type checking
* **Pytest**: For testing
Run quality checks:
.. code-block:: bash
# Format and lint Python code
poetry run ruff format .
poetry run ruff check .
# Type checking
poetry run mypy .
# Run tests
poetry run pytest
Rust Code
~~~~~~~~~
* **rustfmt**: For formatting
* **clippy**: For linting
* **cargo test**: For testing
Run quality checks:
.. code-block:: bash
# Format Rust code
cargo fmt
# Lint Rust code
cargo clippy
# Run Rust tests
cargo test
Testing
-------
The project includes comprehensive tests for both Python and Rust components.
Running Tests
~~~~~~~~~~~~~
.. code-block:: bash
# Run all tests
poetry run pytest
# Run with coverage
poetry run pytest --cov=read_aconity_layers
# Run Rust tests
cargo test
Test Structure
~~~~~~~~~~~~~~
* **Python tests**: Located in ``tests/`` directory
* **Rust tests**: Integrated into ``src/rust_fn/mod.rs``
* **Property-based tests**: Uses ``arbtest`` for Rust property testing
* **Regression tests**: Validates against known good outputs
Adding Tests
~~~~~~~~~~~~
When adding new functionality:
1. **Add Rust tests** in the appropriate module
2. **Add Python integration tests** in ``tests/``
3. **Update regression tests** if output format changes
4. **Add property tests** for mathematical functions
Documentation
-------------
Building Documentation
~~~~~~~~~~~~~~~~~~~~~~~
**Prerequisites**: You need the Rust toolchain installed for ``sphinxcontrib-rust`` to work.
.. code-block:: bash
# Install documentation dependencies
poetry install --with docs
# Build documentation
cd docs
make html
# Or build manually
poetry run sphinx-build -b html . _build/html
# Serve locally (optional)
make serve
Documentation Structure
~~~~~~~~~~~~~~~~~~~~~~~
* **docs/conf.py**: Sphinx configuration
* **docs/index.rst**: Main documentation page
* **docs/python/**: Python API documentation
* **docs/rust/**: Rust API documentation
* **docs/*.rst**: User guides and tutorials
The documentation automatically generates API references from:
* Python docstrings and type hints
* Rust documentation comments (``///`` and ``//!``)
* Type stub files (``*.pyi``)
**Note**: For Rust API documentation to work properly, you need:
1. Rust toolchain installed (cargo, rustfmt)
2. Proper Rust doc comments in your source code
3. The ``sphinxcontrib-rust`` extension configured correctly
Contributing
------------
Workflow
~~~~~~~~
1. **Fork the repository** on GitHub
2. **Create a feature branch** from ``main``
3. **Make your changes** following the coding standards
4. **Add tests** for new functionality
5. **Update documentation** as needed
6. **Run the full test suite** to ensure everything works
7. **Submit a pull request**
Pre-commit Checks
~~~~~~~~~~~~~~~~~
The project uses pre-commit hooks that run automatically:
* Code formatting (Ruff, rustfmt)
* Linting (Ruff, Clippy)
* Type checking (MyPy)
* Version bump validation
* Poetry validation
These checks must pass before commits are accepted.
Release Process
~~~~~~~~~~~~~~~
1. **Update version** in ``Cargo.toml`` (triggers version validation)
2. **Update changelog** if applicable
3. **Ensure all tests pass**
4. **Create a release** on GitHub
5. **CI automatically builds and publishes** wheels to PyPI
Architecture Notes
------------------
The library is structured in two main components:
Rust Core (``src/rust_fn/``)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
* **High-performance file I/O** using CSV reader
* **Parallel processing** with Rayon
* **Memory-efficient array operations** with ndarray
* **Coordinate correction algorithms**
Python Bindings (``src/lib.rs``)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
* **PyO3 integration** for seamless Python interop
* **Error handling** conversion from Rust to Python exceptions
* **NumPy integration** for zero-copy array passing
* **Type annotations** via stub files
Performance Considerations
~~~~~~~~~~~~~~~~~~~~~~~~~~
* File I/O is the primary bottleneck
* Parallel processing scales well with core count
* Memory usage is proportional to dataset size
* Coordinate corrections use vectorized operations
Common Development Tasks
------------------------
Adding a New Function
~~~~~~~~~~~~~~~~~~~~~~
1. **Implement in Rust** (``src/rust_fn/mod.rs``)
2. **Add Python binding** (``src/lib.rs``)
3. **Update type stubs** (``read_layers.pyi``)
4. **Add tests** for both Rust and Python
5. **Update documentation**
Debugging Build Issues
~~~~~~~~~~~~~~~~~~~~~~
* **Check Rust version**: Must be 1.70+
* **Verify PyO3 compatibility**: Should match Python version
* **Clear build cache**: ``cargo clean`` and ``poetry env remove --all``
* **Check dependencies**: Ensure all dev dependencies are installed
Profiling Performance
~~~~~~~~~~~~~~~~~~~~~
For Rust code:
.. code-block:: bash
# Profile with perf (Linux)
cargo build --release
perf record --call-graph=dwarf ./target/release/your_binary
perf report
For Python integration:
.. code-block:: bash
# Profile with py-spy
pip install py-spy
py-spy record -o profile.svg -- python your_script.py

52
docs/index.rst Normal file
View File

@@ -0,0 +1,52 @@
read_aconity_layers Documentation
==================================
A utility for fast reading of layer data from the aconity mini powder bed fusion machine.
.. toctree::
:maxdepth: 2
:caption: Contents:
installation
quickstart
python/index
rust/index
development
Overview
--------
``read_aconity_layers`` is a high-performance Python library for reading and processing layer data from Aconity mini powder bed fusion machines. It's built with Rust for maximum performance and uses PyO3 for seamless Python integration.
Features
--------
* **Fast**: Built with Rust for high-performance data processing
* **Simple**: Easy-to-use Python API
* **Parallel**: Leverages Rayon for parallel processing of multiple files
* **Type-safe**: Full type annotations and stub files included
Quick Example
-------------
.. code-block:: python
import read_aconity_layers as ral
import numpy as np
# Read all layers from a directory
data = ral.read_layers("/path/to/layer/files/")
# Read specific layer files
files = ["/path/to/layer1.pcd", "/path/to/layer2.pcd"]
data = ral.read_selected_layers(files)
# Read a single layer
layer = ral.read_layer("/path/to/layer.pcd")
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

98
docs/installation.rst Normal file
View File

@@ -0,0 +1,98 @@
Installation
============
Requirements
------------
* Python 3.11 or higher
* NumPy 2.0.0 or higher
From PyPI (Recommended)
-----------------------
.. code-block:: bash
pip install read-aconity-layers
From Source
-----------
If you want to build from source or contribute to the project:
Prerequisites
~~~~~~~~~~~~~
* Rust 1.70 or higher
* Python 3.11 or higher
* Poetry (for development)
Build Steps
~~~~~~~~~~~
1. Clone the repository:
.. code-block:: bash
git clone https://github.com/Cian-H/read_aconity_layers.git
cd read_aconity_layers
2. Install Python dependencies:
.. code-block:: bash
poetry install
3. Build the Rust extension:
.. code-block:: bash
poetry run maturin develop
4. Run tests to verify installation:
.. code-block:: bash
poetry run pytest
Development Installation
------------------------
For development work, you'll also want the development dependencies:
.. code-block:: bash
poetry install --with dev,docs
This installs additional tools for:
* Code formatting (ruff)
* Type checking (mypy)
* Testing (pytest)
* Documentation building (sphinx)
Troubleshooting
---------------
Common Issues
~~~~~~~~~~~~~
**Import Error**: If you get import errors, make sure you've run ``maturin develop``
to build the Rust extension.
**Performance Issues**: The library uses parallel processing by default. If you
encounter memory issues with very large datasets, consider processing files in
smaller batches.
**Rust Compilation Errors**: Make sure you have a recent version of Rust installed.
The minimum supported version is 1.70.
Platform Notes
~~~~~~~~~~~~~~
**Windows**: You may need to install the Microsoft C++ Build Tools if you don't
already have them.
**macOS**: Xcode command line tools are required for Rust compilation.
**Linux**: Most distributions should work out of the box. You may need to install
``build-essential`` on Debian/Ubuntu systems.

50
docs/justfile Normal file
View File

@@ -0,0 +1,50 @@
# Justfile for Sphinx documentation
sphinxopts := ""
sphinxbuild := "poetry run sphinx-build"
sourcedir := "."
builddir := "./_build"
# Default recipe - equivalent to "just" without arguments
default: help
# Show help
help:
@{{sphinxbuild}} -M help "{{sourcedir}}" "{{builddir}}" {{sphinxopts}}
# Build the Rust extension first (required for autodoc)
build-extension:
@echo "Checking Rust toolchain..."
@which cargo > /dev/null || (echo "Error: Rust/Cargo not found. Please install Rust toolchain." && exit 1)
cd .. && poetry run maturin develop
# Clean build directory
clean:
rm -rf {{builddir}}/*
# Build HTML documentation
html: build-extension
{{sphinxbuild}} -b html "{{sourcedir}}" "{{builddir}}/html" {{sphinxopts}}
@echo
@echo "Build finished. The HTML pages are in {{builddir}}/html."
# Build and serve documentation locally
serve: html
@echo "Serving documentation at http://localhost:8000"
cd {{builddir}}/html && python -m http.server 8000
# Build documentation with live reload (requires sphinx-autobuild)
livehtml: build-extension
poetry run sphinx-autobuild "{{sourcedir}}" "{{builddir}}/html" {{sphinxopts}}
# Check for broken links
linkcheck:
{{sphinxbuild}} -b linkcheck "{{sourcedir}}" "{{builddir}}/linkcheck" {{sphinxopts}}
# Build all formats
all: html
# Catch-all recipe for Sphinx targets (equivalent to Make's % target)
# Usage: just sphinx <target> [options]
sphinx target *options="":
@{{sphinxbuild}} -M {{target}} "{{sourcedir}}" "{{builddir}}" {{sphinxopts}} {{options}}

25
docs/python/index.rst Normal file
View File

@@ -0,0 +1,25 @@
Python API Reference
====================
This section contains the complete Python API reference for ``read_aconity_layers``.
Module
-----------
.. automodule:: read_aconity_layers
:members:
:undoc-members:
:show-inheritance:
Return Type Information
-----------------------
The library includes comprehensive type stubs for full IDE support and type checking.
All functions return NumPy arrays with the following structure:
* **Column 0**: X coordinates (corrected)
* **Column 1**: Y coordinates (corrected)
* **Column 2**: Z coordinates (layer height)
* **Column 3**: Pyrometer 1 readings
* **Column 4**: Pyrometer 2 readings

190
docs/quickstart.rst Normal file
View File

@@ -0,0 +1,190 @@
Quickstart Guide
================
This guide will get you up and running with ``read_aconity_layers`` in just a few minutes.
Basic Usage
-----------
Reading All Layers from a Directory
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The most common use case is reading all layer files from a directory:
.. code-block:: python
import read_aconity_layers as ral
import numpy as np
# Read all .pcd files from a directory
data = ral.read_layers("/path/to/your/layer/files/")
print(f"Loaded {data.shape[0]} data points")
print(f"Data shape: {data.shape}")
print(f"Columns: [x, y, z, data1, data2]")
Reading Specific Files
~~~~~~~~~~~~~~~~~~~~~~
If you want to read only specific layer files:
.. code-block:: python
import read_aconity_layers as ral
# List of specific files to read
files = [
"/path/to/0.1.pcd",
"/path/to/0.2.pcd",
"/path/to/0.3.pcd"
]
data = ral.read_selected_layers(files)
Reading a Single Layer
~~~~~~~~~~~~~~~~~~~~~~
For processing individual layers:
.. code-block:: python
import read_aconity_layers as ral
# Read just one layer file
layer_data = ral.read_layer("/path/to/single_layer.pcd")
# Extract coordinates
x_coords = layer_data[:, 0]
y_coords = layer_data[:, 1]
z_coords = layer_data[:, 2]
Working with the Data
---------------------
Understanding the Data Format
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
All functions return NumPy arrays with 5 columns:
* **Column 0**: X coordinates (corrected)
* **Column 1**: Y coordinates (corrected)
* **Column 2**: Z coordinates (layer height)
* **Column 3**: Original data column 3
* **Column 4**: Original data column 4
The X and Y coordinates are automatically corrected using the calibration
formulas built into the library.
Example: Basic Data Analysis
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. code-block:: python
import read_aconity_layers as ral
import numpy as np
import matplotlib.pyplot as plt
# Load data
data = ral.read_layers("/path/to/layers/")
# Basic statistics
print(f"X range: {data[:, 0].min():.2f} to {data[:, 0].max():.2f}")
print(f"Y range: {data[:, 1].min():.2f} to {data[:, 1].max():.2f}")
print(f"Z range: {data[:, 2].min():.2f} to {data[:, 2].max():.2f}")
# Plot layer distribution
unique_z = np.unique(data[:, 2])
layer_counts = [np.sum(data[:, 2] == z) for z in unique_z]
plt.figure(figsize=(10, 6))
plt.plot(unique_z, layer_counts)
plt.xlabel('Layer Height (Z)')
plt.ylabel('Number of Points')
plt.title('Points per Layer')
plt.show()
Example: Processing by Layer
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. code-block:: python
import read_aconity_layers as ral
import numpy as np
# Read data
data = ral.read_layers("/path/to/layers/")
# Group by Z coordinate (layer)
unique_z = np.unique(data[:, 2])
layer_stats = []
for z in unique_z:
layer_mask = data[:, 2] == z
layer_points = data[layer_mask]
stats = {
'z': z,
'point_count': len(layer_points),
'x_mean': layer_points[:, 0].mean(),
'y_mean': layer_points[:, 1].mean(),
'data1_mean': layer_points[:, 3].mean(),
'data2_mean': layer_points[:, 4].mean(),
}
layer_stats.append(stats)
# Convert to structured array for easier analysis
layer_stats = np.array(layer_stats)
Performance Tips
----------------
Parallel Processing
~~~~~~~~~~~~~~~~~~~
The library automatically uses parallel processing for multiple files.
For best performance:
* Use ``read_layers()`` for directories with many files
* The library will automatically use all available CPU cores
* Larger numbers of files will see better speedup
Memory Usage
~~~~~~~~~~~~
For very large datasets:
* Consider processing files in batches if memory is limited
* Use ``read_selected_layers()`` to process subsets
* The library streams data efficiently, but the final arrays are held in memory
File Organization
~~~~~~~~~~~~~~~~~
For optimal performance:
* Keep layer files in a single directory when using ``read_layers()``
* Use consistent naming (the Z coordinate is extracted from the filename)
* Ensure files are properly formatted space-delimited text
Error Handling
--------------
The library provides detailed error messages for common issues:
.. code-block:: python
import read_aconity_layers as ral
try:
data = ral.read_layers("/path/to/layers/")
except IOError as e:
print(f"File read error: {e}")
except RuntimeError as e:
print(f"Processing error: {e}")
Next Steps
----------
* Check out the full :doc:`python/index` for detailed function documentation
* See :doc:`development` if you want to contribute to the project
* For performance-critical applications, review the :doc:`rust/index`

57
docs/rust/index.rst Normal file
View File

@@ -0,0 +1,57 @@
Rust API Reference
==================
This section documents the internal Rust implementation of ``read_aconity_layers``.
.. note::
This documentation is primarily intended for contributors and developers
who want to understand the internal implementation. Most users should
refer to the :doc:`../python/index` instead.
.. toctree::
:glob:
:caption: Rust API:
crates/read_aconity_layers/lib
Overview
--------
The Rust implementation provides the high-performance core of ``read_aconity_layers``.
Key characteristics include:
Performance Features
~~~~~~~~~~~~~~~~~~~~
* **Parallel Processing**: Uses Rayon for parallel file reading across all CPU cores
* **Memory Efficiency**: Streams data rather than loading everything into memory at once
* **SIMD Operations**: Leverages vectorized operations for coordinate corrections
* **Zero-Copy**: Minimizes data copying between Rust and Python using PyO3
Architecture
~~~~~~~~~~~~
The crate is organized into two main components:
* **Public API** (``src/lib.rs``): PyO3 bindings that expose Rust functions to Python
* **Core Logic** (``src/rust_fn/``): Pure Rust implementation of file reading and processing
Error Handling
~~~~~~~~~~~~~~
The Rust code uses a comprehensive ``ReadError`` enum that covers all possible
failure modes, from I/O errors to parsing failures. These are automatically
converted to appropriate Python exceptions through the PyO3 integration.
Dependencies
~~~~~~~~~~~~
Key Rust dependencies that power the performance:
* ``ndarray`` - N-dimensional arrays with BLAS integration
* ``rayon`` - Data parallelism library
* ``csv`` - Fast CSV parsing
* ``pyo3`` - Python bindings
* ``numpy`` - NumPy integration for PyO3
* ``glob`` - File path pattern matching
* ``indicatif`` - Progress bars for long operations

7
justfile Normal file
View File

@@ -0,0 +1,7 @@
# Lists available just commands
default:
@just --list
# Documentation commands - delegates to docs/justfile
docs command *args:
cd docs && just {{command}} {{args}}

975
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -34,6 +34,15 @@ mypy = "^1.15.0"
pytest = "^8.3.5"
loguru = "^0.7.3"
[tool.poetry.group.docs.dependencies]
sphinx = "^8.2.3"
sphinx-autodoc-typehints = "^3.2.0"
sphinx-copybutton = "^0.5.2"
sphinx-rtd-theme = "^3.0.2"
sphinxcontrib-rust = "^0.8.1"
myst-parser = "^4.0.1"
sphinx-autobuild = "^2024.10.3"
[build-system]
requires = ["maturin>=1.7,<2.0"]
build-backend = "maturin"