17 Commits

Author SHA1 Message Date
dependabot[bot]
91be22ab31 build(deps-dev): bump types-docutils (#781)
Some checks are pending
Bump Version / Bump Version Workflow (push) Waiting to run
docker-build / platform-excludes (push) Waiting to run
docker-build / build (push) Blocked by required conditions
docker-build / merge (push) Blocked by required conditions
pre-commit / pre-commit (push) Waiting to run
Run Pytest on Pull Request / test (push) Waiting to run
Bumps [types-docutils](https://github.com/typeshed-internal/stub_uploader) from 0.22.2.20251006 to 0.22.3.20251115.
- [Commits](https://github.com/typeshed-internal/stub_uploader/commits)

---
updated-dependencies:
- dependency-name: types-docutils
  dependency-version: 0.22.3.20251115
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:23:19 +01:00
dependabot[bot]
1652e507d8 build(deps-dev): bump pre-commit from 4.4.0 to 4.5.0 (#782)
Bumps [pre-commit](https://github.com/pre-commit/pre-commit) from 4.4.0 to 4.5.0.
- [Release notes](https://github.com/pre-commit/pre-commit/releases)
- [Changelog](https://github.com/pre-commit/pre-commit/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pre-commit/pre-commit/compare/v4.4.0...v4.5.0)

---
updated-dependencies:
- dependency-name: pre-commit
  dependency-version: 4.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:22:59 +01:00
dependabot[bot]
d4a8c93665 build(deps): bump rich-toolkit from 0.15.1 to 0.16.0 (#780)
Bumps rich-toolkit from 0.15.1 to 0.16.0.

---
updated-dependencies:
- dependency-name: rich-toolkit
  dependency-version: 0.16.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:22:50 +01:00
dependabot[bot]
ab568ef37d build(deps): bump fastapi[standard-no-fastapi-cloud-cli] (#776)
Some checks failed
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bump Version / Bump Version Workflow (push) Has been cancelled
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.2 to 0.121.3.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.2...0.121.3)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.121.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:30:18 +01:00
dependabot[bot]
d7b19c7169 build(deps-dev): bump coverage from 7.11.3 to 7.12.0 (#774)
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.11.3 to 7.12.0.
- [Release notes](https://github.com/coveragepy/coveragepy/releases)
- [Changelog](https://github.com/coveragepy/coveragepy/blob/main/CHANGES.rst)
- [Commits](https://github.com/coveragepy/coveragepy/compare/7.11.3...7.12.0)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.12.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:30:06 +01:00
dependabot[bot]
ec53665f5e build(deps): bump python-fasthtml from 0.12.33 to 0.12.35 (#775)
Bumps [python-fasthtml](https://github.com/AnswerDotAI/fasthtml) from 0.12.33 to 0.12.35.
- [Release notes](https://github.com/AnswerDotAI/fasthtml/releases)
- [Changelog](https://github.com/AnswerDotAI/fasthtml/blob/main/CHANGELOG.md)
- [Commits](https://github.com/AnswerDotAI/fasthtml/commits)

---
updated-dependencies:
- dependency-name: python-fasthtml
  dependency-version: 0.12.35
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:29:53 +01:00
Bobby Noelte
976a2c8405 chore: automate development version and release generation (#772)
Some checks failed
Bump Version / Bump Version Workflow (push) Has been cancelled
docker-build / platform-excludes (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
This change introduces a GitHub Action to automate release creation, including
proper tagging and automatic addition of a development marker to the version.

A hash is also appended to development versions to make their state easier to
distinguish.

Tests and release documentation have been updated to reflect the revised
release workflow. Several files now retrieve the current version dynamically.

The test --full-run option has been rename to --finalize to make
clear it is to do commit finalization testing.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-20 00:10:19 +01:00
dependabot[bot]
bdbb0b060d build(deps): bump numpy from 2.3.4 to 2.3.5 (#771)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [numpy](https://github.com/numpy/numpy) from 2.3.4 to 2.3.5.
- [Release notes](https://github.com/numpy/numpy/releases)
- [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst)
- [Commits](https://github.com/numpy/numpy/compare/v2.3.4...v2.3.5)

---
updated-dependencies:
- dependency-name: numpy
  dependency-version: 2.3.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-18 18:28:03 +01:00
dependabot[bot]
08d7c2ac5b build(deps-dev): bump pytest from 9.0.0 to 9.0.1 (#768)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [pytest](https://github.com/pytest-dev/pytest) from 9.0.0 to 9.0.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/9.0.0...9.0.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-version: 9.0.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-16 13:48:44 +01:00
dependabot[bot]
e255718240 build(deps): bump fastapi[standard-no-fastapi-cloud-cli] (#769)
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.1 to 0.121.2.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.1...0.121.2)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.121.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-16 13:48:08 +01:00
Bobby Noelte
4c2997dbd6 feat: add bidding zone to energy charts price prediction (#765)
Energy charts supports bidding zones. Allow to specifiy the bidding zone in the configuration.

Extend and simplify ElecPrice configuration structure and setup config migration to automatically
update the configuration file.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-16 13:26:18 +01:00
Bobby Noelte
edff649a5e chore: improve enhancement template (#766)
Improve enhancement template to not use fenced python chapters.

Add chapter to describe the enhancement.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-16 13:25:58 +01:00
Bobby Noelte
bad99fc62d chore: bump python version to 3.13.9 (#767) 2025-11-16 13:25:45 +01:00
Bobby Noelte
7bf9dd723e chore: improve doc generation and test (#762)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Improve documentation generation and add tests for documentation.
Extend sphinx by todo directive.

The configuration table is now split into several tables. The test
is adapted accordingly.

There is a new test that checks the docstrings to be compliant to the
RST format as used by sphinx to create the documentation. We can not
use Markdown in docstrings. The docstrings are adapted accordingly.

An additional test checks that the documentation can be build with sphinx.
This test takes very long is only enabled in full run (aka. ci) mode.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-13 22:53:46 +01:00
Bobby Noelte
8da137f8f1 fix: cached_method deprecated and test
cachebox deprecated the method decorator. Used cached instead.

Fix cache integration tests that were accessing real world addresses.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-13 19:35:08 +01:00
dependabot[bot]
cab3a3dd21 build(deps-dev): bump pre-commit from 4.3.0 to 4.4.0 (#758)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [pre-commit](https://github.com/pre-commit/pre-commit) from 4.3.0 to 4.4.0.
- [Release notes](https://github.com/pre-commit/pre-commit/releases)
- [Changelog](https://github.com/pre-commit/pre-commit/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pre-commit/pre-commit/compare/v4.3.0...v4.4.0)

---
updated-dependencies:
- dependency-name: pre-commit
  dependency-version: 4.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-11 01:07:46 +01:00
dependabot[bot]
74a9271e88 build(deps-dev): bump commitizen from 4.9.1 to 4.10.0 (#760)
Bumps [commitizen](https://github.com/commitizen-tools/commitizen) from 4.9.1 to 4.10.0.
- [Release notes](https://github.com/commitizen-tools/commitizen/releases)
- [Changelog](https://github.com/commitizen-tools/commitizen/blob/master/CHANGELOG.md)
- [Commits](https://github.com/commitizen-tools/commitizen/compare/v4.9.1...v4.10.0)

---
updated-dependencies:
- dependency-name: commitizen
  dependency-version: 4.10.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-11 01:07:33 +01:00
69 changed files with 4158 additions and 2635 deletions

View File

@@ -8,18 +8,20 @@ body:
- type: markdown
attributes:
value: >
Please post your idea first as a [Discussion](https://github.com/Akkudoktor-EOS/EOS/discussions)
to validate it and bring attention to it. After validation,
you can open this issue for a more technical developer discussion.
Check the [Contributor Guide](https://github.com/Akkudoktor-EOS/EOS/blob/main/CONTRIBUTING.md)
if you need more information.
- type: textarea
attributes:
label: "Describe the enhancement or feature request:"
validations:
required: true
- type: textarea
attributes:
label: "Link to discussion and related issues"
description: >
<link here>
render: python
validations:
required: false
@@ -28,6 +30,5 @@ body:
label: "Proposed implementation"
description: >
How it could be implemented with a high level API.
render: python
validations:
required: false

99
.github/workflows/bump-version.yml vendored Normal file
View File

@@ -0,0 +1,99 @@
name: Bump Version
# Trigger the workflow on any push to main
on:
push:
branches:
- main
jobs:
bump-version:
runs-on: ubuntu-latest
name: Bump Version Workflow
steps:
# --- Step 1: Checkout the repository ---
- name: Checkout repo
uses: actions/checkout@v4
with:
fetch-depth: 0 # Needed to create tags and see full history
persist-credentials: true # Needed for pushing commits and tags
# --- Step 2: Set up Python ---
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
# --- Step 3: Calculate version dynamically ---
- name: Calculate version
id: calc
run: |
# Call custom version calculation script
VERSION=$(python scripts/get_version.py)
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "Computed version: $VERSION"
# --- Step 4: Skip workflow for development versions ---
- name: Skip if version contains 'dev'
run: |
# Exit workflow early if the version contains 'dev'
if [[ "${{ steps.calc.outputs.version }}" == *dev* ]]; then
echo "Version contains 'dev', skipping bump version workflow."
exit 0
fi
# --- Step 5: Update files and commit if necessary ---
- name: Update files and commit
run: |
# Define files to update
UPDATE_FILES="haaddon/config.yaml"
# Call general Python version replacement script
python scripts/update_version.py "${{ steps.calc.outputs.version }}" $UPDATE_FILES
# Commit changes if any
git config user.name "github-actions"
git config user.email "actions@github.com"
git add $UPDATE_FILES
if git diff --cached --quiet; then
echo "No files changed. Skipping commit."
else
git commit -m "chore: bump version to ${{ steps.calc.outputs.version }}"
git push
# --- Step 6: Create release tag ---
- name: Create release tag if it does not exist
id: tagging
run: |
TAG="v${{ steps.calc.outputs.version }}"
if git rev-parse --verify "$TAG" >/dev/null 2>&1; then
echo "Tag $TAG already exists. Skipping tag creation."
echo "created=false" >> $GITHUB_OUTPUT
else
git tag -a "v${{ steps.calc.outputs.version }}" -m "Release ${{ steps.calc.outputs.version }}"
git push origin "v${{ steps.calc.outputs.version }}"
echo "created=true" >> $GITHUB_OUTPUT
fi
# --- Step 7: Bump to development version ---
- name: Bump dev version
id: bump_dev
run: |
VERSION_BASE=$(python scripts/bump_dev_version.py | tail -n1)
if [ -z "$VERSION_BASE" ]; then
echo "Error: bump_dev_version.py returned an empty version."
exit 1
fi
echo "version_base=$VERSION_BASE" >> $GITHUB_OUTPUT
git config user.name "github-actions"
git config user.email "actions@github.com"
git add src/akkudoktoreos/core/version.py
if git diff --cached --quiet; then
echo "version.py not changed. Skipping commit."
else
git commit -m "chore: bump dev version to ${VERSION_BASE}"
git push

View File

@@ -16,7 +16,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
python-version: "3.13.9"
- name: Install dependencies
run: |
@@ -26,7 +26,7 @@ jobs:
- name: Run Pytest
run: |
pip install -e .
python -m pytest --full-run --check-config-side-effect -vs --cov src --cov-report term-missing
python -m pytest --finalize --check-config-side-effect -vs --cov src --cov-report term-missing
- name: Upload test artifacts
uses: actions/upload-artifact@v4

View File

@@ -37,7 +37,9 @@ repos:
additional_dependencies:
- types-requests==2.32.4.20250913
- pandas-stubs==2.3.2.250926
- tokenize-rt==3.2.0
- tokenize-rt==6.2.0
- types-docutils==0.22.2.20251006
- types-PyYaml==6.0.12.20250915
pass_filenames: false
# --- Markdown linter ---
@@ -46,7 +48,6 @@ repos:
hooks:
- id: pymarkdown
files: ^docs/
exclude: ^docs/_generated
args:
- --config=docs/pymarkdown.json
- scan

View File

@@ -1,5 +1,8 @@
# syntax=docker/dockerfile:1.7
ARG PYTHON_VERSION=3.12.7
# Dockerfile
# Set base image first
ARG PYTHON_VERSION=3.13.9
FROM python:${PYTHON_VERSION}-slim
LABEL source="https://github.com/Akkudoktor-EOS/EOS"
@@ -32,28 +35,25 @@ RUN adduser --system --group --no-create-home eos \
&& mkdir -p "${EOS_CONFIG_DIR}" \
&& chown eos "${EOS_CONFIG_DIR}"
# Install requirements
COPY requirements.txt .
RUN --mount=type=cache,target=/root/.cache/pip \
pip install --no-cache-dir -r requirements.txt
# Copy source
COPY src/ ./src
COPY pyproject.toml .
RUN mkdir -p src && pip install --no-cache-dir -e .
COPY src src
# Create version information
COPY scripts/get_version.py ./scripts/get_version.py
RUN python scripts/get_version.py > ./version.txt
RUN rm ./scripts/get_version.py
# Create minimal default configuration for Docker to fix EOSDash accessibility (#629)
# This ensures EOSDash binds to 0.0.0.0 instead of 127.0.0.1 in containers
RUN echo '{\n\
"server": {\n\
"host": "0.0.0.0",\n\
"port": 8503,\n\
"startup_eosdash": true,\n\
"eosdash_host": "0.0.0.0",\n\
"eosdash_port": 8504\n\
}\n\
}' > "${EOS_CONFIG_DIR}/EOS.config.json" \
&& chown eos:eos "${EOS_CONFIG_DIR}/EOS.config.json"
RUN echo "Building Akkudoktor-EOS with Python $PYTHON_VERSION"
# Install akkudoktoreos package in editable form (-e)
# pyproject-toml will read the version from version.txt
RUN pip install --no-cache-dir -e .
USER eos
ENTRYPOINT []
@@ -61,6 +61,7 @@ ENTRYPOINT []
EXPOSE 8503
EXPOSE 8504
CMD ["python", "src/akkudoktoreos/server/eos.py", "--host", "0.0.0.0"]
# Ensure EOS and EOSdash bind to 0.0.0.0
CMD ["python", "-m", "akkudoktoreos.server.eos", "--host", "0.0.0.0"]
VOLUME ["${MPLCONFIGDIR}", "${EOS_CACHE_DIR}", "${EOS_OUTPUT_DIR}", "${EOS_CONFIG_DIR}"]

View File

@@ -1,5 +1,8 @@
# Define the targets
.PHONY: help venv pip install dist test test-full test-system test-ci test-profile docker-run docker-build docs read-docs clean format gitlint mypy run run-dev run-dash run-dash-dev bumps
.PHONY: help venv pip install dist test test-full test-system test-ci test-profile docker-run docker-build docs read-docs clean format gitlint mypy run run-dev run-dash run-dash-dev prepare-version test-version
# - Take VERSION from version.py
VERSION := $(shell python3 scripts/get_version.py)
# Default target
all: help
@@ -25,18 +28,19 @@ help:
@echo " run-dash - Run EOSdash production server in virtual environment."
@echo " run-dash-dev - Run EOSdash development server in virtual environment (automatically reloads)."
@echo " test - Run tests."
@echo " test-full - Run tests with full optimization."
@echo " test-full - Run all tests (e.g. to finalize a commit)."
@echo " test-system - Run tests with system tests enabled."
@echo " test-ci - Run tests as CI does. No user config file allowed."
@echo " test-profile - Run single test optimization with profiling."
@echo " dist - Create distribution (in dist/)."
@echo " clean - Remove generated documentation, distribution and virtual environment."
@echo " bump - Bump version to next release version."
@echo " prepare-version - Prepare a version defined in setup.py."
# Target to set up a Python 3 virtual environment
venv:
python3 -m venv .venv
@echo "Virtual environment created in '.venv'. Activate it using 'source .venv/bin/activate'."
@PYVER=$$(./.venv/bin/python --version) && \
echo "Virtual environment created in '.venv' with $$PYVER. Activate it using 'source .venv/bin/activate'."
# Target to install dependencies from requirements.txt
pip: venv
@@ -49,8 +53,12 @@ pip-dev: pip
.venv/bin/pip install -r requirements-dev.txt
@echo "Dependencies installed from requirements-dev.txt."
# Target to create a version.txt
version-txt:
echo "$(VERSION)" > version.txt
# Target to install EOS in editable form (development mode) into virtual environment.
install: pip-dev
install: pip-dev version-txt
.venv/bin/pip install build
.venv/bin/pip install -e .
@echo "EOS installed in editable form (development mode)."
@@ -62,7 +70,7 @@ dist: pip
@echo "Distribution created (see dist/)."
# Target to generate documentation
gen-docs: pip-dev
gen-docs: pip-dev version-txt
.venv/bin/pip install -e .
.venv/bin/python ./scripts/generate_config_md.py --output-file docs/_generated/config.md
.venv/bin/python ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md
@@ -71,12 +79,13 @@ gen-docs: pip-dev
# Target to build HTML documentation
docs: pip-dev
.venv/bin/sphinx-build -M html docs build/docs
.venv/bin/pytest --full-run tests/test_docsphinx.py
@echo "Documentation build to build/docs/html/."
# Target to read the HTML documentation
read-docs: docs
read-docs:
@echo "Read the documentation in your browser"
.venv/bin/pytest --full-run tests/test_docsphinx.py
.venv/bin/python -m webbrowser build/docs/html/index.html
# Clean Python bytecode
@@ -125,7 +134,7 @@ test:
# Target to run tests as done by CI on Github.
test-ci:
@echo "Running tests as CI..."
.venv/bin/pytest --full-run --check-config-side-effect -vs --cov src --cov-report term-missing
.venv/bin/pytest --finalize --check-config-side-effect -vs --cov src --cov-report term-missing
# Target to run tests including the system tests.
test-system:
@@ -135,7 +144,7 @@ test-system:
# Target to run all tests.
test-full:
@echo "Running all tests..."
.venv/bin/pytest --full-run
.venv/bin/pytest --finalize
# Target to run tests including the single test optimization with profiling.
test-profile:
@@ -156,21 +165,26 @@ mypy:
# Run entire setup on docker
docker-run:
@docker pull python:3.13.9-slim
@docker compose up --remove-orphans
docker-build:
@docker compose build --pull
@docker pull python:3.13.9-slim
@docker compose build
# Bump Akkudoktoreos version
VERSION ?= 0.2.0+dev
NEW_VERSION ?= $(VERSION)+dev
# Propagete version info to all version files
# Take UPDATE_FILES from GitHub action bump-version.yml
UPDATE_FILES := $(shell sed -n 's/^[[:space:]]*UPDATE_FILES[[:space:]]*=[[:space:]]*"\([^"]*\)".*/\1/p' \
.github/workflows/bump-version.yml)
prepare-version: #pip-dev
@echo "Update version to $(VERSION) from version.py in files $(UPDATE_FILES) and doc"
.venv/bin/python ./scripts/update_version.py $(VERSION) $(UPDATE_FILES)
.venv/bin/python ./scripts/convert_lightweight_tags.py
.venv/bin/python ./scripts/generate_config_md.py --output-file docs/_generated/config.md
.venv/bin/python ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md
.venv/bin/python ./scripts/generate_openapi.py --output-file openapi.json
.venv/bin/pytest -vv --finalize tests/test_version.py
bump: pip-dev
@echo "Bumping akkudoktoreos version from $(VERSION) to $(NEW_VERSION) (dry-run: $(EXTRA_ARGS))"
.venv/bin/python scripts/convert_lightweight_tags.py
.venv/bin/python scripts/bump_version.py $(VERSION) $(NEW_VERSION) $(EXTRA_ARGS)
bump-dry: pip-dev
@echo "Bumping akkudoktoreos version from $(VERSION) to $(NEW_VERSION) (dry-run: --dry-run)"
.venv/bin/python scripts/convert_lightweight_tags.py
.venv/bin/python scripts/bump_version.py $(VERSION) $(NEW_VERSION) --dry-run
test-version:
echo "Test version information to be correctly set in all version files"
.venv/bin/pytest -vv tests/test_version.py

View File

@@ -1,6 +1,7 @@
---
networks:
default:
external: true
name: "eos"
services:
eos:
@@ -38,18 +39,6 @@ services:
- "${EOS_SERVER__EOSDASH_PORT}:8504"
# Volume mount configuration (optional)
# IMPORTANT: When mounting local directories, the default config won't be available.
# You must create an EOS.config.json file in your local config directory with:
# {
# "server": {
# "host": "0.0.0.0", # Required for Docker container accessibility
# "port": 8503,
# "startup_eosdash": true,
# "eosdash_host": "0.0.0.0", # Required for Docker container accessibility
# "eosdash_port": 8504
# }
# }
#
# Example volume mounts (uncomment to use):
# volumes:
# - ./config:/opt/eos/config # Mount local config directory

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,28 @@
## Cache Configuration
<!-- pyml disable line-length -->
:::{table} cache
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| cleanup_interval | `EOS_CACHE__CLEANUP_INTERVAL` | `float` | `rw` | `300` | Intervall in seconds for EOS file cache cleanup. |
| subpath | `EOS_CACHE__SUBPATH` | `Optional[pathlib.Path]` | `rw` | `cache` | Sub-path for the EOS cache data directory. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"cache": {
"subpath": "cache",
"cleanup_interval": 300.0
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,405 @@
## Base configuration for devices simulation settings
<!-- pyml disable line-length -->
:::{table} devices
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| batteries | `EOS_DEVICES__BATTERIES` | `Optional[list[akkudoktoreos.devices.devices.BatteriesCommonSettings]]` | `rw` | `None` | List of battery devices |
| electric_vehicles | `EOS_DEVICES__ELECTRIC_VEHICLES` | `Optional[list[akkudoktoreos.devices.devices.BatteriesCommonSettings]]` | `rw` | `None` | List of electric vehicle devices |
| home_appliances | `EOS_DEVICES__HOME_APPLIANCES` | `Optional[list[akkudoktoreos.devices.devices.HomeApplianceCommonSettings]]` | `rw` | `None` | List of home appliances |
| inverters | `EOS_DEVICES__INVERTERS` | `Optional[list[akkudoktoreos.devices.devices.InverterCommonSettings]]` | `rw` | `None` | List of inverters |
| max_batteries | `EOS_DEVICES__MAX_BATTERIES` | `Optional[int]` | `rw` | `None` | Maximum number of batteries that can be set |
| max_electric_vehicles | `EOS_DEVICES__MAX_ELECTRIC_VEHICLES` | `Optional[int]` | `rw` | `None` | Maximum number of electric vehicles that can be set |
| max_home_appliances | `EOS_DEVICES__MAX_HOME_APPLIANCES` | `Optional[int]` | `rw` | `None` | Maximum number of home_appliances that can be set |
| max_inverters | `EOS_DEVICES__MAX_INVERTERS` | `Optional[int]` | `rw` | `None` | Maximum number of inverters that can be set |
| measurement_keys | | `Optional[list[str]]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1,
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w",
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
}
```
<!-- pyml enable line-length -->
### Inverter devices base settings
<!-- pyml disable line-length -->
:::{table} devices::inverters::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| battery_id | `Optional[str]` | `rw` | `None` | ID of battery controlled by this inverter. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| max_power_w | `Optional[float]` | `rw` | `None` | Maximum power [W]. |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"inverters": [
{
"device_id": "battery1",
"max_power_w": 10000.0,
"battery_id": null
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"inverters": [
{
"device_id": "battery1",
"max_power_w": 10000.0,
"battery_id": null,
"measurement_keys": []
}
]
}
}
```
<!-- pyml enable line-length -->
### Home Appliance devices base settings
<!-- pyml disable line-length -->
:::{table} devices::home_appliances::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| consumption_wh | `int` | `rw` | `required` | Energy consumption [Wh]. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| duration_h | `int` | `rw` | `required` | Usage duration in hours [0 ... 24]. |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
| time_windows | `Optional[akkudoktoreos.utils.datetimeutil.TimeWindowSequence]` | `rw` | `None` | Sequence of allowed time windows. Defaults to optimization general time window. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"home_appliances": [
{
"device_id": "battery1",
"consumption_wh": 2000,
"duration_h": 1,
"time_windows": {
"windows": [
{
"start_time": "10:00:00.000000 Europe/Berlin",
"duration": "2 hours",
"day_of_week": null,
"date": null,
"locale": null
}
]
}
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"home_appliances": [
{
"device_id": "battery1",
"consumption_wh": 2000,
"duration_h": 1,
"time_windows": {
"windows": [
{
"start_time": "10:00:00.000000 Europe/Berlin",
"duration": "2 hours",
"day_of_week": null,
"date": null,
"locale": null
}
]
},
"measurement_keys": []
}
]
}
}
```
<!-- pyml enable line-length -->
### Battery devices base settings
<!-- pyml disable line-length -->
:::{table} devices::batteries::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| capacity_wh | `int` | `rw` | `8000` | Capacity [Wh]. |
| charge_rates | `Optional[numpydantic.vendor.npbase_meta_classes.NDArray]` | `rw` | `[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]` | Charge rates as factor of maximum charging power [0.00 ... 1.00]. None triggers fallback to default charge-rates. |
| charging_efficiency | `float` | `rw` | `0.88` | Charging efficiency [0.01 ... 1.00]. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| discharging_efficiency | `float` | `rw` | `0.88` | Discharge efficiency [0.01 ... 1.00]. |
| levelized_cost_of_storage_kwh | `float` | `rw` | `0.0` | Levelized cost of storage (LCOS), the average lifetime cost of delivering one kWh [€/kWh]. |
| max_charge_power_w | `Optional[float]` | `rw` | `5000` | Maximum charging power [W]. |
| max_soc_percentage | `int` | `rw` | `100` | Maximum state of charge (SOC) as percentage of capacity [%]. |
| measurement_key_power_3_phase_sym_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l1_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l2_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l3_w | `str` | `ro` | `N/A` | None |
| measurement_key_soc_factor | `str` | `ro` | `N/A` | None |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
| min_charge_power_w | `Optional[float]` | `rw` | `50` | Minimum charging power [W]. |
| min_soc_percentage | `int` | `rw` | `0` | Minimum state of charge (SOC) as percentage of capacity [%]. This is the target SoC for charging |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.12,
"max_charge_power_w": 5000.0,
"min_charge_power_w": 50.0,
"charge_rates": "[0. 0.25 0.5 0.75 1. ]",
"min_soc_percentage": 10,
"max_soc_percentage": 100
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.12,
"max_charge_power_w": 5000.0,
"min_charge_power_w": 50.0,
"charge_rates": "[0. 0.25 0.5 0.75 1. ]",
"min_soc_percentage": 10,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,99 @@
## Electricity Price Prediction Configuration
<!-- pyml disable line-length -->
:::{table} elecprice
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| charges_kwh | `EOS_ELECPRICE__CHARGES_KWH` | `Optional[float]` | `rw` | `None` | Electricity price charges [€/kWh]. Will be added to variable market price. |
| elecpriceimport | `EOS_ELECPRICE__ELECPRICEIMPORT` | `ElecPriceImportCommonSettings` | `rw` | `required` | Import provider settings. |
| energycharts | `EOS_ELECPRICE__ENERGYCHARTS` | `ElecPriceEnergyChartsCommonSettings` | `rw` | `required` | Energy Charts provider settings. |
| provider | `EOS_ELECPRICE__PROVIDER` | `Optional[str]` | `rw` | `None` | Electricity price provider id of provider to be used. |
| vat_rate | `EOS_ELECPRICE__VAT_RATE` | `Optional[float]` | `rw` | `1.19` | VAT rate factor applied to electricity price when charges are used. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"provider": "ElecPriceAkkudoktor",
"charges_kwh": 0.21,
"vat_rate": 1.19,
"elecpriceimport": {
"import_file_path": null,
"import_json": null
},
"energycharts": {
"bidding_zone": "DE-LU"
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for Energy Charts electricity price provider
<!-- pyml disable line-length -->
:::{table} elecprice::energycharts
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| bidding_zone | `<enum 'EnergyChartsBiddingZones'>` | `rw` | `EnergyChartsBiddingZones.DE_LU` | Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', 'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI' |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"energycharts": {
"bidding_zone": "AT"
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for elecprice data import from file or JSON String
<!-- pyml disable line-length -->
:::{table} elecprice::elecpriceimport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import elecprice data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of electricity price forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"elecpriceimport": {
"import_file_path": null,
"import_json": "{\"elecprice_marketprice_wh\": [0.0003384, 0.0003318, 0.0003284]}"
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,30 @@
## Energy Management Configuration
<!-- pyml disable line-length -->
:::{table} ems
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| interval | `EOS_EMS__INTERVAL` | `Optional[float]` | `rw` | `None` | Intervall in seconds between EOS energy management runs. |
| mode | `EOS_EMS__MODE` | `Optional[akkudoktoreos.core.emsettings.EnergyManagementMode]` | `rw` | `None` | Energy management mode [OPTIMIZATION | PREDICTION]. |
| startup_delay | `EOS_EMS__STARTUP_DELAY` | `float` | `rw` | `5` | Startup delay in seconds for EOS energy management runs. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"ems": {
"startup_delay": 5.0,
"interval": 300.0,
"mode": "OPTIMIZATION"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,215 @@
## Full example Config
<!-- pyml disable line-length -->
```json
{
"cache": {
"subpath": "cache",
"cleanup_interval": 300.0
},
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1
},
"elecprice": {
"provider": "ElecPriceAkkudoktor",
"charges_kwh": 0.21,
"vat_rate": 1.19,
"elecpriceimport": {
"import_file_path": null,
"import_json": null
},
"energycharts": {
"bidding_zone": "DE-LU"
}
},
"ems": {
"startup_delay": 5.0,
"interval": 300.0,
"mode": "OPTIMIZATION"
},
"feedintariff": {
"provider": "FeedInTariffFixed",
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
},
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405
},
"load": {
"provider": "LoadAkkudoktor",
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
},
"logging": {
"console_level": "TRACE",
"file_level": "TRACE"
},
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
]
},
"optimization": {
"horizon_hours": 24,
"interval": 3600,
"algorithm": "GENETIC",
"genetic": {
"individuals": 400,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
},
"prediction": {
"hours": 48,
"historic_hours": 48
},
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1
},
"server": {
"host": "127.0.0.1",
"port": 8503,
"verbose": false,
"startup_eosdash": true,
"eosdash_host": "127.0.0.1",
"eosdash_port": 8504
},
"utils": {},
"weather": {
"provider": "WeatherImport",
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,126 @@
## Feed In Tariff Prediction Configuration
<!-- pyml disable line-length -->
:::{table} feedintariff
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_FEEDINTARIFF__PROVIDER` | `Optional[str]` | `rw` | `None` | Feed in tariff provider id of provider to be used. |
| provider_settings | `EOS_FEEDINTARIFF__PROVIDER_SETTINGS` | `FeedInTariffCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider": "FeedInTariffFixed",
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for feed in tariff data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings::FeedInTariffImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import feed in tariff data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of feed in tariff forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffImport": {
"import_file_path": null,
"import_json": "{\"fead_in_tariff_wh\": [0.000078, 0.000078, 0.000023]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for elecprice fixed price
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings::FeedInTariffFixed
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| feed_in_tariff_kwh | `Optional[float]` | `rw` | `None` | Electricity price feed in tariff [€/kWH]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffFixed": {
"feed_in_tariff_kwh": 0.078
}
}
}
}
```
<!-- pyml enable line-length -->
### Feed In Tariff Prediction Provider Configuration
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| FeedInTariffFixed | `Optional[akkudoktoreos.prediction.feedintarifffixed.FeedInTariffFixedCommonSettings]` | `rw` | `None` | FeedInTariffFixed settings |
| FeedInTariffImport | `Optional[akkudoktoreos.prediction.feedintariffimport.FeedInTariffImportCommonSettings]` | `rw` | `None` | FeedInTariffImport settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,73 @@
## Settings for common configuration
General configuration to set directories of cache and output files and system location (latitude
and longitude).
Validators ensure each parameter is within a specified range. A computed property, `timezone`,
determines the time zone based on latitude and longitude.
Attributes:
latitude (Optional[float]): Latitude in degrees, must be between -90 and 90.
longitude (Optional[float]): Longitude in degrees, must be between -180 and 180.
Properties:
timezone (Optional[str]): Computed time zone string based on the specified latitude
and longitude.
<!-- pyml disable line-length -->
:::{table} general
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| config_file_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| config_folder_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| data_folder_path | `EOS_GENERAL__DATA_FOLDER_PATH` | `Optional[pathlib.Path]` | `rw` | `None` | Path to EOS data directory. |
| data_output_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| data_output_subpath | `EOS_GENERAL__DATA_OUTPUT_SUBPATH` | `Optional[pathlib.Path]` | `rw` | `output` | Sub-path for the EOS output data directory. |
| latitude | `EOS_GENERAL__LATITUDE` | `Optional[float]` | `rw` | `52.52` | Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°) |
| longitude | `EOS_GENERAL__LONGITUDE` | `Optional[float]` | `rw` | `13.405` | Longitude in decimal degrees, within -180 to 180 (°) |
| timezone | | `Optional[str]` | `ro` | `N/A` | None |
| version | `EOS_GENERAL__VERSION` | `str` | `rw` | `0.2.0+dev.4dbc2d` | Configuration file version. Used to check compatibility. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405,
"timezone": "Europe/Berlin",
"data_output_path": null,
"config_folder_path": "/home/user/.config/net.akkudoktoreos.net",
"config_file_path": "/home/user/.config/net.akkudoktoreos.net/EOS.config.json"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,162 @@
## Load Prediction Configuration
<!-- pyml disable line-length -->
:::{table} load
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_LOAD__PROVIDER` | `Optional[str]` | `rw` | `None` | Load provider id of provider to be used. |
| provider_settings | `EOS_LOAD__PROVIDER_SETTINGS` | `LoadCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider": "LoadAkkudoktor",
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for load data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import load data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of load forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadImport": {
"import_file_path": null,
"import_json": "{\"load0_mean\": [676.71, 876.19, 527.13]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for VRM API
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadVrm
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| load_vrm_idsite | `int` | `rw` | `12345` | VRM-Installation-ID |
| load_vrm_token | `str` | `rw` | `your-token` | Token for Connecting VRM API |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadVrm": {
"load_vrm_token": "your-token",
"load_vrm_idsite": 12345
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for load data import from file
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadAkkudoktor
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| loadakkudoktor_year_energy_kwh | `Optional[float]` | `rw` | `None` | Yearly energy consumption (kWh). |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadAkkudoktor": {
"loadakkudoktor_year_energy_kwh": 40421.0
}
}
}
}
```
<!-- pyml enable line-length -->
### Load Prediction Provider Configuration
<!-- pyml disable line-length -->
:::{table} load::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| LoadAkkudoktor | `Optional[akkudoktoreos.prediction.loadakkudoktor.LoadAkkudoktorCommonSettings]` | `rw` | `None` | LoadAkkudoktor settings |
| LoadImport | `Optional[akkudoktoreos.prediction.loadimport.LoadImportCommonSettings]` | `rw` | `None` | LoadImport settings |
| LoadVrm | `Optional[akkudoktoreos.prediction.loadvrm.LoadVrmCommonSettings]` | `rw` | `None` | LoadVrm settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,45 @@
## Logging Configuration
<!-- pyml disable line-length -->
:::{table} logging
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| console_level | `EOS_LOGGING__CONSOLE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to console. |
| file_level | `EOS_LOGGING__FILE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to file. |
| file_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"logging": {
"console_level": "TRACE",
"file_level": "TRACE"
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"logging": {
"console_level": "TRACE",
"file_level": "TRACE",
"file_path": "/home/user/.local/share/net.akkudoktor.eos/output/eos.log"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,72 @@
## Measurement Configuration
<!-- pyml disable line-length -->
:::{table} measurement
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| grid_export_emr_keys | `EOS_MEASUREMENT__GRID_EXPORT_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of energy export to grid [kWh]. |
| grid_import_emr_keys | `EOS_MEASUREMENT__GRID_IMPORT_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of energy import from grid [kWh]. |
| keys | | `list[str]` | `ro` | `N/A` | None |
| load_emr_keys | `EOS_MEASUREMENT__LOAD_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of a load [kWh]. |
| pv_production_emr_keys | `EOS_MEASUREMENT__PV_PRODUCTION_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are PV production energy meter readings [kWh]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
],
"keys": [
"grid_export_emr",
"grid_import_emr",
"load0_emr",
"pv1_emr"
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,76 @@
## General Optimization Configuration
<!-- pyml disable line-length -->
:::{table} optimization
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| algorithm | `EOS_OPTIMIZATION__ALGORITHM` | `Optional[str]` | `rw` | `GENETIC` | The optimization algorithm. |
| genetic | `EOS_OPTIMIZATION__GENETIC` | `Optional[akkudoktoreos.optimization.optimization.GeneticCommonSettings]` | `rw` | `None` | Genetic optimization algorithm configuration. |
| horizon_hours | `EOS_OPTIMIZATION__HORIZON_HOURS` | `Optional[int]` | `rw` | `24` | The general time window within which the energy optimization goal shall be achieved [h]. Defaults to 24 hours. |
| interval | `EOS_OPTIMIZATION__INTERVAL` | `Optional[int]` | `rw` | `3600` | The optimization interval [sec]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"optimization": {
"horizon_hours": 24,
"interval": 3600,
"algorithm": "GENETIC",
"genetic": {
"individuals": 400,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
}
}
```
<!-- pyml enable line-length -->
### General Genetic Optimization Algorithm Configuration
<!-- pyml disable line-length -->
:::{table} optimization::genetic
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| generations | `Optional[int]` | `rw` | `400` | Number of generations to evaluate the optimal solution [>= 10]. Defaults to 400. |
| individuals | `Optional[int]` | `rw` | `300` | Number of individuals (solutions) to generate for the (initial) generation [>= 10]. Defaults to 300. |
| penalties | `Optional[dict[str, Union[float, int, str]]]` | `rw` | `None` | A dictionary of penalty function parameters consisting of a penalty function parameter name and the associated value. |
| seed | `Optional[int]` | `rw` | `None` | Fixed seed for genetic algorithm. Defaults to 'None' which means random seed. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"optimization": {
"genetic": {
"individuals": 300,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,42 @@
## General Prediction Configuration
This class provides configuration for prediction settings, allowing users to specify
parameters such as the forecast duration (in hours).
Validators ensure each parameter is within a specified range.
Attributes:
hours (Optional[int]): Number of hours into the future for predictions.
Must be non-negative.
historic_hours (Optional[int]): Number of hours into the past for historical data.
Must be non-negative.
Validators:
validate_hours (int): Ensures `hours` is a non-negative integer.
validate_historic_hours (int): Ensures `historic_hours` is a non-negative integer.
<!-- pyml disable line-length -->
:::{table} prediction
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| historic_hours | `EOS_PREDICTION__HISTORIC_HOURS` | `Optional[int]` | `rw` | `48` | Number of hours into the past for historical predictions data |
| hours | `EOS_PREDICTION__HOURS` | `Optional[int]` | `rw` | `48` | Number of hours into the future for predictions |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"prediction": {
"hours": 48,
"historic_hours": 48
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,340 @@
## PV Forecast Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| max_planes | `EOS_PVFORECAST__MAX_PLANES` | `Optional[int]` | `rw` | `0` | Maximum number of planes that can be set |
| planes | `EOS_PVFORECAST__PLANES` | `Optional[list[akkudoktoreos.prediction.pvforecast.PVForecastPlaneSetting]]` | `rw` | `None` | Plane configuration. |
| planes_azimuth | | `List[float]` | `ro` | `N/A` | None |
| planes_inverter_paco | | `Any` | `ro` | `N/A` | None |
| planes_peakpower | | `List[float]` | `ro` | `N/A` | None |
| planes_tilt | | `List[float]` | `ro` | `N/A` | None |
| planes_userhorizon | | `Any` | `ro` | `N/A` | None |
| provider | `EOS_PVFORECAST__PROVIDER` | `Optional[str]` | `rw` | `None` | PVForecast provider id of provider to be used. |
| provider_settings | `EOS_PVFORECAST__PROVIDER_SETTINGS` | `PVForecastCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1,
"planes_peakpower": [
5.0,
3.5
],
"planes_azimuth": [
180.0,
90.0
],
"planes_tilt": [
10.0,
20.0
],
"planes_userhorizon": [
[
10.0,
20.0,
30.0
],
[
5.0,
15.0,
25.0
]
],
"planes_inverter_paco": [
6000.0,
4000.0
]
}
}
```
<!-- pyml enable line-length -->
### Common settings for VRM API
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings::PVForecastVrm
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| pvforecast_vrm_idsite | `int` | `rw` | `12345` | VRM-Installation-ID |
| pvforecast_vrm_token | `str` | `rw` | `your-token` | Token for Connecting VRM API |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastVrm": {
"pvforecast_vrm_token": "your-token",
"pvforecast_vrm_idsite": 12345
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for pvforecast data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings::PVForecastImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import PV forecast data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of PV forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastImport": {
"import_file_path": null,
"import_json": "{\"pvforecast_ac_power\": [0, 8.05, 352.91]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### PV Forecast Provider Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| PVForecastImport | `Optional[akkudoktoreos.prediction.pvforecastimport.PVForecastImportCommonSettings]` | `rw` | `None` | PVForecastImport settings |
| PVForecastVrm | `Optional[akkudoktoreos.prediction.pvforecastvrm.PVForecastVrmCommonSettings]` | `rw` | `None` | PVForecastVrm settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
}
}
}
```
<!-- pyml enable line-length -->
### PV Forecast Plane Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast::planes::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| albedo | `Optional[float]` | `rw` | `None` | Proportion of the light hitting the ground that it reflects back. |
| inverter_model | `Optional[str]` | `rw` | `None` | Model of the inverter of this plane. |
| inverter_paco | `Optional[int]` | `rw` | `None` | AC power rating of the inverter [W]. |
| loss | `Optional[float]` | `rw` | `14.0` | Sum of PV system losses in percent |
| module_model | `Optional[str]` | `rw` | `None` | Model of the PV modules of this plane. |
| modules_per_string | `Optional[int]` | `rw` | `None` | Number of the PV modules of the strings of this plane. |
| mountingplace | `Optional[str]` | `rw` | `free` | Type of mounting for PV system. Options are 'free' for free-standing and 'building' for building-integrated. |
| optimal_surface_tilt | `Optional[bool]` | `rw` | `False` | Calculate the optimum tilt angle. Ignored for two-axis tracking. |
| optimalangles | `Optional[bool]` | `rw` | `False` | Calculate the optimum tilt and azimuth angles. Ignored for two-axis tracking. |
| peakpower | `Optional[float]` | `rw` | `None` | Nominal power of PV system in kW. |
| pvtechchoice | `Optional[str]` | `rw` | `crystSi` | PV technology. One of 'crystSi', 'CIS', 'CdTe', 'Unknown'. |
| strings_per_inverter | `Optional[int]` | `rw` | `None` | Number of the strings of the inverter of this plane. |
| surface_azimuth | `Optional[float]` | `rw` | `180.0` | Orientation (azimuth angle) of the (fixed) plane. Clockwise from north (north=0, east=90, south=180, west=270). |
| surface_tilt | `Optional[float]` | `rw` | `30.0` | Tilt angle from horizontal plane. Ignored for two-axis tracking. |
| trackingtype | `Optional[int]` | `rw` | `None` | Type of suntracking. 0=fixed, 1=single horizontal axis aligned north-south, 2=two-axis tracking, 3=vertical axis tracking, 4=single horizontal axis aligned east-west, 5=single inclined axis aligned north-south. |
| userhorizon | `Optional[List[float]]` | `rw` | `None` | Elevation of horizon in degrees, at equally spaced azimuth clockwise from north. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,36 @@
## Server Configuration
<!-- pyml disable line-length -->
:::{table} server
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| eosdash_host | `EOS_SERVER__EOSDASH_HOST` | `Optional[str]` | `rw` | `None` | EOSdash server IP address. Defaults to EOS server IP address. |
| eosdash_port | `EOS_SERVER__EOSDASH_PORT` | `Optional[int]` | `rw` | `None` | EOSdash server IP port number. Defaults to EOS server IP port number + 1. |
| host | `EOS_SERVER__HOST` | `Optional[str]` | `rw` | `127.0.0.1` | EOS server IP address. Defaults to 127.0.0.1. |
| port | `EOS_SERVER__PORT` | `Optional[int]` | `rw` | `8503` | EOS server IP port number. Defaults to 8503. |
| startup_eosdash | `EOS_SERVER__STARTUP_EOSDASH` | `Optional[bool]` | `rw` | `True` | EOS server to start EOSdash server. Defaults to True. |
| verbose | `EOS_SERVER__VERBOSE` | `Optional[bool]` | `rw` | `False` | Enable debug output |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"server": {
"host": "127.0.0.1",
"port": 8503,
"verbose": false,
"startup_eosdash": true,
"eosdash_host": "127.0.0.1",
"eosdash_port": 8504
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,23 @@
## Utils Configuration
<!-- pyml disable line-length -->
:::{table} utils
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"utils": {}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,92 @@
## Weather Forecast Configuration
<!-- pyml disable line-length -->
:::{table} weather
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_WEATHER__PROVIDER` | `Optional[str]` | `rw` | `None` | Weather provider id of provider to be used. |
| provider_settings | `EOS_WEATHER__PROVIDER_SETTINGS` | `WeatherCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider": "WeatherImport",
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for weather data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} weather::provider_settings::WeatherImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import weather data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of weather forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider_settings": {
"WeatherImport": {
"import_file_path": null,
"import_json": "{\"weather_temp_air\": [18.3, 17.8, 16.9]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Weather Forecast Provider Configuration
<!-- pyml disable line-length -->
:::{table} weather::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| WeatherImport | `Optional[akkudoktoreos.prediction.weatherimport.WeatherImportCommonSettings]` | `rw` | `None` | WeatherImport settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -1,8 +1,10 @@
# Akkudoktor-EOS
**Version**: `v0.2.0+dev`
**Version**: `v0.2.0+dev.4dbc2d`
<!-- pyml disable line-length -->
**Description**: This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.
<!-- pyml enable line-length -->
**Base URL**: `No base URL provided.`
@@ -10,11 +12,15 @@
## POST /gesamtlast
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_gesamtlast_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_gesamtlast_post)
<!-- pyml enable line-length -->
Fastapi Gesamtlast
```
<!-- pyml disable line-length -->
```python
"""
Deprecated: Total Load Prediction with adjustment.
Endpoint to handle total load prediction adjusted by latest measured data.
@@ -30,7 +36,9 @@ Note:
'/v1/measurement/series' or
'/v1/measurement/dataframe' or
'/v1/measurement/data'
"""
```
<!-- pyml enable line-length -->
**Request Body**:
@@ -48,11 +56,15 @@ Note:
## GET /gesamtlast_simple
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get)
<!-- pyml enable line-length -->
Fastapi Gesamtlast Simple
```
<!-- pyml disable line-length -->
```python
"""
Deprecated: Total Load Prediction.
Endpoint to handle total load prediction.
@@ -69,7 +81,9 @@ Note:
'/v1/prediction/update'
and then request data with
'/v1/prediction/list?key=loadforecast_power_w' instead.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -85,18 +99,24 @@ Note:
## POST /optimize
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_optimize_optimize_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_optimize_optimize_post)
<!-- pyml enable line-length -->
Fastapi Optimize
```
<!-- pyml disable line-length -->
```python
"""
Deprecated: Optimize.
Endpoint to handle optimization.
Note:
Use automatic optimization instead.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -120,11 +140,15 @@ Note:
## GET /pvforecast
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_pvforecast_pvforecast_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_pvforecast_pvforecast_get)
<!-- pyml enable line-length -->
Fastapi Pvforecast
```
<!-- pyml disable line-length -->
```python
"""
Deprecated: PV Forecast Prediction.
Endpoint to handle PV forecast prediction.
@@ -139,7 +163,9 @@ Note:
and then request data with
'/v1/prediction/list?key=pvforecast_ac_power' and
'/v1/prediction/list?key=pvforecastakkudoktor_temp_air' instead.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -149,11 +175,15 @@ Note:
## GET /strompreis
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_strompreis_strompreis_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_strompreis_strompreis_get)
<!-- pyml enable line-length -->
Fastapi Strompreis
```
<!-- pyml disable line-length -->
```python
"""
Deprecated: Electricity Market Price Prediction per Wh (€/Wh).
Electricity prices start at 00.00.00 today and are provided for 48 hours.
@@ -169,7 +199,9 @@ Note:
and then request data with
'/v1/prediction/list?key=elecprice_marketprice_wh' or
'/v1/prediction/list?key=elecprice_marketprice_kwh' instead.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -179,16 +211,22 @@ Note:
## GET /v1/admin/cache
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_get_v1_admin_cache_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_get_v1_admin_cache_get)
<!-- pyml enable line-length -->
Fastapi Admin Cache Get
```
<!-- pyml disable line-length -->
```python
"""
Current cache management data.
Returns:
data (dict): The management data.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -198,18 +236,24 @@ Returns:
## POST /v1/admin/cache/clear
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Clear Post
```
<!-- pyml disable line-length -->
```python
"""
Clear the cache.
Deletes all cache files.
Returns:
data (dict): The management data after cleanup.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -219,18 +263,24 @@ Returns:
## POST /v1/admin/cache/clear-expired
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Clear Expired Post
```
<!-- pyml disable line-length -->
```python
"""
Clear the cache from expired data.
Deletes expired cache files.
Returns:
data (dict): The management data after cleanup.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -240,16 +290,22 @@ Returns:
## POST /v1/admin/cache/load
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Load Post
```
<!-- pyml disable line-length -->
```python
"""
Load cache management data.
Returns:
data (dict): The management data that was loaded.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -259,16 +315,22 @@ Returns:
## POST /v1/admin/cache/save
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Save Post
```
<!-- pyml disable line-length -->
```python
"""
Save the current cache management data.
Returns:
data (dict): The management data that was saved.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -278,15 +340,21 @@ Returns:
## POST /v1/admin/server/restart
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post)
<!-- pyml enable line-length -->
Fastapi Admin Server Restart Post
```
<!-- pyml disable line-length -->
```python
"""
Restart the server.
Restart EOS properly by starting a new instance before exiting the old one.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -296,13 +364,19 @@ Restart EOS properly by starting a new instance before exiting the old one.
## POST /v1/admin/server/shutdown
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post)
<!-- pyml enable line-length -->
Fastapi Admin Server Shutdown Post
```
<!-- pyml disable line-length -->
```python
"""
Shutdown the server.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -312,16 +386,22 @@ Shutdown the server.
## GET /v1/config
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_v1_config_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_v1_config_get)
<!-- pyml enable line-length -->
Fastapi Config Get
```
<!-- pyml disable line-length -->
```python
"""
Get the current configuration.
Returns:
configuration (ConfigEOS): The current configuration.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -331,11 +411,15 @@ Returns:
## PUT /v1/config
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_v1_config_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_v1_config_put)
<!-- pyml enable line-length -->
Fastapi Config Put
```
<!-- pyml disable line-length -->
```python
"""
Update the current config with the provided settings.
Note that for any setting value that is None or unset, the configuration will fall back to
@@ -347,7 +431,9 @@ Args:
Returns:
configuration (ConfigEOS): The current configuration after the write.
"""
```
<!-- pyml enable line-length -->
**Request Body**:
@@ -365,16 +451,22 @@ Returns:
## GET /v1/config/backup
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_backup_get_v1_config_backup_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_backup_get_v1_config_backup_get)
<!-- pyml enable line-length -->
Fastapi Config Backup Get
```
<!-- pyml disable line-length -->
```python
"""
Get the EOS configuration backup identifiers and backup metadata.
Returns:
dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -384,16 +476,22 @@ Returns:
## PUT /v1/config/file
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_file_put_v1_config_file_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_file_put_v1_config_file_put)
<!-- pyml enable line-length -->
Fastapi Config File Put
```
<!-- pyml disable line-length -->
```python
"""
Save the current configuration to the EOS configuration file.
Returns:
configuration (ConfigEOS): The current configuration that was saved.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -403,16 +501,22 @@ Returns:
## POST /v1/config/reset
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_reset_post_v1_config_reset_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_reset_post_v1_config_reset_post)
<!-- pyml enable line-length -->
Fastapi Config Reset Post
```
<!-- pyml disable line-length -->
```python
"""
Reset the configuration to the EOS configuration file.
Returns:
configuration (ConfigEOS): The current configuration after update.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -422,16 +526,22 @@ Returns:
## PUT /v1/config/revert
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_revert_put_v1_config_revert_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_revert_put_v1_config_revert_put)
<!-- pyml enable line-length -->
Fastapi Config Revert Put
```
<!-- pyml disable line-length -->
```python
"""
Revert the configuration to a EOS configuration backup.
Returns:
configuration (ConfigEOS): The current configuration after revert.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -447,11 +557,15 @@ Returns:
## GET /v1/config/{path}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_key_v1_config__path__get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_key_v1_config__path__get)
<!-- pyml enable line-length -->
Fastapi Config Get Key
```
<!-- pyml disable line-length -->
```python
"""
Get the value of a nested key or index in the config model.
Args:
@@ -459,7 +573,9 @@ Args:
Returns:
value (Any): The value of the selected nested key.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -475,11 +591,15 @@ Returns:
## PUT /v1/config/{path}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_key_v1_config__path__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_key_v1_config__path__put)
<!-- pyml enable line-length -->
Fastapi Config Put Key
```
<!-- pyml disable line-length -->
```python
"""
Update a nested key or index in the config model.
Args:
@@ -488,7 +608,9 @@ Args:
Returns:
configuration (ConfigEOS): The current configuration after the update.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -517,13 +639,19 @@ Returns:
## GET /v1/energy-management/optimization/solution
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get)
<!-- pyml enable line-length -->
Fastapi Energy Management Optimization Solution Get
```
<!-- pyml disable line-length -->
```python
"""
Get the latest solution of the optimization.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -533,13 +661,19 @@ Get the latest solution of the optimization.
## GET /v1/energy-management/plan
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get)
<!-- pyml enable line-length -->
Fastapi Energy Management Plan Get
```
<!-- pyml disable line-length -->
```python
"""
Get the latest energy management plan.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -549,13 +683,19 @@ Get the latest energy management plan.
## GET /v1/health
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_health_get_v1_health_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_health_get_v1_health_get)
<!-- pyml enable line-length -->
Fastapi Health Get
```
<!-- pyml disable line-length -->
```python
"""
Health check endpoint to verify that the EOS server is alive.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -565,11 +705,15 @@ Health check endpoint to verify that the EOS server is alive.
## GET /v1/logging/log
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_logging_get_log_v1_logging_log_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_logging_get_log_v1_logging_log_get)
<!-- pyml enable line-length -->
Fastapi Logging Get Log
```
<!-- pyml disable line-length -->
```python
"""
Get structured log entries from the EOS log file.
Filters and returns log entries based on the specified query parameters. The log
@@ -586,7 +730,9 @@ Args:
Returns:
JSONResponse: A JSON list of log entries.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -614,13 +760,19 @@ Returns:
## PUT /v1/measurement/data
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_data_put_v1_measurement_data_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_data_put_v1_measurement_data_put)
<!-- pyml enable line-length -->
Fastapi Measurement Data Put
```
<!-- pyml disable line-length -->
```python
"""
Merge the measurement data given as datetime data into EOS measurements.
"""
```
<!-- pyml enable line-length -->
**Request Body**:
@@ -638,13 +790,19 @@ Merge the measurement data given as datetime data into EOS measurements.
## PUT /v1/measurement/dataframe
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put)
<!-- pyml enable line-length -->
Fastapi Measurement Dataframe Put
```
<!-- pyml disable line-length -->
```python
"""
Merge the measurement data given as dataframe into EOS measurements.
"""
```
<!-- pyml enable line-length -->
**Request Body**:
@@ -662,13 +820,19 @@ Merge the measurement data given as dataframe into EOS measurements.
## GET /v1/measurement/keys
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_keys_get_v1_measurement_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_keys_get_v1_measurement_keys_get)
<!-- pyml enable line-length -->
Fastapi Measurement Keys Get
```
<!-- pyml disable line-length -->
```python
"""
Get a list of available measurement keys.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -678,13 +842,19 @@ Get a list of available measurement keys.
## GET /v1/measurement/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_get_v1_measurement_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_get_v1_measurement_series_get)
<!-- pyml enable line-length -->
Fastapi Measurement Series Get
```
<!-- pyml disable line-length -->
```python
"""
Get the measurements of given key as series.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -700,13 +870,19 @@ Get the measurements of given key as series.
## PUT /v1/measurement/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_put_v1_measurement_series_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_put_v1_measurement_series_put)
<!-- pyml enable line-length -->
Fastapi Measurement Series Put
```
<!-- pyml disable line-length -->
```python
"""
Merge measurement given as series into given key.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -728,13 +904,19 @@ Merge measurement given as series into given key.
## PUT /v1/measurement/value
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_value_put_v1_measurement_value_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_value_put_v1_measurement_value_put)
<!-- pyml enable line-length -->
Fastapi Measurement Value Put
```
<!-- pyml disable line-length -->
```python
"""
Merge the measurement of given key and value into EOS measurements at given datetime.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -754,11 +936,15 @@ Merge the measurement of given key and value into EOS measurements at given date
## GET /v1/prediction/dataframe
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get)
<!-- pyml enable line-length -->
Fastapi Prediction Dataframe Get
```
<!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as series.
Args:
@@ -768,7 +954,9 @@ Args:
end_datetime (Optional[str]: Ending datetime (exclusive).
Defaults to end datetime of latest prediction.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -790,11 +978,15 @@ Defaults to end datetime of latest prediction.
## PUT /v1/prediction/import/{provider_id}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put)
<!-- pyml enable line-length -->
Fastapi Prediction Import Provider
```
<!-- pyml disable line-length -->
```python
"""
Import prediction for given provider ID.
Args:
@@ -802,7 +994,9 @@ Args:
data: Prediction data.
force_enable: Update data even if provider is disabled.
Defaults to False.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -841,13 +1035,19 @@ Args:
## GET /v1/prediction/keys
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_keys_get_v1_prediction_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_keys_get_v1_prediction_keys_get)
<!-- pyml enable line-length -->
Fastapi Prediction Keys Get
```
<!-- pyml disable line-length -->
```python
"""
Get a list of available prediction keys.
"""
```
<!-- pyml enable line-length -->
**Responses**:
@@ -857,11 +1057,15 @@ Get a list of available prediction keys.
## GET /v1/prediction/list
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_list_get_v1_prediction_list_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_list_get_v1_prediction_list_get)
<!-- pyml enable line-length -->
Fastapi Prediction List Get
```
<!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as value list.
Args:
@@ -872,7 +1076,9 @@ Args:
Defaults to end datetime of latest prediction.
interval (Optional[str]): Time duration for each interval.
Defaults to 1 hour.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -894,16 +1100,22 @@ Args:
## GET /v1/prediction/providers
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_providers_get_v1_prediction_providers_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_providers_get_v1_prediction_providers_get)
<!-- pyml enable line-length -->
Fastapi Prediction Providers Get
```
<!-- pyml disable line-length -->
```python
"""
Get a list of available prediction providers.
Args:
enabled (bool): Return enabled/disabled providers. If unset, return all providers.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -919,11 +1131,15 @@ Args:
## GET /v1/prediction/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_series_get_v1_prediction_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_series_get_v1_prediction_series_get)
<!-- pyml enable line-length -->
Fastapi Prediction Series Get
```
<!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as series.
Args:
@@ -932,7 +1148,9 @@ Args:
Defaults to start datetime of latest prediction.
end_datetime (Optional[str]: Ending datetime (exclusive).
Defaults to end datetime of latest prediction.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -952,11 +1170,15 @@ Args:
## POST /v1/prediction/update
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_v1_prediction_update_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_v1_prediction_update_post)
<!-- pyml enable line-length -->
Fastapi Prediction Update
```
<!-- pyml disable line-length -->
```python
"""
Update predictions for all providers.
Args:
@@ -964,7 +1186,9 @@ Args:
Defaults to False.
force_enable: Update data even if provider is disabled.
Defaults to False.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -982,11 +1206,15 @@ Args:
## POST /v1/prediction/update/{provider_id}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post)
<!-- pyml enable line-length -->
Fastapi Prediction Update Provider
```
<!-- pyml disable line-length -->
```python
"""
Update predictions for given provider ID.
Args:
@@ -995,7 +1223,9 @@ Args:
Defaults to False.
force_enable: Update data even if provider is disabled.
Defaults to False.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -1015,16 +1245,22 @@ Args:
## GET /v1/resource/status
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_get_v1_resource_status_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_get_v1_resource_status_get)
<!-- pyml enable line-length -->
Fastapi Devices Status Get
```
<!-- pyml disable line-length -->
```python
"""
Get the latest status of a resource/ device.
Return:
latest_status: The latest status of a resource/ device.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -1042,16 +1278,22 @@ Return:
## PUT /v1/resource/status
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_put_v1_resource_status_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_put_v1_resource_status_put)
<!-- pyml enable line-length -->
Fastapi Devices Status Put
```
<!-- pyml disable line-length -->
```python
"""
Update the status of a resource/ device.
Return:
latest_status: The latest status of a resource/ device.
"""
```
<!-- pyml enable line-length -->
**Parameters**:
@@ -1105,7 +1347,9 @@ Return:
## GET /visualization_results.pdf
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/get_pdf_visualization_results_pdf_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/get_pdf_visualization_results_pdf_get)
<!-- pyml enable line-length -->
Get Pdf
@@ -1114,3 +1358,5 @@ Get Pdf
- **200**: Successful Response
---
Auto generated from openapi.json.

View File

@@ -124,8 +124,9 @@ Configuration options:
- `charges_kwh`: Electricity price charges (€/kWh).
- `vat_rate`: VAT rate factor applied to electricity price when charges are used (default: 1.19).
- `provider_settings.import_file_path`: Path to the file to import electricity price forecast data from.
- `provider_settings.import_json`: JSON string, dictionary of electricity price forecast value lists.
- `elecpriceimport.import_file_path`: Path to the file to import electricity price forecast data from.
- `elecpriceimport.import_json`: JSON string, dictionary of electricity price forecast value lists.
- `energycharts.bidding_zone`: Bidding zone Energy Charts shall provide price data for.
### ElecPriceAkkudoktor Provider

View File

@@ -7,13 +7,20 @@ https://www.sphinx-doc.org/en/master/usage/configuration.html
import sys
from pathlib import Path
# Add the src directory to sys.path so Sphinx can import akkudoktoreos
PROJECT_ROOT = Path(__file__).parent.parent
SRC_DIR = PROJECT_ROOT / "src"
sys.path.insert(0, str(SRC_DIR))
from akkudoktoreos.core.version import __version__
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "Akkudoktor EOS"
copyright = "2024, Andreas Schmitz"
copyright = "2025, Andreas Schmitz"
author = "Andreas Schmitz"
release = "0.0.1"
release = __version__
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
@@ -22,6 +29,7 @@ extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
"sphinx.ext.napoleon",
"sphinx.ext.todo",
"sphinx_rtd_theme",
"myst_parser",
"sphinx_tabs.tabs",

View File

@@ -393,6 +393,13 @@ At a minimum, you should run the module tests:
make test
```
:::{admonition} Note
:class: Note
Depending on your changes you may also have to change the version.py and documentation files. Do as
suggested by the tests. You may ignore the version.py and documentation changes up until you
finalize your change.
:::
You should also run the system tests. These include additional tests that interact with real
resources:

View File

@@ -5,10 +5,10 @@
This guide provides different methods to install AkkudoktorEOS:
- Installation from Source (GitHub)
- Installation from Release Package (GitHub)
- Installation with Docker (DockerHub)
- Installation with Docker (docker-compose)
- Installation from Source (GitHub) (M1)
- Installation from Release Package (GitHub) (M2)
- Installation with Docker (DockerHub) (M3)
- Installation with Docker (docker-compose) (M4)
Choose the method that best suits your needs.
@@ -34,6 +34,9 @@ Before installing, ensure you have the following:
- Docker Engine 20.10 or higher
- Docker Compose (optional, recommended)
See [Install Docker Engine](https://docs.docker.com/engine/install/) on how to install docker on
your Linux distro.
## Installation from Source (GitHub) (M1)
Recommended for developers or users wanting the latest updates.

View File

@@ -13,8 +13,8 @@ and how to set a **development version** after the release.
| 1 | Contributor | Prepare a release branch **in your fork** using Commitizen |
| 2 | Contributor | Open a **Pull Request to upstream** (`Akkudoktor-EOS/EOS`) |
| 3 | Maintainer | Review and **merge the release PR** |
| 4 | Maintainer | Create the **GitHub Release and tag** |
| 5 | Maintainer | Set the **development version marker** via a follow-up PR |
| 4 | CI | Create the **GitHub Release and tag** |
| 5 | CI | Set the **development version marker** via a follow-up PR |
## 🔄 Detailed Workflow
@@ -40,24 +40,26 @@ git checkout -b release/vX.Y.Z
#### Bump the version information
At least update
Set `__version__` in src/akkudoktoreos/core/version.py
- pyproject.toml
- src/akkudoktoreos/core/version.py
- src/akkudoktoreos/data/default.config.json
- Makefile
```python
__version__ = 0.3.0
```
Prepare version by updating versioned files, e.g.:
- haaddon/config.yaml
and the generated documentation:
```bash
make bump VERSION=0.1.0+dev NEW_VERSION=X.Y.Z
make gen-docs
make prepare-version
```
You may check the changes by:
Check the changes by:
```bash
git diff
make test-version
```
#### Create a new CHANGELOG.md entry
@@ -66,19 +68,20 @@ Edit CHANGELOG.md
#### Create the new release commit
Add all the changed version files and all other changes to the commit.
```bash
git add pyproject.toml src/akkudoktoreos/core/version.py \
src/akkudoktoreos/data/default.config.json Makefile CHANGELOG.md
git commit -s -m "chore(release): Release vX.Y.Z"
git add src/akkudoktoreos/core/version.py CHANGELOG.md ...
git commit -s -m "chore: Prepare Release v0.3.0"
```
#### Push the branch to your fork
```bash
git push --set-upstream origin release/vX.Y.Z
git push --set-upstream origin release/v0.3.0
```
### 2⃣ Contributor: Open the Release Pull Request
### 2⃣ Contributor: Open the Release Preparation Pull Request
| From | To |
| ------------------------------------ | ------------------------- |
@@ -87,13 +90,13 @@ git push --set-upstream origin release/vX.Y.Z
**PR Title:**
```text
chore(release): release vX.Y.Z
chore: prepare release vX.Y.Z
```
**PR Description Template:**
```markdown
## Release vX.Y.Z
## Prepare Release vX.Y.Z
This pull request prepares release **vX.Y.Z**.
@@ -119,94 +122,26 @@ See `CHANGELOG.md` for full details.
**Merge Strategy:**
- Prefer **Merge Commit** (or **Squash Merge**, per project preference)
- Use commit message: `chore(release): Release vX.Y.Z`
- Use commit message: `chore: Prepare Release vX.Y.Z`
### 4Maintainer: Publish the GitHub Release
### 4CI: Publish the GitHub Release
1. Go to **GitHub → Releases → Draft a new release**
2. **Choose tag** → enter `vX.Y.Z` (GitHub creates the tag on publish)
3. **Release title:** `vX.Y.Z`
4. **Paste changelog entry** from `CHANGELOG.md`
5. Optionally enable **Set as latest release**
6. Click **Publish release** 🎉
The new release will automatically be published by the GitHub CI action.
### 5⃣ Maintainer: Prepare the Development Version Marker
See `.github/workflwows/bump-version.yml`for details.
**Sync local copy:**
### 5⃣ CI: Prepare the Development Version Marker
```bash
git fetch eos
git checkout main
git pull eos main
```
The development version marker will automatically be set by the GitHub CI action.
**Create a development version branch:**
```bash
git checkout -b release/vX.Y.Z_dev
```
**Set development version marker manually:**
```bash
make bump VERSION=X.Y.Z NEW_VERSION=X.Y.Z+dev
make gen-docs
```
```bash
git add pyproject.toml src/akkudoktoreos/core/version.py \
src/akkudoktoreos/data/default.config.json Makefile
git commit -s -m "chore: set development version marker X.Y.Z+dev"
```
```bash
git push --set-upstream origin release/vX.Y.Z_dev
```
### 6⃣ Maintainer (or Contributor): Open the Development Version PR
| From | To |
| ---------------------------------------- | ------------------------- |
| `<your-username>/EOS:release/vX.Y.Z_dev` | `Akkudoktor-EOS/EOS:main` |
**PR Title:**
```text
chore: development version vX.Y.Z+dev
```
**PR Description Template:**
```markdown
## Development version vX.Y.Z+dev
This pull request marks the repository as back in active development.
### Changes
- Set version to `vX.Y.Z+dev`
No changelog entry is needed.
```
### 7⃣ Maintainer: Review and Merge the Development Version PR
**Checklist:**
- ✅ Only version files updated to `+dev`
- ✅ No unintended changes
**Merge Strategy:**
- Merge with commit message: `chore: development version vX.Y.Z+dev`
See `.github/workflwows/bump-version.yml`for details.
## ✅ Quick Reference
| Step | Actor | Action |
| ---- | ----- | ------ |
| **1. Prepare release branch** | Contributor | Bump version & changelog via Commitizen |
| **1. Prepare release branch** | Contributor | Bump version & changelog |
| **2. Open release PR** | Contributor | Submit release for review |
| **3. Review & merge release PR** | Maintainer | Finalize changes into `main` |
| **4. Publish GitHub Release** | Maintainer | Create tag & notify users |
| **5. Prepare development version branch** | Maintainer | Set development marker |
| **6. Open development PR** | Maintainer (or Contributor) | Propose returning to development state |
| **7. Review & merge development PR** | Maintainer | Mark repository as back in development |
| **4. Publish GitHub Release** | CI | Create tag & notify users |
| **5. Prepare development version branch** | CI | Set development marker |

View File

@@ -3,7 +3,7 @@
"info": {
"title": "Akkudoktor-EOS",
"description": "This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.",
"version": "v0.2.0+dev"
"version": "v0.2.0+dev.4dbc2d"
},
"paths": {
"/v1/admin/cache/clear": {
@@ -2406,7 +2406,7 @@
"general": {
"$ref": "#/components/schemas/GeneralSettings-Output",
"default": {
"version": "0.2.0+dev",
"version": "0.2.0+dev.4dbc2d",
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405,
@@ -2469,7 +2469,10 @@
"$ref": "#/components/schemas/ElecPriceCommonSettings-Output",
"default": {
"vat_rate": 1.19,
"provider_settings": {}
"elecpriceimport": {},
"energycharts": {
"bidding_zone": "DE-LU"
}
}
},
"feedintariff": {
@@ -2519,7 +2522,7 @@
"additionalProperties": false,
"type": "object",
"title": "ConfigEOS",
"description": "Singleton configuration handler for the EOS application.\n\nConfigEOS extends `SettingsEOS` with support for default configuration paths and automatic\ninitialization.\n\n`ConfigEOS` ensures that only one instance of the class is created throughout the application,\nallowing consistent access to EOS configuration settings. This singleton instance loads\nconfiguration data from a predefined set of directories or creates a default configuration if\nnone is found.\n\nInitialization Process:\n - Upon instantiation, the singleton instance attempts to load a configuration file in this order:\n 1. The directory specified by the `EOS_CONFIG_DIR` environment variable\n 2. The directory specified by the `EOS_DIR` environment variable.\n 3. A platform specific default directory for EOS.\n 4. The current working directory.\n - The first available configuration file found in these directories is loaded.\n - If no configuration file is found, a default configuration file is created in the platform\n specific default directory, and default settings are loaded into it.\n\nAttributes from the loaded configuration are accessible directly as instance attributes of\n`ConfigEOS`, providing a centralized, shared configuration object for EOS.\n\nSingleton Behavior:\n - This class uses the `SingletonMixin` to ensure that all requests for `ConfigEOS` return\n the same instance, which contains the most up-to-date configuration. Modifying the configuration\n in one part of the application reflects across all references to this class.\n\nAttributes:\n config_folder_path (Optional[Path]): Path to the configuration directory.\n config_file_path (Optional[Path]): Path to the configuration file.\n\nRaises:\n FileNotFoundError: If no configuration file is found, and creating a default configuration fails.\n\nExample:\n To initialize and access configuration attributes (only one instance is created):\n ```python\n config_eos = ConfigEOS() # Always returns the same instance\n print(config_eos.prediction.hours) # Access a setting from the loaded configuration\n ```"
"description": "Singleton configuration handler for the EOS application.\n\nConfigEOS extends `SettingsEOS` with support for default configuration paths and automatic\ninitialization.\n\n`ConfigEOS` ensures that only one instance of the class is created throughout the application,\nallowing consistent access to EOS configuration settings. This singleton instance loads\nconfiguration data from a predefined set of directories or creates a default configuration if\nnone is found.\n\nInitialization Process:\n - Upon instantiation, the singleton instance attempts to load a configuration file in this order:\n 1. The directory specified by the `EOS_CONFIG_DIR` environment variable\n 2. The directory specified by the `EOS_DIR` environment variable.\n 3. A platform specific default directory for EOS.\n 4. The current working directory.\n - The first available configuration file found in these directories is loaded.\n - If no configuration file is found, a default configuration file is created in the platform\n specific default directory, and default settings are loaded into it.\n\nAttributes from the loaded configuration are accessible directly as instance attributes of\n`ConfigEOS`, providing a centralized, shared configuration object for EOS.\n\nSingleton Behavior:\n - This class uses the `SingletonMixin` to ensure that all requests for `ConfigEOS` return\n the same instance, which contains the most up-to-date configuration. Modifying the configuration\n in one part of the application reflects across all references to this class.\n\nAttributes:\n config_folder_path (Optional[Path]): Path to the configuration directory.\n config_file_path (Optional[Path]): Path to the configuration file.\n\nRaises:\n FileNotFoundError: If no configuration file is found, and creating a default configuration fails.\n\nExample:\n To initialize and access configuration attributes (only one instance is created):\n .. code-block:: python\n\n config_eos = ConfigEOS() # Always returns the same instance\n print(config_eos.prediction.hours) # Access a setting from the loaded configuration"
},
"DDBCActuatorStatus": {
"properties": {
@@ -2975,27 +2978,6 @@
"title": "DevicesCommonSettings",
"description": "Base configuration for devices simulation settings."
},
"ElecPriceCommonProviderSettings": {
"properties": {
"ElecPriceImport": {
"anyOf": [
{
"$ref": "#/components/schemas/ElecPriceImportCommonSettings"
},
{
"type": "null"
}
],
"description": "ElecPriceImport settings",
"examples": [
null
]
}
},
"type": "object",
"title": "ElecPriceCommonProviderSettings",
"description": "Electricity Price Prediction Provider Configuration."
},
"ElecPriceCommonSettings-Input": {
"properties": {
"provider": {
@@ -3046,12 +3028,13 @@
1.19
]
},
"provider_settings": {
"$ref": "#/components/schemas/ElecPriceCommonProviderSettings",
"description": "Provider settings",
"examples": [
{}
]
"elecpriceimport": {
"$ref": "#/components/schemas/ElecPriceImportCommonSettings",
"description": "Import provider settings."
},
"energycharts": {
"$ref": "#/components/schemas/ElecPriceEnergyChartsCommonSettings",
"description": "Energy Charts provider settings."
}
},
"type": "object",
@@ -3108,18 +3091,34 @@
1.19
]
},
"provider_settings": {
"$ref": "#/components/schemas/ElecPriceCommonProviderSettings",
"description": "Provider settings",
"examples": [
{}
]
"elecpriceimport": {
"$ref": "#/components/schemas/ElecPriceImportCommonSettings",
"description": "Import provider settings."
},
"energycharts": {
"$ref": "#/components/schemas/ElecPriceEnergyChartsCommonSettings",
"description": "Energy Charts provider settings."
}
},
"type": "object",
"title": "ElecPriceCommonSettings",
"description": "Electricity Price Prediction Configuration."
},
"ElecPriceEnergyChartsCommonSettings": {
"properties": {
"bidding_zone": {
"$ref": "#/components/schemas/EnergyChartsBiddingZones",
"description": "Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', 'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI'",
"default": "DE-LU",
"examples": [
"AT"
]
}
},
"type": "object",
"title": "ElecPriceEnergyChartsCommonSettings",
"description": "Common settings for Energy Charts electricity price provider."
},
"ElecPriceImportCommonSettings": {
"properties": {
"import_file_path": {
@@ -3375,6 +3374,29 @@
"title": "ElectricVehicleResult",
"description": "Result class containing information related to the electric vehicle's charging and discharging behavior."
},
"EnergyChartsBiddingZones": {
"type": "string",
"enum": [
"AT",
"BE",
"CH",
"CZ",
"DE-LU",
"DE-AT-LU",
"DK1",
"DK2",
"FR",
"HU",
"IT-NORTH",
"NL",
"NO2",
"PL",
"SE4",
"SI"
],
"title": "EnergyChartsBiddingZones",
"description": "Energy Charts Bidding Zones."
},
"EnergyManagementCommonSettings": {
"properties": {
"startup_delay": {
@@ -4062,7 +4084,7 @@
"type": "string",
"title": "Version",
"description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0+dev"
"default": "0.2.0+dev.4dbc2d"
},
"data_folder_path": {
"anyOf": [
@@ -4136,7 +4158,7 @@
"type": "string",
"title": "Version",
"description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0+dev"
"default": "0.2.0+dev.4dbc2d"
},
"data_folder_path": {
"anyOf": [
@@ -7153,7 +7175,7 @@
},
"type": "object",
"title": "PydanticDateTimeData",
"description": "Pydantic model for time series data with consistent value lengths.\n\nThis model validates a dictionary where:\n- Keys are strings representing data series names\n- Values are lists of numeric or string values\n- Special keys 'start_datetime' and 'interval' can contain string values\nfor time series indexing\n- All value lists must have the same length\n\nExample:\n {\n \"start_datetime\": \"2024-01-01 00:00:00\", # optional\n \"interval\": \"1 Hour\", # optional\n \"loadforecast_power_w\": [20.5, 21.0, 22.1],\n \"load_min\": [18.5, 19.0, 20.1]\n }"
"description": "Pydantic model for time series data with consistent value lengths.\n\nThis model validates a dictionary where:\n- Keys are strings representing data series names\n- Values are lists of numeric or string values\n- Special keys 'start_datetime' and 'interval' can contain string values\nfor time series indexing\n- All value lists must have the same length\n\nExample:\n .. code-block:: python\n\n {\n \"start_datetime\": \"2024-01-01 00:00:00\", # optional\n \"interval\": \"1 Hour\", # optional\n \"loadforecast_power_w\": [20.5, 21.0, 22.1],\n \"load_min\": [18.5, 19.0, 20.1]\n }"
},
"PydanticDateTimeDataFrame": {
"properties": {

View File

@@ -1,6 +1,6 @@
[project]
name = "akkudoktor-eos"
version = "0.2.0+dev"
dynamic = ["version"] # Get version information dynamically
authors = [
{ name="Andreas Schmitz", email="author@example.com" },
]
@@ -25,6 +25,8 @@ build-backend = "setuptools.build_meta"
[tool.setuptools.dynamic]
dependencies = {file = ["requirements.txt"]}
optional-dependencies = {dev = { file = ["requirements-dev.txt"] }}
# version.txt must be generated
version = { file = "version.txt" }
[tool.setuptools.packages.find]
where = ["src/"]
@@ -109,29 +111,10 @@ module = "xprocess.*"
ignore_missing_imports = true
[tool.commitizen]
# Only used as linter
name = "cz_conventional_commits"
version_scheme = "semver"
version = "0.2.0+dev" # <-- Set your current version heretag_format = "v$version"
# Files to automatically update when bumping version
update_changelog_on_bump = true
changelog_incremental = true
annotated_tag = true
bump_message = "chore(release): $current_version → $new_version"
# Branch validation settings
# Enforce commit message and branch style:
branch_validation = true
branch_pattern = "^(feat|fix|chore|docs|refactor|test)/[a-z0-9._-]+$"
# Customize changelog generation
[tool.commitizen.changelog]
path = "CHANGELOG.md"
template = "keepachangelog"
# If your version is stored in multiple files (Python modules, docs etc.), add them here
[tool.commitizen.files]
version = [
"pyproject.toml", # Auto-update project version
"src/akkudoktoreos/core/version.py",
"src/akkudoktoreos/data/default.config.json"
]

View File

@@ -7,12 +7,16 @@
# - mypy (mirrors-mypy) - sync with requirements-dev.txt (if on pypi)
# - pymarkdown
# - commitizen - sync with requirements-dev.txt (if on pypi)
pre-commit==4.3.0
#
# !!! Sync .pre-commit-config.yaml and requirements-dev.txt !!!
pre-commit==4.5.0
mypy==1.18.2
types-requests==2.32.4.20250913 # for mypy
pandas-stubs==2.3.2.250926 # for mypy
tokenize-rt==6.2.0 # for mypy
commitizen==4.9.1
types-docutils==0.22.3.20251115 # for mypy
types-PyYaml==6.0.12.20250915 # for mypy
commitizen==4.10.0
deprecated==1.3.1 # for commitizen
# Sphinx
@@ -23,7 +27,7 @@ GitPython==3.1.45
myst-parser==4.0.1
# Pytest
pytest==9.0.0
pytest==9.0.1
pytest-cov==7.0.0
coverage==7.11.3
coverage==7.12.0
pytest-xprocess==1.0.2

View File

@@ -1,14 +1,14 @@
babel==2.17.0
beautifulsoup4==4.14.2
cachebox==5.1.0
numpy==2.3.4
numpy==2.3.5
numpydantic==1.7.0
matplotlib==3.10.7
contourpy==1.3.3
fastapi[standard-no-fastapi-cloud-cli]==0.121.1
fastapi[standard-no-fastapi-cloud-cli]==0.121.3
fastapi_cli==0.0.16
rich-toolkit==0.15.1
python-fasthtml==0.12.33
rich-toolkit==0.16.0
python-fasthtml==0.12.35
MonsterUI==1.0.32
markdown-it-py==3.0.0
mdit-py-plugins==0.5.0

View File

@@ -0,0 +1,70 @@
#!/usr/bin/env python3
"""
Update VERSION_BASE in version.py after a release tag.
Behavior:
- Read VERSION_BASE from version.py
- Strip ANY existing "+dev" suffix
- Append exactly one "+dev"
- Write back the updated file
This ensures:
0.2.0 --> 0.2.0+dev
0.2.0+dev --> 0.2.0+dev
0.2.0+dev+dev -> 0.2.0+dev
"""
import re
import sys
from pathlib import Path
ROOT = Path(__file__).resolve().parent.parent
VERSION_FILE = ROOT / "src" / "akkudoktoreos" / "core" / "version.py"
def bump_dev_version_file(file: Path) -> str:
text = file.read_text(encoding="utf-8")
# Extract current version
m = re.search(r'^VERSION_BASE\s*=\s*["\']([^"\']+)["\']',
text, flags=re.MULTILINE)
if not m:
raise ValueError("VERSION_BASE not found")
base_version = m.group(1)
# Remove trailing +dev if present → ensure idempotency
cleaned = re.sub(r'(\+dev)+$', '', base_version)
# Append +dev
new_version = f"{cleaned}+dev"
# Replace inside file content
new_text = re.sub(
r'^VERSION_BASE\s*=\s*["\']([^"\']+)["\']',
f'VERSION_BASE = "{new_version}"',
text,
flags=re.MULTILINE
)
file.write_text(new_text, encoding="utf-8")
return new_version
def main():
# Use CLI argument or fallback default path
version_file = Path(sys.argv[1]) if len(sys.argv) > 1 else VERSION_FILE
try:
new_version = bump_dev_version_file(version_file)
except Exception as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)
# MUST print to stdout
print(new_version)
if __name__ == "__main__":
main()

View File

@@ -1,170 +0,0 @@
"""Update version strings in multiple project files only if the old version matches.
This script updates version information in:
- pyproject.toml
- src/akkudoktoreos/core/version.py
- src/akkudoktoreos/data/default.config.json
- Makefile
Supported version formats:
- __version__ = "<version>"
- version = "<version>"
- "version": "<version>"
- VERSION ?: <version>
It will:
- Replace VERSION → NEW_VERSION if the old version is found.
- Report which files were updated.
- Report which files contained mismatched versions.
- Report which files had no version.
Usage:
python bump_version.py VERSION NEW_VERSION
Args:
VERSION (str): Version expected before replacement.
NEW_VERSION (str): Version to write.
"""
#!/usr/bin/env python3
import argparse
import glob
import os
import re
import shutil
from pathlib import Path
from typing import List, Tuple
# Patterns to match version strings
VERSION_PATTERNS = [
re.compile(r'(__version__\s*=\s*")(?P<ver>[^"]+)(")'),
re.compile(r'(version\s*=\s*")(?P<ver>[^"]+)(")'),
re.compile(r'("version"\s*:\s*")(?P<ver>[^"]+)(")'),
re.compile(r'(VERSION\s*\?=\s*)(?P<ver>[^\s]+)'), # For Makefile: VERSION ?= 0.2.0
]
# Default files to process
DEFAULT_FILES = [
"pyproject.toml",
"src/akkudoktoreos/core/version.py",
"src/akkudoktoreos/data/default.config.json",
"Makefile",
]
def backup_file(file_path: str) -> str:
"""Create a backup of the given file with a .bak suffix.
Args:
file_path: Path to the file to backup.
Returns:
Path to the backup file.
"""
backup_path = f"{file_path}.bak"
shutil.copy2(file_path, backup_path)
return backup_path
def replace_version_in_file(
file_path: Path, old_version: str, new_version: str, dry_run: bool = False
) -> Tuple[bool, bool]:
"""
Replace old_version with new_version in the given file if it matches.
Args:
file_path: Path to the file to modify.
old_version: The old version to replace.
new_version: The new version to set.
dry_run: If True, don't actually modify files.
Returns:
Tuple[bool, bool]: (file_would_be_updated, old_version_found)
"""
content = file_path.read_text()
new_content = content
old_version_found = False
file_would_be_updated = False
for pattern in VERSION_PATTERNS:
def repl(match):
nonlocal old_version_found, file_would_be_updated
ver = match.group("ver")
if ver == old_version:
old_version_found = True
file_would_be_updated = True
# Some patterns have 3 groups (like quotes)
if len(match.groups()) == 3:
return f"{match.group(1)}{new_version}{match.group(3)}"
else:
return f"{match.group(1)}{new_version}"
return match.group(0)
new_content = pattern.sub(repl, new_content)
if file_would_be_updated:
if dry_run:
print(f"[DRY-RUN] Would update {file_path}")
else:
backup_path = file_path.with_suffix(file_path.suffix + ".bak")
shutil.copy(file_path, backup_path)
file_path.write_text(new_content)
print(f"Updated {file_path} (backup saved to {backup_path})")
elif not old_version_found:
print(f"[SKIP] {file_path}: old version '{old_version}' not found")
return file_would_be_updated, old_version_found
def main():
parser = argparse.ArgumentParser(description="Bump version across project files.")
parser.add_argument("old_version", help="Old version to replace")
parser.add_argument("new_version", help="New version to set")
parser.add_argument(
"--dry-run", action="store_true", help="Show what would be changed without modifying files"
)
parser.add_argument(
"--glob", nargs="*", help="Optional glob patterns to include additional files"
)
args = parser.parse_args()
updated_files = []
not_found_files = []
# Determine files to update
files_to_update: List[Path] = [Path(f) for f in DEFAULT_FILES]
if args.glob:
for pattern in args.glob:
files_to_update.extend(Path(".").glob(pattern))
files_to_update = list(dict.fromkeys(files_to_update)) # remove duplicates
any_updated = False
for file_path in files_to_update:
if file_path.exists() and file_path.is_file():
updated, _ = replace_version_in_file(
file_path, args.old_version, args.new_version, args.dry_run
)
any_updated |= updated
if updated:
updated_files.append(file_path)
else:
print(f"[SKIP] {file_path}: file does not exist")
not_found_files.append(file_path)
print("\nSummary:")
if updated_files:
print(f"Updated files ({len(updated_files)}):")
for f in updated_files:
print(f" {f}")
else:
print("No files were updated.")
if not_found_files:
print(f"Files where old version was not found ({len(not_found_files)}):")
for f in not_found_files:
print(f" {f}")
if __name__ == "__main__":
main()

View File

@@ -8,7 +8,7 @@ import re
import sys
import textwrap
from pathlib import Path
from typing import Any, Type, Union
from typing import Any, Optional, Type, Union, get_args
from loguru import logger
from pydantic.fields import ComputedFieldInfo, FieldInfo
@@ -24,13 +24,29 @@ undocumented_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict()
global_config_dict: dict[str, Any] = dict()
def get_title(config: PydanticBaseModel) -> str:
def get_model_class_from_annotation(field_type: Any) -> type[PydanticBaseModel] | None:
"""Given a type annotation (possibly Optional or Union), return the first Pydantic model class."""
origin = getattr(field_type, "__origin__", None)
if origin is Union:
# unwrap Union/Optional
for arg in get_args(field_type):
cls = get_model_class_from_annotation(arg)
if cls is not None:
return cls
return None
elif isinstance(field_type, type) and issubclass(field_type, PydanticBaseModel):
return field_type
else:
return None
def get_title(config: type[PydanticBaseModel]) -> str:
if config.__doc__ is None:
raise NameError(f"Missing docstring: {config}")
return config.__doc__.strip().splitlines()[0].strip(".")
def get_body(config: PydanticBaseModel) -> str:
def get_body(config: type[PydanticBaseModel]) -> str:
if config.__doc__ is None:
raise NameError(f"Missing docstring: {config}")
return textwrap.dedent("\n".join(config.__doc__.strip().splitlines()[1:])).strip()
@@ -124,7 +140,7 @@ def get_model_structure_from_examples(
def create_model_from_examples(
model_class: PydanticBaseModel, multiple: bool
model_class: type[PydanticBaseModel], multiple: bool
) -> list[PydanticBaseModel]:
"""Create a model instance with default or example values, respecting constraints."""
return [
@@ -163,7 +179,7 @@ def get_type_name(field_type: type) -> str:
def generate_config_table_md(
config: PydanticBaseModel,
config: type[PydanticBaseModel],
toplevel_keys: list[str],
prefix: str,
toplevel: bool = False,
@@ -199,22 +215,28 @@ def generate_config_table_md(
table += "\n\n"
table += (
"<!-- pyml disable line-length -->\n"
":::{table} "
+ f"{'::'.join(toplevel_keys)}\n:widths: 10 {env_width}10 5 5 30\n:align: left\n\n"
)
table += f"| Name {env_header}| Type | Read-Only | Default | Description |\n"
table += f"| ---- {env_header_underline}| ---- | --------- | ------- | ----------- |\n"
for field_name, field_info in list(config.model_fields.items()) + list(
config.model_computed_fields.items()
):
fields = {}
for field_name, field_info in config.model_fields.items():
fields[field_name] = field_info
for field_name, field_info in config.model_computed_fields.items():
fields[field_name] = field_info
for field_name in sorted(fields.keys()):
field_info = fields[field_name]
regular_field = isinstance(field_info, FieldInfo)
config_name = field_name if extra_config else field_name.upper()
field_type = field_info.annotation if regular_field else field_info.return_type
default_value = get_default_value(field_info, regular_field)
description = field_info.description if field_info.description else "-"
deprecated = field_info.deprecated if field_info.deprecated else None
description = config.field_description(field_name)
deprecated = config.field_deprecated(field_name)
read_only = "rw" if regular_field else "ro"
type_name = get_type_name(field_type)
@@ -270,7 +292,7 @@ def generate_config_table_md(
undocumented_types.setdefault(new_type, (info[0], info[1]))
if toplevel:
table += ":::\n\n" # Add an empty line after the table
table += ":::\n<!-- pyml enable line-length -->\n\n" # Add an empty line after the table
has_examples_list = toplevel_keys[-1] == "list"
instance_list = create_model_from_examples(config, has_examples_list)
@@ -288,9 +310,13 @@ def generate_config_table_md(
same_output = ins_out_dict_list == ins_dict_list
same_output_str = "/Output" if same_output else ""
table += f"#{heading_level} Example Input{same_output_str}\n\n"
table += "```{eval-rst}\n"
table += ".. code-block:: json\n\n"
# -- code block heading
table += "<!-- pyml disable no-emphasis-as-heading -->\n"
table += f"**Example Input{same_output_str}**\n"
table += "<!-- pyml enable no-emphasis-as-heading -->\n\n"
# -- code block
table += "<!-- pyml disable line-length -->\n"
table += "```json\n"
if has_examples_list:
input_dict = build_nested_structure(toplevel_keys[:-1], ins_dict_list)
if not extra_config:
@@ -300,20 +326,24 @@ def generate_config_table_md(
if not extra_config:
global_config_dict[toplevel_keys[0]] = ins_dict_list[0]
table += textwrap.indent(json.dumps(input_dict, indent=4), " ")
table += "\n"
table += "```\n\n"
table += "\n```\n<!-- pyml enable line-length -->\n\n"
# -- end code block
if not same_output:
table += f"#{heading_level} Example Output\n\n"
table += "```{eval-rst}\n"
table += ".. code-block:: json\n\n"
# -- code block heading
table += "<!-- pyml disable no-emphasis-as-heading -->\n"
table += f"**Example Output**\n"
table += "<!-- pyml enable no-emphasis-as-heading -->\n\n"
# -- code block
table += "<!-- pyml disable line-length -->\n"
table += "```json\n"
if has_examples_list:
output_dict = build_nested_structure(toplevel_keys[:-1], ins_out_dict_list)
else:
output_dict = build_nested_structure(toplevel_keys, ins_out_dict_list[0])
table += textwrap.indent(json.dumps(output_dict, indent=4), " ")
table += "\n"
table += "```\n\n"
table += "\n```\n<!-- pyml enable line-length -->\n\n"
# -- end code block
while undocumented_types:
extra_config_type, extra_info = undocumented_types.popitem()
@@ -325,7 +355,7 @@ def generate_config_table_md(
return table
def generate_config_md(config_eos: ConfigEOS) -> str:
def generate_config_md(file_path: Optional[Union[str, Path]], config_eos: ConfigEOS) -> str:
"""Generate configuration specification in Markdown with extra tables for prefixed values.
Returns:
@@ -337,44 +367,103 @@ def generate_config_md(config_eos: ConfigEOS) -> str:
)
GeneralSettings._config_folder_path = config_eos.general.config_file_path.parent
markdown = "# Configuration Table\n\n"
markdown = ""
# Generate tables for each top level config
for field_name, field_info in config_eos.__class__.model_fields.items():
field_type = field_info.annotation
markdown += generate_config_table_md(
field_type, [field_name], f"EOS_{field_name.upper()}__", True
if file_path:
file_path = Path(file_path)
# -- table of content
markdown += "```{toctree}\n"
markdown += ":maxdepth: 1\n"
markdown += ":caption: Configuration Table\n\n"
else:
markdown += "# Configuration Table\n\n"
markdown += (
"The configuration table describes all the configuration options of Akkudoktor-EOS\n\n"
)
# Generate tables for each top level config
for field_name in sorted(config_eos.__class__.model_fields.keys()):
field_info = config_eos.__class__.model_fields[field_name]
field_type = field_info.annotation
model_class = get_model_class_from_annotation(field_type)
if model_class is None:
raise ValueError(f"Can not find class of top level field {field_name}.")
table = generate_config_table_md(
model_class, [field_name], f"EOS_{field_name.upper()}__", True
)
if file_path:
# Write table to extra document
table_path = file_path.with_name(file_path.stem + f"{field_name.lower()}.md")
write_to_file(table_path, table)
markdown += f"../_generated/{table_path.name}\n"
else:
# We will write to stdout
markdown += "---\n\n"
markdown += table
# Generate full example
example = ""
# Full config
markdown += "## Full example Config\n\n"
markdown += "```{eval-rst}\n"
markdown += ".. code-block:: json\n\n"
example += "## Full example Config\n\n"
# -- code block
example += "<!-- pyml disable line-length -->\n"
example += "```json\n"
# Test for valid config first
config_eos.merge_settings_from_dict(global_config_dict)
markdown += textwrap.indent(json.dumps(global_config_dict, indent=4), " ")
markdown += "\n"
markdown += "```\n\n"
example += textwrap.indent(json.dumps(global_config_dict, indent=4), " ")
example += "\n"
example += "```\n<!-- pyml enable line-length -->\n\n"
# -- end code block end
if file_path:
example_path = file_path.with_name(file_path.stem + f"example.md")
write_to_file(example_path, example)
markdown += f"../_generated/{example_path.name}\n"
markdown += "```\n\n"
# -- end table of content
else:
markdown += "---\n\n"
markdown += example
# Assure there is no double \n at end of file
markdown = markdown.rstrip("\n")
markdown += "\n"
markdown += "\nAuto generated from source code.\n"
# Write markdown to file or stdout
write_to_file(file_path, markdown)
return markdown
def write_to_file(file_path: Optional[Union[str, Path]], config_md: str):
if os.name == "nt":
config_md = config_md.replace("\\\\", "/")
# Assure log path does not leak to documentation
markdown = re.sub(
config_md = re.sub(
r'(?<=["\'])/[^"\']*/output/eos\.log(?=["\'])',
'/home/user/.local/share/net.akkudoktoreos.net/output/eos.log',
markdown
'/home/user/.local/share/net.akkudoktor.eos/output/eos.log',
config_md
)
# Assure timezone name does not leak to documentation
tz_name = to_datetime().timezone_name
markdown = re.sub(re.escape(tz_name), "Europe/Berlin", markdown, flags=re.IGNORECASE)
config_md = re.sub(re.escape(tz_name), "Europe/Berlin", config_md, flags=re.IGNORECASE)
# Also replace UTC, as GitHub CI always is on UTC
markdown = re.sub(re.escape("UTC"), "Europe/Berlin", markdown, flags=re.IGNORECASE)
config_md = re.sub(re.escape("UTC"), "Europe/Berlin", config_md, flags=re.IGNORECASE)
# Assure no extra lines at end of file
config_md = config_md.rstrip("\n")
config_md += "\n"
return markdown
if file_path:
# Write to file
with open(Path(file_path), "w", encoding="utf-8", newline="\n") as f:
f.write(config_md)
else:
# Write to std output
print(config_md)
def main():
@@ -384,23 +473,14 @@ def main():
"--output-file",
type=str,
default=None,
help="File to write the Configuration Specification to",
help="File to write the top level configuration specification to.",
)
args = parser.parse_args()
config_eos = get_config()
try:
config_md = generate_config_md(config_eos)
if os.name == "nt":
config_md = config_md.replace("\\\\", "/")
if args.output_file:
# Write to file
with open(args.output_file, "w", encoding="utf-8", newline="\n") as f:
f.write(config_md)
else:
# Write to std output
print(config_md)
config_md = generate_config_md(args.output_file, config_eos)
except Exception as e:
print(f"Error during Configuration Specification generation: {e}", file=sys.stderr)

View File

@@ -194,6 +194,8 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
markdown = f"## {method.upper()} {path}\n\n"
# -- links
markdown += "<!-- pyml disable line-length -->\n"
markdown += f"**Links**: {local_path}, {akkudoktoreos_main_path}"
if devel:
# Add link to akkudoktor branch the development has used
@@ -206,7 +208,8 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
+ link_method
)
markdown += f", {akkudoktoreos_base_path}"
markdown += "\n\n"
markdown += "\n<!-- pyml enable line-length -->\n\n"
# -- links end
summary = details.get("summary", None)
if summary:
@@ -214,9 +217,14 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
description = details.get("description", None)
if description:
markdown += "```\n"
markdown += f"{description}"
markdown += "\n```\n\n"
# -- code block
markdown += "<!-- pyml disable line-length -->\n"
markdown += "```python\n"
markdown += '"""\n'
markdown += f"{description}\n"
markdown += '"""\n'
markdown += "```\n<!-- pyml enable line-length -->\n\n"
# -- end code block end
markdown += format_parameters(details.get("parameters", []))
markdown += format_request_body(details.get("requestBody", {}).get("content", {}))
@@ -239,7 +247,11 @@ def openapi_to_markdown(openapi_json: dict, devel: bool = False) -> str:
info = extract_info(openapi_json)
markdown = f"# {info['title']}\n\n"
markdown += f"**Version**: `{info['version']}`\n\n"
markdown += f"**Description**: {info['description']}\n\n"
# -- description
markdown += "<!-- pyml disable line-length -->\n"
markdown += f"**Description**: {info['description']}\n"
markdown += "<!-- pyml enable line-length -->\n\n"
# -- end description
markdown += f"**Base URL**: `{info['base_url']}`\n\n"
security_schemes = openapi_json.get("components", {}).get("securitySchemes", {})
@@ -257,6 +269,8 @@ def openapi_to_markdown(openapi_json: dict, devel: bool = False) -> str:
markdown = markdown.rstrip("\n")
markdown += "\n"
markdown += "\nAuto generated from openapi.json.\n"
return markdown

15
scripts/get_version.py Normal file
View File

@@ -0,0 +1,15 @@
#!.venv/bin/python
"""Get version of EOS"""
import sys
from pathlib import Path
# Add the src directory to sys.path so Sphinx can import akkudoktoreos
PROJECT_ROOT = Path(__file__).parent.parent
SRC_DIR = PROJECT_ROOT / "src"
sys.path.insert(0, str(SRC_DIR))
from akkudoktoreos.core.version import __version__
if __name__ == "__main__":
print(__version__)

113
scripts/update_version.py Normal file
View File

@@ -0,0 +1,113 @@
#!.venv/bin/python
"""General version replacement script.
Usage:
python scripts/update_version.py <version> <file1> [file2 ...]
"""
#!/usr/bin/env python3
import re
import sys
from pathlib import Path
from typing import List
# --- Patterns to match version strings ---
VERSION_PATTERNS = [
# Python: __version__ = "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])(__version__\s*=\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# Python: version = "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])(version\s*=\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# JSON: "version": "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])("version"\s*:\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# Makefile-style: VERSION ?= 1.2.3
re.compile(
r'(?<![A-Za-z0-9])(VERSION\s*\?=\s*)'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
),
# YAML: version: "1.2.3"
re.compile(
r'(?m)^(version\s*:\s*["\']?)'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(["\']?)\s*$'
),
]
def update_version_in_file(file_path: Path, new_version: str) -> bool:
"""
Replace version strings in a file based on VERSION_PATTERNS.
Returns True if the file was updated.
"""
content = file_path.read_text()
new_content = content
file_would_be_updated = False
for pattern in VERSION_PATTERNS:
def repl(match):
nonlocal file_would_be_updated
ver = match.group("ver")
if ver != new_version:
file_would_be_updated = True
# Three-group patterns (__version__, JSON, YAML)
if len(match.groups()) == 3:
return f"{match.group(1)}{new_version}{match.group(3)}"
# Two-group patterns (Makefile)
return f"{match.group(1)}{new_version}"
return match.group(0)
new_content = pattern.sub(repl, new_content)
if file_would_be_updated:
file_path.write_text(new_content)
return file_would_be_updated
def main(version: str, files: List[str]):
if not version:
raise ValueError("No version provided")
if not files:
raise ValueError("No files provided")
updated_files = []
for f in files:
path = Path(f)
if not path.exists():
print(f"Warning: {path} does not exist, skipping")
continue
if update_version_in_file(path, version):
updated_files.append(str(path))
if updated_files:
print(f"Updated files: {', '.join(updated_files)}")
else:
print("No files updated.")
if __name__ == "__main__":
if len(sys.argv) < 3:
print("Usage: python update_version.py <version> <file1> [file2 ...]")
sys.exit(1)
version_arg = sys.argv[1]
files_arg = sys.argv[2:]
main(version_arg, files_arg)

View File

@@ -11,7 +11,7 @@ Key features:
import json
import os
import shutil
import tempfile
from pathlib import Path
from typing import Any, ClassVar, Optional, Type
@@ -154,7 +154,7 @@ class GeneralSettings(SettingsBaseModel):
if v not in cls.compatible_versions:
error = (
f"Incompatible configuration version '{v}'. "
f"Expected one of: {', '.join(cls.compatible_versions)}."
f"Expected: {', '.join(cls.compatible_versions)}."
)
logger.error(error)
raise ValueError(error)
@@ -287,10 +287,10 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
Example:
To initialize and access configuration attributes (only one instance is created):
```python
config_eos = ConfigEOS() # Always returns the same instance
print(config_eos.prediction.hours) # Access a setting from the loaded configuration
```
.. code-block:: python
config_eos = ConfigEOS() # Always returns the same instance
print(config_eos.prediction.hours) # Access a setting from the loaded configuration
"""
@@ -335,32 +335,44 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
file_secret_settings (pydantic_settings.PydanticBaseSettingsSource): Unused (needed for parent class interface).
Returns:
tuple[pydantic_settings.PydanticBaseSettingsSource, ...]: A tuple of settings sources in the order they should be applied.
tuple[pydantic_settings.PydanticBaseSettingsSource, ...]: A tuple of settings sources in the order they should be applied.
Behavior:
1. Checks for the existence of a JSON configuration file in the expected location.
2. If the configuration file does not exist, creates the directory (if needed) and attempts to copy a
default configuration file to the location. If the copy fails, uses the default configuration file directly.
3. Creates a `pydantic_settings.JsonConfigSettingsSource` for both the configuration file and the default configuration file.
2. If the configuration file does not exist, creates the directory (if needed) and
attempts to create a default configuration file in the location. If the creation
fails, a temporary configuration directory is used.
3. Creates a `pydantic_settings.JsonConfigSettingsSource` for the configuration
file.
4. Updates class attributes `GeneralSettings._config_folder_path` and
`GeneralSettings._config_file_path` to reflect the determined paths.
5. Returns a tuple containing all provided and newly created settings sources in the desired order.
5. Returns a tuple containing all provided and newly created settings sources in
the desired order.
Notes:
- This method logs a warning if the default configuration file cannot be copied.
- It ensures that a fallback to the default configuration file is always possible.
- This method logs an error if the default configuration file in the normal
configuration directory cannot be created.
- It ensures that a fallback to a default configuration file is always possible.
"""
# Ensure we know and have the config folder path and the config file
config_file, exists = cls._get_config_file_path()
config_dir = config_file.parent
if not exists:
config_dir.mkdir(parents=True, exist_ok=True)
# Create minimum config file
config_minimum_content = '{ "general": { "version": "' + __version__ + '" } }'
try:
shutil.copy2(cls.config_default_file_path, config_file)
config_file.write_text(config_minimum_content, encoding="utf-8")
except Exception as exc:
logger.warning(f"Could not copy default config: {exc}. Using default config...")
config_file = cls.config_default_file_path
config_dir = config_file.parent
# Create minimum config in temporary config directory as last resort
error_msg = f"Could not create minimum config file in {config_dir}: {exc}"
logger.error(error_msg)
temp_dir = Path(tempfile.mkdtemp())
info_msg = f"Using temporary config directory {temp_dir}"
logger.info(info_msg)
config_dir = temp_dir
config_file = temp_dir / config_file.name
config_file.write_text(config_minimum_content, encoding="utf-8")
# Remember config_dir and config file
GeneralSettings._config_folder_path = config_dir
GeneralSettings._config_file_path = config_file
@@ -387,19 +399,8 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
f"Error reading config file '{config_file}' (falling back to default config): {ex}"
)
# Append default settings to sources
default_settings = pydantic_settings.JsonConfigSettingsSource(
settings_cls, json_file=cls.config_default_file_path
)
setting_sources.append(default_settings)
return tuple(setting_sources)
@classproperty
def config_default_file_path(cls) -> Path:
"""Compute the default config file path."""
return cls.package_root_path.joinpath("data/default.config.json")
@classproperty
def package_root_path(cls) -> Path:
"""Compute the package root path."""
@@ -461,9 +462,12 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
ValidationError: If the data contains invalid values for the defined fields.
Example:
>>> config = get_config()
>>> new_data = {"prediction": {"hours": 24}, "server": {"port": 8000}}
>>> config.merge_settings_from_dict(new_data)
.. code-block:: python
config = get_config()
new_data = {"prediction": {"hours": 24}, "server": {"port": 8000}}
config.merge_settings_from_dict(new_data)
"""
self._setup(**merge_models(self, data))
@@ -518,8 +522,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
The returned dictionary uses `backup_id` (suffix) as keys. The value for
each key is a dictionary including:
- ``storage_time``: The file modification timestamp in ISO-8601 format.
- ``version``: Version information found in the backup file
(defaults to ``"unknown"``).
- ``version``: Version information found in the backup file (defaults to ``"unknown"``).
Returns:
dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata.

View File

@@ -21,11 +21,14 @@ if TYPE_CHECKING:
# - tuple[str, Callable[[Any], Any]] (new path + transform)
# - None (drop)
MIGRATION_MAP: Dict[str, Union[str, Tuple[str, Callable[[Any], Any]], None]] = {
# 0.1.0 -> 0.2.0
# 0.2.0 -> 0.2.0+dev
"elecprice/provider_settings/ElecPriceImport/import_file_path": "elecprice/elecpriceimport/import_file_path",
"elecprice/provider_settings/ElecPriceImport/import_json": "elecprice/elecpriceimport/import_json",
# 0.1.0 -> 0.2.0+dev
"devices/batteries/0/initial_soc_percentage": None,
"devices/electric_vehicles/0/initial_soc_percentage": None,
"elecprice/provider_settings/import_file_path": "elecprice/provider_settings/ElecPriceImport/import_file_path",
"elecprice/provider_settings/import_json": "elecprice/provider_settings/ElecPriceImport/import_json",
"elecprice/provider_settings/import_file_path": "elecprice/elecpriceimport/import_file_path",
"elecprice/provider_settings/import_json": "elecprice/elecpriceimport/import_json",
"load/provider_settings/import_file_path": "load/provider_settings/LoadImport/import_file_path",
"load/provider_settings/import_json": "load/provider_settings/LoadImport/import_json",
"load/provider_settings/loadakkudoktor_year_energy": "load/provider_settings/LoadAkkudoktor/loadakkudoktor_year_energy_kwh",

View File

@@ -90,7 +90,10 @@ class CacheEnergyManagementStore(SingletonMixin):
the application lifecycle.
Example:
>>> cache = CacheEnergyManagementStore()
.. code-block:: python
cache = CacheEnergyManagementStore()
"""
if hasattr(self, "_initialized"):
return
@@ -112,7 +115,10 @@ class CacheEnergyManagementStore(SingletonMixin):
AttributeError: If the cache object does not have the requested method.
Example:
>>> result = cache.get("key")
.. code-block:: python
result = cache.get("key")
"""
# This will return a method of the target cache, or raise an AttributeError
target_attr = getattr(self.cache, name)
@@ -134,7 +140,10 @@ class CacheEnergyManagementStore(SingletonMixin):
KeyError: If the key does not exist in the cache.
Example:
>>> value = cache["user_data"]
.. code-block:: python
value = cache["user_data"]
"""
return CacheEnergyManagementStore.cache[key]
@@ -146,7 +155,10 @@ class CacheEnergyManagementStore(SingletonMixin):
value (Any): The value to store.
Example:
>>> cache["user_data"] = {"name": "Alice", "age": 30}
.. code-block:: python
cache["user_data"] = {"name": "Alice", "age": 30}
"""
CacheEnergyManagementStore.cache[key] = value
@@ -166,7 +178,10 @@ class CacheEnergyManagementStore(SingletonMixin):
management system run).
Example:
>>> cache.clear()
.. code-block:: python
cache.clear()
"""
if hasattr(self.cache, "clear") and callable(getattr(self.cache, "clear")):
CacheEnergyManagementStore.cache.clear()
@@ -179,64 +194,35 @@ class CacheEnergyManagementStore(SingletonMixin):
raise AttributeError(f"'{self.cache.__class__.__name__}' object has no method 'clear'")
def cachemethod_energy_management(method: TCallable) -> TCallable:
"""Decorator for in memory caching the result of an instance method.
def cache_energy_management(callable: TCallable) -> TCallable:
"""Decorator for in memory caching the result of a callable.
This decorator caches the method's result in `CacheEnergyManagementStore`, ensuring
that subsequent calls with the same arguments return the cached result until the
This decorator caches the method or function's result in `CacheEnergyManagementStore`,
ensuring that subsequent calls with the same arguments return the cached result until the
next energy management start.
Args:
method (Callable): The instance method to be decorated.
Returns:
Callable: The wrapped method with caching functionality.
Example:
>>> class MyClass:
>>> @cachemethod_energy_management
>>> def expensive_method(self, param: str) -> str:
>>> # Perform expensive computation
>>> return f"Computed {param}"
"""
@cachebox.cachedmethod(
cache=CacheEnergyManagementStore().cache, callback=cache_energy_management_store_callback
)
@functools.wraps(method)
def wrapper(self: Any, *args: Any, **kwargs: Any) -> Any:
result = method(self, *args, **kwargs)
return result
return wrapper
def cache_energy_management(func: TCallable) -> TCallable:
"""Decorator for in memory caching the result of a standalone function.
This decorator caches the function's result in `CacheEnergyManagementStore`, ensuring
that subsequent calls with the same arguments return the cached result until the
next energy management start.
Args:
func (Callable): The function to be decorated.
callable (Callable): The function or method to be decorated.
Returns:
Callable: The wrapped function with caching functionality.
Example:
>>> @cache_until_next_update
>>> def expensive_function(param: str) -> str:
>>> # Perform expensive computation
>>> return f"Computed {param}"
.. code-block:: python
@cache_energy_management
def expensive_function(param: str) -> str:
# Perform expensive computation
return f"Computed {param}"
"""
@cachebox.cached(
cache=CacheEnergyManagementStore().cache, callback=cache_energy_management_store_callback
)
@functools.wraps(func)
@functools.wraps(callable)
def wrapper(*args: Any, **kwargs: Any) -> Any:
result = func(*args, **kwargs)
result = callable(*args, **kwargs)
return result
return wrapper
@@ -277,12 +263,15 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
with their associated keys and dates.
Example:
>>> cache_store = CacheFileStore()
>>> cache_store.create('example_file')
>>> cache_file = cache_store.get('example_file')
>>> cache_file.write('Some data')
>>> cache_file.seek(0)
>>> print(cache_file.read()) # Output: 'Some data'
.. code-block:: python
cache_store = CacheFileStore()
cache_store.create('example_file')
cache_file = cache_store.get('example_file')
cache_file.write('Some data')
cache_file.seek(0)
print(cache_file.read()) # Output: 'Some data'
"""
def __init__(self, *args: Any, **kwargs: Any) -> None:
@@ -491,10 +480,13 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
file_obj: A file-like object representing the cache file.
Example:
>>> cache_file = cache_store.create('example_file', suffix='.txt')
>>> cache_file.write('Some cached data')
>>> cache_file.seek(0)
>>> print(cache_file.read()) # Output: 'Some cached data'
.. code-block:: python
cache_file = cache_store.create('example_file', suffix='.txt')
cache_file.write('Some cached data')
cache_file.seek(0)
print(cache_file.read()) # Output: 'Some cached data'
"""
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
@@ -543,7 +535,10 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
ValueError: If the key is already in store.
Example:
>>> cache_store.set('example_file', io.BytesIO(b'Some binary data'))
.. code-block:: python
cache_store.set('example_file', io.BytesIO(b'Some binary data'))
"""
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
@@ -599,10 +594,13 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
file_obj: The file-like cache object, or None if no file is found.
Example:
>>> cache_file = cache_store.get('example_file')
>>> if cache_file:
>>> cache_file.seek(0)
>>> print(cache_file.read()) # Output: Cached data (if exists)
.. code-block:: python
cache_file = cache_store.get('example_file')
if cache_file:
cache_file.seek(0)
print(cache_file.read()) # Output: Cached data (if exists)
"""
if until_datetime or until_date:
until_datetime, _ttl_duration = self._until_datetime_by_options(
@@ -881,13 +879,15 @@ def cache_in_file(
A decorated function that caches its result in a temporary file.
Example:
>>> from datetime import date
>>> @cache_in_file(suffix='.txt')
>>> def expensive_computation(until_date=None):
>>> # Perform some expensive computation
>>> return 'Some large result'
>>>
>>> result = expensive_computation(until_date=date.today())
.. code-block:: python
from datetime import date
@cache_in_file(suffix='.txt')
def expensive_computation(until_date=None):
# Perform some expensive computation
return 'Some large result'
result = expensive_computation(until_date=date.today())
Notes:
- The cache key is based on the function arguments after excluding those in `ignore_params`.

View File

@@ -39,11 +39,12 @@ class ConfigMixin:
config (ConfigEOS): Property to access the global EOS configuration.
Example:
```python
class MyEOSClass(ConfigMixin):
def my_method(self):
if self.config.myconfigval:
```
.. code-block:: python
class MyEOSClass(ConfigMixin):
def my_method(self):
if self.config.myconfigval:
"""
@classproperty
@@ -78,12 +79,13 @@ class MeasurementMixin:
measurement (Measurement): Property to access the global EOS measurement data.
Example:
```python
class MyOptimizationClass(MeasurementMixin):
def analyze_mymeasurement(self):
measurement_data = self.measurement.mymeasurement
# Perform analysis
```
.. code-block:: python
class MyOptimizationClass(MeasurementMixin):
def analyze_mymeasurement(self):
measurement_data = self.measurement.mymeasurement
# Perform analysis
"""
@classproperty
@@ -118,12 +120,13 @@ class PredictionMixin:
prediction (Prediction): Property to access the global EOS prediction data.
Example:
```python
class MyOptimizationClass(PredictionMixin):
def analyze_myprediction(self):
prediction_data = self.prediction.mypredictionresult
# Perform analysis
```
.. code-block:: python
class MyOptimizationClass(PredictionMixin):
def analyze_myprediction(self):
prediction_data = self.prediction.mypredictionresult
# Perform analysis
"""
@classproperty
@@ -159,12 +162,13 @@ class EnergyManagementSystemMixin:
ems (EnergyManagementSystem): Property to access the global EOS energy management system.
Example:
```python
class MyOptimizationClass(EnergyManagementSystemMixin):
def analyze_myprediction(self):
ems_data = self.ems.the_ems_method()
# Perform analysis
```
.. code-block:: python
class MyOptimizationClass(EnergyManagementSystemMixin):
def analyze_myprediction(self):
ems_data = self.ems.the_ems_method()
# Perform analysis
"""
@classproperty
@@ -224,22 +228,25 @@ class SingletonMixin:
- Avoid using `__init__` to reinitialize the singleton instance after it has been created.
Example:
class MySingletonModel(SingletonMixin, PydanticBaseModel):
name: str
.. code-block:: python
# implement __init__ to avoid re-initialization of parent classes:
def __init__(self, *args: Any, **kwargs: Any) -> None:
if hasattr(self, "_initialized"):
return
# Your initialisation here
...
super().__init__(*args, **kwargs)
class MySingletonModel(SingletonMixin, PydanticBaseModel):
name: str
instance1 = MySingletonModel(name="Instance 1")
instance2 = MySingletonModel(name="Instance 2")
# implement __init__ to avoid re-initialization of parent classes:
def __init__(self, *args: Any, **kwargs: Any) -> None:
if hasattr(self, "_initialized"):
return
# Your initialisation here
...
super().__init__(*args, **kwargs)
instance1 = MySingletonModel(name="Instance 1")
instance2 = MySingletonModel(name="Instance 2")
assert instance1 is instance2 # True
print(instance1.name) # Output: "Instance 1"
assert instance1 is instance2 # True
print(instance1.name) # Output: "Instance 1"
"""
_lock: ClassVar[threading.Lock] = threading.Lock()

View File

@@ -432,20 +432,23 @@ class DataSequence(DataBase, MutableSequence):
Derived classes have to provide their own records field with correct record type set.
Usage:
# Example of creating, adding, and using DataSequence
class DerivedSequence(DataSquence):
records: List[DerivedDataRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of data records" })
.. code-block:: python
seq = DerivedSequence()
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=72))
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=75))
# Example of creating, adding, and using DataSequence
class DerivedSequence(DataSquence):
records: List[DerivedDataRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of data records" })
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
seq = DerivedSequence()
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=72))
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=75))
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
# Convert to Pandas Series
series = seq.key_to_series('temperature')
# Convert to Pandas Series
series = seq.key_to_series('temperature')
"""
# To be overloaded by derived classes.
@@ -737,9 +740,12 @@ class DataSequence(DataBase, MutableSequence):
**kwargs: Key-value pairs as keyword arguments
Examples:
>>> update_value(date, 'temperature', 25.5)
>>> update_value(date, {'temperature': 25.5, 'humidity': 80})
>>> update_value(date, temperature=25.5, humidity=80)
.. code-block:: python
update_value(date, 'temperature', 25.5)
update_value(date, {'temperature': 25.5, 'humidity': 80})
update_value(date, temperature=25.5, humidity=80)
"""
# Process input arguments into a dictionary
values: Dict[str, Any] = {}
@@ -1378,15 +1384,18 @@ class DataImportMixin:
"""Mixin class for import of generic data.
This class is designed to handle generic data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific data.
- **Values**: Are lists of data values starting at a specified `start_datetime`, where
- **Values**: Are lists of data values starting at a specified start_datetime, where
each value corresponds to a subsequent time interval (e.g., hourly).
Two special keys are handled. `start_datetime` may be used to defined the starting datetime of
the values. `ìnterval` may be used to define the fixed time interval between two values.
Two special keys are handled. start_datetime may be used to defined the starting datetime of
the values. ìnterval may be used to define the fixed time interval between two values.
On import self.update_value(datetime, key, value) is called which has to be provided.
Also self.ems_start_datetime may be necessary as a default in case start_datetime is not
given.
On import `self.update_value(datetime, key, value)` is called which has to be provided.
Also `self.ems_start_datetime` may be necessary as a default in case `start_datetime`is not given.
"""
# Attributes required but defined elsehere.
@@ -1418,16 +1427,20 @@ class DataImportMixin:
Behavior:
- Skips invalid timestamps during DST spring forward transitions.
- Includes both instances of repeated timestamps during DST fall back transitions.
- Ensures the list contains exactly `value_count` entries.
- Ensures the list contains exactly 'value_count' entries.
Example:
>>> start_datetime = pendulum.datetime(2024, 11, 3, 0, 0, tz="America/New_York")
>>> import_datetimes(start_datetime, 5)
[(DateTime(2024, 11, 3, 0, 0, tzinfo=Timezone('America/New_York')), 0),
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1),
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1), # Repeated hour
(DateTime(2024, 11, 3, 2, 0, tzinfo=Timezone('America/New_York')), 2),
(DateTime(2024, 11, 3, 3, 0, tzinfo=Timezone('America/New_York')), 3)]
.. code-block:: python
start_datetime = pendulum.datetime(2024, 11, 3, 0, 0, tz="America/New_York")
import_datetimes(start_datetime, 5)
[(DateTime(2024, 11, 3, 0, 0, tzinfo=Timezone('America/New_York')), 0),
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1),
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1), # Repeated hour
(DateTime(2024, 11, 3, 2, 0, tzinfo=Timezone('America/New_York')), 2),
(DateTime(2024, 11, 3, 3, 0, tzinfo=Timezone('America/New_York')), 3)]
"""
timestamps_with_indices: List[Tuple[DateTime, int]] = []
@@ -1665,17 +1678,18 @@ class DataImportMixin:
JSONDecodeError: If the file content is not valid JSON.
Example:
Given a JSON string with the following content:
```json
{
"start_datetime": "2024-11-10 00:00:00"
"interval": "30 minutes"
"loadforecast_power_w": [20.5, 21.0, 22.1],
"other_xyz: [10.5, 11.0, 12.1],
}
```
and `key_prefix = "load"`, only the "loadforecast_power_w" key will be processed even though
both keys are in the record.
Given a JSON string with the following content and `key_prefix = "load"`, only the
"loadforecast_power_w" key will be processed even though both keys are in the record.
.. code-block:: json
{
"start_datetime": "2024-11-10 00:00:00",
"interval": "30 minutes",
"loadforecast_power_w": [20.5, 21.0, 22.1],
"other_xyz: [10.5, 11.0, 12.1]
}
"""
# Try pandas dataframe with orient="split"
try:
@@ -1741,15 +1755,16 @@ class DataImportMixin:
JSONDecodeError: If the file content is not valid JSON.
Example:
Given a JSON file with the following content:
```json
{
"loadforecast_power_w": [20.5, 21.0, 22.1],
"other_xyz: [10.5, 11.0, 12.1],
}
```
and `key_prefix = "load"`, only the "loadforecast_power_w" key will be processed even though
both keys are in the record.
Given a JSON file with the following content and `key_prefix = "load"`, only the
"loadforecast_power_w" key will be processed even though both keys are in the record.
.. code-block:: json
{
"loadforecast_power_w": [20.5, 21.0, 22.1],
"other_xyz: [10.5, 11.0, 12.1],
}
"""
with import_file_path.open("r", encoding="utf-8", newline=None) as import_file:
import_str = import_file.read()
@@ -1762,9 +1777,10 @@ class DataImportProvider(DataImportMixin, DataProvider):
"""Abstract base class for data providers that import generic data.
This class is designed to handle generic data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific data.
- **Values**: Are lists of data values starting at a specified `start_datetime`, where
each value corresponds to a subsequent time interval (e.g., hourly).
each value corresponds to a subsequent time interval (e.g., hourly).
Subclasses must implement the logic for managing generic data based on the imported records.
"""

View File

@@ -12,14 +12,16 @@ class classproperty:
the class rather than any instance of the class.
Example:
class MyClass:
_value = 42
.. code-block:: python
@classproperty
def value(cls):
return cls._value
class MyClass:
_value = 42
print(MyClass.value) # Outputs: 42
@classproperty
def value(cls):
return cls._value
print(MyClass.value) # Outputs: 42
Methods:
__get__: Retrieves the value of the class property by calling the

View File

@@ -6,10 +6,12 @@ These enhancements facilitate the use of Pydantic models in applications requiri
datetime fields and consistent data serialization.
Key Features:
- Custom type adapter for `pendulum.DateTime` fields with automatic serialization to ISO 8601 strings.
- Utility methods for converting models to and from dictionaries and JSON strings.
- Validation tools for maintaining data consistency, including specialized support for
pandas DataFrames and Series with datetime indexes.
"""
import inspect
@@ -157,16 +159,19 @@ class PydanticModelNestedValueMixin:
or an invalid transition is made (such as an attribute on a non-model).
Example:
class Address(PydanticBaseModel):
city: str
.. code-block:: python
class User(PydanticBaseModel):
name: str
address: Address
class Address(PydanticBaseModel):
city: str
class User(PydanticBaseModel):
name: str
address: Address
user = User(name="Alice", address=Address(city="NY"))
user._validate_path_structure("address/city") # OK
user._validate_path_structure("address/zipcode") # Raises ValueError
user = User(name="Alice", address=Address(city="NY"))
user._validate_path_structure("address/city") # OK
user._validate_path_structure("address/zipcode") # Raises ValueError
"""
path_elements = path.strip("/").split("/")
# The model we are currently working on
@@ -264,18 +269,19 @@ class PydanticModelNestedValueMixin:
IndexError: If a list index is out of bounds or invalid.
Example:
```python
class Address(PydanticBaseModel):
city: str
.. code-block:: python
class User(PydanticBaseModel):
name: str
address: Address
class Address(PydanticBaseModel):
city: str
class User(PydanticBaseModel):
name: str
address: Address
user = User(name="Alice", address=Address(city="New York"))
city = user.get_nested_value("address/city")
print(city) # Output: "New York"
user = User(name="Alice", address=Address(city="New York"))
city = user.get_nested_value("address/city")
print(city) # Output: "New York"
```
"""
path_elements = path.strip("/").split("/")
model: Any = self
@@ -318,22 +324,23 @@ class PydanticModelNestedValueMixin:
TypeError: If a missing field cannot be initialized.
Example:
```python
class Address(PydanticBaseModel):
city: Optional[str]
.. code-block:: python
class User(PydanticBaseModel):
name: str
address: Optional[Address]
settings: Optional[Dict[str, Any]]
class Address(PydanticBaseModel):
city: Optional[str]
user = User(name="Alice", address=None, settings=None)
user.set_nested_value("address/city", "Los Angeles")
user.set_nested_value("settings/theme", "dark")
class User(PydanticBaseModel):
name: str
address: Optional[Address]
settings: Optional[Dict[str, Any]]
user = User(name="Alice", address=None, settings=None)
user.set_nested_value("address/city", "Los Angeles")
user.set_nested_value("settings/theme", "dark")
print(user.address.city) # Output: "Los Angeles"
print(user.settings) # Output: {'theme': 'dark'}
print(user.address.city) # Output: "Los Angeles"
print(user.settings) # Output: {'theme': 'dark'}
```
"""
path = path.strip("/")
# Store old value (if possible)
@@ -753,18 +760,21 @@ class PydanticBaseModel(PydanticModelNestedValueMixin, BaseModel):
gracefully by returning an empty dictionary.
Examples:
>>> class User(Base):
... name: str = Field(
... json_schema_extra={"description": "User name"}
... )
...
>>> field = User.model_fields["name"]
>>> User.get_field_extra_dict(field)
{'description': 'User name'}
.. code-block:: python
class User(Base):
name: str = Field(
json_schema_extra={"description": "User name"}
)
field = User.model_fields["name"]
User.get_field_extra_dict(field)
{'description': 'User name'}
missing = User.model_fields.get("unknown", None)
User.get_field_extra_dict(missing) if missing else {}
{}
>>> missing = User.model_fields.get("unknown", None)
>>> User.get_field_extra_dict(missing) if missing else {}
{}
"""
if model_field is None:
return {}
@@ -873,12 +883,15 @@ class PydanticDateTimeData(RootModel):
- All value lists must have the same length
Example:
{
"start_datetime": "2024-01-01 00:00:00", # optional
"interval": "1 Hour", # optional
"loadforecast_power_w": [20.5, 21.0, 22.1],
"load_min": [18.5, 19.0, 20.1]
}
.. code-block:: python
{
"start_datetime": "2024-01-01 00:00:00", # optional
"interval": "1 Hour", # optional
"loadforecast_power_w": [20.5, 21.0, 22.1],
"load_min": [18.5, 19.0, 20.1]
}
"""
root: Dict[str, Union[str, List[Union[float, int, str, None]]]]
@@ -1275,9 +1288,12 @@ class PydanticDateTimeSeries(PydanticBaseModel):
ValueError: If series index is not datetime type.
Example:
>>> dates = pd.date_range('2024-01-01', periods=3)
>>> s = pd.Series([1.1, 2.2, 3.3], index=dates)
>>> model = PydanticDateTimeSeries.from_series(s)
.. code-block:: python
dates = pd.date_range('2024-01-01', periods=3)
s = pd.Series([1.1, 2.2, 3.3], index=dates)
model = PydanticDateTimeSeries.from_series(s)
"""
index = pd.Index([to_datetime(dt, as_string=True, in_timezone=tz) for dt in series.index])
series.index = index

View File

@@ -1,5 +1,156 @@
"""Version information for akkudoktoreos."""
import hashlib
import re
from fnmatch import fnmatch
from pathlib import Path
from typing import Optional
# For development add `+dev` to previous release
# For release omit `+dev`.
__version__ = "0.2.0+dev"
VERSION_BASE = "0.2.0+dev"
# Project hash of relevant files
HASH_EOS = ""
# ------------------------------
# Helpers for version generation
# ------------------------------
def is_excluded_dir(path: Path, excluded_dir_patterns: set[str]) -> bool:
"""Check whether a directory should be excluded based on name patterns."""
return any(fnmatch(path.name, pattern) for pattern in excluded_dir_patterns)
def hash_tree(
paths: list[Path],
allowed_suffixes: set[str],
excluded_dir_patterns: set[str],
excluded_files: Optional[set[Path]] = None,
) -> str:
"""Return SHA256 hash for files under `paths`.
Restricted by suffix, excluding excluded directory patterns and excluded_files.
"""
h = hashlib.sha256()
excluded_files = excluded_files or set()
for root in paths:
if not root.exists():
raise ValueError(f"Root path does not exist: {root}")
for p in sorted(root.rglob("*")):
# Skip excluded directories
if p.is_dir() and is_excluded_dir(p, excluded_dir_patterns):
continue
# Skip files inside excluded directories
if any(is_excluded_dir(parent, excluded_dir_patterns) for parent in p.parents):
continue
# Skip excluded files
if p.resolve() in excluded_files:
continue
# Hash only allowed file types
if p.is_file() and p.suffix.lower() in allowed_suffixes:
h.update(p.read_bytes())
digest = h.hexdigest()
return digest
def _version_hash() -> str:
"""Calculate project hash.
Only package file ins src/akkudoktoreos can be hashed to make it work also for packages.
"""
DIR_PACKAGE_ROOT = Path(__file__).resolve().parent.parent
# Allowed file suffixes to consider
ALLOWED_SUFFIXES: set[str] = {".py", ".md", ".json"}
# Directory patterns to exclude (glob-like)
EXCLUDED_DIR_PATTERNS: set[str] = {"*_autosum", "*__pycache__", "*_generated"}
# Files to exclude
EXCLUDED_FILES: set[Path] = set()
# Directories whose changes shall be part of the project hash
watched_paths = [DIR_PACKAGE_ROOT]
hash_current = hash_tree(
watched_paths, ALLOWED_SUFFIXES, EXCLUDED_DIR_PATTERNS, excluded_files=EXCLUDED_FILES
)
return hash_current
def _version_calculate() -> str:
"""Compute version."""
global HASH_EOS
HASH_EOS = _version_hash()
if VERSION_BASE.endswith("+dev"):
return f"{VERSION_BASE}.{HASH_EOS[:6]}"
else:
return VERSION_BASE
# ---------------------------
# Project version information
# ----------------------------
# The version
__version__ = _version_calculate()
# -------------------
# Version info access
# -------------------
# Regular expression to split the version string into pieces
VERSION_RE = re.compile(
r"""
^(?P<base>\d+\.\d+\.\d+) # x.y.z
(?:\+ # +dev.hash starts here
(?:
(?P<dev>dev) # literal 'dev'
(?:\.(?P<hash>[A-Za-z0-9]+))? # optional .hash
)
)?
$
""",
re.VERBOSE,
)
def version() -> dict[str, Optional[str]]:
"""Parses the version string.
The version string shall be of the form:
x.y.z
x.y.z+dev
x.y.z+dev.HASH
Returns:
.. code-block:: python
{
"version": "0.2.0+dev.a96a65",
"base": "x.y.z",
"dev": "dev" or None,
"hash": "<hash>" or None,
}
"""
global __version__
match = VERSION_RE.match(__version__)
if not match:
raise ValueError(f"Invalid version format: {version}")
info = match.groupdict()
info["version"] = __version__
return info

View File

@@ -1,5 +0,0 @@
{
"general": {
"version": "0.2.0+dev"
}
}

View File

@@ -171,25 +171,28 @@ class Battery:
Two **exclusive** modes:
Mode 1:
- `wh is not None` and `charge_factor == 0`
→ The raw requested charge energy is `wh` (pre-efficiency).
→ If remaining capacity is insufficient, charging is automatically limited.
→ No exception is raised due to capacity limits.
**Mode 1:**
Mode 2:
- `wh is None` and `charge_factor > 0`
→ The raw requested energy is `max_charge_power_w * charge_factor`.
→ If the request exceeds remaining capacity, the algorithm tries to
find a lower charge_factor that is compatible. If such a charge factor
exists, this hours charge_factor is replaced.
→ If no charge factor can accommodate charging, the request is ignored
(`(0.0, 0.0)` is returned) and a penalty is applied elsewhere.
- `wh is not None` and `charge_factor == 0`
- The raw requested charge energy is `wh` (pre-efficiency).
- If remaining capacity is insufficient, charging is automatically limited.
- No exception is raised due to capacity limits.
**Mode 2:**
- `wh is None` and `charge_factor > 0`
- The raw requested energy is `max_charge_power_w * charge_factor`.
- If the request exceeds remaining capacity, the algorithm tries to find a lower
`charge_factor` that is compatible. If such a charge factor exists, this hours
`charge_factor` is replaced.
- If no charge factor can accommodate charging, the request is ignored (``(0.0, 0.0)`` is
returned) and a penalty is applied elsewhere.
Charging is constrained by:
• Available SoC headroom (max_soc_wh soc_wh)
• max_charge_power_w
• charging_efficiency
- Available SoC headroom (``max_soc_wh soc_wh``)
- ``max_charge_power_w``
- ``charging_efficiency``
Args:
wh (float | None):

View File

@@ -212,15 +212,14 @@ class GeneticSolution(ConfigMixin, GeneticParametersBaseModel):
discharge_allowed (bool): Whether discharging is permitted.
Returns:
tuple[BatteryOperationMode, float]:
A tuple containing:
tuple[BatteryOperationMode, float]: A tuple containing
- `BatteryOperationMode`: the representative high-level operation mode.
- `float`: the operation factor corresponding to the active signal.
Notes:
- The mapping prioritizes AC charge > DC charge > discharge.
- Multiple strategies can produce the same low-level signals; this function
returns a representative mode based on a defined priority order.
returns a representative mode based on a defined priority order.
"""
# (0,0,0) → Nothing allowed
if ac_charge <= 0.0 and dc_charge <= 0.0 and not discharge_allowed:

View File

@@ -4,6 +4,9 @@ from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.prediction.elecpriceenergycharts import (
ElecPriceEnergyChartsCommonSettings,
)
from akkudoktoreos.prediction.elecpriceimport import ElecPriceImportCommonSettings
from akkudoktoreos.prediction.prediction import get_prediction
@@ -17,15 +20,6 @@ elecprice_providers = [
]
class ElecPriceCommonProviderSettings(SettingsBaseModel):
"""Electricity Price Prediction Provider Configuration."""
ElecPriceImport: Optional[ElecPriceImportCommonSettings] = Field(
default=None,
json_schema_extra={"description": "ElecPriceImport settings", "examples": [None]},
)
class ElecPriceCommonSettings(SettingsBaseModel):
"""Electricity Price Prediction Configuration."""
@@ -53,17 +47,14 @@ class ElecPriceCommonSettings(SettingsBaseModel):
},
)
provider_settings: ElecPriceCommonProviderSettings = Field(
default_factory=ElecPriceCommonProviderSettings,
json_schema_extra={
"description": "Provider settings",
"examples": [
# Example 1: Empty/default settings (all providers None)
{
"ElecPriceImport": None,
},
],
},
elecpriceimport: ElecPriceImportCommonSettings = Field(
default_factory=ElecPriceImportCommonSettings,
json_schema_extra={"description": "Import provider settings."},
)
energycharts: ElecPriceEnergyChartsCommonSettings = Field(
default_factory=ElecPriceEnergyChartsCommonSettings,
json_schema_extra={"description": "Energy Charts provider settings."},
)
# Validators

View File

@@ -7,21 +7,44 @@ format, enabling consistent access to forecasted and historical electricity pric
"""
from datetime import datetime
from enum import Enum
from typing import Any, List, Optional, Union
import numpy as np
import pandas as pd
import requests
from loguru import logger
from pydantic import ValidationError
from pydantic import Field, ValidationError
from statsmodels.tsa.holtwinters import ExponentialSmoothing
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
class EnergyChartsBiddingZones(str, Enum):
"""Energy Charts Bidding Zones."""
AT = "AT"
BE = "BE"
CH = "CH"
CZ = "CZ"
DE_LU = "DE-LU"
DE_AT_LU = "DE-AT-LU"
DK1 = "DK1"
DK2 = "DK2"
FR = "FR"
HU = "HU"
IT_North = "IT-NORTH"
NL = "NL"
NO2 = "NO2"
PL = "PL"
SE4 = "SE4"
SI = "SI"
class EnergyChartsElecPrice(PydanticBaseModel):
license_info: str
unix_seconds: List[int]
@@ -30,6 +53,21 @@ class EnergyChartsElecPrice(PydanticBaseModel):
deprecated: bool
class ElecPriceEnergyChartsCommonSettings(SettingsBaseModel):
"""Common settings for Energy Charts electricity price provider."""
bidding_zone: EnergyChartsBiddingZones = Field(
default=EnergyChartsBiddingZones.DE_LU,
json_schema_extra={
"description": (
"Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', "
"'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI'"
),
"examples": ["AT"],
},
)
class ElecPriceEnergyCharts(ElecPriceProvider):
"""Fetch and process electricity price forecast data from Energy-Charts.
@@ -95,7 +133,8 @@ class ElecPriceEnergyCharts(ElecPriceProvider):
)
last_date = to_datetime(self.end_datetime, as_string="YYYY-MM-DD")
url = f"{source}/price?bzn=DE-LU&start={start_date}&end={last_date}"
bidding_zone = str(self.config.elecprice.energycharts.bidding_zone)
url = f"{source}/price?bzn={bidding_zone}&start={start_date}&end={last_date}"
response = requests.get(url, timeout=30)
logger.debug(f"Response from {url}: {response}")
response.raise_for_status() # Raise an error for bad responses

View File

@@ -9,7 +9,6 @@ format, enabling consistent access to forecasted and historical elecprice attrib
from pathlib import Path
from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
@@ -65,16 +64,13 @@ class ElecPriceImport(ElecPriceProvider, PredictionImportProvider):
return "ElecPriceImport"
def _update_data(self, force_update: Optional[bool] = False) -> None:
if self.config.elecprice.provider_settings.ElecPriceImport is None:
logger.debug(f"{self.provider_id()} data update without provider settings.")
return
if self.config.elecprice.provider_settings.ElecPriceImport.import_file_path:
if self.config.elecprice.elecpriceimport.import_file_path:
self.import_from_file(
self.config.elecprice.provider_settings.ElecPriceImport.import_file_path,
self.config.elecprice.elecpriceimport.import_file_path,
key_prefix="elecprice",
)
if self.config.elecprice.provider_settings.ElecPriceImport.import_json:
if self.config.elecprice.elecpriceimport.import_json:
self.import_from_json(
self.config.elecprice.provider_settings.ElecPriceImport.import_json,
self.config.elecprice.elecpriceimport.import_json,
key_prefix="elecprice",
)

View File

@@ -5,7 +5,7 @@ from pathlib import Path
import numpy as np
from scipy.interpolate import RegularGridInterpolator
from akkudoktoreos.core.cache import cachemethod_energy_management
from akkudoktoreos.core.cache import cache_energy_management
from akkudoktoreos.core.coreabc import SingletonMixin
@@ -24,7 +24,7 @@ class SelfConsumptionProbabilityInterpolator:
points = np.array([np.full_like(partial_loads, load_1h_power), partial_loads]).T
return points, partial_loads
@cachemethod_energy_management
@cache_energy_management
def calculate_self_consumption(self, load_1h_power: float, pv_power: float) -> float:
"""Calculate the PV self-consumption rate using RegularGridInterpolator.

View File

@@ -70,20 +70,23 @@ class PredictionSequence(DataSequence):
Derived classes have to provide their own records field with correct record type set.
Usage:
# Example of creating, adding, and using PredictionSequence
class DerivedSequence(PredictionSquence):
records: List[DerivedPredictionRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of prediction records" })
.. code-block:: python
seq = DerivedSequence()
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=72))
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=75))
# Example of creating, adding, and using PredictionSequence
class DerivedSequence(PredictionSquence):
records: List[DerivedPredictionRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of prediction records" })
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
seq = DerivedSequence()
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=72))
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=75))
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
# Convert to Pandas Series
series = seq.key_to_series('temperature')
# Convert to Pandas Series
series = seq.key_to_series('temperature')
"""
# To be overloaded by derived classes.
@@ -224,9 +227,10 @@ class PredictionImportProvider(PredictionProvider, DataImportProvider):
"""Abstract base class for prediction providers that import prediction data.
This class is designed to handle prediction data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific prediction.
- **Values**: Are lists of prediction values starting at a specified `start_datetime`, where
each value corresponds to a subsequent time interval (e.g., hourly).
each value corresponds to a subsequent time interval (e.g., hourly).
Subclasses must implement the logic for managing prediction data based on the imported records.
"""

View File

@@ -12,51 +12,53 @@ Classes:
PVForecastAkkudoktor: Primary class to manage PV power forecasts, handle data retrieval, caching, and integration with Akkudoktor.net.
Example:
# Set up the configuration with necessary fields for URL generation
settings_data = {
"general": {
"latitude": 52.52,
"longitude": 13.405,
},
"prediction": {
"hours": 48,
"historic_hours": 24,
},
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"planes": [
{
"peakpower": 5.0,
"surface_azimuth": 170,
"surface_tilt": 7,
"userhorizon": [20, 27, 22, 20],
"inverter_paco": 10000,
},
{
"peakpower": 4.8,
"surface_azimuth": 90,
"surface_tilt": 7,
"userhorizon": [30, 30, 30, 50],
"inverter_paco": 10000,
}
]
.. code-block:: python
# Set up the configuration with necessary fields for URL generation
settings_data = {
"general": {
"latitude": 52.52,
"longitude": 13.405,
},
"prediction": {
"hours": 48,
"historic_hours": 24,
},
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"planes": [
{
"peakpower": 5.0,
"surface_azimuth": 170,
"surface_tilt": 7,
"userhorizon": [20, 27, 22, 20],
"inverter_paco": 10000,
},
{
"peakpower": 4.8,
"surface_azimuth": 90,
"surface_tilt": 7,
"userhorizon": [30, 30, 30, 50],
"inverter_paco": 10000,
}
]
}
}
}
# Create the config instance from the provided data
config = PVForecastAkkudoktorSettings(**settings_data)
# Create the config instance from the provided data
config = PVForecastAkkudoktorSettings(**settings_data)
# Initialize the forecast object with the generated configuration
forecast = PVForecastAkkudoktor(settings=config)
# Initialize the forecast object with the generated configuration
forecast = PVForecastAkkudoktor(settings=config)
# Get an actual forecast
forecast.update_data()
# Get an actual forecast
forecast.update_data()
# Update the AC power measurement for a specific date and time
forecast.update_value(to_datetime(None, to_maxtime=False), "pvforecastakkudoktor_ac_power_measured", 1000.0)
# Update the AC power measurement for a specific date and time
forecast.update_value(to_datetime(None, to_maxtime=False), "pvforecastakkudoktor_ac_power_measured", 1000.0)
# Report the DC and AC power forecast along with AC measurements
print(forecast.report_ac_power_and_measurement())
# Report the DC and AC power forecast along with AC measurements
print(forecast.report_ac_power_and_measurement())
Attributes:
hours (int): Number of hours into the future to forecast. Default is 48.

View File

@@ -117,17 +117,25 @@ class WeatherClearOutside(WeatherProvider):
Workflow:
1. **Retrieve Web Content**: Uses a helper method to fetch or retrieve cached ClearOutside HTML content.
2. **Extract Forecast Date and Timezone**:
- Parses the forecast's start and end dates and the UTC offset from the "Generated" header.
- Parses the forecast's start and end dates and the UTC offset from the "Generated"
header.
3. **Extract Weather Data**:
- For each day in the 7-day forecast, the function finds detailed weather parameters
and associates values for each hour.
- Parameters include cloud cover, temperature, humidity, visibility, and precipitation type, among others.
and associates values for each hour.
- Parameters include cloud cover, temperature, humidity, visibility, and
precipitation type, among others.
4. **Irradiance Calculation**:
- Calculates irradiance (GHI, DNI, DHI) values using cloud cover data and the `pvlib` library.
- Calculates irradiance (GHI, DNI, DHI) values using cloud cover data and the
`pvlib` library.
5. **Store Data**:
- Combines all hourly data into `WeatherDataRecord` objects, with keys
standardized according to `WeatherDataRecord` attributes.
standardized according to `WeatherDataRecord` attributes.
"""
# Get ClearOutside web content - either from site or cached
response = self._request_forecast(force_update=force_update) # type: ignore

View File

@@ -1,3 +1,4 @@
import hashlib
import json
import logging
import os
@@ -7,6 +8,7 @@ import sys
import tempfile
import time
from contextlib import contextmanager
from fnmatch import fnmatch
from http import HTTPStatus
from pathlib import Path
from typing import Generator, Optional, Union
@@ -21,12 +23,14 @@ from loguru import logger
from xprocess import ProcessStarter, XProcess
from akkudoktoreos.config.config import ConfigEOS, get_config
from akkudoktoreos.core.version import _version_hash, version
from akkudoktoreos.server.server import get_default_host
# -----------------------------------------------
# Adapt pytest logging handling to Loguru logging
# -----------------------------------------------
@pytest.fixture
def caplog(caplog: LogCaptureFixture):
"""Propagate Loguru logs to the pytest caplog handler."""
@@ -88,7 +92,7 @@ def disable_debug_logging(scope="session", autouse=True):
def pytest_addoption(parser):
parser.addoption(
"--full-run", action="store_true", default=False, help="Run with all optimization tests."
"--finalize", action="store_true", default=False, help="Run with all tests."
)
parser.addoption(
"--check-config-side-effect",
@@ -105,8 +109,8 @@ def pytest_addoption(parser):
@pytest.fixture
def is_full_run(request):
yield bool(request.config.getoption("--full-run"))
def is_finalize(request):
yield bool(request.config.getoption("--finalize"))
@pytest.fixture(autouse=True)
@@ -123,6 +127,12 @@ def is_system_test(request):
yield bool(request.config.getoption("--system-test"))
@pytest.fixture
def is_ci() -> bool:
"""Returns True if running on GitHub Actions CI, False otherwise."""
return os.getenv("CI") == "true"
@pytest.fixture
def prediction_eos():
from akkudoktoreos.prediction.prediction import get_prediction
@@ -528,6 +538,25 @@ def server_setup_for_function(xprocess) -> Generator[dict[str, Union[str, int]],
yield result
# --------------------------------------
# Provide version and hash check support
# --------------------------------------
@pytest.fixture(scope="session")
def version_and_hash() -> Generator[dict[str, Optional[str]], None, None]:
"""Return version info as in in version.py and calculate current hash.
Runs once per test session.
"""
info = version()
info["hash_current"] = _version_hash()
yield info
# After all tests
# ------------------------------
# Provide pytest timezone change
# ------------------------------

View File

@@ -16,7 +16,6 @@ from akkudoktoreos.core.cache import (
CacheFileStore,
cache_energy_management,
cache_in_file,
cachemethod_energy_management,
)
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
@@ -64,10 +63,10 @@ class TestCacheEnergyManagementStore:
class TestCacheUntilUpdateDecorators:
def test_cachemethod_energy_management(self, cache_energy_management_store):
"""Test that cachemethod_energy_management caches method results."""
"""Test that cache_energy_management caches method results."""
class MyClass:
@cachemethod_energy_management
@cache_energy_management
def compute(self, value: int) -> int:
return value * 2
@@ -102,7 +101,7 @@ class TestCacheUntilUpdateDecorators:
"""Test that caching works for different arguments."""
class MyClass:
@cachemethod_energy_management
@cache_energy_management
def compute(self, value: int) -> int:
return value * 2
@@ -123,7 +122,7 @@ class TestCacheUntilUpdateDecorators:
"""Test that cache is cleared between EMS update cycles."""
class MyClass:
@cachemethod_energy_management
@cache_energy_management
def compute(self, value: int) -> int:
return value * 2

View File

@@ -120,15 +120,6 @@ def test_singleton_behavior(config_eos, config_default_dirs):
assert instance1.general.config_file_path == initial_cfg_file
def test_default_config_path(config_eos, config_default_dirs):
"""Test that the default config file path is computed correctly."""
_, _, config_default_dir_default, _ = config_default_dirs
expected_path = config_default_dir_default.joinpath("default.config.json")
assert config_eos.config_default_file_path == expected_path
assert config_eos.config_default_file_path.is_file()
def test_config_file_priority(config_default_dirs):
"""Test config file priority.

View File

@@ -1,5 +1,6 @@
import json
import os
import shutil
import sys
from pathlib import Path
from unittest.mock import patch
@@ -9,6 +10,9 @@ import pytest
DIR_PROJECT_ROOT = Path(__file__).parent.parent
DIR_TESTDATA = Path(__file__).parent / "testdata"
DIR_DOCS_GENERATED = DIR_PROJECT_ROOT / "docs" / "_generated"
DIR_TEST_GENERATED = DIR_TESTDATA / "docs" / "_generated"
def test_openapi_spec_current(config_eos):
"""Verify the openapi spec hasn´t changed."""
@@ -74,11 +78,14 @@ def test_openapi_md_current(config_eos):
def test_config_md_current(config_eos):
"""Verify the generated configuration markdown hasn´t changed."""
expected_config_md_path = DIR_PROJECT_ROOT / "docs" / "_generated" / "config.md"
new_config_md_path = DIR_TESTDATA / "config-new.md"
assert DIR_DOCS_GENERATED.exists()
with expected_config_md_path.open("r", encoding="utf-8", newline=None) as f_expected:
expected_config_md = f_expected.read()
# Remove any leftover files from last run
if DIR_TEST_GENERATED.exists():
shutil.rmtree(DIR_TEST_GENERATED)
# Ensure test dir exists
DIR_TEST_GENERATED.mkdir(parents=True, exist_ok=True)
# Patch get_config and import within guard to patch global variables within the eos module.
with patch("akkudoktoreos.config.config.get_config", return_value=config_eos):
@@ -87,17 +94,33 @@ def test_config_md_current(config_eos):
sys.path.insert(0, str(root_dir))
from scripts import generate_config_md
config_md = generate_config_md.generate_config_md(config_eos)
# Get all the top level fields
field_names = sorted(config_eos.__class__.model_fields.keys())
if os.name == "nt":
config_md = config_md.replace("\\\\", "/")
with new_config_md_path.open("w", encoding="utf-8", newline="\n") as f_new:
f_new.write(config_md)
# Create the file paths
expected = [ DIR_DOCS_GENERATED / "config.md", DIR_DOCS_GENERATED / "configexample.md", ]
tested = [ DIR_TEST_GENERATED / "config.md", DIR_TEST_GENERATED / "configexample.md", ]
for field_name in field_names:
file_name = f"config{field_name.lower()}.md"
expected.append(DIR_DOCS_GENERATED / file_name)
tested.append(DIR_TEST_GENERATED / file_name)
try:
assert config_md == expected_config_md
except AssertionError as e:
pytest.fail(
f"Expected {new_config_md_path} to equal {expected_config_md_path}.\n"
+ f"If ok: `make gen-docs` or `cp {new_config_md_path} {expected_config_md_path}`\n"
)
# Create test files
config_md = generate_config_md.generate_config_md(tested[0], config_eos)
# Check test files are the same as the expected files
for i, expected_path in enumerate(expected):
tested_path = tested[i]
with expected_path.open("r", encoding="utf-8", newline=None) as f_expected:
expected_config_md = f_expected.read()
with tested_path.open("r", encoding="utf-8", newline=None) as f_expected:
tested_config_md = f_expected.read()
try:
assert tested_config_md == expected_config_md
except AssertionError as e:
pytest.fail(
f"Expected {tested_path} to equal {expected_path}.\n"
+ f"If ok: `make gen-docs` or `cp {tested_path} {expected_path}`\n"
)

140
tests/test_docsphinx.py Normal file
View File

@@ -0,0 +1,140 @@
import json
import os
import shutil
import subprocess
import sys
import tempfile
from pathlib import Path
from typing import Optional
import pytest
DIR_PROJECT_ROOT = Path(__file__).absolute().parent.parent
DIR_BUILD = DIR_PROJECT_ROOT / "build"
DIR_BUILD_DOCS = DIR_PROJECT_ROOT / "build" / "docs"
DIR_DOCS = DIR_PROJECT_ROOT / "docs"
DIR_SRC = DIR_PROJECT_ROOT / "src"
HASH_FILE = DIR_BUILD / ".sphinx_hash.json"
def find_sphinx_build() -> str:
venv = os.getenv("VIRTUAL_ENV")
paths = [Path(venv)] if venv else []
paths.append(DIR_PROJECT_ROOT / ".venv")
for base in paths:
cmd = base / ("Scripts" if os.name == "nt" else "bin") / ("sphinx-build.exe" if os.name == "nt" else "sphinx-build")
if cmd.exists():
return str(cmd)
return "sphinx-build"
@pytest.fixture(scope="session")
def sphinx_changed(version_and_hash) -> Optional[str]:
"""Returns new hash if any watched files have changed since last run.
Hash is stored in .sphinx_hash.json.
"""
new_hash = None
# Load previous hash
try:
previous = json.loads(HASH_FILE.read_text())
previous_hash = previous.get("hash")
except Exception:
previous_hash = None
changed = (previous_hash != version_and_hash["hash_current"])
if changed:
new_hash = version_and_hash["hash_current"]
return new_hash
class TestSphinxDocumentation:
"""Test class to verify Sphinx documentation generation.
Ensures no major warnings are emitted.
"""
SPHINX_CMD = [
find_sphinx_build(),
"-M",
"html",
str(DIR_DOCS),
str(DIR_BUILD_DOCS),
]
def _cleanup_autosum_dirs(self):
"""Delete all *_autosum folders inside docs/."""
for folder in DIR_DOCS.rglob("*_autosum"):
if folder.is_dir():
shutil.rmtree(folder)
def _cleanup_build_dir(self):
"""Delete build/docs directory if present."""
if DIR_BUILD_DOCS.exists():
shutil.rmtree(DIR_BUILD_DOCS)
def test_sphinx_build(self, sphinx_changed: Optional[str], is_finalize: bool):
"""Build Sphinx documentation and ensure no major warnings appear in the build output."""
# Ensure docs folder exists
if not DIR_DOCS.exists():
pytest.skip(f"Skipping Sphinx build test - docs folder not present: {DIR_DOCS}")
if not sphinx_changed:
pytest.skip(f"Skipping Sphinx build — no relevant file changes detected: {HASH_FILE}")
if not is_finalize:
pytest.skip("Skipping Sphinx test — not full run")
# Clean directories
self._cleanup_autosum_dirs()
self._cleanup_build_dir()
# Set environment for sphinx run (sphinx will make eos create a config file)
eos_tmp_dir = tempfile.TemporaryDirectory()
eos_dir = str(eos_tmp_dir.name)
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_CONFIG_DIR"] = eos_dir
try:
# Run sphinx-build
project_dir = Path(__file__).parent.parent
process = subprocess.run(
self.SPHINX_CMD,
check=True,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True,
cwd=project_dir,
)
# Combine output
output = process.stdout + "\n" + process.stderr
returncode = process.returncode
except:
output = f"ERROR: Could not start sphinx-build - {self.SPHINX_CMD}"
returncode = -1
# Remove temporary EOS_DIR
eos_tmp_dir.cleanup()
assert returncode == 0
# Possible markers: ERROR: WARNING: TRACEBACK:
major_markers = ("ERROR:", "TRACEBACK:")
bad_lines = [
line for line in output.splitlines()
if any(marker in line for marker in major_markers)
]
assert not bad_lines, f"Sphinx build contained errors:\n" + "\n".join(bad_lines)
# Update stored hash
HASH_FILE.parent.mkdir(parents=True, exist_ok=True)
HASH_FILE.write_text(json.dumps({"hash": sphinx_changed}, indent=2))

374
tests/test_docstringrst.py Normal file
View File

@@ -0,0 +1,374 @@
import importlib
import importlib.util
import inspect
import pkgutil
import re
import sys
from difflib import SequenceMatcher
from pathlib import Path
from docutils import nodes
from docutils.core import publish_parts
from docutils.frontend import OptionParser
from docutils.parsers.rst import Directive, Parser, directives
from docutils.utils import Reporter, new_document
from sphinx.ext.napoleon import Config as NapoleonConfig
from sphinx.ext.napoleon.docstring import GoogleDocstring
DIR_PROJECT_ROOT = Path(__file__).absolute().parent.parent
DIR_DOCS = DIR_PROJECT_ROOT / "docs"
PACKAGE_NAME = "akkudoktoreos"
# ---------------------------------------------------------------------------
# Location ignore rules (regex)
# ---------------------------------------------------------------------------
# Locations to ignore (regex). Note the escaped dot for literal '.'
IGNORE_LOCATIONS = [
r"\.__new__$",
# Pydantic
r"\.model_copy$",
r"\.model_dump$",
r"\.model_dump_json$",
r"\.field_serializer$",
r"\.field_validator$",
r"\.model_validator$",
r"\.computed_field$",
r"\.Field$",
r"\.FieldInfo.*",
r"\.ComputedFieldInfo.*",
r"\.PrivateAttr$",
# pathlib
r"\.Path.*",
# MarkdownIt
r"\.MarkdownIt.*",
# FastAPI
r"\.FastAPI.*",
r"\.FileResponse.*",
r"\.PdfResponse.*",
r"\.HTTPException$",
# bokeh
r"\.bokeh.*",
r"\.figure.*",
r"\.ColumnDataSource.*",
r"\.LinearAxis.*",
r"\.Range1d.*",
# BeautifulSoup
r"\.BeautifulSoup.*",
# ExponentialSmoothing
r"\.ExponentialSmoothing.*",
# Pendulum
r"\.Date$",
r"\.DateTime$",
r"\.Duration$",
# ABC
r"\.abstractmethod$",
# numpytypes
r"\.NDArray$",
# typing
r"\.ParamSpec",
r"\.TypeVar",
r"\.Annotated",
# contextlib
r"\.asynccontextmanager$",
# concurrent
r"\.ThreadPoolExecutor.*",
# asyncio
r"\.Lock.*",
# scipy
r"\.RegularGridInterpolator.*",
# pylogging
r"\.InterceptHandler.filter$",
# itertools
r"\.chain$",
# functools
r"\.partial$",
# fnmatch
r"\.fnmatch$",
]
# ---------------------------------------------------------------------------
# Error message ignore rules by location (regex)
# ---------------------------------------------------------------------------
IGNORE_ERRORS_BY_LOCATION = {
r"^akkudoktoreos.*": [
r"Unexpected possible title overline or transition.*",
],
}
# --- Use your global paths ---
conf_path = DIR_DOCS / "conf.py"
spec = importlib.util.spec_from_file_location("sphinx_conf", conf_path)
if spec is None:
raise AssertionError(f"Can not import sphinx_conf from {conf_path}")
sphinx_conf = importlib.util.module_from_spec(spec)
sys.modules["sphinx_conf"] = sphinx_conf
if spec.loader is None:
raise AssertionError(f"Can not import sphinx_conf from {conf_path}")
spec.loader.exec_module(sphinx_conf)
# Build NapoleonConfig with all options
napoleon_config = NapoleonConfig(
napoleon_google_docstring=getattr(sphinx_conf, "napoleon_google_docstring", True),
napoleon_numpy_docstring=getattr(sphinx_conf, "napoleon_numpy_docstring", False),
napoleon_include_init_with_doc=getattr(sphinx_conf, "napoleon_include_init_with_doc", False),
napoleon_include_private_with_doc=getattr(sphinx_conf, "napoleon_include_private_with_doc", False),
napoleon_include_special_with_doc=getattr(sphinx_conf, "napoleon_include_special_with_doc", True),
napoleon_use_admonition_for_examples=getattr(sphinx_conf, "napoleon_use_admonition_for_examples", False),
napoleon_use_admonition_for_notes=getattr(sphinx_conf, "napoleon_use_admonition_for_notes", False),
napoleon_use_admonition_for_references=getattr(sphinx_conf, "napoleon_use_admonition_for_references", False),
napoleon_use_ivar=getattr(sphinx_conf, "napoleon_use_ivar", False),
napoleon_use_param=getattr(sphinx_conf, "napoleon_use_param", True),
napoleon_use_rtype=getattr(sphinx_conf, "napoleon_use_rtype", True),
napoleon_preprocess_types=getattr(sphinx_conf, "napoleon_preprocess_types", False),
napoleon_type_aliases=getattr(sphinx_conf, "napoleon_type_aliases", None),
napoleon_attr_annotations=getattr(sphinx_conf, "napoleon_attr_annotations", True),
)
FENCE_RE = re.compile(r"^```(\w*)\s*$")
def replace_fenced_code_blocks(doc: str) -> tuple[str, bool]:
"""Replace fenced code blocks (```lang) in a docstring with RST code-block syntax.
Returns:
(new_doc, changed):
new_doc: The docstring with replacements applied
changed: True if any fenced block was replaced
"""
out_lines = []
inside = False
lang = ""
buffer: list[str] = []
changed = False
lines = doc.split("\n")
for line in lines:
stripped = line.strip()
# Detect opening fence: ``` or ```python
m = FENCE_RE.match(stripped)
if m and not inside:
inside = True
lang = m.group(1) or ""
# Write RST code-block header
if lang:
out_lines.append(f" .. code-block:: {lang}")
else:
out_lines.append(" .. code-block::")
out_lines.append("") # blank line required by RST
changed = True
continue
# Detect closing fence ```
if stripped == "```" and inside:
# Emit fenced code content with indentation
for b in buffer:
out_lines.append(" " + b)
out_lines.append("") # trailing blank line to close environment
inside = False
buffer = []
continue
if inside:
buffer.append(line)
else:
out_lines.append(line)
# If doc ended while still in fenced code, flush
if inside:
changed = True
for b in buffer:
out_lines.append(" " + b)
out_lines.append("")
inside = False
return "\n".join(out_lines), changed
def prepare_docutils_for_sphinx():
class NoOpDirective(Directive):
has_content = True
required_arguments = 0
optional_arguments = 100
final_argument_whitespace = True
def run(self):
return []
for d in ["attribute", "data", "method", "function", "class", "event", "todo"]:
directives.register_directive(d, NoOpDirective)
def validate_rst(text: str) -> list[tuple[int, str]]:
"""Validate a string as reStructuredText.
Returns a list of tuples: (line_number, message).
"""
if not text or not text.strip():
return []
warnings: list[tuple[int, str]] = []
class RecordingReporter(Reporter):
"""Capture warnings/errors instead of halting."""
def system_message(self, level, message, *children, **kwargs):
line = kwargs.get("line", None)
warnings.append((line or 0, message))
return nodes.system_message(message, level=level, type=self.levels[level], *children, **kwargs)
# Create default settings
settings = OptionParser(components=(Parser,)).get_default_values()
document = new_document("<docstring>", settings=settings)
# Attach custom reporter
document.reporter = RecordingReporter(
source="<docstring>",
report_level=1, # capture warnings and above
halt_level=100, # never halt
stream=None,
debug=False
)
parser = Parser()
parser.parse(text, document)
return warnings
def iter_docstrings(package_name: str):
"""Yield docstrings of modules, classes, functions in the given package."""
package = importlib.import_module(package_name)
for module_info in pkgutil.walk_packages(package.__path__, package.__name__ + "."):
module = importlib.import_module(module_info.name)
# Module docstring
if module.__doc__:
yield f"Module {module.__name__}", inspect.getdoc(module)
# Classes + methods
for _, obj in inspect.getmembers(module):
if inspect.isclass(obj) or inspect.isfunction(obj):
if obj.__doc__:
yield f"{module.__name__}.{obj.__name__}", inspect.getdoc(obj)
# Methods of classes
if inspect.isclass(obj):
for _, meth in inspect.getmembers(obj, inspect.isfunction):
if meth.__doc__:
yield f"{module.__name__}.{obj.__name__}.{meth.__name__}", inspect.getdoc(meth)
def map_converted_to_original(orig: str, conv: str) -> dict[int,int]:
"""Map original docstring line to converted docstring line.
Returns:
mapping: key = converted line index (0-based), value = original line index (0-based).
"""
orig_lines = orig.splitlines()
conv_lines = conv.splitlines()
matcher = SequenceMatcher(None, orig_lines, conv_lines)
line_map = {}
for tag, i1, i2, j1, j2 in matcher.get_opcodes():
if tag in ("equal", "replace"):
for o, c in zip(range(i1, i2), range(j1, j2)):
line_map[c] = o
elif tag == "insert":
for c in range(j1, j2):
line_map[c] = max(i1 - 1, 0)
return line_map
def test_all_docstrings_rst_compliant():
"""All docstrings must be valid reStructuredText."""
failures = []
for location, doc in iter_docstrings(PACKAGE_NAME):
# Skip ignored locations
if any(re.search(pat, location) for pat in IGNORE_LOCATIONS):
continue
# convert like sphinx napoleon does
doc_converted = str(GoogleDocstring(doc, napoleon_config))
# Register directives that sphinx knows - just to avaid errors
prepare_docutils_for_sphinx()
# Validate
messages = validate_rst(doc_converted)
if not messages:
continue
# Map converted line numbers back to original docstring
line_map = map_converted_to_original(doc, doc_converted)
# Filter messages
filtered_messages = []
ignore_msg_patterns = []
for loc_pattern, patterns in IGNORE_ERRORS_BY_LOCATION.items():
if re.search(loc_pattern, location):
ignore_msg_patterns.extend(patterns)
for conv_line, msg_text in messages:
orig_line = line_map.get(conv_line - 1, conv_line - 1) + 1
if any(re.search(pat, msg_text) for pat in ignore_msg_patterns):
continue
filtered_messages.append((orig_line, msg_text))
if filtered_messages:
failures.append((location, filtered_messages, doc, doc_converted))
# Raise AssertionError with nicely formatted output
if failures:
msg = "Invalid reST docstrings (see https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html for valid format):\n"
for location, errors, doc, doc_converted in failures:
msg += f"\n--- {location} ---\n"
msg += "\nConverted by Sphinx Napoleon:\n"
doc_lines = doc_converted.splitlines()
for i, line_content in enumerate(doc_lines, start=1):
line_str = f"{i:2}" # fixed-width
msg += f" L{line_str}: {line_content}\n"
msg += "\nOriginal:\n"
doc_lines = doc.splitlines()
error_map = {line: err for line, err in errors}
for i, line_content in enumerate(doc_lines, start=1):
line_str = f"{i:2}" # fixed-width
if i in error_map:
msg += f">>> L{line_str}: {line_content} <-- {error_map[i]}\n"
else:
msg += f" L{line_str}: {line_content}\n"
doc_fixed, changed = replace_fenced_code_blocks(doc)
if changed:
msg += "\nImproved for fenced code blocks:\n"
msg += '"""' + doc_fixed + '\n"""\n'
msg += f"Total: {len(failures)} docstrings"
raise AssertionError(msg)

View File

@@ -173,11 +173,20 @@ def test_request_forecast_status_codes(
provider._request_forecast()
@patch("requests.get")
@patch("akkudoktoreos.core.cache.CacheFileStore")
def test_cache_integration(mock_cache, provider):
def test_cache_integration(mock_cache, mock_get, provider, sample_akkudoktor_1_json):
"""Test caching of 8-day electricity price data."""
# Mock response object
mock_response = Mock()
mock_response.status_code = 200
mock_response.content = json.dumps(sample_akkudoktor_1_json)
mock_get.return_value = mock_response
# Mock cache object
mock_cache_instance = mock_cache.return_value
mock_cache_instance.get.return_value = None # Simulate no cache
provider._update_data(force_update=True)
mock_cache_instance.create.assert_called_once()
mock_cache_instance.get.assert_called_once()

View File

@@ -167,11 +167,20 @@ def test_request_forecast_status_codes(
provider._request_forecast()
@patch("requests.get")
@patch("akkudoktoreos.core.cache.CacheFileStore")
def test_cache_integration(mock_cache, provider):
def test_cache_integration(mock_cache, mock_get, provider, sample_energycharts_json):
"""Test caching of 8-day electricity price data."""
# Mock response object
mock_response = Mock()
mock_response.status_code = 200
mock_response.content = json.dumps(sample_energycharts_json)
mock_get.return_value = mock_response
# Mock cache object
mock_cache_instance = mock_cache.return_value
mock_cache_instance.get.return_value = None # Simulate no cache
provider._update_data(force_update=True)
mock_cache_instance.create.assert_called_once()
mock_cache_instance.get.assert_called_once()
@@ -195,7 +204,7 @@ def test_key_to_array_resampling(provider):
@pytest.mark.skip(reason="For development only")
def test_akkudoktor_development_forecast_data(provider):
def test_energycharts_development_forecast_data(provider):
"""Fetch data from real Energy-Charts server."""
# Preset, as this is usually done by update_data()
provider.ems_start_datetime = to_datetime("2024-10-26 00:00:00")

View File

@@ -18,11 +18,9 @@ def provider(sample_import_1_json, config_eos):
settings = {
"elecprice": {
"provider": "ElecPriceImport",
"provider_settings": {
"ElecPriceImport": {
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
"import_json": json.dumps(sample_import_1_json),
},
"elecpriceimport": {
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
"import_json": json.dumps(sample_import_1_json),
},
}
}
@@ -56,10 +54,8 @@ def test_invalid_provider(provider, config_eos):
settings = {
"elecprice": {
"provider": "<invalid>",
"provider_settings": {
"ElecPriceImport": {
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
},
"elecpriceimport": {
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
},
}
}
@@ -90,11 +86,11 @@ def test_import(provider, sample_import_1_json, start_datetime, from_file, confi
ems_eos = get_ems()
ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin"))
if from_file:
config_eos.elecprice.provider_settings.ElecPriceImport.import_json = None
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_json is None
config_eos.elecprice.elecpriceimport.import_json = None
assert config_eos.elecprice.elecpriceimport.import_json is None
else:
config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path = None
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path is None
config_eos.elecprice.elecpriceimport.import_file_path = None
assert config_eos.elecprice.elecpriceimport.import_file_path is None
provider.clear()
# Call the method

View File

@@ -50,7 +50,7 @@ def test_optimize(
fn_out: str,
ngen: int,
config_eos: ConfigEOS,
is_full_run: bool,
is_finalize: bool,
):
"""Test optimierung_ems."""
# Test parameters
@@ -107,8 +107,8 @@ def test_optimize(
genetic_optimization = GeneticOptimization(fixed_seed=fixed_seed)
# Activate with pytest --full-run
if ngen > 10 and not is_full_run:
# Activate with pytest --finalize
if ngen > 10 and not is_finalize:
pytest.skip()
visualize_filename = str((DIR_TESTDATA / f"new_{fn_out}").with_suffix(".pdf"))

119
tests/test_version.py Normal file
View File

@@ -0,0 +1,119 @@
# tests/test_version.py
import subprocess
import sys
from pathlib import Path
import pytest
import yaml
DIR_PROJECT_ROOT = Path(__file__).parent.parent
GET_VERSION_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "get_version.py"
BUMP_DEV_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "bump_dev_version.py"
UPDATE_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "update_version.py"
# --- Helper to create test files ---
def write_file(path: Path, content: str):
path.write_text(content, encoding="utf-8")
return path
# --- 1⃣ Test get_version.py ---
def test_get_version_prints_non_empty():
result = subprocess.run(
[sys.executable, str(GET_VERSION_SCRIPT)],
capture_output=True,
text=True,
check=True
)
version = result.stdout.strip()
assert version, "get_version.py should print a non-empty version"
assert len(version.split(".")) >= 3, "Version should have at least MAJOR.MINOR.PATCH"
# --- 2⃣ Test update_version.py on multiple file types ---
def test_update_version_multiple_formats(tmp_path):
py_file = write_file(tmp_path / "version.py", '__version__ = "0.1.0"\n')
yaml_file = write_file(tmp_path / "config.yaml", 'version: "0.1.0"\n')
json_file = write_file(tmp_path / "package.json", '{"version": "0.1.0"}\n')
new_version = "0.2.0"
files = [py_file, yaml_file, json_file]
subprocess.run(
[sys.executable, str(UPDATE_SCRIPT), new_version] + [str(f.resolve()) for f in files],
check=True
)
# Verify updates
assert f'__version__ = "{new_version}"' in py_file.read_text()
assert yaml.safe_load(yaml_file.read_text())["version"] == new_version
assert f'"version": "{new_version}"' in json_file.read_text()
# --- 3⃣ Test bump_dev_version.py ---
def test_bump_dev_version_appends_dev(tmp_path):
version_file = write_file(tmp_path / "version.py", 'VERSION_BASE = "0.2.0"\n')
result = subprocess.run(
[sys.executable, str(BUMP_DEV_SCRIPT), str(version_file.resolve())],
capture_output=True,
text=True,
check=True
)
new_version = result.stdout.strip()
assert new_version == "0.2.0+dev"
content = version_file.read_text()
assert f'VERSION_BASE = "{new_version}"' in content
# --- 4⃣ Full workflow simulation with git ---
def test_workflow_git(tmp_path):
# Create git repo
subprocess.run(["git", "init"], cwd=tmp_path, check=True)
subprocess.run(["git", "config", "user.name", "test"], cwd=tmp_path, check=True)
subprocess.run(["git", "config", "user.email", "test@test.com"], cwd=tmp_path, check=True)
# Create files
version_file = write_file(tmp_path / "version.py", 'VERSION_BASE = "0.1.0"\n')
config_file = write_file(tmp_path / "config.yaml", 'version: "0.1.0"\n')
subprocess.run(["git", "add", "."], cwd=tmp_path, check=True)
subprocess.run(["git", "commit", "-m", "initial commit"], cwd=tmp_path, check=True)
# --- Step 1: Calculate version (mock) ---
new_version = "0.2.0"
# --- Step 2: Update files ---
subprocess.run(
[sys.executable, str(UPDATE_SCRIPT), new_version, str(config_file.resolve()), str(version_file.resolve())],
cwd=tmp_path,
check=True
)
# --- Step 3: Commit updated files if needed ---
subprocess.run(["git", "add", str(config_file.resolve()), str(version_file.resolve())], cwd=tmp_path, check=True)
diff_result = subprocess.run(["git", "diff", "--cached", "--quiet"], cwd=tmp_path)
assert diff_result.returncode == 1, "There should be staged changes to commit"
subprocess.run(["git", "commit", "-m", f"chore: bump version to {new_version}"], cwd=tmp_path, check=True)
# --- Step 4: Tag version ---
tag_name = f"v{new_version}"
subprocess.run(["git", "tag", "-a", tag_name, "-m", f"Release {new_version}"], cwd=tmp_path, check=True)
tags = subprocess.run(["git", "tag"], cwd=tmp_path, capture_output=True, text=True, check=True).stdout
assert tag_name in tags
# --- Step 5: Bump dev version ---
result = subprocess.run(
[sys.executable, str(BUMP_DEV_SCRIPT), str(version_file.resolve())],
cwd=tmp_path,
capture_output=True,
text=True,
check=True
)
dev_version = result.stdout.strip()
assert dev_version.endswith("+dev")
assert dev_version.count("+dev") == 1
content = version_file.read_text()
assert f'VERSION_BASE = "{dev_version}"' in content