24 Commits

Author SHA1 Message Date
dependabot[bot]
64b5482c9f build(deps): bump fastapi[standard-no-fastapi-cloud-cli]
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.3 to 0.122.0.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.3...0.122.0)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.122.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-24 23:36:28 +00:00
dependabot[bot]
91be22ab31 build(deps-dev): bump types-docutils (#781)
Some checks are pending
Bump Version / Bump Version Workflow (push) Waiting to run
docker-build / platform-excludes (push) Waiting to run
docker-build / build (push) Blocked by required conditions
docker-build / merge (push) Blocked by required conditions
pre-commit / pre-commit (push) Waiting to run
Run Pytest on Pull Request / test (push) Waiting to run
Bumps [types-docutils](https://github.com/typeshed-internal/stub_uploader) from 0.22.2.20251006 to 0.22.3.20251115.
- [Commits](https://github.com/typeshed-internal/stub_uploader/commits)

---
updated-dependencies:
- dependency-name: types-docutils
  dependency-version: 0.22.3.20251115
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:23:19 +01:00
dependabot[bot]
1652e507d8 build(deps-dev): bump pre-commit from 4.4.0 to 4.5.0 (#782)
Bumps [pre-commit](https://github.com/pre-commit/pre-commit) from 4.4.0 to 4.5.0.
- [Release notes](https://github.com/pre-commit/pre-commit/releases)
- [Changelog](https://github.com/pre-commit/pre-commit/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pre-commit/pre-commit/compare/v4.4.0...v4.5.0)

---
updated-dependencies:
- dependency-name: pre-commit
  dependency-version: 4.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:22:59 +01:00
dependabot[bot]
d4a8c93665 build(deps): bump rich-toolkit from 0.15.1 to 0.16.0 (#780)
Bumps rich-toolkit from 0.15.1 to 0.16.0.

---
updated-dependencies:
- dependency-name: rich-toolkit
  dependency-version: 0.16.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 20:22:50 +01:00
dependabot[bot]
ab568ef37d build(deps): bump fastapi[standard-no-fastapi-cloud-cli] (#776)
Some checks failed
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bump Version / Bump Version Workflow (push) Has been cancelled
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.2 to 0.121.3.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.2...0.121.3)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.121.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:30:18 +01:00
dependabot[bot]
d7b19c7169 build(deps-dev): bump coverage from 7.11.3 to 7.12.0 (#774)
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.11.3 to 7.12.0.
- [Release notes](https://github.com/coveragepy/coveragepy/releases)
- [Changelog](https://github.com/coveragepy/coveragepy/blob/main/CHANGES.rst)
- [Commits](https://github.com/coveragepy/coveragepy/compare/7.11.3...7.12.0)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.12.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:30:06 +01:00
dependabot[bot]
ec53665f5e build(deps): bump python-fasthtml from 0.12.33 to 0.12.35 (#775)
Bumps [python-fasthtml](https://github.com/AnswerDotAI/fasthtml) from 0.12.33 to 0.12.35.
- [Release notes](https://github.com/AnswerDotAI/fasthtml/releases)
- [Changelog](https://github.com/AnswerDotAI/fasthtml/blob/main/CHANGELOG.md)
- [Commits](https://github.com/AnswerDotAI/fasthtml/commits)

---
updated-dependencies:
- dependency-name: python-fasthtml
  dependency-version: 0.12.35
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-20 16:29:53 +01:00
Bobby Noelte
976a2c8405 chore: automate development version and release generation (#772)
Some checks failed
Bump Version / Bump Version Workflow (push) Has been cancelled
docker-build / platform-excludes (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
This change introduces a GitHub Action to automate release creation, including
proper tagging and automatic addition of a development marker to the version.

A hash is also appended to development versions to make their state easier to
distinguish.

Tests and release documentation have been updated to reflect the revised
release workflow. Several files now retrieve the current version dynamically.

The test --full-run option has been rename to --finalize to make
clear it is to do commit finalization testing.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-20 00:10:19 +01:00
dependabot[bot]
bdbb0b060d build(deps): bump numpy from 2.3.4 to 2.3.5 (#771)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [numpy](https://github.com/numpy/numpy) from 2.3.4 to 2.3.5.
- [Release notes](https://github.com/numpy/numpy/releases)
- [Changelog](https://github.com/numpy/numpy/blob/main/doc/RELEASE_WALKTHROUGH.rst)
- [Commits](https://github.com/numpy/numpy/compare/v2.3.4...v2.3.5)

---
updated-dependencies:
- dependency-name: numpy
  dependency-version: 2.3.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-18 18:28:03 +01:00
dependabot[bot]
08d7c2ac5b build(deps-dev): bump pytest from 9.0.0 to 9.0.1 (#768)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [pytest](https://github.com/pytest-dev/pytest) from 9.0.0 to 9.0.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/9.0.0...9.0.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-version: 9.0.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-16 13:48:44 +01:00
dependabot[bot]
e255718240 build(deps): bump fastapi[standard-no-fastapi-cloud-cli] (#769)
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.1 to 0.121.2.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.1...0.121.2)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.121.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-16 13:48:08 +01:00
Bobby Noelte
4c2997dbd6 feat: add bidding zone to energy charts price prediction (#765)
Energy charts supports bidding zones. Allow to specifiy the bidding zone in the configuration.

Extend and simplify ElecPrice configuration structure and setup config migration to automatically
update the configuration file.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-16 13:26:18 +01:00
Bobby Noelte
edff649a5e chore: improve enhancement template (#766)
Improve enhancement template to not use fenced python chapters.

Add chapter to describe the enhancement.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-16 13:25:58 +01:00
Bobby Noelte
bad99fc62d chore: bump python version to 3.13.9 (#767) 2025-11-16 13:25:45 +01:00
Bobby Noelte
7bf9dd723e chore: improve doc generation and test (#762)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Improve documentation generation and add tests for documentation.
Extend sphinx by todo directive.

The configuration table is now split into several tables. The test
is adapted accordingly.

There is a new test that checks the docstrings to be compliant to the
RST format as used by sphinx to create the documentation. We can not
use Markdown in docstrings. The docstrings are adapted accordingly.

An additional test checks that the documentation can be build with sphinx.
This test takes very long is only enabled in full run (aka. ci) mode.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-13 22:53:46 +01:00
Bobby Noelte
8da137f8f1 fix: cached_method deprecated and test
cachebox deprecated the method decorator. Used cached instead.

Fix cache integration tests that were accessing real world addresses.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-13 19:35:08 +01:00
dependabot[bot]
cab3a3dd21 build(deps-dev): bump pre-commit from 4.3.0 to 4.4.0 (#758)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [pre-commit](https://github.com/pre-commit/pre-commit) from 4.3.0 to 4.4.0.
- [Release notes](https://github.com/pre-commit/pre-commit/releases)
- [Changelog](https://github.com/pre-commit/pre-commit/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pre-commit/pre-commit/compare/v4.3.0...v4.4.0)

---
updated-dependencies:
- dependency-name: pre-commit
  dependency-version: 4.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-11 01:07:46 +01:00
dependabot[bot]
74a9271e88 build(deps-dev): bump commitizen from 4.9.1 to 4.10.0 (#760)
Bumps [commitizen](https://github.com/commitizen-tools/commitizen) from 4.9.1 to 4.10.0.
- [Release notes](https://github.com/commitizen-tools/commitizen/releases)
- [Changelog](https://github.com/commitizen-tools/commitizen/blob/master/CHANGELOG.md)
- [Commits](https://github.com/commitizen-tools/commitizen/compare/v4.9.1...v4.10.0)

---
updated-dependencies:
- dependency-name: commitizen
  dependency-version: 4.10.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-11 01:07:33 +01:00
dependabot[bot]
5e681e4356 build(deps): bump fastapi[standard-no-fastapi-cloud-cli]
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
Bumps [fastapi[standard-no-fastapi-cloud-cli]](https://github.com/fastapi/fastapi) from 0.121.0 to 0.121.1.
- [Release notes](https://github.com/fastapi/fastapi/releases)
- [Commits](https://github.com/fastapi/fastapi/compare/0.121.0...0.121.1)

---
updated-dependencies:
- dependency-name: fastapi[standard-no-fastapi-cloud-cli]
  dependency-version: 0.121.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-11-10 22:40:53 +00:00
dependabot[bot]
1b5616b05c build(deps-dev): bump pytest from 8.4.2 to 9.0.0 (#754)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.4.2 to 9.0.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.4.2...9.0.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-version: 9.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-10 23:19:41 +01:00
dependabot[bot]
12bb7a5736 build(deps-dev): bump coverage from 7.11.1 to 7.11.3 (#756)
Bumps [coverage](https://github.com/coveragepy/coveragepy) from 7.11.1 to 7.11.3.
- [Release notes](https://github.com/coveragepy/coveragepy/releases)
- [Changelog](https://github.com/coveragepy/coveragepy/blob/main/CHANGES.rst)
- [Commits](https://github.com/coveragepy/coveragepy/compare/7.11.1...7.11.3)

---
updated-dependencies:
- dependency-name: coverage
  dependency-version: 7.11.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-10 23:16:27 +01:00
dependabot[bot]
1f4a975a55 build(deps): bump fastapi-cli from 0.0.14 to 0.0.16 (#757)
Bumps [fastapi-cli](https://github.com/fastapi/fastapi-cli) from 0.0.14 to 0.0.16.
- [Release notes](https://github.com/fastapi/fastapi-cli/releases)
- [Changelog](https://github.com/fastapi/fastapi-cli/blob/main/release-notes.md)
- [Commits](https://github.com/fastapi/fastapi-cli/compare/0.0.14...0.0.16)

---
updated-dependencies:
- dependency-name: fastapi-cli
  dependency-version: 0.0.16
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-10 23:15:54 +01:00
Bobby Noelte
e7b43782a4 fix: pydantic extra keywords deprecated (#753)
Pydantic deprecates using extra keyword arguments on Field.
Used json_schema_extra instead.

Deprecated in Pydantic V2.0 to be removed in V3.0.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-10 16:57:44 +01:00
Bobby Noelte
54b0622a96 chore: set development version marker 0.2.0+dev
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled
* Development version v0.2.0+dev

This pull request marks the repository as back in active development.

* Changes

- Set version to `v0.2.0+dev`

No changelog entry is needed.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
2025-11-09 09:41:26 +01:00
102 changed files with 5979 additions and 3694 deletions

View File

@@ -8,18 +8,20 @@ body:
- type: markdown - type: markdown
attributes: attributes:
value: > value: >
Please post your idea first as a [Discussion](https://github.com/Akkudoktor-EOS/EOS/discussions)
to validate it and bring attention to it. After validation,
you can open this issue for a more technical developer discussion.
Check the [Contributor Guide](https://github.com/Akkudoktor-EOS/EOS/blob/main/CONTRIBUTING.md) Check the [Contributor Guide](https://github.com/Akkudoktor-EOS/EOS/blob/main/CONTRIBUTING.md)
if you need more information. if you need more information.
- type: textarea
attributes:
label: "Describe the enhancement or feature request:"
validations:
required: true
- type: textarea - type: textarea
attributes: attributes:
label: "Link to discussion and related issues" label: "Link to discussion and related issues"
description: > description: >
<link here> <link here>
render: python
validations: validations:
required: false required: false
@@ -28,6 +30,5 @@ body:
label: "Proposed implementation" label: "Proposed implementation"
description: > description: >
How it could be implemented with a high level API. How it could be implemented with a high level API.
render: python
validations: validations:
required: false required: false

99
.github/workflows/bump-version.yml vendored Normal file
View File

@@ -0,0 +1,99 @@
name: Bump Version
# Trigger the workflow on any push to main
on:
push:
branches:
- main
jobs:
bump-version:
runs-on: ubuntu-latest
name: Bump Version Workflow
steps:
# --- Step 1: Checkout the repository ---
- name: Checkout repo
uses: actions/checkout@v4
with:
fetch-depth: 0 # Needed to create tags and see full history
persist-credentials: true # Needed for pushing commits and tags
# --- Step 2: Set up Python ---
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
# --- Step 3: Calculate version dynamically ---
- name: Calculate version
id: calc
run: |
# Call custom version calculation script
VERSION=$(python scripts/get_version.py)
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "Computed version: $VERSION"
# --- Step 4: Skip workflow for development versions ---
- name: Skip if version contains 'dev'
run: |
# Exit workflow early if the version contains 'dev'
if [[ "${{ steps.calc.outputs.version }}" == *dev* ]]; then
echo "Version contains 'dev', skipping bump version workflow."
exit 0
fi
# --- Step 5: Update files and commit if necessary ---
- name: Update files and commit
run: |
# Define files to update
UPDATE_FILES="haaddon/config.yaml"
# Call general Python version replacement script
python scripts/update_version.py "${{ steps.calc.outputs.version }}" $UPDATE_FILES
# Commit changes if any
git config user.name "github-actions"
git config user.email "actions@github.com"
git add $UPDATE_FILES
if git diff --cached --quiet; then
echo "No files changed. Skipping commit."
else
git commit -m "chore: bump version to ${{ steps.calc.outputs.version }}"
git push
# --- Step 6: Create release tag ---
- name: Create release tag if it does not exist
id: tagging
run: |
TAG="v${{ steps.calc.outputs.version }}"
if git rev-parse --verify "$TAG" >/dev/null 2>&1; then
echo "Tag $TAG already exists. Skipping tag creation."
echo "created=false" >> $GITHUB_OUTPUT
else
git tag -a "v${{ steps.calc.outputs.version }}" -m "Release ${{ steps.calc.outputs.version }}"
git push origin "v${{ steps.calc.outputs.version }}"
echo "created=true" >> $GITHUB_OUTPUT
fi
# --- Step 7: Bump to development version ---
- name: Bump dev version
id: bump_dev
run: |
VERSION_BASE=$(python scripts/bump_dev_version.py | tail -n1)
if [ -z "$VERSION_BASE" ]; then
echo "Error: bump_dev_version.py returned an empty version."
exit 1
fi
echo "version_base=$VERSION_BASE" >> $GITHUB_OUTPUT
git config user.name "github-actions"
git config user.email "actions@github.com"
git add src/akkudoktoreos/core/version.py
if git diff --cached --quiet; then
echo "version.py not changed. Skipping commit."
else
git commit -m "chore: bump dev version to ${VERSION_BASE}"
git push

View File

@@ -16,7 +16,7 @@ jobs:
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version: "3.12" python-version: "3.13.9"
- name: Install dependencies - name: Install dependencies
run: | run: |
@@ -26,7 +26,7 @@ jobs:
- name: Run Pytest - name: Run Pytest
run: | run: |
pip install -e . pip install -e .
python -m pytest --full-run --check-config-side-effect -vs --cov src --cov-report term-missing python -m pytest --finalize --check-config-side-effect -vs --cov src --cov-report term-missing
- name: Upload test artifacts - name: Upload test artifacts
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v4

View File

@@ -37,7 +37,9 @@ repos:
additional_dependencies: additional_dependencies:
- types-requests==2.32.4.20250913 - types-requests==2.32.4.20250913
- pandas-stubs==2.3.2.250926 - pandas-stubs==2.3.2.250926
- tokenize-rt==3.2.0 - tokenize-rt==6.2.0
- types-docutils==0.22.2.20251006
- types-PyYaml==6.0.12.20250915
pass_filenames: false pass_filenames: false
# --- Markdown linter --- # --- Markdown linter ---
@@ -46,7 +48,6 @@ repos:
hooks: hooks:
- id: pymarkdown - id: pymarkdown
files: ^docs/ files: ^docs/
exclude: ^docs/_generated
args: args:
- --config=docs/pymarkdown.json - --config=docs/pymarkdown.json
- scan - scan

View File

@@ -1,5 +1,8 @@
# syntax=docker/dockerfile:1.7 # syntax=docker/dockerfile:1.7
ARG PYTHON_VERSION=3.12.7 # Dockerfile
# Set base image first
ARG PYTHON_VERSION=3.13.9
FROM python:${PYTHON_VERSION}-slim FROM python:${PYTHON_VERSION}-slim
LABEL source="https://github.com/Akkudoktor-EOS/EOS" LABEL source="https://github.com/Akkudoktor-EOS/EOS"
@@ -32,28 +35,25 @@ RUN adduser --system --group --no-create-home eos \
&& mkdir -p "${EOS_CONFIG_DIR}" \ && mkdir -p "${EOS_CONFIG_DIR}" \
&& chown eos "${EOS_CONFIG_DIR}" && chown eos "${EOS_CONFIG_DIR}"
# Install requirements
COPY requirements.txt . COPY requirements.txt .
RUN --mount=type=cache,target=/root/.cache/pip \ RUN --mount=type=cache,target=/root/.cache/pip \
pip install --no-cache-dir -r requirements.txt pip install --no-cache-dir -r requirements.txt
# Copy source
COPY src/ ./src
COPY pyproject.toml . COPY pyproject.toml .
RUN mkdir -p src && pip install --no-cache-dir -e .
COPY src src # Create version information
COPY scripts/get_version.py ./scripts/get_version.py
RUN python scripts/get_version.py > ./version.txt
RUN rm ./scripts/get_version.py
# Create minimal default configuration for Docker to fix EOSDash accessibility (#629) RUN echo "Building Akkudoktor-EOS with Python $PYTHON_VERSION"
# This ensures EOSDash binds to 0.0.0.0 instead of 127.0.0.1 in containers
RUN echo '{\n\ # Install akkudoktoreos package in editable form (-e)
"server": {\n\ # pyproject-toml will read the version from version.txt
"host": "0.0.0.0",\n\ RUN pip install --no-cache-dir -e .
"port": 8503,\n\
"startup_eosdash": true,\n\
"eosdash_host": "0.0.0.0",\n\
"eosdash_port": 8504\n\
}\n\
}' > "${EOS_CONFIG_DIR}/EOS.config.json" \
&& chown eos:eos "${EOS_CONFIG_DIR}/EOS.config.json"
USER eos USER eos
ENTRYPOINT [] ENTRYPOINT []
@@ -61,6 +61,7 @@ ENTRYPOINT []
EXPOSE 8503 EXPOSE 8503
EXPOSE 8504 EXPOSE 8504
CMD ["python", "src/akkudoktoreos/server/eos.py", "--host", "0.0.0.0"] # Ensure EOS and EOSdash bind to 0.0.0.0
CMD ["python", "-m", "akkudoktoreos.server.eos", "--host", "0.0.0.0"]
VOLUME ["${MPLCONFIGDIR}", "${EOS_CACHE_DIR}", "${EOS_OUTPUT_DIR}", "${EOS_CONFIG_DIR}"] VOLUME ["${MPLCONFIGDIR}", "${EOS_CACHE_DIR}", "${EOS_OUTPUT_DIR}", "${EOS_CONFIG_DIR}"]

View File

@@ -1,5 +1,8 @@
# Define the targets # Define the targets
.PHONY: help venv pip install dist test test-full test-system test-ci test-profile docker-run docker-build docs read-docs clean format gitlint mypy run run-dev run-dash run-dash-dev bumps .PHONY: help venv pip install dist test test-full test-system test-ci test-profile docker-run docker-build docs read-docs clean format gitlint mypy run run-dev run-dash run-dash-dev prepare-version test-version
# - Take VERSION from version.py
VERSION := $(shell python3 scripts/get_version.py)
# Default target # Default target
all: help all: help
@@ -25,18 +28,19 @@ help:
@echo " run-dash - Run EOSdash production server in virtual environment." @echo " run-dash - Run EOSdash production server in virtual environment."
@echo " run-dash-dev - Run EOSdash development server in virtual environment (automatically reloads)." @echo " run-dash-dev - Run EOSdash development server in virtual environment (automatically reloads)."
@echo " test - Run tests." @echo " test - Run tests."
@echo " test-full - Run tests with full optimization." @echo " test-full - Run all tests (e.g. to finalize a commit)."
@echo " test-system - Run tests with system tests enabled." @echo " test-system - Run tests with system tests enabled."
@echo " test-ci - Run tests as CI does. No user config file allowed." @echo " test-ci - Run tests as CI does. No user config file allowed."
@echo " test-profile - Run single test optimization with profiling." @echo " test-profile - Run single test optimization with profiling."
@echo " dist - Create distribution (in dist/)." @echo " dist - Create distribution (in dist/)."
@echo " clean - Remove generated documentation, distribution and virtual environment." @echo " clean - Remove generated documentation, distribution and virtual environment."
@echo " bump - Bump version to next release version." @echo " prepare-version - Prepare a version defined in setup.py."
# Target to set up a Python 3 virtual environment # Target to set up a Python 3 virtual environment
venv: venv:
python3 -m venv .venv python3 -m venv .venv
@echo "Virtual environment created in '.venv'. Activate it using 'source .venv/bin/activate'." @PYVER=$$(./.venv/bin/python --version) && \
echo "Virtual environment created in '.venv' with $$PYVER. Activate it using 'source .venv/bin/activate'."
# Target to install dependencies from requirements.txt # Target to install dependencies from requirements.txt
pip: venv pip: venv
@@ -49,8 +53,12 @@ pip-dev: pip
.venv/bin/pip install -r requirements-dev.txt .venv/bin/pip install -r requirements-dev.txt
@echo "Dependencies installed from requirements-dev.txt." @echo "Dependencies installed from requirements-dev.txt."
# Target to create a version.txt
version-txt:
echo "$(VERSION)" > version.txt
# Target to install EOS in editable form (development mode) into virtual environment. # Target to install EOS in editable form (development mode) into virtual environment.
install: pip-dev install: pip-dev version-txt
.venv/bin/pip install build .venv/bin/pip install build
.venv/bin/pip install -e . .venv/bin/pip install -e .
@echo "EOS installed in editable form (development mode)." @echo "EOS installed in editable form (development mode)."
@@ -62,7 +70,7 @@ dist: pip
@echo "Distribution created (see dist/)." @echo "Distribution created (see dist/)."
# Target to generate documentation # Target to generate documentation
gen-docs: pip-dev gen-docs: pip-dev version-txt
.venv/bin/pip install -e . .venv/bin/pip install -e .
.venv/bin/python ./scripts/generate_config_md.py --output-file docs/_generated/config.md .venv/bin/python ./scripts/generate_config_md.py --output-file docs/_generated/config.md
.venv/bin/python ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md .venv/bin/python ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md
@@ -71,12 +79,13 @@ gen-docs: pip-dev
# Target to build HTML documentation # Target to build HTML documentation
docs: pip-dev docs: pip-dev
.venv/bin/sphinx-build -M html docs build/docs .venv/bin/pytest --full-run tests/test_docsphinx.py
@echo "Documentation build to build/docs/html/." @echo "Documentation build to build/docs/html/."
# Target to read the HTML documentation # Target to read the HTML documentation
read-docs: docs read-docs:
@echo "Read the documentation in your browser" @echo "Read the documentation in your browser"
.venv/bin/pytest --full-run tests/test_docsphinx.py
.venv/bin/python -m webbrowser build/docs/html/index.html .venv/bin/python -m webbrowser build/docs/html/index.html
# Clean Python bytecode # Clean Python bytecode
@@ -125,7 +134,7 @@ test:
# Target to run tests as done by CI on Github. # Target to run tests as done by CI on Github.
test-ci: test-ci:
@echo "Running tests as CI..." @echo "Running tests as CI..."
.venv/bin/pytest --full-run --check-config-side-effect -vs --cov src --cov-report term-missing .venv/bin/pytest --finalize --check-config-side-effect -vs --cov src --cov-report term-missing
# Target to run tests including the system tests. # Target to run tests including the system tests.
test-system: test-system:
@@ -135,7 +144,7 @@ test-system:
# Target to run all tests. # Target to run all tests.
test-full: test-full:
@echo "Running all tests..." @echo "Running all tests..."
.venv/bin/pytest --full-run .venv/bin/pytest --finalize
# Target to run tests including the single test optimization with profiling. # Target to run tests including the single test optimization with profiling.
test-profile: test-profile:
@@ -156,21 +165,26 @@ mypy:
# Run entire setup on docker # Run entire setup on docker
docker-run: docker-run:
@docker pull python:3.13.9-slim
@docker compose up --remove-orphans @docker compose up --remove-orphans
docker-build: docker-build:
@docker compose build --pull @docker pull python:3.13.9-slim
@docker compose build
# Bump Akkudoktoreos version # Propagete version info to all version files
VERSION ?= 0.2.0 # Take UPDATE_FILES from GitHub action bump-version.yml
NEW_VERSION ?= $(VERSION)+dev UPDATE_FILES := $(shell sed -n 's/^[[:space:]]*UPDATE_FILES[[:space:]]*=[[:space:]]*"\([^"]*\)".*/\1/p' \
.github/workflows/bump-version.yml)
prepare-version: #pip-dev
@echo "Update version to $(VERSION) from version.py in files $(UPDATE_FILES) and doc"
.venv/bin/python ./scripts/update_version.py $(VERSION) $(UPDATE_FILES)
.venv/bin/python ./scripts/convert_lightweight_tags.py
.venv/bin/python ./scripts/generate_config_md.py --output-file docs/_generated/config.md
.venv/bin/python ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md
.venv/bin/python ./scripts/generate_openapi.py --output-file openapi.json
.venv/bin/pytest -vv --finalize tests/test_version.py
bump: pip-dev test-version:
@echo "Bumping akkudoktoreos version from $(VERSION) to $(NEW_VERSION) (dry-run: $(EXTRA_ARGS))" echo "Test version information to be correctly set in all version files"
.venv/bin/python scripts/convert_lightweight_tags.py .venv/bin/pytest -vv tests/test_version.py
.venv/bin/python scripts/bump_version.py $(VERSION) $(NEW_VERSION) $(EXTRA_ARGS)
bump-dry: pip-dev
@echo "Bumping akkudoktoreos version from $(VERSION) to $(NEW_VERSION) (dry-run: --dry-run)"
.venv/bin/python scripts/convert_lightweight_tags.py
.venv/bin/python scripts/bump_version.py $(VERSION) $(NEW_VERSION) --dry-run

View File

@@ -1,6 +1,7 @@
--- ---
networks: networks:
default: default:
external: true
name: "eos" name: "eos"
services: services:
eos: eos:
@@ -38,18 +39,6 @@ services:
- "${EOS_SERVER__EOSDASH_PORT}:8504" - "${EOS_SERVER__EOSDASH_PORT}:8504"
# Volume mount configuration (optional) # Volume mount configuration (optional)
# IMPORTANT: When mounting local directories, the default config won't be available.
# You must create an EOS.config.json file in your local config directory with:
# {
# "server": {
# "host": "0.0.0.0", # Required for Docker container accessibility
# "port": 8503,
# "startup_eosdash": true,
# "eosdash_host": "0.0.0.0", # Required for Docker container accessibility
# "eosdash_port": 8504
# }
# }
#
# Example volume mounts (uncomment to use): # Example volume mounts (uncomment to use):
# volumes: # volumes:
# - ./config:/opt/eos/config # Mount local config directory # - ./config:/opt/eos/config # Mount local config directory

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,28 @@
## Cache Configuration
<!-- pyml disable line-length -->
:::{table} cache
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| cleanup_interval | `EOS_CACHE__CLEANUP_INTERVAL` | `float` | `rw` | `300` | Intervall in seconds for EOS file cache cleanup. |
| subpath | `EOS_CACHE__SUBPATH` | `Optional[pathlib.Path]` | `rw` | `cache` | Sub-path for the EOS cache data directory. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"cache": {
"subpath": "cache",
"cleanup_interval": 300.0
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,405 @@
## Base configuration for devices simulation settings
<!-- pyml disable line-length -->
:::{table} devices
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| batteries | `EOS_DEVICES__BATTERIES` | `Optional[list[akkudoktoreos.devices.devices.BatteriesCommonSettings]]` | `rw` | `None` | List of battery devices |
| electric_vehicles | `EOS_DEVICES__ELECTRIC_VEHICLES` | `Optional[list[akkudoktoreos.devices.devices.BatteriesCommonSettings]]` | `rw` | `None` | List of electric vehicle devices |
| home_appliances | `EOS_DEVICES__HOME_APPLIANCES` | `Optional[list[akkudoktoreos.devices.devices.HomeApplianceCommonSettings]]` | `rw` | `None` | List of home appliances |
| inverters | `EOS_DEVICES__INVERTERS` | `Optional[list[akkudoktoreos.devices.devices.InverterCommonSettings]]` | `rw` | `None` | List of inverters |
| max_batteries | `EOS_DEVICES__MAX_BATTERIES` | `Optional[int]` | `rw` | `None` | Maximum number of batteries that can be set |
| max_electric_vehicles | `EOS_DEVICES__MAX_ELECTRIC_VEHICLES` | `Optional[int]` | `rw` | `None` | Maximum number of electric vehicles that can be set |
| max_home_appliances | `EOS_DEVICES__MAX_HOME_APPLIANCES` | `Optional[int]` | `rw` | `None` | Maximum number of home_appliances that can be set |
| max_inverters | `EOS_DEVICES__MAX_INVERTERS` | `Optional[int]` | `rw` | `None` | Maximum number of inverters that can be set |
| measurement_keys | | `Optional[list[str]]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1,
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w",
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
}
```
<!-- pyml enable line-length -->
### Inverter devices base settings
<!-- pyml disable line-length -->
:::{table} devices::inverters::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| battery_id | `Optional[str]` | `rw` | `None` | ID of battery controlled by this inverter. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| max_power_w | `Optional[float]` | `rw` | `None` | Maximum power [W]. |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"inverters": [
{
"device_id": "battery1",
"max_power_w": 10000.0,
"battery_id": null
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"inverters": [
{
"device_id": "battery1",
"max_power_w": 10000.0,
"battery_id": null,
"measurement_keys": []
}
]
}
}
```
<!-- pyml enable line-length -->
### Home Appliance devices base settings
<!-- pyml disable line-length -->
:::{table} devices::home_appliances::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| consumption_wh | `int` | `rw` | `required` | Energy consumption [Wh]. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| duration_h | `int` | `rw` | `required` | Usage duration in hours [0 ... 24]. |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
| time_windows | `Optional[akkudoktoreos.utils.datetimeutil.TimeWindowSequence]` | `rw` | `None` | Sequence of allowed time windows. Defaults to optimization general time window. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"home_appliances": [
{
"device_id": "battery1",
"consumption_wh": 2000,
"duration_h": 1,
"time_windows": {
"windows": [
{
"start_time": "10:00:00.000000 Europe/Berlin",
"duration": "2 hours",
"day_of_week": null,
"date": null,
"locale": null
}
]
}
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"home_appliances": [
{
"device_id": "battery1",
"consumption_wh": 2000,
"duration_h": 1,
"time_windows": {
"windows": [
{
"start_time": "10:00:00.000000 Europe/Berlin",
"duration": "2 hours",
"day_of_week": null,
"date": null,
"locale": null
}
]
},
"measurement_keys": []
}
]
}
}
```
<!-- pyml enable line-length -->
### Battery devices base settings
<!-- pyml disable line-length -->
:::{table} devices::batteries::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| capacity_wh | `int` | `rw` | `8000` | Capacity [Wh]. |
| charge_rates | `Optional[numpydantic.vendor.npbase_meta_classes.NDArray]` | `rw` | `[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]` | Charge rates as factor of maximum charging power [0.00 ... 1.00]. None triggers fallback to default charge-rates. |
| charging_efficiency | `float` | `rw` | `0.88` | Charging efficiency [0.01 ... 1.00]. |
| device_id | `str` | `rw` | `<unknown>` | ID of device |
| discharging_efficiency | `float` | `rw` | `0.88` | Discharge efficiency [0.01 ... 1.00]. |
| levelized_cost_of_storage_kwh | `float` | `rw` | `0.0` | Levelized cost of storage (LCOS), the average lifetime cost of delivering one kWh [€/kWh]. |
| max_charge_power_w | `Optional[float]` | `rw` | `5000` | Maximum charging power [W]. |
| max_soc_percentage | `int` | `rw` | `100` | Maximum state of charge (SOC) as percentage of capacity [%]. |
| measurement_key_power_3_phase_sym_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l1_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l2_w | `str` | `ro` | `N/A` | None |
| measurement_key_power_l3_w | `str` | `ro` | `N/A` | None |
| measurement_key_soc_factor | `str` | `ro` | `N/A` | None |
| measurement_keys | `Optional[list[str]]` | `ro` | `N/A` | None |
| min_charge_power_w | `Optional[float]` | `rw` | `50` | Minimum charging power [W]. |
| min_soc_percentage | `int` | `rw` | `0` | Minimum state of charge (SOC) as percentage of capacity [%]. This is the target SoC for charging |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.12,
"max_charge_power_w": 5000.0,
"min_charge_power_w": 50.0,
"charge_rates": "[0. 0.25 0.5 0.75 1. ]",
"min_soc_percentage": 10,
"max_soc_percentage": 100
}
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.12,
"max_charge_power_w": 5000.0,
"min_charge_power_w": 50.0,
"charge_rates": "[0. 0.25 0.5 0.75 1. ]",
"min_soc_percentage": 10,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,99 @@
## Electricity Price Prediction Configuration
<!-- pyml disable line-length -->
:::{table} elecprice
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| charges_kwh | `EOS_ELECPRICE__CHARGES_KWH` | `Optional[float]` | `rw` | `None` | Electricity price charges [€/kWh]. Will be added to variable market price. |
| elecpriceimport | `EOS_ELECPRICE__ELECPRICEIMPORT` | `ElecPriceImportCommonSettings` | `rw` | `required` | Import provider settings. |
| energycharts | `EOS_ELECPRICE__ENERGYCHARTS` | `ElecPriceEnergyChartsCommonSettings` | `rw` | `required` | Energy Charts provider settings. |
| provider | `EOS_ELECPRICE__PROVIDER` | `Optional[str]` | `rw` | `None` | Electricity price provider id of provider to be used. |
| vat_rate | `EOS_ELECPRICE__VAT_RATE` | `Optional[float]` | `rw` | `1.19` | VAT rate factor applied to electricity price when charges are used. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"provider": "ElecPriceAkkudoktor",
"charges_kwh": 0.21,
"vat_rate": 1.19,
"elecpriceimport": {
"import_file_path": null,
"import_json": null
},
"energycharts": {
"bidding_zone": "DE-LU"
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for Energy Charts electricity price provider
<!-- pyml disable line-length -->
:::{table} elecprice::energycharts
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| bidding_zone | `<enum 'EnergyChartsBiddingZones'>` | `rw` | `EnergyChartsBiddingZones.DE_LU` | Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', 'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI' |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"energycharts": {
"bidding_zone": "AT"
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for elecprice data import from file or JSON String
<!-- pyml disable line-length -->
:::{table} elecprice::elecpriceimport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import elecprice data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of electricity price forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"elecprice": {
"elecpriceimport": {
"import_file_path": null,
"import_json": "{\"elecprice_marketprice_wh\": [0.0003384, 0.0003318, 0.0003284]}"
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,30 @@
## Energy Management Configuration
<!-- pyml disable line-length -->
:::{table} ems
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| interval | `EOS_EMS__INTERVAL` | `Optional[float]` | `rw` | `None` | Intervall in seconds between EOS energy management runs. |
| mode | `EOS_EMS__MODE` | `Optional[akkudoktoreos.core.emsettings.EnergyManagementMode]` | `rw` | `None` | Energy management mode [OPTIMIZATION | PREDICTION]. |
| startup_delay | `EOS_EMS__STARTUP_DELAY` | `float` | `rw` | `5` | Startup delay in seconds for EOS energy management runs. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"ems": {
"startup_delay": 5.0,
"interval": 300.0,
"mode": "OPTIMIZATION"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,215 @@
## Full example Config
<!-- pyml disable line-length -->
```json
{
"cache": {
"subpath": "cache",
"cleanup_interval": 300.0
},
"devices": {
"batteries": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_batteries": 1,
"electric_vehicles": [
{
"device_id": "battery1",
"capacity_wh": 8000,
"charging_efficiency": 0.88,
"discharging_efficiency": 0.88,
"levelized_cost_of_storage_kwh": 0.0,
"max_charge_power_w": 5000,
"min_charge_power_w": 50,
"charge_rates": "[0. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1. ]",
"min_soc_percentage": 0,
"max_soc_percentage": 100,
"measurement_key_soc_factor": "battery1-soc-factor",
"measurement_key_power_l1_w": "battery1-power-l1-w",
"measurement_key_power_l2_w": "battery1-power-l2-w",
"measurement_key_power_l3_w": "battery1-power-l3-w",
"measurement_key_power_3_phase_sym_w": "battery1-power-3-phase-sym-w",
"measurement_keys": [
"battery1-soc-factor",
"battery1-power-l1-w",
"battery1-power-l2-w",
"battery1-power-l3-w",
"battery1-power-3-phase-sym-w"
]
}
],
"max_electric_vehicles": 1,
"inverters": [],
"max_inverters": 1,
"home_appliances": [],
"max_home_appliances": 1
},
"elecprice": {
"provider": "ElecPriceAkkudoktor",
"charges_kwh": 0.21,
"vat_rate": 1.19,
"elecpriceimport": {
"import_file_path": null,
"import_json": null
},
"energycharts": {
"bidding_zone": "DE-LU"
}
},
"ems": {
"startup_delay": 5.0,
"interval": 300.0,
"mode": "OPTIMIZATION"
},
"feedintariff": {
"provider": "FeedInTariffFixed",
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
},
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405
},
"load": {
"provider": "LoadAkkudoktor",
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
},
"logging": {
"console_level": "TRACE",
"file_level": "TRACE"
},
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
]
},
"optimization": {
"horizon_hours": 24,
"interval": 3600,
"algorithm": "GENETIC",
"genetic": {
"individuals": 400,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
},
"prediction": {
"hours": 48,
"historic_hours": 48
},
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1
},
"server": {
"host": "127.0.0.1",
"port": 8503,
"verbose": false,
"startup_eosdash": true,
"eosdash_host": "127.0.0.1",
"eosdash_port": 8504
},
"utils": {},
"weather": {
"provider": "WeatherImport",
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,126 @@
## Feed In Tariff Prediction Configuration
<!-- pyml disable line-length -->
:::{table} feedintariff
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_FEEDINTARIFF__PROVIDER` | `Optional[str]` | `rw` | `None` | Feed in tariff provider id of provider to be used. |
| provider_settings | `EOS_FEEDINTARIFF__PROVIDER_SETTINGS` | `FeedInTariffCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider": "FeedInTariffFixed",
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for feed in tariff data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings::FeedInTariffImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import feed in tariff data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of feed in tariff forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffImport": {
"import_file_path": null,
"import_json": "{\"fead_in_tariff_wh\": [0.000078, 0.000078, 0.000023]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for elecprice fixed price
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings::FeedInTariffFixed
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| feed_in_tariff_kwh | `Optional[float]` | `rw` | `None` | Electricity price feed in tariff [€/kWH]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffFixed": {
"feed_in_tariff_kwh": 0.078
}
}
}
}
```
<!-- pyml enable line-length -->
### Feed In Tariff Prediction Provider Configuration
<!-- pyml disable line-length -->
:::{table} feedintariff::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| FeedInTariffFixed | `Optional[akkudoktoreos.prediction.feedintarifffixed.FeedInTariffFixedCommonSettings]` | `rw` | `None` | FeedInTariffFixed settings |
| FeedInTariffImport | `Optional[akkudoktoreos.prediction.feedintariffimport.FeedInTariffImportCommonSettings]` | `rw` | `None` | FeedInTariffImport settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"feedintariff": {
"provider_settings": {
"FeedInTariffFixed": null,
"FeedInTariffImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,73 @@
## Settings for common configuration
General configuration to set directories of cache and output files and system location (latitude
and longitude).
Validators ensure each parameter is within a specified range. A computed property, `timezone`,
determines the time zone based on latitude and longitude.
Attributes:
latitude (Optional[float]): Latitude in degrees, must be between -90 and 90.
longitude (Optional[float]): Longitude in degrees, must be between -180 and 180.
Properties:
timezone (Optional[str]): Computed time zone string based on the specified latitude
and longitude.
<!-- pyml disable line-length -->
:::{table} general
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| config_file_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| config_folder_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| data_folder_path | `EOS_GENERAL__DATA_FOLDER_PATH` | `Optional[pathlib.Path]` | `rw` | `None` | Path to EOS data directory. |
| data_output_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
| data_output_subpath | `EOS_GENERAL__DATA_OUTPUT_SUBPATH` | `Optional[pathlib.Path]` | `rw` | `output` | Sub-path for the EOS output data directory. |
| latitude | `EOS_GENERAL__LATITUDE` | `Optional[float]` | `rw` | `52.52` | Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°) |
| longitude | `EOS_GENERAL__LONGITUDE` | `Optional[float]` | `rw` | `13.405` | Longitude in decimal degrees, within -180 to 180 (°) |
| timezone | | `Optional[str]` | `ro` | `N/A` | None |
| version | `EOS_GENERAL__VERSION` | `str` | `rw` | `0.2.0+dev.4dbc2d` | Configuration file version. Used to check compatibility. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"general": {
"version": "0.2.0+dev.4dbc2d",
"data_folder_path": null,
"data_output_subpath": "output",
"latitude": 52.52,
"longitude": 13.405,
"timezone": "Europe/Berlin",
"data_output_path": null,
"config_folder_path": "/home/user/.config/net.akkudoktoreos.net",
"config_file_path": "/home/user/.config/net.akkudoktoreos.net/EOS.config.json"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,162 @@
## Load Prediction Configuration
<!-- pyml disable line-length -->
:::{table} load
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_LOAD__PROVIDER` | `Optional[str]` | `rw` | `None` | Load provider id of provider to be used. |
| provider_settings | `EOS_LOAD__PROVIDER_SETTINGS` | `LoadCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider": "LoadAkkudoktor",
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for load data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import load data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of load forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadImport": {
"import_file_path": null,
"import_json": "{\"load0_mean\": [676.71, 876.19, 527.13]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for VRM API
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadVrm
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| load_vrm_idsite | `int` | `rw` | `12345` | VRM-Installation-ID |
| load_vrm_token | `str` | `rw` | `your-token` | Token for Connecting VRM API |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadVrm": {
"load_vrm_token": "your-token",
"load_vrm_idsite": 12345
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for load data import from file
<!-- pyml disable line-length -->
:::{table} load::provider_settings::LoadAkkudoktor
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| loadakkudoktor_year_energy_kwh | `Optional[float]` | `rw` | `None` | Yearly energy consumption (kWh). |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadAkkudoktor": {
"loadakkudoktor_year_energy_kwh": 40421.0
}
}
}
}
```
<!-- pyml enable line-length -->
### Load Prediction Provider Configuration
<!-- pyml disable line-length -->
:::{table} load::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| LoadAkkudoktor | `Optional[akkudoktoreos.prediction.loadakkudoktor.LoadAkkudoktorCommonSettings]` | `rw` | `None` | LoadAkkudoktor settings |
| LoadImport | `Optional[akkudoktoreos.prediction.loadimport.LoadImportCommonSettings]` | `rw` | `None` | LoadImport settings |
| LoadVrm | `Optional[akkudoktoreos.prediction.loadvrm.LoadVrmCommonSettings]` | `rw` | `None` | LoadVrm settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"load": {
"provider_settings": {
"LoadAkkudoktor": null,
"LoadVrm": null,
"LoadImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,45 @@
## Logging Configuration
<!-- pyml disable line-length -->
:::{table} logging
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| console_level | `EOS_LOGGING__CONSOLE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to console. |
| file_level | `EOS_LOGGING__FILE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to file. |
| file_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | None |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"logging": {
"console_level": "TRACE",
"file_level": "TRACE"
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"logging": {
"console_level": "TRACE",
"file_level": "TRACE",
"file_path": "/home/user/.local/share/net.akkudoktor.eos/output/eos.log"
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,72 @@
## Measurement Configuration
<!-- pyml disable line-length -->
:::{table} measurement
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| grid_export_emr_keys | `EOS_MEASUREMENT__GRID_EXPORT_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of energy export to grid [kWh]. |
| grid_import_emr_keys | `EOS_MEASUREMENT__GRID_IMPORT_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of energy import from grid [kWh]. |
| keys | | `list[str]` | `ro` | `N/A` | None |
| load_emr_keys | `EOS_MEASUREMENT__LOAD_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are energy meter readings of a load [kWh]. |
| pv_production_emr_keys | `EOS_MEASUREMENT__PV_PRODUCTION_EMR_KEYS` | `Optional[list[str]]` | `rw` | `None` | The keys of the measurements that are PV production energy meter readings [kWh]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
]
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"measurement": {
"load_emr_keys": [
"load0_emr"
],
"grid_export_emr_keys": [
"grid_export_emr"
],
"grid_import_emr_keys": [
"grid_import_emr"
],
"pv_production_emr_keys": [
"pv1_emr"
],
"keys": [
"grid_export_emr",
"grid_import_emr",
"load0_emr",
"pv1_emr"
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,76 @@
## General Optimization Configuration
<!-- pyml disable line-length -->
:::{table} optimization
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| algorithm | `EOS_OPTIMIZATION__ALGORITHM` | `Optional[str]` | `rw` | `GENETIC` | The optimization algorithm. |
| genetic | `EOS_OPTIMIZATION__GENETIC` | `Optional[akkudoktoreos.optimization.optimization.GeneticCommonSettings]` | `rw` | `None` | Genetic optimization algorithm configuration. |
| horizon_hours | `EOS_OPTIMIZATION__HORIZON_HOURS` | `Optional[int]` | `rw` | `24` | The general time window within which the energy optimization goal shall be achieved [h]. Defaults to 24 hours. |
| interval | `EOS_OPTIMIZATION__INTERVAL` | `Optional[int]` | `rw` | `3600` | The optimization interval [sec]. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"optimization": {
"horizon_hours": 24,
"interval": 3600,
"algorithm": "GENETIC",
"genetic": {
"individuals": 400,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
}
}
```
<!-- pyml enable line-length -->
### General Genetic Optimization Algorithm Configuration
<!-- pyml disable line-length -->
:::{table} optimization::genetic
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| generations | `Optional[int]` | `rw` | `400` | Number of generations to evaluate the optimal solution [>= 10]. Defaults to 400. |
| individuals | `Optional[int]` | `rw` | `300` | Number of individuals (solutions) to generate for the (initial) generation [>= 10]. Defaults to 300. |
| penalties | `Optional[dict[str, Union[float, int, str]]]` | `rw` | `None` | A dictionary of penalty function parameters consisting of a penalty function parameter name and the associated value. |
| seed | `Optional[int]` | `rw` | `None` | Fixed seed for genetic algorithm. Defaults to 'None' which means random seed. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"optimization": {
"genetic": {
"individuals": 300,
"generations": 400,
"seed": null,
"penalties": {
"ev_soc_miss": 10
}
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,42 @@
## General Prediction Configuration
This class provides configuration for prediction settings, allowing users to specify
parameters such as the forecast duration (in hours).
Validators ensure each parameter is within a specified range.
Attributes:
hours (Optional[int]): Number of hours into the future for predictions.
Must be non-negative.
historic_hours (Optional[int]): Number of hours into the past for historical data.
Must be non-negative.
Validators:
validate_hours (int): Ensures `hours` is a non-negative integer.
validate_historic_hours (int): Ensures `historic_hours` is a non-negative integer.
<!-- pyml disable line-length -->
:::{table} prediction
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| historic_hours | `EOS_PREDICTION__HISTORIC_HOURS` | `Optional[int]` | `rw` | `48` | Number of hours into the past for historical predictions data |
| hours | `EOS_PREDICTION__HOURS` | `Optional[int]` | `rw` | `48` | Number of hours into the future for predictions |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"prediction": {
"hours": 48,
"historic_hours": 48
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,340 @@
## PV Forecast Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| max_planes | `EOS_PVFORECAST__MAX_PLANES` | `Optional[int]` | `rw` | `0` | Maximum number of planes that can be set |
| planes | `EOS_PVFORECAST__PLANES` | `Optional[list[akkudoktoreos.prediction.pvforecast.PVForecastPlaneSetting]]` | `rw` | `None` | Plane configuration. |
| planes_azimuth | | `List[float]` | `ro` | `N/A` | None |
| planes_inverter_paco | | `Any` | `ro` | `N/A` | None |
| planes_peakpower | | `List[float]` | `ro` | `N/A` | None |
| planes_tilt | | `List[float]` | `ro` | `N/A` | None |
| planes_userhorizon | | `Any` | `ro` | `N/A` | None |
| provider | `EOS_PVFORECAST__PROVIDER` | `Optional[str]` | `rw` | `None` | PVForecast provider id of provider to be used. |
| provider_settings | `EOS_PVFORECAST__PROVIDER_SETTINGS` | `PVForecastCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1
}
}
```
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
},
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
},
{
"surface_tilt": 20.0,
"surface_azimuth": 90.0,
"userhorizon": [
5.0,
15.0,
25.0
],
"peakpower": 3.5,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 1,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 4000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
],
"max_planes": 1,
"planes_peakpower": [
5.0,
3.5
],
"planes_azimuth": [
180.0,
90.0
],
"planes_tilt": [
10.0,
20.0
],
"planes_userhorizon": [
[
10.0,
20.0,
30.0
],
[
5.0,
15.0,
25.0
]
],
"planes_inverter_paco": [
6000.0,
4000.0
]
}
}
```
<!-- pyml enable line-length -->
### Common settings for VRM API
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings::PVForecastVrm
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| pvforecast_vrm_idsite | `int` | `rw` | `12345` | VRM-Installation-ID |
| pvforecast_vrm_token | `str` | `rw` | `your-token` | Token for Connecting VRM API |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastVrm": {
"pvforecast_vrm_token": "your-token",
"pvforecast_vrm_idsite": 12345
}
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for pvforecast data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings::PVForecastImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import PV forecast data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of PV forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastImport": {
"import_file_path": null,
"import_json": "{\"pvforecast_ac_power\": [0, 8.05, 352.91]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### PV Forecast Provider Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| PVForecastImport | `Optional[akkudoktoreos.prediction.pvforecastimport.PVForecastImportCommonSettings]` | `rw` | `None` | PVForecastImport settings |
| PVForecastVrm | `Optional[akkudoktoreos.prediction.pvforecastvrm.PVForecastVrmCommonSettings]` | `rw` | `None` | PVForecastVrm settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"provider_settings": {
"PVForecastImport": null,
"PVForecastVrm": null
}
}
}
```
<!-- pyml enable line-length -->
### PV Forecast Plane Configuration
<!-- pyml disable line-length -->
:::{table} pvforecast::planes::list
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| albedo | `Optional[float]` | `rw` | `None` | Proportion of the light hitting the ground that it reflects back. |
| inverter_model | `Optional[str]` | `rw` | `None` | Model of the inverter of this plane. |
| inverter_paco | `Optional[int]` | `rw` | `None` | AC power rating of the inverter [W]. |
| loss | `Optional[float]` | `rw` | `14.0` | Sum of PV system losses in percent |
| module_model | `Optional[str]` | `rw` | `None` | Model of the PV modules of this plane. |
| modules_per_string | `Optional[int]` | `rw` | `None` | Number of the PV modules of the strings of this plane. |
| mountingplace | `Optional[str]` | `rw` | `free` | Type of mounting for PV system. Options are 'free' for free-standing and 'building' for building-integrated. |
| optimal_surface_tilt | `Optional[bool]` | `rw` | `False` | Calculate the optimum tilt angle. Ignored for two-axis tracking. |
| optimalangles | `Optional[bool]` | `rw` | `False` | Calculate the optimum tilt and azimuth angles. Ignored for two-axis tracking. |
| peakpower | `Optional[float]` | `rw` | `None` | Nominal power of PV system in kW. |
| pvtechchoice | `Optional[str]` | `rw` | `crystSi` | PV technology. One of 'crystSi', 'CIS', 'CdTe', 'Unknown'. |
| strings_per_inverter | `Optional[int]` | `rw` | `None` | Number of the strings of the inverter of this plane. |
| surface_azimuth | `Optional[float]` | `rw` | `180.0` | Orientation (azimuth angle) of the (fixed) plane. Clockwise from north (north=0, east=90, south=180, west=270). |
| surface_tilt | `Optional[float]` | `rw` | `30.0` | Tilt angle from horizontal plane. Ignored for two-axis tracking. |
| trackingtype | `Optional[int]` | `rw` | `None` | Type of suntracking. 0=fixed, 1=single horizontal axis aligned north-south, 2=two-axis tracking, 3=vertical axis tracking, 4=single horizontal axis aligned east-west, 5=single inclined axis aligned north-south. |
| userhorizon | `Optional[List[float]]` | `rw` | `None` | Elevation of horizon in degrees, at equally spaced azimuth clockwise from north. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"pvforecast": {
"planes": [
{
"surface_tilt": 10.0,
"surface_azimuth": 180.0,
"userhorizon": [
10.0,
20.0,
30.0
],
"peakpower": 5.0,
"pvtechchoice": "crystSi",
"mountingplace": "free",
"loss": 14.0,
"trackingtype": 0,
"optimal_surface_tilt": false,
"optimalangles": false,
"albedo": null,
"module_model": null,
"inverter_model": null,
"inverter_paco": 6000,
"modules_per_string": 20,
"strings_per_inverter": 2
}
]
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,36 @@
## Server Configuration
<!-- pyml disable line-length -->
:::{table} server
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| eosdash_host | `EOS_SERVER__EOSDASH_HOST` | `Optional[str]` | `rw` | `None` | EOSdash server IP address. Defaults to EOS server IP address. |
| eosdash_port | `EOS_SERVER__EOSDASH_PORT` | `Optional[int]` | `rw` | `None` | EOSdash server IP port number. Defaults to EOS server IP port number + 1. |
| host | `EOS_SERVER__HOST` | `Optional[str]` | `rw` | `127.0.0.1` | EOS server IP address. Defaults to 127.0.0.1. |
| port | `EOS_SERVER__PORT` | `Optional[int]` | `rw` | `8503` | EOS server IP port number. Defaults to 8503. |
| startup_eosdash | `EOS_SERVER__STARTUP_EOSDASH` | `Optional[bool]` | `rw` | `True` | EOS server to start EOSdash server. Defaults to True. |
| verbose | `EOS_SERVER__VERBOSE` | `Optional[bool]` | `rw` | `False` | Enable debug output |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"server": {
"host": "127.0.0.1",
"port": 8503,
"verbose": false,
"startup_eosdash": true,
"eosdash_host": "127.0.0.1",
"eosdash_port": 8504
}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,23 @@
## Utils Configuration
<!-- pyml disable line-length -->
:::{table} utils
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"utils": {}
}
```
<!-- pyml enable line-length -->

View File

@@ -0,0 +1,92 @@
## Weather Forecast Configuration
<!-- pyml disable line-length -->
:::{table} weather
:widths: 10 20 10 5 5 30
:align: left
| Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- |
| provider | `EOS_WEATHER__PROVIDER` | `Optional[str]` | `rw` | `None` | Weather provider id of provider to be used. |
| provider_settings | `EOS_WEATHER__PROVIDER_SETTINGS` | `WeatherCommonProviderSettings` | `rw` | `required` | Provider settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider": "WeatherImport",
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->
### Common settings for weather data import from file or JSON string
<!-- pyml disable line-length -->
:::{table} weather::provider_settings::WeatherImport
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| import_file_path | `Union[str, pathlib.Path, NoneType]` | `rw` | `None` | Path to the file to import weather data from. |
| import_json | `Optional[str]` | `rw` | `None` | JSON string, dictionary of weather forecast value lists. |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider_settings": {
"WeatherImport": {
"import_file_path": null,
"import_json": "{\"weather_temp_air\": [18.3, 17.8, 16.9]}"
}
}
}
}
```
<!-- pyml enable line-length -->
### Weather Forecast Provider Configuration
<!-- pyml disable line-length -->
:::{table} weather::provider_settings
:widths: 10 10 5 5 30
:align: left
| Name | Type | Read-Only | Default | Description |
| ---- | ---- | --------- | ------- | ----------- |
| WeatherImport | `Optional[akkudoktoreos.prediction.weatherimport.WeatherImportCommonSettings]` | `rw` | `None` | WeatherImport settings |
:::
<!-- pyml enable line-length -->
<!-- pyml disable no-emphasis-as-heading -->
**Example Input/Output**
<!-- pyml enable no-emphasis-as-heading -->
<!-- pyml disable line-length -->
```json
{
"weather": {
"provider_settings": {
"WeatherImport": null
}
}
}
```
<!-- pyml enable line-length -->

View File

@@ -1,8 +1,10 @@
# Akkudoktor-EOS # Akkudoktor-EOS
**Version**: `v0.2.0` **Version**: `v0.2.0+dev.4dbc2d`
<!-- pyml disable line-length -->
**Description**: This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period. **Description**: This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.
<!-- pyml enable line-length -->
**Base URL**: `No base URL provided.` **Base URL**: `No base URL provided.`
@@ -10,11 +12,15 @@
## POST /gesamtlast ## POST /gesamtlast
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_gesamtlast_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_gesamtlast_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_gesamtlast_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_gesamtlast_post)
<!-- pyml enable line-length -->
Fastapi Gesamtlast Fastapi Gesamtlast
``` <!-- pyml disable line-length -->
```python
"""
Deprecated: Total Load Prediction with adjustment. Deprecated: Total Load Prediction with adjustment.
Endpoint to handle total load prediction adjusted by latest measured data. Endpoint to handle total load prediction adjusted by latest measured data.
@@ -30,7 +36,9 @@ Note:
'/v1/measurement/series' or '/v1/measurement/series' or
'/v1/measurement/dataframe' or '/v1/measurement/dataframe' or
'/v1/measurement/data' '/v1/measurement/data'
"""
``` ```
<!-- pyml enable line-length -->
**Request Body**: **Request Body**:
@@ -48,11 +56,15 @@ Note:
## GET /gesamtlast_simple ## GET /gesamtlast_simple
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_gesamtlast_simple_gesamtlast_simple_get)
<!-- pyml enable line-length -->
Fastapi Gesamtlast Simple Fastapi Gesamtlast Simple
``` <!-- pyml disable line-length -->
```python
"""
Deprecated: Total Load Prediction. Deprecated: Total Load Prediction.
Endpoint to handle total load prediction. Endpoint to handle total load prediction.
@@ -69,7 +81,9 @@ Note:
'/v1/prediction/update' '/v1/prediction/update'
and then request data with and then request data with
'/v1/prediction/list?key=loadforecast_power_w' instead. '/v1/prediction/list?key=loadforecast_power_w' instead.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -85,18 +99,24 @@ Note:
## POST /optimize ## POST /optimize
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_optimize_optimize_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_optimize_optimize_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_optimize_optimize_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_optimize_optimize_post)
<!-- pyml enable line-length -->
Fastapi Optimize Fastapi Optimize
``` <!-- pyml disable line-length -->
```python
"""
Deprecated: Optimize. Deprecated: Optimize.
Endpoint to handle optimization. Endpoint to handle optimization.
Note: Note:
Use automatic optimization instead. Use automatic optimization instead.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -120,11 +140,15 @@ Note:
## GET /pvforecast ## GET /pvforecast
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_pvforecast_pvforecast_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_pvforecast_pvforecast_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_pvforecast_pvforecast_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_pvforecast_pvforecast_get)
<!-- pyml enable line-length -->
Fastapi Pvforecast Fastapi Pvforecast
``` <!-- pyml disable line-length -->
```python
"""
Deprecated: PV Forecast Prediction. Deprecated: PV Forecast Prediction.
Endpoint to handle PV forecast prediction. Endpoint to handle PV forecast prediction.
@@ -139,7 +163,9 @@ Note:
and then request data with and then request data with
'/v1/prediction/list?key=pvforecast_ac_power' and '/v1/prediction/list?key=pvforecast_ac_power' and
'/v1/prediction/list?key=pvforecastakkudoktor_temp_air' instead. '/v1/prediction/list?key=pvforecastakkudoktor_temp_air' instead.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -149,11 +175,15 @@ Note:
## GET /strompreis ## GET /strompreis
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_strompreis_strompreis_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_strompreis_strompreis_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_strompreis_strompreis_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_strompreis_strompreis_get)
<!-- pyml enable line-length -->
Fastapi Strompreis Fastapi Strompreis
``` <!-- pyml disable line-length -->
```python
"""
Deprecated: Electricity Market Price Prediction per Wh (€/Wh). Deprecated: Electricity Market Price Prediction per Wh (€/Wh).
Electricity prices start at 00.00.00 today and are provided for 48 hours. Electricity prices start at 00.00.00 today and are provided for 48 hours.
@@ -169,7 +199,9 @@ Note:
and then request data with and then request data with
'/v1/prediction/list?key=elecprice_marketprice_wh' or '/v1/prediction/list?key=elecprice_marketprice_wh' or
'/v1/prediction/list?key=elecprice_marketprice_kwh' instead. '/v1/prediction/list?key=elecprice_marketprice_kwh' instead.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -179,16 +211,22 @@ Note:
## GET /v1/admin/cache ## GET /v1/admin/cache
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_get_v1_admin_cache_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_get_v1_admin_cache_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_get_v1_admin_cache_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_get_v1_admin_cache_get)
<!-- pyml enable line-length -->
Fastapi Admin Cache Get Fastapi Admin Cache Get
``` <!-- pyml disable line-length -->
```python
"""
Current cache management data. Current cache management data.
Returns: Returns:
data (dict): The management data. data (dict): The management data.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -198,18 +236,24 @@ Returns:
## POST /v1/admin/cache/clear ## POST /v1/admin/cache/clear
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_post_v1_admin_cache_clear_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Clear Post Fastapi Admin Cache Clear Post
``` <!-- pyml disable line-length -->
```python
"""
Clear the cache. Clear the cache.
Deletes all cache files. Deletes all cache files.
Returns: Returns:
data (dict): The management data after cleanup. data (dict): The management data after cleanup.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -219,18 +263,24 @@ Returns:
## POST /v1/admin/cache/clear-expired ## POST /v1/admin/cache/clear-expired
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_clear_expired_post_v1_admin_cache_clear-expired_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Clear Expired Post Fastapi Admin Cache Clear Expired Post
``` <!-- pyml disable line-length -->
```python
"""
Clear the cache from expired data. Clear the cache from expired data.
Deletes expired cache files. Deletes expired cache files.
Returns: Returns:
data (dict): The management data after cleanup. data (dict): The management data after cleanup.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -240,16 +290,22 @@ Returns:
## POST /v1/admin/cache/load ## POST /v1/admin/cache/load
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_load_post_v1_admin_cache_load_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Load Post Fastapi Admin Cache Load Post
``` <!-- pyml disable line-length -->
```python
"""
Load cache management data. Load cache management data.
Returns: Returns:
data (dict): The management data that was loaded. data (dict): The management data that was loaded.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -259,16 +315,22 @@ Returns:
## POST /v1/admin/cache/save ## POST /v1/admin/cache/save
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_cache_save_post_v1_admin_cache_save_post)
<!-- pyml enable line-length -->
Fastapi Admin Cache Save Post Fastapi Admin Cache Save Post
``` <!-- pyml disable line-length -->
```python
"""
Save the current cache management data. Save the current cache management data.
Returns: Returns:
data (dict): The management data that was saved. data (dict): The management data that was saved.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -278,15 +340,21 @@ Returns:
## POST /v1/admin/server/restart ## POST /v1/admin/server/restart
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_restart_post_v1_admin_server_restart_post)
<!-- pyml enable line-length -->
Fastapi Admin Server Restart Post Fastapi Admin Server Restart Post
``` <!-- pyml disable line-length -->
```python
"""
Restart the server. Restart the server.
Restart EOS properly by starting a new instance before exiting the old one. Restart EOS properly by starting a new instance before exiting the old one.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -296,13 +364,19 @@ Restart EOS properly by starting a new instance before exiting the old one.
## POST /v1/admin/server/shutdown ## POST /v1/admin/server/shutdown
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_admin_server_shutdown_post_v1_admin_server_shutdown_post)
<!-- pyml enable line-length -->
Fastapi Admin Server Shutdown Post Fastapi Admin Server Shutdown Post
``` <!-- pyml disable line-length -->
```python
"""
Shutdown the server. Shutdown the server.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -312,16 +386,22 @@ Shutdown the server.
## GET /v1/config ## GET /v1/config
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_v1_config_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_v1_config_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_v1_config_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_v1_config_get)
<!-- pyml enable line-length -->
Fastapi Config Get Fastapi Config Get
``` <!-- pyml disable line-length -->
```python
"""
Get the current configuration. Get the current configuration.
Returns: Returns:
configuration (ConfigEOS): The current configuration. configuration (ConfigEOS): The current configuration.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -331,11 +411,15 @@ Returns:
## PUT /v1/config ## PUT /v1/config
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_v1_config_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_v1_config_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_v1_config_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_v1_config_put)
<!-- pyml enable line-length -->
Fastapi Config Put Fastapi Config Put
``` <!-- pyml disable line-length -->
```python
"""
Update the current config with the provided settings. Update the current config with the provided settings.
Note that for any setting value that is None or unset, the configuration will fall back to Note that for any setting value that is None or unset, the configuration will fall back to
@@ -347,7 +431,9 @@ Args:
Returns: Returns:
configuration (ConfigEOS): The current configuration after the write. configuration (ConfigEOS): The current configuration after the write.
"""
``` ```
<!-- pyml enable line-length -->
**Request Body**: **Request Body**:
@@ -365,16 +451,22 @@ Returns:
## GET /v1/config/backup ## GET /v1/config/backup
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_backup_get_v1_config_backup_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_backup_get_v1_config_backup_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_backup_get_v1_config_backup_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_backup_get_v1_config_backup_get)
<!-- pyml enable line-length -->
Fastapi Config Backup Get Fastapi Config Backup Get
``` <!-- pyml disable line-length -->
```python
"""
Get the EOS configuration backup identifiers and backup metadata. Get the EOS configuration backup identifiers and backup metadata.
Returns: Returns:
dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata. dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -384,16 +476,22 @@ Returns:
## PUT /v1/config/file ## PUT /v1/config/file
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_file_put_v1_config_file_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_file_put_v1_config_file_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_file_put_v1_config_file_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_file_put_v1_config_file_put)
<!-- pyml enable line-length -->
Fastapi Config File Put Fastapi Config File Put
``` <!-- pyml disable line-length -->
```python
"""
Save the current configuration to the EOS configuration file. Save the current configuration to the EOS configuration file.
Returns: Returns:
configuration (ConfigEOS): The current configuration that was saved. configuration (ConfigEOS): The current configuration that was saved.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -403,16 +501,22 @@ Returns:
## POST /v1/config/reset ## POST /v1/config/reset
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_reset_post_v1_config_reset_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_reset_post_v1_config_reset_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_reset_post_v1_config_reset_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_reset_post_v1_config_reset_post)
<!-- pyml enable line-length -->
Fastapi Config Reset Post Fastapi Config Reset Post
``` <!-- pyml disable line-length -->
```python
"""
Reset the configuration to the EOS configuration file. Reset the configuration to the EOS configuration file.
Returns: Returns:
configuration (ConfigEOS): The current configuration after update. configuration (ConfigEOS): The current configuration after update.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -422,16 +526,22 @@ Returns:
## PUT /v1/config/revert ## PUT /v1/config/revert
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_revert_put_v1_config_revert_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_revert_put_v1_config_revert_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_revert_put_v1_config_revert_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_revert_put_v1_config_revert_put)
<!-- pyml enable line-length -->
Fastapi Config Revert Put Fastapi Config Revert Put
``` <!-- pyml disable line-length -->
```python
"""
Revert the configuration to a EOS configuration backup. Revert the configuration to a EOS configuration backup.
Returns: Returns:
configuration (ConfigEOS): The current configuration after revert. configuration (ConfigEOS): The current configuration after revert.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -447,11 +557,15 @@ Returns:
## GET /v1/config/{path} ## GET /v1/config/{path}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_key_v1_config__path__get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_key_v1_config__path__get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_get_key_v1_config__path__get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_get_key_v1_config__path__get)
<!-- pyml enable line-length -->
Fastapi Config Get Key Fastapi Config Get Key
``` <!-- pyml disable line-length -->
```python
"""
Get the value of a nested key or index in the config model. Get the value of a nested key or index in the config model.
Args: Args:
@@ -459,7 +573,9 @@ Args:
Returns: Returns:
value (Any): The value of the selected nested key. value (Any): The value of the selected nested key.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -475,11 +591,15 @@ Returns:
## PUT /v1/config/{path} ## PUT /v1/config/{path}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_key_v1_config__path__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_key_v1_config__path__put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_config_put_key_v1_config__path__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_config_put_key_v1_config__path__put)
<!-- pyml enable line-length -->
Fastapi Config Put Key Fastapi Config Put Key
``` <!-- pyml disable line-length -->
```python
"""
Update a nested key or index in the config model. Update a nested key or index in the config model.
Args: Args:
@@ -488,7 +608,9 @@ Args:
Returns: Returns:
configuration (ConfigEOS): The current configuration after the update. configuration (ConfigEOS): The current configuration after the update.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -517,13 +639,19 @@ Returns:
## GET /v1/energy-management/optimization/solution ## GET /v1/energy-management/optimization/solution
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_optimization_solution_get_v1_energy-management_optimization_solution_get)
<!-- pyml enable line-length -->
Fastapi Energy Management Optimization Solution Get Fastapi Energy Management Optimization Solution Get
``` <!-- pyml disable line-length -->
```python
"""
Get the latest solution of the optimization. Get the latest solution of the optimization.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -533,13 +661,19 @@ Get the latest solution of the optimization.
## GET /v1/energy-management/plan ## GET /v1/energy-management/plan
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_energy_management_plan_get_v1_energy-management_plan_get)
<!-- pyml enable line-length -->
Fastapi Energy Management Plan Get Fastapi Energy Management Plan Get
``` <!-- pyml disable line-length -->
```python
"""
Get the latest energy management plan. Get the latest energy management plan.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -549,13 +683,19 @@ Get the latest energy management plan.
## GET /v1/health ## GET /v1/health
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_health_get_v1_health_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_health_get_v1_health_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_health_get_v1_health_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_health_get_v1_health_get)
<!-- pyml enable line-length -->
Fastapi Health Get Fastapi Health Get
``` <!-- pyml disable line-length -->
```python
"""
Health check endpoint to verify that the EOS server is alive. Health check endpoint to verify that the EOS server is alive.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -565,11 +705,15 @@ Health check endpoint to verify that the EOS server is alive.
## GET /v1/logging/log ## GET /v1/logging/log
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_logging_get_log_v1_logging_log_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_logging_get_log_v1_logging_log_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_logging_get_log_v1_logging_log_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_logging_get_log_v1_logging_log_get)
<!-- pyml enable line-length -->
Fastapi Logging Get Log Fastapi Logging Get Log
``` <!-- pyml disable line-length -->
```python
"""
Get structured log entries from the EOS log file. Get structured log entries from the EOS log file.
Filters and returns log entries based on the specified query parameters. The log Filters and returns log entries based on the specified query parameters. The log
@@ -586,7 +730,9 @@ Args:
Returns: Returns:
JSONResponse: A JSON list of log entries. JSONResponse: A JSON list of log entries.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -614,13 +760,19 @@ Returns:
## PUT /v1/measurement/data ## PUT /v1/measurement/data
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_data_put_v1_measurement_data_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_data_put_v1_measurement_data_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_data_put_v1_measurement_data_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_data_put_v1_measurement_data_put)
<!-- pyml enable line-length -->
Fastapi Measurement Data Put Fastapi Measurement Data Put
``` <!-- pyml disable line-length -->
```python
"""
Merge the measurement data given as datetime data into EOS measurements. Merge the measurement data given as datetime data into EOS measurements.
"""
``` ```
<!-- pyml enable line-length -->
**Request Body**: **Request Body**:
@@ -638,13 +790,19 @@ Merge the measurement data given as datetime data into EOS measurements.
## PUT /v1/measurement/dataframe ## PUT /v1/measurement/dataframe
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_dataframe_put_v1_measurement_dataframe_put)
<!-- pyml enable line-length -->
Fastapi Measurement Dataframe Put Fastapi Measurement Dataframe Put
``` <!-- pyml disable line-length -->
```python
"""
Merge the measurement data given as dataframe into EOS measurements. Merge the measurement data given as dataframe into EOS measurements.
"""
``` ```
<!-- pyml enable line-length -->
**Request Body**: **Request Body**:
@@ -662,13 +820,19 @@ Merge the measurement data given as dataframe into EOS measurements.
## GET /v1/measurement/keys ## GET /v1/measurement/keys
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_keys_get_v1_measurement_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_keys_get_v1_measurement_keys_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_keys_get_v1_measurement_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_keys_get_v1_measurement_keys_get)
<!-- pyml enable line-length -->
Fastapi Measurement Keys Get Fastapi Measurement Keys Get
``` <!-- pyml disable line-length -->
```python
"""
Get a list of available measurement keys. Get a list of available measurement keys.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -678,13 +842,19 @@ Get a list of available measurement keys.
## GET /v1/measurement/series ## GET /v1/measurement/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_get_v1_measurement_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_get_v1_measurement_series_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_get_v1_measurement_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_get_v1_measurement_series_get)
<!-- pyml enable line-length -->
Fastapi Measurement Series Get Fastapi Measurement Series Get
``` <!-- pyml disable line-length -->
```python
"""
Get the measurements of given key as series. Get the measurements of given key as series.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -700,13 +870,19 @@ Get the measurements of given key as series.
## PUT /v1/measurement/series ## PUT /v1/measurement/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_put_v1_measurement_series_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_put_v1_measurement_series_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_series_put_v1_measurement_series_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_series_put_v1_measurement_series_put)
<!-- pyml enable line-length -->
Fastapi Measurement Series Put Fastapi Measurement Series Put
``` <!-- pyml disable line-length -->
```python
"""
Merge measurement given as series into given key. Merge measurement given as series into given key.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -728,13 +904,19 @@ Merge measurement given as series into given key.
## PUT /v1/measurement/value ## PUT /v1/measurement/value
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_value_put_v1_measurement_value_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_value_put_v1_measurement_value_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_value_put_v1_measurement_value_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_value_put_v1_measurement_value_put)
<!-- pyml enable line-length -->
Fastapi Measurement Value Put Fastapi Measurement Value Put
``` <!-- pyml disable line-length -->
```python
"""
Merge the measurement of given key and value into EOS measurements at given datetime. Merge the measurement of given key and value into EOS measurements at given datetime.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -754,11 +936,15 @@ Merge the measurement of given key and value into EOS measurements at given date
## GET /v1/prediction/dataframe ## GET /v1/prediction/dataframe
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get)
<!-- pyml enable line-length -->
Fastapi Prediction Dataframe Get Fastapi Prediction Dataframe Get
``` <!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as series. Get prediction for given key within given date range as series.
Args: Args:
@@ -768,7 +954,9 @@ Args:
end_datetime (Optional[str]: Ending datetime (exclusive). end_datetime (Optional[str]: Ending datetime (exclusive).
Defaults to end datetime of latest prediction. Defaults to end datetime of latest prediction.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -790,11 +978,15 @@ Defaults to end datetime of latest prediction.
## PUT /v1/prediction/import/{provider_id} ## PUT /v1/prediction/import/{provider_id}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put)
<!-- pyml enable line-length -->
Fastapi Prediction Import Provider Fastapi Prediction Import Provider
``` <!-- pyml disable line-length -->
```python
"""
Import prediction for given provider ID. Import prediction for given provider ID.
Args: Args:
@@ -802,7 +994,9 @@ Args:
data: Prediction data. data: Prediction data.
force_enable: Update data even if provider is disabled. force_enable: Update data even if provider is disabled.
Defaults to False. Defaults to False.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -841,13 +1035,19 @@ Args:
## GET /v1/prediction/keys ## GET /v1/prediction/keys
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_keys_get_v1_prediction_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_keys_get_v1_prediction_keys_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_keys_get_v1_prediction_keys_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_keys_get_v1_prediction_keys_get)
<!-- pyml enable line-length -->
Fastapi Prediction Keys Get Fastapi Prediction Keys Get
``` <!-- pyml disable line-length -->
```python
"""
Get a list of available prediction keys. Get a list of available prediction keys.
"""
``` ```
<!-- pyml enable line-length -->
**Responses**: **Responses**:
@@ -857,11 +1057,15 @@ Get a list of available prediction keys.
## GET /v1/prediction/list ## GET /v1/prediction/list
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_list_get_v1_prediction_list_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_list_get_v1_prediction_list_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_list_get_v1_prediction_list_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_list_get_v1_prediction_list_get)
<!-- pyml enable line-length -->
Fastapi Prediction List Get Fastapi Prediction List Get
``` <!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as value list. Get prediction for given key within given date range as value list.
Args: Args:
@@ -872,7 +1076,9 @@ Args:
Defaults to end datetime of latest prediction. Defaults to end datetime of latest prediction.
interval (Optional[str]): Time duration for each interval. interval (Optional[str]): Time duration for each interval.
Defaults to 1 hour. Defaults to 1 hour.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -894,16 +1100,22 @@ Args:
## GET /v1/prediction/providers ## GET /v1/prediction/providers
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_providers_get_v1_prediction_providers_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_providers_get_v1_prediction_providers_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_providers_get_v1_prediction_providers_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_providers_get_v1_prediction_providers_get)
<!-- pyml enable line-length -->
Fastapi Prediction Providers Get Fastapi Prediction Providers Get
``` <!-- pyml disable line-length -->
```python
"""
Get a list of available prediction providers. Get a list of available prediction providers.
Args: Args:
enabled (bool): Return enabled/disabled providers. If unset, return all providers. enabled (bool): Return enabled/disabled providers. If unset, return all providers.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -919,11 +1131,15 @@ Args:
## GET /v1/prediction/series ## GET /v1/prediction/series
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_series_get_v1_prediction_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_series_get_v1_prediction_series_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_series_get_v1_prediction_series_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_series_get_v1_prediction_series_get)
<!-- pyml enable line-length -->
Fastapi Prediction Series Get Fastapi Prediction Series Get
``` <!-- pyml disable line-length -->
```python
"""
Get prediction for given key within given date range as series. Get prediction for given key within given date range as series.
Args: Args:
@@ -932,7 +1148,9 @@ Args:
Defaults to start datetime of latest prediction. Defaults to start datetime of latest prediction.
end_datetime (Optional[str]: Ending datetime (exclusive). end_datetime (Optional[str]: Ending datetime (exclusive).
Defaults to end datetime of latest prediction. Defaults to end datetime of latest prediction.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -952,11 +1170,15 @@ Args:
## POST /v1/prediction/update ## POST /v1/prediction/update
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_v1_prediction_update_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_v1_prediction_update_post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_v1_prediction_update_post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_v1_prediction_update_post)
<!-- pyml enable line-length -->
Fastapi Prediction Update Fastapi Prediction Update
``` <!-- pyml disable line-length -->
```python
"""
Update predictions for all providers. Update predictions for all providers.
Args: Args:
@@ -964,7 +1186,9 @@ Args:
Defaults to False. Defaults to False.
force_enable: Update data even if provider is disabled. force_enable: Update data even if provider is disabled.
Defaults to False. Defaults to False.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -982,11 +1206,15 @@ Args:
## POST /v1/prediction/update/{provider_id} ## POST /v1/prediction/update/{provider_id}
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post) **Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_update_provider_v1_prediction_update__provider_id__post)
<!-- pyml enable line-length -->
Fastapi Prediction Update Provider Fastapi Prediction Update Provider
``` <!-- pyml disable line-length -->
```python
"""
Update predictions for given provider ID. Update predictions for given provider ID.
Args: Args:
@@ -995,7 +1223,9 @@ Args:
Defaults to False. Defaults to False.
force_enable: Update data even if provider is disabled. force_enable: Update data even if provider is disabled.
Defaults to False. Defaults to False.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -1015,16 +1245,22 @@ Args:
## GET /v1/resource/status ## GET /v1/resource/status
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_get_v1_resource_status_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_get_v1_resource_status_get) **Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_get_v1_resource_status_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_get_v1_resource_status_get)
<!-- pyml enable line-length -->
Fastapi Devices Status Get Fastapi Devices Status Get
``` <!-- pyml disable line-length -->
```python
"""
Get the latest status of a resource/ device. Get the latest status of a resource/ device.
Return: Return:
latest_status: The latest status of a resource/ device. latest_status: The latest status of a resource/ device.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -1042,16 +1278,22 @@ Return:
## PUT /v1/resource/status ## PUT /v1/resource/status
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_put_v1_resource_status_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_put_v1_resource_status_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_devices_status_put_v1_resource_status_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_devices_status_put_v1_resource_status_put)
<!-- pyml enable line-length -->
Fastapi Devices Status Put Fastapi Devices Status Put
``` <!-- pyml disable line-length -->
```python
"""
Update the status of a resource/ device. Update the status of a resource/ device.
Return: Return:
latest_status: The latest status of a resource/ device. latest_status: The latest status of a resource/ device.
"""
``` ```
<!-- pyml enable line-length -->
**Parameters**: **Parameters**:
@@ -1105,7 +1347,9 @@ Return:
## GET /visualization_results.pdf ## GET /visualization_results.pdf
<!-- pyml disable line-length -->
**Links**: [local](http://localhost:8503/docs#/default/get_pdf_visualization_results_pdf_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/get_pdf_visualization_results_pdf_get) **Links**: [local](http://localhost:8503/docs#/default/get_pdf_visualization_results_pdf_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/get_pdf_visualization_results_pdf_get)
<!-- pyml enable line-length -->
Get Pdf Get Pdf
@@ -1114,3 +1358,5 @@ Get Pdf
- **200**: Successful Response - **200**: Successful Response
--- ---
Auto generated from openapi.json.

View File

@@ -124,8 +124,9 @@ Configuration options:
- `charges_kwh`: Electricity price charges (€/kWh). - `charges_kwh`: Electricity price charges (€/kWh).
- `vat_rate`: VAT rate factor applied to electricity price when charges are used (default: 1.19). - `vat_rate`: VAT rate factor applied to electricity price when charges are used (default: 1.19).
- `provider_settings.import_file_path`: Path to the file to import electricity price forecast data from. - `elecpriceimport.import_file_path`: Path to the file to import electricity price forecast data from.
- `provider_settings.import_json`: JSON string, dictionary of electricity price forecast value lists. - `elecpriceimport.import_json`: JSON string, dictionary of electricity price forecast value lists.
- `energycharts.bidding_zone`: Bidding zone Energy Charts shall provide price data for.
### ElecPriceAkkudoktor Provider ### ElecPriceAkkudoktor Provider

View File

@@ -7,13 +7,20 @@ https://www.sphinx-doc.org/en/master/usage/configuration.html
import sys import sys
from pathlib import Path from pathlib import Path
# Add the src directory to sys.path so Sphinx can import akkudoktoreos
PROJECT_ROOT = Path(__file__).parent.parent
SRC_DIR = PROJECT_ROOT / "src"
sys.path.insert(0, str(SRC_DIR))
from akkudoktoreos.core.version import __version__
# -- Project information ----------------------------------------------------- # -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information # https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = "Akkudoktor EOS" project = "Akkudoktor EOS"
copyright = "2024, Andreas Schmitz" copyright = "2025, Andreas Schmitz"
author = "Andreas Schmitz" author = "Andreas Schmitz"
release = "0.0.1" release = __version__
# -- General configuration --------------------------------------------------- # -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
@@ -22,6 +29,7 @@ extensions = [
"sphinx.ext.autodoc", "sphinx.ext.autodoc",
"sphinx.ext.autosummary", "sphinx.ext.autosummary",
"sphinx.ext.napoleon", "sphinx.ext.napoleon",
"sphinx.ext.todo",
"sphinx_rtd_theme", "sphinx_rtd_theme",
"myst_parser", "myst_parser",
"sphinx_tabs.tabs", "sphinx_tabs.tabs",

View File

@@ -393,6 +393,13 @@ At a minimum, you should run the module tests:
make test make test
``` ```
:::{admonition} Note
:class: Note
Depending on your changes you may also have to change the version.py and documentation files. Do as
suggested by the tests. You may ignore the version.py and documentation changes up until you
finalize your change.
:::
You should also run the system tests. These include additional tests that interact with real You should also run the system tests. These include additional tests that interact with real
resources: resources:

View File

@@ -5,10 +5,10 @@
This guide provides different methods to install AkkudoktorEOS: This guide provides different methods to install AkkudoktorEOS:
- Installation from Source (GitHub) - Installation from Source (GitHub) (M1)
- Installation from Release Package (GitHub) - Installation from Release Package (GitHub) (M2)
- Installation with Docker (DockerHub) - Installation with Docker (DockerHub) (M3)
- Installation with Docker (docker-compose) - Installation with Docker (docker-compose) (M4)
Choose the method that best suits your needs. Choose the method that best suits your needs.
@@ -34,6 +34,9 @@ Before installing, ensure you have the following:
- Docker Engine 20.10 or higher - Docker Engine 20.10 or higher
- Docker Compose (optional, recommended) - Docker Compose (optional, recommended)
See [Install Docker Engine](https://docs.docker.com/engine/install/) on how to install docker on
your Linux distro.
## Installation from Source (GitHub) (M1) ## Installation from Source (GitHub) (M1)
Recommended for developers or users wanting the latest updates. Recommended for developers or users wanting the latest updates.

View File

@@ -13,8 +13,8 @@ and how to set a **development version** after the release.
| 1 | Contributor | Prepare a release branch **in your fork** using Commitizen | | 1 | Contributor | Prepare a release branch **in your fork** using Commitizen |
| 2 | Contributor | Open a **Pull Request to upstream** (`Akkudoktor-EOS/EOS`) | | 2 | Contributor | Open a **Pull Request to upstream** (`Akkudoktor-EOS/EOS`) |
| 3 | Maintainer | Review and **merge the release PR** | | 3 | Maintainer | Review and **merge the release PR** |
| 4 | Maintainer | Create the **GitHub Release and tag** | | 4 | CI | Create the **GitHub Release and tag** |
| 5 | Maintainer | Set the **development version marker** via a follow-up PR | | 5 | CI | Set the **development version marker** via a follow-up PR |
## 🔄 Detailed Workflow ## 🔄 Detailed Workflow
@@ -40,24 +40,26 @@ git checkout -b release/vX.Y.Z
#### Bump the version information #### Bump the version information
At least update Set `__version__` in src/akkudoktoreos/core/version.py
- pyproject.toml ```python
- src/akkudoktoreos/core/version.py __version__ = 0.3.0
- src/akkudoktoreos/data/default.config.json ```
- Makefile
Prepare version by updating versioned files, e.g.:
- haaddon/config.yaml
and the generated documentation: and the generated documentation:
```bash ```bash
make bump VERSION=0.1.0+dev NEW_VERSION=X.Y.Z make prepare-version
make gen-docs
``` ```
You may check the changes by: Check the changes by:
```bash ```bash
git diff make test-version
``` ```
#### Create a new CHANGELOG.md entry #### Create a new CHANGELOG.md entry
@@ -66,19 +68,20 @@ Edit CHANGELOG.md
#### Create the new release commit #### Create the new release commit
Add all the changed version files and all other changes to the commit.
```bash ```bash
git add pyproject.toml src/akkudoktoreos/core/version.py \ git add src/akkudoktoreos/core/version.py CHANGELOG.md ...
src/akkudoktoreos/data/default.config.json Makefile CHANGELOG.md git commit -s -m "chore: Prepare Release v0.3.0"
git commit -s -m "chore(release): Release vX.Y.Z"
``` ```
#### Push the branch to your fork #### Push the branch to your fork
```bash ```bash
git push --set-upstream origin release/vX.Y.Z git push --set-upstream origin release/v0.3.0
``` ```
### 2⃣ Contributor: Open the Release Pull Request ### 2⃣ Contributor: Open the Release Preparation Pull Request
| From | To | | From | To |
| ------------------------------------ | ------------------------- | | ------------------------------------ | ------------------------- |
@@ -87,13 +90,13 @@ git push --set-upstream origin release/vX.Y.Z
**PR Title:** **PR Title:**
```text ```text
chore(release): release vX.Y.Z chore: prepare release vX.Y.Z
``` ```
**PR Description Template:** **PR Description Template:**
```markdown ```markdown
## Release vX.Y.Z ## Prepare Release vX.Y.Z
This pull request prepares release **vX.Y.Z**. This pull request prepares release **vX.Y.Z**.
@@ -119,94 +122,26 @@ See `CHANGELOG.md` for full details.
**Merge Strategy:** **Merge Strategy:**
- Prefer **Merge Commit** (or **Squash Merge**, per project preference) - Prefer **Merge Commit** (or **Squash Merge**, per project preference)
- Use commit message: `chore(release): Release vX.Y.Z` - Use commit message: `chore: Prepare Release vX.Y.Z`
### 4Maintainer: Publish the GitHub Release ### 4CI: Publish the GitHub Release
1. Go to **GitHub → Releases → Draft a new release** The new release will automatically be published by the GitHub CI action.
2. **Choose tag** → enter `vX.Y.Z` (GitHub creates the tag on publish)
3. **Release title:** `vX.Y.Z`
4. **Paste changelog entry** from `CHANGELOG.md`
5. Optionally enable **Set as latest release**
6. Click **Publish release** 🎉
### 5⃣ Maintainer: Prepare the Development Version Marker See `.github/workflwows/bump-version.yml`for details.
**Sync local copy:** ### 5⃣ CI: Prepare the Development Version Marker
```bash The development version marker will automatically be set by the GitHub CI action.
git fetch eos
git checkout main
git pull eos main
```
**Create a development version branch:** See `.github/workflwows/bump-version.yml`for details.
```bash
git checkout -b release/vX.Y.Z_dev
```
**Set development version marker manually:**
```bash
make bump VERSION=X.Y.Z NEW_VERSION=X.Y.Z+dev
make gen-docs
```
```bash
git add pyproject.toml src/akkudoktoreos/core/version.py \
src/akkudoktoreos/data/default.config.json Makefile
git commit -s -m "chore: set development version marker X.Y.Z+dev"
```
```bash
git push --set-upstream origin release/vX.Y.Z_dev
```
### 6⃣ Maintainer (or Contributor): Open the Development Version PR
| From | To |
| ---------------------------------------- | ------------------------- |
| `<your-username>/EOS:release/vX.Y.Z_dev` | `Akkudoktor-EOS/EOS:main` |
**PR Title:**
```text
chore: development version vX.Y.Z+dev
```
**PR Description Template:**
```markdown
## Development version vX.Y.Z+dev
This pull request marks the repository as back in active development.
### Changes
- Set version to `vX.Y.Z+dev`
No changelog entry is needed.
```
### 7⃣ Maintainer: Review and Merge the Development Version PR
**Checklist:**
- ✅ Only version files updated to `+dev`
- ✅ No unintended changes
**Merge Strategy:**
- Merge with commit message: `chore: development version vX.Y.Z+dev`
## ✅ Quick Reference ## ✅ Quick Reference
| Step | Actor | Action | | Step | Actor | Action |
| ---- | ----- | ------ | | ---- | ----- | ------ |
| **1. Prepare release branch** | Contributor | Bump version & changelog via Commitizen | | **1. Prepare release branch** | Contributor | Bump version & changelog |
| **2. Open release PR** | Contributor | Submit release for review | | **2. Open release PR** | Contributor | Submit release for review |
| **3. Review & merge release PR** | Maintainer | Finalize changes into `main` | | **3. Review & merge release PR** | Maintainer | Finalize changes into `main` |
| **4. Publish GitHub Release** | Maintainer | Create tag & notify users | | **4. Publish GitHub Release** | CI | Create tag & notify users |
| **5. Prepare development version branch** | Maintainer | Set development marker | | **5. Prepare development version branch** | CI | Set development marker |
| **6. Open development PR** | Maintainer (or Contributor) | Propose returning to development state |
| **7. Review & merge development PR** | Maintainer | Mark repository as back in development |

View File

@@ -3,7 +3,7 @@
"info": { "info": {
"title": "Akkudoktor-EOS", "title": "Akkudoktor-EOS",
"description": "This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.", "description": "This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.",
"version": "v0.2.0" "version": "v0.2.0+dev.4dbc2d"
}, },
"paths": { "paths": {
"/v1/admin/cache/clear": { "/v1/admin/cache/clear": {
@@ -2406,7 +2406,7 @@
"general": { "general": {
"$ref": "#/components/schemas/GeneralSettings-Output", "$ref": "#/components/schemas/GeneralSettings-Output",
"default": { "default": {
"version": "0.2.0", "version": "0.2.0+dev.4dbc2d",
"data_output_subpath": "output", "data_output_subpath": "output",
"latitude": 52.52, "latitude": 52.52,
"longitude": 13.405, "longitude": 13.405,
@@ -2469,7 +2469,10 @@
"$ref": "#/components/schemas/ElecPriceCommonSettings-Output", "$ref": "#/components/schemas/ElecPriceCommonSettings-Output",
"default": { "default": {
"vat_rate": 1.19, "vat_rate": 1.19,
"provider_settings": {} "elecpriceimport": {},
"energycharts": {
"bidding_zone": "DE-LU"
}
} }
}, },
"feedintariff": { "feedintariff": {
@@ -2519,7 +2522,7 @@
"additionalProperties": false, "additionalProperties": false,
"type": "object", "type": "object",
"title": "ConfigEOS", "title": "ConfigEOS",
"description": "Singleton configuration handler for the EOS application.\n\nConfigEOS extends `SettingsEOS` with support for default configuration paths and automatic\ninitialization.\n\n`ConfigEOS` ensures that only one instance of the class is created throughout the application,\nallowing consistent access to EOS configuration settings. This singleton instance loads\nconfiguration data from a predefined set of directories or creates a default configuration if\nnone is found.\n\nInitialization Process:\n - Upon instantiation, the singleton instance attempts to load a configuration file in this order:\n 1. The directory specified by the `EOS_CONFIG_DIR` environment variable\n 2. The directory specified by the `EOS_DIR` environment variable.\n 3. A platform specific default directory for EOS.\n 4. The current working directory.\n - The first available configuration file found in these directories is loaded.\n - If no configuration file is found, a default configuration file is created in the platform\n specific default directory, and default settings are loaded into it.\n\nAttributes from the loaded configuration are accessible directly as instance attributes of\n`ConfigEOS`, providing a centralized, shared configuration object for EOS.\n\nSingleton Behavior:\n - This class uses the `SingletonMixin` to ensure that all requests for `ConfigEOS` return\n the same instance, which contains the most up-to-date configuration. Modifying the configuration\n in one part of the application reflects across all references to this class.\n\nAttributes:\n config_folder_path (Optional[Path]): Path to the configuration directory.\n config_file_path (Optional[Path]): Path to the configuration file.\n\nRaises:\n FileNotFoundError: If no configuration file is found, and creating a default configuration fails.\n\nExample:\n To initialize and access configuration attributes (only one instance is created):\n ```python\n config_eos = ConfigEOS() # Always returns the same instance\n print(config_eos.prediction.hours) # Access a setting from the loaded configuration\n ```" "description": "Singleton configuration handler for the EOS application.\n\nConfigEOS extends `SettingsEOS` with support for default configuration paths and automatic\ninitialization.\n\n`ConfigEOS` ensures that only one instance of the class is created throughout the application,\nallowing consistent access to EOS configuration settings. This singleton instance loads\nconfiguration data from a predefined set of directories or creates a default configuration if\nnone is found.\n\nInitialization Process:\n - Upon instantiation, the singleton instance attempts to load a configuration file in this order:\n 1. The directory specified by the `EOS_CONFIG_DIR` environment variable\n 2. The directory specified by the `EOS_DIR` environment variable.\n 3. A platform specific default directory for EOS.\n 4. The current working directory.\n - The first available configuration file found in these directories is loaded.\n - If no configuration file is found, a default configuration file is created in the platform\n specific default directory, and default settings are loaded into it.\n\nAttributes from the loaded configuration are accessible directly as instance attributes of\n`ConfigEOS`, providing a centralized, shared configuration object for EOS.\n\nSingleton Behavior:\n - This class uses the `SingletonMixin` to ensure that all requests for `ConfigEOS` return\n the same instance, which contains the most up-to-date configuration. Modifying the configuration\n in one part of the application reflects across all references to this class.\n\nAttributes:\n config_folder_path (Optional[Path]): Path to the configuration directory.\n config_file_path (Optional[Path]): Path to the configuration file.\n\nRaises:\n FileNotFoundError: If no configuration file is found, and creating a default configuration fails.\n\nExample:\n To initialize and access configuration attributes (only one instance is created):\n .. code-block:: python\n\n config_eos = ConfigEOS() # Always returns the same instance\n print(config_eos.prediction.hours) # Access a setting from the loaded configuration"
}, },
"DDBCActuatorStatus": { "DDBCActuatorStatus": {
"properties": { "properties": {
@@ -2975,27 +2978,6 @@
"title": "DevicesCommonSettings", "title": "DevicesCommonSettings",
"description": "Base configuration for devices simulation settings." "description": "Base configuration for devices simulation settings."
}, },
"ElecPriceCommonProviderSettings": {
"properties": {
"ElecPriceImport": {
"anyOf": [
{
"$ref": "#/components/schemas/ElecPriceImportCommonSettings"
},
{
"type": "null"
}
],
"description": "ElecPriceImport settings",
"examples": [
null
]
}
},
"type": "object",
"title": "ElecPriceCommonProviderSettings",
"description": "Electricity Price Prediction Provider Configuration."
},
"ElecPriceCommonSettings-Input": { "ElecPriceCommonSettings-Input": {
"properties": { "properties": {
"provider": { "provider": {
@@ -3046,12 +3028,13 @@
1.19 1.19
] ]
}, },
"provider_settings": { "elecpriceimport": {
"$ref": "#/components/schemas/ElecPriceCommonProviderSettings", "$ref": "#/components/schemas/ElecPriceImportCommonSettings",
"description": "Provider settings", "description": "Import provider settings."
"examples": [ },
{} "energycharts": {
] "$ref": "#/components/schemas/ElecPriceEnergyChartsCommonSettings",
"description": "Energy Charts provider settings."
} }
}, },
"type": "object", "type": "object",
@@ -3108,18 +3091,34 @@
1.19 1.19
] ]
}, },
"provider_settings": { "elecpriceimport": {
"$ref": "#/components/schemas/ElecPriceCommonProviderSettings", "$ref": "#/components/schemas/ElecPriceImportCommonSettings",
"description": "Provider settings", "description": "Import provider settings."
"examples": [ },
{} "energycharts": {
] "$ref": "#/components/schemas/ElecPriceEnergyChartsCommonSettings",
"description": "Energy Charts provider settings."
} }
}, },
"type": "object", "type": "object",
"title": "ElecPriceCommonSettings", "title": "ElecPriceCommonSettings",
"description": "Electricity Price Prediction Configuration." "description": "Electricity Price Prediction Configuration."
}, },
"ElecPriceEnergyChartsCommonSettings": {
"properties": {
"bidding_zone": {
"$ref": "#/components/schemas/EnergyChartsBiddingZones",
"description": "Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', 'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI'",
"default": "DE-LU",
"examples": [
"AT"
]
}
},
"type": "object",
"title": "ElecPriceEnergyChartsCommonSettings",
"description": "Common settings for Energy Charts electricity price provider."
},
"ElecPriceImportCommonSettings": { "ElecPriceImportCommonSettings": {
"properties": { "properties": {
"import_file_path": { "import_file_path": {
@@ -3375,6 +3374,29 @@
"title": "ElectricVehicleResult", "title": "ElectricVehicleResult",
"description": "Result class containing information related to the electric vehicle's charging and discharging behavior." "description": "Result class containing information related to the electric vehicle's charging and discharging behavior."
}, },
"EnergyChartsBiddingZones": {
"type": "string",
"enum": [
"AT",
"BE",
"CH",
"CZ",
"DE-LU",
"DE-AT-LU",
"DK1",
"DK2",
"FR",
"HU",
"IT-NORTH",
"NL",
"NO2",
"PL",
"SE4",
"SI"
],
"title": "EnergyChartsBiddingZones",
"description": "Energy Charts Bidding Zones."
},
"EnergyManagementCommonSettings": { "EnergyManagementCommonSettings": {
"properties": { "properties": {
"startup_delay": { "startup_delay": {
@@ -4062,7 +4084,7 @@
"type": "string", "type": "string",
"title": "Version", "title": "Version",
"description": "Configuration file version. Used to check compatibility.", "description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0" "default": "0.2.0+dev.4dbc2d"
}, },
"data_folder_path": { "data_folder_path": {
"anyOf": [ "anyOf": [
@@ -4136,7 +4158,7 @@
"type": "string", "type": "string",
"title": "Version", "title": "Version",
"description": "Configuration file version. Used to check compatibility.", "description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0" "default": "0.2.0+dev.4dbc2d"
}, },
"data_folder_path": { "data_folder_path": {
"anyOf": [ "anyOf": [
@@ -7153,7 +7175,7 @@
}, },
"type": "object", "type": "object",
"title": "PydanticDateTimeData", "title": "PydanticDateTimeData",
"description": "Pydantic model for time series data with consistent value lengths.\n\nThis model validates a dictionary where:\n- Keys are strings representing data series names\n- Values are lists of numeric or string values\n- Special keys 'start_datetime' and 'interval' can contain string values\nfor time series indexing\n- All value lists must have the same length\n\nExample:\n {\n \"start_datetime\": \"2024-01-01 00:00:00\", # optional\n \"interval\": \"1 Hour\", # optional\n \"loadforecast_power_w\": [20.5, 21.0, 22.1],\n \"load_min\": [18.5, 19.0, 20.1]\n }" "description": "Pydantic model for time series data with consistent value lengths.\n\nThis model validates a dictionary where:\n- Keys are strings representing data series names\n- Values are lists of numeric or string values\n- Special keys 'start_datetime' and 'interval' can contain string values\nfor time series indexing\n- All value lists must have the same length\n\nExample:\n .. code-block:: python\n\n {\n \"start_datetime\": \"2024-01-01 00:00:00\", # optional\n \"interval\": \"1 Hour\", # optional\n \"loadforecast_power_w\": [20.5, 21.0, 22.1],\n \"load_min\": [18.5, 19.0, 20.1]\n }"
}, },
"PydanticDateTimeDataFrame": { "PydanticDateTimeDataFrame": {
"properties": { "properties": {

View File

@@ -1,6 +1,6 @@
[project] [project]
name = "akkudoktor-eos" name = "akkudoktor-eos"
version = "0.2.0" dynamic = ["version"] # Get version information dynamically
authors = [ authors = [
{ name="Andreas Schmitz", email="author@example.com" }, { name="Andreas Schmitz", email="author@example.com" },
] ]
@@ -25,6 +25,8 @@ build-backend = "setuptools.build_meta"
[tool.setuptools.dynamic] [tool.setuptools.dynamic]
dependencies = {file = ["requirements.txt"]} dependencies = {file = ["requirements.txt"]}
optional-dependencies = {dev = { file = ["requirements-dev.txt"] }} optional-dependencies = {dev = { file = ["requirements-dev.txt"] }}
# version.txt must be generated
version = { file = "version.txt" }
[tool.setuptools.packages.find] [tool.setuptools.packages.find]
where = ["src/"] where = ["src/"]
@@ -109,29 +111,10 @@ module = "xprocess.*"
ignore_missing_imports = true ignore_missing_imports = true
[tool.commitizen] [tool.commitizen]
# Only used as linter
name = "cz_conventional_commits" name = "cz_conventional_commits"
version_scheme = "semver" version_scheme = "semver"
version = "0.2.0" # <-- Set your current version heretag_format = "v$version"
# Files to automatically update when bumping version # Enforce commit message and branch style:
update_changelog_on_bump = true
changelog_incremental = true
annotated_tag = true
bump_message = "chore(release): $current_version → $new_version"
# Branch validation settings
branch_validation = true branch_validation = true
branch_pattern = "^(feat|fix|chore|docs|refactor|test)/[a-z0-9._-]+$" branch_pattern = "^(feat|fix|chore|docs|refactor|test)/[a-z0-9._-]+$"
# Customize changelog generation
[tool.commitizen.changelog]
path = "CHANGELOG.md"
template = "keepachangelog"
# If your version is stored in multiple files (Python modules, docs etc.), add them here
[tool.commitizen.files]
version = [
"pyproject.toml", # Auto-update project version
"src/akkudoktoreos/core/version.py",
"src/akkudoktoreos/data/default.config.json"
]

View File

@@ -7,12 +7,16 @@
# - mypy (mirrors-mypy) - sync with requirements-dev.txt (if on pypi) # - mypy (mirrors-mypy) - sync with requirements-dev.txt (if on pypi)
# - pymarkdown # - pymarkdown
# - commitizen - sync with requirements-dev.txt (if on pypi) # - commitizen - sync with requirements-dev.txt (if on pypi)
pre-commit==4.3.0 #
# !!! Sync .pre-commit-config.yaml and requirements-dev.txt !!!
pre-commit==4.5.0
mypy==1.18.2 mypy==1.18.2
types-requests==2.32.4.20250913 # for mypy types-requests==2.32.4.20250913 # for mypy
pandas-stubs==2.3.2.250926 # for mypy pandas-stubs==2.3.2.250926 # for mypy
tokenize-rt==6.2.0 # for mypy tokenize-rt==6.2.0 # for mypy
commitizen==4.9.1 types-docutils==0.22.3.20251115 # for mypy
types-PyYaml==6.0.12.20250915 # for mypy
commitizen==4.10.0
deprecated==1.3.1 # for commitizen deprecated==1.3.1 # for commitizen
# Sphinx # Sphinx
@@ -23,7 +27,7 @@ GitPython==3.1.45
myst-parser==4.0.1 myst-parser==4.0.1
# Pytest # Pytest
pytest==8.4.2 pytest==9.0.1
pytest-cov==7.0.0 pytest-cov==7.0.0
coverage==7.11.1 coverage==7.12.0
pytest-xprocess==1.0.2 pytest-xprocess==1.0.2

View File

@@ -1,14 +1,14 @@
babel==2.17.0 babel==2.17.0
beautifulsoup4==4.14.2 beautifulsoup4==4.14.2
cachebox==5.1.0 cachebox==5.1.0
numpy==2.3.4 numpy==2.3.5
numpydantic==1.7.0 numpydantic==1.7.0
matplotlib==3.10.7 matplotlib==3.10.7
contourpy==1.3.3 contourpy==1.3.3
fastapi[standard-no-fastapi-cloud-cli]==0.121.0 fastapi[standard-no-fastapi-cloud-cli]==0.122.0
fastapi_cli==0.0.14 fastapi_cli==0.0.16
rich-toolkit==0.15.1 rich-toolkit==0.16.0
python-fasthtml==0.12.33 python-fasthtml==0.12.35
MonsterUI==1.0.32 MonsterUI==1.0.32
markdown-it-py==3.0.0 markdown-it-py==3.0.0
mdit-py-plugins==0.5.0 mdit-py-plugins==0.5.0

View File

@@ -0,0 +1,70 @@
#!/usr/bin/env python3
"""
Update VERSION_BASE in version.py after a release tag.
Behavior:
- Read VERSION_BASE from version.py
- Strip ANY existing "+dev" suffix
- Append exactly one "+dev"
- Write back the updated file
This ensures:
0.2.0 --> 0.2.0+dev
0.2.0+dev --> 0.2.0+dev
0.2.0+dev+dev -> 0.2.0+dev
"""
import re
import sys
from pathlib import Path
ROOT = Path(__file__).resolve().parent.parent
VERSION_FILE = ROOT / "src" / "akkudoktoreos" / "core" / "version.py"
def bump_dev_version_file(file: Path) -> str:
text = file.read_text(encoding="utf-8")
# Extract current version
m = re.search(r'^VERSION_BASE\s*=\s*["\']([^"\']+)["\']',
text, flags=re.MULTILINE)
if not m:
raise ValueError("VERSION_BASE not found")
base_version = m.group(1)
# Remove trailing +dev if present → ensure idempotency
cleaned = re.sub(r'(\+dev)+$', '', base_version)
# Append +dev
new_version = f"{cleaned}+dev"
# Replace inside file content
new_text = re.sub(
r'^VERSION_BASE\s*=\s*["\']([^"\']+)["\']',
f'VERSION_BASE = "{new_version}"',
text,
flags=re.MULTILINE
)
file.write_text(new_text, encoding="utf-8")
return new_version
def main():
# Use CLI argument or fallback default path
version_file = Path(sys.argv[1]) if len(sys.argv) > 1 else VERSION_FILE
try:
new_version = bump_dev_version_file(version_file)
except Exception as e:
print(f"Error: {e}", file=sys.stderr)
sys.exit(1)
# MUST print to stdout
print(new_version)
if __name__ == "__main__":
main()

View File

@@ -1,170 +0,0 @@
"""Update version strings in multiple project files only if the old version matches.
This script updates version information in:
- pyproject.toml
- src/akkudoktoreos/core/version.py
- src/akkudoktoreos/data/default.config.json
- Makefile
Supported version formats:
- __version__ = "<version>"
- version = "<version>"
- "version": "<version>"
- VERSION ?: <version>
It will:
- Replace VERSION → NEW_VERSION if the old version is found.
- Report which files were updated.
- Report which files contained mismatched versions.
- Report which files had no version.
Usage:
python bump_version.py VERSION NEW_VERSION
Args:
VERSION (str): Version expected before replacement.
NEW_VERSION (str): Version to write.
"""
#!/usr/bin/env python3
import argparse
import glob
import os
import re
import shutil
from pathlib import Path
from typing import List, Tuple
# Patterns to match version strings
VERSION_PATTERNS = [
re.compile(r'(__version__\s*=\s*")(?P<ver>[^"]+)(")'),
re.compile(r'(version\s*=\s*")(?P<ver>[^"]+)(")'),
re.compile(r'("version"\s*:\s*")(?P<ver>[^"]+)(")'),
re.compile(r'(VERSION\s*\?=\s*)(?P<ver>[^\s]+)'), # For Makefile: VERSION ?= 0.2.0
]
# Default files to process
DEFAULT_FILES = [
"pyproject.toml",
"src/akkudoktoreos/core/version.py",
"src/akkudoktoreos/data/default.config.json",
"Makefile",
]
def backup_file(file_path: str) -> str:
"""Create a backup of the given file with a .bak suffix.
Args:
file_path: Path to the file to backup.
Returns:
Path to the backup file.
"""
backup_path = f"{file_path}.bak"
shutil.copy2(file_path, backup_path)
return backup_path
def replace_version_in_file(
file_path: Path, old_version: str, new_version: str, dry_run: bool = False
) -> Tuple[bool, bool]:
"""
Replace old_version with new_version in the given file if it matches.
Args:
file_path: Path to the file to modify.
old_version: The old version to replace.
new_version: The new version to set.
dry_run: If True, don't actually modify files.
Returns:
Tuple[bool, bool]: (file_would_be_updated, old_version_found)
"""
content = file_path.read_text()
new_content = content
old_version_found = False
file_would_be_updated = False
for pattern in VERSION_PATTERNS:
def repl(match):
nonlocal old_version_found, file_would_be_updated
ver = match.group("ver")
if ver == old_version:
old_version_found = True
file_would_be_updated = True
# Some patterns have 3 groups (like quotes)
if len(match.groups()) == 3:
return f"{match.group(1)}{new_version}{match.group(3)}"
else:
return f"{match.group(1)}{new_version}"
return match.group(0)
new_content = pattern.sub(repl, new_content)
if file_would_be_updated:
if dry_run:
print(f"[DRY-RUN] Would update {file_path}")
else:
backup_path = file_path.with_suffix(file_path.suffix + ".bak")
shutil.copy(file_path, backup_path)
file_path.write_text(new_content)
print(f"Updated {file_path} (backup saved to {backup_path})")
elif not old_version_found:
print(f"[SKIP] {file_path}: old version '{old_version}' not found")
return file_would_be_updated, old_version_found
def main():
parser = argparse.ArgumentParser(description="Bump version across project files.")
parser.add_argument("old_version", help="Old version to replace")
parser.add_argument("new_version", help="New version to set")
parser.add_argument(
"--dry-run", action="store_true", help="Show what would be changed without modifying files"
)
parser.add_argument(
"--glob", nargs="*", help="Optional glob patterns to include additional files"
)
args = parser.parse_args()
updated_files = []
not_found_files = []
# Determine files to update
files_to_update: List[Path] = [Path(f) for f in DEFAULT_FILES]
if args.glob:
for pattern in args.glob:
files_to_update.extend(Path(".").glob(pattern))
files_to_update = list(dict.fromkeys(files_to_update)) # remove duplicates
any_updated = False
for file_path in files_to_update:
if file_path.exists() and file_path.is_file():
updated, _ = replace_version_in_file(
file_path, args.old_version, args.new_version, args.dry_run
)
any_updated |= updated
if updated:
updated_files.append(file_path)
else:
print(f"[SKIP] {file_path}: file does not exist")
not_found_files.append(file_path)
print("\nSummary:")
if updated_files:
print(f"Updated files ({len(updated_files)}):")
for f in updated_files:
print(f" {f}")
else:
print("No files were updated.")
if not_found_files:
print(f"Files where old version was not found ({len(not_found_files)}):")
for f in not_found_files:
print(f" {f}")
if __name__ == "__main__":
main()

View File

@@ -8,7 +8,7 @@ import re
import sys import sys
import textwrap import textwrap
from pathlib import Path from pathlib import Path
from typing import Any, Type, Union from typing import Any, Optional, Type, Union, get_args
from loguru import logger from loguru import logger
from pydantic.fields import ComputedFieldInfo, FieldInfo from pydantic.fields import ComputedFieldInfo, FieldInfo
@@ -24,13 +24,29 @@ undocumented_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict()
global_config_dict: dict[str, Any] = dict() global_config_dict: dict[str, Any] = dict()
def get_title(config: PydanticBaseModel) -> str: def get_model_class_from_annotation(field_type: Any) -> type[PydanticBaseModel] | None:
"""Given a type annotation (possibly Optional or Union), return the first Pydantic model class."""
origin = getattr(field_type, "__origin__", None)
if origin is Union:
# unwrap Union/Optional
for arg in get_args(field_type):
cls = get_model_class_from_annotation(arg)
if cls is not None:
return cls
return None
elif isinstance(field_type, type) and issubclass(field_type, PydanticBaseModel):
return field_type
else:
return None
def get_title(config: type[PydanticBaseModel]) -> str:
if config.__doc__ is None: if config.__doc__ is None:
raise NameError(f"Missing docstring: {config}") raise NameError(f"Missing docstring: {config}")
return config.__doc__.strip().splitlines()[0].strip(".") return config.__doc__.strip().splitlines()[0].strip(".")
def get_body(config: PydanticBaseModel) -> str: def get_body(config: type[PydanticBaseModel]) -> str:
if config.__doc__ is None: if config.__doc__ is None:
raise NameError(f"Missing docstring: {config}") raise NameError(f"Missing docstring: {config}")
return textwrap.dedent("\n".join(config.__doc__.strip().splitlines()[1:])).strip() return textwrap.dedent("\n".join(config.__doc__.strip().splitlines()[1:])).strip()
@@ -53,17 +69,49 @@ def resolve_nested_types(field_type: Any, parent_types: list[str]) -> list[tuple
def get_example_or_default(field_name: str, field_info: FieldInfo, example_ix: int) -> Any: def get_example_or_default(field_name: str, field_info: FieldInfo, example_ix: int) -> Any:
"""Generate a default value for a field, considering constraints.""" """Generate a default value for a field, considering constraints.
if field_info.examples is not None:
try:
return field_info.examples[example_ix]
except IndexError:
return field_info.examples[-1]
if field_info.default is not None: Priority:
1. field_info.examples
2. field_info.example
3. json_schema_extra['examples']
4. json_schema_extra['example']
5. field_info.default
"""
# 1. Old-style examples attribute
examples = getattr(field_info, "examples", None)
if examples is not None:
try:
return examples[example_ix]
except IndexError:
return examples[-1]
# 2. Old-style single example
example = getattr(field_info, "example", None)
if example is not None:
return example
# 3. Look into json_schema_extra (new style)
extra = getattr(field_info, "json_schema_extra", {}) or {}
examples = extra.get("examples")
if examples is not None:
try:
return examples[example_ix]
except IndexError:
return examples[-1]
example = extra.get("example")
if example is not None:
return example
# 5. Default
if getattr(field_info, "default", None) not in (None, ...):
return field_info.default return field_info.default
raise NotImplementedError(f"No default or example provided '{field_name}': {field_info}") raise NotImplementedError(
f"No default or example provided for field '{field_name}': {field_info}"
)
def get_model_structure_from_examples( def get_model_structure_from_examples(
@@ -92,7 +140,7 @@ def get_model_structure_from_examples(
def create_model_from_examples( def create_model_from_examples(
model_class: PydanticBaseModel, multiple: bool model_class: type[PydanticBaseModel], multiple: bool
) -> list[PydanticBaseModel]: ) -> list[PydanticBaseModel]:
"""Create a model instance with default or example values, respecting constraints.""" """Create a model instance with default or example values, respecting constraints."""
return [ return [
@@ -131,7 +179,7 @@ def get_type_name(field_type: type) -> str:
def generate_config_table_md( def generate_config_table_md(
config: PydanticBaseModel, config: type[PydanticBaseModel],
toplevel_keys: list[str], toplevel_keys: list[str],
prefix: str, prefix: str,
toplevel: bool = False, toplevel: bool = False,
@@ -167,22 +215,28 @@ def generate_config_table_md(
table += "\n\n" table += "\n\n"
table += ( table += (
"<!-- pyml disable line-length -->\n"
":::{table} " ":::{table} "
+ f"{'::'.join(toplevel_keys)}\n:widths: 10 {env_width}10 5 5 30\n:align: left\n\n" + f"{'::'.join(toplevel_keys)}\n:widths: 10 {env_width}10 5 5 30\n:align: left\n\n"
) )
table += f"| Name {env_header}| Type | Read-Only | Default | Description |\n" table += f"| Name {env_header}| Type | Read-Only | Default | Description |\n"
table += f"| ---- {env_header_underline}| ---- | --------- | ------- | ----------- |\n" table += f"| ---- {env_header_underline}| ---- | --------- | ------- | ----------- |\n"
for field_name, field_info in list(config.model_fields.items()) + list(
config.model_computed_fields.items() fields = {}
): for field_name, field_info in config.model_fields.items():
fields[field_name] = field_info
for field_name, field_info in config.model_computed_fields.items():
fields[field_name] = field_info
for field_name in sorted(fields.keys()):
field_info = fields[field_name]
regular_field = isinstance(field_info, FieldInfo) regular_field = isinstance(field_info, FieldInfo)
config_name = field_name if extra_config else field_name.upper() config_name = field_name if extra_config else field_name.upper()
field_type = field_info.annotation if regular_field else field_info.return_type field_type = field_info.annotation if regular_field else field_info.return_type
default_value = get_default_value(field_info, regular_field) default_value = get_default_value(field_info, regular_field)
description = field_info.description if field_info.description else "-" description = config.field_description(field_name)
deprecated = field_info.deprecated if field_info.deprecated else None deprecated = config.field_deprecated(field_name)
read_only = "rw" if regular_field else "ro" read_only = "rw" if regular_field else "ro"
type_name = get_type_name(field_type) type_name = get_type_name(field_type)
@@ -238,7 +292,7 @@ def generate_config_table_md(
undocumented_types.setdefault(new_type, (info[0], info[1])) undocumented_types.setdefault(new_type, (info[0], info[1]))
if toplevel: if toplevel:
table += ":::\n\n" # Add an empty line after the table table += ":::\n<!-- pyml enable line-length -->\n\n" # Add an empty line after the table
has_examples_list = toplevel_keys[-1] == "list" has_examples_list = toplevel_keys[-1] == "list"
instance_list = create_model_from_examples(config, has_examples_list) instance_list = create_model_from_examples(config, has_examples_list)
@@ -256,9 +310,13 @@ def generate_config_table_md(
same_output = ins_out_dict_list == ins_dict_list same_output = ins_out_dict_list == ins_dict_list
same_output_str = "/Output" if same_output else "" same_output_str = "/Output" if same_output else ""
table += f"#{heading_level} Example Input{same_output_str}\n\n" # -- code block heading
table += "```{eval-rst}\n" table += "<!-- pyml disable no-emphasis-as-heading -->\n"
table += ".. code-block:: json\n\n" table += f"**Example Input{same_output_str}**\n"
table += "<!-- pyml enable no-emphasis-as-heading -->\n\n"
# -- code block
table += "<!-- pyml disable line-length -->\n"
table += "```json\n"
if has_examples_list: if has_examples_list:
input_dict = build_nested_structure(toplevel_keys[:-1], ins_dict_list) input_dict = build_nested_structure(toplevel_keys[:-1], ins_dict_list)
if not extra_config: if not extra_config:
@@ -268,20 +326,24 @@ def generate_config_table_md(
if not extra_config: if not extra_config:
global_config_dict[toplevel_keys[0]] = ins_dict_list[0] global_config_dict[toplevel_keys[0]] = ins_dict_list[0]
table += textwrap.indent(json.dumps(input_dict, indent=4), " ") table += textwrap.indent(json.dumps(input_dict, indent=4), " ")
table += "\n" table += "\n```\n<!-- pyml enable line-length -->\n\n"
table += "```\n\n" # -- end code block
if not same_output: if not same_output:
table += f"#{heading_level} Example Output\n\n" # -- code block heading
table += "```{eval-rst}\n" table += "<!-- pyml disable no-emphasis-as-heading -->\n"
table += ".. code-block:: json\n\n" table += f"**Example Output**\n"
table += "<!-- pyml enable no-emphasis-as-heading -->\n\n"
# -- code block
table += "<!-- pyml disable line-length -->\n"
table += "```json\n"
if has_examples_list: if has_examples_list:
output_dict = build_nested_structure(toplevel_keys[:-1], ins_out_dict_list) output_dict = build_nested_structure(toplevel_keys[:-1], ins_out_dict_list)
else: else:
output_dict = build_nested_structure(toplevel_keys, ins_out_dict_list[0]) output_dict = build_nested_structure(toplevel_keys, ins_out_dict_list[0])
table += textwrap.indent(json.dumps(output_dict, indent=4), " ") table += textwrap.indent(json.dumps(output_dict, indent=4), " ")
table += "\n" table += "\n```\n<!-- pyml enable line-length -->\n\n"
table += "```\n\n" # -- end code block
while undocumented_types: while undocumented_types:
extra_config_type, extra_info = undocumented_types.popitem() extra_config_type, extra_info = undocumented_types.popitem()
@@ -293,7 +355,7 @@ def generate_config_table_md(
return table return table
def generate_config_md(config_eos: ConfigEOS) -> str: def generate_config_md(file_path: Optional[Union[str, Path]], config_eos: ConfigEOS) -> str:
"""Generate configuration specification in Markdown with extra tables for prefixed values. """Generate configuration specification in Markdown with extra tables for prefixed values.
Returns: Returns:
@@ -305,44 +367,103 @@ def generate_config_md(config_eos: ConfigEOS) -> str:
) )
GeneralSettings._config_folder_path = config_eos.general.config_file_path.parent GeneralSettings._config_folder_path = config_eos.general.config_file_path.parent
markdown = "# Configuration Table\n\n" markdown = ""
# Generate tables for each top level config if file_path:
for field_name, field_info in config_eos.__class__.model_fields.items(): file_path = Path(file_path)
field_type = field_info.annotation # -- table of content
markdown += generate_config_table_md( markdown += "```{toctree}\n"
field_type, [field_name], f"EOS_{field_name.upper()}__", True markdown += ":maxdepth: 1\n"
markdown += ":caption: Configuration Table\n\n"
else:
markdown += "# Configuration Table\n\n"
markdown += (
"The configuration table describes all the configuration options of Akkudoktor-EOS\n\n"
) )
# Generate tables for each top level config
for field_name in sorted(config_eos.__class__.model_fields.keys()):
field_info = config_eos.__class__.model_fields[field_name]
field_type = field_info.annotation
model_class = get_model_class_from_annotation(field_type)
if model_class is None:
raise ValueError(f"Can not find class of top level field {field_name}.")
table = generate_config_table_md(
model_class, [field_name], f"EOS_{field_name.upper()}__", True
)
if file_path:
# Write table to extra document
table_path = file_path.with_name(file_path.stem + f"{field_name.lower()}.md")
write_to_file(table_path, table)
markdown += f"../_generated/{table_path.name}\n"
else:
# We will write to stdout
markdown += "---\n\n"
markdown += table
# Generate full example
example = ""
# Full config # Full config
markdown += "## Full example Config\n\n" example += "## Full example Config\n\n"
markdown += "```{eval-rst}\n" # -- code block
markdown += ".. code-block:: json\n\n" example += "<!-- pyml disable line-length -->\n"
example += "```json\n"
# Test for valid config first # Test for valid config first
config_eos.merge_settings_from_dict(global_config_dict) config_eos.merge_settings_from_dict(global_config_dict)
markdown += textwrap.indent(json.dumps(global_config_dict, indent=4), " ") example += textwrap.indent(json.dumps(global_config_dict, indent=4), " ")
markdown += "\n" example += "\n"
markdown += "```\n\n" example += "```\n<!-- pyml enable line-length -->\n\n"
# -- end code block end
if file_path:
example_path = file_path.with_name(file_path.stem + f"example.md")
write_to_file(example_path, example)
markdown += f"../_generated/{example_path.name}\n"
markdown += "```\n\n"
# -- end table of content
else:
markdown += "---\n\n"
markdown += example
# Assure there is no double \n at end of file # Assure there is no double \n at end of file
markdown = markdown.rstrip("\n") markdown = markdown.rstrip("\n")
markdown += "\n" markdown += "\n"
markdown += "\nAuto generated from source code.\n"
# Write markdown to file or stdout
write_to_file(file_path, markdown)
return markdown
def write_to_file(file_path: Optional[Union[str, Path]], config_md: str):
if os.name == "nt":
config_md = config_md.replace("\\\\", "/")
# Assure log path does not leak to documentation # Assure log path does not leak to documentation
markdown = re.sub( config_md = re.sub(
r'(?<=["\'])/[^"\']*/output/eos\.log(?=["\'])', r'(?<=["\'])/[^"\']*/output/eos\.log(?=["\'])',
'/home/user/.local/share/net.akkudoktoreos.net/output/eos.log', '/home/user/.local/share/net.akkudoktor.eos/output/eos.log',
markdown config_md
) )
# Assure timezone name does not leak to documentation # Assure timezone name does not leak to documentation
tz_name = to_datetime().timezone_name tz_name = to_datetime().timezone_name
markdown = re.sub(re.escape(tz_name), "Europe/Berlin", markdown, flags=re.IGNORECASE) config_md = re.sub(re.escape(tz_name), "Europe/Berlin", config_md, flags=re.IGNORECASE)
# Also replace UTC, as GitHub CI always is on UTC # Also replace UTC, as GitHub CI always is on UTC
markdown = re.sub(re.escape("UTC"), "Europe/Berlin", markdown, flags=re.IGNORECASE) config_md = re.sub(re.escape("UTC"), "Europe/Berlin", config_md, flags=re.IGNORECASE)
# Assure no extra lines at end of file
config_md = config_md.rstrip("\n")
config_md += "\n"
return markdown if file_path:
# Write to file
with open(Path(file_path), "w", encoding="utf-8", newline="\n") as f:
f.write(config_md)
else:
# Write to std output
print(config_md)
def main(): def main():
@@ -352,23 +473,14 @@ def main():
"--output-file", "--output-file",
type=str, type=str,
default=None, default=None,
help="File to write the Configuration Specification to", help="File to write the top level configuration specification to.",
) )
args = parser.parse_args() args = parser.parse_args()
config_eos = get_config() config_eos = get_config()
try: try:
config_md = generate_config_md(config_eos) config_md = generate_config_md(args.output_file, config_eos)
if os.name == "nt":
config_md = config_md.replace("\\\\", "/")
if args.output_file:
# Write to file
with open(args.output_file, "w", encoding="utf-8", newline="\n") as f:
f.write(config_md)
else:
# Write to std output
print(config_md)
except Exception as e: except Exception as e:
print(f"Error during Configuration Specification generation: {e}", file=sys.stderr) print(f"Error during Configuration Specification generation: {e}", file=sys.stderr)

View File

@@ -194,6 +194,8 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
markdown = f"## {method.upper()} {path}\n\n" markdown = f"## {method.upper()} {path}\n\n"
# -- links
markdown += "<!-- pyml disable line-length -->\n"
markdown += f"**Links**: {local_path}, {akkudoktoreos_main_path}" markdown += f"**Links**: {local_path}, {akkudoktoreos_main_path}"
if devel: if devel:
# Add link to akkudoktor branch the development has used # Add link to akkudoktor branch the development has used
@@ -206,7 +208,8 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
+ link_method + link_method
) )
markdown += f", {akkudoktoreos_base_path}" markdown += f", {akkudoktoreos_base_path}"
markdown += "\n\n" markdown += "\n<!-- pyml enable line-length -->\n\n"
# -- links end
summary = details.get("summary", None) summary = details.get("summary", None)
if summary: if summary:
@@ -214,9 +217,14 @@ def format_endpoint(path: str, method: str, details: dict, devel: bool = False)
description = details.get("description", None) description = details.get("description", None)
if description: if description:
markdown += "```\n" # -- code block
markdown += f"{description}" markdown += "<!-- pyml disable line-length -->\n"
markdown += "\n```\n\n" markdown += "```python\n"
markdown += '"""\n'
markdown += f"{description}\n"
markdown += '"""\n'
markdown += "```\n<!-- pyml enable line-length -->\n\n"
# -- end code block end
markdown += format_parameters(details.get("parameters", [])) markdown += format_parameters(details.get("parameters", []))
markdown += format_request_body(details.get("requestBody", {}).get("content", {})) markdown += format_request_body(details.get("requestBody", {}).get("content", {}))
@@ -239,7 +247,11 @@ def openapi_to_markdown(openapi_json: dict, devel: bool = False) -> str:
info = extract_info(openapi_json) info = extract_info(openapi_json)
markdown = f"# {info['title']}\n\n" markdown = f"# {info['title']}\n\n"
markdown += f"**Version**: `{info['version']}`\n\n" markdown += f"**Version**: `{info['version']}`\n\n"
markdown += f"**Description**: {info['description']}\n\n" # -- description
markdown += "<!-- pyml disable line-length -->\n"
markdown += f"**Description**: {info['description']}\n"
markdown += "<!-- pyml enable line-length -->\n\n"
# -- end description
markdown += f"**Base URL**: `{info['base_url']}`\n\n" markdown += f"**Base URL**: `{info['base_url']}`\n\n"
security_schemes = openapi_json.get("components", {}).get("securitySchemes", {}) security_schemes = openapi_json.get("components", {}).get("securitySchemes", {})
@@ -257,6 +269,8 @@ def openapi_to_markdown(openapi_json: dict, devel: bool = False) -> str:
markdown = markdown.rstrip("\n") markdown = markdown.rstrip("\n")
markdown += "\n" markdown += "\n"
markdown += "\nAuto generated from openapi.json.\n"
return markdown return markdown

15
scripts/get_version.py Normal file
View File

@@ -0,0 +1,15 @@
#!.venv/bin/python
"""Get version of EOS"""
import sys
from pathlib import Path
# Add the src directory to sys.path so Sphinx can import akkudoktoreos
PROJECT_ROOT = Path(__file__).parent.parent
SRC_DIR = PROJECT_ROOT / "src"
sys.path.insert(0, str(SRC_DIR))
from akkudoktoreos.core.version import __version__
if __name__ == "__main__":
print(__version__)

113
scripts/update_version.py Normal file
View File

@@ -0,0 +1,113 @@
#!.venv/bin/python
"""General version replacement script.
Usage:
python scripts/update_version.py <version> <file1> [file2 ...]
"""
#!/usr/bin/env python3
import re
import sys
from pathlib import Path
from typing import List
# --- Patterns to match version strings ---
VERSION_PATTERNS = [
# Python: __version__ = "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])(__version__\s*=\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# Python: version = "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])(version\s*=\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# JSON: "version": "1.2.3"
re.compile(
r'(?<![A-Za-z0-9])("version"\s*:\s*")'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(")'
),
# Makefile-style: VERSION ?= 1.2.3
re.compile(
r'(?<![A-Za-z0-9])(VERSION\s*\?=\s*)'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
),
# YAML: version: "1.2.3"
re.compile(
r'(?m)^(version\s*:\s*["\']?)'
r'(?P<ver>\d+\.\d+\.\d+(?:\+[0-9A-Za-z\.]+)?)'
r'(["\']?)\s*$'
),
]
def update_version_in_file(file_path: Path, new_version: str) -> bool:
"""
Replace version strings in a file based on VERSION_PATTERNS.
Returns True if the file was updated.
"""
content = file_path.read_text()
new_content = content
file_would_be_updated = False
for pattern in VERSION_PATTERNS:
def repl(match):
nonlocal file_would_be_updated
ver = match.group("ver")
if ver != new_version:
file_would_be_updated = True
# Three-group patterns (__version__, JSON, YAML)
if len(match.groups()) == 3:
return f"{match.group(1)}{new_version}{match.group(3)}"
# Two-group patterns (Makefile)
return f"{match.group(1)}{new_version}"
return match.group(0)
new_content = pattern.sub(repl, new_content)
if file_would_be_updated:
file_path.write_text(new_content)
return file_would_be_updated
def main(version: str, files: List[str]):
if not version:
raise ValueError("No version provided")
if not files:
raise ValueError("No files provided")
updated_files = []
for f in files:
path = Path(f)
if not path.exists():
print(f"Warning: {path} does not exist, skipping")
continue
if update_version_in_file(path, version):
updated_files.append(str(path))
if updated_files:
print(f"Updated files: {', '.join(updated_files)}")
else:
print("No files updated.")
if __name__ == "__main__":
if len(sys.argv) < 3:
print("Usage: python update_version.py <version> <file1> [file2 ...]")
sys.exit(1)
version_arg = sys.argv[1]
files_arg = sys.argv[2:]
main(version_arg, files_arg)

View File

@@ -11,7 +11,7 @@ Key features:
import json import json
import os import os
import shutil import tempfile
from pathlib import Path from pathlib import Path
from typing import Any, ClassVar, Optional, Type from typing import Any, ClassVar, Optional, Type
@@ -85,28 +85,38 @@ class GeneralSettings(SettingsBaseModel):
_config_file_path: ClassVar[Optional[Path]] = None _config_file_path: ClassVar[Optional[Path]] = None
version: str = Field( version: str = Field(
default=__version__, description="Configuration file version. Used to check compatibility." default=__version__,
json_schema_extra={
"description": "Configuration file version. Used to check compatibility."
},
) )
data_folder_path: Optional[Path] = Field( data_folder_path: Optional[Path] = Field(
default=None, description="Path to EOS data directory.", examples=[None, "/home/eos/data"] default=None,
json_schema_extra={
"description": "Path to EOS data directory.",
"examples": [None, "/home/eos/data"],
},
) )
data_output_subpath: Optional[Path] = Field( data_output_subpath: Optional[Path] = Field(
default="output", description="Sub-path for the EOS output data directory." default="output",
json_schema_extra={"description": "Sub-path for the EOS output data directory."},
) )
latitude: Optional[float] = Field( latitude: Optional[float] = Field(
default=52.52, default=52.52,
ge=-90.0, ge=-90.0,
le=90.0, le=90.0,
description="Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°)", json_schema_extra={
"description": "Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°)"
},
) )
longitude: Optional[float] = Field( longitude: Optional[float] = Field(
default=13.405, default=13.405,
ge=-180.0, ge=-180.0,
le=180.0, le=180.0,
description="Longitude in decimal degrees, within -180 to 180 (°)", json_schema_extra={"description": "Longitude in decimal degrees, within -180 to 180 (°)"},
) )
# Computed fields # Computed fields
@@ -144,7 +154,7 @@ class GeneralSettings(SettingsBaseModel):
if v not in cls.compatible_versions: if v not in cls.compatible_versions:
error = ( error = (
f"Incompatible configuration version '{v}'. " f"Incompatible configuration version '{v}'. "
f"Expected one of: {', '.join(cls.compatible_versions)}." f"Expected: {', '.join(cls.compatible_versions)}."
) )
logger.error(error) logger.error(error)
raise ValueError(error) raise ValueError(error)
@@ -158,64 +168,49 @@ class SettingsEOS(pydantic_settings.BaseSettings, PydanticModelNestedValueMixin)
""" """
general: Optional[GeneralSettings] = Field( general: Optional[GeneralSettings] = Field(
default=None, default=None, json_schema_extra={"description": "General Settings"}
description="General Settings",
) )
cache: Optional[CacheCommonSettings] = Field( cache: Optional[CacheCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Cache Settings"}
description="Cache Settings",
) )
ems: Optional[EnergyManagementCommonSettings] = Field( ems: Optional[EnergyManagementCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Energy Management Settings"}
description="Energy Management Settings",
) )
logging: Optional[LoggingCommonSettings] = Field( logging: Optional[LoggingCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Logging Settings"}
description="Logging Settings",
) )
devices: Optional[DevicesCommonSettings] = Field( devices: Optional[DevicesCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Devices Settings"}
description="Devices Settings",
) )
measurement: Optional[MeasurementCommonSettings] = Field( measurement: Optional[MeasurementCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Measurement Settings"}
description="Measurement Settings",
) )
optimization: Optional[OptimizationCommonSettings] = Field( optimization: Optional[OptimizationCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Optimization Settings"}
description="Optimization Settings",
) )
prediction: Optional[PredictionCommonSettings] = Field( prediction: Optional[PredictionCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Prediction Settings"}
description="Prediction Settings",
) )
elecprice: Optional[ElecPriceCommonSettings] = Field( elecprice: Optional[ElecPriceCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Electricity Price Settings"}
description="Electricity Price Settings",
) )
feedintariff: Optional[FeedInTariffCommonSettings] = Field( feedintariff: Optional[FeedInTariffCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Feed In Tariff Settings"}
description="Feed In Tariff Settings",
) )
load: Optional[LoadCommonSettings] = Field( load: Optional[LoadCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Load Settings"}
description="Load Settings",
) )
pvforecast: Optional[PVForecastCommonSettings] = Field( pvforecast: Optional[PVForecastCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "PV Forecast Settings"}
description="PV Forecast Settings",
) )
weather: Optional[WeatherCommonSettings] = Field( weather: Optional[WeatherCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Weather Settings"}
description="Weather Settings",
) )
server: Optional[ServerCommonSettings] = Field( server: Optional[ServerCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Server Settings"}
description="Server Settings",
) )
utils: Optional[UtilsCommonSettings] = Field( utils: Optional[UtilsCommonSettings] = Field(
default=None, default=None, json_schema_extra={"description": "Utilities Settings"}
description="Utilities Settings",
) )
model_config = pydantic_settings.SettingsConfigDict( model_config = pydantic_settings.SettingsConfigDict(
@@ -292,10 +287,10 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
Example: Example:
To initialize and access configuration attributes (only one instance is created): To initialize and access configuration attributes (only one instance is created):
```python .. code-block:: python
config_eos = ConfigEOS() # Always returns the same instance
print(config_eos.prediction.hours) # Access a setting from the loaded configuration config_eos = ConfigEOS() # Always returns the same instance
``` print(config_eos.prediction.hours) # Access a setting from the loaded configuration
""" """
@@ -340,32 +335,44 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
file_secret_settings (pydantic_settings.PydanticBaseSettingsSource): Unused (needed for parent class interface). file_secret_settings (pydantic_settings.PydanticBaseSettingsSource): Unused (needed for parent class interface).
Returns: Returns:
tuple[pydantic_settings.PydanticBaseSettingsSource, ...]: A tuple of settings sources in the order they should be applied. tuple[pydantic_settings.PydanticBaseSettingsSource, ...]: A tuple of settings sources in the order they should be applied.
Behavior: Behavior:
1. Checks for the existence of a JSON configuration file in the expected location. 1. Checks for the existence of a JSON configuration file in the expected location.
2. If the configuration file does not exist, creates the directory (if needed) and attempts to copy a 2. If the configuration file does not exist, creates the directory (if needed) and
default configuration file to the location. If the copy fails, uses the default configuration file directly. attempts to create a default configuration file in the location. If the creation
3. Creates a `pydantic_settings.JsonConfigSettingsSource` for both the configuration file and the default configuration file. fails, a temporary configuration directory is used.
3. Creates a `pydantic_settings.JsonConfigSettingsSource` for the configuration
file.
4. Updates class attributes `GeneralSettings._config_folder_path` and 4. Updates class attributes `GeneralSettings._config_folder_path` and
`GeneralSettings._config_file_path` to reflect the determined paths. `GeneralSettings._config_file_path` to reflect the determined paths.
5. Returns a tuple containing all provided and newly created settings sources in the desired order. 5. Returns a tuple containing all provided and newly created settings sources in
the desired order.
Notes: Notes:
- This method logs a warning if the default configuration file cannot be copied. - This method logs an error if the default configuration file in the normal
- It ensures that a fallback to the default configuration file is always possible. configuration directory cannot be created.
- It ensures that a fallback to a default configuration file is always possible.
""" """
# Ensure we know and have the config folder path and the config file # Ensure we know and have the config folder path and the config file
config_file, exists = cls._get_config_file_path() config_file, exists = cls._get_config_file_path()
config_dir = config_file.parent config_dir = config_file.parent
if not exists: if not exists:
config_dir.mkdir(parents=True, exist_ok=True) config_dir.mkdir(parents=True, exist_ok=True)
# Create minimum config file
config_minimum_content = '{ "general": { "version": "' + __version__ + '" } }'
try: try:
shutil.copy2(cls.config_default_file_path, config_file) config_file.write_text(config_minimum_content, encoding="utf-8")
except Exception as exc: except Exception as exc:
logger.warning(f"Could not copy default config: {exc}. Using default config...") # Create minimum config in temporary config directory as last resort
config_file = cls.config_default_file_path error_msg = f"Could not create minimum config file in {config_dir}: {exc}"
config_dir = config_file.parent logger.error(error_msg)
temp_dir = Path(tempfile.mkdtemp())
info_msg = f"Using temporary config directory {temp_dir}"
logger.info(info_msg)
config_dir = temp_dir
config_file = temp_dir / config_file.name
config_file.write_text(config_minimum_content, encoding="utf-8")
# Remember config_dir and config file # Remember config_dir and config file
GeneralSettings._config_folder_path = config_dir GeneralSettings._config_folder_path = config_dir
GeneralSettings._config_file_path = config_file GeneralSettings._config_file_path = config_file
@@ -392,19 +399,8 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
f"Error reading config file '{config_file}' (falling back to default config): {ex}" f"Error reading config file '{config_file}' (falling back to default config): {ex}"
) )
# Append default settings to sources
default_settings = pydantic_settings.JsonConfigSettingsSource(
settings_cls, json_file=cls.config_default_file_path
)
setting_sources.append(default_settings)
return tuple(setting_sources) return tuple(setting_sources)
@classproperty
def config_default_file_path(cls) -> Path:
"""Compute the default config file path."""
return cls.package_root_path.joinpath("data/default.config.json")
@classproperty @classproperty
def package_root_path(cls) -> Path: def package_root_path(cls) -> Path:
"""Compute the package root path.""" """Compute the package root path."""
@@ -466,9 +462,12 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
ValidationError: If the data contains invalid values for the defined fields. ValidationError: If the data contains invalid values for the defined fields.
Example: Example:
>>> config = get_config() .. code-block:: python
>>> new_data = {"prediction": {"hours": 24}, "server": {"port": 8000}}
>>> config.merge_settings_from_dict(new_data) config = get_config()
new_data = {"prediction": {"hours": 24}, "server": {"port": 8000}}
config.merge_settings_from_dict(new_data)
""" """
self._setup(**merge_models(self, data)) self._setup(**merge_models(self, data))
@@ -523,8 +522,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
The returned dictionary uses `backup_id` (suffix) as keys. The value for The returned dictionary uses `backup_id` (suffix) as keys. The value for
each key is a dictionary including: each key is a dictionary including:
- ``storage_time``: The file modification timestamp in ISO-8601 format. - ``storage_time``: The file modification timestamp in ISO-8601 format.
- ``version``: Version information found in the backup file - ``version``: Version information found in the backup file (defaults to ``"unknown"``).
(defaults to ``"unknown"``).
Returns: Returns:
dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata. dict[str, dict[str, Any]]: Mapping of backup identifiers to metadata.

View File

@@ -21,11 +21,14 @@ if TYPE_CHECKING:
# - tuple[str, Callable[[Any], Any]] (new path + transform) # - tuple[str, Callable[[Any], Any]] (new path + transform)
# - None (drop) # - None (drop)
MIGRATION_MAP: Dict[str, Union[str, Tuple[str, Callable[[Any], Any]], None]] = { MIGRATION_MAP: Dict[str, Union[str, Tuple[str, Callable[[Any], Any]], None]] = {
# 0.1.0 -> 0.2.0 # 0.2.0 -> 0.2.0+dev
"elecprice/provider_settings/ElecPriceImport/import_file_path": "elecprice/elecpriceimport/import_file_path",
"elecprice/provider_settings/ElecPriceImport/import_json": "elecprice/elecpriceimport/import_json",
# 0.1.0 -> 0.2.0+dev
"devices/batteries/0/initial_soc_percentage": None, "devices/batteries/0/initial_soc_percentage": None,
"devices/electric_vehicles/0/initial_soc_percentage": None, "devices/electric_vehicles/0/initial_soc_percentage": None,
"elecprice/provider_settings/import_file_path": "elecprice/provider_settings/ElecPriceImport/import_file_path", "elecprice/provider_settings/import_file_path": "elecprice/elecpriceimport/import_file_path",
"elecprice/provider_settings/import_json": "elecprice/provider_settings/ElecPriceImport/import_json", "elecprice/provider_settings/import_json": "elecprice/elecpriceimport/import_json",
"load/provider_settings/import_file_path": "load/provider_settings/LoadImport/import_file_path", "load/provider_settings/import_file_path": "load/provider_settings/LoadImport/import_file_path",
"load/provider_settings/import_json": "load/provider_settings/LoadImport/import_json", "load/provider_settings/import_json": "load/provider_settings/LoadImport/import_json",
"load/provider_settings/loadakkudoktor_year_energy": "load/provider_settings/LoadAkkudoktor/loadakkudoktor_year_energy_kwh", "load/provider_settings/loadakkudoktor_year_energy": "load/provider_settings/LoadAkkudoktor/loadakkudoktor_year_energy_kwh",

View File

@@ -90,7 +90,10 @@ class CacheEnergyManagementStore(SingletonMixin):
the application lifecycle. the application lifecycle.
Example: Example:
>>> cache = CacheEnergyManagementStore() .. code-block:: python
cache = CacheEnergyManagementStore()
""" """
if hasattr(self, "_initialized"): if hasattr(self, "_initialized"):
return return
@@ -112,7 +115,10 @@ class CacheEnergyManagementStore(SingletonMixin):
AttributeError: If the cache object does not have the requested method. AttributeError: If the cache object does not have the requested method.
Example: Example:
>>> result = cache.get("key") .. code-block:: python
result = cache.get("key")
""" """
# This will return a method of the target cache, or raise an AttributeError # This will return a method of the target cache, or raise an AttributeError
target_attr = getattr(self.cache, name) target_attr = getattr(self.cache, name)
@@ -134,7 +140,10 @@ class CacheEnergyManagementStore(SingletonMixin):
KeyError: If the key does not exist in the cache. KeyError: If the key does not exist in the cache.
Example: Example:
>>> value = cache["user_data"] .. code-block:: python
value = cache["user_data"]
""" """
return CacheEnergyManagementStore.cache[key] return CacheEnergyManagementStore.cache[key]
@@ -146,7 +155,10 @@ class CacheEnergyManagementStore(SingletonMixin):
value (Any): The value to store. value (Any): The value to store.
Example: Example:
>>> cache["user_data"] = {"name": "Alice", "age": 30} .. code-block:: python
cache["user_data"] = {"name": "Alice", "age": 30}
""" """
CacheEnergyManagementStore.cache[key] = value CacheEnergyManagementStore.cache[key] = value
@@ -166,7 +178,10 @@ class CacheEnergyManagementStore(SingletonMixin):
management system run). management system run).
Example: Example:
>>> cache.clear() .. code-block:: python
cache.clear()
""" """
if hasattr(self.cache, "clear") and callable(getattr(self.cache, "clear")): if hasattr(self.cache, "clear") and callable(getattr(self.cache, "clear")):
CacheEnergyManagementStore.cache.clear() CacheEnergyManagementStore.cache.clear()
@@ -179,64 +194,35 @@ class CacheEnergyManagementStore(SingletonMixin):
raise AttributeError(f"'{self.cache.__class__.__name__}' object has no method 'clear'") raise AttributeError(f"'{self.cache.__class__.__name__}' object has no method 'clear'")
def cachemethod_energy_management(method: TCallable) -> TCallable: def cache_energy_management(callable: TCallable) -> TCallable:
"""Decorator for in memory caching the result of an instance method. """Decorator for in memory caching the result of a callable.
This decorator caches the method's result in `CacheEnergyManagementStore`, ensuring This decorator caches the method or function's result in `CacheEnergyManagementStore`,
that subsequent calls with the same arguments return the cached result until the ensuring that subsequent calls with the same arguments return the cached result until the
next energy management start. next energy management start.
Args: Args:
method (Callable): The instance method to be decorated. callable (Callable): The function or method to be decorated.
Returns:
Callable: The wrapped method with caching functionality.
Example:
>>> class MyClass:
>>> @cachemethod_energy_management
>>> def expensive_method(self, param: str) -> str:
>>> # Perform expensive computation
>>> return f"Computed {param}"
"""
@cachebox.cachedmethod(
cache=CacheEnergyManagementStore().cache, callback=cache_energy_management_store_callback
)
@functools.wraps(method)
def wrapper(self: Any, *args: Any, **kwargs: Any) -> Any:
result = method(self, *args, **kwargs)
return result
return wrapper
def cache_energy_management(func: TCallable) -> TCallable:
"""Decorator for in memory caching the result of a standalone function.
This decorator caches the function's result in `CacheEnergyManagementStore`, ensuring
that subsequent calls with the same arguments return the cached result until the
next energy management start.
Args:
func (Callable): The function to be decorated.
Returns: Returns:
Callable: The wrapped function with caching functionality. Callable: The wrapped function with caching functionality.
Example: Example:
>>> @cache_until_next_update .. code-block:: python
>>> def expensive_function(param: str) -> str:
>>> # Perform expensive computation @cache_energy_management
>>> return f"Computed {param}" def expensive_function(param: str) -> str:
# Perform expensive computation
return f"Computed {param}"
""" """
@cachebox.cached( @cachebox.cached(
cache=CacheEnergyManagementStore().cache, callback=cache_energy_management_store_callback cache=CacheEnergyManagementStore().cache, callback=cache_energy_management_store_callback
) )
@functools.wraps(func) @functools.wraps(callable)
def wrapper(*args: Any, **kwargs: Any) -> Any: def wrapper(*args: Any, **kwargs: Any) -> Any:
result = func(*args, **kwargs) result = callable(*args, **kwargs)
return result return result
return wrapper return wrapper
@@ -251,10 +237,14 @@ RetType = TypeVar("RetType")
class CacheFileRecord(PydanticBaseModel): class CacheFileRecord(PydanticBaseModel):
cache_file: Any = Field(..., description="File descriptor of the cache file.") cache_file: Any = Field(
until_datetime: DateTime = Field(..., description="Datetime until the cache file is valid.") ..., json_schema_extra={"description": "File descriptor of the cache file."}
)
until_datetime: DateTime = Field(
..., json_schema_extra={"description": "Datetime until the cache file is valid."}
)
ttl_duration: Optional[Duration] = Field( ttl_duration: Optional[Duration] = Field(
default=None, description="Duration the cache file is valid." default=None, json_schema_extra={"description": "Duration the cache file is valid."}
) )
@@ -273,12 +263,15 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
with their associated keys and dates. with their associated keys and dates.
Example: Example:
>>> cache_store = CacheFileStore() .. code-block:: python
>>> cache_store.create('example_file')
>>> cache_file = cache_store.get('example_file') cache_store = CacheFileStore()
>>> cache_file.write('Some data') cache_store.create('example_file')
>>> cache_file.seek(0) cache_file = cache_store.get('example_file')
>>> print(cache_file.read()) # Output: 'Some data' cache_file.write('Some data')
cache_file.seek(0)
print(cache_file.read()) # Output: 'Some data'
""" """
def __init__(self, *args: Any, **kwargs: Any) -> None: def __init__(self, *args: Any, **kwargs: Any) -> None:
@@ -487,10 +480,13 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
file_obj: A file-like object representing the cache file. file_obj: A file-like object representing the cache file.
Example: Example:
>>> cache_file = cache_store.create('example_file', suffix='.txt') .. code-block:: python
>>> cache_file.write('Some cached data')
>>> cache_file.seek(0) cache_file = cache_store.create('example_file', suffix='.txt')
>>> print(cache_file.read()) # Output: 'Some cached data' cache_file.write('Some cached data')
cache_file.seek(0)
print(cache_file.read()) # Output: 'Some cached data'
""" """
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key( cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
@@ -539,7 +535,10 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
ValueError: If the key is already in store. ValueError: If the key is already in store.
Example: Example:
>>> cache_store.set('example_file', io.BytesIO(b'Some binary data')) .. code-block:: python
cache_store.set('example_file', io.BytesIO(b'Some binary data'))
""" """
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key( cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
@@ -595,10 +594,13 @@ class CacheFileStore(ConfigMixin, SingletonMixin):
file_obj: The file-like cache object, or None if no file is found. file_obj: The file-like cache object, or None if no file is found.
Example: Example:
>>> cache_file = cache_store.get('example_file') .. code-block:: python
>>> if cache_file:
>>> cache_file.seek(0) cache_file = cache_store.get('example_file')
>>> print(cache_file.read()) # Output: Cached data (if exists) if cache_file:
cache_file.seek(0)
print(cache_file.read()) # Output: Cached data (if exists)
""" """
if until_datetime or until_date: if until_datetime or until_date:
until_datetime, _ttl_duration = self._until_datetime_by_options( until_datetime, _ttl_duration = self._until_datetime_by_options(
@@ -877,13 +879,15 @@ def cache_in_file(
A decorated function that caches its result in a temporary file. A decorated function that caches its result in a temporary file.
Example: Example:
>>> from datetime import date .. code-block:: python
>>> @cache_in_file(suffix='.txt')
>>> def expensive_computation(until_date=None): from datetime import date
>>> # Perform some expensive computation @cache_in_file(suffix='.txt')
>>> return 'Some large result' def expensive_computation(until_date=None):
>>> # Perform some expensive computation
>>> result = expensive_computation(until_date=date.today()) return 'Some large result'
result = expensive_computation(until_date=date.today())
Notes: Notes:
- The cache key is based on the function arguments after excluding those in `ignore_params`. - The cache key is based on the function arguments after excluding those in `ignore_params`.

View File

@@ -15,11 +15,13 @@ class CacheCommonSettings(SettingsBaseModel):
"""Cache Configuration.""" """Cache Configuration."""
subpath: Optional[Path] = Field( subpath: Optional[Path] = Field(
default="cache", description="Sub-path for the EOS cache data directory." default="cache",
json_schema_extra={"description": "Sub-path for the EOS cache data directory."},
) )
cleanup_interval: float = Field( cleanup_interval: float = Field(
default=5 * 60, description="Intervall in seconds for EOS file cache cleanup." default=5 * 60,
json_schema_extra={"description": "Intervall in seconds for EOS file cache cleanup."},
) )
# Do not make this a pydantic computed field. The pydantic model must be fully initialized # Do not make this a pydantic computed field. The pydantic model must be fully initialized

View File

@@ -39,11 +39,12 @@ class ConfigMixin:
config (ConfigEOS): Property to access the global EOS configuration. config (ConfigEOS): Property to access the global EOS configuration.
Example: Example:
```python .. code-block:: python
class MyEOSClass(ConfigMixin):
def my_method(self): class MyEOSClass(ConfigMixin):
if self.config.myconfigval: def my_method(self):
``` if self.config.myconfigval:
""" """
@classproperty @classproperty
@@ -78,12 +79,13 @@ class MeasurementMixin:
measurement (Measurement): Property to access the global EOS measurement data. measurement (Measurement): Property to access the global EOS measurement data.
Example: Example:
```python .. code-block:: python
class MyOptimizationClass(MeasurementMixin):
def analyze_mymeasurement(self): class MyOptimizationClass(MeasurementMixin):
measurement_data = self.measurement.mymeasurement def analyze_mymeasurement(self):
# Perform analysis measurement_data = self.measurement.mymeasurement
``` # Perform analysis
""" """
@classproperty @classproperty
@@ -118,12 +120,13 @@ class PredictionMixin:
prediction (Prediction): Property to access the global EOS prediction data. prediction (Prediction): Property to access the global EOS prediction data.
Example: Example:
```python .. code-block:: python
class MyOptimizationClass(PredictionMixin):
def analyze_myprediction(self): class MyOptimizationClass(PredictionMixin):
prediction_data = self.prediction.mypredictionresult def analyze_myprediction(self):
# Perform analysis prediction_data = self.prediction.mypredictionresult
``` # Perform analysis
""" """
@classproperty @classproperty
@@ -159,12 +162,13 @@ class EnergyManagementSystemMixin:
ems (EnergyManagementSystem): Property to access the global EOS energy management system. ems (EnergyManagementSystem): Property to access the global EOS energy management system.
Example: Example:
```python .. code-block:: python
class MyOptimizationClass(EnergyManagementSystemMixin):
def analyze_myprediction(self): class MyOptimizationClass(EnergyManagementSystemMixin):
ems_data = self.ems.the_ems_method() def analyze_myprediction(self):
# Perform analysis ems_data = self.ems.the_ems_method()
``` # Perform analysis
""" """
@classproperty @classproperty
@@ -224,22 +228,25 @@ class SingletonMixin:
- Avoid using `__init__` to reinitialize the singleton instance after it has been created. - Avoid using `__init__` to reinitialize the singleton instance after it has been created.
Example: Example:
class MySingletonModel(SingletonMixin, PydanticBaseModel): .. code-block:: python
name: str
# implement __init__ to avoid re-initialization of parent classes: class MySingletonModel(SingletonMixin, PydanticBaseModel):
def __init__(self, *args: Any, **kwargs: Any) -> None: name: str
if hasattr(self, "_initialized"):
return
# Your initialisation here
...
super().__init__(*args, **kwargs)
instance1 = MySingletonModel(name="Instance 1") # implement __init__ to avoid re-initialization of parent classes:
instance2 = MySingletonModel(name="Instance 2") def __init__(self, *args: Any, **kwargs: Any) -> None:
if hasattr(self, "_initialized"):
return
# Your initialisation here
...
super().__init__(*args, **kwargs)
instance1 = MySingletonModel(name="Instance 1")
instance2 = MySingletonModel(name="Instance 2")
assert instance1 is instance2 # True
print(instance1.name) # Output: "Instance 1"
assert instance1 is instance2 # True
print(instance1.name) # Output: "Instance 1"
""" """
_lock: ClassVar[threading.Lock] = threading.Lock() _lock: ClassVar[threading.Lock] = threading.Lock()

View File

@@ -84,12 +84,16 @@ class DataRecord(DataBase, MutableMapping):
- Supports non-standard data types like `datetime`. - Supports non-standard data types like `datetime`.
""" """
date_time: Optional[DateTime] = Field(default=None, description="DateTime") date_time: Optional[DateTime] = Field(
default=None, json_schema_extra={"description": "DateTime"}
)
configured_data: dict[str, Any] = Field( configured_data: dict[str, Any] = Field(
default_factory=dict, default_factory=dict,
description="Configured field like data", json_schema_extra={
examples=[{"load0_mr": 40421}], "description": "Configured field like data",
"examples": [{"load0_mr": 40421}],
},
) )
# Pydantic v2 model configuration # Pydantic v2 model configuration
@@ -368,10 +372,11 @@ class DataRecord(DataBase, MutableMapping):
return None return None
# Get all descriptions from the fields # Get all descriptions from the fields
descriptions = { descriptions: dict[str, str] = {}
field_name: field_info.description for field_name in cls.model_fields.keys():
for field_name, field_info in cls.model_fields.items() desc = cls.field_description(field_name)
} if desc:
descriptions[field_name] = desc
# Use difflib to get close matches # Use difflib to get close matches
matches = difflib.get_close_matches( matches = difflib.get_close_matches(
@@ -427,25 +432,29 @@ class DataSequence(DataBase, MutableSequence):
Derived classes have to provide their own records field with correct record type set. Derived classes have to provide their own records field with correct record type set.
Usage: Usage:
# Example of creating, adding, and using DataSequence .. code-block:: python
class DerivedSequence(DataSquence):
records: List[DerivedDataRecord] = Field(default_factory=list,
description="List of data records")
seq = DerivedSequence() # Example of creating, adding, and using DataSequence
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=72)) class DerivedSequence(DataSquence):
seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=75)) records: List[DerivedDataRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of data records" })
# Convert to JSON and back seq = DerivedSequence()
json_data = seq.to_json() seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=72))
new_seq = DerivedSequence.from_json(json_data) seq.insert(DerivedDataRecord(date_time=datetime.now(), temperature=75))
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
# Convert to Pandas Series
series = seq.key_to_series('temperature')
# Convert to Pandas Series
series = seq.key_to_series('temperature')
""" """
# To be overloaded by derived classes. # To be overloaded by derived classes.
records: List[DataRecord] = Field(default_factory=list, description="List of data records") records: List[DataRecord] = Field(
default_factory=list, json_schema_extra={"description": "List of data records"}
)
# Derived fields (computed) # Derived fields (computed)
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@@ -731,9 +740,12 @@ class DataSequence(DataBase, MutableSequence):
**kwargs: Key-value pairs as keyword arguments **kwargs: Key-value pairs as keyword arguments
Examples: Examples:
>>> update_value(date, 'temperature', 25.5) .. code-block:: python
>>> update_value(date, {'temperature': 25.5, 'humidity': 80})
>>> update_value(date, temperature=25.5, humidity=80) update_value(date, 'temperature', 25.5)
update_value(date, {'temperature': 25.5, 'humidity': 80})
update_value(date, temperature=25.5, humidity=80)
""" """
# Process input arguments into a dictionary # Process input arguments into a dictionary
values: Dict[str, Any] = {} values: Dict[str, Any] = {}
@@ -1313,7 +1325,7 @@ class DataProvider(SingletonMixin, DataSequence):
""" """
update_datetime: Optional[AwareDatetime] = Field( update_datetime: Optional[AwareDatetime] = Field(
None, description="Latest update datetime for generic data" None, json_schema_extra={"description": "Latest update datetime for generic data"}
) )
@abstractmethod @abstractmethod
@@ -1372,15 +1384,18 @@ class DataImportMixin:
"""Mixin class for import of generic data. """Mixin class for import of generic data.
This class is designed to handle generic data provided in the form of a key-value dictionary. This class is designed to handle generic data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific data. - **Keys**: Represent identifiers from the record keys of a specific data.
- **Values**: Are lists of data values starting at a specified `start_datetime`, where - **Values**: Are lists of data values starting at a specified start_datetime, where
each value corresponds to a subsequent time interval (e.g., hourly). each value corresponds to a subsequent time interval (e.g., hourly).
Two special keys are handled. `start_datetime` may be used to defined the starting datetime of Two special keys are handled. start_datetime may be used to defined the starting datetime of
the values. `ìnterval` may be used to define the fixed time interval between two values. the values. ìnterval may be used to define the fixed time interval between two values.
On import self.update_value(datetime, key, value) is called which has to be provided.
Also self.ems_start_datetime may be necessary as a default in case start_datetime is not
given.
On import `self.update_value(datetime, key, value)` is called which has to be provided.
Also `self.ems_start_datetime` may be necessary as a default in case `start_datetime`is not given.
""" """
# Attributes required but defined elsehere. # Attributes required but defined elsehere.
@@ -1412,16 +1427,20 @@ class DataImportMixin:
Behavior: Behavior:
- Skips invalid timestamps during DST spring forward transitions. - Skips invalid timestamps during DST spring forward transitions.
- Includes both instances of repeated timestamps during DST fall back transitions. - Includes both instances of repeated timestamps during DST fall back transitions.
- Ensures the list contains exactly `value_count` entries. - Ensures the list contains exactly 'value_count' entries.
Example: Example:
>>> start_datetime = pendulum.datetime(2024, 11, 3, 0, 0, tz="America/New_York") .. code-block:: python
>>> import_datetimes(start_datetime, 5)
[(DateTime(2024, 11, 3, 0, 0, tzinfo=Timezone('America/New_York')), 0), start_datetime = pendulum.datetime(2024, 11, 3, 0, 0, tz="America/New_York")
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1), import_datetimes(start_datetime, 5)
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1), # Repeated hour
(DateTime(2024, 11, 3, 2, 0, tzinfo=Timezone('America/New_York')), 2), [(DateTime(2024, 11, 3, 0, 0, tzinfo=Timezone('America/New_York')), 0),
(DateTime(2024, 11, 3, 3, 0, tzinfo=Timezone('America/New_York')), 3)] (DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1),
(DateTime(2024, 11, 3, 1, 0, tzinfo=Timezone('America/New_York')), 1), # Repeated hour
(DateTime(2024, 11, 3, 2, 0, tzinfo=Timezone('America/New_York')), 2),
(DateTime(2024, 11, 3, 3, 0, tzinfo=Timezone('America/New_York')), 3)]
""" """
timestamps_with_indices: List[Tuple[DateTime, int]] = [] timestamps_with_indices: List[Tuple[DateTime, int]] = []
@@ -1659,17 +1678,18 @@ class DataImportMixin:
JSONDecodeError: If the file content is not valid JSON. JSONDecodeError: If the file content is not valid JSON.
Example: Example:
Given a JSON string with the following content: Given a JSON string with the following content and `key_prefix = "load"`, only the
```json "loadforecast_power_w" key will be processed even though both keys are in the record.
{
"start_datetime": "2024-11-10 00:00:00" .. code-block:: json
"interval": "30 minutes"
"loadforecast_power_w": [20.5, 21.0, 22.1], {
"other_xyz: [10.5, 11.0, 12.1], "start_datetime": "2024-11-10 00:00:00",
} "interval": "30 minutes",
``` "loadforecast_power_w": [20.5, 21.0, 22.1],
and `key_prefix = "load"`, only the "loadforecast_power_w" key will be processed even though "other_xyz: [10.5, 11.0, 12.1]
both keys are in the record. }
""" """
# Try pandas dataframe with orient="split" # Try pandas dataframe with orient="split"
try: try:
@@ -1735,15 +1755,16 @@ class DataImportMixin:
JSONDecodeError: If the file content is not valid JSON. JSONDecodeError: If the file content is not valid JSON.
Example: Example:
Given a JSON file with the following content: Given a JSON file with the following content and `key_prefix = "load"`, only the
```json "loadforecast_power_w" key will be processed even though both keys are in the record.
{
"loadforecast_power_w": [20.5, 21.0, 22.1], .. code-block:: json
"other_xyz: [10.5, 11.0, 12.1],
} {
``` "loadforecast_power_w": [20.5, 21.0, 22.1],
and `key_prefix = "load"`, only the "loadforecast_power_w" key will be processed even though "other_xyz: [10.5, 11.0, 12.1],
both keys are in the record. }
""" """
with import_file_path.open("r", encoding="utf-8", newline=None) as import_file: with import_file_path.open("r", encoding="utf-8", newline=None) as import_file:
import_str = import_file.read() import_str = import_file.read()
@@ -1756,9 +1777,10 @@ class DataImportProvider(DataImportMixin, DataProvider):
"""Abstract base class for data providers that import generic data. """Abstract base class for data providers that import generic data.
This class is designed to handle generic data provided in the form of a key-value dictionary. This class is designed to handle generic data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific data. - **Keys**: Represent identifiers from the record keys of a specific data.
- **Values**: Are lists of data values starting at a specified `start_datetime`, where - **Values**: Are lists of data values starting at a specified `start_datetime`, where
each value corresponds to a subsequent time interval (e.g., hourly). each value corresponds to a subsequent time interval (e.g., hourly).
Subclasses must implement the logic for managing generic data based on the imported records. Subclasses must implement the logic for managing generic data based on the imported records.
""" """
@@ -1780,7 +1802,7 @@ class DataContainer(SingletonMixin, DataBase, MutableMapping):
# To be overloaded by derived classes. # To be overloaded by derived classes.
providers: List[DataProvider] = Field( providers: List[DataProvider] = Field(
default_factory=list, description="List of data providers" default_factory=list, json_schema_extra={"description": "List of data providers"}
) )
@field_validator("providers", mode="after") @field_validator("providers", mode="after")

View File

@@ -12,14 +12,16 @@ class classproperty:
the class rather than any instance of the class. the class rather than any instance of the class.
Example: Example:
class MyClass: .. code-block:: python
_value = 42
@classproperty class MyClass:
def value(cls): _value = 42
return cls._value
print(MyClass.value) # Outputs: 42 @classproperty
def value(cls):
return cls._value
print(MyClass.value) # Outputs: 42
Methods: Methods:
__get__: Retrieves the value of the class property by calling the __get__: Retrieves the value of the class property by calling the

File diff suppressed because it is too large Load Diff

View File

@@ -24,17 +24,23 @@ class EnergyManagementCommonSettings(SettingsBaseModel):
startup_delay: float = Field( startup_delay: float = Field(
default=5, default=5,
ge=1, ge=1,
description="Startup delay in seconds for EOS energy management runs.", json_schema_extra={
"description": "Startup delay in seconds for EOS energy management runs."
},
) )
interval: Optional[float] = Field( interval: Optional[float] = Field(
default=None, default=None,
description="Intervall in seconds between EOS energy management runs.", json_schema_extra={
examples=["300"], "description": "Intervall in seconds between EOS energy management runs.",
"examples": ["300"],
},
) )
mode: Optional[EnergyManagementMode] = Field( mode: Optional[EnergyManagementMode] = Field(
default=None, default=None,
description="Energy management mode [OPTIMIZATION | PREDICTION].", json_schema_extra={
examples=["OPTIMIZATION", "PREDICTION"], "description": "Energy management mode [OPTIMIZATION | PREDICTION].",
"examples": ["OPTIMIZATION", "PREDICTION"],
},
) )

View File

@@ -17,14 +17,18 @@ class LoggingCommonSettings(SettingsBaseModel):
console_level: Optional[str] = Field( console_level: Optional[str] = Field(
default=None, default=None,
description="Logging level when logging to console.", json_schema_extra={
examples=LOGGING_LEVELS, "description": "Logging level when logging to console.",
"examples": LOGGING_LEVELS,
},
) )
file_level: Optional[str] = Field( file_level: Optional[str] = Field(
default=None, default=None,
description="Logging level when logging to file.", json_schema_extra={
examples=LOGGING_LEVELS, "description": "Logging level when logging to file.",
"examples": LOGGING_LEVELS,
},
) )
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]

View File

@@ -6,10 +6,12 @@ These enhancements facilitate the use of Pydantic models in applications requiri
datetime fields and consistent data serialization. datetime fields and consistent data serialization.
Key Features: Key Features:
- Custom type adapter for `pendulum.DateTime` fields with automatic serialization to ISO 8601 strings. - Custom type adapter for `pendulum.DateTime` fields with automatic serialization to ISO 8601 strings.
- Utility methods for converting models to and from dictionaries and JSON strings. - Utility methods for converting models to and from dictionaries and JSON strings.
- Validation tools for maintaining data consistency, including specialized support for - Validation tools for maintaining data consistency, including specialized support for
pandas DataFrames and Series with datetime indexes. pandas DataFrames and Series with datetime indexes.
""" """
import inspect import inspect
@@ -43,6 +45,7 @@ from pydantic import (
ValidationInfo, ValidationInfo,
field_validator, field_validator,
) )
from pydantic.fields import ComputedFieldInfo, FieldInfo
from akkudoktoreos.utils.datetimeutil import DateTime, to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import DateTime, to_datetime, to_duration
@@ -156,16 +159,19 @@ class PydanticModelNestedValueMixin:
or an invalid transition is made (such as an attribute on a non-model). or an invalid transition is made (such as an attribute on a non-model).
Example: Example:
class Address(PydanticBaseModel): .. code-block:: python
city: str
class User(PydanticBaseModel): class Address(PydanticBaseModel):
name: str city: str
address: Address
class User(PydanticBaseModel):
name: str
address: Address
user = User(name="Alice", address=Address(city="NY"))
user._validate_path_structure("address/city") # OK
user._validate_path_structure("address/zipcode") # Raises ValueError
user = User(name="Alice", address=Address(city="NY"))
user._validate_path_structure("address/city") # OK
user._validate_path_structure("address/zipcode") # Raises ValueError
""" """
path_elements = path.strip("/").split("/") path_elements = path.strip("/").split("/")
# The model we are currently working on # The model we are currently working on
@@ -263,18 +269,19 @@ class PydanticModelNestedValueMixin:
IndexError: If a list index is out of bounds or invalid. IndexError: If a list index is out of bounds or invalid.
Example: Example:
```python .. code-block:: python
class Address(PydanticBaseModel):
city: str
class User(PydanticBaseModel): class Address(PydanticBaseModel):
name: str city: str
address: Address
class User(PydanticBaseModel):
name: str
address: Address
user = User(name="Alice", address=Address(city="New York"))
city = user.get_nested_value("address/city")
print(city) # Output: "New York"
user = User(name="Alice", address=Address(city="New York"))
city = user.get_nested_value("address/city")
print(city) # Output: "New York"
```
""" """
path_elements = path.strip("/").split("/") path_elements = path.strip("/").split("/")
model: Any = self model: Any = self
@@ -317,22 +324,23 @@ class PydanticModelNestedValueMixin:
TypeError: If a missing field cannot be initialized. TypeError: If a missing field cannot be initialized.
Example: Example:
```python .. code-block:: python
class Address(PydanticBaseModel):
city: Optional[str]
class User(PydanticBaseModel): class Address(PydanticBaseModel):
name: str city: Optional[str]
address: Optional[Address]
settings: Optional[Dict[str, Any]]
user = User(name="Alice", address=None, settings=None) class User(PydanticBaseModel):
user.set_nested_value("address/city", "Los Angeles") name: str
user.set_nested_value("settings/theme", "dark") address: Optional[Address]
settings: Optional[Dict[str, Any]]
user = User(name="Alice", address=None, settings=None)
user.set_nested_value("address/city", "Los Angeles")
user.set_nested_value("settings/theme", "dark")
print(user.address.city) # Output: "Los Angeles"
print(user.settings) # Output: {'theme': 'dark'}
print(user.address.city) # Output: "Los Angeles"
print(user.settings) # Output: {'theme': 'dark'}
```
""" """
path = path.strip("/") path = path.strip("/")
# Store old value (if possible) # Store old value (if possible)
@@ -720,6 +728,149 @@ class PydanticBaseModel(PydanticModelNestedValueMixin, BaseModel):
data = json.loads(json_str) data = json.loads(json_str)
return cls.model_validate(data) return cls.model_validate(data)
@classmethod
def _field_extra_dict(
cls,
model_field: Union[FieldInfo, ComputedFieldInfo],
) -> Dict[str, Any]:
"""Return the ``json_schema_extra`` dictionary for a given model field.
This method provides a safe and unified way to access the
``json_schema_extra`` metadata associated with a Pydantic field
definition. It supports both standard fields defined via
``Field(...)`` and computed fields, and gracefully handles
cases where ``json_schema_extra`` is not present.
Args:
model_field (Union[FieldInfo, ComputedFieldInfo]):
The Pydantic field object from which to extract
``json_schema_extra`` metadata. This can be obtained
from ``model.model_fields[field_name]`` or
``model.model_computed_fields[field_name]``.
Returns:
Dict[str, Any]:
A dictionary containing the fields ``json_schema_extra``
metadata. If no metadata is available, an empty dictionary
is returned.
Raises:
None:
This method does not raise. Missing metadata is handled
gracefully by returning an empty dictionary.
Examples:
.. code-block:: python
class User(Base):
name: str = Field(
json_schema_extra={"description": "User name"}
)
field = User.model_fields["name"]
User.get_field_extra_dict(field)
{'description': 'User name'}
missing = User.model_fields.get("unknown", None)
User.get_field_extra_dict(missing) if missing else {}
{}
"""
if model_field is None:
return {}
# Pydantic v2 primary location
extra = getattr(model_field, "json_schema_extra", None)
if isinstance(extra, dict):
return extra
# Pydantic v1 compatibility fallback
fi = getattr(model_field, "field_info", None)
if fi is not None:
extra = getattr(fi, "json_schema_extra", None)
if isinstance(extra, dict):
return extra
return {}
@classmethod
def field_description(cls, field_name: str) -> Optional[str]:
"""Return the description metadata of a model field, if available.
This method retrieves the `Field` specification from the model's
`model_fields` registry and extracts its description from the field's
`json_schema_extra` / `extra` metadata (as provided by
`_field_extra_dict`). If the field does not exist or no description is
present, ``None`` is returned.
Args:
field_name (str):
Name of the field whose description should be returned.
Returns:
Optional[str]:
The textual description if present, otherwise ``None``.
"""
field = cls.model_fields.get(field_name)
if not field:
return None
extra = cls._field_extra_dict(field)
if "description" in extra:
return str(extra["description"])
return None
@classmethod
def field_deprecated(cls, field_name: str) -> Optional[str]:
"""Return the deprecated metadata of a model field, if available.
This method retrieves the `Field` specification from the model's
`model_fields` registry and extracts its description from the field's
`json_schema_extra` / `extra` metadata (as provided by
`_field_extra_dict`). If the field does not exist or no description is
present, ``None`` is returned.
Args:
field_name (str):
Name of the field whose deprecated info should be returned.
Returns:
Optional[str]:
The textual deprecated info if present, otherwise ``None``.
"""
field = cls.model_fields.get(field_name)
if not field:
return None
extra = cls._field_extra_dict(field)
if "deprecated" in extra:
return str(extra["deprecated"])
return None
@classmethod
def field_examples(cls, field_name: str) -> Optional[list[Any]]:
"""Return the examples metadata of a model field, if available.
This method retrieves the `Field` specification from the model's
`model_fields` registry and extracts its description from the field's
`json_schema_extra` / `extra` metadata (as provided by
`_field_extra_dict`). If the field does not exist or no description is
present, ``None`` is returned.
Args:
field_name (str):
Name of the field whose examples should be returned.
Returns:
Optional[list[Any]]:
The examples if present, otherwise ``None``.
"""
field = cls.model_fields.get(field_name)
if not field:
return None
extra = cls._field_extra_dict(field)
if "examples" in extra:
return extra["examples"]
return None
class PydanticDateTimeData(RootModel): class PydanticDateTimeData(RootModel):
"""Pydantic model for time series data with consistent value lengths. """Pydantic model for time series data with consistent value lengths.
@@ -732,12 +883,15 @@ class PydanticDateTimeData(RootModel):
- All value lists must have the same length - All value lists must have the same length
Example: Example:
{ .. code-block:: python
"start_datetime": "2024-01-01 00:00:00", # optional
"interval": "1 Hour", # optional {
"loadforecast_power_w": [20.5, 21.0, 22.1], "start_datetime": "2024-01-01 00:00:00", # optional
"load_min": [18.5, 19.0, 20.1] "interval": "1 Hour", # optional
} "loadforecast_power_w": [20.5, 21.0, 22.1],
"load_min": [18.5, 19.0, 20.1]
}
""" """
root: Dict[str, Union[str, List[Union[float, int, str, None]]]] root: Dict[str, Union[str, List[Union[float, int, str, None]]]]
@@ -795,9 +949,12 @@ class PydanticDateTimeDataFrame(PydanticBaseModel):
data: Dict[str, Dict[str, Any]] data: Dict[str, Dict[str, Any]]
dtypes: Dict[str, str] = Field(default_factory=dict) dtypes: Dict[str, str] = Field(default_factory=dict)
tz: Optional[str] = Field(default=None, description="Timezone for datetime values") tz: Optional[str] = Field(
default=None, json_schema_extra={"description": "Timezone for datetime values"}
)
datetime_columns: list[str] = Field( datetime_columns: list[str] = Field(
default_factory=lambda: ["date_time"], description="Columns to be treated as datetime" default_factory=lambda: ["date_time"],
json_schema_extra={"description": "Columns to be treated as datetime"},
) )
@field_validator("tz") @field_validator("tz")
@@ -1131,9 +1288,12 @@ class PydanticDateTimeSeries(PydanticBaseModel):
ValueError: If series index is not datetime type. ValueError: If series index is not datetime type.
Example: Example:
>>> dates = pd.date_range('2024-01-01', periods=3) .. code-block:: python
>>> s = pd.Series([1.1, 2.2, 3.3], index=dates)
>>> model = PydanticDateTimeSeries.from_series(s) dates = pd.date_range('2024-01-01', periods=3)
s = pd.Series([1.1, 2.2, 3.3], index=dates)
model = PydanticDateTimeSeries.from_series(s)
""" """
index = pd.Index([to_datetime(dt, as_string=True, in_timezone=tz) for dt in series.index]) index = pd.Index([to_datetime(dt, as_string=True, in_timezone=tz) for dt in series.index])
series.index = index series.index = index

View File

@@ -1,5 +1,156 @@
"""Version information for akkudoktoreos.""" """Version information for akkudoktoreos."""
import hashlib
import re
from fnmatch import fnmatch
from pathlib import Path
from typing import Optional
# For development add `+dev` to previous release # For development add `+dev` to previous release
# For release omit `+dev`. # For release omit `+dev`.
__version__ = "0.2.0" VERSION_BASE = "0.2.0+dev"
# Project hash of relevant files
HASH_EOS = ""
# ------------------------------
# Helpers for version generation
# ------------------------------
def is_excluded_dir(path: Path, excluded_dir_patterns: set[str]) -> bool:
"""Check whether a directory should be excluded based on name patterns."""
return any(fnmatch(path.name, pattern) for pattern in excluded_dir_patterns)
def hash_tree(
paths: list[Path],
allowed_suffixes: set[str],
excluded_dir_patterns: set[str],
excluded_files: Optional[set[Path]] = None,
) -> str:
"""Return SHA256 hash for files under `paths`.
Restricted by suffix, excluding excluded directory patterns and excluded_files.
"""
h = hashlib.sha256()
excluded_files = excluded_files or set()
for root in paths:
if not root.exists():
raise ValueError(f"Root path does not exist: {root}")
for p in sorted(root.rglob("*")):
# Skip excluded directories
if p.is_dir() and is_excluded_dir(p, excluded_dir_patterns):
continue
# Skip files inside excluded directories
if any(is_excluded_dir(parent, excluded_dir_patterns) for parent in p.parents):
continue
# Skip excluded files
if p.resolve() in excluded_files:
continue
# Hash only allowed file types
if p.is_file() and p.suffix.lower() in allowed_suffixes:
h.update(p.read_bytes())
digest = h.hexdigest()
return digest
def _version_hash() -> str:
"""Calculate project hash.
Only package file ins src/akkudoktoreos can be hashed to make it work also for packages.
"""
DIR_PACKAGE_ROOT = Path(__file__).resolve().parent.parent
# Allowed file suffixes to consider
ALLOWED_SUFFIXES: set[str] = {".py", ".md", ".json"}
# Directory patterns to exclude (glob-like)
EXCLUDED_DIR_PATTERNS: set[str] = {"*_autosum", "*__pycache__", "*_generated"}
# Files to exclude
EXCLUDED_FILES: set[Path] = set()
# Directories whose changes shall be part of the project hash
watched_paths = [DIR_PACKAGE_ROOT]
hash_current = hash_tree(
watched_paths, ALLOWED_SUFFIXES, EXCLUDED_DIR_PATTERNS, excluded_files=EXCLUDED_FILES
)
return hash_current
def _version_calculate() -> str:
"""Compute version."""
global HASH_EOS
HASH_EOS = _version_hash()
if VERSION_BASE.endswith("+dev"):
return f"{VERSION_BASE}.{HASH_EOS[:6]}"
else:
return VERSION_BASE
# ---------------------------
# Project version information
# ----------------------------
# The version
__version__ = _version_calculate()
# -------------------
# Version info access
# -------------------
# Regular expression to split the version string into pieces
VERSION_RE = re.compile(
r"""
^(?P<base>\d+\.\d+\.\d+) # x.y.z
(?:\+ # +dev.hash starts here
(?:
(?P<dev>dev) # literal 'dev'
(?:\.(?P<hash>[A-Za-z0-9]+))? # optional .hash
)
)?
$
""",
re.VERBOSE,
)
def version() -> dict[str, Optional[str]]:
"""Parses the version string.
The version string shall be of the form:
x.y.z
x.y.z+dev
x.y.z+dev.HASH
Returns:
.. code-block:: python
{
"version": "0.2.0+dev.a96a65",
"base": "x.y.z",
"dev": "dev" or None,
"hash": "<hash>" or None,
}
"""
global __version__
match = VERSION_RE.match(__version__)
if not match:
raise ValueError(f"Invalid version format: {version}")
info = match.groupdict()
info["version"] = __version__
return info

View File

@@ -1,5 +0,0 @@
{
"general": {
"version": "0.2.0"
}
}

View File

@@ -25,74 +25,81 @@ class BatteriesCommonSettings(DevicesBaseSettings):
"""Battery devices base settings.""" """Battery devices base settings."""
capacity_wh: int = Field( capacity_wh: int = Field(
default=8000, default=8000, gt=0, json_schema_extra={"description": "Capacity [Wh].", "examples": [8000]}
gt=0,
description="Capacity [Wh].",
examples=[8000],
) )
charging_efficiency: float = Field( charging_efficiency: float = Field(
default=0.88, default=0.88,
gt=0, gt=0,
le=1, le=1,
description="Charging efficiency [0.01 ... 1.00].", json_schema_extra={
examples=[0.88], "description": "Charging efficiency [0.01 ... 1.00].",
"examples": [0.88],
},
) )
discharging_efficiency: float = Field( discharging_efficiency: float = Field(
default=0.88, default=0.88,
gt=0, gt=0,
le=1, le=1,
description="Discharge efficiency [0.01 ... 1.00].", json_schema_extra={
examples=[0.88], "description": "Discharge efficiency [0.01 ... 1.00].",
"examples": [0.88],
},
) )
levelized_cost_of_storage_kwh: float = Field( levelized_cost_of_storage_kwh: float = Field(
default=0.0, default=0.0,
description="Levelized cost of storage (LCOS), the average lifetime cost of delivering one kWh [€/kWh].", json_schema_extra={
examples=[0.12], "description": "Levelized cost of storage (LCOS), the average lifetime cost of delivering one kWh [€/kWh].",
"examples": [0.12],
},
) )
max_charge_power_w: Optional[float] = Field( max_charge_power_w: Optional[float] = Field(
default=5000, default=5000,
gt=0, gt=0,
description="Maximum charging power [W].", json_schema_extra={"description": "Maximum charging power [W].", "examples": [5000]},
examples=[5000],
) )
min_charge_power_w: Optional[float] = Field( min_charge_power_w: Optional[float] = Field(
default=50, default=50,
gt=0, gt=0,
description="Minimum charging power [W].", json_schema_extra={"description": "Minimum charging power [W].", "examples": [50]},
examples=[50],
) )
charge_rates: Optional[NDArray[Shape["*"], float]] = Field( charge_rates: Optional[NDArray[Shape["*"], float]] = Field(
default=BATTERY_DEFAULT_CHARGE_RATES, default=BATTERY_DEFAULT_CHARGE_RATES,
description=( json_schema_extra={
"Charge rates as factor of maximum charging power [0.00 ... 1.00]. " "description": (
"None triggers fallback to default charge-rates." "Charge rates as factor of maximum charging power [0.00 ... 1.00]. "
), "None triggers fallback to default charge-rates."
examples=[[0.0, 0.25, 0.5, 0.75, 1.0], None], ),
"examples": [[0.0, 0.25, 0.5, 0.75, 1.0], None],
},
) )
min_soc_percentage: int = Field( min_soc_percentage: int = Field(
default=0, default=0,
ge=0, ge=0,
le=100, le=100,
description=( json_schema_extra={
"Minimum state of charge (SOC) as percentage of capacity [%]. " "description": (
"This is the target SoC for charging" "Minimum state of charge (SOC) as percentage of capacity [%]. "
), "This is the target SoC for charging"
examples=[10], ),
"examples": [10],
},
) )
max_soc_percentage: int = Field( max_soc_percentage: int = Field(
default=100, default=100,
ge=0, ge=0,
le=100, le=100,
description="Maximum state of charge (SOC) as percentage of capacity [%].", json_schema_extra={
examples=[100], "description": "Maximum state of charge (SOC) as percentage of capacity [%].",
"examples": [100],
},
) )
@field_validator("charge_rates", mode="before") @field_validator("charge_rates", mode="before")
@@ -178,14 +185,15 @@ class InverterCommonSettings(DevicesBaseSettings):
max_power_w: Optional[float] = Field( max_power_w: Optional[float] = Field(
default=None, default=None,
gt=0, gt=0,
description="Maximum power [W].", json_schema_extra={"description": "Maximum power [W].", "examples": [10000]},
examples=[10000],
) )
battery_id: Optional[str] = Field( battery_id: Optional[str] = Field(
default=None, default=None,
description="ID of battery controlled by this inverter.", json_schema_extra={
examples=[None, "battery1"], "description": "ID of battery controlled by this inverter.",
"examples": [None, "battery1"],
},
) )
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@@ -200,28 +208,27 @@ class HomeApplianceCommonSettings(DevicesBaseSettings):
"""Home Appliance devices base settings.""" """Home Appliance devices base settings."""
consumption_wh: int = Field( consumption_wh: int = Field(
gt=0, gt=0, json_schema_extra={"description": "Energy consumption [Wh].", "examples": [2000]}
description="Energy consumption [Wh].",
examples=[2000],
) )
duration_h: int = Field( duration_h: int = Field(
gt=0, gt=0,
le=24, le=24,
description="Usage duration in hours [0 ... 24].", json_schema_extra={"description": "Usage duration in hours [0 ... 24].", "examples": [1]},
examples=[1],
) )
time_windows: Optional[TimeWindowSequence] = Field( time_windows: Optional[TimeWindowSequence] = Field(
default=None, default=None,
description="Sequence of allowed time windows. Defaults to optimization general time window.", json_schema_extra={
examples=[ "description": "Sequence of allowed time windows. Defaults to optimization general time window.",
{ "examples": [
"windows": [ {
{"start_time": "10:00", "duration": "2 hours"}, "windows": [
], {"start_time": "10:00", "duration": "2 hours"},
}, ],
], },
],
},
) )
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@@ -237,50 +244,62 @@ class DevicesCommonSettings(SettingsBaseModel):
batteries: Optional[list[BatteriesCommonSettings]] = Field( batteries: Optional[list[BatteriesCommonSettings]] = Field(
default=None, default=None,
description="List of battery devices", json_schema_extra={
examples=[[{"device_id": "battery1", "capacity_wh": 8000}]], "description": "List of battery devices",
"examples": [[{"device_id": "battery1", "capacity_wh": 8000}]],
},
) )
max_batteries: Optional[int] = Field( max_batteries: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
description="Maximum number of batteries that can be set", json_schema_extra={
examples=[1, 2], "description": "Maximum number of batteries that can be set",
"examples": [1, 2],
},
) )
electric_vehicles: Optional[list[BatteriesCommonSettings]] = Field( electric_vehicles: Optional[list[BatteriesCommonSettings]] = Field(
default=None, default=None,
description="List of electric vehicle devices", json_schema_extra={
examples=[[{"device_id": "battery1", "capacity_wh": 8000}]], "description": "List of electric vehicle devices",
"examples": [[{"device_id": "battery1", "capacity_wh": 8000}]],
},
) )
max_electric_vehicles: Optional[int] = Field( max_electric_vehicles: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
description="Maximum number of electric vehicles that can be set", json_schema_extra={
examples=[1, 2], "description": "Maximum number of electric vehicles that can be set",
"examples": [1, 2],
},
) )
inverters: Optional[list[InverterCommonSettings]] = Field( inverters: Optional[list[InverterCommonSettings]] = Field(
default=None, description="List of inverters", examples=[[]] default=None, json_schema_extra={"description": "List of inverters", "examples": [[]]}
) )
max_inverters: Optional[int] = Field( max_inverters: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
description="Maximum number of inverters that can be set", json_schema_extra={
examples=[1, 2], "description": "Maximum number of inverters that can be set",
"examples": [1, 2],
},
) )
home_appliances: Optional[list[HomeApplianceCommonSettings]] = Field( home_appliances: Optional[list[HomeApplianceCommonSettings]] = Field(
default=None, description="List of home appliances", examples=[[]] default=None, json_schema_extra={"description": "List of home appliances", "examples": [[]]}
) )
max_home_appliances: Optional[int] = Field( max_home_appliances: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
description="Maximum number of home_appliances that can be set", json_schema_extra={
examples=[1, 2], "description": "Maximum number of home_appliances that can be set",
"examples": [1, 2],
},
) )
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@@ -336,13 +355,17 @@ class ResourceRegistry(SingletonMixin, ConfigMixin, PydanticBaseModel):
latest: dict[ResourceKey, ResourceStatus] = Field( latest: dict[ResourceKey, ResourceStatus] = Field(
default_factory=dict, default_factory=dict,
description="Latest resource status that was reported per resource key.", json_schema_extra={
example=[], "description": "Latest resource status that was reported per resource key.",
"example": [],
},
) )
history: dict[ResourceKey, list[tuple[DateTime, ResourceStatus]]] = Field( history: dict[ResourceKey, list[tuple[DateTime, ResourceStatus]]] = Field(
default_factory=dict, default_factory=dict,
description="History of resource stati that were reported per resource key.", json_schema_extra={
example=[], "description": "History of resource stati that were reported per resource key.",
"example": [],
},
) )
@model_validator(mode="after") @model_validator(mode="after")

View File

@@ -12,8 +12,10 @@ class DevicesBaseSettings(SettingsBaseModel):
device_id: str = Field( device_id: str = Field(
default="<unknown>", default="<unknown>",
description="ID of device", json_schema_extra={
examples=["battery1", "ev1", "inverter1", "dishwasher"], "description": "ID of device",
"examples": ["battery1", "ev1", "inverter1", "dishwasher"],
},
) )

View File

@@ -171,25 +171,28 @@ class Battery:
Two **exclusive** modes: Two **exclusive** modes:
Mode 1: **Mode 1:**
- `wh is not None` and `charge_factor == 0`
→ The raw requested charge energy is `wh` (pre-efficiency).
→ If remaining capacity is insufficient, charging is automatically limited.
→ No exception is raised due to capacity limits.
Mode 2: - `wh is not None` and `charge_factor == 0`
- `wh is None` and `charge_factor > 0` - The raw requested charge energy is `wh` (pre-efficiency).
→ The raw requested energy is `max_charge_power_w * charge_factor`. - If remaining capacity is insufficient, charging is automatically limited.
→ If the request exceeds remaining capacity, the algorithm tries to - No exception is raised due to capacity limits.
find a lower charge_factor that is compatible. If such a charge factor
exists, this hours charge_factor is replaced. **Mode 2:**
→ If no charge factor can accommodate charging, the request is ignored
(`(0.0, 0.0)` is returned) and a penalty is applied elsewhere. - `wh is None` and `charge_factor > 0`
- The raw requested energy is `max_charge_power_w * charge_factor`.
- If the request exceeds remaining capacity, the algorithm tries to find a lower
`charge_factor` that is compatible. If such a charge factor exists, this hours
`charge_factor` is replaced.
- If no charge factor can accommodate charging, the request is ignored (``(0.0, 0.0)`` is
returned) and a penalty is applied elsewhere.
Charging is constrained by: Charging is constrained by:
• Available SoC headroom (max_soc_wh soc_wh)
• max_charge_power_w - Available SoC headroom (``max_soc_wh soc_wh``)
• charging_efficiency - ``max_charge_power_w``
- ``charging_efficiency``
Args: Args:
wh (float | None): wh (float | None):

View File

@@ -24,26 +24,34 @@ class MeasurementCommonSettings(SettingsBaseModel):
load_emr_keys: Optional[list[str]] = Field( load_emr_keys: Optional[list[str]] = Field(
default=None, default=None,
description="The keys of the measurements that are energy meter readings of a load [kWh].", json_schema_extra={
examples=[["load0_emr"]], "description": "The keys of the measurements that are energy meter readings of a load [kWh].",
"examples": [["load0_emr"]],
},
) )
grid_export_emr_keys: Optional[list[str]] = Field( grid_export_emr_keys: Optional[list[str]] = Field(
default=None, default=None,
description="The keys of the measurements that are energy meter readings of energy export to grid [kWh].", json_schema_extra={
examples=[["grid_export_emr"]], "description": "The keys of the measurements that are energy meter readings of energy export to grid [kWh].",
"examples": [["grid_export_emr"]],
},
) )
grid_import_emr_keys: Optional[list[str]] = Field( grid_import_emr_keys: Optional[list[str]] = Field(
default=None, default=None,
description="The keys of the measurements that are energy meter readings of energy import from grid [kWh].", json_schema_extra={
examples=[["grid_import_emr"]], "description": "The keys of the measurements that are energy meter readings of energy import from grid [kWh].",
"examples": [["grid_import_emr"]],
},
) )
pv_production_emr_keys: Optional[list[str]] = Field( pv_production_emr_keys: Optional[list[str]] = Field(
default=None, default=None,
description="The keys of the measurements that are PV production energy meter readings [kWh].", json_schema_extra={
examples=[["pv1_emr"]], "description": "The keys of the measurements that are PV production energy meter readings [kWh].",
"examples": [["pv1_emr"]],
},
) )
## Computed fields ## Computed fields
@@ -78,7 +86,7 @@ class Measurement(SingletonMixin, DataImportMixin, DataSequence):
""" """
records: list[MeasurementDataRecord] = Field( records: list[MeasurementDataRecord] = Field(
default_factory=list, description="list of measurement data records" default_factory=list, json_schema_extra={"description": "list of measurement data records"}
) )
def __init__(self, *args: Any, **kwargs: Any) -> None: def __init__(self, *args: Any, **kwargs: Any) -> None:

View File

@@ -34,50 +34,74 @@ class GeneticSimulation(PydanticBaseModel):
) )
start_hour: int = Field( start_hour: int = Field(
default=0, ge=0, le=23, description="Starting hour on day for optimizations." default=0,
ge=0,
le=23,
json_schema_extra={"description": "Starting hour on day for optimizations."},
) )
optimization_hours: Optional[int] = Field( optimization_hours: Optional[int] = Field(
default=24, ge=0, description="Number of hours into the future for optimizations." default=24,
ge=0,
json_schema_extra={"description": "Number of hours into the future for optimizations."},
) )
prediction_hours: Optional[int] = Field( prediction_hours: Optional[int] = Field(
default=48, ge=0, description="Number of hours into the future for predictions" default=48,
ge=0,
json_schema_extra={"description": "Number of hours into the future for predictions"},
) )
load_energy_array: Optional[NDArray[Shape["*"], float]] = Field( load_energy_array: Optional[NDArray[Shape["*"], float]] = Field(
default=None, default=None,
description="An array of floats representing the total load (consumption) in watts for different time intervals.", json_schema_extra={
"description": "An array of floats representing the total load (consumption) in watts for different time intervals."
},
) )
pv_prediction_wh: Optional[NDArray[Shape["*"], float]] = Field( pv_prediction_wh: Optional[NDArray[Shape["*"], float]] = Field(
default=None, default=None,
description="An array of floats representing the forecasted photovoltaic output in watts for different time intervals.", json_schema_extra={
"description": "An array of floats representing the forecasted photovoltaic output in watts for different time intervals."
},
) )
elect_price_hourly: Optional[NDArray[Shape["*"], float]] = Field( elect_price_hourly: Optional[NDArray[Shape["*"], float]] = Field(
default=None, default=None,
description="An array of floats representing the electricity price in euros per watt-hour for different time intervals.", json_schema_extra={
"description": "An array of floats representing the electricity price in euros per watt-hour for different time intervals."
},
) )
elect_revenue_per_hour_arr: Optional[NDArray[Shape["*"], float]] = Field( elect_revenue_per_hour_arr: Optional[NDArray[Shape["*"], float]] = Field(
default=None, default=None,
description="An array of floats representing the feed-in compensation in euros per watt-hour.", json_schema_extra={
"description": "An array of floats representing the feed-in compensation in euros per watt-hour."
},
) )
battery: Optional[Battery] = Field(default=None, description="TBD.") battery: Optional[Battery] = Field(default=None, json_schema_extra={"description": "TBD."})
ev: Optional[Battery] = Field(default=None, description="TBD.") ev: Optional[Battery] = Field(default=None, json_schema_extra={"description": "TBD."})
home_appliance: Optional[HomeAppliance] = Field(default=None, description="TBD.") home_appliance: Optional[HomeAppliance] = Field(
inverter: Optional[Inverter] = Field(default=None, description="TBD.") default=None, json_schema_extra={"description": "TBD."}
)
inverter: Optional[Inverter] = Field(default=None, json_schema_extra={"description": "TBD."})
ac_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(default=None, description="TBD") ac_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(
dc_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(default=None, description="TBD") default=None, json_schema_extra={"description": "TBD"}
)
dc_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(
default=None, json_schema_extra={"description": "TBD"}
)
bat_discharge_hours: Optional[NDArray[Shape["*"], float]] = Field( bat_discharge_hours: Optional[NDArray[Shape["*"], float]] = Field(
default=None, description="TBD" default=None, json_schema_extra={"description": "TBD"}
)
ev_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(
default=None, json_schema_extra={"description": "TBD"}
) )
ev_charge_hours: Optional[NDArray[Shape["*"], float]] = Field(default=None, description="TBD")
ev_discharge_hours: Optional[NDArray[Shape["*"], float]] = Field( ev_discharge_hours: Optional[NDArray[Shape["*"], float]] = Field(
default=None, description="TBD" default=None, json_schema_extra={"description": "TBD"}
) )
home_appliance_start_hour: Optional[int] = Field( home_appliance_start_hour: Optional[int] = Field(
default=None, description="Home appliance start hour - None denotes no start." default=None,
json_schema_extra={"description": "Home appliance start hour - None denotes no start."},
) )
def prepare( def prepare(

View File

@@ -9,27 +9,27 @@ from akkudoktoreos.utils.datetimeutil import TimeWindowSequence
class DeviceParameters(GeneticParametersBaseModel): class DeviceParameters(GeneticParametersBaseModel):
device_id: str = Field(description="ID of device", examples="device1") device_id: str = Field(json_schema_extra={"description": "ID of device", "examples": "device1"})
hours: Optional[int] = Field( hours: Optional[int] = Field(
default=None, default=None,
gt=0, gt=0,
description="Number of prediction hours. Defaults to global config prediction hours.", json_schema_extra={
examples=[None], "description": "Number of prediction hours. Defaults to global config prediction hours.",
"examples": [None],
},
) )
def max_charging_power_field(description: Optional[str] = None) -> float: def max_charging_power_field(description: Optional[str] = None) -> float:
if description is None: if description is None:
description = "Maximum charging power in watts." description = "Maximum charging power in watts."
return Field( return Field(default=5000, gt=0, json_schema_extra={"description": description})
default=5000,
gt=0,
description=description,
)
def initial_soc_percentage_field(description: str) -> int: def initial_soc_percentage_field(description: str) -> int:
return Field(default=0, ge=0, le=100, description=description, examples=[42]) return Field(
default=0, ge=0, le=100, json_schema_extra={"description": description, "examples": [42]}
)
def discharging_efficiency_field(default_value: float) -> float: def discharging_efficiency_field(default_value: float) -> float:
@@ -37,24 +37,32 @@ def discharging_efficiency_field(default_value: float) -> float:
default=default_value, default=default_value,
gt=0, gt=0,
le=1, le=1,
description="A float representing the discharge efficiency of the battery.", json_schema_extra={
"description": "A float representing the discharge efficiency of the battery."
},
) )
class BaseBatteryParameters(DeviceParameters): class BaseBatteryParameters(DeviceParameters):
"""Battery Device Simulation Configuration.""" """Battery Device Simulation Configuration."""
device_id: str = Field(description="ID of battery", examples=["battery1"]) device_id: str = Field(
json_schema_extra={"description": "ID of battery", "examples": ["battery1"]}
)
capacity_wh: int = Field( capacity_wh: int = Field(
gt=0, gt=0,
description="An integer representing the capacity of the battery in watt-hours.", json_schema_extra={
examples=[8000], "description": "An integer representing the capacity of the battery in watt-hours.",
"examples": [8000],
},
) )
charging_efficiency: float = Field( charging_efficiency: float = Field(
default=0.88, default=0.88,
gt=0, gt=0,
le=1, le=1,
description="A float representing the charging efficiency of the battery.", json_schema_extra={
"description": "A float representing the charging efficiency of the battery."
},
) )
discharging_efficiency: float = discharging_efficiency_field(0.88) discharging_efficiency: float = discharging_efficiency_field(0.88)
max_charge_power_w: Optional[float] = max_charging_power_field() max_charge_power_w: Optional[float] = max_charging_power_field()
@@ -65,19 +73,25 @@ class BaseBatteryParameters(DeviceParameters):
default=0, default=0,
ge=0, ge=0,
le=100, le=100,
description="An integer representing the minimum state of charge (SOC) of the battery in percentage.", json_schema_extra={
examples=[10], "description": "An integer representing the minimum state of charge (SOC) of the battery in percentage.",
"examples": [10],
},
) )
max_soc_percentage: int = Field( max_soc_percentage: int = Field(
default=100, default=100,
ge=0, ge=0,
le=100, le=100,
description="An integer representing the maximum state of charge (SOC) of the battery in percentage.", json_schema_extra={
"description": "An integer representing the maximum state of charge (SOC) of the battery in percentage."
},
) )
charge_rates: Optional[list[float]] = Field( charge_rates: Optional[list[float]] = Field(
default=None, default=None,
description="Charge rates as factor of maximum charging power [0.00 ... 1.00]. None denotes all charge rates are available.", json_schema_extra={
examples=[[0.0, 0.25, 0.5, 0.75, 1.0], None], "description": "Charge rates as factor of maximum charging power [0.00 ... 1.00]. None denotes all charge rates are available.",
"examples": [[0.0, 0.25, 0.5, 0.75, 1.0], None],
},
) )
@@ -90,7 +104,9 @@ class SolarPanelBatteryParameters(BaseBatteryParameters):
class ElectricVehicleParameters(BaseBatteryParameters): class ElectricVehicleParameters(BaseBatteryParameters):
"""Battery Electric Vehicle Device Simulation Configuration.""" """Battery Electric Vehicle Device Simulation Configuration."""
device_id: str = Field(description="ID of electric vehicle", examples=["ev1"]) device_id: str = Field(
json_schema_extra={"description": "ID of electric vehicle", "examples": ["ev1"]}
)
discharging_efficiency: float = discharging_efficiency_field(1.0) discharging_efficiency: float = discharging_efficiency_field(1.0)
initial_soc_percentage: int = initial_soc_percentage_field( initial_soc_percentage: int = initial_soc_percentage_field(
"An integer representing the current state of charge (SOC) of the battery in percentage." "An integer representing the current state of charge (SOC) of the battery in percentage."
@@ -100,33 +116,44 @@ class ElectricVehicleParameters(BaseBatteryParameters):
class HomeApplianceParameters(DeviceParameters): class HomeApplianceParameters(DeviceParameters):
"""Home Appliance Device Simulation Configuration.""" """Home Appliance Device Simulation Configuration."""
device_id: str = Field(description="ID of home appliance", examples=["dishwasher"]) device_id: str = Field(
json_schema_extra={"description": "ID of home appliance", "examples": ["dishwasher"]}
)
consumption_wh: int = Field( consumption_wh: int = Field(
gt=0, gt=0,
description="An integer representing the energy consumption of a household device in watt-hours.", json_schema_extra={
examples=[2000], "description": "An integer representing the energy consumption of a household device in watt-hours.",
"examples": [2000],
},
) )
duration_h: int = Field( duration_h: int = Field(
gt=0, gt=0,
description="An integer representing the usage duration of a household device in hours.", json_schema_extra={
examples=[3], "description": "An integer representing the usage duration of a household device in hours.",
"examples": [3],
},
) )
time_windows: Optional[TimeWindowSequence] = Field( time_windows: Optional[TimeWindowSequence] = Field(
default=None, default=None,
description="List of allowed time windows. Defaults to optimization general time window.", json_schema_extra={
examples=[ "description": "List of allowed time windows. Defaults to optimization general time window.",
[ "examples": [
{"start_time": "10:00", "duration": "2 hours"}, [
{"start_time": "10:00", "duration": "2 hours"},
],
], ],
], },
) )
class InverterParameters(DeviceParameters): class InverterParameters(DeviceParameters):
"""Inverter Device Simulation Configuration.""" """Inverter Device Simulation Configuration."""
device_id: str = Field(description="ID of inverter", examples=["inverter1"]) device_id: str = Field(
max_power_wh: float = Field(gt=0, examples=[10000]) json_schema_extra={"description": "ID of inverter", "examples": ["inverter1"]}
battery_id: Optional[str] = Field( )
default=None, description="ID of battery", examples=[None, "battery1"] max_power_wh: float = Field(gt=0, json_schema_extra={"examples": [10000]})
battery_id: Optional[str] = Field(
default=None,
json_schema_extra={"description": "ID of battery", "examples": [None, "battery1"]},
) )

View File

@@ -37,19 +37,29 @@ class GeneticEnergyManagementParameters(GeneticParametersBaseModel):
"""Encapsulates energy-related forecasts and costs used in GENETIC optimization.""" """Encapsulates energy-related forecasts and costs used in GENETIC optimization."""
pv_prognose_wh: list[float] = Field( pv_prognose_wh: list[float] = Field(
description="An array of floats representing the forecasted photovoltaic output in watts for different time intervals." json_schema_extra={
"description": "An array of floats representing the forecasted photovoltaic output in watts for different time intervals."
}
) )
strompreis_euro_pro_wh: list[float] = Field( strompreis_euro_pro_wh: list[float] = Field(
description="An array of floats representing the electricity price in euros per watt-hour for different time intervals." json_schema_extra={
"description": "An array of floats representing the electricity price in euros per watt-hour for different time intervals."
}
) )
einspeiseverguetung_euro_pro_wh: Union[list[float], float] = Field( einspeiseverguetung_euro_pro_wh: Union[list[float], float] = Field(
description="A float or array of floats representing the feed-in compensation in euros per watt-hour." json_schema_extra={
"description": "A float or array of floats representing the feed-in compensation in euros per watt-hour."
}
) )
preis_euro_pro_wh_akku: float = Field( preis_euro_pro_wh_akku: float = Field(
description="A float representing the cost of battery energy per watt-hour." json_schema_extra={
"description": "A float representing the cost of battery energy per watt-hour."
}
) )
gesamtlast: list[float] = Field( gesamtlast: list[float] = Field(
description="An array of floats representing the total load (consumption) in watts for different time intervals." json_schema_extra={
"description": "An array of floats representing the total load (consumption) in watts for different time intervals."
}
) )
@model_validator(mode="after") @model_validator(mode="after")
@@ -93,10 +103,15 @@ class GeneticOptimizationParameters(
dishwasher: Optional[HomeApplianceParameters] = None dishwasher: Optional[HomeApplianceParameters] = None
temperature_forecast: Optional[list[Optional[float]]] = Field( temperature_forecast: Optional[list[Optional[float]]] = Field(
default=None, default=None,
description="An array of floats representing the temperature forecast in degrees Celsius for different time intervals.", json_schema_extra={
"description": "An array of floats representing the temperature forecast in degrees Celsius for different time intervals."
},
) )
start_solution: Optional[list[float]] = Field( start_solution: Optional[list[float]] = Field(
default=None, description="Can be `null` or contain a previous solution (if available)." default=None,
json_schema_extra={
"description": "Can be `null` or contain a previous solution (if available)."
},
) )
@model_validator(mode="after") @model_validator(mode="after")

View File

@@ -28,29 +28,52 @@ from akkudoktoreos.utils.utils import NumpyEncoder
class DeviceOptimizeResult(GeneticParametersBaseModel): class DeviceOptimizeResult(GeneticParametersBaseModel):
device_id: str = Field(description="ID of device", examples=["device1"]) device_id: str = Field(
hours: int = Field(gt=0, description="Number of hours in the simulation.", examples=[24]) json_schema_extra={"description": "ID of device", "examples": ["device1"]}
)
hours: int = Field(
gt=0,
json_schema_extra={"description": "Number of hours in the simulation.", "examples": [24]},
)
class ElectricVehicleResult(DeviceOptimizeResult): class ElectricVehicleResult(DeviceOptimizeResult):
"""Result class containing information related to the electric vehicle's charging and discharging behavior.""" """Result class containing information related to the electric vehicle's charging and discharging behavior."""
device_id: str = Field(description="ID of electric vehicle", examples=["ev1"]) device_id: str = Field(
json_schema_extra={"description": "ID of electric vehicle", "examples": ["ev1"]}
)
charge_array: list[float] = Field( charge_array: list[float] = Field(
description="Hourly charging status (0 for no charging, 1 for charging)." json_schema_extra={
"description": "Hourly charging status (0 for no charging, 1 for charging)."
}
) )
discharge_array: list[int] = Field( discharge_array: list[int] = Field(
description="Hourly discharging status (0 for no discharging, 1 for discharging)." json_schema_extra={
"description": "Hourly discharging status (0 for no discharging, 1 for discharging)."
}
)
discharging_efficiency: float = Field(
json_schema_extra={"description": "The discharge efficiency as a float.."}
)
capacity_wh: int = Field(
json_schema_extra={"description": "Capacity of the EVs battery in watt-hours."}
)
charging_efficiency: float = Field(
json_schema_extra={"description": "Charging efficiency as a float.."}
)
max_charge_power_w: int = Field(
json_schema_extra={"description": "Maximum charging power in watts."}
) )
discharging_efficiency: float = Field(description="The discharge efficiency as a float..")
capacity_wh: int = Field(description="Capacity of the EVs battery in watt-hours.")
charging_efficiency: float = Field(description="Charging efficiency as a float..")
max_charge_power_w: int = Field(description="Maximum charging power in watts.")
soc_wh: float = Field( soc_wh: float = Field(
description="State of charge of the battery in watt-hours at the start of the simulation." json_schema_extra={
"description": "State of charge of the battery in watt-hours at the start of the simulation."
}
) )
initial_soc_percentage: int = Field( initial_soc_percentage: int = Field(
description="State of charge at the start of the simulation in percentage." json_schema_extra={
"description": "State of charge at the start of the simulation in percentage."
}
) )
@field_validator("discharge_array", "charge_array", mode="before") @field_validator("discharge_array", "charge_array", mode="before")
@@ -61,37 +84,49 @@ class ElectricVehicleResult(DeviceOptimizeResult):
class GeneticSimulationResult(GeneticParametersBaseModel): class GeneticSimulationResult(GeneticParametersBaseModel):
"""This object contains the results of the simulation and provides insights into various parameters over the entire forecast period.""" """This object contains the results of the simulation and provides insights into various parameters over the entire forecast period."""
Last_Wh_pro_Stunde: list[float] = Field(description="TBD") Last_Wh_pro_Stunde: list[float] = Field(json_schema_extra={"description": "TBD"})
EAuto_SoC_pro_Stunde: list[float] = Field( EAuto_SoC_pro_Stunde: list[float] = Field(
description="The state of charge of the EV for each hour." json_schema_extra={"description": "The state of charge of the EV for each hour."}
) )
Einnahmen_Euro_pro_Stunde: list[float] = Field( Einnahmen_Euro_pro_Stunde: list[float] = Field(
description="The revenue from grid feed-in or other sources in euros per hour." json_schema_extra={
"description": "The revenue from grid feed-in or other sources in euros per hour."
}
) )
Gesamt_Verluste: float = Field( Gesamt_Verluste: float = Field(
description="The total losses in watt-hours over the entire period." json_schema_extra={"description": "The total losses in watt-hours over the entire period."}
) )
Gesamtbilanz_Euro: float = Field( Gesamtbilanz_Euro: float = Field(
description="The total balance of revenues minus costs in euros." json_schema_extra={"description": "The total balance of revenues minus costs in euros."}
) )
Gesamteinnahmen_Euro: float = Field(description="The total revenues in euros.") Gesamteinnahmen_Euro: float = Field(
Gesamtkosten_Euro: float = Field(description="The total costs in euros.") json_schema_extra={"description": "The total revenues in euros."}
)
Gesamtkosten_Euro: float = Field(json_schema_extra={"description": "The total costs in euros."})
Home_appliance_wh_per_hour: list[Optional[float]] = Field( Home_appliance_wh_per_hour: list[Optional[float]] = Field(
description="The energy consumption of a household appliance in watt-hours per hour." json_schema_extra={
"description": "The energy consumption of a household appliance in watt-hours per hour."
}
)
Kosten_Euro_pro_Stunde: list[float] = Field(
json_schema_extra={"description": "The costs in euros per hour."}
) )
Kosten_Euro_pro_Stunde: list[float] = Field(description="The costs in euros per hour.")
Netzbezug_Wh_pro_Stunde: list[float] = Field( Netzbezug_Wh_pro_Stunde: list[float] = Field(
description="The grid energy drawn in watt-hours per hour." json_schema_extra={"description": "The grid energy drawn in watt-hours per hour."}
) )
Netzeinspeisung_Wh_pro_Stunde: list[float] = Field( Netzeinspeisung_Wh_pro_Stunde: list[float] = Field(
description="The energy fed into the grid in watt-hours per hour." json_schema_extra={"description": "The energy fed into the grid in watt-hours per hour."}
)
Verluste_Pro_Stunde: list[float] = Field(
json_schema_extra={"description": "The losses in watt-hours per hour."}
) )
Verluste_Pro_Stunde: list[float] = Field(description="The losses in watt-hours per hour.")
akku_soc_pro_stunde: list[float] = Field( akku_soc_pro_stunde: list[float] = Field(
description="The state of charge of the battery (not the EV) in percentage per hour." json_schema_extra={
"description": "The state of charge of the battery (not the EV) in percentage per hour."
}
) )
Electricity_price: list[float] = Field( Electricity_price: list[float] = Field(
description="Used Electricity Price, including predictions" json_schema_extra={"description": "Used Electricity Price, including predictions"}
) )
@field_validator( @field_validator(
@@ -115,24 +150,34 @@ class GeneticSolution(ConfigMixin, GeneticParametersBaseModel):
"""**Note**: The first value of "Last_Wh_per_hour", "Netzeinspeisung_Wh_per_hour", and "Netzbezug_Wh_per_hour", will be set to null in the JSON output and represented as NaN or None in the corresponding classes' data returns. This approach is adopted to ensure that the current hour's processing remains unchanged.""" """**Note**: The first value of "Last_Wh_per_hour", "Netzeinspeisung_Wh_per_hour", and "Netzbezug_Wh_per_hour", will be set to null in the JSON output and represented as NaN or None in the corresponding classes' data returns. This approach is adopted to ensure that the current hour's processing remains unchanged."""
ac_charge: list[float] = Field( ac_charge: list[float] = Field(
description="Array with AC charging values as relative power (0.0-1.0), other values set to 0." json_schema_extra={
"description": "Array with AC charging values as relative power (0.0-1.0), other values set to 0."
}
) )
dc_charge: list[float] = Field( dc_charge: list[float] = Field(
description="Array with DC charging values as relative power (0-1), other values set to 0." json_schema_extra={
"description": "Array with DC charging values as relative power (0-1), other values set to 0."
}
) )
discharge_allowed: list[int] = Field( discharge_allowed: list[int] = Field(
description="Array with discharge values (1 for discharge, 0 otherwise)." json_schema_extra={
"description": "Array with discharge values (1 for discharge, 0 otherwise)."
}
) )
eautocharge_hours_float: Optional[list[float]] = Field(description="TBD") eautocharge_hours_float: Optional[list[float]] = Field(json_schema_extra={"description": "TBD"})
result: GeneticSimulationResult result: GeneticSimulationResult
eauto_obj: Optional[ElectricVehicleResult] eauto_obj: Optional[ElectricVehicleResult]
start_solution: Optional[list[float]] = Field( start_solution: Optional[list[float]] = Field(
default=None, default=None,
description="An array of binary values (0 or 1) representing a possible starting solution for the simulation.", json_schema_extra={
"description": "An array of binary values (0 or 1) representing a possible starting solution for the simulation."
},
) )
washingstart: Optional[int] = Field( washingstart: Optional[int] = Field(
default=None, default=None,
description="Can be `null` or contain an object representing the start of washing (if applicable).", json_schema_extra={
"description": "Can be `null` or contain an object representing the start of washing (if applicable)."
},
) )
@field_validator( @field_validator(
@@ -167,15 +212,14 @@ class GeneticSolution(ConfigMixin, GeneticParametersBaseModel):
discharge_allowed (bool): Whether discharging is permitted. discharge_allowed (bool): Whether discharging is permitted.
Returns: Returns:
tuple[BatteryOperationMode, float]: tuple[BatteryOperationMode, float]: A tuple containing
A tuple containing:
- `BatteryOperationMode`: the representative high-level operation mode. - `BatteryOperationMode`: the representative high-level operation mode.
- `float`: the operation factor corresponding to the active signal. - `float`: the operation factor corresponding to the active signal.
Notes: Notes:
- The mapping prioritizes AC charge > DC charge > discharge. - The mapping prioritizes AC charge > DC charge > discharge.
- Multiple strategies can produce the same low-level signals; this function - Multiple strategies can produce the same low-level signals; this function
returns a representative mode based on a defined priority order. returns a representative mode based on a defined priority order.
""" """
# (0,0,0) → Nothing allowed # (0,0,0) → Nothing allowed
if ac_charge <= 0.0 and dc_charge <= 0.0 and not discharge_allowed: if ac_charge <= 0.0 and dc_charge <= 0.0 and not discharge_allowed:

View File

@@ -16,30 +16,38 @@ class GeneticCommonSettings(SettingsBaseModel):
individuals: Optional[int] = Field( individuals: Optional[int] = Field(
default=300, default=300,
ge=10, ge=10,
description="Number of individuals (solutions) to generate for the (initial) generation [>= 10]. Defaults to 300.", json_schema_extra={
examples=[300], "description": "Number of individuals (solutions) to generate for the (initial) generation [>= 10]. Defaults to 300.",
"examples": [300],
},
) )
generations: Optional[int] = Field( generations: Optional[int] = Field(
default=400, default=400,
ge=10, ge=10,
description="Number of generations to evaluate the optimal solution [>= 10]. Defaults to 400.", json_schema_extra={
examples=[400], "description": "Number of generations to evaluate the optimal solution [>= 10]. Defaults to 400.",
"examples": [400],
},
) )
seed: Optional[int] = Field( seed: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
description="Fixed seed for genetic algorithm. Defaults to 'None' which means random seed.", json_schema_extra={
examples=[None], "description": "Fixed seed for genetic algorithm. Defaults to 'None' which means random seed.",
"examples": [None],
},
) )
penalties: Optional[dict[str, Union[float, int, str]]] = Field( penalties: Optional[dict[str, Union[float, int, str]]] = Field(
default=None, default=None,
description="A dictionary of penalty function parameters consisting of a penalty function parameter name and the associated value.", json_schema_extra={
examples=[ "description": "A dictionary of penalty function parameters consisting of a penalty function parameter name and the associated value.",
{"ev_soc_miss": 10}, "examples": [
], {"ev_soc_miss": 10},
],
},
) )
@@ -49,28 +57,33 @@ class OptimizationCommonSettings(SettingsBaseModel):
horizon_hours: Optional[int] = Field( horizon_hours: Optional[int] = Field(
default=24, default=24,
ge=0, ge=0,
description="The general time window within which the energy optimization goal shall be achieved [h]. Defaults to 24 hours.", json_schema_extra={
examples=[24], "description": "The general time window within which the energy optimization goal shall be achieved [h]. Defaults to 24 hours.",
"examples": [24],
},
) )
interval: Optional[int] = Field( interval: Optional[int] = Field(
default=3600, default=3600,
ge=15 * 60, ge=15 * 60,
le=60 * 60, le=60 * 60,
description="The optimization interval [sec].", json_schema_extra={
examples=[60 * 60, 15 * 60], "description": "The optimization interval [sec].",
"examples": [60 * 60, 15 * 60],
},
) )
algorithm: Optional[str] = Field( algorithm: Optional[str] = Field(
default="GENETIC", default="GENETIC",
description="The optimization algorithm.", json_schema_extra={"description": "The optimization algorithm.", "examples": ["GENETIC"]},
examples=["GENETIC"],
) )
genetic: Optional[GeneticCommonSettings] = Field( genetic: Optional[GeneticCommonSettings] = Field(
default=None, default=None,
description="Genetic optimization algorithm configuration.", json_schema_extra={
examples=[{"individuals": 400, "seed": None, "penalties": {"ev_soc_miss": 10}}], "description": "Genetic optimization algorithm configuration.",
"examples": [{"individuals": 400, "seed": None, "penalties": {"ev_soc_miss": 10}}],
},
) )
@model_validator(mode="after") @model_validator(mode="after")
@@ -85,57 +98,71 @@ class OptimizationCommonSettings(SettingsBaseModel):
class OptimizationSolution(PydanticBaseModel): class OptimizationSolution(PydanticBaseModel):
"""General Optimization Solution.""" """General Optimization Solution."""
id: str = Field(..., description="Unique ID for the optimization solution.") id: str = Field(
..., json_schema_extra={"description": "Unique ID for the optimization solution."}
)
generated_at: DateTime = Field(..., description="Timestamp when the solution was generated.") generated_at: DateTime = Field(
..., json_schema_extra={"description": "Timestamp when the solution was generated."}
)
comment: Optional[str] = Field( comment: Optional[str] = Field(
default=None, description="Optional comment or annotation for the solution." default=None,
json_schema_extra={"description": "Optional comment or annotation for the solution."},
) )
valid_from: Optional[DateTime] = Field( valid_from: Optional[DateTime] = Field(
default=None, description="Start time of the optimization solution." default=None, json_schema_extra={"description": "Start time of the optimization solution."}
) )
valid_until: Optional[DateTime] = Field( valid_until: Optional[DateTime] = Field(
default=None, default=None, json_schema_extra={"description": "End time of the optimization solution."}
description="End time of the optimization solution.",
) )
total_losses_energy_wh: float = Field( total_losses_energy_wh: float = Field(
description="The total losses in watt-hours over the entire period." json_schema_extra={"description": "The total losses in watt-hours over the entire period."}
) )
total_revenues_amt: float = Field(description="The total revenues [money amount].") total_revenues_amt: float = Field(
json_schema_extra={"description": "The total revenues [money amount]."}
)
total_costs_amt: float = Field(description="The total costs [money amount].") total_costs_amt: float = Field(
json_schema_extra={"description": "The total costs [money amount]."}
)
fitness_score: set[float] = Field(description="The fitness score as a set of fitness values.") fitness_score: set[float] = Field(
json_schema_extra={"description": "The fitness score as a set of fitness values."}
)
prediction: PydanticDateTimeDataFrame = Field( prediction: PydanticDateTimeDataFrame = Field(
description=( json_schema_extra={
"Datetime data frame with time series prediction data per optimization interval:" "description": (
"- pv_energy_wh: PV energy prediction (positive) in wh" "Datetime data frame with time series prediction data per optimization interval:"
"- elec_price_amt_kwh: Electricity price prediction in money per kwh" "- pv_energy_wh: PV energy prediction (positive) in wh"
"- feed_in_tariff_amt_kwh: Feed in tariff prediction in money per kwh" "- elec_price_amt_kwh: Electricity price prediction in money per kwh"
"- weather_temp_air_celcius: Temperature in °C" "- feed_in_tariff_amt_kwh: Feed in tariff prediction in money per kwh"
"- loadforecast_energy_wh: Load mean energy prediction in wh" "- weather_temp_air_celcius: Temperature in °C"
"- loadakkudoktor_std_energy_wh: Load energy standard deviation prediction in wh" "- loadforecast_energy_wh: Load mean energy prediction in wh"
"- loadakkudoktor_mean_energy_wh: Load mean energy prediction in wh" "- loadakkudoktor_std_energy_wh: Load energy standard deviation prediction in wh"
) "- loadakkudoktor_mean_energy_wh: Load mean energy prediction in wh"
)
}
) )
solution: PydanticDateTimeDataFrame = Field( solution: PydanticDateTimeDataFrame = Field(
description=( json_schema_extra={
"Datetime data frame with time series solution data per optimization interval:" "description": (
"- load_energy_wh: Load of all energy consumers in wh" "Datetime data frame with time series solution data per optimization interval:"
"- grid_energy_wh: Grid energy feed in (negative) or consumption (positive) in wh" "- load_energy_wh: Load of all energy consumers in wh"
"- costs_amt: Costs in money amount" "- grid_energy_wh: Grid energy feed in (negative) or consumption (positive) in wh"
"- revenue_amt: Revenue in money amount" "- costs_amt: Costs in money amount"
"- losses_energy_wh: Energy losses in wh" "- revenue_amt: Revenue in money amount"
"- <device-id>_operation_mode_id: Operation mode id of the device." "- losses_energy_wh: Energy losses in wh"
"- <device-id>_operation_mode_factor: Operation mode factor of the device." "- <device-id>_operation_mode_id: Operation mode id of the device."
"- <device-id>_soc_factor: State of charge of a battery/ electric vehicle device as factor of total capacity." "- <device-id>_operation_mode_factor: Operation mode factor of the device."
"- <device-id>_energy_wh: Energy consumption (positive) of a device in wh." "- <device-id>_soc_factor: State of charge of a battery/ electric vehicle device as factor of total capacity."
) "- <device-id>_energy_wh: Energy consumption (positive) of a device in wh."
)
}
) )

View File

@@ -4,6 +4,9 @@ from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.prediction.elecpriceenergycharts import (
ElecPriceEnergyChartsCommonSettings,
)
from akkudoktoreos.prediction.elecpriceimport import ElecPriceImportCommonSettings from akkudoktoreos.prediction.elecpriceimport import ElecPriceImportCommonSettings
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
@@ -17,44 +20,41 @@ elecprice_providers = [
] ]
class ElecPriceCommonProviderSettings(SettingsBaseModel):
"""Electricity Price Prediction Provider Configuration."""
ElecPriceImport: Optional[ElecPriceImportCommonSettings] = Field(
default=None, description="ElecPriceImport settings", examples=[None]
)
class ElecPriceCommonSettings(SettingsBaseModel): class ElecPriceCommonSettings(SettingsBaseModel):
"""Electricity Price Prediction Configuration.""" """Electricity Price Prediction Configuration."""
provider: Optional[str] = Field( provider: Optional[str] = Field(
default=None, default=None,
description="Electricity price provider id of provider to be used.", json_schema_extra={
examples=["ElecPriceAkkudoktor"], "description": "Electricity price provider id of provider to be used.",
"examples": ["ElecPriceAkkudoktor"],
},
) )
charges_kwh: Optional[float] = Field( charges_kwh: Optional[float] = Field(
default=None, default=None,
ge=0, ge=0,
description="Electricity price charges [€/kWh]. Will be added to variable market price.", json_schema_extra={
examples=[0.21], "description": "Electricity price charges [€/kWh]. Will be added to variable market price.",
"examples": [0.21],
},
) )
vat_rate: Optional[float] = Field( vat_rate: Optional[float] = Field(
default=1.19, default=1.19,
ge=0, ge=0,
description="VAT rate factor applied to electricity price when charges are used.", json_schema_extra={
examples=[1.19], "description": "VAT rate factor applied to electricity price when charges are used.",
"examples": [1.19],
},
) )
provider_settings: ElecPriceCommonProviderSettings = Field( elecpriceimport: ElecPriceImportCommonSettings = Field(
default_factory=ElecPriceCommonProviderSettings, default_factory=ElecPriceImportCommonSettings,
description="Provider settings", json_schema_extra={"description": "Import provider settings."},
examples=[ )
# Example 1: Empty/default settings (all providers None)
{ energycharts: ElecPriceEnergyChartsCommonSettings = Field(
"ElecPriceImport": None, default_factory=ElecPriceEnergyChartsCommonSettings,
}, json_schema_extra={"description": "Energy Charts provider settings."},
],
) )
# Validators # Validators

View File

@@ -21,7 +21,7 @@ class ElecPriceDataRecord(PredictionRecord):
""" """
elecprice_marketprice_wh: Optional[float] = Field( elecprice_marketprice_wh: Optional[float] = Field(
None, description="Electricity market price per Wh (€/Wh)" None, json_schema_extra={"description": "Electricity market price per Wh (€/Wh)"}
) )
# Computed fields # Computed fields
@@ -59,7 +59,8 @@ class ElecPriceProvider(PredictionProvider):
# overload # overload
records: List[ElecPriceDataRecord] = Field( records: List[ElecPriceDataRecord] = Field(
default_factory=list, description="List of ElecPriceDataRecord records" default_factory=list,
json_schema_extra={"description": "List of ElecPriceDataRecord records"},
) )
@classmethod @classmethod

View File

@@ -7,21 +7,44 @@ format, enabling consistent access to forecasted and historical electricity pric
""" """
from datetime import datetime from datetime import datetime
from enum import Enum
from typing import Any, List, Optional, Union from typing import Any, List, Optional, Union
import numpy as np import numpy as np
import pandas as pd import pandas as pd
import requests import requests
from loguru import logger from loguru import logger
from pydantic import ValidationError from pydantic import Field, ValidationError
from statsmodels.tsa.holtwinters import ExponentialSmoothing from statsmodels.tsa.holtwinters import ExponentialSmoothing
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.cache import cache_in_file from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
class EnergyChartsBiddingZones(str, Enum):
"""Energy Charts Bidding Zones."""
AT = "AT"
BE = "BE"
CH = "CH"
CZ = "CZ"
DE_LU = "DE-LU"
DE_AT_LU = "DE-AT-LU"
DK1 = "DK1"
DK2 = "DK2"
FR = "FR"
HU = "HU"
IT_North = "IT-NORTH"
NL = "NL"
NO2 = "NO2"
PL = "PL"
SE4 = "SE4"
SI = "SI"
class EnergyChartsElecPrice(PydanticBaseModel): class EnergyChartsElecPrice(PydanticBaseModel):
license_info: str license_info: str
unix_seconds: List[int] unix_seconds: List[int]
@@ -30,6 +53,21 @@ class EnergyChartsElecPrice(PydanticBaseModel):
deprecated: bool deprecated: bool
class ElecPriceEnergyChartsCommonSettings(SettingsBaseModel):
"""Common settings for Energy Charts electricity price provider."""
bidding_zone: EnergyChartsBiddingZones = Field(
default=EnergyChartsBiddingZones.DE_LU,
json_schema_extra={
"description": (
"Bidding Zone: 'AT', 'BE', 'CH', 'CZ', 'DE-LU', 'DE-AT-LU', 'DK1', 'DK2', 'FR', "
"'HU', 'IT-NORTH', 'NL', 'NO2', 'PL', 'SE4' or 'SI'"
),
"examples": ["AT"],
},
)
class ElecPriceEnergyCharts(ElecPriceProvider): class ElecPriceEnergyCharts(ElecPriceProvider):
"""Fetch and process electricity price forecast data from Energy-Charts. """Fetch and process electricity price forecast data from Energy-Charts.
@@ -95,7 +133,8 @@ class ElecPriceEnergyCharts(ElecPriceProvider):
) )
last_date = to_datetime(self.end_datetime, as_string="YYYY-MM-DD") last_date = to_datetime(self.end_datetime, as_string="YYYY-MM-DD")
url = f"{source}/price?bzn=DE-LU&start={start_date}&end={last_date}" bidding_zone = str(self.config.elecprice.energycharts.bidding_zone)
url = f"{source}/price?bzn={bidding_zone}&start={start_date}&end={last_date}"
response = requests.get(url, timeout=30) response = requests.get(url, timeout=30)
logger.debug(f"Response from {url}: {response}") logger.debug(f"Response from {url}: {response}")
response.raise_for_status() # Raise an error for bad responses response.raise_for_status() # Raise an error for bad responses

View File

@@ -9,7 +9,6 @@ format, enabling consistent access to forecasted and historical elecprice attrib
from pathlib import Path from pathlib import Path
from typing import Optional, Union from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
@@ -22,14 +21,18 @@ class ElecPriceImportCommonSettings(SettingsBaseModel):
import_file_path: Optional[Union[str, Path]] = Field( import_file_path: Optional[Union[str, Path]] = Field(
default=None, default=None,
description="Path to the file to import elecprice data from.", json_schema_extra={
examples=[None, "/path/to/prices.json"], "description": "Path to the file to import elecprice data from.",
"examples": [None, "/path/to/prices.json"],
},
) )
import_json: Optional[str] = Field( import_json: Optional[str] = Field(
default=None, default=None,
description="JSON string, dictionary of electricity price forecast value lists.", json_schema_extra={
examples=['{"elecprice_marketprice_wh": [0.0003384, 0.0003318, 0.0003284]}'], "description": "JSON string, dictionary of electricity price forecast value lists.",
"examples": ['{"elecprice_marketprice_wh": [0.0003384, 0.0003318, 0.0003284]}'],
},
) )
# Validators # Validators
@@ -61,16 +64,13 @@ class ElecPriceImport(ElecPriceProvider, PredictionImportProvider):
return "ElecPriceImport" return "ElecPriceImport"
def _update_data(self, force_update: Optional[bool] = False) -> None: def _update_data(self, force_update: Optional[bool] = False) -> None:
if self.config.elecprice.provider_settings.ElecPriceImport is None: if self.config.elecprice.elecpriceimport.import_file_path:
logger.debug(f"{self.provider_id()} data update without provider settings.")
return
if self.config.elecprice.provider_settings.ElecPriceImport.import_file_path:
self.import_from_file( self.import_from_file(
self.config.elecprice.provider_settings.ElecPriceImport.import_file_path, self.config.elecprice.elecpriceimport.import_file_path,
key_prefix="elecprice", key_prefix="elecprice",
) )
if self.config.elecprice.provider_settings.ElecPriceImport.import_json: if self.config.elecprice.elecpriceimport.import_json:
self.import_from_json( self.import_from_json(
self.config.elecprice.provider_settings.ElecPriceImport.import_json, self.config.elecprice.elecpriceimport.import_json,
key_prefix="elecprice", key_prefix="elecprice",
) )

View File

@@ -22,10 +22,12 @@ class FeedInTariffCommonProviderSettings(SettingsBaseModel):
"""Feed In Tariff Prediction Provider Configuration.""" """Feed In Tariff Prediction Provider Configuration."""
FeedInTariffFixed: Optional[FeedInTariffFixedCommonSettings] = Field( FeedInTariffFixed: Optional[FeedInTariffFixedCommonSettings] = Field(
default=None, description="FeedInTariffFixed settings", examples=[None] default=None,
json_schema_extra={"description": "FeedInTariffFixed settings", "examples": [None]},
) )
FeedInTariffImport: Optional[FeedInTariffImportCommonSettings] = Field( FeedInTariffImport: Optional[FeedInTariffImportCommonSettings] = Field(
default=None, description="FeedInTariffImport settings", examples=[None] default=None,
json_schema_extra={"description": "FeedInTariffImport settings", "examples": [None]},
) )
@@ -34,20 +36,24 @@ class FeedInTariffCommonSettings(SettingsBaseModel):
provider: Optional[str] = Field( provider: Optional[str] = Field(
default=None, default=None,
description="Feed in tariff provider id of provider to be used.", json_schema_extra={
examples=["FeedInTariffFixed", "FeedInTarifImport"], "description": "Feed in tariff provider id of provider to be used.",
"examples": ["FeedInTariffFixed", "FeedInTarifImport"],
},
) )
provider_settings: FeedInTariffCommonProviderSettings = Field( provider_settings: FeedInTariffCommonProviderSettings = Field(
default_factory=FeedInTariffCommonProviderSettings, default_factory=FeedInTariffCommonProviderSettings,
description="Provider settings", json_schema_extra={
examples=[ "description": "Provider settings",
# Example 1: Empty/default settings (all providers None) "examples": [
{ # Example 1: Empty/default settings (all providers None)
"FeedInTariffFixed": None, {
"FeedInTariffImport": None, "FeedInTariffFixed": None,
}, "FeedInTariffImport": None,
], },
],
},
) )
# Validators # Validators

View File

@@ -20,7 +20,9 @@ class FeedInTariffDataRecord(PredictionRecord):
""" """
feed_in_tariff_wh: Optional[float] = Field(None, description="Feed in tariff per Wh (€/Wh)") feed_in_tariff_wh: Optional[float] = Field(
None, json_schema_extra={"description": "Feed in tariff per Wh (€/Wh)"}
)
# Computed fields # Computed fields
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@@ -46,7 +48,8 @@ class FeedInTariffProvider(PredictionProvider):
# overload # overload
records: List[FeedInTariffDataRecord] = Field( records: List[FeedInTariffDataRecord] = Field(
default_factory=list, description="List of FeedInTariffDataRecord records" default_factory=list,
json_schema_extra={"description": "List of FeedInTariffDataRecord records"},
) )
@classmethod @classmethod

View File

@@ -16,8 +16,10 @@ class FeedInTariffFixedCommonSettings(SettingsBaseModel):
feed_in_tariff_kwh: Optional[float] = Field( feed_in_tariff_kwh: Optional[float] = Field(
default=None, default=None,
ge=0, ge=0,
description="Electricity price feed in tariff [€/kWH].", json_schema_extra={
examples=[0.078], "description": "Electricity price feed in tariff [€/kWH].",
"examples": [0.078],
},
) )

View File

@@ -21,13 +21,17 @@ class FeedInTariffImportCommonSettings(SettingsBaseModel):
import_file_path: Optional[Union[str, Path]] = Field( import_file_path: Optional[Union[str, Path]] = Field(
default=None, default=None,
description="Path to the file to import feed in tariff data from.", json_schema_extra={
examples=[None, "/path/to/feedintariff.json"], "description": "Path to the file to import feed in tariff data from.",
"examples": [None, "/path/to/feedintariff.json"],
},
) )
import_json: Optional[str] = Field( import_json: Optional[str] = Field(
default=None, default=None,
description="JSON string, dictionary of feed in tariff forecast value lists.", json_schema_extra={
examples=['{"fead_in_tariff_wh": [0.000078, 0.000078, 0.000023]}'], "description": "JSON string, dictionary of feed in tariff forecast value lists.",
"examples": ['{"fead_in_tariff_wh": [0.000078, 0.000078, 0.000023]}'],
},
) )
# Validators # Validators

View File

@@ -5,7 +5,7 @@ from pathlib import Path
import numpy as np import numpy as np
from scipy.interpolate import RegularGridInterpolator from scipy.interpolate import RegularGridInterpolator
from akkudoktoreos.core.cache import cachemethod_energy_management from akkudoktoreos.core.cache import cache_energy_management
from akkudoktoreos.core.coreabc import SingletonMixin from akkudoktoreos.core.coreabc import SingletonMixin
@@ -24,7 +24,7 @@ class SelfConsumptionProbabilityInterpolator:
points = np.array([np.full_like(partial_loads, load_1h_power), partial_loads]).T points = np.array([np.full_like(partial_loads, load_1h_power), partial_loads]).T
return points, partial_loads return points, partial_loads
@cachemethod_energy_management @cache_energy_management
def calculate_self_consumption(self, load_1h_power: float, pv_power: float) -> float: def calculate_self_consumption(self, load_1h_power: float, pv_power: float) -> float:
"""Calculate the PV self-consumption rate using RegularGridInterpolator. """Calculate the PV self-consumption rate using RegularGridInterpolator.

View File

@@ -25,13 +25,14 @@ class LoadCommonProviderSettings(SettingsBaseModel):
"""Load Prediction Provider Configuration.""" """Load Prediction Provider Configuration."""
LoadAkkudoktor: Optional[LoadAkkudoktorCommonSettings] = Field( LoadAkkudoktor: Optional[LoadAkkudoktorCommonSettings] = Field(
default=None, description="LoadAkkudoktor settings", examples=[None] default=None,
json_schema_extra={"description": "LoadAkkudoktor settings", "examples": [None]},
) )
LoadVrm: Optional[LoadVrmCommonSettings] = Field( LoadVrm: Optional[LoadVrmCommonSettings] = Field(
default=None, description="LoadVrm settings", examples=[None] default=None, json_schema_extra={"description": "LoadVrm settings", "examples": [None]}
) )
LoadImport: Optional[LoadImportCommonSettings] = Field( LoadImport: Optional[LoadImportCommonSettings] = Field(
default=None, description="LoadImport settings", examples=[None] default=None, json_schema_extra={"description": "LoadImport settings", "examples": [None]}
) )
@@ -40,21 +41,25 @@ class LoadCommonSettings(SettingsBaseModel):
provider: Optional[str] = Field( provider: Optional[str] = Field(
default=None, default=None,
description="Load provider id of provider to be used.", json_schema_extra={
examples=["LoadAkkudoktor"], "description": "Load provider id of provider to be used.",
"examples": ["LoadAkkudoktor"],
},
) )
provider_settings: LoadCommonProviderSettings = Field( provider_settings: LoadCommonProviderSettings = Field(
default_factory=LoadCommonProviderSettings, default_factory=LoadCommonProviderSettings,
description="Provider settings", json_schema_extra={
examples=[ "description": "Provider settings",
# Example 1: Empty/default settings (all providers None) "examples": [
{ # Example 1: Empty/default settings (all providers None)
"LoadAkkudoktor": None, {
"LoadVrm": None, "LoadAkkudoktor": None,
"LoadImport": None, "LoadVrm": None,
}, "LoadImport": None,
], },
],
},
) )
# Validators # Validators

View File

@@ -16,7 +16,7 @@ class LoadDataRecord(PredictionRecord):
"""Represents a load data record containing various load attributes at a specific datetime.""" """Represents a load data record containing various load attributes at a specific datetime."""
loadforecast_power_w: Optional[float] = Field( loadforecast_power_w: Optional[float] = Field(
default=None, description="Predicted load mean value (W)." default=None, json_schema_extra={"description": "Predicted load mean value (W)."}
) )
@@ -42,7 +42,7 @@ class LoadProvider(PredictionProvider):
# overload # overload
records: List[LoadDataRecord] = Field( records: List[LoadDataRecord] = Field(
default_factory=list, description="List of LoadDataRecord records" default_factory=list, json_schema_extra={"description": "List of LoadDataRecord records"}
) )
@classmethod @classmethod

View File

@@ -15,7 +15,8 @@ class LoadAkkudoktorCommonSettings(SettingsBaseModel):
"""Common settings for load data import from file.""" """Common settings for load data import from file."""
loadakkudoktor_year_energy_kwh: Optional[float] = Field( loadakkudoktor_year_energy_kwh: Optional[float] = Field(
default=None, description="Yearly energy consumption (kWh).", examples=[40421] default=None,
json_schema_extra={"description": "Yearly energy consumption (kWh).", "examples": [40421]},
) )
@@ -23,11 +24,11 @@ class LoadAkkudoktorDataRecord(LoadDataRecord):
"""Represents a load data record with extra fields for LoadAkkudoktor.""" """Represents a load data record with extra fields for LoadAkkudoktor."""
loadakkudoktor_mean_power_w: Optional[float] = Field( loadakkudoktor_mean_power_w: Optional[float] = Field(
default=None, description="Predicted load mean value (W)." default=None, json_schema_extra={"description": "Predicted load mean value (W)."}
) )
loadakkudoktor_std_power_w: Optional[float] = Field( loadakkudoktor_std_power_w: Optional[float] = Field(
default=None, description="Predicted load standard deviation (W)." default=None, json_schema_extra={"description": "Predicted load standard deviation (W)."}
) )
@@ -35,7 +36,8 @@ class LoadAkkudoktor(LoadProvider):
"""Fetch Load forecast data from Akkudoktor load profiles.""" """Fetch Load forecast data from Akkudoktor load profiles."""
records: list[LoadAkkudoktorDataRecord] = Field( records: list[LoadAkkudoktorDataRecord] = Field(
default_factory=list, description="List of LoadAkkudoktorDataRecord records" default_factory=list,
json_schema_extra={"description": "List of LoadAkkudoktorDataRecord records"},
) )
@classmethod @classmethod

View File

@@ -22,13 +22,17 @@ class LoadImportCommonSettings(SettingsBaseModel):
import_file_path: Optional[Union[str, Path]] = Field( import_file_path: Optional[Union[str, Path]] = Field(
default=None, default=None,
description="Path to the file to import load data from.", json_schema_extra={
examples=[None, "/path/to/yearly_load.json"], "description": "Path to the file to import load data from.",
"examples": [None, "/path/to/yearly_load.json"],
},
) )
import_json: Optional[str] = Field( import_json: Optional[str] = Field(
default=None, default=None,
description="JSON string, dictionary of load forecast value lists.", json_schema_extra={
examples=['{"load0_mean": [676.71, 876.19, 527.13]}'], "description": "JSON string, dictionary of load forecast value lists.",
"examples": ['{"load0_mean": [676.71, 876.19, 527.13]}'],
},
) )
# Validators # Validators

View File

@@ -27,9 +27,15 @@ class LoadVrmCommonSettings(SettingsBaseModel):
"""Common settings for VRM API.""" """Common settings for VRM API."""
load_vrm_token: str = Field( load_vrm_token: str = Field(
default="your-token", description="Token for Connecting VRM API", examples=["your-token"] default="your-token",
json_schema_extra={
"description": "Token for Connecting VRM API",
"examples": ["your-token"],
},
)
load_vrm_idsite: int = Field(
default=12345, json_schema_extra={"description": "VRM-Installation-ID", "examples": [12345]}
) )
load_vrm_idsite: int = Field(default=12345, description="VRM-Installation-ID", examples=[12345])
class LoadVrm(LoadProvider): class LoadVrm(LoadProvider):

View File

@@ -70,13 +70,17 @@ class PredictionCommonSettings(SettingsBaseModel):
""" """
hours: Optional[int] = Field( hours: Optional[int] = Field(
default=48, ge=0, description="Number of hours into the future for predictions" default=48,
ge=0,
json_schema_extra={"description": "Number of hours into the future for predictions"},
) )
historic_hours: Optional[int] = Field( historic_hours: Optional[int] = Field(
default=48, default=48,
ge=0, ge=0,
description="Number of hours into the past for historical predictions data", json_schema_extra={
"description": "Number of hours into the past for historical predictions data"
},
) )
@@ -107,7 +111,9 @@ class Prediction(PredictionContainer):
WeatherClearOutside, WeatherClearOutside,
WeatherImport, WeatherImport,
] ]
] = Field(default_factory=list, description="List of prediction providers") ] = Field(
default_factory=list, json_schema_extra={"description": "List of prediction providers"}
)
# Initialize forecast providers, all are singletons. # Initialize forecast providers, all are singletons.

View File

@@ -70,26 +70,28 @@ class PredictionSequence(DataSequence):
Derived classes have to provide their own records field with correct record type set. Derived classes have to provide their own records field with correct record type set.
Usage: Usage:
# Example of creating, adding, and using PredictionSequence .. code-block:: python
class DerivedSequence(PredictionSquence):
records: List[DerivedPredictionRecord] = Field(default_factory=list,
description="List of prediction records")
seq = DerivedSequence() # Example of creating, adding, and using PredictionSequence
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=72)) class DerivedSequence(PredictionSquence):
seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=75)) records: List[DerivedPredictionRecord] = Field(default_factory=list, json_schema_extra={ "description": "List of prediction records" })
# Convert to JSON and back seq = DerivedSequence()
json_data = seq.to_json() seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=72))
new_seq = DerivedSequence.from_json(json_data) seq.insert(DerivedPredictionRecord(date_time=datetime.now(), temperature=75))
# Convert to JSON and back
json_data = seq.to_json()
new_seq = DerivedSequence.from_json(json_data)
# Convert to Pandas Series
series = seq.key_to_series('temperature')
# Convert to Pandas Series
series = seq.key_to_series('temperature')
""" """
# To be overloaded by derived classes. # To be overloaded by derived classes.
records: List[PredictionRecord] = Field( records: List[PredictionRecord] = Field(
default_factory=list, description="List of prediction records" default_factory=list, json_schema_extra={"description": "List of prediction records"}
) )
@@ -225,9 +227,10 @@ class PredictionImportProvider(PredictionProvider, DataImportProvider):
"""Abstract base class for prediction providers that import prediction data. """Abstract base class for prediction providers that import prediction data.
This class is designed to handle prediction data provided in the form of a key-value dictionary. This class is designed to handle prediction data provided in the form of a key-value dictionary.
- **Keys**: Represent identifiers from the record keys of a specific prediction. - **Keys**: Represent identifiers from the record keys of a specific prediction.
- **Values**: Are lists of prediction values starting at a specified `start_datetime`, where - **Values**: Are lists of prediction values starting at a specified `start_datetime`, where
each value corresponds to a subsequent time interval (e.g., hourly). each value corresponds to a subsequent time interval (e.g., hourly).
Subclasses must implement the logic for managing prediction data based on the imported records. Subclasses must implement the logic for managing prediction data based on the imported records.
""" """
@@ -249,5 +252,5 @@ class PredictionContainer(PredictionStartEndKeepMixin, DataContainer):
# To be overloaded by derived classes. # To be overloaded by derived classes.
providers: List[PredictionProvider] = Field( providers: List[PredictionProvider] = Field(
default_factory=list, description="List of prediction providers" default_factory=list, json_schema_extra={"description": "List of prediction providers"}
) )

View File

@@ -23,77 +23,118 @@ pvforecast_providers = [
class PVForecastPlaneSetting(SettingsBaseModel): class PVForecastPlaneSetting(SettingsBaseModel):
"""PV Forecast Plane Configuration.""" """PV Forecast Plane Configuration."""
# latitude: Optional[float] = Field(default=None, description="Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°)") # latitude: Optional[float] = Field(default=None, json_schema_extra={ "description": "Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°)" })
surface_tilt: Optional[float] = Field( surface_tilt: Optional[float] = Field(
default=30.0, default=30.0,
ge=0.0, ge=0.0,
le=90.0, le=90.0,
description="Tilt angle from horizontal plane. Ignored for two-axis tracking.", json_schema_extra={
examples=[10.0, 20.0], "description": "Tilt angle from horizontal plane. Ignored for two-axis tracking.",
"examples": [10.0, 20.0],
},
) )
surface_azimuth: Optional[float] = Field( surface_azimuth: Optional[float] = Field(
default=180.0, default=180.0,
ge=0.0, ge=0.0,
le=360.0, le=360.0,
description="Orientation (azimuth angle) of the (fixed) plane. Clockwise from north (north=0, east=90, south=180, west=270).", json_schema_extra={
examples=[180.0, 90.0], "description": "Orientation (azimuth angle) of the (fixed) plane. Clockwise from north (north=0, east=90, south=180, west=270).",
"examples": [180.0, 90.0],
},
) )
userhorizon: Optional[List[float]] = Field( userhorizon: Optional[List[float]] = Field(
default=None, default=None,
description="Elevation of horizon in degrees, at equally spaced azimuth clockwise from north.", json_schema_extra={
examples=[[10.0, 20.0, 30.0], [5.0, 15.0, 25.0]], "description": "Elevation of horizon in degrees, at equally spaced azimuth clockwise from north.",
"examples": [[10.0, 20.0, 30.0], [5.0, 15.0, 25.0]],
},
) )
peakpower: Optional[float] = Field( peakpower: Optional[float] = Field(
default=None, description="Nominal power of PV system in kW.", examples=[5.0, 3.5] default=None,
json_schema_extra={
"description": "Nominal power of PV system in kW.",
"examples": [5.0, 3.5],
},
) )
pvtechchoice: Optional[str] = Field( pvtechchoice: Optional[str] = Field(
default="crystSi", description="PV technology. One of 'crystSi', 'CIS', 'CdTe', 'Unknown'." default="crystSi",
json_schema_extra={
"description": "PV technology. One of 'crystSi', 'CIS', 'CdTe', 'Unknown'."
},
) )
mountingplace: Optional[str] = Field( mountingplace: Optional[str] = Field(
default="free", default="free",
description="Type of mounting for PV system. Options are 'free' for free-standing and 'building' for building-integrated.", json_schema_extra={
"description": "Type of mounting for PV system. Options are 'free' for free-standing and 'building' for building-integrated."
},
)
loss: Optional[float] = Field(
default=14.0, json_schema_extra={"description": "Sum of PV system losses in percent"}
) )
loss: Optional[float] = Field(default=14.0, description="Sum of PV system losses in percent")
trackingtype: Optional[int] = Field( trackingtype: Optional[int] = Field(
default=None, default=None,
ge=0, ge=0,
le=5, le=5,
description="Type of suntracking. 0=fixed, 1=single horizontal axis aligned north-south, 2=two-axis tracking, 3=vertical axis tracking, 4=single horizontal axis aligned east-west, 5=single inclined axis aligned north-south.", json_schema_extra={
examples=[0, 1, 2, 3, 4, 5], "description": "Type of suntracking. 0=fixed, 1=single horizontal axis aligned north-south, 2=two-axis tracking, 3=vertical axis tracking, 4=single horizontal axis aligned east-west, 5=single inclined axis aligned north-south.",
"examples": [0, 1, 2, 3, 4, 5],
},
) )
optimal_surface_tilt: Optional[bool] = Field( optimal_surface_tilt: Optional[bool] = Field(
default=False, default=False,
description="Calculate the optimum tilt angle. Ignored for two-axis tracking.", json_schema_extra={
examples=[False], "description": "Calculate the optimum tilt angle. Ignored for two-axis tracking.",
"examples": [False],
},
) )
optimalangles: Optional[bool] = Field( optimalangles: Optional[bool] = Field(
default=False, default=False,
description="Calculate the optimum tilt and azimuth angles. Ignored for two-axis tracking.", json_schema_extra={
examples=[False], "description": "Calculate the optimum tilt and azimuth angles. Ignored for two-axis tracking.",
"examples": [False],
},
) )
albedo: Optional[float] = Field( albedo: Optional[float] = Field(
default=None, default=None,
description="Proportion of the light hitting the ground that it reflects back.", json_schema_extra={
examples=[None], "description": "Proportion of the light hitting the ground that it reflects back.",
"examples": [None],
},
) )
module_model: Optional[str] = Field( module_model: Optional[str] = Field(
default=None, description="Model of the PV modules of this plane.", examples=[None] default=None,
json_schema_extra={
"description": "Model of the PV modules of this plane.",
"examples": [None],
},
) )
inverter_model: Optional[str] = Field( inverter_model: Optional[str] = Field(
default=None, description="Model of the inverter of this plane.", examples=[None] default=None,
json_schema_extra={
"description": "Model of the inverter of this plane.",
"examples": [None],
},
) )
inverter_paco: Optional[int] = Field( inverter_paco: Optional[int] = Field(
default=None, description="AC power rating of the inverter [W].", examples=[6000, 4000] default=None,
json_schema_extra={
"description": "AC power rating of the inverter [W].",
"examples": [6000, 4000],
},
) )
modules_per_string: Optional[int] = Field( modules_per_string: Optional[int] = Field(
default=None, default=None,
description="Number of the PV modules of the strings of this plane.", json_schema_extra={
examples=[20], "description": "Number of the PV modules of the strings of this plane.",
"examples": [20],
},
) )
strings_per_inverter: Optional[int] = Field( strings_per_inverter: Optional[int] = Field(
default=None, default=None,
description="Number of the strings of the inverter of this plane.", json_schema_extra={
examples=[2], "description": "Number of the strings of the inverter of this plane.",
"examples": [2],
},
) )
@model_validator(mode="after") @model_validator(mode="after")
@@ -124,10 +165,12 @@ class PVForecastCommonProviderSettings(SettingsBaseModel):
"""PV Forecast Provider Configuration.""" """PV Forecast Provider Configuration."""
PVForecastImport: Optional[PVForecastImportCommonSettings] = Field( PVForecastImport: Optional[PVForecastImportCommonSettings] = Field(
default=None, description="PVForecastImport settings", examples=[None] default=None,
json_schema_extra={"description": "PVForecastImport settings", "examples": [None]},
) )
PVForecastVrm: Optional[PVForecastVrmCommonSettings] = Field( PVForecastVrm: Optional[PVForecastVrmCommonSettings] = Field(
default=None, description="PVForecastVrm settings", examples=[None] default=None,
json_schema_extra={"description": "PVForecastVrm settings", "examples": [None]},
) )
@@ -141,72 +184,80 @@ class PVForecastCommonSettings(SettingsBaseModel):
provider: Optional[str] = Field( provider: Optional[str] = Field(
default=None, default=None,
description="PVForecast provider id of provider to be used.", json_schema_extra={
examples=["PVForecastAkkudoktor"], "description": "PVForecast provider id of provider to be used.",
"examples": ["PVForecastAkkudoktor"],
},
) )
provider_settings: PVForecastCommonProviderSettings = Field( provider_settings: PVForecastCommonProviderSettings = Field(
default_factory=PVForecastCommonProviderSettings, default_factory=PVForecastCommonProviderSettings,
description="Provider settings", json_schema_extra={
examples=[ "description": "Provider settings",
# Example 1: Empty/default settings (all providers None) "examples": [
{ # Example 1: Empty/default settings (all providers None)
"PVForecastImport": None, {
"PVForecastVrm": None, "PVForecastImport": None,
}, "PVForecastVrm": None,
], },
],
},
) )
planes: Optional[list[PVForecastPlaneSetting]] = Field( planes: Optional[list[PVForecastPlaneSetting]] = Field(
default=None, default=None,
description="Plane configuration.", json_schema_extra={
examples=[ "description": "Plane configuration.",
[ "examples": [
{ [
"surface_tilt": 10.0, {
"surface_azimuth": 180.0, "surface_tilt": 10.0,
"userhorizon": [10.0, 20.0, 30.0], "surface_azimuth": 180.0,
"peakpower": 5.0, "userhorizon": [10.0, 20.0, 30.0],
"pvtechchoice": "crystSi", "peakpower": 5.0,
"mountingplace": "free", "pvtechchoice": "crystSi",
"loss": 14.0, "mountingplace": "free",
"trackingtype": 0, "loss": 14.0,
"optimal_surface_tilt": False, "trackingtype": 0,
"optimalangles": False, "optimal_surface_tilt": False,
"albedo": None, "optimalangles": False,
"module_model": None, "albedo": None,
"inverter_model": None, "module_model": None,
"inverter_paco": 6000, "inverter_model": None,
"modules_per_string": 20, "inverter_paco": 6000,
"strings_per_inverter": 2, "modules_per_string": 20,
}, "strings_per_inverter": 2,
{ },
"surface_tilt": 20.0, {
"surface_azimuth": 90.0, "surface_tilt": 20.0,
"userhorizon": [5.0, 15.0, 25.0], "surface_azimuth": 90.0,
"peakpower": 3.5, "userhorizon": [5.0, 15.0, 25.0],
"pvtechchoice": "crystSi", "peakpower": 3.5,
"mountingplace": "free", "pvtechchoice": "crystSi",
"loss": 14.0, "mountingplace": "free",
"trackingtype": 1, "loss": 14.0,
"optimal_surface_tilt": False, "trackingtype": 1,
"optimalangles": False, "optimal_surface_tilt": False,
"albedo": None, "optimalangles": False,
"module_model": None, "albedo": None,
"inverter_model": None, "module_model": None,
"inverter_paco": 4000, "inverter_model": None,
"modules_per_string": 20, "inverter_paco": 4000,
"strings_per_inverter": 2, "modules_per_string": 20,
}, "strings_per_inverter": 2,
] },
], ]
],
},
) )
max_planes: Optional[int] = Field( max_planes: Optional[int] = Field(
default=0, default=0,
ge=0, ge=0,
description="Maximum number of planes that can be set", json_schema_extra={
examples=[1, 2], "description": "Maximum number of planes that can be set",
"examples": [1, 2],
},
) )
# Validators # Validators

View File

@@ -16,8 +16,12 @@ from akkudoktoreos.prediction.predictionabc import PredictionProvider, Predictio
class PVForecastDataRecord(PredictionRecord): class PVForecastDataRecord(PredictionRecord):
"""Represents a pvforecast data record containing various pvforecast attributes at a specific datetime.""" """Represents a pvforecast data record containing various pvforecast attributes at a specific datetime."""
pvforecast_dc_power: Optional[float] = Field(default=None, description="Total DC power (W).") pvforecast_dc_power: Optional[float] = Field(
pvforecast_ac_power: Optional[float] = Field(default=None, description="Total AC power (W).") default=None, json_schema_extra={"description": "Total DC power (W)."}
)
pvforecast_ac_power: Optional[float] = Field(
default=None, json_schema_extra={"description": "Total AC power (W)."}
)
class PVForecastProvider(PredictionProvider): class PVForecastProvider(PredictionProvider):
@@ -42,7 +46,8 @@ class PVForecastProvider(PredictionProvider):
# overload # overload
records: List[PVForecastDataRecord] = Field( records: List[PVForecastDataRecord] = Field(
default_factory=list, description="List of PVForecastDataRecord records" default_factory=list,
json_schema_extra={"description": "List of PVForecastDataRecord records"},
) )
@classmethod @classmethod

View File

@@ -12,51 +12,53 @@ Classes:
PVForecastAkkudoktor: Primary class to manage PV power forecasts, handle data retrieval, caching, and integration with Akkudoktor.net. PVForecastAkkudoktor: Primary class to manage PV power forecasts, handle data retrieval, caching, and integration with Akkudoktor.net.
Example: Example:
# Set up the configuration with necessary fields for URL generation .. code-block:: python
settings_data = {
"general": { # Set up the configuration with necessary fields for URL generation
"latitude": 52.52, settings_data = {
"longitude": 13.405, "general": {
}, "latitude": 52.52,
"prediction": { "longitude": 13.405,
"hours": 48, },
"historic_hours": 24, "prediction": {
}, "hours": 48,
"pvforecast": { "historic_hours": 24,
"provider": "PVForecastAkkudoktor", },
"planes": [ "pvforecast": {
{ "provider": "PVForecastAkkudoktor",
"peakpower": 5.0, "planes": [
"surface_azimuth": 170, {
"surface_tilt": 7, "peakpower": 5.0,
"userhorizon": [20, 27, 22, 20], "surface_azimuth": 170,
"inverter_paco": 10000, "surface_tilt": 7,
}, "userhorizon": [20, 27, 22, 20],
{ "inverter_paco": 10000,
"peakpower": 4.8, },
"surface_azimuth": 90, {
"surface_tilt": 7, "peakpower": 4.8,
"userhorizon": [30, 30, 30, 50], "surface_azimuth": 90,
"inverter_paco": 10000, "surface_tilt": 7,
} "userhorizon": [30, 30, 30, 50],
] "inverter_paco": 10000,
}
]
}
} }
}
# Create the config instance from the provided data # Create the config instance from the provided data
config = PVForecastAkkudoktorSettings(**settings_data) config = PVForecastAkkudoktorSettings(**settings_data)
# Initialize the forecast object with the generated configuration # Initialize the forecast object with the generated configuration
forecast = PVForecastAkkudoktor(settings=config) forecast = PVForecastAkkudoktor(settings=config)
# Get an actual forecast # Get an actual forecast
forecast.update_data() forecast.update_data()
# Update the AC power measurement for a specific date and time # Update the AC power measurement for a specific date and time
forecast.update_value(to_datetime(None, to_maxtime=False), "pvforecastakkudoktor_ac_power_measured", 1000.0) forecast.update_value(to_datetime(None, to_maxtime=False), "pvforecastakkudoktor_ac_power_measured", 1000.0)
# Report the DC and AC power forecast along with AC measurements # Report the DC and AC power forecast along with AC measurements
print(forecast.report_ac_power_and_measurement()) print(forecast.report_ac_power_and_measurement())
Attributes: Attributes:
hours (int): Number of hours into the future to forecast. Default is 48. hours (int): Number of hours into the future to forecast. Default is 48.
@@ -157,13 +159,13 @@ class PVForecastAkkudoktorDataRecord(PVForecastDataRecord):
"""Represents a Akkudoktor specific pvforecast data record containing various pvforecast attributes at a specific datetime.""" """Represents a Akkudoktor specific pvforecast data record containing various pvforecast attributes at a specific datetime."""
pvforecastakkudoktor_ac_power_measured: Optional[float] = Field( pvforecastakkudoktor_ac_power_measured: Optional[float] = Field(
default=None, description="Total AC power measured (W)" default=None, json_schema_extra={"description": "Total AC power measured (W)"}
) )
pvforecastakkudoktor_wind_speed_10m: Optional[float] = Field( pvforecastakkudoktor_wind_speed_10m: Optional[float] = Field(
default=None, description="Wind Speed 10m (kmph)" default=None, json_schema_extra={"description": "Wind Speed 10m (kmph)"}
) )
pvforecastakkudoktor_temp_air: Optional[float] = Field( pvforecastakkudoktor_temp_air: Optional[float] = Field(
default=None, description="Temperature (°C)" default=None, json_schema_extra={"description": "Temperature (°C)"}
) )
# Computed fields # Computed fields
@@ -209,7 +211,8 @@ class PVForecastAkkudoktor(PVForecastProvider):
# overload # overload
records: List[PVForecastAkkudoktorDataRecord] = Field( records: List[PVForecastAkkudoktorDataRecord] = Field(
default_factory=list, description="List of PVForecastAkkudoktorDataRecord records" default_factory=list,
json_schema_extra={"description": "List of PVForecastAkkudoktorDataRecord records"},
) )
@classmethod @classmethod

View File

@@ -22,14 +22,18 @@ class PVForecastImportCommonSettings(SettingsBaseModel):
import_file_path: Optional[Union[str, Path]] = Field( import_file_path: Optional[Union[str, Path]] = Field(
default=None, default=None,
description="Path to the file to import PV forecast data from.", json_schema_extra={
examples=[None, "/path/to/pvforecast.json"], "description": "Path to the file to import PV forecast data from.",
"examples": [None, "/path/to/pvforecast.json"],
},
) )
import_json: Optional[str] = Field( import_json: Optional[str] = Field(
default=None, default=None,
description="JSON string, dictionary of PV forecast value lists.", json_schema_extra={
examples=['{"pvforecast_ac_power": [0, 8.05, 352.91]}'], "description": "JSON string, dictionary of PV forecast value lists.",
"examples": ['{"pvforecast_ac_power": [0, 8.05, 352.91]}'],
},
) )
# Validators # Validators

View File

@@ -27,10 +27,14 @@ class PVForecastVrmCommonSettings(SettingsBaseModel):
"""Common settings for VRM API.""" """Common settings for VRM API."""
pvforecast_vrm_token: str = Field( pvforecast_vrm_token: str = Field(
default="your-token", description="Token for Connecting VRM API", examples=["your-token"] default="your-token",
json_schema_extra={
"description": "Token for Connecting VRM API",
"examples": ["your-token"],
},
) )
pvforecast_vrm_idsite: int = Field( pvforecast_vrm_idsite: int = Field(
default=12345, description="VRM-Installation-ID", examples=[12345] default=12345, json_schema_extra={"description": "VRM-Installation-ID", "examples": [12345]}
) )

View File

@@ -23,7 +23,8 @@ class WeatherCommonProviderSettings(SettingsBaseModel):
"""Weather Forecast Provider Configuration.""" """Weather Forecast Provider Configuration."""
WeatherImport: Optional[WeatherImportCommonSettings] = Field( WeatherImport: Optional[WeatherImportCommonSettings] = Field(
default=None, description="WeatherImport settings", examples=[None] default=None,
json_schema_extra={"description": "WeatherImport settings", "examples": [None]},
) )
@@ -32,19 +33,23 @@ class WeatherCommonSettings(SettingsBaseModel):
provider: Optional[str] = Field( provider: Optional[str] = Field(
default=None, default=None,
description="Weather provider id of provider to be used.", json_schema_extra={
examples=["WeatherImport"], "description": "Weather provider id of provider to be used.",
"examples": ["WeatherImport"],
},
) )
provider_settings: WeatherCommonProviderSettings = Field( provider_settings: WeatherCommonProviderSettings = Field(
default_factory=WeatherCommonProviderSettings, default_factory=WeatherCommonProviderSettings,
description="Provider settings", json_schema_extra={
examples=[ "description": "Provider settings",
# Example 1: Empty/default settings (all providers None) "examples": [
{ # Example 1: Empty/default settings (all providers None)
"WeatherImport": None, {
}, "WeatherImport": None,
], },
],
},
) )
# Validators # Validators

View File

@@ -47,48 +47,68 @@ class WeatherDataRecord(PredictionRecord):
""" """
weather_total_clouds: Optional[float] = Field( weather_total_clouds: Optional[float] = Field(
default=None, description="Total Clouds (% Sky Obscured)" default=None, json_schema_extra={"description": "Total Clouds (% Sky Obscured)"}
) )
weather_low_clouds: Optional[float] = Field( weather_low_clouds: Optional[float] = Field(
default=None, description="Low Clouds (% Sky Obscured)" default=None, json_schema_extra={"description": "Low Clouds (% Sky Obscured)"}
) )
weather_medium_clouds: Optional[float] = Field( weather_medium_clouds: Optional[float] = Field(
default=None, description="Medium Clouds (% Sky Obscured)" default=None, json_schema_extra={"description": "Medium Clouds (% Sky Obscured)"}
) )
weather_high_clouds: Optional[float] = Field( weather_high_clouds: Optional[float] = Field(
default=None, description="High Clouds (% Sky Obscured)" default=None, json_schema_extra={"description": "High Clouds (% Sky Obscured)"}
)
weather_visibility: Optional[float] = Field(
default=None, json_schema_extra={"description": "Visibility (m)"}
)
weather_fog: Optional[float] = Field(default=None, json_schema_extra={"description": "Fog (%)"})
weather_precip_type: Optional[str] = Field(
default=None, json_schema_extra={"description": "Precipitation Type"}
) )
weather_visibility: Optional[float] = Field(default=None, description="Visibility (m)")
weather_fog: Optional[float] = Field(default=None, description="Fog (%)")
weather_precip_type: Optional[str] = Field(default=None, description="Precipitation Type")
weather_precip_prob: Optional[float] = Field( weather_precip_prob: Optional[float] = Field(
default=None, description="Precipitation Probability (%)" default=None, json_schema_extra={"description": "Precipitation Probability (%)"}
) )
weather_precip_amt: Optional[float] = Field( weather_precip_amt: Optional[float] = Field(
default=None, description="Precipitation Amount (mm)" default=None, json_schema_extra={"description": "Precipitation Amount (mm)"}
) )
weather_preciptable_water: Optional[float] = Field( weather_preciptable_water: Optional[float] = Field(
default=None, description="Precipitable Water (cm)" default=None, json_schema_extra={"description": "Precipitable Water (cm)"}
)
weather_wind_speed: Optional[float] = Field(
default=None, json_schema_extra={"description": "Wind Speed (kmph)"}
)
weather_wind_direction: Optional[float] = Field(
default=None, json_schema_extra={"description": "Wind Direction (°)"}
)
weather_frost_chance: Optional[str] = Field(
default=None, json_schema_extra={"description": "Chance of Frost"}
)
weather_temp_air: Optional[float] = Field(
default=None, json_schema_extra={"description": "Temperature (°C)"}
)
weather_feels_like: Optional[float] = Field(
default=None, json_schema_extra={"description": "Feels Like (°C)"}
)
weather_dew_point: Optional[float] = Field(
default=None, json_schema_extra={"description": "Dew Point (°C)"}
) )
weather_wind_speed: Optional[float] = Field(default=None, description="Wind Speed (kmph)")
weather_wind_direction: Optional[float] = Field(default=None, description="Wind Direction (°)")
weather_frost_chance: Optional[str] = Field(default=None, description="Chance of Frost")
weather_temp_air: Optional[float] = Field(default=None, description="Temperature (°C)")
weather_feels_like: Optional[float] = Field(default=None, description="Feels Like (°C)")
weather_dew_point: Optional[float] = Field(default=None, description="Dew Point (°C)")
weather_relative_humidity: Optional[float] = Field( weather_relative_humidity: Optional[float] = Field(
default=None, description="Relative Humidity (%)" default=None, json_schema_extra={"description": "Relative Humidity (%)"}
)
weather_pressure: Optional[float] = Field(
default=None, json_schema_extra={"description": "Pressure (mb)"}
)
weather_ozone: Optional[float] = Field(
default=None, json_schema_extra={"description": "Ozone (du)"}
) )
weather_pressure: Optional[float] = Field(default=None, description="Pressure (mb)")
weather_ozone: Optional[float] = Field(default=None, description="Ozone (du)")
weather_ghi: Optional[float] = Field( weather_ghi: Optional[float] = Field(
default=None, description="Global Horizontal Irradiance (W/m2)" default=None, json_schema_extra={"description": "Global Horizontal Irradiance (W/m2)"}
) )
weather_dni: Optional[float] = Field( weather_dni: Optional[float] = Field(
default=None, description="Direct Normal Irradiance (W/m2)" default=None, json_schema_extra={"description": "Direct Normal Irradiance (W/m2)"}
) )
weather_dhi: Optional[float] = Field( weather_dhi: Optional[float] = Field(
default=None, description="Diffuse Horizontal Irradiance (W/m2)" default=None, json_schema_extra={"description": "Diffuse Horizontal Irradiance (W/m2)"}
) )
@@ -114,7 +134,7 @@ class WeatherProvider(PredictionProvider):
# overload # overload
records: List[WeatherDataRecord] = Field( records: List[WeatherDataRecord] = Field(
default_factory=list, description="List of WeatherDataRecord records" default_factory=list, json_schema_extra={"description": "List of WeatherDataRecord records"}
) )
@classmethod @classmethod

View File

@@ -117,17 +117,25 @@ class WeatherClearOutside(WeatherProvider):
Workflow: Workflow:
1. **Retrieve Web Content**: Uses a helper method to fetch or retrieve cached ClearOutside HTML content. 1. **Retrieve Web Content**: Uses a helper method to fetch or retrieve cached ClearOutside HTML content.
2. **Extract Forecast Date and Timezone**: 2. **Extract Forecast Date and Timezone**:
- Parses the forecast's start and end dates and the UTC offset from the "Generated" header. - Parses the forecast's start and end dates and the UTC offset from the "Generated"
header.
3. **Extract Weather Data**: 3. **Extract Weather Data**:
- For each day in the 7-day forecast, the function finds detailed weather parameters - For each day in the 7-day forecast, the function finds detailed weather parameters
and associates values for each hour. and associates values for each hour.
- Parameters include cloud cover, temperature, humidity, visibility, and precipitation type, among others. - Parameters include cloud cover, temperature, humidity, visibility, and
precipitation type, among others.
4. **Irradiance Calculation**: 4. **Irradiance Calculation**:
- Calculates irradiance (GHI, DNI, DHI) values using cloud cover data and the `pvlib` library. - Calculates irradiance (GHI, DNI, DHI) values using cloud cover data and the
`pvlib` library.
5. **Store Data**: 5. **Store Data**:
- Combines all hourly data into `WeatherDataRecord` objects, with keys - Combines all hourly data into `WeatherDataRecord` objects, with keys
standardized according to `WeatherDataRecord` attributes. standardized according to `WeatherDataRecord` attributes.
""" """
# Get ClearOutside web content - either from site or cached # Get ClearOutside web content - either from site or cached
response = self._request_forecast(force_update=force_update) # type: ignore response = self._request_forecast(force_update=force_update) # type: ignore

View File

@@ -22,14 +22,18 @@ class WeatherImportCommonSettings(SettingsBaseModel):
import_file_path: Optional[Union[str, Path]] = Field( import_file_path: Optional[Union[str, Path]] = Field(
default=None, default=None,
description="Path to the file to import weather data from.", json_schema_extra={
examples=[None, "/path/to/weather_data.json"], "description": "Path to the file to import weather data from.",
"examples": [None, "/path/to/weather_data.json"],
},
) )
import_json: Optional[str] = Field( import_json: Optional[str] = Field(
default=None, default=None,
description="JSON string, dictionary of weather forecast value lists.", json_schema_extra={
examples=['{"weather_temp_air": [18.3, 17.8, 16.9]}'], "description": "JSON string, dictionary of weather forecast value lists.",
"examples": ['{"weather_temp_air": [18.3, 17.8, 16.9]}'],
},
) )
# Validators # Validators

View File

@@ -77,6 +77,65 @@ def get_nested_value(
return default return default
def get_field_extra_dict(
subfield_info: Union[FieldInfo, ComputedFieldInfo],
) -> Dict[str, Any]:
"""Extract json_schema_extra.
Extract regardless of whether it is defined directly
on the field (Pydantic v2) or inherited from v1 compatibility wrappers.
Always returns a dictionary.
"""
# Pydantic v2 location
extra = getattr(subfield_info, "json_schema_extra", None)
if isinstance(extra, dict):
return extra
# Pydantic v1 compatibility fallbacks
fi = getattr(subfield_info, "field_info", None)
if fi is not None:
extra = getattr(fi, "json_schema_extra", None)
if isinstance(extra, dict):
return extra
return {}
def get_description(
subfield_info: Union[FieldInfo, ComputedFieldInfo],
extra: Dict[str, Any],
) -> str:
"""Fetch description.
Priority:
1) json_schema_extra["description"]
2) field_info.description
3) empty string
"""
if "description" in extra:
return str(extra["description"])
desc = getattr(subfield_info, "description", None)
return str(desc) if desc is not None else ""
def get_deprecated(
subfield_info: Union[FieldInfo, ComputedFieldInfo],
extra: Dict[str, Any],
) -> Optional[Any]:
"""Fetch deprecated.
Priority:
1) json_schema_extra["deprecated"]
2) field_info.deprecated
3) None
"""
if "deprecated" in extra:
return extra["deprecated"]
return getattr(subfield_info, "deprecated", None)
def get_default_value(field_info: Union[FieldInfo, ComputedFieldInfo], regular_field: bool) -> Any: def get_default_value(field_info: Union[FieldInfo, ComputedFieldInfo], regular_field: bool) -> Any:
"""Retrieve the default value of a field. """Retrieve the default value of a field.
@@ -163,6 +222,7 @@ def configuration(
): ):
if found_basic: if found_basic:
continue continue
extra = get_field_extra_dict(subfield_info)
config: dict[str, Optional[Any]] = {} config: dict[str, Optional[Any]] = {}
config["name"] = ".".join(values_prefix + parent_types) config["name"] = ".".join(values_prefix + parent_types)
@@ -170,12 +230,8 @@ def configuration(
get_nested_value(values, values_prefix + parent_types, "<unknown>") get_nested_value(values, values_prefix + parent_types, "<unknown>")
) )
config["default"] = json.dumps(get_default_value(subfield_info, regular_field)) config["default"] = json.dumps(get_default_value(subfield_info, regular_field))
config["description"] = ( config["description"] = get_description(subfield_info, extra)
subfield_info.description if subfield_info.description else "" config["deprecated"] = get_deprecated(subfield_info, extra)
)
config["deprecated"] = (
subfield_info.deprecated if subfield_info.deprecated else None
)
if isinstance(subfield_info, ComputedFieldInfo): if isinstance(subfield_info, ComputedFieldInfo):
config["read-only"] = "ro" config["read-only"] = "ro"
type_description = str(subfield_info.return_type) type_description = str(subfield_info.return_type)

View File

@@ -153,31 +153,42 @@ class ServerCommonSettings(SettingsBaseModel):
host: Optional[str] = Field( host: Optional[str] = Field(
default=get_default_host(), default=get_default_host(),
description="EOS server IP address. Defaults to 127.0.0.1.", json_schema_extra={
examples=["127.0.0.1", "localhost"], "description": "EOS server IP address. Defaults to 127.0.0.1.",
"examples": ["127.0.0.1", "localhost"],
},
) )
port: Optional[int] = Field( port: Optional[int] = Field(
default=8503, default=8503,
description="EOS server IP port number. Defaults to 8503.", json_schema_extra={
examples=[ "description": "EOS server IP port number. Defaults to 8503.",
8503, "examples": [
], 8503,
],
},
)
verbose: Optional[bool] = Field(
default=False, json_schema_extra={"description": "Enable debug output"}
) )
verbose: Optional[bool] = Field(default=False, description="Enable debug output")
startup_eosdash: Optional[bool] = Field( startup_eosdash: Optional[bool] = Field(
default=True, description="EOS server to start EOSdash server. Defaults to True." default=True,
json_schema_extra={"description": "EOS server to start EOSdash server. Defaults to True."},
) )
eosdash_host: Optional[str] = Field( eosdash_host: Optional[str] = Field(
default=None, default=None,
description="EOSdash server IP address. Defaults to EOS server IP address.", json_schema_extra={
examples=["127.0.0.1", "localhost"], "description": "EOSdash server IP address. Defaults to EOS server IP address.",
"examples": ["127.0.0.1", "localhost"],
},
) )
eosdash_port: Optional[int] = Field( eosdash_port: Optional[int] = Field(
default=None, default=None,
description="EOSdash server IP port number. Defaults to EOS server IP port number + 1.", json_schema_extra={
examples=[ "description": "EOSdash server IP port number. Defaults to EOS server IP port number + 1.",
8504, "examples": [
], 8504,
],
},
) )
@field_validator("host", "eosdash_host", mode="before") @field_validator("host", "eosdash_host", mode="before")

View File

@@ -835,31 +835,42 @@ class TimeWindow(BaseModel):
Supports day names in multiple languages via locale-aware parsing. Supports day names in multiple languages via locale-aware parsing.
""" """
start_time: Time = Field(..., description="Start time of the time window (time of day).") start_time: Time = Field(
..., json_schema_extra={"description": "Start time of the time window (time of day)."}
)
duration: Duration = Field( duration: Duration = Field(
..., description="Duration of the time window starting from `start_time`." ...,
json_schema_extra={
"description": "Duration of the time window starting from `start_time`."
},
) )
day_of_week: Optional[Union[int, str]] = Field( day_of_week: Optional[Union[int, str]] = Field(
default=None, default=None,
description=( json_schema_extra={
"Optional day of the week restriction. " "description": (
"Can be specified as integer (0=Monday to 6=Sunday) or localized weekday name. " "Optional day of the week restriction. "
"If None, applies every day unless `date` is set." "Can be specified as integer (0=Monday to 6=Sunday) or localized weekday name. "
), "If None, applies every day unless `date` is set."
)
},
) )
date: Optional[Date] = Field( date: Optional[Date] = Field(
default=None, default=None,
description=( json_schema_extra={
"Optional specific calendar date for the time window. Overrides `day_of_week` if set." "description": (
), "Optional specific calendar date for the time window. Overrides `day_of_week` if set."
)
},
) )
locale: Optional[str] = Field( locale: Optional[str] = Field(
default=None, default=None,
description=( json_schema_extra={
"Locale used to parse weekday names in `day_of_week` when given as string. " "description": (
"If not set, Pendulum's default locale is used. " "Locale used to parse weekday names in `day_of_week` when given as string. "
"Examples: 'en', 'de', 'fr', etc." "If not set, Pendulum's default locale is used. "
), "Examples: 'en', 'de', 'fr', etc."
)
},
) )
@field_validator("duration", mode="before") @field_validator("duration", mode="before")
@@ -1160,7 +1171,8 @@ class TimeWindowSequence(BaseModel):
""" """
windows: Optional[list[TimeWindow]] = Field( windows: Optional[list[TimeWindow]] = Field(
default_factory=list, description="List of TimeWindow objects that make up this sequence." default_factory=list,
json_schema_extra={"description": "List of TimeWindow objects that make up this sequence."},
) )
@field_validator("windows") @field_validator("windows")

View File

@@ -1,3 +1,4 @@
import hashlib
import json import json
import logging import logging
import os import os
@@ -7,6 +8,7 @@ import sys
import tempfile import tempfile
import time import time
from contextlib import contextmanager from contextlib import contextmanager
from fnmatch import fnmatch
from http import HTTPStatus from http import HTTPStatus
from pathlib import Path from pathlib import Path
from typing import Generator, Optional, Union from typing import Generator, Optional, Union
@@ -21,12 +23,14 @@ from loguru import logger
from xprocess import ProcessStarter, XProcess from xprocess import ProcessStarter, XProcess
from akkudoktoreos.config.config import ConfigEOS, get_config from akkudoktoreos.config.config import ConfigEOS, get_config
from akkudoktoreos.core.version import _version_hash, version
from akkudoktoreos.server.server import get_default_host from akkudoktoreos.server.server import get_default_host
# ----------------------------------------------- # -----------------------------------------------
# Adapt pytest logging handling to Loguru logging # Adapt pytest logging handling to Loguru logging
# ----------------------------------------------- # -----------------------------------------------
@pytest.fixture @pytest.fixture
def caplog(caplog: LogCaptureFixture): def caplog(caplog: LogCaptureFixture):
"""Propagate Loguru logs to the pytest caplog handler.""" """Propagate Loguru logs to the pytest caplog handler."""
@@ -88,7 +92,7 @@ def disable_debug_logging(scope="session", autouse=True):
def pytest_addoption(parser): def pytest_addoption(parser):
parser.addoption( parser.addoption(
"--full-run", action="store_true", default=False, help="Run with all optimization tests." "--finalize", action="store_true", default=False, help="Run with all tests."
) )
parser.addoption( parser.addoption(
"--check-config-side-effect", "--check-config-side-effect",
@@ -105,8 +109,8 @@ def pytest_addoption(parser):
@pytest.fixture @pytest.fixture
def is_full_run(request): def is_finalize(request):
yield bool(request.config.getoption("--full-run")) yield bool(request.config.getoption("--finalize"))
@pytest.fixture(autouse=True) @pytest.fixture(autouse=True)
@@ -123,6 +127,12 @@ def is_system_test(request):
yield bool(request.config.getoption("--system-test")) yield bool(request.config.getoption("--system-test"))
@pytest.fixture
def is_ci() -> bool:
"""Returns True if running on GitHub Actions CI, False otherwise."""
return os.getenv("CI") == "true"
@pytest.fixture @pytest.fixture
def prediction_eos(): def prediction_eos():
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
@@ -528,6 +538,25 @@ def server_setup_for_function(xprocess) -> Generator[dict[str, Union[str, int]],
yield result yield result
# --------------------------------------
# Provide version and hash check support
# --------------------------------------
@pytest.fixture(scope="session")
def version_and_hash() -> Generator[dict[str, Optional[str]], None, None]:
"""Return version info as in in version.py and calculate current hash.
Runs once per test session.
"""
info = version()
info["hash_current"] = _version_hash()
yield info
# After all tests
# ------------------------------ # ------------------------------
# Provide pytest timezone change # Provide pytest timezone change
# ------------------------------ # ------------------------------

View File

@@ -16,7 +16,6 @@ from akkudoktoreos.core.cache import (
CacheFileStore, CacheFileStore,
cache_energy_management, cache_energy_management,
cache_in_file, cache_in_file,
cachemethod_energy_management,
) )
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
@@ -64,10 +63,10 @@ class TestCacheEnergyManagementStore:
class TestCacheUntilUpdateDecorators: class TestCacheUntilUpdateDecorators:
def test_cachemethod_energy_management(self, cache_energy_management_store): def test_cachemethod_energy_management(self, cache_energy_management_store):
"""Test that cachemethod_energy_management caches method results.""" """Test that cache_energy_management caches method results."""
class MyClass: class MyClass:
@cachemethod_energy_management @cache_energy_management
def compute(self, value: int) -> int: def compute(self, value: int) -> int:
return value * 2 return value * 2
@@ -102,7 +101,7 @@ class TestCacheUntilUpdateDecorators:
"""Test that caching works for different arguments.""" """Test that caching works for different arguments."""
class MyClass: class MyClass:
@cachemethod_energy_management @cache_energy_management
def compute(self, value: int) -> int: def compute(self, value: int) -> int:
return value * 2 return value * 2
@@ -123,7 +122,7 @@ class TestCacheUntilUpdateDecorators:
"""Test that cache is cleared between EMS update cycles.""" """Test that cache is cleared between EMS update cycles."""
class MyClass: class MyClass:
@cachemethod_energy_management @cache_energy_management
def compute(self, value: int) -> int: def compute(self, value: int) -> int:
return value * 2 return value * 2

View File

@@ -120,15 +120,6 @@ def test_singleton_behavior(config_eos, config_default_dirs):
assert instance1.general.config_file_path == initial_cfg_file assert instance1.general.config_file_path == initial_cfg_file
def test_default_config_path(config_eos, config_default_dirs):
"""Test that the default config file path is computed correctly."""
_, _, config_default_dir_default, _ = config_default_dirs
expected_path = config_default_dir_default.joinpath("default.config.json")
assert config_eos.config_default_file_path == expected_path
assert config_eos.config_default_file_path.is_file()
def test_config_file_priority(config_default_dirs): def test_config_file_priority(config_default_dirs):
"""Test config file priority. """Test config file priority.

View File

@@ -1,5 +1,6 @@
import json import json
import os import os
import shutil
import sys import sys
from pathlib import Path from pathlib import Path
from unittest.mock import patch from unittest.mock import patch
@@ -9,6 +10,9 @@ import pytest
DIR_PROJECT_ROOT = Path(__file__).parent.parent DIR_PROJECT_ROOT = Path(__file__).parent.parent
DIR_TESTDATA = Path(__file__).parent / "testdata" DIR_TESTDATA = Path(__file__).parent / "testdata"
DIR_DOCS_GENERATED = DIR_PROJECT_ROOT / "docs" / "_generated"
DIR_TEST_GENERATED = DIR_TESTDATA / "docs" / "_generated"
def test_openapi_spec_current(config_eos): def test_openapi_spec_current(config_eos):
"""Verify the openapi spec hasn´t changed.""" """Verify the openapi spec hasn´t changed."""
@@ -74,11 +78,14 @@ def test_openapi_md_current(config_eos):
def test_config_md_current(config_eos): def test_config_md_current(config_eos):
"""Verify the generated configuration markdown hasn´t changed.""" """Verify the generated configuration markdown hasn´t changed."""
expected_config_md_path = DIR_PROJECT_ROOT / "docs" / "_generated" / "config.md" assert DIR_DOCS_GENERATED.exists()
new_config_md_path = DIR_TESTDATA / "config-new.md"
with expected_config_md_path.open("r", encoding="utf-8", newline=None) as f_expected: # Remove any leftover files from last run
expected_config_md = f_expected.read() if DIR_TEST_GENERATED.exists():
shutil.rmtree(DIR_TEST_GENERATED)
# Ensure test dir exists
DIR_TEST_GENERATED.mkdir(parents=True, exist_ok=True)
# Patch get_config and import within guard to patch global variables within the eos module. # Patch get_config and import within guard to patch global variables within the eos module.
with patch("akkudoktoreos.config.config.get_config", return_value=config_eos): with patch("akkudoktoreos.config.config.get_config", return_value=config_eos):
@@ -87,17 +94,33 @@ def test_config_md_current(config_eos):
sys.path.insert(0, str(root_dir)) sys.path.insert(0, str(root_dir))
from scripts import generate_config_md from scripts import generate_config_md
config_md = generate_config_md.generate_config_md(config_eos) # Get all the top level fields
field_names = sorted(config_eos.__class__.model_fields.keys())
if os.name == "nt": # Create the file paths
config_md = config_md.replace("\\\\", "/") expected = [ DIR_DOCS_GENERATED / "config.md", DIR_DOCS_GENERATED / "configexample.md", ]
with new_config_md_path.open("w", encoding="utf-8", newline="\n") as f_new: tested = [ DIR_TEST_GENERATED / "config.md", DIR_TEST_GENERATED / "configexample.md", ]
f_new.write(config_md) for field_name in field_names:
file_name = f"config{field_name.lower()}.md"
expected.append(DIR_DOCS_GENERATED / file_name)
tested.append(DIR_TEST_GENERATED / file_name)
try: # Create test files
assert config_md == expected_config_md config_md = generate_config_md.generate_config_md(tested[0], config_eos)
except AssertionError as e:
pytest.fail( # Check test files are the same as the expected files
f"Expected {new_config_md_path} to equal {expected_config_md_path}.\n" for i, expected_path in enumerate(expected):
+ f"If ok: `make gen-docs` or `cp {new_config_md_path} {expected_config_md_path}`\n" tested_path = tested[i]
)
with expected_path.open("r", encoding="utf-8", newline=None) as f_expected:
expected_config_md = f_expected.read()
with tested_path.open("r", encoding="utf-8", newline=None) as f_expected:
tested_config_md = f_expected.read()
try:
assert tested_config_md == expected_config_md
except AssertionError as e:
pytest.fail(
f"Expected {tested_path} to equal {expected_path}.\n"
+ f"If ok: `make gen-docs` or `cp {tested_path} {expected_path}`\n"
)

140
tests/test_docsphinx.py Normal file
View File

@@ -0,0 +1,140 @@
import json
import os
import shutil
import subprocess
import sys
import tempfile
from pathlib import Path
from typing import Optional
import pytest
DIR_PROJECT_ROOT = Path(__file__).absolute().parent.parent
DIR_BUILD = DIR_PROJECT_ROOT / "build"
DIR_BUILD_DOCS = DIR_PROJECT_ROOT / "build" / "docs"
DIR_DOCS = DIR_PROJECT_ROOT / "docs"
DIR_SRC = DIR_PROJECT_ROOT / "src"
HASH_FILE = DIR_BUILD / ".sphinx_hash.json"
def find_sphinx_build() -> str:
venv = os.getenv("VIRTUAL_ENV")
paths = [Path(venv)] if venv else []
paths.append(DIR_PROJECT_ROOT / ".venv")
for base in paths:
cmd = base / ("Scripts" if os.name == "nt" else "bin") / ("sphinx-build.exe" if os.name == "nt" else "sphinx-build")
if cmd.exists():
return str(cmd)
return "sphinx-build"
@pytest.fixture(scope="session")
def sphinx_changed(version_and_hash) -> Optional[str]:
"""Returns new hash if any watched files have changed since last run.
Hash is stored in .sphinx_hash.json.
"""
new_hash = None
# Load previous hash
try:
previous = json.loads(HASH_FILE.read_text())
previous_hash = previous.get("hash")
except Exception:
previous_hash = None
changed = (previous_hash != version_and_hash["hash_current"])
if changed:
new_hash = version_and_hash["hash_current"]
return new_hash
class TestSphinxDocumentation:
"""Test class to verify Sphinx documentation generation.
Ensures no major warnings are emitted.
"""
SPHINX_CMD = [
find_sphinx_build(),
"-M",
"html",
str(DIR_DOCS),
str(DIR_BUILD_DOCS),
]
def _cleanup_autosum_dirs(self):
"""Delete all *_autosum folders inside docs/."""
for folder in DIR_DOCS.rglob("*_autosum"):
if folder.is_dir():
shutil.rmtree(folder)
def _cleanup_build_dir(self):
"""Delete build/docs directory if present."""
if DIR_BUILD_DOCS.exists():
shutil.rmtree(DIR_BUILD_DOCS)
def test_sphinx_build(self, sphinx_changed: Optional[str], is_finalize: bool):
"""Build Sphinx documentation and ensure no major warnings appear in the build output."""
# Ensure docs folder exists
if not DIR_DOCS.exists():
pytest.skip(f"Skipping Sphinx build test - docs folder not present: {DIR_DOCS}")
if not sphinx_changed:
pytest.skip(f"Skipping Sphinx build — no relevant file changes detected: {HASH_FILE}")
if not is_finalize:
pytest.skip("Skipping Sphinx test — not full run")
# Clean directories
self._cleanup_autosum_dirs()
self._cleanup_build_dir()
# Set environment for sphinx run (sphinx will make eos create a config file)
eos_tmp_dir = tempfile.TemporaryDirectory()
eos_dir = str(eos_tmp_dir.name)
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_CONFIG_DIR"] = eos_dir
try:
# Run sphinx-build
project_dir = Path(__file__).parent.parent
process = subprocess.run(
self.SPHINX_CMD,
check=True,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True,
cwd=project_dir,
)
# Combine output
output = process.stdout + "\n" + process.stderr
returncode = process.returncode
except:
output = f"ERROR: Could not start sphinx-build - {self.SPHINX_CMD}"
returncode = -1
# Remove temporary EOS_DIR
eos_tmp_dir.cleanup()
assert returncode == 0
# Possible markers: ERROR: WARNING: TRACEBACK:
major_markers = ("ERROR:", "TRACEBACK:")
bad_lines = [
line for line in output.splitlines()
if any(marker in line for marker in major_markers)
]
assert not bad_lines, f"Sphinx build contained errors:\n" + "\n".join(bad_lines)
# Update stored hash
HASH_FILE.parent.mkdir(parents=True, exist_ok=True)
HASH_FILE.write_text(json.dumps({"hash": sphinx_changed}, indent=2))

374
tests/test_docstringrst.py Normal file
View File

@@ -0,0 +1,374 @@
import importlib
import importlib.util
import inspect
import pkgutil
import re
import sys
from difflib import SequenceMatcher
from pathlib import Path
from docutils import nodes
from docutils.core import publish_parts
from docutils.frontend import OptionParser
from docutils.parsers.rst import Directive, Parser, directives
from docutils.utils import Reporter, new_document
from sphinx.ext.napoleon import Config as NapoleonConfig
from sphinx.ext.napoleon.docstring import GoogleDocstring
DIR_PROJECT_ROOT = Path(__file__).absolute().parent.parent
DIR_DOCS = DIR_PROJECT_ROOT / "docs"
PACKAGE_NAME = "akkudoktoreos"
# ---------------------------------------------------------------------------
# Location ignore rules (regex)
# ---------------------------------------------------------------------------
# Locations to ignore (regex). Note the escaped dot for literal '.'
IGNORE_LOCATIONS = [
r"\.__new__$",
# Pydantic
r"\.model_copy$",
r"\.model_dump$",
r"\.model_dump_json$",
r"\.field_serializer$",
r"\.field_validator$",
r"\.model_validator$",
r"\.computed_field$",
r"\.Field$",
r"\.FieldInfo.*",
r"\.ComputedFieldInfo.*",
r"\.PrivateAttr$",
# pathlib
r"\.Path.*",
# MarkdownIt
r"\.MarkdownIt.*",
# FastAPI
r"\.FastAPI.*",
r"\.FileResponse.*",
r"\.PdfResponse.*",
r"\.HTTPException$",
# bokeh
r"\.bokeh.*",
r"\.figure.*",
r"\.ColumnDataSource.*",
r"\.LinearAxis.*",
r"\.Range1d.*",
# BeautifulSoup
r"\.BeautifulSoup.*",
# ExponentialSmoothing
r"\.ExponentialSmoothing.*",
# Pendulum
r"\.Date$",
r"\.DateTime$",
r"\.Duration$",
# ABC
r"\.abstractmethod$",
# numpytypes
r"\.NDArray$",
# typing
r"\.ParamSpec",
r"\.TypeVar",
r"\.Annotated",
# contextlib
r"\.asynccontextmanager$",
# concurrent
r"\.ThreadPoolExecutor.*",
# asyncio
r"\.Lock.*",
# scipy
r"\.RegularGridInterpolator.*",
# pylogging
r"\.InterceptHandler.filter$",
# itertools
r"\.chain$",
# functools
r"\.partial$",
# fnmatch
r"\.fnmatch$",
]
# ---------------------------------------------------------------------------
# Error message ignore rules by location (regex)
# ---------------------------------------------------------------------------
IGNORE_ERRORS_BY_LOCATION = {
r"^akkudoktoreos.*": [
r"Unexpected possible title overline or transition.*",
],
}
# --- Use your global paths ---
conf_path = DIR_DOCS / "conf.py"
spec = importlib.util.spec_from_file_location("sphinx_conf", conf_path)
if spec is None:
raise AssertionError(f"Can not import sphinx_conf from {conf_path}")
sphinx_conf = importlib.util.module_from_spec(spec)
sys.modules["sphinx_conf"] = sphinx_conf
if spec.loader is None:
raise AssertionError(f"Can not import sphinx_conf from {conf_path}")
spec.loader.exec_module(sphinx_conf)
# Build NapoleonConfig with all options
napoleon_config = NapoleonConfig(
napoleon_google_docstring=getattr(sphinx_conf, "napoleon_google_docstring", True),
napoleon_numpy_docstring=getattr(sphinx_conf, "napoleon_numpy_docstring", False),
napoleon_include_init_with_doc=getattr(sphinx_conf, "napoleon_include_init_with_doc", False),
napoleon_include_private_with_doc=getattr(sphinx_conf, "napoleon_include_private_with_doc", False),
napoleon_include_special_with_doc=getattr(sphinx_conf, "napoleon_include_special_with_doc", True),
napoleon_use_admonition_for_examples=getattr(sphinx_conf, "napoleon_use_admonition_for_examples", False),
napoleon_use_admonition_for_notes=getattr(sphinx_conf, "napoleon_use_admonition_for_notes", False),
napoleon_use_admonition_for_references=getattr(sphinx_conf, "napoleon_use_admonition_for_references", False),
napoleon_use_ivar=getattr(sphinx_conf, "napoleon_use_ivar", False),
napoleon_use_param=getattr(sphinx_conf, "napoleon_use_param", True),
napoleon_use_rtype=getattr(sphinx_conf, "napoleon_use_rtype", True),
napoleon_preprocess_types=getattr(sphinx_conf, "napoleon_preprocess_types", False),
napoleon_type_aliases=getattr(sphinx_conf, "napoleon_type_aliases", None),
napoleon_attr_annotations=getattr(sphinx_conf, "napoleon_attr_annotations", True),
)
FENCE_RE = re.compile(r"^```(\w*)\s*$")
def replace_fenced_code_blocks(doc: str) -> tuple[str, bool]:
"""Replace fenced code blocks (```lang) in a docstring with RST code-block syntax.
Returns:
(new_doc, changed):
new_doc: The docstring with replacements applied
changed: True if any fenced block was replaced
"""
out_lines = []
inside = False
lang = ""
buffer: list[str] = []
changed = False
lines = doc.split("\n")
for line in lines:
stripped = line.strip()
# Detect opening fence: ``` or ```python
m = FENCE_RE.match(stripped)
if m and not inside:
inside = True
lang = m.group(1) or ""
# Write RST code-block header
if lang:
out_lines.append(f" .. code-block:: {lang}")
else:
out_lines.append(" .. code-block::")
out_lines.append("") # blank line required by RST
changed = True
continue
# Detect closing fence ```
if stripped == "```" and inside:
# Emit fenced code content with indentation
for b in buffer:
out_lines.append(" " + b)
out_lines.append("") # trailing blank line to close environment
inside = False
buffer = []
continue
if inside:
buffer.append(line)
else:
out_lines.append(line)
# If doc ended while still in fenced code, flush
if inside:
changed = True
for b in buffer:
out_lines.append(" " + b)
out_lines.append("")
inside = False
return "\n".join(out_lines), changed
def prepare_docutils_for_sphinx():
class NoOpDirective(Directive):
has_content = True
required_arguments = 0
optional_arguments = 100
final_argument_whitespace = True
def run(self):
return []
for d in ["attribute", "data", "method", "function", "class", "event", "todo"]:
directives.register_directive(d, NoOpDirective)
def validate_rst(text: str) -> list[tuple[int, str]]:
"""Validate a string as reStructuredText.
Returns a list of tuples: (line_number, message).
"""
if not text or not text.strip():
return []
warnings: list[tuple[int, str]] = []
class RecordingReporter(Reporter):
"""Capture warnings/errors instead of halting."""
def system_message(self, level, message, *children, **kwargs):
line = kwargs.get("line", None)
warnings.append((line or 0, message))
return nodes.system_message(message, level=level, type=self.levels[level], *children, **kwargs)
# Create default settings
settings = OptionParser(components=(Parser,)).get_default_values()
document = new_document("<docstring>", settings=settings)
# Attach custom reporter
document.reporter = RecordingReporter(
source="<docstring>",
report_level=1, # capture warnings and above
halt_level=100, # never halt
stream=None,
debug=False
)
parser = Parser()
parser.parse(text, document)
return warnings
def iter_docstrings(package_name: str):
"""Yield docstrings of modules, classes, functions in the given package."""
package = importlib.import_module(package_name)
for module_info in pkgutil.walk_packages(package.__path__, package.__name__ + "."):
module = importlib.import_module(module_info.name)
# Module docstring
if module.__doc__:
yield f"Module {module.__name__}", inspect.getdoc(module)
# Classes + methods
for _, obj in inspect.getmembers(module):
if inspect.isclass(obj) or inspect.isfunction(obj):
if obj.__doc__:
yield f"{module.__name__}.{obj.__name__}", inspect.getdoc(obj)
# Methods of classes
if inspect.isclass(obj):
for _, meth in inspect.getmembers(obj, inspect.isfunction):
if meth.__doc__:
yield f"{module.__name__}.{obj.__name__}.{meth.__name__}", inspect.getdoc(meth)
def map_converted_to_original(orig: str, conv: str) -> dict[int,int]:
"""Map original docstring line to converted docstring line.
Returns:
mapping: key = converted line index (0-based), value = original line index (0-based).
"""
orig_lines = orig.splitlines()
conv_lines = conv.splitlines()
matcher = SequenceMatcher(None, orig_lines, conv_lines)
line_map = {}
for tag, i1, i2, j1, j2 in matcher.get_opcodes():
if tag in ("equal", "replace"):
for o, c in zip(range(i1, i2), range(j1, j2)):
line_map[c] = o
elif tag == "insert":
for c in range(j1, j2):
line_map[c] = max(i1 - 1, 0)
return line_map
def test_all_docstrings_rst_compliant():
"""All docstrings must be valid reStructuredText."""
failures = []
for location, doc in iter_docstrings(PACKAGE_NAME):
# Skip ignored locations
if any(re.search(pat, location) for pat in IGNORE_LOCATIONS):
continue
# convert like sphinx napoleon does
doc_converted = str(GoogleDocstring(doc, napoleon_config))
# Register directives that sphinx knows - just to avaid errors
prepare_docutils_for_sphinx()
# Validate
messages = validate_rst(doc_converted)
if not messages:
continue
# Map converted line numbers back to original docstring
line_map = map_converted_to_original(doc, doc_converted)
# Filter messages
filtered_messages = []
ignore_msg_patterns = []
for loc_pattern, patterns in IGNORE_ERRORS_BY_LOCATION.items():
if re.search(loc_pattern, location):
ignore_msg_patterns.extend(patterns)
for conv_line, msg_text in messages:
orig_line = line_map.get(conv_line - 1, conv_line - 1) + 1
if any(re.search(pat, msg_text) for pat in ignore_msg_patterns):
continue
filtered_messages.append((orig_line, msg_text))
if filtered_messages:
failures.append((location, filtered_messages, doc, doc_converted))
# Raise AssertionError with nicely formatted output
if failures:
msg = "Invalid reST docstrings (see https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html for valid format):\n"
for location, errors, doc, doc_converted in failures:
msg += f"\n--- {location} ---\n"
msg += "\nConverted by Sphinx Napoleon:\n"
doc_lines = doc_converted.splitlines()
for i, line_content in enumerate(doc_lines, start=1):
line_str = f"{i:2}" # fixed-width
msg += f" L{line_str}: {line_content}\n"
msg += "\nOriginal:\n"
doc_lines = doc.splitlines()
error_map = {line: err for line, err in errors}
for i, line_content in enumerate(doc_lines, start=1):
line_str = f"{i:2}" # fixed-width
if i in error_map:
msg += f">>> L{line_str}: {line_content} <-- {error_map[i]}\n"
else:
msg += f" L{line_str}: {line_content}\n"
doc_fixed, changed = replace_fenced_code_blocks(doc)
if changed:
msg += "\nImproved for fenced code blocks:\n"
msg += '"""' + doc_fixed + '\n"""\n'
msg += f"Total: {len(failures)} docstrings"
raise AssertionError(msg)

View File

@@ -173,11 +173,20 @@ def test_request_forecast_status_codes(
provider._request_forecast() provider._request_forecast()
@patch("requests.get")
@patch("akkudoktoreos.core.cache.CacheFileStore") @patch("akkudoktoreos.core.cache.CacheFileStore")
def test_cache_integration(mock_cache, provider): def test_cache_integration(mock_cache, mock_get, provider, sample_akkudoktor_1_json):
"""Test caching of 8-day electricity price data.""" """Test caching of 8-day electricity price data."""
# Mock response object
mock_response = Mock()
mock_response.status_code = 200
mock_response.content = json.dumps(sample_akkudoktor_1_json)
mock_get.return_value = mock_response
# Mock cache object
mock_cache_instance = mock_cache.return_value mock_cache_instance = mock_cache.return_value
mock_cache_instance.get.return_value = None # Simulate no cache mock_cache_instance.get.return_value = None # Simulate no cache
provider._update_data(force_update=True) provider._update_data(force_update=True)
mock_cache_instance.create.assert_called_once() mock_cache_instance.create.assert_called_once()
mock_cache_instance.get.assert_called_once() mock_cache_instance.get.assert_called_once()

View File

@@ -167,11 +167,20 @@ def test_request_forecast_status_codes(
provider._request_forecast() provider._request_forecast()
@patch("requests.get")
@patch("akkudoktoreos.core.cache.CacheFileStore") @patch("akkudoktoreos.core.cache.CacheFileStore")
def test_cache_integration(mock_cache, provider): def test_cache_integration(mock_cache, mock_get, provider, sample_energycharts_json):
"""Test caching of 8-day electricity price data.""" """Test caching of 8-day electricity price data."""
# Mock response object
mock_response = Mock()
mock_response.status_code = 200
mock_response.content = json.dumps(sample_energycharts_json)
mock_get.return_value = mock_response
# Mock cache object
mock_cache_instance = mock_cache.return_value mock_cache_instance = mock_cache.return_value
mock_cache_instance.get.return_value = None # Simulate no cache mock_cache_instance.get.return_value = None # Simulate no cache
provider._update_data(force_update=True) provider._update_data(force_update=True)
mock_cache_instance.create.assert_called_once() mock_cache_instance.create.assert_called_once()
mock_cache_instance.get.assert_called_once() mock_cache_instance.get.assert_called_once()
@@ -195,7 +204,7 @@ def test_key_to_array_resampling(provider):
@pytest.mark.skip(reason="For development only") @pytest.mark.skip(reason="For development only")
def test_akkudoktor_development_forecast_data(provider): def test_energycharts_development_forecast_data(provider):
"""Fetch data from real Energy-Charts server.""" """Fetch data from real Energy-Charts server."""
# Preset, as this is usually done by update_data() # Preset, as this is usually done by update_data()
provider.ems_start_datetime = to_datetime("2024-10-26 00:00:00") provider.ems_start_datetime = to_datetime("2024-10-26 00:00:00")

View File

@@ -18,11 +18,9 @@ def provider(sample_import_1_json, config_eos):
settings = { settings = {
"elecprice": { "elecprice": {
"provider": "ElecPriceImport", "provider": "ElecPriceImport",
"provider_settings": { "elecpriceimport": {
"ElecPriceImport": { "import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON), "import_json": json.dumps(sample_import_1_json),
"import_json": json.dumps(sample_import_1_json),
},
}, },
} }
} }
@@ -56,10 +54,8 @@ def test_invalid_provider(provider, config_eos):
settings = { settings = {
"elecprice": { "elecprice": {
"provider": "<invalid>", "provider": "<invalid>",
"provider_settings": { "elecpriceimport": {
"ElecPriceImport": { "import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
},
}, },
} }
} }
@@ -90,11 +86,11 @@ def test_import(provider, sample_import_1_json, start_datetime, from_file, confi
ems_eos = get_ems() ems_eos = get_ems()
ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin")) ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin"))
if from_file: if from_file:
config_eos.elecprice.provider_settings.ElecPriceImport.import_json = None config_eos.elecprice.elecpriceimport.import_json = None
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_json is None assert config_eos.elecprice.elecpriceimport.import_json is None
else: else:
config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path = None config_eos.elecprice.elecpriceimport.import_file_path = None
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path is None assert config_eos.elecprice.elecpriceimport.import_file_path is None
provider.clear() provider.clear()
# Call the method # Call the method

View File

@@ -50,7 +50,7 @@ def test_optimize(
fn_out: str, fn_out: str,
ngen: int, ngen: int,
config_eos: ConfigEOS, config_eos: ConfigEOS,
is_full_run: bool, is_finalize: bool,
): ):
"""Test optimierung_ems.""" """Test optimierung_ems."""
# Test parameters # Test parameters
@@ -107,8 +107,8 @@ def test_optimize(
genetic_optimization = GeneticOptimization(fixed_seed=fixed_seed) genetic_optimization = GeneticOptimization(fixed_seed=fixed_seed)
# Activate with pytest --full-run # Activate with pytest --finalize
if ngen > 10 and not is_full_run: if ngen > 10 and not is_finalize:
pytest.skip() pytest.skip()
visualize_filename = str((DIR_TESTDATA / f"new_{fn_out}").with_suffix(".pdf")) visualize_filename = str((DIR_TESTDATA / f"new_{fn_out}").with_suffix(".pdf"))

Some files were not shown because too many files have changed in this diff Show More