fix: test break docs and on data compaction (2) (#902)
Some checks failed
Bump Version / Bump Version Workflow (push) Has been cancelled
docker-build / platform-excludes (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled

Ensure that the snapping sequence generated in the test fixture
is within the boundaries expected by the test.

Ensure we read the _version_date.py info as UTC datetime and do
no localtime conversion.

Prevent and guard test_version.py to modify the version date file.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
This commit is contained in:
Bobby Noelte
2026-02-24 18:56:11 +01:00
committed by GitHub
parent 90e2e8af7e
commit 378377b1ce
11 changed files with 119 additions and 53 deletions

View File

@@ -188,7 +188,7 @@ prepare-version: install
$(PYTHON) ./scripts/generate_config_md.py --output-file docs/_generated/config.md $(PYTHON) ./scripts/generate_config_md.py --output-file docs/_generated/config.md
$(PYTHON) ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md $(PYTHON) ./scripts/generate_openapi_md.py --output-file docs/_generated/openapi.md
$(PYTHON) ./scripts/generate_openapi.py --output-file openapi.json $(PYTHON) ./scripts/generate_openapi.py --output-file openapi.json
$(PYTEST) -vv --finalize tests/test_version.py $(PYTEST) -vv --finalize tests/test_doc.py
test-version: test-version:
echo "Test version information to be correctly set in all version files" echo "Test version information to be correctly set in all version files"

View File

@@ -6,7 +6,7 @@
# the root directory (no add-on folder as usual). # the root directory (no add-on folder as usual).
name: "Akkudoktor-EOS" name: "Akkudoktor-EOS"
version: "0.2.0.dev2602240695620513" version: "0.2.0.dev2602241754328029"
slug: "eos" slug: "eos"
description: "Akkudoktor-EOS add-on" description: "Akkudoktor-EOS add-on"
url: "https://github.com/Akkudoktor-EOS/EOS" url: "https://github.com/Akkudoktor-EOS/EOS"

View File

@@ -120,7 +120,7 @@
} }
}, },
"general": { "general": {
"version": "0.2.0.dev2602240695620513", "version": "0.2.0.dev2602241754328029",
"data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net", "data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net",
"data_output_subpath": "output", "data_output_subpath": "output",
"latitude": 52.52, "latitude": 52.52,

View File

@@ -16,7 +16,7 @@
| latitude | `EOS_GENERAL__LATITUDE` | `Optional[float]` | `rw` | `52.52` | Latitude in decimal degrees between -90 and 90. North is positive (ISO 19115) (°) | | latitude | `EOS_GENERAL__LATITUDE` | `Optional[float]` | `rw` | `52.52` | Latitude in decimal degrees between -90 and 90. North is positive (ISO 19115) (°) |
| longitude | `EOS_GENERAL__LONGITUDE` | `Optional[float]` | `rw` | `13.405` | Longitude in decimal degrees within -180 to 180 (°) | | longitude | `EOS_GENERAL__LONGITUDE` | `Optional[float]` | `rw` | `13.405` | Longitude in decimal degrees within -180 to 180 (°) |
| timezone | | `Optional[str]` | `ro` | `N/A` | Computed timezone based on latitude and longitude. | | timezone | | `Optional[str]` | `ro` | `N/A` | Computed timezone based on latitude and longitude. |
| version | `EOS_GENERAL__VERSION` | `str` | `rw` | `0.2.0.dev2602240695620513` | Configuration file version. Used to check compatibility. | | version | `EOS_GENERAL__VERSION` | `str` | `rw` | `0.2.0.dev2602241754328029` | Configuration file version. Used to check compatibility. |
::: :::
<!-- pyml enable line-length --> <!-- pyml enable line-length -->
@@ -28,7 +28,7 @@
```json ```json
{ {
"general": { "general": {
"version": "0.2.0.dev2602240695620513", "version": "0.2.0.dev2602241754328029",
"data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net", "data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net",
"data_output_subpath": "output", "data_output_subpath": "output",
"latitude": 52.52, "latitude": 52.52,
@@ -46,7 +46,7 @@
```json ```json
{ {
"general": { "general": {
"version": "0.2.0.dev2602240695620513", "version": "0.2.0.dev2602241754328029",
"data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net", "data_folder_path": "/home/user/.local/share/net.akkudoktoreos.net",
"data_output_subpath": "output", "data_output_subpath": "output",
"latitude": 52.52, "latitude": 52.52,

View File

@@ -1,6 +1,6 @@
# Akkudoktor-EOS # Akkudoktor-EOS
**Version**: `v0.2.0.dev2602240695620513` **Version**: `v0.2.0.dev2602241754328029`
<!-- pyml disable line-length --> <!-- pyml disable line-length -->
**Description**: This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period. **Description**: This project provides a comprehensive solution for simulating and optimizing an energy system based on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries), load management (consumer requirements), heat pumps, electric vehicles, and consideration of electricity price data, this system enables forecasting and optimization of energy flow and costs over a specified period.

View File

@@ -8,7 +8,7 @@
"name": "Apache 2.0", "name": "Apache 2.0",
"url": "https://www.apache.org/licenses/LICENSE-2.0.html" "url": "https://www.apache.org/licenses/LICENSE-2.0.html"
}, },
"version": "v0.2.0.dev2602240695620513" "version": "v0.2.0.dev2602241754328029"
}, },
"paths": { "paths": {
"/v1/admin/cache/clear": { "/v1/admin/cache/clear": {
@@ -4451,7 +4451,7 @@
"type": "string", "type": "string",
"title": "Version", "title": "Version",
"description": "Configuration file version. Used to check compatibility.", "description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0.dev2602240695620513" "default": "0.2.0.dev2602241754328029"
}, },
"data_folder_path": { "data_folder_path": {
"type": "string", "type": "string",
@@ -4514,7 +4514,7 @@
"type": "string", "type": "string",
"title": "Version", "title": "Version",
"description": "Configuration file version. Used to check compatibility.", "description": "Configuration file version. Used to check compatibility.",
"default": "0.2.0.dev2602240695620513" "default": "0.2.0.dev2602241754328029"
}, },
"data_folder_path": { "data_folder_path": {
"type": "string", "type": "string",

View File

@@ -8,11 +8,14 @@ Usage:
#!/usr/bin/env python3 #!/usr/bin/env python3
import re import re
import sys import sys
from datetime import timezone
from pathlib import Path from pathlib import Path
from typing import List from typing import List
# Add the src directory to sys.path so import akkudoktoreos works in all cases
PROJECT_ROOT = Path(__file__).parent.parent PROJECT_ROOT = Path(__file__).parent.parent
PACKAGE_DIR = PROJECT_ROOT / "src" / "akkudoktoreos"
# Add the src directory to sys.path so import akkudoktoreos works in all cases
SRC_DIR = PROJECT_ROOT / "src" SRC_DIR = PROJECT_ROOT / "src"
sys.path.insert(0, str(SRC_DIR)) sys.path.insert(0, str(SRC_DIR))
@@ -95,17 +98,19 @@ def update_version_in_file(file_path: Path, new_version: str) -> bool:
def update_version_date_file() -> str: def update_version_date_file() -> str:
"""Write current version date to __version_date__.py""" """Write current version date to _version_date.py, only if changed."""
from akkudoktoreos.core.version import VERSION_DATE_FILE, _version_date_hash from akkudoktoreos.core.version import VERSION_DATE_FILE, _version_date_hash
version_date, _ = _version_date_hash()
version_date_utc = version_date.astimezone(timezone.utc)
version_date_str = version_date_utc.isoformat()
new_content = f'VERSION_DATE = "{version_date_str}"\n'
version_date, _ = _version_date_hash() if VERSION_DATE_FILE.exists() and VERSION_DATE_FILE.read_text(encoding="utf-8") == new_content:
version_date_str = version_date.strftime('%Y-%m-%dT%H:%M:%SZ') print(f"No change to {VERSION_DATE_FILE}")
content = f'VERSION_DATE = "{version_date_str}"\n' return str(VERSION_DATE_FILE)
VERSION_DATE_FILE.write_text(content)
VERSION_DATE_FILE.write_text(new_content, encoding="utf-8")
print(f"Updated {VERSION_DATE_FILE} with UTC date {version_date_str}") print(f"Updated {VERSION_DATE_FILE} with UTC date {version_date_str}")
return str(VERSION_DATE_FILE) return str(VERSION_DATE_FILE)
@@ -124,10 +129,18 @@ def main(version: str, files: List[str]):
if update_version_in_file(path, version): if update_version_in_file(path, version):
updated_files.append(str(path)) updated_files.append(str(path))
updated_files.append(update_version_date_file())
if updated_files: if updated_files:
print(f"Updated files: {', '.join(updated_files)}") print(f"Updated files: {', '.join(updated_files)}")
# Only update VERSION_DATE_FILE if a real package file was touched
# Exclude VERSION_DATE_FILE itself to avoid a self-referencing loop
from akkudoktoreos.core.version import VERSION_DATE_FILE
package_files_updated = any(
str(PACKAGE_DIR) in f and Path(f).resolve() != VERSION_DATE_FILE.resolve()
for f in updated_files
)
if package_files_updated:
updated_files.append(update_version_date_file())
else: else:
print("No files updated.") print("No files updated.")

View File

@@ -1 +1 @@
VERSION_DATE = "2026-02-24T06:03:36Z" VERSION_DATE = "2026-02-24T16:58:00Z"

View File

@@ -247,7 +247,10 @@ def newest_commit_or_dirty_datetime(files: list[Path]) -> datetime:
exec(VERSION_DATE_FILE.read_text(), {}, ns) # noqa: S102 exec(VERSION_DATE_FILE.read_text(), {}, ns) # noqa: S102
date_str = ns.get("VERSION_DATE") date_str = ns.get("VERSION_DATE")
if date_str: if date_str:
return datetime.fromisoformat(date_str).astimezone(timezone.utc) dt = datetime.fromisoformat(date_str)
if dt.tzinfo is None:
dt = dt.replace(tzinfo=timezone.utc) # treat naive as UTC, don't convert
return dt.astimezone(timezone.utc)
except Exception: # noqa: S110 except Exception: # noqa: S110
pass pass

View File

@@ -809,32 +809,45 @@ class TestDataSequenceSparseGuard:
insert a "newest anchor" record 1 second before now so that insert a "newest anchor" record 1 second before now so that
db_max ≈ now, making cutoff = db_max - age_threshold ≈ now - age_minutes. db_max ≈ now, making cutoff = db_max - age_threshold ≈ now - age_minutes.
The test records are placed at now - (age_minutes + margin) + offset, Critically, _db_compact_tier FLOORS the cutoff to the interval boundary:
which puts them clearly before the cutoff and inside the compaction window. window_end_epoch = floor(anchor_epoch - age_sec, interval_sec)
resampled_count = age_minutes / interval_minutes (the window width in We replicate that exact floor here so that all test records are
buckets). We require len(offsets_minutes) > resampled_count so the guaranteed to land before window_end regardless of what wall-clock
snapping path is entered rather than the pure-skip path. time the test runs at (UTC CI vs. local non-UTC machines).
The test records are placed at base + offset_minutes where base is
chosen so that base + max(offsets) < window_end.
resampled_count = window_width / interval_sec (ceiling).
We require len(offsets_minutes) > resampled_count so the snapping
path is entered rather than the pure-skip path.
Returns (seq, age_threshold, target_interval, record_datetimes). Returns (seq, age_threshold, target_interval, record_datetimes).
""" """
age_td = to_duration(f"{age_minutes} minutes") age_td = to_duration(f"{age_minutes} minutes")
interval_td = to_duration(f"{interval_minutes} minutes") interval_td = to_duration(f"{interval_minutes} minutes")
interval_sec = interval_minutes * 60 interval_sec = interval_minutes * 60
age_sec = age_minutes * 60
# Margin must be larger than the maximum offset so that ALL test records # Replicate the exact window_end the implementation will compute:
# land before window_end = floor(now - age_minutes, interval_sec). # anchor = now - 1s
# We need: base + max(offsets) < now - age_minutes # raw_cutoff = anchor - age_td
# => now - (age_minutes + margin) + max(offsets) < now - age_minutes # window_end = floor(raw_cutoff, interval_sec)
# => max(offsets) < margin anchor_epoch = int(now.subtract(seconds=1).timestamp())
# Use margin = max(offsets_minutes) + 2*interval_minutes + 1 (generous). raw_cutoff_epoch = anchor_epoch - age_sec
window_end_epoch = (raw_cutoff_epoch // interval_sec) * interval_sec
# Place base interval_sec before window_end so all records
# (base + max_offset) are safely inside [window_start, window_end).
# We need: base_epoch + max(offsets)*60 < window_end_epoch
# Use: base_epoch = window_end_epoch - (max_offset + 2*interval_minutes + 1) * 60
# Then floor base to interval boundary.
max_offset = max(offsets_minutes) if offsets_minutes else 0 max_offset = max(offsets_minutes) if offsets_minutes else 0
margin = max_offset + 2 * interval_minutes + 1 margin_sec = (max_offset + 2 * interval_minutes + 1) * 60
raw_base_epoch = window_end_epoch - margin_sec
# Floor base to interval boundary so snapping arithmetic is exact base_epoch = (raw_base_epoch // interval_sec) * interval_sec
raw_base = now.subtract(minutes=age_minutes + margin).set(second=0, microsecond=0) base = DateTime.fromtimestamp(base_epoch, tz="UTC")
base_epoch = int(raw_base.timestamp())
base = raw_base.subtract(seconds=base_epoch % interval_sec)
seq = EnergySequence() seq = EnergySequence()
dts = [] dts = []
@@ -858,12 +871,10 @@ class TestDataSequenceSparseGuard:
""" """
now = to_datetime().in_timezone("UTC") now = to_datetime().in_timezone("UTC")
# 4 records at :03, :08, :13, :18 — all misaligned for a 10-min interval # 4 records at :03, :08, :13, :18 — all misaligned for a 10-min interval
seq, age_td, interval_td, _ = self._make_snapping_seq( seq, age_td, interval_td, dts = self._make_snapping_seq(
now, offsets_minutes=[3, 8, 13, 18] now, offsets_minutes=[3, 8, 13, 18]
) )
# before includes the anchor record which is NOT in the compaction window n_test_records = len([3, 8, 13, 18])
# and therefore NOT deleted. Only the 4 test records are in-window.
n_test_records = len([3, 8, 13, 18]) # offsets_minutes
deleted = seq._db_compact_tier(age_td, interval_td) deleted = seq._db_compact_tier(age_td, interval_td)
after = seq.db_count_records() after = seq.db_count_records()
@@ -871,16 +882,17 @@ class TestDataSequenceSparseGuard:
f"All {n_test_records} in-window records must be deleted (whole-window delete); " f"All {n_test_records} in-window records must be deleted (whole-window delete); "
f"got deleted={deleted}" f"got deleted={deleted}"
) )
# Net count after: anchor(1) + snapped buckets re-inserted.
# Implementation uses FLOOR division: (epoch // interval_sec) * interval_sec # Compute expected snapped buckets using the ABSOLUTE epochs of the
# offsets [3,8,13,18] with interval=10min map to buckets: # inserted records (same arithmetic _db_compact_tier uses), not
# 3 // 10 = 0 → :00 # offset-relative floor division. This is correct on any host timezone.
# 8 // 10 = 0 → :00 (collision with :03) interval_sec = 10 * 60
# 13 // 10 = 1 → :10 snapped_buckets = {
# 18 // 10 = 1 → :10 (collision with :13) (int(dt.timestamp()) // interval_sec) * interval_sec
# → 2 unique buckets for dt in dts
interval_minutes = 10 }
n_snapped = len({(off // interval_minutes) * interval_minutes for off in [3, 8, 13, 18]}) n_snapped = len(snapped_buckets)
assert after == 1 + n_snapped, ( assert after == 1 + n_snapped, (
f"Expected 1 anchor + {n_snapped} snapped buckets = {1 + n_snapped} records; " f"Expected 1 anchor + {n_snapped} snapped buckets = {1 + n_snapped} records; "
f"got {after}" f"got {after}"

View File

@@ -12,6 +12,7 @@ from akkudoktoreos.core.version import (
DIR_PACKAGE_ROOT, DIR_PACKAGE_ROOT,
EXCLUDED_DIR_PATTERNS, EXCLUDED_DIR_PATTERNS,
EXCLUDED_FILES, EXCLUDED_FILES,
VERSION_DATE_FILE,
HashConfig, HashConfig,
_version_calculate, _version_calculate,
_version_date_hash, _version_date_hash,
@@ -26,6 +27,43 @@ BUMP_DEV_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "bump_dev_version.py"
UPDATE_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "update_version.py" UPDATE_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "update_version.py"
@pytest.fixture(autouse=True)
def guard_version_date_file():
"""Ensure no test modifies the VERSION_DATE_FILE (_version_date.py)."""
# Record state before test
if VERSION_DATE_FILE.exists():
before_mtime = VERSION_DATE_FILE.stat().st_mtime
before_content = VERSION_DATE_FILE.read_text(encoding="utf-8")
else:
before_mtime = None
before_content = None
yield
# Check state after test
if VERSION_DATE_FILE.exists():
after_mtime = VERSION_DATE_FILE.stat().st_mtime
after_content = VERSION_DATE_FILE.read_text(encoding="utf-8")
if before_content is None:
pytest.fail(
f"Test created VERSION_DATE_FILE which should not exist: {VERSION_DATE_FILE}"
)
elif after_mtime != before_mtime or after_content != before_content:
# Restore the original content immediately to avoid polluting subsequent tests
VERSION_DATE_FILE.write_text(before_content, encoding="utf-8")
pytest.fail(
f"Test modified VERSION_DATE_FILE: {VERSION_DATE_FILE}\n"
f"Original content:\n{before_content}\n"
f"Modified content:\n{after_content}"
)
else:
if before_content is not None:
pytest.fail(
f"Test deleted VERSION_DATE_FILE: {VERSION_DATE_FILE}"
)
# --- Git helpers --- # --- Git helpers ---
def get_git_tracked_files(repo_path: Path) -> Optional[set[Path]]: def get_git_tracked_files(repo_path: Path) -> Optional[set[Path]]: