feat: add Home Assistant and NodeRED adapters (#764)

Adapters for Home Assistant and NodeRED integration are added.
Akkudoktor-EOS can now be run as Home Assistant add-on and standalone.

As Home Assistant add-on EOS uses ingress to fully integrate the EOSdash dashboard
in Home Assistant.

The fix includes several bug fixes that are not directly related to the adapter
implementation but are necessary to keep EOS running properly and to test and
document the changes.

* fix: development version scheme

  The development versioning scheme is adaptet to fit to docker and
  home assistant expectations. The new scheme is x.y.z and x.y.z.dev<hash>.
  Hash is only digits as expected by home assistant. Development version
  is appended by .dev as expected by docker.

* fix: use mean value in interval on resampling for array

  When downsampling data use the mean value of all values within the new
  sampling interval.

* fix: default battery ev soc and appliance wh

  Make the genetic simulation return default values for the
  battery SoC, electric vehicle SoC and appliance load if these
  assets are not used.

* fix: import json string

  Strip outer quotes from JSON strings on import to be compliant to json.loads()
  expectation.

* fix: default interval definition for import data

  Default interval must be defined in lowercase human definition to
  be accepted by pendulum.

* fix: clearoutside schema change

* feat: add adapters for integrations

  Adapters for Home Assistant and NodeRED integration are added.
  Akkudoktor-EOS can now be run as Home Assistant add-on and standalone.

  As Home Assistant add-on EOS uses ingress to fully integrate the EOSdash dashboard
  in Home Assistant.

* feat: allow eos to be started with root permissions and drop priviledges

  Home assistant starts all add-ons with root permissions. Eos now drops
  root permissions if an applicable user is defined by paramter --run_as_user.
  The docker image defines the user eos to be used.

* feat: make eos supervise and monitor EOSdash

  Eos now not only starts EOSdash but also monitors EOSdash during runtime
  and restarts EOSdash on fault. EOSdash logging is captured by EOS
  and forwarded to the EOS log to provide better visibility.

* feat: add duration to string conversion

  Make to_duration to also return the duration as string on request.

* chore: Use info logging to report missing optimization parameters

  In parameter preparation for automatic optimization an error was logged for missing paramters.
  Log is now down using the info level.

* chore: make EOSdash use the EOS data directory for file import/ export

  EOSdash use the EOS data directory for file import/ export by default.
  This allows to use the configuration import/ export function also
  within docker images.

* chore: improve EOSdash config tab display

  Improve display of JSON code and add more forms for config value update.

* chore: make docker image file system layout similar to home assistant

  Only use /data directory for persistent data. This is handled as a
  docker volume. The /data volume is mapped to ~/.local/share/net.akkudoktor.eos
  if using docker compose.

* chore: add home assistant add-on development environment

  Add VSCode devcontainer and task definition for home assistant add-on
  development.

* chore: improve documentation
This commit is contained in:
Bobby Noelte
2025-12-30 22:08:21 +01:00
committed by GitHub
parent 02c794460f
commit 58d70e417b
111 changed files with 6815 additions and 1199 deletions

View File

@@ -30,7 +30,6 @@ from akkudoktoreos.server.server import get_default_host
# Adapt pytest logging handling to Loguru logging
# -----------------------------------------------
@pytest.fixture
def caplog(caplog: LogCaptureFixture):
"""Propagate Loguru logs to the pytest caplog handler."""
@@ -430,13 +429,20 @@ def server_base(
eos_dir = str(eos_tmp_dir.name)
class Starter(ProcessStarter):
# Set environment for server run
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_CONFIG_DIR"] = eos_dir
if extra_env:
env.update(extra_env)
# assure server to be installed
try:
project_dir = Path(__file__).parent.parent
subprocess.run(
[sys.executable, "-c", "import", "akkudoktoreos.server.eos"],
check=True,
env=os.environ,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=project_dir,
@@ -444,20 +450,13 @@ def server_base(
except subprocess.CalledProcessError:
subprocess.run(
[sys.executable, "-m", "pip", "install", "-e", str(project_dir)],
env=os.environ,
env=env,
check=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=project_dir,
)
# Set environment for server run
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_CONFIG_DIR"] = eos_dir
if extra_env:
env.update(extra_env)
# Set command to start server process
args = [
sys.executable,
@@ -487,6 +486,25 @@ def server_base(
logger.debug(f"[xprocess] Exception during health check: {e}")
return False
def wait_callback(self):
"""Assert that process is ready to answer queries using provided
callback funtion. Will raise TimeoutError if self.callback does not
return True before self.timeout seconds"""
from datetime import datetime
while True:
time.sleep(1.0)
if self.startup_check():
return True
if datetime.now() > self._max_time:
info = self.process.getinfo("eos")
error_msg = (
f"The provided startup check could not assert process responsiveness\n"
f"within the specified time interval of {self.timeout} seconds.\n"
f"Server log is in '{info.logpath}'.\n"
)
raise TimeoutError(error_msg)
# Kill all running eos and eosdash process - just to be sure
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, server_timeout)
@@ -494,10 +512,12 @@ def server_base(
config_file_path = Path(eos_dir).joinpath(ConfigEOS.CONFIG_FILE_NAME)
with config_file_path.open(mode="w", encoding="utf-8", newline="\n") as fd:
json.dump({}, fd)
logger.info(f"Created empty config file in {config_file_path}.")
# ensure process is running and return its logfile
pid, logfile = xprocess.ensure("eos", Starter)
logger.info(f"Started EOS ({pid}). This may take very long (up to {server_timeout} seconds).")
logger.info(f"EOS_DIR: {Starter.env["EOS_DIR"]}, EOS_CONFIG_DIR: {Starter.env["EOS_CONFIG_DIR"]}")
logger.info(f"View xprocess logfile at: {logfile}")
yield {
@@ -509,7 +529,7 @@ def server_base(
"timeout": server_timeout,
}
# clean up whole process tree afterwards
# clean up whole process tree afterwards
xprocess.getinfo("eos").terminate()
# Cleanup any EOS process left.

79
tests/test_adapter.py Normal file
View File

@@ -0,0 +1,79 @@
"""
Tests for Adapter and AdapterContainer integration.
"""
from __future__ import annotations
from datetime import datetime
from typing import TypeAlias
import pytest
from akkudoktoreos.adapter.adapter import (
Adapter,
AdapterCommonSettings,
get_adapter,
)
from akkudoktoreos.adapter.adapterabc import AdapterContainer
from akkudoktoreos.adapter.homeassistant import HomeAssistantAdapter
from akkudoktoreos.adapter.nodered import NodeREDAdapter
# ---------- Typed aliases for fixtures ----------
AdapterFixture: TypeAlias = Adapter
SettingsFixture: TypeAlias = AdapterCommonSettings
# ---------- Fixtures ----------
@pytest.fixture
def adapter() -> AdapterFixture:
"""Fixture returning a fully initialized Adapter instance."""
return get_adapter()
@pytest.fixture
def settings() -> SettingsFixture:
"""Fixture providing default adapter common settings."""
return AdapterCommonSettings()
# ---------- Test Class ----------
class TestAdapter:
def test_is_adapter_container(self, adapter: AdapterFixture) -> None:
"""Adapter should be an AdapterContainer and an Adapter."""
assert isinstance(adapter, AdapterContainer)
assert isinstance(adapter, Adapter)
def test_providers_present(self, adapter: AdapterFixture) -> None:
"""Adapter must contain HA and NodeRED providers."""
assert len(adapter.providers) == 2
assert any(isinstance(p, HomeAssistantAdapter) for p in adapter.providers)
assert any(isinstance(p, NodeREDAdapter) for p in adapter.providers)
def test_adapter_order(self, adapter: AdapterFixture) -> None:
"""Provider order should match HomeAssistantAdapter -> NodeREDAdapter."""
assert isinstance(adapter.providers[0], HomeAssistantAdapter)
assert isinstance(adapter.providers[1], NodeREDAdapter)
# ----- AdapterCommonSettings -----
def test_settings_default_provider(self, settings: SettingsFixture) -> None:
"""Default provider should be None."""
assert settings.provider is None
def test_settings_accepts_single_provider(self, settings: SettingsFixture) -> None:
"""Settings should accept a single provider literal."""
settings.provider = ["HomeAssistant"]
assert settings.provider == ["HomeAssistant"]
def test_settings_accepts_multiple_providers(self, settings: SettingsFixture) -> None:
"""Settings should accept multiple provider literals."""
settings.provider = ["HomeAssistant", "NodeRED"]
assert isinstance(settings.provider, list)
assert settings.provider == ["HomeAssistant", "NodeRED"]
def test_provider_sub_settings(self, settings: SettingsFixture) -> None:
"""sub-settings (homeassistant & nodered) must be initialized."""
assert hasattr(settings, "homeassistant")
assert hasattr(settings, "nodered")
assert settings.homeassistant is not None
assert settings.nodered is not None

View File

@@ -0,0 +1,127 @@
from __future__ import annotations
from datetime import datetime
from unittest.mock import MagicMock, patch
import pytest
from pydantic import BaseModel
from akkudoktoreos.adapter.adapter import AdapterCommonSettings
from akkudoktoreos.adapter.nodered import NodeREDAdapter, NodeREDAdapterCommonSettings
from akkudoktoreos.core.emplan import DDBCInstruction, FRBCInstruction
from akkudoktoreos.core.ems import EnergyManagementStage
from akkudoktoreos.utils.datetimeutil import DateTime, compare_datetimes, to_datetime
@pytest.fixture
def mock_ems() -> MagicMock:
m = MagicMock()
m.stage.return_value = EnergyManagementStage.DATA_ACQUISITION
m.plan.return_value.get_active_instructions.return_value = []
return m
@pytest.fixture
def adapter(config_eos, mock_ems: MagicMock) -> NodeREDAdapter:
"""Fully Pydantic-safe NodeREDAdapter fixture."""
# Set nested value - also fills None values
config_eos.set_nested_value("adapter/provider", ["NodeRED"])
ad = NodeREDAdapter()
# Mark update datetime invalid
ad.update_datetime = None
# Assign EMS
object.__setattr__(ad, "ems", mock_ems)
return ad
class TestNodeREDAdapter:
def test_provider_id(self, adapter: NodeREDAdapter):
assert adapter.provider_id() == "NodeRED"
def test_enabled_detection_single(self, adapter: NodeREDAdapter):
adapter.config.adapter.provider = ["NodeRED"]
assert adapter.enabled() is True
adapter.config.adapter.provider = ["HomeAssistant"]
assert adapter.enabled() is False
adapter.config.adapter.provider = ["HomeAssistant", "NodeRED"]
assert adapter.enabled() is True
@patch("requests.get")
def test_update_datetime(self, mock_get, adapter: NodeREDAdapter):
adapter.ems.stage.return_value = EnergyManagementStage.DATA_ACQUISITION
mock_get.return_value.status_code = 200
mock_get.return_value.json.return_value = {"foo": "bar"}
now = to_datetime()
adapter.update_data(force_enable=True)
mock_get.assert_called_once()
assert compare_datetimes(adapter.update_datetime, now).approximately_equal
@patch("requests.get")
def test_update_data_data_acquisition_success(self, mock_get , adapter: NodeREDAdapter):
adapter.ems.stage.return_value = EnergyManagementStage.DATA_ACQUISITION
mock_get.return_value.status_code = 200
mock_get.return_value.json.return_value = {"foo": "bar"}
adapter.update_data(force_enable=True)
mock_get.assert_called_once()
url, = mock_get.call_args[0]
assert "/eos/data_aquisition" in url
@patch("requests.get", side_effect=Exception("boom"))
def test_update_data_data_acquisition_failure(self, mock_get, adapter: NodeREDAdapter):
adapter.ems.stage.return_value = EnergyManagementStage.DATA_ACQUISITION
with pytest.raises(RuntimeError):
adapter.update_data(force_enable=True)
@patch("requests.post")
def test_update_data_control_dispatch_instructions(self, mock_post, adapter: NodeREDAdapter):
adapter.ems.stage.return_value = EnergyManagementStage.CONTROL_DISPATCH
instr1 = DDBCInstruction(
id="res1@extra", operation_mode_id="X", operation_mode_factor=0.5,
actuator_id="dummy", execution_time=to_datetime()
)
instr2 = FRBCInstruction(
id="resA", operation_mode_id="Y", operation_mode_factor=0.25,
actuator_id="dummy", execution_time=to_datetime()
)
adapter.ems.plan.return_value.get_active_instructions.return_value = [instr1, instr2]
mock_post.return_value.status_code = 200
mock_post.return_value.json.return_value = {}
adapter.update_data(force_enable=True)
_, kwargs = mock_post.call_args
payload = kwargs["json"]
assert payload["res1_op_mode"] == "X"
assert payload["res1_op_factor"] == 0.5
assert payload["resA_op_mode"] == "Y"
assert payload["resA_op_factor"] == 0.25
url, = mock_post.call_args[0]
assert "/eos/control_dispatch" in url
@patch("requests.post")
def test_update_data_disabled_provider(self, mock_post, adapter: NodeREDAdapter):
adapter.config.adapter.provider = ["HomeAssistant"] # NodeRED disabled
adapter.update_data(force_enable=False)
mock_post.assert_not_called()
@patch("requests.post")
def test_update_data_force_enable_overrides_disabled(self, mock_post, adapter: NodeREDAdapter):
adapter.config.adapter.provider = ["HomeAssistant"]
adapter.ems.stage.return_value = EnergyManagementStage.CONTROL_DISPATCH
mock_post.return_value.status_code = 200
mock_post.return_value.json.return_value = {}
adapter.update_data(force_enable=True)
mock_post.assert_called_once()

View File

@@ -52,7 +52,9 @@ def test_config_constants(config_eos):
def test_computed_paths(config_eos):
"""Test computed paths for output and cache."""
# Don't actually try to create the data folder
with patch("pathlib.Path.mkdir"):
with patch("pathlib.Path.mkdir"), \
patch("pathlib.Path.is_dir", return_value=True), \
patch("pathlib.Path.exists", return_value=True):
config_eos.merge_settings_from_dict(
{
"general": {
@@ -371,7 +373,7 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
BATTERY_DEFAULT_CHARGE_RATES,
)
],
KeyError,
TypeError,
),
# Invalid index (no number)
(
@@ -383,7 +385,7 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
BATTERY_DEFAULT_CHARGE_RATES,
)
],
KeyError,
IndexError,
),
# Unset value (set None)
(

View File

@@ -698,6 +698,33 @@ class TestDataSequence:
fill_method="invalid",
)
def test_key_to_array_resample_mean(self, sequence):
"""Test that numeric resampling uses mean when multiple values fall into one interval."""
interval = to_duration("1 hour")
# Insert values every 15 minutes within the same hour
record1 = self.create_test_record(pendulum.datetime(2023, 11, 6, 0, 0), 1.0)
record2 = self.create_test_record(pendulum.datetime(2023, 11, 6, 0, 15), 2.0)
record3 = self.create_test_record(pendulum.datetime(2023, 11, 6, 0, 30), 3.0)
record4 = self.create_test_record(pendulum.datetime(2023, 11, 6, 0, 45), 4.0)
sequence.insert_by_datetime(record1)
sequence.insert_by_datetime(record2)
sequence.insert_by_datetime(record3)
sequence.insert_by_datetime(record4)
# Resample to hourly interval, expecting the mean of the 4 values
array = sequence.key_to_array(
key="data_value",
start_datetime=pendulum.datetime(2023, 11, 6, 0),
end_datetime=pendulum.datetime(2023, 11, 6, 1),
interval=interval,
)
assert isinstance(array, np.ndarray)
assert len(array) == 1 # one interval: 0:00-1:00
# The first interval mean = (1+2+3+4)/4 = 2.5
assert array[0] == pytest.approx(2.5)
def test_to_datetimeindex(self, sequence2):
record1 = self.create_test_record(datetime(2023, 11, 5), 0.8)
record2 = self.create_test_record(datetime(2023, 11, 6), 0.9)

View File

@@ -1657,47 +1657,162 @@ def test_to_datetime(
# to_duration
# -----------------------------
class TestToDuration:
# ------------------------------------------------------------------
# Valid input conversions (no formatting)
# ------------------------------------------------------------------
@pytest.mark.parametrize(
"input_value, expected_output",
[
# duration input
(pendulum.duration(days=1), pendulum.duration(days=1)),
# Test cases for valid duration inputs
@pytest.mark.parametrize(
"input_value, expected_output",
[
# duration input
(pendulum.duration(days=1), pendulum.duration(days=1)),
# String input
("2 days", pendulum.duration(days=2)),
("5 hours", pendulum.duration(hours=5)),
("47 hours", pendulum.duration(hours=47)),
("48 hours", pendulum.duration(seconds=48 * 3600)),
("30 minutes", pendulum.duration(minutes=30)),
("45 seconds", pendulum.duration(seconds=45)),
(
"1 day 2 hours 30 minutes 15 seconds",
pendulum.duration(days=1, hours=2, minutes=30, seconds=15),
),
("3 days 4 hours", pendulum.duration(days=3, hours=4)),
# Integer/Float input
(3600, pendulum.duration(seconds=3600)), # 1 hour
(86400, pendulum.duration(days=1)), # 1 day
(1800.5, pendulum.duration(seconds=1800.5)), # 30 minutes and 0.5 seconds
# Tuple/List input
((1, 2, 30, 15), pendulum.duration(days=1, hours=2, minutes=30, seconds=15)),
([0, 10, 0, 0], pendulum.duration(hours=10)),
],
)
def test_to_duration_valid(input_value, expected_output):
"""Test to_duration with valid inputs."""
assert to_duration(input_value) == expected_output
# String input
("1 hour", pendulum.duration(hours=1)),
("2 days", pendulum.duration(days=2)),
("5 hours", pendulum.duration(hours=5)),
("47 hours", pendulum.duration(hours=47)),
("48 hours", pendulum.duration(seconds=48 * 3600)),
("30 minutes", pendulum.duration(minutes=30)),
("45 seconds", pendulum.duration(seconds=45)),
(
"1 day 2 hours 30 minutes 15 seconds",
pendulum.duration(days=1, hours=2, minutes=30, seconds=15),
),
("3 days 4 hours", pendulum.duration(days=3, hours=4)),
# Integer / Float
(3600, pendulum.duration(seconds=3600)),
(86400, pendulum.duration(days=1)),
(1800.5, pendulum.duration(seconds=1800.5)),
def test_to_duration_summation():
start_datetime = to_datetime("2028-01-11 00:00:00")
index_datetime = start_datetime
for i in range(48):
expected_datetime = start_datetime + to_duration(f"{i} hours")
assert index_datetime == expected_datetime
index_datetime += to_duration("1 hour")
assert index_datetime == to_datetime("2028-01-13 00:00:00")
# Tuple / List
((1, 2, 30, 15), pendulum.duration(days=1, hours=2, minutes=30, seconds=15)),
([0, 10, 0, 0], pendulum.duration(hours=10)),
],
)
def test_to_duration_valid(self, input_value, expected_output):
"""Test that valid inputs convert to correct Duration objects."""
assert to_duration(input_value) == expected_output
# ------------------------------------------------------------------
# ISO-8601 output (`as_string=True`)
# ------------------------------------------------------------------
@pytest.mark.parametrize(
"input_value, expected",
[
("15 minutes", "PT15M"),
("1 hour 30 minutes", "PT1H30M"),
("45 seconds", "PT45S"),
("1 hour 5 seconds", "PT1H5S"),
("2 days", "P2D"),
("2 days 3 hours 4 minutes 5 seconds", "P2DT3H4M5S"),
("0 seconds", "PT0S"),
]
)
def test_as_string_true_iso8601(self, input_value, expected):
"""Test ISO-8601 duration strings for various inputs."""
assert to_duration(input_value, as_string=True) == expected
# ------------------------------------------------------------------
# Human readable (`as_string="human"`)
# ------------------------------------------------------------------
def test_as_string_human(self):
assert to_duration("90 seconds", as_string="human") == "1 minute 30 seconds"
# ------------------------------------------------------------------
# Pandas frequency (`as_string="pandas"`)
# ------------------------------------------------------------------
@pytest.mark.parametrize(
"input_value, expected",
[
("1 hour", "1h"),
("2 hours", "2h"),
("15 minutes", "15min"),
("90 minutes", "90min"),
("30 seconds", "30s"),
("900 seconds", "15min"),
],
)
def test_as_string_pandas(self, input_value, expected):
assert to_duration(input_value, as_string="pandas") == expected
# ------------------------------------------------------------------
# Custom format strings
# ------------------------------------------------------------------
def test_as_string_custom_seconds(self):
assert to_duration("75 seconds", as_string="Total: {S}s") == "Total: 75s"
def test_as_string_custom_minutes(self):
assert to_duration("15 minutes", as_string="{M}m total") == "15m total"
def test_as_string_custom_hours(self):
assert to_duration("7200 seconds", as_string="{H} hours") == "2 hours"
def test_as_string_custom_human_alias(self):
assert to_duration("30 minutes", as_string="{f}") == "30 minutes"
# ------------------------------------------------------------------
# Invalid input handling
# ------------------------------------------------------------------
@pytest.mark.parametrize(
"input_value",
[
"not a duration",
"5 lightyears",
(1, 2, 3), # wrong tuple size
{"a": 1}, # unsupported type
None,
],
)
def test_invalid_inputs_raise(self, input_value):
with pytest.raises(ValueError):
to_duration(input_value)
# ------------------------------------------------------------------
# Invalid as_string values
# ------------------------------------------------------------------
def test_invalid_as_string_raises(self):
with pytest.raises(ValueError):
to_duration("5 minutes", as_string=123) # type: ignore
def test_summation(self):
start_datetime = to_datetime("2028-01-11 00:00:00")
index_datetime = start_datetime
for i in range(48):
expected_datetime = start_datetime + to_duration(f"{i} hours")
assert index_datetime == expected_datetime
index_datetime += to_duration("1 hour")
assert index_datetime == to_datetime("2028-01-13 00:00:00")
def test_excessive_length_raises_valueerror(self):
"""Test that to_duration raises ValueError for strings exceeding max length.
This test covers the fix for the ReDoS vulnerability.
Related to: #494
"""
# String exceeds limits
long_string = "a" * (MAX_DURATION_STRING_LENGTH + 50)
# Expected Errormessage ESCAPED für Regex
expected_error_message = re.escape(
f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH})."
)
# Check if error was raised
with pytest.raises(ValueError, match=expected_error_message):
to_duration(long_string)
# Optional: String exactly at the limit should NOT trigger the length check.
at_limit_string = "b" * MAX_DURATION_STRING_LENGTH
try:
to_duration(at_limit_string)
except ValueError as e:
if str(e) == f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH}).":
pytest.fail(
f"to_duration raised length ValueError unexpectedly for string at limit: {at_limit_string}"
)
pass
# -----------------------------
@@ -1900,33 +2015,3 @@ def test_compare_datetimes_gt(dt1, dt2):
assert compare_datetimes(dt1, dt2).gt
assert compare_datetimes(dt1, dt2).le == False
assert compare_datetimes(dt1, dt2).lt == False
def test_to_duration_excessive_length_raises_valueerror():
"""Test that to_duration raises ValueError for strings exceeding max length.
This test covers the fix for the ReDoS vulnerability.
Related to: #494
"""
# String exceeds limits
long_string = "a" * (MAX_DURATION_STRING_LENGTH + 50)
# Expected Errormessage ESCAPED für Regex
expected_error_message = re.escape(
f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH})."
)
# Check if error was raised
with pytest.raises(ValueError, match=expected_error_message):
to_duration(long_string)
# Optional: String exactly at the limit should NOT trigger the length check.
at_limit_string = "b" * MAX_DURATION_STRING_LENGTH
try:
to_duration(at_limit_string)
except ValueError as e:
if str(e) == f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH}).":
pytest.fail(
f"to_duration raised length ValueError unexpectedly for string at limit: {at_limit_string}"
)
pass

View File

@@ -25,6 +25,49 @@ def fixed_now():
class TestEnergyManagementPlan:
# ----------------------------------------------------------------------
# Helpers (only used inside the class)
# ----------------------------------------------------------------------
def _make_instr(self, resource_id, execution_time, duration=None):
if duration is None:
instr = OMBCInstruction(
id=resource_id,
execution_time=execution_time,
operation_mode_id="mode",
operation_mode_factor=1.0,
)
else:
instr = PEBCInstruction(
id=resource_id,
execution_time=execution_time,
power_constraints_id="pc-123",
power_envelopes=[
PEBCPowerEnvelope(
id="pebcpe@1234",
commodity_quantity=CommodityQuantity.ELECTRIC_POWER_L1,
power_envelope_elements=[
PEBCPowerEnvelopeElement(
duration=to_duration(duration),
upper_limit=1010.0,
lower_limit=990.0,
),
],
),
],
)
return instr
def _build_plan(self, instructions, now):
plan = EnergyManagementPlan(
id="plan-test",
generated_at=now,
instructions=instructions,
)
plan._update_time_range()
return plan
def test_add_instruction_and_time_range(self, fixed_now):
plan = EnergyManagementPlan(
id="plan-123",
@@ -46,12 +89,8 @@ class TestEnergyManagementPlan:
plan.add_instruction(instr1)
plan.add_instruction(instr2)
# Check that valid_from matches the earliest execution_time
assert plan.valid_from == fixed_now
# instr2 has infinite duration so valid_until must be None
assert plan.valid_until is None
assert plan.instructions == [instr1, instr2]
def test_clear(self, fixed_now):
@@ -72,36 +111,6 @@ class TestEnergyManagementPlan:
assert plan.valid_until is None
assert plan.valid_from is not None
def test_get_active_instructions(self, fixed_now):
instr1 = OMBCInstruction(
resource_id="dev-1",
execution_time=fixed_now.subtract(minutes=1),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
instr2 = OMBCInstruction(
resource_id="dev-2",
execution_time=fixed_now.add(minutes=1),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
instr3 = OMBCInstruction(
resource_id="dev-3",
execution_time=fixed_now.subtract(minutes=10),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
plan = EnergyManagementPlan(
id="plan-123",
generated_at=fixed_now,
instructions=[instr1, instr2, instr3],
)
plan._update_time_range()
active = plan.get_active_instructions(now=fixed_now)
ids = {i.resource_id for i in active}
assert ids == {"dev-1", "dev-3"}
def test_get_next_instruction(self, fixed_now):
instr1 = OMBCInstruction(
resource_id="dev-1",
@@ -154,14 +163,12 @@ class TestEnergyManagementPlan:
assert len(dev1_instructions) == 1
assert dev1_instructions[0].resource_id == "dev-1"
def test_add_various_instructions(self, fixed_now):
plan = EnergyManagementPlan(
id="plan-123",
generated_at=fixed_now,
instructions=[]
)
instrs = [
DDBCInstruction(
id="actuatorA@123",
@@ -212,11 +219,11 @@ class TestEnergyManagementPlan:
PEBCPowerEnvelope(
id="pebcpe@1234",
commodity_quantity=CommodityQuantity.ELECTRIC_POWER_L1,
power_envelope_elements = [
power_envelope_elements=[
PEBCPowerEnvelopeElement(
duration = to_duration(10),
upper_limit = 1010.0,
lower_limit = 990.0,
duration=to_duration(10),
upper_limit=1010.0,
lower_limit=990.0,
),
],
),
@@ -228,8 +235,131 @@ class TestEnergyManagementPlan:
plan.add_instruction(instr)
assert len(plan.instructions) == len(instrs)
# Check that get_instructions_for_device returns the right instructions
assert any(
instr for instr in plan.get_instructions_for_resource("actuatorA")
if isinstance(instr, DDBCInstruction)
)
# -------------------------------------------
# Special testing for get_active_instructions
# -------------------------------------------
def test_get_active_instructions(self, fixed_now):
instr1 = OMBCInstruction(
resource_id="dev-1",
execution_time=fixed_now.subtract(minutes=1),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
instr2 = OMBCInstruction(
resource_id="dev-2",
execution_time=fixed_now.add(minutes=1),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
instr3 = OMBCInstruction(
resource_id="dev-3",
execution_time=fixed_now.subtract(minutes=10),
operation_mode_id="mymode1",
operation_mode_factor=1.0,
)
plan = EnergyManagementPlan(
id="plan-123",
generated_at=fixed_now,
instructions=[instr1, instr2, instr3],
)
plan._update_time_range()
resource_ids = plan.get_resources()
assert resource_ids == ["dev-1", "dev-2", "dev-3"]
active = plan.get_active_instructions(now=fixed_now)
ids = {i.resource_id for i in active}
assert ids == {"dev-1", "dev-3"}
def test_get_active_instructions_with_duration(self, fixed_now):
instr = self._make_instr(
"dev-1",
fixed_now.subtract(minutes=5),
duration=Duration(minutes=10),
)
plan = self._build_plan([instr], fixed_now)
active = plan.get_active_instructions(fixed_now)
assert {i.resource_id for i in active} == {"dev-1"}
def test_get_active_instructions_expired_duration(self, fixed_now):
instr = self._make_instr(
"dev-1",
fixed_now.subtract(minutes=20),
duration=Duration(minutes=10),
)
plan = self._build_plan([instr], fixed_now)
assert plan.get_active_instructions(fixed_now) == []
def test_get_active_instructions_end_exactly_now_not_active(self, fixed_now):
instr = self._make_instr(
"dev-1",
fixed_now.subtract(minutes=10),
duration=Duration(minutes=10),
)
plan = self._build_plan([instr], fixed_now)
assert plan.get_active_instructions(fixed_now) == []
def test_get_active_instructions_latest_supersedes(self, fixed_now):
instr1 = self._make_instr(
"dev-1",
fixed_now.subtract(minutes=10),
duration=Duration(minutes=30),
)
instr2 = self._make_instr("dev-1", fixed_now.subtract(minutes=1))
plan = self._build_plan([instr1, instr2], fixed_now)
active = plan.get_active_instructions(fixed_now)
assert len(active) == 1
assert active[0] is instr2
def test_get_active_instructions_mixed_resources(self, fixed_now):
instr1 = self._make_instr(
"r1",
fixed_now.subtract(minutes=5),
duration=Duration(minutes=10),
)
instr2 = self._make_instr("r2", fixed_now.subtract(minutes=1))
instr3 = self._make_instr("r3", fixed_now.add(minutes=10))
plan = self._build_plan([instr1, instr2, instr3], fixed_now)
ids = {i.resource_id for i in plan.get_active_instructions(fixed_now)}
assert ids == {"r1", "r2"}
def test_get_active_instructions_start_exactly_now(self, fixed_now):
instr = self._make_instr("dev-1", fixed_now)
plan = self._build_plan([instr], fixed_now)
assert {i.resource_id for i in plan.get_active_instructions(fixed_now)} == {"dev-1"}
def test_get_active_instructions_no_active(self, fixed_now):
instr = self._make_instr("dev-1", fixed_now.add(minutes=1))
plan = self._build_plan([instr], fixed_now)
assert plan.get_active_instructions(fixed_now) == []
def test_get_active_instructions_future_does_not_override_until_reached(self, fixed_now):
instr1 = self._make_instr("dev-1", fixed_now.subtract(minutes=5))
instr2 = self._make_instr("dev-1", fixed_now.add(minutes=5))
plan = self._build_plan([instr1, instr2], fixed_now)
active_before = plan.get_active_instructions(fixed_now)
assert {i.resource_id for i in active_before} == {"dev-1"}
def test_get_active_instructions_future_overrides_once_time_reached(self, fixed_now):
exec_future = fixed_now.add(minutes=5)
instr1 = self._make_instr("dev-1", fixed_now.subtract(minutes=5))
instr2 = self._make_instr("dev-1", exec_future)
plan = self._build_plan([instr1, instr2], fixed_now)
active_before = plan.get_active_instructions(fixed_now)
assert {i.resource_id for i in active_before} == {"dev-1"}
active_after = plan.get_active_instructions(exec_future)
assert {i.resource_id for i in active_after} == {"dev-1"}

View File

@@ -14,7 +14,7 @@ from pydantic.fields import FieldInfo
from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.pvforecast import PVForecastPlaneSetting
from akkudoktoreos.server.dash.configuration import (
configuration,
create_config_details,
get_default_value,
get_nested_value,
resolve_nested_types,
@@ -68,38 +68,44 @@ class TestEOSdashConfig:
def test_configuration(self):
"""Test extracting configuration details from a Pydantic model based on provided values."""
values = {"field1": "custom_value", "field2": 20}
config = configuration(SampleModel, values)
config_details = create_config_details(SampleModel, values)
assert any(
item["name"] == "field1" and item["value"] == '"custom_value"' for item in config
item["name"] == "field1" and item["value"] == '"custom_value"'
for key, item in config_details.items()
)
assert any(
item["name"] == "field2" and item["value"] == "20"
for key, item in config_details.items()
)
assert any(item["name"] == "field2" and item["value"] == "20" for item in config)
def test_configuration_eos(self, config_eos):
"""Test extracting EOS configuration details from EOS config based on provided values."""
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
values = json.load(fd)
config = configuration(config_eos, values)
config_details = create_config_details(config_eos, values)
assert any(
item["name"] == "server.eosdash_port" and item["value"] == "8504" for item in config
item["name"] == "server.eosdash_port" and item["value"] == "8504"
for key, item in config_details.items()
)
assert any(
item["name"] == "server.eosdash_host" and item["value"] == '"127.0.0.1"'
for item in config
for key, item in config_details.items()
)
def test_configuration_pvforecast_plane_settings(self):
"""Test extracting EOS PV forecast plane configuration details from EOS config based on provided values."""
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
values = json.load(fd)
config = configuration(
config_details = create_config_details(
PVForecastPlaneSetting(), values, values_prefix=["pvforecast", "planes", "0"]
)
assert any(
item["name"] == "pvforecast.planes.0.surface_azimuth" and item["value"] == "170"
for item in config
for key, item in config_details.items()
)
assert any(
item["name"] == "pvforecast.planes.0.userhorizon"
and item["value"] == "[20, 27, 22, 20]"
for item in config
for key, item in config_details.items()
)

View File

@@ -1,7 +1,99 @@
import re
import time
from http import HTTPStatus
from types import SimpleNamespace
from unittest.mock import patch
import pytest
import requests
from bs4 import BeautifulSoup
from akkudoktoreos.server.dash.context import EOSDASH_ROOT, ROOT_PATH, request_url_for
# -----------------------------------------------------
# URL filtering logic
# -----------------------------------------------------
ALLOWED_PREFIXES = [
"/api/hassio_ingress/",
"http://", "https://", # external URLs
"mailto:", "tel:", # contact URLs
"#", # anchor links
]
def is_allowed_prefix(url: str) -> bool:
return any(url.startswith(p) for p in ALLOWED_PREFIXES)
ABSOLUTE_URL = re.compile(r"^/[^/].*")
RELATIVE_PARENT = re.compile(r"^\.\./")
WS_REGEX = re.compile(r'new\s+WebSocket\s*\(\s*[\'"]([^\'"]*)[\'"]')
# -----------------------------------------------------
# Core HTML parser
# -----------------------------------------------------
def scan_html_for_link_issues(html: str):
soup = BeautifulSoup(html, "html.parser")
found_absolute: list[str] = []
found_relative_up: list[str] = []
all_urls = []
def add_issue(lst, tag, attr, value) -> None:
lst.append(f"<{tag.name} {attr}='{value}'>")
for tag in soup.find_all(True):
for attr in ("href", "src", "action"):
if attr not in tag.attrs:
continue
value = tag[attr]
if not isinstance(value, str):
continue
all_urls.append(value)
# (1) absolute URL
if ABSOLUTE_URL.match(value) and not is_allowed_prefix(value):
add_issue(found_absolute, tag, attr, value)
# (2) relative going up
if RELATIVE_PARENT.match(value):
add_issue(found_relative_up, tag, attr, value)
# (3) mixed usage check: both absolute + relative appear
used_absolute = any(u.startswith("/") for u in all_urls if not is_allowed_prefix(u))
used_relative = any(not u.startswith("/") for u in all_urls if not is_allowed_prefix(u))
mixed_usage = used_absolute and used_relative
# (4) detect absolute WebSocket URLs in JS
ws_bad = []
for m in WS_REGEX.findall(html):
if m.startswith("/") and not is_allowed_prefix(m):
ws_bad.append(m)
return found_absolute, found_relative_up, mixed_usage, ws_bad
def collect_testable_routes(app):
urls = []
for r in app.routes:
if not hasattr(r, "path"):
continue
path = r.path
# skip API-style or binary endpoints:
if path.startswith("/api"):
continue
if path.endswith(".js") or path.endswith(".css"):
continue
urls.append(path)
return sorted(set(urls))
class TestEOSDash:
@@ -38,3 +130,73 @@ class TestEOSDash:
server = server_setup_for_class["server"]
timeout = server_setup_for_class["timeout"]
self._assert_server_alive(server, timeout)
def test_ingress_safe_links(self, server_setup_for_class, monkeypatch, tmp_path):
base = server_setup_for_class["eosdash_server"]
with patch("akkudoktoreos.server.dash.context.ROOT_PATH", "/api/hassio_ingress/TOKEN/"):
eos_dir = tmp_path
monkeypatch.setenv("EOS_DIR", str(eos_dir))
monkeypatch.setenv("EOS_CONFIG_DIR", str(eos_dir))
# Import with environment vars set to prevent creation of EOS.config.json in wrong dir.
from akkudoktoreos.server.eosdash import app
for path in collect_testable_routes(app):
url = f"{base}{path}"
resp = requests.get(url)
resp.raise_for_status()
abs_issues, rel_up_issues, mixed_usage, ws_issues = scan_html_for_link_issues(resp.text)
#assert not abs_issues, (
# f"Forbidden absolute paths detected on {path}:\n" +
# "\n".join(abs_issues)
#)
assert not rel_up_issues, (
f"Relative paths navigating up (`../`) detected on {path}:\n" +
"\n".join(rel_up_issues)
)
assert not mixed_usage, f"Mixed absolute/relative linking detected on page {path}"
assert not ws_issues, f"Forbidden WebSocket paths detected on {path}:\n" + "\n".join(ws_issues)
@pytest.mark.parametrize(
"root_path,path,expected",
[
("/", "/eosdash/footer", "/eosdash/footer"),
("/", "eosdash/footer", "/eosdash/footer"),
("/", "footer", "/eosdash/footer"),
("/", "eosdash/assets/logo.png", "/eosdash/assets/logo.png"),
("/api/hassio_ingress/TOKEN/", "/api/hassio_ingress/TOKEN/eosdash/footer", "/api/hassio_ingress/TOKEN/eosdash/footer"),
("/api/hassio_ingress/TOKEN/", "/eosdash/footer", "/api/hassio_ingress/TOKEN/eosdash/footer"),
("/api/hassio_ingress/TOKEN/", "eosdash/footer", "/api/hassio_ingress/TOKEN/eosdash/footer"),
("/api/hassio_ingress/TOKEN/", "footer", "/api/hassio_ingress/TOKEN/eosdash/footer"),
("/api/hassio_ingress/TOKEN/", "assets/logo.png", "/api/hassio_ingress/TOKEN/eosdash/assets/logo.png"),
],
)
def test_request_url_for(self, root_path, path, expected):
"""Test that request_url_for produces absolute non-rewritable URLs.
Args:
root_path (str): Root path.
path (str): Path passed to request_url_for().
expected (str): Final produced path.
"""
result = request_url_for(path, root_path = root_path)
assert result == expected, (
f"URL rewriting mismatch. "
f"root_path={root_path}, path={path}, expected={expected}, got={result}"
)
# Test fallback to global var
with patch("akkudoktoreos.server.dash.context.ROOT_PATH", root_path):
result = request_url_for(path, root_path = None)
assert result == expected, (
f"URL rewriting mismatch. "
f"root_path={root_path}, path={path}, expected={expected}, got={result}"
)

343
tests/test_homeassistant.py Normal file
View File

@@ -0,0 +1,343 @@
import os
import subprocess
from pathlib import Path
from typing import Optional
import pytest
import yaml
from pydantic import ValidationError
class TestHomeAssistantAddon:
"""Tests to ensure the repository root is a valid Home Assistant add-on.
Simulates the Home Assistant Supervisor's expectations.
"""
@property
def root(self):
"""Repository root (repo == addon)."""
return Path(__file__).resolve().parent.parent
def test_config_yaml_exists(self):
"""Ensure config.yaml exists in the repo root."""
cfg_path = self.root / "config.yaml"
assert cfg_path.is_file(), "config.yaml must exist in repository root."
def test_config_yaml_loadable(self):
"""Verify that config.yaml parses and contains required fields."""
cfg_path = self.root / "config.yaml"
with open(cfg_path) as f:
cfg = yaml.safe_load(f)
required_fields = ["name", "version", "slug", "description", "arch"]
for field in required_fields:
assert field in cfg, f"Missing required field '{field}' in config.yaml."
# Additional validation
assert isinstance(cfg["arch"], list), "arch must be a list"
assert len(cfg["arch"]) > 0, "arch list cannot be empty"
print(f"✓ config.yaml valid:")
print(f" Name: {cfg['name']}")
print(f" Version: {cfg['version']}")
print(f" Slug: {cfg['slug']}")
print(f" Architectures: {', '.join(cfg['arch'])}")
def test_readme_exists(self):
"""Ensure README.md exists and is not empty."""
readme_path = self.root / "README.md"
assert readme_path.is_file(), "README.md must exist in the repository root."
content = readme_path.read_text()
assert len(content.strip()) > 0, "README.md is empty"
print(f"✓ README.md exists ({len(content)} bytes)")
def test_docs_md_exists(self):
"""Ensure DOCS.md exists in the repo root (for Home Assistant add-on documentation)."""
docs_path = self.root / "DOCS.md"
assert docs_path.is_file(), "DOCS.md must exist in the repository root for add-on documentation."
content = docs_path.read_text()
assert len(content.strip()) > 0, "DOCS.md is empty"
print(f"✓ DOCS.md exists ({len(content)} bytes)")
@pytest.mark.docker
def test_dockerfile_exists(self):
"""Ensure Dockerfile exists in the repo root and has basic structure."""
dockerfile = self.root / "Dockerfile"
assert dockerfile.is_file(), "Dockerfile must exist in repository root."
content = dockerfile.read_text()
# Check for FROM statement
assert "FROM" in content, "Dockerfile must contain FROM statement"
# Check for common add-on patterns
if "ARG BUILD_FROM" in content:
print("✓ Dockerfile uses Home Assistant build args")
print("✓ Dockerfile exists and has valid structure")
@pytest.mark.docker
def test_docker_build_context_valid(self):
"""Runs a Docker build using the root of the repo as Home Assistant supervisor would.
Fails if the build context is invalid or Dockerfile has syntax errors.
"""
# Check if Docker is available
try:
subprocess.run(
["docker", "--version"],
capture_output=True,
check=True
)
except (FileNotFoundError, subprocess.CalledProcessError):
pytest.skip("Docker not found or not running")
cmd = [
"docker", "build",
"-t", "ha-addon-test:latest",
str(self.root),
]
print(f"\nBuilding Docker image from: {self.root}")
try:
result = subprocess.run(
cmd,
check=True,
capture_output=True,
text=True,
cwd=str(self.root)
)
print("✓ Docker build successful")
if result.stdout:
print("\nBuild output (last 20 lines):")
print('\n'.join(result.stdout.splitlines()[-20:]))
except subprocess.CalledProcessError as e:
print("\n✗ Docker build failed")
print("\nSTDOUT:")
print(e.stdout)
print("\nSTDERR:")
print(e.stderr)
pytest.fail(
f"Docker build failed with exit code {e.returncode}. "
"This simulates a Supervisor build failure."
)
@pytest.mark.docker
def test_addon_builder_validation(self, is_finalize: bool):
"""Validate add-on can be built using Home Assistant's builder tool.
This is the closest to what Supervisor does when installing an add-on.
"""
if not is_finalize:
pytest.skip("Skipping add-on builder validation test — not full run")
# Check if Docker is available
try:
subprocess.run(
["docker", "--version"],
capture_output=True,
check=True
)
except (FileNotFoundError, subprocess.CalledProcessError):
pytest.skip("Docker not found or not running")
print(f"\nValidating add-on with builder: {self.root}")
# Read config to get architecture info
cfg_path = self.root / "config.yaml"
with open(cfg_path) as f:
cfg = yaml.safe_load(f)
# Detect host architecture
import platform
machine = platform.machine().lower()
# Map Python's platform names to Home Assistant architectures
arch_map = {
"x86_64": "amd64",
"amd64": "amd64",
"aarch64": "aarch64",
"arm64": "aarch64",
"armv7l": "armv7",
"armv7": "armv7",
}
host_arch = arch_map.get(machine, "amd64")
# Check if config supports this architecture
if host_arch not in cfg["arch"]:
pytest.skip(
f"Add-on doesn't support host architecture {host_arch}. "
f"Supported: {', '.join(cfg['arch'])}"
)
print(f"Using builder for architecture: {host_arch}")
# The builder expects specific arguments for building
builder_image = f"ghcr.io/home-assistant/{host_arch}-builder:latest"
result = subprocess.run(
[
"docker", "run", "--rm", "--privileged",
"-v", f"{self.root}:/data",
"-v", "/var/run/docker.sock:/var/run/docker.sock",
builder_image,
"--generic", cfg["version"],
"--target", "/data",
f"--{host_arch}",
"--test"
],
capture_output=True,
text=True,
cwd=str(self.root),
check=False,
timeout=600
)
# Print output for debugging
if result.stdout:
print("\nBuilder stdout:")
print(result.stdout)
if result.stderr:
print("\nBuilder stderr:")
print(result.stderr)
# Check result
if result.returncode != 0:
# Check if it's just because the builder tool is unavailable
if "exec format error" in result.stderr or "not found" in result.stderr:
pytest.fail(
"Builder tool not compatible with this system."
)
pytest.fail(
f"Add-on builder validation failed with exit code {result.returncode}"
)
print("✓ Add-on builder validation passed")
def test_build_yaml_if_exists(self):
"""If build.yaml exists, validate its structure."""
build_path = self.root / "build.yaml"
if not build_path.exists():
pytest.skip("build.yaml not present (optional)")
with open(build_path) as f:
build_cfg = yaml.safe_load(f)
assert "build_from" in build_cfg, "build.yaml must contain 'build_from'"
assert isinstance(build_cfg["build_from"], dict), "'build_from' must be a dictionary"
print("✓ build.yaml structure valid")
print(f" Architectures defined: {', '.join(build_cfg['build_from'].keys())}")
def test_addon_configuration_complete(self):
"""Comprehensive validation of add-on configuration.
Checks all required fields and common configuration issues.
"""
cfg_path = self.root / "config.yaml"
with open(cfg_path) as f:
cfg = yaml.safe_load(f)
# Required top-level fields
required_fields = ["name", "version", "slug", "description", "arch"]
for field in required_fields:
assert field in cfg, f"Missing required field: {field}"
# Validate specific fields
assert isinstance(cfg["arch"], list), "arch must be a list"
assert len(cfg["arch"]) > 0, "arch list cannot be empty"
valid_archs = ["aarch64", "amd64", "armhf", "armv7", "i386"]
for arch in cfg["arch"]:
assert arch in valid_archs, f"Invalid architecture: {arch}"
# Validate version format (should be semantic versioning)
version = cfg["version"]
assert isinstance(version, str), "version must be a string"
# Validate slug (lowercase, no special chars except dash)
slug = cfg["slug"]
assert slug.islower() or "-" in slug, "slug should be lowercase"
assert slug.replace("-", "").replace("_", "").isalnum(), \
"slug should only contain alphanumeric characters, dash, or underscore"
# Optional but common fields
if "startup" in cfg:
valid_startup = ["initialize", "system", "services", "application", "once"]
assert cfg["startup"] in valid_startup, \
f"Invalid startup value: {cfg['startup']}"
if "boot" in cfg:
valid_boot = ["auto", "manual"]
assert cfg["boot"] in valid_boot, f"Invalid boot value: {cfg['boot']}"
# Validate ingress configuration
if cfg.get("ingress"):
assert "ingress_port" in cfg, "ingress_port required when ingress is enabled"
ingress_port = cfg["ingress_port"]
assert isinstance(ingress_port, int), "ingress_port must be an integer"
assert 1 <= ingress_port <= 65535, "ingress_port must be a valid port number"
# Ingress port should NOT be in ports section
ports = cfg.get("ports", {})
port_key = f"{ingress_port}/tcp"
assert port_key not in ports, \
f"Port {ingress_port} is used for ingress and should not be in 'ports' section"
# Validate URL if present
if "url" in cfg:
url = cfg["url"]
assert url.startswith("http://") or url.startswith("https://"), \
"URL must start with http:// or https://"
# Validate map directories if present
if "map" in cfg:
assert isinstance(cfg["map"], list), "map must be a list"
valid_mappings = ["config", "ssl", "addons", "backup", "share", "media"]
for mapping in cfg["map"]:
# Handle both "config:rw" and "config" formats
base_mapping = mapping.split(":")[0]
assert base_mapping in valid_mappings, \
f"Invalid map directory: {base_mapping}"
print("✓ Add-on configuration validation passed")
print(f" Name: {cfg['name']}")
print(f" Version: {cfg['version']}")
print(f" Slug: {cfg['slug']}")
print(f" Architectures: {', '.join(cfg['arch'])}")
if "startup" in cfg:
print(f" Startup: {cfg['startup']}")
if cfg.get("ingress"):
print(f" Ingress: enabled on port {cfg['ingress_port']}")
def test_ingress_configuration_consistent(self):
"""If ingress is enabled, ensure port configuration is correct."""
cfg_path = self.root / "config.yaml"
with open(cfg_path) as f:
cfg = yaml.safe_load(f)
if not cfg.get("ingress"):
pytest.skip("Ingress not enabled")
# If ingress is enabled, check configuration
assert "ingress_port" in cfg, "ingress_port must be specified when ingress is enabled"
ingress_port = cfg["ingress_port"]
# The ingress port should NOT be in the ports section
ports = cfg.get("ports", {})
port_key = f"{ingress_port}/tcp"
if port_key in ports:
pytest.fail(
f"Port {ingress_port} is used for ingress but also listed in 'ports' section. "
f"Remove it from 'ports' to avoid conflicts."
)
print(f"✓ Ingress configuration valid (port {ingress_port})")

View File

@@ -8,7 +8,7 @@ from unittest.mock import patch
import pytest
from loguru import logger
from akkudoktoreos.core.logging import track_logging_config
from akkudoktoreos.core.logging import logging_track_config
# -----------------------------
# logsettings
@@ -20,14 +20,14 @@ class TestLoggingCommonSettings:
logger.remove()
def test_valid_console_level_sets_logging(self, config_eos, caplog):
config_eos.track_nested_value("/logging", track_logging_config)
config_eos.track_nested_value("/logging", logging_track_config)
config_eos.set_nested_value("/logging/console_level", "INFO")
assert config_eos.get_nested_value("/logging/console_level") == "INFO"
assert config_eos.logging.console_level == "INFO"
assert any("console: INFO" in message for message in caplog.messages)
def test_valid_console_level_calls_tracking_callback(self, config_eos):
with patch("akkudoktoreos.core.logging.track_logging_config") as mock_setup:
with patch("akkudoktoreos.core.logging.logging_track_config") as mock_setup:
config_eos.track_nested_value("/logging", mock_setup)
config_eos.set_nested_value("/logging/console_level", "INFO")
assert config_eos.get_nested_value("/logging/console_level") == "INFO"

View File

@@ -1,3 +1,4 @@
import asyncio
import json
import os
import signal
@@ -44,73 +45,116 @@ class TestServer:
class TestServerStartStop:
def test_server_start_eosdash(self, tmpdir):
"""Test the EOSdash server startup from EOS."""
# Do not use any fixture as this will make pytest the owner of the EOSdash port.
host = get_default_host()
port = 8503
eosdash_host = host
eosdash_port = 8504
@pytest.mark.asyncio
async def test_server_start_eosdash(self, config_eos, monkeypatch, tmp_path):
"""Test the EOSdash server startup from EOS.
Do not use any fixture as this will make pytest the owner of the EOSdash port.
Tests that:
1. EOSdash starts via the supervisor
2. The /eosdash/health endpoint returns OK
3. EOSdash reports correct status and version
4. EOSdash can be terminated cleanly
"""
eos_dir = tmp_path
monkeypatch.setenv("EOS_DIR", str(eos_dir))
monkeypatch.setenv("EOS_CONFIG_DIR", str(eos_dir))
# Import with environment vars set to prevent creation of EOS.config.json in wrong dir.
from akkudoktoreos.server.rest.starteosdash import run_eosdash_supervisor
config_eos.server.host = get_default_host()
config_eos.server.port = 8503
config_eos.server.eosdash_host = config_eos.server.host
config_eos.server.eosdash_port = 8504
timeout = 120
server = f"http://{host}:{port}"
eosdash_server = f"http://{eosdash_host}:{eosdash_port}"
eos_dir = str(tmpdir)
eosdash_server = f"http://{config_eos.server.eosdash_host}:{config_eos.server.eosdash_port}"
# Cleanup any EOS and EOSdash process left.
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, timeout)
# Import after test setup to prevent creation of config file before test
from akkudoktoreos.server.eos import start_eosdash
# Port may be blocked
assert wait_for_port_free(eosdash_port, timeout=120, waiting_app_name="EOSdash")
process = start_eosdash(
host=eosdash_host,
port=eosdash_port,
eos_host=host,
eos_port=port,
log_level="DEBUG",
access_log=False,
reload=False,
eos_dir=eos_dir,
eos_config_dir=eos_dir,
cleanup_eos_eosdash(
host=config_eos.server.host,
port=config_eos.server.port,
eosdash_host=config_eos.server.eosdash_host,
eosdash_port=config_eos.server.eosdash_port,
server_timeout=timeout,
)
# Assure EOSdash is up
# Port may be blocked
assert wait_for_port_free(config_eos.server.eosdash_port, timeout=120, waiting_app_name="EOSdash")
"""Start the EOSdash supervisor as a background task for testing."""
task = asyncio.create_task(run_eosdash_supervisor())
# give the supervisor some time to begin starting EOSdash
await asyncio.sleep(1)
# ---------------------------------
# Wait for health endpoint to come up
# ---------------------------------
startup = False
error = ""
for retries in range(int(timeout / 3)):
try:
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
if result.status_code == HTTPStatus.OK:
resp = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
if resp.status_code == HTTPStatus.OK:
startup = True
break
error = f"{result.status_code}, {str(result.content)}"
error = f"{resp.status_code}, {str(resp.content)}"
except Exception as ex:
error = str(ex)
time.sleep(3)
assert startup, f"Connection to {eosdash_server}/eosdash/health failed: {error}"
health = result.json()
assert health["status"] == "alive"
assert health["version"] == __version__
await asyncio.sleep(3)
# Shutdown eosdash
# Graceful shutdown of the background task
# Do it before any assert
task.cancel()
try:
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
if result.status_code == HTTPStatus.OK:
pid = result.json()["pid"]
os.kill(pid, signal.SIGTERM)
time.sleep(1)
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
assert result.status_code != HTTPStatus.OK
except:
await task
except asyncio.CancelledError:
pass
# Cleanup any EOS and EOSdash process left.
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, timeout)
assert startup, f"Connection to {eosdash_server}/eosdash/health failed: {error}"
health = resp.json()
assert health.get("status") == "alive"
assert health.get("version") == __version__
# ---------------------------------
# Shutdown EOSdash (as provided)
# ---------------------------------
try:
resp = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
if resp.status_code == HTTPStatus.OK:
pid = resp.json().get("pid")
assert pid is not None, "EOSdash did not report a PID"
os.kill(pid, signal.SIGTERM)
time.sleep(1)
# After shutdown, the server should not respond OK anymore
try:
resp2 = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
assert resp2.status_code != HTTPStatus.OK
except Exception:
pass # expected
except Exception:
pass # ignore shutdown errors for safety
# ---------------------------------
# Cleanup any leftover processes
# ---------------------------------
cleanup_eos_eosdash(
host=config_eos.server.host,
port=config_eos.server.port,
eosdash_host=config_eos.server.eosdash_host,
eosdash_port=config_eos.server.eosdash_port,
server_timeout=timeout,
)
@pytest.mark.skipif(os.name == "nt", reason="Server restart not supported on Windows")
def test_server_restart(self, server_setup_for_function, is_system_test):

View File

@@ -6,6 +6,8 @@ from pathlib import Path
import pytest
import yaml
from akkudoktoreos.core.version import _version_calculate, _version_hash
DIR_PROJECT_ROOT = Path(__file__).parent.parent
GET_VERSION_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "get_version.py"
BUMP_DEV_SCRIPT = DIR_PROJECT_ROOT / "scripts" / "bump_dev_version.py"
@@ -18,6 +20,54 @@ def write_file(path: Path, content: str):
return path
# --- Test version helpers ---
def test_version_non_dev(monkeypatch):
"""If VERSION_BASE does not end with 'dev', no hash digits are appended."""
monkeypatch.setattr("akkudoktoreos.core.version.VERSION_BASE", "0.2.0")
result = _version_calculate()
assert result == "0.2.0"
def test_version_dev_precision_8(monkeypatch):
"""Test that a dev version appends exactly 8 digits derived from the hash."""
fake_hash = "abcdef1234567890" # deterministic fake digest
monkeypatch.setattr("akkudoktoreos.core.version._version_hash", lambda: fake_hash)
monkeypatch.setattr("akkudoktoreos.core.version.VERSION_BASE", "0.2.0.dev")
monkeypatch.setattr("akkudoktoreos.core.version.VERSION_DEV_PRECISION", 8)
result = _version_calculate()
# compute expected suffix
hash_value = int(fake_hash, 16)
expected_digits = str(hash_value % (10 ** 8)).zfill(8)
expected = f"0.2.0.dev{expected_digits}"
assert result == expected
assert len(expected_digits) == 8
assert result.startswith("0.2.0.dev")
assert result == expected
def test_version_dev_precision_8_different_hash(monkeypatch):
"""A different hash must produce a different 8-digit suffix."""
fake_hash = "1234abcd9999ffff"
monkeypatch.setattr("akkudoktoreos.core.version._version_hash", lambda: fake_hash)
monkeypatch.setattr("akkudoktoreos.core.version.VERSION_BASE", "0.2.0.dev")
monkeypatch.setattr("akkudoktoreos.core.version.VERSION_DEV_PRECISION", 8)
result = _version_calculate()
hash_value = int(fake_hash, 16)
expected_digits = str(hash_value % (10 ** 8)).zfill(8)
expected = f"0.2.0.dev{expected_digits}"
assert result == expected
assert len(expected_digits) == 8
# --- 1⃣ Test get_version.py ---
def test_get_version_prints_non_empty():
result = subprocess.run(
@@ -62,7 +112,7 @@ def test_bump_dev_version_appends_dev(tmp_path):
check=True
)
new_version = result.stdout.strip()
assert new_version == "0.2.0+dev"
assert new_version == "0.2.0.dev"
content = version_file.read_text()
assert f'VERSION_BASE = "{new_version}"' in content
@@ -113,7 +163,7 @@ def test_workflow_git(tmp_path):
check=True
)
dev_version = result.stdout.strip()
assert dev_version.endswith("+dev")
assert dev_version.count("+dev") == 1
assert dev_version.endswith(".dev")
assert dev_version.count(".dev") == 1
content = version_file.read_text()
assert f'VERSION_BASE = "{dev_version}"' in content

View File

@@ -276,44 +276,44 @@
"Gesamteinnahmen_Euro": 1.1542928225199272,
"Gesamtkosten_Euro": 2.6531771286148773,
"Home_appliance_wh_per_hour": [
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null,
null
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0,
0.0
],
"Kosten_Euro_pro_Stunde": [
0.027996119999999992,

View File

@@ -232,7 +232,7 @@
1232.67,
871.26,
860.88,
2658.03,
2658.0299999999997,
1222.72,
1221.04,
949.99,
@@ -396,7 +396,7 @@
4.174095896658514e-14,
0.0003442778967139274,
0.0,
0.3686370794492921,
0.368637079449292,
0.0,
0.08231598,
0.174597189,
@@ -436,7 +436,7 @@
2.270998855635753e-10,
1.7179535764168035,
0.0,
1623.9518918471017,
1623.9518918471015,
0.0,
257.64,
566.69,