feat: add Home Assistant and NodeRED adapters (#764)

Adapters for Home Assistant and NodeRED integration are added.
Akkudoktor-EOS can now be run as Home Assistant add-on and standalone.

As Home Assistant add-on EOS uses ingress to fully integrate the EOSdash dashboard
in Home Assistant.

The fix includes several bug fixes that are not directly related to the adapter
implementation but are necessary to keep EOS running properly and to test and
document the changes.

* fix: development version scheme

  The development versioning scheme is adaptet to fit to docker and
  home assistant expectations. The new scheme is x.y.z and x.y.z.dev<hash>.
  Hash is only digits as expected by home assistant. Development version
  is appended by .dev as expected by docker.

* fix: use mean value in interval on resampling for array

  When downsampling data use the mean value of all values within the new
  sampling interval.

* fix: default battery ev soc and appliance wh

  Make the genetic simulation return default values for the
  battery SoC, electric vehicle SoC and appliance load if these
  assets are not used.

* fix: import json string

  Strip outer quotes from JSON strings on import to be compliant to json.loads()
  expectation.

* fix: default interval definition for import data

  Default interval must be defined in lowercase human definition to
  be accepted by pendulum.

* fix: clearoutside schema change

* feat: add adapters for integrations

  Adapters for Home Assistant and NodeRED integration are added.
  Akkudoktor-EOS can now be run as Home Assistant add-on and standalone.

  As Home Assistant add-on EOS uses ingress to fully integrate the EOSdash dashboard
  in Home Assistant.

* feat: allow eos to be started with root permissions and drop priviledges

  Home assistant starts all add-ons with root permissions. Eos now drops
  root permissions if an applicable user is defined by paramter --run_as_user.
  The docker image defines the user eos to be used.

* feat: make eos supervise and monitor EOSdash

  Eos now not only starts EOSdash but also monitors EOSdash during runtime
  and restarts EOSdash on fault. EOSdash logging is captured by EOS
  and forwarded to the EOS log to provide better visibility.

* feat: add duration to string conversion

  Make to_duration to also return the duration as string on request.

* chore: Use info logging to report missing optimization parameters

  In parameter preparation for automatic optimization an error was logged for missing paramters.
  Log is now down using the info level.

* chore: make EOSdash use the EOS data directory for file import/ export

  EOSdash use the EOS data directory for file import/ export by default.
  This allows to use the configuration import/ export function also
  within docker images.

* chore: improve EOSdash config tab display

  Improve display of JSON code and add more forms for config value update.

* chore: make docker image file system layout similar to home assistant

  Only use /data directory for persistent data. This is handled as a
  docker volume. The /data volume is mapped to ~/.local/share/net.akkudoktor.eos
  if using docker compose.

* chore: add home assistant add-on development environment

  Add VSCode devcontainer and task definition for home assistant add-on
  development.

* chore: improve documentation
This commit is contained in:
Bobby Noelte
2025-12-30 22:08:21 +01:00
committed by GitHub
parent 02c794460f
commit 58d70e417b
111 changed files with 6815 additions and 1199 deletions

View File

@@ -0,0 +1,94 @@
from typing import TYPE_CHECKING, Optional, Union
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.adapter.adapterabc import AdapterContainer
from akkudoktoreos.adapter.homeassistant import (
HomeAssistantAdapter,
HomeAssistantAdapterCommonSettings,
)
from akkudoktoreos.adapter.nodered import NodeREDAdapter, NodeREDAdapterCommonSettings
from akkudoktoreos.config.configabc import SettingsBaseModel
if TYPE_CHECKING:
adapter_providers: list[str]
class AdapterCommonSettings(SettingsBaseModel):
"""Adapter Configuration."""
provider: Optional[list[str]] = Field(
default=None,
json_schema_extra={
"description": ("List of adapter provider id(s) of provider(s) to be used."),
"examples": [["HomeAssistant"], ["HomeAssistant", "NodeRED"]],
},
)
homeassistant: HomeAssistantAdapterCommonSettings = Field(
default_factory=HomeAssistantAdapterCommonSettings,
json_schema_extra={"description": "Home Assistant adapter settings."},
)
nodered: NodeREDAdapterCommonSettings = Field(
default_factory=NodeREDAdapterCommonSettings,
json_schema_extra={"description": "NodeRED adapter settings."},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available electricity price provider ids."""
return adapter_providers
# Validators
@field_validator("provider", mode="after")
@classmethod
def validate_provider(cls, value: Optional[list[str]]) -> Optional[list[str]]:
if value is None:
return value
for provider_id in value:
if provider_id not in adapter_providers:
raise ValueError(
f"Provider '{value}' is not a valid adapter provider: {adapter_providers}."
)
return value
class Adapter(AdapterContainer):
"""Adapter container to manage multiple adapter providers.
Attributes:
providers (List[Union[PVForecastAkkudoktor, WeatherBrightSky, WeatherClearOutside]]):
List of forecast provider instances, in the order they should be updated.
Providers may depend on updates from others.
"""
providers: list[
Union[
HomeAssistantAdapter,
NodeREDAdapter,
]
] = Field(default_factory=list, json_schema_extra={"description": "List of adapter providers"})
# Initialize adapter providers, all are singletons.
homeassistant_adapter = HomeAssistantAdapter()
nodered_adapter = NodeREDAdapter()
def get_adapter() -> Adapter:
"""Gets the EOS adapter data."""
# Initialize Adapter instance with providers in the required order
# Care for provider sequence as providers may rely on others to be updated before.
adapter = Adapter(
providers=[
homeassistant_adapter,
nodered_adapter,
]
)
return adapter
# Valid adapter providers
adapter_providers = [provider.provider_id() for provider in get_adapter().providers]

View File

@@ -0,0 +1,160 @@
"""Abstract and base classes for adapters."""
from abc import abstractmethod
from typing import Any, Optional
from loguru import logger
from pydantic import (
Field,
field_validator,
)
from akkudoktoreos.core.coreabc import (
ConfigMixin,
MeasurementMixin,
SingletonMixin,
StartMixin,
)
from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.utils.datetimeutil import (
DateTime,
)
class AdapterProvider(SingletonMixin, ConfigMixin, MeasurementMixin, StartMixin, PydanticBaseModel):
"""Abstract base class for adapter providers with singleton thread-safety and configurable data parameters.
Note:
Derived classes have to provide their own _update_data method.
"""
update_datetime: Optional[DateTime] = Field(
None, json_schema_extra={"description": "Latest update datetime for adapter data"}
)
@abstractmethod
def provider_id(self) -> str:
"""Return the unique identifier for the adapter provider.
To be implemented by derived classes.
"""
return "AdapterProvider"
def enabled(self) -> bool:
"""Return True if the provider is enabled according to configuration.
Can be overwritten by derived classes.
"""
if self.config.adapter is None:
return False
if isinstance(self.config.adapter.provider, str):
return self.provider_id() == self.config.adapter.provider
if isinstance(self.config.adapter.provider, list):
return self.provider_id() in self.config.adapter.provider
return False
@abstractmethod
def _update_data(self) -> None:
"""Abstract method for custom adapter data update logic, to be implemented by derived classes.
Data update may be requested at different stages of energy management. The stage can be
detected by self.ems.stage().
"""
pass
def __init__(self, *args: Any, **kwargs: Any) -> None:
if hasattr(self, "_initialized"):
return
super().__init__(*args, **kwargs)
def update_data(
self,
force_enable: Optional[bool] = False,
) -> None:
"""Calls the custom update function if enabled or forced.
Args:
force_enable (bool, optional): If True, forces the update even if the provider is disabled.
"""
# Check after configuration is updated.
if not force_enable and not self.enabled():
return
# Call the custom update logic
logger.debug(f"Update adapter provider: {self.provider_id()}")
self._update_data()
class AdapterContainer(SingletonMixin, ConfigMixin, PydanticBaseModel):
"""A container for managing multiple adapter provider instances.
This class enables to control multiple adapter providers
"""
providers: list[AdapterProvider] = Field(
default_factory=list, json_schema_extra={"description": "List of adapter providers"}
)
@field_validator("providers")
def check_providers(cls, value: list[AdapterProvider]) -> list[AdapterProvider]:
# Check each item in the list
for item in value:
if not isinstance(item, AdapterProvider):
raise TypeError(
f"Each item in the adapter providers list must be an AdapterProvider, got {type(item).__name__}"
)
return value
@property
def enabled_providers(self) -> list[Any]:
"""List of providers that are currently enabled."""
enab = []
for provider in self.providers:
if provider.enabled():
enab.append(provider)
return enab
def __init__(self, *args: Any, **kwargs: Any) -> None:
if hasattr(self, "_initialized"):
return
super().__init__(*args, **kwargs)
def provider_by_id(self, provider_id: str) -> AdapterProvider:
"""Retrieves an adapter provider by its unique identifier.
This method searches through the list of all available providers and
returns the first provider whose `provider_id` matches the given
`provider_id`. If no matching provider is found, the method returns `None`.
Args:
provider_id (str): The unique identifier of the desired data provider.
Returns:
DataProvider: The data provider matching the given `provider_id`.
Raises:
ValueError if provider id is unknown.
Example:
provider = data.provider_by_id("WeatherImport")
"""
providers = {provider.provider_id(): provider for provider in self.providers}
if provider_id not in providers:
error_msg = f"Unknown provider id: '{provider_id}' of '{providers.keys()}'."
logger.error(error_msg)
raise ValueError(error_msg)
return providers[provider_id]
def update_data(
self,
force_enable: Optional[bool] = False,
) -> None:
"""Calls the custom update function of all adapters if enabled or forced.
Args:
force_enable (bool, optional): If True, forces the update even if the provider is disabled.
"""
# Call the custom update logic
if len(self.providers) > 0:
for provider in self.providers:
provider.update_data(force_enable=force_enable)

View File

@@ -0,0 +1,524 @@
"""Home Assistant adapter."""
import os
from typing import Optional, Union
import pandas as pd
import requests
from loguru import logger
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.adapter.adapterabc import AdapterProvider
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.emplan import (
DDBCInstruction,
FRBCInstruction,
)
from akkudoktoreos.core.ems import EnergyManagementStage
from akkudoktoreos.devices.devices import get_resource_registry
from akkudoktoreos.utils.datetimeutil import to_datetime
# Supervisor API endpoint and token (injected automatically in add-on container)
CORE_API = "http://supervisor/core/api"
TOKEN = os.environ.get("SUPERVISOR_TOKEN")
HEADERS = {
"Authorization": f"Bearer {TOKEN}",
"Content-Type": "application/json",
}
HOMEASSISTANT_ENTITY_ID_PREFIX = "sensor.eos_"
resources_eos = get_resource_registry()
class HomeAssistantAdapterCommonSettings(SettingsBaseModel):
"""Common settings for the home assistant adapter."""
config_entity_ids: Optional[dict[str, str]] = Field(
default=None,
json_schema_extra={
"description": (
"Mapping of EOS config keys to Home Assistant entity IDs.\n"
"The config key has to be given by a /-separated path\n"
"e.g. devices/batteries/0/capacity_wh"
),
"examples": [
{
"devices/batteries/0/capacity_wh": "sensor.battery1_capacity",
}
],
},
)
load_emr_entity_ids: Optional[list[str]] = Field(
default=None,
json_schema_extra={
"description": "Entity ID(s) of load energy meter reading [kWh]",
"examples": [
["sensor.load_energy_total_kwh"],
["sensor.load_emr1_kwh", "sensor.load_emr2_kwh"],
],
},
)
pv_production_emr_entity_ids: Optional[list[str]] = Field(
default=None,
json_schema_extra={
"description": "Entity ID(s) of PV production energy meter reading [kWh]",
"examples": [
["sensor.pv_energy_total_kwh"],
["sensor.pv_emr1_kwh", "sensor.pv_emr2_kwh"],
],
},
)
device_measurement_entity_ids: Optional[dict[str, str]] = Field(
default=None,
json_schema_extra={
"description": "Mapping of EOS measurement keys used by device (resource) simulations to Home Assistant entity IDs.",
"examples": [
{
"ev11_soc_factor": "sensor.ev11_soc_factor",
"battery1_soc_factor": "sensor.battery1_soc_factor",
}
],
},
)
device_instruction_entity_ids: Optional[list[str]] = Field(
default=None,
json_schema_extra={
"description": (
"Entity IDs for device (resource) instructions to be updated by EOS.\n"
f"The device ids (resource ids) have to be prepended by '{HOMEASSISTANT_ENTITY_ID_PREFIX}' to build the entity_id.\n"
f"E.g. The instruction for device id 'battery1' becomes the entity_id "
f"'{HOMEASSISTANT_ENTITY_ID_PREFIX}battery1'."
),
"examples": [
[
f"{HOMEASSISTANT_ENTITY_ID_PREFIX}battery1",
]
],
},
)
solution_entity_ids: Optional[list[str]] = Field(
default=None,
json_schema_extra={
"description": (
"Entity IDs for optimization solution keys to be updated by EOS.\n"
f"The solution keys have to be prepended by '{HOMEASSISTANT_ENTITY_ID_PREFIX}' to build the entity_id.\n"
f"E.g. solution key 'battery1_idle_op_mode' becomes the entity_id "
f"'{HOMEASSISTANT_ENTITY_ID_PREFIX}battery1_idle_op_mode'."
),
"examples": [
[
f"{HOMEASSISTANT_ENTITY_ID_PREFIX}battery1_idle_mode_mode",
]
],
},
)
# Computed fields
@computed_field # type: ignore[prop-decorator]
@property
def homeassistant_entity_ids(self) -> list[str]:
"""Entity IDs available at Home Assistant."""
try:
from akkudoktoreos.adapter.adapter import get_adapter
adapter_eos = get_adapter()
result = adapter_eos.provider_by_id("HomeAssistant").get_homeassistant_entity_ids()
except:
return []
return result
@computed_field # type: ignore[prop-decorator]
@property
def eos_solution_entity_ids(self) -> list[str]:
"""Entity IDs for optimization solution available at EOS."""
try:
from akkudoktoreos.adapter.adapter import get_adapter
adapter_eos = get_adapter()
result = adapter_eos.provider_by_id("HomeAssistant").get_eos_solution_entity_ids()
except:
return []
return result
@computed_field # type: ignore[prop-decorator]
@property
def eos_device_instruction_entity_ids(self) -> list[str]:
"""Entity IDs for energy management instructions available at EOS."""
try:
from akkudoktoreos.adapter.adapter import get_adapter
adapter_eos = get_adapter()
result = adapter_eos.provider_by_id(
"HomeAssistant"
).get_eos_device_instruction_entity_ids()
except:
return []
return result
# Validators
@field_validator("solution_entity_ids", mode="after")
@classmethod
def validate_solution_entity_ids(cls, value: Optional[list[str]]) -> Optional[list[str]]:
if value is None:
return None
for entity_id in value:
if not entity_id.startswith(HOMEASSISTANT_ENTITY_ID_PREFIX):
raise ValueError(
f"Invalid optimization solution entity id '{entity_id}': prefix '{HOMEASSISTANT_ENTITY_ID_PREFIX}' expected."
)
return value
@field_validator("device_instruction_entity_ids", mode="after")
@classmethod
def validate_device_instruction_entity_ids(
cls, value: Optional[list[str]]
) -> Optional[list[str]]:
if value is None:
return None
for entity_id in value:
if not entity_id.startswith(HOMEASSISTANT_ENTITY_ID_PREFIX):
raise ValueError(
f"Invalid instruction entity id '{entity_id}': prefix '{HOMEASSISTANT_ENTITY_ID_PREFIX}' expected."
)
return value
class HomeAssistantAdapter(AdapterProvider):
@classmethod
def provider_id(cls) -> str:
"""Return the unique identifier for the adapter provider."""
return "HomeAssistant"
def get_homeassistant_entity_ids(self) -> list[str]:
"""Retrieve the available entity IDs from Home Assistant.
Returns:
list[str]: The available entity IDs, or [].
Example:
>>> entity_ids = get_homeassistant_entity_ids()
>>> print(entity_ids)
["sensor.pv_all", "sensor.battery1_soc"]
"""
if not TOKEN:
raise RuntimeError("Missing SUPERVISOR_TOKEN environment variable.")
entity_ids = []
url = f"{CORE_API}/states"
resp = requests.get(url, headers=HEADERS, timeout=10)
if resp.ok:
data = resp.json()
entity_ids = [
entity["entity_id"]
for entity in data
if not entity["entity_id"].startswith(HOMEASSISTANT_ENTITY_ID_PREFIX)
]
debug_msg = f"homeassistant_entity_ids: {entity_ids}"
logger.debug(debug_msg)
else:
error_msg = f"Failed to read entity states: {resp.text}"
logger.error(error_msg)
raise ValueError(error_msg)
return sorted(entity_ids)
def _entity_id_from_solution_key(self, key: str) -> str:
return HOMEASSISTANT_ENTITY_ID_PREFIX + key
def get_eos_solution_entity_ids(self) -> list[str]:
"""Retrieve the available entity IDs for the EOS optimization solution.
Returns:
list[str]: The available entity IDs, or [].
"""
solution_entity_ids = []
try:
optimization_solution_keys = self.config.optimization.keys
for key in sorted(optimization_solution_keys):
solution_entity_ids.append(self._entity_id_from_solution_key(key))
except:
solution_entity_ids = []
return solution_entity_ids
def _entity_id_from_resource_id(self, resource_id: str) -> str:
return HOMEASSISTANT_ENTITY_ID_PREFIX + resource_id
def get_eos_device_instruction_entity_ids(self) -> list[str]:
"""Retrieve the available entity IDs for the EOS energy management plan instructions.
Returns:
list[str]: The available entity IDs, or [].
"""
instruction_entity_ids = []
plan = self.ems.plan()
if plan:
resource_ids = plan.get_resources()
for resource_id in resource_ids:
instruction_entity_ids.append(self._entity_id_from_resource_id(resource_id))
return sorted(instruction_entity_ids)
def set_entity_state(
self, entity_id: str, state_value: str, attributes: dict | None = None
) -> None:
"""Post or update a Home Assistant entity state.
Args:
entity_id (str): The Home Assistant entity ID to update.
state_value (str): The new state value for the entity.
attributes (dict | None): Optional dictionary of additional attributes.
Raises:
requests.RequestException: If the HTTP request to Home Assistant fails.
Example:
>>> set_entity_state("sensor.energy_optimizer_status", "running")
"""
if not TOKEN:
raise RuntimeError("Missing SUPERVISOR_TOKEN environment variable.")
url = f"{CORE_API}/states/{entity_id}"
data = {"state": state_value, "attributes": attributes or {}}
resp = requests.post(url, headers=HEADERS, json=data, timeout=10)
if resp.status_code not in (200, 201):
error_msg = f"Failed to update {entity_id}: {resp.text}"
logger.error(error_msg)
raise ValueError(error_msg)
else:
debug_msg = f"Updated {entity_id} = {state_value}"
logger.debug(debug_msg)
def get_entity_state(self, entity_id: str) -> str:
"""Retrieve the current state of an entity from Home Assistant.
Args:
entity_id (str): The Home Assistant entity ID to query.
Returns:
str: The current state of the entity.
Example:
>>> state = get_entity_state("switch.living_room_lamp")
>>> print(state)
"on"
"""
if not TOKEN:
raise RuntimeError("Missing SUPERVISOR_TOKEN environment variable.")
url = f"{CORE_API}/states/{entity_id}"
resp = requests.get(url, headers=HEADERS, timeout=10)
if resp.ok:
data = resp.json()
debug_msg = f"{entity_id}: {data['state']}"
logger.debug(debug_msg)
return data["state"]
else:
error_msg = f"Failed to read {entity_id}: {resp.text}"
logger.error(error_msg)
raise ValueError(error_msg)
def _convert_entity_state(self, state: str) -> Union[bool, float, str, None]:
"""Convert a Home Assistant entity state to a Python value.
This method converts the raw ``state`` string of a Home Assistant entity
into an appropriate Python type, following Home Assistant's global
state model and commonly used domain semantics.
Conversion rules:
**Availability states**
- ``"unavailable"``, ``"unknown"``, ``"none"`` → ``None``
**Binary / boolean states**
Used by binary sensors and many device domains:
- ``"on"``, ``"true"``, ``"yes"``, ``"open"``, ``"opening"``,
``"locked"``, ``"home"``, ``"detected"``, ``"active"`` → ``True``
- ``"off"``, ``"false"``, ``"no"``, ``"closed"``, ``"closing"``,
``"unlocked"``, ``"not_home"``, ``"clear"``, ``"idle"`` → ``False``
**Numeric states**
- Values that can be parsed as numbers are converted to ``float``.
This covers most sensor entities (temperature, power, energy, etc.).
**Other states**
- Any remaining states (e.g. ``"playing"``, ``"paused"``,
``"cooling"``, ``"heating"``, ``"standby"``, ``"jammed"``) are
returned as their original string value.
The input state is normalized using ``strip()`` and ``lower()`` before
conversion. If numeric conversion fails, the original unmodified
state string is returned.
Args:
state: Raw entity state as provided by Home Assistant.
Returns:
The converted entity state as one of:
``None``, ``bool``, ``float``, or ``str``.
"""
raw_state = state
value = state.strip().lower()
# Availability / unknown states
if value in {"unavailable", "unknown", "none"}:
return None
# States that semantically represent True
if value in {
"on",
"true",
"yes",
"y",
"open",
"opening",
"locked",
"home",
"detected",
"active",
}:
return True
# States that semantically represent False
if value in {
"off",
"false",
"no",
"n",
"closed",
"closing",
"unlocked",
"not_home",
"clear",
"idle",
}:
return False
# Numeric states (sensors, counters, percentages, etc.)
try:
return float(value)
except ValueError:
# Preserve original state for enums and free-text states
return raw_state
def _update_data(self) -> None:
stage = self.ems.stage()
if stage == EnergyManagementStage.DATA_ACQUISITION:
# Sync configuration
entity_ids = self.config.adapter.homeassistant.config_entity_ids
if entity_ids:
for (
config_key,
entity_id,
) in entity_ids.items():
try:
state = self.get_entity_state(entity_id)
logger.debug(f"Entity {entity_id}: {state}")
value = self._convert_entity_state(state)
if value:
self.config.set_nested_value(config_key, value)
except Exception as e:
logger.error(f"{e}")
# Retrieve measurements necessary for device simulations
entity_ids = self.config.adapter.homeassistant.device_measurement_entity_ids
if entity_ids:
for (
measurement_key,
entity_id,
) in entity_ids.items():
if entity_id:
try:
state = self.get_entity_state(entity_id)
logger.debug(f"Entity {entity_id}: {state}")
if state:
measurement_value = float(state)
self.measurement.update_value(
self.ems_start_datetime, measurement_key, measurement_value
)
except Exception as e:
logger.error(f"{e}")
# Retrieve measurements for load prediction
entity_ids = self.config.adapter.homeassistant.load_emr_entity_ids
if entity_ids:
measurement_keys = self.config.measurement.load_emr_keys
if measurement_keys is None:
measurement_keys = []
for entity_id in entity_ids:
measurement_key = entity_id
if measurement_key not in measurement_keys:
measurement_keys.append(measurement_key)
self.comfig.measurement.load_emr_keys = measurement_keys
try:
state = self.get_entity_state(entity_id)
logger.debug(f"Entity {entity_id}: {state}")
if state:
measurement_value = float(state)
self.measurement.update_value(
self.ems_start_datetime, measurement_key, measurement_value
)
except Exception as e:
logger.error(f"{e}")
# Retrieve measurements for PV prediction
entity_ids = self.config.adapter.homeassistant.pv_production_emr_entity_ids
if entity_ids:
measurement_keys = self.config.measurement.pv_production_emr_keys
if measurement_keys is None:
measurement_keys = []
for entity_id in entity_ids:
measurement_key = entity_id
if measurement_key not in measurement_keys:
measurement_keys.append(measurement_key)
self.comfig.measurement.pv_production_emr_keys = measurement_keys
try:
state = self.get_entity_state(entity_id)
logger.debug(f"Entity {entity_id}: {state}")
if state:
measurement_value = float(state)
self.measurement.update_value(
self.ems_start_datetime, measurement_key, measurement_value
)
except Exception as e:
logger.error(f"{e}")
# We got data - mark the update time
self.update_datetime = to_datetime()
if stage == EnergyManagementStage.CONTROL_DISPATCH:
# Currently active optimization solution
optimization_solution = self.ems.optimization_solution()
entity_ids = self.config.adapter.homeassistant.solution_entity_ids
if optimization_solution and entity_ids:
df = optimization_solution.solution.to_dataframe()
now = pd.Timestamp.now(tz=df.index.tz)
row = df.loc[:now].iloc[-1] # Last known value before now
for entity_id in entity_ids:
solution_key = entity_id[len(HOMEASSISTANT_ENTITY_ID_PREFIX) :]
try:
self.set_entity_state(entity_id, row[solution_key])
except Exception as e:
logger.error(f"{e}")
# Currently active instructions
instructions = self.ems.plan().get_active_instructions()
entity_ids = self.config.adapter.homeassistant.device_instruction_entity_ids
if instructions and entity_ids:
for instruction in instructions:
entity_id = self._entity_id_from_resource_id(instruction.resource_id)
if entity_id in entity_ids:
if isinstance(instruction, (DDBCInstruction, FRBCInstruction)):
state = instruction.operation_mode_id.lower()
attributes = {
"operation_mode_factor": instruction.operation_mode_factor,
}
try:
self.set_entity_state(entity_id, state, attributes)
except Exception as e:
logger.error(f"{e}")

View File

@@ -0,0 +1,128 @@
"""Nod-RED adapter."""
from typing import Optional, Union
import requests
from loguru import logger
from pydantic import Field, field_validator
from akkudoktoreos.adapter.adapterabc import AdapterProvider
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.emplan import DDBCInstruction, FRBCInstruction
from akkudoktoreos.core.ems import EnergyManagementStage
from akkudoktoreos.server.server import get_default_host, validate_ip_or_hostname
from akkudoktoreos.utils.datetimeutil import to_datetime
class NodeREDAdapterCommonSettings(SettingsBaseModel):
r"""Common settings for the NodeRED adapter.
The Node-RED adapter sends to HTTP IN nodes.
This is the example flow:
[HTTP In \\<URL\\>] -> [Function (parse payload)] -> [Debug] -> [HTTP Response]
There are two URLs that are used:
- GET /eos/data_aquisition
The GET is issued before the optimization.
- POST /eos/control_dispatch
The POST is issued after the optimization.
"""
host: Optional[str] = Field(
default=get_default_host(),
json_schema_extra={
"description": "Node-RED server IP address. Defaults to 127.0.0.1.",
"examples": ["127.0.0.1", "localhost"],
},
)
port: Optional[int] = Field(
default=1880,
json_schema_extra={
"description": "Node-RED server IP port number. Defaults to 1880.",
"examples": [
1880,
],
},
)
@field_validator("host", mode="before")
def validate_server_host(cls, value: Optional[str]) -> Optional[str]:
if isinstance(value, str):
value = validate_ip_or_hostname(value)
return value
@field_validator("port")
def validate_server_port(cls, value: Optional[int]) -> Optional[int]:
if value is not None and not (1024 <= value <= 49151):
raise ValueError("Server port number must be between 1024 and 49151.")
return value
class NodeREDAdapter(AdapterProvider):
def provider_id(self) -> str:
"""Return the unique identifier for the adapter provider."""
return "NodeRED"
def _update_data(self) -> None:
"""Custom adapter data update logic.
Data update may be requested at different stages of energy management. The stage can be
detected by self.ems.stage().
"""
server = f"http://{self.config.adapter.nodered.host}:{self.config.adapter.nodered.port}"
data: Optional[dict[str, Union[str, float]]] = None
stage = self.ems.stage()
if stage == EnergyManagementStage.CONTROL_DISPATCH:
data = {}
# currently active instructions
instructions = self.ems.plan().get_active_instructions()
for instruction in instructions:
idx = instruction.id.find("@")
resource_id = instruction.id[:idx] if idx != -1 else instruction.id
operation_mode_id = "<unknown>"
operation_mode_factor = 0.0
if isinstance(instruction, (DDBCInstruction, FRBCInstruction)):
operation_mode_id = instruction.operation_mode_id
operation_mode_factor = instruction.operation_mode_factor
data[f"{resource_id}_op_mode"] = operation_mode_id
data[f"{resource_id}_op_factor"] = operation_mode_factor
elif stage == EnergyManagementStage.DATA_ACQUISITION:
data = {}
if data is None:
return
logger.info(f"NodeRED {str(stage).lower()} at {server}: {data}")
try:
error_msg = None
if stage == EnergyManagementStage.CONTROL_DISPATCH:
response = requests.post(f"{server}/eos/{str(stage).lower()}", json=data, timeout=5)
elif stage == EnergyManagementStage.DATA_ACQUISITION:
response = requests.get(f"{server}/eos/{str(stage).lower()}", json=data, timeout=5)
response.raise_for_status()
except requests.exceptions.HTTPError as e:
try:
# Try to get 'detail' from the JSON response
detail = response.json().get(
"detail", f"No error details for data '{data}' '{response.text}'"
)
except ValueError:
# Response is not JSON
detail = f"No error details for data '{data}' '{response.text}'"
error_msg = f"NodeRED `{str(stage).lower()}` fails at `{server}`: {detail}"
except Exception as e:
error_msg = f"NodeRED `{str(stage).lower()}` fails at `{server}`: {e}"
if error_msg:
logger.error(error_msg)
raise RuntimeError(error_msg)
if stage == EnergyManagementStage.DATA_ACQUISITION:
data = response.json()
# We got data - mark the update time
self.update_datetime = to_datetime()

View File

@@ -13,7 +13,7 @@ import json
import os
import tempfile
from pathlib import Path
from typing import Any, ClassVar, Optional, Type
from typing import Any, ClassVar, Optional, Type, Union
import pydantic_settings
from loguru import logger
@@ -21,6 +21,7 @@ from platformdirs import user_config_dir, user_data_dir
from pydantic import Field, computed_field, field_validator
# settings
from akkudoktoreos.adapter.adapter import AdapterCommonSettings
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.config.configmigrate import migrate_config_data, migrate_config_file
from akkudoktoreos.core.cachesettings import CacheCommonSettings
@@ -65,25 +66,17 @@ def get_absolute_path(
class GeneralSettings(SettingsBaseModel):
"""Settings for common configuration.
General configuration to set directories of cache and output files and system location (latitude
and longitude).
Validators ensure each parameter is within a specified range. A computed property, `timezone`,
determines the time zone based on latitude and longitude.
Attributes:
latitude (Optional[float]): Latitude in degrees, must be between -90 and 90.
longitude (Optional[float]): Longitude in degrees, must be between -180 and 180.
Properties:
timezone (Optional[str]): Computed time zone string based on the specified latitude
and longitude.
"""
"""General settings."""
_config_folder_path: ClassVar[Optional[Path]] = None
_config_file_path: ClassVar[Optional[Path]] = None
# Detect Home Assistant add-on environment
# Home Assistant sets this environment variable automatically
_home_assistant_addon: ClassVar[bool] = (
"HASSIO_TOKEN" in os.environ or "SUPERVISOR_TOKEN" in os.environ
)
version: str = Field(
default=__version__,
json_schema_extra={
@@ -109,21 +102,21 @@ class GeneralSettings(SettingsBaseModel):
ge=-90.0,
le=90.0,
json_schema_extra={
"description": "Latitude in decimal degrees, between -90 and 90, north is positive (ISO 19115) (°)"
"description": "Latitude in decimal degrees between -90 and 90. North is positive (ISO 19115) (°)"
},
)
longitude: Optional[float] = Field(
default=13.405,
ge=-180.0,
le=180.0,
json_schema_extra={"description": "Longitude in decimal degrees, within -180 to 180 (°)"},
json_schema_extra={"description": "Longitude in decimal degrees within -180 to 180 (°)"},
)
# Computed fields
@computed_field # type: ignore[prop-decorator]
@property
def timezone(self) -> Optional[str]:
"""Compute timezone based on latitude and longitude."""
"""Computed timezone based on latitude and longitude."""
if self.latitude and self.longitude:
return to_timezone(location=(self.latitude, self.longitude), as_string=True)
return None
@@ -131,7 +124,10 @@ class GeneralSettings(SettingsBaseModel):
@computed_field # type: ignore[prop-decorator]
@property
def data_output_path(self) -> Optional[Path]:
"""Compute data_output_path based on data_folder_path."""
"""Computed data_output_path based on data_folder_path."""
if self.home_assistant_addon:
# Only /data is persistent for home assistant add-on
return Path("/data/output")
return get_absolute_path(self.data_folder_path, self.data_output_subpath)
@computed_field # type: ignore[prop-decorator]
@@ -146,6 +142,12 @@ class GeneralSettings(SettingsBaseModel):
"""Path to EOS configuration file."""
return self._config_file_path
@computed_field # type: ignore[prop-decorator]
@property
def home_assistant_addon(self) -> bool:
"""EOS is running as home assistant add-on."""
return self._home_assistant_addon
compatible_versions: ClassVar[list[str]] = [__version__]
@field_validator("version")
@@ -160,6 +162,22 @@ class GeneralSettings(SettingsBaseModel):
raise ValueError(error)
return v
@field_validator("data_folder_path", mode="after")
@classmethod
def validate_data_folder_path(cls, value: Optional[Union[str, Path]]) -> Optional[Path]:
"""Ensure dir is available."""
if cls._home_assistant_addon:
# Force to home assistant add-on /data directory
return Path("/data")
if value is None:
return None
if isinstance(value, str):
value = Path(value)
value.resolve()
if not value.is_dir():
raise ValueError(f"Data folder path '{value}' is not a directory.")
return value
class SettingsEOS(pydantic_settings.BaseSettings, PydanticModelNestedValueMixin):
"""Settings for all EOS.
@@ -212,6 +230,9 @@ class SettingsEOS(pydantic_settings.BaseSettings, PydanticModelNestedValueMixin)
utils: Optional[UtilsCommonSettings] = Field(
default=None, json_schema_extra={"description": "Utilities Settings"}
)
adapter: Optional[AdapterCommonSettings] = Field(
default=None, json_schema_extra={"description": "Adapter Settings"}
)
model_config = pydantic_settings.SettingsConfigDict(
env_nested_delimiter="__",
@@ -242,6 +263,7 @@ class SettingsEOSDefaults(SettingsEOS):
weather: WeatherCommonSettings = WeatherCommonSettings()
server: ServerCommonSettings = ServerCommonSettings()
utils: UtilsCommonSettings = UtilsCommonSettings()
adapter: AdapterCommonSettings = AdapterCommonSettings()
def __hash__(self) -> int:
# Just for usage in configmigrate, finally overwritten when used by ConfigEOS.
@@ -297,6 +319,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
APP_NAME: ClassVar[str] = "net.akkudoktor.eos" # reverse order
APP_AUTHOR: ClassVar[str] = "akkudoktor"
EOS_DIR: ClassVar[str] = "EOS_DIR"
EOS_DATA_DIR: ClassVar[str] = "EOS_DATA_DIR"
EOS_CONFIG_DIR: ClassVar[str] = "EOS_CONFIG_DIR"
ENCODING: ClassVar[str] = "UTF-8"
CONFIG_FILE_NAME: ClassVar[str] = "EOS.config.json"
@@ -355,27 +378,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
- It ensures that a fallback to a default configuration file is always possible.
"""
# Ensure we know and have the config folder path and the config file
config_file, exists = cls._get_config_file_path()
config_dir = config_file.parent
if not exists:
config_dir.mkdir(parents=True, exist_ok=True)
# Create minimum config file
config_minimum_content = '{ "general": { "version": "' + __version__ + '" } }'
try:
config_file.write_text(config_minimum_content, encoding="utf-8")
except Exception as exc:
# Create minimum config in temporary config directory as last resort
error_msg = f"Could not create minimum config file in {config_dir}: {exc}"
logger.error(error_msg)
temp_dir = Path(tempfile.mkdtemp())
info_msg = f"Using temporary config directory {temp_dir}"
logger.info(info_msg)
config_dir = temp_dir
config_file = temp_dir / config_file.name
config_file.write_text(config_minimum_content, encoding="utf-8")
# Remember config_dir and config file
GeneralSettings._config_folder_path = config_dir
GeneralSettings._config_file_path = config_file
config_file = cls._setup_config_file()
# All the settings sources in priority sequence
setting_sources = [
@@ -384,7 +387,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
dotenv_settings,
]
# Apend file settings to sources
# Append file settings to sources
file_settings: Optional[pydantic_settings.JsonConfigSettingsSource] = None
try:
backup_file = config_file.with_suffix(f".{to_datetime(as_string='YYYYMMDDHHmmss')}")
@@ -426,7 +429,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
# (Re-)load settings - call base class init
SettingsEOSDefaults.__init__(self, *args, **kwargs)
# Init config file and data folder pathes
self._create_initial_config_file()
self._setup_config_file()
self._update_data_folder_path()
self._initialized = True
logger.debug("Config setup:\n{}", self)
@@ -559,17 +562,6 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
return result
def _create_initial_config_file(self) -> None:
if self.general.config_file_path and not self.general.config_file_path.exists():
self.general.config_file_path.parent.mkdir(parents=True, exist_ok=True)
try:
with self.general.config_file_path.open("w", encoding="utf-8", newline="\n") as f:
f.write(self.model_dump_json(indent=4))
except Exception as e:
logger.error(
f"Could not write configuration file '{self.general.config_file_path}': {e}"
)
def _update_data_folder_path(self) -> None:
"""Updates path to the data directory."""
# From Settings
@@ -579,7 +571,16 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
self.general.data_folder_path = data_dir
return
except Exception as e:
logger.warning(f"Could not setup data dir: {e}")
logger.warning(f"Could not setup data dir {data_dir}: {e}")
# From EOS_DATA_DIR env
if env_dir := os.getenv(self.EOS_DATA_DIR):
try:
data_dir = Path(env_dir).resolve()
data_dir.mkdir(parents=True, exist_ok=True)
self.general.data_folder_path = data_dir
return
except Exception as e:
logger.warning(f"Could not setup data dir {data_dir}: {e}")
# From EOS_DIR env
if env_dir := os.getenv(self.EOS_DIR):
try:
@@ -588,7 +589,7 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
self.general.data_folder_path = data_dir
return
except Exception as e:
logger.warning(f"Could not setup data dir: {e}")
logger.warning(f"Could not setup data dir {data_dir}: {e}")
# From platform specific default path
try:
data_dir = Path(user_data_dir(self.APP_NAME, self.APP_AUTHOR))
@@ -597,9 +598,10 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
self.general.data_folder_path = data_dir
return
except Exception as e:
logger.warning(f"Could not setup data dir: {e}")
logger.warning(f"Could not setup data dir {data_dir}: {e}")
# Current working directory
data_dir = Path.cwd()
logger.warning(f"Using data dir {data_dir}")
self.general.data_folder_path = data_dir
@classmethod
@@ -611,16 +613,28 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
2. user configuration directory
3. current working directory
If running as Home Assistat add-on returns /data/config/EOS.config.json.
Returns:
tuple[Path, bool]: The path to the configuration file and if there is already a config file there
"""
if GeneralSettings._home_assistant_addon:
# Only /data is persistent for home assistant add-on
cfile = Path("/data/config") / cls.CONFIG_FILE_NAME
logger.debug(f"Config file forced to: '{cfile}'")
return cfile, cfile.exists()
config_dirs = []
env_base_dir = os.getenv(cls.EOS_DIR)
env_config_dir = os.getenv(cls.EOS_CONFIG_DIR)
env_dir = get_absolute_path(env_base_dir, env_config_dir)
logger.debug(f"Environment config dir: '{env_dir}'")
if env_dir is not None:
config_dirs.append(env_dir.resolve())
env_eos_dir = os.getenv(cls.EOS_DIR)
logger.debug(f"Environment EOS_DIR: '{env_eos_dir}'")
env_eos_config_dir = os.getenv(cls.EOS_CONFIG_DIR)
logger.debug(f"Environment EOS_CONFIG_DIR: '{env_eos_config_dir}'")
env_config_dir = get_absolute_path(env_eos_dir, env_eos_config_dir)
logger.debug(f"Resulting environment config dir: '{env_config_dir}'")
if env_config_dir is not None:
config_dirs.append(env_config_dir.resolve())
config_dirs.append(Path(user_config_dir(cls.APP_NAME, cls.APP_AUTHOR)))
config_dirs.append(Path.cwd())
for cdir in config_dirs:
@@ -628,8 +642,52 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
if cfile.exists():
logger.debug(f"Found config file: '{cfile}'")
return cfile, True
return config_dirs[0].joinpath(cls.CONFIG_FILE_NAME), False
@classmethod
def _setup_config_file(cls) -> Path:
"""Setup config file.
Creates an initial config file if it does not exist.
Returns:
config_file_path (Path): Path to config file
"""
config_file_path, exists = cls._get_config_file_path()
if (
GeneralSettings._config_file_path
and GeneralSettings._config_file_path != config_file_path
):
debug_msg = (
f"Config file changed from '{GeneralSettings._config_file_path}' to "
f"'{config_file_path}'"
)
logger.debug(debug_msg)
if not exists:
# Create minimum config file
config_minimum_content = '{ "general": { "version": "' + __version__ + '" } }'
try:
config_file_path.parent.mkdir(parents=True, exist_ok=True)
config_file_path.write_text(config_minimum_content, encoding="utf-8")
except Exception as exc:
# Create minimum config in temporary config directory as last resort
error_msg = (
f"Could not create minimum config file in {config_file_path.parent}: {exc}"
)
logger.error(error_msg)
temp_dir = Path(tempfile.mkdtemp())
info_msg = f"Using temporary config directory {temp_dir}"
logger.info(info_msg)
config_file_path = temp_dir / config_file_path.name
config_file_path.write_text(config_minimum_content, encoding="utf-8")
# Remember config_dir and config file
GeneralSettings._config_folder_path = config_file_path.parent
GeneralSettings._config_file_path = config_file_path
return config_file_path
def to_config_file(self) -> None:
"""Saves the current configuration to the configuration file.

View File

@@ -21,6 +21,11 @@ if TYPE_CHECKING:
# - tuple[str, Callable[[Any], Any]] (new path + transform)
# - None (drop)
MIGRATION_MAP: Dict[str, Union[str, Tuple[str, Callable[[Any], Any]], None]] = {
# 0.2.0.dev -> 0.2.0.dev
"adapter/homeassistant/optimization_solution_entity_ids": (
"adapter/homeassistant/solution_entity_ids",
lambda v: v if isinstance(v, list) else None,
),
# 0.2.0 -> 0.2.0+dev
"elecprice/provider_settings/ElecPriceImport/import_file_path": "elecprice/elecpriceimport/import_file_path",
"elecprice/provider_settings/ElecPriceImport/import_json": "elecprice/elecpriceimport/import_json",

View File

@@ -27,7 +27,10 @@ class CacheCommonSettings(SettingsBaseModel):
# Do not make this a pydantic computed field. The pydantic model must be fully initialized
# to have access to config.general, which may not be the case if it is a computed field.
def path(self) -> Optional[Path]:
"""Compute cache path based on general.data_folder_path."""
"""Computed cache path based on general.data_folder_path."""
if self.config.general.home_assistant_addon:
# Only /data is persistent for home assistant add-on
return Path("/data/cache")
data_cache_path = self.config.general.data_folder_path
if data_cache_path is None or self.subpath is None:
return None

View File

@@ -18,12 +18,53 @@ from loguru import logger
from akkudoktoreos.core.decorators import classproperty
from akkudoktoreos.utils.datetimeutil import DateTime
adapter_eos: Any = None
config_eos: Any = None
measurement_eos: Any = None
prediction_eos: Any = None
ems_eos: Any = None
class AdapterMixin:
"""Mixin class for managing EOS adapter.
This class serves as a foundational component for EOS-related classes requiring access
to the global EOS adapters. It provides a `adapter` property that dynamically retrieves
the adapter instance.
Usage:
Subclass this base class to gain access to the `adapter` attribute, which retrieves the
global adapter instance lazily to avoid import-time circular dependencies.
Attributes:
adapter (Adapter): Property to access the global EOS adapter.
Example:
.. code-block:: python
class MyEOSClass(AdapterMixin):
def my_method(self):
self.adapter.update_date()
"""
@classproperty
def adapter(cls) -> Any:
"""Convenience class method/ attribute to retrieve the EOS adapters.
Returns:
Adapter: The adapters.
"""
# avoid circular dependency at import time
global adapter_eos
if adapter_eos is None:
from akkudoktoreos.adapter.adapter import get_adapter
adapter_eos = get_adapter()
return adapter_eos
class ConfigMixin:
"""Mixin class for managing EOS configuration data.

View File

@@ -1018,7 +1018,7 @@ class DataSequence(DataBase, MutableSequence):
end_datetime: Optional[DateTime] = None,
interval: Optional[Duration] = None,
fill_method: Optional[str] = None,
dropna: Optional[bool] = None,
dropna: Optional[bool] = True,
) -> NDArray[Shape["*"], Any]:
"""Extract an array indexed by fixed time intervals from data records within an optional date range.
@@ -1032,17 +1032,19 @@ class DataSequence(DataBase, MutableSequence):
- 'ffill': Forward fill missing values.
- 'bfill': Backward fill missing values.
- 'none': Defaults to 'linear' for numeric values, otherwise 'ffill'.
dropna: (bool, optional): Whether to drop NAN/ None values before processing. Defaults to True.
dropna: (bool, optional): Whether to drop NAN/ None values before processing.
Defaults to True.
Returns:
np.ndarray: A NumPy Array of the values extracted from the specified key.
np.ndarray: A NumPy Array of the values at the chosen frequency extracted from the
specified key.
Raises:
KeyError: If the specified key is not found in any of the DataRecords.
"""
self._validate_key(key)
# General check on fill_method
# Validate fill method
if fill_method not in ("ffill", "bfill", "linear", "none", None):
raise ValueError(f"Unsupported fill method: {fill_method}")
@@ -1050,13 +1052,17 @@ class DataSequence(DataBase, MutableSequence):
start_datetime = to_datetime(start_datetime, to_maxtime=False) if start_datetime else None
end_datetime = to_datetime(end_datetime, to_maxtime=False) if end_datetime else None
resampled = None
if interval is None:
interval = to_duration("1 hour")
resample_freq = "1h"
else:
resample_freq = to_duration(interval, as_string="pandas")
# Load raw lists (already sorted & filtered)
dates, values = self.key_to_lists(key=key, dropna=dropna)
values_len = len(values)
# Bring lists into shape
if values_len < 1:
# No values, assume at least one value set to None
if start_datetime is not None:
@@ -1092,40 +1098,40 @@ class DataSequence(DataBase, MutableSequence):
dates.append(end_datetime)
values.append(values[-1])
series = pd.Series(data=values, index=pd.DatetimeIndex(dates), name=key)
if not series.index.inferred_type == "datetime64":
# Construct series
series = pd.Series(values, index=pd.DatetimeIndex(dates), name=key)
if series.index.inferred_type != "datetime64":
raise TypeError(
f"Expected DatetimeIndex, but got {type(series.index)} "
f"infered to {series.index.inferred_type}: {series}"
)
# Handle missing values
if series.dtype in [np.float64, np.float32, np.int64, np.int32]:
# Numeric types
if fill_method is None:
# Determine default fill method depending on dtype
if fill_method is None:
if pd.api.types.is_numeric_dtype(series):
fill_method = "linear"
# Resample the series to the specified interval
resampled = series.resample(interval, origin=resample_origin).first()
if fill_method == "linear":
resampled = resampled.interpolate(method="linear")
elif fill_method == "ffill":
resampled = resampled.ffill()
elif fill_method == "bfill":
resampled = resampled.bfill()
elif fill_method != "none":
raise ValueError(f"Unsupported fill method: {fill_method}")
else:
# Non-numeric types
if fill_method is None:
else:
fill_method = "ffill"
# Resample the series to the specified interval
# Perform the resampling
if pd.api.types.is_numeric_dtype(series):
# numeric → use mean
resampled = series.resample(interval, origin=resample_origin).mean()
else:
# non-numeric → fallback (first, last, mode, or ffill)
resampled = series.resample(interval, origin=resample_origin).first()
if fill_method == "ffill":
resampled = resampled.ffill()
elif fill_method == "bfill":
resampled = resampled.bfill()
elif fill_method != "none":
raise ValueError(f"Unsupported fill method for non-numeric data: {fill_method}")
# Handle missing values after resampling
if fill_method == "linear" and pd.api.types.is_numeric_dtype(series):
resampled = resampled.interpolate("linear")
elif fill_method == "ffill":
resampled = resampled.ffill()
elif fill_method == "bfill":
resampled = resampled.bfill()
elif fill_method == "none":
pass
else:
raise ValueError(f"Unsupported fill method: {fill_method}")
logger.debug(
"Resampled for '{}' with length {}: {}...{}",
@@ -1141,6 +1147,16 @@ class DataSequence(DataBase, MutableSequence):
if end_datetime is not None and len(resampled) > 0:
resampled = resampled.truncate(after=end_datetime.subtract(seconds=1))
array = resampled.values
# Convert NaN to None if there are actually NaNs
if (
isinstance(array, np.ndarray)
and np.issubdtype(array.dtype.type, np.floating)
and pd.isna(array).any()
):
array = array.astype(object)
array[pd.isna(array)] = None
logger.debug(
"Array for '{}' with length {}: {}...{}", key, len(array), array[:10], array[-10:]
)
@@ -1691,6 +1707,14 @@ class DataImportMixin:
}
"""
# Strip quotes if provided - does not effect unquoted string
json_str = json_str.strip() # strip white space at start and end
if (json_str.startswith("'") and json_str.endswith("'")) or (
json_str.startswith('"') and json_str.endswith('"')
):
json_str = json_str[1:-1] # strip outer quotes
json_str = json_str.strip() # strip remaining white space at start and end
# Try pandas dataframe with orient="split"
try:
import_data = PydanticDateTimeDataFrame.model_validate_json(json_str)
@@ -1720,10 +1744,15 @@ class DataImportMixin:
logger.debug(f"PydanticDateTimeData import: {error_msg}")
# Use simple dict format
import_data = json.loads(json_str)
self.import_from_dict(
import_data, key_prefix=key_prefix, start_datetime=start_datetime, interval=interval
)
try:
import_data = json.loads(json_str)
self.import_from_dict(
import_data, key_prefix=key_prefix, start_datetime=start_datetime, interval=interval
)
except Exception as e:
error_msg = f"Invalid JSON string '{json_str}': {e}"
logger.debug(error_msg)
raise ValueError(error_msg) from e
def import_from_file(
self,

View File

@@ -25,11 +25,11 @@ class classproperty:
Methods:
__get__: Retrieves the value of the class property by calling the
decorated method on the class.
decorated method on the class.
Parameters:
fget (Callable[[Any], Any]): A method that takes the class as an
argument and returns a value.
argument and returns a value.
Raises:
RuntimeError: If `fget` is not defined when `__get__` is called.

View File

@@ -10,9 +10,11 @@ Demand Driven Based Control.
import uuid
from abc import ABC, abstractmethod
from collections import defaultdict
from enum import Enum
from typing import Annotated, Literal, Optional, Union
from loguru import logger
from pydantic import Field, computed_field, model_validator
from akkudoktoreos.core.pydantic import PydanticBaseModel
@@ -2257,20 +2259,60 @@ class EnergyManagementPlan(PydanticBaseModel):
self.valid_from = to_datetime()
self.valid_until = None
def get_resources(self) -> list[str]:
"""Retrieves the resource_ids for the resources the plan currently holds instructions for.
Returns a list of resource ids.
"""
resource_ids = []
for instr in self.instructions:
resource_id = instr.resource_id
if resource_id not in resource_ids:
resource_ids.append(resource_id)
return resource_ids
def get_active_instructions(
self, now: Optional[DateTime] = None
) -> list[EnergyManagementInstruction]:
"""Retrieves all currently active instructions at the specified time."""
) -> list["EnergyManagementInstruction"]:
"""Retrieves the currently active instruction for each resource at the specified time.
Semantics:
- For each resource, consider only instructions with execution_time <= now.
- Choose the instruction with the latest execution_time (the most recent).
- If that instruction has a duration (timedelta), it's active only if now < execution_time + duration.
- If that instruction has no duration (None), treat it as open-ended (active until superseded).
Returns a list with at most one instruction per resource (the active one).
"""
now = now or to_datetime()
active = []
# Group instructions by resource_id
by_resource: dict[str, list["EnergyManagementInstruction"]] = defaultdict(list)
for instr in self.instructions:
instr_duration = instr.duration()
# skip instructions scheduled in the future
if instr.execution_time <= now:
by_resource[instr.resource_id].append(instr)
active: list["EnergyManagementInstruction"] = []
for resource_id, instrs in by_resource.items():
# pick latest instruction by execution_time
latest = max(instrs, key=lambda i: i.execution_time)
if len(instrs) == 0:
# No instructions, ther shall be at least one
error_msg = f"No instructions for {resource_id}"
logger.error(error_msg)
raise ValueError(error_msg)
instr_duration = latest.duration() # expected: Duration| None
if instr_duration is None:
if instr.execution_time <= now:
active.append(instr)
# open-ended (active until replaced) -> active because we selected latest <= now
active.append(latest)
else:
if instr.execution_time <= now < instr.execution_time + instr_duration:
active.append(instr)
# active only if now is strictly before execution_time + duration
if latest.execution_time + instr_duration > now:
active.append(latest)
return active
def get_next_instruction(

View File

@@ -1,6 +1,7 @@
import traceback
from asyncio import Lock, get_running_loop
from concurrent.futures import ThreadPoolExecutor
from enum import Enum
from functools import partial
from typing import ClassVar, Optional
@@ -8,7 +9,12 @@ from loguru import logger
from pydantic import computed_field
from akkudoktoreos.core.cache import CacheEnergyManagementStore
from akkudoktoreos.core.coreabc import ConfigMixin, PredictionMixin, SingletonMixin
from akkudoktoreos.core.coreabc import (
AdapterMixin,
ConfigMixin,
PredictionMixin,
SingletonMixin,
)
from akkudoktoreos.core.emplan import EnergyManagementPlan
from akkudoktoreos.core.emsettings import EnergyManagementMode
from akkudoktoreos.core.pydantic import PydanticBaseModel
@@ -24,7 +30,23 @@ from akkudoktoreos.utils.datetimeutil import DateTime, compare_datetimes, to_dat
executor = ThreadPoolExecutor(max_workers=1)
class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBaseModel):
class EnergyManagementStage(Enum):
"""Enumeration of the main stages in the energy management lifecycle."""
IDLE = "IDLE"
DATA_ACQUISITION = "DATA_AQUISITION"
FORECAST_RETRIEVAL = "FORECAST_RETRIEVAL"
OPTIMIZATION = "OPTIMIZATION"
CONTROL_DISPATCH = "CONTROL_DISPATCH"
def __str__(self) -> str:
"""Return the string representation of the stage."""
return self.value
class EnergyManagement(
SingletonMixin, ConfigMixin, PredictionMixin, AdapterMixin, PydanticBaseModel
):
"""Energy management."""
# Start datetime.
@@ -33,6 +55,9 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
# last run datetime. Used by energy management task
_last_run_datetime: ClassVar[Optional[DateTime]] = None
# Current energy management stage
_stage: ClassVar[EnergyManagementStage] = EnergyManagementStage.IDLE
# energy management plan of latest energy management run with optimization
_plan: ClassVar[Optional[EnergyManagementPlan]] = None
@@ -81,6 +106,15 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
cls._start_datetime = start_datetime.set(minute=0, second=0, microsecond=0)
return cls._start_datetime
@classmethod
def stage(cls) -> EnergyManagementStage:
"""Get the the stage of the energy management.
Returns:
EnergyManagementStage: The current stage of energy management.
"""
return cls._stage
@classmethod
def plan(cls) -> Optional[EnergyManagementPlan]:
"""Get the latest energy management plan.
@@ -122,6 +156,7 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
"""Run the energy management.
This method initializes the energy management run by setting its
start datetime, updating predictions, and optionally starting
optimization depending on the selected mode or configuration.
@@ -157,6 +192,8 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
logger.info("Starting energy management run.")
cls._stage = EnergyManagementStage.DATA_ACQUISITION
# Remember/ set the start datetime of this energy management run.
# None leads
cls.set_start_datetime(start_datetime)
@@ -164,12 +201,23 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
# Throw away any memory cached results of the last energy management run.
CacheEnergyManagementStore().clear()
# Do data aquisition by adapters
try:
cls.adapter.update_data(force_enable)
except Exception as e:
trace = "".join(traceback.TracebackException.from_exception(e).format())
error_msg = f"Adapter update failed - phase {cls._stage}: {e}\n{trace}"
logger.error(error_msg)
cls._stage = EnergyManagementStage.FORECAST_RETRIEVAL
if mode is None:
mode = cls.config.ems.mode
if mode is None or mode == "PREDICTION":
# Update the predictions
cls.prediction.update_data(force_enable=force_enable, force_update=force_update)
logger.info("Energy management run done (predictions updated)")
cls._stage = EnergyManagementStage.IDLE
return
# Prepare optimization parameters
@@ -184,8 +232,12 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
logger.error(
"Energy management run canceled. Could not prepare optimisation parameters."
)
cls._stage = EnergyManagementStage.IDLE
return
cls._stage = EnergyManagementStage.OPTIMIZATION
logger.info("Starting energy management optimization.")
# Take values from config if not given
if genetic_individuals is None:
genetic_individuals = cls.config.optimization.genetic.individuals
@@ -195,7 +247,6 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
if cls._start_datetime is None: # Make mypy happy - already set by us
raise RuntimeError("Start datetime not set.")
logger.info("Starting energy management optimization.")
try:
optimization = GeneticOptimization(
verbose=bool(cls.config.server.verbose),
@@ -208,8 +259,11 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
)
except:
logger.exception("Energy management optimization failed.")
cls._stage = EnergyManagementStage.IDLE
return
cls._stage = EnergyManagementStage.CONTROL_DISPATCH
# Make genetic solution public
cls._genetic_solution = solution
@@ -224,6 +278,17 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
logger.debug("Energy management plan:\n{}", cls._plan)
logger.info("Energy management run done (optimization updated)")
# Do control dispatch by adapters
try:
cls.adapter.update_data(force_enable)
except Exception as e:
trace = "".join(traceback.TracebackException.from_exception(e).format())
error_msg = f"Adapter update failed - phase {cls._stage}: {e}\n{trace}"
logger.error(error_msg)
# energy management run finished
cls._stage = EnergyManagementStage.IDLE
async def run(
self,
start_datetime: Optional[DateTime] = None,

View File

@@ -65,7 +65,7 @@ console_handler_id = None
file_handler_id = None
def track_logging_config(config_eos: Any, path: str, old_value: Any, value: Any) -> None:
def logging_track_config(config_eos: Any, path: str, old_value: Any, value: Any) -> None:
"""Track logging config changes."""
global console_handler_id, file_handler_id

View File

@@ -400,7 +400,21 @@ class PydanticModelNestedValueMixin:
# Get next value
next_value = None
if isinstance(model, BaseModel):
if isinstance(model, RootModel):
# If this is the final key, set the value
if is_final_key:
try:
model.validate_and_set(key, value)
except Exception as e:
raise ValueError(f"Error updating model: {e}") from e
return
next_value = model.root
elif isinstance(model, BaseModel):
logger.debug(
f"Detected base model {model.__class__.__name__} of type {type(model)}"
)
# Track parent and key for possible assignment later
parent = model
parent_key = [
@@ -432,6 +446,7 @@ class PydanticModelNestedValueMixin:
next_value = getattr(model, key, None)
elif isinstance(model, list):
logger.debug(f"Detected list of type {type(model)}")
# Handle lists (ensure index exists and modify safely)
try:
idx = int(key)
@@ -468,6 +483,7 @@ class PydanticModelNestedValueMixin:
return
elif isinstance(model, dict):
logger.debug(f"Detected dict of type {type(model)}")
# Handle dictionaries (auto-create missing keys)
# Get next type from parent key type information
@@ -795,29 +811,61 @@ class PydanticBaseModel(PydanticModelNestedValueMixin, BaseModel):
@classmethod
def field_description(cls, field_name: str) -> Optional[str]:
"""Return the description metadata of a model field, if available.
"""Return a human-readable description for a model field.
This method retrieves the `Field` specification from the model's
`model_fields` registry and extracts its description from the field's
`json_schema_extra` / `extra` metadata (as provided by
`_field_extra_dict`). If the field does not exist or no description is
present, ``None`` is returned.
Looks up descriptions for both regular and computed fields.
Resolution order:
Normal fields:
1) json_schema_extra["description"]
2) field.description
Computed fields:
1) ComputedFieldInfo.description
2) function docstring (func.__doc__)
3) json_schema_extra["description"]
If a field exists but no description is found, returns "-".
If the field does not exist, returns None.
Args:
field_name (str):
Name of the field whose description should be returned.
field_name: Field name.
Returns:
Optional[str]:
The textual description if present, otherwise ``None``.
Description string, "-" if missing, or None if not a field.
"""
field = cls.model_fields.get(field_name)
if not field:
# 1) Regular declared fields
field: FieldInfo | None = cls.model_fields.get(field_name)
if field is not None:
extra = cls._field_extra_dict(field)
if "description" in extra:
return str(extra["description"])
# some FieldInfo may also have .description directly
if getattr(field, "description", None):
return str(field.description)
return None
extra = cls._field_extra_dict(field)
# 2) Computed fields live in a separate mapping
cfield: ComputedFieldInfo | None = cls.model_computed_fields.get(field_name)
if cfield is None:
return None
# 2a) ComputedFieldInfo may have a description attribute
if getattr(cfield, "description", None):
return str(cfield.description)
# 2b) fallback to wrapped property's docstring
func = getattr(cfield, "func", None)
if func and func.__doc__:
return func.__doc__.strip()
# 2c) last resort: json_schema_extra if you use it for computed fields
extra = cls._field_extra_dict(cfield)
if "description" in extra:
return str(extra["description"])
return None
return "-"
@classmethod
def field_deprecated(cls, field_name: str) -> Optional[str]:
@@ -887,7 +935,7 @@ class PydanticDateTimeData(RootModel):
{
"start_datetime": "2024-01-01 00:00:00", # optional
"interval": "1 Hour", # optional
"interval": "1 hour", # optional
"loadforecast_power_w": [20.5, 21.0, 22.1],
"load_min": [18.5, 19.0, 20.1]
}

View File

@@ -6,13 +6,15 @@ from fnmatch import fnmatch
from pathlib import Path
from typing import Optional
# For development add `+dev` to previous release
# For release omit `+dev`.
VERSION_BASE = "0.2.0+dev"
# For development add `.dev` to previous release
# For release omit `.dev`.
VERSION_BASE = "0.2.0.dev"
# Project hash of relevant files
HASH_EOS = ""
# Number of digits to append to .dev to identify a development version
VERSION_DEV_PRECISION = 8
# ------------------------------
# Helpers for version generation
@@ -91,8 +93,11 @@ def _version_calculate() -> str:
"""Compute version."""
global HASH_EOS
HASH_EOS = _version_hash()
if VERSION_BASE.endswith("+dev"):
return f"{VERSION_BASE}.{HASH_EOS[:6]}"
if VERSION_BASE.endswith("dev"):
# After dev only digits are allowed - convert hexdigest to digits
hash_value = int(HASH_EOS, 16)
hash_digits = str(hash_value % (10**VERSION_DEV_PRECISION)).zfill(VERSION_DEV_PRECISION)
return f"{VERSION_BASE}{hash_digits}"
else:
return VERSION_BASE
@@ -114,10 +119,10 @@ __version__ = _version_calculate()
VERSION_RE = re.compile(
r"""
^(?P<base>\d+\.\d+\.\d+) # x.y.z
(?:\+ # +dev.hash starts here
(?:[\.\+\-] # .dev<hash> starts here
(?:
(?P<dev>dev) # literal 'dev'
(?:\.(?P<hash>[A-Za-z0-9]+))? # optional .hash
(?:(?P<hash>[A-Za-z0-9]+))? # optional <hash>
)
)?
$
@@ -131,8 +136,8 @@ def version() -> dict[str, Optional[str]]:
The version string shall be of the form:
x.y.z
x.y.z+dev
x.y.z+dev.HASH
x.y.z.dev
x.y.z.dev<HASH>
Returns:
.. code-block:: python

View File

@@ -18,7 +18,7 @@ from akkudoktoreos.devices.devicesabc import DevicesBaseSettings
from akkudoktoreos.utils.datetimeutil import DateTime, TimeWindowSequence, to_datetime
# Default charge rates for battery
BATTERY_DEFAULT_CHARGE_RATES = np.linspace(0.0, 1.0, 11) # 0.0, 0.1, ..., 1.0
BATTERY_DEFAULT_CHARGE_RATES: list[float] = [0.0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0]
class BatteriesCommonSettings(DevicesBaseSettings):
@@ -68,7 +68,7 @@ class BatteriesCommonSettings(DevicesBaseSettings):
json_schema_extra={"description": "Minimum charging power [W].", "examples": [50]},
)
charge_rates: Optional[NDArray[Shape["*"], float]] = Field(
charge_rates: Optional[list[float]] = Field(
default=BATTERY_DEFAULT_CHARGE_RATES,
json_schema_extra={
"description": (
@@ -165,10 +165,7 @@ class BatteriesCommonSettings(DevicesBaseSettings):
@computed_field # type: ignore[prop-decorator]
@property
def measurement_keys(self) -> Optional[list[str]]:
"""Measurement keys for the battery stati that are measurements.
Battery SoC, power.
"""
"""Measurement keys for the battery stati that are measurements."""
keys: list[str] = [
self.measurement_key_soc_factor,
self.measurement_key_power_l1_w,

View File

@@ -25,7 +25,7 @@ class Battery:
self.discharging_efficiency = self.parameters.discharging_efficiency
# Charge rates, in case of None use default
self.charge_rates = BATTERY_DEFAULT_CHARGE_RATES
self.charge_rates = np.array(BATTERY_DEFAULT_CHARGE_RATES, dtype=float)
if self.parameters.charge_rates:
charge_rates = np.array(self.parameters.charge_rates, dtype=float)
charge_rates = np.unique(charge_rates)

View File

@@ -234,14 +234,14 @@ class GeneticSimulation(PydanticBaseModel):
consumption_energy_per_hour = np.full((total_hours), np.nan)
costs_per_hour = np.full((total_hours), np.nan)
revenue_per_hour = np.full((total_hours), np.nan)
soc_per_hour = np.full((total_hours), np.nan)
soc_ev_per_hour = np.full((total_hours), np.nan)
losses_wh_per_hour = np.full((total_hours), np.nan)
home_appliance_wh_per_hour = np.full((total_hours), np.nan)
electricity_price_per_hour = np.full((total_hours), np.nan)
# Set initial state
if battery_fast:
# Pre-allocate arrays for the results, optimized for speed
soc_per_hour = np.full((total_hours), np.nan)
soc_per_hour[0] = battery_fast.current_soc_percentage()
# Fill the charge array of the battery
dc_charge_hours_fast[0:start_hour] = 0
@@ -255,8 +255,14 @@ class GeneticSimulation(PydanticBaseModel):
bat_discharge_hours_fast[0:start_hour] = 0
bat_discharge_hours_fast[end_hour:] = 0
battery_fast.discharge_array = bat_discharge_hours_fast
else:
# Default return if no battery is available
soc_per_hour = np.full((total_hours), 0)
if ev_fast:
# Pre-allocate arrays for the results, optimized for speed
soc_ev_per_hour = np.full((total_hours), np.nan)
soc_ev_per_hour[0] = ev_fast.current_soc_percentage()
# Fill the charge array of the ev
ev_charge_hours_fast[0:start_hour] = 0
@@ -266,14 +272,22 @@ class GeneticSimulation(PydanticBaseModel):
ev_discharge_hours_fast[0:start_hour] = 0
ev_discharge_hours_fast[end_hour:] = 0
ev_fast.discharge_array = ev_discharge_hours_fast
else:
# Default return if no electric vehicle is available
soc_ev_per_hour = np.full((total_hours), 0)
if home_appliance_fast and self.home_appliance_start_hour:
home_appliance_enabled = True
# Pre-allocate arrays for the results, optimized for speed
home_appliance_wh_per_hour = np.full((total_hours), np.nan)
self.home_appliance_start_hour = home_appliance_fast.set_starting_time(
self.home_appliance_start_hour, start_hour
)
else:
home_appliance_enabled = False
# Default return if no home appliance is available
home_appliance_wh_per_hour = np.full((total_hours), 0)
for hour in range(start_hour, end_hour):
hour_idx = hour - start_hour

View File

@@ -177,33 +177,33 @@ class GeneticOptimizationParameters(
# Check for general predictions conditions
if cls.config.general.latitude is None:
default_latitude = 52.52
logger.error(f"Latitude unknown - defaulting to {default_latitude}.")
logger.info(f"Latitude unknown - defaulting to {default_latitude}.")
cls.config.general.latitude = default_latitude
if cls.config.general.longitude is None:
default_longitude = 13.405
logger.error(f"Longitude unknown - defaulting to {default_longitude}.")
logger.info(f"Longitude unknown - defaulting to {default_longitude}.")
cls.config.general.longitude = default_longitude
if cls.config.prediction.hours is None:
logger.error("Prediction hours unknown - defaulting to 48 hours.")
logger.info("Prediction hours unknown - defaulting to 48 hours.")
cls.config.prediction.hours = 48
if cls.config.prediction.historic_hours is None:
logger.error("Prediction historic hours unknown - defaulting to 24 hours.")
logger.info("Prediction historic hours unknown - defaulting to 24 hours.")
cls.config.prediction.historic_hours = 24
# Check optimization definitions
if cls.config.optimization.horizon_hours is None:
logger.error("Optimization horizon unknown - defaulting to 24 hours.")
logger.info("Optimization horizon unknown - defaulting to 24 hours.")
cls.config.optimization.horizon_hours = 24
if cls.config.optimization.interval is None:
logger.error("Optimization interval unknown - defaulting to 3600 seconds.")
logger.info("Optimization interval unknown - defaulting to 3600 seconds.")
cls.config.optimization.interval = 3600
if cls.config.optimization.interval != 3600:
logger.error(
logger.info(
"Optimization interval '{}' seconds not supported - forced to 3600 seconds."
)
cls.config.optimization.interval = 3600
# Check genetic algorithm definitions
if cls.config.optimization.genetic is None:
logger.error(
logger.info(
"Genetic optimization configuration not configured - defaulting to demo config."
)
cls.config.optimization.genetic = {
@@ -215,16 +215,16 @@ class GeneticOptimizationParameters(
},
}
if cls.config.optimization.genetic.individuals is None:
logger.error("Genetic individuals unknown - defaulting to 300.")
logger.info("Genetic individuals unknown - defaulting to 300.")
cls.config.optimization.genetic.individuals = 300
if cls.config.optimization.genetic.generations is None:
logger.error("Genetic generations unknown - defaulting to 400.")
logger.info("Genetic generations unknown - defaulting to 400.")
cls.config.optimization.genetic.generations = 400
if cls.config.optimization.genetic.penalties is None:
logger.error("Genetic penalties unknown - defaulting to demo config.")
logger.info("Genetic penalties unknown - defaulting to demo config.")
cls.config.optimization.genetic.penalties = {"ev_soc_miss": 10}
if "ev_soc_miss" not in cls.config.optimization.genetic.penalties:
logger.error("ev_soc_miss penalty function parameter unknown - defaulting to 100.")
logger.info("ev_soc_miss penalty function parameter unknown - defaulting to 10.")
cls.config.optimization.genetic.penalties["ev_soc_miss"] = 10
# Get start solution from last run
@@ -262,7 +262,7 @@ class GeneticOptimizationParameters(
* power_to_energy_per_interval_factor
).tolist()
except:
logger.exception(
logger.info(
"No PV forecast data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -270,6 +270,7 @@ class GeneticOptimizationParameters(
{
"pvforecast": {
"provider": "PVForecastAkkudoktor",
"max_planes": 4,
"planes": [
{
"peakpower": 5.0,
@@ -314,7 +315,7 @@ class GeneticOptimizationParameters(
fill_method="ffill",
).tolist()
except:
logger.exception(
logger.info(
"No Electricity Marketprice forecast data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -330,7 +331,7 @@ class GeneticOptimizationParameters(
fill_method="ffill",
).tolist()
except:
logger.exception(
logger.info(
"No Load forecast data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -357,7 +358,7 @@ class GeneticOptimizationParameters(
fill_method="ffill",
).tolist()
except:
logger.exception(
logger.info(
"No feed in tariff forecast data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -384,7 +385,7 @@ class GeneticOptimizationParameters(
fill_method="ffill",
).tolist()
except:
logger.exception(
logger.info(
"No weather forecast data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -397,14 +398,14 @@ class GeneticOptimizationParameters(
# Batteries
# ---------
if cls.config.devices.max_batteries is None:
logger.error("Number of battery devices not configured - defaulting to 1.")
logger.info("Number of battery devices not configured - defaulting to 1.")
cls.config.devices.max_batteries = 1
if cls.config.devices.max_batteries == 0:
battery_params = None
battery_lcos_kwh = 0
else:
if cls.config.devices.batteries is None:
logger.error("No battery device data available - defaulting to demo data.")
logger.info("No battery device data available - defaulting to demo data.")
cls.config.devices.batteries = [{"device_id": "battery1", "capacity_wh": 8000}]
try:
battery_config = cls.config.devices.batteries[0]
@@ -418,7 +419,7 @@ class GeneticOptimizationParameters(
max_soc_percentage=battery_config.max_soc_percentage,
)
except:
logger.exception(
logger.info(
"No battery device data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -427,7 +428,7 @@ class GeneticOptimizationParameters(
continue
# Levelized cost of ownership
if battery_config.levelized_cost_of_storage_kwh is None:
logger.error(
logger.info(
"No battery device LCOS data available - defaulting to 0 €/kWh. Parameter preparation attempt {}.",
attempt,
)
@@ -449,7 +450,7 @@ class GeneticOptimizationParameters(
except:
initial_soc_percentage = None
if initial_soc_percentage is None:
logger.error(
logger.info(
f"No battery device SoC data (measurement key = '{battery_config.measurement_key_soc_factor}') available - defaulting to 0."
)
initial_soc_percentage = 0
@@ -458,13 +459,13 @@ class GeneticOptimizationParameters(
# Electric Vehicles
# -----------------
if cls.config.devices.max_electric_vehicles is None:
logger.error("Number of electric_vehicle devices not configured - defaulting to 1.")
logger.info("Number of electric_vehicle devices not configured - defaulting to 1.")
cls.config.devices.max_electric_vehicles = 1
if cls.config.devices.max_electric_vehicles == 0:
electric_vehicle_params = None
else:
if cls.config.devices.electric_vehicles is None:
logger.error(
logger.info(
"No electric vehicle device data available - defaulting to demo data."
)
cls.config.devices.max_electric_vehicles = 1
@@ -489,7 +490,7 @@ class GeneticOptimizationParameters(
max_soc_percentage=electric_vehicle_config.max_soc_percentage,
)
except:
logger.exception(
logger.info(
"No electric_vehicle device data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -520,7 +521,7 @@ class GeneticOptimizationParameters(
except:
initial_soc_percentage = None
if initial_soc_percentage is None:
logger.error(
logger.info(
f"No electric vehicle device SoC data (measurement key = '{electric_vehicle_config.measurement_key_soc_factor}') available - defaulting to 0."
)
initial_soc_percentage = 0
@@ -529,13 +530,13 @@ class GeneticOptimizationParameters(
# Inverters
# ---------
if cls.config.devices.max_inverters is None:
logger.error("Number of inverter devices not configured - defaulting to 1.")
logger.info("Number of inverter devices not configured - defaulting to 1.")
cls.config.devices.max_inverters = 1
if cls.config.devices.max_inverters == 0:
inverter_params = None
else:
if cls.config.devices.inverters is None:
logger.error("No inverter device data available - defaulting to demo data.")
logger.info("No inverter device data available - defaulting to demo data.")
cls.config.devices.inverters = [
{
"device_id": "inverter1",
@@ -551,7 +552,7 @@ class GeneticOptimizationParameters(
battery_id=inverter_config.battery_id,
)
except:
logger.exception(
logger.info(
"No inverter device data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -568,14 +569,14 @@ class GeneticOptimizationParameters(
# Home Appliances
# ---------------
if cls.config.devices.max_home_appliances is None:
logger.error("Number of home appliance devices not configured - defaulting to 1.")
logger.info("Number of home appliance devices not configured - defaulting to 1.")
cls.config.devices.max_home_appliances = 1
if cls.config.devices.max_home_appliances == 0:
home_appliance_params = None
else:
home_appliance_params = None
if cls.config.devices.home_appliances is None:
logger.error(
logger.info(
"No home appliance device data available - defaulting to demo data."
)
cls.config.devices.home_appliances = [
@@ -606,7 +607,7 @@ class GeneticOptimizationParameters(
time_windows=home_appliance_config.time_windows,
)
except:
logger.exception(
logger.info(
"No home appliance device data available - defaulting to demo data. Parameter preparation attempt {}.",
attempt,
)
@@ -639,7 +640,7 @@ class GeneticOptimizationParameters(
start_solution=start_solution,
)
except:
logger.exception(
logger.info(
"Can not prepare optimization parameters - will retry. Parameter preparation attempt {}.",
attempt,
)

View File

@@ -416,9 +416,35 @@ class GeneticSolution(ConfigMixin, GeneticParametersBaseModel):
solution[key] = operation[key]
# Add home appliance data
if self.washingstart:
if self.config.devices.max_home_appliances and self.config.devices.max_home_appliances > 0:
# Use config and not self.washingstart as washingstart may be None (no start)
# even if configured to be started.
# result starts at start_day_hour
solution["homeappliance1_energy_wh"] = self.result.Home_appliance_wh_per_hour[:n_points]
operation = {
"homeappliance1_run_op_mode": [],
"homeappliance1_run_op_factor": [],
"homeappliance1_off_op_mode": [],
"homeappliance1_off_op_factor": [],
}
for hour_idx, energy in enumerate(solution["homeappliance1_energy_wh"]):
if energy > 0.0:
operation["homeappliance1_run_op_mode"].append(1.0)
operation["homeappliance1_run_op_factor"].append(1.0)
operation["homeappliance1_off_op_mode"].append(0.0)
operation["homeappliance1_off_op_factor"].append(0.0)
else:
operation["homeappliance1_run_op_mode"].append(0.0)
operation["homeappliance1_run_op_factor"].append(0.0)
operation["homeappliance1_off_op_mode"].append(1.0)
operation["homeappliance1_off_op_factor"].append(1.0)
for key in operation.keys():
if len(operation[key]) != n_points:
error_msg = f"instruction {key} has invalid length {len(operation[key])} - expected {n_points}"
logger.error(error_msg)
raise ValueError(error_msg)
solution[key] = operation[key]
# Fill prediction into dataframe with correct column names
# - pvforecast_ac_energy_wh_energy_wh: PV energy prediction (positive) in wh
@@ -633,19 +659,33 @@ class GeneticSolution(ConfigMixin, GeneticParametersBaseModel):
)
# Add home appliance instructions (demand driven based control)
if self.washingstart:
if self.config.devices.max_home_appliances and self.config.devices.max_home_appliances > 0:
# Use config and not self.washingstart as washingstart may be None (no start)
# even if configured to be started.
resource_id = "homeappliance1"
operation_mode = ApplianceOperationMode.RUN # type: ignore[assignment]
operation_mode_factor = 1.0
execution_time = start_datetime.add(hours=self.washingstart - start_day_hour)
plan.add_instruction(
DDBCInstruction(
resource_id=resource_id,
execution_time=execution_time,
actuator_id=resource_id,
operation_mode_id=operation_mode,
operation_mode_factor=operation_mode_factor,
)
)
last_energy: Optional[float] = None
for hours, energy in enumerate(self.result.Home_appliance_wh_per_hour):
# hours starts at start_datetime with 0
if energy is None:
raise ValueError(
f"Unexpected value {energy} in {self.result.Home_appliance_wh_per_hour}"
)
if last_energy is None or energy != last_energy:
if energy > 0.0:
operation_mode = ApplianceOperationMode.RUN # type: ignore[assignment]
else:
operation_mode = ApplianceOperationMode.OFF # type: ignore[assignment]
operation_mode_factor = 1.0
execution_time = start_datetime.add(hours=hours)
plan.add_instruction(
DDBCInstruction(
resource_id=resource_id,
execution_time=execution_time,
actuator_id=resource_id,
operation_mode_id=operation_mode,
operation_mode_factor=operation_mode_factor,
)
)
last_energy = energy
return plan

View File

@@ -1,6 +1,6 @@
from typing import Optional, Union
from pydantic import Field, model_validator
from pydantic import Field, computed_field, model_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.pydantic import (
@@ -86,6 +86,22 @@ class OptimizationCommonSettings(SettingsBaseModel):
},
)
# Computed fields
@computed_field # type: ignore[prop-decorator]
@property
def keys(self) -> list[str]:
"""The keys of the solution."""
from akkudoktoreos.core.ems import get_ems
key_list = []
optimization_solution = get_ems().optimization_solution()
if optimization_solution:
# Prepare mapping
df = optimization_solution.solution.to_dataframe()
key_list = df.columns.tolist()
return sorted(set(key_list))
# Validators
@model_validator(mode="after")
def _enforce_algorithm_configuration(self) -> "OptimizationCommonSettings":
"""Ensure algorithm default configuration is set."""

View File

@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import Field, field_validator
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
@@ -57,6 +57,12 @@ class ElecPriceCommonSettings(SettingsBaseModel):
json_schema_extra={"description": "Energy Charts provider settings."},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available electricity price provider ids."""
return elecprice_providers
# Validators
@field_validator("provider", mode="after")
@classmethod

View File

@@ -1,6 +1,6 @@
from typing import Optional
from pydantic import Field, field_validator
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.feedintariffabc import FeedInTariffProvider
@@ -56,6 +56,12 @@ class FeedInTariffCommonSettings(SettingsBaseModel):
},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available feed in tariff provider ids."""
return feedintariff_providers
# Validators
@field_validator("provider", mode="after")
@classmethod

View File

@@ -2,7 +2,7 @@
from typing import Optional
from pydantic import Field, field_validator
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.loadabc import LoadProvider
@@ -62,6 +62,12 @@ class LoadCommonSettings(SettingsBaseModel):
},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available load provider ids."""
return load_providers
# Validators
@field_validator("provider", mode="after")
@classmethod

View File

@@ -39,11 +39,11 @@ class LoadImportCommonSettings(SettingsBaseModel):
@field_validator("import_file_path", mode="after")
@classmethod
def validate_loadimport_file_path(cls, value: Optional[Union[str, Path]]) -> Optional[Path]:
"""Ensure file is available."""
if value is None:
return None
if isinstance(value, str):
value = Path(value)
"""Ensure file is available."""
value.resolve()
if not value.is_file():
raise ValueError(f"Import file path '{value}' is not a file.")

View File

@@ -24,7 +24,7 @@ class VrmForecastResponse(PydanticBaseModel):
class LoadVrmCommonSettings(SettingsBaseModel):
"""Common settings for VRM API."""
"""Common settings for load forecast VRM API."""
load_vrm_token: str = Field(
default="your-token",

View File

@@ -52,22 +52,7 @@ from akkudoktoreos.prediction.weatherimport import WeatherImport
class PredictionCommonSettings(SettingsBaseModel):
"""General Prediction Configuration.
This class provides configuration for prediction settings, allowing users to specify
parameters such as the forecast duration (in hours).
Validators ensure each parameter is within a specified range.
Attributes:
hours (Optional[int]): Number of hours into the future for predictions.
Must be non-negative.
historic_hours (Optional[int]): Number of hours into the past for historical data.
Must be non-negative.
Validators:
validate_hours (int): Ensures `hours` is a non-negative integer.
validate_historic_hours (int): Ensures `historic_hours` is a non-negative integer.
"""
"""General Prediction Configuration."""
hours: Optional[int] = Field(
default=48,

View File

@@ -260,6 +260,12 @@ class PVForecastCommonSettings(SettingsBaseModel):
},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available PVForecast provider ids."""
return pvforecast_providers
# Validators
@field_validator("provider", mode="after")
@classmethod

View File

@@ -193,20 +193,6 @@ class PVForecastAkkudoktor(PVForecastProvider):
from the PVForecastAkkudoktor API and maps it to `PVForecastDataRecord` fields, applying
any necessary scaling or unit corrections. It manages the forecast over a range
of hours into the future and retains historical data.
Attributes:
hours (int, optional): Number of hours in the future for the forecast.
historic_hours (int, optional): Number of past hours for retaining data.
latitude (float, optional): The latitude in degrees, validated to be between -90 and 90.
longitude (float, optional): The longitude in degrees, validated to be between -180 and 180.
start_datetime (datetime, optional): Start datetime for forecasts, defaults to the current datetime.
end_datetime (datetime, computed): The forecast's end datetime, computed based on `start_datetime` and `hours`.
keep_datetime (datetime, computed): The datetime to retain historical data, computed from `start_datetime` and `historic_hours`.
Methods:
provider_id(): Returns a unique identifier for the provider.
_request_forecast(): Fetches the forecast from the Akkudoktor API.
_update_data(): Processes and updates forecast data from Akkudoktor in PVForecastDataRecord format.
"""
# overload

View File

@@ -24,7 +24,7 @@ class VrmForecastResponse(PydanticBaseModel):
class PVForecastVrmCommonSettings(SettingsBaseModel):
"""Common settings for VRM API."""
"""Common settings for PV forecast VRM API."""
pvforecast_vrm_token: str = Field(
default="your-token",

View File

@@ -2,7 +2,7 @@
from typing import Optional
from pydantic import Field, field_validator
from pydantic import Field, computed_field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.prediction.prediction import get_prediction
@@ -52,6 +52,12 @@ class WeatherCommonSettings(SettingsBaseModel):
},
)
@computed_field # type: ignore[prop-decorator]
@property
def providers(self) -> list[str]:
"""Available weather provider ids."""
return weather_providers
# Validators
@field_validator("provider", mode="after")
@classmethod

View File

@@ -37,7 +37,7 @@ WheaterDataClearOutsideMapping: List[Tuple[str, Optional[str], Optional[float]]]
("Precipitation Type", "Precipitation Type", None),
("Precipitation Probability (%)", "Precipitation Probability (%)", 1),
("Precipitation Amount (mm)", "Precipitation Amount (mm)", 1),
("Wind Speed (mph)", "Wind Speed (kmph)", 1.60934),
("Wind Speed/Direction (mph)", "Wind Speed (kmph)", 1.60934),
("Chance of Frost", "Chance of Frost", None),
("Temperature (°C)", "Temperature (°C)", 1),
("Feels Like (°C)", "Feels Like (°C)", 1),
@@ -218,7 +218,7 @@ class WeatherClearOutside(WeatherProvider):
for detail_name in detail_names:
if detail_name not in clearoutside_key_mapping:
warning_msg = (
f"Clearoutside schema change. Unexpected detail name {detail_name}."
f"Clearoutside schema change. Unexpected detail name '{detail_name}'."
)
logger.warning(warning_msg)
@@ -226,17 +226,13 @@ class WeatherClearOutside(WeatherProvider):
# Beware there is one ul paragraph before that is not associated to a detail
p_detail_tables = p_day.find_all("ul")
if len(p_detail_tables) != len(detail_names) + 1:
error_msg = f"Clearoutside schema change. Unexpected number ({p_detail_tables}) of `ul` for details {len(detail_names)}. Should be one extra only."
error_msg = f"Clearoutside schema change. Unexpected number ({p_detail_tables}) of 'ul' for details {len(detail_names)}. Should be one extra only."
logger.error(error_msg)
raise ValueError(error_msg)
p_detail_tables.pop(0)
# Create clearout data
clearout_data = {}
# Replace some detail names that we use differently
detail_names = [
s.replace("Wind Speed/Direction (mph)", "Wind Speed (mph)") for s in detail_names
]
# Number of detail values. On last day may be less than 24.
detail_values_count = None
# Add data values
@@ -266,7 +262,7 @@ class WeatherClearOutside(WeatherProvider):
extra_detail_name = None
extra_detail_data = []
for p_detail_value in p_detail_values:
if detail_name == "Wind Speed (mph)":
if detail_name == "Wind Speed/Direction (mph)":
# Get the usual value
value_str = p_detail_value.get_text()
# Also extract extra data

View File

@@ -19,6 +19,8 @@ over a specified period.
Documentation can be found at [Akkudoktor-EOS](https://akkudoktor-eos.readthedocs.io/en/latest/).
---
## Version Information
**Current Version:** {__version__}
@@ -29,4 +31,5 @@ Documentation can be found at [Akkudoktor-EOS](https://akkudoktor-eos.readthedoc
def About(**kwargs: Any) -> Div:
global about_md
return Markdown(about_md, **kwargs)

View File

@@ -5,17 +5,13 @@ for the EOS dashboard.
"""
import json
from pathlib import Path
from typing import Any, Optional, Union
import requests
from fasthtml.common import Select
from loguru import logger
from monsterui.foundations import stringify
from monsterui.franken import ( # Select, TODO: Select from FrankenUI does not work - using Select from FastHTML instead
H3,
Button,
ButtonT,
Card,
Details,
Div,
@@ -28,33 +24,12 @@ from monsterui.franken import ( # Select, TODO: Select from FrankenUI does not
Summary,
UkIcon,
)
from platformdirs import user_config_dir
from akkudoktoreos.server.dash.components import Error, Success
from akkudoktoreos.server.dash.components import ConfigButton, Error, Success
from akkudoktoreos.server.dash.configuration import get_nested_value
from akkudoktoreos.server.dash.context import export_import_directory, request_url_for
from akkudoktoreos.utils.datetimeutil import to_datetime
# Directory to export files to, or to import files from
export_import_directory = Path(user_config_dir("net.akkudoktor.eosdash", "akkudoktor"))
def AdminButton(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> Button:
"""Creates a styled button for administrative actions.
Args:
*c (Any): Positional arguments representing the button's content.
cls (Optional[Union[str, tuple]]): Additional CSS classes for styling. Defaults to None.
**kwargs (Any): Additional keyword arguments passed to the `Button`.
Returns:
Button: A styled `Button` component for admin actions.
"""
new_cls = f"{ButtonT.primary}"
if cls:
new_cls += f" {stringify(cls)}"
kwargs["cls"] = new_cls
return Button(*c, submit=False, **kwargs)
def AdminCache(
eos_host: str, eos_port: Union[str, int], data: Optional[dict], config: Optional[dict[str, Any]]
@@ -111,9 +86,9 @@ def AdminCache(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Clear all",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='{"category": "cache", "action": "clear"}',
@@ -132,9 +107,9 @@ def AdminCache(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Clear expired",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='{"category": "cache", "action": "clear-expired"}',
@@ -301,14 +276,16 @@ def AdminConfig(
)
# Update for display, in case we added a new file before
import_from_file_names = [f.name for f in list(export_import_directory.glob("*.json"))]
import_from_file_names = sorted([f.name for f in list(export_import_directory.glob("*.json"))])
if config_backup is None:
revert_to_backup_metadata_list = ["Backup list not available"]
else:
revert_to_backup_metadata_list = [
f"{backup_meta['date_time']} {backup_meta['version']}"
for backup_id, backup_meta in config_backup.items()
]
revert_to_backup_metadata_list = sorted(
[
f"{backup_meta['date_time']} {backup_meta['version']}"
for backup_id, backup_meta in config_backup.items()
]
)
return (
category,
@@ -319,9 +296,9 @@ def AdminConfig(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Save to file",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='{"category": "configuration", "action": "save_to_file"}',
@@ -341,9 +318,9 @@ def AdminConfig(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Revert to backup",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "category": "configuration", "action": "revert_to_backup", "backup_metadata": document.querySelector("[name=\'selected_backup_metadata\']").value }',
@@ -352,6 +329,7 @@ def AdminConfig(
*Options(*revert_to_backup_metadata_list),
id="backup_metadata",
name="selected_backup_metadata", # Name of hidden input field with selected value
cls="border rounded px-3 py-2 mr-2",
placeholder="Select backup",
),
),
@@ -368,9 +346,9 @@ def AdminConfig(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Export to file",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{"category": "configuration", "action": "export_to_file", "export_to_file_tag": document.querySelector("[name=\'chosen_export_file_tag\']").value }',
@@ -398,9 +376,9 @@ def AdminConfig(
Grid(
DivHStacked(
UkIcon(icon="play"),
AdminButton(
ConfigButton(
"Import from file",
hx_post="/eosdash/admin",
hx_post=request_url_for("/eosdash/admin"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "category": "configuration", "action": "import_from_file", "import_file_name": document.querySelector("[name=\'selected_import_file_name\']").value }',
@@ -409,6 +387,7 @@ def AdminConfig(
*Options(*import_from_file_names),
id="import_file_name",
name="selected_import_file_name", # Name of hidden input field with selected value
cls="border rounded px-3 py-2 mr-2",
placeholder="Select file",
),
),

View File

@@ -2,35 +2,13 @@
# MIT license
from typing import Optional
import bokeh
from bokeh.embed import components
from bokeh.models import Plot
from monsterui.franken import H4, Card, NotStr, Script
from bokeh.resources import INLINE
from monsterui.franken import H4, Card, NotStr
bokeh_version = bokeh.__version__
BokehJS = [
Script(
src=f"https://cdn.bokeh.org/bokeh/release/bokeh-{bokeh_version}.min.js",
crossorigin="anonymous",
),
Script(
src=f"https://cdn.bokeh.org/bokeh/release/bokeh-widgets-{bokeh_version}.min.js",
crossorigin="anonymous",
),
Script(
src=f"https://cdn.bokeh.org/bokeh/release/bokeh-tables-{bokeh_version}.min.js",
crossorigin="anonymous",
),
Script(
src=f"https://cdn.bokeh.org/bokeh/release/bokeh-gl-{bokeh_version}.min.js",
crossorigin="anonymous",
),
Script(
src=f"https://cdn.bokeh.org/bokeh/release/bokeh-mathjax-{bokeh_version}.min.js",
crossorigin="anonymous",
),
]
# Javascript for bokeh - to be included by the page
BokehJS = [NotStr(INLINE.render_css()), NotStr(INLINE.render_js())]
def bokey_apply_theme_to_plot(plot: Plot, dark: bool) -> None:

View File

@@ -1,28 +1,36 @@
from typing import Any, Optional, Union
import json
from typing import Any, Callable, Optional, Union
from fasthtml.common import H1, Button, Div, Li
from fasthtml.common import H1, Button, Div, Li, Select
from monsterui.daisy import (
Alert,
AlertT,
)
from monsterui.foundations import stringify
from monsterui.franken import ( # Button, Does not pass hx_vals
from monsterui.franken import ( # Select: Does not work - using Select from FastHTML instead;; Button: Does not pass hx_vals - using Button from FastHTML instead
H3,
ButtonT,
Card,
Code,
Container,
ContainerT,
Details,
DivHStacked,
DivLAligned,
DivRAligned,
Form,
Grid,
Input,
Option,
P,
Pre,
Summary,
TabContainer,
UkIcon,
)
from akkudoktoreos.server.dash.context import request_url_for
scrollbar_viewport_styles = (
"scrollbar-width: none; -ms-overflow-style: none; -webkit-overflow-scrolling: touch;"
)
@@ -71,11 +79,59 @@ def ScrollArea(
)
def JsonView(data: Any) -> Pre:
"""Render structured data as formatted JSON inside a styled <pre> block.
The data is serialized to JSON using indentation for readability and
UTF-8 characters are preserved. The JSON is wrapped in a <code> element
with a JSON language class to support syntax highlighting, and then
placed inside a <pre> container with MonsterUI-compatible styling.
The JSON output is height-constrained and scrollable to safely display
large payloads without breaking the page layout.
Args:
data: Any JSON-serializable Python object to render.
Returns:
A FastHTML `Pre` element containing a formatted JSON representation
of the input data.
"""
code_str = json.dumps(data, indent=2, ensure_ascii=False)
return Pre(
Code(code_str, cls="language-json"),
cls="rounded-lg bg-muted p-3 max-h-[30vh] overflow-y-auto overflow-x-hidden whitespace-pre-wrap",
)
def TextView(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> Pre:
"""Render plain text with preserved line breaks and wrapped long lines.
This view uses a <pre> element with whitespace wrapping enabled so that
newline characters are respected while long lines are wrapped instead
of causing horizontal scrolling.
Args:
*c (Any): Positional arguments representing the TextView content.
cls (Optional[Union[str, tuple]]): Additional CSS classes for styling. Defaults to None.
**kwargs (Any): Additional keyword arguments passed to the `Pre`.
Returns:
A FastHTML `Pre` element that displays the text with preserved
formatting and line wrapping.
"""
new_cls = "whitespace-pre-wrap"
if cls:
new_cls += f"{stringify(cls)}"
kwargs["cls"] = new_cls
return Pre(*c, **kwargs)
def Success(*c: Any) -> Alert:
return Alert(
DivLAligned(
UkIcon("check"),
P(*c),
TextView(*c),
),
cls=AlertT.success,
)
@@ -85,12 +141,321 @@ def Error(*c: Any) -> Alert:
return Alert(
DivLAligned(
UkIcon("triangle-alert"),
P(*c),
TextView(*c),
),
cls=AlertT.error,
)
def ConfigButton(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> Button:
"""Creates a styled button for configuration actions.
Args:
*c (Any): Positional arguments representing the button's content.
cls (Optional[Union[str, tuple]]): Additional CSS classes for styling. Defaults to None.
**kwargs (Any): Additional keyword arguments passed to the `Button`.
Returns:
Button: A styled `Button` component for configuration actions.
"""
new_cls = f"px-4 py-2 rounded {ButtonT.primary}"
if cls:
new_cls += f"{stringify(cls)}"
kwargs["cls"] = new_cls
return Button(*c, submit=False, **kwargs)
def make_config_update_form() -> Callable[[str, str], Grid]:
"""Factory for a form that sets a single configuration value.
Returns:
A function (config_name: str, value: str) -> Grid
"""
def ConfigUpdateForm(config_name: str, value: str) -> Grid:
config_id = config_name.lower().replace(".", "-")
return Grid(
DivRAligned(P("update")),
Grid(
Form(
Input(value="update", type="hidden", id="action"),
Input(value=config_name, type="hidden", id="key"),
Input(value=value, type="text", id="value"),
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
),
),
id=f"{config_id}-update-form",
)
return ConfigUpdateForm
def make_config_update_value_form(
available_values: list[str],
) -> Callable[[str, str], Grid]:
"""Factory for a form that sets a single configuration value with pre-set avaliable values.
Args:
available_values: Allowed values for the configuration
Returns:
A function (config_name: str, value: str) -> Grid
"""
def ConfigUpdateValueForm(config_name: str, value: str) -> Grid:
config_id = config_name.lower().replace(".", "-")
return Grid(
DivRAligned(P("update value")),
DivHStacked(
ConfigButton(
"Set",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals=f"""js:{{
action: "update",
key: "{config_name}",
value: document
.querySelector("[name='{config_id}_selected_value']")
.value
}}""",
),
Select(
Option("Select a value...", value="", selected=True, disabled=True),
*[
Option(
val,
value=val,
selected=(val == value),
)
for val in available_values
],
id=f"{config_id}-value-select",
name=f"{config_id}_selected_value",
required=True,
cls="border rounded px-3 py-2 mr-2 col-span-4",
),
),
id=f"{config_id}-update-value-form",
)
return ConfigUpdateValueForm
def make_config_update_list_form(available_values: list[str]) -> Callable[[str, str], Grid]:
"""Factory function that creates a ConfigUpdateListForm with pre-set available values.
Args:
available_values: List of available values to choose from
Returns:
A function that creates ConfigUpdateListForm instances with the given available_values.
The returned function takes (config_name: str, value: str) and returns a Grid.
"""
def ConfigUpdateListForm(config_name: str, value: str) -> Grid:
"""Creates a card with a form to add/remove values from a list.
Sends to "/eosdash/configuration":
The form sends an HTTP PUT request with the following parameters:
- key (str): The configuration key name (value of config_name parameter)
- value (str): A JSON string representing the updated list of values
The value parameter will always be a valid JSON string representation of a list.
Args:
config_name: The name of the configuration
value (str): The current value of the configuration, a list of values in json format.
"""
current_values = json.loads(value)
if current_values is None:
current_values = []
config_id = config_name.lower().replace(".", "-")
return Grid(
DivRAligned(P("update list")),
Grid(
# Form to add new value to list
DivHStacked(
ConfigButton(
"Add",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals=f"""js:{{
action: "update",
key: "{config_name}",
value: JSON.stringify(
[...new Set([
...{json.dumps(current_values)},
document.querySelector("[name='{config_id}_selected_add_value']").value.trim()
])].filter(v => v !== "")
)
}}""",
),
Select(
Option("Select a value...", value="", selected=True, disabled=True),
*[
Option(val, value=val, disabled=val in current_values)
for val in available_values
],
id=f"{config_id}-add-value-select",
name=f"{config_id}_selected_add_value", # Name of hidden input with selected value
required=True,
cls="border rounded px-3 py-2 mr-2 col-span-4",
),
),
# Form to delete value from list
DivHStacked(
ConfigButton(
"Delete",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals=f"""js:{{
action: "update",
key: "{config_name}",
value: JSON.stringify(
[...new Set([
...{json.dumps(current_values)}
])].filter(v => v !== document.querySelector("[name='{config_id}_selected_delete_value']").value.trim())
)
}}""",
),
Select(
Option("Select a value...", value="", selected=True, disabled=True),
*[Option(val, value=val) for val in current_values],
id=f"{config_id}-delete-value-select",
name=f"{config_id}_selected_delete_value", # Name of hidden input with selected value
required=True,
cls="border rounded px-3 py-2 mr-2 col-span-4",
),
),
cols=1,
),
id=f"{config_id}-update-list-form",
)
# Return the function that creates a ConfigUpdateListForm instance
return ConfigUpdateListForm
def make_config_update_map_form(
available_keys: list[str] | None = None,
available_values: list[str] | None = None,
) -> Callable[[str, str], Grid]:
"""Factory function that creates a ConfigUpdateMapForm.
Args:
available_keys: Optional list of allowed keys (None = free text)
available_values: Optional list of allowed values (None = free text)
Returns:
A function that creates ConfigUpdateMapForm instances.
The returned function takes (config_name: str, value: str) and returns a Grid.
"""
def ConfigUpdateMapForm(config_name: str, value: str) -> Grid:
"""Creates a card with a form to add/update/delete entries in a map."""
current_map: dict[str, str] = json.loads(value) or {}
config_id = config_name.lower().replace(".", "-")
return Grid(
DivRAligned(P("update map")),
Grid(
# Add / update key-value pair
DivHStacked(
ConfigButton(
"Set",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals=f"""js:{{
action: "update",
key: "{config_name}",
value: JSON.stringify(
Object.assign(
{json.dumps(current_map)},
{{
[document.querySelector("[name='{config_id}_set_key']").value.trim()]:
document.querySelector("[name='{config_id}_set_value']").value.trim()
}}
)
)
}}""",
),
(
Select(
Option("Select key...", value="", selected=True, disabled=True),
*[Option(k, value=k) for k in (sorted(available_keys) or [])],
name=f"{config_id}_set_key",
cls="border rounded px-3 py-2 col-span-2",
)
if available_keys
else Input(
name=f"{config_id}_set_key",
placeholder="Key",
required=True,
cls="border rounded px-3 py-2 col-span-2",
),
),
(
Select(
Option("Select value...", value="", selected=True, disabled=True),
*[Option(k, value=k) for k in (sorted(available_values) or [])],
name=f"{config_id}_set_value",
cls="border rounded px-3 py-2 col-span-2",
)
if available_values
else Input(
name=f"{config_id}_set_value",
placeholder="Value",
required=True,
cls="border rounded px-3 py-2 col-span-2",
),
),
),
# Delete key
DivHStacked(
ConfigButton(
"Delete",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals=f"""js:{{
action: "update",
key: "{config_name}",
value: JSON.stringify(
Object.fromEntries(
Object.entries({json.dumps(current_map)})
.filter(([k]) =>
k !== document.querySelector("[name='{config_id}_delete_key']").value
)
)
)
}}""",
),
Select(
Option("Select key...", value="", selected=True, disabled=True),
*[Option(k, value=k) for k in sorted(current_map.keys())],
name=f"{config_id}_delete_key",
required=True,
cls="border rounded px-3 py-2 col-span-4",
),
),
cols=1,
),
id=f"{config_id}-update-map-form",
)
return ConfigUpdateMapForm
def ConfigCard(
config_name: str,
config_type: str,
@@ -102,6 +467,7 @@ def ConfigCard(
update_error: Optional[str],
update_value: Optional[str],
update_open: Optional[bool],
update_form_factory: Optional[Callable[[str, str], Grid]] = None,
) -> Card:
"""Creates a styled configuration card for displaying configuration details.
@@ -113,7 +479,7 @@ def ConfigCard(
config_name (str): The name of the configuration.
config_type (str): The type of the configuration.
read_only (str): Indicates if the configuration is read-only ("rw" for read-write,
any other value indicates read-only).
any other value indicates read-only).
value (str): The current value of the configuration.
default (str): The default value of the configuration.
description (str): A description of the configuration.
@@ -121,7 +487,9 @@ def ConfigCard(
update_error (Optional[str]): The error message, if any, during the update process.
update_value (Optional[str]): The value to be updated, if different from the current value.
update_open (Optional[bool]): A flag indicating whether the update section of the card
should be initially expanded.
should be initially expanded.
update_form_factory (Optional[Callable[[str, str], Grid]]): The factory to create a form to
use to update the configuration value. Defaults to simple text input.
Returns:
Card: A styled Card component containing the configuration details.
@@ -131,6 +499,11 @@ def ConfigCard(
update_value = value
if not update_open:
update_open = False
if not update_form_factory:
# Default update form
update_form = make_config_update_form()(config_name, update_value)
else:
update_form = update_form_factory(config_name, update_value)
if deprecated:
if isinstance(deprecated, bool):
deprecated = "Deprecated"
@@ -147,12 +520,12 @@ def ConfigCard(
P(read_only),
),
),
P(value),
JsonView(json.loads(value)),
),
cls="list-none",
),
Grid(
P(description),
TextView(description),
P(config_type),
)
if not deprecated
@@ -171,27 +544,18 @@ def ConfigCard(
if read_only == "rw" and not deprecated
else None,
# Set value
Grid(
DivRAligned(P("update")),
Grid(
Form(
Input(value=config_name, type="hidden", id="key"),
Input(value=update_value, type="text", id="value"),
hx_put="/eosdash/configuration",
hx_target="#page-content",
hx_swap="innerHTML",
),
),
)
if read_only == "rw" and not deprecated
else None,
update_form if read_only == "rw" and not deprecated else None,
# Last error
Grid(
DivRAligned(P("update error")),
P(update_error),
TextView(update_error),
)
if update_error
else None,
# Provide minimal update form on error if complex update_form is used
make_config_update_form()(config_name, update_value)
if update_error and update_form_factory is not None
else None,
cls="space-y-4 gap-4",
open=update_open,
),
@@ -226,7 +590,7 @@ def DashboardFooter(*c: Any, path: str) -> Card:
"""
return Card(
Container(*c, id="footer-content"),
hx_get=f"{path}",
hx_get=request_url_for(path),
hx_trigger="every 5s",
hx_target="#footer-content",
hx_swap="innerHTML",
@@ -266,7 +630,7 @@ def DashboardTabs(dashboard_items: dict[str, str]) -> Card:
Li(
DashboardTrigger(
H3(menu),
hx_get=f"{path}",
hx_get=request_url_for(path),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "dark": window.matchMedia("(prefers-color-scheme: dark)").matches }',

View File

@@ -1,5 +1,6 @@
import json
from typing import Any, Dict, List, Optional, Sequence, TypeVar, Union
from collections.abc import Sequence
from typing import Any, Dict, List, Optional, TypeVar, Union
import requests
from loguru import logger
@@ -7,6 +8,7 @@ from monsterui.franken import (
H3,
H4,
Card,
CardTitle,
Details,
Div,
DividerLine,
@@ -15,6 +17,7 @@ from monsterui.franken import (
Form,
Grid,
Input,
LabelCheckboxX,
P,
Summary,
UkIcon,
@@ -25,7 +28,15 @@ from pydantic_core import PydanticUndefined
from akkudoktoreos.config.config import ConfigEOS
from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.pvforecast import PVForecastPlaneSetting
from akkudoktoreos.server.dash.components import ConfigCard
from akkudoktoreos.server.dash.components import (
ConfigCard,
JsonView,
TextView,
make_config_update_list_form,
make_config_update_map_form,
make_config_update_value_form,
)
from akkudoktoreos.server.dash.context import request_url_for
T = TypeVar("T")
@@ -33,6 +44,14 @@ T = TypeVar("T")
# Dictionary of config names and associated dictionary with keys "value", "result", "error", "open".
config_update_latest: dict[str, dict[str, Optional[Union[str, bool]]]] = {}
# Current state of config displayed
config_visible: dict[str, dict] = {
"config-visible-read-only": {
"label": "Configuration (read-only)",
"visible": False,
},
}
def get_nested_value(
dictionary: Union[Dict[str, Any], List[Any]],
@@ -178,9 +197,9 @@ def resolve_nested_types(field_type: Any, parent_types: list[str]) -> list[tuple
return resolved_types
def configuration(
def create_config_details(
model: type[PydanticBaseModel], values: dict, values_prefix: list[str] = []
) -> list[dict]:
) -> dict[str, dict]:
"""Generate configuration details based on provided values and model metadata.
Args:
@@ -189,9 +208,9 @@ def configuration(
values_prefix (list[str]): A list of parent type names that prefixes the model values in the values.
Returns:
list[dict]: A sorted list of configuration details, each represented as a dictionary.
dict[dict]: A dictionary of configuration details, each represented as a dictionary.
"""
configs = []
config_details: dict[str, dict] = {}
inner_types: set[type[PydanticBaseModel]] = set()
for field_name, field_info in list(model.model_fields.items()) + list(
@@ -244,7 +263,7 @@ def configuration(
.replace("NoneType", "None")
.replace("<class 'float'>", "float")
)
configs.append(config)
config_details[str(config["name"])] = config
found_basic = True
else:
new_parent_types = parent_types + nested_parent_types
@@ -258,18 +277,18 @@ def configuration(
)
extract_nested_models(field_info, [field_name])
return sorted(configs, key=lambda x: x["name"])
return config_details
def get_configuration(eos_host: str, eos_port: Union[str, int]) -> list[dict]:
"""Fetch and process configuration data from the specified EOS server.
def get_config(eos_host: str, eos_port: Union[str, int]) -> dict[str, Any]:
"""Fetch configuration data from the specified EOS server.
Args:
eos_host (str): The hostname of the EOS server.
eos_port (Union[str, int]): The port of the EOS server.
Returns:
List[dict]: A list of processed configuration entries.
dict[str, Any]: A dict of configuration data.
"""
server = f"http://{eos_host}:{eos_port}"
@@ -284,7 +303,7 @@ def get_configuration(eos_host: str, eos_port: Union[str, int]) -> list[dict]:
warning_msg = f"Can not retrieve configuration from {server}: {e}, {detail}"
logger.warning(warning_msg)
return configuration(ConfigEOS, config)
return config
def ConfigPlanesCard(
@@ -341,7 +360,7 @@ def ConfigPlanesCard(
# Create cards for all planes
rows = []
for i in range(0, max_planes):
plane_config = configuration(
plane_config = create_config_details(
PVForecastPlaneSetting(),
eos_planes_config,
values_prefix=["pvforecast", "planes", str(i)],
@@ -352,10 +371,12 @@ def ConfigPlanesCard(
plane_value = json.dumps(eos_planes[i])
else:
plane_value = json.dumps(None)
for config in plane_config:
for config_key in sorted(plane_config.keys()):
config = plane_config[config_key]
update_error = config_update_latest.get(config["name"], {}).get("error") # type: ignore
update_value = config_update_latest.get(config["name"], {}).get("value") # type: ignore
update_open = config_update_latest.get(config["name"], {}).get("open") # type: ignore
update_form_factory = None
if update_open:
planes_update_open = True
plane_update_open = True
@@ -368,6 +389,12 @@ def ConfigPlanesCard(
error_msg = "update_error or update_value or update_open of wrong type."
logger.error(error_msg)
raise TypeError(error_msg)
if config["name"].endswith("pvtechchoice"):
update_form_factory = make_config_update_value_form(
["crystSi", "CIS", "CdTe", "Unknown"]
)
elif config["name"].endswith("mountingplace"):
update_form_factory = make_config_update_value_form(["free", "building"])
plane_rows.append(
ConfigCard(
config["name"],
@@ -380,6 +407,7 @@ def ConfigPlanesCard(
update_error,
update_value,
update_open,
update_form_factory,
)
)
rows.append(
@@ -396,7 +424,7 @@ def ConfigPlanesCard(
P(read_only),
),
),
P(plane_value),
JsonView(json.loads(plane_value)),
),
cls="list-none",
),
@@ -421,12 +449,12 @@ def ConfigPlanesCard(
P(read_only),
),
),
P(value),
JsonView(json.loads(value)),
),
cls="list-none",
),
Grid(
P(description),
TextView(description),
P(config_type),
),
# Default
@@ -441,9 +469,10 @@ def ConfigPlanesCard(
DivRAligned(P("update")),
Grid(
Form(
Input(value="update", type="hidden", id="action"),
Input(value=config_name, type="hidden", id="key"),
Input(value=planes_update_value, type="text", id="value"),
hx_put="/eosdash/configuration",
hx_put=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
),
@@ -454,7 +483,7 @@ def ConfigPlanesCard(
# Last error
Grid(
DivRAligned(P("update error")),
P(planes_update_error),
TextView(planes_update_error),
)
if planes_update_error
else None,
@@ -468,33 +497,150 @@ def ConfigPlanesCard(
def Configuration(
eos_host: str, eos_port: Union[str, int], configuration: Optional[list[dict]] = None
eos_host: str,
eos_port: Union[str, int],
data: Optional[dict] = None,
) -> Div:
"""Create a visual representation of the configuration.
Args:
eos_host (str): The hostname of the EOS server.
eos_port (Union[str, int]): The port of the EOS server.
configuration (Optional[list[dict]]): Optional configuration. If not provided it will be
retrievd from EOS.
data (Optional[dict], optional): Incoming data to trigger config actions. Defaults to None.
Returns:
rows: Rows of configuration details.
"""
if not configuration:
configuration = get_configuration(eos_host, eos_port)
global config_visible
dark = False
if data and data.get("action", None):
if data.get("dark", None) == "true":
dark = True
if data["action"] == "visible":
renderer = data.get("renderer", None)
if renderer:
config_visible[renderer]["visible"] = bool(data.get(f"{renderer}-visible", False))
elif data["action"] == "update":
# This data contains a new value for key
key = data["key"]
value_json_str: str = data.get("value", "")
try:
value = json.loads(value_json_str)
except:
if value_json_str in ("None", "none", "Null", "null"):
value = None
else:
value = value_json_str
error = None
config = None
try:
server = f"http://{eos_host}:{eos_port}"
path = key.replace(".", "/")
response = requests.put(f"{server}/v1/config/{path}", json=value, timeout=10)
response.raise_for_status()
config = response.json()
except requests.exceptions.HTTPError as err:
try:
# Try to get 'detail' from the JSON response
detail = response.json().get(
"detail", f"No error details for value '{value}' '{response.text}'"
)
except ValueError:
# Response is not JSON
detail = f"No error details for value '{value}' '{response.text}'"
error = f"Can not set {key} on {server}: {err}, {detail}"
# Mark all updates as closed
for k in config_update_latest:
config_update_latest[k]["open"] = False
# Remember this update as latest one
config_update_latest[key] = {
"error": error,
"result": config,
"value": value_json_str,
"open": True,
}
# (Re-)read configuration details to be shure we display actual data
config = get_config(eos_host, eos_port)
# Process configuration data
config_details = create_config_details(ConfigEOS, config)
ConfigMenu = Card(
# CheckboxGroup to toggle config data visibility
Grid(
*[
LabelCheckboxX(
label=config_visible[renderer]["label"],
id=f"{renderer}-visible",
name=f"{renderer}-visible",
value="true",
checked=config_visible[renderer]["visible"],
hx_post=request_url_for("/eosdash/configuration"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "action": "visible", "renderer": '
+ '"'
+ f"{renderer}"
+ '", '
+ '"dark": window.matchMedia("(prefers-color-scheme: dark)").matches '
+ "}",
# lbl_cls=f"text-{solution_color[renderer]}",
)
for renderer in list(config_visible.keys())
],
cols=4,
),
header=CardTitle("Choose What's Shown"),
)
rows = []
last_category = ""
# find some special configuration values
max_planes = 0
for config in configuration:
if config["name"] == "pvforecast.max_planes":
try:
max_planes = int(config["value"])
except:
max_planes = 0
try:
max_planes = int(config_details["pvforecast.max_planes"]["value"])
except:
max_planes = 0
logger.debug(f"max_planes: {max_planes}")
try:
homeassistant_entity_ids = json.loads(
config_details["adapter.homeassistant.homeassistant_entity_ids"]["value"]
)
except:
homeassistant_entity_ids = []
logger.debug(f"homeassistant_entity_ids: {homeassistant_entity_ids}")
eos_solution_entity_ids = []
try:
eos_solution_entity_ids = json.loads(
config_details["adapter.homeassistant.eos_solution_entity_ids"]["value"]
)
except:
eos_solution_entity_ids = []
logger.debug(f"eos_solution_entity_ids {eos_solution_entity_ids}")
eos_device_instruction_entity_ids = []
try:
eos_device_instruction_entity_ids = json.loads(
config_details["adapter.homeassistant.eos_device_instruction_entity_ids"]["value"]
)
except:
eos_device_instruction_entity_ids = []
logger.debug(f"eos_device_instruction_entity_ids {eos_device_instruction_entity_ids}")
devices_measurement_keys = []
try:
devices_measurement_keys = json.loads(config_details["devices.measurement_keys"]["value"])
except:
devices_measurement_keys = []
logger.debug(f"devices_measurement_keys {devices_measurement_keys}")
# build visual representation
for config in configuration:
for config_key in sorted(config_details.keys()):
config = config_details[config_key]
category = config["name"].split(".")[0]
if category != last_category:
rows.append(H3(category))
@@ -512,6 +658,12 @@ def Configuration(
error_msg = "update_error or update_value or update_open of wrong type."
logger.error(error_msg)
raise TypeError(error_msg)
if (
not config_visible["config-visible-read-only"]["visible"]
and config["read-only"] != "rw"
):
# Do not display read only values
continue
if (
config["type"]
== "Optional[list[akkudoktoreos.prediction.pvforecast.PVForecastPlaneSetting]]"
@@ -532,7 +684,47 @@ def Configuration(
update_open,
)
)
else:
elif not config["deprecated"]:
update_form_factory = None
if config["name"].endswith(".provider"):
# Special configuration for prediction provider setting
try:
provider_ids = json.loads(config_details[config["name"] + "s"]["value"])
except:
provider_ids = []
if config["type"].startswith("Optional[list"):
update_form_factory = make_config_update_list_form(provider_ids)
else:
provider_ids.append("None")
update_form_factory = make_config_update_value_form(provider_ids)
elif config["name"].startswith("adapter.homeassistant.config_entity_ids"):
# Home Assistant adapter config entities
update_form_factory = make_config_update_map_form(None, homeassistant_entity_ids)
elif config["name"].startswith("adapter.homeassistant.load_emr_entity_ids"):
# Home Assistant adapter load energy meter readings entities
update_form_factory = make_config_update_list_form(homeassistant_entity_ids)
elif config["name"].startswith("adapter.homeassistant.pv_production_emr_entity_ids"):
# Home Assistant adapter pv energy meter readings entities
update_form_factory = make_config_update_list_form(homeassistant_entity_ids)
elif config["name"].startswith("adapter.homeassistant.device_measurement_entity_ids"):
# Home Assistant adapter device measurement entities
update_form_factory = make_config_update_map_form(
devices_measurement_keys, homeassistant_entity_ids
)
elif config["name"].startswith("adapter.homeassistant.device_instruction_entity_ids"):
# Home Assistant adapter device instruction entities
update_form_factory = make_config_update_list_form(
eos_device_instruction_entity_ids
)
elif config["name"].startswith("adapter.homeassistant.solution_entity_ids"):
# Home Assistant adapter optimization solution entities
update_form_factory = make_config_update_list_form(eos_solution_entity_ids)
elif config["name"].startswith("ems.mode"):
# Energy managemnt mode
update_form_factory = make_config_update_value_form(
["OPTIMIZATION", "PREDICTION", "None"]
)
rows.append(
ConfigCard(
config["name"],
@@ -545,61 +737,8 @@ def Configuration(
update_error,
update_value,
update_open,
update_form_factory,
)
)
return Div(*rows, cls="space-y-4")
def ConfigKeyUpdate(eos_host: str, eos_port: Union[str, int], key: str, value: str) -> P:
"""Update configuration key and create a visual representation of the configuration.
Args:
eos_host (str): The hostname of the EOS server.
eos_port (Union[str, int]): The port of the EOS server.
key (str): configuration key in dot notation
value (str): configuration value as json string
Returns:
rows: Rows of configuration details.
"""
server = f"http://{eos_host}:{eos_port}"
path = key.replace(".", "/")
try:
data = json.loads(value)
except:
if value in ("None", "none", "Null", "null"):
data = None
else:
data = value
error = None
config = None
try:
response = requests.put(f"{server}/v1/config/{path}", json=data, timeout=10)
response.raise_for_status()
config = response.json()
except requests.exceptions.HTTPError as err:
try:
# Try to get 'detail' from the JSON response
detail = response.json().get(
"detail", f"No error details for data '{data}' '{response.text}'"
)
except ValueError:
# Response is not JSON
detail = f"No error details for data '{data}' '{response.text}'"
error = f"Can not set {key} on {server}: {err}, {detail}"
# Mark all updates as closed
for k in config_update_latest:
config_update_latest[k]["open"] = False
# Remember this update as latest one
config_update_latest[key] = {
"error": error,
"result": config,
"value": value,
"open": True,
}
if error or config is None:
# Reread configuration to be shure we display actual data
return Configuration(eos_host, eos_port)
# Use configuration already provided
return Configuration(eos_host, eos_port, configuration(ConfigEOS, config))
return Div(ConfigMenu, *rows, cls="space-y-3")

View File

@@ -0,0 +1,169 @@
import os
from pathlib import Path
from typing import Awaitable, Callable, Optional
from loguru import logger
from platformdirs import user_config_dir
from starlette.middleware.base import BaseHTTPMiddleware
from starlette.requests import Request
from starlette.responses import Response
# Home assistant token, if running under Home Assistant
HASSIO_TOKEN = os.environ.get("HASSIO_TOKEN")
# Compute global root path at startup
# Will be replaced on first request if Ingress is active
ROOT_PATH = "/"
# EOSdash path prefix
EOSDASH_ROOT = "eosdash/"
# Directory to export files to, or to import files from
export_import_directory = (
Path(os.environ.get("EOS_DATA_DIR", user_config_dir("net.akkudoktor.eosdash", "akkudoktor")))
if not HASSIO_TOKEN
else Path("/data")
)
class IngressMiddleware(BaseHTTPMiddleware):
"""Middleware to handle Home Assistant Ingress path prefixes.
This middleware enables FastHTML applications to work seamlessly both with
and without Home Assistant Ingress. When deployed as a Home Assistant add-on
with Ingress enabled, it automatically handles the path prefix routing.
Home Assistant Ingress proxies add-on traffic through paths like
`/api/hassio_ingress/<token>/`, which requires setting the application's
root_path for correct URL generation. This middleware detects the Ingress
path from the X-Ingress-Path header and configures the request scope
accordingly.
When running standalone (development or direct access), the middleware
passes requests through unchanged, allowing normal operation.
Attributes:
None
Examples:
>>> from fasthtml.common import FastHTML
>>> from starlette.middleware import Middleware
>>>
>>> app = FastHTML(middleware=[Middleware(IngressMiddleware)])
>>>
>>> @app.get("/")
>>> def home():
... return "Hello World"
Notes:
- All htmx and route URLs should use relative paths (e.g., "/api/data")
- The middleware automatically adapts to both Ingress and direct access
- No code changes needed when switching between deployment modes
"""
async def dispatch(
self, request: Request, call_next: Callable[[Request], Awaitable[Response]]
) -> Response:
"""Process the request and set root_path if running under Ingress.
Args:
request: The incoming Starlette Request object.
call_next: Callable to invoke the next middleware or route handler.
Returns:
Response: The response from the application after processing.
Note:
The X-Ingress-Path header is automatically added by Home Assistant
when proxying requests through Ingress.
"""
global ROOT_PATH
# Home Assistant passes the ingress path in this header
# Try multiple header variations (case-insensitive)
ingress_path = (
request.headers.get("X-Ingress-Path", "")
or request.headers.get("x-ingress-path", "")
or request.headers.get("X-INGRESS-PATH", "")
)
# Debug logging - remove after testing
logger.debug(f"All headers: {dict(request.headers)}")
logger.debug(f"Ingress path: {ingress_path}")
logger.debug(f"Request path: {request.url.path}")
# Only set root_path if we have an ingress path
if ingress_path:
ROOT_PATH = ingress_path
request.scope["root_path"] = ingress_path
# Otherwise, root_path remains empty (normal operation)
response = await call_next(request)
return response
# Helper functions
def request_url_for(path: str, root_path: Optional[str] = None) -> str:
"""Generate a full URL including the root_path.
Args:
path: Relative path **inside the app** (e.g., "eosdash/footer" or "eosdash/assets/logo.png").
root_path: Root path.
Returns:
str: Absolute URL including the root_path.
"""
global ROOT_PATH, EOSDASH_ROOT
# Step 1: fallback to global root
if root_path is None:
root_path = ROOT_PATH
# Normalize root path
root_path = root_path.rstrip("/") + "/"
# Normalize path
if path.startswith(root_path):
# Strip root_path prefix
path = path[len(root_path) :]
# Remove leading / if any
path = path.lstrip("/")
# Strip EOSDASH_ROOT if present
if path.startswith(EOSDASH_ROOT):
path = path[len(EOSDASH_ROOT) :]
# Build final URL
result = root_path + EOSDASH_ROOT + path.lstrip("/")
# Normalize accidental double slashes (except leading)
while "//" in result[1:]:
result = result.replace("//", "/")
logger.debug(f"URL for path '{path}' with root path '{root_path}': '{result}'")
return result
def safe_asset_path(filepath: str) -> Path:
"""Return a safe filesystem path for an asset under dash/assets/.
This prevents directory traversal attacks by restricting paths to
the assets folder.
Args:
filepath (str): Relative asset path requested by the client.
Returns:
Path: Absolute Path object pointing to the asset file.
Raises:
ValueError: If the filepath attempts to traverse directories using '../'.
"""
if ".." in filepath or filepath.startswith("/"):
raise ValueError(f"Forbidden file path: {filepath}")
asset_path = Path(__file__).parent / "dash/assets" / filepath
return asset_path

View File

@@ -9,8 +9,6 @@ from requests.exceptions import RequestException
import akkudoktoreos.server.dash.eosstatus as eosstatus
from akkudoktoreos.config.config import get_config
config_eos = get_config()
def get_alive(eos_host: str, eos_port: Union[str, int]) -> str:
"""Fetch alive information from the specified EOS server.
@@ -42,9 +40,9 @@ def get_alive(eos_host: str, eos_port: Union[str, int]) -> str:
def Footer(eos_host: Optional[str], eos_port: Optional[Union[str, int]]) -> str:
if eos_host is None:
eos_host = config_eos.server.host
eos_host = get_config().server.host
if eos_port is None:
eos_port = config_eos.server.port
eos_port = get_config().server.port
alive_icon = None
if eos_host is None or eos_port is None:
alive = "EOS server not given: {eos_host}:{eos_port}"

View File

@@ -1,5 +1,8 @@
"""Markdown rendering with MonsterUI HTML classes."""
import base64
import mimetypes
from pathlib import Path
from typing import Any, List, Optional, Union
from fasthtml.common import FT, Div, NotStr
@@ -8,113 +11,138 @@ from markdown_it.renderer import RendererHTML
from markdown_it.token import Token
from monsterui.foundations import stringify
# Where to find the static data assets
ASSETS_DIR = Path(__file__).parent / "assets"
ASSETS_PREFIX = "/eosdash/assets/"
IMAGE_EXTS = {".png", ".jpg", ".jpeg", ".gif", ".webp", ".svg", ".ico"}
def file_to_data_uri(file_path: Path) -> str:
"""Convert a file to a data URI.
Args:
file_path: Path to the file to convert.
Returns:
str: Data URI string with format data:mime/type;base64,encoded_data
"""
ext = file_path.suffix.lower()
# Determine MIME type
mime, _ = mimetypes.guess_type(str(file_path))
if mime is None:
mime = f"image/{ext.lstrip('.')}"
# Read file as bytes and encode to base64
raw = file_path.read_bytes()
encoded = base64.b64encode(raw).decode("ascii")
return f"data:{mime};base64,{encoded}"
def render_heading(
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
) -> str:
"""Custom renderer for Markdown headings.
Adds specific CSS classes based on the heading level.
Parameters:
self: The renderer instance.
tokens: List of tokens to be rendered.
idx: Index of the current token.
options: Rendering options.
env: Environment sandbox for plugins.
Returns:
The rendered token as a string.
"""
"""Custom renderer for Markdown headings with MonsterUI styling."""
if tokens[idx].markup == "#":
tokens[idx].attrSet("class", "uk-heading-divider uk-h1 uk-margin")
tokens[idx].attrSet(
"class",
"scroll-m-20 text-4xl font-extrabold tracking-tight lg:text-5xl mt-8 mb-4 border-b pb-2",
)
elif tokens[idx].markup == "##":
tokens[idx].attrSet("class", "uk-heading-divider uk-h2 uk-margin")
tokens[idx].attrSet(
"class", "scroll-m-20 border-b pb-2 text-3xl font-semibold tracking-tight mt-6 mb-3"
)
elif tokens[idx].markup == "###":
tokens[idx].attrSet("class", "uk-heading-divider uk-h3 uk-margin")
tokens[idx].attrSet("class", "scroll-m-20 text-2xl font-semibold tracking-tight mt-5 mb-2")
elif tokens[idx].markup == "####":
tokens[idx].attrSet("class", "uk-heading-divider uk-h4 uk-margin")
tokens[idx].attrSet("class", "scroll-m-20 text-xl font-semibold tracking-tight mt-4 mb-2")
# pass token to default renderer.
return self.renderToken(tokens, idx, options, env)
def render_paragraph(
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
) -> str:
"""Custom renderer for Markdown paragraphs.
Adds specific CSS classes.
Parameters:
self: The renderer instance.
tokens: List of tokens to be rendered.
idx: Index of the current token.
options: Rendering options.
env: Environment sandbox for plugins.
Returns:
The rendered token as a string.
"""
tokens[idx].attrSet("class", "uk-paragraph")
# pass token to default renderer.
"""Custom renderer for Markdown paragraphs with MonsterUI styling."""
tokens[idx].attrSet("class", "leading-7 [&:not(:first-child)]:mt-6")
return self.renderToken(tokens, idx, options, env)
def render_blockquote(
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
) -> str:
"""Custom renderer for Markdown blockquotes.
"""Custom renderer for Markdown blockquotes with MonsterUI styling."""
tokens[idx].attrSet("class", "mt-6 border-l-2 pl-6 italic border-primary")
return self.renderToken(tokens, idx, options, env)
Adds specific CSS classes.
Parameters:
self: The renderer instance.
tokens: List of tokens to be rendered.
idx: Index of the current token.
options: Rendering options.
env: Environment sandbox for plugins.
def render_list(self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict) -> str:
"""Custom renderer for lists with MonsterUI styling."""
tokens[idx].attrSet("class", "my-6 ml-6 list-disc [&>li]:mt-2")
return self.renderToken(tokens, idx, options, env)
Returns:
The rendered token as a string.
"""
tokens[idx].attrSet("class", "uk-blockquote")
# pass token to default renderer.
def render_image(
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
) -> str:
"""Custom renderer for Markdown images with MonsterUI styling."""
token = tokens[idx]
src = token.attrGet("src")
alt = token.content or ""
if src:
pos = src.find(ASSETS_PREFIX)
if pos != -1:
asset_rel = src[pos + len(ASSETS_PREFIX) :]
fs_path = ASSETS_DIR / asset_rel
if fs_path.exists():
data_uri = file_to_data_uri(fs_path)
token.attrSet("src", data_uri)
# MonsterUI/shadcn styling for images
token.attrSet("class", "rounded-lg border my-6 max-w-full h-auto")
return self.renderToken(tokens, idx, options, env)
def render_link(self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict) -> str:
"""Custom renderer for Markdown links.
"""Custom renderer for Markdown links with MonsterUI styling."""
token = tokens[idx]
href = token.attrGet("href")
Adds the target attribute to open links in a new tab.
if href:
pos = href.find(ASSETS_PREFIX)
if pos != -1:
asset_rel = href[pos + len(ASSETS_PREFIX) :]
key = asset_rel.rsplit(".", 1)[0]
if key in env:
return str(env[key])
Parameters:
self: The renderer instance.
tokens: List of tokens to be rendered.
idx: Index of the current token.
options: Rendering options.
env: Environment sandbox for plugins.
Returns:
The rendered token as a string.
"""
tokens[idx].attrSet("class", "uk-link")
tokens[idx].attrSet("target", "_blank")
# pass token to default renderer.
# MonsterUI link styling
token.attrSet(
"class", "font-medium text-primary underline underline-offset-4 hover:text-primary/80"
)
token.attrSet("target", "_blank")
return self.renderToken(tokens, idx, options, env)
# Register all renderers
markdown = MarkdownIt("gfm-like")
markdown.add_render_rule("heading_open", render_heading)
markdown.add_render_rule("paragraph_open", render_paragraph)
markdown.add_render_rule("blockquote_open", render_blockquote)
markdown.add_render_rule("link_open", render_link)
markdown.add_render_rule("image", render_image)
markdown.add_render_rule("bullet_list_open", render_list)
markdown.add_render_rule("ordered_list_open", render_list)
markdown_cls = "bg-background text-lg ring-offset-background placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
# Updated wrapper class to match shadcn/ui theme
markdown_cls = "text-foreground space-y-4"
# markdown_cls = "bg-background text-lg ring-offset-background placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
def Markdown(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> FT:

View File

@@ -29,6 +29,7 @@ from akkudoktoreos.core.emplan import (
from akkudoktoreos.optimization.optimization import OptimizationSolution
from akkudoktoreos.server.dash.bokeh import Bokeh, bokey_apply_theme_to_plot
from akkudoktoreos.server.dash.components import Error
from akkudoktoreos.server.dash.context import request_url_for
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime
# bar width for 1 hour bars (time given in millseconds)
@@ -385,7 +386,7 @@ def SolutionCard(solution: OptimizationSolution, config: SettingsEOS, data: Opti
name=f"{renderer}-visible",
value="true",
checked=solution_visible[renderer],
hx_post="/eosdash/plan",
hx_post=request_url_for("/eosdash/plan"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "category": "solution", "action": "visible", "renderer": '
@@ -412,7 +413,7 @@ def SolutionCard(solution: OptimizationSolution, config: SettingsEOS, data: Opti
name=f"{renderer}-visible",
value="true",
checked=solution_visible[renderer],
hx_post="/eosdash/plan",
hx_post=request_url_for("/eosdash/plan"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "category": "solution", "action": "visible", "renderer": '
@@ -439,7 +440,7 @@ def SolutionCard(solution: OptimizationSolution, config: SettingsEOS, data: Opti
name=f"{renderer}-visible",
value="true",
checked=solution_visible[renderer],
hx_post="/eosdash/plan",
hx_post=request_url_for("/eosdash/plan"),
hx_target="#page-content",
hx_swap="innerHTML",
hx_vals='js:{ "category": "solution", "action": "visible", "renderer": '
@@ -595,7 +596,7 @@ def Plan(eos_host: str, eos_port: Union[str, int], data: Optional[dict] = None)
result.raise_for_status()
except requests.exceptions.HTTPError as err:
detail = result.json()["detail"]
return Error(f"Can not retrieve configuration from {server}: {err}, {detail}")
return Error(f"Can not retrieve configuration from {server}: {err},\n{detail}")
eosstatus.eos_config = SettingsEOS(**result.json())
# Get the optimization solution
@@ -607,7 +608,7 @@ def Plan(eos_host: str, eos_port: Union[str, int], data: Optional[dict] = None)
solution_json = result.json()
except requests.exceptions.HTTPError as e:
detail = result.json()["detail"]
warning_msg = f"Can not retrieve optimization solution from {server}: {e}, {detail}"
warning_msg = f"Can not retrieve optimization solution from {server}: {e},\n{detail}"
logger.warning(warning_msg)
return Error(warning_msg)
except Exception as e:
@@ -623,7 +624,7 @@ def Plan(eos_host: str, eos_port: Union[str, int], data: Optional[dict] = None)
plan_json = result.json()
except requests.exceptions.HTTPError as e:
detail = result.json()["detail"]
warning_msg = f"Can not retrieve plan from {server}: {e}, {detail}"
warning_msg = f"Can not retrieve plan from {server}: {e},\n{detail}"
logger.warning(warning_msg)
return Error(warning_msg)
except Exception as e:

View File

@@ -9,7 +9,6 @@ import subprocess
import sys
import traceback
from contextlib import asynccontextmanager
from pathlib import Path
from typing import Annotated, Any, AsyncGenerator, Dict, List, Optional, Union
import psutil
@@ -33,7 +32,7 @@ from akkudoktoreos.core.emplan import EnergyManagementPlan, ResourceStatus
from akkudoktoreos.core.ems import get_ems
from akkudoktoreos.core.emsettings import EnergyManagementMode
from akkudoktoreos.core.logabc import LOGGING_LEVELS
from akkudoktoreos.core.logging import read_file_log, track_logging_config
from akkudoktoreos.core.logging import logging_track_config, read_file_log
from akkudoktoreos.core.pydantic import (
PydanticBaseModel,
PydanticDateTimeData,
@@ -54,11 +53,13 @@ from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktorCommonSettings
from akkudoktoreos.prediction.prediction import get_prediction
from akkudoktoreos.prediction.pvforecast import PVForecastCommonSettings
from akkudoktoreos.server.rest.error import create_error_page
from akkudoktoreos.server.rest.starteosdash import run_eosdash_supervisor
from akkudoktoreos.server.rest.tasks import repeat_every
from akkudoktoreos.server.server import (
drop_root_privileges,
fix_data_directories_permissions,
get_default_host,
get_host_ip,
validate_ip_or_hostname,
wait_for_port_free,
)
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
@@ -70,15 +71,18 @@ prediction_eos = get_prediction()
ems_eos = get_ems()
resource_registry_eos = get_resource_registry()
# ------------------------------------
# Logging configuration at import time
# ------------------------------------
logger.remove()
track_logging_config(config_eos, "logging", None, None)
config_eos.track_nested_value("/logging", track_logging_config)
logging_track_config(config_eos, "logging", None, None)
# -----------------------------
# Configuration change tracking
# -----------------------------
config_eos.track_nested_value("/logging", logging_track_config)
# ----------------------------
# Safe argparse at import time
@@ -114,6 +118,11 @@ parser.add_argument(
default=None,
help="Enable or disable automatic EOSdash startup. Options: True or False (default: value from config)",
)
parser.add_argument(
"--run_as_user",
type=str,
help="The unprivileged user account the EOS server shall switch to after performing root-level startup tasks.",
)
# Command line arguments
args: argparse.Namespace
@@ -137,7 +146,7 @@ if args and args.log_level is not None:
# Ensure log_level from command line is in config settings
if log_level in LOGGING_LEVELS:
# Setup console logging level using nested value
# - triggers logging configuration by track_logging_config
# - triggers logging configuration by logging_track_config
config_eos.set_nested_value("logging/console_level", log_level)
logger.debug(f"logging/console_level configuration set by argument to {log_level}")
@@ -188,105 +197,6 @@ if config_eos.server.startup_eosdash:
config_eos.set_nested_value("server/eosdash_port", port + 1)
# ----------------------
# EOSdash server startup
# ----------------------
def start_eosdash(
host: str,
port: int,
eos_host: str,
eos_port: int,
log_level: str,
access_log: bool,
reload: bool,
eos_dir: str,
eos_config_dir: str,
) -> subprocess.Popen:
"""Start the EOSdash server as a subprocess.
This function starts the EOSdash server by launching it as a subprocess. It checks if the server
is already running on the specified port and either returns the existing process or starts a new
one.
Args:
host (str): The hostname for the EOSdash server.
port (int): The port for the EOSdash server.
eos_host (str): The hostname for the EOS server.
eos_port (int): The port for the EOS server.
log_level (str): The logging level for the EOSdash server.
access_log (bool): Flag to enable or disable access logging.
reload (bool): Flag to enable or disable auto-reloading.
eos_dir (str): Path to the EOS data directory.
eos_config_dir (str): Path to the EOS configuration directory.
Returns:
subprocess.Popen: The process of the EOSdash server.
Raises:
RuntimeError: If the EOSdash server fails to start.
"""
try:
validate_ip_or_hostname(host)
validate_ip_or_hostname(eos_host)
except Exception as ex:
error_msg = f"Could not start EOSdash: {ex}"
logger.error(error_msg)
raise RuntimeError(error_msg)
eosdash_path = Path(__file__).parent.resolve().joinpath("eosdash.py")
# Do a one time check for port free to generate warnings if not so
wait_for_port_free(port, timeout=0, waiting_app_name="EOSdash")
cmd = [
sys.executable,
"-m",
"akkudoktoreos.server.eosdash",
"--host",
str(host),
"--port",
str(port),
"--eos-host",
str(eos_host),
"--eos-port",
str(eos_port),
"--log_level",
log_level,
"--access_log",
str(access_log),
"--reload",
str(reload),
]
# Set environment before any subprocess run, to keep custom config dir
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_CONFIG_DIR"] = eos_config_dir
try:
server_process = subprocess.Popen( # noqa: S603
cmd,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
start_new_session=True,
)
logger.info(f"Started EOSdash with '{cmd}'.")
except subprocess.CalledProcessError as ex:
error_msg = f"Could not start EOSdash: {ex}"
logger.error(error_msg)
raise RuntimeError(error_msg)
# Check EOSdash is still running
if server_process.poll() is not None:
error_msg = f"EOSdash finished immediatedly with code: {server_process.returncode}"
logger.error(error_msg)
raise RuntimeError(error_msg)
return server_process
# ----------------------
# EOS REST Server
# ----------------------
@@ -389,41 +299,7 @@ async def server_shutdown_task() -> None:
async def lifespan(app: FastAPI) -> AsyncGenerator[None, None]:
"""Lifespan manager for the app."""
# On startup
if config_eos.server.startup_eosdash:
try:
if (
config_eos.server.eosdash_host is None
or config_eos.server.eosdash_port is None
or config_eos.server.host is None
or config_eos.server.port is None
):
raise ValueError(
f"Invalid configuration for EOSdash server startup.\n"
f"- server/startup_eosdash: {config_eos.server.startup_eosdash}\n"
f"- server/eosdash_host: {config_eos.server.eosdash_host}\n"
f"- server/eosdash_port: {config_eos.server.eosdash_port}\n"
f"- server/host: {config_eos.server.host}\n"
f"- server/port: {config_eos.server.port}"
)
log_level = (
config_eos.logging.console_level if config_eos.logging.console_level else "info"
)
eosdash_process = start_eosdash(
host=str(config_eos.server.eosdash_host),
port=config_eos.server.eosdash_port,
eos_host=str(config_eos.server.host),
eos_port=config_eos.server.port,
log_level=log_level,
access_log=True,
reload=False,
eos_dir=str(config_eos.general.data_folder_path),
eos_config_dir=str(config_eos.general.config_folder_path),
)
except Exception as e:
logger.error(f"Failed to start EOSdash server. Error: {e}")
sys.exit(1)
asyncio.create_task(run_eosdash_supervisor())
load_eos_state()
@@ -606,7 +482,7 @@ async def fastapi_admin_server_shutdown_post() -> dict:
}
@app.get("/v1/health")
@app.get("/v1/health", tags=["health"])
def fastapi_health_get(): # type: ignore
"""Health check endpoint to verify that the EOS server is alive."""
return JSONResponse(
@@ -1190,7 +1066,7 @@ def fastapi_energy_management_optimization_solution_get() -> OptimizationSolutio
if solution is None:
raise HTTPException(
status_code=404,
detail="Can not get the optimization solution. Did you configure automatic optimization?",
detail="Can not get the optimization solution.\nDid you configure automatic optimization?",
)
return solution
@@ -1202,7 +1078,7 @@ def fastapi_energy_management_plan_get() -> EnergyManagementPlan:
if plan is None:
raise HTTPException(
status_code=404,
detail="Can not get the energy management plan. Did you configure automatic optimization?",
detail="Can not get the energy management plan.\nDid you configure automatic optimization?",
)
return plan
@@ -1256,7 +1132,7 @@ async def fastapi_strompreis() -> list[float]:
except Exception as e:
raise HTTPException(
status_code=404,
detail=f"Can not get the electricity price forecast: {e}. Did you configure the electricity price forecast provider?",
detail=f"Can not get the electricity price forecast: {e}.\nDid you configure the electricity price forecast provider?",
)
return elecprice
@@ -1360,7 +1236,7 @@ async def fastapi_gesamtlast(request: GesamtlastRequest) -> list[float]:
except Exception as e:
raise HTTPException(
status_code=404,
detail=f"Can not get the total load forecast: {e}. Did you configure the load forecast provider?",
detail=f"Can not get the total load forecast: {e}.\nDid you configure the load forecast provider?",
)
return prediction_list
@@ -1421,7 +1297,7 @@ async def fastapi_gesamtlast_simple(year_energy: float) -> list[float]:
except Exception as e:
raise HTTPException(
status_code=404,
detail=f"Can not get the total load forecast: {e}. Did you configure the load forecast provider?",
detail=f"Can not get the total load forecast: {e}.\nDid you configure the load forecast provider?",
)
return prediction_list
@@ -1616,6 +1492,17 @@ def run_eos() -> None:
Returns:
None
"""
if args:
run_as_user = args.run_as_user
else:
run_as_user = None
# Switch data directories ownership to user
fix_data_directories_permissions(run_as_user=run_as_user)
# Switch privileges to run_as_user
drop_root_privileges(run_as_user=run_as_user)
# Wait for EOS port to be free - e.g. in case of restart
wait_for_port_free(port, timeout=120, waiting_app_name="EOS")
@@ -1628,6 +1515,8 @@ def run_eos() -> None:
log_level="info", # Fix log level for uvicorn to info
access_log=True, # Fix server access logging to True
reload=reload,
proxy_headers=True,
forwarded_allow_ips="*",
)
except Exception as e:
logger.exception("Failed to start uvicorn server.")

View File

@@ -6,25 +6,37 @@ from pathlib import Path
import psutil
import uvicorn
from fasthtml.common import FileResponse, JSONResponse
from fasthtml.common import Base, FileResponse, JSONResponse
from loguru import logger
from monsterui.core import FastHTML, Theme
from starlette.middleware import Middleware
from starlette.requests import Request
from akkudoktoreos.config.config import get_config
from akkudoktoreos.core.logabc import LOGGING_LEVELS
from akkudoktoreos.core.logging import track_logging_config
from akkudoktoreos.core.logging import logging_track_config
from akkudoktoreos.core.version import __version__
from akkudoktoreos.server.dash.about import About
# Pages
from akkudoktoreos.server.dash.about import About
from akkudoktoreos.server.dash.admin import Admin
# helpers
from akkudoktoreos.server.dash.bokeh import BokehJS
from akkudoktoreos.server.dash.components import Page
from akkudoktoreos.server.dash.configuration import ConfigKeyUpdate, Configuration
from akkudoktoreos.server.dash.configuration import Configuration
from akkudoktoreos.server.dash.context import (
IngressMiddleware,
safe_asset_path,
)
from akkudoktoreos.server.dash.footer import Footer
from akkudoktoreos.server.dash.plan import Plan
from akkudoktoreos.server.dash.prediction import Prediction
from akkudoktoreos.server.server import get_default_host, wait_for_port_free
from akkudoktoreos.server.server import (
drop_root_privileges,
get_default_host,
wait_for_port_free,
)
from akkudoktoreos.utils.stringutil import str2bool
config_eos = get_config()
@@ -35,8 +47,8 @@ config_eos = get_config()
# ------------------------------------
logger.remove()
track_logging_config(config_eos, "logging", None, None)
config_eos.track_nested_value("/logging", track_logging_config)
logging_track_config(config_eos, "logging", None, None)
config_eos.track_nested_value("/logging", logging_track_config)
# ----------------------------
@@ -83,6 +95,12 @@ parser.add_argument(
default=False,
help="Enable or disable auto-reload. Useful for development. Options: True or False (default: False)",
)
parser.add_argument(
"--run_as_user",
type=str,
help="The unprivileged user account the EOSdash server shall run if started in root-level.",
)
# Command line arguments
args: argparse.Namespace
@@ -110,7 +128,7 @@ else:
# Ensure log_level from command line is in config settings
if config_eosdash["log_level"] in LOGGING_LEVELS:
# Setup console logging level using nested value
# - triggers logging configuration by track_logging_config
# - triggers logging configuration by logging_track_config
config_eos.set_nested_value("logging/console_level", config_eosdash["log_level"])
logger.debug(
f"logging/console_level configuration set by argument to {config_eosdash['log_level']}"
@@ -180,9 +198,11 @@ hdrs = (
# The EOSdash application
app: FastHTML = FastHTML(
title="EOSdash",
hdrs=hdrs,
secret_key=os.getenv("EOS_SERVER__EOSDASH_SESSKEY"),
title="EOSdash", # Default page title
hdrs=hdrs, # Additional FT elements to add to <HEAD>
# htmx=True, # Include HTMX header?
middleware=[Middleware(IngressMiddleware)],
secret_key=os.getenv("EOS_SERVER__EOSDASH_SESSKEY"), # Signing key for sessions
)
@@ -199,37 +219,60 @@ def eos_server() -> tuple[str, int]:
return config_eosdash["eos_host"], config_eosdash["eos_port"]
# -------------------------------------------------------------------
# Routes
# -------------------------------------------------------------------
@app.get("/favicon.ico")
def get_eosdash_favicon(): # type: ignore
"""Get favicon."""
def get_eosdash_favicon(request: Request): # type: ignore
"""Get the EOSdash favicon.
Args:
request (Request): The incoming FastHTML request.
Returns:
FileResponse: The favicon file.
"""
return FileResponse(path=favicon_filepath)
@app.get("/")
def get_eosdash(): # type: ignore
"""Serves the main EOSdash page.
def get_eosdash(request: Request): # type: ignore
"""Serve the main EOSdash page with navigation links.
Args:
request (Request): The incoming FastHTML request.
Returns:
Page: The main dashboard page with navigation links and footer.
"""
return Page(
None,
{
"Plan": "/eosdash/plan",
"Prediction": "/eosdash/prediction",
"Config": "/eosdash/configuration",
"Admin": "/eosdash/admin",
"About": "/eosdash/about",
},
About(),
Footer(*eos_server()),
"/eosdash/footer",
root_path: str = request.scope.get("root_path", "")
return (
Base(href=f"{root_path}/") if root_path else None,
Page(
None,
{
"Plan": "/eosdash/plan",
"Prediction": "/eosdash/prediction",
"Config": "/eosdash/configuration",
"Admin": "/eosdash/admin",
"About": "/eosdash/about",
},
About(),
Footer(*eos_server()),
"/eosdash/footer",
),
)
@app.get("/eosdash/footer")
def get_eosdash_footer(): # type: ignore
"""Serves the EOSdash Foooter information.
def get_eosdash_footer(request: Request): # type: ignore
"""Serve the EOSdash Footer information.
Args:
request (Request): The incoming FastHTML request.
Returns:
Footer: The Footer component.
@@ -238,8 +281,11 @@ def get_eosdash_footer(): # type: ignore
@app.get("/eosdash/about")
def get_eosdash_about(): # type: ignore
"""Serves the EOSdash About page.
def get_eosdash_about(request: Request): # type: ignore
"""Serve the EOSdash About page.
Args:
request (Request): The incoming FastHTML request.
Returns:
About: The About page component.
@@ -248,8 +294,11 @@ def get_eosdash_about(): # type: ignore
@app.get("/eosdash/admin")
def get_eosdash_admin(): # type: ignore
"""Serves the EOSdash Admin page.
def get_eosdash_admin(request: Request): # type: ignore
"""Serve the EOSdash Admin page.
Args:
request (Request): The incoming FastHTML request.
Returns:
Admin: The Admin page component.
@@ -258,10 +307,12 @@ def get_eosdash_admin(): # type: ignore
@app.post("/eosdash/admin")
def post_eosdash_admin(data: dict): # type: ignore
def post_eosdash_admin(request: Request, data: dict): # type: ignore
"""Provide control data to the Admin page.
This endpoint is called from within the Admin page on user actions.
Args:
request (Request): The incoming FastHTML request.
data (dict): User-submitted data from the Admin page.
Returns:
Admin: The Admin page component.
@@ -270,8 +321,11 @@ def post_eosdash_admin(data: dict): # type: ignore
@app.get("/eosdash/configuration")
def get_eosdash_configuration(): # type: ignore
"""Serves the EOSdash Configuration page.
def get_eosdash_configuration(request: Request): # type: ignore
"""Serve the EOSdash Configuration page.
Args:
request (Request): The incoming FastHTML request.
Returns:
Configuration: The Configuration page component.
@@ -280,13 +334,40 @@ def get_eosdash_configuration(): # type: ignore
@app.put("/eosdash/configuration")
def put_eosdash_configuration(data: dict): # type: ignore
return ConfigKeyUpdate(*eos_server(), data["key"], data["value"])
def put_eosdash_configuration(request: Request, data: dict): # type: ignore
"""Update a configuration key/value pair.
Args:
request (Request): The incoming FastHTML request.
data (dict): Dictionary containing 'key' and 'value' to trigger configuration update.
Returns:
Configuration: The Configuration page component with updated configuration.
"""
return Configuration(*eos_server(), data)
@app.post("/eosdash/configuration")
def post_eosdash_configuration(request: Request, data: dict): # type: ignore
"""Provide control data to the configuration page.
Args:
request (Request): The incoming FastHTML request.
data (dict): User-submitted data from the configuration page.
Returns:
Configuration: The Configuration page component with updated configuration.
"""
return Configuration(*eos_server(), data)
@app.get("/eosdash/plan")
def get_eosdash_plan(data: dict): # type: ignore
"""Serves the EOSdash Plan page.
def get_eosdash_plan(request: Request, data: dict): # type: ignore
"""Serve the EOSdash Plan page.
Args:
request (Request): The incoming FastHTML request.
data (dict): Optional query data.
Returns:
Plan: The Plan page component.
@@ -295,10 +376,12 @@ def get_eosdash_plan(data: dict): # type: ignore
@app.post("/eosdash/plan")
def post_eosdash_plan(data: dict): # type: ignore
def post_eosdash_plan(request: Request, data: dict): # type: ignore
"""Provide control data to the Plan page.
This endpoint is called from within the Plan page on user actions.
Args:
request (Request): The incoming FastHTML request.
data (dict): User-submitted data from the Plan page.
Returns:
Plan: The Plan page component.
@@ -307,8 +390,12 @@ def post_eosdash_plan(data: dict): # type: ignore
@app.get("/eosdash/prediction")
def get_eosdash_prediction(data: dict): # type: ignore
"""Serves the EOSdash Prediction page.
def get_eosdash_prediction(request: Request, data: dict): # type: ignore
"""Serve the EOSdash Prediction page.
Args:
request (Request): The incoming FastHTML request.
data (dict): Optional query data.
Returns:
Prediction: The Prediction page component.
@@ -317,8 +404,15 @@ def get_eosdash_prediction(data: dict): # type: ignore
@app.get("/eosdash/health")
def get_eosdash_health(): # type: ignore
"""Health check endpoint to verify that the EOSdash server is alive."""
def get_eosdash_health(request: Request): # type: ignore
"""Health check endpoint to verify the EOSdash server is alive.
Args:
request (Request): The incoming FastHTML request.
Returns:
JSONResponse: Server status including PID and version.
"""
return JSONResponse(
{
"status": "alive",
@@ -328,13 +422,37 @@ def get_eosdash_health(): # type: ignore
)
@app.get("/eosdash/assets/{fname:path}.{ext:static}")
def get_eosdash_assets(fname: str, ext: str): # type: ignore
"""Get assets."""
asset_filepath = Path(__file__).parent.joinpath(f"dash/assets/{fname}.{ext}")
@app.get("/eosdash/assets/{filepath:path}")
def get_eosdash_assets(request: Request, filepath: str): # type: ignore
"""Serve static assets for EOSdash safely.
Args:
request (Request): The incoming FastHTML request.
filepath (str): Relative path of the asset under dash/assets/.
Returns:
FileResponse: The requested asset file if it exists.
Raises:
404: If the file does not exist.
403: If the file path is forbidden (directory traversal attempt).
"""
try:
asset_filepath = safe_asset_path(filepath)
except ValueError:
return {"error": "Forbidden"}, 403
if not asset_filepath.exists() or not asset_filepath.is_file():
return {"error": "File not found"}, 404
return FileResponse(path=asset_filepath)
# ----------------------
# Run the EOSdash server
# ----------------------
def run_eosdash() -> None:
"""Run the EOSdash server with the specified configurations.
@@ -348,6 +466,14 @@ def run_eosdash() -> None:
Returns:
None
"""
if args:
run_as_user = args.run_as_user
else:
run_as_user = None
# Drop root privileges if running as root
drop_root_privileges(run_as_user=run_as_user)
# Wait for EOSdash port to be free - e.g. in case of restart
wait_for_port_free(config_eosdash["eosdash_port"], timeout=120, waiting_app_name="EOSdash")
@@ -359,6 +485,8 @@ def run_eosdash() -> None:
log_level=config_eosdash["log_level"].lower(),
access_log=config_eosdash["access_log"],
reload=config_eosdash["reload"],
proxy_headers=True,
forwarded_allow_ips="*",
)
except Exception as e:
logger.error(

View File

@@ -0,0 +1,269 @@
import asyncio
import os
import re
import sys
from pathlib import Path
from loguru import logger
from akkudoktoreos.config.config import get_config
from akkudoktoreos.server.server import (
validate_ip_or_hostname,
wait_for_port_free,
)
config_eos = get_config()
# Loguru to HA stdout
logger.add(sys.stdout, format="{time} | {level} | {message}", enqueue=True)
LOG_PATTERN = re.compile(
r"""
(?:(?P<timestamp>^\S+\s+\S+)\s*\|\s*)? # Optional timestamp
(?P<level>TRACE|DEBUG|INFO|WARNING|ERROR|CRITICAL)\s*\|\s* # Log level
(?:
(?P<file_path>[A-Za-z0-9_\-./]+) # Full file path or filename
:
(?P<line>\d+) # Line number
\s*\|\s*
)?
(?:(?P<function>[A-Za-z0-9_<>-]+)\s*\|\s*)? # Optional function name
(?P<msg>.*) # Message
""",
re.VERBOSE,
)
async def forward_stream(stream: asyncio.StreamReader, prefix: str = "") -> None:
"""Continuously read log lines from a subprocess and re-log them via Loguru.
The function reads lines from an ``asyncio.StreamReader`` originating from a
subprocess (typically the subprocess's stdout or stderr), parses the log
metadata if present (log level, file path, line number, function), and
forwards the log entry to Loguru. If the line cannot be parsed, it is logged
as an ``INFO`` message with generic metadata.
Args:
stream (asyncio.StreamReader):
An asynchronous stream to read from, usually ``proc.stdout`` or
``proc.stderr`` from ``asyncio.create_subprocess_exec``.
prefix (str, optional):
A string prefix added to each forwarded log line. Useful for
distinguishing between multiple subprocess sources.
Defaults to an empty string.
Notes:
- If the subprocess log line includes a file path (e.g.,
``/app/server/main.py:42``), both ``file.name`` and ``file.path`` will
be set accordingly in the forwarded Loguru log entry.
- If metadata cannot be extracted, fallback values
(``subprocess.py`` and ``/subprocess/subprocess.py``) are used.
- The function runs until ``stream`` reaches EOF.
"""
while True:
line = await stream.readline()
if not line:
break # End of stream
raw = line.decode(errors="replace").rstrip()
match = LOG_PATTERN.search(raw)
if match:
data = match.groupdict()
level = data["level"] or "INFO"
message = data["msg"]
# ---- Extract file path and name ----
file_path = data["file_path"]
if file_path:
if "/" in file_path:
file_name = file_path.rsplit("/", 1)[1]
else:
file_name = file_path
else:
file_name = "subprocess.py"
file_path = f"/subprocess/{file_name}"
# ---- Extract function and line ----
func_name = data["function"] or "<subprocess>"
line_no = int(data["line"]) if data["line"] else 1
# ---- Patch logger with realistic metadata ----
patched = logger.patch(
lambda r: r.update(
{
"file": {
"name": file_name,
"path": file_path,
},
"line": line_no,
"function": func_name,
"name": "EOSdash",
}
)
)
patched.log(level, f"{prefix}{message}")
else:
# Fallback: unstructured log line
file_name = "subprocess.py"
file_path = f"/subprocess/{file_name}"
logger.patch(
lambda r: r.update(
{
"file": {
"name": file_name,
"path": file_path,
},
"line": 1,
"function": "<subprocess>",
"name": "EOSdash",
}
)
).info(f"{prefix}{raw}")
async def run_eosdash_supervisor() -> None:
"""Starts EOSdash, pipes its logs, restarts it if it crashes.
Runs forever.
"""
eosdash_path = Path(__file__).parent.resolve().joinpath("eosdash.py")
while True:
await asyncio.sleep(5)
if not config_eos.server.startup_eosdash:
continue
if (
config_eos.server.eosdash_host is None
or config_eos.server.eosdash_port is None
or config_eos.server.host is None
or config_eos.server.port is None
):
error_msg = (
f"Invalid configuration for EOSdash server startup.\n"
f"- server/eosdash_host: {config_eos.server.eosdash_host}\n"
f"- server/eosdash_port: {config_eos.server.eosdash_port}\n"
f"- server/host: {config_eos.server.host}\n"
f"- server/port: {config_eos.server.port}"
)
logger.error(error_msg)
continue
# Get all the parameters
host = str(config_eos.server.eosdash_host)
port = config_eos.server.eosdash_port
eos_host = str(config_eos.server.host)
eos_port = config_eos.server.port
access_log = True
reload = False
log_level = config_eos.logging.console_level if config_eos.logging.console_level else "info"
try:
validate_ip_or_hostname(host)
validate_ip_or_hostname(eos_host)
except Exception as ex:
error_msg = f"Could not start EOSdash: {ex}"
logger.error(error_msg)
continue
if eos_host != host:
# EOSdash runs on a different server - we can not start.
error_msg = (
f"EOSdash server startup not possible on different hosts.\n"
f"- server/eosdash_host: {config_eos.server.eosdash_host}\n"
f"- server/host: {config_eos.server.host}"
)
logger.error(error_msg)
continue
# Do a one time check for port free to generate warnings if not so
wait_for_port_free(port, timeout=0, waiting_app_name="EOSdash")
cmd = [
sys.executable,
"-m",
"akkudoktoreos.server.eosdash",
"--host",
str(host),
"--port",
str(port),
"--eos-host",
str(eos_host),
"--eos-port",
str(eos_port),
"--log_level",
log_level,
"--access_log",
str(access_log),
"--reload",
str(reload),
]
# Set environment before any subprocess run, to keep custom config dir
eos_dir = str(config_eos.package_root_path)
eos_data_dir = str(config_eos.general.data_folder_path)
eos_config_dir = str(config_eos.general.config_folder_path)
env = os.environ.copy()
env["EOS_DIR"] = eos_dir
env["EOS_DATA_DIR"] = eos_data_dir
env["EOS_CONFIG_DIR"] = eos_config_dir
logger.info("Starting EOSdash subprocess...")
# Start EOSdash server
try:
proc = await asyncio.create_subprocess_exec(
*cmd, env=env, stdout=asyncio.subprocess.PIPE, stderr=asyncio.subprocess.PIPE
)
except FileNotFoundError:
logger.error(
"Failed to start EOSdash: 'python' executable '{sys.executable}' not found."
)
continue
except PermissionError:
logger.error("Failed to start EOSdash: permission denied on 'eosdash.py'.")
continue
except asyncio.CancelledError:
logger.warning("EOSdash startup cancelled (shutdown?).")
return
except Exception as e:
logger.exception(f"Unexpected error launching EOSdash: {e}")
continue
if proc.stdout is None:
logger.error("Failed to forward EOSdash output to EOS pipe.")
else:
# Forward log
asyncio.create_task(forward_stream(proc.stdout, prefix="[EOSdash] "))
if proc.stderr is None:
logger.error("Failed to forward EOSdash error output to EOS pipe.")
else:
# Forward log
asyncio.create_task(forward_stream(proc.stderr, prefix="[EOSdash-ERR] "))
# If we reach here, the subprocess started successfully
logger.info("EOSdash subprocess started successfully.")
# Wait for exit
try:
exit_code = await proc.wait()
logger.error(f"EOSdash exited with code {exit_code}")
except asyncio.CancelledError:
logger.warning("EOSdash wait cancelled (shutdown?).")
return
except Exception as e:
logger.exception(f"Error while waiting for EOSdash to terminate: {e}")
# Restart after a delay
logger.info("Restarting EOSdash...")

View File

@@ -3,10 +3,10 @@
from __future__ import annotations
import asyncio
import logging
from functools import wraps
from typing import Any, Callable, Coroutine, Union
import loguru
from starlette.concurrency import run_in_threadpool
NoArgsNoReturnFuncT = Callable[[], None]
@@ -37,7 +37,7 @@ def repeat_every(
*,
seconds: float,
wait_first: float | None = None,
logger: logging.Logger | None = None,
logger: loguru.logger | None = None,
raise_exceptions: bool = False,
max_repetitions: int | None = None,
on_complete: NoArgsNoReturnAnyFuncT | None = None,

View File

@@ -1,6 +1,9 @@
"""Server Module."""
import grp
import ipaddress
import os
import pwd
import re
import socket
import time
@@ -148,6 +151,179 @@ def wait_for_port_free(port: int, timeout: int = 0, waiting_app_name: str = "App
return True
def drop_root_privileges(run_as_user: Optional[str] = None) -> bool:
"""Drop root privileges and switch execution to a less privileged user.
This function transitions the running process from root (UID 0) to the
specified unprivileged user. It sets UID, GID, supplementary groups, and
updates environment variables to reflect the new user context.
If the process is not running as root, no privilege changes are made.
Args:
run_as_user (str | None):
The name of the target user to switch to.
If ``None`` (default), the current effective user is used and
no privilege change is attempted.
Returns:
bool:
``True`` if privileges were successfully dropped OR the process is
already running as the target user.
``False`` if privilege dropping failed.
Notes:
- This must be called very early during startup, before opening files,
creating sockets, or starting threads.
- Dropping privileges is irreversible within the same process.
- The target user must exist inside the container (valid entry in
``/etc/passwd`` and ``/etc/group``).
"""
# Determine current user
current_user = pwd.getpwuid(os.geteuid()).pw_name
# No action needed if already running as the desired user
if run_as_user is None or run_as_user == current_user:
return True
# Cannot switch users unless running as root
if os.geteuid() != 0:
logger.error(
f"Privilege switch requested to '{run_as_user}' "
f"but process is not root (running as '{current_user}')."
)
return False
# Resolve target user info
try:
pw_record = pwd.getpwnam(run_as_user)
except KeyError:
logger.error(f"Privilege switch failed: user '{run_as_user}' does not exist.")
return False
user_uid: int = pw_record.pw_uid
user_gid: int = pw_record.pw_gid
try:
# Get all groups where the user is listed as a member
supplementary_groups: list[int] = [
g.gr_gid for g in grp.getgrall() if run_as_user in g.gr_mem
]
# Ensure the primary group is included (it usually is NOT in gr_mem)
if user_gid not in supplementary_groups:
supplementary_groups.append(user_gid)
# Apply groups, gid, uid (in that order)
os.setgroups(supplementary_groups)
os.setgid(user_gid)
os.setuid(user_uid)
except Exception as e:
logger.error(f"Privilege switch failed: {e}")
return False
# Update environment variables to reflect the new user identity
os.environ["HOME"] = pw_record.pw_dir
os.environ["LOGNAME"] = run_as_user
os.environ["USER"] = run_as_user
# Restrictive umask
os.umask(0o077)
# Verify that privilege drop was successful
if os.geteuid() != user_uid or os.getegid() != user_gid:
logger.error(
f"Privilege drop sanity check failed: now uid={os.geteuid()}, gid={os.getegid()}, "
f"expected uid={user_uid}, gid={user_gid}"
)
return False
logger.info(
f"Switched privileges to user '{run_as_user}' "
f"(uid={user_uid}, gid={user_gid}, groups={supplementary_groups})"
)
return True
def fix_data_directories_permissions(run_as_user: Optional[str] = None) -> None:
"""Ensure correct ownership for data directories.
This function recursively updates the owner and group of the data directories and all of its
subdirectories and files so that they belong to the given user.
The function may require root privileges to change file ownership. It logs an error message
if a path ownership can not be updated.
Args:
run_as_user (Optional[str]): The user who should own the data directories and files.
Defaults to current one.
"""
from akkudoktoreos.config.config import get_config
config_eos = get_config()
base_dirs = [
config_eos.general.data_folder_path,
config_eos.general.data_output_path,
config_eos.general.config_folder_path,
config_eos.cache.path(),
]
error_msg: Optional[str] = None
if run_as_user is None:
# Get current user - try to ensure current user can access the data directories
run_as_user = pwd.getpwuid(os.geteuid()).pw_name
try:
pw_record = pwd.getpwnam(run_as_user)
except KeyError as e:
error_msg = f"Data directories '{base_dirs}' permission fix failed: user '{run_as_user}' does not exist."
logger.error(error_msg)
return
uid = pw_record.pw_uid
gid = pw_record.pw_gid
# Walk directory tree and fix permissions
for base_dir in base_dirs:
if base_dir is None:
continue
# ensure base dir exists
try:
base_dir.mkdir(parents=True, exist_ok=True)
except Exception as e:
logger.error(f"Could not setup data dir '{base_dir}': {e}")
continue
for root, dirs, files in os.walk(base_dir):
for name in dirs + files:
path = os.path.join(root, name)
try:
os.chown(path, uid, gid)
except PermissionError as e:
error_msg = f"Permission denied while updating ownership of '{path}' to user '{run_as_user}'"
logger.error(error_msg)
except Exception as e:
error_msg = (
f"Updating ownership failed of '{path}' to user '{run_as_user}': {e}"
)
logger.error(error_msg)
# Also fix the base directory itself
try:
os.chown(base_dir, uid, gid)
except PermissionError as e:
error_msg = (
f"Permission denied while updating ownership of '{path}' to user '{run_as_user}'"
)
logger.error(error_msg)
except Exception as e:
error_msg = f"Updating ownership failed of '{path}' to user '{run_as_user}': {e}"
logger.error(error_msg)
if error_msg is None:
logger.info(f"Updated ownership of '{base_dirs}' recursively to user '{run_as_user}'.")
class ServerCommonSettings(SettingsBaseModel):
"""Server Configuration."""

View File

@@ -665,11 +665,14 @@ def to_time(
- int (e.g. 14 → 14:00)
- float (e.g. 14.5 → 14:30)
- tuple like (14,), (14, 30), (14, 30, 15)
in_timezone: Optional timezone name or object (e.g., "Europe/Berlin").
Defaults to the local timezone.
to_naive: If True, return a timezone-naive Time object.
as_string: If True, return time as "HH:mm:ss ZZ".
If a format string is provided, it's passed to `pendulum.Time.format()`.
If a format string is provided, it's passed to `pendulum.Time.format()`.
Returns:
Time or str: A time object or its formatted string.
@@ -1637,106 +1640,233 @@ def to_datetime(
return dt
# to duration helper
def duration_to_iso8601(duration: pendulum.Duration) -> str:
"""Convert pendulum.Duration to ISO-8601 duration string."""
total_seconds = int(duration.total_seconds())
days, rem = divmod(total_seconds, 86400)
hours, rem = divmod(rem, 3600)
minutes, seconds = divmod(rem, 60)
parts = ["P"]
if days:
parts.append(f"{days}D")
time_parts = []
if hours:
time_parts.append(f"{hours}H")
if minutes:
time_parts.append(f"{minutes}M")
if seconds:
time_parts.append(f"{seconds}S")
if time_parts:
parts.append("T")
parts.extend(time_parts)
elif len(parts) == 1: # zero duration
parts.append("T0S")
return "".join(parts)
@overload
def to_duration(
input_value: Union[
Duration, datetime.timedelta, str, int, float, Tuple[int, int, int, int], List[int]
],
) -> Duration:
"""Converts various input types into a Duration object using pendulum.
as_string: Literal[False] | None = None,
) -> Duration: ...
@overload
def to_duration(
input_value: Union[
Duration, datetime.timedelta, str, int, float, Tuple[int, int, int, int], List[int]
],
as_string: str | Literal[True] = True,
) -> str: ...
def to_duration(
input_value: Union[
Duration, datetime.timedelta, str, int, float, Tuple[int, int, int, int], List[int]
],
as_string: Optional[Union[str, bool]] = None,
) -> Union[Duration, str]:
"""Converts various input types into a `pendulum.Duration` or a formatted duration string.
Args:
input_value (Union[Duration, timedelta, str, int, float, tuple, list]): Input to be converted
into a timedelta:
- str: A duration string like "2 days", "5 hours", "30 minutes", or a combination.
- int/float: Number representing seconds.
- tuple/list: A tuple or list in the format (days, hours, minutes, seconds).
input_value (Union[Duration, timedelta, str, int, float, tuple, list]):
The input value to convert into a duration.
Supported types include:
- `pendulum.Duration`: Returned unchanged unless formatting is requested.
- `datetime.timedelta`: Converted based on total seconds.
- `str`: A duration expression (e.g., `"15 minutes"`, `"2 hours"`),
or a string parsed by Pendulum.
- `int` or `float`: Interpreted as a number of seconds.
- `tuple` or `list`: Must be `(days, hours, minutes, seconds)`.
as_string (Optional[Union[str, bool]]):
Controls the output format of the returned duration:
- `None` or `False` (default):
Returns a `pendulum.Duration` object.
- `True`:
Returns an ISO-8601 duration string (e.g., `"PT15M"`).
- `"human"`:
Returns a human-readable form (e.g., `"15 minutes"`).
- `"pandas"`:
Returns a Pandas frequency string such as:
- `"1h"` for 1 hour
- `"15min"` for 15 minutes
- `"900s"` for 900 seconds
- `str`:
A custom format pattern. The following format tokens are supported:
- `{S}` → total seconds
- `{M}` → total minutes (integer)
- `{H}` → total hours (integer)
- `{f}` → human-friendly representation (Pendulum `in_words()`)
Example:
`"Duration: {M} minutes"` → `"Duration: 15 minutes"`
Returns:
duration: A Duration object corresponding to the input value.
Union[Duration, str]:
- A `pendulum.Duration` if no formatting is requested.
- A formatted string depending on the `as_string` option.
Raises:
ValueError: If the input format is not supported.
ValueError:
- If the input type is unsupported.
- If a duration string cannot be parsed.
- If `as_string` contains an unsupported format option.
Examples:
>>> to_duration("2 days 5 hours")
timedelta(days=2, seconds=18000)
>>> to_duration("15 minutes")
<Duration [900 seconds]>
>>> to_duration(3600)
timedelta(seconds=3600)
>>> to_duration("15 minutes", as_string=True)
'PT15M'
>>> to_duration((1, 2, 30, 15))
timedelta(days=1, seconds=90315)
>>> to_duration("15 minutes", as_string="human")
'15 minutes'
>>> to_duration("90 seconds", as_string="pandas")
'90S'
>>> to_duration("15 minutes", as_string="{M}m")
'15m'
"""
# ---- normalize to pendulum.Duration ----
duration = None
if isinstance(input_value, Duration):
return input_value
duration = input_value
if isinstance(input_value, datetime.timedelta):
return pendulum.duration(seconds=input_value.total_seconds())
elif isinstance(input_value, datetime.timedelta):
duration = pendulum.duration(seconds=input_value.total_seconds())
if isinstance(input_value, (int, float)):
# Handle integers or floats as seconds
return pendulum.duration(seconds=input_value)
elif isinstance(input_value, (int, float)):
duration = pendulum.duration(seconds=input_value)
elif isinstance(input_value, (tuple, list)):
# Handle tuple or list: (days, hours, minutes, seconds)
if len(input_value) == 4:
days, hours, minutes, seconds = input_value
return pendulum.duration(days=days, hours=hours, minutes=minutes, seconds=seconds)
else:
error_msg = f"Expected a tuple or list of length 4, got {len(input_value)}"
if len(input_value) != 4:
error_msg = f"Expected tuple/list length 4, got {len(input_value)}"
logger.error(error_msg)
raise ValueError(error_msg)
days, hours, minutes, seconds = input_value
duration = pendulum.duration(days=days, hours=hours, minutes=minutes, seconds=seconds)
elif isinstance(input_value, str):
# Use pendulum's parsing for human-readable duration strings
# first try pendulum.parse
try:
parsed = pendulum.parse(input_value)
if isinstance(parsed, pendulum.Duration):
return parsed # Already a duration
duration = parsed # Already a duration
else:
# It's a DateTime, calculate duration from start of day
return parsed - parsed.start_of("day")
duration = parsed - parsed.start_of("day")
except pendulum.parsing.exceptions.ParserError as e:
logger.trace(f"Invalid Pendulum time string format '{input_value}': {e}")
# Handle strings like "2 days 5 hours 30 minutes"
total_seconds = 0
time_units = {
"day": 86400, # 24 * 60 * 60
"hour": 3600,
"minute": 60,
"second": 1,
}
# Mitigate ReDoS vulnerability (#494) by checking input string length.
if len(input_value) > MAX_DURATION_STRING_LENGTH:
raise ValueError(
f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH})."
)
# Regular expression to match time components like '2 days', '5 hours', etc.
matches = re.findall(r"(\d+)\s*(days?|hours?|minutes?|seconds?)", input_value)
if not matches:
error_msg = f"Invalid time string format '{input_value}'"
logger.error(error_msg)
raise ValueError(error_msg)
for value, unit in matches:
unit = unit.lower().rstrip("s") # Normalize unit
if unit in time_units:
total_seconds += int(value) * time_units[unit]
else:
error_msg = f"Unsupported time unit: {unit}"
# Mitigate ReDoS vulnerability (#494) by checking input string length.
if len(input_value) > MAX_DURATION_STRING_LENGTH:
error_msg = (
f"Input string exceeds maximum allowed length ({MAX_DURATION_STRING_LENGTH})."
)
logger.error(error_msg)
raise ValueError(error_msg)
return pendulum.duration(seconds=total_seconds)
# Handle strings like "2 days 5 hours 30 minutes"
matches = re.findall(r"(\d+)\s*(days?|hours?|minutes?|seconds?)", input_value)
if not matches:
error_msg = f"Invalid time string format '{input_value}'"
logger.error(error_msg)
raise ValueError(error_msg)
total_seconds = 0
time_units = {
"day": 86400,
"hour": 3600,
"minute": 60,
"second": 1,
}
for value, unit in matches:
unit = unit.lower().rstrip("s") # Normalize unit
if unit in time_units:
total_seconds += int(value) * time_units[unit]
else:
error_msg = f"Unsupported time unit: {unit}"
logger.error(error_msg)
raise ValueError(error_msg)
duration = pendulum.duration(seconds=total_seconds)
else:
error_msg = f"Unsupported input type: {type(input_value)}"
logger.error(error_msg)
raise ValueError(error_msg)
# ---- now apply as_string rules ----
if not as_string:
return duration
total_seconds = int(duration.total_seconds())
# Boolean True → ISO-8601
if as_string is True:
return duration_to_iso8601(duration)
# Human-readable
if as_string == "human":
return duration.in_words()
# Pandas frequency
if as_string == "pandas":
# hours?
if total_seconds % 3600 == 0:
return f"{total_seconds // 3600}h"
# minutes?
if total_seconds % 60 == 0:
return f"{total_seconds // 60}min"
# else seconds (fallback)
return f"{total_seconds}s"
# Custom format string
if isinstance(as_string, str):
return as_string.format(
S=total_seconds,
M=total_seconds // 60,
H=total_seconds // 3600,
f=duration.in_words(),
)
error_msg = f"Unsupported as_string value: {as_string}"
logger.error(error_msg)
raise ValueError(error_msg)
@overload
def to_timezone(