fix: logging, prediction update, multiple bugs (#584)
Some checks failed
docker-build / platform-excludes (push) Has been cancelled
pre-commit / pre-commit (push) Has been cancelled
Run Pytest on Pull Request / test (push) Has been cancelled
docker-build / build (push) Has been cancelled
docker-build / merge (push) Has been cancelled
Close stale pull requests/issues / Find Stale issues and PRs (push) Has been cancelled

* Fix logging configuration issues that made logging stop operation. Switch to Loguru
  logging (from Python logging). Enable console and file logging with different log levels.
  Add logging documentation.

* Fix logging configuration and EOS configuration out of sync. Added tracking support
  for nested value updates of Pydantic models. This used to update the logging configuration
  when the EOS configurationm for logging is changed. Should keep logging config and EOS
  config in sync as long as all changes to the EOS logging configuration are done by
  set_nested_value(), which is the case for the REST API.

* Fix energy management task looping endlessly after the second update when trying to update
  the last_update datetime.

* Fix get_nested_value() to correctly take values from the dicts in a Pydantic model instance.

* Fix usage of model classes instead of model instances in nested value access when evaluation
  the value type that is associated to each key.

* Fix illegal json format in prediction documentation for PVForecastAkkudoktor provider.

* Fix documentation qirks and add EOS Connect to integrations.

* Support deprecated fields in configuration in documentation generation and EOSdash.

* Enhance EOSdash demo to show BrightSky humidity data (that is often missing)

* Update documentation reference to German EOS installation videos.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
This commit is contained in:
Bobby Noelte 2025-06-10 22:00:28 +02:00 committed by GitHub
parent 9d46f3c08e
commit bd38b3c5ef
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
70 changed files with 5927 additions and 5035 deletions

View File

@ -76,6 +76,11 @@ read-docs: docs
@echo "Read the documentation in your browser" @echo "Read the documentation in your browser"
.venv/bin/python -m webbrowser build/docs/html/index.html .venv/bin/python -m webbrowser build/docs/html/index.html
# Clean Python bytecode
clean-bytecode:
find . -type d -name "__pycache__" -exec rm -r {} +
find . -type f -name "*.pyc" -delete
# Clean target to remove generated documentation and documentation artefacts # Clean target to remove generated documentation and documentation artefacts
clean-docs: clean-docs:
@echo "Searching and deleting all '_autosum' directories in docs..." @echo "Searching and deleting all '_autosum' directories in docs..."

View File

@ -15,10 +15,6 @@ Properties:
timezone (Optional[str]): Computed time zone string based on the specified latitude timezone (Optional[str]): Computed time zone string based on the specified latitude
and longitude. and longitude.
Validators:
validate_latitude (float): Ensures `latitude` is within the range -90 to 90.
validate_longitude (float): Ensures `longitude` is within the range -180 to 180.
:::{table} general :::{table} general
:widths: 10 20 10 5 5 30 :widths: 10 20 10 5 5 30
:align: left :align: left
@ -127,8 +123,10 @@ Validators:
| Name | Environment Variable | Type | Read-Only | Default | Description | | Name | Environment Variable | Type | Read-Only | Default | Description |
| ---- | -------------------- | ---- | --------- | ------- | ----------- | | ---- | -------------------- | ---- | --------- | ------- | ----------- |
| level | `EOS_LOGGING__LEVEL` | `Optional[str]` | `rw` | `None` | EOS default logging level. | | level | `EOS_LOGGING__LEVEL` | `Optional[str]` | `rw` | `None` | This is deprecated. Use console_level and file_level instead. |
| root_level | | `str` | `ro` | `N/A` | Root logger logging level. | | console_level | `EOS_LOGGING__CONSOLE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to console. |
| file_level | `EOS_LOGGING__FILE_LEVEL` | `Optional[str]` | `rw` | `None` | Logging level when logging to file. |
| file_path | | `Optional[pathlib.Path]` | `ro` | `N/A` | Computed log file path based on data output path. |
::: :::
### Example Input ### Example Input
@ -138,7 +136,9 @@ Validators:
{ {
"logging": { "logging": {
"level": "INFO" "level": null,
"console_level": "TRACE",
"file_level": "TRACE"
} }
} }
``` ```
@ -150,8 +150,10 @@ Validators:
{ {
"logging": { "logging": {
"level": "INFO", "level": null,
"root_level": "INFO" "console_level": "TRACE",
"file_level": "TRACE",
"file_path": "/home/user/.local/share/net.akkudoktoreos.net/output/eos.log"
} }
} }
``` ```
@ -946,7 +948,9 @@ Validators:
"interval": 300.0 "interval": 300.0
}, },
"logging": { "logging": {
"level": "INFO" "level": null,
"console_level": "TRACE",
"file_level": "TRACE"
}, },
"devices": { "devices": {
"batteries": [ "batteries": [

View File

@ -464,6 +464,55 @@ Health check endpoint to verify that the EOS server is alive.
--- ---
## GET /v1/logging/log
**Links**: [local](http://localhost:8503/docs#/default/fastapi_logging_get_log_v1_logging_log_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_logging_get_log_v1_logging_log_get)
Fastapi Logging Get Log
```
Get structured log entries from the EOS log file.
Filters and returns log entries based on the specified query parameters. The log
file is expected to contain newline-delimited JSON entries.
Args:
limit (int): Maximum number of entries to return.
level (Optional[str]): Filter logs by severity level (e.g., DEBUG, INFO).
contains (Optional[str]): Return only logs that include this string in the message.
regex (Optional[str]): Return logs that match this regular expression in the message.
from_time (Optional[str]): ISO 8601 timestamp to filter logs not older than this.
to_time (Optional[str]): ISO 8601 timestamp to filter logs not newer than this.
tail (bool): If True, fetch the most recent log entries (like `tail`).
Returns:
JSONResponse: A JSON list of log entries.
```
**Parameters**:
- `limit` (query, optional): Maximum number of log entries to return.
- `level` (query, optional): Filter by log level (e.g., INFO, ERROR).
- `contains` (query, optional): Filter logs containing this substring.
- `regex` (query, optional): Filter logs by matching regex in message.
- `from_time` (query, optional): Start time (ISO format) for filtering logs.
- `to_time` (query, optional): End time (ISO format) for filtering logs.
- `tail` (query, optional): If True, returns the most recent lines (tail mode).
**Responses**:
- **200**: Successful Response
- **422**: Validation Error
---
## PUT /v1/measurement/data ## PUT /v1/measurement/data
**Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_data_put_v1_measurement_data_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_data_put_v1_measurement_data_put) **Links**: [local](http://localhost:8503/docs#/default/fastapi_measurement_data_put_v1_measurement_data_put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_measurement_data_put_v1_measurement_data_put)

View File

@ -28,7 +28,7 @@ management.
Energy management is the overall process to provide planning data for scheduling the different Energy management is the overall process to provide planning data for scheduling the different
devices in your system in an optimal way. Energy management cares for the update of predictions and devices in your system in an optimal way. Energy management cares for the update of predictions and
the optimization of the planning based on the simulated behavior of the devices. The planning is on the optimization of the planning based on the simulated behavior of the devices. The planning is on
the hour. Sub-hour energy management is left the hour.
### Optimization ### Optimization

View File

@ -20,8 +20,8 @@ Andreas Schmitz uses [Node-RED](https://nodered.org/) as part of his home automa
### Node-Red Resources ### Node-Red Resources
- [Installation Guide (German)](https://meintechblog.de/2024/09/05/andreas-schmitz-joerg-installiert-mein-energieoptimierungssystem/) - [Installation Guide (German)](https://www.youtube.com/playlist?list=PL8_vk9A-s7zLD865Oou6y3EeQLlNtu-Hn)
\— A detailed guide on integrating an early version of EOS with `Node-RED`. \— A detailed guide on integrating EOS with `Node-RED`.
## Home Assistant ## Home Assistant
@ -34,3 +34,8 @@ emphasizes local control and user privacy.
- Duetting's [EOS Home Assistant Addon](https://github.com/Duetting/ha_eos_addon) — Additional - Duetting's [EOS Home Assistant Addon](https://github.com/Duetting/ha_eos_addon) — Additional
details can be found in this [discussion thread](https://github.com/Akkudoktor-EOS/EOS/discussions/294). details can be found in this [discussion thread](https://github.com/Akkudoktor-EOS/EOS/discussions/294).
## EOS Connect
[EOS connect](https://github.com/ohAnd/EOS_connect) uses `EOS` for energy management and optimization,
and connects to smart home platforms to monitor, forecast, and control energy flows.

View File

@ -174,7 +174,7 @@ but usually find good local optima very quickly in a large solution space.
## Links ## Links
- [German Video explaining the basic concept and installation process for the early version of EOS (YouTube)](https://www.youtube.com/live/ftQULW4-1ts?si=oDdBBifCpUmiCXaY) - [German Videos explaining the basic concept and installation process of EOS (YouTube)](https://www.youtube.com/playlist?list=PL8_vk9A-s7zLD865Oou6y3EeQLlNtu-Hn)
- [German Forum of Akkudoktor EOS](https://akkudoktor.net/c/der-akkudoktor/eos) - [German Forum of Akkudoktor EOS](https://akkudoktor.net/c/der-akkudoktor/eos)
- [Akkudoktor-EOS GitHub Repository](https://github.com/Akkudoktor-EOS/EOS) - [Akkudoktor-EOS GitHub Repository](https://github.com/Akkudoktor-EOS/EOS)
- [Latest EOS Documentation](https://akkudoktor-eos.readthedocs.io/en/latest/) - [Latest EOS Documentation](https://akkudoktor-eos.readthedocs.io/en/latest/)

View File

@ -0,0 +1,75 @@
% SPDX-License-Identifier: Apache-2.0
(logging-page)=
# Logging
EOS automatically records important events and messages to help you understand whats happening and
to troubleshoot problems.
## How Logging Works
- By default, logs are shown in your terminal (console).
- You can also save logs to a file for later review.
- Log files are rotated automatically to avoid becoming too large.
## Controlling Log Details
### 1. Command-Line Option
Set the amount of log detail shown on the console by using `--log-level` when starting EOS.
Example:
```{eval-rst}
.. tabs::
.. tab:: Windows
.. code-block:: powershell
.venv\Scripts\python src/akkudoktoreos/server/eos.py --log-level DEBUG
.. tab:: Linux
.. code-block:: bash
.venv/bin/python src/akkudoktoreos/server/eos.py --log-level DEBUG
```
Common levels:
- DEBUG (most detail)
- INFO (default)
- WARNING
- ERROR
- CRITICAL (least detail)
### 2. Configuration File
You can also set logging options in your EOS configuration file (EOS.config.json).
```Json
{
"logging": {
"console_level": "INFO",
"file_level": "DEBUG"
}
}
```
### 3. Environment Variable
You can also control the log level by setting the `EOS_LOGGING__CONSOLE_LEVEL` and the
`EOS_LOGGING__FILE_LEVEL` environment variables.
```bash
EOS_LOGGING__CONSOLE_LEVEL="INFO"
EOS_LOGGING__FILE_LEVEL="DEBUG"
```
## File Logging
If the `file_level` configuration is set, log records are written to a rotating log file. The log
file is in the data output directory and named `eos.log`. You may directly read the file or use
the `/v1/logging/log` endpoint to access the file log.

View File

@ -375,28 +375,28 @@ Example:
"surface_azimuth": -10, "surface_azimuth": -10,
"surface_tilt": 7, "surface_tilt": 7,
"userhorizon": [20, 27, 22, 20], "userhorizon": [20, 27, 22, 20],
"inverter_paco": 10000, "inverter_paco": 10000
}, },
{ {
"peakpower": 4.8, "peakpower": 4.8,
"surface_azimuth": -90, "surface_azimuth": -90,
"surface_tilt": 7, "surface_tilt": 7,
"userhorizon": [30, 30, 30, 50], "userhorizon": [30, 30, 30, 50],
"inverter_paco": 10000, "inverter_paco": 10000
}, },
{ {
"peakpower": 1.4, "peakpower": 1.4,
"surface_azimuth": -40, "surface_azimuth": -40,
"surface_tilt": 60, "surface_tilt": 60,
"userhorizon": [60, 30, 0, 30], "userhorizon": [60, 30, 0, 30],
"inverter_paco": 2000, "inverter_paco": 2000
}, },
{ {
"peakpower": 1.6, "peakpower": 1.6,
"surface_azimuth": 5, "surface_azimuth": 5,
"surface_tilt": 45, "surface_tilt": 45,
"userhorizon": [45, 25, 30, 60], "userhorizon": [45, 25, 30, 60],
"inverter_paco": 1400, "inverter_paco": 1400
} }
] ]
} }

View File

@ -40,6 +40,7 @@ akkudoktoreos/optimization.md
akkudoktoreos/prediction.md akkudoktoreos/prediction.md
akkudoktoreos/measurement.md akkudoktoreos/measurement.md
akkudoktoreos/integration.md akkudoktoreos/integration.md
akkudoktoreos/logging.md
akkudoktoreos/serverapi.md akkudoktoreos/serverapi.md
akkudoktoreos/api.rst akkudoktoreos/api.rst

File diff suppressed because it is too large Load Diff

View File

@ -22,3 +22,4 @@ pydantic==2.11.5
statsmodels==0.14.4 statsmodels==0.14.4
pydantic-settings==2.9.1 pydantic-settings==2.9.1
linkify-it-py==2.0.3 linkify-it-py==2.0.3
loguru==0.7.3

View File

@ -4,22 +4,20 @@
import argparse import argparse
import json import json
import os import os
import re
import sys import sys
import textwrap import textwrap
from pathlib import Path from pathlib import Path
from typing import Any, Union from typing import Any, Union
from loguru import logger
from pydantic.fields import ComputedFieldInfo, FieldInfo from pydantic.fields import ComputedFieldInfo, FieldInfo
from pydantic_core import PydanticUndefined from pydantic_core import PydanticUndefined
from akkudoktoreos.config.config import ConfigEOS, GeneralSettings, get_config from akkudoktoreos.config.config import ConfigEOS, GeneralSettings, get_config
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.utils.docs import get_model_structure_from_examples from akkudoktoreos.utils.docs import get_model_structure_from_examples
logger = get_logger(__name__)
documented_types: set[PydanticBaseModel] = set() documented_types: set[PydanticBaseModel] = set()
undocumented_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict() undocumented_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict()
@ -145,6 +143,7 @@ def generate_config_table_md(
field_type = field_info.annotation if regular_field else field_info.return_type field_type = field_info.annotation if regular_field else field_info.return_type
default_value = get_default_value(field_info, regular_field) default_value = get_default_value(field_info, regular_field)
description = field_info.description if field_info.description else "-" description = field_info.description if field_info.description else "-"
deprecated = field_info.deprecated if field_info.deprecated else None
read_only = "rw" if regular_field else "ro" read_only = "rw" if regular_field else "ro"
type_name = get_type_name(field_type) type_name = get_type_name(field_type)
@ -154,6 +153,11 @@ def generate_config_table_md(
env_entry = f"| `{prefix}{config_name}` " env_entry = f"| `{prefix}{config_name}` "
else: else:
env_entry = "| " env_entry = "| "
if deprecated:
if isinstance(deprecated, bool):
description = "Deprecated!"
else:
description = deprecated
table += f"| {field_name} {env_entry}| `{type_name}` | `{read_only}` | `{default_value}` | {description} |\n" table += f"| {field_name} {env_entry}| `{type_name}` | `{read_only}` | `{default_value}` | {description} |\n"
inner_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict() inner_types: dict[PydanticBaseModel, tuple[str, list[str]]] = dict()
@ -259,7 +263,7 @@ def generate_config_md(config_eos: ConfigEOS) -> str:
markdown = "# Configuration Table\n\n" markdown = "# Configuration Table\n\n"
# Generate tables for each top level config # Generate tables for each top level config
for field_name, field_info in config_eos.model_fields.items(): for field_name, field_info in config_eos.__class__.model_fields.items():
field_type = field_info.annotation field_type = field_info.annotation
markdown += generate_config_table_md( markdown += generate_config_table_md(
field_type, [field_name], f"EOS_{field_name.upper()}__", True field_type, [field_name], f"EOS_{field_name.upper()}__", True
@ -279,6 +283,13 @@ def generate_config_md(config_eos: ConfigEOS) -> str:
markdown = markdown.rstrip("\n") markdown = markdown.rstrip("\n")
markdown += "\n" markdown += "\n"
# Assure log path does not leak to documentation
markdown = re.sub(
r'(?<=["\'])/[^"\']*/output/eos\.log(?=["\'])',
'/home/user/.local/share/net.akkudoktoreos.net/output/eos.log',
markdown
)
return markdown return markdown

View File

@ -42,6 +42,9 @@ def generate_openapi() -> dict:
general = openapi_spec["components"]["schemas"]["ConfigEOS"]["properties"]["general"]["default"] general = openapi_spec["components"]["schemas"]["ConfigEOS"]["properties"]["general"]["default"]
general["config_file_path"] = "/home/user/.config/net.akkudoktoreos.net/EOS.config.json" general["config_file_path"] = "/home/user/.config/net.akkudoktoreos.net/EOS.config.json"
general["config_folder_path"] = "/home/user/.config/net.akkudoktoreos.net" general["config_folder_path"] = "/home/user/.config/net.akkudoktoreos.net"
# Fix file path for logging settings to not show local/test file path
logging = openapi_spec["components"]["schemas"]["ConfigEOS"]["properties"]["logging"]["default"]
logging["file_path"] = "/home/user/.local/share/net.akkudoktoreos.net/output/eos.log"
return openapi_spec return openapi_spec

View File

@ -12,15 +12,12 @@ import numpy as np
from akkudoktoreos.config.config import get_config from akkudoktoreos.config.config import get_config
from akkudoktoreos.core.ems import get_ems from akkudoktoreos.core.ems import get_ems
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.optimization.genetic import ( from akkudoktoreos.optimization.genetic import (
OptimizationParameters, OptimizationParameters,
optimization_problem, optimization_problem,
) )
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
get_logger(__name__, logging_level="DEBUG")
def prepare_optimization_real_parameters() -> OptimizationParameters: def prepare_optimization_real_parameters() -> OptimizationParameters:
"""Prepare and return optimization parameters with real world data. """Prepare and return optimization parameters with real world data.

View File

@ -14,6 +14,7 @@ import shutil
from pathlib import Path from pathlib import Path
from typing import Any, ClassVar, Optional, Type from typing import Any, ClassVar, Optional, Type
from loguru import logger
from platformdirs import user_config_dir, user_data_dir from platformdirs import user_config_dir, user_data_dir
from pydantic import Field, computed_field from pydantic import Field, computed_field
from pydantic_settings import ( from pydantic_settings import (
@ -29,7 +30,6 @@ from akkudoktoreos.core.cachesettings import CacheCommonSettings
from akkudoktoreos.core.coreabc import SingletonMixin from akkudoktoreos.core.coreabc import SingletonMixin
from akkudoktoreos.core.decorators import classproperty from akkudoktoreos.core.decorators import classproperty
from akkudoktoreos.core.emsettings import EnergyManagementCommonSettings from akkudoktoreos.core.emsettings import EnergyManagementCommonSettings
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.logsettings import LoggingCommonSettings from akkudoktoreos.core.logsettings import LoggingCommonSettings
from akkudoktoreos.core.pydantic import PydanticModelNestedValueMixin, merge_models from akkudoktoreos.core.pydantic import PydanticModelNestedValueMixin, merge_models
from akkudoktoreos.devices.settings import DevicesCommonSettings from akkudoktoreos.devices.settings import DevicesCommonSettings
@ -44,8 +44,6 @@ from akkudoktoreos.server.server import ServerCommonSettings
from akkudoktoreos.utils.datetimeutil import to_timezone from akkudoktoreos.utils.datetimeutil import to_timezone
from akkudoktoreos.utils.utils import UtilsCommonSettings from akkudoktoreos.utils.utils import UtilsCommonSettings
logger = get_logger(__name__)
def get_absolute_path( def get_absolute_path(
basepath: Optional[Path | str], subpath: Optional[Path | str] basepath: Optional[Path | str], subpath: Optional[Path | str]
@ -80,10 +78,6 @@ class GeneralSettings(SettingsBaseModel):
Properties: Properties:
timezone (Optional[str]): Computed time zone string based on the specified latitude timezone (Optional[str]): Computed time zone string based on the specified latitude
and longitude. and longitude.
Validators:
validate_latitude (float): Ensures `latitude` is within the range -90 to 90.
validate_longitude (float): Ensures `longitude` is within the range -180 to 180.
""" """
_config_folder_path: ClassVar[Optional[Path]] = None _config_folder_path: ClassVar[Optional[Path]] = None
@ -283,6 +277,16 @@ class ConfigEOS(SingletonMixin, SettingsEOSDefaults):
ENCODING: ClassVar[str] = "UTF-8" ENCODING: ClassVar[str] = "UTF-8"
CONFIG_FILE_NAME: ClassVar[str] = "EOS.config.json" CONFIG_FILE_NAME: ClassVar[str] = "EOS.config.json"
def __hash__(self) -> int:
# ConfigEOS is a singleton
return hash("config_eos")
def __eq__(self, other: Any) -> bool:
if not isinstance(other, ConfigEOS):
return False
# ConfigEOS is a singleton
return True
@classmethod @classmethod
def settings_customise_sources( def settings_customise_sources(
cls, cls,

View File

@ -27,17 +27,14 @@ from typing import (
) )
import cachebox import cachebox
from loguru import logger
from pendulum import DateTime, Duration from pendulum import DateTime, Duration
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.coreabc import ConfigMixin, SingletonMixin from akkudoktoreos.core.coreabc import ConfigMixin, SingletonMixin
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
logger = get_logger(__name__)
# --------------------------------- # ---------------------------------
# In-Memory Caching Functionality # In-Memory Caching Functionality
# --------------------------------- # ---------------------------------

View File

@ -13,13 +13,10 @@ Classes:
import threading import threading
from typing import Any, ClassVar, Dict, Optional, Type from typing import Any, ClassVar, Dict, Optional, Type
from loguru import logger
from pendulum import DateTime from pendulum import DateTime
from pydantic import computed_field from pydantic import computed_field
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
config_eos: Any = None config_eos: Any = None
measurement_eos: Any = None measurement_eos: Any = None
prediction_eos: Any = None prediction_eos: Any = None

View File

@ -19,6 +19,7 @@ from typing import Any, Dict, Iterator, List, Optional, Tuple, Type, Union, over
import numpy as np import numpy as np
import pandas as pd import pandas as pd
import pendulum import pendulum
from loguru import logger
from numpydantic import NDArray, Shape from numpydantic import NDArray, Shape
from pendulum import DateTime, Duration from pendulum import DateTime, Duration
from pydantic import ( from pydantic import (
@ -31,7 +32,6 @@ from pydantic import (
) )
from akkudoktoreos.core.coreabc import ConfigMixin, SingletonMixin, StartMixin from akkudoktoreos.core.coreabc import ConfigMixin, SingletonMixin, StartMixin
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import ( from akkudoktoreos.core.pydantic import (
PydanticBaseModel, PydanticBaseModel,
PydanticDateTimeData, PydanticDateTimeData,
@ -39,8 +39,6 @@ from akkudoktoreos.core.pydantic import (
) )
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
logger = get_logger(__name__)
class DataBase(ConfigMixin, StartMixin, PydanticBaseModel): class DataBase(ConfigMixin, StartMixin, PydanticBaseModel):
"""Base class for handling generic data. """Base class for handling generic data.

View File

@ -1,10 +1,6 @@
from collections.abc import Callable from collections.abc import Callable
from typing import Any, Optional from typing import Any, Optional
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
class classproperty: class classproperty:
"""A decorator to define a read-only property at the class level. """A decorator to define a read-only property at the class level.

View File

@ -2,6 +2,7 @@ import traceback
from typing import Any, ClassVar, Optional from typing import Any, ClassVar, Optional
import numpy as np import numpy as np
from loguru import logger
from numpydantic import NDArray, Shape from numpydantic import NDArray, Shape
from pendulum import DateTime from pendulum import DateTime
from pydantic import ConfigDict, Field, computed_field, field_validator, model_validator from pydantic import ConfigDict, Field, computed_field, field_validator, model_validator
@ -9,7 +10,6 @@ from typing_extensions import Self
from akkudoktoreos.core.cache import CacheUntilUpdateStore from akkudoktoreos.core.cache import CacheUntilUpdateStore
from akkudoktoreos.core.coreabc import ConfigMixin, PredictionMixin, SingletonMixin from akkudoktoreos.core.coreabc import ConfigMixin, PredictionMixin, SingletonMixin
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import ParametersBaseModel, PydanticBaseModel from akkudoktoreos.core.pydantic import ParametersBaseModel, PydanticBaseModel
from akkudoktoreos.devices.battery import Battery from akkudoktoreos.devices.battery import Battery
from akkudoktoreos.devices.generic import HomeAppliance from akkudoktoreos.devices.generic import HomeAppliance
@ -17,8 +17,6 @@ from akkudoktoreos.devices.inverter import Inverter
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime
from akkudoktoreos.utils.utils import NumpyEncoder from akkudoktoreos.utils.utils import NumpyEncoder
logger = get_logger(__name__)
class EnergyManagementParameters(ParametersBaseModel): class EnergyManagementParameters(ParametersBaseModel):
pv_prognose_wh: list[float] = Field( pv_prognose_wh: list[float] = Field(
@ -283,6 +281,8 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
self.prediction.update_data(force_enable=force_enable, force_update=force_update) self.prediction.update_data(force_enable=force_enable, force_update=force_update)
# TODO: Create optimisation problem that calls into devices.update_data() for simulations. # TODO: Create optimisation problem that calls into devices.update_data() for simulations.
logger.info("Energy management run (crippled version - prediction update only)")
def manage_energy(self) -> None: def manage_energy(self) -> None:
"""Repeating task for managing energy. """Repeating task for managing energy.
@ -302,6 +302,7 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
Note: The task maintains the interval even if some intervals are missed. Note: The task maintains the interval even if some intervals are missed.
""" """
current_datetime = to_datetime() current_datetime = to_datetime()
interval = self.config.ems.interval # interval maybe changed in between
if EnergyManagement._last_datetime is None: if EnergyManagement._last_datetime is None:
# Never run before # Never run before
@ -316,13 +317,13 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
logger.error(message) logger.error(message)
return return
if self.config.ems.interval is None or self.config.ems.interval == float("nan"): if interval is None or interval == float("nan"):
# No Repetition # No Repetition
return return
if ( if (
compare_datetimes(current_datetime, self._last_datetime).time_diff compare_datetimes(current_datetime, EnergyManagement._last_datetime).time_diff
< self.config.ems.interval < interval
): ):
# Wait for next run # Wait for next run
return return
@ -337,9 +338,9 @@ class EnergyManagement(SingletonMixin, ConfigMixin, PredictionMixin, PydanticBas
# Remember the energy management run - keep on interval even if we missed some intervals # Remember the energy management run - keep on interval even if we missed some intervals
while ( while (
compare_datetimes(current_datetime, EnergyManagement._last_datetime).time_diff compare_datetimes(current_datetime, EnergyManagement._last_datetime).time_diff
>= self.config.ems.interval >= interval
): ):
EnergyManagement._last_datetime.add(seconds=self.config.ems.interval) EnergyManagement._last_datetime = EnergyManagement._last_datetime.add(seconds=interval)
def set_start_hour(self, start_hour: Optional[int] = None) -> None: def set_start_hour(self, start_hour: Optional[int] = None) -> None:
"""Sets start datetime to given hour. """Sets start datetime to given hour.

View File

@ -1,20 +1,3 @@
"""Abstract and base classes for logging.""" """Abstract and base classes for logging."""
import logging LOGGING_LEVELS: list[str] = ["TRACE", "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]
def logging_str_to_level(level_str: str) -> int:
"""Convert log level string to logging level."""
if level_str == "DEBUG":
level = logging.DEBUG
elif level_str == "INFO":
level = logging.INFO
elif level_str == "WARNING":
level = logging.WARNING
elif level_str == "CRITICAL":
level = logging.CRITICAL
elif level_str == "ERROR":
level = logging.ERROR
else:
raise ValueError(f"Unknown loggin level: {level_str}")
return level

View File

@ -1,95 +1,241 @@
"""Utility functions for handling logging tasks. """Utility for configuring Loguru loggers."""
Functions:
----------
- get_logger: Creates and configures a logger with console and optional rotating file logging.
Example usage:
--------------
# Logger setup
>>> logger = get_logger(__name__, log_file="app.log", logging_level="DEBUG")
>>> logger.info("Logging initialized.")
Notes:
------
- The logger supports rotating log files to prevent excessive log file size.
"""
import json
import logging as pylogging import logging as pylogging
import os import os
from logging.handlers import RotatingFileHandler import re
from typing import Optional import sys
from pathlib import Path
from types import FrameType
from typing import Any, List, Optional
from akkudoktoreos.core.logabc import logging_str_to_level import pendulum
from loguru import logger
from akkudoktoreos.core.logabc import LOGGING_LEVELS
def get_logger( class InterceptHandler(pylogging.Handler):
name: str, """A logging handler that redirects standard Python logging messages to Loguru.
log_file: Optional[str] = None,
logging_level: Optional[str] = None,
max_bytes: int = 5000000,
backup_count: int = 5,
) -> pylogging.Logger:
"""Creates and configures a logger with a given name.
The logger supports logging to both the console and an optional log file. File logging is This handler ensures consistency between the `logging` module and Loguru by intercepting
handled by a rotating file handler to prevent excessive log file size. logs sent to the standard logging system and re-emitting them through Loguru with proper
formatting and context (including exception info and call depth).
Attributes:
loglevel_mapping (dict): Mapping from standard logging levels to Loguru level names.
"""
loglevel_mapping: dict[int, str] = {
50: "CRITICAL",
40: "ERROR",
30: "WARNING",
20: "INFO",
10: "DEBUG",
5: "TRACE",
0: "NOTSET",
}
def emit(self, record: pylogging.LogRecord) -> None:
"""Emits a logging record by forwarding it to Loguru with preserved metadata.
Args: Args:
name (str): The name of the logger, typically `__name__` from the calling module. record (logging.LogRecord): A record object containing log message and metadata.
log_file (Optional[str]): Path to the log file for file logging. If None, no file logging is done. """
logging_level (Optional[str]): Logging level (e.g., "INFO", "DEBUG"). Defaults to "INFO". try:
max_bytes (int): Maximum size in bytes for log file before rotation. Defaults to 5 MB. level = logger.level(record.levelname).name
backup_count (int): Number of backup log files to keep. Defaults to 5. except AttributeError:
level = self.loglevel_mapping.get(record.levelno, "INFO")
frame: Optional[FrameType] = pylogging.currentframe()
depth: int = 2
while frame and frame.f_code.co_filename == pylogging.__file__:
frame = frame.f_back
depth += 1
log = logger.bind(request_id="app")
log.opt(depth=depth, exception=record.exc_info).log(level, record.getMessage())
console_handler_id = None
file_handler_id = None
def track_logging_config(config_eos: Any, path: str, old_value: Any, value: Any) -> None:
"""Track logging config changes."""
global console_handler_id, file_handler_id
if not path.startswith("logging"):
raise ValueError(f"Logging shall not track '{path}'")
if not config_eos.logging.console_level:
# No value given - check environment value - may also be None
config_eos.logging.console_level = os.getenv("EOS_LOGGING__LEVEL")
if not config_eos.logging.file_level:
# No value given - check environment value - may also be None
config_eos.logging.file_level = os.getenv("EOS_LOGGING__LEVEL")
# Remove handlers
if console_handler_id:
try:
logger.remove(console_handler_id)
except Exception as e:
logger.debug("Exception on logger.remove: {}", e, exc_info=True)
console_handler_id = None
if file_handler_id:
try:
logger.remove(file_handler_id)
except Exception as e:
logger.debug("Exception on logger.remove: {}", e, exc_info=True)
file_handler_id = None
# Create handlers with new configuration
# Always add console handler
if config_eos.logging.console_level not in LOGGING_LEVELS:
logger.error(
f"Invalid console log level '{config_eos.logging.console_level} - forced to INFO'."
)
config_eos.logging.console_level = "INFO"
console_handler_id = logger.add(
sys.stderr,
enqueue=True,
backtrace=True,
level=config_eos.logging.console_level,
# format=_console_format
)
# Add file handler
if config_eos.logging.file_level and config_eos.logging.file_path:
if config_eos.logging.file_level not in LOGGING_LEVELS:
logger.error(
f"Invalid file log level '{config_eos.logging.console_level}' - forced to INFO."
)
config_eos.logging.file_level = "INFO"
file_handler_id = logger.add(
sink=config_eos.logging.file_path,
rotation="100 MB",
retention="3 days",
enqueue=True,
backtrace=True,
level=config_eos.logging.file_level,
serialize=True, # JSON dict formatting
# format=_file_format
)
# Redirect standard logging to Loguru
pylogging.basicConfig(handlers=[InterceptHandler()], level=0)
# Redirect uvicorn and fastapi logging to Loguru
pylogging.getLogger("uvicorn.access").handlers = [InterceptHandler()]
for pylogger_name in ["uvicorn", "uvicorn.error", "fastapi"]:
pylogger = pylogging.getLogger(pylogger_name)
pylogger.handlers = [InterceptHandler()]
pylogger.propagate = False
logger.info(
f"Logger reconfigured - console: {config_eos.logging.console_level}, file: {config_eos.logging.file_level}."
)
def read_file_log(
log_path: Path,
limit: int = 100,
level: Optional[str] = None,
contains: Optional[str] = None,
regex: Optional[str] = None,
from_time: Optional[str] = None,
to_time: Optional[str] = None,
tail: bool = False,
) -> List[dict]:
"""Read and filter structured log entries from a JSON-formatted log file.
Args:
log_path (Path): Path to the JSON-formatted log file.
limit (int, optional): Maximum number of log entries to return. Defaults to 100.
level (Optional[str], optional): Filter logs by log level (e.g., "INFO", "ERROR"). Defaults to None.
contains (Optional[str], optional): Filter logs that contain this substring in their message. Case-insensitive. Defaults to None.
regex (Optional[str], optional): Filter logs whose message matches this regular expression. Defaults to None.
from_time (Optional[str], optional): ISO 8601 datetime string to filter logs not earlier than this time. Defaults to None.
to_time (Optional[str], optional): ISO 8601 datetime string to filter logs not later than this time. Defaults to None.
tail (bool, optional): If True, read the last lines of the file (like `tail -n`). Defaults to False.
Returns: Returns:
logging.Logger: Configured logger instance. List[dict]: A list of filtered log entries as dictionaries.
Example: Raises:
logger = get_logger(__name__, log_file="app.log", logging_level="DEBUG") FileNotFoundError: If the log file does not exist.
logger.info("Application started") ValueError: If the datetime strings are invalid or improperly formatted.
Exception: For other unforeseen I/O or parsing errors.
""" """
# Create a logger with the specified name if not log_path.exists():
logger = pylogging.getLogger(name) raise FileNotFoundError("Log file not found")
logger.propagate = True
# This is already supported by pydantic-settings in LoggingCommonSettings, however in case
# loading the config itself fails and to set the level before we load the config, we set it here manually.
if logging_level is None and (env_level := os.getenv("EOS_LOGGING__LEVEL")) is not None:
logging_level = env_level
if logging_level is not None:
level = logging_str_to_level(logging_level)
logger.setLevel(level)
# The log message format try:
formatter = pylogging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") from_dt = pendulum.parse(from_time) if from_time else None
to_dt = pendulum.parse(to_time) if to_time else None
except Exception as e:
raise ValueError(f"Invalid date/time format: {e}")
# Prevent loggers from being added multiple times regex_pattern = re.compile(regex) if regex else None
# There may already be a logger from pytest
if not logger.handlers:
# Create a console handler with a standard output stream
console_handler = pylogging.StreamHandler()
if logging_level is not None:
console_handler.setLevel(level)
console_handler.setFormatter(formatter)
# Add the console handler to the logger def matches_filters(log: dict) -> bool:
logger.addHandler(console_handler) if level and log.get("level", {}).get("name") != level.upper():
return False
if contains and contains.lower() not in log.get("message", "").lower():
return False
if regex_pattern and not regex_pattern.search(log.get("message", "")):
return False
if from_dt or to_dt:
try:
log_time = pendulum.parse(log["time"])
except Exception:
return False
if from_dt and log_time < from_dt:
return False
if to_dt and log_time > to_dt:
return False
return True
if log_file and len(logger.handlers) < 2: # We assume a console logger to be the first logger matched_logs = []
# If a log file path is specified, create a rotating file handler lines: list[str] = []
# Ensure the log directory exists if tail:
log_dir = os.path.dirname(log_file) with log_path.open("rb") as f:
if log_dir and not os.path.exists(log_dir): f.seek(0, 2)
os.makedirs(log_dir) end = f.tell()
buffer = bytearray()
pointer = end
# Create a rotating file handler while pointer > 0 and len(lines) < limit * 5:
file_handler = RotatingFileHandler(log_file, maxBytes=max_bytes, backupCount=backup_count) pointer -= 1
if logging_level is not None: f.seek(pointer)
file_handler.setLevel(level) byte = f.read(1)
file_handler.setFormatter(formatter) if byte == b"\n":
if buffer:
line = buffer[::-1].decode("utf-8", errors="ignore")
lines.append(line)
buffer.clear()
else:
buffer.append(byte[0])
if buffer:
line = buffer[::-1].decode("utf-8", errors="ignore")
lines.append(line)
lines = lines[::-1]
else:
with log_path.open("r", encoding="utf-8", newline=None) as f_txt:
lines = f_txt.readlines()
# Add the file handler to the logger for line in lines:
logger.addHandler(file_handler) if not line.strip():
continue
try:
log = json.loads(line)
except json.JSONDecodeError:
continue
if matches_filters(log):
matched_logs.append(log)
if len(matched_logs) >= limit:
break
return logger return matched_logs

View File

@ -3,13 +3,13 @@
Kept in an extra module to avoid cyclic dependencies on package import. Kept in an extra module to avoid cyclic dependencies on package import.
""" """
import logging from pathlib import Path
from typing import Optional from typing import Optional
from pydantic import Field, computed_field, field_validator from pydantic import Field, computed_field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logabc import logging_str_to_level from akkudoktoreos.core.logabc import LOGGING_LEVELS
class LoggingCommonSettings(SettingsBaseModel): class LoggingCommonSettings(SettingsBaseModel):
@ -17,27 +17,47 @@ class LoggingCommonSettings(SettingsBaseModel):
level: Optional[str] = Field( level: Optional[str] = Field(
default=None, default=None,
description="EOS default logging level.", deprecated="This is deprecated. Use console_level and file_level instead.",
examples=["INFO", "DEBUG", "WARNING", "ERROR", "CRITICAL"],
) )
# Validators console_level: Optional[str] = Field(
@field_validator("level", mode="after") default=None,
@classmethod description="Logging level when logging to console.",
def set_default_logging_level(cls, value: Optional[str]) -> Optional[str]: examples=LOGGING_LEVELS,
if isinstance(value, str) and value.upper() == "NONE": )
value = None
if value is None: file_level: Optional[str] = Field(
return None default=None,
level = logging_str_to_level(value) description="Logging level when logging to file.",
logging.getLogger().setLevel(level) examples=LOGGING_LEVELS,
return value )
# Computed fields
@computed_field # type: ignore[prop-decorator] @computed_field # type: ignore[prop-decorator]
@property @property
def root_level(self) -> str: def file_path(self) -> Optional[Path]:
"""Root logger logging level.""" """Computed log file path based on data output path."""
level = logging.getLogger().getEffectiveLevel() try:
level_name = logging.getLevelName(level) path = SettingsBaseModel.config.general.data_output_path / "eos.log"
return level_name except:
# Config may not be fully set up
path = None
return path
# Validators
@field_validator("console_level", "file_level", mode="after")
@classmethod
def validate_level(cls, value: Optional[str]) -> Optional[str]:
"""Validate logging level string."""
if value is None:
# Nothing to set
return None
if isinstance(value, str):
level = value.upper()
if level == "NONE":
return None
if level not in LOGGING_LEVELS:
raise ValueError(f"Logging level {value} not supported")
value = level
else:
raise TypeError(f"Invalid {type(value)} of logging level {value}")
return value

View File

@ -12,20 +12,35 @@ Key Features:
pandas DataFrames and Series with datetime indexes. pandas DataFrames and Series with datetime indexes.
""" """
import inspect
import json import json
import re import re
import uuid
import weakref
from copy import deepcopy from copy import deepcopy
from typing import Any, Dict, List, Optional, Type, Union, get_args, get_origin from typing import (
Any,
Callable,
Dict,
List,
Optional,
Type,
Union,
get_args,
get_origin,
)
from zoneinfo import ZoneInfo from zoneinfo import ZoneInfo
import pandas as pd import pandas as pd
import pendulum import pendulum
from loguru import logger
from pandas.api.types import is_datetime64_any_dtype from pandas.api.types import is_datetime64_any_dtype
from pydantic import ( from pydantic import (
AwareDatetime, AwareDatetime,
BaseModel, BaseModel,
ConfigDict, ConfigDict,
Field, Field,
PrivateAttr,
RootModel, RootModel,
TypeAdapter, TypeAdapter,
ValidationError, ValidationError,
@ -35,6 +50,10 @@ from pydantic import (
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
# Global weakref dictionary to hold external state per model instance
# Used as a workaround for PrivateAttr not working in e.g. Mixin Classes
_model_private_state: "weakref.WeakKeyDictionary[Union[PydanticBaseModel, PydanticModelNestedValueMixin], Dict[str, Any]]" = weakref.WeakKeyDictionary()
def merge_models(source: BaseModel, update_dict: dict[str, Any]) -> dict[str, Any]: def merge_models(source: BaseModel, update_dict: dict[str, Any]) -> dict[str, Any]:
def deep_update(source_dict: dict[str, Any], update_dict: dict[str, Any]) -> dict[str, Any]: def deep_update(source_dict: dict[str, Any], update_dict: dict[str, Any]) -> dict[str, Any]:
@ -83,13 +102,164 @@ class PydanticTypeAdapterDateTime(TypeAdapter[pendulum.DateTime]):
class PydanticModelNestedValueMixin: class PydanticModelNestedValueMixin:
"""A mixin providing methods to get and set nested values within a Pydantic model. """A mixin providing methods to get, set and track nested values within a Pydantic model.
The methods use a '/'-separated path to denote the nested values. The methods use a '/'-separated path to denote the nested values.
Supports handling `Optional`, `List`, and `Dict` types, ensuring correct initialization of Supports handling `Optional`, `List`, and `Dict` types, ensuring correct initialization of
missing attributes. missing attributes.
Example:
class Address(PydanticBaseModel):
city: str
class User(PydanticBaseModel):
name: str
address: Address
def on_city_change(old, new, path):
print(f"{path}: {old} -> {new}")
user = User(name="Alice", address=Address(city="NY"))
user.track_nested_value("address/city", on_city_change)
user.set_nested_value("address/city", "LA") # triggers callback
""" """
def track_nested_value(self, path: str, callback: Callable[[Any, str, Any, Any], None]) -> None:
"""Register a callback for a specific path (or subtree).
Callback triggers if set path is equal or deeper.
Args:
path (str): '/'-separated path to track.
callback (callable): Function called as callback(model_instance, set_path, old_value, new_value).
"""
try:
self._validate_path_structure(path)
pass
except:
raise ValueError(f"Path '{path}' is invalid")
path = path.strip("/")
# Use private data workaround
# Should be:
# _nested_value_callbacks: dict[str, list[Callable[[str, Any, Any], None]]]
# = PrivateAttr(default_factory=dict)
nested_value_callbacks = get_private_attr(self, "nested_value_callbacks", dict())
if path not in nested_value_callbacks:
nested_value_callbacks[path] = []
nested_value_callbacks[path].append(callback)
set_private_attr(self, "nested_value_callbacks", nested_value_callbacks)
logger.debug("Nested value callbacks {}", nested_value_callbacks)
def _validate_path_structure(self, path: str) -> None:
"""Validate that a '/'-separated path is structurally valid for this model.
Checks that each segment of the path corresponds to a field or index in the model's type structure,
without requiring that all intermediate values are currently initialized. This method is intended
to ensure that the path could be valid for nested access or assignment, according to the model's
class definition.
Args:
path (str): The '/'-separated attribute/index path to validate (e.g., "address/city" or "items/0/value").
Raises:
ValueError: If any segment of the path does not correspond to a valid field in the model,
or an invalid transition is made (such as an attribute on a non-model).
Example:
class Address(PydanticBaseModel):
city: str
class User(PydanticBaseModel):
name: str
address: Address
user = User(name="Alice", address=Address(city="NY"))
user._validate_path_structure("address/city") # OK
user._validate_path_structure("address/zipcode") # Raises ValueError
"""
path_elements = path.strip("/").split("/")
# The model we are currently working on
model: Any = self
# The model we get the type information from. It is a pydantic BaseModel
parent: BaseModel = model
# The field that provides type information for the current key
# Fields may have nested types that translates to a sequence of keys, not just one
# - my_field: Optional[list[OtherModel]] -> e.g. "myfield/0" for index 0
# parent_key = ["myfield",] ... ["myfield", "0"]
# parent_key_types = [list, OtherModel]
parent_key: list[str] = []
parent_key_types: list = []
for i, key in enumerate(path_elements):
is_final_key = i == len(path_elements) - 1
# Add current key to parent key to enable nested type tracking
parent_key.append(key)
# Get next value
next_value = None
if isinstance(model, BaseModel):
# Track parent and key for possible assignment later
parent = model
parent_key = [
key,
]
parent_key_types = self._get_key_types(model.__class__, key)
# If this is the final key, set the value
if is_final_key:
return
# Attempt to access the next attribute, handling None values
next_value = getattr(model, key, None)
# Handle missing values (initialize dict/list/model if necessary)
if next_value is None:
next_type = parent_key_types[len(parent_key) - 1]
next_value = self._initialize_value(next_type)
elif isinstance(model, list):
# Handle lists
try:
idx = int(key)
except Exception as e:
raise IndexError(
f"Invalid list index '{key}' at '{path}': key = '{key}'; parent = '{parent}', parent_key = '{parent_key}'; model = '{model}'; {e}"
)
# Get next type from parent key type information
next_type = parent_key_types[len(parent_key) - 1]
if len(model) > idx:
next_value = model[idx]
else:
return
if is_final_key:
return
elif isinstance(model, dict):
# Handle dictionaries (auto-create missing keys)
# Get next type from parent key type information
next_type = parent_key_types[len(parent_key) - 1]
if is_final_key:
return
if key not in model:
return
else:
next_value = model[key]
else:
raise KeyError(f"Key '{key}' not found in model.")
# Move deeper
model = next_value
def get_nested_value(self, path: str) -> Any: def get_nested_value(self, path: str) -> Any:
"""Retrieve a nested value from the model using a '/'-separated path. """Retrieve a nested value from the model using a '/'-separated path.
@ -128,6 +298,11 @@ class PydanticModelNestedValueMixin:
model = model[int(key)] model = model[int(key)]
except (ValueError, IndexError) as e: except (ValueError, IndexError) as e:
raise IndexError(f"Invalid list index at '{path}': {key}; {e}") raise IndexError(f"Invalid list index at '{path}': {key}; {e}")
elif isinstance(model, dict):
try:
model = model[key]
except Exception as e:
raise KeyError(f"Invalid dict key at '{path}': {key}; {e}")
elif isinstance(model, BaseModel): elif isinstance(model, BaseModel):
model = getattr(model, key) model = getattr(model, key)
else: else:
@ -142,6 +317,8 @@ class PydanticModelNestedValueMixin:
Automatically initializes missing `Optional`, `Union`, `dict`, and `list` fields if necessary. Automatically initializes missing `Optional`, `Union`, `dict`, and `list` fields if necessary.
If a missing field cannot be initialized, raises an exception. If a missing field cannot be initialized, raises an exception.
Triggers the callbacks registered by track_nested_value().
Args: Args:
path (str): A '/'-separated path to the nested attribute (e.g., "key1/key2/0"). path (str): A '/'-separated path to the nested attribute (e.g., "key1/key2/0").
value (Any): The new value to set. value (Any): The new value to set.
@ -170,6 +347,44 @@ class PydanticModelNestedValueMixin:
print(user.settings) # Output: {'theme': 'dark'} print(user.settings) # Output: {'theme': 'dark'}
``` ```
""" """
path = path.strip("/")
# Store old value (if possible)
try:
old_value = self.get_nested_value(path)
except Exception as e:
# We can not get the old value
# raise ValueError(f"Can not get old (current) value of '{path}': {e}") from e
old_value = None
# Proceed with core logic
self._set_nested_value(path, value)
# Trigger all callbacks whose path is a prefix of set path
triggered = set()
nested_value_callbacks = get_private_attr(self, "nested_value_callbacks", dict())
for cb_path, callbacks in nested_value_callbacks.items():
# Match: cb_path == path, or cb_path is a prefix (parent) of path
pass
if path == cb_path or path.startswith(cb_path + "/"):
for cb in callbacks:
# Prevent duplicate calls
if (cb_path, id(cb)) not in triggered:
cb(self, path, old_value, value)
triggered.add((cb_path, id(cb)))
def _set_nested_value(self, path: str, value: Any) -> None:
"""Set a nested value core logic.
Args:
path (str): A '/'-separated path to the nested attribute (e.g., "key1/key2/0").
value (Any): The new value to set.
Raises:
KeyError: If a key is not found in the model.
IndexError: If a list index is out of bounds or invalid.
ValueError: If a validation error occurs.
TypeError: If a missing field cannot be initialized.
"""
path_elements = path.strip("/").split("/") path_elements = path.strip("/").split("/")
# The model we are currently working on # The model we are currently working on
model: Any = self model: Any = self
@ -191,6 +406,13 @@ class PydanticModelNestedValueMixin:
# Get next value # Get next value
next_value = None next_value = None
if isinstance(model, BaseModel): if isinstance(model, BaseModel):
# Track parent and key for possible assignment later
parent = model
parent_key = [
key,
]
parent_key_types = self._get_key_types(model.__class__, key)
# If this is the final key, set the value # If this is the final key, set the value
if is_final_key: if is_final_key:
try: try:
@ -199,13 +421,6 @@ class PydanticModelNestedValueMixin:
raise ValueError(f"Error updating model: {e}") from e raise ValueError(f"Error updating model: {e}") from e
return return
# Track parent and key for possible assignment later
parent = model
parent_key = [
key,
]
parent_key_types = self._get_key_types(model, key)
# Attempt to access the next attribute, handling None values # Attempt to access the next attribute, handling None values
next_value = getattr(model, key, None) next_value = getattr(model, key, None)
@ -227,7 +442,7 @@ class PydanticModelNestedValueMixin:
idx = int(key) idx = int(key)
except Exception as e: except Exception as e:
raise IndexError( raise IndexError(
f"Invalid list index '{key}' at '{path}': key = {key}; parent = {parent}, parent_key = {parent_key}; model = {model}; {e}" f"Invalid list index '{key}' at '{path}': key = '{key}'; parent = '{parent}', parent_key = '{parent_key}'; model = '{model}'; {e}"
) )
# Get next type from parent key type information # Get next type from parent key type information
@ -309,6 +524,9 @@ class PydanticModelNestedValueMixin:
Raises: Raises:
TypeError: If the key does not exist or lacks a valid type annotation. TypeError: If the key does not exist or lacks a valid type annotation.
""" """
if not inspect.isclass(model):
raise TypeError(f"Model '{model}' is not of class type.")
if key not in model.model_fields: if key not in model.model_fields:
raise TypeError(f"Field '{key}' does not exist in model '{model.__name__}'.") raise TypeError(f"Field '{key}' does not exist in model '{model.__name__}'.")
@ -408,11 +626,13 @@ class PydanticModelNestedValueMixin:
raise TypeError(f"Unsupported type hint '{type_hint}' for initialization.") raise TypeError(f"Unsupported type hint '{type_hint}' for initialization.")
class PydanticBaseModel(BaseModel, PydanticModelNestedValueMixin): class PydanticBaseModel(PydanticModelNestedValueMixin, BaseModel):
"""Base model class with automatic serialization and deserialization of `pendulum.DateTime` fields. """Base model with pendulum datetime support, nested value utilities, and stable hashing.
This model serializes pendulum.DateTime objects to ISO 8601 strings and This class provides:
deserializes ISO 8601 strings to pendulum.DateTime objects. - ISO 8601 serialization/deserialization of `pendulum.DateTime` fields.
- Nested attribute access and mutation via `PydanticModelNestedValueMixin`.
- A consistent hash using a UUID for use in sets and as dictionary keys
""" """
# Enable custom serialization globally in config # Enable custom serialization globally in config
@ -422,6 +642,17 @@ class PydanticBaseModel(BaseModel, PydanticModelNestedValueMixin):
validate_assignment=True, validate_assignment=True,
) )
_uuid: str = PrivateAttr(default_factory=lambda: str(uuid.uuid4()))
"""str: A private UUID string generated on instantiation, used for hashing."""
def __hash__(self) -> int:
"""Returns a stable hash based on the instance's UUID.
Returns:
int: Hash value derived from the model's UUID.
"""
return hash(self._uuid)
@field_validator("*", mode="before") @field_validator("*", mode="before")
def validate_and_convert_pendulum(cls, value: Any, info: ValidationInfo) -> Any: def validate_and_convert_pendulum(cls, value: Any, info: ValidationInfo) -> Any:
"""Validator to convert fields of type `pendulum.DateTime`. """Validator to convert fields of type `pendulum.DateTime`.
@ -839,3 +1070,27 @@ class PydanticDateTimeSeries(PydanticBaseModel):
class ParametersBaseModel(PydanticBaseModel): class ParametersBaseModel(PydanticBaseModel):
model_config = ConfigDict(extra="forbid") model_config = ConfigDict(extra="forbid")
def set_private_attr(
model: Union[PydanticBaseModel, PydanticModelNestedValueMixin], key: str, value: Any
) -> None:
"""Set a private attribute for a model instance (not stored in model itself)."""
if model not in _model_private_state:
_model_private_state[model] = {}
_model_private_state[model][key] = value
def get_private_attr(
model: Union[PydanticBaseModel, PydanticModelNestedValueMixin], key: str, default: Any = None
) -> Any:
"""Get a private attribute or return default."""
return _model_private_state.get(model, {}).get(key, default)
def del_private_attr(
model: Union[PydanticBaseModel, PydanticModelNestedValueMixin], key: str
) -> None:
"""Delete a private attribute."""
if model in _model_private_state and key in _model_private_state[model]:
del _model_private_state[model][key]

View File

@ -3,7 +3,6 @@ from typing import Any, Optional
import numpy as np import numpy as np
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.devices.devicesabc import ( from akkudoktoreos.devices.devicesabc import (
DeviceBase, DeviceBase,
DeviceOptimizeResult, DeviceOptimizeResult,
@ -11,8 +10,6 @@ from akkudoktoreos.devices.devicesabc import (
) )
from akkudoktoreos.utils.utils import NumpyEncoder from akkudoktoreos.utils.utils import NumpyEncoder
logger = get_logger(__name__)
def max_charging_power_field(description: Optional[str] = None) -> float: def max_charging_power_field(description: Optional[str] = None) -> float:
if description is None: if description is None:

View File

@ -1,15 +1,12 @@
from typing import Optional from typing import Optional
from akkudoktoreos.core.coreabc import SingletonMixin from akkudoktoreos.core.coreabc import SingletonMixin
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.devices.battery import Battery from akkudoktoreos.devices.battery import Battery
from akkudoktoreos.devices.devicesabc import DevicesBase from akkudoktoreos.devices.devicesabc import DevicesBase
from akkudoktoreos.devices.generic import HomeAppliance from akkudoktoreos.devices.generic import HomeAppliance
from akkudoktoreos.devices.inverter import Inverter from akkudoktoreos.devices.inverter import Inverter
from akkudoktoreos.devices.settings import DevicesCommonSettings from akkudoktoreos.devices.settings import DevicesCommonSettings
logger = get_logger(__name__)
class Devices(SingletonMixin, DevicesBase): class Devices(SingletonMixin, DevicesBase):
def __init__(self, settings: Optional[DevicesCommonSettings] = None): def __init__(self, settings: Optional[DevicesCommonSettings] = None):

View File

@ -3,6 +3,7 @@
from enum import Enum from enum import Enum
from typing import Optional, Type from typing import Optional, Type
from loguru import logger
from pendulum import DateTime from pendulum import DateTime
from pydantic import Field, computed_field from pydantic import Field, computed_field
@ -12,12 +13,9 @@ from akkudoktoreos.core.coreabc import (
EnergyManagementSystemMixin, EnergyManagementSystemMixin,
PredictionMixin, PredictionMixin,
) )
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import ParametersBaseModel from akkudoktoreos.core.pydantic import ParametersBaseModel
from akkudoktoreos.utils.datetimeutil import to_duration from akkudoktoreos.utils.datetimeutil import to_duration
logger = get_logger(__name__)
class DeviceParameters(ParametersBaseModel): class DeviceParameters(ParametersBaseModel):
device_id: str = Field(description="ID of device", examples="device1") device_id: str = Field(description="ID of device", examples="device1")

View File

@ -3,11 +3,8 @@ from typing import Optional
import numpy as np import numpy as np
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.devices.devicesabc import DeviceBase, DeviceParameters from akkudoktoreos.devices.devicesabc import DeviceBase, DeviceParameters
logger = get_logger(__name__)
class HomeApplianceParameters(DeviceParameters): class HomeApplianceParameters(DeviceParameters):
"""Home Appliance Device Simulation Configuration.""" """Home Appliance Device Simulation Configuration."""

View File

@ -1,6 +1,6 @@
from typing import List, Sequence from typing import List, Sequence
from akkudoktoreos.core.logging import get_logger from loguru import logger
class Heatpump: class Heatpump:
@ -22,7 +22,6 @@ class Heatpump:
def __init__(self, max_heat_output: int, hours: int): def __init__(self, max_heat_output: int, hours: int):
self.max_heat_output = max_heat_output self.max_heat_output = max_heat_output
self.hours = hours self.hours = hours
self.log = get_logger(__name__)
def __check_outside_temperature_range__(self, temp_celsius: float) -> bool: def __check_outside_temperature_range__(self, temp_celsius: float) -> bool:
"""Check if temperature is in valid range between -100 and 100 degree Celsius. """Check if temperature is in valid range between -100 and 100 degree Celsius.
@ -59,7 +58,7 @@ class Heatpump:
f"Outside temperature '{outside_temperature_celsius}' not in range " f"Outside temperature '{outside_temperature_celsius}' not in range "
"(min: -100 Celsius, max: 100 Celsius)" "(min: -100 Celsius, max: 100 Celsius)"
) )
self.log.error(err_msg) logger.error(err_msg)
raise ValueError(err_msg) raise ValueError(err_msg)
def calculate_heating_output(self, outside_temperature_celsius: float) -> float: def calculate_heating_output(self, outside_temperature_celsius: float) -> float:
@ -87,7 +86,7 @@ class Heatpump:
f"Outside temperature '{outside_temperature_celsius}' not in range " f"Outside temperature '{outside_temperature_celsius}' not in range "
"(min: -100 Celsius, max: 100 Celsius)" "(min: -100 Celsius, max: 100 Celsius)"
) )
self.log.error(err_msg) logger.error(err_msg)
raise ValueError(err_msg) raise ValueError(err_msg)
def calculate_heat_power(self, outside_temperature_celsius: float) -> float: def calculate_heat_power(self, outside_temperature_celsius: float) -> float:
@ -111,7 +110,7 @@ class Heatpump:
f"Outside temperature '{outside_temperature_celsius}' not in range " f"Outside temperature '{outside_temperature_celsius}' not in range "
"(min: -100 Celsius, max: 100 Celsius)" "(min: -100 Celsius, max: 100 Celsius)"
) )
self.log.error(err_msg) logger.error(err_msg)
raise ValueError(err_msg) raise ValueError(err_msg)
def simulate_24h(self, temperatures: Sequence[float]) -> List[float]: def simulate_24h(self, temperatures: Sequence[float]) -> List[float]:

View File

@ -1,13 +1,11 @@
from typing import Optional from typing import Optional
from loguru import logger
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.devices.devicesabc import DeviceBase, DeviceParameters from akkudoktoreos.devices.devicesabc import DeviceBase, DeviceParameters
from akkudoktoreos.prediction.interpolator import get_eos_load_interpolator from akkudoktoreos.prediction.interpolator import get_eos_load_interpolator
logger = get_logger(__name__)
class InverterParameters(DeviceParameters): class InverterParameters(DeviceParameters):
"""Inverter Device Simulation Configuration.""" """Inverter Device Simulation Configuration."""

View File

@ -3,13 +3,10 @@ from typing import Optional
from pydantic import Field from pydantic import Field
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.devices.battery import BaseBatteryParameters from akkudoktoreos.devices.battery import BaseBatteryParameters
from akkudoktoreos.devices.generic import HomeApplianceParameters from akkudoktoreos.devices.generic import HomeApplianceParameters
from akkudoktoreos.devices.inverter import InverterParameters from akkudoktoreos.devices.inverter import InverterParameters
logger = get_logger(__name__)
class DevicesCommonSettings(SettingsBaseModel): class DevicesCommonSettings(SettingsBaseModel):
"""Base configuration for devices simulation settings.""" """Base configuration for devices simulation settings."""

View File

@ -9,6 +9,7 @@ The measurements can be added programmatically or imported from a file or JSON s
from typing import Any, ClassVar, List, Optional from typing import Any, ClassVar, List, Optional
import numpy as np import numpy as np
from loguru import logger
from numpydantic import NDArray, Shape from numpydantic import NDArray, Shape
from pendulum import DateTime, Duration from pendulum import DateTime, Duration
from pydantic import Field, computed_field from pydantic import Field, computed_field
@ -16,11 +17,8 @@ from pydantic import Field, computed_field
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.coreabc import SingletonMixin from akkudoktoreos.core.coreabc import SingletonMixin
from akkudoktoreos.core.dataabc import DataImportMixin, DataRecord, DataSequence from akkudoktoreos.core.dataabc import DataImportMixin, DataRecord, DataSequence
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.utils.datetimeutil import to_duration from akkudoktoreos.utils.datetimeutil import to_duration
logger = get_logger(__name__)
class MeasurementCommonSettings(SettingsBaseModel): class MeasurementCommonSettings(SettingsBaseModel):
"""Measurement Configuration.""" """Measurement Configuration."""

View File

@ -4,6 +4,7 @@ from typing import Any, Optional
import numpy as np import numpy as np
from deap import algorithms, base, creator, tools from deap import algorithms, base, creator, tools
from loguru import logger
from pydantic import Field, field_validator, model_validator from pydantic import Field, field_validator, model_validator
from typing_extensions import Self from typing_extensions import Self
@ -13,7 +14,6 @@ from akkudoktoreos.core.coreabc import (
EnergyManagementSystemMixin, EnergyManagementSystemMixin,
) )
from akkudoktoreos.core.ems import EnergyManagementParameters, SimulationResult from akkudoktoreos.core.ems import EnergyManagementParameters, SimulationResult
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import ParametersBaseModel from akkudoktoreos.core.pydantic import ParametersBaseModel
from akkudoktoreos.devices.battery import ( from akkudoktoreos.devices.battery import (
Battery, Battery,
@ -25,8 +25,6 @@ from akkudoktoreos.devices.generic import HomeAppliance, HomeApplianceParameters
from akkudoktoreos.devices.inverter import Inverter, InverterParameters from akkudoktoreos.devices.inverter import Inverter, InverterParameters
from akkudoktoreos.utils.utils import NumpyEncoder from akkudoktoreos.utils.utils import NumpyEncoder
logger = get_logger(__name__)
class OptimizationParameters(ParametersBaseModel): class OptimizationParameters(ParametersBaseModel):
ems: EnergyManagementParameters ems: EnergyManagementParameters

View File

@ -3,9 +3,6 @@ from typing import List, Optional
from pydantic import Field from pydantic import Field
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
class OptimizationCommonSettings(SettingsBaseModel): class OptimizationCommonSettings(SettingsBaseModel):

View File

@ -3,11 +3,8 @@
from pydantic import ConfigDict from pydantic import ConfigDict
from akkudoktoreos.core.coreabc import ConfigMixin, PredictionMixin from akkudoktoreos.core.coreabc import ConfigMixin, PredictionMixin
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
logger = get_logger(__name__)
class OptimizationBase(ConfigMixin, PredictionMixin, PydanticBaseModel): class OptimizationBase(ConfigMixin, PredictionMixin, PydanticBaseModel):
"""Base class for handling optimization data. """Base class for handling optimization data.

View File

@ -9,11 +9,8 @@ from typing import List, Optional
from pydantic import Field, computed_field from pydantic import Field, computed_field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord
logger = get_logger(__name__)
class ElecPriceDataRecord(PredictionRecord): class ElecPriceDataRecord(PredictionRecord):
"""Represents a electricity price data record containing various price attributes at a specific datetime. """Represents a electricity price data record containing various price attributes at a specific datetime.

View File

@ -11,17 +11,15 @@ from typing import Any, List, Optional, Union
import numpy as np import numpy as np
import pandas as pd import pandas as pd
import requests import requests
from loguru import logger
from pydantic import ValidationError from pydantic import ValidationError
from statsmodels.tsa.holtwinters import ExponentialSmoothing from statsmodels.tsa.holtwinters import ExponentialSmoothing
from akkudoktoreos.core.cache import cache_in_file from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
logger = get_logger(__name__)
class AkkudoktorElecPriceMeta(PydanticBaseModel): class AkkudoktorElecPriceMeta(PydanticBaseModel):
start_timestamp: str start_timestamp: str

View File

@ -9,15 +9,13 @@ format, enabling consistent access to forecasted and historical elecprice attrib
from pathlib import Path from pathlib import Path
from typing import Optional, Union from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider from akkudoktoreos.prediction.elecpriceabc import ElecPriceProvider
from akkudoktoreos.prediction.predictionabc import PredictionImportProvider from akkudoktoreos.prediction.predictionabc import PredictionImportProvider
logger = get_logger(__name__)
class ElecPriceImportCommonSettings(SettingsBaseModel): class ElecPriceImportCommonSettings(SettingsBaseModel):
"""Common settings for elecprice data import from file or JSON String.""" """Common settings for elecprice data import from file or JSON String."""

View File

@ -5,13 +5,11 @@ from typing import Optional, Union
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.loadabc import LoadProvider from akkudoktoreos.prediction.loadabc import LoadProvider
from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktorCommonSettings from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktorCommonSettings
from akkudoktoreos.prediction.loadimport import LoadImportCommonSettings from akkudoktoreos.prediction.loadimport import LoadImportCommonSettings
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
logger = get_logger(__name__)
prediction_eos = get_prediction() prediction_eos = get_prediction()
# Valid load providers # Valid load providers

View File

@ -9,11 +9,8 @@ from typing import List, Optional
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord
logger = get_logger(__name__)
class LoadDataRecord(PredictionRecord): class LoadDataRecord(PredictionRecord):
"""Represents a load data record containing various load attributes at a specific datetime.""" """Represents a load data record containing various load attributes at a specific datetime."""

View File

@ -3,15 +3,13 @@
from typing import Optional from typing import Optional
import numpy as np import numpy as np
from loguru import logger
from pydantic import Field from pydantic import Field
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.loadabc import LoadProvider from akkudoktoreos.prediction.loadabc import LoadProvider
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
logger = get_logger(__name__)
class LoadAkkudoktorCommonSettings(SettingsBaseModel): class LoadAkkudoktorCommonSettings(SettingsBaseModel):
"""Common settings for load data import from file.""" """Common settings for load data import from file."""

View File

@ -9,15 +9,13 @@ format, enabling consistent access to forecasted and historical load attributes.
from pathlib import Path from pathlib import Path
from typing import Optional, Union from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.loadabc import LoadProvider from akkudoktoreos.prediction.loadabc import LoadProvider
from akkudoktoreos.prediction.predictionabc import PredictionImportProvider from akkudoktoreos.prediction.predictionabc import PredictionImportProvider
logger = get_logger(__name__)
class LoadImportCommonSettings(SettingsBaseModel): class LoadImportCommonSettings(SettingsBaseModel):
"""Common settings for load data import from file or JSON string.""" """Common settings for load data import from file or JSON string."""

View File

@ -10,6 +10,7 @@ and manipulation of configuration and prediction data in a clear, scalable, and
from typing import List, Optional from typing import List, Optional
from loguru import logger
from pendulum import DateTime from pendulum import DateTime
from pydantic import Field, computed_field from pydantic import Field, computed_field
@ -22,11 +23,8 @@ from akkudoktoreos.core.dataabc import (
DataRecord, DataRecord,
DataSequence, DataSequence,
) )
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.utils.datetimeutil import to_duration from akkudoktoreos.utils.datetimeutil import to_duration
logger = get_logger(__name__)
class PredictionBase(DataBase, MeasurementMixin): class PredictionBase(DataBase, MeasurementMixin):
"""Base class for handling prediction data. """Base class for handling prediction data.

View File

@ -5,13 +5,11 @@ from typing import Any, List, Optional, Self
from pydantic import Field, computed_field, field_validator, model_validator from pydantic import Field, computed_field, field_validator, model_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
from akkudoktoreos.prediction.pvforecastabc import PVForecastProvider from akkudoktoreos.prediction.pvforecastabc import PVForecastProvider
from akkudoktoreos.prediction.pvforecastimport import PVForecastImportCommonSettings from akkudoktoreos.prediction.pvforecastimport import PVForecastImportCommonSettings
from akkudoktoreos.utils.docs import get_model_structure_from_examples from akkudoktoreos.utils.docs import get_model_structure_from_examples
logger = get_logger(__name__)
prediction_eos = get_prediction() prediction_eos = get_prediction()
# Valid PV forecast providers # Valid PV forecast providers

View File

@ -7,13 +7,11 @@ Notes:
from abc import abstractmethod from abc import abstractmethod
from typing import List, Optional from typing import List, Optional
from loguru import logger
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord
logger = get_logger(__name__)
class PVForecastDataRecord(PredictionRecord): class PVForecastDataRecord(PredictionRecord):
"""Represents a pvforecast data record containing various pvforecast attributes at a specific datetime.""" """Represents a pvforecast data record containing various pvforecast attributes at a specific datetime."""

View File

@ -78,10 +78,10 @@ Methods:
from typing import Any, List, Optional, Union from typing import Any, List, Optional, Union
import requests import requests
from loguru import logger
from pydantic import Field, ValidationError, computed_field, field_validator from pydantic import Field, ValidationError, computed_field, field_validator
from akkudoktoreos.core.cache import cache_in_file from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.pvforecastabc import ( from akkudoktoreos.prediction.pvforecastabc import (
PVForecastDataRecord, PVForecastDataRecord,
@ -89,8 +89,6 @@ from akkudoktoreos.prediction.pvforecastabc import (
) )
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime
logger = get_logger(__name__)
class AkkudoktorForecastHorizon(PydanticBaseModel): class AkkudoktorForecastHorizon(PydanticBaseModel):
altitude: int altitude: int

View File

@ -9,15 +9,13 @@ format, enabling consistent access to forecasted and historical pvforecast attri
from pathlib import Path from pathlib import Path
from typing import Optional, Union from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionImportProvider from akkudoktoreos.prediction.predictionabc import PredictionImportProvider
from akkudoktoreos.prediction.pvforecastabc import PVForecastProvider from akkudoktoreos.prediction.pvforecastabc import PVForecastProvider
logger = get_logger(__name__)
class PVForecastImportCommonSettings(SettingsBaseModel): class PVForecastImportCommonSettings(SettingsBaseModel):
"""Common settings for pvforecast data import from file or JSON string.""" """Common settings for pvforecast data import from file or JSON string."""

View File

@ -14,11 +14,8 @@ import pandas as pd
import pvlib import pvlib
from pydantic import Field from pydantic import Field
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord from akkudoktoreos.prediction.predictionabc import PredictionProvider, PredictionRecord
logger = get_logger(__name__)
class WeatherDataRecord(PredictionRecord): class WeatherDataRecord(PredictionRecord):
"""Represents a weather data record containing various weather attributes at a specific datetime. """Represents a weather data record containing various weather attributes at a specific datetime.

View File

@ -13,15 +13,12 @@ import numpy as np
import pandas as pd import pandas as pd
import pvlib import pvlib
import requests import requests
from loguru import logger
from akkudoktoreos.core.cache import cache_in_file from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.weatherabc import WeatherDataRecord, WeatherProvider from akkudoktoreos.prediction.weatherabc import WeatherDataRecord, WeatherProvider
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
logger = get_logger(__name__)
WheaterDataBrightSkyMapping: List[Tuple[str, Optional[str], Optional[Union[str, float]]]] = [ WheaterDataBrightSkyMapping: List[Tuple[str, Optional[str], Optional[Union[str, float]]]] = [
# brightsky_key, description, corr_factor # brightsky_key, description, corr_factor
("timestamp", "DateTime", "to datetime in timezone"), ("timestamp", "DateTime", "to datetime in timezone"),

View File

@ -18,15 +18,12 @@ from typing import Dict, List, Optional, Tuple
import pandas as pd import pandas as pd
import requests import requests
from bs4 import BeautifulSoup from bs4 import BeautifulSoup
from loguru import logger
from akkudoktoreos.core.cache import cache_in_file from akkudoktoreos.core.cache import cache_in_file
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.weatherabc import WeatherDataRecord, WeatherProvider from akkudoktoreos.prediction.weatherabc import WeatherDataRecord, WeatherProvider
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration, to_timezone from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration, to_timezone
logger = get_logger(__name__)
WheaterDataClearOutsideMapping: List[Tuple[str, Optional[str], Optional[float]]] = [ WheaterDataClearOutsideMapping: List[Tuple[str, Optional[str], Optional[float]]] = [
# clearoutside_key, description, corr_factor # clearoutside_key, description, corr_factor
("DateTime", "DateTime", None), ("DateTime", "DateTime", None),

View File

@ -9,15 +9,13 @@ format, enabling consistent access to forecasted and historical weather attribut
from pathlib import Path from pathlib import Path
from typing import Optional, Union from typing import Optional, Union
from loguru import logger
from pydantic import Field, field_validator from pydantic import Field, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.predictionabc import PredictionImportProvider from akkudoktoreos.prediction.predictionabc import PredictionImportProvider
from akkudoktoreos.prediction.weatherabc import WeatherProvider from akkudoktoreos.prediction.weatherabc import WeatherProvider
logger = get_logger(__name__)
class WeatherImportCommonSettings(SettingsBaseModel): class WeatherImportCommonSettings(SettingsBaseModel):
"""Common settings for weather data import from file or JSON string.""" """Common settings for weather data import from file or JSON string."""

View File

@ -10,6 +10,7 @@ from typing import Any, Optional, Union
import requests import requests
from fasthtml.common import Select from fasthtml.common import Select
from loguru import logger
from monsterui.foundations import stringify from monsterui.foundations import stringify
from monsterui.franken import ( # Select, TODO: Select from FrankenUI does not work - using Select from FastHTML instead from monsterui.franken import ( # Select, TODO: Select from FrankenUI does not work - using Select from FastHTML instead
H3, H3,
@ -29,13 +30,10 @@ from monsterui.franken import ( # Select, TODO: Select from FrankenUI does not
) )
from platformdirs import user_config_dir from platformdirs import user_config_dir
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.server.dash.components import Error, Success from akkudoktoreos.server.dash.components import Error, Success
from akkudoktoreos.server.dash.configuration import get_nested_value from akkudoktoreos.server.dash.configuration import get_nested_value
from akkudoktoreos.utils.datetimeutil import to_datetime from akkudoktoreos.utils.datetimeutil import to_datetime
logger = get_logger(__name__)
# Directory to export files to, or to import files from # Directory to export files to, or to import files from
export_import_directory = Path(user_config_dir("net.akkudoktor.eosdash", "akkudoktor")) export_import_directory = Path(user_config_dir("net.akkudoktor.eosdash", "akkudoktor"))

View File

@ -100,6 +100,7 @@ def ConfigCard(
value: str, value: str,
default: str, default: str,
description: str, description: str,
deprecated: Optional[Union[str, bool]],
update_error: Optional[str], update_error: Optional[str],
update_value: Optional[str], update_value: Optional[str],
update_open: Optional[bool], update_open: Optional[bool],
@ -118,6 +119,7 @@ def ConfigCard(
value (str): The current value of the configuration. value (str): The current value of the configuration.
default (str): The default value of the configuration. default (str): The default value of the configuration.
description (str): A description of the configuration. description (str): A description of the configuration.
deprecated (Optional[Union[str, bool]]): The deprecated marker of the configuration.
update_error (Optional[str]): The error message, if any, during the update process. update_error (Optional[str]): The error message, if any, during the update process.
update_value (Optional[str]): The value to be updated, if different from the current value. update_value (Optional[str]): The value to be updated, if different from the current value.
update_open (Optional[bool]): A flag indicating whether the update section of the card update_open (Optional[bool]): A flag indicating whether the update section of the card
@ -131,6 +133,9 @@ def ConfigCard(
update_value = value update_value = value
if not update_open: if not update_open:
update_open = False update_open = False
if deprecated:
if isinstance(deprecated, bool):
deprecated = "Deprecated"
return Card( return Card(
Details( Details(
Summary( Summary(
@ -151,13 +156,21 @@ def ConfigCard(
Grid( Grid(
P(description), P(description),
P(config_type), P(config_type),
), )
if not deprecated
else None,
Grid(
P(deprecated),
P("DEPRECATED!"),
)
if deprecated
else None,
# Default # Default
Grid( Grid(
DivRAligned(P("default")), DivRAligned(P("default")),
P(default), P(default),
) )
if read_only == "rw" if read_only == "rw" and not deprecated
else None, else None,
# Set value # Set value
Grid( Grid(
@ -172,7 +185,7 @@ def ConfigCard(
), ),
), ),
) )
if read_only == "rw" if read_only == "rw" and not deprecated
else None, else None,
# Last error # Last error
Grid( Grid(

View File

@ -2,6 +2,7 @@ import json
from typing import Any, Dict, List, Optional, Sequence, TypeVar, Union from typing import Any, Dict, List, Optional, Sequence, TypeVar, Union
import requests import requests
from loguru import logger
from monsterui.franken import ( from monsterui.franken import (
H3, H3,
H4, H4,
@ -22,13 +23,10 @@ from pydantic.fields import ComputedFieldInfo, FieldInfo
from pydantic_core import PydanticUndefined from pydantic_core import PydanticUndefined
from akkudoktoreos.config.config import ConfigEOS from akkudoktoreos.config.config import ConfigEOS
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticBaseModel from akkudoktoreos.core.pydantic import PydanticBaseModel
from akkudoktoreos.prediction.pvforecast import PVForecastPlaneSetting from akkudoktoreos.prediction.pvforecast import PVForecastPlaneSetting
from akkudoktoreos.server.dash.components import ConfigCard from akkudoktoreos.server.dash.components import ConfigCard
logger = get_logger(__name__)
T = TypeVar("T") T = TypeVar("T")
# Latest configuration update results # Latest configuration update results
@ -166,7 +164,7 @@ def configuration(
if found_basic: if found_basic:
continue continue
config = {} config: dict[str, Optional[Any]] = {}
config["name"] = ".".join(values_prefix + parent_types) config["name"] = ".".join(values_prefix + parent_types)
config["value"] = json.dumps( config["value"] = json.dumps(
get_nested_value(values, values_prefix + parent_types, "<unknown>") get_nested_value(values, values_prefix + parent_types, "<unknown>")
@ -175,6 +173,9 @@ def configuration(
config["description"] = ( config["description"] = (
subfield_info.description if subfield_info.description else "" subfield_info.description if subfield_info.description else ""
) )
config["deprecated"] = (
subfield_info.deprecated if subfield_info.deprecated else None
)
if isinstance(subfield_info, ComputedFieldInfo): if isinstance(subfield_info, ComputedFieldInfo):
config["read-only"] = "ro" config["read-only"] = "ro"
type_description = str(subfield_info.return_type) type_description = str(subfield_info.return_type)
@ -319,6 +320,7 @@ def ConfigPlanesCard(
config["value"], config["value"],
config["default"], config["default"],
config["description"], config["description"],
config["deprecated"],
update_error, update_error,
update_value, update_value,
update_open, update_open,
@ -457,6 +459,7 @@ def Configuration(
if ( if (
config["type"] config["type"]
== "Optional[list[akkudoktoreos.prediction.pvforecast.PVForecastPlaneSetting]]" == "Optional[list[akkudoktoreos.prediction.pvforecast.PVForecastPlaneSetting]]"
and not config["deprecated"]
): ):
# Special configuration for PV planes # Special configuration for PV planes
rows.append( rows.append(
@ -482,6 +485,7 @@ def Configuration(
config["value"], config["value"],
config["default"], config["default"],
config["description"], config["description"],
config["deprecated"],
update_error, update_error,
update_value, update_value,
update_open, update_open,

View File

@ -8,7 +8,6 @@ from bokeh.models import ColumnDataSource, LinearAxis, Range1d
from bokeh.plotting import figure from bokeh.plotting import figure
from monsterui.franken import FT, Grid, P from monsterui.franken import FT, Grid, P
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.core.pydantic import PydanticDateTimeDataFrame from akkudoktoreos.core.pydantic import PydanticDateTimeDataFrame
from akkudoktoreos.server.dash.bokeh import Bokeh from akkudoktoreos.server.dash.bokeh import Bokeh
@ -17,8 +16,6 @@ FILE_DEMOCONFIG = DIR_DEMODATA.joinpath("democonfig.json")
if not FILE_DEMOCONFIG.exists(): if not FILE_DEMOCONFIG.exists():
raise ValueError(f"File does not exist: {FILE_DEMOCONFIG}") raise ValueError(f"File does not exist: {FILE_DEMOCONFIG}")
logger = get_logger(__name__)
# bar width for 1 hour bars (time given in millseconds) # bar width for 1 hour bars (time given in millseconds)
BAR_WIDTH_1HOUR = 1000 * 60 * 60 BAR_WIDTH_1HOUR = 1000 * 60 * 60
@ -76,25 +73,37 @@ def DemoElectricityPriceForecast(predictions: pd.DataFrame, config: dict) -> FT:
return Bokeh(plot) return Bokeh(plot)
def DemoWeatherTempAir(predictions: pd.DataFrame, config: dict) -> FT: def DemoWeatherTempAirHumidity(predictions: pd.DataFrame, config: dict) -> FT:
source = ColumnDataSource(predictions) source = ColumnDataSource(predictions)
provider = config["weather"]["provider"] provider = config["weather"]["provider"]
plot = figure( plot = figure(
x_axis_type="datetime", x_axis_type="datetime",
y_range=Range1d( title=f"Air Temperature and Humidity Prediction ({provider})",
predictions["weather_temp_air"].min() - 1.0, predictions["weather_temp_air"].max() + 1.0
),
title=f"Air Temperature Prediction ({provider})",
x_axis_label="Datetime", x_axis_label="Datetime",
y_axis_label="Temperature [°C]", y_axis_label="Temperature [°C]",
sizing_mode="stretch_width", sizing_mode="stretch_width",
height=400, height=400,
) )
# Add secondary y-axis for humidity
plot.extra_y_ranges["humidity"] = Range1d(start=-5, end=105)
y2_axis = LinearAxis(y_range_name="humidity", axis_label="Relative Humidity [%]")
y2_axis.axis_label_text_color = "green"
plot.add_layout(y2_axis, "left")
plot.line( plot.line(
"date_time", "weather_temp_air", source=source, legend_label="Air Temperature", color="blue" "date_time", "weather_temp_air", source=source, legend_label="Air Temperature", color="blue"
) )
plot.line(
"date_time",
"weather_relative_humidity",
source=source,
legend_label="Relative Humidity [%]",
color="green",
y_range_name="humidity",
)
return Bokeh(plot) return Bokeh(plot)
@ -150,7 +159,10 @@ def DemoLoad(predictions: pd.DataFrame, config: dict) -> FT:
sizing_mode="stretch_width", sizing_mode="stretch_width",
height=400, height=400,
) )
plot.extra_y_ranges["stddev"] = Range1d(0, 1000) # Add secondary y-axis for stddev
stddev_min = predictions["load_std"].min()
stddev_max = predictions["load_std"].max()
plot.extra_y_ranges["stddev"] = Range1d(start=stddev_min - 5, end=stddev_max + 5)
y2_axis = LinearAxis(y_range_name="stddev", axis_label="Load Standard Deviation [W]") y2_axis = LinearAxis(y_range_name="stddev", axis_label="Load Standard Deviation [W]")
y2_axis.axis_label_text_color = "green" y2_axis.axis_label_text_color = "green"
plot.add_layout(y2_axis, "left") plot.add_layout(y2_axis, "left")
@ -230,6 +242,7 @@ def Demo(eos_host: str, eos_port: Union[str, int]) -> str:
"keys": [ "keys": [
"pvforecast_ac_power", "pvforecast_ac_power",
"elecprice_marketprice_kwh", "elecprice_marketprice_kwh",
"weather_relative_humidity",
"weather_temp_air", "weather_temp_air",
"weather_ghi", "weather_ghi",
"weather_dni", "weather_dni",
@ -260,7 +273,7 @@ def Demo(eos_host: str, eos_port: Union[str, int]) -> str:
return Grid( return Grid(
DemoPVForecast(predictions, democonfig), DemoPVForecast(predictions, democonfig),
DemoElectricityPriceForecast(predictions, democonfig), DemoElectricityPriceForecast(predictions, democonfig),
DemoWeatherTempAir(predictions, democonfig), DemoWeatherTempAirHumidity(predictions, democonfig),
DemoWeatherIrradiance(predictions, democonfig), DemoWeatherIrradiance(predictions, democonfig),
DemoLoad(predictions, democonfig), DemoLoad(predictions, democonfig),
cols_max=2, cols_max=2,

View File

@ -1,14 +1,13 @@
from typing import Optional, Union from typing import Optional, Union
import requests import requests
from loguru import logger
from monsterui.daisy import Loading, LoadingT from monsterui.daisy import Loading, LoadingT
from monsterui.franken import A, ButtonT, DivFullySpaced, P from monsterui.franken import A, ButtonT, DivFullySpaced, P
from requests.exceptions import RequestException from requests.exceptions import RequestException
from akkudoktoreos.config.config import get_config from akkudoktoreos.config.config import get_config
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
config_eos = get_config() config_eos = get_config()

View File

@ -25,11 +25,13 @@ from fastapi.responses import (
RedirectResponse, RedirectResponse,
Response, Response,
) )
from loguru import logger
from akkudoktoreos.config.config import ConfigEOS, SettingsEOS, get_config from akkudoktoreos.config.config import ConfigEOS, SettingsEOS, get_config
from akkudoktoreos.core.cache import CacheFileStore from akkudoktoreos.core.cache import CacheFileStore
from akkudoktoreos.core.ems import get_ems from akkudoktoreos.core.ems import get_ems
from akkudoktoreos.core.logging import get_logger from akkudoktoreos.core.logabc import LOGGING_LEVELS
from akkudoktoreos.core.logging import read_file_log, track_logging_config
from akkudoktoreos.core.pydantic import ( from akkudoktoreos.core.pydantic import (
PydanticBaseModel, PydanticBaseModel,
PydanticDateTimeData, PydanticDateTimeData,
@ -56,7 +58,6 @@ from akkudoktoreos.server.server import (
) )
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
logger = get_logger(__name__)
config_eos = get_config() config_eos = get_config()
measurement_eos = get_measurement() measurement_eos = get_measurement()
prediction_eos = get_prediction() prediction_eos = get_prediction()
@ -66,6 +67,14 @@ ems_eos = get_ems()
args = None args = None
# ----------------------
# Logging configuration
# ----------------------
logger.remove()
track_logging_config(config_eos, "logging", None, None)
config_eos.track_nested_value("/logging", track_logging_config)
# ---------------------- # ----------------------
# EOSdash server startup # EOSdash server startup
# ---------------------- # ----------------------
@ -177,18 +186,32 @@ def cache_save() -> dict:
return CacheFileStore().save_store() return CacheFileStore().save_store()
@repeat_every(seconds=float(config_eos.cache.cleanup_interval)) def cache_cleanup_on_exception(e: Exception) -> None:
logger.error("Cache cleanup task caught an exception: {}", e, exc_info=True)
@repeat_every(
seconds=float(config_eos.cache.cleanup_interval),
on_exception=cache_cleanup_on_exception,
)
def cache_cleanup_task() -> None: def cache_cleanup_task() -> None:
"""Repeating task to clear cache from expired cache files.""" """Repeating task to clear cache from expired cache files."""
logger.debug("Clear cache")
cache_clear() cache_clear()
def energy_management_on_exception(e: Exception) -> None:
logger.error("Energy management task caught an exception: {}", e, exc_info=True)
@repeat_every( @repeat_every(
seconds=10, seconds=10,
wait_first=config_eos.ems.startup_delay, wait_first=config_eos.ems.startup_delay,
on_exception=energy_management_on_exception,
) )
def energy_management_task() -> None: def energy_management_task() -> None:
"""Repeating task for energy management.""" """Repeating task for energy management."""
logger.debug("Check EMS run")
ems_eos.manage_energy() ems_eos.manage_energy()
@ -219,6 +242,7 @@ async def server_shutdown_task() -> None:
os.kill(pid, signal.SIGTERM) # type: ignore[attr-defined,unused-ignore] os.kill(pid, signal.SIGTERM) # type: ignore[attr-defined,unused-ignore]
logger.info(f"🚀 EOS terminated, PID {pid}") logger.info(f"🚀 EOS terminated, PID {pid}")
sys.exit(0)
@asynccontextmanager @asynccontextmanager
@ -544,6 +568,52 @@ def fastapi_config_get_key(
raise HTTPException(status_code=400, detail=str(e)) raise HTTPException(status_code=400, detail=str(e))
@app.get("/v1/logging/log", tags=["logging"])
async def fastapi_logging_get_log(
limit: int = Query(100, description="Maximum number of log entries to return."),
level: Optional[str] = Query(None, description="Filter by log level (e.g., INFO, ERROR)."),
contains: Optional[str] = Query(None, description="Filter logs containing this substring."),
regex: Optional[str] = Query(None, description="Filter logs by matching regex in message."),
from_time: Optional[str] = Query(
None, description="Start time (ISO format) for filtering logs."
),
to_time: Optional[str] = Query(None, description="End time (ISO format) for filtering logs."),
tail: bool = Query(False, description="If True, returns the most recent lines (tail mode)."),
) -> JSONResponse:
"""Get structured log entries from the EOS log file.
Filters and returns log entries based on the specified query parameters. The log
file is expected to contain newline-delimited JSON entries.
Args:
limit (int): Maximum number of entries to return.
level (Optional[str]): Filter logs by severity level (e.g., DEBUG, INFO).
contains (Optional[str]): Return only logs that include this string in the message.
regex (Optional[str]): Return logs that match this regular expression in the message.
from_time (Optional[str]): ISO 8601 timestamp to filter logs not older than this.
to_time (Optional[str]): ISO 8601 timestamp to filter logs not newer than this.
tail (bool): If True, fetch the most recent log entries (like `tail`).
Returns:
JSONResponse: A JSON list of log entries.
"""
log_path = config_eos.logging.file_path
try:
logs = read_file_log(
log_path=log_path,
limit=limit,
level=level,
contains=contains,
regex=regex,
from_time=from_time,
to_time=to_time,
tail=tail,
)
return JSONResponse(content=logs)
except Exception as e:
return JSONResponse(content={"error": str(e)}, status_code=500)
@app.get("/v1/measurement/keys", tags=["measurement"]) @app.get("/v1/measurement/keys", tags=["measurement"])
def fastapi_measurement_keys_get() -> list[str]: def fastapi_measurement_keys_get() -> list[str]:
"""Get a list of available measurement keys.""" """Get a list of available measurement keys."""
@ -1245,23 +1315,18 @@ Did you want to connect to <a href="{url}" class="back-button">EOSdash</a>?
return RedirectResponse(url="/docs", status_code=303) return RedirectResponse(url="/docs", status_code=303)
def run_eos(host: str, port: int, log_level: str, access_log: bool, reload: bool) -> None: def run_eos(host: str, port: int, log_level: str, reload: bool) -> None:
"""Run the EOS server with the specified configurations. """Run the EOS server with the specified configurations.
This function starts the EOS server using the Uvicorn ASGI server. It accepts Starts the EOS server using the Uvicorn ASGI server. Logs an error and exits if
arguments for the host, port, log level, access log, and reload options. The binding to the host and port fails.
log level is converted to lowercase to ensure compatibility with Uvicorn's
expected log level format. If an error occurs while attempting to bind the
server to the specified host and port, an error message is logged and the
application exits.
Parameters: Args:
host (str): The hostname to bind the server to. host (str): Hostname to bind the server to.
port (int): The port number to bind the server to. port (int): Port number to bind the server to.
log_level (str): The log level for the server. Options include "critical", "error", log_level (str): Log level for the server. One of
"warning", "info", "debug", and "trace". "critical", "error", "warning", "info", "debug", or "trace".
access_log (bool): Whether to enable or disable the access log. Set to True to enable. reload (bool): Enable or disable auto-reload. True for development.
reload (bool): Whether to enable or disable auto-reload. Set to True for development.
Returns: Returns:
None None
@ -1270,20 +1335,26 @@ def run_eos(host: str, port: int, log_level: str, access_log: bool, reload: bool
if host == "0.0.0.0" and os.name == "nt": # noqa: S104 if host == "0.0.0.0" and os.name == "nt": # noqa: S104
host = "localhost" host = "localhost"
# Setup console logging level using nested value
# - triggers logging configuration by track_logging_config
if log_level.upper() in LOGGING_LEVELS:
config_eos.set_nested_value("logging/console_level", log_level.upper())
# Wait for EOS port to be free - e.g. in case of restart # Wait for EOS port to be free - e.g. in case of restart
wait_for_port_free(port, timeout=120, waiting_app_name="EOS") wait_for_port_free(port, timeout=120, waiting_app_name="EOS")
try: try:
# Let uvicorn run the fastAPI app
uvicorn.run( uvicorn.run(
"akkudoktoreos.server.eos:app", "akkudoktoreos.server.eos:app",
host=host, host=host,
port=port, port=port,
log_level=log_level.lower(), # Convert log_level to lowercase log_level="info",
access_log=access_log, access_log=True,
reload=reload, reload=reload,
) )
except Exception as e: except Exception as e:
logger.error(f"Could not bind to host {host}:{port}. Error: {e}") logger.exception("Failed to start uvicorn server.")
raise e raise e
@ -1298,7 +1369,7 @@ def main() -> None:
Command-line Arguments: Command-line Arguments:
--host (str): Host for the EOS server (default: value from config). --host (str): Host for the EOS server (default: value from config).
--port (int): Port for the EOS server (default: value from config). --port (int): Port for the EOS server (default: value from config).
--log_level (str): Log level for the server. Options: "critical", "error", "warning", "info", "debug", "trace" (default: "info"). --log_level (str): Log level for the server console. Options: "critical", "error", "warning", "info", "debug", "trace" (default: "info").
--access_log (bool): Enable or disable access log. Options: True or False (default: False). --access_log (bool): Enable or disable access log. Options: True or False (default: False).
--reload (bool): Enable or disable auto-reload. Useful for development. Options: True or False (default: False). --reload (bool): Enable or disable auto-reload. Useful for development. Options: True or False (default: False).
""" """
@ -1322,14 +1393,8 @@ def main() -> None:
parser.add_argument( parser.add_argument(
"--log_level", "--log_level",
type=str, type=str,
default="info", default="none",
help='Log level for the server. Options: "critical", "error", "warning", "info", "debug", "trace" (default: "info")', help='Log level for the server console. Options: "critical", "error", "warning", "info", "debug", "trace" (default: "none")',
)
parser.add_argument(
"--access_log",
type=bool,
default=False,
help="Enable or disable access log. Options: True or False (default: True)",
) )
parser.add_argument( parser.add_argument(
"--reload", "--reload",
@ -1344,7 +1409,7 @@ def main() -> None:
port = args.port if args.port else 8503 port = args.port if args.port else 8503
try: try:
run_eos(host, port, args.log_level, args.access_log, args.reload) run_eos(host, port, args.log_level, args.reload)
except: except:
sys.exit(1) sys.exit(1)

View File

@ -8,10 +8,10 @@ from typing import Optional
import psutil import psutil
import uvicorn import uvicorn
from fasthtml.common import FileResponse, JSONResponse from fasthtml.common import FileResponse, JSONResponse
from loguru import logger
from monsterui.core import FastHTML, Theme from monsterui.core import FastHTML, Theme
from akkudoktoreos.config.config import get_config from akkudoktoreos.config.config import get_config
from akkudoktoreos.core.logging import get_logger
# Pages # Pages
from akkudoktoreos.server.dash.admin import Admin from akkudoktoreos.server.dash.admin import Admin
@ -23,7 +23,6 @@ from akkudoktoreos.server.dash.footer import Footer
from akkudoktoreos.server.dash.hello import Hello from akkudoktoreos.server.dash.hello import Hello
from akkudoktoreos.server.server import get_default_host, wait_for_port_free from akkudoktoreos.server.server import get_default_host, wait_for_port_free
logger = get_logger(__name__)
config_eos = get_config() config_eos = get_config()
# The favicon for EOSdash # The favicon for EOSdash

View File

@ -6,12 +6,10 @@ import time
from typing import Optional, Union from typing import Optional, Union
import psutil import psutil
from loguru import logger
from pydantic import Field, IPvAnyAddress, field_validator from pydantic import Field, IPvAnyAddress, field_validator
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
def get_default_host() -> str: def get_default_host() -> str:

View File

@ -27,13 +27,11 @@ from datetime import date, datetime, timedelta
from typing import Any, List, Literal, Optional, Tuple, Union, overload from typing import Any, List, Literal, Optional, Tuple, Union, overload
import pendulum import pendulum
from loguru import logger
from pendulum import Date, DateTime, Duration from pendulum import Date, DateTime, Duration
from pendulum.tz.timezone import Timezone from pendulum.tz.timezone import Timezone
from timezonefinder import TimezoneFinder from timezonefinder import TimezoneFinder
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
MAX_DURATION_STRING_LENGTH = 350 MAX_DURATION_STRING_LENGTH = 350
@ -154,22 +152,22 @@ def to_datetime(
fmt_tz = f"{fmt} z" fmt_tz = f"{fmt} z"
dt_tz = f"{date_input} {in_timezone}" dt_tz = f"{date_input} {in_timezone}"
dt = pendulum.from_format(dt_tz, fmt_tz) dt = pendulum.from_format(dt_tz, fmt_tz)
logger.debug( logger.trace(
f"Str Fmt converted: {dt}, tz={dt.tz} from {date_input}, tz={in_timezone}" f"Str Fmt converted: {dt}, tz={dt.tz} from {date_input}, tz={in_timezone}"
) )
break break
except ValueError as e: except ValueError as e:
logger.debug(f"{date_input}, {fmt}, {e}") logger.trace(f"{date_input}, {fmt}, {e}")
dt = None dt = None
else: else:
# DateTime input with timezone info # DateTime input with timezone info
try: try:
dt = pendulum.parse(date_input) dt = pendulum.parse(date_input)
logger.debug( logger.trace(
f"Pendulum Fmt converted: {dt}, tz={dt.tz} from {date_input}, tz={in_timezone}" f"Pendulum Fmt converted: {dt}, tz={dt.tz} from {date_input}, tz={in_timezone}"
) )
except pendulum.parsing.exceptions.ParserError as e: except pendulum.parsing.exceptions.ParserError as e:
logger.debug(f"Date string {date_input} does not match any Pendulum formats: {e}") logger.trace(f"Date string {date_input} does not match any Pendulum formats: {e}")
dt = None dt = None
if dt is None: if dt is None:
# Some special values # Some special values
@ -181,7 +179,7 @@ def to_datetime(
timestamp = float(date_input) timestamp = float(date_input)
dt = pendulum.from_timestamp(timestamp, tz="UTC") dt = pendulum.from_timestamp(timestamp, tz="UTC")
except (ValueError, TypeError) as e: except (ValueError, TypeError) as e:
logger.debug(f"Date string {date_input} does not match timestamp format: {e}") logger.trace(f"Date string {date_input} does not match timestamp format: {e}")
dt = None dt = None
if dt is None: if dt is None:
raise ValueError(f"Date string {date_input} does not match any known formats.") raise ValueError(f"Date string {date_input} does not match any known formats.")
@ -202,7 +200,7 @@ def to_datetime(
# Represent in target timezone # Represent in target timezone
dt_in_tz = dt.in_timezone(in_timezone) dt_in_tz = dt.in_timezone(in_timezone)
logger.debug( logger.trace(
f"\nTimezone adapted to: {in_timezone}\nfrom: {dt} tz={dt.timezone}\nto: {dt_in_tz} tz={dt_in_tz.tz}" f"\nTimezone adapted to: {in_timezone}\nfrom: {dt} tz={dt.timezone}\nto: {dt_in_tz} tz={dt_in_tz.tz}"
) )
dt = dt_in_tz dt = dt_in_tz
@ -277,7 +275,7 @@ def to_duration(
duration = pendulum.parse(input_value) duration = pendulum.parse(input_value)
return duration - duration.start_of("day") return duration - duration.start_of("day")
except pendulum.parsing.exceptions.ParserError as e: except pendulum.parsing.exceptions.ParserError as e:
logger.debug(f"Invalid Pendulum time string format '{input_value}': {e}") logger.trace(f"Invalid Pendulum time string format '{input_value}': {e}")
# Handle strings like "2 days 5 hours 30 minutes" # Handle strings like "2 days 5 hours 30 minutes"
total_seconds = 0 total_seconds = 0

View File

@ -35,6 +35,8 @@ def get_model_structure_from_examples(
example_data: list[dict[str, Any]] = [{} for _ in range(example_max_length)] example_data: list[dict[str, Any]] = [{} for _ in range(example_max_length)]
for field_name, field_info in model_class.model_fields.items(): for field_name, field_info in model_class.model_fields.items():
if field_info.deprecated:
continue
for example_ix in range(example_max_length): for example_ix in range(example_max_length):
example_data[example_ix][field_name] = get_example_or_default( example_data[example_ix][field_name] = get_example_or_default(
field_name, field_info, example_ix field_name, field_info, example_ix

View File

@ -4,9 +4,6 @@ from typing import Any
import numpy as np import numpy as np
from akkudoktoreos.config.configabc import SettingsBaseModel from akkudoktoreos.config.configabc import SettingsBaseModel
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
class UtilsCommonSettings(SettingsBaseModel): class UtilsCommonSettings(SettingsBaseModel):

View File

@ -13,16 +13,17 @@ from matplotlib.backends.backend_pdf import PdfPages
from akkudoktoreos.core.coreabc import ConfigMixin from akkudoktoreos.core.coreabc import ConfigMixin
from akkudoktoreos.core.ems import EnergyManagement from akkudoktoreos.core.ems import EnergyManagement
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.optimization.genetic import OptimizationParameters from akkudoktoreos.optimization.genetic import OptimizationParameters
from akkudoktoreos.utils.datetimeutil import to_datetime from akkudoktoreos.utils.datetimeutil import to_datetime
logger = get_logger(__name__)
matplotlib.use( matplotlib.use(
"Agg" "Agg"
) # non-interactive backend that can only write to files, backend needed to stay in main thread. ) # non-interactive backend that can only write to files, backend needed to stay in main thread.
debug_visualize: bool = False
class VisualizationReport(ConfigMixin): class VisualizationReport(ConfigMixin):
def __init__( def __init__(
self, self,
@ -440,6 +441,8 @@ def prepare_visualize(
filename: str = "visualization_results.pdf", filename: str = "visualization_results.pdf",
start_hour: int = 0, start_hour: int = 0,
) -> None: ) -> None:
global debug_visualize
report = VisualizationReport(filename) report = VisualizationReport(filename)
next_full_hour_date = EnergyManagement.set_start_datetime() next_full_hour_date = EnergyManagement.set_start_datetime()
# Group 1: # Group 1:
@ -642,7 +645,7 @@ def prepare_visualize(
if filtered_balance.size > 0 or filtered_losses.size > 0: if filtered_balance.size > 0 or filtered_losses.size > 0:
report.finalize_group() report.finalize_group()
if logger.level == "DEBUG" or results["fixed_seed"]: if debug_visualize or results["fixed_seed"]:
report.create_line_chart( report.create_line_chart(
0, 0,
[ [
@ -667,6 +670,8 @@ def prepare_visualize(
def generate_example_report(filename: str = "example_report.pdf") -> None: def generate_example_report(filename: str = "example_report.pdf") -> None:
"""Generate example visualization report.""" """Generate example visualization report."""
global debug_visualize
report = VisualizationReport(filename, "test") report = VisualizationReport(filename, "test")
x_hours = 0 # Define x-axis start values (e.g., hours) x_hours = 0 # Define x-axis start values (e.g., hours)
@ -738,9 +743,9 @@ def generate_example_report(filename: str = "example_report.pdf") -> None:
report.finalize_group() # Finalize the third group of charts report.finalize_group() # Finalize the third group of charts
logger.setLevel("DEBUG") # set level for example report debug_visualize = True # set level for example report
if logger.level == "DEBUG": if debug_visualize:
report.create_line_chart( report.create_line_chart(
x_hours, x_hours,
[np.array([0.2, 0.25, 0.3, 0.35])], [np.array([0.2, 0.25, 0.3, 0.35])],

View File

@ -16,35 +16,75 @@ import pendulum
import psutil import psutil
import pytest import pytest
import requests import requests
from _pytest.logging import LogCaptureFixture
from loguru import logger
from xprocess import ProcessStarter, XProcess from xprocess import ProcessStarter, XProcess
from akkudoktoreos.config.config import ConfigEOS, get_config from akkudoktoreos.config.config import ConfigEOS, get_config
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.server.server import get_default_host from akkudoktoreos.server.server import get_default_host
logger = get_logger(__name__) # -----------------------------------------------
# Adapt pytest logging handling to Loguru logging
# -----------------------------------------------
@pytest.fixture
def caplog(caplog: LogCaptureFixture):
"""Propagate Loguru logs to the pytest caplog handler."""
handler_id = logger.add(
caplog.handler,
format="{message}",
level=0,
filter=lambda record: record["level"].no >= caplog.handler.level,
enqueue=False, # Set to 'True' if your test is spawning child processes.
)
yield caplog
try:
logger.remove(handler_id)
except:
# May already be deleted
pass
@pytest.fixture
def reportlog(pytestconfig):
"""Propagate Loguru logs to the pytest terminal reporter."""
logging_plugin = pytestconfig.pluginmanager.getplugin("logging-plugin")
handler_id = logger.add(logging_plugin.report_handler, format="{message}")
yield
try:
logger.remove(handler_id)
except:
# May already be deleted
pass
@pytest.fixture(autouse=True)
def propagate_logs():
"""Deal with the pytest --log-cli-level command-line flag.
This option controls the standard logging logs, not loguru ones.
For this reason, we first install a PropagateHandler for compatibility.
"""
class PropagateHandler(logging.Handler):
def emit(self, record):
if logging.getLogger(record.name).isEnabledFor(record.levelno):
logging.getLogger(record.name).handle(record)
logger.remove()
logger.add(PropagateHandler(), format="{message}")
yield
@pytest.fixture() @pytest.fixture()
def disable_debug_logging(scope="session", autouse=True): def disable_debug_logging(scope="session", autouse=True):
"""Automatically disable debug logging for all tests.""" """Automatically disable debug logging for all tests."""
original_levels = {} logger.remove() # Remove all loggers
root_logger = logging.getLogger() logger.add(sys.stderr, level="INFO") # Only show INFO and above
original_levels[root_logger] = root_logger.level
root_logger.setLevel(logging.INFO)
for logger_name, logger in logging.root.manager.loggerDict.items():
if isinstance(logger, logging.Logger):
original_levels[logger] = logger.level
if logger.level <= logging.DEBUG:
logger.setLevel(logging.INFO)
yield
for logger, level in original_levels.items():
logger.setLevel(level)
# -----------------------------------------------
# Provide pytest options for specific test setups
# -----------------------------------------------
def pytest_addoption(parser): def pytest_addoption(parser):
parser.addoption( parser.addoption(
@ -144,6 +184,7 @@ def cfg_non_existent(request):
@pytest.fixture(autouse=True) @pytest.fixture(autouse=True)
def user_cwd(config_default_dirs): def user_cwd(config_default_dirs):
"""Patch cwd provided by module pathlib.Path.cwd."""
with patch( with patch(
"pathlib.Path.cwd", "pathlib.Path.cwd",
return_value=config_default_dirs[1], return_value=config_default_dirs[1],
@ -153,6 +194,7 @@ def user_cwd(config_default_dirs):
@pytest.fixture(autouse=True) @pytest.fixture(autouse=True)
def user_config_dir(config_default_dirs): def user_config_dir(config_default_dirs):
"""Patch user_config_dir provided by module platformdirs."""
with patch( with patch(
"akkudoktoreos.config.config.user_config_dir", "akkudoktoreos.config.config.user_config_dir",
return_value=str(config_default_dirs[0]), return_value=str(config_default_dirs[0]),
@ -162,6 +204,7 @@ def user_config_dir(config_default_dirs):
@pytest.fixture(autouse=True) @pytest.fixture(autouse=True)
def user_data_dir(config_default_dirs): def user_data_dir(config_default_dirs):
"""Patch user_data_dir provided by module platformdirs."""
with patch( with patch(
"akkudoktoreos.config.config.user_data_dir", "akkudoktoreos.config.config.user_data_dir",
return_value=str(config_default_dirs[-1] / "data"), return_value=str(config_default_dirs[-1] / "data"),
@ -189,14 +232,18 @@ def config_eos(
config_file_cwd = config_default_dirs[1] / ConfigEOS.CONFIG_FILE_NAME config_file_cwd = config_default_dirs[1] / ConfigEOS.CONFIG_FILE_NAME
assert not config_file.exists() assert not config_file.exists()
assert not config_file_cwd.exists() assert not config_file_cwd.exists()
config_eos = get_config() config_eos = get_config()
config_eos.reset_settings() config_eos.reset_settings()
assert config_file == config_eos.general.config_file_path assert config_file == config_eos.general.config_file_path
assert config_file.exists() assert config_file.exists()
assert not config_file_cwd.exists() assert not config_file_cwd.exists()
# Check user data directory pathes (config_default_dirs[-1] == data_default_dir_user)
assert config_default_dirs[-1] / "data" == config_eos.general.data_folder_path assert config_default_dirs[-1] / "data" == config_eos.general.data_folder_path
assert config_default_dirs[-1] / "data/cache" == config_eos.cache.path() assert config_default_dirs[-1] / "data/cache" == config_eos.cache.path()
assert config_default_dirs[-1] / "data/output" == config_eos.general.data_output_path assert config_default_dirs[-1] / "data/output" == config_eos.general.data_output_path
assert config_default_dirs[-1] / "data/output/eos.log" == config_eos.logging.file_path
return config_eos return config_eos

View File

@ -4,12 +4,10 @@ from typing import Union
from unittest.mock import patch from unittest.mock import patch
import pytest import pytest
from loguru import logger
from pydantic import ValidationError from pydantic import ValidationError
from akkudoktoreos.config.config import ConfigEOS, GeneralSettings from akkudoktoreos.config.config import ConfigEOS, GeneralSettings
from akkudoktoreos.core.logging import get_logger
logger = get_logger(__name__)
# overwrite config_mixin fixture from conftest # overwrite config_mixin fixture from conftest

View File

@ -156,7 +156,7 @@ class TestDataRecord:
assert "data_value" in record_dict assert "data_value" in record_dict
assert record_dict["data_value"] == 20.0 assert record_dict["data_value"] == 20.0
record2 = DerivedRecord.from_dict(record_dict) record2 = DerivedRecord.from_dict(record_dict)
assert record2 == record assert record2.model_dump() == record.model_dump()
def test_to_json(self): def test_to_json(self):
record = self.create_test_record(datetime(2024, 1, 3, tzinfo=timezone.utc), 10.0) record = self.create_test_record(datetime(2024, 1, 3, tzinfo=timezone.utc), 10.0)
@ -165,7 +165,7 @@ class TestDataRecord:
assert "data_value" in json_str assert "data_value" in json_str
assert "20.0" in json_str assert "20.0" in json_str
record2 = DerivedRecord.from_json(json_str) record2 = DerivedRecord.from_json(json_str)
assert record2 == record assert record2.model_dump() == record.model_dump()
class TestDataSequence: class TestDataSequence:
@ -526,7 +526,7 @@ class TestDataSequence:
data_dict = sequence.to_dict() data_dict = sequence.to_dict()
assert isinstance(data_dict, dict) assert isinstance(data_dict, dict)
sequence_other = sequence.from_dict(data_dict) sequence_other = sequence.from_dict(data_dict)
assert sequence_other == sequence assert sequence_other.model_dump() == sequence.model_dump()
def test_to_json(self, sequence): def test_to_json(self, sequence):
record = self.create_test_record(datetime(2023, 11, 6), 0.8) record = self.create_test_record(datetime(2023, 11, 6), 0.8)

View File

@ -5,10 +5,10 @@ from unittest.mock import Mock, patch
import numpy as np import numpy as np
import pytest import pytest
import requests import requests
from loguru import logger
from akkudoktoreos.core.cache import CacheFileStore from akkudoktoreos.core.cache import CacheFileStore
from akkudoktoreos.core.ems import get_ems from akkudoktoreos.core.ems import get_ems
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.elecpriceakkudoktor import ( from akkudoktoreos.prediction.elecpriceakkudoktor import (
AkkudoktorElecPrice, AkkudoktorElecPrice,
AkkudoktorElecPriceValue, AkkudoktorElecPriceValue,
@ -22,8 +22,6 @@ FILE_TESTDATA_ELECPRICEAKKUDOKTOR_1_JSON = DIR_TESTDATA.joinpath(
"elecpriceforecast_akkudoktor_1.json" "elecpriceforecast_akkudoktor_1.json"
) )
logger = get_logger(__name__)
@pytest.fixture @pytest.fixture
def provider(monkeypatch, config_eos): def provider(monkeypatch, config_eos):
@ -94,7 +92,7 @@ def test_request_forecast(mock_get, provider, sample_akkudoktor_1_json):
akkudoktor_data = provider._request_forecast() akkudoktor_data = provider._request_forecast()
assert isinstance(akkudoktor_data, AkkudoktorElecPrice) assert isinstance(akkudoktor_data, AkkudoktorElecPrice)
assert akkudoktor_data.values[0] == AkkudoktorElecPriceValue( assert akkudoktor_data.values[0].model_dump() == AkkudoktorElecPriceValue(
start_timestamp=1733785200000, start_timestamp=1733785200000,
end_timestamp=1733788800000, end_timestamp=1733788800000,
start="2024-12-09T23:00:00.000Z", start="2024-12-09T23:00:00.000Z",
@ -102,7 +100,7 @@ def test_request_forecast(mock_get, provider, sample_akkudoktor_1_json):
marketprice=92.85, marketprice=92.85,
unit="Eur/MWh", unit="Eur/MWh",
marketpriceEurocentPerKWh=9.29, marketpriceEurocentPerKWh=9.29,
) ).model_dump()
@patch("requests.get") @patch("requests.get")

View File

@ -1,69 +1,35 @@
"""Test Module for logging Module.""" """Test Module for logging Module."""
import logging import logging
from logging.handlers import RotatingFileHandler import os
import sys
from pathlib import Path from pathlib import Path
from unittest.mock import patch
import pytest import pytest
from loguru import logger
from akkudoktoreos.core.logging import get_logger from akkudoktoreos.core.logging import track_logging_config
# ----------------------------- # -----------------------------
# get_logger # logsettings
# ----------------------------- # -----------------------------
class TestLoggingCommonSettings:
def teardown_method(self):
"""Reset Loguru after each test to avoid handler contamination."""
logger.remove()
def test_get_logger_console_logging(): def test_valid_console_level_sets_logging(self, config_eos, caplog):
"""Test logger creation with console logging.""" config_eos.track_nested_value("/logging", track_logging_config)
logger = get_logger("test_logger", logging_level="DEBUG") config_eos.set_nested_value("/logging/console_level", "INFO")
assert config_eos.get_nested_value("/logging/console_level") == "INFO"
assert config_eos.logging.console_level == "INFO"
assert any("console: INFO" in message for message in caplog.messages)
# Check logger name def test_valid_console_level_calls_tracking_callback(self, config_eos):
assert logger.name == "test_logger" with patch("akkudoktoreos.core.logging.track_logging_config") as mock_setup:
config_eos.track_nested_value("/logging", mock_setup)
# Check logger level config_eos.set_nested_value("/logging/console_level", "INFO")
assert logger.level == logging.DEBUG assert config_eos.get_nested_value("/logging/console_level") == "INFO"
assert config_eos.logging.console_level == "INFO"
# Check console handler is present mock_setup.assert_called_once()
assert len(logger.handlers) == 1
assert isinstance(logger.handlers[0], logging.StreamHandler)
def test_get_logger_file_logging(tmpdir):
"""Test logger creation with file logging."""
log_file = Path(tmpdir).joinpath("test.log")
logger = get_logger("test_logger", log_file=str(log_file), logging_level="WARNING")
# Check logger name
assert logger.name == "test_logger"
# Check logger level
assert logger.level == logging.WARNING
# Check console handler is present
assert len(logger.handlers) == 2 # One for console and one for file
assert isinstance(logger.handlers[0], logging.StreamHandler)
assert isinstance(logger.handlers[1], RotatingFileHandler)
# Check file existence
assert log_file.exists()
def test_get_logger_no_file_logging():
"""Test logger creation without file logging."""
logger = get_logger("test_logger")
# Check logger name
assert logger.name == "test_logger"
# Check logger level
assert logger.level == logging.INFO
# Check no file handler is present
assert len(logger.handlers) >= 1 # First is console handler (maybe be pytest handler)
assert isinstance(logger.handlers[0], logging.StreamHandler)
def test_get_logger_with_invalid_level():
"""Test logger creation with an invalid logging level."""
with pytest.raises(ValueError, match="Unknown loggin level: INVALID"):
logger = get_logger("test_logger", logging_level="INVALID")

View File

@ -3,9 +3,9 @@ from pathlib import Path
from unittest.mock import Mock, patch from unittest.mock import Mock, patch
import pytest import pytest
from loguru import logger
from akkudoktoreos.core.ems import get_ems from akkudoktoreos.core.ems import get_ems
from akkudoktoreos.core.logging import get_logger
from akkudoktoreos.prediction.prediction import get_prediction from akkudoktoreos.prediction.prediction import get_prediction
from akkudoktoreos.prediction.pvforecastakkudoktor import ( from akkudoktoreos.prediction.pvforecastakkudoktor import (
AkkudoktorForecastHorizon, AkkudoktorForecastHorizon,
@ -24,8 +24,6 @@ FILE_TESTDATA_PV_FORECAST_INPUT_SINGLE_PLANE = DIR_TESTDATA.joinpath(
) )
FILE_TESTDATA_PV_FORECAST_RESULT_1 = DIR_TESTDATA.joinpath("pv_forecast_result_1.txt") FILE_TESTDATA_PV_FORECAST_RESULT_1 = DIR_TESTDATA.joinpath("pv_forecast_result_1.txt")
logger = get_logger(__name__)
@pytest.fixture @pytest.fixture
def sample_settings(config_eos): def sample_settings(config_eos):

View File

@ -66,6 +66,11 @@ class TestPydanticModelNestedValueMixin:
with pytest.raises(TypeError): with pytest.raises(TypeError):
PydanticModelNestedValueMixin._get_key_types(User, "unknown_field") PydanticModelNestedValueMixin._get_key_types(User, "unknown_field")
def test_get_key_types_for_instance_raises(self, user_instance):
"""Test _get_key_types raises an error for an instance."""
with pytest.raises(TypeError):
PydanticModelNestedValueMixin._get_key_types(user_instance, "unknown_field")
def test_set_nested_value_in_model(self, user_instance): def test_set_nested_value_in_model(self, user_instance):
"""Test setting nested value in a model field (Address -> city).""" """Test setting nested value in a model field (Address -> city)."""
assert user_instance.addresses is None assert user_instance.addresses is None
@ -123,7 +128,7 @@ class TestPydanticModelNestedValueMixin:
"""Test attempting to set value for a non-existent field.""" """Test attempting to set value for a non-existent field."""
user = User(name="John") user = User(name="John")
with pytest.raises(ValueError): with pytest.raises(TypeError):
user.set_nested_value("non_existent_field", "Some Value") user.set_nested_value("non_existent_field", "Some Value")
def test_set_nested_value_with_invalid_type(self, user_instance): def test_set_nested_value_with_invalid_type(self, user_instance):
@ -144,6 +149,91 @@ class TestPydanticModelNestedValueMixin:
"The first address should be an instance of Address" "The first address should be an instance of Address"
) )
def test_track_nested_value_simple_callback(self, user_instance):
user_instance.set_nested_value("addresses/0/city", "NY")
assert user_instance.addresses is not None
assert user_instance.addresses[0].city == "NY"
callback_calls = []
def cb(model, path, old, new):
callback_calls.append((path, old, new))
user_instance.track_nested_value("addresses/0/city", cb)
user_instance.set_nested_value("addresses/0/city", "LA")
assert user_instance.addresses is not None
assert user_instance.addresses[0].city == "LA"
assert callback_calls == [("addresses/0/city", "NY", "LA")]
def test_track_nested_value_prefix_triggers(self, user_instance):
user_instance.set_nested_value("addresses/0", Address(city="Berlin", postal_code="10000"))
assert user_instance.addresses is not None
assert user_instance.addresses[0].city == "Berlin"
cb_prefix = []
cb_exact = []
def cb1(model, path, old, new):
cb_prefix.append((path, old, new))
def cb2(model, path, old, new):
cb_exact.append((path, old, new))
user_instance.track_nested_value("addresses/0", cb1)
user_instance.track_nested_value("addresses/0/city", cb2)
user_instance.set_nested_value("addresses/0/city", "Munich")
assert user_instance.addresses is not None
assert user_instance.addresses[0].city == "Munich"
# Both callbacks should be triggered
assert cb_prefix == [("addresses/0/city", "Berlin", "Munich")]
assert cb_exact == [("addresses/0/city", "Berlin", "Munich")]
def test_track_nested_value_multiple_callbacks_same_path(self, user_instance):
user_instance.set_nested_value("addresses/0/city", "Berlin")
calls1 = []
calls2 = []
user_instance.track_nested_value("addresses/0/city", lambda lib, path, o, n: calls1.append((path, o, n)))
user_instance.track_nested_value("addresses/0/city", lambda lib, path, o, n: calls2.append((path, o, n)))
user_instance.set_nested_value("addresses/0/city", "Stuttgart")
assert calls1 == [("addresses/0/city", "Berlin", "Stuttgart")]
assert calls2 == [("addresses/0/city", "Berlin", "Stuttgart")]
def test_track_nested_value_invalid_path_raises(self, user_instance):
with pytest.raises(ValueError) as excinfo:
user_instance.track_nested_value("unknown_field", lambda model, path, o, n: None)
assert "is invalid" in str(excinfo.value)
with pytest.raises(ValueError) as excinfo:
user_instance.track_nested_value("unknown_field/0/city", lambda model, path, o, n: None)
assert "is invalid" in str(excinfo.value)
def test_track_nested_value_list_and_dict_path(self):
class Book(PydanticBaseModel):
title: str
class Library(PydanticBaseModel):
books: list[Book]
meta: dict[str, str] = {}
lib = Library(books=[Book(title="A")], meta={"location": "center"})
assert lib.meta["location"] == "center"
calls = []
# For list, only root attribute structure is checked, not indices
lib.track_nested_value("books/0/title", lambda lib, path, o, n: calls.append((path, o, n)))
lib.set_nested_value("books/0/title", "B")
assert lib.books[0].title == "B"
assert calls == [("books/0/title", "A", "B")]
# For dict, only root attribute structure is checked
meta_calls = []
lib.track_nested_value("meta/location", lambda lib, path, o, n: meta_calls.append((path, o, n)))
assert lib.meta["location"] == "center"
lib.set_nested_value("meta/location", "north")
assert lib.meta["location"] == "north"
assert meta_calls == [("meta/location", "center", "north")]
class TestPydanticBaseModel: class TestPydanticBaseModel:
def test_valid_pendulum_datetime(self): def test_valid_pendulum_datetime(self):