mirror of
https://github.com/Akkudoktor-EOS/EOS.git
synced 2025-10-29 13:56:21 +00:00
fix: automatic optimization (#596)
This fix implements the long term goal to have the EOS server run optimization (or
energy management) on regular intervals automatically. Thus clients can request
the current energy management plan at any time and it is updated on regular
intervals without interaction by the client.
This fix started out to "only" make automatic optimization (or energy management)
runs working. It turned out there are several endpoints that in some way
update predictions or run the optimization. To lock against such concurrent attempts
the code had to be refactored to allow control of execution. During refactoring it
became clear that some classes and files are named without a proper reference
to their usage. Thus not only refactoring but also renaming became necessary.
The names are still not the best, but I hope they are more intuitive.
The fix includes several bug fixes that are not directly related to the automatic optimization
but are necessary to keep EOS running properly to do the automatic optimization and
to test and document the changes.
This is a breaking change as the configuration structure changed once again and
the server API was also enhanced and streamlined. The server API that is used by
Andreas and Jörg in their videos has not changed.
* fix: automatic optimization
Allow optimization to automatically run on configured intervals gathering all
optimization parameters from configuration and predictions. The automatic run
can be configured to only run prediction updates skipping the optimization.
Extend documentaion to also cover automatic optimization. Lock automatic runs
against runs initiated by the /optimize or other endpoints. Provide new
endpoints to retrieve the energy management plan and the genetic solution
of the latest automatic optimization run. Offload energy management to thread
pool executor to keep the app more responsive during the CPU heavy optimization
run.
* fix: EOS servers recognize environment variables on startup
Force initialisation of EOS configuration on server startup to assure
all sources of EOS configuration are properly set up and read. Adapt
server tests and configuration tests to also test for environment
variable configuration.
* fix: Remove 0.0.0.0 to localhost translation under Windows
EOS imposed a 0.0.0.0 to localhost translation under Windows for
convenience. This caused some trouble in user configurations. Now, as the
default IP address configuration is 127.0.0.1, the user is responsible
for to set up the correct Windows compliant IP address.
* fix: allow names for hosts additional to IP addresses
* fix: access pydantic model fields by class
Access by instance is deprecated.
* fix: down sampling key_to_array
* fix: make cache clear endpoint clear all cache files
Make /v1/admin/cache/clear clear all cache files. Before it only cleared
expired cache files by default. Add new endpoint /v1/admin/clear-expired
to only clear expired cache files.
* fix: timezonefinder returns Europe/Paris instead of Europe/Berlin
timezonefinder 8.10 got more inaccurate for timezones in europe as there is
a common timezone. Use new package tzfpy instead which is still returning
Europe/Berlin if you are in Germany. tzfpy also claims to be faster than
timezonefinder.
* fix: provider settings configuration
Provider configuration used to be a union holding the settings for several
providers. Pydantic union handling does not always find the correct type
for a provider setting. This led to exceptions in specific configurations.
Now provider settings are explicit comfiguration items for each possible
provider. This is a breaking change as the configuration structure was
changed.
* fix: ClearOutside weather prediction irradiance calculation
Pvlib needs a pandas time index. Convert time index.
* fix: test config file priority
Do not use config_eos fixture as this fixture already creates a config file.
* fix: optimization sample request documentation
Provide all data in documentation of optimization sample request.
* fix: gitlint blocking pip dependency resolution
Replace gitlint by commitizen. Gitlint is not actively maintained anymore.
Gitlint dependencies blocked pip from dependency resolution.
* fix: sync pre-commit config to actual dependency requirements
.pre-commit-config.yaml was out of sync, also requirements-dev.txt.
* fix: missing babel in requirements.txt
Add babel to requirements.txt
* feat: setup default device configuration for automatic optimization
In case the parameters for automatic optimization are not fully defined a
default configuration is setup to allow the automatic energy management
run. The default configuration may help the user to correctly define
the device configuration.
* feat: allow configuration of genetic algorithm parameters
The genetic algorithm parameters for number of individuals, number of
generations, the seed and penalty function parameters are now avaliable
as configuration options.
* feat: allow configuration of home appliance time windows
The time windows a home appliance is allowed to run are now configurable
by the configuration (for /v1 API) and also by the home appliance parameters
(for the classic /optimize API). If there is no such configuration the
time window defaults to optimization hours, which was the standard before
the change. Documentation on how to configure time windows is added.
* feat: standardize mesaurement keys for battery/ ev SoC measurements
The standardized measurement keys to report battery SoC to the device
simulations can now be retrieved from the device configuration as a
read-only config option.
* feat: feed in tariff prediction
Add feed in tarif predictions needed for automatic optimization. The feed in
tariff can be retrieved as fixed feed in tarif or can be imported. Also add
tests for the different feed in tariff providers. Extend documentation to
cover the feed in tariff providers.
* feat: add energy management plan based on S2 standard instructions
EOS can generate an energy management plan as a list of simple instructions.
May be retrieved by the /v1/energy-management/plan endpoint. The instructions
loosely follow the S2 energy management standard.
* feat: make measurement keys configurable by EOS configuration.
The fixed measurement keys are replaced by configurable measurement keys.
* feat: make pendulum DateTime, Date, Duration types usable for pydantic models
Use pydantic_extra_types.pendulum_dt to get pydantic pendulum types. Types are
added to the datetimeutil utility. Remove custom made pendulum adaptations
from EOS pydantic module. Make EOS modules use the pydantic pendulum types
managed by the datetimeutil module instead of the core pendulum types.
* feat: Add Time, TimeWindow, TimeWindowSequence and to_time to datetimeutil.
The time windows are are added to support home appliance time window
configuration. All time classes are also pydantic models. Time is the base
class for time definition derived from pendulum.Time.
* feat: Extend DataRecord by configurable field like data.
Configurable field like data was added to support the configuration of
measurement records.
* feat: Add additional information to health information
Version information is added to the health endpoints of eos and eosDash.
The start time of the last optimization and the latest run time of the energy
management is added to the EOS health information.
* feat: add pydantic merge model tests
* feat: add plan tab to EOSdash
The plan tab displays the current energy management instructions.
* feat: add predictions tab to EOSdash
The predictions tab displays the current predictions.
* feat: add cache management to EOSdash admin tab
The admin tab is extended by a section for cache management. It allows to
clear the cache.
* feat: add about tab to EOSdash
The about tab resembles the former hello tab and provides extra information.
* feat: Adapt changelog and prepare for release management
Release management using commitizen is added. The changelog file is adapted and
teh changelog and a description for release management is added in the
documentation.
* feat(doc): Improve install and devlopment documentation
Provide a more concise installation description in Readme.md and add extra
installation page and development page to documentation.
* chore: Use memory cache for interpolation instead of dict in inverter
Decorate calculate_self_consumption() with @cachemethod_until_update to cache
results in memory during an energy management/ optimization run. Replacement
of dict type caching in inverter is now possible because all optimization
runs are properly locked and the memory cache CacheUntilUpdateStore is properly
cleared at the start of any energy management/ optimization operation.
* chore: refactor genetic
Refactor the genetic algorithm modules for enhanced module structure and better
readability. Removed unnecessary and overcomplex devices singleton. Also
split devices configuration from genetic algorithm parameters to allow further
development independently from genetic algorithm parameter format. Move
charge rates configuration for electric vehicles from optimization to devices
configuration to allow to have different charge rates for different cars in
the future.
* chore: Rename memory cache to CacheEnergyManagementStore
The name better resembles the task of the cache to chache function and method
results for an energy management run. Also the decorator functions are renamed
accordingly: cachemethod_energy_management, cache_energy_management
* chore: use class properties for config/ems/prediction mixin classes
* chore: skip debug logs from mathplotlib
Mathplotlib is very noisy in debug mode.
* chore: automatically sync bokeh js to bokeh python package
bokeh was updated to 3.8.0, make JS CDN automatically follow the package version.
* chore: rename hello.py to about.py
Make hello.py the adapted EOSdash about page.
* chore: remove demo page from EOSdash
As no the plan and prediction pages are working without configuration, the demo
page is no longer necessary
* chore: split test_server.py for system test
Split test_server.py to create explicit test_system.py for system tests.
* chore: move doc utils to generate_config_md.py
The doc utils are only used in scripts/generate_config_md.py. Move it there to
attribute for strong cohesion.
* chore: improve pydantic merge model documentation
* chore: remove pendulum warning from readme
* chore: remove GitHub discussions from contributing documentation
Github discussions is to be replaced by Akkudoktor.net.
* chore(release): bump version to 0.1.0+dev for development
* build(deps): bump fastapi[standard] from 0.115.14 to 0.117.1
bump fastapi and make coverage version (for pytest-cov) explicit to avoid pip break.
* build(deps): bump uvicorn from 0.36.0 to 0.37.0
BREAKING CHANGE: EOS configuration changed. V1 API changed.
- The available_charge_rates_percent configuration is removed from optimization.
Use the new charge_rate configuration for the electric vehicle
- Optimization configuration parameter hours renamed to horizon_hours
- Device configuration now has to provide the number of devices and device
properties per device.
- Specific prediction provider configuration to be provided by explicit
configuration item (no union for all providers).
- Measurement keys to be provided as a list.
- New feed in tariff providers have to be configured.
- /v1/measurement/loadxxx endpoints are removed. Use generic mesaurement endpoints.
- /v1/admin/cache/clear now clears all cache files. Use
/v1/admin/cache/clear-expired to only clear all expired cache files.
Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
This commit is contained in:
@@ -130,25 +130,6 @@ def prediction_eos():
|
||||
return get_prediction()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def devices_eos(config_mixin):
|
||||
from akkudoktoreos.devices.devices import get_devices
|
||||
|
||||
devices = get_devices()
|
||||
print("devices_eos reset!")
|
||||
devices.reset()
|
||||
return devices
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def devices_mixin(devices_eos):
|
||||
with patch(
|
||||
"akkudoktoreos.core.coreabc.DevicesMixin.devices", new_callable=PropertyMock
|
||||
) as devices_mixin_patch:
|
||||
devices_mixin_patch.return_value = devices_eos
|
||||
yield devices_mixin_patch
|
||||
|
||||
|
||||
# Test if test has side effect of writing to system (user) config file
|
||||
# Before activating, make sure that no user config file exists (e.g. ~/.config/net.akkudoktoreos.eos/EOS.config.json)
|
||||
@pytest.fixture(autouse=True)
|
||||
@@ -273,12 +254,144 @@ def config_default_dirs(tmpdir):
|
||||
)
|
||||
|
||||
|
||||
# ------------------------------------
|
||||
# Provide pytest EOS server management
|
||||
# ------------------------------------
|
||||
|
||||
|
||||
def cleanup_eos_eosdash(
|
||||
host: str,
|
||||
port: int,
|
||||
eosdash_host: str,
|
||||
eosdash_port: int,
|
||||
server_timeout: float = 10.0,
|
||||
) -> None:
|
||||
"""Clean up any running EOS and EOSdash processes.
|
||||
|
||||
Args:
|
||||
host (str): EOS server host (e.g., "127.0.0.1").
|
||||
port (int): Port number used by the EOS process.
|
||||
eosdash_hostr (str): EOSdash server host.
|
||||
eosdash_port (int): Port used by EOSdash.
|
||||
server_timeout (float): Timeout in seconds before giving up.
|
||||
"""
|
||||
server = f"http://{host}:{port}"
|
||||
eosdash_server = f"http://{eosdash_host}:{eosdash_port}"
|
||||
|
||||
sigkill = signal.SIGTERM if os.name == "nt" else signal.SIGKILL
|
||||
|
||||
# Attempt to shut down EOS via health endpoint
|
||||
try:
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
pid = result.json()["pid"]
|
||||
os.kill(pid, sigkill)
|
||||
time.sleep(1)
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
assert result.status_code != HTTPStatus.OK
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback: kill processes bound to the EOS port
|
||||
pids: list[int] = []
|
||||
for _ in range(int(server_timeout / 3)):
|
||||
for conn in psutil.net_connections(kind="inet"):
|
||||
if conn.laddr.port == port and conn.pid is not None:
|
||||
try:
|
||||
process = psutil.Process(conn.pid)
|
||||
cmdline = process.as_dict(attrs=["cmdline"])["cmdline"]
|
||||
if "akkudoktoreos.server.eos" in " ".join(cmdline):
|
||||
pids.append(conn.pid)
|
||||
except Exception:
|
||||
pass
|
||||
for pid in pids:
|
||||
os.kill(pid, sigkill)
|
||||
running = False
|
||||
for pid in pids:
|
||||
try:
|
||||
proc = psutil.Process(pid)
|
||||
status = proc.status()
|
||||
if status != psutil.STATUS_ZOMBIE:
|
||||
running = True
|
||||
break
|
||||
except psutil.NoSuchProcess:
|
||||
continue
|
||||
if not running:
|
||||
break
|
||||
time.sleep(3)
|
||||
|
||||
# Check for processes still running (maybe zombies).
|
||||
for pid in pids:
|
||||
try:
|
||||
proc = psutil.Process(pid)
|
||||
status = proc.status()
|
||||
assert status == psutil.STATUS_ZOMBIE, f"Cleanup EOS expected zombie, got {status} for PID {pid}"
|
||||
except psutil.NoSuchProcess:
|
||||
# Process already reaped (possibly by init/systemd)
|
||||
continue
|
||||
|
||||
# Attempt to shut down EOSdash via health endpoint
|
||||
for srv in (eosdash_server, "http://127.0.0.1:8504", "http://127.0.0.1:8555"):
|
||||
try:
|
||||
result = requests.get(f"{srv}/eosdash/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
pid = result.json()["pid"]
|
||||
os.kill(pid, sigkill)
|
||||
time.sleep(1)
|
||||
result = requests.get(f"{srv}/eosdash/health", timeout=2)
|
||||
assert result.status_code != HTTPStatus.OK
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback: kill EOSdash processes bound to known ports
|
||||
pids = []
|
||||
for _ in range(int(server_timeout / 3)):
|
||||
for conn in psutil.net_connections(kind="inet"):
|
||||
if conn.laddr.port in (eosdash_port, 8504, 8555) and conn.pid is not None:
|
||||
try:
|
||||
process = psutil.Process(conn.pid)
|
||||
cmdline = process.as_dict(attrs=["cmdline"])["cmdline"]
|
||||
if "akkudoktoreos.server.eosdash" in " ".join(cmdline):
|
||||
pids.append(conn.pid)
|
||||
except Exception:
|
||||
pass
|
||||
for pid in pids:
|
||||
os.kill(pid, sigkill)
|
||||
running = False
|
||||
for pid in pids:
|
||||
try:
|
||||
proc = psutil.Process(pid)
|
||||
status = proc.status()
|
||||
if status != psutil.STATUS_ZOMBIE:
|
||||
running = True
|
||||
break
|
||||
except psutil.NoSuchProcess:
|
||||
continue
|
||||
if not running:
|
||||
break
|
||||
time.sleep(3)
|
||||
|
||||
# Check for processes still running (maybe zombies).
|
||||
for pid in pids:
|
||||
try:
|
||||
proc = psutil.Process(pid)
|
||||
status = proc.status()
|
||||
assert status == psutil.STATUS_ZOMBIE, f"Cleanup EOSdash expected zombie, got {status} for PID {pid}"
|
||||
except psutil.NoSuchProcess:
|
||||
# Process already reaped (possibly by init/systemd)
|
||||
continue
|
||||
|
||||
|
||||
@contextmanager
|
||||
def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], None, None]:
|
||||
def server_base(
|
||||
xprocess: XProcess,
|
||||
extra_env: Optional[dict[str, str]] = None
|
||||
) -> Generator[dict[str, Union[str, int]], None, None]:
|
||||
"""Fixture to start the server with temporary EOS_DIR and default config.
|
||||
|
||||
Args:
|
||||
xprocess (XProcess): The pytest-xprocess fixture to manage the server process.
|
||||
extra_env (Optional[dict[str, str]]): Environment variables to set before server startup.
|
||||
|
||||
Yields:
|
||||
dict[str, str]: A dictionary containing:
|
||||
@@ -287,14 +400,22 @@ def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], Non
|
||||
"""
|
||||
host = get_default_host()
|
||||
port = 8503
|
||||
eosdash_port = 8504
|
||||
server = f"http://{host}:{port}"
|
||||
|
||||
# Port of server may be still blocked by a server usage despite the other server already
|
||||
# shut down. CLOSE_WAIT, TIME_WAIT may typically take up to 120 seconds.
|
||||
server_timeout = 120
|
||||
|
||||
server = f"http://{host}:{port}"
|
||||
eosdash_server = f"http://{host}:{eosdash_port}"
|
||||
if extra_env and extra_env.get("EOS_SERVER__EOSDASH_HOST", None):
|
||||
eosdash_host = extra_env["EOS_SERVER__EOSDASH_HOST"]
|
||||
else:
|
||||
eosdash_host = host
|
||||
if extra_env and extra_env.get("EOS_SERVER__EOSDASH_PORT", None):
|
||||
eosdash_port: int = int(extra_env["EOS_SERVER__EOSDASH_PORT"])
|
||||
else:
|
||||
eosdash_port = 8504
|
||||
eosdash_server = f"http://{eosdash_host}:{eosdash_port}"
|
||||
|
||||
eos_tmp_dir = tempfile.TemporaryDirectory()
|
||||
eos_dir = str(eos_tmp_dir.name)
|
||||
|
||||
@@ -324,8 +445,10 @@ def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], Non
|
||||
env = os.environ.copy()
|
||||
env["EOS_DIR"] = eos_dir
|
||||
env["EOS_CONFIG_DIR"] = eos_dir
|
||||
if extra_env:
|
||||
env.update(extra_env)
|
||||
|
||||
# command to start server process
|
||||
# Set command to start server process
|
||||
args = [
|
||||
sys.executable,
|
||||
"-m",
|
||||
@@ -345,88 +468,17 @@ def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], Non
|
||||
# checks if our server is ready
|
||||
def startup_check(self):
|
||||
try:
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
if result.status_code == 200:
|
||||
response = requests.get(f"{server}/v1/health", timeout=10)
|
||||
logger.debug(f"[xprocess] Health check: {response.status_code}")
|
||||
if response.status_code == 200:
|
||||
return True
|
||||
except:
|
||||
pass
|
||||
logger.debug(f"[xprocess] Health check: {response}")
|
||||
except Exception as e:
|
||||
logger.debug(f"[xprocess] Exception during health check: {e}")
|
||||
return False
|
||||
|
||||
def cleanup_eos_eosdash():
|
||||
# Cleanup any EOS process left.
|
||||
if os.name == "nt":
|
||||
# Windows does not provide SIGKILL
|
||||
sigkill = signal.SIGTERM
|
||||
else:
|
||||
sigkill = signal.SIGKILL # type: ignore
|
||||
# - Use pid on EOS health endpoint
|
||||
try:
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
pid = result.json()["pid"]
|
||||
os.kill(pid, sigkill)
|
||||
time.sleep(1)
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
assert result.status_code != HTTPStatus.OK
|
||||
except:
|
||||
pass
|
||||
# - Use pids from processes on EOS port
|
||||
for retries in range(int(server_timeout / 3)):
|
||||
pids: list[int] = []
|
||||
for conn in psutil.net_connections(kind="inet"):
|
||||
if conn.laddr.port == port:
|
||||
if conn.pid not in pids:
|
||||
# Get fresh process info
|
||||
try:
|
||||
process = psutil.Process(conn.pid)
|
||||
process_info = process.as_dict(attrs=["pid", "cmdline"])
|
||||
if "akkudoktoreos.server.eos" in process_info["cmdline"]:
|
||||
pids.append(conn.pid)
|
||||
except:
|
||||
# PID may already be dead
|
||||
pass
|
||||
for pid in pids:
|
||||
os.kill(pid, sigkill)
|
||||
if len(pids) == 0:
|
||||
break
|
||||
time.sleep(3)
|
||||
assert len(pids) == 0
|
||||
# Cleanup any EOSdash processes left.
|
||||
# - Use pid on EOSdash health endpoint
|
||||
try:
|
||||
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
pid = result.json()["pid"]
|
||||
os.kill(pid, sigkill)
|
||||
time.sleep(1)
|
||||
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
|
||||
assert result.status_code != HTTPStatus.OK
|
||||
except:
|
||||
pass
|
||||
# - Use pids from processes on EOSdash port
|
||||
for retries in range(int(server_timeout / 3)):
|
||||
pids = []
|
||||
for conn in psutil.net_connections(kind="inet"):
|
||||
if conn.laddr.port == eosdash_port:
|
||||
if conn.pid not in pids:
|
||||
# Get fresh process info
|
||||
try:
|
||||
process = psutil.Process(conn.pid)
|
||||
process_info = process.as_dict(attrs=["pid", "cmdline"])
|
||||
if "akkudoktoreos.server.eosdash" in process_info["cmdline"]:
|
||||
pids.append(conn.pid)
|
||||
except:
|
||||
# PID may already be dead
|
||||
pass
|
||||
for pid in pids:
|
||||
os.kill(pid, sigkill)
|
||||
if len(pids) == 0:
|
||||
break
|
||||
time.sleep(3)
|
||||
assert len(pids) == 0
|
||||
|
||||
# Kill all running eos and eosdash process - just to be sure
|
||||
cleanup_eos_eosdash()
|
||||
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, server_timeout)
|
||||
|
||||
# Ensure there is an empty config file in the temporary EOS directory
|
||||
config_file_path = Path(eos_dir).joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
@@ -440,7 +492,9 @@ def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], Non
|
||||
|
||||
yield {
|
||||
"server": server,
|
||||
"port": port,
|
||||
"eosdash_server": eosdash_server,
|
||||
"eosdash_port": eosdash_port,
|
||||
"eos_dir": eos_dir,
|
||||
"timeout": server_timeout,
|
||||
}
|
||||
@@ -449,16 +503,21 @@ def server_base(xprocess: XProcess) -> Generator[dict[str, Union[str, int]], Non
|
||||
xprocess.getinfo("eos").terminate()
|
||||
|
||||
# Cleanup any EOS process left.
|
||||
cleanup_eos_eosdash()
|
||||
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, server_timeout)
|
||||
|
||||
# Remove temporary EOS_DIR
|
||||
eos_tmp_dir.cleanup()
|
||||
|
||||
|
||||
@pytest.fixture(scope="class")
|
||||
def server_setup_for_class(xprocess) -> Generator[dict[str, Union[str, int]], None, None]:
|
||||
"""A fixture to start the server for a test class."""
|
||||
with server_base(xprocess) as result:
|
||||
def server_setup_for_class(request, xprocess) -> Generator[dict[str, Union[str, int]], None, None]:
|
||||
"""A fixture to start the server for a test class.
|
||||
|
||||
Get env vars from the test class attribute `eos_env`, if defined
|
||||
"""
|
||||
extra_env = getattr(request.cls, "eos_env", None)
|
||||
|
||||
with server_base(xprocess, extra_env=extra_env) as result:
|
||||
yield result
|
||||
|
||||
|
||||
@@ -469,66 +528,9 @@ def server_setup_for_function(xprocess) -> Generator[dict[str, Union[str, int]],
|
||||
yield result
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def server(xprocess, config_eos, config_default_dirs) -> Generator[str, None, None]:
|
||||
"""Fixture to start the server.
|
||||
|
||||
Provides URL of the server.
|
||||
"""
|
||||
# create url/port info to the server
|
||||
url = "http://127.0.0.1:8503"
|
||||
|
||||
class Starter(ProcessStarter):
|
||||
# Set environment before any subprocess run, to keep custom config dir
|
||||
env = os.environ.copy()
|
||||
env["EOS_DIR"] = str(config_default_dirs[-1])
|
||||
project_dir = config_eos.package_root_path.parent.parent
|
||||
|
||||
# assure server to be installed
|
||||
try:
|
||||
subprocess.run(
|
||||
[sys.executable, "-c", "import", "akkudoktoreos.server.eos"],
|
||||
check=True,
|
||||
env=env,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
cwd=project_dir,
|
||||
)
|
||||
except subprocess.CalledProcessError:
|
||||
subprocess.run(
|
||||
[sys.executable, "-m", "pip", "install", "-e", str(project_dir)],
|
||||
check=True,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
)
|
||||
|
||||
# command to start server process
|
||||
args = [sys.executable, "-m", "akkudoktoreos.server.eos"]
|
||||
|
||||
# will wait for xx seconds before timing out
|
||||
timeout = 10
|
||||
|
||||
# xprocess will now attempt to clean up upon interruptions
|
||||
terminate_on_interrupt = True
|
||||
|
||||
# checks if our server is ready
|
||||
def startup_check(self):
|
||||
try:
|
||||
result = requests.get(f"{url}/v1/health")
|
||||
if result.status_code == 200:
|
||||
return True
|
||||
except:
|
||||
pass
|
||||
return False
|
||||
|
||||
# ensure process is running and return its logfile
|
||||
pid, logfile = xprocess.ensure("eos", Starter)
|
||||
print(f"View xprocess logfile at: {logfile}")
|
||||
|
||||
yield url
|
||||
|
||||
# clean up whole process tree afterwards
|
||||
xprocess.getinfo("eos").terminate()
|
||||
# ------------------------------
|
||||
# Provide pytest timezone change
|
||||
# ------------------------------
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import numpy as np
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.devices.battery import Battery, SolarPanelBatteryParameters
|
||||
from akkudoktoreos.devices.genetic.battery import Battery, SolarPanelBatteryParameters
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@@ -15,7 +15,10 @@ def setup_pv_battery():
|
||||
max_charge_power_w=8000,
|
||||
hours=24,
|
||||
)
|
||||
battery = Battery(params)
|
||||
battery = Battery(
|
||||
params,
|
||||
prediction_hours=48,
|
||||
)
|
||||
battery.reset()
|
||||
return battery
|
||||
|
||||
@@ -150,7 +153,7 @@ def test_charge_energy_not_allowed_hour(setup_pv_battery):
|
||||
battery = setup_pv_battery
|
||||
|
||||
# Disable charging for all hours
|
||||
battery.set_charge_per_hour(np.zeros(battery.hours))
|
||||
battery.set_charge_per_hour(np.zeros(battery.prediction_hours))
|
||||
|
||||
charged_wh, losses_wh = battery.charge_energy(wh=4000, hour=3)
|
||||
|
||||
@@ -177,7 +180,9 @@ def test_charge_energy_relative_power(setup_pv_battery):
|
||||
|
||||
@pytest.fixture
|
||||
def setup_car_battery():
|
||||
from akkudoktoreos.devices.battery import ElectricVehicleParameters
|
||||
from akkudoktoreos.optimization.genetic.geneticparams import (
|
||||
ElectricVehicleParameters,
|
||||
)
|
||||
|
||||
params = ElectricVehicleParameters(
|
||||
device_id="ev1",
|
||||
@@ -188,7 +193,10 @@ def setup_car_battery():
|
||||
max_charge_power_w=7000,
|
||||
hours=24,
|
||||
)
|
||||
battery = Battery(params)
|
||||
battery = Battery(
|
||||
params,
|
||||
prediction_hours=48,
|
||||
)
|
||||
battery.reset()
|
||||
return battery
|
||||
|
||||
|
||||
@@ -11,12 +11,12 @@ import cachebox
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.core.cache import (
|
||||
CacheEnergyManagementStore,
|
||||
CacheFileRecord,
|
||||
CacheFileStore,
|
||||
CacheUntilUpdateStore,
|
||||
cache_energy_management,
|
||||
cache_in_file,
|
||||
cache_until_update,
|
||||
cachemethod_until_update,
|
||||
cachemethod_energy_management,
|
||||
)
|
||||
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
|
||||
|
||||
@@ -27,103 +27,103 @@ from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_
|
||||
|
||||
# Fixtures for testing
|
||||
@pytest.fixture
|
||||
def cache_until_update_store():
|
||||
"""Ensures CacheUntilUpdateStore is reset between tests."""
|
||||
cache = CacheUntilUpdateStore()
|
||||
CacheUntilUpdateStore().clear()
|
||||
def cache_energy_management_store():
|
||||
"""Ensures CacheEnergyManagementStore is reset between tests."""
|
||||
cache = CacheEnergyManagementStore()
|
||||
CacheEnergyManagementStore().clear()
|
||||
assert len(cache) == 0
|
||||
return cache
|
||||
|
||||
|
||||
class TestCacheUntilUpdateStore:
|
||||
def test_cache_initialization(self, cache_until_update_store):
|
||||
"""Test that CacheUntilUpdateStore initializes with the correct properties."""
|
||||
cache = CacheUntilUpdateStore()
|
||||
class TestCacheEnergyManagementStore:
|
||||
def test_cache_initialization(self, cache_energy_management_store):
|
||||
"""Test that CacheEnergyManagementStore initializes with the correct properties."""
|
||||
cache = CacheEnergyManagementStore()
|
||||
assert isinstance(cache.cache, cachebox.LRUCache)
|
||||
assert cache.maxsize == 100
|
||||
assert len(cache) == 0
|
||||
|
||||
def test_singleton_behavior(self, cache_until_update_store):
|
||||
"""Test that CacheUntilUpdateStore is a singleton."""
|
||||
cache1 = CacheUntilUpdateStore()
|
||||
cache2 = CacheUntilUpdateStore()
|
||||
def test_singleton_behavior(self, cache_energy_management_store):
|
||||
"""Test that CacheEnergyManagementStore is a singleton."""
|
||||
cache1 = CacheEnergyManagementStore()
|
||||
cache2 = CacheEnergyManagementStore()
|
||||
assert cache1 is cache2
|
||||
|
||||
def test_cache_storage(self, cache_until_update_store):
|
||||
def test_cache_storage(self, cache_energy_management_store):
|
||||
"""Test that items can be added and retrieved from the cache."""
|
||||
cache = CacheUntilUpdateStore()
|
||||
cache = CacheEnergyManagementStore()
|
||||
cache["key1"] = "value1"
|
||||
assert cache["key1"] == "value1"
|
||||
assert len(cache) == 1
|
||||
|
||||
def test_cache_getattr_invalid_method(self, cache_until_update_store):
|
||||
def test_cache_getattr_invalid_method(self, cache_energy_management_store):
|
||||
"""Test that accessing an invalid method raises an AttributeError."""
|
||||
with pytest.raises(AttributeError):
|
||||
CacheUntilUpdateStore().non_existent_method() # This should raise AttributeError
|
||||
CacheEnergyManagementStore().non_existent_method() # This should raise AttributeError
|
||||
|
||||
|
||||
class TestCacheUntilUpdateDecorators:
|
||||
def test_cachemethod_until_update(self, cache_until_update_store):
|
||||
"""Test that cachemethod_until_update caches method results."""
|
||||
def test_cachemethod_energy_management(self, cache_energy_management_store):
|
||||
"""Test that cachemethod_energy_management caches method results."""
|
||||
|
||||
class MyClass:
|
||||
@cachemethod_until_update
|
||||
@cachemethod_energy_management
|
||||
def compute(self, value: int) -> int:
|
||||
return value * 2
|
||||
|
||||
obj = MyClass()
|
||||
|
||||
# Call method and assert caching
|
||||
assert CacheUntilUpdateStore.miss_count == 0
|
||||
assert CacheUntilUpdateStore.hit_count == 0
|
||||
assert CacheEnergyManagementStore.miss_count == 0
|
||||
assert CacheEnergyManagementStore.hit_count == 0
|
||||
result1 = obj.compute(5)
|
||||
assert CacheUntilUpdateStore.miss_count == 1
|
||||
assert CacheUntilUpdateStore.hit_count == 0
|
||||
assert CacheEnergyManagementStore.miss_count == 1
|
||||
assert CacheEnergyManagementStore.hit_count == 0
|
||||
result2 = obj.compute(5)
|
||||
assert CacheUntilUpdateStore.miss_count == 1
|
||||
assert CacheUntilUpdateStore.hit_count == 1
|
||||
assert CacheEnergyManagementStore.miss_count == 1
|
||||
assert CacheEnergyManagementStore.hit_count == 1
|
||||
assert result1 == result2
|
||||
|
||||
def test_cache_until_update(self, cache_until_update_store):
|
||||
"""Test that cache_until_update caches function results."""
|
||||
def test_cache_energy_management(self, cache_energy_management_store):
|
||||
"""Test that cache_energy_management caches function results."""
|
||||
|
||||
@cache_until_update
|
||||
@cache_energy_management
|
||||
def compute(value: int) -> int:
|
||||
return value * 3
|
||||
|
||||
# Call function and assert caching
|
||||
result1 = compute(4)
|
||||
assert CacheUntilUpdateStore.last_event == cachebox.EVENT_MISS
|
||||
assert CacheEnergyManagementStore.last_event == cachebox.EVENT_MISS
|
||||
result2 = compute(4)
|
||||
assert CacheUntilUpdateStore.last_event == cachebox.EVENT_HIT
|
||||
assert CacheEnergyManagementStore.last_event == cachebox.EVENT_HIT
|
||||
assert result1 == result2
|
||||
|
||||
def test_cache_with_different_arguments(self, cache_until_update_store):
|
||||
def test_cache_with_different_arguments(self, cache_energy_management_store):
|
||||
"""Test that caching works for different arguments."""
|
||||
|
||||
class MyClass:
|
||||
@cachemethod_until_update
|
||||
@cachemethod_energy_management
|
||||
def compute(self, value: int) -> int:
|
||||
return value * 2
|
||||
|
||||
obj = MyClass()
|
||||
|
||||
assert CacheUntilUpdateStore.miss_count == 0
|
||||
assert CacheEnergyManagementStore.miss_count == 0
|
||||
result1 = obj.compute(3)
|
||||
assert CacheUntilUpdateStore.last_event == cachebox.EVENT_MISS
|
||||
assert CacheUntilUpdateStore.miss_count == 1
|
||||
assert CacheEnergyManagementStore.last_event == cachebox.EVENT_MISS
|
||||
assert CacheEnergyManagementStore.miss_count == 1
|
||||
result2 = obj.compute(5)
|
||||
assert CacheUntilUpdateStore.last_event == cachebox.EVENT_MISS
|
||||
assert CacheUntilUpdateStore.miss_count == 2
|
||||
assert CacheEnergyManagementStore.last_event == cachebox.EVENT_MISS
|
||||
assert CacheEnergyManagementStore.miss_count == 2
|
||||
|
||||
assert result1 == 6
|
||||
assert result2 == 10
|
||||
|
||||
def test_cache_clearing(self, cache_until_update_store):
|
||||
def test_cache_clearing(self, cache_energy_management_store):
|
||||
"""Test that cache is cleared between EMS update cycles."""
|
||||
|
||||
class MyClass:
|
||||
@cachemethod_until_update
|
||||
@cachemethod_energy_management
|
||||
def compute(self, value: int) -> int:
|
||||
return value * 2
|
||||
|
||||
@@ -131,26 +131,26 @@ class TestCacheUntilUpdateDecorators:
|
||||
obj.compute(5)
|
||||
|
||||
# Clear cache
|
||||
CacheUntilUpdateStore().clear()
|
||||
CacheEnergyManagementStore().clear()
|
||||
|
||||
with pytest.raises(KeyError):
|
||||
_ = CacheUntilUpdateStore()["<invalid>"]
|
||||
_ = CacheEnergyManagementStore()["<invalid>"]
|
||||
|
||||
def test_decorator_works_for_standalone_function(self, cache_until_update_store):
|
||||
"""Test that cache_until_update works with standalone functions."""
|
||||
def test_decorator_works_for_standalone_function(self, cache_energy_management_store):
|
||||
"""Test that cache_energy_management works with standalone functions."""
|
||||
|
||||
@cache_until_update
|
||||
@cache_energy_management
|
||||
def add(a: int, b: int) -> int:
|
||||
return a + b
|
||||
|
||||
assert CacheUntilUpdateStore.miss_count == 0
|
||||
assert CacheUntilUpdateStore.hit_count == 0
|
||||
assert CacheEnergyManagementStore.miss_count == 0
|
||||
assert CacheEnergyManagementStore.hit_count == 0
|
||||
result1 = add(1, 2)
|
||||
assert CacheUntilUpdateStore.miss_count == 1
|
||||
assert CacheUntilUpdateStore.hit_count == 0
|
||||
assert CacheEnergyManagementStore.miss_count == 1
|
||||
assert CacheEnergyManagementStore.hit_count == 0
|
||||
result2 = add(1, 2)
|
||||
assert CacheUntilUpdateStore.miss_count == 1
|
||||
assert CacheUntilUpdateStore.hit_count == 1
|
||||
assert CacheEnergyManagementStore.miss_count == 1
|
||||
assert CacheEnergyManagementStore.hit_count == 1
|
||||
|
||||
assert result1 == result2
|
||||
|
||||
|
||||
@@ -1,104 +0,0 @@
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.config.config import ConfigEOS
|
||||
from akkudoktoreos.optimization.genetic import (
|
||||
OptimizationParameters,
|
||||
OptimizeResponse,
|
||||
optimization_problem,
|
||||
)
|
||||
from akkudoktoreos.utils.visualize import (
|
||||
prepare_visualize, # Import the new prepare_visualize
|
||||
)
|
||||
|
||||
DIR_TESTDATA = Path(__file__).parent / "testdata"
|
||||
|
||||
|
||||
def compare_dict(actual: dict[str, Any], expected: dict[str, Any]):
|
||||
assert set(actual) == set(expected)
|
||||
|
||||
for key, value in expected.items():
|
||||
if isinstance(value, dict):
|
||||
assert isinstance(actual[key], dict)
|
||||
compare_dict(actual[key], value)
|
||||
elif isinstance(value, list):
|
||||
assert isinstance(actual[key], list)
|
||||
assert actual[key] == pytest.approx(value)
|
||||
else:
|
||||
assert actual[key] == pytest.approx(value)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"fn_in, fn_out, ngen",
|
||||
[
|
||||
("optimize_input_1.json", "optimize_result_1.json", 3),
|
||||
("optimize_input_2.json", "optimize_result_2.json", 3),
|
||||
("optimize_input_2.json", "optimize_result_2_full.json", 400),
|
||||
],
|
||||
)
|
||||
def test_optimize(
|
||||
fn_in: str,
|
||||
fn_out: str,
|
||||
ngen: int,
|
||||
config_eos: ConfigEOS,
|
||||
is_full_run: bool,
|
||||
):
|
||||
"""Test optimierung_ems."""
|
||||
# Assure configuration holds the correct values
|
||||
config_eos.merge_settings_from_dict(
|
||||
{"prediction": {"hours": 48}, "optimization": {"hours": 48}}
|
||||
)
|
||||
|
||||
# Load input and output data
|
||||
file = DIR_TESTDATA / fn_in
|
||||
with file.open("r") as f_in:
|
||||
input_data = OptimizationParameters(**json.load(f_in))
|
||||
|
||||
file = DIR_TESTDATA / fn_out
|
||||
# In case a new test case is added, we don't want to fail here, so the new output is written to disk before
|
||||
try:
|
||||
with file.open("r") as f_out:
|
||||
expected_result = OptimizeResponse(**json.load(f_out))
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
opt_class = optimization_problem(fixed_seed=42)
|
||||
start_hour = 10
|
||||
|
||||
# Activate with pytest --full-run
|
||||
if ngen > 10 and not is_full_run:
|
||||
pytest.skip()
|
||||
|
||||
visualize_filename = str((DIR_TESTDATA / f"new_{fn_out}").with_suffix(".pdf"))
|
||||
|
||||
with patch(
|
||||
"akkudoktoreos.utils.visualize.prepare_visualize",
|
||||
side_effect=lambda parameters, results, *args, **kwargs: prepare_visualize(
|
||||
parameters, results, filename=visualize_filename, **kwargs
|
||||
),
|
||||
) as prepare_visualize_patch:
|
||||
# Call the optimization function
|
||||
ergebnis = opt_class.optimierung_ems(
|
||||
parameters=input_data, start_hour=start_hour, ngen=ngen
|
||||
)
|
||||
# Write test output to file, so we can take it as new data on intended change
|
||||
TESTDATA_FILE = DIR_TESTDATA / f"new_{fn_out}"
|
||||
with TESTDATA_FILE.open("w", encoding="utf-8", newline="\n") as f_out:
|
||||
f_out.write(ergebnis.model_dump_json(indent=4, exclude_unset=True))
|
||||
|
||||
assert ergebnis.result.Gesamtbilanz_Euro == pytest.approx(
|
||||
expected_result.result.Gesamtbilanz_Euro
|
||||
)
|
||||
|
||||
# Assert that the output contains all expected entries.
|
||||
# This does not assert that the optimization always gives the same result!
|
||||
# Reproducibility and mathematical accuracy should be tested on the level of individual components.
|
||||
compare_dict(ergebnis.model_dump(), expected_result.model_dump())
|
||||
|
||||
# The function creates a visualization result PDF as a side-effect.
|
||||
prepare_visualize_patch.assert_called_once()
|
||||
assert Path(visualize_filename).exists()
|
||||
@@ -1,11 +1,11 @@
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Union
|
||||
from typing import Any, Optional, Union
|
||||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
from loguru import logger
|
||||
from pydantic import ValidationError
|
||||
from pydantic import IPvAnyAddress, ValidationError
|
||||
|
||||
from akkudoktoreos.config.config import ConfigEOS, GeneralSettings
|
||||
|
||||
@@ -20,11 +20,11 @@ def test_fixture_new_config_file(config_default_dirs):
|
||||
"""Assure fixture stash_config_file is working."""
|
||||
config_default_dir_user, config_default_dir_cwd, _, _ = config_default_dirs
|
||||
|
||||
config_file_path_user = config_default_dir_user.joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
config_file_path_cwd = config_default_dir_cwd.joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
config_file_user = config_default_dir_user.joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
config_file_cwd = config_default_dir_cwd.joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
|
||||
assert not config_file_path_user.exists()
|
||||
assert not config_file_path_cwd.exists()
|
||||
assert not config_file_user.exists()
|
||||
assert not config_file_cwd.exists()
|
||||
|
||||
|
||||
def test_config_constants(config_eos):
|
||||
@@ -62,6 +62,38 @@ def test_computed_paths(config_eos):
|
||||
config_eos.reset_settings()
|
||||
|
||||
|
||||
def test_config_from_env(monkeypatch, config_eos):
|
||||
"""Test configuration from env."""
|
||||
assert config_eos.server.port == 8503
|
||||
assert config_eos.server.eosdash_port is None
|
||||
|
||||
monkeypatch.setenv("EOS_SERVER__PORT", "8553")
|
||||
monkeypatch.setenv("EOS_SERVER__EOSDASH_PORT", "8555")
|
||||
|
||||
config_eos.reset_settings()
|
||||
|
||||
assert config_eos.server.port == 8553
|
||||
assert config_eos.server.eosdash_port == 8555
|
||||
|
||||
|
||||
def test_config_ipaddress(monkeypatch, config_eos):
|
||||
"""Test configuration for IP adresses."""
|
||||
assert config_eos.server.host == "127.0.0.1"
|
||||
|
||||
monkeypatch.setenv("EOS_SERVER__HOST", "0.0.0.0")
|
||||
config_eos.reset_settings()
|
||||
assert config_eos.server.host == "0.0.0.0"
|
||||
|
||||
monkeypatch.setenv("EOS_SERVER__HOST", "mail.akkudoktor.net")
|
||||
config_eos.reset_settings()
|
||||
assert config_eos.server.host == "mail.akkudoktor.net"
|
||||
|
||||
# keep last
|
||||
monkeypatch.setenv("EOS_SERVER__HOST", "localhost")
|
||||
config_eos.reset_settings()
|
||||
assert config_eos.server.host == "localhost"
|
||||
|
||||
|
||||
def test_singleton_behavior(config_eos, config_default_dirs):
|
||||
"""Test that ConfigEOS behaves as a singleton."""
|
||||
initial_cfg_file = config_eos.general.config_file_path
|
||||
@@ -85,21 +117,34 @@ def test_default_config_path(config_eos, config_default_dirs):
|
||||
|
||||
|
||||
def test_config_file_priority(config_default_dirs):
|
||||
"""Test config file priority."""
|
||||
from akkudoktoreos.config.config import get_config
|
||||
"""Test config file priority.
|
||||
|
||||
Priority is:
|
||||
1. environment variable directory
|
||||
2. user configuration directory
|
||||
3. current working directory
|
||||
"""
|
||||
|
||||
config_default_dir_user, config_default_dir_cwd, _, _ = config_default_dirs
|
||||
config_file_cwd = Path(config_default_dir_cwd) / ConfigEOS.CONFIG_FILE_NAME
|
||||
config_file_user = Path(config_default_dir_user) / ConfigEOS.CONFIG_FILE_NAME
|
||||
|
||||
config_file = Path(config_default_dir_cwd) / ConfigEOS.CONFIG_FILE_NAME
|
||||
config_file.write_text("{}")
|
||||
config_eos = get_config()
|
||||
assert config_eos.general.config_file_path == config_file
|
||||
assert not config_file_cwd.exists()
|
||||
assert not config_file_user.exists()
|
||||
|
||||
config_file = Path(config_default_dir_user) / ConfigEOS.CONFIG_FILE_NAME
|
||||
config_file.parent.mkdir()
|
||||
config_file.write_text("{}")
|
||||
# current working directory (prio 3)
|
||||
config_file_cwd.write_text("{}")
|
||||
|
||||
config_eos = ConfigEOS()
|
||||
config_eos.update()
|
||||
assert config_eos.general.config_file_path == config_file
|
||||
assert config_eos.general.config_file_path == config_file_cwd
|
||||
|
||||
# user configuration directory (prio 2)
|
||||
config_file_user.parent.mkdir()
|
||||
config_file_user.write_text("{}")
|
||||
|
||||
config_eos.update()
|
||||
assert config_eos.general.config_file_path == config_file_user
|
||||
|
||||
|
||||
@patch("akkudoktoreos.config.config.user_config_dir")
|
||||
@@ -220,6 +265,7 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
|
||||
assert config_no_coords.timezone is None
|
||||
|
||||
|
||||
|
||||
# Test partial assignments and possible side effects
|
||||
@pytest.mark.parametrize(
|
||||
"path, value, expected, exception",
|
||||
@@ -280,13 +326,20 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
|
||||
[("general.latitude", 52.52), ("general.longitude", 13.405)],
|
||||
ValueError,
|
||||
),
|
||||
# Correct value assignment - preparation for list
|
||||
(
|
||||
"devices/max_electric_vehicles",
|
||||
1,
|
||||
[("devices.max_electric_vehicles", 1), ],
|
||||
None,
|
||||
),
|
||||
# Correct value for list
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent/0",
|
||||
0.1,
|
||||
"devices/electric_vehicles/0/charge_rates",
|
||||
[0.1, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
[
|
||||
(
|
||||
"optimization.ev_available_charge_rates_percent",
|
||||
"devices.electric_vehicles[0].charge_rates",
|
||||
[0.1, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
)
|
||||
],
|
||||
@@ -294,23 +347,23 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
|
||||
),
|
||||
# Invalid value for list
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent/0",
|
||||
"devices/electric_vehicles/0/charge_rates",
|
||||
"invalid",
|
||||
[
|
||||
(
|
||||
"optimization.ev_available_charge_rates_percent",
|
||||
"devices.electric_vehicles[0].charge_rates",
|
||||
[0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
)
|
||||
],
|
||||
TypeError,
|
||||
ValueError,
|
||||
),
|
||||
# Invalid index (out of bound)
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent/10",
|
||||
"devices/electric_vehicles/0/charge_rates/10",
|
||||
0,
|
||||
[
|
||||
(
|
||||
"optimization.ev_available_charge_rates_percent",
|
||||
"devices.electric_vehicles[0].charge_rates",
|
||||
[0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
)
|
||||
],
|
||||
@@ -318,11 +371,11 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
|
||||
),
|
||||
# Invalid index (no number)
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent/test",
|
||||
"devices/electric_vehicles/0/charge_rates/test",
|
||||
0,
|
||||
[
|
||||
(
|
||||
"optimization.ev_available_charge_rates_percent",
|
||||
"devices.electric_vehicles[0].charge_rates",
|
||||
[0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
)
|
||||
],
|
||||
@@ -330,11 +383,11 @@ def test_config_common_settings_timezone_none_when_coordinates_missing():
|
||||
),
|
||||
# Unset value (set None)
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent",
|
||||
"devices/electric_vehicles/0/charge_rates",
|
||||
None,
|
||||
[
|
||||
(
|
||||
"optimization.ev_available_charge_rates_percent",
|
||||
"devices.electric_vehicles[0].charge_rates",
|
||||
None,
|
||||
)
|
||||
],
|
||||
@@ -373,18 +426,6 @@ def test_set_nested_key(path, value, expected, exception, config_eos):
|
||||
("general/latitude", 52.52, None),
|
||||
("general/latitude/", 52.52, None),
|
||||
("general/latitude/test", None, KeyError),
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent/1",
|
||||
0.375,
|
||||
None,
|
||||
),
|
||||
("optimization/ev_available_charge_rates_percent/10", 0, IndexError),
|
||||
("optimization/ev_available_charge_rates_percent/test", 0, IndexError),
|
||||
(
|
||||
"optimization/ev_available_charge_rates_percent",
|
||||
[0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
None,
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_get_nested_key(path, expected_value, exception, config_eos):
|
||||
@@ -409,7 +450,8 @@ def test_merge_settings_from_dict_invalid(config_eos):
|
||||
|
||||
def test_merge_settings_partial(config_eos):
|
||||
"""Test merging only a subset of settings."""
|
||||
partial_settings: dict[str, dict[str, Union[float, None, str]]] = {
|
||||
|
||||
partial_settings: dict[str, Any] = {
|
||||
"general": {
|
||||
"latitude": 51.1657 # Only latitude is updated
|
||||
},
|
||||
@@ -419,6 +461,8 @@ def test_merge_settings_partial(config_eos):
|
||||
assert config_eos.general.latitude == 51.1657
|
||||
assert config_eos.general.longitude == 13.405 # Should remain unchanged
|
||||
|
||||
#-----------------
|
||||
|
||||
partial_settings = {
|
||||
"weather": {
|
||||
"provider": "BrightSky",
|
||||
@@ -428,6 +472,8 @@ def test_merge_settings_partial(config_eos):
|
||||
config_eos.merge_settings_from_dict(partial_settings)
|
||||
assert config_eos.weather.provider == "BrightSky"
|
||||
|
||||
#-----------------
|
||||
|
||||
partial_settings = {
|
||||
"general": {
|
||||
"latitude": None,
|
||||
@@ -446,6 +492,36 @@ def test_merge_settings_partial(config_eos):
|
||||
assert config_eos.general.latitude is None
|
||||
assert config_eos.weather.provider == "ClearOutside"
|
||||
|
||||
#-----------------
|
||||
|
||||
partial_settings = {
|
||||
"devices": {
|
||||
"max_electric_vehicles": 1,
|
||||
"electric_vehicles": [
|
||||
{
|
||||
"charge_rates": [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
}
|
||||
],
|
||||
}
|
||||
}
|
||||
|
||||
config_eos.merge_settings_from_dict(partial_settings)
|
||||
assert config_eos.devices.max_electric_vehicles == 1
|
||||
assert len(config_eos.devices.electric_vehicles) == 1
|
||||
assert config_eos.devices.electric_vehicles[0].charge_rates == [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0]
|
||||
|
||||
# Assure re-apply generates the same config
|
||||
config_eos.merge_settings_from_dict(partial_settings)
|
||||
assert config_eos.devices.max_electric_vehicles == 1
|
||||
assert len(config_eos.devices.electric_vehicles) == 1
|
||||
assert config_eos.devices.electric_vehicles[0].charge_rates == [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0]
|
||||
|
||||
# Assure update keeps same values
|
||||
config_eos.update()
|
||||
assert config_eos.devices.max_electric_vehicles == 1
|
||||
assert len(config_eos.devices.electric_vehicles) == 1
|
||||
assert config_eos.devices.electric_vehicles[0].charge_rates == [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0]
|
||||
|
||||
|
||||
def test_merge_settings_empty(config_eos):
|
||||
"""Test merging an empty dictionary does not change settings."""
|
||||
|
||||
229
tests/test_configmigrate.py
Normal file
229
tests/test_configmigrate.py
Normal file
@@ -0,0 +1,229 @@
|
||||
import json
|
||||
import shutil
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.config import configmigrate
|
||||
from akkudoktoreos.config.config import ConfigEOS, SettingsEOSDefaults
|
||||
from akkudoktoreos.core.version import __version__
|
||||
|
||||
# Test data directory and known migration pairs
|
||||
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
||||
|
||||
MIGRATION_PAIRS = [
|
||||
(
|
||||
DIR_TESTDATA / "eos_config_minimal_0_1_0.json",
|
||||
DIR_TESTDATA / "eos_config_minimal_now.json",
|
||||
),
|
||||
(
|
||||
DIR_TESTDATA / "eos_config_andreas_0_1_0.json",
|
||||
DIR_TESTDATA / "eos_config_andreas_now.json",
|
||||
),
|
||||
# Add more pairs here:
|
||||
# (DIR_TESTDATA / "old_config_X.json", DIR_TESTDATA / "expected_config_X.json"),
|
||||
]
|
||||
|
||||
|
||||
def _dict_contains(superset: Any, subset: Any, path="") -> list[str]:
|
||||
"""Recursively verify that all key-value pairs from a subset dictionary or list exist in a superset.
|
||||
|
||||
Supports nested dictionaries and lists. Extra keys in superset are allowed.
|
||||
Numeric values (int/float) are compared with tolerance.
|
||||
|
||||
Args:
|
||||
superset (Any): The dictionary or list that should contain all items from `subset`.
|
||||
subset (Any): The expected dictionary or list.
|
||||
path (str, optional): Current nested path used for error reporting. Defaults to "".
|
||||
|
||||
Returns:
|
||||
list[str]: A list of strings describing mismatches or missing keys. Empty list if all subset items are present.
|
||||
"""
|
||||
errors = []
|
||||
|
||||
if isinstance(subset, dict) and isinstance(superset, dict):
|
||||
for key, sub_value in subset.items():
|
||||
full_path = f"{path}/{key}" if path else key
|
||||
if key not in superset:
|
||||
errors.append(f"Missing key: {full_path}")
|
||||
continue
|
||||
errors.extend(_dict_contains(superset[key], sub_value, full_path))
|
||||
|
||||
elif isinstance(subset, list) and isinstance(superset, list):
|
||||
for i, elem in enumerate(subset):
|
||||
if i >= len(superset):
|
||||
full_path = f"{path}[{i}]" if path else f"[{i}]"
|
||||
errors.append(f"List too short at {full_path}: expected element {elem}")
|
||||
continue
|
||||
errors.extend(_dict_contains(superset[i], elem, f"{path}[{i}]" if path else f"[{i}]"))
|
||||
|
||||
else:
|
||||
# Compare values (with numeric tolerance)
|
||||
if isinstance(subset, (int, float)) and isinstance(superset, (int, float)):
|
||||
if abs(float(subset) - float(superset)) > 1e-6:
|
||||
errors.append(f"Value mismatch at {path}: expected {subset}, got {superset}")
|
||||
elif subset != superset:
|
||||
errors.append(f"Value mismatch at {path}: expected {subset}, got {superset}")
|
||||
|
||||
return errors
|
||||
|
||||
|
||||
class TestConfigMigration:
|
||||
"""Tests for migrate_config_file()"""
|
||||
|
||||
@pytest.fixture
|
||||
def tmp_config_file(self, config_default_dirs) -> Path:
|
||||
"""Create a temporary valid config file with an invalid version."""
|
||||
config_default_dir_user, _, _, _ = config_default_dirs
|
||||
config_file_user = config_default_dir_user.joinpath(ConfigEOS.CONFIG_FILE_NAME)
|
||||
|
||||
# Create a default config object (simulates the latest schema)
|
||||
default_config = SettingsEOSDefaults()
|
||||
|
||||
# Dump to JSON
|
||||
config_json = json.loads(default_config.model_dump_json())
|
||||
|
||||
# Corrupt the version (simulate outdated config)
|
||||
config_json["general"]["version"] = "0.0.0-old"
|
||||
|
||||
# Write file
|
||||
with config_file_user.open("w", encoding="utf-8") as f:
|
||||
json.dump(config_json, f, indent=4)
|
||||
|
||||
return config_file_user
|
||||
|
||||
|
||||
def test_migrate_config_file_from_invalid_version(self, tmp_config_file: Path):
|
||||
"""Test that migration updates an outdated config version successfully."""
|
||||
backup_file = tmp_config_file.with_suffix(".bak")
|
||||
|
||||
# Run migration
|
||||
result = configmigrate.migrate_config_file(tmp_config_file, backup_file)
|
||||
|
||||
# Verify success
|
||||
assert result is True, "Migration should succeed even from invalid version."
|
||||
|
||||
# Verify backup exists
|
||||
assert backup_file.exists(), "Backup file should be created before migration."
|
||||
|
||||
# Verify version updated
|
||||
with tmp_config_file.open("r", encoding="utf-8") as f:
|
||||
migrated_data = json.load(f)
|
||||
assert migrated_data["general"]["version"] == __version__, \
|
||||
"Migrated config should have updated version."
|
||||
|
||||
# Verify it still matches the structure of SettingsEOSDefaults
|
||||
new_model = SettingsEOSDefaults(**migrated_data)
|
||||
assert isinstance(new_model, SettingsEOSDefaults)
|
||||
|
||||
def test_migrate_config_file_already_current(self, tmp_path: Path):
|
||||
"""Test that a current config file returns True immediately."""
|
||||
config_path = tmp_path / "EOS_current.json"
|
||||
default = SettingsEOSDefaults()
|
||||
with config_path.open("w", encoding="utf-8") as f:
|
||||
f.write(default.model_dump_json(indent=4))
|
||||
|
||||
backup_file = config_path.with_suffix(".bak")
|
||||
|
||||
result = configmigrate.migrate_config_file(config_path, backup_file)
|
||||
assert result is True
|
||||
assert not backup_file.exists(), "No backup should be made if config is already current."
|
||||
|
||||
|
||||
@pytest.mark.parametrize("old_file, expected_file", MIGRATION_PAIRS)
|
||||
def test_migrate_old_version_config(self, tmp_path: Path, old_file: Path, expected_file: Path):
|
||||
"""Ensure migration from old → new schema produces the expected output."""
|
||||
# --- Prepare temporary working file based on expected file name ---
|
||||
working_file = expected_file.with_suffix(".new")
|
||||
shutil.copy(old_file, working_file)
|
||||
|
||||
# Backup file path (inside tmp_path to avoid touching repo files)
|
||||
backup_file = tmp_path / f"{old_file.stem}.bak"
|
||||
|
||||
failed = False
|
||||
try:
|
||||
assert working_file.exists(), f"Working config file is missing: {working_file}"
|
||||
|
||||
# --- Perform migration ---
|
||||
result = configmigrate.migrate_config_file(working_file, backup_file)
|
||||
|
||||
# --- Assertions ---
|
||||
assert result is True, f"Migration failed for {old_file.name}"
|
||||
|
||||
assert configmigrate.mapped_count >= 1, f"No mapped migrations for {old_file.name}"
|
||||
assert configmigrate.auto_count >= 1, f"No automatic migrations for {old_file.name}"
|
||||
|
||||
assert len(configmigrate.skipped_paths) <= 7, (
|
||||
f"Too many skipped paths in {old_file.name}: {configmigrate.skipped_paths}"
|
||||
)
|
||||
|
||||
assert backup_file.exists(), f"Backup file not created for {old_file.name}"
|
||||
|
||||
# --- Compare migrated result with expected output ---
|
||||
new_data = json.loads(working_file.read_text(encoding="utf-8"))
|
||||
expected_data = json.loads(expected_file.read_text(encoding="utf-8"))
|
||||
|
||||
# Check version
|
||||
assert new_data["general"]["version"] == __version__, (
|
||||
f"Expected version {__version__}, got {new_data['general']['version']}"
|
||||
)
|
||||
|
||||
# Recursive subset comparison
|
||||
errors = _dict_contains(new_data, expected_data)
|
||||
assert not errors, (
|
||||
f"Migrated config for {old_file.name} is missing or mismatched fields:\n" +
|
||||
"\n".join(errors)
|
||||
)
|
||||
|
||||
# --- Compare migrated result with migration map ---
|
||||
# Ensure all expected mapped fields are actually migrated and correct
|
||||
missing_migrations = []
|
||||
mismatched_values = []
|
||||
|
||||
for old_path, mapping in configmigrate.MIGRATION_MAP.items():
|
||||
if mapping is None:
|
||||
continue # skip intentionally dropped fields
|
||||
|
||||
# Determine new path (string or tuple)
|
||||
new_path = mapping[0] if isinstance(mapping, tuple) else mapping
|
||||
|
||||
# Get value from expected data (if present)
|
||||
expected_value = configmigrate._get_json_nested_value(expected_data, new_path)
|
||||
if expected_value is None:
|
||||
continue # new field not present in expected config
|
||||
|
||||
# Check that migration recorded this old path
|
||||
if old_path.strip("/") not in configmigrate.migrated_source_paths:
|
||||
missing_migrations.append(f"{old_path} → {new_path}")
|
||||
continue
|
||||
|
||||
# Verify the migrated value matches the expected one
|
||||
new_value = configmigrate._get_json_nested_value(new_data, new_path)
|
||||
if new_value != expected_value:
|
||||
mismatched_values.append(
|
||||
f"{old_path} → {new_path}: expected {expected_value!r}, got {new_value!r}"
|
||||
)
|
||||
|
||||
assert not missing_migrations, (
|
||||
"Some expected migration map entries were not migrated:\n"
|
||||
+ "\n".join(missing_migrations)
|
||||
)
|
||||
assert not mismatched_values, (
|
||||
"Migrated values differ from expected results:\n"
|
||||
+ "\n".join(mismatched_values)
|
||||
)
|
||||
|
||||
# Validate migrated config with schema
|
||||
new_model = SettingsEOSDefaults(**new_data)
|
||||
assert isinstance(new_model, SettingsEOSDefaults)
|
||||
|
||||
except Exception:
|
||||
# mark failure and re-raise so pytest records the error and the working_file is kept
|
||||
failed = True
|
||||
raise
|
||||
finally:
|
||||
# Remove the .new working file only if the test passed (failed == False)
|
||||
if not failed and working_file.exists():
|
||||
working_file.unlink(missing_ok=True)
|
||||
@@ -22,7 +22,6 @@ from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_
|
||||
# Derived classes for testing
|
||||
# ---------------------------
|
||||
|
||||
|
||||
class DerivedConfig(SettingsBaseModel):
|
||||
env_var: Optional[int] = Field(default=None, description="Test config by environment var")
|
||||
instance_field: Optional[str] = Field(default=None, description="Test config by instance field")
|
||||
@@ -35,8 +34,19 @@ class DerivedBase(DataBase):
|
||||
|
||||
|
||||
class DerivedRecord(DataRecord):
|
||||
"""Date Record derived from base class DataRecord.
|
||||
|
||||
The derived data record got the
|
||||
- `data_value` field and the
|
||||
- `dish_washer_emr`, `solar_power`, `temp` configurable field like data.
|
||||
"""
|
||||
|
||||
data_value: Optional[float] = Field(default=None, description="Data Value")
|
||||
|
||||
@classmethod
|
||||
def configured_data_keys(cls) -> Optional[list[str]]:
|
||||
return ["dish_washer_emr", "solar_power", "temp"]
|
||||
|
||||
|
||||
class DerivedSequence(DataSequence):
|
||||
# overload
|
||||
@@ -128,6 +138,13 @@ class TestDataRecord:
|
||||
"""Helper function to create a test DataRecord."""
|
||||
return DerivedRecord(date_time=date, data_value=value)
|
||||
|
||||
@pytest.fixture
|
||||
def record(self):
|
||||
"""Fixture to create a sample DerivedDataRecord with some data set."""
|
||||
rec = DerivedRecord(date_time=None, data_value=10.0)
|
||||
rec.configured_data = {"dish_washer_emr": 123.0, "solar_power": 456.0}
|
||||
return rec
|
||||
|
||||
def test_getitem(self):
|
||||
record = self.create_test_record(datetime(2024, 1, 3, tzinfo=timezone.utc), 10.0)
|
||||
assert record["data_value"] == 10.0
|
||||
@@ -147,7 +164,7 @@ class TestDataRecord:
|
||||
record = self.create_test_record(datetime(2024, 1, 3, tzinfo=timezone.utc), 10.0)
|
||||
record.date_time = None
|
||||
record.data_value = 20.0
|
||||
assert len(record) == 2
|
||||
assert len(record) == 5 # 2 regular fields + 3 configured data "fields"
|
||||
|
||||
def test_to_dict(self):
|
||||
record = self.create_test_record(datetime(2024, 1, 3, tzinfo=timezone.utc), 10.0)
|
||||
@@ -167,6 +184,202 @@ class TestDataRecord:
|
||||
record2 = DerivedRecord.from_json(json_str)
|
||||
assert record2.model_dump() == record.model_dump()
|
||||
|
||||
def test_record_keys_includes_configured_data_keys(self, record):
|
||||
"""Ensure record_keys includes all configured configured data keys."""
|
||||
assert set(record.record_keys()) >= set(record.configured_data_keys())
|
||||
|
||||
def test_record_keys_writable_includes_configured_data_keys(self, record):
|
||||
"""Ensure record_keys_writable includes all configured configured data keys."""
|
||||
assert set(record.record_keys_writable()) >= set(record.configured_data_keys())
|
||||
|
||||
def test_getitem_existing_field(self, record):
|
||||
"""Test that __getitem__ returns correct value for existing native field."""
|
||||
record.date_time = "2024-01-01T00:00:00+00:00"
|
||||
assert record["date_time"] is not None
|
||||
|
||||
def test_getitem_existing_configured_data(self, record):
|
||||
"""Test that __getitem__ retrieves existing configured data values."""
|
||||
assert record["dish_washer_emr"] == 123.0
|
||||
assert record["solar_power"] == 456.0
|
||||
|
||||
def test_getitem_missing_configured_data_returns_none(self, record):
|
||||
"""Test that __getitem__ returns None for missing but known configured data keys."""
|
||||
assert record["temp"] is None
|
||||
|
||||
def test_getitem_raises_keyerror(self, record):
|
||||
"""Test that __getitem__ raises KeyError for completely unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
_ = record["nonexistent"]
|
||||
|
||||
def test_setitem_field(self, record):
|
||||
"""Test setting a native field using __setitem__."""
|
||||
record["date_time"] = "2025-01-01T12:00:00+00:00"
|
||||
assert str(record.date_time).startswith("2025-01-01")
|
||||
|
||||
def test_setitem_configured_data(self, record):
|
||||
"""Test setting a known configured data key using __setitem__."""
|
||||
record["temp"] = 25.5
|
||||
assert record.configured_data["temp"] == 25.5
|
||||
|
||||
def test_setitem_invalid_key_raises(self, record):
|
||||
"""Test that __setitem__ raises KeyError for unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
record["unknown_key"] = 123
|
||||
|
||||
def test_delitem_field(self, record):
|
||||
"""Test deleting a native field using __delitem__."""
|
||||
record["date_time"] = "2025-01-01T12:00:00+00:00"
|
||||
del record["date_time"]
|
||||
assert record.date_time is None
|
||||
|
||||
def test_delitem_configured_data(self, record):
|
||||
"""Test deleting a known configured data key using __delitem__."""
|
||||
del record["solar_power"]
|
||||
assert "solar_power" not in record.configured_data
|
||||
|
||||
def test_delitem_unknown_raises(self, record):
|
||||
"""Test that __delitem__ raises KeyError for unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
del record["nonexistent"]
|
||||
|
||||
def test_attribute_get_existing_field(self, record):
|
||||
"""Test accessing a native field via attribute."""
|
||||
record.date_time = "2025-01-01T12:00:00+00:00"
|
||||
assert record.date_time is not None
|
||||
|
||||
def test_attribute_get_existing_configured_data(self, record):
|
||||
"""Test accessing an existing configured data via attribute."""
|
||||
assert record.dish_washer_emr == 123.0
|
||||
|
||||
def test_attribute_get_missing_configured_data(self, record):
|
||||
"""Test accessing a missing but known configured data returns None."""
|
||||
assert record.temp is None
|
||||
|
||||
def test_attribute_get_invalid_raises(self, record):
|
||||
"""Test accessing an unknown attribute raises AttributeError."""
|
||||
with pytest.raises(AttributeError):
|
||||
_ = record.nonexistent
|
||||
|
||||
def test_attribute_set_existing_field(self, record):
|
||||
"""Test setting a native field via attribute."""
|
||||
record.date_time = "2025-06-25T12:00:00+00:00"
|
||||
assert record.date_time is not None
|
||||
|
||||
def test_attribute_set_existing_configured_data(self, record):
|
||||
"""Test setting a known configured data key via attribute."""
|
||||
record.temp = 99.9
|
||||
assert record.configured_data["temp"] == 99.9
|
||||
|
||||
def test_attribute_set_invalid_raises(self, record):
|
||||
"""Test setting an unknown attribute raises AttributeError."""
|
||||
with pytest.raises(AttributeError):
|
||||
record.invalid = 123
|
||||
|
||||
def test_delattr_field(self, record):
|
||||
"""Test deleting a native field via attribute."""
|
||||
record.date_time = "2025-06-25T12:00:00+00:00"
|
||||
del record.date_time
|
||||
assert record.date_time is None
|
||||
|
||||
def test_delattr_configured_data(self, record):
|
||||
"""Test deleting a known configured data key via attribute."""
|
||||
record.temp = 88.0
|
||||
del record.temp
|
||||
assert "temp" not in record.configured_data
|
||||
|
||||
def test_delattr_ignored_missing_configured_data_key(self, record):
|
||||
"""Test deleting a known configured data key that was never set is a no-op."""
|
||||
del record.temp
|
||||
assert "temp" not in record.configured_data
|
||||
|
||||
def test_len_and_iter(self, record):
|
||||
"""Test that __len__ and __iter__ behave as expected."""
|
||||
keys = list(iter(record))
|
||||
assert set(record.record_keys_writable()) == set(keys)
|
||||
assert len(record) == len(keys)
|
||||
|
||||
def test_in_operator_includes_configured_data(self, record):
|
||||
"""Test that 'in' operator includes configured data keys."""
|
||||
assert "dish_washer_emr" in record
|
||||
assert "temp" in record # known key, even if not yet set
|
||||
assert "nonexistent" not in record
|
||||
|
||||
def test_hasattr_behavior(self, record):
|
||||
"""Test that hasattr returns True for fields and known configured dataWs."""
|
||||
assert hasattr(record, "date_time")
|
||||
assert hasattr(record, "dish_washer_emr")
|
||||
assert hasattr(record, "temp") # allowed, even if not yet set
|
||||
assert not hasattr(record, "nonexistent")
|
||||
|
||||
def test_model_validate_roundtrip(self, record):
|
||||
"""Test that MeasurementDataRecord can be serialized and revalidated."""
|
||||
dumped = record.model_dump()
|
||||
restored = DerivedRecord.model_validate(dumped)
|
||||
assert restored.dish_washer_emr == 123.0
|
||||
assert restored.solar_power == 456.0
|
||||
assert restored.temp is None # not set
|
||||
|
||||
def test_copy_preserves_configured_data(self, record):
|
||||
"""Test that copying preserves configured data values."""
|
||||
record.temp = 22.2
|
||||
copied = record.model_copy()
|
||||
assert copied.dish_washer_emr == 123.0
|
||||
assert copied.temp == 22.2
|
||||
assert copied is not record
|
||||
|
||||
def test_equality_includes_configured_data(self, record):
|
||||
"""Test that equality includes the `configured data` content."""
|
||||
other = record.model_copy()
|
||||
assert record == other
|
||||
|
||||
def test_inequality_differs_with_configured_data(self, record):
|
||||
"""Test that records with different configured datas are not equal."""
|
||||
other = record.model_copy(deep=True)
|
||||
# Modify one configured data value in the copy
|
||||
other.configured_data["dish_washer_emr"] = 999.9
|
||||
assert record != other
|
||||
|
||||
def test_in_operator_for_configured_data_and_fields(self, record):
|
||||
"""Ensure 'in' works for both fields and configured configured data keys."""
|
||||
assert "dish_washer_emr" in record
|
||||
assert "solar_power" in record
|
||||
assert "date_time" in record # standard field
|
||||
assert "temp" in record # allowed but not yet set
|
||||
assert "unknown" not in record
|
||||
|
||||
def test_hasattr_equivalence_to_getattr(self, record):
|
||||
"""hasattr should return True for all valid keys/configured datas."""
|
||||
assert hasattr(record, "dish_washer_emr")
|
||||
assert hasattr(record, "temp")
|
||||
assert hasattr(record, "date_time")
|
||||
assert not hasattr(record, "nonexistent")
|
||||
|
||||
def test_dir_includes_configured_data_keys(self, record):
|
||||
"""`dir(record)` should include configured data keys for introspection.
|
||||
It shall not include the internal 'configured datas' attribute.
|
||||
"""
|
||||
keys = dir(record)
|
||||
assert "configured datas" not in keys
|
||||
for key in record.configured_data_keys():
|
||||
assert key in keys
|
||||
|
||||
def test_init_configured_field_like_data_applies_before_model_init(self):
|
||||
"""Test that keys listed in `_configured_data_keys` are moved to `configured_data` at init time."""
|
||||
record = DerivedRecord(
|
||||
date_time="2024-01-03T00:00:00+00:00",
|
||||
data_value=42.0,
|
||||
dish_washer_emr=111.1,
|
||||
solar_power=222.2,
|
||||
temp=333.3 # assume `temp` is also a valid configured key
|
||||
)
|
||||
|
||||
assert record.data_value == 42.0
|
||||
assert record.configured_data == {
|
||||
"dish_washer_emr": 111.1,
|
||||
"solar_power": 222.2,
|
||||
"temp": 333.3,
|
||||
}
|
||||
|
||||
|
||||
class TestDataSequence:
|
||||
@pytest.fixture
|
||||
@@ -385,6 +598,25 @@ class TestDataSequence:
|
||||
assert array[1] == 0.8 # Forward-filled value
|
||||
assert array[2] == 1.0
|
||||
|
||||
def test_key_to_array_ffill_one_value(self, sequence):
|
||||
"""Test key_to_array with forward filling for missing values and only one value at end available."""
|
||||
interval = to_duration("1 hour")
|
||||
record1 = self.create_test_record(pendulum.datetime(2023, 11, 6, 2), 1.0)
|
||||
sequence.insert_by_datetime(record1)
|
||||
|
||||
array = sequence.key_to_array(
|
||||
key="data_value",
|
||||
start_datetime=pendulum.datetime(2023, 11, 6),
|
||||
end_datetime=pendulum.datetime(2023, 11, 6, 4),
|
||||
interval=interval,
|
||||
fill_method="ffill",
|
||||
)
|
||||
assert len(array) == 4
|
||||
assert array[0] == 1.0 # Backward-filled value
|
||||
assert array[1] == 1.0 # Backward-filled value
|
||||
assert array[2] == 1.0
|
||||
assert array[2] == 1.0 # Forward-filled value
|
||||
|
||||
def test_key_to_array_bfill(self, sequence):
|
||||
"""Test key_to_array with backward filling for missing values."""
|
||||
interval = to_duration("1 hour")
|
||||
@@ -543,6 +775,55 @@ class TestDataSequence:
|
||||
assert sequence[0].date_time == sequence2[0].date_time
|
||||
assert sequence[0].data_value == sequence2[0].data_value
|
||||
|
||||
def test_key_to_value_exact_match(self, sequence):
|
||||
"""Test key_to_value returns exact match when datetime matches a record."""
|
||||
dt = datetime(2023, 11, 5)
|
||||
record = self.create_test_record(dt, 0.75)
|
||||
sequence.append(record)
|
||||
result = sequence.key_to_value("data_value", dt)
|
||||
assert result == 0.75
|
||||
|
||||
def test_key_to_value_nearest(self, sequence):
|
||||
"""Test key_to_value returns value closest in time to the given datetime."""
|
||||
record1 = self.create_test_record(datetime(2023, 11, 5, 12), 0.6)
|
||||
record2 = self.create_test_record(datetime(2023, 11, 6, 12), 0.9)
|
||||
sequence.append(record1)
|
||||
sequence.append(record2)
|
||||
dt = datetime(2023, 11, 6, 10) # closer to record2
|
||||
result = sequence.key_to_value("data_value", dt)
|
||||
assert result == 0.9
|
||||
|
||||
def test_key_to_value_nearest_after(self, sequence):
|
||||
"""Test key_to_value returns value nearest after the given datetime."""
|
||||
record1 = self.create_test_record(datetime(2023, 11, 5, 10), 0.7)
|
||||
record2 = self.create_test_record(datetime(2023, 11, 5, 15), 0.8)
|
||||
sequence.append(record1)
|
||||
sequence.append(record2)
|
||||
dt = datetime(2023, 11, 5, 14) # closer to record2
|
||||
result = sequence.key_to_value("data_value", dt)
|
||||
assert result == 0.8
|
||||
|
||||
def test_key_to_value_empty_sequence(self, sequence):
|
||||
"""Test key_to_value returns None when sequence is empty."""
|
||||
result = sequence.key_to_value("data_value", datetime(2023, 11, 5))
|
||||
assert result is None
|
||||
|
||||
def test_key_to_value_missing_key(self, sequence):
|
||||
"""Test key_to_value returns None when key is missing in records."""
|
||||
record = self.create_test_record(datetime(2023, 11, 5), None)
|
||||
sequence.append(record)
|
||||
result = sequence.key_to_value("data_value", datetime(2023, 11, 5))
|
||||
assert result is None
|
||||
|
||||
def test_key_to_value_multiple_records_with_none(self, sequence):
|
||||
"""Test key_to_value skips records with None values."""
|
||||
r1 = self.create_test_record(datetime(2023, 11, 5), None)
|
||||
r2 = self.create_test_record(datetime(2023, 11, 6), 1.0)
|
||||
sequence.append(r1)
|
||||
sequence.append(r2)
|
||||
result = sequence.key_to_value("data_value", datetime(2023, 11, 5, 12))
|
||||
assert result == 1.0
|
||||
|
||||
def test_key_to_dict(self, sequence):
|
||||
record1 = self.create_test_record(datetime(2023, 11, 5), 0.8)
|
||||
record2 = self.create_test_record(datetime(2023, 11, 6), 0.9)
|
||||
@@ -694,7 +975,7 @@ class TestDataProvider:
|
||||
ems_eos.set_start_datetime(sample_start_datetime)
|
||||
provider.update_data()
|
||||
|
||||
assert provider.start_datetime == sample_start_datetime
|
||||
assert provider.ems_start_datetime == sample_start_datetime
|
||||
|
||||
def test_update_method_force_enable(self, provider, monkeypatch):
|
||||
"""Test that `update` executes when `force_enable` is True, even if `enabled` is False."""
|
||||
@@ -857,7 +1138,7 @@ class TestDataContainer:
|
||||
del container_with_providers["non_existent_key"]
|
||||
|
||||
def test_len(self, container_with_providers):
|
||||
assert len(container_with_providers) == 3
|
||||
assert len(container_with_providers) == 5
|
||||
|
||||
def test_repr(self, container_with_providers):
|
||||
representation = repr(container_with_providers)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -128,7 +128,7 @@ def test_update_data(mock_get, provider, sample_akkudoktor_1_json, cache_store):
|
||||
# Assert we get hours prioce values by resampling
|
||||
np_price_array = provider.key_to_array(
|
||||
key="elecprice_marketprice_wh",
|
||||
start_datetime=provider.start_datetime,
|
||||
start_datetime=provider.ems_start_datetime,
|
||||
end_datetime=provider.end_datetime,
|
||||
)
|
||||
assert len(np_price_array) == provider.total_hours
|
||||
@@ -188,7 +188,7 @@ def test_key_to_array_resampling(provider):
|
||||
provider.update_data(force_update=True)
|
||||
array = provider.key_to_array(
|
||||
key="elecprice_marketprice_wh",
|
||||
start_datetime=provider.start_datetime,
|
||||
start_datetime=provider.ems_start_datetime,
|
||||
end_datetime=provider.end_datetime,
|
||||
)
|
||||
assert isinstance(array, np.ndarray)
|
||||
@@ -204,7 +204,7 @@ def test_key_to_array_resampling(provider):
|
||||
def test_akkudoktor_development_forecast_data(provider):
|
||||
"""Fetch data from real Akkudoktor server."""
|
||||
# Preset, as this is usually done by update_data()
|
||||
provider.start_datetime = to_datetime("2024-10-26 00:00:00")
|
||||
provider.ems_start_datetime = to_datetime("2024-10-26 00:00:00")
|
||||
|
||||
akkudoktor_data = provider._request_forecast()
|
||||
|
||||
|
||||
@@ -125,7 +125,7 @@ def test_update_data(mock_get, provider, sample_energycharts_json, cache_store):
|
||||
# Assert we get hours prioce values by resampling
|
||||
np_price_array = provider.key_to_array(
|
||||
key="elecprice_marketprice_wh",
|
||||
start_datetime=provider.start_datetime,
|
||||
start_datetime=provider.ems_start_datetime,
|
||||
end_datetime=provider.end_datetime,
|
||||
)
|
||||
assert len(np_price_array) == provider.total_hours
|
||||
@@ -182,7 +182,7 @@ def test_key_to_array_resampling(provider):
|
||||
provider.update_data(force_update=True)
|
||||
array = provider.key_to_array(
|
||||
key="elecprice_marketprice_wh",
|
||||
start_datetime=provider.start_datetime,
|
||||
start_datetime=provider.ems_start_datetime,
|
||||
end_datetime=provider.end_datetime,
|
||||
)
|
||||
assert isinstance(array, np.ndarray)
|
||||
@@ -198,7 +198,7 @@ def test_key_to_array_resampling(provider):
|
||||
def test_akkudoktor_development_forecast_data(provider):
|
||||
"""Fetch data from real Energy-Charts server."""
|
||||
# Preset, as this is usually done by update_data()
|
||||
provider.start_datetime = to_datetime("2024-10-26 00:00:00")
|
||||
provider.ems_start_datetime = to_datetime("2024-10-26 00:00:00")
|
||||
|
||||
energy_charts_data = provider._request_forecast()
|
||||
|
||||
|
||||
@@ -19,8 +19,10 @@ def provider(sample_import_1_json, config_eos):
|
||||
"elecprice": {
|
||||
"provider": "ElecPriceImport",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
"ElecPriceImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -55,7 +57,9 @@ def test_invalid_provider(provider, config_eos):
|
||||
"elecprice": {
|
||||
"provider": "<invalid>",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
|
||||
"ElecPriceImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_ELECPRICEIMPORT_1_JSON),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -86,20 +90,20 @@ def test_import(provider, sample_import_1_json, start_datetime, from_file, confi
|
||||
ems_eos = get_ems()
|
||||
ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin"))
|
||||
if from_file:
|
||||
config_eos.elecprice.provider_settings.import_json = None
|
||||
assert config_eos.elecprice.provider_settings.import_json is None
|
||||
config_eos.elecprice.provider_settings.ElecPriceImport.import_json = None
|
||||
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_json is None
|
||||
else:
|
||||
config_eos.elecprice.provider_settings.import_file_path = None
|
||||
assert config_eos.elecprice.provider_settings.import_file_path is None
|
||||
config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path = None
|
||||
assert config_eos.elecprice.provider_settings.ElecPriceImport.import_file_path is None
|
||||
provider.clear()
|
||||
|
||||
# Call the method
|
||||
provider.update_data()
|
||||
|
||||
# Assert: Verify the result is as expected
|
||||
assert provider.start_datetime is not None
|
||||
assert provider.ems_start_datetime is not None
|
||||
assert provider.total_hours is not None
|
||||
assert compare_datetimes(provider.start_datetime, ems_eos.start_datetime).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, ems_eos.start_datetime).equal
|
||||
values = sample_import_1_json["elecprice_marketprice_wh"]
|
||||
value_datetime_mapping = provider.import_datetimes(ems_eos.start_datetime, len(values))
|
||||
for i, mapping in enumerate(value_datetime_mapping):
|
||||
|
||||
235
tests/test_emplan.py
Normal file
235
tests/test_emplan.py
Normal file
@@ -0,0 +1,235 @@
|
||||
from typing import Optional
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.core.emplan import (
|
||||
BaseInstruction,
|
||||
CommodityQuantity,
|
||||
DDBCInstruction,
|
||||
EnergyManagementPlan,
|
||||
FRBCInstruction,
|
||||
OMBCInstruction,
|
||||
PEBCInstruction,
|
||||
PEBCPowerEnvelope,
|
||||
PEBCPowerEnvelopeElement,
|
||||
PPBCEndInterruptionInstruction,
|
||||
PPBCScheduleInstruction,
|
||||
PPBCStartInterruptionInstruction,
|
||||
)
|
||||
from akkudoktoreos.utils.datetimeutil import Duration, to_datetime, to_duration
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fixed_now():
|
||||
return to_datetime("2025-06-01T12:00:00")
|
||||
|
||||
|
||||
class TestEnergyManagementPlan:
|
||||
def test_add_instruction_and_time_range(self, fixed_now):
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[]
|
||||
)
|
||||
instr1 = OMBCInstruction(
|
||||
resource_id="dev-1",
|
||||
execution_time=fixed_now,
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr2 = OMBCInstruction(
|
||||
resource_id="dev-2",
|
||||
execution_time=fixed_now.add(minutes=5),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
plan.add_instruction(instr1)
|
||||
plan.add_instruction(instr2)
|
||||
|
||||
# Check that valid_from matches the earliest execution_time
|
||||
assert plan.valid_from == fixed_now
|
||||
|
||||
# instr2 has infinite duration so valid_until must be None
|
||||
assert plan.valid_until is None
|
||||
|
||||
assert plan.instructions == [instr1, instr2]
|
||||
|
||||
def test_clear(self, fixed_now):
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[
|
||||
OMBCInstruction(
|
||||
resource_id="dev-1",
|
||||
execution_time=fixed_now,
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
],
|
||||
)
|
||||
plan.clear()
|
||||
assert plan.instructions == []
|
||||
assert plan.valid_until is None
|
||||
assert plan.valid_from is not None
|
||||
|
||||
def test_get_active_instructions(self, fixed_now):
|
||||
instr1 = OMBCInstruction(
|
||||
resource_id="dev-1",
|
||||
execution_time=fixed_now.subtract(minutes=1),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr2 = OMBCInstruction(
|
||||
resource_id="dev-2",
|
||||
execution_time=fixed_now.add(minutes=1),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr3 = OMBCInstruction(
|
||||
resource_id="dev-3",
|
||||
execution_time=fixed_now.subtract(minutes=10),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[instr1, instr2, instr3],
|
||||
)
|
||||
plan._update_time_range()
|
||||
|
||||
active = plan.get_active_instructions(now=fixed_now)
|
||||
ids = {i.resource_id for i in active}
|
||||
assert ids == {"dev-1", "dev-3"}
|
||||
|
||||
def test_get_next_instruction(self, fixed_now):
|
||||
instr1 = OMBCInstruction(
|
||||
resource_id="dev-1",
|
||||
execution_time=fixed_now.subtract(minutes=1),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr2 = OMBCInstruction(
|
||||
resource_id="dev-2",
|
||||
execution_time=fixed_now.add(minutes=10),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr3 = OMBCInstruction(
|
||||
resource_id="dev-3",
|
||||
execution_time=fixed_now.add(minutes=5),
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[instr1, instr2, instr3],
|
||||
)
|
||||
plan._update_time_range()
|
||||
|
||||
next_instr = plan.get_next_instruction(now=fixed_now)
|
||||
assert next_instr is not None
|
||||
assert next_instr.resource_id == "dev-3"
|
||||
|
||||
def test_get_instructions_for_resource(self, fixed_now):
|
||||
instr1 = OMBCInstruction(
|
||||
resource_id="dev-1",
|
||||
execution_time=fixed_now,
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
instr2 = OMBCInstruction(
|
||||
resource_id="dev-2",
|
||||
execution_time=fixed_now,
|
||||
operation_mode_id="mymode1",
|
||||
operation_mode_factor=1.0,
|
||||
)
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[instr1, instr2],
|
||||
)
|
||||
dev1_instructions = plan.get_instructions_for_resource("dev-1")
|
||||
assert len(dev1_instructions) == 1
|
||||
assert dev1_instructions[0].resource_id == "dev-1"
|
||||
|
||||
|
||||
def test_add_various_instructions(self, fixed_now):
|
||||
plan = EnergyManagementPlan(
|
||||
id="plan-123",
|
||||
generated_at=fixed_now,
|
||||
instructions=[]
|
||||
)
|
||||
|
||||
instrs = [
|
||||
DDBCInstruction(
|
||||
id="actuatorA@123",
|
||||
execution_time=fixed_now,
|
||||
actuator_id="actuatorA",
|
||||
operation_mode_id="mode123",
|
||||
operation_mode_factor=0.5,
|
||||
),
|
||||
FRBCInstruction(
|
||||
id="actuatorB@456",
|
||||
execution_time=fixed_now.add(minutes=1),
|
||||
actuator_id="actuatorB",
|
||||
operation_mode_id="FRBC_Mode_1",
|
||||
operation_mode_factor=1.0,
|
||||
),
|
||||
OMBCInstruction(
|
||||
id="controller@789",
|
||||
execution_time=fixed_now.add(minutes=2),
|
||||
operation_mode_id="OMBC_Mode_42",
|
||||
operation_mode_factor=0.8,
|
||||
),
|
||||
PPBCEndInterruptionInstruction(
|
||||
id="end_int@001",
|
||||
execution_time=fixed_now.add(minutes=3),
|
||||
power_profile_id="profile-123",
|
||||
sequence_container_id="container-456",
|
||||
power_sequence_id="seq-789",
|
||||
),
|
||||
PPBCStartInterruptionInstruction(
|
||||
id="start_int@002",
|
||||
execution_time=fixed_now.add(minutes=4),
|
||||
power_profile_id="profile-321",
|
||||
sequence_container_id="container-654",
|
||||
power_sequence_id="seq-987",
|
||||
),
|
||||
PPBCScheduleInstruction(
|
||||
id="schedule@003",
|
||||
execution_time=fixed_now.add(minutes=5),
|
||||
power_profile_id="profile-999",
|
||||
sequence_container_id="container-888",
|
||||
power_sequence_id="seq-777",
|
||||
),
|
||||
PEBCInstruction(
|
||||
id="pebc@004",
|
||||
execution_time=fixed_now.add(minutes=6),
|
||||
power_constraints_id="pc-123",
|
||||
power_envelopes=[
|
||||
PEBCPowerEnvelope(
|
||||
id="pebcpe@1234",
|
||||
commodity_quantity=CommodityQuantity.ELECTRIC_POWER_L1,
|
||||
power_envelope_elements = [
|
||||
PEBCPowerEnvelopeElement(
|
||||
duration = to_duration(10),
|
||||
upper_limit = 1010.0,
|
||||
lower_limit = 990.0,
|
||||
),
|
||||
],
|
||||
),
|
||||
],
|
||||
),
|
||||
]
|
||||
|
||||
for instr in instrs:
|
||||
plan.add_instruction(instr)
|
||||
|
||||
assert len(plan.instructions) == len(instrs)
|
||||
# Check that get_instructions_for_device returns the right instructions
|
||||
assert any(
|
||||
instr for instr in plan.get_instructions_for_resource("actuatorA")
|
||||
if isinstance(instr, DDBCInstruction)
|
||||
)
|
||||
@@ -5,19 +5,16 @@ import requests
|
||||
|
||||
|
||||
class TestEOSDash:
|
||||
def test_eosdash_started(self, server_setup_for_class, is_system_test):
|
||||
"""Test the EOSdash server is started by EOS server."""
|
||||
server = server_setup_for_class["server"]
|
||||
eosdash_server = server_setup_for_class["eosdash_server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
timeout = server_setup_for_class["timeout"]
|
||||
|
||||
# Assure EOSdash is up
|
||||
def _assert_server_alive(self, base_url: str, timeout: int):
|
||||
"""Poll the /eosdash/health endpoint until it's alive or timeout reached."""
|
||||
startup = False
|
||||
error = ""
|
||||
for retries in range(int(timeout / 3)):
|
||||
result = None
|
||||
|
||||
for _ in range(int(timeout / 3)):
|
||||
try:
|
||||
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
|
||||
result = requests.get(f"{base_url}/eosdash/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
startup = True
|
||||
break
|
||||
@@ -25,27 +22,19 @@ class TestEOSDash:
|
||||
except Exception as ex:
|
||||
error = str(ex)
|
||||
time.sleep(3)
|
||||
assert startup, f"Connection to {eosdash_server}/eosdash/health failed: {error}"
|
||||
|
||||
assert startup, f"Connection to {base_url}/eosdash/health failed: {error}"
|
||||
assert result is not None
|
||||
assert result.json()["status"] == "alive"
|
||||
|
||||
def test_eosdash_started(self, server_setup_for_class, is_system_test):
|
||||
"""Test the EOSdash server is started by EOS server."""
|
||||
eosdash_server = server_setup_for_class["eosdash_server"]
|
||||
timeout = server_setup_for_class["timeout"]
|
||||
self._assert_server_alive(eosdash_server, timeout)
|
||||
|
||||
def test_eosdash_proxied_by_eos(self, server_setup_for_class, is_system_test):
|
||||
"""Test the EOSdash server proxied by EOS server."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
timeout = server_setup_for_class["timeout"]
|
||||
|
||||
# Assure EOSdash is up
|
||||
startup = False
|
||||
error = ""
|
||||
for retries in range(int(timeout / 3)):
|
||||
try:
|
||||
result = requests.get(f"{server}/eosdash/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
startup = True
|
||||
break
|
||||
error = f"{result.status_code}, {str(result.content)}"
|
||||
except Exception as ex:
|
||||
error = str(ex)
|
||||
time.sleep(3)
|
||||
assert startup, f"Connection to {server}/eosdash/health failed: {error}"
|
||||
assert result.json()["status"] == "alive"
|
||||
self._assert_server_alive(server, timeout)
|
||||
|
||||
62
tests/test_feedintarifffixed.py
Normal file
62
tests/test_feedintarifffixed.py
Normal file
@@ -0,0 +1,62 @@
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.core.ems import get_ems
|
||||
from akkudoktoreos.prediction.feedintarifffixed import FeedInTariffFixed
|
||||
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime
|
||||
|
||||
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def provider(config_eos):
|
||||
"""Fixture to create a ElecPriceProvider instance."""
|
||||
settings = {
|
||||
"feedintariff": {
|
||||
"provider": "FeedInTariffFixed",
|
||||
"provider_settings": {
|
||||
"FeedInTariffFixed": {
|
||||
"feed_in_tariff_kwh": 0.078,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
config_eos.merge_settings_from_dict(settings)
|
||||
assert config_eos.feedintariff.provider == "FeedInTariffFixed"
|
||||
provider = FeedInTariffFixed()
|
||||
assert provider.enabled()
|
||||
return provider
|
||||
|
||||
|
||||
# ------------------------------------------------
|
||||
# General forecast
|
||||
# ------------------------------------------------
|
||||
|
||||
|
||||
def test_singleton_instance(provider):
|
||||
"""Test that ElecPriceForecast behaves as a singleton."""
|
||||
another_instance = FeedInTariffFixed()
|
||||
assert provider is another_instance
|
||||
|
||||
|
||||
def test_invalid_provider(provider, config_eos):
|
||||
"""Test requesting an unsupported provider."""
|
||||
settings = {
|
||||
"feedintariff": {
|
||||
"provider": "<invalid>",
|
||||
"provider_settings": {
|
||||
"FeedInTariffFixed": {
|
||||
"feed_in_tariff_kwh": 0.078,
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
with pytest.raises(ValueError, match="not a valid feed in tariff provider"):
|
||||
config_eos.merge_settings_from_dict(settings)
|
||||
|
||||
|
||||
# ------------------------------------------------
|
||||
# Fixed feed in tariv values
|
||||
# ------------------------------------------------
|
||||
150
tests/test_geneticoptimize.py
Normal file
150
tests/test_geneticoptimize.py
Normal file
@@ -0,0 +1,150 @@
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.config.config import ConfigEOS
|
||||
from akkudoktoreos.core.cache import CacheEnergyManagementStore
|
||||
from akkudoktoreos.core.ems import get_ems
|
||||
from akkudoktoreos.optimization.genetic.genetic import GeneticOptimization
|
||||
from akkudoktoreos.optimization.genetic.geneticparams import (
|
||||
GeneticOptimizationParameters,
|
||||
)
|
||||
from akkudoktoreos.optimization.genetic.geneticsolution import GeneticSolution
|
||||
from akkudoktoreos.utils.datetimeutil import to_datetime
|
||||
from akkudoktoreos.utils.visualize import (
|
||||
prepare_visualize, # Import the new prepare_visualize
|
||||
)
|
||||
|
||||
ems_eos = get_ems()
|
||||
|
||||
DIR_TESTDATA = Path(__file__).parent / "testdata"
|
||||
|
||||
|
||||
def compare_dict(actual: dict[str, Any], expected: dict[str, Any]):
|
||||
assert set(actual) == set(expected)
|
||||
|
||||
for key, value in expected.items():
|
||||
if isinstance(value, dict):
|
||||
assert isinstance(actual[key], dict)
|
||||
compare_dict(actual[key], value)
|
||||
elif isinstance(value, list):
|
||||
assert isinstance(actual[key], list)
|
||||
assert actual[key] == pytest.approx(value)
|
||||
else:
|
||||
assert actual[key] == pytest.approx(value)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"fn_in, fn_out, ngen",
|
||||
[
|
||||
("optimize_input_1.json", "optimize_result_1.json", 3),
|
||||
("optimize_input_2.json", "optimize_result_2.json", 3),
|
||||
("optimize_input_2.json", "optimize_result_2_full.json", 400),
|
||||
],
|
||||
)
|
||||
def test_optimize(
|
||||
fn_in: str,
|
||||
fn_out: str,
|
||||
ngen: int,
|
||||
config_eos: ConfigEOS,
|
||||
is_full_run: bool,
|
||||
):
|
||||
"""Test optimierung_ems."""
|
||||
# Test parameters
|
||||
fixed_start_hour = 10
|
||||
fixed_seed = 42
|
||||
|
||||
# Assure configuration holds the correct values
|
||||
config_eos.merge_settings_from_dict(
|
||||
{
|
||||
"prediction": {
|
||||
"hours": 48
|
||||
},
|
||||
"optimization": {
|
||||
"horizon_hours": 48,
|
||||
"genetic": {
|
||||
"individuals": 300,
|
||||
"generations": 10,
|
||||
"penalties": {
|
||||
"ev_soc_miss": 10
|
||||
}
|
||||
}
|
||||
},
|
||||
"devices": {
|
||||
"max_electric_vehicles": 1,
|
||||
"electric_vehicles": [
|
||||
{
|
||||
"charge_rates": [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0],
|
||||
}
|
||||
],
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
# Load input and output data
|
||||
file = DIR_TESTDATA / fn_in
|
||||
with file.open("r") as f_in:
|
||||
input_data = GeneticOptimizationParameters(**json.load(f_in))
|
||||
|
||||
file = DIR_TESTDATA / fn_out
|
||||
# In case a new test case is added, we don't want to fail here, so the new output is written
|
||||
# to disk before
|
||||
try:
|
||||
with file.open("r") as f_out:
|
||||
expected_data = json.load(f_out)
|
||||
expected_result = GeneticSolution(**expected_data)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
# Fake energy management run start datetime
|
||||
ems_eos.set_start_datetime(to_datetime().set(hour=fixed_start_hour))
|
||||
|
||||
# Throw away any cached results of the last energy management run.
|
||||
CacheEnergyManagementStore().clear()
|
||||
|
||||
genetic_optimization = GeneticOptimization(fixed_seed=fixed_seed)
|
||||
|
||||
# Activate with pytest --full-run
|
||||
if ngen > 10 and not is_full_run:
|
||||
pytest.skip()
|
||||
|
||||
visualize_filename = str((DIR_TESTDATA / f"new_{fn_out}").with_suffix(".pdf"))
|
||||
|
||||
with patch(
|
||||
"akkudoktoreos.utils.visualize.prepare_visualize",
|
||||
side_effect=lambda parameters, results, *args, **kwargs: prepare_visualize(
|
||||
parameters, results, filename=visualize_filename, **kwargs
|
||||
),
|
||||
) as prepare_visualize_patch:
|
||||
# Call the optimization function
|
||||
genetic_solution = genetic_optimization.optimierung_ems(
|
||||
parameters=input_data, start_hour=fixed_start_hour, ngen=ngen
|
||||
)
|
||||
# The function creates a visualization result PDF as a side-effect.
|
||||
prepare_visualize_patch.assert_called_once()
|
||||
assert Path(visualize_filename).exists()
|
||||
|
||||
# Write test output to file, so we can take it as new data on intended change
|
||||
TESTDATA_FILE = DIR_TESTDATA / f"new_{fn_out}"
|
||||
with TESTDATA_FILE.open("w", encoding="utf-8", newline="\n") as f_out:
|
||||
f_out.write(genetic_solution.model_dump_json(indent=4, exclude_unset=True))
|
||||
|
||||
assert genetic_solution.result.Gesamtbilanz_Euro == pytest.approx(
|
||||
expected_result.result.Gesamtbilanz_Euro
|
||||
)
|
||||
|
||||
# Assert that the output contains all expected entries.
|
||||
# This does not assert that the optimization always gives the same result!
|
||||
# Reproducibility and mathematical accuracy should be tested on the level of individual components.
|
||||
compare_dict(genetic_solution.model_dump(), expected_result.model_dump())
|
||||
|
||||
# Check the correct generic optimization solution is created
|
||||
optimization_solution = genetic_solution.optimization_solution()
|
||||
# @TODO
|
||||
|
||||
# Check the correct generic energy management plan is created
|
||||
plan = genetic_solution.energy_management_plan()
|
||||
# @TODO
|
||||
@@ -1,32 +1,41 @@
|
||||
import numpy as np
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.core.ems import (
|
||||
EnergyManagement,
|
||||
EnergyManagementParameters,
|
||||
SimulationResult,
|
||||
get_ems,
|
||||
)
|
||||
from akkudoktoreos.devices.battery import (
|
||||
Battery,
|
||||
from akkudoktoreos.devices.genetic.battery import Battery
|
||||
from akkudoktoreos.devices.genetic.homeappliance import HomeAppliance
|
||||
from akkudoktoreos.devices.genetic.inverter import Inverter
|
||||
from akkudoktoreos.optimization.genetic.genetic import GeneticSimulation
|
||||
from akkudoktoreos.optimization.genetic.geneticdevices import (
|
||||
ElectricVehicleParameters,
|
||||
HomeApplianceParameters,
|
||||
InverterParameters,
|
||||
SolarPanelBatteryParameters,
|
||||
)
|
||||
from akkudoktoreos.devices.generic import HomeAppliance, HomeApplianceParameters
|
||||
from akkudoktoreos.devices.inverter import Inverter, InverterParameters
|
||||
from akkudoktoreos.optimization.genetic.geneticparams import (
|
||||
GeneticEnergyManagementParameters,
|
||||
GeneticOptimizationParameters,
|
||||
)
|
||||
from akkudoktoreos.optimization.genetic.geneticsolution import GeneticSimulationResult
|
||||
from akkudoktoreos.utils.datetimeutil import (
|
||||
TimeWindow,
|
||||
TimeWindowSequence,
|
||||
to_duration,
|
||||
to_time,
|
||||
)
|
||||
|
||||
start_hour = 1
|
||||
|
||||
|
||||
# Example initialization of necessary components
|
||||
@pytest.fixture
|
||||
def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
def genetic_simulation(config_eos) -> GeneticSimulation:
|
||||
"""Fixture to create an EnergyManagement instance with given test parameters."""
|
||||
# Assure configuration holds the correct values
|
||||
config_eos.merge_settings_from_dict(
|
||||
{"prediction": {"hours": 48}, "optimization": {"hours": 24}}
|
||||
)
|
||||
assert config_eos.prediction.hours == 48
|
||||
assert config_eos.optimization.horizon_hours == 24
|
||||
|
||||
# Initialize the battery and the inverter
|
||||
akku = Battery(
|
||||
@@ -35,15 +44,15 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
capacity_wh=5000,
|
||||
initial_soc_percentage=80,
|
||||
min_soc_percentage=10,
|
||||
)
|
||||
),
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
akku.reset()
|
||||
devices_eos.add_device(akku)
|
||||
|
||||
inverter = Inverter(
|
||||
InverterParameters(device_id="inverter1", max_power_wh=10000, battery_id=akku.device_id)
|
||||
InverterParameters(device_id="inverter1", max_power_wh=10000, battery_id=akku.parameters.device_id),
|
||||
battery = akku,
|
||||
)
|
||||
devices_eos.add_device(inverter)
|
||||
|
||||
# Household device (currently not used, set to None)
|
||||
home_appliance = HomeAppliance(
|
||||
@@ -51,21 +60,21 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
device_id="dishwasher1",
|
||||
consumption_wh=2000,
|
||||
duration_h=2,
|
||||
time_windows=None,
|
||||
),
|
||||
optimization_hours = config_eos.optimization.horizon_hours,
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
home_appliance.set_starting_time(2)
|
||||
devices_eos.add_device(home_appliance)
|
||||
|
||||
# Example initialization of electric car battery
|
||||
eauto = Battery(
|
||||
ElectricVehicleParameters(
|
||||
device_id="ev1", capacity_wh=26400, initial_soc_percentage=10, min_soc_percentage=10
|
||||
),
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
eauto.set_charge_per_hour(np.full(config_eos.prediction.hours, 1))
|
||||
devices_eos.add_device(eauto)
|
||||
|
||||
devices_eos.post_setup()
|
||||
|
||||
# Parameters based on previous example data
|
||||
pv_prognose_wh = [
|
||||
@@ -225,39 +234,41 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
]
|
||||
|
||||
# Initialize the energy management system with the respective parameters
|
||||
ems = get_ems()
|
||||
ems.set_parameters(
|
||||
EnergyManagementParameters(
|
||||
simulation = GeneticSimulation()
|
||||
simulation.prepare(
|
||||
GeneticEnergyManagementParameters(
|
||||
pv_prognose_wh=pv_prognose_wh,
|
||||
strompreis_euro_pro_wh=strompreis_euro_pro_wh,
|
||||
einspeiseverguetung_euro_pro_wh=einspeiseverguetung_euro_pro_wh,
|
||||
preis_euro_pro_wh_akku=preis_euro_pro_wh_akku,
|
||||
gesamtlast=gesamtlast,
|
||||
),
|
||||
optimization_hours = config_eos.optimization.horizon_hours,
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
inverter=inverter,
|
||||
ev=eauto,
|
||||
home_appliance=home_appliance,
|
||||
)
|
||||
|
||||
return ems
|
||||
return simulation
|
||||
|
||||
|
||||
def test_simulation(create_ems_instance):
|
||||
def test_simulation(genetic_simulation):
|
||||
"""Test the EnergyManagement simulation method."""
|
||||
ems = create_ems_instance
|
||||
simulation = genetic_simulation
|
||||
|
||||
# Simulate starting from hour 1 (this value can be adjusted)
|
||||
|
||||
result = ems.simulate(start_hour=start_hour)
|
||||
result = simulation.simulate(start_hour=start_hour)
|
||||
|
||||
# visualisiere_ergebnisse(
|
||||
# ems.gesamtlast,
|
||||
# ems.pv_prognose_wh,
|
||||
# ems.strompreis_euro_pro_wh,
|
||||
# simulation.gesamtlast,
|
||||
# simulation.pv_prognose_wh,
|
||||
# simulation.strompreis_euro_pro_wh,
|
||||
# result,
|
||||
# ems.akku.discharge_array+ems.akku.charge_array,
|
||||
# simulation.akku.discharge_array+simulation.akku.charge_array,
|
||||
# None,
|
||||
# ems.pv_prognose_wh,
|
||||
# simulation.pv_prognose_wh,
|
||||
# start_hour,
|
||||
# 48,
|
||||
# np.full(48, 0.0),
|
||||
@@ -278,7 +289,7 @@ def test_simulation(create_ems_instance):
|
||||
|
||||
# Check that the result is a dictionary
|
||||
assert isinstance(result, dict), "Result should be a dictionary."
|
||||
assert SimulationResult(**result) is not None
|
||||
assert GeneticSimulationResult(**result) is not None
|
||||
|
||||
# Check the length of the main arrays
|
||||
assert len(result["Last_Wh_pro_Stunde"]) == 47, (
|
||||
@@ -341,8 +352,8 @@ def test_simulation(create_ems_instance):
|
||||
)
|
||||
|
||||
# Check home appliances
|
||||
assert sum(ems.home_appliance.get_load_curve()) == 2000, (
|
||||
"The sum of 'ems.home_appliance.get_load_curve()' should be 2000."
|
||||
assert sum(simulation.home_appliance.get_load_curve()) == 2000, (
|
||||
"The sum of 'simulation.home_appliance.get_load_curve()' should be 2000."
|
||||
)
|
||||
|
||||
assert (
|
||||
@@ -1,46 +1,58 @@
|
||||
import numpy as np
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.core.ems import (
|
||||
EnergyManagement,
|
||||
EnergyManagementParameters,
|
||||
SimulationResult,
|
||||
get_ems,
|
||||
)
|
||||
from akkudoktoreos.devices.battery import (
|
||||
Battery,
|
||||
from akkudoktoreos.devices.genetic.battery import Battery
|
||||
from akkudoktoreos.devices.genetic.homeappliance import HomeAppliance
|
||||
from akkudoktoreos.devices.genetic.inverter import Inverter
|
||||
from akkudoktoreos.optimization.genetic.genetic import GeneticSimulation
|
||||
from akkudoktoreos.optimization.genetic.geneticdevices import (
|
||||
ElectricVehicleParameters,
|
||||
HomeApplianceParameters,
|
||||
InverterParameters,
|
||||
SolarPanelBatteryParameters,
|
||||
)
|
||||
from akkudoktoreos.devices.generic import HomeAppliance, HomeApplianceParameters
|
||||
from akkudoktoreos.devices.inverter import Inverter, InverterParameters
|
||||
from akkudoktoreos.optimization.genetic.geneticparams import (
|
||||
GeneticEnergyManagementParameters,
|
||||
GeneticOptimizationParameters,
|
||||
)
|
||||
from akkudoktoreos.optimization.genetic.geneticsolution import GeneticSimulationResult
|
||||
from akkudoktoreos.utils.datetimeutil import (
|
||||
TimeWindow,
|
||||
TimeWindowSequence,
|
||||
to_duration,
|
||||
to_time,
|
||||
)
|
||||
|
||||
start_hour = 0
|
||||
|
||||
|
||||
# Example initialization of necessary components
|
||||
@pytest.fixture
|
||||
def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
def genetic_simulation_2(config_eos) -> GeneticSimulation:
|
||||
"""Fixture to create an EnergyManagement instance with given test parameters."""
|
||||
# Assure configuration holds the correct values
|
||||
config_eos.merge_settings_from_dict(
|
||||
{"prediction": {"hours": 48}, "optimization": {"hours": 24}}
|
||||
)
|
||||
assert config_eos.prediction.hours == 48
|
||||
assert config_eos.optimization.horizon_hours == 24
|
||||
|
||||
# Initialize the battery and the inverter
|
||||
akku = Battery(
|
||||
SolarPanelBatteryParameters(
|
||||
device_id="pv1", capacity_wh=5000, initial_soc_percentage=80, min_soc_percentage=10
|
||||
)
|
||||
device_id="battery1",
|
||||
capacity_wh=5000,
|
||||
initial_soc_percentage=80,
|
||||
min_soc_percentage=10,
|
||||
),
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
akku.reset()
|
||||
devices_eos.add_device(akku)
|
||||
|
||||
inverter = Inverter(
|
||||
InverterParameters(device_id="iv1", max_power_wh=10000, battery_id=akku.device_id)
|
||||
InverterParameters(device_id="inverter1", max_power_wh=10000, battery_id=akku.parameters.device_id),
|
||||
battery = akku,
|
||||
)
|
||||
devices_eos.add_device(inverter)
|
||||
|
||||
# Household device (currently not used, set to None)
|
||||
home_appliance = HomeAppliance(
|
||||
@@ -48,20 +60,20 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
device_id="dishwasher1",
|
||||
consumption_wh=2000,
|
||||
duration_h=2,
|
||||
)
|
||||
time_windows=None,
|
||||
),
|
||||
optimization_hours = config_eos.optimization.horizon_hours,
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
home_appliance.set_starting_time(2)
|
||||
devices_eos.add_device(home_appliance)
|
||||
|
||||
# Example initialization of electric car battery
|
||||
eauto = Battery(
|
||||
ElectricVehicleParameters(
|
||||
device_id="ev1", capacity_wh=26400, initial_soc_percentage=100, min_soc_percentage=100
|
||||
device_id="ev1", capacity_wh=26400, initial_soc_percentage=10, min_soc_percentage=10
|
||||
),
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
)
|
||||
devices_eos.add_device(eauto)
|
||||
|
||||
devices_eos.post_setup()
|
||||
|
||||
# Parameters based on previous example data
|
||||
pv_prognose_wh = [0.0] * config_eos.prediction.hours
|
||||
@@ -128,15 +140,17 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
]
|
||||
|
||||
# Initialize the energy management system with the respective parameters
|
||||
ems = get_ems()
|
||||
ems.set_parameters(
|
||||
EnergyManagementParameters(
|
||||
simulation = GeneticSimulation()
|
||||
simulation.prepare(
|
||||
GeneticEnergyManagementParameters(
|
||||
pv_prognose_wh=pv_prognose_wh,
|
||||
strompreis_euro_pro_wh=strompreis_euro_pro_wh,
|
||||
einspeiseverguetung_euro_pro_wh=einspeiseverguetung_euro_pro_wh,
|
||||
preis_euro_pro_wh_akku=preis_euro_pro_wh_akku,
|
||||
gesamtlast=gesamtlast,
|
||||
),
|
||||
optimization_hours = config_eos.optimization.horizon_hours,
|
||||
prediction_hours = config_eos.prediction.hours,
|
||||
inverter=inverter,
|
||||
ev=eauto,
|
||||
home_appliance=home_appliance,
|
||||
@@ -144,30 +158,30 @@ def create_ems_instance(devices_eos, config_eos) -> EnergyManagement:
|
||||
|
||||
ac = np.full(config_eos.prediction.hours, 0.0)
|
||||
ac[20] = 1
|
||||
ems.set_akku_ac_charge_hours(ac)
|
||||
simulation.set_akku_ac_charge_hours(ac)
|
||||
dc = np.full(config_eos.prediction.hours, 0.0)
|
||||
dc[11] = 1
|
||||
ems.set_akku_dc_charge_hours(dc)
|
||||
simulation.set_akku_dc_charge_hours(dc)
|
||||
|
||||
return ems
|
||||
return simulation
|
||||
|
||||
|
||||
def test_simulation(create_ems_instance):
|
||||
def test_simulation(genetic_simulation_2):
|
||||
"""Test the EnergyManagement simulation method."""
|
||||
ems = create_ems_instance
|
||||
simulation = genetic_simulation_2
|
||||
|
||||
# Simulate starting from hour 0 (this value can be adjusted)
|
||||
result = ems.simulate(start_hour=start_hour)
|
||||
result = simulation.simulate(start_hour=start_hour)
|
||||
|
||||
# --- Pls do not remove! ---
|
||||
# visualisiere_ergebnisse(
|
||||
# ems.gesamtlast,
|
||||
# ems.pv_prognose_wh,
|
||||
# ems.strompreis_euro_pro_wh,
|
||||
# simulation.gesamtlast,
|
||||
# simulation.pv_prognose_wh,
|
||||
# simulation.strompreis_euro_pro_wh,
|
||||
# result,
|
||||
# ems.akku.discharge_array+ems.akku.charge_array,
|
||||
# simulation.akku.discharge_array+simulation.akku.charge_array,
|
||||
# None,
|
||||
# ems.pv_prognose_wh,
|
||||
# simulation.pv_prognose_wh,
|
||||
# start_hour,
|
||||
# 48,
|
||||
# np.full(48, 0.0),
|
||||
@@ -178,7 +192,7 @@ def test_simulation(create_ems_instance):
|
||||
# Assertions to validate results
|
||||
assert result is not None, "Result should not be None"
|
||||
assert isinstance(result, dict), "Result should be a dictionary"
|
||||
assert SimulationResult(**result) is not None
|
||||
assert GeneticSimulationResult(**result) is not None
|
||||
assert "Last_Wh_pro_Stunde" in result, "Result should contain 'Last_Wh_pro_Stunde'"
|
||||
|
||||
"""
|
||||
@@ -253,73 +267,64 @@ def test_simulation(create_ems_instance):
|
||||
print("All tests passed successfully.")
|
||||
|
||||
|
||||
def test_set_parameters(create_ems_instance):
|
||||
def test_set_parameters(genetic_simulation_2):
|
||||
"""Test the set_parameters method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
simulation = genetic_simulation_2
|
||||
|
||||
# Check if parameters are set correctly
|
||||
assert ems.load_energy_array is not None, "load_energy_array should not be None"
|
||||
assert ems.pv_prediction_wh is not None, "pv_prediction_wh should not be None"
|
||||
assert ems.elect_price_hourly is not None, "elect_price_hourly should not be None"
|
||||
assert ems.elect_revenue_per_hour_arr is not None, (
|
||||
assert simulation.load_energy_array is not None, "load_energy_array should not be None"
|
||||
assert simulation.pv_prediction_wh is not None, "pv_prediction_wh should not be None"
|
||||
assert simulation.elect_price_hourly is not None, "elect_price_hourly should not be None"
|
||||
assert simulation.elect_revenue_per_hour_arr is not None, (
|
||||
"elect_revenue_per_hour_arr should not be None"
|
||||
)
|
||||
|
||||
|
||||
def test_set_akku_discharge_hours(create_ems_instance):
|
||||
def test_set_akku_discharge_hours(genetic_simulation_2):
|
||||
"""Test the set_akku_discharge_hours method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
discharge_hours = np.full(ems.config.prediction.hours, 1.0)
|
||||
ems.set_akku_discharge_hours(discharge_hours)
|
||||
assert np.array_equal(ems.battery.discharge_array, discharge_hours), (
|
||||
simulation = genetic_simulation_2
|
||||
discharge_hours = np.full(simulation.prediction_hours, 1.0)
|
||||
simulation.set_akku_discharge_hours(discharge_hours)
|
||||
assert np.array_equal(simulation.battery.discharge_array, discharge_hours), (
|
||||
"Discharge hours should be set correctly"
|
||||
)
|
||||
|
||||
|
||||
def test_set_akku_ac_charge_hours(create_ems_instance):
|
||||
def test_set_akku_ac_charge_hours(genetic_simulation_2):
|
||||
"""Test the set_akku_ac_charge_hours method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
ac_charge_hours = np.full(ems.config.prediction.hours, 1.0)
|
||||
ems.set_akku_ac_charge_hours(ac_charge_hours)
|
||||
assert np.array_equal(ems.ac_charge_hours, ac_charge_hours), (
|
||||
simulation = genetic_simulation_2
|
||||
ac_charge_hours = np.full(simulation.prediction_hours, 1.0)
|
||||
simulation.set_akku_ac_charge_hours(ac_charge_hours)
|
||||
assert np.array_equal(simulation.ac_charge_hours, ac_charge_hours), (
|
||||
"AC charge hours should be set correctly"
|
||||
)
|
||||
|
||||
|
||||
def test_set_akku_dc_charge_hours(create_ems_instance):
|
||||
def test_set_akku_dc_charge_hours(genetic_simulation_2):
|
||||
"""Test the set_akku_dc_charge_hours method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
dc_charge_hours = np.full(ems.config.prediction.hours, 1.0)
|
||||
ems.set_akku_dc_charge_hours(dc_charge_hours)
|
||||
assert np.array_equal(ems.dc_charge_hours, dc_charge_hours), (
|
||||
simulation = genetic_simulation_2
|
||||
dc_charge_hours = np.full(simulation.prediction_hours, 1.0)
|
||||
simulation.set_akku_dc_charge_hours(dc_charge_hours)
|
||||
assert np.array_equal(simulation.dc_charge_hours, dc_charge_hours), (
|
||||
"DC charge hours should be set correctly"
|
||||
)
|
||||
|
||||
|
||||
def test_set_ev_charge_hours(create_ems_instance):
|
||||
def test_set_ev_charge_hours(genetic_simulation_2):
|
||||
"""Test the set_ev_charge_hours method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
ev_charge_hours = np.full(ems.config.prediction.hours, 1.0)
|
||||
ems.set_ev_charge_hours(ev_charge_hours)
|
||||
assert np.array_equal(ems.ev_charge_hours, ev_charge_hours), (
|
||||
simulation = genetic_simulation_2
|
||||
ev_charge_hours = np.full(simulation.prediction_hours, 1.0)
|
||||
simulation.set_ev_charge_hours(ev_charge_hours)
|
||||
assert np.array_equal(simulation.ev_charge_hours, ev_charge_hours), (
|
||||
"EV charge hours should be set correctly"
|
||||
)
|
||||
|
||||
|
||||
def test_reset(create_ems_instance):
|
||||
def test_reset(genetic_simulation_2):
|
||||
"""Test the reset method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
ems.reset()
|
||||
assert ems.ev.current_soc_percentage() == 100, "EV SOC should be reset to initial value"
|
||||
assert ems.battery.current_soc_percentage() == 80, (
|
||||
simulation = genetic_simulation_2
|
||||
simulation.reset()
|
||||
assert simulation.ev.current_soc_percentage() == simulation.ev.parameters.initial_soc_percentage, "EV SOC should be reset to initial value"
|
||||
assert simulation.battery.current_soc_percentage() == simulation.battery.parameters.initial_soc_percentage, (
|
||||
"Battery SOC should be reset to initial value"
|
||||
)
|
||||
|
||||
|
||||
def test_simulate_start_now(create_ems_instance):
|
||||
"""Test the simulate_start_now method of EnergyManagement."""
|
||||
ems = create_ems_instance
|
||||
result = ems.simulate_start_now()
|
||||
assert result is not None, "Result should not be None"
|
||||
assert isinstance(result, dict), "Result should be a dictionary"
|
||||
assert "Last_Wh_pro_Stunde" in result, "Result should contain 'Last_Wh_pro_Stunde'"
|
||||
@@ -1,6 +1,6 @@
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.devices.heatpump import Heatpump
|
||||
from akkudoktoreos.devices.genetic.heatpump import Heatpump
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
|
||||
@@ -2,7 +2,7 @@ from unittest.mock import Mock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.devices.inverter import Inverter, InverterParameters
|
||||
from akkudoktoreos.devices.genetic.inverter import Inverter, InverterParameters
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
@@ -10,26 +10,24 @@ def mock_battery() -> Mock:
|
||||
mock_battery = Mock()
|
||||
mock_battery.charge_energy = Mock(return_value=(0.0, 0.0))
|
||||
mock_battery.discharge_energy = Mock(return_value=(0.0, 0.0))
|
||||
mock_battery.device_id = "battery1"
|
||||
mock_battery.parameters.device_id = "battery1"
|
||||
return mock_battery
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def inverter(mock_battery, devices_eos) -> Inverter:
|
||||
devices_eos.add_device(mock_battery)
|
||||
def inverter(mock_battery) -> Inverter:
|
||||
mock_self_consumption_predictor = Mock()
|
||||
mock_self_consumption_predictor.calculate_self_consumption.return_value = 1.0
|
||||
with patch(
|
||||
"akkudoktoreos.devices.inverter.get_eos_load_interpolator",
|
||||
"akkudoktoreos.devices.genetic.inverter.get_eos_load_interpolator",
|
||||
return_value=mock_self_consumption_predictor,
|
||||
):
|
||||
iv = Inverter(
|
||||
InverterParameters(
|
||||
device_id="iv1", max_power_wh=500.0, battery_id=mock_battery.device_id
|
||||
device_id="iv1", max_power_wh=500.0, battery_id=mock_battery.parameters.device_id
|
||||
),
|
||||
battery = mock_battery
|
||||
)
|
||||
devices_eos.add_device(iv)
|
||||
devices_eos.post_setup()
|
||||
return iv
|
||||
|
||||
|
||||
|
||||
@@ -20,14 +20,18 @@ def provider(config_eos):
|
||||
"load": {
|
||||
"provider": "LoadAkkudoktor",
|
||||
"provider_settings": {
|
||||
"load_name": "Akkudoktor Profile",
|
||||
"loadakkudoktor_year_energy": "1000",
|
||||
"LoadAkkudoktor": {
|
||||
"loadakkudoktor_year_energy": "1000",
|
||||
},
|
||||
},
|
||||
},
|
||||
"measurement": {
|
||||
"load_emr_keys": ["load0_mr", "load1_mr"]
|
||||
}
|
||||
}
|
||||
config_eos.merge_settings_from_dict(settings)
|
||||
assert config_eos.load.provider == "LoadAkkudoktor"
|
||||
assert config_eos.load.provider_settings.loadakkudoktor_year_energy == 1000
|
||||
assert config_eos.load.provider_settings.LoadAkkudoktor.loadakkudoktor_year_energy == 1000
|
||||
return LoadAkkudoktor()
|
||||
|
||||
|
||||
|
||||
@@ -19,8 +19,10 @@ def load_vrm_instance(config_eos):
|
||||
"load": {
|
||||
"provider": "LoadVrm",
|
||||
"provider_settings": {
|
||||
"load_vrm_token": "dummy-token",
|
||||
"load_vrm_idsite": 12345
|
||||
"LoadVrm": {
|
||||
"load_vrm_token": "dummy-token",
|
||||
"load_vrm_idsite": 12345,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,217 +10,390 @@ from akkudoktoreos.measurement.measurement import (
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def measurement_eos():
|
||||
"""Fixture to create a Measurement instance."""
|
||||
measurement = get_measurement()
|
||||
measurement.records = [
|
||||
MeasurementDataRecord(
|
||||
class TestMeasurementDataRecord:
|
||||
"""Test suite for the MeasurementDataRecord class.
|
||||
|
||||
Ensuring that both dictionary-like and attribute-style access work correctly for fields and
|
||||
configured measurements.
|
||||
"""
|
||||
|
||||
@pytest.fixture
|
||||
def sample_config(self, config_eos):
|
||||
"""Fixture to configure the measurement keys on the global config."""
|
||||
config_eos.measurement.load_emr_keys = ["dish_washer_mr", "temp"]
|
||||
config_eos.measurement.pv_production_emr_keys = ["solar_power"]
|
||||
return config_eos
|
||||
|
||||
@pytest.fixture
|
||||
def record(self, sample_config):
|
||||
"""Fixture to create a sample MeasurementDataRecord with some measurements set."""
|
||||
rec = MeasurementDataRecord(date_time=None)
|
||||
rec.configured_data = {"dish_washer_mr": 123.0, "solar_power": 456.0}
|
||||
return rec
|
||||
|
||||
def test_record_keys_includes_measurement_keys(self, record):
|
||||
"""Ensure record_keys includes all configured measurement keys."""
|
||||
assert set(record.record_keys()) >= set(record.config.measurement.keys)
|
||||
|
||||
def test_record_keys_writable_includes_measurement_keys(self, record):
|
||||
"""Ensure record_keys_writable includes all configured measurement keys."""
|
||||
assert set(record.record_keys_writable()) >= set(record.config.measurement.keys)
|
||||
|
||||
def test_getitem_existing_field(self, record):
|
||||
"""Test that __getitem__ returns correct value for existing native field."""
|
||||
record.date_time = "2024-01-01T00:00:00+00:00"
|
||||
assert record["date_time"] is not None
|
||||
|
||||
def test_getitem_existing_measurement(self, record):
|
||||
"""Test that __getitem__ retrieves existing measurement values."""
|
||||
assert record["dish_washer_mr"] == 123.0
|
||||
assert record["solar_power"] == 456.0
|
||||
|
||||
def test_getitem_missing_measurement_returns_none(self, record):
|
||||
"""Test that __getitem__ returns None for missing but known measurement keys."""
|
||||
assert record["temp"] is None
|
||||
|
||||
def test_getitem_raises_keyerror(self, record):
|
||||
"""Test that __getitem__ raises KeyError for completely unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
_ = record["nonexistent"]
|
||||
|
||||
def test_setitem_field(self, record):
|
||||
"""Test setting a native field using __setitem__."""
|
||||
record["date_time"] = "2025-01-01T12:00:00+00:00"
|
||||
assert str(record.date_time).startswith("2025-01-01")
|
||||
|
||||
def test_setitem_measurement(self, record):
|
||||
"""Test setting a known measurement key using __setitem__."""
|
||||
record["temp"] = 25.5
|
||||
assert record["temp"] == 25.5
|
||||
|
||||
def test_setitem_invalid_key_raises(self, record):
|
||||
"""Test that __setitem__ raises KeyError for unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
record["unknown_key"] = 123
|
||||
|
||||
def test_delitem_field(self, record):
|
||||
"""Test deleting a native field using __delitem__."""
|
||||
record["date_time"] = "2025-01-01T12:00:00+00:00"
|
||||
del record["date_time"]
|
||||
assert record.date_time is None
|
||||
|
||||
def test_delitem_measurement(self, record):
|
||||
"""Test deleting a known measurement key using __delitem__."""
|
||||
del record["solar_power"]
|
||||
assert record["solar_power"] is None
|
||||
|
||||
def test_delitem_unknown_raises(self, record):
|
||||
"""Test that __delitem__ raises KeyError for unknown keys."""
|
||||
with pytest.raises(KeyError):
|
||||
del record["nonexistent"]
|
||||
|
||||
def test_attribute_get_existing_field(self, record):
|
||||
"""Test accessing a native field via attribute."""
|
||||
record.date_time = "2025-01-01T12:00:00+00:00"
|
||||
assert record.date_time is not None
|
||||
|
||||
def test_attribute_get_existing_measurement(self, record):
|
||||
"""Test accessing an existing measurement via attribute."""
|
||||
assert record.dish_washer_mr == 123.0
|
||||
|
||||
def test_attribute_get_missing_measurement(self, record):
|
||||
"""Test accessing a missing but known measurement returns None."""
|
||||
assert record.temp is None
|
||||
|
||||
def test_attribute_get_invalid_raises(self, record):
|
||||
"""Test accessing an unknown attribute raises AttributeError."""
|
||||
with pytest.raises(AttributeError):
|
||||
_ = record.nonexistent
|
||||
|
||||
def test_attribute_set_existing_field(self, record):
|
||||
"""Test setting a native field via attribute."""
|
||||
record.date_time = "2025-06-25T12:00:00+00:00"
|
||||
assert record.date_time is not None
|
||||
|
||||
def test_attribute_set_existing_measurement(self, record):
|
||||
"""Test setting a known measurement key via attribute."""
|
||||
record.temp = 99.9
|
||||
assert record["temp"] == 99.9
|
||||
|
||||
def test_attribute_set_invalid_raises(self, record):
|
||||
"""Test setting an unknown attribute raises AttributeError."""
|
||||
with pytest.raises(AttributeError):
|
||||
record.invalid = 123
|
||||
|
||||
def test_delattr_field(self, record):
|
||||
"""Test deleting a native field via attribute."""
|
||||
record.date_time = "2025-06-25T12:00:00+00:00"
|
||||
del record.date_time
|
||||
assert record.date_time is None
|
||||
|
||||
def test_delattr_measurement(self, record):
|
||||
"""Test deleting a known measurement key via attribute."""
|
||||
record.temp = 88.0
|
||||
del record.temp
|
||||
assert record.temp is None
|
||||
|
||||
def test_delattr_ignored_missing_measurement_key(self, record):
|
||||
"""Test deleting a known measurement key that was never set is a no-op."""
|
||||
del record.temp
|
||||
assert record.temp is None
|
||||
|
||||
def test_len_and_iter(self, record):
|
||||
"""Test that __len__ and __iter__ behave as expected."""
|
||||
keys = list(iter(record))
|
||||
assert set(record.record_keys_writable()) == set(keys)
|
||||
assert len(record) == len(keys)
|
||||
|
||||
def test_in_operator_includes_measurements(self, record):
|
||||
"""Test that 'in' operator includes measurement keys."""
|
||||
assert "dish_washer_mr" in record
|
||||
assert "temp" in record # known key, even if not yet set
|
||||
assert "nonexistent" not in record
|
||||
|
||||
def test_hasattr_behavior(self, record):
|
||||
"""Test that hasattr returns True for fields and known measurements."""
|
||||
assert hasattr(record, "date_time")
|
||||
assert hasattr(record, "dish_washer_mr")
|
||||
assert hasattr(record, "temp") # allowed, even if not yet set
|
||||
assert not hasattr(record, "nonexistent")
|
||||
|
||||
def test_model_validate_roundtrip(self, record):
|
||||
"""Test that MeasurementDataRecord can be serialized and revalidated."""
|
||||
dumped = record.model_dump()
|
||||
restored = MeasurementDataRecord.model_validate(dumped)
|
||||
assert restored.dish_washer_mr == 123.0
|
||||
assert restored.solar_power == 456.0
|
||||
assert restored.temp is None # not set
|
||||
|
||||
def test_copy_preserves_measurements(self, record):
|
||||
"""Test that copying preserves measurement values."""
|
||||
record.temp = 22.2
|
||||
copied = record.model_copy()
|
||||
assert copied.dish_washer_mr == 123.0
|
||||
assert copied.temp == 22.2
|
||||
assert copied is not record
|
||||
|
||||
def test_equality_includes_measurements(self, record):
|
||||
"""Test that equality includes the `measurements` content."""
|
||||
other = record.model_copy()
|
||||
assert record == other
|
||||
|
||||
def test_inequality_differs_with_measurements(self, record):
|
||||
"""Test that records with different measurements are not equal."""
|
||||
other = record.model_copy(deep=True)
|
||||
# Modify one measurement value in the copy
|
||||
other["dish_washer_mr"] = 999.9
|
||||
assert record != other
|
||||
|
||||
def test_in_operator_for_measurements_and_fields(self, record):
|
||||
"""Ensure 'in' works for both fields and configured measurement keys."""
|
||||
assert "dish_washer_mr" in record
|
||||
assert "solar_power" in record
|
||||
assert "date_time" in record # standard field
|
||||
assert "temp" in record # allowed but not yet set
|
||||
assert "unknown" not in record
|
||||
|
||||
def test_hasattr_equivalence_to_getattr(self, record):
|
||||
"""hasattr should return True for all valid keys/measurements."""
|
||||
assert hasattr(record, "dish_washer_mr")
|
||||
assert hasattr(record, "temp")
|
||||
assert hasattr(record, "date_time")
|
||||
assert not hasattr(record, "nonexistent")
|
||||
|
||||
def test_dir_includes_measurement_keys(self, record):
|
||||
"""`dir(record)` should include measurement keys for introspection.
|
||||
It shall not include the internal 'measurements' attribute.
|
||||
"""
|
||||
keys = dir(record)
|
||||
assert "measurements" not in keys
|
||||
for key in record.config.measurement.keys:
|
||||
assert key in keys
|
||||
|
||||
|
||||
class TestMeasurement:
|
||||
"""Test suite for the Measuremen class."""
|
||||
|
||||
@pytest.fixture
|
||||
def measurement_eos(self, config_eos):
|
||||
"""Fixture to create a Measurement instance."""
|
||||
config_eos.measurement.load_emr_keys = ["load0_mr", "load1_mr", "load2_mr", "load3_mr"]
|
||||
measurement = get_measurement()
|
||||
record0 = MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=0),
|
||||
load0_mr=100,
|
||||
load1_mr=200,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=1),
|
||||
load0_mr=150,
|
||||
load1_mr=250,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=2),
|
||||
load0_mr=200,
|
||||
load1_mr=300,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=3),
|
||||
load0_mr=250,
|
||||
load1_mr=350,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=4),
|
||||
load0_mr=300,
|
||||
load1_mr=400,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=5),
|
||||
load0_mr=350,
|
||||
load1_mr=450,
|
||||
),
|
||||
]
|
||||
return measurement
|
||||
)
|
||||
assert record0.load0_mr == 100
|
||||
assert record0.load1_mr == 200
|
||||
measurement.records = [
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=0),
|
||||
load0_mr=100,
|
||||
load1_mr=200,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=1),
|
||||
load0_mr=150,
|
||||
load1_mr=250,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=2),
|
||||
load0_mr=200,
|
||||
load1_mr=300,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=3),
|
||||
load0_mr=250,
|
||||
load1_mr=350,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=4),
|
||||
load0_mr=300,
|
||||
load1_mr=400,
|
||||
),
|
||||
MeasurementDataRecord(
|
||||
date_time=datetime(2023, 1, 1, hour=5),
|
||||
load0_mr=350,
|
||||
load1_mr=450,
|
||||
),
|
||||
]
|
||||
return measurement
|
||||
|
||||
def test_interval_count(self, measurement_eos):
|
||||
"""Test interval count calculation."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
interval = duration(hours=1)
|
||||
|
||||
def test_interval_count(measurement_eos):
|
||||
"""Test interval count calculation."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
interval = duration(hours=1)
|
||||
assert measurement_eos._interval_count(start, end, interval) == 3
|
||||
|
||||
assert measurement_eos._interval_count(start, end, interval) == 3
|
||||
def test_interval_count_invalid_end_before_start(self, measurement_eos):
|
||||
"""Test interval count raises ValueError when end_datetime is before start_datetime."""
|
||||
start = datetime(2023, 1, 1, 3)
|
||||
end = datetime(2023, 1, 1, 0)
|
||||
interval = duration(hours=1)
|
||||
|
||||
with pytest.raises(ValueError, match="end_datetime must be after start_datetime"):
|
||||
measurement_eos._interval_count(start, end, interval)
|
||||
|
||||
def test_interval_count_invalid_end_before_start(measurement_eos):
|
||||
"""Test interval count raises ValueError when end_datetime is before start_datetime."""
|
||||
start = datetime(2023, 1, 1, 3)
|
||||
end = datetime(2023, 1, 1, 0)
|
||||
interval = duration(hours=1)
|
||||
def test_interval_count_invalid_non_positive_interval(self, measurement_eos):
|
||||
"""Test interval count raises ValueError when interval is non-positive."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
|
||||
with pytest.raises(ValueError, match="end_datetime must be after start_datetime"):
|
||||
measurement_eos._interval_count(start, end, interval)
|
||||
with pytest.raises(ValueError, match="interval must be positive"):
|
||||
measurement_eos._interval_count(start, end, duration(hours=0))
|
||||
|
||||
def test_energy_from_meter_readings_valid_input(self, measurement_eos):
|
||||
"""Test _energy_from_meter_readings with valid inputs and proper alignment of load data."""
|
||||
key = "load0_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
def test_interval_count_invalid_non_positive_interval(measurement_eos):
|
||||
"""Test interval count raises ValueError when interval is non-positive."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
|
||||
with pytest.raises(ValueError, match="interval must be positive"):
|
||||
measurement_eos._interval_count(start, end, duration(hours=0))
|
||||
|
||||
|
||||
def test_energy_from_meter_readings_valid_input(measurement_eos):
|
||||
"""Test _energy_from_meter_readings with valid inputs and proper alignment of load data."""
|
||||
key = "load0_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
expected_load_array = np.array([50, 50, 50, 50, 50]) # Differences between consecutive readings
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
|
||||
def test_energy_from_meter_readings_empty_array(measurement_eos):
|
||||
"""Test _energy_from_meter_readings with no data (empty array)."""
|
||||
key = "load0_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
# Use empyt records array
|
||||
measurement_eos.records = []
|
||||
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
# Expected: an array of zeros with one less than the number of intervals
|
||||
expected_size = (
|
||||
measurement_eos._interval_count(start_datetime, end_datetime + interval, interval) - 1
|
||||
)
|
||||
expected_load_array = np.zeros(expected_size)
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
|
||||
def test_energy_from_meter_readings_misaligned_array(measurement_eos):
|
||||
"""Test _energy_from_meter_readings with misaligned array size."""
|
||||
key = "load1_mr"
|
||||
start_datetime = measurement_eos.min_datetime
|
||||
end_datetime = measurement_eos.max_datetime
|
||||
interval = duration(hours=1)
|
||||
|
||||
# Use misaligned array, latest interval set to 2 hours (instead of 1 hour)
|
||||
measurement_eos.records[-1].date_time = datetime(2023, 1, 1, 6)
|
||||
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
expected_load_array = np.array([50, 50, 50, 50, 25]) # Differences between consecutive readings
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
|
||||
def test_energy_from_meter_readings_partial_data(measurement_eos, caplog):
|
||||
"""Test _energy_from_meter_readings with partial data (misaligned but empty array)."""
|
||||
key = "load2_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
with caplog.at_level("DEBUG"):
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
expected_size = (
|
||||
measurement_eos._interval_count(start_datetime, end_datetime + interval, interval) - 1
|
||||
)
|
||||
expected_load_array = np.zeros(expected_size)
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
expected_load_array = np.array([50, 50, 50, 50, 50]) # Differences between consecutive readings
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
def test_energy_from_meter_readings_empty_array(self, measurement_eos):
|
||||
"""Test _energy_from_meter_readings with no data (empty array)."""
|
||||
key = "load0_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
def test_energy_from_meter_readings_negative_interval(measurement_eos):
|
||||
"""Test _energy_from_meter_readings with a negative interval."""
|
||||
key = "load3_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=-1)
|
||||
# Use empyt records array
|
||||
measurement_eos.records = []
|
||||
|
||||
with pytest.raises(ValueError, match="interval must be positive"):
|
||||
measurement_eos._energy_from_meter_readings(key, start_datetime, end_datetime, interval)
|
||||
|
||||
|
||||
def test_load_total(measurement_eos):
|
||||
"""Test total load calculation."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 2)
|
||||
interval = duration(hours=1)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
|
||||
# Expected total load per interval
|
||||
expected = np.array([100, 100]) # Differences between consecutive meter readings
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
|
||||
|
||||
def test_load_total_no_data(measurement_eos):
|
||||
"""Test total load calculation with no data."""
|
||||
measurement_eos.records = []
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
interval = duration(hours=1)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
expected = np.zeros(3) # No data, so all intervals are zero
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
|
||||
|
||||
def test_name_to_key(measurement_eos):
|
||||
"""Test name_to_key functionality."""
|
||||
settings = SettingsEOS(
|
||||
measurement=MeasurementCommonSettings(
|
||||
load0_name="Household",
|
||||
load1_name="Heat Pump",
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
)
|
||||
measurement_eos.config.merge_settings(settings)
|
||||
|
||||
assert measurement_eos.name_to_key("Household", "load") == "load0_mr"
|
||||
assert measurement_eos.name_to_key("Heat Pump", "load") == "load1_mr"
|
||||
assert measurement_eos.name_to_key("Unknown", "load") is None
|
||||
|
||||
|
||||
def test_name_to_key_invalid_topic(measurement_eos):
|
||||
"""Test name_to_key with an invalid topic."""
|
||||
settings = SettingsEOS(
|
||||
MeasurementCommonSettings(
|
||||
load0_name="Household",
|
||||
load1_name="Heat Pump",
|
||||
# Expected: an array of zeros with one less than the number of intervals
|
||||
expected_size = (
|
||||
measurement_eos._interval_count(start_datetime, end_datetime + interval, interval) - 1
|
||||
)
|
||||
)
|
||||
measurement_eos.config.merge_settings(settings)
|
||||
expected_load_array = np.zeros(expected_size)
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
assert measurement_eos.name_to_key("Household", "invalid_topic") is None
|
||||
def test_energy_from_meter_readings_misaligned_array(self, measurement_eos):
|
||||
"""Test _energy_from_meter_readings with misaligned array size."""
|
||||
key = "load1_mr"
|
||||
start_datetime = measurement_eos.min_datetime
|
||||
end_datetime = measurement_eos.max_datetime
|
||||
interval = duration(hours=1)
|
||||
|
||||
# Use misaligned array, latest interval set to 2 hours (instead of 1 hour)
|
||||
measurement_eos.records[-1].date_time = datetime(2023, 1, 1, 6)
|
||||
|
||||
def test_load_total_partial_intervals(measurement_eos):
|
||||
"""Test total load calculation with partial intervals."""
|
||||
start = datetime(2023, 1, 1, 0, 30) # Start in the middle of an interval
|
||||
end = datetime(2023, 1, 1, 1, 30) # End in the middle of another interval
|
||||
interval = duration(hours=1)
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
expected = np.array([100]) # Only one complete interval covered
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
expected_load_array = np.array([50, 50, 50, 50, 25]) # Differences between consecutive readings
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
def test_energy_from_meter_readings_partial_data(self, measurement_eos, caplog):
|
||||
"""Test _energy_from_meter_readings with partial data (misaligned but empty array)."""
|
||||
key = "load2_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=1)
|
||||
|
||||
with caplog.at_level("DEBUG"):
|
||||
load_array = measurement_eos._energy_from_meter_readings(
|
||||
key, start_datetime, end_datetime, interval
|
||||
)
|
||||
|
||||
expected_size = (
|
||||
measurement_eos._interval_count(start_datetime, end_datetime + interval, interval) - 1
|
||||
)
|
||||
expected_load_array = np.zeros(expected_size)
|
||||
np.testing.assert_array_equal(load_array, expected_load_array)
|
||||
|
||||
def test_energy_from_meter_readings_negative_interval(self, measurement_eos):
|
||||
"""Test _energy_from_meter_readings with a negative interval."""
|
||||
key = "load3_mr"
|
||||
start_datetime = datetime(2023, 1, 1, 0)
|
||||
end_datetime = datetime(2023, 1, 1, 5)
|
||||
interval = duration(hours=-1)
|
||||
|
||||
with pytest.raises(ValueError, match="interval must be positive"):
|
||||
measurement_eos._energy_from_meter_readings(key, start_datetime, end_datetime, interval)
|
||||
|
||||
def test_load_total(self, measurement_eos):
|
||||
"""Test total load calculation."""
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 2)
|
||||
interval = duration(hours=1)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
|
||||
# Expected total load per interval
|
||||
expected = np.array([100, 100]) # Differences between consecutive meter readings
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
|
||||
def test_load_total_no_data(self, measurement_eos):
|
||||
"""Test total load calculation with no data."""
|
||||
measurement_eos.records = []
|
||||
start = datetime(2023, 1, 1, 0)
|
||||
end = datetime(2023, 1, 1, 3)
|
||||
interval = duration(hours=1)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
expected = np.zeros(3) # No data, so all intervals are zero
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
|
||||
def test_load_total_partial_intervals(self, measurement_eos):
|
||||
"""Test total load calculation with partial intervals."""
|
||||
start = datetime(2023, 1, 1, 0, 30) # Start in the middle of an interval
|
||||
end = datetime(2023, 1, 1, 1, 30) # End in the middle of another interval
|
||||
interval = duration(hours=1)
|
||||
|
||||
result = measurement_eos.load_total(start_datetime=start, end_datetime=end, interval=interval)
|
||||
expected = np.array([100]) # Only one complete interval covered
|
||||
np.testing.assert_array_equal(result, expected)
|
||||
|
||||
@@ -4,6 +4,8 @@ from pydantic import ValidationError
|
||||
from akkudoktoreos.prediction.elecpriceakkudoktor import ElecPriceAkkudoktor
|
||||
from akkudoktoreos.prediction.elecpriceenergycharts import ElecPriceEnergyCharts
|
||||
from akkudoktoreos.prediction.elecpriceimport import ElecPriceImport
|
||||
from akkudoktoreos.prediction.feedintarifffixed import FeedInTariffFixed
|
||||
from akkudoktoreos.prediction.feedintariffimport import FeedInTariffImport
|
||||
from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktor
|
||||
from akkudoktoreos.prediction.loadimport import LoadImport
|
||||
from akkudoktoreos.prediction.loadvrm import LoadVrm
|
||||
@@ -33,6 +35,8 @@ def forecast_providers():
|
||||
ElecPriceAkkudoktor(),
|
||||
ElecPriceEnergyCharts(),
|
||||
ElecPriceImport(),
|
||||
FeedInTariffFixed(),
|
||||
FeedInTariffImport(),
|
||||
LoadAkkudoktor(),
|
||||
LoadVrm(),
|
||||
LoadImport(),
|
||||
@@ -76,15 +80,17 @@ def test_provider_sequence(prediction):
|
||||
assert isinstance(prediction.providers[0], ElecPriceAkkudoktor)
|
||||
assert isinstance(prediction.providers[1], ElecPriceEnergyCharts)
|
||||
assert isinstance(prediction.providers[2], ElecPriceImport)
|
||||
assert isinstance(prediction.providers[3], LoadAkkudoktor)
|
||||
assert isinstance(prediction.providers[4], LoadVrm)
|
||||
assert isinstance(prediction.providers[5], LoadImport)
|
||||
assert isinstance(prediction.providers[6], PVForecastAkkudoktor)
|
||||
assert isinstance(prediction.providers[7], PVForecastVrm)
|
||||
assert isinstance(prediction.providers[8], PVForecastImport)
|
||||
assert isinstance(prediction.providers[9], WeatherBrightSky)
|
||||
assert isinstance(prediction.providers[10], WeatherClearOutside)
|
||||
assert isinstance(prediction.providers[11], WeatherImport)
|
||||
assert isinstance(prediction.providers[3], FeedInTariffFixed)
|
||||
assert isinstance(prediction.providers[4], FeedInTariffImport)
|
||||
assert isinstance(prediction.providers[5], LoadAkkudoktor)
|
||||
assert isinstance(prediction.providers[6], LoadVrm)
|
||||
assert isinstance(prediction.providers[7], LoadImport)
|
||||
assert isinstance(prediction.providers[8], PVForecastAkkudoktor)
|
||||
assert isinstance(prediction.providers[9], PVForecastVrm)
|
||||
assert isinstance(prediction.providers[10], PVForecastImport)
|
||||
assert isinstance(prediction.providers[11], WeatherBrightSky)
|
||||
assert isinstance(prediction.providers[12], WeatherClearOutside)
|
||||
assert isinstance(prediction.providers[13], WeatherImport)
|
||||
|
||||
|
||||
def test_provider_by_id(prediction, forecast_providers):
|
||||
@@ -100,6 +106,8 @@ def test_prediction_repr(prediction):
|
||||
assert "ElecPriceAkkudoktor" in result
|
||||
assert "ElecPriceEnergyCharts" in result
|
||||
assert "ElecPriceImport" in result
|
||||
assert "FeedInTariffFixed" in result
|
||||
assert "FeedInTariffImport" in result
|
||||
assert "LoadAkkudoktor" in result
|
||||
assert "LoadVrm" in result
|
||||
assert "LoadImport" in result
|
||||
|
||||
@@ -101,7 +101,7 @@ class TestPredictionBase:
|
||||
assert base.config.prediction.hours == 2
|
||||
|
||||
def test_config_value_from_field_default(self, base, monkeypatch):
|
||||
assert base.config.prediction.model_fields["historic_hours"].default == 48
|
||||
assert base.config.prediction.__class__.model_fields["historic_hours"].default == 48
|
||||
assert base.config.prediction.historic_hours == 48
|
||||
monkeypatch.setenv("EOS_PREDICTION__HISTORIC_HOURS", "128")
|
||||
base.config.reset_settings()
|
||||
@@ -192,7 +192,7 @@ class TestPredictionProvider:
|
||||
|
||||
assert provider.config.prediction.hours == config_eos.prediction.hours
|
||||
assert provider.config.prediction.historic_hours == 2
|
||||
assert provider.start_datetime == sample_start_datetime
|
||||
assert provider.ems_start_datetime == sample_start_datetime
|
||||
assert provider.end_datetime == sample_start_datetime + to_duration(
|
||||
f"{provider.config.prediction.hours} hours"
|
||||
)
|
||||
@@ -416,7 +416,7 @@ class TestPredictionContainer:
|
||||
del container_with_providers["non_existent_key"]
|
||||
|
||||
def test_len(self, container_with_providers):
|
||||
assert len(container_with_providers) == 3
|
||||
assert len(container_with_providers) == 2
|
||||
|
||||
def test_repr(self, container_with_providers):
|
||||
representation = repr(container_with_providers)
|
||||
|
||||
@@ -276,7 +276,7 @@ def test_pvforecast_akkudoktor_update_with_sample_forecast(
|
||||
ems_eos = get_ems()
|
||||
ems_eos.set_start_datetime(sample_forecast_start)
|
||||
provider.update_data(force_enable=True, force_update=True)
|
||||
assert compare_datetimes(provider.start_datetime, sample_forecast_start).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, sample_forecast_start).equal
|
||||
assert compare_datetimes(provider[0].date_time, to_datetime(sample_forecast_start)).equal
|
||||
|
||||
|
||||
@@ -328,7 +328,7 @@ def test_timezone_behaviour(
|
||||
ems_eos = get_ems()
|
||||
ems_eos.set_start_datetime(other_start_datetime)
|
||||
provider.update_data(force_update=True)
|
||||
assert compare_datetimes(provider.start_datetime, other_start_datetime).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, other_start_datetime).equal
|
||||
# Check wether first record starts at requested sample start time
|
||||
assert compare_datetimes(provider[0].date_time, sample_forecast_start).equal
|
||||
|
||||
|
||||
@@ -19,8 +19,10 @@ def provider(sample_import_1_json, config_eos):
|
||||
"pvforecast": {
|
||||
"provider": "PVForecastImport",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_PVFORECASTIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
"PVForecastImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_PVFORECASTIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -55,7 +57,9 @@ def test_invalid_provider(provider, config_eos):
|
||||
"pvforecast": {
|
||||
"provider": "<invalid>",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_PVFORECASTIMPORT_1_JSON),
|
||||
"PVForecastImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_PVFORECASTIMPORT_1_JSON),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -86,20 +90,20 @@ def test_import(provider, sample_import_1_json, start_datetime, from_file, confi
|
||||
ems_eos = get_ems()
|
||||
ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin"))
|
||||
if from_file:
|
||||
config_eos.pvforecast.provider_settings.import_json = None
|
||||
assert config_eos.pvforecast.provider_settings.import_json is None
|
||||
config_eos.pvforecast.provider_settings.PVForecastImport.import_json = None
|
||||
assert config_eos.pvforecast.provider_settings.PVForecastImport.import_json is None
|
||||
else:
|
||||
config_eos.pvforecast.provider_settings.import_file_path = None
|
||||
assert config_eos.pvforecast.provider_settings.import_file_path is None
|
||||
config_eos.pvforecast.provider_settings.PVForecastImport.import_file_path = None
|
||||
assert config_eos.pvforecast.provider_settings.PVForecastImport.import_file_path is None
|
||||
provider.clear()
|
||||
|
||||
# Call the method
|
||||
provider.update_data()
|
||||
|
||||
# Assert: Verify the result is as expected
|
||||
assert provider.start_datetime is not None
|
||||
assert provider.ems_start_datetime is not None
|
||||
assert provider.total_hours is not None
|
||||
assert compare_datetimes(provider.start_datetime, ems_eos.start_datetime).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, ems_eos.start_datetime).equal
|
||||
values = sample_import_1_json["pvforecast_ac_power"]
|
||||
value_datetime_mapping = provider.import_datetimes(ems_eos.start_datetime, len(values))
|
||||
for i, mapping in enumerate(value_datetime_mapping):
|
||||
|
||||
@@ -19,8 +19,10 @@ def pvforecast_instance(config_eos):
|
||||
"pvforecast": {
|
||||
"provider": "PVForecastVrm",
|
||||
"provider_settings": {
|
||||
"pvforecast_vrm_token": "dummy-token",
|
||||
"pvforecast_vrm_idsite": 12345
|
||||
"PVForecastVrm": {
|
||||
"pvforecast_vrm_token": "dummy-token",
|
||||
"pvforecast_vrm_idsite": 12345,
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from typing import Optional
|
||||
from typing import Any, Optional
|
||||
|
||||
import pandas as pd
|
||||
import pendulum
|
||||
@@ -11,12 +11,13 @@ from akkudoktoreos.core.pydantic import (
|
||||
PydanticDateTimeDataFrame,
|
||||
PydanticDateTimeSeries,
|
||||
PydanticModelNestedValueMixin,
|
||||
merge_models,
|
||||
)
|
||||
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime
|
||||
from akkudoktoreos.utils.datetimeutil import DateTime, compare_datetimes, to_datetime
|
||||
|
||||
|
||||
class PydanticTestModel(PydanticBaseModel):
|
||||
datetime_field: pendulum.DateTime = Field(
|
||||
datetime_field: DateTime = Field(
|
||||
..., description="A datetime field with pendulum support."
|
||||
)
|
||||
optional_field: Optional[str] = Field(default=None, description="An optional field.")
|
||||
@@ -33,6 +34,108 @@ class User(PydanticBaseModel):
|
||||
settings: Optional[dict[str, str]] = None
|
||||
|
||||
|
||||
class SampleNestedModel(PydanticBaseModel):
|
||||
threshold: int
|
||||
enabled: bool = True
|
||||
|
||||
|
||||
class SampleModel(PydanticBaseModel):
|
||||
name: str
|
||||
count: int
|
||||
config: SampleNestedModel
|
||||
optional: str | None = None
|
||||
|
||||
|
||||
class TestMergeModels:
|
||||
"""Test suite for the merge_models utility function with None overriding."""
|
||||
|
||||
def test_flat_override(self):
|
||||
"""Top-level fields in update_dict override those in source, including None."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5})
|
||||
update = {"name": "Updated"}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["name"] == "Updated"
|
||||
assert result["count"] == 10
|
||||
assert result["config"]["threshold"] == 5
|
||||
|
||||
def test_flat_override_with_none(self):
|
||||
"""Update with None value should override source value."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5}, optional="keep me")
|
||||
update = {"optional": None}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["optional"] is None
|
||||
|
||||
def test_nested_override(self):
|
||||
"""Nested fields in update_dict override nested fields in source, including None."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5, "enabled": True})
|
||||
update = {"config": {"threshold": 99, "enabled": False}}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["config"]["threshold"] == 99
|
||||
assert result["config"]["enabled"] is False
|
||||
|
||||
def test_nested_override_with_none(self):
|
||||
"""Nested update with None should override nested source values."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5, "enabled": True})
|
||||
update = {"config": {"threshold": None}}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["config"]["threshold"] is None
|
||||
assert result["config"]["enabled"] is True # untouched because not in update
|
||||
|
||||
def test_preserve_source_values(self):
|
||||
"""Source values are preserved if not overridden in update_dict."""
|
||||
source = SampleModel(name="Source", count=7, config={"threshold": 1})
|
||||
update: dict[str, Any] = {}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["name"] == "Source"
|
||||
assert result["count"] == 7
|
||||
assert result["config"]["threshold"] == 1
|
||||
|
||||
def test_update_extends_source(self):
|
||||
"""Optional fields in update_dict are added to result."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5})
|
||||
update = {"optional": "new value"}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["optional"] == "new value"
|
||||
|
||||
def test_update_extends_source_with_none(self):
|
||||
"""Optional field with None in update_dict is added and overrides source."""
|
||||
source = SampleModel(name="Test", count=10, config={"threshold": 5}, optional="value")
|
||||
update = {"optional": None}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["optional"] is None
|
||||
|
||||
def test_deep_merge_behavior(self):
|
||||
"""Nested updates merge with source, overriding only specified subkeys."""
|
||||
source = SampleModel(name="Model", count=3, config={"threshold": 1, "enabled": False})
|
||||
update = {"config": {"enabled": True}}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["config"]["enabled"] is True
|
||||
assert result["config"]["threshold"] == 1
|
||||
|
||||
def test_override_all(self):
|
||||
"""All fields in update_dict override all fields in source, including None."""
|
||||
source = SampleModel(name="Orig", count=1, config={"threshold": 10, "enabled": True})
|
||||
update = {
|
||||
"name": "New",
|
||||
"count": None,
|
||||
"config": {"threshold": 50, "enabled": None}
|
||||
}
|
||||
result = merge_models(source, update)
|
||||
|
||||
assert result["name"] == "New"
|
||||
assert result["count"] is None
|
||||
assert result["config"]["threshold"] == 50
|
||||
assert result["config"]["enabled"] is None
|
||||
|
||||
|
||||
class TestPydanticModelNestedValueMixin:
|
||||
"""Umbrella test class to group all test cases for `PydanticModelNestedValueMixin`."""
|
||||
|
||||
@@ -242,7 +345,7 @@ class TestPydanticBaseModel:
|
||||
assert model.datetime_field == dt
|
||||
|
||||
def test_invalid_datetime_string(self):
|
||||
with pytest.raises(ValidationError, match="Cannot convert 'invalid_datetime' to datetime"):
|
||||
with pytest.raises(ValueError):
|
||||
PydanticTestModel(datetime_field="invalid_datetime")
|
||||
|
||||
def test_iso8601_serialization(self):
|
||||
@@ -299,6 +402,7 @@ class TestPydanticDateTimeData:
|
||||
|
||||
class TestPydanticDateTimeDataFrame:
|
||||
def test_valid_dataframe(self):
|
||||
"""Ensure conversion from and to DataFrame preserves index and values."""
|
||||
df = pd.DataFrame(
|
||||
{
|
||||
"value": [100, 200],
|
||||
@@ -308,13 +412,101 @@ class TestPydanticDateTimeDataFrame:
|
||||
model = PydanticDateTimeDataFrame.from_dataframe(df)
|
||||
result = model.to_dataframe()
|
||||
|
||||
# Check index
|
||||
assert len(result.index) == len(df.index)
|
||||
for i, dt in enumerate(df.index):
|
||||
expected_dt = to_datetime(dt)
|
||||
result_dt = to_datetime(result.index[i])
|
||||
assert compare_datetimes(result_dt, expected_dt).equal
|
||||
|
||||
def test_add_row(self):
|
||||
"""Verify that a new row can be inserted with matching columns."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.add_row("2024-12-22T00:00:00", {"value": 200})
|
||||
|
||||
# Normalize key the same way the model stores it
|
||||
key = model._normalize_index("2024-12-22T00:00:00")
|
||||
|
||||
assert key in model.data
|
||||
assert model.data[key]["value"] == 200
|
||||
|
||||
def test_add_row_column_mismatch_raises(self):
|
||||
"""Ensure adding a row with mismatched columns raises ValueError."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
with pytest.raises(ValueError):
|
||||
model.add_row("2024-12-22T00:00:00", {"wrong": 200})
|
||||
|
||||
def test_update_row(self):
|
||||
"""Check updating an existing row's values works."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.update_row("2024-12-21T00:00:00", {"value": 999})
|
||||
|
||||
key = model._normalize_index("2024-12-21T00:00:00")
|
||||
assert model.data[key]["value"] == 999
|
||||
|
||||
def test_update_row_missing_raises(self):
|
||||
"""Verify updating a non-existing row raises KeyError."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
with pytest.raises(KeyError):
|
||||
model.update_row("2024-12-22T00:00:00", {"value": 999})
|
||||
|
||||
def test_delete_row(self):
|
||||
"""Ensure rows can be deleted by index."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.delete_row("2024-12-21T00:00:00")
|
||||
assert "2024-12-21T00:00:00" not in model.data
|
||||
|
||||
def test_set_and_get_value(self):
|
||||
"""Confirm set_value and get_value operate correctly."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.set_value("2024-12-21T00:00:00", "value", 555)
|
||||
assert model.get_value("2024-12-21T00:00:00", "value") == 555
|
||||
|
||||
def test_add_column(self):
|
||||
"""Check that a new column can be added with default value."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.add_column("extra", default=0, dtype="int64")
|
||||
|
||||
key = model._normalize_index("2024-12-21T00:00:00")
|
||||
assert model.data[key]["extra"] == 0
|
||||
assert model.dtypes["extra"] == "int64"
|
||||
|
||||
def test_rename_column(self):
|
||||
"""Ensure renaming a column updates all rows and dtypes."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100}}, dtypes={"value": "int64"}
|
||||
)
|
||||
model.rename_column("value", "renamed")
|
||||
|
||||
key = model._normalize_index("2024-12-21T00:00:00")
|
||||
assert "renamed" in model.data[key]
|
||||
assert "value" not in model.data[key]
|
||||
assert model.dtypes["renamed"] == "int64"
|
||||
|
||||
def test_drop_column(self):
|
||||
"""Verify dropping a column removes it from both data and dtypes."""
|
||||
model = PydanticDateTimeDataFrame(
|
||||
data={"2024-12-21T00:00:00": {"value": 100, "extra": 1}}, dtypes={"value": "int64", "extra": "int64"}
|
||||
)
|
||||
model.drop_column("extra")
|
||||
|
||||
key = model._normalize_index("2024-12-21T00:00:00")
|
||||
assert "extra" not in model.data[key]
|
||||
assert "extra" not in model.dtypes
|
||||
|
||||
|
||||
class TestPydanticDateTimeSeries:
|
||||
def test_valid_series(self):
|
||||
|
||||
@@ -8,12 +8,10 @@ from pathlib import Path
|
||||
import psutil
|
||||
import pytest
|
||||
import requests
|
||||
from conftest import cleanup_eos_eosdash
|
||||
|
||||
from akkudoktoreos.server.server import get_default_host
|
||||
|
||||
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
||||
|
||||
FILE_TESTDATA_EOSSERVER_CONFIG_1 = DIR_TESTDATA.joinpath("eosserver_config_1.json")
|
||||
from akkudoktoreos.core.version import __version__
|
||||
from akkudoktoreos.server.server import get_default_host, wait_for_port_free
|
||||
|
||||
|
||||
class TestServer:
|
||||
@@ -22,7 +20,14 @@ class TestServer:
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.get(f"{server}/v1/config")
|
||||
# Assure server is running
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
health = result.json()
|
||||
assert health["status"] == "alive"
|
||||
assert health["version"] == __version__
|
||||
|
||||
result = requests.get(f"{server}/v1/config", timeout=2)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Get testing config
|
||||
@@ -37,258 +42,36 @@ class TestServer:
|
||||
assert str(data_folder_path).startswith(eos_dir)
|
||||
assert str(data_ouput_path).startswith(eos_dir)
|
||||
|
||||
def test_prediction_brightsky(self, server_setup_for_class, is_system_test):
|
||||
"""Test weather prediction by BrightSky."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.get(f"{server}/v1/config")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Get testing config
|
||||
config_json = result.json()
|
||||
config_folder_path = Path(config_json["general"]["config_folder_path"])
|
||||
# Assure we are working in test environment
|
||||
assert str(config_folder_path).startswith(eos_dir)
|
||||
|
||||
result = requests.put(f"{server}/v1/config/weather/provider", json="BrightSky")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "BrightSky" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/BrightSky")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=weather_temp_air")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_clearoutside(self, server_setup_for_class, is_system_test):
|
||||
"""Test weather prediction by ClearOutside."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.put(f"{server}/v1/config/weather/provider", json="ClearOutside")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "ClearOutside" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/ClearOutside")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=weather_temp_air")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_pvforecastakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test PV prediction by PVForecastAkkudoktor."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
# Reset config
|
||||
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
|
||||
config = json.load(fd)
|
||||
config["pvforecast"]["provider"] = "PVForecastAkkudoktor"
|
||||
result = requests.put(f"{server}/v1/config", json=config)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "PVForecastAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/PVForecastAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=pvforecast_ac_power")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_elecpriceakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test electricity price prediction by ElecPriceImport."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
# Reset config
|
||||
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
|
||||
config = json.load(fd)
|
||||
config["elecprice"]["provider"] = "ElecPriceAkkudoktor"
|
||||
result = requests.put(f"{server}/v1/config", json=config)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "ElecPriceAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/ElecPriceAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=elecprice_marketprice_wh")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_loadakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test load prediction by LoadAkkudoktor."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.put(f"{server}/v1/config/load/provider", json="LoadAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "LoadAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/LoadAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=load_mean")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_admin_cache(self, server_setup_for_class, is_system_test):
|
||||
"""Test whether cache is reconstructed from cached files."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.get(f"{server}/v1/admin/cache")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache = result.json()
|
||||
|
||||
if is_system_test:
|
||||
# There should be some cache data
|
||||
assert cache != {}
|
||||
|
||||
# Save cache
|
||||
result = requests.post(f"{server}/v1/admin/cache/save")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_saved = result.json()
|
||||
assert cache_saved == cache
|
||||
|
||||
# Clear cache - should clear nothing as all cache files expire in the future
|
||||
result = requests.post(f"{server}/v1/admin/cache/clear")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_cleared = result.json()
|
||||
assert cache_cleared == cache
|
||||
|
||||
# Force clear cache
|
||||
result = requests.post(f"{server}/v1/admin/cache/clear?clear_all=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_cleared = result.json()
|
||||
assert cache_cleared == {}
|
||||
|
||||
# Try to load already deleted cache entries
|
||||
result = requests.post(f"{server}/v1/admin/cache/load")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_loaded = result.json()
|
||||
assert cache_loaded == {}
|
||||
|
||||
# Cache should still be empty
|
||||
result = requests.get(f"{server}/v1/admin/cache")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache = result.json()
|
||||
assert cache == {}
|
||||
|
||||
|
||||
class TestServerStartStop:
|
||||
def test_server_start_eosdash(self, tmpdir):
|
||||
"""Test the EOSdash server startup from EOS."""
|
||||
# Do not use any fixture as this will make pytest the owner of the EOSdash port.
|
||||
host = get_default_host()
|
||||
if os.name == "nt":
|
||||
# Windows does not provide SIGKILL
|
||||
sigkill = signal.SIGTERM # type: ignore[attr-defined,unused-ignore]
|
||||
else:
|
||||
sigkill = signal.SIGKILL # type: ignore
|
||||
port = 8503
|
||||
eosdash_host = host
|
||||
eosdash_port = 8504
|
||||
timeout = 120
|
||||
|
||||
server = f"http://{host}:{port}"
|
||||
eosdash_server = f"http://{host}:{eosdash_port}"
|
||||
eosdash_server = f"http://{eosdash_host}:{eosdash_port}"
|
||||
eos_dir = str(tmpdir)
|
||||
|
||||
# Cleanup any EOSdash process left.
|
||||
try:
|
||||
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
pid = result.json()["pid"]
|
||||
os.kill(pid, sigkill)
|
||||
time.sleep(1)
|
||||
result = requests.get(f"{eosdash_server}/eosdash/health", timeout=2)
|
||||
assert result.status_code != HTTPStatus.OK
|
||||
except:
|
||||
pass
|
||||
|
||||
# Wait for EOSdash port to be freed
|
||||
process_info: list[dict] = []
|
||||
for retries in range(int(timeout / 3)):
|
||||
process_info = []
|
||||
pids: list[int] = []
|
||||
for conn in psutil.net_connections(kind="inet"):
|
||||
if conn.laddr.port == eosdash_port:
|
||||
if conn.pid not in pids:
|
||||
# Get fresh process info
|
||||
process = psutil.Process(conn.pid)
|
||||
pids.append(conn.pid)
|
||||
process_info.append(process.as_dict(attrs=["pid", "cmdline"]))
|
||||
if len(process_info) == 0:
|
||||
break
|
||||
time.sleep(3)
|
||||
assert len(process_info) == 0
|
||||
# Cleanup any EOS and EOSdash process left.
|
||||
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, timeout)
|
||||
|
||||
# Import after test setup to prevent creation of config file before test
|
||||
from akkudoktoreos.server.eos import start_eosdash
|
||||
|
||||
# Port may be blocked
|
||||
assert wait_for_port_free(eosdash_port, timeout=120, waiting_app_name="EOSdash")
|
||||
|
||||
process = start_eosdash(
|
||||
host=host,
|
||||
host=eosdash_host,
|
||||
port=eosdash_port,
|
||||
eos_host=host,
|
||||
eos_port=port,
|
||||
log_level="debug",
|
||||
log_level="DEBUG",
|
||||
access_log=False,
|
||||
reload=False,
|
||||
eos_dir=eos_dir,
|
||||
@@ -310,7 +93,9 @@ class TestServerStartStop:
|
||||
time.sleep(3)
|
||||
|
||||
assert startup, f"Connection to {eosdash_server}/eosdash/health failed: {error}"
|
||||
assert result.json()["status"] == "alive"
|
||||
health = result.json()
|
||||
assert health["status"] == "alive"
|
||||
assert health["version"] == __version__
|
||||
|
||||
# Shutdown eosdash
|
||||
try:
|
||||
@@ -324,6 +109,9 @@ class TestServerStartStop:
|
||||
except:
|
||||
pass
|
||||
|
||||
# Cleanup any EOS and EOSdash process left.
|
||||
cleanup_eos_eosdash(host, port, eosdash_host, eosdash_port, timeout)
|
||||
|
||||
@pytest.mark.skipif(os.name == "nt", reason="Server restart not supported on Windows")
|
||||
def test_server_restart(self, server_setup_for_function, is_system_test):
|
||||
"""Test server restart."""
|
||||
@@ -403,7 +191,7 @@ class TestServerStartStop:
|
||||
# Assure EOS is up again
|
||||
startup = False
|
||||
error = ""
|
||||
for retries in range(int(timeout / 3)):
|
||||
for retries in range(int(timeout / 5)):
|
||||
try:
|
||||
result = requests.get(f"{server}/v1/health", timeout=2)
|
||||
if result.status_code == HTTPStatus.OK:
|
||||
@@ -412,7 +200,7 @@ class TestServerStartStop:
|
||||
error = f"{result.status_code}, {str(result.content)}"
|
||||
except Exception as ex:
|
||||
error = str(ex)
|
||||
time.sleep(3)
|
||||
time.sleep(5)
|
||||
|
||||
assert startup, f"Connection to {server}/v1/health failed: {error}"
|
||||
assert result.json()["status"] == "alive"
|
||||
@@ -442,3 +230,24 @@ class TestServerStartStop:
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
assert "Stopping EOS.." in result.json()["message"]
|
||||
new_pid = result.json()["pid"]
|
||||
|
||||
|
||||
class TestServerWithEnv:
|
||||
eos_env = {
|
||||
"EOS_SERVER__EOSDASH_PORT": "8555",
|
||||
}
|
||||
|
||||
def test_server_setup_for_class(self, server_setup_for_class):
|
||||
"""Ensure server is started with environment passed to configuration."""
|
||||
server = server_setup_for_class["server"]
|
||||
|
||||
assert server_setup_for_class["eosdash_port"] == int(self.eos_env["EOS_SERVER__EOSDASH_PORT"])
|
||||
|
||||
result = requests.get(f"{server}/v1/config")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Get testing config
|
||||
config_json = result.json()
|
||||
|
||||
# Assure config got configuration from environment
|
||||
assert config_json["server"]["eosdash_port"] == int(self.eos_env["EOS_SERVER__EOSDASH_PORT"])
|
||||
|
||||
51
tests/test_stringutil.py
Normal file
51
tests/test_stringutil.py
Normal file
@@ -0,0 +1,51 @@
|
||||
"""Tests for the stringutil module."""
|
||||
|
||||
import pytest
|
||||
|
||||
from akkudoktoreos.utils.stringutil import str2bool
|
||||
|
||||
|
||||
class TestStr2Bool:
|
||||
"""Unit tests for the str2bool function."""
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"input_value",
|
||||
["yes", "YES", "y", "Y", "true", "TRUE", "t", "T", "1", "on", "ON"],
|
||||
)
|
||||
def test_truthy_values(self, input_value):
|
||||
"""Test that all accepted truthy string values return True."""
|
||||
assert str2bool(input_value) is True
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"input_value",
|
||||
["no", "NO", "n", "N", "false", "FALSE", "f", "F", "0", "off", "OFF"],
|
||||
)
|
||||
def test_falsy_values(self, input_value):
|
||||
"""Test that all accepted falsy string values return False."""
|
||||
assert str2bool(input_value) is False
|
||||
|
||||
def test_bool_input_returns_itself(self):
|
||||
"""Test that passing a boolean returns the same value."""
|
||||
assert str2bool(True) is True
|
||||
assert str2bool(False) is False
|
||||
|
||||
def test_whitespace_is_ignored(self):
|
||||
"""Test that surrounding whitespace does not affect the result."""
|
||||
assert str2bool(" yes ") is True
|
||||
assert str2bool("\tno\n") is False
|
||||
|
||||
def test_invalid_string_raises_value_error(self):
|
||||
"""Test that invalid strings raise a ValueError."""
|
||||
with pytest.raises(ValueError, match="Invalid boolean value"):
|
||||
str2bool("maybe")
|
||||
with pytest.raises(ValueError):
|
||||
str2bool("truthish")
|
||||
|
||||
def test_type_error_on_non_string_non_bool(self):
|
||||
"""Test that non-string, non-boolean inputs raise ValueError."""
|
||||
with pytest.raises(ValueError, match="Invalid boolean value"):
|
||||
str2bool(None)
|
||||
with pytest.raises(ValueError, match="Invalid boolean value"):
|
||||
str2bool(1.23)
|
||||
with pytest.raises(ValueError, match="Invalid boolean value"):
|
||||
str2bool([])
|
||||
210
tests/test_system.py
Normal file
210
tests/test_system.py
Normal file
@@ -0,0 +1,210 @@
|
||||
import json
|
||||
import os
|
||||
import signal
|
||||
import time
|
||||
from http import HTTPStatus
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
import requests
|
||||
|
||||
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
||||
|
||||
FILE_TESTDATA_EOSSERVER_CONFIG_1 = DIR_TESTDATA.joinpath("eosserver_config_1.json")
|
||||
|
||||
|
||||
class TestSystem:
|
||||
def test_prediction_brightsky(self, server_setup_for_class, is_system_test):
|
||||
"""Test weather prediction by BrightSky."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.get(f"{server}/v1/config", timeout=2)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Get testing config
|
||||
config_json = result.json()
|
||||
config_folder_path = Path(config_json["general"]["config_folder_path"])
|
||||
# Assure we are working in test environment
|
||||
assert str(config_folder_path).startswith(eos_dir)
|
||||
|
||||
result = requests.put(f"{server}/v1/config/weather/provider", json="BrightSky")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "BrightSky" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/BrightSky")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=weather_temp_air")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_clearoutside(self, server_setup_for_class, is_system_test):
|
||||
"""Test weather prediction by ClearOutside."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.put(f"{server}/v1/config/weather/provider", json="ClearOutside")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "ClearOutside" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/ClearOutside")
|
||||
assert result.status_code == HTTPStatus.OK, f"Failed: {result.headers} {result.text}"
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=weather_temp_air")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_pvforecastakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test PV prediction by PVForecastAkkudoktor."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
# Reset config
|
||||
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
|
||||
config = json.load(fd)
|
||||
config["pvforecast"]["provider"] = "PVForecastAkkudoktor"
|
||||
result = requests.put(f"{server}/v1/config", json=config)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "PVForecastAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/PVForecastAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK, f"Failed: {result.headers} {result.text}"
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=pvforecast_ac_power")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_elecpriceakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test electricity price prediction by ElecPriceImport."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
# Reset config
|
||||
with FILE_TESTDATA_EOSSERVER_CONFIG_1.open("r", encoding="utf-8", newline=None) as fd:
|
||||
config = json.load(fd)
|
||||
config["elecprice"]["provider"] = "ElecPriceAkkudoktor"
|
||||
result = requests.put(f"{server}/v1/config", json=config)
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "ElecPriceAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/ElecPriceAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=elecprice_marketprice_wh")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_prediction_loadakkudoktor(self, server_setup_for_class, is_system_test):
|
||||
"""Test load prediction by LoadAkkudoktor."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.put(f"{server}/v1/config/load/provider", json="LoadAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
# Assure prediction is enabled
|
||||
result = requests.get(f"{server}/v1/prediction/providers?enabled=true")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
providers = result.json()
|
||||
assert "LoadAkkudoktor" in providers
|
||||
|
||||
if is_system_test:
|
||||
result = requests.post(f"{server}/v1/prediction/update/LoadAkkudoktor")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
result = requests.get(f"{server}/v1/prediction/series?key=load_mean")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
|
||||
data = result.json()
|
||||
assert len(data["data"]) > 24
|
||||
|
||||
else:
|
||||
pass
|
||||
|
||||
def test_admin_cache(self, server_setup_for_class, is_system_test):
|
||||
"""Test whether cache is reconstructed from cached files."""
|
||||
server = server_setup_for_class["server"]
|
||||
eos_dir = server_setup_for_class["eos_dir"]
|
||||
|
||||
result = requests.get(f"{server}/v1/admin/cache")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache = result.json()
|
||||
|
||||
if is_system_test:
|
||||
# There should be some cache data
|
||||
assert cache != {}
|
||||
|
||||
# Save cache
|
||||
result = requests.post(f"{server}/v1/admin/cache/save")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_saved = result.json()
|
||||
assert cache_saved == cache
|
||||
|
||||
# Clear expired cache - should clear nothing as all cache files expire in the future
|
||||
result = requests.post(f"{server}/v1/admin/cache/clear-expired")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_cleared = result.json()
|
||||
assert cache_cleared == cache
|
||||
|
||||
# Force clear cache
|
||||
result = requests.post(f"{server}/v1/admin/cache/clear")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_cleared = result.json()
|
||||
assert cache_cleared == {}
|
||||
|
||||
# Try to load already deleted cache entries
|
||||
result = requests.post(f"{server}/v1/admin/cache/load")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache_loaded = result.json()
|
||||
assert cache_loaded == {}
|
||||
|
||||
# Cache should still be empty
|
||||
result = requests.get(f"{server}/v1/admin/cache")
|
||||
assert result.status_code == HTTPStatus.OK
|
||||
cache = result.json()
|
||||
assert cache == {}
|
||||
@@ -162,7 +162,7 @@ def test_update_data(mock_get, provider, sample_clearout_1_html, sample_clearout
|
||||
# Check for correct prediction time window
|
||||
assert provider.config.prediction.hours == 48
|
||||
assert provider.config.prediction.historic_hours == 48
|
||||
assert compare_datetimes(provider.start_datetime, expected_start).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, expected_start).equal
|
||||
assert compare_datetimes(provider.end_datetime, expected_end).equal
|
||||
assert compare_datetimes(provider.keep_datetime, expected_keep).equal
|
||||
|
||||
|
||||
@@ -19,8 +19,10 @@ def provider(sample_import_1_json, config_eos):
|
||||
"weather": {
|
||||
"provider": "WeatherImport",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_WEATHERIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
"WeatherImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_WEATHERIMPORT_1_JSON),
|
||||
"import_json": json.dumps(sample_import_1_json),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -55,7 +57,9 @@ def test_invalid_provider(provider, config_eos, monkeypatch):
|
||||
"weather": {
|
||||
"provider": "<invalid>",
|
||||
"provider_settings": {
|
||||
"import_file_path": str(FILE_TESTDATA_WEATHERIMPORT_1_JSON),
|
||||
"WeatherImport": {
|
||||
"import_file_path": str(FILE_TESTDATA_WEATHERIMPORT_1_JSON),
|
||||
},
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -86,20 +90,20 @@ def test_import(provider, sample_import_1_json, start_datetime, from_file, confi
|
||||
ems_eos = get_ems()
|
||||
ems_eos.set_start_datetime(to_datetime(start_datetime, in_timezone="Europe/Berlin"))
|
||||
if from_file:
|
||||
config_eos.weather.provider_settings.import_json = None
|
||||
assert config_eos.weather.provider_settings.import_json is None
|
||||
config_eos.weather.provider_settings.WeatherImport.import_json = None
|
||||
assert config_eos.weather.provider_settings.WeatherImport.import_json is None
|
||||
else:
|
||||
config_eos.weather.provider_settings.import_file_path = None
|
||||
assert config_eos.weather.provider_settings.import_file_path is None
|
||||
config_eos.weather.provider_settings.WeatherImport.import_file_path = None
|
||||
assert config_eos.weather.provider_settings.WeatherImport.import_file_path is None
|
||||
provider.clear()
|
||||
|
||||
# Call the method
|
||||
provider.update_data()
|
||||
|
||||
# Assert: Verify the result is as expected
|
||||
assert provider.start_datetime is not None
|
||||
assert provider.ems_start_datetime is not None
|
||||
assert provider.total_hours is not None
|
||||
assert compare_datetimes(provider.start_datetime, ems_eos.start_datetime).equal
|
||||
assert compare_datetimes(provider.ems_start_datetime, ems_eos.start_datetime).equal
|
||||
values = sample_import_1_json["weather_temp_air"]
|
||||
value_datetime_mapping = provider.import_datetimes(ems_eos.start_datetime, len(values))
|
||||
for i, mapping in enumerate(value_datetime_mapping):
|
||||
|
||||
101
tests/testdata/eos_config_andreas_0_1_0.json
vendored
Normal file
101
tests/testdata/eos_config_andreas_0_1_0.json
vendored
Normal file
@@ -0,0 +1,101 @@
|
||||
{
|
||||
"general": {
|
||||
"data_folder_path": null,
|
||||
"data_output_subpath": "output",
|
||||
"latitude": 52.5,
|
||||
"longitude": 13.4
|
||||
},
|
||||
"cache": {
|
||||
"subpath": "cache",
|
||||
"cleanup_interval": 300.0
|
||||
},
|
||||
"ems": {
|
||||
"startup_delay": 5.0,
|
||||
"interval": 300.0
|
||||
},
|
||||
"logging": {
|
||||
"level": "INFO"
|
||||
},
|
||||
"devices": {
|
||||
"batteries": [
|
||||
{
|
||||
"device_id": "pv_akku",
|
||||
"hours": null,
|
||||
"capacity_wh": 30000,
|
||||
"charging_efficiency": 0.88,
|
||||
"discharging_efficiency": 0.88,
|
||||
"max_charge_power_w": 5000,
|
||||
"initial_soc_percentage": 0,
|
||||
"min_soc_percentage": 0,
|
||||
"max_soc_percentage": 100
|
||||
}
|
||||
],
|
||||
"inverters": [],
|
||||
"home_appliances": []
|
||||
},
|
||||
"measurement": {
|
||||
"load0_name": "Household",
|
||||
"load1_name": null,
|
||||
"load2_name": null,
|
||||
"load3_name": null,
|
||||
"load4_name": null
|
||||
},
|
||||
"optimization": {
|
||||
"hours": 48,
|
||||
"penalty": 10,
|
||||
"ev_available_charge_rates_percent": [
|
||||
0.0, 37.5, 50.0, 62.5, 75.0, 87.5, 100.0
|
||||
]
|
||||
},
|
||||
"prediction": {
|
||||
"hours": 48,
|
||||
"historic_hours": 48
|
||||
},
|
||||
"elecprice": {
|
||||
"provider": "ElecPriceAkkudoktor",
|
||||
"charges_kwh": 0.21,
|
||||
"provider_settings": null
|
||||
},
|
||||
"load": {
|
||||
"provider_settings": {
|
||||
"loadakkudoktor_year_energy": 13000
|
||||
}
|
||||
},
|
||||
"pvforecast": {
|
||||
"provider": "PVForecastAkkudoktor",
|
||||
"planes": [
|
||||
{
|
||||
"surface_tilt": 87.907,
|
||||
"surface_azimuth": 175.0,
|
||||
"userhorizon": [28.0, 34.0, 32.0, 60.0],
|
||||
"peakpower": 13.110,
|
||||
"pvtechchoice": "crystSi",
|
||||
"mountingplace": "free",
|
||||
"loss": 18.6,
|
||||
"trackingtype": 0,
|
||||
"optimal_surface_tilt": false,
|
||||
"optimalangles": false,
|
||||
"albedo": 0.25,
|
||||
"module_model": null,
|
||||
"inverter_model": null,
|
||||
"inverter_paco": 15000,
|
||||
"modules_per_string": 20,
|
||||
"strings_per_inverter": 2
|
||||
}
|
||||
],
|
||||
"provider_settings": null
|
||||
},
|
||||
"weather": {
|
||||
"provider": "WeatherImport",
|
||||
"provider_settings": null
|
||||
},
|
||||
"server": {
|
||||
"host": "0.0.0.0",
|
||||
"port": 8503,
|
||||
"verbose": true,
|
||||
"startup_eosdash": true,
|
||||
"eosdash_host": "0.0.0.0",
|
||||
"eosdash_port": 8504
|
||||
},
|
||||
"utils": {}
|
||||
}
|
||||
100
tests/testdata/eos_config_andreas_now.json
vendored
Normal file
100
tests/testdata/eos_config_andreas_now.json
vendored
Normal file
@@ -0,0 +1,100 @@
|
||||
{
|
||||
"general": {
|
||||
"data_folder_path": null,
|
||||
"data_output_subpath": "output",
|
||||
"latitude": 52.5,
|
||||
"longitude": 13.4
|
||||
},
|
||||
"cache": {
|
||||
"subpath": "cache",
|
||||
"cleanup_interval": 300.0
|
||||
},
|
||||
"ems": {
|
||||
"startup_delay": 5.0,
|
||||
"interval": 300.0
|
||||
},
|
||||
"logging": {
|
||||
"console_level": "INFO"
|
||||
},
|
||||
"devices": {
|
||||
"batteries": [
|
||||
{
|
||||
"device_id": "pv_akku",
|
||||
"capacity_wh": 30000,
|
||||
"charging_efficiency": 0.88,
|
||||
"discharging_efficiency": 0.88,
|
||||
"max_charge_power_w": 5000,
|
||||
"min_soc_percentage": 0,
|
||||
"max_soc_percentage": 100
|
||||
}
|
||||
],
|
||||
"electric_vehicles": [
|
||||
{
|
||||
"charge_rates": [0.0, 0.375, 0.5, 0.625, 0.75, 0.875, 1.0]
|
||||
}
|
||||
],
|
||||
"inverters": [],
|
||||
"home_appliances": []
|
||||
},
|
||||
"measurement": {
|
||||
"load_emr_keys": ["Household"]
|
||||
},
|
||||
"optimization": {
|
||||
"horizon_hours": 48,
|
||||
"genetic": {
|
||||
"penalties": {
|
||||
"ev_soc_miss": 10
|
||||
}
|
||||
}
|
||||
},
|
||||
"prediction": {
|
||||
"hours": 48,
|
||||
"historic_hours": 48
|
||||
},
|
||||
"elecprice": {
|
||||
"provider": "ElecPriceAkkudoktor",
|
||||
"charges_kwh": 0.21
|
||||
},
|
||||
"load": {
|
||||
"provider_settings": {
|
||||
"LoadAkkudoktor": {
|
||||
"loadakkudoktor_year_energy": 13000
|
||||
}
|
||||
}
|
||||
},
|
||||
"pvforecast": {
|
||||
"provider": "PVForecastAkkudoktor",
|
||||
"planes": [
|
||||
{
|
||||
"surface_tilt": 87.907,
|
||||
"surface_azimuth": 175.0,
|
||||
"userhorizon": [28.0, 34.0, 32.0, 60.0],
|
||||
"peakpower": 13.110,
|
||||
"pvtechchoice": "crystSi",
|
||||
"mountingplace": "free",
|
||||
"loss": 18.6,
|
||||
"trackingtype": 0,
|
||||
"optimal_surface_tilt": false,
|
||||
"optimalangles": false,
|
||||
"albedo": 0.25,
|
||||
"module_model": null,
|
||||
"inverter_model": null,
|
||||
"inverter_paco": 15000,
|
||||
"modules_per_string": 20,
|
||||
"strings_per_inverter": 2
|
||||
}
|
||||
]
|
||||
},
|
||||
"weather": {
|
||||
"provider": "WeatherImport"
|
||||
},
|
||||
"server": {
|
||||
"host": "0.0.0.0",
|
||||
"port": 8503,
|
||||
"verbose": true,
|
||||
"startup_eosdash": true,
|
||||
"eosdash_host": "0.0.0.0",
|
||||
"eosdash_port": 8504
|
||||
},
|
||||
"utils": {}
|
||||
}
|
||||
24
tests/testdata/eos_config_minimal_0_1_0.json
vendored
Normal file
24
tests/testdata/eos_config_minimal_0_1_0.json
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"elecprice": {
|
||||
"charges_kwh": 0.21,
|
||||
"provider": "ElecPriceImport"
|
||||
},
|
||||
"prediction": {
|
||||
"historic_hours": 48,
|
||||
"hours": 48
|
||||
},
|
||||
"optimization": {
|
||||
"hours": 48
|
||||
},
|
||||
"general": {
|
||||
"latitude": 52.5,
|
||||
"longitude": 13.4
|
||||
},
|
||||
"server": {
|
||||
"startup_eosdash": true,
|
||||
"host": "0.0.0.0",
|
||||
"port": 8503,
|
||||
"eosdash_host": "0.0.0.0",
|
||||
"eosdash_port": 8504
|
||||
}
|
||||
}
|
||||
24
tests/testdata/eos_config_minimal_now.json
vendored
Normal file
24
tests/testdata/eos_config_minimal_now.json
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"elecprice": {
|
||||
"charges_kwh": 0.21,
|
||||
"provider": "ElecPriceImport"
|
||||
},
|
||||
"prediction": {
|
||||
"historic_hours": 48,
|
||||
"hours": 48
|
||||
},
|
||||
"optimization": {
|
||||
"horizon_hours": 48
|
||||
},
|
||||
"general": {
|
||||
"latitude": 52.5,
|
||||
"longitude": 13.4
|
||||
},
|
||||
"server": {
|
||||
"startup_eosdash": true,
|
||||
"host": "0.0.0.0",
|
||||
"port": 8503,
|
||||
"eosdash_host": "0.0.0.0",
|
||||
"eosdash_port": 8504
|
||||
}
|
||||
}
|
||||
6
tests/testdata/eosserver_config_1.json
vendored
6
tests/testdata/eosserver_config_1.json
vendored
@@ -14,11 +14,13 @@
|
||||
"load": {
|
||||
"provider": "LoadImport",
|
||||
"provider_settings": {
|
||||
"loadakkudoktor_year_energy": 20000
|
||||
"LoadAkkudoktor": {
|
||||
"loadakkudoktor_year_energy": 20000
|
||||
}
|
||||
}
|
||||
},
|
||||
"optimization": {
|
||||
"hours": 48
|
||||
"horizon_hours": 48
|
||||
},
|
||||
"pvforecast": {
|
||||
"planes": [
|
||||
|
||||
4
tests/testdata/optimize_input_1.json
vendored
4
tests/testdata/optimize_input_1.json
vendored
@@ -65,5 +65,5 @@
|
||||
1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 1, 1, 1, 1,
|
||||
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1
|
||||
]
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
|
||||
408
tests/testdata/weatherforecast_brightsky_1.json
vendored
408
tests/testdata/weatherforecast_brightsky_1.json
vendored
@@ -21,13 +21,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -52,13 +52,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -83,13 +83,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -114,13 +114,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -145,13 +145,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -176,13 +176,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -207,13 +207,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -238,13 +238,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -269,13 +269,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -300,13 +300,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -331,13 +331,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -362,13 +362,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -393,13 +393,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -424,13 +424,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -455,13 +455,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -486,13 +486,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -517,13 +517,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -548,13 +548,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -579,13 +579,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "partly-cloudy-day"
|
||||
},
|
||||
@@ -610,13 +610,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -641,13 +641,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -672,13 +672,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -703,13 +703,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -734,13 +734,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -765,13 +765,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -796,13 +796,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -827,13 +827,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -858,13 +858,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -889,13 +889,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -920,13 +920,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -951,13 +951,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -982,13 +982,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1013,13 +1013,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1044,13 +1044,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1075,13 +1075,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1106,13 +1106,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1137,13 +1137,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1168,13 +1168,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1199,13 +1199,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1230,13 +1230,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1261,13 +1261,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1292,13 +1292,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1323,13 +1323,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1354,13 +1354,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1385,13 +1385,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1416,13 +1416,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1447,13 +1447,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1478,13 +1478,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1509,13 +1509,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
},
|
||||
@@ -1540,13 +1540,13 @@
|
||||
"solar": null,
|
||||
"fallback_source_ids": {
|
||||
"cloud_cover": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"visibility": 219419,
|
||||
"wind_gust_speed": 219419,
|
||||
"wind_speed": 219419,
|
||||
"wind_gust_direction": 219419,
|
||||
"pressure_msl": 219419,
|
||||
"sunshine": 219419,
|
||||
"wind_gust_speed": 219419
|
||||
"wind_direction": 219419
|
||||
},
|
||||
"icon": "cloudy"
|
||||
}
|
||||
@@ -1561,8 +1561,8 @@
|
||||
"height": 216.5,
|
||||
"station_name": "Arnstein-M\u00fcdesheim",
|
||||
"wmo_station_id": "P125",
|
||||
"first_record": "2010-01-01T00:00:00+00:00",
|
||||
"last_record": "2025-02-13T23:00:00+00:00",
|
||||
"first_record": "2010-01-01T01:00:00+01:00",
|
||||
"last_record": "2025-10-23T01:00:00+02:00",
|
||||
"distance": 7199.0
|
||||
},
|
||||
{
|
||||
@@ -1574,8 +1574,8 @@
|
||||
"height": 281.73,
|
||||
"station_name": "Kissingen, Bad",
|
||||
"wmo_station_id": "10658",
|
||||
"first_record": "2010-01-01T00:00:00+00:00",
|
||||
"last_record": "2025-02-14T23:00:00+00:00",
|
||||
"first_record": "2010-01-01T01:00:00+01:00",
|
||||
"last_record": "2025-10-23T01:00:00+02:00",
|
||||
"distance": 25569.0
|
||||
}
|
||||
]
|
||||
|
||||
216
tests/testdata/weatherforecast_brightsky_2.json
vendored
216
tests/testdata/weatherforecast_brightsky_2.json
vendored
@@ -2,6 +2,7 @@
|
||||
"records": [
|
||||
{
|
||||
"date_time": "2024-10-26 00:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -23,11 +24,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 01:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -49,11 +50,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 02:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -75,11 +76,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 03:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -101,11 +102,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 04:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -127,11 +128,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 05:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -153,11 +154,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 06:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -179,11 +180,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 07:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -205,11 +206,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 08:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -231,11 +232,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 09:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -257,11 +258,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 22.22705922303379,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 22.22705922303379,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 22.22705922303379
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 10:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -283,11 +284,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 68.16265202099999,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 68.16265202099999,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 68.16265202099999
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 11:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -309,11 +310,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 108.0100746278567,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 108.0100746278567,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 108.0100746278567
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 12:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -335,11 +336,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 134.2816493853918,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 134.2816493853918,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 134.2816493853918
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 13:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -361,11 +362,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 144.04237088707308,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 144.04237088707308,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 144.04237088707308
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 14:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -387,11 +388,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 136.35519419190516,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 136.35519419190516,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 136.35519419190516
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 15:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -413,11 +414,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 111.94730962791996,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 111.94730962791996,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 111.94730962791996
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 16:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -439,11 +440,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 73.45834328182735,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 73.45834328182735,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 73.45834328182735
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 17:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -465,11 +466,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 34.07062080450064,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 34.07062080450064,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 34.07062080450064
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 18:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 62.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -491,11 +492,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.11256372587508819,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.11256372587508819,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.11256372587508819
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 19:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -517,11 +518,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 20:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -543,11 +544,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 21:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -569,11 +570,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 22:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -595,11 +596,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-26 23:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -621,11 +622,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 00:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -647,11 +648,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 01:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -673,11 +674,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 02:00:00+02:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -699,11 +700,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 02:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -725,11 +726,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": null,
|
||||
"weather_dni": null,
|
||||
"weather_dhi": null,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": null
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 03:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -751,11 +752,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 04:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -777,11 +778,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 05:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -803,11 +804,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 06:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -829,11 +830,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 07:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -855,11 +856,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 08:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -881,11 +882,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 20.901591088639343,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 20.901591088639343,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 20.901591088639343
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 09:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -907,11 +908,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 66.41841804602629,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 66.41841804602629,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 66.41841804602629
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 10:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -933,11 +934,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 106.12345605852113,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 106.12345605852113,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 106.12345605852113
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 11:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -959,11 +960,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 132.31929512932624,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 132.31929512932624,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 132.31929512932624
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 12:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -985,11 +986,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 142.03807516868267,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 142.03807516868267,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 142.03807516868267
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 13:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1011,11 +1012,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 134.33853283469773,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 134.33853283469773,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 134.33853283469773
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 14:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1037,11 +1038,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 109.95561941571053,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 109.95561941571053,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 109.95561941571053
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 15:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 87.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1063,11 +1064,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 88.84019629738314,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 88.84019629738314,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 88.84019629738314
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 16:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1089,11 +1090,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 25.90303659201319,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 25.90303659201319,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 25.90303659201319
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 17:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1115,11 +1116,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.027781191847857583,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.027781191847857583,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.027781191847857583
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 18:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1141,11 +1142,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 19:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1167,11 +1168,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 20:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1193,11 +1194,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 21:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1219,11 +1220,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 22:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1245,11 +1246,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-27 23:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1271,11 +1272,11 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
},
|
||||
{
|
||||
"date_time": "2024-10-28 00:00:00+01:00",
|
||||
"configured_data": {},
|
||||
"weather_total_clouds": 100.0,
|
||||
"weather_low_clouds": null,
|
||||
"weather_medium_clouds": null,
|
||||
@@ -1285,7 +1286,7 @@
|
||||
"weather_precip_type": null,
|
||||
"weather_precip_prob": null,
|
||||
"weather_precip_amt": 0.0,
|
||||
"weather_preciptable_water": 2.174201406952952,
|
||||
"weather_preciptable_water": null,
|
||||
"weather_wind_speed": 11.9,
|
||||
"weather_wind_direction": 180.0,
|
||||
"weather_frost_chance": null,
|
||||
@@ -1297,14 +1298,12 @@
|
||||
"weather_ozone": null,
|
||||
"weather_ghi": 0.0,
|
||||
"weather_dni": 0.0,
|
||||
"weather_dhi": 0.0,
|
||||
"start_datetime": "2024-10-26 00:00:00+02:00"
|
||||
"weather_dhi": 0.0
|
||||
}
|
||||
],
|
||||
"update_datetime": "2025-02-15T08:48:36.218971+01:00",
|
||||
"start_datetime": "2024-10-26T00:00:00+02:00",
|
||||
"min_datetime": "2024-10-26T00:00:00+02:00",
|
||||
"max_datetime": "2024-10-28T00:00:00+01:00",
|
||||
"update_datetime": "2025-10-23 17:16:44.461593+02:00",
|
||||
"min_datetime": "2024-10-26 00:00:00+02:00",
|
||||
"max_datetime": "2024-10-28 00:00:00+01:00",
|
||||
"record_keys": [
|
||||
"date_time",
|
||||
"weather_total_clouds",
|
||||
@@ -1328,8 +1327,7 @@
|
||||
"weather_ozone",
|
||||
"weather_ghi",
|
||||
"weather_dni",
|
||||
"weather_dhi",
|
||||
"start_datetime"
|
||||
"weather_dhi"
|
||||
],
|
||||
"record_keys_writable": [
|
||||
"date_time",
|
||||
@@ -1356,8 +1354,8 @@
|
||||
"weather_dni",
|
||||
"weather_dhi"
|
||||
],
|
||||
"end_datetime": "2024-10-28T00:00:00+01:00",
|
||||
"keep_datetime": "2024-10-24T00:00:00+02:00",
|
||||
"end_datetime": "2024-10-28 00:00:00+01:00",
|
||||
"keep_datetime": "2024-10-24 00:00:00+02:00",
|
||||
"total_hours": 49,
|
||||
"keep_hours": 48
|
||||
}
|
||||
Reference in New Issue
Block a user