Improve EOSdash.
Make EOSdash use UI components from MonsterUI to ease further development. - Add a first menu with some dummy pages and the configuration page. - Make the configuration scrollable. - Add markdown component that uses markdown-it-py (same as used by the myth-parser for documentation generation). - Add bokeh (https://docs.bokeh.org/) component for charts - Added several prediction charts to demo - Add a footer that displays connection status with EOS server - Add logo and favicon Update EOS server: - Move error message generation to extra module - Use redirect instead of proxy Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
18
Makefile
@ -19,8 +19,10 @@ help:
|
|||||||
@echo " read-docs - Read HTML documentation in your browser."
|
@echo " read-docs - Read HTML documentation in your browser."
|
||||||
@echo " gen-docs - Generate openapi.json and docs/_generated/*.""
|
@echo " gen-docs - Generate openapi.json and docs/_generated/*.""
|
||||||
@echo " clean-docs - Remove generated documentation.""
|
@echo " clean-docs - Remove generated documentation.""
|
||||||
@echo " run - Run EOS production server in the virtual environment."
|
@echo " run - Run EOS production server in virtual environment."
|
||||||
@echo " run-dev - Run EOS development server in the virtual environment (automatically reloads)."
|
@echo " run-dev - Run EOS development server in virtual environment (automatically reloads)."
|
||||||
|
@echo " run-dash - Run EOSdash production server in virtual environment."
|
||||||
|
@echo " run-dash-dev - Run EOSdash development server in virtual environment (automatically reloads)."
|
||||||
@echo " dist - Create distribution (in dist/)."
|
@echo " dist - Create distribution (in dist/)."
|
||||||
@echo " clean - Remove generated documentation, distribution and virtual environment."
|
@echo " clean - Remove generated documentation, distribution and virtual environment."
|
||||||
|
|
||||||
@ -85,11 +87,19 @@ clean: clean-docs
|
|||||||
|
|
||||||
run:
|
run:
|
||||||
@echo "Starting EOS production server, please wait..."
|
@echo "Starting EOS production server, please wait..."
|
||||||
.venv/bin/python src/akkudoktoreos/server/eos.py
|
.venv/bin/python -m akkudoktoreos.server.eos
|
||||||
|
|
||||||
run-dev:
|
run-dev:
|
||||||
@echo "Starting EOS development server, please wait..."
|
@echo "Starting EOS development server, please wait..."
|
||||||
.venv/bin/python src/akkudoktoreos/server/eos.py --host localhost --port 8503 --reload true
|
.venv/bin/python -m akkudoktoreos.server.eos --host localhost --port 8503 --reload true
|
||||||
|
|
||||||
|
run-dash:
|
||||||
|
@echo "Starting EOSdash production server, please wait..."
|
||||||
|
.venv/bin/python -m akkudoktoreos.server.eosdash
|
||||||
|
|
||||||
|
run-dash-dev:
|
||||||
|
@echo "Starting EOSdash development server, please wait..."
|
||||||
|
.venv/bin/python -m akkudoktoreos.server.eosdash --host localhost --port 8504 --reload true
|
||||||
|
|
||||||
# Target to setup tests.
|
# Target to setup tests.
|
||||||
test-setup: pip-dev
|
test-setup: pip-dev
|
||||||
|
@ -871,9 +871,6 @@ Validators:
|
|||||||
|
|
||||||
## Server Configuration
|
## Server Configuration
|
||||||
|
|
||||||
Attributes:
|
|
||||||
To be added
|
|
||||||
|
|
||||||
:::{table} server
|
:::{table} server
|
||||||
:widths: 10 20 10 5 5 30
|
:widths: 10 20 10 5 5 30
|
||||||
:align: left
|
:align: left
|
||||||
|
@ -366,7 +366,7 @@ Returns:
|
|||||||
Fastapi Config Reset Post
|
Fastapi Config Reset Post
|
||||||
|
|
||||||
```
|
```
|
||||||
Reset the configuration.
|
Reset the configuration to the EOS configuration file.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
configuration (ConfigEOS): The current configuration after update.
|
configuration (ConfigEOS): The current configuration after update.
|
||||||
@ -674,6 +674,42 @@ Merge the measurement of given key and value into EOS measurements at given date
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## GET /v1/prediction/dataframe
|
||||||
|
|
||||||
|
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_dataframe_get_v1_prediction_dataframe_get)
|
||||||
|
|
||||||
|
Fastapi Prediction Dataframe Get
|
||||||
|
|
||||||
|
```
|
||||||
|
Get prediction for given key within given date range as series.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key (str): Prediction key
|
||||||
|
start_datetime (Optional[str]): Starting datetime (inclusive).
|
||||||
|
Defaults to start datetime of latest prediction.
|
||||||
|
end_datetime (Optional[str]: Ending datetime (exclusive).
|
||||||
|
|
||||||
|
Defaults to end datetime of latest prediction.
|
||||||
|
```
|
||||||
|
|
||||||
|
**Parameters**:
|
||||||
|
|
||||||
|
- `keys` (query, required): Prediction keys.
|
||||||
|
|
||||||
|
- `start_datetime` (query, optional): Starting datetime (inclusive).
|
||||||
|
|
||||||
|
- `end_datetime` (query, optional): Ending datetime (exclusive).
|
||||||
|
|
||||||
|
- `interval` (query, optional): Time duration for each interval. Defaults to 1 hour.
|
||||||
|
|
||||||
|
**Responses**:
|
||||||
|
|
||||||
|
- **200**: Successful Response
|
||||||
|
|
||||||
|
- **422**: Validation Error
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## PUT /v1/prediction/import/{provider_id}
|
## PUT /v1/prediction/import/{provider_id}
|
||||||
|
|
||||||
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put)
|
**Links**: [local](http://localhost:8503/docs#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put), [eos](https://petstore3.swagger.io/?url=https://raw.githubusercontent.com/Akkudoktor-EOS/EOS/refs/heads/main/openapi.json#/default/fastapi_prediction_import_provider_v1_prediction_import__provider_id__put)
|
||||||
|
106
openapi.json
@ -2229,7 +2229,7 @@
|
|||||||
"type": "object"
|
"type": "object"
|
||||||
},
|
},
|
||||||
"ServerCommonSettings": {
|
"ServerCommonSettings": {
|
||||||
"description": "Server Configuration.\n\nAttributes:\n To be added",
|
"description": "Server Configuration.",
|
||||||
"properties": {
|
"properties": {
|
||||||
"eosdash_host": {
|
"eosdash_host": {
|
||||||
"anyOf": [
|
"anyOf": [
|
||||||
@ -3371,7 +3371,7 @@
|
|||||||
},
|
},
|
||||||
"/v1/config/reset": {
|
"/v1/config/reset": {
|
||||||
"post": {
|
"post": {
|
||||||
"description": "Reset the configuration.\n\nReturns:\n configuration (ConfigEOS): The current configuration after update.",
|
"description": "Reset the configuration to the EOS configuration file.\n\nReturns:\n configuration (ConfigEOS): The current configuration after update.",
|
||||||
"operationId": "fastapi_config_reset_post_v1_config_reset_post",
|
"operationId": "fastapi_config_reset_post_v1_config_reset_post",
|
||||||
"responses": {
|
"responses": {
|
||||||
"200": {
|
"200": {
|
||||||
@ -3951,6 +3951,108 @@
|
|||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"/v1/prediction/dataframe": {
|
||||||
|
"get": {
|
||||||
|
"description": "Get prediction for given key within given date range as series.\n\nArgs:\n key (str): Prediction key\n start_datetime (Optional[str]): Starting datetime (inclusive).\n Defaults to start datetime of latest prediction.\n end_datetime (Optional[str]: Ending datetime (exclusive).\n\nDefaults to end datetime of latest prediction.",
|
||||||
|
"operationId": "fastapi_prediction_dataframe_get_v1_prediction_dataframe_get",
|
||||||
|
"parameters": [
|
||||||
|
{
|
||||||
|
"description": "Prediction keys.",
|
||||||
|
"in": "query",
|
||||||
|
"name": "keys",
|
||||||
|
"required": true,
|
||||||
|
"schema": {
|
||||||
|
"description": "Prediction keys.",
|
||||||
|
"items": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"title": "Keys",
|
||||||
|
"type": "array"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Starting datetime (inclusive).",
|
||||||
|
"in": "query",
|
||||||
|
"name": "start_datetime",
|
||||||
|
"required": false,
|
||||||
|
"schema": {
|
||||||
|
"anyOf": [
|
||||||
|
{
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "null"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"description": "Starting datetime (inclusive).",
|
||||||
|
"title": "Start Datetime"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Ending datetime (exclusive).",
|
||||||
|
"in": "query",
|
||||||
|
"name": "end_datetime",
|
||||||
|
"required": false,
|
||||||
|
"schema": {
|
||||||
|
"anyOf": [
|
||||||
|
{
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "null"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"description": "Ending datetime (exclusive).",
|
||||||
|
"title": "End Datetime"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Time duration for each interval. Defaults to 1 hour.",
|
||||||
|
"in": "query",
|
||||||
|
"name": "interval",
|
||||||
|
"required": false,
|
||||||
|
"schema": {
|
||||||
|
"anyOf": [
|
||||||
|
{
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "null"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"description": "Time duration for each interval. Defaults to 1 hour.",
|
||||||
|
"title": "Interval"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"responses": {
|
||||||
|
"200": {
|
||||||
|
"content": {
|
||||||
|
"application/json": {
|
||||||
|
"schema": {
|
||||||
|
"$ref": "#/components/schemas/PydanticDateTimeDataFrame"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"description": "Successful Response"
|
||||||
|
},
|
||||||
|
"422": {
|
||||||
|
"content": {
|
||||||
|
"application/json": {
|
||||||
|
"schema": {
|
||||||
|
"$ref": "#/components/schemas/HTTPValidationError"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"description": "Validation Error"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"summary": "Fastapi Prediction Dataframe Get",
|
||||||
|
"tags": [
|
||||||
|
"prediction"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
"/v1/prediction/import/{provider_id}": {
|
"/v1/prediction/import/{provider_id}": {
|
||||||
"put": {
|
"put": {
|
||||||
"description": "Import prediction for given provider ID.\n\nArgs:\n provider_id: ID of provider to update.\n data: Prediction data.\n force_enable: Update data even if provider is disabled.\n Defaults to False.",
|
"description": "Import prediction for given provider ID.\n\nArgs:\n provider_id: ID of provider to update.\n data: Prediction data.\n force_enable: Update data even if provider is disabled.\n Defaults to False.",
|
||||||
|
@ -4,6 +4,10 @@ numpydantic==1.6.7
|
|||||||
matplotlib==3.10.0
|
matplotlib==3.10.0
|
||||||
fastapi[standard]==0.115.7
|
fastapi[standard]==0.115.7
|
||||||
python-fasthtml==0.12.0
|
python-fasthtml==0.12.0
|
||||||
|
MonsterUI==0.0.29
|
||||||
|
markdown-it-py==3.0.0
|
||||||
|
mdit-py-plugins==0.4.2
|
||||||
|
bokeh==3.6.3
|
||||||
uvicorn==0.34.0
|
uvicorn==0.34.0
|
||||||
scikit-learn==1.6.1
|
scikit-learn==1.6.1
|
||||||
timezonefinder==6.5.8
|
timezonefinder==6.5.8
|
||||||
|
@ -1845,6 +1845,88 @@ class DataContainer(SingletonMixin, DataBase, MutableMapping):
|
|||||||
|
|
||||||
return array
|
return array
|
||||||
|
|
||||||
|
def keys_to_dataframe(
|
||||||
|
self,
|
||||||
|
keys: list[str],
|
||||||
|
start_datetime: Optional[DateTime] = None,
|
||||||
|
end_datetime: Optional[DateTime] = None,
|
||||||
|
interval: Optional[Any] = None, # Duration assumed
|
||||||
|
fill_method: Optional[str] = None,
|
||||||
|
) -> pd.DataFrame:
|
||||||
|
"""Retrieve a dataframe indexed by fixed time intervals for specified keys from the data in each DataProvider.
|
||||||
|
|
||||||
|
Generates a pandas DataFrame using the NumPy arrays for each specified key, ensuring a common time index..
|
||||||
|
|
||||||
|
Args:
|
||||||
|
keys (list[str]): A list of field names to retrieve.
|
||||||
|
start_datetime (datetime, optional): Start date for filtering records (inclusive).
|
||||||
|
end_datetime (datetime, optional): End date for filtering records (exclusive).
|
||||||
|
interval (duration, optional): The fixed time interval. Defaults to 1 hour.
|
||||||
|
fill_method (str, optional): Method to handle missing values during resampling.
|
||||||
|
- 'linear': Linearly interpolate missing values (for numeric data only).
|
||||||
|
- 'ffill': Forward fill missing values.
|
||||||
|
- 'bfill': Backward fill missing values.
|
||||||
|
- 'none': Defaults to 'linear' for numeric values, otherwise 'ffill'.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
pd.DataFrame: A DataFrame where each column represents a key's array with a common time index.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
KeyError: If no valid data is found for any of the requested keys.
|
||||||
|
ValueError: If any retrieved array has a different time index than the first one.
|
||||||
|
"""
|
||||||
|
# Ensure datetime objects are normalized
|
||||||
|
start_datetime = to_datetime(start_datetime, to_maxtime=False) if start_datetime else None
|
||||||
|
end_datetime = to_datetime(end_datetime, to_maxtime=False) if end_datetime else None
|
||||||
|
if interval is None:
|
||||||
|
interval = to_duration("1 hour")
|
||||||
|
if start_datetime is None:
|
||||||
|
# Take earliest datetime of all providers that are enabled
|
||||||
|
for provider in self.enabled_providers:
|
||||||
|
if start_datetime is None:
|
||||||
|
start_datetime = provider.min_datetime
|
||||||
|
elif (
|
||||||
|
provider.min_datetime
|
||||||
|
and compare_datetimes(provider.min_datetime, start_datetime).lt
|
||||||
|
):
|
||||||
|
start_datetime = provider.min_datetime
|
||||||
|
if end_datetime is None:
|
||||||
|
# Take latest datetime of all providers that are enabled
|
||||||
|
for provider in self.enabled_providers:
|
||||||
|
if end_datetime is None:
|
||||||
|
end_datetime = provider.max_datetime
|
||||||
|
elif (
|
||||||
|
provider.max_datetime
|
||||||
|
and compare_datetimes(provider.max_datetime, end_datetime).gt
|
||||||
|
):
|
||||||
|
end_datetime = provider.min_datetime
|
||||||
|
if end_datetime:
|
||||||
|
end_datetime.add(seconds=1)
|
||||||
|
|
||||||
|
# Create a DatetimeIndex based on start, end, and interval
|
||||||
|
reference_index = pd.date_range(
|
||||||
|
start=start_datetime, end=end_datetime, freq=interval, inclusive="left"
|
||||||
|
)
|
||||||
|
|
||||||
|
data = {}
|
||||||
|
for key in keys:
|
||||||
|
try:
|
||||||
|
array = self.key_to_array(key, start_datetime, end_datetime, interval, fill_method)
|
||||||
|
|
||||||
|
if len(array) != len(reference_index):
|
||||||
|
raise ValueError(
|
||||||
|
f"Array length mismatch for key '{key}' (expected {len(reference_index)}, got {len(array)})"
|
||||||
|
)
|
||||||
|
|
||||||
|
data[key] = array
|
||||||
|
except KeyError as e:
|
||||||
|
raise KeyError(f"Failed to retrieve data for key '{key}': {e}")
|
||||||
|
|
||||||
|
if not data:
|
||||||
|
raise KeyError(f"No valid data found for the requested keys {keys}.")
|
||||||
|
|
||||||
|
return pd.DataFrame(data, index=reference_index)
|
||||||
|
|
||||||
def provider_by_id(self, provider_id: str) -> DataProvider:
|
def provider_by_id(self, provider_id: str) -> DataProvider:
|
||||||
"""Retrieves a data provider by its unique identifier.
|
"""Retrieves a data provider by its unique identifier.
|
||||||
|
|
||||||
|
@ -437,6 +437,10 @@ class PydanticDateTimeDataFrame(PydanticBaseModel):
|
|||||||
index = pd.Index([to_datetime(dt, in_timezone=self.tz) for dt in df.index])
|
index = pd.Index([to_datetime(dt, in_timezone=self.tz) for dt in df.index])
|
||||||
df.index = index
|
df.index = index
|
||||||
|
|
||||||
|
# Check if 'date_time' column exists, if not, create it
|
||||||
|
if "date_time" not in df.columns:
|
||||||
|
df["date_time"] = df.index
|
||||||
|
|
||||||
dtype_mapping = {
|
dtype_mapping = {
|
||||||
"int": int,
|
"int": int,
|
||||||
"float": float,
|
"float": float,
|
||||||
|
@ -63,6 +63,9 @@ class ElecPriceImport(ElecPriceProvider, PredictionImportProvider):
|
|||||||
return "ElecPriceImport"
|
return "ElecPriceImport"
|
||||||
|
|
||||||
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
||||||
|
if self.config.elecprice.provider_settings is None:
|
||||||
|
logger.debug(f"{self.provider_id()} data update without provider settings.")
|
||||||
|
return
|
||||||
if self.config.elecprice.provider_settings.import_file_path:
|
if self.config.elecprice.provider_settings.import_file_path:
|
||||||
self.import_from_file(
|
self.import_from_file(
|
||||||
self.config.elecprice.provider_settings.import_file_path,
|
self.config.elecprice.provider_settings.import_file_path,
|
||||||
|
@ -62,6 +62,9 @@ class LoadImport(LoadProvider, PredictionImportProvider):
|
|||||||
return "LoadImport"
|
return "LoadImport"
|
||||||
|
|
||||||
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
||||||
|
if self.config.load.provider_settings is None:
|
||||||
|
logger.debug(f"{self.provider_id()} data update without provider settings.")
|
||||||
|
return
|
||||||
if self.config.load.provider_settings.import_file_path:
|
if self.config.load.provider_settings.import_file_path:
|
||||||
self.import_from_file(self.config.provider_settings.import_file_path, key_prefix="load")
|
self.import_from_file(self.config.provider_settings.import_file_path, key_prefix="load")
|
||||||
if self.config.load.provider_settings.import_json:
|
if self.config.load.provider_settings.import_json:
|
||||||
|
@ -63,6 +63,9 @@ class PVForecastImport(PVForecastProvider, PredictionImportProvider):
|
|||||||
return "PVForecastImport"
|
return "PVForecastImport"
|
||||||
|
|
||||||
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
||||||
|
if self.config.pvforecast.provider_settings is None:
|
||||||
|
logger.debug(f"{self.provider_id()} data update without provider settings.")
|
||||||
|
return
|
||||||
if self.config.pvforecast.provider_settings.import_file_path is not None:
|
if self.config.pvforecast.provider_settings.import_file_path is not None:
|
||||||
self.import_from_file(
|
self.import_from_file(
|
||||||
self.config.pvforecast.provider_settings.import_file_path,
|
self.config.pvforecast.provider_settings.import_file_path,
|
||||||
|
@ -63,6 +63,9 @@ class WeatherImport(WeatherProvider, PredictionImportProvider):
|
|||||||
return "WeatherImport"
|
return "WeatherImport"
|
||||||
|
|
||||||
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
def _update_data(self, force_update: Optional[bool] = False) -> None:
|
||||||
|
if self.config.weather.provider_settings is None:
|
||||||
|
logger.debug(f"{self.provider_id()} data update without provider settings.")
|
||||||
|
return
|
||||||
if self.config.weather.provider_settings.import_file_path:
|
if self.config.weather.provider_settings.import_file_path:
|
||||||
self.import_from_file(
|
self.import_from_file(
|
||||||
self.config.weather.provider_settings.import_file_path, key_prefix="weather"
|
self.config.weather.provider_settings.import_file_path, key_prefix="weather"
|
||||||
|
0
src/akkudoktoreos/server/dash/__init__.py
Normal file
After Width: | Height: | Size: 22 KiB |
After Width: | Height: | Size: 112 KiB |
After Width: | Height: | Size: 20 KiB |
BIN
src/akkudoktoreos/server/dash/assets/favicon/favicon-16x16.png
Normal file
After Width: | Height: | Size: 724 B |
BIN
src/akkudoktoreos/server/dash/assets/favicon/favicon-32x32.png
Normal file
After Width: | Height: | Size: 1.6 KiB |
BIN
src/akkudoktoreos/server/dash/assets/favicon/favicon.ico
Normal file
After Width: | Height: | Size: 15 KiB |
@ -0,0 +1 @@
|
|||||||
|
{"name":"","short_name":"","icons":[{"src":"/android-chrome-192x192.png","sizes":"192x192","type":"image/png"},{"src":"/android-chrome-512x512.png","sizes":"512x512","type":"image/png"}],"theme_color":"#ffffff","background_color":"#ffffff","display":"standalone"}
|
BIN
src/akkudoktoreos/server/dash/assets/icon.png
Normal file
After Width: | Height: | Size: 7.5 KiB |
BIN
src/akkudoktoreos/server/dash/assets/logo.png
Normal file
After Width: | Height: | Size: 12 KiB |
38
src/akkudoktoreos/server/dash/bokeh.py
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
# Module taken from https://github.com/koaning/fh-altair
|
||||||
|
# MIT license
|
||||||
|
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from bokeh.embed import components
|
||||||
|
from bokeh.models import Plot
|
||||||
|
from monsterui.franken import H4, Card, NotStr, Script
|
||||||
|
|
||||||
|
BokehJS = [
|
||||||
|
Script(src="https://cdn.bokeh.org/bokeh/release/bokeh-3.6.3.min.js", crossorigin="anonymous"),
|
||||||
|
Script(
|
||||||
|
src="https://cdn.bokeh.org/bokeh/release/bokeh-widgets-3.6.3.min.js",
|
||||||
|
crossorigin="anonymous",
|
||||||
|
),
|
||||||
|
Script(
|
||||||
|
src="https://cdn.bokeh.org/bokeh/release/bokeh-tables-3.6.3.min.js", crossorigin="anonymous"
|
||||||
|
),
|
||||||
|
Script(
|
||||||
|
src="https://cdn.bokeh.org/bokeh/release/bokeh-gl-3.6.3.min.js", crossorigin="anonymous"
|
||||||
|
),
|
||||||
|
Script(
|
||||||
|
src="https://cdn.bokeh.org/bokeh/release/bokeh-mathjax-3.6.3.min.js",
|
||||||
|
crossorigin="anonymous",
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def Bokeh(plot: Plot, header: Optional[str] = None) -> Card:
|
||||||
|
"""Converts an Bokeh plot to a FastHTML FT component."""
|
||||||
|
script, div = components(plot)
|
||||||
|
if header:
|
||||||
|
header = H4(header, cls="mt-2")
|
||||||
|
return Card(
|
||||||
|
NotStr(div),
|
||||||
|
NotStr(script),
|
||||||
|
header=header,
|
||||||
|
)
|
224
src/akkudoktoreos/server/dash/components.py
Normal file
@ -0,0 +1,224 @@
|
|||||||
|
from typing import Any, Optional, Union
|
||||||
|
|
||||||
|
from fasthtml.common import H1, Div, Li
|
||||||
|
|
||||||
|
# from mdit_py_plugins import plugin1, plugin2
|
||||||
|
from monsterui.foundations import stringify
|
||||||
|
from monsterui.franken import (
|
||||||
|
Button,
|
||||||
|
ButtonT,
|
||||||
|
Card,
|
||||||
|
Container,
|
||||||
|
ContainerT,
|
||||||
|
Details,
|
||||||
|
DivLAligned,
|
||||||
|
DivRAligned,
|
||||||
|
Grid,
|
||||||
|
Input,
|
||||||
|
P,
|
||||||
|
Summary,
|
||||||
|
TabContainer,
|
||||||
|
UkIcon,
|
||||||
|
)
|
||||||
|
|
||||||
|
scrollbar_viewport_styles = (
|
||||||
|
"scrollbar-width: none; -ms-overflow-style: none; -webkit-overflow-scrolling: touch;"
|
||||||
|
)
|
||||||
|
|
||||||
|
scrollbar_cls = "flex touch-none select-none transition-colors p-[1px]"
|
||||||
|
|
||||||
|
|
||||||
|
def ScrollArea(
|
||||||
|
*c: Any, cls: Optional[Union[str, tuple]] = None, orientation: str = "vertical", **kwargs: Any
|
||||||
|
) -> Div:
|
||||||
|
"""Creates a styled scroll area.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
orientation (str): The orientation of the scroll area. Defaults to vertical.
|
||||||
|
"""
|
||||||
|
new_cls = "relative overflow-hidden"
|
||||||
|
if cls:
|
||||||
|
new_cls += f" {stringify(cls)}"
|
||||||
|
kwargs["cls"] = new_cls
|
||||||
|
|
||||||
|
content = Div(
|
||||||
|
Div(*c, style="min-width:100%;display:table;"),
|
||||||
|
style=f"overflow: {'hidden scroll' if orientation == 'vertical' else 'scroll'}; {scrollbar_viewport_styles}",
|
||||||
|
cls="w-full h-full rounded-[inherit]",
|
||||||
|
data_ref="viewport",
|
||||||
|
)
|
||||||
|
|
||||||
|
scrollbar = Div(
|
||||||
|
Div(cls="bg-border rounded-full hidden relative flex-1", data_ref="thumb"),
|
||||||
|
cls=f"{scrollbar_cls} flex-col h-2.5 w-full border-t border-t-transparent"
|
||||||
|
if orientation == "horizontal"
|
||||||
|
else f"{scrollbar_cls} w-2.5 h-full border-l border-l-transparent",
|
||||||
|
data_ref="scrollbar",
|
||||||
|
style=f"position: absolute;{'right:0; top:0;' if orientation == 'vertical' else 'bottom:0; left:0;'}",
|
||||||
|
)
|
||||||
|
|
||||||
|
return Div(
|
||||||
|
content,
|
||||||
|
scrollbar,
|
||||||
|
role="region",
|
||||||
|
tabindex="0",
|
||||||
|
data_orientation=orientation,
|
||||||
|
data_ref_scrollarea=True,
|
||||||
|
aria_label="Scrollable content",
|
||||||
|
**kwargs,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def ConfigCard(
|
||||||
|
config_name: str, config_type: str, read_only: str, value: str, default: str, description: str
|
||||||
|
) -> Card:
|
||||||
|
return Card(
|
||||||
|
Details(
|
||||||
|
Summary(
|
||||||
|
Grid(
|
||||||
|
Grid(
|
||||||
|
DivLAligned(
|
||||||
|
UkIcon(icon="play"),
|
||||||
|
P(config_name),
|
||||||
|
),
|
||||||
|
DivRAligned(
|
||||||
|
P(read_only),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
Input(value=value) if read_only == "rw" else P(value),
|
||||||
|
),
|
||||||
|
# cls="flex cursor-pointer list-none items-center gap-4",
|
||||||
|
cls="list-none",
|
||||||
|
),
|
||||||
|
Grid(
|
||||||
|
P(description),
|
||||||
|
P(config_type),
|
||||||
|
),
|
||||||
|
Grid(
|
||||||
|
DivRAligned(
|
||||||
|
P("default") if read_only == "rw" else P(""),
|
||||||
|
),
|
||||||
|
P(default) if read_only == "rw" else P(""),
|
||||||
|
)
|
||||||
|
if read_only == "rw"
|
||||||
|
else None,
|
||||||
|
cls="space-y-4 gap-4",
|
||||||
|
),
|
||||||
|
cls="w-full",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def DashboardHeader(title: Optional[str]) -> Div:
|
||||||
|
"""Creates a styled header with a title.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (Optional[str]): The title text for the header.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Div: A styled `Div` element containing the header.
|
||||||
|
"""
|
||||||
|
if title is None:
|
||||||
|
return Div("", cls="header")
|
||||||
|
return Div(H1(title, cls="text-2xl font-bold mb-4"), cls="header")
|
||||||
|
|
||||||
|
|
||||||
|
def DashboardFooter(*c: Any, path: str) -> Card:
|
||||||
|
"""Creates a styled footer with the provided information.
|
||||||
|
|
||||||
|
The footer content is reloaded every 5 seconds from path.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
path (str): Path to reload footer content from
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Card: A styled `Card` element containing the footer.
|
||||||
|
"""
|
||||||
|
return Card(
|
||||||
|
Container(*c, id="footer-content"),
|
||||||
|
hx_get=f"{path}",
|
||||||
|
hx_trigger="every 5s",
|
||||||
|
hx_target="#footer-content",
|
||||||
|
hx_swap="innerHTML",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def DashboardTrigger(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> Button:
|
||||||
|
"""Creates a styled button for the dashboard trigger.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
*c: Positional arguments to pass to the button.
|
||||||
|
cls (Optional[str]): Additional CSS classes for styling. Defaults to None.
|
||||||
|
**kwargs: Additional keyword arguments for the button.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Button: A styled `Button` component.
|
||||||
|
"""
|
||||||
|
new_cls = f"{ButtonT.primary}"
|
||||||
|
if cls:
|
||||||
|
new_cls += f" {stringify(cls)}"
|
||||||
|
kwargs["cls"] = new_cls
|
||||||
|
return Button(*c, submit=False, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def DashboardTabs(dashboard_items: dict[str, str]) -> Card:
|
||||||
|
"""Creates a dashboard tab with dynamic dashboard items.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dashboard_items (dict[str, str]): A dictionary of dashboard items where keys are item names
|
||||||
|
and values are paths for navigation.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Card: A styled `Card` component containing the dashboard tabs.
|
||||||
|
"""
|
||||||
|
dash_items = [
|
||||||
|
Li(
|
||||||
|
DashboardTrigger(
|
||||||
|
menu,
|
||||||
|
hx_get=f"{path}",
|
||||||
|
hx_target="#page-content",
|
||||||
|
hx_swap="innerHTML",
|
||||||
|
),
|
||||||
|
)
|
||||||
|
for menu, path in dashboard_items.items()
|
||||||
|
]
|
||||||
|
return Card(TabContainer(*dash_items, cls="gap-4"), alt=True)
|
||||||
|
|
||||||
|
|
||||||
|
def DashboardContent(content: Any) -> Card:
|
||||||
|
"""Creates a content section within a styled card.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
content (Any): The content to display.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Card: A styled `Card` element containing the content.
|
||||||
|
"""
|
||||||
|
return Card(ScrollArea(Container(content, id="page-content"), cls="h-[75vh] w-full rounded-md"))
|
||||||
|
|
||||||
|
|
||||||
|
def Page(
|
||||||
|
title: Optional[str],
|
||||||
|
dashboard_items: dict[str, str],
|
||||||
|
content: Any,
|
||||||
|
footer_content: Any,
|
||||||
|
footer_path: str,
|
||||||
|
) -> Div:
|
||||||
|
"""Generates a full-page layout with a header, dashboard items, content, and footer.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (Optional[str]): The page title.
|
||||||
|
dashboard_items (dict[str, str]): A dictionary of dashboard items.
|
||||||
|
content (Any): The main content for the page.
|
||||||
|
footer_content (Any): Footer content.
|
||||||
|
footer_path (Any): Path to reload footer content from.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Div: A `Div` element representing the entire page layout.
|
||||||
|
"""
|
||||||
|
return Container(
|
||||||
|
DashboardHeader(title),
|
||||||
|
DashboardTabs(dashboard_items),
|
||||||
|
DashboardContent(content),
|
||||||
|
DashboardFooter(footer_content, path=footer_path),
|
||||||
|
cls=("bg-background text-foreground w-screen p-4 space-y-4", ContainerT.xl),
|
||||||
|
)
|
275
src/akkudoktoreos/server/dash/configuration.py
Normal file
@ -0,0 +1,275 @@
|
|||||||
|
from typing import Any, Dict, List, Optional, Sequence, TypeVar, Union
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from monsterui.franken import Div, DividerLine, P, Table, Tbody, Td, Th, Thead, Tr
|
||||||
|
from pydantic.fields import ComputedFieldInfo, FieldInfo
|
||||||
|
from pydantic_core import PydanticUndefined
|
||||||
|
|
||||||
|
from akkudoktoreos.config.config import get_config
|
||||||
|
from akkudoktoreos.core.logging import get_logger
|
||||||
|
from akkudoktoreos.core.pydantic import PydanticBaseModel
|
||||||
|
from akkudoktoreos.server.dash.components import ConfigCard
|
||||||
|
|
||||||
|
logger = get_logger(__name__)
|
||||||
|
config_eos = get_config()
|
||||||
|
|
||||||
|
T = TypeVar("T")
|
||||||
|
|
||||||
|
|
||||||
|
def get_nested_value(
|
||||||
|
dictionary: Union[Dict[str, Any], List[Any]],
|
||||||
|
keys: Sequence[Union[str, int]],
|
||||||
|
default: Optional[T] = None,
|
||||||
|
) -> Union[Any, T]:
|
||||||
|
"""Retrieve a nested value from a dictionary or list using a sequence of keys.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dictionary (Union[Dict[str, Any], List[Any]]): The nested dictionary or list to search.
|
||||||
|
keys (Sequence[Union[str, int]]): A sequence of keys or indices representing the path to the desired value.
|
||||||
|
default (Optional[T]): A value to return if the path is not found.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Union[Any, T]: The value at the specified nested path, or the default value if not found.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
TypeError: If the input is not a dictionary or list, or if keys are not a sequence.
|
||||||
|
KeyError: If a key is not found in a dictionary.
|
||||||
|
IndexError: If an index is out of range in a list.
|
||||||
|
"""
|
||||||
|
if not isinstance(dictionary, (dict, list)):
|
||||||
|
raise TypeError("The first argument must be a dictionary or list")
|
||||||
|
if not isinstance(keys, Sequence):
|
||||||
|
raise TypeError("Keys must be provided as a sequence (e.g., list, tuple)")
|
||||||
|
|
||||||
|
if not keys:
|
||||||
|
return dictionary
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Traverse the structure
|
||||||
|
current = dictionary
|
||||||
|
for key in keys:
|
||||||
|
if isinstance(current, dict) and isinstance(key, str):
|
||||||
|
current = current[key]
|
||||||
|
elif isinstance(current, list) and isinstance(key, int):
|
||||||
|
current = current[key]
|
||||||
|
else:
|
||||||
|
raise KeyError(f"Invalid key or index: {key}")
|
||||||
|
return current
|
||||||
|
except (KeyError, IndexError, TypeError):
|
||||||
|
return default
|
||||||
|
|
||||||
|
|
||||||
|
def get_default_value(field_info: Union[FieldInfo, ComputedFieldInfo], regular_field: bool) -> Any:
|
||||||
|
"""Retrieve the default value of a field.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
field_info (Union[FieldInfo, ComputedFieldInfo]): The field metadata from Pydantic.
|
||||||
|
regular_field (bool): Indicates if the field is a regular field.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Any: The default value of the field or "N/A" if not a regular field.
|
||||||
|
"""
|
||||||
|
default_value = ""
|
||||||
|
if regular_field:
|
||||||
|
if (val := field_info.default) is not PydanticUndefined:
|
||||||
|
default_value = val
|
||||||
|
else:
|
||||||
|
default_value = "N/A"
|
||||||
|
return default_value
|
||||||
|
|
||||||
|
|
||||||
|
def resolve_nested_types(field_type: Any, parent_types: list[str]) -> list[tuple[Any, list[str]]]:
|
||||||
|
"""Resolve nested types within a field and return their structure.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
field_type (Any): The type of the field to resolve.
|
||||||
|
parent_types (List[str]): A list of parent type names.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List[tuple[Any, List[str]]]: A list of tuples containing resolved types and their parent hierarchy.
|
||||||
|
"""
|
||||||
|
resolved_types: list[tuple[Any, list[str]]] = []
|
||||||
|
|
||||||
|
origin = getattr(field_type, "__origin__", field_type)
|
||||||
|
if origin is Union:
|
||||||
|
for arg in getattr(field_type, "__args__", []):
|
||||||
|
if arg is not type(None):
|
||||||
|
resolved_types.extend(resolve_nested_types(arg, parent_types))
|
||||||
|
else:
|
||||||
|
resolved_types.append((field_type, parent_types))
|
||||||
|
|
||||||
|
return resolved_types
|
||||||
|
|
||||||
|
|
||||||
|
def configuration(values: dict) -> list[dict]:
|
||||||
|
"""Generate configuration details based on provided values and model metadata.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
values (dict): A dictionary containing the current configuration values.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List[dict]: A sorted list of configuration details, each represented as a dictionary.
|
||||||
|
"""
|
||||||
|
configs = []
|
||||||
|
inner_types: set[type[PydanticBaseModel]] = set()
|
||||||
|
|
||||||
|
for field_name, field_info in list(config_eos.model_fields.items()) + list(
|
||||||
|
config_eos.model_computed_fields.items()
|
||||||
|
):
|
||||||
|
|
||||||
|
def extract_nested_models(
|
||||||
|
subfield_info: Union[ComputedFieldInfo, FieldInfo], parent_types: list[str]
|
||||||
|
) -> None:
|
||||||
|
regular_field = isinstance(subfield_info, FieldInfo)
|
||||||
|
subtype = subfield_info.annotation if regular_field else subfield_info.return_type
|
||||||
|
|
||||||
|
if subtype in inner_types:
|
||||||
|
return
|
||||||
|
|
||||||
|
nested_types = resolve_nested_types(subtype, [])
|
||||||
|
found_basic = False
|
||||||
|
for nested_type, nested_parent_types in nested_types:
|
||||||
|
if not isinstance(nested_type, type) or not issubclass(
|
||||||
|
nested_type, PydanticBaseModel
|
||||||
|
):
|
||||||
|
if found_basic:
|
||||||
|
continue
|
||||||
|
|
||||||
|
config = {}
|
||||||
|
config["name"] = ".".join(parent_types)
|
||||||
|
config["value"] = str(get_nested_value(values, parent_types, "<unknown>"))
|
||||||
|
config["default"] = str(get_default_value(subfield_info, regular_field))
|
||||||
|
config["description"] = (
|
||||||
|
subfield_info.description if subfield_info.description else ""
|
||||||
|
)
|
||||||
|
if isinstance(subfield_info, ComputedFieldInfo):
|
||||||
|
config["read-only"] = "ro"
|
||||||
|
type_description = str(subfield_info.return_type)
|
||||||
|
else:
|
||||||
|
config["read-only"] = "rw"
|
||||||
|
type_description = str(subfield_info.annotation)
|
||||||
|
config["type"] = (
|
||||||
|
type_description.replace("typing.", "")
|
||||||
|
.replace("pathlib.", "")
|
||||||
|
.replace("[", "[ ")
|
||||||
|
.replace("NoneType", "None")
|
||||||
|
)
|
||||||
|
configs.append(config)
|
||||||
|
found_basic = True
|
||||||
|
else:
|
||||||
|
new_parent_types = parent_types + nested_parent_types
|
||||||
|
inner_types.add(nested_type)
|
||||||
|
for nested_field_name, nested_field_info in list(
|
||||||
|
nested_type.model_fields.items()
|
||||||
|
) + list(nested_type.model_computed_fields.items()):
|
||||||
|
extract_nested_models(
|
||||||
|
nested_field_info,
|
||||||
|
new_parent_types + [nested_field_name],
|
||||||
|
)
|
||||||
|
|
||||||
|
extract_nested_models(field_info, [field_name])
|
||||||
|
return sorted(configs, key=lambda x: x["name"])
|
||||||
|
|
||||||
|
|
||||||
|
def get_configuration(eos_host: Optional[str], eos_port: Optional[Union[str, int]]) -> list[dict]:
|
||||||
|
"""Fetch and process configuration data from the specified EOS server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
eos_host (Optional[str]): The hostname of the server.
|
||||||
|
eos_port (Optional[Union[str, int]]): The port of the server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List[dict]: A list of processed configuration entries.
|
||||||
|
"""
|
||||||
|
if eos_host is None:
|
||||||
|
eos_host = config_eos.server.host
|
||||||
|
if eos_port is None:
|
||||||
|
eos_port = config_eos.server.port
|
||||||
|
server = f"http://{eos_host}:{eos_port}"
|
||||||
|
|
||||||
|
# Get current configuration from server
|
||||||
|
try:
|
||||||
|
result = requests.get(f"{server}/v1/config")
|
||||||
|
result.raise_for_status()
|
||||||
|
except requests.exceptions.HTTPError as e:
|
||||||
|
detail = result.json()["detail"]
|
||||||
|
warning_msg = f"Can not retrieve configuration from {server}: {e}, {detail}"
|
||||||
|
logger.warning(warning_msg)
|
||||||
|
return configuration({})
|
||||||
|
config = result.json()
|
||||||
|
|
||||||
|
return configuration(config)
|
||||||
|
|
||||||
|
|
||||||
|
def Configuration(eos_host: Optional[str], eos_port: Optional[Union[str, int]]) -> Div:
|
||||||
|
"""Create a visual representation of the configuration.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
eos_host (Optional[str]): The hostname of the EOS server.
|
||||||
|
eos_port (Optional[Union[str, int]]): The port of the EOS server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Table: A `monsterui.franken.Table` component displaying configuration details.
|
||||||
|
"""
|
||||||
|
flds = "Name", "Type", "RO/RW", "Value", "Default", "Description"
|
||||||
|
rows = []
|
||||||
|
last_category = ""
|
||||||
|
for config in get_configuration(eos_host, eos_port):
|
||||||
|
category = config["name"].split(".")[0]
|
||||||
|
if category != last_category:
|
||||||
|
rows.append(P(category))
|
||||||
|
rows.append(DividerLine())
|
||||||
|
last_category = category
|
||||||
|
rows.append(
|
||||||
|
ConfigCard(
|
||||||
|
config["name"],
|
||||||
|
config["type"],
|
||||||
|
config["read-only"],
|
||||||
|
config["value"],
|
||||||
|
config["default"],
|
||||||
|
config["description"],
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return Div(*rows, cls="space-y-4")
|
||||||
|
|
||||||
|
|
||||||
|
def ConfigurationOrg(eos_host: Optional[str], eos_port: Optional[Union[str, int]]) -> Table:
|
||||||
|
"""Create a visual representation of the configuration.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
eos_host (Optional[str]): The hostname of the EOS server.
|
||||||
|
eos_port (Optional[Union[str, int]]): The port of the EOS server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Table: A `monsterui.franken.Table` component displaying configuration details.
|
||||||
|
"""
|
||||||
|
flds = "Name", "Type", "RO/RW", "Value", "Default", "Description"
|
||||||
|
rows = [
|
||||||
|
Tr(
|
||||||
|
Td(
|
||||||
|
config["name"],
|
||||||
|
cls="max-w-64 text-wrap break-all",
|
||||||
|
),
|
||||||
|
Td(
|
||||||
|
config["type"],
|
||||||
|
cls="max-w-48 text-wrap break-all",
|
||||||
|
),
|
||||||
|
Td(
|
||||||
|
config["read-only"],
|
||||||
|
cls="max-w-24 text-wrap break-all",
|
||||||
|
),
|
||||||
|
Td(
|
||||||
|
config["value"],
|
||||||
|
cls="max-w-md text-wrap break-all",
|
||||||
|
),
|
||||||
|
Td(config["default"], cls="max-w-48 text-wrap break-all"),
|
||||||
|
Td(
|
||||||
|
config["description"],
|
||||||
|
cls="max-w-prose text-wrap",
|
||||||
|
),
|
||||||
|
cls="",
|
||||||
|
)
|
||||||
|
for config in get_configuration(eos_host, eos_port)
|
||||||
|
]
|
||||||
|
head = Thead(*map(Th, flds), cls="text-left")
|
||||||
|
return Table(head, Tbody(*rows), cls="w-full uk-table uk-table-divider uk-table-striped")
|
86
src/akkudoktoreos/server/dash/data/democonfig.json
Normal file
@ -0,0 +1,86 @@
|
|||||||
|
{
|
||||||
|
"elecprice": {
|
||||||
|
"charges_kwh": 0.21,
|
||||||
|
"provider": "ElecPriceAkkudoktor"
|
||||||
|
},
|
||||||
|
"general": {
|
||||||
|
"latitude": 52.5,
|
||||||
|
"longitude": 13.4
|
||||||
|
},
|
||||||
|
"prediction": {
|
||||||
|
"historic_hours": 48,
|
||||||
|
"hours": 48
|
||||||
|
},
|
||||||
|
"load": {
|
||||||
|
"provider": "LoadAkkudoktor",
|
||||||
|
"provider_settings": {
|
||||||
|
"loadakkudoktor_year_energy": 20000
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"optimization": {
|
||||||
|
"hours": 48
|
||||||
|
},
|
||||||
|
"pvforecast": {
|
||||||
|
"planes": [
|
||||||
|
{
|
||||||
|
"peakpower": 5.0,
|
||||||
|
"surface_azimuth": -10,
|
||||||
|
"surface_tilt": 7,
|
||||||
|
"userhorizon": [
|
||||||
|
20,
|
||||||
|
27,
|
||||||
|
22,
|
||||||
|
20
|
||||||
|
],
|
||||||
|
"inverter_paco": 10000
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"peakpower": 4.8,
|
||||||
|
"surface_azimuth": -90,
|
||||||
|
"surface_tilt": 7,
|
||||||
|
"userhorizon": [
|
||||||
|
30,
|
||||||
|
30,
|
||||||
|
30,
|
||||||
|
50
|
||||||
|
],
|
||||||
|
"inverter_paco": 10000
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"peakpower": 1.4,
|
||||||
|
"surface_azimuth": -40,
|
||||||
|
"surface_tilt": 60,
|
||||||
|
"userhorizon": [
|
||||||
|
60,
|
||||||
|
30,
|
||||||
|
0,
|
||||||
|
30
|
||||||
|
],
|
||||||
|
"inverter_paco": 2000
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"peakpower": 1.6,
|
||||||
|
"surface_azimuth": 5,
|
||||||
|
"surface_tilt": 45,
|
||||||
|
"userhorizon": [
|
||||||
|
45,
|
||||||
|
25,
|
||||||
|
30,
|
||||||
|
60
|
||||||
|
],
|
||||||
|
"inverter_paco": 1400
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"provider": "PVForecastAkkudoktor"
|
||||||
|
},
|
||||||
|
"server": {
|
||||||
|
"startup_eosdash": true,
|
||||||
|
"host": "0.0.0.0",
|
||||||
|
"port": 8503,
|
||||||
|
"eosdash_host": "0.0.0.0",
|
||||||
|
"eosdash_port": 8504
|
||||||
|
},
|
||||||
|
"weather": {
|
||||||
|
"provider": "BrightSky"
|
||||||
|
}
|
||||||
|
}
|
217
src/akkudoktoreos/server/dash/demo.py
Normal file
@ -0,0 +1,217 @@
|
|||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Union
|
||||||
|
|
||||||
|
import pandas as pd
|
||||||
|
import requests
|
||||||
|
from bokeh.models import ColumnDataSource, Range1d
|
||||||
|
from bokeh.plotting import figure
|
||||||
|
from monsterui.franken import FT, Grid, P
|
||||||
|
|
||||||
|
from akkudoktoreos.core.logging import get_logger
|
||||||
|
from akkudoktoreos.core.pydantic import PydanticDateTimeDataFrame
|
||||||
|
from akkudoktoreos.server.dash.bokeh import Bokeh
|
||||||
|
|
||||||
|
DIR_DEMODATA = Path(__file__).absolute().parent.joinpath("data")
|
||||||
|
FILE_DEMOCONFIG = DIR_DEMODATA.joinpath("democonfig.json")
|
||||||
|
if not FILE_DEMOCONFIG.exists():
|
||||||
|
raise ValueError(f"File does not exist: {FILE_DEMOCONFIG}")
|
||||||
|
|
||||||
|
logger = get_logger(__name__)
|
||||||
|
|
||||||
|
# bar width for 1 hour bars (time given in millseconds)
|
||||||
|
BAR_WIDTH_1HOUR = 1000 * 60 * 60
|
||||||
|
|
||||||
|
|
||||||
|
def DemoPVForecast(predictions: pd.DataFrame, config: dict) -> FT:
|
||||||
|
source = ColumnDataSource(predictions)
|
||||||
|
provider = config["pvforecast"]["provider"]
|
||||||
|
|
||||||
|
plot = figure(
|
||||||
|
x_axis_type="datetime",
|
||||||
|
title=f"PV Power Prediction ({provider})",
|
||||||
|
x_axis_label="Datetime",
|
||||||
|
y_axis_label="Power [W]",
|
||||||
|
sizing_mode="stretch_width",
|
||||||
|
height=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
plot.vbar(
|
||||||
|
x="date_time",
|
||||||
|
top="pvforecast_ac_power",
|
||||||
|
source=source,
|
||||||
|
width=BAR_WIDTH_1HOUR * 0.8,
|
||||||
|
legend_label="AC Power",
|
||||||
|
color="lightblue",
|
||||||
|
)
|
||||||
|
|
||||||
|
return Bokeh(plot)
|
||||||
|
|
||||||
|
|
||||||
|
def DemoElectricityPriceForecast(predictions: pd.DataFrame, config: dict) -> FT:
|
||||||
|
source = ColumnDataSource(predictions)
|
||||||
|
provider = config["elecprice"]["provider"]
|
||||||
|
|
||||||
|
plot = figure(
|
||||||
|
x_axis_type="datetime",
|
||||||
|
y_range=Range1d(
|
||||||
|
predictions["elecprice_marketprice_kwh"].min() - 0.1,
|
||||||
|
predictions["elecprice_marketprice_kwh"].max() + 0.1,
|
||||||
|
),
|
||||||
|
title=f"Electricity Price Prediction ({provider})",
|
||||||
|
x_axis_label="Datetime",
|
||||||
|
y_axis_label="Price [€/kWh]",
|
||||||
|
sizing_mode="stretch_width",
|
||||||
|
height=400,
|
||||||
|
)
|
||||||
|
plot.vbar(
|
||||||
|
x="date_time",
|
||||||
|
top="elecprice_marketprice_kwh",
|
||||||
|
source=source,
|
||||||
|
width=BAR_WIDTH_1HOUR * 0.8,
|
||||||
|
legend_label="Market Price",
|
||||||
|
color="lightblue",
|
||||||
|
)
|
||||||
|
|
||||||
|
return Bokeh(plot)
|
||||||
|
|
||||||
|
|
||||||
|
def DemoWeatherTempAir(predictions: pd.DataFrame, config: dict) -> FT:
|
||||||
|
source = ColumnDataSource(predictions)
|
||||||
|
provider = config["weather"]["provider"]
|
||||||
|
|
||||||
|
plot = figure(
|
||||||
|
x_axis_type="datetime",
|
||||||
|
y_range=Range1d(
|
||||||
|
predictions["weather_temp_air"].min() - 1.0, predictions["weather_temp_air"].max() + 1.0
|
||||||
|
),
|
||||||
|
title=f"Air Temperature Prediction ({provider})",
|
||||||
|
x_axis_label="Datetime",
|
||||||
|
y_axis_label="Temperature [°C]",
|
||||||
|
sizing_mode="stretch_width",
|
||||||
|
height=400,
|
||||||
|
)
|
||||||
|
plot.line(
|
||||||
|
"date_time", "weather_temp_air", source=source, legend_label="Air Temperature", color="blue"
|
||||||
|
)
|
||||||
|
|
||||||
|
return Bokeh(plot)
|
||||||
|
|
||||||
|
|
||||||
|
def DemoWeatherIrradiance(predictions: pd.DataFrame, config: dict) -> FT:
|
||||||
|
source = ColumnDataSource(predictions)
|
||||||
|
provider = config["weather"]["provider"]
|
||||||
|
|
||||||
|
plot = figure(
|
||||||
|
x_axis_type="datetime",
|
||||||
|
title=f"Irradiance Prediction ({provider})",
|
||||||
|
x_axis_label="Datetime",
|
||||||
|
y_axis_label="Irradiance [W/m2]",
|
||||||
|
sizing_mode="stretch_width",
|
||||||
|
height=400,
|
||||||
|
)
|
||||||
|
plot.line(
|
||||||
|
"date_time",
|
||||||
|
"weather_ghi",
|
||||||
|
source=source,
|
||||||
|
legend_label="Global Horizontal Irradiance",
|
||||||
|
color="red",
|
||||||
|
)
|
||||||
|
plot.line(
|
||||||
|
"date_time",
|
||||||
|
"weather_dni",
|
||||||
|
source=source,
|
||||||
|
legend_label="Direct Normal Irradiance",
|
||||||
|
color="green",
|
||||||
|
)
|
||||||
|
plot.line(
|
||||||
|
"date_time",
|
||||||
|
"weather_dhi",
|
||||||
|
source=source,
|
||||||
|
legend_label="Diffuse Horizontal Irradiance",
|
||||||
|
color="blue",
|
||||||
|
)
|
||||||
|
|
||||||
|
return Bokeh(plot)
|
||||||
|
|
||||||
|
|
||||||
|
def Demo(eos_host: str, eos_port: Union[str, int]) -> str:
|
||||||
|
server = f"http://{eos_host}:{eos_port}"
|
||||||
|
|
||||||
|
# Get current configuration from server
|
||||||
|
try:
|
||||||
|
result = requests.get(f"{server}/v1/config")
|
||||||
|
result.raise_for_status()
|
||||||
|
except requests.exceptions.HTTPError as err:
|
||||||
|
detail = result.json()["detail"]
|
||||||
|
return P(
|
||||||
|
f"Can not retrieve configuration from {server}: {err}, {detail}",
|
||||||
|
cls="text-center",
|
||||||
|
)
|
||||||
|
config = result.json()
|
||||||
|
|
||||||
|
# Set demo configuration
|
||||||
|
with FILE_DEMOCONFIG.open("r", encoding="utf-8") as fd:
|
||||||
|
democonfig = json.load(fd)
|
||||||
|
try:
|
||||||
|
result = requests.put(f"{server}/v1/config", json=democonfig)
|
||||||
|
result.raise_for_status()
|
||||||
|
except requests.exceptions.HTTPError as err:
|
||||||
|
detail = result.json()["detail"]
|
||||||
|
# Try to reset to original config
|
||||||
|
requests.put(f"{server}/v1/config", json=config)
|
||||||
|
return P(
|
||||||
|
f"Can not set demo configuration on {server}: {err}, {detail}",
|
||||||
|
cls="text-center",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update all predictions
|
||||||
|
try:
|
||||||
|
result = requests.post(f"{server}/v1/prediction/update")
|
||||||
|
result.raise_for_status()
|
||||||
|
except requests.exceptions.HTTPError as err:
|
||||||
|
detail = result.json()["detail"]
|
||||||
|
# Try to reset to original config
|
||||||
|
requests.put(f"{server}/v1/config", json=config)
|
||||||
|
return P(
|
||||||
|
f"Can not update predictions on {server}: {err}, {detail}",
|
||||||
|
cls="text-center",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get Forecasts
|
||||||
|
try:
|
||||||
|
params = {
|
||||||
|
"keys": [
|
||||||
|
"pvforecast_ac_power",
|
||||||
|
"elecprice_marketprice_kwh",
|
||||||
|
"weather_temp_air",
|
||||||
|
"weather_ghi",
|
||||||
|
"weather_dni",
|
||||||
|
"weather_dhi",
|
||||||
|
],
|
||||||
|
}
|
||||||
|
result = requests.get(f"{server}/v1/prediction/dataframe", params=params)
|
||||||
|
result.raise_for_status()
|
||||||
|
predictions = PydanticDateTimeDataFrame(**result.json()).to_dataframe()
|
||||||
|
except requests.exceptions.HTTPError as err:
|
||||||
|
detail = result.json()["detail"]
|
||||||
|
return P(
|
||||||
|
f"Can not retrieve predictions from {server}: {err}, {detail}",
|
||||||
|
cls="text-center",
|
||||||
|
)
|
||||||
|
except Exception as err:
|
||||||
|
return P(
|
||||||
|
f"Can not retrieve predictions from {server}: {err}",
|
||||||
|
cls="text-center",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Reset to original config
|
||||||
|
requests.put(f"{server}/v1/config", json=config)
|
||||||
|
|
||||||
|
return Grid(
|
||||||
|
DemoPVForecast(predictions, democonfig),
|
||||||
|
DemoElectricityPriceForecast(predictions, democonfig),
|
||||||
|
DemoWeatherTempAir(predictions, democonfig),
|
||||||
|
DemoWeatherIrradiance(predictions, democonfig),
|
||||||
|
cols_max=2,
|
||||||
|
)
|
92
src/akkudoktoreos/server/dash/footer.py
Normal file
@ -0,0 +1,92 @@
|
|||||||
|
from typing import Optional, Union
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from monsterui.daisy import Loading, LoadingT
|
||||||
|
from monsterui.franken import A, ButtonT, DivFullySpaced, P
|
||||||
|
from requests.exceptions import RequestException
|
||||||
|
|
||||||
|
from akkudoktoreos.config.config import get_config
|
||||||
|
from akkudoktoreos.core.logging import get_logger
|
||||||
|
|
||||||
|
logger = get_logger(__name__)
|
||||||
|
config_eos = get_config()
|
||||||
|
|
||||||
|
|
||||||
|
def get_alive(eos_host: str, eos_port: Union[str, int]) -> str:
|
||||||
|
"""Fetch alive information from the specified EOS server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
eos_host (str): The hostname of the server.
|
||||||
|
eos_port (Union[str, int]): The port of the server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: Alive data.
|
||||||
|
"""
|
||||||
|
result = requests.Response()
|
||||||
|
try:
|
||||||
|
result = requests.get(f"http://{eos_host}:{eos_port}/v1/health")
|
||||||
|
if result.status_code == 200:
|
||||||
|
alive = result.json()["status"]
|
||||||
|
else:
|
||||||
|
alive = f"Server responded with status code: {result.status_code}"
|
||||||
|
except RequestException as e:
|
||||||
|
warning_msg = f"{e}"
|
||||||
|
logger.warning(warning_msg)
|
||||||
|
alive = warning_msg
|
||||||
|
|
||||||
|
return alive
|
||||||
|
|
||||||
|
|
||||||
|
def Footer(eos_host: Optional[str], eos_port: Optional[Union[str, int]]) -> str:
|
||||||
|
if eos_host is None:
|
||||||
|
eos_host = config_eos.server.host
|
||||||
|
if eos_port is None:
|
||||||
|
eos_port = config_eos.server.port
|
||||||
|
alive_icon = None
|
||||||
|
if eos_host is None or eos_port is None:
|
||||||
|
alive = "EOS server not given: {eos_host}:{eos_port}"
|
||||||
|
else:
|
||||||
|
alive = get_alive(eos_host, eos_port)
|
||||||
|
if alive == "alive":
|
||||||
|
alive_icon = Loading(
|
||||||
|
cls=(
|
||||||
|
LoadingT.ring,
|
||||||
|
LoadingT.sm,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
alive = f"EOS {eos_host}:{eos_port}"
|
||||||
|
if alive_icon:
|
||||||
|
alive_cls = f"{ButtonT.primary} uk-link rounded-md"
|
||||||
|
else:
|
||||||
|
alive_cls = f"{ButtonT.secondary} uk-link rounded-md"
|
||||||
|
return DivFullySpaced(
|
||||||
|
P(
|
||||||
|
alive_icon,
|
||||||
|
A(alive, href=f"http://{eos_host}:{eos_port}/docs", target="_blank", cls=alive_cls),
|
||||||
|
),
|
||||||
|
P(
|
||||||
|
A(
|
||||||
|
"Documentation",
|
||||||
|
href="https://akkudoktor-eos.readthedocs.io/en/latest/",
|
||||||
|
target="_blank",
|
||||||
|
cls="uk-link",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
P(
|
||||||
|
A(
|
||||||
|
"Issues",
|
||||||
|
href="https://github.com/Akkudoktor-EOS/EOS/issues",
|
||||||
|
target="_blank",
|
||||||
|
cls="uk-link",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
P(
|
||||||
|
A(
|
||||||
|
"GitHub",
|
||||||
|
href="https://github.com/Akkudoktor-EOS/EOS/",
|
||||||
|
target="_blank",
|
||||||
|
cls="uk-link",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
cls="uk-padding-remove-top uk-padding-remove-botton",
|
||||||
|
)
|
24
src/akkudoktoreos/server/dash/hello.py
Normal file
@ -0,0 +1,24 @@
|
|||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from fasthtml.common import Div
|
||||||
|
|
||||||
|
from akkudoktoreos.server.dash.markdown import Markdown
|
||||||
|
|
||||||
|
hello_md = """
|
||||||
|
|
||||||
|
# Akkudoktor EOSdash
|
||||||
|
|
||||||
|
The dashboard for Akkudoktor EOS.
|
||||||
|
|
||||||
|
EOS provides a comprehensive solution for simulating and optimizing an energy system based
|
||||||
|
on renewable energy sources. With a focus on photovoltaic (PV) systems, battery storage (batteries),
|
||||||
|
load management (consumer requirements), heat pumps, electric vehicles, and consideration of
|
||||||
|
electricity price data, this system enables forecasting and optimization of energy flow and costs
|
||||||
|
over a specified period.
|
||||||
|
|
||||||
|
Documentation can be found at [Akkudoktor-EOS](https://akkudoktor-eos.readthedocs.io/en/latest/).
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def Hello(**kwargs: Any) -> Div:
|
||||||
|
return Markdown(hello_md, **kwargs)
|
136
src/akkudoktoreos/server/dash/markdown.py
Normal file
@ -0,0 +1,136 @@
|
|||||||
|
"""Markdown rendering with MonsterUI HTML classes."""
|
||||||
|
|
||||||
|
from typing import Any, List, Optional, Union
|
||||||
|
|
||||||
|
from fasthtml.common import FT, Div, NotStr
|
||||||
|
from markdown_it import MarkdownIt
|
||||||
|
from markdown_it.renderer import RendererHTML
|
||||||
|
from markdown_it.token import Token
|
||||||
|
from monsterui.foundations import stringify
|
||||||
|
|
||||||
|
|
||||||
|
def render_heading(
|
||||||
|
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
|
||||||
|
) -> str:
|
||||||
|
"""Custom renderer for Markdown headings.
|
||||||
|
|
||||||
|
Adds specific CSS classes based on the heading level.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
self: The renderer instance.
|
||||||
|
tokens: List of tokens to be rendered.
|
||||||
|
idx: Index of the current token.
|
||||||
|
options: Rendering options.
|
||||||
|
env: Environment sandbox for plugins.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The rendered token as a string.
|
||||||
|
"""
|
||||||
|
if tokens[idx].markup == "#":
|
||||||
|
tokens[idx].attrSet("class", "uk-heading-divider uk-h1 uk-margin")
|
||||||
|
elif tokens[idx].markup == "##":
|
||||||
|
tokens[idx].attrSet("class", "uk-heading-divider uk-h2 uk-margin")
|
||||||
|
elif tokens[idx].markup == "###":
|
||||||
|
tokens[idx].attrSet("class", "uk-heading-divider uk-h3 uk-margin")
|
||||||
|
elif tokens[idx].markup == "####":
|
||||||
|
tokens[idx].attrSet("class", "uk-heading-divider uk-h4 uk-margin")
|
||||||
|
|
||||||
|
# pass token to default renderer.
|
||||||
|
return self.renderToken(tokens, idx, options, env)
|
||||||
|
|
||||||
|
|
||||||
|
def render_paragraph(
|
||||||
|
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
|
||||||
|
) -> str:
|
||||||
|
"""Custom renderer for Markdown paragraphs.
|
||||||
|
|
||||||
|
Adds specific CSS classes.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
self: The renderer instance.
|
||||||
|
tokens: List of tokens to be rendered.
|
||||||
|
idx: Index of the current token.
|
||||||
|
options: Rendering options.
|
||||||
|
env: Environment sandbox for plugins.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The rendered token as a string.
|
||||||
|
"""
|
||||||
|
tokens[idx].attrSet("class", "uk-paragraph")
|
||||||
|
|
||||||
|
# pass token to default renderer.
|
||||||
|
return self.renderToken(tokens, idx, options, env)
|
||||||
|
|
||||||
|
|
||||||
|
def render_blockquote(
|
||||||
|
self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict
|
||||||
|
) -> str:
|
||||||
|
"""Custom renderer for Markdown blockquotes.
|
||||||
|
|
||||||
|
Adds specific CSS classes.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
self: The renderer instance.
|
||||||
|
tokens: List of tokens to be rendered.
|
||||||
|
idx: Index of the current token.
|
||||||
|
options: Rendering options.
|
||||||
|
env: Environment sandbox for plugins.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The rendered token as a string.
|
||||||
|
"""
|
||||||
|
tokens[idx].attrSet("class", "uk-blockquote")
|
||||||
|
|
||||||
|
# pass token to default renderer.
|
||||||
|
return self.renderToken(tokens, idx, options, env)
|
||||||
|
|
||||||
|
|
||||||
|
def render_link(self: RendererHTML, tokens: List[Token], idx: int, options: dict, env: dict) -> str:
|
||||||
|
"""Custom renderer for Markdown links.
|
||||||
|
|
||||||
|
Adds the target attribute to open links in a new tab.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
self: The renderer instance.
|
||||||
|
tokens: List of tokens to be rendered.
|
||||||
|
idx: Index of the current token.
|
||||||
|
options: Rendering options.
|
||||||
|
env: Environment sandbox for plugins.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The rendered token as a string.
|
||||||
|
"""
|
||||||
|
tokens[idx].attrSet("class", "uk-link")
|
||||||
|
tokens[idx].attrSet("target", "_blank")
|
||||||
|
|
||||||
|
# pass token to default renderer.
|
||||||
|
return self.renderToken(tokens, idx, options, env)
|
||||||
|
|
||||||
|
|
||||||
|
markdown = MarkdownIt("gfm-like")
|
||||||
|
markdown.add_render_rule("heading_open", render_heading)
|
||||||
|
markdown.add_render_rule("paragraph_open", render_paragraph)
|
||||||
|
markdown.add_render_rule("blockquote_open", render_blockquote)
|
||||||
|
markdown.add_render_rule("link_open", render_link)
|
||||||
|
|
||||||
|
|
||||||
|
markdown_cls = "bg-background text-lg ring-offset-background placeholder:text-muted-foreground focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
|
||||||
|
|
||||||
|
|
||||||
|
def Markdown(*c: Any, cls: Optional[Union[str, tuple]] = None, **kwargs: Any) -> FT:
|
||||||
|
"""Component to render Markdown content with custom styling.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
c: Markdown content to be rendered.
|
||||||
|
cls: Optional additional CSS classes to be added.
|
||||||
|
kwargs: Additional keyword arguments for the Div component.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
An FT object representing the rendered HTML content wrapped in a Div component.
|
||||||
|
"""
|
||||||
|
new_cls = markdown_cls
|
||||||
|
if cls:
|
||||||
|
new_cls += f" {stringify(cls)}"
|
||||||
|
kwargs["cls"] = new_cls
|
||||||
|
md_html = markdown.render(*c)
|
||||||
|
return Div(NotStr(md_html), **kwargs)
|
@ -7,12 +7,10 @@ import os
|
|||||||
import signal
|
import signal
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
import time
|
|
||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Annotated, Any, AsyncGenerator, Dict, List, Optional, Union
|
from typing import Annotated, Any, AsyncGenerator, Dict, List, Optional, Union
|
||||||
|
|
||||||
import httpx
|
|
||||||
import psutil
|
import psutil
|
||||||
import uvicorn
|
import uvicorn
|
||||||
from fastapi import Body, FastAPI
|
from fastapi import Body, FastAPI
|
||||||
@ -48,8 +46,9 @@ from akkudoktoreos.prediction.load import LoadCommonSettings
|
|||||||
from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktorCommonSettings
|
from akkudoktoreos.prediction.loadakkudoktor import LoadAkkudoktorCommonSettings
|
||||||
from akkudoktoreos.prediction.prediction import PredictionCommonSettings, get_prediction
|
from akkudoktoreos.prediction.prediction import PredictionCommonSettings, get_prediction
|
||||||
from akkudoktoreos.prediction.pvforecast import PVForecastCommonSettings
|
from akkudoktoreos.prediction.pvforecast import PVForecastCommonSettings
|
||||||
|
from akkudoktoreos.server.rest.error import create_error_page
|
||||||
from akkudoktoreos.server.rest.tasks import repeat_every
|
from akkudoktoreos.server.rest.tasks import repeat_every
|
||||||
from akkudoktoreos.server.server import get_default_host
|
from akkudoktoreos.server.server import get_default_host, wait_for_port_free
|
||||||
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
|
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
|
||||||
|
|
||||||
logger = get_logger(__name__)
|
logger = get_logger(__name__)
|
||||||
@ -61,98 +60,6 @@ ems_eos = get_ems()
|
|||||||
# Command line arguments
|
# Command line arguments
|
||||||
args = None
|
args = None
|
||||||
|
|
||||||
ERROR_PAGE_TEMPLATE = """
|
|
||||||
<!DOCTYPE html>
|
|
||||||
<html lang="en">
|
|
||||||
<head>
|
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<title>Energy Optimization System (EOS) Error</title>
|
|
||||||
<style>
|
|
||||||
body {
|
|
||||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, sans-serif;
|
|
||||||
background-color: #f5f5f5;
|
|
||||||
display: flex;
|
|
||||||
justify-content: center;
|
|
||||||
align-items: center;
|
|
||||||
height: 100vh;
|
|
||||||
margin: 0;
|
|
||||||
padding: 20px;
|
|
||||||
box-sizing: border-box;
|
|
||||||
}
|
|
||||||
.error-container {
|
|
||||||
background: white;
|
|
||||||
padding: 2rem;
|
|
||||||
border-radius: 8px;
|
|
||||||
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
|
|
||||||
max-width: 500px;
|
|
||||||
width: 100%;
|
|
||||||
text-align: center;
|
|
||||||
}
|
|
||||||
.error-code {
|
|
||||||
font-size: 4rem;
|
|
||||||
font-weight: bold;
|
|
||||||
color: #e53e3e;
|
|
||||||
margin: 0;
|
|
||||||
}
|
|
||||||
.error-title {
|
|
||||||
font-size: 1.5rem;
|
|
||||||
color: #2d3748;
|
|
||||||
margin: 1rem 0;
|
|
||||||
}
|
|
||||||
.error-message {
|
|
||||||
color: #4a5568;
|
|
||||||
margin-bottom: 1.5rem;
|
|
||||||
}
|
|
||||||
.error-details {
|
|
||||||
background: #f7fafc;
|
|
||||||
padding: 1rem;
|
|
||||||
border-radius: 4px;
|
|
||||||
margin-bottom: 1.5rem;
|
|
||||||
text-align: left;
|
|
||||||
font-family: monospace;
|
|
||||||
white-space: pre-wrap;
|
|
||||||
word-break: break-word;
|
|
||||||
}
|
|
||||||
.back-button {
|
|
||||||
background: #3182ce;
|
|
||||||
color: white;
|
|
||||||
border: none;
|
|
||||||
padding: 0.75rem 1.5rem;
|
|
||||||
border-radius: 4px;
|
|
||||||
text-decoration: none;
|
|
||||||
display: inline-block;
|
|
||||||
transition: background-color 0.2s;
|
|
||||||
}
|
|
||||||
.back-button:hover {
|
|
||||||
background: #2c5282;
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body>
|
|
||||||
<div class="error-container">
|
|
||||||
<h1 class="error-code">STATUS_CODE</h1>
|
|
||||||
<h2 class="error-title">ERROR_TITLE</h2>
|
|
||||||
<p class="error-message">ERROR_MESSAGE</p>
|
|
||||||
<div class="error-details">ERROR_DETAILS</div>
|
|
||||||
<a href="/docs" class="back-button">Back to Home</a>
|
|
||||||
</div>
|
|
||||||
</body>
|
|
||||||
</html>
|
|
||||||
"""
|
|
||||||
|
|
||||||
|
|
||||||
def create_error_page(
|
|
||||||
status_code: str, error_title: str, error_message: str, error_details: str
|
|
||||||
) -> str:
|
|
||||||
"""Create an error page by replacing placeholders in the template."""
|
|
||||||
return (
|
|
||||||
ERROR_PAGE_TEMPLATE.replace("STATUS_CODE", status_code)
|
|
||||||
.replace("ERROR_TITLE", error_title)
|
|
||||||
.replace("ERROR_MESSAGE", error_message)
|
|
||||||
.replace("ERROR_DETAILS", error_details)
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# ----------------------
|
# ----------------------
|
||||||
# EOSdash server startup
|
# EOSdash server startup
|
||||||
@ -194,18 +101,8 @@ def start_eosdash(
|
|||||||
"""
|
"""
|
||||||
eosdash_path = Path(__file__).parent.resolve().joinpath("eosdash.py")
|
eosdash_path = Path(__file__).parent.resolve().joinpath("eosdash.py")
|
||||||
|
|
||||||
# Check if the EOSdash process is still/ already running, e.g. in case of server restart
|
# Do a one time check for port free to generate warnings if not so
|
||||||
process_info = None
|
wait_for_port_free(port, timeout=0, waiting_app_name="EOSdash")
|
||||||
for conn in psutil.net_connections(kind="inet"):
|
|
||||||
if conn.laddr.port == port:
|
|
||||||
process = psutil.Process(conn.pid)
|
|
||||||
# Get the fresh process info
|
|
||||||
process_info = process.as_dict(attrs=["pid", "cmdline"])
|
|
||||||
break
|
|
||||||
if process_info:
|
|
||||||
# Just warn
|
|
||||||
logger.warning(f"EOSdash port `{port}` still/ already in use.")
|
|
||||||
logger.warning(f"PID: `{process_info['pid']}`, CMD: `{process_info['cmdline']}`")
|
|
||||||
|
|
||||||
cmd = [
|
cmd = [
|
||||||
sys.executable,
|
sys.executable,
|
||||||
@ -391,9 +288,6 @@ app = FastAPI(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
server_dir = Path(__file__).parent.resolve()
|
|
||||||
|
|
||||||
|
|
||||||
class PdfResponse(FileResponse):
|
class PdfResponse(FileResponse):
|
||||||
media_type = "application/pdf"
|
media_type = "application/pdf"
|
||||||
|
|
||||||
@ -523,7 +417,7 @@ def fastapi_health_get(): # type: ignore
|
|||||||
|
|
||||||
@app.post("/v1/config/reset", tags=["config"])
|
@app.post("/v1/config/reset", tags=["config"])
|
||||||
def fastapi_config_reset_post() -> ConfigEOS:
|
def fastapi_config_reset_post() -> ConfigEOS:
|
||||||
"""Reset the configuration.
|
"""Reset the configuration to the EOS configuration file.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
configuration (ConfigEOS): The current configuration after update.
|
configuration (ConfigEOS): The current configuration after update.
|
||||||
@ -812,6 +706,49 @@ def fastapi_prediction_series_get(
|
|||||||
return PydanticDateTimeSeries.from_series(pdseries)
|
return PydanticDateTimeSeries.from_series(pdseries)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/v1/prediction/dataframe", tags=["prediction"])
|
||||||
|
def fastapi_prediction_dataframe_get(
|
||||||
|
keys: Annotated[list[str], Query(description="Prediction keys.")],
|
||||||
|
start_datetime: Annotated[
|
||||||
|
Optional[str],
|
||||||
|
Query(description="Starting datetime (inclusive)."),
|
||||||
|
] = None,
|
||||||
|
end_datetime: Annotated[
|
||||||
|
Optional[str],
|
||||||
|
Query(description="Ending datetime (exclusive)."),
|
||||||
|
] = None,
|
||||||
|
interval: Annotated[
|
||||||
|
Optional[str],
|
||||||
|
Query(description="Time duration for each interval. Defaults to 1 hour."),
|
||||||
|
] = None,
|
||||||
|
) -> PydanticDateTimeDataFrame:
|
||||||
|
"""Get prediction for given key within given date range as series.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
key (str): Prediction key
|
||||||
|
start_datetime (Optional[str]): Starting datetime (inclusive).
|
||||||
|
Defaults to start datetime of latest prediction.
|
||||||
|
end_datetime (Optional[str]: Ending datetime (exclusive).
|
||||||
|
|
||||||
|
Defaults to end datetime of latest prediction.
|
||||||
|
"""
|
||||||
|
for key in keys:
|
||||||
|
if key not in prediction_eos.record_keys:
|
||||||
|
raise HTTPException(status_code=404, detail=f"Key '{key}' is not available.")
|
||||||
|
if start_datetime is None:
|
||||||
|
start_datetime = prediction_eos.start_datetime
|
||||||
|
else:
|
||||||
|
start_datetime = to_datetime(start_datetime)
|
||||||
|
if end_datetime is None:
|
||||||
|
end_datetime = prediction_eos.end_datetime
|
||||||
|
else:
|
||||||
|
end_datetime = to_datetime(end_datetime)
|
||||||
|
df = prediction_eos.keys_to_dataframe(
|
||||||
|
keys=keys, start_datetime=start_datetime, end_datetime=end_datetime, interval=interval
|
||||||
|
)
|
||||||
|
return PydanticDateTimeDataFrame.from_dataframe(df, tz=config_eos.general.timezone)
|
||||||
|
|
||||||
|
|
||||||
@app.get("/v1/prediction/list", tags=["prediction"])
|
@app.get("/v1/prediction/list", tags=["prediction"])
|
||||||
def fastapi_prediction_list_get(
|
def fastapi_prediction_list_get(
|
||||||
key: Annotated[str, Query(description="Prediction key.")],
|
key: Annotated[str, Query(description="Prediction key.")],
|
||||||
@ -1223,75 +1160,66 @@ def site_map() -> RedirectResponse:
|
|||||||
return RedirectResponse(url="/docs")
|
return RedirectResponse(url="/docs")
|
||||||
|
|
||||||
|
|
||||||
# Keep the proxy last to handle all requests that are not taken by the Rest API.
|
# Keep the redirect last to handle all requests that are not taken by the Rest API.
|
||||||
|
|
||||||
|
|
||||||
@app.delete("/{path:path}", include_in_schema=False)
|
@app.delete("/{path:path}", include_in_schema=False)
|
||||||
async def proxy_delete(request: Request, path: str) -> Response:
|
async def redirect_delete(request: Request, path: str) -> Response:
|
||||||
return await proxy(request, path)
|
return redirect(request, path)
|
||||||
|
|
||||||
|
|
||||||
@app.get("/{path:path}", include_in_schema=False)
|
@app.get("/{path:path}", include_in_schema=False)
|
||||||
async def proxy_get(request: Request, path: str) -> Response:
|
async def redirect_get(request: Request, path: str) -> Response:
|
||||||
return await proxy(request, path)
|
return redirect(request, path)
|
||||||
|
|
||||||
|
|
||||||
@app.post("/{path:path}", include_in_schema=False)
|
@app.post("/{path:path}", include_in_schema=False)
|
||||||
async def proxy_post(request: Request, path: str) -> Response:
|
async def redirect_post(request: Request, path: str) -> Response:
|
||||||
return await proxy(request, path)
|
return redirect(request, path)
|
||||||
|
|
||||||
|
|
||||||
@app.put("/{path:path}", include_in_schema=False)
|
@app.put("/{path:path}", include_in_schema=False)
|
||||||
async def proxy_put(request: Request, path: str) -> Response:
|
async def redirect_put(request: Request, path: str) -> Response:
|
||||||
return await proxy(request, path)
|
return redirect(request, path)
|
||||||
|
|
||||||
|
|
||||||
async def proxy(request: Request, path: str) -> Union[Response | RedirectResponse | HTMLResponse]:
|
def redirect(request: Request, path: str) -> Union[HTMLResponse, RedirectResponse]:
|
||||||
|
# Path is not for EOSdash
|
||||||
|
if not (path.startswith("eosdash") or path == ""):
|
||||||
|
host = config_eos.server.eosdash_host
|
||||||
|
if host is None:
|
||||||
|
host = config_eos.server.host
|
||||||
|
host = str(host)
|
||||||
|
port = config_eos.server.eosdash_port
|
||||||
|
if port is None:
|
||||||
|
port = 8504
|
||||||
|
# Make hostname Windows friendly
|
||||||
|
if host == "0.0.0.0" and os.name == "nt":
|
||||||
|
host = "localhost"
|
||||||
|
url = f"http://{host}:{port}/"
|
||||||
|
error_page = create_error_page(
|
||||||
|
status_code="404",
|
||||||
|
error_title="Page Not Found",
|
||||||
|
error_message=f"""<pre>
|
||||||
|
URL is unknown: '{request.url}'
|
||||||
|
Did you want to connect to <a href="{url}" class="back-button">EOSdash</a>?
|
||||||
|
</pre>
|
||||||
|
""",
|
||||||
|
error_details="Unknown URL",
|
||||||
|
)
|
||||||
|
return HTMLResponse(content=error_page, status_code=404)
|
||||||
|
|
||||||
# Make hostname Windows friendly
|
# Make hostname Windows friendly
|
||||||
host = str(config_eos.server.eosdash_host)
|
host = str(config_eos.server.eosdash_host)
|
||||||
if host == "0.0.0.0" and os.name == "nt":
|
if host == "0.0.0.0" and os.name == "nt":
|
||||||
host = "localhost"
|
host = "localhost"
|
||||||
if host and config_eos.server.eosdash_port:
|
if host and config_eos.server.eosdash_port:
|
||||||
# Proxy to EOSdash server
|
# Redirect to EOSdash server
|
||||||
url = f"http://{host}:{config_eos.server.eosdash_port}/{path}"
|
url = f"http://{host}:{config_eos.server.eosdash_port}/{path}"
|
||||||
headers = dict(request.headers)
|
return RedirectResponse(url=url, status_code=303)
|
||||||
|
|
||||||
data = await request.body()
|
# Redirect the root URL to the site map
|
||||||
|
return RedirectResponse(url="/docs", status_code=303)
|
||||||
try:
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
if request.method == "GET":
|
|
||||||
response = await client.get(url, headers=headers)
|
|
||||||
elif request.method == "POST":
|
|
||||||
response = await client.post(url, headers=headers, content=data)
|
|
||||||
elif request.method == "PUT":
|
|
||||||
response = await client.put(url, headers=headers, content=data)
|
|
||||||
elif request.method == "DELETE":
|
|
||||||
response = await client.delete(url, headers=headers, content=data)
|
|
||||||
except Exception as e:
|
|
||||||
error_page = create_error_page(
|
|
||||||
status_code="404",
|
|
||||||
error_title="Page Not Found",
|
|
||||||
error_message=f"""<pre>
|
|
||||||
EOSdash server not reachable: '{url}'
|
|
||||||
Did you start the EOSdash server
|
|
||||||
or set 'startup_eosdash'?
|
|
||||||
If there is no application server intended please
|
|
||||||
set 'eosdash_host' or 'eosdash_port' to None.
|
|
||||||
</pre>
|
|
||||||
""",
|
|
||||||
error_details=f"{e}",
|
|
||||||
)
|
|
||||||
return HTMLResponse(content=error_page, status_code=404)
|
|
||||||
|
|
||||||
return Response(
|
|
||||||
content=response.content,
|
|
||||||
status_code=response.status_code,
|
|
||||||
headers=dict(response.headers),
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
# Redirect the root URL to the site map
|
|
||||||
return RedirectResponse(url="/docs")
|
|
||||||
|
|
||||||
|
|
||||||
def run_eos(host: str, port: int, log_level: str, access_log: bool, reload: bool) -> None:
|
def run_eos(host: str, port: int, log_level: str, access_log: bool, reload: bool) -> None:
|
||||||
@ -1320,26 +1248,7 @@ def run_eos(host: str, port: int, log_level: str, access_log: bool, reload: bool
|
|||||||
host = "localhost"
|
host = "localhost"
|
||||||
|
|
||||||
# Wait for EOS port to be free - e.g. in case of restart
|
# Wait for EOS port to be free - e.g. in case of restart
|
||||||
timeout = 120 # Maximum 120 seconds to wait
|
wait_for_port_free(port, timeout=120, waiting_app_name="EOS")
|
||||||
process_info: list[dict] = []
|
|
||||||
for retries in range(int(timeout / 10)):
|
|
||||||
process_info = []
|
|
||||||
pids: list[int] = []
|
|
||||||
for conn in psutil.net_connections(kind="inet"):
|
|
||||||
if conn.laddr.port == port:
|
|
||||||
if conn.pid not in pids:
|
|
||||||
# Get fresh process info
|
|
||||||
process = psutil.Process(conn.pid)
|
|
||||||
pids.append(conn.pid)
|
|
||||||
process_info.append(process.as_dict(attrs=["pid", "cmdline"]))
|
|
||||||
if len(process_info) == 0:
|
|
||||||
break
|
|
||||||
logger.info(f"EOS waiting for port `{port}` ...")
|
|
||||||
time.sleep(10)
|
|
||||||
if len(process_info) > 0:
|
|
||||||
logger.warning(f"EOS port `{port}` in use.")
|
|
||||||
for info in process_info:
|
|
||||||
logger.warning(f"PID: `{info["pid"]}`, CMD: `{info["cmdline"]}`")
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
uvicorn.run(
|
uvicorn.run(
|
||||||
|
@ -1,127 +1,144 @@
|
|||||||
import argparse
|
import argparse
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import time
|
import traceback
|
||||||
from functools import reduce
|
from pathlib import Path
|
||||||
from typing import Any, Union
|
from typing import Optional
|
||||||
|
|
||||||
import psutil
|
import psutil
|
||||||
import uvicorn
|
import uvicorn
|
||||||
from fasthtml.common import H1, Table, Td, Th, Thead, Titled, Tr, fast_app
|
from fasthtml.common import FileResponse, JSONResponse
|
||||||
from fasthtml.starlette import JSONResponse
|
from monsterui.core import FastHTML, Theme
|
||||||
from pydantic.fields import ComputedFieldInfo, FieldInfo
|
|
||||||
from pydantic_core import PydanticUndefined
|
|
||||||
|
|
||||||
from akkudoktoreos.config.config import get_config
|
from akkudoktoreos.config.config import get_config
|
||||||
from akkudoktoreos.core.logging import get_logger
|
from akkudoktoreos.core.logging import get_logger
|
||||||
from akkudoktoreos.core.pydantic import PydanticBaseModel
|
from akkudoktoreos.server.dash.bokeh import BokehJS
|
||||||
|
from akkudoktoreos.server.dash.components import Page
|
||||||
|
|
||||||
|
# Pages
|
||||||
|
from akkudoktoreos.server.dash.configuration import Configuration
|
||||||
|
from akkudoktoreos.server.dash.demo import Demo
|
||||||
|
from akkudoktoreos.server.dash.footer import Footer
|
||||||
|
from akkudoktoreos.server.dash.hello import Hello
|
||||||
|
from akkudoktoreos.server.server import get_default_host, wait_for_port_free
|
||||||
|
|
||||||
|
# from akkudoktoreos.server.dash.altair import AltairJS
|
||||||
|
|
||||||
logger = get_logger(__name__)
|
logger = get_logger(__name__)
|
||||||
|
|
||||||
config_eos = get_config()
|
config_eos = get_config()
|
||||||
|
|
||||||
|
# The favicon for EOSdash
|
||||||
|
favicon_filepath = Path(__file__).parent.joinpath("dash/assets/favicon/favicon.ico")
|
||||||
|
if not favicon_filepath.exists():
|
||||||
|
raise ValueError(f"Does not exist {favicon_filepath}")
|
||||||
|
|
||||||
# Command line arguments
|
# Command line arguments
|
||||||
args = None
|
args: Optional[argparse.Namespace] = None
|
||||||
|
|
||||||
|
|
||||||
def get_default_value(field_info: Union[FieldInfo, ComputedFieldInfo], regular_field: bool) -> Any:
|
# Get frankenui and tailwind headers via CDN using Theme.green.headers()
|
||||||
default_value = ""
|
# Add altair headers
|
||||||
if regular_field:
|
# hdrs=(Theme.green.headers(highlightjs=True), AltairJS,)
|
||||||
if (val := field_info.default) is not PydanticUndefined:
|
hdrs = (
|
||||||
default_value = val
|
Theme.green.headers(highlightjs=True),
|
||||||
else:
|
BokehJS,
|
||||||
default_value = "N/A"
|
)
|
||||||
return default_value
|
|
||||||
|
|
||||||
|
# The EOSdash application
|
||||||
def resolve_nested_types(field_type: Any, parent_types: list[str]) -> list[tuple[Any, list[str]]]:
|
app: FastHTML = FastHTML(
|
||||||
resolved_types: list[tuple[Any, list[str]]] = []
|
title="EOSdash",
|
||||||
|
hdrs=hdrs,
|
||||||
origin = getattr(field_type, "__origin__", field_type)
|
|
||||||
if origin is Union:
|
|
||||||
for arg in getattr(field_type, "__args__", []):
|
|
||||||
if arg is not type(None):
|
|
||||||
resolved_types.extend(resolve_nested_types(arg, parent_types))
|
|
||||||
else:
|
|
||||||
resolved_types.append((field_type, parent_types))
|
|
||||||
|
|
||||||
return resolved_types
|
|
||||||
|
|
||||||
|
|
||||||
configs = []
|
|
||||||
inner_types: set[type[PydanticBaseModel]] = set()
|
|
||||||
for field_name, field_info in list(config_eos.model_fields.items()) + list(
|
|
||||||
config_eos.model_computed_fields.items()
|
|
||||||
):
|
|
||||||
|
|
||||||
def extract_nested_models(
|
|
||||||
subfield_info: Union[ComputedFieldInfo, FieldInfo], parent_types: list[str]
|
|
||||||
) -> None:
|
|
||||||
regular_field = isinstance(subfield_info, FieldInfo)
|
|
||||||
subtype = subfield_info.annotation if regular_field else subfield_info.return_type
|
|
||||||
|
|
||||||
if subtype in inner_types:
|
|
||||||
return
|
|
||||||
|
|
||||||
nested_types = resolve_nested_types(subtype, [])
|
|
||||||
found_basic = False
|
|
||||||
for nested_type, nested_parent_types in nested_types:
|
|
||||||
if not isinstance(nested_type, type) or not issubclass(nested_type, PydanticBaseModel):
|
|
||||||
if found_basic:
|
|
||||||
continue
|
|
||||||
|
|
||||||
config = {}
|
|
||||||
config["name"] = ".".join(parent_types)
|
|
||||||
try:
|
|
||||||
config["value"] = reduce(getattr, [config_eos] + parent_types)
|
|
||||||
except AttributeError:
|
|
||||||
# Parent value(s) are not set in current config
|
|
||||||
config["value"] = ""
|
|
||||||
config["default"] = get_default_value(subfield_info, regular_field)
|
|
||||||
config["description"] = (
|
|
||||||
subfield_info.description if subfield_info.description else ""
|
|
||||||
)
|
|
||||||
configs.append(config)
|
|
||||||
found_basic = True
|
|
||||||
else:
|
|
||||||
new_parent_types = parent_types + nested_parent_types
|
|
||||||
inner_types.add(nested_type)
|
|
||||||
for nested_field_name, nested_field_info in list(
|
|
||||||
nested_type.model_fields.items()
|
|
||||||
) + list(nested_type.model_computed_fields.items()):
|
|
||||||
extract_nested_models(
|
|
||||||
nested_field_info,
|
|
||||||
new_parent_types + [nested_field_name],
|
|
||||||
)
|
|
||||||
|
|
||||||
extract_nested_models(field_info, [field_name])
|
|
||||||
configs = sorted(configs, key=lambda x: x["name"])
|
|
||||||
|
|
||||||
|
|
||||||
app, rt = fast_app(
|
|
||||||
secret_key=os.getenv("EOS_SERVER__EOSDASH_SESSKEY"),
|
secret_key=os.getenv("EOS_SERVER__EOSDASH_SESSKEY"),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def config_table() -> Table:
|
def eos_server() -> tuple[str, int]:
|
||||||
rows = [
|
"""Retrieves the EOS server host and port configuration.
|
||||||
Tr(
|
|
||||||
Td(config["name"]),
|
If `args` is provided, it uses the `eos_host` and `eos_port` from `args`.
|
||||||
Td(config["value"]),
|
Otherwise, it falls back to the values from `config_eos.server`.
|
||||||
Td(config["default"]),
|
|
||||||
Td(config["description"]),
|
Returns:
|
||||||
cls="even:bg-purple/5",
|
tuple[str, int]: A tuple containing:
|
||||||
)
|
- `eos_host` (str): The EOS server hostname or IP.
|
||||||
for config in configs
|
- `eos_port` (int): The EOS server port.
|
||||||
]
|
"""
|
||||||
flds = "Name", "Value", "Default", "Description"
|
if args is None:
|
||||||
head = Thead(*map(Th, flds), cls="bg-purple/10")
|
eos_host = str(config_eos.server.host)
|
||||||
return Table(head, *rows, cls="w-full")
|
eos_port = config_eos.server.port
|
||||||
|
else:
|
||||||
|
eos_host = args.eos_host
|
||||||
|
eos_port = args.eos_port
|
||||||
|
eos_host = eos_host if eos_host else get_default_host()
|
||||||
|
eos_port = eos_port if eos_port else 8503
|
||||||
|
|
||||||
|
return eos_host, eos_port
|
||||||
|
|
||||||
|
|
||||||
@rt("/")
|
@app.get("/favicon.ico")
|
||||||
def get(): # type: ignore
|
def get_eosdash_favicon(): # type: ignore
|
||||||
return Titled("EOS Dashboard", H1("Configuration"), config_table())
|
"""Get favicon."""
|
||||||
|
return FileResponse(path=favicon_filepath)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
def get_eosdash(): # type: ignore
|
||||||
|
"""Serves the main EOSdash page.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Page: The main dashboard page with navigation links and footer.
|
||||||
|
"""
|
||||||
|
return Page(
|
||||||
|
None,
|
||||||
|
{
|
||||||
|
"EOSdash": "/eosdash/hello",
|
||||||
|
"Config": "/eosdash/configuration",
|
||||||
|
"Demo": "/eosdash/demo",
|
||||||
|
},
|
||||||
|
Hello(),
|
||||||
|
Footer(*eos_server()),
|
||||||
|
"/eosdash/footer",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/eosdash/footer")
|
||||||
|
def get_eosdash_footer(): # type: ignore
|
||||||
|
"""Serves the EOSdash Foooter information.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Footer: The Footer component.
|
||||||
|
"""
|
||||||
|
return Footer(*eos_server())
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/eosdash/hello")
|
||||||
|
def get_eosdash_hello(): # type: ignore
|
||||||
|
"""Serves the EOSdash Hello page.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Hello: The Hello page component.
|
||||||
|
"""
|
||||||
|
return Hello()
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/eosdash/configuration")
|
||||||
|
def get_eosdash_configuration(): # type: ignore
|
||||||
|
"""Serves the EOSdash Configuration page.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Configuration: The Configuration page component.
|
||||||
|
"""
|
||||||
|
return Configuration(*eos_server())
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/eosdash/demo")
|
||||||
|
def get_eosdash_demo(): # type: ignore
|
||||||
|
"""Serves the EOSdash Demo page.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Demo: The Demo page component.
|
||||||
|
"""
|
||||||
|
return Demo(*eos_server())
|
||||||
|
|
||||||
|
|
||||||
@app.get("/eosdash/health")
|
@app.get("/eosdash/health")
|
||||||
@ -135,7 +152,14 @@ def get_eosdash_health(): # type: ignore
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def run_eosdash(host: str, port: int, log_level: str, access_log: bool, reload: bool) -> None:
|
@app.get("/eosdash/assets/{fname:path}.{ext:static}")
|
||||||
|
def get_eosdash_assets(fname: str, ext: str): # type: ignore
|
||||||
|
"""Get assets."""
|
||||||
|
asset_filepath = Path(__file__).parent.joinpath(f"dash/assets/{fname}.{ext}")
|
||||||
|
return FileResponse(path=asset_filepath)
|
||||||
|
|
||||||
|
|
||||||
|
def run_eosdash() -> None:
|
||||||
"""Run the EOSdash server with the specified configurations.
|
"""Run the EOSdash server with the specified configurations.
|
||||||
|
|
||||||
This function starts the EOSdash server using the Uvicorn ASGI server. It accepts
|
This function starts the EOSdash server using the Uvicorn ASGI server. It accepts
|
||||||
@ -145,65 +169,77 @@ def run_eosdash(host: str, port: int, log_level: str, access_log: bool, reload:
|
|||||||
server to the specified host and port, an error message is logged and the
|
server to the specified host and port, an error message is logged and the
|
||||||
application exits.
|
application exits.
|
||||||
|
|
||||||
Args:
|
|
||||||
host (str): The hostname to bind the server to.
|
|
||||||
port (int): The port number to bind the server to.
|
|
||||||
log_level (str): The log level for the server. Options include "critical", "error",
|
|
||||||
"warning", "info", "debug", and "trace".
|
|
||||||
access_log (bool): Whether to enable or disable the access log. Set to True to enable.
|
|
||||||
reload (bool): Whether to enable or disable auto-reload. Set to True for development.
|
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
None
|
None
|
||||||
"""
|
"""
|
||||||
|
# Setup parameters from args, config_eos and default
|
||||||
|
# Remember parameters that are also in config
|
||||||
|
# - EOS host
|
||||||
|
if args and args.eos_host:
|
||||||
|
eos_host = args.eos_host
|
||||||
|
elif config_eos.server.host:
|
||||||
|
eos_host = config_eos.server.host
|
||||||
|
else:
|
||||||
|
eos_host = get_default_host()
|
||||||
|
config_eos.server.host = eos_host
|
||||||
|
# - EOS port
|
||||||
|
if args and args.eos_port:
|
||||||
|
eos_port = args.eos_port
|
||||||
|
elif config_eos.server.port:
|
||||||
|
eos_port = config_eos.server.port
|
||||||
|
else:
|
||||||
|
eos_port = 8503
|
||||||
|
config_eos.server.port = eos_port
|
||||||
|
# - EOSdash host
|
||||||
|
if args and args.host:
|
||||||
|
eosdash_host = args.host
|
||||||
|
elif config_eos.server.eosdash.host:
|
||||||
|
eosdash_host = config_eos.server.eosdash_host
|
||||||
|
else:
|
||||||
|
eosdash_host = get_default_host()
|
||||||
|
config_eos.server.eosdash_host = eosdash_host
|
||||||
|
# - EOS port
|
||||||
|
if args and args.port:
|
||||||
|
eosdash_port = args.port
|
||||||
|
elif config_eos.server.eosdash_port:
|
||||||
|
eosdash_port = config_eos.server.eosdash_port
|
||||||
|
else:
|
||||||
|
eosdash_port = 8504
|
||||||
|
config_eos.server.eosdash_port = eosdash_port
|
||||||
|
# - log level
|
||||||
|
if args and args.log_level:
|
||||||
|
log_level = args.log_level
|
||||||
|
else:
|
||||||
|
log_level = "info"
|
||||||
|
# - access log
|
||||||
|
if args and args.access_log:
|
||||||
|
access_log = args.access_log
|
||||||
|
else:
|
||||||
|
access_log = False
|
||||||
|
# - reload
|
||||||
|
if args and args.reload:
|
||||||
|
reload = args.reload
|
||||||
|
else:
|
||||||
|
reload = False
|
||||||
|
|
||||||
# Make hostname Windows friendly
|
# Make hostname Windows friendly
|
||||||
if host == "0.0.0.0" and os.name == "nt":
|
if eosdash_host == "0.0.0.0" and os.name == "nt":
|
||||||
host = "localhost"
|
eosdash_host = "localhost"
|
||||||
|
|
||||||
# Wait for EOSdash port to be free - e.g. in case of restart
|
# Wait for EOSdash port to be free - e.g. in case of restart
|
||||||
timeout = 120 # Maximum 120 seconds to wait
|
wait_for_port_free(eosdash_port, timeout=120, waiting_app_name="EOSdash")
|
||||||
process_info: list[dict] = []
|
|
||||||
for retries in range(int(timeout / 3)):
|
|
||||||
process_info = []
|
|
||||||
pids: list[int] = []
|
|
||||||
for conn in psutil.net_connections(kind="inet"):
|
|
||||||
if conn.laddr.port == port:
|
|
||||||
if conn.pid not in pids:
|
|
||||||
# Get fresh process info
|
|
||||||
process = psutil.Process(conn.pid)
|
|
||||||
pids.append(conn.pid)
|
|
||||||
process_info.append(process.as_dict(attrs=["pid", "cmdline"]))
|
|
||||||
if len(process_info) == 0:
|
|
||||||
break
|
|
||||||
logger.info(f"EOSdash waiting for port `{port}` ...")
|
|
||||||
time.sleep(3)
|
|
||||||
if len(process_info) > 0:
|
|
||||||
logger.warning(f"EOSdash port `{port}` in use.")
|
|
||||||
for info in process_info:
|
|
||||||
logger.warning(f"PID: `{info["pid"]}`, CMD: `{info["cmdline"]}`")
|
|
||||||
|
|
||||||
# Setup config from args
|
|
||||||
if args:
|
|
||||||
if args.eos_host:
|
|
||||||
config_eos.server.host = args.eos_host
|
|
||||||
if args.eos_port:
|
|
||||||
config_eos.server.port = args.eos_port
|
|
||||||
if args.host:
|
|
||||||
config_eos.server.eosdash_host = args.host
|
|
||||||
if args.port:
|
|
||||||
config_eos.server.eosdash_port = args.port
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
uvicorn.run(
|
uvicorn.run(
|
||||||
"akkudoktoreos.server.eosdash:app",
|
"akkudoktoreos.server.eosdash:app",
|
||||||
host=host,
|
host=eosdash_host,
|
||||||
port=port,
|
port=eosdash_port,
|
||||||
log_level=log_level.lower(), # Convert log_level to lowercase
|
log_level=log_level.lower(),
|
||||||
access_log=access_log,
|
access_log=access_log,
|
||||||
reload=reload,
|
reload=reload,
|
||||||
)
|
)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Could not bind to host {host}:{port}. Error: {e}")
|
logger.error(f"Could not bind to host {eosdash_host}:{eosdash_port}. Error: {e}")
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
|
|
||||||
@ -212,7 +248,7 @@ def main() -> None:
|
|||||||
|
|
||||||
This function sets up the argument parser to accept command-line arguments for
|
This function sets up the argument parser to accept command-line arguments for
|
||||||
host, port, log_level, access_log, and reload. It uses default values from the
|
host, port, log_level, access_log, and reload. It uses default values from the
|
||||||
config_eos module if arguments are not provided. After parsing the arguments,
|
config module if arguments are not provided. After parsing the arguments,
|
||||||
it starts the EOSdash server with the specified configurations.
|
it starts the EOSdash server with the specified configurations.
|
||||||
|
|
||||||
Command-line Arguments:
|
Command-line Arguments:
|
||||||
@ -226,7 +262,6 @@ def main() -> None:
|
|||||||
"""
|
"""
|
||||||
parser = argparse.ArgumentParser(description="Start EOSdash server.")
|
parser = argparse.ArgumentParser(description="Start EOSdash server.")
|
||||||
|
|
||||||
# Host and port arguments with defaults from config_eos
|
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--host",
|
"--host",
|
||||||
type=str,
|
type=str,
|
||||||
@ -239,8 +274,6 @@ def main() -> None:
|
|||||||
default=config_eos.server.eosdash_port,
|
default=config_eos.server.eosdash_port,
|
||||||
help="Port for the EOSdash server (default: value from config)",
|
help="Port for the EOSdash server (default: value from config)",
|
||||||
)
|
)
|
||||||
|
|
||||||
# EOS Host and port arguments with defaults from config_eos
|
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--eos-host",
|
"--eos-host",
|
||||||
type=str,
|
type=str,
|
||||||
@ -253,8 +286,6 @@ def main() -> None:
|
|||||||
default=config_eos.server.port,
|
default=config_eos.server.port,
|
||||||
help="Port of the EOS server (default: value from config)",
|
help="Port of the EOS server (default: value from config)",
|
||||||
)
|
)
|
||||||
|
|
||||||
# Optional arguments for log_level, access_log, and reload
|
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--log_level",
|
"--log_level",
|
||||||
type=str,
|
type=str,
|
||||||
@ -265,7 +296,7 @@ def main() -> None:
|
|||||||
"--access_log",
|
"--access_log",
|
||||||
type=bool,
|
type=bool,
|
||||||
default=False,
|
default=False,
|
||||||
help="Enable or disable access log. Options: True or False (default: True)",
|
help="Enable or disable access log. Options: True or False (default: False)",
|
||||||
)
|
)
|
||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--reload",
|
"--reload",
|
||||||
@ -274,13 +305,15 @@ def main() -> None:
|
|||||||
help="Enable or disable auto-reload. Useful for development. Options: True or False (default: False)",
|
help="Enable or disable auto-reload. Useful for development. Options: True or False (default: False)",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
global args
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
run_eosdash(args.host, args.port, args.log_level, args.access_log, args.reload)
|
run_eosdash()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
error_msg = f"Failed to run EOSdash: {ex}"
|
error_msg = f"Failed to run EOSdash: {ex}"
|
||||||
logger.error(error_msg)
|
logger.error(error_msg)
|
||||||
|
traceback.print_exc()
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
91
src/akkudoktoreos/server/rest/error.py
Normal file
@ -0,0 +1,91 @@
|
|||||||
|
ERROR_PAGE_TEMPLATE = """
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>Energy Optimization System (EOS) Error</title>
|
||||||
|
<style>
|
||||||
|
body {
|
||||||
|
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, sans-serif;
|
||||||
|
background-color: #f5f5f5;
|
||||||
|
display: flex;
|
||||||
|
justify-content: center;
|
||||||
|
align-items: center;
|
||||||
|
height: 100vh;
|
||||||
|
margin: 0;
|
||||||
|
padding: 20px;
|
||||||
|
box-sizing: border-box;
|
||||||
|
}
|
||||||
|
.error-container {
|
||||||
|
background: white;
|
||||||
|
padding: 2rem;
|
||||||
|
border-radius: 8px;
|
||||||
|
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
|
||||||
|
max-width: 500px;
|
||||||
|
width: 100%;
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
.error-code {
|
||||||
|
font-size: 4rem;
|
||||||
|
font-weight: bold;
|
||||||
|
color: #e53e3e;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
.error-title {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
color: #2d3748;
|
||||||
|
margin: 1rem 0;
|
||||||
|
}
|
||||||
|
.error-message {
|
||||||
|
color: #4a5568;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
}
|
||||||
|
.error-details {
|
||||||
|
background: #f7fafc;
|
||||||
|
padding: 1rem;
|
||||||
|
border-radius: 4px;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
text-align: center;
|
||||||
|
font-family: monospace;
|
||||||
|
white-space: pre-wrap;
|
||||||
|
word-break: break-word;
|
||||||
|
}
|
||||||
|
.back-button {
|
||||||
|
background: #3182ce;
|
||||||
|
color: white;
|
||||||
|
border: none;
|
||||||
|
padding: 0.75rem 1.5rem;
|
||||||
|
border-radius: 4px;
|
||||||
|
text-decoration: none;
|
||||||
|
display: inline-block;
|
||||||
|
transition: background-color 0.2s;
|
||||||
|
}
|
||||||
|
.back-button:hover {
|
||||||
|
background: #2c5282;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div class="error-container">
|
||||||
|
<h1 class="error-code">STATUS_CODE</h1>
|
||||||
|
<h2 class="error-title">ERROR_TITLE</h2>
|
||||||
|
<p class="error-message">ERROR_MESSAGE</p>
|
||||||
|
<div class="error-details">ERROR_DETAILS</div>
|
||||||
|
<a href="/docs" class="back-button">Back to Home</a>
|
||||||
|
</div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def create_error_page(
|
||||||
|
status_code: str, error_title: str, error_message: str, error_details: str
|
||||||
|
) -> str:
|
||||||
|
"""Create an error page by replacing placeholders in the template."""
|
||||||
|
return (
|
||||||
|
ERROR_PAGE_TEMPLATE.replace("STATUS_CODE", status_code)
|
||||||
|
.replace("ERROR_TITLE", error_title)
|
||||||
|
.replace("ERROR_MESSAGE", error_message)
|
||||||
|
.replace("ERROR_DETAILS", error_details)
|
||||||
|
)
|
@ -1,8 +1,10 @@
|
|||||||
"""Server Module."""
|
"""Server Module."""
|
||||||
|
|
||||||
import os
|
import os
|
||||||
from typing import Optional
|
import time
|
||||||
|
from typing import Optional, Union
|
||||||
|
|
||||||
|
import psutil
|
||||||
from pydantic import Field, IPvAnyAddress, field_validator
|
from pydantic import Field, IPvAnyAddress, field_validator
|
||||||
|
|
||||||
from akkudoktoreos.config.configabc import SettingsBaseModel
|
from akkudoktoreos.config.configabc import SettingsBaseModel
|
||||||
@ -17,12 +19,78 @@ def get_default_host() -> str:
|
|||||||
return "0.0.0.0"
|
return "0.0.0.0"
|
||||||
|
|
||||||
|
|
||||||
class ServerCommonSettings(SettingsBaseModel):
|
def wait_for_port_free(port: int, timeout: int = 0, waiting_app_name: str = "App") -> bool:
|
||||||
"""Server Configuration.
|
"""Wait for a network port to become free, with timeout.
|
||||||
|
|
||||||
Attributes:
|
Checks if the port is currently in use and logs warnings with process details.
|
||||||
To be added
|
Retries every 3 seconds until timeout is reached.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
port: The network port number to check
|
||||||
|
timeout: Maximum seconds to wait (0 means check once without waiting)
|
||||||
|
waiting_app_name: Name of the application waiting for the port
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
bool: True if port is free, False if port is still in use after timeout
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: If port number or timeout is invalid
|
||||||
|
psutil.Error: If there are problems accessing process information
|
||||||
"""
|
"""
|
||||||
|
if not 0 <= port <= 65535:
|
||||||
|
raise ValueError(f"Invalid port number: {port}")
|
||||||
|
if timeout < 0:
|
||||||
|
raise ValueError(f"Invalid timeout: {timeout}")
|
||||||
|
|
||||||
|
def get_processes_using_port() -> list[dict]:
|
||||||
|
"""Get info about processes using the specified port."""
|
||||||
|
processes: list[dict] = []
|
||||||
|
seen_pids: set[int] = set()
|
||||||
|
|
||||||
|
try:
|
||||||
|
for conn in psutil.net_connections(kind="inet"):
|
||||||
|
if conn.laddr.port == port and conn.pid not in seen_pids:
|
||||||
|
try:
|
||||||
|
process = psutil.Process(conn.pid)
|
||||||
|
seen_pids.add(conn.pid)
|
||||||
|
processes.append(process.as_dict(attrs=["pid", "cmdline"]))
|
||||||
|
except psutil.NoSuchProcess:
|
||||||
|
continue
|
||||||
|
except psutil.Error as e:
|
||||||
|
logger.error(f"Error checking port {port}: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
return processes
|
||||||
|
|
||||||
|
retries = max(int(timeout / 3), 1) if timeout > 0 else 1
|
||||||
|
|
||||||
|
for _ in range(retries):
|
||||||
|
process_info = get_processes_using_port()
|
||||||
|
|
||||||
|
if not process_info:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if timeout <= 0:
|
||||||
|
break
|
||||||
|
|
||||||
|
logger.info(f"{waiting_app_name} waiting for port {port} to become free...")
|
||||||
|
time.sleep(3)
|
||||||
|
|
||||||
|
if process_info:
|
||||||
|
logger.warning(
|
||||||
|
f"{waiting_app_name} port {port} still in use after waiting {timeout} seconds."
|
||||||
|
)
|
||||||
|
for info in process_info:
|
||||||
|
logger.warning(
|
||||||
|
f"Process using port - PID: {info['pid']}, Command: {' '.join(info['cmdline'])}"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class ServerCommonSettings(SettingsBaseModel):
|
||||||
|
"""Server Configuration."""
|
||||||
|
|
||||||
host: Optional[IPvAnyAddress] = Field(
|
host: Optional[IPvAnyAddress] = Field(
|
||||||
default=get_default_host(), description="EOS server IP address."
|
default=get_default_host(), description="EOS server IP address."
|
||||||
@ -37,6 +105,15 @@ class ServerCommonSettings(SettingsBaseModel):
|
|||||||
)
|
)
|
||||||
eosdash_port: Optional[int] = Field(default=8504, description="EOSdash server IP port number.")
|
eosdash_port: Optional[int] = Field(default=8504, description="EOSdash server IP port number.")
|
||||||
|
|
||||||
|
@field_validator("host", "eosdash_host", mode="before")
|
||||||
|
def validate_server_host(
|
||||||
|
cls, value: Optional[Union[str, IPvAnyAddress]]
|
||||||
|
) -> Optional[Union[str, IPvAnyAddress]]:
|
||||||
|
if isinstance(value, str):
|
||||||
|
if value.lower() in ("localhost", "loopback"):
|
||||||
|
value = "127.0.0.1"
|
||||||
|
return value
|
||||||
|
|
||||||
@field_validator("port", "eosdash_port")
|
@field_validator("port", "eosdash_port")
|
||||||
def validate_server_port(cls, value: Optional[int]) -> Optional[int]:
|
def validate_server_port(cls, value: Optional[int]) -> Optional[int]:
|
||||||
if value is not None and not (1024 <= value <= 49151):
|
if value is not None and not (1024 <= value <= 49151):
|
||||||
|
@ -9,6 +9,8 @@ import psutil
|
|||||||
import pytest
|
import pytest
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
from akkudoktoreos.server.server import get_default_host
|
||||||
|
|
||||||
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
DIR_TESTDATA = Path(__file__).absolute().parent.joinpath("testdata")
|
||||||
|
|
||||||
FILE_TESTDATA_EOSSERVER_CONFIG_1 = DIR_TESTDATA.joinpath("eosserver_config_1.json")
|
FILE_TESTDATA_EOSSERVER_CONFIG_1 = DIR_TESTDATA.joinpath("eosserver_config_1.json")
|
||||||
@ -235,12 +237,11 @@ class TestServerStartStop:
|
|||||||
def test_server_start_eosdash(self, tmpdir):
|
def test_server_start_eosdash(self, tmpdir):
|
||||||
"""Test the EOSdash server startup from EOS."""
|
"""Test the EOSdash server startup from EOS."""
|
||||||
# Do not use any fixture as this will make pytest the owner of the EOSdash port.
|
# Do not use any fixture as this will make pytest the owner of the EOSdash port.
|
||||||
|
host = get_default_host()
|
||||||
if os.name == "nt":
|
if os.name == "nt":
|
||||||
host = "localhost"
|
|
||||||
# Windows does not provide SIGKILL
|
# Windows does not provide SIGKILL
|
||||||
sigkill = signal.SIGTERM
|
sigkill = signal.SIGTERM
|
||||||
else:
|
else:
|
||||||
host = "0.0.0.0"
|
|
||||||
sigkill = signal.SIGKILL
|
sigkill = signal.SIGKILL
|
||||||
port = 8503
|
port = 8503
|
||||||
eosdash_port = 8504
|
eosdash_port = 8504
|
||||||
|