Fix2 config and predictions revamp. (#281)

measurement:

- Add new measurement class to hold real world measurements.
- Handles load meter readings, grid import and export meter readings.
- Aggregates load meter readings aka. measurements to total load.
- Can import measurements from files, pandas datetime series,
    pandas datetime dataframes, simple daetime arrays and
    programmatically.
- Maybe expanded to other measurement values.
- Should be used for load prediction adaptions by real world
    measurements.

core/coreabc:

- Add mixin class to access measurements

core/pydantic:

- Add pydantic models for pandas datetime series and dataframes.
- Add pydantic models for simple datetime array

core/dataabc:

- Provide DataImport mixin class for generic import handling.
    Imports from JSON string and files. Imports from pandas datetime dataframes
    and simple datetime arrays. Signature of import method changed to
    allow import datetimes to be given programmatically and by data content.
- Use pydantic models for datetime series, dataframes, arrays
- Validate generic imports by pydantic models
- Provide new attributes min_datetime and max_datetime for DataSequence.
- Add parameter dropna to drop NAN/ None values when creating lists, pandas series
    or numpy array from DataSequence.

config/config:

- Add common settings for the measurement module.

predictions/elecpriceakkudoktor:

- Use mean values of last 7 days to fill prediction values not provided by
    akkudoktor.net (only provides 24 values).

prediction/loadabc:

- Extend the generic prediction keys by 'load_total_adjusted' for load predictions
    that adjust the predicted total load by measured load values.

prediction/loadakkudoktor:

- Extend the Akkudoktor load prediction by load adjustment using measured load
    values.

prediction/load_aggregator:

- Module removed. Load aggregation is now handled by the measurement module.

prediction/load_corrector:

- Module removed. Load correction (aka. adjustment of load prediction by
    measured load energy) is handled by the LoadAkkudoktor prediction and
    the generic 'load_mean_adjusted' prediction key.

prediction/load_forecast:

- Module removed. Functionality now completely handled by the LoadAkkudoktor
    prediction.

utils/cacheutil:

- Use pydantic.
- Fix potential bug in ttl (time to live) duration handling.

utils/datetimeutil:

- Added missing handling of pendulum.DateTime and pendulum.Duration instances
    as input. Handled before as datetime.datetime and datetime.timedelta.

utils/visualize:

- Move main to generate_example_report() for better testing support.

server/server:

- Added new configuration option server_fastapi_startup_server_fasthtml
  to make startup of FastHTML server by FastAPI server conditional.

server/fastapi_server:

- Add APIs for measurements
- Improve APIs to provide or take pandas datetime series and
    datetime dataframes controlled by pydantic model.
- Improve APIs to provide or take simple datetime data arrays
    controlled by pydantic model.
- Move fastAPI server API to v1 for new APIs.
- Update pre v1 endpoints to use new prediction and measurement capabilities.
- Only start FastHTML server if 'server_fastapi_startup_server_fasthtml'
    config option is set.

tests:

- Adapt import tests to changed import method signature
- Adapt server test to use the v1 API
- Extend the dataabc test to test for array generation from data
    with several data interval scenarios.
- Extend the datetimeutil test to also test for correct handling
    of to_datetime() providing now().
- Adapt LoadAkkudoktor test for new adjustment calculation.
- Adapt visualization test to use example report function instead of visualize.py
    run as process.
- Removed test_load_aggregator. Functionality is now tested in test_measurement.
- Added tests for measurement module

docs:

- Remove sphinxcontrib-openapi as it prevents build of documentation.
    "site-packages/sphinxcontrib/openapi/openapi31.py", line 305, in _get_type_from_schema
    for t in schema["anyOf"]: KeyError: 'anyOf'"

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
This commit is contained in:
Bobby Noelte
2024-12-29 18:42:49 +01:00
committed by GitHub
parent 2a8e11d7dc
commit 830af85fca
38 changed files with 3671 additions and 948 deletions

View File

@@ -31,21 +31,23 @@ import os
import pickle
import tempfile
import threading
from datetime import date, datetime, time, timedelta
from typing import (
IO,
Any,
Callable,
Dict,
Generic,
List,
Literal,
Optional,
ParamSpec,
TypeVar,
Union,
)
from akkudoktoreos.utils.datetimeutil import to_datetime, to_duration
from pendulum import DateTime, Duration
from pydantic import BaseModel, ConfigDict, Field
from akkudoktoreos.utils.datetimeutil import compare_datetimes, to_datetime, to_duration
from akkudoktoreos.utils.logutil import get_logger
logger = get_logger(__name__)
@@ -56,6 +58,21 @@ Param = ParamSpec("Param")
RetType = TypeVar("RetType")
class CacheFileRecord(BaseModel):
# Enable custom serialization globally in config
model_config = ConfigDict(
arbitrary_types_allowed=True,
use_enum_values=True,
validate_assignment=True,
)
cache_file: Any = Field(..., description="File descriptor of the cache file.")
until_datetime: DateTime = Field(..., description="Datetime until the cache file is valid.")
ttl_duration: Optional[Duration] = Field(
default=None, description="Duration the cache file is valid."
)
class CacheFileStoreMeta(type, Generic[T]):
"""A thread-safe implementation of CacheFileStore."""
@@ -102,12 +119,36 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
This constructor sets up an empty key-value store (a dictionary) where each key
corresponds to a cache file that is associated with a given key and an optional date.
"""
self._store: dict[str, tuple[IO[bytes], datetime]] = {}
self._store: Dict[str, CacheFileRecord] = {}
self._store_lock = threading.Lock()
def _until_datetime_by_options(
self,
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Optional[Any] = None,
) -> tuple[DateTime, Optional[Duration]]:
"""Get until_datetime and ttl_duration from the given options."""
ttl_duration = None
if until_datetime:
until_datetime = to_datetime(until_datetime)
elif with_ttl:
ttl_duration = to_duration(with_ttl)
until_datetime = to_datetime() + ttl_duration
elif until_date:
until_datetime = to_datetime(until_date).end_of("day")
else:
# end of today
until_datetime = to_datetime().end_of("day")
return (until_datetime, ttl_duration)
def _generate_cache_file_key(
self, key: str, until_datetime: Union[datetime, None]
) -> tuple[str, datetime]:
self,
key: str,
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Optional[Any] = None,
) -> tuple[str, DateTime, Optional[Duration]]:
"""Generates a unique cache file key based on the key and date.
The cache file key is a combination of the input key and the date (if provided),
@@ -115,7 +156,7 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
Args:
key (str): The key that identifies the cache file.
until_datetime (Optional[Any]): The datetime
until_datetime (Optional[DateTime]): The datetime
until the cache file is valid. The default is the current date at maximum time
(23:59:59).
@@ -123,12 +164,18 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
A tuple of:
str: A hashed string that serves as the unique identifier for the cache file.
datetime: The datetime until the the cache file is valid.
Optional[ttl_duration]: Duration for ttl control.
"""
if until_datetime is None:
until_datetime = datetime.combine(date.today(), time.max)
key_datetime = to_datetime(until_datetime, as_string="UTC")
until_datetime_dt, ttl_duration = self._until_datetime_by_options(
until_date, until_datetime, with_ttl
)
if ttl_duration:
# We need a special key for with_ttl, only encoding the with_ttl
key_datetime = ttl_duration.in_words()
else:
key_datetime = to_datetime(until_datetime_dt, as_string="UTC")
cache_key = hashlib.sha256(f"{key}{key_datetime}".encode("utf-8")).hexdigest()
return (f"{cache_key}", until_datetime)
return (f"{cache_key}", until_datetime_dt, ttl_duration)
def _get_file_path(self, file_obj: IO[bytes]) -> Optional[str]:
"""Retrieve the file path from a file-like object.
@@ -147,37 +194,17 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
file_path = file_obj.name # Get the file path from the cache file object
return file_path
def _until_datetime_by_options(
self,
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Union[timedelta, str, int, float, None] = None,
) -> datetime:
"""Get until_datetime from the given options."""
if until_datetime:
until_datetime = to_datetime(until_datetime)
elif with_ttl:
with_ttl = to_duration(with_ttl)
until_datetime = to_datetime(datetime.now() + with_ttl)
elif until_date:
until_datetime = to_datetime(to_datetime(until_date).date())
else:
# end of today
until_datetime = to_datetime(datetime.combine(date.today(), time.max))
return until_datetime
def _is_valid_cache_item(
self,
cache_item: tuple[IO[bytes], datetime],
until_datetime: Optional[datetime] = None,
at_datetime: Optional[datetime] = None,
before_datetime: Optional[datetime] = None,
cache_item: CacheFileRecord,
until_datetime: Optional[DateTime] = None,
at_datetime: Optional[DateTime] = None,
before_datetime: Optional[DateTime] = None,
) -> bool:
cache_file_datetime = cache_item[1] # Extract the datetime associated with the cache item
if (
(until_datetime and until_datetime == cache_file_datetime)
or (at_datetime and at_datetime <= cache_file_datetime)
or (before_datetime and cache_file_datetime < before_datetime)
(until_datetime and until_datetime == cache_item.until_datetime)
or (at_datetime and at_datetime <= cache_item.until_datetime)
or (before_datetime and cache_item.until_datetime < before_datetime)
):
return True
return False
@@ -188,7 +215,8 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
until_datetime: Optional[Any] = None,
at_datetime: Optional[Any] = None,
before_datetime: Optional[Any] = None,
) -> Optional[tuple[str, IO[bytes], datetime]]:
ttl_duration: Optional[Any] = None,
) -> tuple[str, Optional[CacheFileRecord]]:
"""Searches for a cached item that matches the key and falls within the datetime range.
This method looks for a cache item with a key that matches the given `key`, and whose associated
@@ -203,48 +231,62 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
before_datetime (Optional[Any]): The datetime to compare the cache item's datetime to be before.
Returns:
Optional[tuple]: Returns the cache_file_key, chache_file, cache_file_datetime if found,
otherwise returns `None`.
tuple[str, Optional[CacheFileRecord]]: Returns the cache_file_key, cache file record if found, otherwise returns `None`.
"""
# Convert input to datetime if they are not None
until_datetime_dt: Optional[datetime] = None
if until_datetime is not None:
until_datetime_dt = to_datetime(until_datetime)
at_datetime_dt: Optional[datetime] = None
if at_datetime is not None:
at_datetime_dt = to_datetime(at_datetime)
before_datetime_dt: Optional[datetime] = None
if before_datetime is not None:
before_datetime_dt = to_datetime(before_datetime)
if ttl_duration is not None:
# TTL duration - use current datetime
if until_datetime or at_datetime or before_datetime:
raise NotImplementedError(
f"Search with ttl_duration and datetime filter until:{until_datetime}, at:{at_datetime}, before:{before_datetime} is not implemented"
)
at_datetime = to_datetime()
else:
if until_datetime is not None:
until_datetime = to_datetime(until_datetime)
if at_datetime is not None:
at_datetime = to_datetime(at_datetime)
if before_datetime is not None:
before_datetime = to_datetime(before_datetime)
if until_datetime is None and at_datetime is None and before_datetime is None:
at_datetime = to_datetime().end_of("day")
for cache_file_key, cache_item in self._store.items():
# Check if the cache file datetime matches the given criteria
if self._is_valid_cache_item(
cache_item,
until_datetime=until_datetime_dt,
at_datetime=at_datetime_dt,
before_datetime=before_datetime_dt,
until_datetime=until_datetime,
at_datetime=at_datetime,
before_datetime=before_datetime,
):
# This cache file is within the given datetime range
# Extract the datetime associated with the cache item
cache_file_datetime = cache_item[1]
# Generate a cache file key based on the given key and the cache file datetime
generated_key, _until_dt = self._generate_cache_file_key(key, cache_file_datetime)
if cache_item.ttl_duration:
generated_key, _until_dt, _ttl_duration = self._generate_cache_file_key(
key, with_ttl=cache_item.ttl_duration
)
else:
generated_key, _until_dt, _ttl_duration = self._generate_cache_file_key(
key, until_datetime=cache_item.until_datetime
)
logger.debug(
f"Search: ttl:{ttl_duration}, until:{until_datetime}, at:{at_datetime}, before:{before_datetime} -> hit: {generated_key == cache_file_key}, item: {cache_item.cache_file.seek(0), cache_item.cache_file.read()}"
)
if generated_key == cache_file_key:
# The key matches, return the key and the cache item
return (cache_file_key, cache_item[0], cache_file_datetime)
# The key matches, return the cache item
return (cache_file_key, cache_item)
# Return None if no matching cache item is found
return None
return ("<not found>", None)
def create(
self,
key: str,
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Union[timedelta, str, int, float, None] = None,
with_ttl: Optional[Any] = None,
mode: str = "wb+",
delete: bool = False,
suffix: Optional[str] = None,
@@ -261,8 +303,7 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
until_datetime (Optional[Any]): The datetime
until the cache file is valid. Time of day is set to maximum time (23:59:59) if not
provided.
with_ttl (Union[timedelta, str, int, float, None], optional): The time to live that
the cache file is valid. Time starts now.
with_ttl (Optional[Any]): The time to live that the cache file is valid. Time starts now.
mode (str, optional): The mode in which the tempfile is opened
(e.g., 'w+', 'r+', 'wb+'). Defaults to 'wb+'.
delete (bool, optional): Whether to delete the file after it is closed.
@@ -279,20 +320,22 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
>>> cache_file.seek(0)
>>> print(cache_file.read()) # Output: 'Some cached data'
"""
until_datetime_dt = self._until_datetime_by_options(
until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
)
cache_file_key, _ = self._generate_cache_file_key(key, until_datetime_dt)
with self._store_lock: # Synchronize access to _store
if (cache_file_item := self._store.get(cache_file_key)) is not None:
if (cache_item := self._store.get(cache_file_key)) is not None:
# File already available
cache_file_obj = cache_file_item[0]
cache_file_obj = cache_item.cache_file
else:
cache_file_obj = tempfile.NamedTemporaryFile(
mode=mode, delete=delete, suffix=suffix
)
self._store[cache_file_key] = (cache_file_obj, until_datetime_dt)
self._store[cache_file_key] = CacheFileRecord(
cache_file=cache_file_obj,
until_datetime=until_datetime_dt,
ttl_duration=ttl_duration,
)
cache_file_obj.seek(0)
return cache_file_obj
@@ -302,7 +345,7 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
file_obj: IO[bytes],
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Union[timedelta, str, int, float, None] = None,
with_ttl: Optional[Any] = None,
) -> None:
"""Stores a file-like object in the cache under the specified key and date.
@@ -317,8 +360,7 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
until_datetime (Optional[Any]): The datetime
until the cache file is valid. Time of day is set to maximum time (23:59:59) if not
provided.
with_ttl (Union[timedelta, str, int, float, None], optional): The time to live that
the cache file is valid. Time starts now.
with_ttl (Optional[Any]): The time to live that the cache file is valid. Time starts now.
Raises:
ValueError: If the key is already in store.
@@ -326,16 +368,26 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
Example:
>>> cache_store.set('example_file', io.BytesIO(b'Some binary data'))
"""
until_datetime_dt = self._until_datetime_by_options(
until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
cache_file_key, until_datetime_dt, ttl_duration = self._generate_cache_file_key(
key, until_datetime=until_datetime, until_date=until_date, with_ttl=with_ttl
)
cache_file_key, until_date = self._generate_cache_file_key(key, until_datetime_dt)
with self._store_lock: # Synchronize access to _store
if cache_file_key in self._store:
raise ValueError(f"Key already in store: `{key}`.")
if ttl_duration:
# Special with_ttl case
if compare_datetimes(
self._store[cache_file_key].until_datetime, to_datetime()
).lt:
# File is outdated - replace by new file
self.delete(key=cache_file_key)
else:
raise ValueError(f"Key already in store: `{key}`.")
else:
raise ValueError(f"Key already in store: `{key}`.")
self._store[cache_file_key] = (file_obj, until_date)
self._store[cache_file_key] = CacheFileRecord(
cache_file=file_obj, until_datetime=until_datetime_dt, ttl_duration=ttl_duration
)
def get(
self,
@@ -344,6 +396,7 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
until_datetime: Optional[Any] = None,
at_datetime: Optional[Any] = None,
before_datetime: Optional[Any] = None,
ttl_duration: Optional[Any] = None,
) -> Optional[IO[bytes]]:
"""Retrieves the cache file associated with the given key and validity datetime.
@@ -362,6 +415,8 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
provided. Defaults to the current datetime if None is provided.
before_datetime (Optional[Any]): The datetime
to compare the cache files datetime to be before.
ttl_duration (Optional[Any]): The time to live to compare the cache files time to live
to be equal.
Returns:
file_obj: The file-like cache object, or None if no file is found.
@@ -373,21 +428,20 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
>>> print(cache_file.read()) # Output: Cached data (if exists)
"""
if until_datetime or until_date:
until_datetime = self._until_datetime_by_options(
until_datetime, _ttl_duration = self._until_datetime_by_options(
until_datetime=until_datetime, until_date=until_date
)
elif at_datetime:
at_datetime = to_datetime(at_datetime)
elif before_datetime:
before_datetime = to_datetime(before_datetime)
else:
at_datetime = to_datetime(datetime.now())
with self._store_lock: # Synchronize access to _store
search_item = self._search(key, until_datetime, at_datetime, before_datetime)
_cache_file_key, search_item = self._search(
key,
until_datetime=until_datetime,
at_datetime=at_datetime,
before_datetime=before_datetime,
ttl_duration=ttl_duration,
)
if search_item is None:
return None
return search_item[1]
return search_item.cache_file
def delete(
self,
@@ -418,17 +472,15 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
elif before_datetime:
before_datetime = to_datetime(before_datetime)
else:
today = datetime.now().date() # Get today's date
tomorrow = today + timedelta(days=1) # Add one day to get tomorrow's date
before_datetime = to_datetime(datetime.combine(tomorrow, time.min))
# Make before_datetime tommorow at start of day
before_datetime = to_datetime().add(days=1).start_of("day")
with self._store_lock: # Synchronize access to _store
search_item = self._search(key, until_datetime, None, before_datetime)
cache_file_key, search_item = self._search(
key, until_datetime=until_datetime, before_datetime=before_datetime
)
if search_item:
cache_file_key = search_item[0]
cache_file = search_item[1]
cache_file_datetime = search_item[2]
file_path = self._get_file_path(cache_file)
file_path = self._get_file_path(search_item.cache_file)
if file_path is None:
logger.warning(
f"The cache file with key '{cache_file_key}' is an in memory "
@@ -436,9 +488,10 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
)
self._store.pop(cache_file_key)
return
file_path = cache_file.name # Get the file path from the cache file object
# Get the file path from the cache file object
file_path = search_item.cache_file.name
del self._store[cache_file_key]
if os.path.exists(file_path):
if file_path and os.path.exists(file_path):
try:
os.remove(file_path)
logger.debug(f"Deleted cache file: {file_path}")
@@ -462,30 +515,31 @@ class CacheFileStore(metaclass=CacheFileStoreMeta):
OSError: If there's an error during file deletion.
"""
delete_keys = [] # List of keys to delete, prevent deleting when traversing the store
clear_timestamp = None
# Some weired logic to prevent calling to_datetime on clear_all.
# Clear_all may be set on __del__. At this time some info for to_datetime will
# not be available anymore.
if not clear_all:
if before_datetime is None:
before_datetime = to_datetime().start_of("day")
else:
before_datetime = to_datetime(before_datetime)
with self._store_lock: # Synchronize access to _store
for cache_file_key, cache_item in self._store.items():
cache_file = cache_item[0]
# Some weired logic to prevent calling to_datetime on clear_all.
# Clear_all may be set on __del__. At this time some info for to_datetime will
# not be available anymore.
clear_file = clear_all
if not clear_all:
if clear_timestamp is None:
before_datetime = to_datetime(before_datetime, to_maxtime=False)
# Convert the threshold date to a timestamp (seconds since epoch)
clear_timestamp = to_datetime(before_datetime).timestamp()
cache_file_timestamp = to_datetime(cache_item[1]).timestamp()
if cache_file_timestamp < clear_timestamp:
clear_file = True
if clear_all:
clear_file = True
else:
clear_file = compare_datetimes(cache_item.until_datetime, before_datetime).lt
if clear_file:
# We have to clear this cache file
delete_keys.append(cache_file_key)
file_path = self._get_file_path(cache_file)
file_path = self._get_file_path(cache_item.cache_file)
if file_path is None:
# In memory file like object
@@ -516,7 +570,7 @@ def cache_in_file(
force_update: Optional[bool] = None,
until_date: Optional[Any] = None,
until_datetime: Optional[Any] = None,
with_ttl: Union[timedelta, str, int, float, None] = None,
with_ttl: Optional[Any] = None,
mode: Literal["w", "w+", "wb", "wb+", "r", "r+", "rb", "rb+"] = "wb+",
delete: bool = False,
suffix: Optional[str] = None,
@@ -620,7 +674,7 @@ def cache_in_file(
elif param == "with_ttl":
until_datetime = None
until_date = None
with_ttl = kwargs[param] # type: ignore[assignment]
with_ttl = kwargs[param]
elif param == "until_date":
until_datetime = None
until_date = kwargs[param]
@@ -642,7 +696,9 @@ def cache_in_file(
result: Optional[RetType | bytes] = None
# Get cache file that is currently valid
cache_file = CacheFileStore().get(key)
cache_file = CacheFileStore().get(
key, until_date=until_date, until_datetime=until_datetime, ttl_duration=with_ttl
)
if not force_update and cache_file is not None:
# cache file is available
try:

View File

@@ -19,7 +19,7 @@ Example usage:
>>> to_duration("2 days 5 hours")
# Timezone detection
>>> to_timezone(location={40.7128, -74.0060})
>>> to_timezone(location=(40.7128, -74.0060))
"""
import re
@@ -27,7 +27,7 @@ from datetime import date, datetime, timedelta
from typing import Any, List, Literal, Optional, Tuple, Union, overload
import pendulum
from pendulum import DateTime
from pendulum import Date, DateTime, Duration
from pendulum.tz.timezone import Timezone
from timezonefinder import TimezoneFinder
@@ -71,6 +71,7 @@ def to_datetime(
date_input (Optional[Any]): The date input to convert. Supported types include:
- `str`: A date string in various formats (e.g., "2024-10-13", "13 Oct 2024").
- `pendulum.DateTime`: A Pendulum DateTime object.
- `pendulum.Date`: A Pendulum Date object, which will be converted to a datetime at the start or end of the day.
- `datetime.datetime`: A standard Python datetime object.
- `datetime.date`: A date object, which will be converted to a datetime at the start or end of the day.
- `int` or `float`: A Unix timestamp, interpreted as seconds since the epoch (UTC).
@@ -123,6 +124,14 @@ def to_datetime(
if isinstance(date_input, DateTime):
dt = date_input
elif isinstance(date_input, Date):
dt = pendulum.datetime(
year=date_input.year, month=date_input.month, day=date_input.day, tz=in_timezone
)
if to_maxtime:
dt = dt.end_of("day")
else:
dt = dt.start_of("day")
elif isinstance(date_input, str):
# Convert to timezone aware datetime
dt = None
@@ -161,14 +170,22 @@ def to_datetime(
except pendulum.parsing.exceptions.ParserError as e:
logger.debug(f"Date string {date_input} does not match any Pendulum formats: {e}")
dt = None
if dt is None:
# Some special values
if date_input.lower() == "infinity":
# Subtract one year from max as max datetime will create an overflow error in certain context.
dt = DateTime.max.subtract(years=1)
if dt is None:
try:
timestamp = float(date_input)
dt = pendulum.from_timestamp(timestamp, tz="UTC")
except (ValueError, TypeError) as e:
logger.debug(f"Date string {date_input} does not match timestamp format: {e}")
dt = None
if dt is None:
raise ValueError(f"Date string {date_input} does not match any known formats.")
elif date_input is None:
dt = (
pendulum.today(tz=in_timezone).end_of("day")
if to_maxtime
else pendulum.today(tz=in_timezone).start_of("day")
)
dt = pendulum.now(tz=in_timezone)
elif isinstance(date_input, datetime):
dt = pendulum.instance(date_input)
elif isinstance(date_input, date):
@@ -206,19 +223,19 @@ def to_datetime(
def to_duration(
input_value: Union[timedelta, str, int, float, Tuple[int, int, int, int], List[int]],
) -> timedelta:
"""Converts various input types into a timedelta object using pendulum.
input_value: Union[Duration, timedelta, str, int, float, Tuple[int, int, int, int], List[int]],
) -> Duration:
"""Converts various input types into a Duration object using pendulum.
Args:
input_value (Union[timedelta, str, int, float, tuple, list]): Input to be converted
input_value (Union[Duration, timedelta, str, int, float, tuple, list]): Input to be converted
into a timedelta:
- str: A duration string like "2 days", "5 hours", "30 minutes", or a combination.
- int/float: Number representing seconds.
- tuple/list: A tuple or list in the format (days, hours, minutes, seconds).
Returns:
timedelta: A timedelta object corresponding to the input value.
duration: A Duration object corresponding to the input value.
Raises:
ValueError: If the input format is not supported.
@@ -233,18 +250,21 @@ def to_duration(
>>> to_duration((1, 2, 30, 15))
timedelta(days=1, seconds=90315)
"""
if isinstance(input_value, timedelta):
if isinstance(input_value, Duration):
return input_value
if isinstance(input_value, timedelta):
return pendulum.duration(seconds=input_value.total_seconds())
if isinstance(input_value, (int, float)):
# Handle integers or floats as seconds
return timedelta(seconds=input_value)
return pendulum.duration(seconds=input_value)
elif isinstance(input_value, (tuple, list)):
# Handle tuple or list: (days, hours, minutes, seconds)
if len(input_value) == 4:
days, hours, minutes, seconds = input_value
return timedelta(days=days, hours=hours, minutes=minutes, seconds=seconds)
return pendulum.duration(days=days, hours=hours, minutes=minutes, seconds=seconds)
else:
error_msg = f"Expected a tuple or list of length 4, got {len(input_value)}"
logger.error(error_msg)
@@ -340,7 +360,7 @@ def to_timezone(
>>> to_timezone(utc_offset=5.5, as_string=True)
'UTC+05:30'
>>> to_timezone(location={40.7128, -74.0060})
>>> to_timezone(location=(40.7128, -74.0060))
<Timezone [America/New_York]>
>>> to_timezone()

View File

@@ -427,9 +427,9 @@ def prepare_visualize(
report.generate_pdf()
if __name__ == "__main__":
# Example usage
report = VisualizationReport("example_report.pdf")
def generate_example_report(filename: str = "example_report.pdf") -> None:
"""Generate example visualization report."""
report = VisualizationReport(filename)
x_hours = 0 # Define x-axis start values (e.g., hours)
# Group 1: Adding charts to be displayed on the same page
@@ -502,3 +502,7 @@ if __name__ == "__main__":
# Generate the PDF report
report.generate_pdf()
if __name__ == "__main__":
generate_example_report()