EOS/src/akkudoktoreos/logutil.py
Bobby Noelte d2ba0adb5f
Add test to PVForecast (#174)
* Add documentation to class_pv_forecast.py.

Added documentation. Beware mostly generated by ChatGPT.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>

* Add CacheFileStore, datetime and logger utilities.

The `CacheFileStore` class is a singleton-based, thread-safe key-value store for managing
temporary file objects, allowing the creation, retrieval, and management of cache files.

The utility modules offer a flexible logging setup (`get_logger`) and utilities to handle
different date-time formats (`to_datetime`, `to_timestamp`) and timezone detection
(`to_timezone).

- Cache files are automatically valid for the the current date unless specified otherwise.
  This is to mimic the current behaviour used in several classes.
- The logger supports rotating log files to prevent excessive log file size.
- The `to_datetime` and `to_timestamp`functions support a wide variety of input types and formats.
  They provide the time conversion that is e.g. used in PVForecast.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>

* Improve testability of PVForecast

Improvements for testing of PVForecast
- Use common utility functions to allow for general testing at one spot.
  - to_datetime
  - CacheFileStore
- Use logging instead of print to easily capture in testing.
- Add validation of the json schema for Akkudoktor PV forecast data.
- Allow to create an empty PVForecast instance as base instance for testing.
- Make process_data() complete for filling a PVForecast instance for testing.
- Normalize forecast datetime to timezone of system given in loaded data.
- Do not print report but provide report for test checks.
- Get rid of cache file path using the CachFileStore to automate cache file usage.
- Improved module documentation.

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>

* Add test for PVForecast and newly extracted utility modules.

- Add test for PVForecast
- Add test for CacheFileStore in the new cachefilestore module
- Add test for to_datetime, to_timestamp, to_timezone in the new
  datetimeutil module
- Add test for get_logger in the new logutil module

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>

---------

Signed-off-by: Bobby Noelte <b0661n0e17e@gmail.com>
Co-authored-by: Normann <github@koldrack.com>
2024-11-10 23:49:10 +01:00

96 lines
3.2 KiB
Python

"""Utility functions for handling logging tasks.
Functions:
----------
- get_logger: Creates and configures a logger with console and optional rotating file logging.
Example usage:
--------------
# Logger setup
>>> logger = get_logger(__name__, log_file="app.log", logging_level="DEBUG")
>>> logger.info("Logging initialized.")
Notes:
------
- The logger supports rotating log files to prevent excessive log file size.
"""
import logging
import os
from logging.handlers import RotatingFileHandler
from typing import Optional
def get_logger(
name: str,
log_file: Optional[str] = None,
logging_level: Optional[str] = "INFO",
max_bytes: int = 5000000,
backup_count: int = 5,
) -> logging.Logger:
"""Creates and configures a logger with a given name.
The logger supports logging to both the console and an optional log file. File logging is
handled by a rotating file handler to prevent excessive log file size.
Args:
name (str): The name of the logger, typically `__name__` from the calling module.
log_file (Optional[str]): Path to the log file for file logging. If None, no file logging is done.
logging_level (Optional[str]): Logging level (e.g., "INFO", "DEBUG"). Defaults to "INFO".
max_bytes (int): Maximum size in bytes for log file before rotation. Defaults to 5 MB.
backup_count (int): Number of backup log files to keep. Defaults to 5.
Returns:
logging.Logger: Configured logger instance.
Example:
logger = get_logger(__name__, log_file="app.log", logging_level="DEBUG")
logger.info("Application started")
"""
# Create a logger with the specified name
logger = logging.getLogger(name)
logger.propagate = True
if logging_level == "DEBUG":
level = logging.DEBUG
elif logging_level == "INFO":
level = logging.INFO
elif logging_level == "WARNING":
level = logging.WARNING
elif logging_level == "ERROR":
level = logging.ERROR
else:
level = logging.DEBUG
logger.setLevel(level)
# The log message format
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
# Prevent loggers from being added multiple times
# There may already be a logger from pytest
if not logger.handlers:
# Create a console handler with a standard output stream
console_handler = logging.StreamHandler()
console_handler.setLevel(level)
console_handler.setFormatter(formatter)
# Add the console handler to the logger
logger.addHandler(console_handler)
if log_file and len(logger.handlers) < 2: # We assume a console logger to be the first logger
# If a log file path is specified, create a rotating file handler
# Ensure the log directory exists
log_dir = os.path.dirname(log_file)
if log_dir and not os.path.exists(log_dir):
os.makedirs(log_dir)
# Create a rotating file handler
file_handler = RotatingFileHandler(log_file, maxBytes=max_bytes, backupCount=backup_count)
file_handler.setLevel(level)
file_handler.setFormatter(formatter)
# Add the file handler to the logger
logger.addHandler(file_handler)
return logger