diff --git a/CHANGELOG.md b/CHANGELOG.md
index 644d6a18..5e0d0472 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,18 +4,6 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
## Latest Changes
-### Resilient Sink Addition ([v0.13.1](https://github.com/devsetgo/devsetgo_lib/releases/tag/v0.13.1))
-
-#### What's Changed
-* Adding Resiliency to Logging Config (#429) @devsetgo
-* pip(deps): bump mkdocs-print-site-plugin from 2.4.1 to 2.5.0 (#422) @dependabot
-* pip(deps): bump ruff from 0.4.5 to 0.4.7 (#420) @dependabot
-* pip(deps): bump autopep8 from 2.1.1 to 2.2.0 (#421) @dependabot
-* pip(deps): bump mkdocs-material from 9.5.24 to 9.5.25 (#423) @dependabot
-
-
-Published Date: 2024 July 20, 20:51
-
### ([v0.13.0-republish](https://github.com/devsetgo/devsetgo_lib/releases/tag/v0.13.0-republish))
#### What's Changed
@@ -725,3 +713,60 @@ Published Date: 2021 March 18, 17:19
Published Date: 2021 March 18, 17:06
+
+### Minor updates and library updates. ([v0.4.1](https://github.com/devsetgo/devsetgo_lib/releases/tag/v0.4.1))
+
+# What's Changed
+* Updates and Minor updates (#78) @devsetgo
+* Bump tqdm from 4.54.1 to 4.55.0 (#77) @dependabot
+* Bump twine from 3.2.0 to 3.3.0 (#76) @dependabot
+* Bump coverage from 5.3 to 5.3.1 (#74) @dependabot
+* Bump mkdocs-material from 6.1.7 to 6.2.2 (#75) @dependabot
+* Bump watchdog from 0.10.4 to 1.0.2 (#73) @dependabot
+* Bump pytest from 6.1.2 to 6.2.1 (#71) @dependabot
+* Bump wheel from 0.36.1 to 0.36.2 (#70) @dependabot
+* Bump tqdm from 4.54.0 to 4.54.1 (#67) @dependabot
+* Bump mkdocs-material from 6.1.6 to 6.1.7 (#68) @dependabot
+* Bump pre-commit from 2.9.2 to 2.9.3 (#69) @dependabot
+* Bump wheel from 0.36.0 to 0.36.1 (#66) @dependabot
+* Bump wheel from 0.35.1 to 0.36.0 (#64) @dependabot
+* Bump tqdm from 4.53.0 to 4.54.0 (#65) @dependabot
+* Bump pre-commit from 2.8.2 to 2.9.2 (#61) @dependabot
+* Bump mkdocs-material from 6.1.5 to 6.1.6 (#60) @dependabot
+* Bump tqdm from 4.52.0 to 4.53.0 (#62) @dependabot
+* Bump watchdog from 0.10.3 to 0.10.4 (#63) @dependabot
+* Bump tqdm from 4.51.0 to 4.52.0 (#59) @dependabot
+* Bump mkdocs-material from 6.1.4 to 6.1.5 (#58) @dependabot
+* Bump mkdocs-material from 6.1.2 to 6.1.4 (#57) @dependabot
+* Bump pre-commit from 2.8.0 to 2.8.2 (#55) @dependabot
+* Bump mkdocs-material from 6.1.0 to 6.1.2 (#56) @dependabot
+* Bump pytest from 6.1.1 to 6.1.2 (#52) @dependabot
+* Bump pre-commit from 2.7.1 to 2.8.0 (#53) @dependabot
+* Bump tqdm from 4.50.2 to 4.51.0 (#54) @dependabot
+* Bump mkdocs-material from 6.0.2 to 6.1.0 (#51) @dependabot
+* Bump tqdm from 4.50.1 to 4.50.2 (#49) @dependabot
+* Bump tox from 3.20.0 to 3.20.1 (#50) @dependabot
+* Bump pytest from 6.1.0 to 6.1.1 (#48) @dependabot
+* Bump mkdocs-material from 6.0.1 to 6.0.2 (#47) @dependabot
+* Bump flake8 from 3.8.3 to 3.8.4 (#45) @dependabot
+* Bump tqdm from 4.50.0 to 4.50.1 (#44) @dependabot
+* Bump bump2version from 1.0.0 to 1.0.1 (#46) @dependabot
+* Bump tqdm from 4.49.0 to 4.50.0 (#42) @dependabot
+* Bump black from 19.10b0 to 20.8b1 (#43) @dependabot
+* Bump tqdm from 4.46.0 to 4.49.0 (#40) @dependabot
+* Bump pytest from 5.4.2 to 6.1.0 (#39) @dependabot
+* Bump coverage from 5.1 to 5.3 (#38) @dependabot
+* Bump autoflake from 1.3.1 to 1.4 (#41) @dependabot
+* Bump twine from 3.1.1 to 3.2.0 (#37) @dependabot
+* Bump wheel from 0.34.2 to 0.35.1 (#34) @dependabot
+* Bump pytest-cov from 2.9.0 to 2.10.1 (#36) @dependabot
+* Bump watchdog from 0.10.2 to 0.10.3 (#35) @dependabot
+* Bump mkdocs-material from 5.2.2 to 6.0.1 (#33) @dependabot
+* Bump pylint from 2.5.2 to 2.6.0 (#32) @dependabot-preview
+* Bump pre-commit from 2.4.0 to 2.7.1 (#31) @dependabot-preview
+* Bump tox from 3.15.1 to 3.20.0 (#30) @dependabot-preview
+* Bump flake8 from 3.8.2 to 3.8.3 (#29) @dependabot-preview
+* Bump autopep8 from 1.5.2 to 1.5.4 (#28) @dependabot-preview
+
+
+Published Date: 2020 December 26, 23:51
diff --git a/README.md b/README.md
index 29ac883a..f882bb04 100644
--- a/README.md
+++ b/README.md
@@ -31,27 +31,30 @@ SonarCloud:
## Key Features
-- **File Operations**:
- - **CSV, JSON, and Text File Functions**: Create, read, write, and manipulate various file types with ease.
- - **Folder Functions**: Create and remove directories, list directory contents, and manage file system operations efficiently.
+### **Common Functions**:
+ - **File Operations**:
+ - **CSV, JSON, and Text File Functions**: Create, read, write, and manipulate various file types with ease.
+ - **Folder Functions**: Create and remove directories, list directory contents, and manage file system operations efficiently.
-- **Logging**:
- - Comprehensive logging setup using the `loguru` library. Provides extensive customization options for log configuration, including log rotation, retention, and formatting.
+ - **Logging**:
+ Comprehensive logging setup using the [Loguru Library]('https://loguru.readthedocs.io/en/stable/overview.html'). Provides extensive customization options for log configuration, including log rotation, retention, and formatting. Includes improvements for multiprocessing environments to ensure log messages are handled correctly across multiple processes.
-- **Calendar Functions**:
- - Convert between month names and numbers seamlessly.
+ - **Calendar Functions**:
+ Convert between month names and numbers seamlessly.
-- **Pattern Matching**:
- - Powerful tools for searching patterns in text using regular expressions.
+ - **Pattern Matching**:
+ Powerful tools for searching patterns in text using regular expressions.
-- **FastAPI Endpoints**:
+
+### **FastAPI Endpoints**:
- Pre-built endpoints for system health checks, status, and uptime monitoring.
- Functions to generate HTTP response codes easily.
-- **Async Database**:
+### **Async Database**:
- Configuration and management of asynchronous database sessions.
- CRUD operations with async support.
+---
## Installation
To install `devsetgo_lib`, use pip:
@@ -80,7 +83,7 @@ pip install devsetgo-lib[all]
Here's a quick example to demonstrate how you can use some of the key features of `devsetgo_lib`:
```python
-from devsetgo_lib import file_functions, logging_config, patterns, calendar_functions
+from devsetgo_lib.common_functions import file_functions, logging_config, patterns, calendar_functions
# File Operations
file_functions.create_sample_files("example", 100)
diff --git a/coverage.svg b/coverage.svg
deleted file mode 100644
index 6bfc8faf..00000000
--- a/coverage.svg
+++ /dev/null
@@ -1,21 +0,0 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- coverage
- coverage
- 99%
- 99%
-
-
diff --git a/coverage.xml b/coverage.xml
index 4397aa12..d53f588d 100644
--- a/coverage.xml
+++ b/coverage.xml
@@ -1,5 +1,5 @@
-
+
@@ -21,9 +21,9 @@
-
-
-
+
+
+
@@ -57,41 +57,41 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
@@ -100,27 +100,27 @@
-
-
-
-
-
-
+
+
+
+
-
-
+
-
+
+
-
-
-
-
-
-
+
+
+
+
+
+
+
+
@@ -132,42 +132,42 @@
-
-
-
+
+
-
+
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
-
-
+
+
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
@@ -180,123 +180,123 @@
-
-
-
-
-
-
+
+
+
+
+
+
+
-
+
-
+
-
-
-
+
+
+
-
-
+
+
-
-
+
-
-
-
-
-
+
+
+
+
+
+
-
-
-
-
+
+
+
+
-
-
+
+
-
-
-
-
+
+
+
+
-
-
-
+
+
+
-
+
-
-
+
+
-
+
-
+
-
-
+
+
-
-
-
-
+
+
+
+
-
-
+
+
-
-
-
+
+
+
+
-
-
-
+
+
-
+
-
-
-
-
+
+
+
+
-
-
-
+
+
+
-
-
-
-
-
+
+
+
+
+
-
-
+
+
-
+
-
-
+
@@ -332,17 +332,17 @@
-
-
-
-
-
+
+
+
-
-
-
+
+
+
+
-
+
+
@@ -353,46 +353,46 @@
-
-
-
-
-
+
+
+
+
-
-
-
+
+
+
-
-
+
+
-
-
-
-
-
+
+
+
+
+
-
-
-
-
-
-
+
+
+
+
+
+
+
@@ -428,142 +428,142 @@
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
-
-
-
-
+
+
+
+
-
-
-
+
+
-
+
+
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
-
-
-
-
-
+
+
+
+
-
-
+
+
-
-
+
+
-
-
+
+
-
-
+
+
-
-
+
+
+
-
-
-
-
+
+
+
+
-
-
-
+
+
-
-
-
-
-
+
+
+
+
+
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
-
-
+
+
-
-
-
-
-
+
+
+
+
+
+
-
-
-
+
+
-
+
-
-
-
+
+
+
-
+
+
+
@@ -624,54 +624,52 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
@@ -680,20 +678,20 @@
-
-
-
+
+
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
@@ -719,19 +717,19 @@
-
-
-
-
+
+
+
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
@@ -747,43 +745,43 @@
-
-
-
-
+
+
+
-
+
-
-
+
+
-
+
-
-
-
-
+
+
+
-
-
-
+
+
+
+
-
-
-
+
+
-
-
+
+
-
-
-
-
-
-
+
+
+
+
+
+
+
+
diff --git a/docs/index.md b/docs/index.md
index 29ac883a..f882bb04 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -31,27 +31,30 @@ SonarCloud:
## Key Features
-- **File Operations**:
- - **CSV, JSON, and Text File Functions**: Create, read, write, and manipulate various file types with ease.
- - **Folder Functions**: Create and remove directories, list directory contents, and manage file system operations efficiently.
+### **Common Functions**:
+ - **File Operations**:
+ - **CSV, JSON, and Text File Functions**: Create, read, write, and manipulate various file types with ease.
+ - **Folder Functions**: Create and remove directories, list directory contents, and manage file system operations efficiently.
-- **Logging**:
- - Comprehensive logging setup using the `loguru` library. Provides extensive customization options for log configuration, including log rotation, retention, and formatting.
+ - **Logging**:
+ Comprehensive logging setup using the [Loguru Library]('https://loguru.readthedocs.io/en/stable/overview.html'). Provides extensive customization options for log configuration, including log rotation, retention, and formatting. Includes improvements for multiprocessing environments to ensure log messages are handled correctly across multiple processes.
-- **Calendar Functions**:
- - Convert between month names and numbers seamlessly.
+ - **Calendar Functions**:
+ Convert between month names and numbers seamlessly.
-- **Pattern Matching**:
- - Powerful tools for searching patterns in text using regular expressions.
+ - **Pattern Matching**:
+ Powerful tools for searching patterns in text using regular expressions.
-- **FastAPI Endpoints**:
+
+### **FastAPI Endpoints**:
- Pre-built endpoints for system health checks, status, and uptime monitoring.
- Functions to generate HTTP response codes easily.
-- **Async Database**:
+### **Async Database**:
- Configuration and management of asynchronous database sessions.
- CRUD operations with async support.
+---
## Installation
To install `devsetgo_lib`, use pip:
@@ -80,7 +83,7 @@ pip install devsetgo-lib[all]
Here's a quick example to demonstrate how you can use some of the key features of `devsetgo_lib`:
```python
-from devsetgo_lib import file_functions, logging_config, patterns, calendar_functions
+from devsetgo_lib.common_functions import file_functions, logging_config, patterns, calendar_functions
# File Operations
file_functions.create_sample_files("example", 100)
diff --git a/dsg_lib/__init__.py b/dsg_lib/__init__.py
index 33d49249..9d2ac81d 100644
--- a/dsg_lib/__init__.py
+++ b/dsg_lib/__init__.py
@@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
-__version__ = '0.13.3'
+__version__ = "0.13.3"
diff --git a/dsg_lib/async_database_functions/__import_sqlalchemy.py b/dsg_lib/async_database_functions/__import_sqlalchemy.py
index 4e961ed9..ac424a5a 100644
--- a/dsg_lib/async_database_functions/__import_sqlalchemy.py
+++ b/dsg_lib/async_database_functions/__import_sqlalchemy.py
@@ -1,4 +1,29 @@
# -*- coding: utf-8 -*-
+"""
+This module provides functionality to import and validate SQLAlchemy components, ensuring compatibility with the required version.
+
+The `import_sqlalchemy` function is the primary function in this module. It imports various components from SQLAlchemy, checks the version of SQLAlchemy, and raises an ImportError if the version is below the minimum required version.
+
+Usage example:
+ ```python
+ from import_sqlalchemy import import_sqlalchemy
+
+ sqlalchemy_components = import_sqlalchemy()
+ sqlalchemy, MetaData, create_engine, text, Column, DateTime, String, IntegrityError, SQLAlchemyError, AsyncSession = sqlalchemy_components
+
+ # Example usage of imported components
+ engine = create_engine('sqlite:///example.db')
+ metadata = MetaData()
+ ```
+
+Author(s):
+ Mike Ryan
+
+Date Created:
+ 2024/05/16
+Date Updated:
+ 2024/07/26
+"""
from typing import Tuple
from loguru import logger
@@ -15,7 +40,7 @@ def import_sqlalchemy() -> Tuple:
and raises an ImportError if the version is less than the minimum required version.
Returns:
- tuple: A tuple containing the following SQLAlchemy components:
+ Tuple: A tuple containing the following SQLAlchemy components:
- sqlalchemy: The SQLAlchemy module.
- MetaData: The MetaData class from SQLAlchemy.
- create_engine: The create_engine function from SQLAlchemy.
@@ -26,23 +51,25 @@ def import_sqlalchemy() -> Tuple:
- IntegrityError: The IntegrityError exception from SQLAlchemy.
- SQLAlchemyError: The SQLAlchemyError exception from SQLAlchemy.
- AsyncSession: The AsyncSession class from SQLAlchemy.
- - create_async_engine: The create_async_engine function from SQLAlchemy.
- - select: The select function from SQLAlchemy.
- - declarative_base: The declarative_base function from SQLAlchemy.
- - sessionmaker: The sessionmaker function from SQLAlchemy.
- - func: The func object from SQLAlchemy.
- - NoResultFound: The NoResultFound exception from SQLAlchemy.
Raises:
- ImportError: If the SQLAlchemy version is less than the minimum required version.
+ ImportError: If SQLAlchemy is not installed or the version is below the minimum required version.
+
+ Example:
+ ```python
+ from import_sqlalchemy import import_sqlalchemy
+
+ sqlalchemy_components = import_sqlalchemy()
+ sqlalchemy, MetaData, create_engine, text, Column, DateTime, String, IntegrityError, SQLAlchemyError, AsyncSession = sqlalchemy_components
- Author: Mike Ryan
- Date: 2024/05/16
- License: MIT
+ # Example usage of imported components
+ engine = create_engine('sqlite:///example.db')
+ metadata = MetaData()
+ ```
"""
- min_version = '2.0.0' # Minimum required version of SQLAlchemy
+ min_version = "2.0.0" # Minimum required version of SQLAlchemy
- logger.info('Attempting to import SQLAlchemy...')
+ logger.info("Attempting to import SQLAlchemy...")
try:
# Import SQLAlchemy and its components
@@ -55,11 +82,11 @@ def import_sqlalchemy() -> Tuple:
from sqlalchemy.orm.exc import NoResultFound
from sqlalchemy.sql import func
- logger.info('Successfully imported SQLAlchemy.')
+ logger.info("Successfully imported SQLAlchemy.")
except ImportError: # pragma: no cover
# Handle the case where SQLAlchemy is not installed
- logger.error('Failed to import SQLAlchemy.')
+ logger.error("Failed to import SQLAlchemy.")
create_engine = text = sqlalchemy = None # pragma: no cover
# Check SQLAlchemy version
@@ -68,13 +95,13 @@ def import_sqlalchemy() -> Tuple:
) < packaging_version.parse(min_version):
# If the installed version is less than the minimum required version, raise an error
logger.error(
- f'SQLAlchemy version >= {min_version} required, run `pip install --upgrade sqlalchemy`'
+ f"SQLAlchemy version >= {min_version} required, run `pip install --upgrade sqlalchemy`"
)
raise ImportError(
- f'SQLAlchemy version >= {min_version} required, run `pip install --upgrade sqlalchemy`'
+ f"SQLAlchemy version >= {min_version} required, run `pip install --upgrade sqlalchemy`"
) # pragma: no cover
- logger.info('Returning SQLAlchemy components.')
+ logger.info("Returning SQLAlchemy components.")
# Return the imported SQLAlchemy components
return (
@@ -118,4 +145,6 @@ def import_sqlalchemy() -> Tuple:
String, # The String class from SQLAlchemy
func, # The func object from SQLAlchemy
NoResultFound, # The NoResultFound exception from SQLAlchemy
-) = import_sqlalchemy() # Call the function that imports SQLAlchemy and checks its version
+) = (
+ import_sqlalchemy()
+) # Call the function that imports SQLAlchemy and checks its version
diff --git a/dsg_lib/async_database_functions/async_database.py b/dsg_lib/async_database_functions/async_database.py
index b29cc60c..41d1b3db 100644
--- a/dsg_lib/async_database_functions/async_database.py
+++ b/dsg_lib/async_database_functions/async_database.py
@@ -85,7 +85,7 @@ def __init__(self, db_config: DBConfig):
"""
self.db_config = db_config
self.Base = BASE
- logger.debug('AsyncDatabase initialized')
+ logger.debug("AsyncDatabase initialized")
def get_db_session(self):
"""This method returns a context manager that provides a new database
@@ -96,7 +96,7 @@ def get_db_session(self):
Returns: contextlib._GeneratorContextManager: A context manager that
provides a new database session.
"""
- logger.debug('Getting database session')
+ logger.debug("Getting database session")
return self.db_config.get_db_session()
async def create_tables(self):
@@ -106,7 +106,7 @@ async def create_tables(self):
Returns: None
"""
- logger.debug('Creating tables')
+ logger.debug("Creating tables")
try:
# Bind the engine to the metadata of the base class
self.Base.metadata.bind = self.db_config.engine
@@ -115,8 +115,8 @@ async def create_tables(self):
async with self.db_config.engine.begin() as conn:
# Run a function in a synchronous manner
await conn.run_sync(self.Base.metadata.create_all)
- logger.info('Tables created successfully')
+ logger.info("Tables created successfully")
except Exception as ex: # pragma: no cover
# Log the error and raise it
- logger.error(f'Error creating tables: {ex}') # pragma: no cover
+ logger.error(f"Error creating tables: {ex}") # pragma: no cover
raise # pragma: no cover
diff --git a/dsg_lib/async_database_functions/base_schema.py b/dsg_lib/async_database_functions/base_schema.py
index 7e782236..b2656899 100644
--- a/dsg_lib/async_database_functions/base_schema.py
+++ b/dsg_lib/async_database_functions/base_schema.py
@@ -56,14 +56,18 @@ class MyModel(base_schema.SchemaBaseSQLite):
String, # The String class from SQLAlchemy
func, # The func object from SQLAlchemy
NoResultFound, # The NoResultFound exception from SQLAlchemy
-) = import_sqlalchemy() # Call the function that imports SQLAlchemy and checks its version
+) = (
+ import_sqlalchemy()
+) # Call the function that imports SQLAlchemy and checks its version
# comments
-uuid_comment = 'Unique identifier for each record, a string representation of a UUID'
-date_created_comment = 'Date and time when a row was inserted, defaults to current UTC time'
+uuid_comment = "Unique identifier for each record, a string representation of a UUID"
+date_created_comment = (
+ "Date and time when a row was inserted, defaults to current UTC time"
+)
date_updated_comment = (
- 'Date and time when a row was last updated, defaults to current UTC time on update'
+ "Date and time when a row was last updated, defaults to current UTC time on update"
)
@@ -228,7 +232,7 @@ class MyModel(base_schema.SchemaBaseMySQL, BASE):
date_created = Column(
DateTime,
index=True,
- server_default=text('UTC_TIMESTAMP()'),
+ server_default=text("UTC_TIMESTAMP()"),
comment=date_created_comment,
)
@@ -237,8 +241,8 @@ class MyModel(base_schema.SchemaBaseMySQL, BASE):
date_updated = Column(
DateTime,
index=True,
- server_default=text('UTC_TIMESTAMP()'),
- onupdate=text('UTC_TIMESTAMP()'),
+ server_default=text("UTC_TIMESTAMP()"),
+ onupdate=text("UTC_TIMESTAMP()"),
comment=date_updated_comment,
)
@@ -287,7 +291,7 @@ class MyModel(base_schema.SchemaBaseOracle, BASE):
date_created = Column(
DateTime,
index=True,
- server_default=text('SYS_EXTRACT_UTC(SYSTIMESTAMP)'),
+ server_default=text("SYS_EXTRACT_UTC(SYSTIMESTAMP)"),
comment=date_created_comment,
)
@@ -296,8 +300,8 @@ class MyModel(base_schema.SchemaBaseOracle, BASE):
date_updated = Column(
DateTime,
index=True,
- server_default=text('SYS_EXTRACT_UTC(SYSTIMESTAMP)'),
- onupdate=text('SYS_EXTRACT_UTC(SYSTIMESTAMP)'),
+ server_default=text("SYS_EXTRACT_UTC(SYSTIMESTAMP)"),
+ onupdate=text("SYS_EXTRACT_UTC(SYSTIMESTAMP)"),
comment=date_updated_comment,
)
@@ -346,7 +350,7 @@ class MyModel(base_schema.SchemaBaseMSSQL, BASE):
date_created = Column(
DateTime,
index=True,
- server_default=text('GETUTCDATE()'),
+ server_default=text("GETUTCDATE()"),
comment=date_created_comment,
)
@@ -355,8 +359,8 @@ class MyModel(base_schema.SchemaBaseMSSQL, BASE):
date_updated = Column(
DateTime,
index=True,
- server_default=text('GETUTCDATE()'),
- onupdate=text('GETUTCDATE()'),
+ server_default=text("GETUTCDATE()"),
+ onupdate=text("GETUTCDATE()"),
comment=date_updated_comment,
)
@@ -405,8 +409,8 @@ class MyModel(base_schema.SchemaBaseFirebird, BASE):
date_created = Column(
DateTime,
index=True,
- server_default=text('CURRENT_TIMESTAMP'),
- comment='Date and time when a row was inserted, defaults to current time',
+ server_default=text("CURRENT_TIMESTAMP"),
+ comment="Date and time when a row was inserted, defaults to current time",
)
# The date and time when a particular row was last updated. It defaults to
@@ -414,9 +418,9 @@ class MyModel(base_schema.SchemaBaseFirebird, BASE):
date_updated = Column(
DateTime,
index=True,
- server_default=text('CURRENT_TIMESTAMP'),
- onupdate=text('CURRENT_TIMESTAMP'),
- comment='Date and time when a row was last updated, defaults to current time on update',
+ server_default=text("CURRENT_TIMESTAMP"),
+ onupdate=text("CURRENT_TIMESTAMP"),
+ comment="Date and time when a row was last updated, defaults to current time on update",
)
@@ -464,7 +468,7 @@ class MyModel(base_schema.SchemaBaseSybase, BASE):
date_created = Column(
DateTime,
index=True,
- server_default=text('GETUTCDATE()'),
+ server_default=text("GETUTCDATE()"),
comment=date_created_comment,
)
@@ -473,8 +477,8 @@ class MyModel(base_schema.SchemaBaseSybase, BASE):
date_updated = Column(
DateTime,
index=True,
- server_default=text('GETUTCDATE()'),
- onupdate=text('GETUTCDATE()'),
+ server_default=text("GETUTCDATE()"),
+ onupdate=text("GETUTCDATE()"),
comment=date_updated_comment,
)
diff --git a/dsg_lib/async_database_functions/database_config.py b/dsg_lib/async_database_functions/database_config.py
index eb9d1ff9..8f63ec08 100644
--- a/dsg_lib/async_database_functions/database_config.py
+++ b/dsg_lib/async_database_functions/database_config.py
@@ -67,7 +67,9 @@
String, # The String class from SQLAlchemy
func, # The func object from SQLAlchemy
NoResultFound, # The NoResultFound exception from SQLAlchemy
-) = import_sqlalchemy() # Call the function that imports SQLAlchemy and checks its version
+) = (
+ import_sqlalchemy()
+) # Call the function that imports SQLAlchemy and checks its version
# Now you can use declarative_base at the module level
@@ -118,15 +120,15 @@ class DBConfig:
"""
SUPPORTED_PARAMETERS = {
- 'sqlite': {'echo', 'future', 'pool_recycle'},
- 'postgresql': {
- 'echo',
- 'future',
- 'pool_pre_ping',
- 'pool_size',
- 'max_overflow',
- 'pool_recycle',
- 'pool_timeout',
+ "sqlite": {"echo", "future", "pool_recycle"},
+ "postgresql": {
+ "echo",
+ "future",
+ "pool_pre_ping",
+ "pool_size",
+ "max_overflow",
+ "pool_recycle",
+ "pool_timeout",
},
# Add other engines here...
}
@@ -175,11 +177,15 @@ def __init__(self, config: Dict):
```
"""
self.config = config
- engine_type = self.config['database_uri'].split('+')[0]
+ engine_type = self.config["database_uri"].split("+")[0]
supported_parameters = self.SUPPORTED_PARAMETERS.get(engine_type, set())
- unsupported_parameters = set(config.keys()) - supported_parameters - {'database_uri'}
+ unsupported_parameters = (
+ set(config.keys()) - supported_parameters - {"database_uri"}
+ )
if unsupported_parameters:
- error_message = f'Unsupported parameters for {engine_type}: {unsupported_parameters}'
+ error_message = (
+ f"Unsupported parameters for {engine_type}: {unsupported_parameters}"
+ )
logger.error(error_message)
raise Exception(error_message)
@@ -188,7 +194,9 @@ def __init__(self, config: Dict):
for param in supported_parameters
if self.config.get(param) is not None
}
- self.engine = create_async_engine(self.config['database_uri'], **engine_parameters)
+ self.engine = create_async_engine(
+ self.config["database_uri"], **engine_parameters
+ )
self.metadata = MetaData()
@asynccontextmanager
@@ -229,7 +237,7 @@ async def get_db_session(self):
# Perform your database operations here pass
```
"""
- logger.debug('Creating new database session')
+ logger.debug("Creating new database session")
try:
# Create a new database session
async with sessionmaker(
@@ -239,8 +247,8 @@ async def get_db_session(self):
yield session
except SQLAlchemyError as e: # pragma: no cover
# Log the error and raise it
- logger.error(f'Database error occurred: {str(e)}') # pragma: no cover
+ logger.error(f"Database error occurred: {str(e)}") # pragma: no cover
raise # pragma: no cover
finally: # pragma: no cover
# Log the end of the database session
- logger.debug('Database session ended') # pragma: no cover
+ logger.debug("Database session ended") # pragma: no cover
diff --git a/dsg_lib/async_database_functions/database_operations.py b/dsg_lib/async_database_functions/database_operations.py
index d7db98a0..d5eb3118 100644
--- a/dsg_lib/async_database_functions/database_operations.py
+++ b/dsg_lib/async_database_functions/database_operations.py
@@ -57,7 +57,9 @@
String, # The String class from SQLAlchemy
func, # The func object from SQLAlchemy
NoResultFound, # The NoResultFound exception from SQLAlchemy
-) = import_sqlalchemy() # Call the function that imports SQLAlchemy and checks its version
+) = (
+ import_sqlalchemy()
+) # Call the function that imports SQLAlchemy and checks its version
def handle_exceptions(ex: Exception) -> Dict[str, str]:
@@ -86,21 +88,21 @@ def handle_exceptions(ex: Exception) -> Dict[str, str]:
```
"""
# Extract the error message before the SQL statement
- error_only = str(ex).split('[SQL:')[0]
+ error_only = str(ex).split("[SQL:")[0]
# Check the type of the exception
if isinstance(ex, IntegrityError):
# Log the error and return the error details
- logger.error(f'IntegrityError occurred: {ex}')
- return {'error': 'IntegrityError', 'details': error_only}
+ logger.error(f"IntegrityError occurred: {ex}")
+ return {"error": "IntegrityError", "details": error_only}
elif isinstance(ex, SQLAlchemyError):
# Log the error and return the error details
- logger.error(f'SQLAlchemyError occurred: {ex}')
- return {'error': 'SQLAlchemyError', 'details': error_only}
+ logger.error(f"SQLAlchemyError occurred: {ex}")
+ return {"error": "SQLAlchemyError", "details": error_only}
else:
# Log the error and return the error details
- logger.error(f'Exception occurred: {ex}')
- return {'error': 'General Exception', 'details': str(ex)}
+ logger.error(f"Exception occurred: {ex}")
+ return {"error": "General Exception", "details": str(ex)}
class DatabaseOperations:
@@ -184,15 +186,14 @@ def __init__(self, async_db: AsyncDatabase):
```
"""
# Log the start of the initialization
- logger.debug('Initializing DatabaseOperations instance')
+ logger.debug("Initializing DatabaseOperations instance")
# Store the AsyncDatabase instance in the async_db attribute This
# instance will be used for performing asynchronous database operations
self.async_db = async_db
# Log the successful initialization
- logger.debug('DatabaseOperations instance initialized successfully')
-
+ logger.debug("DatabaseOperations instance initialized successfully")
async def get_columns_details(self, table):
"""
@@ -253,23 +254,25 @@ async def get_columns_details(self, table):
```
"""
# Log the start of the operation
- logger.debug(f'Starting get_columns_details operation for table: {table.__name__}')
+ logger.debug(
+ f"Starting get_columns_details operation for table: {table.__name__}"
+ )
try:
# Log the start of the column retrieval
- logger.debug(f'Getting columns for table: {table.__name__}')
+ logger.debug(f"Getting columns for table: {table.__name__}")
# Retrieve the details of the columns and store them in a dictionary
# The keys are the column names and the values are dictionaries
# containing the column details
columns = {
c.name: {
- 'type': str(c.type),
- 'nullable': c.nullable,
- 'primary_key': c.primary_key,
- 'unique': c.unique,
- 'autoincrement': c.autoincrement,
- 'default': (
+ "type": str(c.type),
+ "nullable": c.nullable,
+ "primary_key": c.primary_key,
+ "unique": c.unique,
+ "autoincrement": c.autoincrement,
+ "default": (
str(c.default.arg)
if c.default is not None and not callable(c.default.arg)
else None
@@ -279,17 +282,16 @@ async def get_columns_details(self, table):
}
# Log the successful column retrieval
- logger.debug(f'Successfully retrieved columns for table: {table.__name__}')
+ logger.debug(f"Successfully retrieved columns for table: {table.__name__}")
return columns
except Exception as ex: # pragma: no cover
# Handle any exceptions that occur during the column retrieval
logger.error(
- f'An error occurred while getting columns for table: {table.__name__}'
+ f"An error occurred while getting columns for table: {table.__name__}"
) # pragma: no cover
return handle_exceptions(ex) # pragma: no cover
-
async def get_primary_keys(self, table):
"""
Retrieves the primary keys of a given table.
@@ -346,26 +348,25 @@ async def get_primary_keys(self, table):
```
"""
# Log the start of the operation
- logger.debug(f'Starting get_primary_keys operation for table: {table.__name__}')
+ logger.debug(f"Starting get_primary_keys operation for table: {table.__name__}")
try:
# Log the start of the primary key retrieval
- logger.debug(f'Getting primary keys for table: {table.__name__}')
+ logger.debug(f"Getting primary keys for table: {table.__name__}")
# Retrieve the primary keys and store them in a list
primary_keys = table.__table__.primary_key.columns.keys()
# Log the successful primary key retrieval
- logger.debug(f'Primary keys retrieved successfully: {primary_keys}')
+ logger.debug(f"Primary keys retrieved successfully: {primary_keys}")
return primary_keys
except Exception as ex: # pragma: no cover
# Handle any exceptions that occur during the primary key retrieval
- logger.error(f'Exception occurred: {ex}') # pragma: no cover
+ logger.error(f"Exception occurred: {ex}") # pragma: no cover
return handle_exceptions(ex) # pragma: no cover
-
async def get_table_names(self):
"""
Retrieves the names of all tables in the database.
@@ -412,27 +413,26 @@ async def get_table_names(self):
```
"""
# Log the start of the operation
- logger.debug('Starting get_table_names operation')
+ logger.debug("Starting get_table_names operation")
try:
# Log the start of the table name retrieval
- logger.debug('Retrieving table names')
+ logger.debug("Retrieving table names")
# Retrieve the table names and store them in a list The keys of the
# metadata.tables dictionary are the table names
table_names = list(self.async_db.Base.metadata.tables.keys())
# Log the successful table name retrieval
- logger.debug(f'Table names retrieved successfully: {table_names}')
+ logger.debug(f"Table names retrieved successfully: {table_names}")
return table_names
except Exception as ex: # pragma: no cover
# Handle any exceptions that occur during the table name retrieval
- logger.error(f'Exception occurred: {ex}') # pragma: no cover
+ logger.error(f"Exception occurred: {ex}") # pragma: no cover
return handle_exceptions(ex) # pragma: no cover
-
async def create_one(self, record):
"""
Adds a single record to the database.
@@ -482,29 +482,28 @@ async def create_one(self, record):
```
"""
# Log the start of the operation
- logger.debug('Starting create_one operation')
+ logger.debug("Starting create_one operation")
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the record being added
- logger.debug(f'Adding record to session: {record.__dict__}')
+ logger.debug(f"Adding record to session: {record.__dict__}")
# Add the record to the session and commit the changes
session.add(record)
await session.commit()
# Log the successful record addition
- logger.debug(f'Record added successfully: {record}')
+ logger.debug(f"Record added successfully: {record}")
return record
except Exception as ex:
# Handle any exceptions that occur during the record addition
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def create_many(self, records):
"""
Adds multiple records to the database.
@@ -557,7 +556,7 @@ async def create_many(self, records):
```
"""
# Log the start of the operation
- logger.debug('Starting create_many operation')
+ logger.debug("Starting create_many operation")
try:
# Start a timer to measure the operation time
@@ -566,7 +565,7 @@ async def create_many(self, records):
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the number of records being added
- logger.debug(f'Adding {len(records)} records to session')
+ logger.debug(f"Adding {len(records)} records to session")
# Add the records to the session and commit the changes
session.add_all(records)
@@ -574,24 +573,23 @@ async def create_many(self, records):
# Log the added records
records_data = [record.__dict__ for record in records]
- logger.debug(f'Records added to session: {records_data}')
+ logger.debug(f"Records added to session: {records_data}")
# Calculate the operation time and log the successful record
# addition
num_records = len(records)
t1 = time.time() - t0
logger.debug(
- f'Record operations were successful. {num_records} records were created in {t1:.4f} seconds.'
+ f"Record operations were successful. {num_records} records were created in {t1:.4f} seconds."
)
return records
except Exception as ex:
# Handle any exceptions that occur during the record addition
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def count_query(self, query):
"""
Executes a count query on the database and returns the number of records
@@ -643,29 +641,30 @@ async def count_query(self, query):
```
"""
# Log the start of the operation
- logger.debug('Starting count_query operation')
+ logger.debug("Starting count_query operation")
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the query being executed
- logger.debug(f'Executing count query: {query}')
+ logger.debug(f"Executing count query: {query}")
# Execute the count query and retrieve the count
- result = await session.execute(select(func.count()).select_from(query.subquery()))
+ result = await session.execute(
+ select(func.count()).select_from(query.subquery())
+ )
count = result.scalar()
# Log the successful query execution
- logger.debug(f'Count query executed successfully. Result: {count}')
+ logger.debug(f"Count query executed successfully. Result: {count}")
return count
except Exception as ex:
# Handle any exceptions that occur during the query execution
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def read_one_record(self, query):
"""
Retrieves a single record from the database based on the provided query.
@@ -716,34 +715,33 @@ async def read_one_record(self, query):
```
"""
# Log the start of the operation
- logger.debug(f'Starting read_one_record operation for {query}')
+ logger.debug(f"Starting read_one_record operation for {query}")
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the start of the record retrieval
- logger.debug(f'Getting record with query: {query}')
+ logger.debug(f"Getting record with query: {query}")
# Execute the query and retrieve the first record
result = await session.execute(query)
record = result.scalar_one()
# Log the successful record retrieval
- logger.debug(f'Record retrieved successfully: {record}')
+ logger.debug(f"Record retrieved successfully: {record}")
return record
except NoResultFound:
# No record was found
- logger.debug('No record found')
+ logger.debug("No record found")
return None
except Exception as ex: # pragma: no cover
# Handle any exceptions that occur during the record retrieval
- logger.error(f'Exception occurred: {ex}') # pragma: no cover
+ logger.error(f"Exception occurred: {ex}") # pragma: no cover
return handle_exceptions(ex) # pragma: no cover
-
async def read_query(self, query):
"""
Executes a fetch query on the database and returns a list of records
@@ -788,43 +786,44 @@ async def read_query(self, query):
```
"""
# Log the start of the operation
- logger.debug('Starting read_query operation')
+ logger.debug("Starting read_query operation")
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the query being executed
- logger.debug(
- f'Executing fetch query: {query}'
- )
+ logger.debug(f"Executing fetch query: {query}")
# Execute the fetch query and retrieve the records
result = await session.execute(query)
records = result.scalars().all()
- logger.debug(f'read_query result: {records}')
+ logger.debug(f"read_query result: {records}")
# Log the successful query execution
- if all(isinstance(record, tuple) for record in records): #pragma: no cover
- logger.debug(f'read_query result is a tuple {type(records)}')
+ if all(
+ isinstance(record, tuple) for record in records
+ ): # pragma: no cover
+ logger.debug(f"read_query result is a tuple {type(records)}")
# If all records are tuples, convert them to dictionaries
records_data = [
- dict(zip(('request_group_id', 'count'), record, strict=False))
+ dict(zip(("request_group_id", "count"), record, strict=False))
for record in records
]
else:
- logger.debug(f'read_query result is a dictionary {type(records)}')
+ logger.debug(f"read_query result is a dictionary {type(records)}")
# Otherwise, try to convert the records to dictionaries using the __dict__ attribute
records_data = [record.__dict__ for record in records]
- logger.debug(f'Fetch query executed successfully. Records: {records_data}')
+ logger.debug(
+ f"Fetch query executed successfully. Records: {records_data}"
+ )
return records
except Exception as ex:
# Handle any exceptions that occur during the query execution
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def read_multi_query(self, queries: Dict[str, str]):
"""
Executes multiple fetch queries on the database and returns a dictionary
@@ -876,7 +875,7 @@ async def read_multi_query(self, queries: Dict[str, str]):
```
"""
# Log the start of the operation
- logger.debug('Starting read_multi_query operation')
+ logger.debug("Starting read_multi_query operation")
try:
results = {}
@@ -884,7 +883,7 @@ async def read_multi_query(self, queries: Dict[str, str]):
async with self.async_db.get_db_session() as session:
for query_name, query in queries.items():
# Log the query being executed
- logger.debug(f'Executing fetch query: {query}')
+ logger.debug(f"Executing fetch query: {query}")
# Execute the fetch query and retrieve the records
result = await session.execute(query)
@@ -900,10 +899,9 @@ async def read_multi_query(self, queries: Dict[str, str]):
except Exception as ex:
# Handle any exceptions that occur during the query execution
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def update_one(self, table, record_id: str, new_values: dict):
"""
Updates a single record in the database identified by its ID.
@@ -956,31 +954,31 @@ async def update_one(self, table, record_id: str, new_values: dict):
record = await db_ops.update_one(User, 1, {'name': 'John Smith'})
```
"""
- non_updatable_fields = ['id', 'date_created']
+ non_updatable_fields = ["id", "date_created"]
# Log the start of the operation
logger.debug(
- f'Starting update_one operation for record_id: {record_id} in table: {table.__name__}'
+ f"Starting update_one operation for record_id: {record_id} in table: {table.__name__}"
)
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the record being fetched
- logger.debug(f'Fetching record with id: {record_id}')
+ logger.debug(f"Fetching record with id: {record_id}")
# Fetch the record
record = await session.get(table, record_id)
if not record:
# Log the error if no record is found
- logger.error(f'No record found with pkid: {record_id}')
+ logger.error(f"No record found with pkid: {record_id}")
return {
- 'error': 'Record not found',
- 'details': f'No record found with pkid {record_id}',
+ "error": "Record not found",
+ "details": f"No record found with pkid {record_id}",
}
# Log the record being updated
- logger.debug(f'Updating record with new values: {new_values}')
+ logger.debug(f"Updating record with new values: {new_values}")
# Update the record with the new values
for key, value in new_values.items():
@@ -989,15 +987,14 @@ async def update_one(self, table, record_id: str, new_values: dict):
await session.commit()
# Log the successful record update
- logger.debug(f'Record updated successfully: {record.pkid}')
+ logger.debug(f"Record updated successfully: {record.pkid}")
return record
except Exception as ex:
# Handle any exceptions that occur during the record update
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def delete_one(self, table, record_id: str):
"""
Deletes a single record from the database based on the provided table
@@ -1054,56 +1051,57 @@ async def delete_one(self, table, record_id: str):
"""
# Log the start of the operation
logger.debug(
- f'Starting delete_one operation for record_id: {record_id} in table: {table.__name__}'
+ f"Starting delete_one operation for record_id: {record_id} in table: {table.__name__}"
)
try:
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the record being fetched
- logger.debug(f'Fetching record with id: {record_id}')
+ logger.debug(f"Fetching record with id: {record_id}")
# Fetch the record
record = await session.get(table, record_id)
# If the record doesn't exist, return an error
if not record:
- logger.error(f'No record found with pkid: {record_id}')
+ logger.error(f"No record found with pkid: {record_id}")
return {
- 'error': 'Record not found',
- 'details': f'No record found with pkid {record_id}',
+ "error": "Record not found",
+ "details": f"No record found with pkid {record_id}",
}
# Log the record being deleted
- logger.debug(f'Deleting record with id: {record_id}')
+ logger.debug(f"Deleting record with id: {record_id}")
# Delete the record
await session.delete(record)
# Log the successful record deletion from the session
- logger.debug(f'Record deleted from session: {record}')
+ logger.debug(f"Record deleted from session: {record}")
# Log the start of the commit
- logger.debug(f'Committing changes to delete record with id: {record_id}')
+ logger.debug(
+ f"Committing changes to delete record with id: {record_id}"
+ )
# Commit the changes
await session.commit()
# Log the successful record deletion
- logger.debug(f'Record deleted successfully: {record_id}')
+ logger.debug(f"Record deleted successfully: {record_id}")
- return {'success': 'Record deleted successfully'}
+ return {"success": "Record deleted successfully"}
except Exception as ex:
# Handle any exceptions that occur during the record deletion
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
-
async def delete_many(
self,
table: Type[DeclarativeMeta],
- id_column_name: str = 'pkid',
+ id_column_name: str = "pkid",
id_values: List[int] = None,
) -> int:
"""
@@ -1145,7 +1143,7 @@ async def delete_many(
print(f"Deleted {deleted_count} records.")
```
"""
- if id_values is None: #pragma: no cover
+ if id_values is None: # pragma: no cover
id_values = []
try:
# Start a timer to measure the operation time
@@ -1154,10 +1152,12 @@ async def delete_many(
# Start a new database session
async with self.async_db.get_db_session() as session:
# Log the number of records being deleted
- logger.debug(f'Deleting {len(id_values)} records from session')
+ logger.debug(f"Deleting {len(id_values)} records from session")
# Create delete statement
- stmt = delete(table).where(getattr(table, id_column_name).in_(id_values))
+ stmt = delete(table).where(
+ getattr(table, id_column_name).in_(id_values)
+ )
# Execute the delete statement and fetch result
result = await session.execute(stmt)
@@ -1169,17 +1169,17 @@ async def delete_many(
deleted_count = result.rowcount
# Log the deleted records
- logger.debug(f'Records deleted from session: {deleted_count}')
+ logger.debug(f"Records deleted from session: {deleted_count}")
# Calculate the operation time and log the successful record deletion
t1 = time.time() - t0
logger.debug(
- f'Record operations were successful. {deleted_count} records were deleted in {t1:.4f} seconds.'
+ f"Record operations were successful. {deleted_count} records were deleted in {t1:.4f} seconds."
)
return deleted_count
except Exception as ex:
# Handle any exceptions that occur during the record deletion
- logger.error(f'Exception occurred: {ex}')
+ logger.error(f"Exception occurred: {ex}")
return handle_exceptions(ex)
diff --git a/dsg_lib/common_functions/calendar_functions.py b/dsg_lib/common_functions/calendar_functions.py
index 74816876..efc4432e 100644
--- a/dsg_lib/common_functions/calendar_functions.py
+++ b/dsg_lib/common_functions/calendar_functions.py
@@ -70,18 +70,18 @@ def get_month(month: int) -> str:
# Define a tuple containing the names of all months
months = (
- 'January',
- 'February',
- 'March',
- 'April',
- 'May',
- 'June',
- 'July',
- 'August',
- 'September',
- 'October',
- 'November',
- 'December',
+ "January",
+ "February",
+ "March",
+ "April",
+ "May",
+ "June",
+ "July",
+ "August",
+ "September",
+ "October",
+ "November",
+ "December",
)
# Convert integer-like floats to integers
@@ -90,16 +90,18 @@ def get_month(month: int) -> str:
# Check if the input month is an integer
if not isinstance(month, int):
- logger.error('Invalid input: %s, integer is required', month)
- return 'Invalid input, integer is required'
+ logger.error("Invalid input: %s, integer is required", month)
+ return "Invalid input, integer is required"
# Check if the input month is within the range of 1-12
if 1 <= month <= 12:
- logger.info('Returning month name for month number: %s', month)
+ logger.info("Returning month name for month number: %s", month)
return months[month - 1]
else:
- logger.error('Invalid input: %s, month number should be between 1 and 12', month)
- return 'Invalid month number'
+ logger.error(
+ "Invalid input: %s, month number should be between 1 and 12", month
+ )
+ return "Invalid month number"
def get_month_number(month_name: str) -> int:
@@ -116,23 +118,23 @@ def get_month_number(month_name: str) -> int:
# Define a dictionary mapping month names to month numbers
month_dict = {
- 'January': 1,
- 'February': 2,
- 'March': 3,
- 'April': 4,
- 'May': 5,
- 'June': 6,
- 'July': 7,
- 'August': 8,
- 'September': 9,
- 'October': 10,
- 'November': 11,
- 'December': 12,
+ "January": 1,
+ "February": 2,
+ "March": 3,
+ "April": 4,
+ "May": 5,
+ "June": 6,
+ "July": 7,
+ "August": 8,
+ "September": 9,
+ "October": 10,
+ "November": 11,
+ "December": 12,
}
# Check if the input month name is a string
if not isinstance(month_name, str):
- logger.error('Invalid input, string is required')
+ logger.error("Invalid input, string is required")
return -1
# Convert the input string to title case and remove leading/trailing spaces
@@ -142,5 +144,5 @@ def get_month_number(month_name: str) -> int:
if month_name in month_dict:
return month_dict[month_name]
else:
- logger.error('Invalid month name: %s', month_name)
+ logger.error("Invalid month name: %s", month_name)
return -1
diff --git a/dsg_lib/common_functions/email_validation.py b/dsg_lib/common_functions/email_validation.py
index f1ec6e04..af35b50f 100644
--- a/dsg_lib/common_functions/email_validation.py
+++ b/dsg_lib/common_functions/email_validation.py
@@ -58,6 +58,7 @@ class DNSType(Enum):
DNS (str): Represents a standard DNS resolver.
TIMEOUT (str): Represents a DNS resolver with a specified timeout.
"""
+
DNS = "dns"
TIMEOUT = "timeout"
@@ -68,12 +69,12 @@ def validate_email_address(
test_environment: bool = False,
allow_smtputf8: bool = False,
allow_empty_local: bool = False,
- allow_quoted_local:bool=False,
+ allow_quoted_local: bool = False,
allow_display_name: bool = False,
allow_domain_literal: bool = False,
globally_deliverable: bool = None,
timeout: int = 10,
- dns_type: str = 'dns',
+ dns_type: str = "dns",
) -> Dict[str, Union[str, bool, Dict[str, Union[str, bool, List[str]]]]]:
"""
Validates an email address and returns a dictionary with the validation result and other information.
@@ -112,7 +113,9 @@ def validate_email_address(
try:
dns_type = DNSType(dns_type.lower())
except ValueError:
- raise ValueError("dns_type must be either 'dns' or 'timeout'. Default is 'dns' if not provided or input is None.")
+ raise ValueError(
+ "dns_type must be either 'dns' or 'timeout'. Default is 'dns' if not provided or input is None."
+ )
# Set up the DNS resolver based on the dns_type
if dns_type == DNSType.DNS:
@@ -158,34 +161,52 @@ def validate_email_address(
# Add the email info and parameters to the return dictionary
email_dict["email_data"] = dict(sorted(vars(emailinfo).items()))
- email_dict["parameters"]=dict(sorted(locals().items()))
+ email_dict["parameters"] = dict(sorted(locals().items()))
# return the dictionary
return email_dict
# Handle EmailUndeliverableError
except EmailUndeliverableError as e:
- error = str(e)
- parameters=dict(sorted(locals().items()))
- email_dict = {"valid": False, "email": email, "error": error,"error_type": "EmailUndeliverableError","parameters":parameters}
+ error = str(e)
+ parameters = dict(sorted(locals().items()))
+ email_dict = {
+ "valid": False,
+ "email": email,
+ "error": error,
+ "error_type": "EmailUndeliverableError",
+ "parameters": parameters,
+ }
logger.error(f"EmailUndeliverableError: {email} - {str(e)}")
logger.debug(f"EmailUndeliverableError: {email} - {str(e)}, - {parameters}")
return email_dict
# Handle EmailNotValidError
except EmailNotValidError as e:
- error = str(e)
- parameters=dict(sorted(locals().items()))
- email_dict = {"valid": False, "email": email, "error": error,"error_type": "EmailNotValidError","parameters":parameters}
+ error = str(e)
+ parameters = dict(sorted(locals().items()))
+ email_dict = {
+ "valid": False,
+ "email": email,
+ "error": error,
+ "error_type": "EmailNotValidError",
+ "parameters": parameters,
+ }
logger.error(f"EmailNotValidError: {email} - {str(e)}")
logger.debug(f"EmailNotValidError: {email} - {str(e)}, - {parameters}")
return email_dict
# Handle other exceptions
- except Exception as e: # pragma: no cover
- error = str(e)
- parameters=dict(sorted(locals().items()))
- email_dict = {"valid": False, "email": email, "error": error,"error_type": "Exception","parameters":parameters}
+ except Exception as e: # pragma: no cover
+ error = str(e)
+ parameters = dict(sorted(locals().items()))
+ email_dict = {
+ "valid": False,
+ "email": email,
+ "error": error,
+ "error_type": "Exception",
+ "parameters": parameters,
+ }
logger.error(f"Exception: {email} - {str(e)}")
logger.debug(f"Exception: {email} - {str(e)}, - {parameters}")
return email_dict
@@ -219,27 +240,20 @@ def validate_email_address(
'this is"not\\allowed@example.com', # spaces, quotes, and backslashes may only exist when within quoted strings and preceded by a backslash
'this\\ still\\"not\\\\allowed@example.com', # even if escaped (preceded by a backslash), spaces, quotes, and backslashes must still be contained by quotes
"1234567890123456789012345678901234567890123456789012345678901234+x@example.com", # local part is longer than 64 characters
-
# Emails with empty local part
"@example.com", # only valid if allow_empty_local is True
-
# Emails with non-ASCII characters
"üñîçøðé@example.com", # only valid if allow_smtputf8 is True
"user@üñîçøðé.com", # only valid if allow_smtputf8 is True
-
# Emails with quoted local part
'"john.doe"@example.com', # only valid if allow_quoted_local is True
'"john..doe"@example.com', # only valid if allow_quoted_local is True
-
# Emails with display name
- 'John Doe ', # only valid if allow_display_name is True
-
+ "John Doe ", # only valid if allow_display_name is True
# Emails with domain literal
- 'user@[192.0.2.1]', # only valid if allow_domain_literal is True
-
+ "user@[192.0.2.1]", # only valid if allow_domain_literal is True
# Emails with long local part
- "a"*65 + "@example.com", # local part is longer than 64 characters
-
+ "a" * 65 + "@example.com", # local part is longer than 64 characters
# Emails with invalid characters
"john doe@example.com", # space is not allowed
"john@doe@example.com", # only one @ is allowed
@@ -250,8 +264,30 @@ def validate_email_address(
# create a list of configurations
configurations = [
- {"check_deliverability": True, "test_environment": False, "allow_smtputf8": False, "allow_empty_local": False, "allow_quoted_local": False, "allow_display_name": False, "allow_domain_literal": False, "globally_deliverable": None, "timeout": 10, "dns_type": 'timeout'},
- {"check_deliverability": False, "test_environment": True, "allow_smtputf8": True, "allow_empty_local": True, "allow_quoted_local": True, "allow_display_name": True, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 5, "dns_type": 'dns'},
+ {
+ "check_deliverability": True,
+ "test_environment": False,
+ "allow_smtputf8": False,
+ "allow_empty_local": False,
+ "allow_quoted_local": False,
+ "allow_display_name": False,
+ "allow_domain_literal": False,
+ "globally_deliverable": None,
+ "timeout": 10,
+ "dns_type": "timeout",
+ },
+ {
+ "check_deliverability": False,
+ "test_environment": True,
+ "allow_smtputf8": True,
+ "allow_empty_local": True,
+ "allow_quoted_local": True,
+ "allow_display_name": True,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 5,
+ "dns_type": "dns",
+ },
{"check_deliverability": True},
# add more configurations here
]
@@ -260,7 +296,7 @@ def validate_email_address(
import time
t0 = time.time()
- validity=[]
+ validity = []
for email in email_addresses:
for config in configurations:
@@ -268,9 +304,9 @@ def validate_email_address(
res = validate_email_address(email, **config)
validity.append(res)
t1 = time.time()
- validity = sorted(validity, key=lambda x: x['email'])
+ validity = sorted(validity, key=lambda x: x["email"])
for v in validity:
- pprint.pprint(v, indent=4)
+ pprint.pprint(v, indent=4)
print(f"Time taken: {t1 - t0:.2f}")
diff --git a/dsg_lib/common_functions/file_functions.py b/dsg_lib/common_functions/file_functions.py
index ec260547..6fd43ef7 100644
--- a/dsg_lib/common_functions/file_functions.py
+++ b/dsg_lib/common_functions/file_functions.py
@@ -49,10 +49,10 @@
from loguru import logger
# Set the path to the directory where the files are located
-directory_to_files: str = 'data'
+directory_to_files: str = "data"
# A dictionary that maps file types to directories
-directory_map = {'.csv': 'csv', '.json': 'json', '.txt': 'text'}
+directory_map = {".csv": "csv", ".json": "json", ".txt": "text"}
def delete_file(file_name: str) -> str:
@@ -82,18 +82,18 @@ def delete_file(file_name: str) -> str:
# Outputs: 'File deleted successfully'
```
"""
- logger.info(f'Deleting file: {file_name}')
+ logger.info(f"Deleting file: {file_name}")
# Check that the file name is a string
if not isinstance(file_name, str):
- raise TypeError(f'{file_name} is not a valid string')
+ raise TypeError(f"{file_name} is not a valid string")
# Split the file name into its name and extension components
file_name, file_ext = os.path.splitext(file_name)
# Check that the file name does not contain a forward slash or backslash
if os.path.sep in file_name:
- raise ValueError(f'{file_name} cannot contain {os.path.sep}')
+ raise ValueError(f"{file_name} cannot contain {os.path.sep}")
# Check that the file type is supported
if file_ext not in directory_map:
@@ -103,18 +103,18 @@ def delete_file(file_name: str) -> str:
# Construct the full file path
file_directory = Path.cwd() / directory_to_files / directory_map[file_ext]
- file_path = file_directory / f'{file_name}{file_ext}'
+ file_path = file_directory / f"{file_name}{file_ext}"
# Check that the file exists
if not file_path.is_file():
- raise FileNotFoundError(f'file not found: {file_name}{file_ext}')
+ raise FileNotFoundError(f"file not found: {file_name}{file_ext}")
# Delete the file
os.remove(file_path)
- logger.info(f'File {file_name}{file_ext} deleted from file path: {file_path}')
+ logger.info(f"File {file_name}{file_ext} deleted from file path: {file_path}")
# Return a string indicating that the file has been deleted
- return 'complete'
+ return "complete"
def save_json(file_name: str, data, root_folder: str = None) -> str:
@@ -153,19 +153,21 @@ def save_json(file_name: str, data, root_folder: str = None) -> str:
try:
# Validate inputs
if not isinstance(data, (list, dict)):
- raise TypeError(f'data must be a list or a dictionary instead of type {type(data)}')
- if '/' in file_name or '\\' in file_name:
- raise ValueError(f'{file_name} cannot contain / or \\')
+ raise TypeError(
+ f"data must be a list or a dictionary instead of type {type(data)}"
+ )
+ if "/" in file_name or "\\" in file_name:
+ raise ValueError(f"{file_name} cannot contain / or \\")
# Add extension if not present in file_name
- if not file_name.endswith('.json'): # pragma: no cover
- file_name += '.json' # pragma: no cover
+ if not file_name.endswith(".json"): # pragma: no cover
+ file_name += ".json" # pragma: no cover
if root_folder is None:
root_folder = directory_to_files
# Determine directory
- json_directory = Path(root_folder) / 'json'
+ json_directory = Path(root_folder) / "json"
# Construct file path
file_path = json_directory / file_name
@@ -174,16 +176,16 @@ def save_json(file_name: str, data, root_folder: str = None) -> str:
json_directory.mkdir(parents=True, exist_ok=True)
# Write data to file
- with open(file_path, 'w') as write_file:
+ with open(file_path, "w") as write_file:
json.dump(data, write_file)
# Log success message
- logger.info(f'File created: {file_path}')
+ logger.info(f"File created: {file_path}")
- return 'File saved successfully'
+ return "File saved successfully"
except (TypeError, ValueError) as e:
- logger.error(f'Error creating file {file_name}: {e}')
+ logger.error(f"Error creating file {file_name}: {e}")
raise
@@ -203,16 +205,16 @@ def open_json(file_name: str) -> dict:
"""
# Check if file name is a string
if not isinstance(file_name, str):
- error = f'{file_name} is not a valid string'
+ error = f"{file_name} is not a valid string"
logger.error(error)
raise TypeError(error)
- file_directory = Path(directory_to_files) / directory_map['.json']
+ file_directory = Path(directory_to_files) / directory_map[".json"]
file_save = file_directory / file_name
# Check if path correct
if not file_save.is_file():
- error = f'file not found error: {file_save}'
+ error = f"file not found error: {file_save}"
logger.exception(error)
raise FileNotFoundError(error)
@@ -221,7 +223,7 @@ def open_json(file_name: str) -> dict:
# load file into data variable
result = json.load(read_file)
- logger.info(f'File Opened: {file_name}')
+ logger.info(f"File Opened: {file_name}")
return result
@@ -232,7 +234,7 @@ def save_csv(
file_name: str,
data: list,
root_folder: str = None,
- delimiter: str = ',',
+ delimiter: str = ",",
quotechar: str = '"',
) -> str:
"""
@@ -275,43 +277,43 @@ def save_csv(
root_folder = directory_to_files
# Create the csv directory if it does not exist
- csv_directory = Path(root_folder) / 'csv'
+ csv_directory = Path(root_folder) / "csv"
csv_directory.mkdir(parents=True, exist_ok=True)
# Check that delimiter and quotechar are single characters
if len(delimiter) != 1:
- raise TypeError(f'{delimiter} can only be a single character')
+ raise TypeError(f"{delimiter} can only be a single character")
if len(quotechar) != 1:
- raise TypeError(f'{quotechar} can only be a single character')
+ raise TypeError(f"{quotechar} can only be a single character")
# Check that data is a list
if not isinstance(data, list):
- raise TypeError(f'{data} is not a valid list')
+ raise TypeError(f"{data} is not a valid list")
# Check that file_name is a string and does not contain invalid characters
- if not isinstance(file_name, str) or '/' in file_name or '\\' in file_name:
- raise TypeError(f'{file_name} is not a valid file name')
+ if not isinstance(file_name, str) or "/" in file_name or "\\" in file_name:
+ raise TypeError(f"{file_name} is not a valid file name")
# Add extension to file_name if needed
- if not file_name.endswith('.csv'):
- file_name += '.csv'
+ if not file_name.endswith(".csv"):
+ file_name += ".csv"
# Create the file path
file_path = csv_directory / file_name
# Write data to file
- with open(file_path, 'w', encoding='utf-8', newline='') as csv_file:
+ with open(file_path, "w", encoding="utf-8", newline="") as csv_file:
csv_writer = csv.writer(csv_file, delimiter=delimiter, quotechar=quotechar)
csv_writer.writerows(data)
- logger.info(f'File Create: {file_name}')
- return 'complete'
+ logger.info(f"File Create: {file_name}")
+ return "complete"
def open_csv(
file_name: str,
- delimiter: str = ',',
- quote_level: str = 'minimal',
+ delimiter: str = ",",
+ quote_level: str = "minimal",
skip_initial_space: bool = True,
) -> list:
"""
@@ -346,14 +348,14 @@ def open_csv(
"""
# A dictionary that maps quote levels to csv quoting constants
quote_levels = {
- 'none': csv.QUOTE_NONE,
- 'minimal': csv.QUOTE_MINIMAL,
- 'all': csv.QUOTE_ALL,
+ "none": csv.QUOTE_NONE,
+ "minimal": csv.QUOTE_MINIMAL,
+ "all": csv.QUOTE_ALL,
}
# Check that file name is a string
if not isinstance(file_name, str):
- error = f'{file_name} is not a valid string'
+ error = f"{file_name} is not a valid string"
logger.error(error)
raise TypeError(error)
@@ -366,19 +368,19 @@ def open_csv(
quoting = quote_levels[quote_level]
# Add extension to file name and create file path
- file_name = f'{file_name}.csv'
- file_directory = Path.cwd().joinpath(directory_to_files).joinpath('csv')
+ file_name = f"{file_name}.csv"
+ file_directory = Path.cwd().joinpath(directory_to_files).joinpath("csv")
file_path = file_directory.joinpath(file_name)
# Check that file exists
if not file_path.is_file():
- error = f'File not found: {file_path}'
+ error = f"File not found: {file_path}"
logger.error(error)
raise FileNotFoundError(error)
# Read CSV file
data = []
- with file_path.open(encoding='utf-8') as f:
+ with file_path.open(encoding="utf-8") as f:
reader = csv.DictReader(
f,
delimiter=delimiter,
@@ -388,48 +390,48 @@ def open_csv(
for row in reader:
data.append(dict(row))
- logger.info(f'File opened: {file_name}')
+ logger.info(f"File opened: {file_name}")
return data
# A list of first names to randomly select from
first_name: List[str] = [
- 'Amy',
- 'Adam',
- 'Catherine',
- 'Charlotte',
- 'Charles',
- 'Craig',
- 'Deloris',
- 'Doris',
- 'Donna',
- 'Eugene',
- 'Eileen',
- 'Emma',
- 'Gerald',
- 'Geraldine',
- 'Gordon',
- 'Jack',
- 'Jenny',
- 'Kelly',
- 'Kevin',
- 'Linda',
- 'Lyle',
- 'Michael',
- 'Monica',
- 'Nancy',
- 'Noel',
- 'Olive',
- 'Robyn',
- 'Robert',
- 'Ryan',
- 'Sarah',
- 'Sean',
- 'Teresa',
- 'Tim',
- 'Valerie',
- 'Wayne',
- 'William',
+ "Amy",
+ "Adam",
+ "Catherine",
+ "Charlotte",
+ "Charles",
+ "Craig",
+ "Deloris",
+ "Doris",
+ "Donna",
+ "Eugene",
+ "Eileen",
+ "Emma",
+ "Gerald",
+ "Geraldine",
+ "Gordon",
+ "Jack",
+ "Jenny",
+ "Kelly",
+ "Kevin",
+ "Linda",
+ "Lyle",
+ "Michael",
+ "Monica",
+ "Nancy",
+ "Noel",
+ "Olive",
+ "Robyn",
+ "Robert",
+ "Ryan",
+ "Sarah",
+ "Sean",
+ "Teresa",
+ "Tim",
+ "Valerie",
+ "Wayne",
+ "William",
]
@@ -455,11 +457,11 @@ def create_sample_files(file_name: str, sample_size: int) -> None:
# Creates 'test.csv' and 'test.json' each with 100 rows of random data ```
"""
- logger.debug(f'Creating sample files for {file_name} with {sample_size} rows.')
+ logger.debug(f"Creating sample files for {file_name} with {sample_size} rows.")
try:
# Generate the CSV data
- csv_header = ['name', 'birth_date', 'number']
+ csv_header = ["name", "birth_date", "number"]
csv_data: List[List[str]] = [csv_header]
# Generate rows for CSV data
@@ -470,7 +472,7 @@ def create_sample_files(file_name: str, sample_size: int) -> None:
csv_data.append(row)
# Save the CSV file
- csv_file = f'{file_name}.csv'
+ csv_file = f"{file_name}.csv"
save_csv(csv_file, csv_data)
# Generate the JSON data
@@ -481,21 +483,23 @@ def create_sample_files(file_name: str, sample_size: int) -> None:
r_int: int = random.randint(0, len(first_name) - 1)
name = first_name[r_int]
sample_dict: dict = {
- 'name': name,
- 'birthday_date': generate_random_date(),
+ "name": name,
+ "birthday_date": generate_random_date(),
}
json_data.append(sample_dict)
# Save the JSON file
- json_file: str = f'{file_name}.json'
+ json_file: str = f"{file_name}.json"
save_json(json_file, json_data)
# Log the data
- logger.debug(f'CSV Data: {csv_data}')
- logger.debug(f'JSON Data: {json_data}')
+ logger.debug(f"CSV Data: {csv_data}")
+ logger.debug(f"JSON Data: {json_data}")
except Exception as e: # pragma: no cover
- logger.exception(f'Error occurred while creating sample files: {e}') # pragma: no cover
+ logger.exception(
+ f"Error occurred while creating sample files: {e}"
+ ) # pragma: no cover
raise # pragma: no cover
@@ -529,7 +533,7 @@ def generate_random_date() -> str:
date_value: datetime = datetime(year, month, day, hour, minute, second)
# Format the datetime string and return it
- return f'{date_value:%Y-%m-%d %H:%M:%S.%f}'
+ return f"{date_value:%Y-%m-%d %H:%M:%S.%f}"
def save_text(file_name: str, data: str, root_folder: str = None) -> str:
@@ -567,10 +571,10 @@ def save_text(file_name: str, data: str, root_folder: str = None) -> str:
root_folder = directory_to_files # pragma: no cover
# Determine the directory for text files
- text_directory = Path(root_folder) / 'text'
+ text_directory = Path(root_folder) / "text"
# Construct the file path for text files
- file_path = text_directory / f'{file_name}.txt'
+ file_path = text_directory / f"{file_name}.txt"
# Create the text directory if it does not exist
text_directory.mkdir(parents=True, exist_ok=True)
@@ -578,18 +582,18 @@ def save_text(file_name: str, data: str, root_folder: str = None) -> str:
# Check that data is a string and that file_name does not contain invalid
# characters
if not isinstance(data, str):
- logger.error(f'{file_name} is not a valid string')
- raise TypeError(f'{file_name} is not a valid string')
- elif '/' in file_name or '\\' in file_name:
- logger.error(f'{file_name} cannot contain \\ or /')
- raise ValueError(f'{file_name} cannot contain \\ or /')
+ logger.error(f"{file_name} is not a valid string")
+ raise TypeError(f"{file_name} is not a valid string")
+ elif "/" in file_name or "\\" in file_name:
+ logger.error(f"{file_name} cannot contain \\ or /")
+ raise ValueError(f"{file_name} cannot contain \\ or /")
# Open or create the file and write the data
- with open(file_path, 'w+', encoding='utf-8') as file:
+ with open(file_path, "w+", encoding="utf-8") as file:
file.write(data)
- logger.info(f'File created: {file_path}')
- return 'complete'
+ logger.info(f"File created: {file_path}")
+ return "complete"
def open_text(file_name: str) -> str:
@@ -616,25 +620,25 @@ def open_text(file_name: str) -> str:
```
"""
# Replace backslashes with forward slashes in the file name
- if '\\' in file_name: # pragma: no cover
- file_name = file_name.replace('\\', '/') # pragma: no cover
+ if "\\" in file_name: # pragma: no cover
+ file_name = file_name.replace("\\", "/") # pragma: no cover
# Check that file_name does not contain invalid characters
- if '/' in file_name:
- logger.error(f'{file_name} cannot contain /')
- raise TypeError(f'{file_name} cannot contain /')
+ if "/" in file_name:
+ logger.error(f"{file_name} cannot contain /")
+ raise TypeError(f"{file_name} cannot contain /")
# Get the path to the text directory and the file path
- file_directory = os.path.join(directory_to_files, 'text')
+ file_directory = os.path.join(directory_to_files, "text")
file_path = Path.cwd().joinpath(file_directory, file_name)
# Check if the file exists
if not file_path.is_file():
- raise FileNotFoundError(f'file not found error: {file_path}')
+ raise FileNotFoundError(f"file not found error: {file_path}")
# Open the file and read the data
- with open(file_path, 'r', encoding='utf-8') as file:
+ with open(file_path, "r", encoding="utf-8") as file:
data = file.read()
- logger.info(f'File opened: {file_path}')
+ logger.info(f"File opened: {file_path}")
return data
diff --git a/dsg_lib/common_functions/folder_functions.py b/dsg_lib/common_functions/folder_functions.py
index baf566e7..e8fb701b 100644
--- a/dsg_lib/common_functions/folder_functions.py
+++ b/dsg_lib/common_functions/folder_functions.py
@@ -45,8 +45,8 @@
from loguru import logger
# Define the directory where the files are located
-directory_to__files: str = 'data'
-file_directory = f'{directory_to__files}/csv'
+directory_to__files: str = "data"
+file_directory = f"{directory_to__files}/csv"
directory_path = Path.cwd().joinpath(file_directory)
@@ -84,7 +84,7 @@ def last_data_files_changed(directory_path: str) -> Tuple[datetime, str]:
# Log a message to indicate that the directory was checked for the last
# modified file
- logger.info(f'Directory checked for last change: {directory_path}')
+ logger.info(f"Directory checked for last change: {directory_path}")
# Return the modification time and path of the last modified file
return time_stamp, file_path
@@ -127,7 +127,7 @@ def get_directory_list(file_directory: str) -> List[str]:
direct_list = [x for x in file_path.iterdir() if x.is_dir()]
# Log a message indicating that the list of directories was retrieved
- logger.info(f'Retrieved list of directories: {file_directory}')
+ logger.info(f"Retrieved list of directories: {file_directory}")
# Return the list of directories
return direct_list
@@ -164,20 +164,20 @@ def make_folder(file_directory: str) -> bool:
# Check if the folder already exists
if file_directory.is_dir():
- error = f'Folder exists: {file_directory}'
+ error = f"Folder exists: {file_directory}"
logger.error(error)
raise FileExistsError(error)
# Check for invalid characters in folder name
invalid_chars = re.findall(r'[<>:"/\\|?*]', file_directory.name)
if invalid_chars:
- error = f'Invalid characters in directory name: {invalid_chars}'
+ error = f"Invalid characters in directory name: {invalid_chars}"
logger.error(error)
raise ValueError(error)
# Create the new folder
Path.mkdir(file_directory)
- logger.info(f'Directory created: {file_directory}')
+ logger.info(f"Directory created: {file_directory}")
return True
@@ -213,7 +213,7 @@ def remove_folder(file_directory: str) -> None:
path.rmdir()
# Log a message indicating that the folder was removed
- logger.info(f'Folder removed: {file_directory}')
+ logger.info(f"Folder removed: {file_directory}")
except FileNotFoundError as err:
# Log an error message if the specified directory does not exist
diff --git a/dsg_lib/common_functions/logging_config.py b/dsg_lib/common_functions/logging_config.py
index 1da2a088..edc487a4 100644
--- a/dsg_lib/common_functions/logging_config.py
+++ b/dsg_lib/common_functions/logging_config.py
@@ -1,50 +1,214 @@
# -*- coding: utf-8 -*-
"""
-This module provides a comprehensive logging setup using the loguru library, facilitating easy logging management for Python applications. The `config_log` function, central to this module, allows for extensive customization of logging behavior. It supports specifying the logging directory, log file name, logging level, and controls for log rotation, retention, and formatting among other features. Additionally, it offers advanced options like backtrace and diagnose for in-depth debugging, and the ability to append the application name to the log file for clearer identification.
+This module provides a comprehensive logging setup using the loguru library, facilitating easy logging management for Python applications.
+
+The `config_log` function, central to this module, allows for extensive customization of logging behavior. It supports specifying the logging directory, log file name, logging level, and controls for log rotation, retention, and formatting among other features. Additionally, it offers advanced options like backtrace and diagnose for in-depth debugging, and the ability to append the application name to the log file for clearer identification.
Usage example:
-```python
-from dsg_lib.common_functions.logging_config import config_log
-
-config_log(
- logging_directory='logs', # Directory for storing logs
- log_name='log', # Base name for log files
- logging_level='DEBUG', # Minimum logging level
- log_rotation='100 MB', # Size threshold for log rotation
- log_retention='30 days', # Duration to retain old log files
- enqueue=True, # Enqueue log messages
-)
-
-# Example log messages
-logger.debug("This is a debug message")
-logger.info("This is an info message")
-logger.error("This is an error message")
-logger.warning("This is a warning message")
-logger.critical("This is a critical message")
-```
-
-Author: Mike Ryan
-DateCreated: 2021/07/16
-DateUpdated: 2024/07/24
-
-License: MIT
+ from dsg_lib.common_functions.logging_config import config_log
+
+ config_log(
+ logging_directory='logs', # Directory for storing logs
+ log_name='log', # Base name for log files
+ logging_level='DEBUG', # Minimum logging level
+ log_rotation='100 MB', # Size threshold for log rotation
+ log_retention='30 days', # Duration to retain old log files
+ enqueue=True, # Enqueue log messages
+ )
+
+ # Example log messages
+ logger.debug("This is a debug message")
+ logger.info("This is an info message")
+ logger.error("This is an error message")
+ logger.warning("This is a warning message")
+ logger.critical("This is a critical message")
+
+Attributes:
+ None
+
+Todo:
+ * Add support for additional logging handlers.
+ * Implement asynchronous logging.
+
+Author:
+ Mike Ryan
+
+Date Created:
+ 2021/07/16
+
+Date Updated:
+ 2024/07/27
+
+License:
+ MIT
"""
import logging
import os
-import time
+import shutil
+from datetime import datetime, timedelta
+from multiprocessing import Lock
from pathlib import Path
from uuid import uuid4
from loguru import logger
+rotation_lock = Lock()
+
+class SafeFileSink:
+ """
+ A class to handle safe file logging with rotation and retention policies.
+
+ This class provides mechanisms to manage log files by rotating them based on size and retaining them for a specified duration. It also supports optional compression of log files.
+
+ Attributes:
+ path (str): The path to the log file.
+ rotation_size (int): The size threshold for log rotation in bytes.
+ retention_days (timedelta): The duration to retain old log files.
+ compression (str, optional): The compression method to use for old log files.
+
+ Methods:
+ parse_size(size_str): Parses a size string (e.g., '100MB') and returns the size in bytes.
+ parse_duration(duration_str): Parses a duration string (e.g., '7 days') and returns a timedelta object.
+
+ Example:
+ safe_file_sink = SafeFileSink(
+ path='logs/app.log',
+ rotation='100 MB',
+ retention='30 days',
+ compression='zip'
+ )
+
+ # This will set up a log file at 'logs/app.log' with rotation at 100 MB,
+ # retention for 30 days, and compression using zip.
+ """
+ def __init__(self, path, rotation, retention, compression=None):
+ self.path = path
+ self.rotation_size = self.parse_size(rotation)
+ self.retention_days = self.parse_duration(retention)
+ self.compression = compression
+
+ @staticmethod
+ def parse_size(size_str): # pragma: no cover
+ """
+ Parses a size string and returns the size in bytes.
+
+ Args:
+ size_str (str): The size string (e.g., '100MB').
+
+ Returns:
+ int: The size in bytes.
+ """
+ size_str = size_str.upper()
+ if size_str.endswith('MB'):
+ return int(size_str[:-2]) * 1024 * 1024
+ elif size_str.endswith('GB'):
+ return int(size_str[:-2]) * 1024 * 1024 * 1024
+ elif size_str.endswith('KB'):
+ return int(size_str[:-2]) * 1024
+ else:
+ return int(size_str)
+
+ @staticmethod
+ def parse_duration(duration_str): # pragma: no cover
+ """
+ Parses a duration string and returns a timedelta object.
+
+ Args:
+ duration_str (str): The duration string (e.g., '7 days').
+
+ Returns:
+ timedelta: The duration as a timedelta object.
+ """
+ duration_str = duration_str.lower()
+ if 'day' in duration_str:
+ return timedelta(days=int(duration_str.split()[0]))
+ elif 'hour' in duration_str:
+ return timedelta(hours=int(duration_str.split()[0]))
+ elif 'minute' in duration_str:
+ return timedelta(minutes=int(duration_str.split()[0]))
+ else:
+ return timedelta(days=0)
+
+ def __call__(self, message): # pragma: no cover
+ """
+ Handles the logging of a message, including writing, rotating, and applying retention policies.
+
+ Args:
+ message (str): The log message to be written.
+
+ This method ensures thread-safe logging by acquiring a lock before writing the message,
+ rotating the logs if necessary, and applying the retention policy to remove old log files.
+ """
+ with rotation_lock:
+ self.write_message(message)
+ self.rotate_logs()
+ self.apply_retention()
+
+ def write_message(self, message): # pragma: no cover
+ """
+ Writes a log message to the log file.
+
+ Args:
+ message (str): The log message to be written.
+
+ This method opens the log file in append mode and writes the message to it.
+ """
+ with open(self.path, 'a') as f:
+ f.write(message)
+
+ def rotate_logs(self): # pragma: no cover
+ """
+ Rotates the log file if it exceeds the specified rotation size.
+
+ This method checks the size of the current log file. If the file size exceeds the specified rotation size, it renames the current log file by appending a timestamp to its name. Optionally, it compresses the rotated log file using the specified compression method and removes the original uncompressed file.
+
+ Args:
+ None
+
+ Returns:
+ None
+
+ Raises:
+ OSError: If there is an error renaming or compressing the log file.
+ """
+ if os.path.getsize(self.path) >= self.rotation_size:
+ timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
+ rotated_path = f"{self.path}.{timestamp}"
+ os.rename(self.path, rotated_path)
+ if self.compression:
+ shutil.make_archive(rotated_path, self.compression, root_dir=os.path.dirname(rotated_path), base_dir=os.path.basename(rotated_path))
+ os.remove(rotated_path)
+
+ def apply_retention(self): # pragma: no cover
+ """
+ Applies the retention policy to remove old log files.
+
+ This method iterates through the log files in the directory of the current log file. It checks the modification time of each log file and removes those that are older than the specified retention period.
+
+ Args:
+ None
+
+ Returns:
+ None
+
+ Raises:
+ OSError: If there is an error removing a log file.
+ """
+ now = datetime.now()
+ for filename in os.listdir(os.path.dirname(self.path)):
+ if filename.startswith(os.path.basename(self.path)) and len(filename.split('.')) > 1:
+ file_path = os.path.join(os.path.dirname(self.path), filename)
+ file_time = datetime.fromtimestamp(os.path.getmtime(file_path))
+ if now - file_time > self.retention_days:
+ os.remove(file_path)
def config_log(
- logging_directory: str = 'log',
- log_name: str = 'log',
- logging_level: str = 'INFO',
- log_rotation: str = '100 MB',
- log_retention: str = '30 days',
+ logging_directory: str = "log",
+ log_name: str = "log",
+ logging_level: str = "INFO",
+ log_rotation: str = "100 MB",
+ log_retention: str = "30 days",
log_backtrace: bool = False,
log_format: str = None,
log_serializer: bool = False,
@@ -53,138 +217,132 @@ def config_log(
append_app_name: bool = False,
enqueue: bool = True,
intercept_standard_logging: bool = True,
- file_sink: bool = True,
+ compression: str = 'zip',
):
"""
- Configures and sets up a logger using the loguru package.
-
- Parameters:
- - logging_directory (str): The directory where logs will be stored. Default is "log".
- - log_name (str): The name of the log file. Default is "log" (extension automatically set in 0.12.2).
- - logging_level (str): The logging level. Default is "INFO".
- - log_rotation (str): The log rotation size. Default is "100 MB".
- - log_retention (str): The log retention period. Default is "30 days".
- - log_backtrace (bool): Whether to enable backtrace. Default is False.
- - log_format (str): The log format. Default is None.
- - log_serializer (bool): Whether to disable log serialization. Default is False.
- - log_diagnose (bool): Whether to enable diagnose. Default is False.
- - app_name (str): The application name. Default is None.
- - append_app_name (bool): Whether to append the application name to the log file name. Default is False.
- - enqueue (bool): Whether to enqueue log messages. Default is True.
- - intercept_standard_logging (bool): Whether to intercept standard logging. Default is True.
- - file_sink (bool): Whether to use a file sink. Default is True.
-
- Raises:
- - ValueError: If the provided logging level is not valid.
-
- Usage Example:
- ```python
- from logging_config import config_log
-
- config_log(
- logging_directory='logs',
- log_name='app.log',
- logging_level='DEBUG',
- log_rotation='100 MB',
- log_retention='10 days',
- log_backtrace=True,
- log_format="{time:YYYY-MM-DD HH:mm:ss.SSSSSS} | {level: <8} | {name} :{function} :{line} - {message} ",
- log_serializer=False,
- log_diagnose=True,
- app_name='my_app',
- append_app_name=True,
- enqueue=True,
- intercept_standard_logging=True,
- file_sink=True
- )
- ```
+ Configures the logging settings for the application.
+
+ This function sets up the logging configuration, including the log directory, log file name, logging level, log rotation, retention policies, and other optional settings.
+
+ Args:
+ logging_directory (str): The directory where log files will be stored. Defaults to "log".
+ log_name (str): The base name of the log file. Defaults to "log".
+ logging_level (str): The logging level (e.g., "INFO", "DEBUG"). Defaults to "INFO".
+ log_rotation (str): The size threshold for log rotation (e.g., "100 MB"). Defaults to "100 MB".
+ log_retention (str): The duration to retain old log files (e.g., "30 days"). Defaults to "30 days".
+ log_backtrace (bool): Whether to include backtrace information in logs. Defaults to False.
+ log_format (str, optional): The format string for log messages. Defaults to a predefined format if not provided.
+ log_serializer (bool): Whether to serialize log messages. Defaults to False.
+ log_diagnose (bool): Whether to include diagnostic information in logs. Defaults to False.
+ app_name (str, optional): The name of the application. Defaults to None.
+ append_app_name (bool): Whether to append the application name to the log file name. Defaults to False.
+ enqueue (bool): Whether to enqueue log messages for asynchronous processing. Defaults to True.
+ intercept_standard_logging (bool): Whether to intercept standard logging calls. Defaults to True.
+ compression (str): The compression method for rotated log files (e.g., "zip"). Defaults to 'zip'.
+
+ Returns:
+ None
+
+ Example:
+ config_log(
+ logging_directory='logs',
+ log_name='app_log',
+ logging_level='DEBUG',
+ log_rotation='50 MB',
+ log_retention='7 days',
+ log_backtrace=True,
+ log_format='{time} - {level} - {message}',
+ log_serializer=True,
+ log_diagnose=True,
+ app_name='MyApp',
+ append_app_name=True,
+ enqueue=False,
+ intercept_standard_logging=False,
+ compression='gz'
+ )
+
+ This will configure the logging settings with the specified parameters, setting up a log file at 'logs/app_log' with rotation at 50 MB, retention for 7 days, and other specified options.
"""
# If the log_name ends with ".log", remove the extension
- if log_name.endswith('.log'):
- log_name = log_name.replace('.log', '') # pragma: no cover
+ if log_name.endswith(".log"):
+ log_name = log_name.replace(".log", "") # pragma: no cover
# If the log_name ends with ".json", remove the extension
- if log_name.endswith('.json'):
- log_name = log_name.replace('.json', '') # pragma: no cover
+ if log_name.endswith(".json"):
+ log_name = log_name.replace(".json", "") # pragma: no cover
# Set default log format if not provided
if log_format is None: # pragma: no cover
- log_format = '{time:YYYY-MM-DD HH:mm:ss.SSSSSS} | {level: <8} | {name} :{function} :{line} - {message} ' # pragma: no cover
+ log_format = "{time:YYYY-MM-DD HH:mm:ss.SSSSSS} | {level: <8} | {name} :{function} :{line} - {message} " # pragma: no cover
if log_serializer is True:
- log_format = '{message}' # pragma: no cover
- log_name = f'{log_name}.json' # pragma: no cover
+ log_format = "{message}" # pragma: no cover
+ log_name = f"{log_name}.json" # pragma: no cover
else:
- log_name = f'{log_name}.log' # pragma: no cover
+ log_name = f"{log_name}.log" # pragma: no cover
# Validate logging level
- log_levels: list = ['DEBUG', 'INFO', 'ERROR', 'WARNING', 'CRITICAL']
+ log_levels: list = ["DEBUG", "INFO", "ERROR", "WARNING", "CRITICAL"]
if logging_level.upper() not in log_levels:
- raise ValueError(f'Invalid logging level: {logging_level}. Valid levels are: {log_levels}')
+ raise ValueError(
+ f"Invalid logging level: {logging_level}. Valid levels are: {log_levels}"
+ )
# Generate unique trace ID
trace_id: str = str(uuid4())
- logger.configure(extra={'app_name': app_name, 'trace_id': trace_id})
+ logger.configure(extra={"app_name": app_name, "trace_id": trace_id})
# Append app name to log format if provided
if app_name is not None:
- log_format += ' | app_name: {extra[app_name]}'
+ log_format += " | app_name: {extra[app_name]}"
# Remove any previously added sinks
logger.remove()
# Append app name to log file name if required
if append_app_name is True and app_name is not None:
- log_name = log_name.replace('.', f'_{app_name}.')
+ log_name = log_name.replace(".", f"_{app_name}.")
# Construct log file path
log_path = Path.cwd().joinpath(logging_directory).joinpath(log_name)
# Add loguru logger with specified configuration
logger.add(
- log_path,
+ SafeFileSink(log_path, rotation=log_rotation, retention=log_retention, compression=compression),
level=logging_level.upper(),
format=log_format,
enqueue=enqueue,
backtrace=log_backtrace,
- rotation=log_rotation,
- retention=log_retention,
- compression='zip',
serialize=log_serializer,
diagnose=log_diagnose,
)
- basic_config_handlers:list = []
+ basic_config_handlers: list = []
class InterceptHandler(logging.Handler):
"""
- Interceptor for standard logging.
-
- This class intercepts standard Python logging messages and redirects
- them to the Loguru logger. It is used as a handler for the standard
- Python logger.
-
- Attributes:
- level (str): The minimum severity level of messages that this
- handler should handle.
+ A logging handler that intercepts standard logging messages and redirects them to Loguru.
- Usage Example: ```python from dsg_lib.logging_config import
- InterceptHandler import logging
+ This handler captures log messages from the standard logging module and forwards them to Loguru, preserving the log level and message details.
- # Create a standard Python logger logger =
- logging.getLogger('my_logger')
+ Methods:
+ emit(record):
+ Emits a log record to Loguru.
+ """
- # Create an InterceptHandler handler = InterceptHandler()
+ def emit(self, record):
+ """
+ Emits a log record to Loguru.
- # Add the InterceptHandler to the logger logger.addHandler(handler)
+ This method captures the log record, determines the appropriate Loguru log level, and logs the message using Loguru. It also handles exceptions and finds the caller's frame to maintain accurate log information.
- # Now, when you log a message using the standard Python logger, it will
- be intercepted and redirected to the Loguru logger logger.info('This is
- an info message') ```
- """
+ Args:
+ record (logging.LogRecord): The log record to be emitted.
- def emit(self, record):
+ Returns:
+ None
+ """
# Get corresponding Loguru level if it exists
try:
level = logger.level(record.levelname).name
@@ -202,7 +360,6 @@ def emit(self, record):
level, record.getMessage()
) # pragma: no cover
-
if intercept_standard_logging:
# Add interceptor handler to all existing loggers
for name in logging.Logger.manager.loggerDict:
@@ -214,44 +371,9 @@ def emit(self, record):
# Set the root logger's level to the lowest level possible
logging.getLogger().setLevel(logging.NOTSET)
-
- class ResilientFileSink:
- def __init__(self, path, retry_count=5, retry_delay=1):
- self.path = path
- self.retry_count = retry_count
- self.retry_delay = retry_delay
- # Ensure the directory exists
- os.makedirs(os.path.dirname(path), exist_ok=True)
-
- def __call__(self, message): # pragma: no cover
- attempt = 0
- while attempt < self.retry_count:
- try:
- # Open the file in append mode and write the message
- with open(self.path, 'a') as file:
- file.write(message)
- break # Exit the loop if write succeeds
- except Exception as e:
- attempt += 1
- time.sleep(self.retry_delay)
- if attempt == self.retry_count:
- print(f"Failed to log message after {self.retry_count} attempts: {e}")
-
-
- if file_sink:
- # Create an instance of ResilientFileSink if file_sink is True
- resilient_sink = ResilientFileSink(str(log_path))
-
- # Append the ResilientFileSink instance to the handlers list
- basic_config_handlers.append(resilient_sink)
-
if intercept_standard_logging:
# Append an InterceptHandler instance to the handlers list if intercept_standard_logging is True
basic_config_handlers.append(InterceptHandler())
- if len(basic_config_handlers) > 0:
- # Configure the root logger if there are any handlers specified
- logging.basicConfig(handlers=basic_config_handlers, level=logging_level.upper())
-
if len(basic_config_handlers) > 0:
logging.basicConfig(handlers=basic_config_handlers, level=logging_level.upper())
diff --git a/dsg_lib/common_functions/patterns.py b/dsg_lib/common_functions/patterns.py
index e46fd990..394eb492 100644
--- a/dsg_lib/common_functions/patterns.py
+++ b/dsg_lib/common_functions/patterns.py
@@ -43,7 +43,9 @@
from loguru import logger
-def pattern_between_two_char(text_string: str, left_characters: str, right_characters: str) -> dict:
+def pattern_between_two_char(
+ text_string: str, left_characters: str, right_characters: str
+) -> dict:
"""
Searches for all patterns between two characters (left and right) in a given
string using regular expressions.
@@ -108,29 +110,29 @@ def pattern_between_two_char(text_string: str, left_characters: str, right_chara
# Create a regex pattern that matches all substrings between target
# characters
- pattern = f'{esc_left_char}(.+?){esc_right_char}'
+ pattern = f"{esc_left_char}(.+?){esc_right_char}"
# Replace \w with . to match any printable UTF-8 character
- pattern = pattern.replace(r'\w', r'.')
+ pattern = pattern.replace(r"\w", r".")
# Search for all patterns and store them in pattern_list variable
pattern_list = re.findall(pattern, esc_text)
# Create a dictionary to store match details
results: dict = {
- 'found': pattern_list,
- 'matched_found': len(pattern_list),
- 'pattern_parameters': {
- 'left_character': esc_left_char,
- 'right_character': esc_right_char,
- 'regex_pattern': pattern,
- 'text_string': esc_text,
+ "found": pattern_list,
+ "matched_found": len(pattern_list),
+ "pattern_parameters": {
+ "left_character": esc_left_char,
+ "right_character": esc_right_char,
+ "regex_pattern": pattern,
+ "text_string": esc_text,
},
}
# Log matched pattern(s) found using 'debug' log level
if len(pattern_list) > 0:
- logger.debug(f'Matched pattern(s): {pattern_list}')
+ logger.debug(f"Matched pattern(s): {pattern_list}")
# Log successful function execution using 'info' log level
logger.info("Successfully executed 'pattern_between_two_char' function")
@@ -139,15 +141,15 @@ def pattern_between_two_char(text_string: str, left_characters: str, right_chara
except ValueError as e: # pragma: no cover
# capture exception and return error in case of invalid input parameters
results: dict = {
- 'error': str(e),
- 'matched_found': 0,
- 'pattern_parameters': {
- 'left_character': left_characters,
- 'right_character': right_characters,
- 'regex_pattern': None,
- 'text_string': text_string,
+ "error": str(e),
+ "matched_found": 0,
+ "pattern_parameters": {
+ "left_character": left_characters,
+ "right_character": right_characters,
+ "regex_pattern": None,
+ "text_string": text_string,
},
}
# logger of regex error using 'critical' log level
- logger.critical(f'Failed to generate regex pattern with error: {e}')
+ logger.critical(f"Failed to generate regex pattern with error: {e}")
return results
diff --git a/dsg_lib/fastapi_functions/_all_codes.py b/dsg_lib/fastapi_functions/_all_codes.py
index e7562dd4..80ce62f2 100644
--- a/dsg_lib/fastapi_functions/_all_codes.py
+++ b/dsg_lib/fastapi_functions/_all_codes.py
@@ -39,318 +39,318 @@
ALL_HTTP_CODES = {
100: {
- 'description': 'Continue',
- 'extended_description': 'The client should continue with its request. This interim response indicates that everything so far is OK and that the client should continue with the request, or ignore it if it is already finished.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/100',
+ "description": "Continue",
+ "extended_description": "The client should continue with its request. This interim response indicates that everything so far is OK and that the client should continue with the request, or ignore it if it is already finished.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/100",
},
101: {
- 'description': 'Switching Protocols',
- 'extended_description': "The server understands and is willing to comply with the client's request, via the Upgrade message header field, for a change in the application protocol being used on this connection.",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/101',
+ "description": "Switching Protocols",
+ "extended_description": "The server understands and is willing to comply with the client's request, via the Upgrade message header field, for a change in the application protocol being used on this connection.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/101",
},
102: {
- 'description': 'Processing',
- 'extended_description': 'This code indicates that the server has received and is processing the request, but no response is available yet.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/102',
+ "description": "Processing",
+ "extended_description": "This code indicates that the server has received and is processing the request, but no response is available yet.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/102",
},
103: {
- 'description': 'Early Hints',
- 'extended_description': 'This status code is primarily intended to be used with the Link header, letting the user agent start preloading resources while the server is still preparing a response.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/103',
+ "description": "Early Hints",
+ "extended_description": "This status code is primarily intended to be used with the Link header, letting the user agent start preloading resources while the server is still preparing a response.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/103",
},
200: {
- 'description': 'OK',
- 'extended_description': 'The request has succeeded.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200',
+ "description": "OK",
+ "extended_description": "The request has succeeded.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200",
},
201: {
- 'description': 'Created',
- 'extended_description': 'The request has been fulfilled and has resulted in a new resource being created.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/201',
+ "description": "Created",
+ "extended_description": "The request has been fulfilled and has resulted in a new resource being created.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/201",
},
202: {
- 'description': 'Accepted',
- 'extended_description': 'The request has been accepted for processing, but the processing has not been completed.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/202',
+ "description": "Accepted",
+ "extended_description": "The request has been accepted for processing, but the processing has not been completed.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/202",
},
203: {
- 'description': 'Non-Authoritative Information',
- 'extended_description': 'The server successfully processed the request, but is returning information that may be from another source.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/203',
+ "description": "Non-Authoritative Information",
+ "extended_description": "The server successfully processed the request, but is returning information that may be from another source.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/203",
},
204: {
- 'description': 'No Content',
- 'extended_description': 'The server successfully processed the request and is not returning any content.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/204',
+ "description": "No Content",
+ "extended_description": "The server successfully processed the request and is not returning any content.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/204",
},
205: {
- 'description': 'Reset Content',
- 'extended_description': 'The server successfully processed the request, but is not returning any content. Unlike a 204 response, this response requires that the requester reset the document view.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/205',
+ "description": "Reset Content",
+ "extended_description": "The server successfully processed the request, but is not returning any content. Unlike a 204 response, this response requires that the requester reset the document view.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/205",
},
206: {
- 'description': 'Partial Content',
- 'extended_description': 'The server is delivering only part of the resource due to a range header sent by the client.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/206',
+ "description": "Partial Content",
+ "extended_description": "The server is delivering only part of the resource due to a range header sent by the client.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/206",
},
207: {
- 'description': 'Multi-Status',
- 'extended_description': 'The message body that follows is an XML message and can contain a number of separate response codes, depending on how many sub-requests were made.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/207',
+ "description": "Multi-Status",
+ "extended_description": "The message body that follows is an XML message and can contain a number of separate response codes, depending on how many sub-requests were made.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/207",
},
208: {
- 'description': 'Already Reported',
- 'extended_description': 'The members of a DAV binding have already been enumerated in a preceding part of the (multistatus) response, and are not being included again.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/208',
+ "description": "Already Reported",
+ "extended_description": "The members of a DAV binding have already been enumerated in a preceding part of the (multistatus) response, and are not being included again.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/208",
},
226: {
- 'description': 'IM Used',
- 'extended_description': 'The server has fulfilled a request for the resource, and the response is a representation of the result of one or more instance-manipulations applied to the current instance.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/226',
+ "description": "IM Used",
+ "extended_description": "The server has fulfilled a request for the resource, and the response is a representation of the result of one or more instance-manipulations applied to the current instance.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/226",
},
300: {
- 'description': 'Multiple Choices',
- 'extended_description': 'Indicates multiple options for the resource that the client may follow.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/300',
+ "description": "Multiple Choices",
+ "extended_description": "Indicates multiple options for the resource that the client may follow.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/300",
},
301: {
- 'description': 'Moved Permanently',
- 'extended_description': 'This and all future requests should be directed to the given URI.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/301',
+ "description": "Moved Permanently",
+ "extended_description": "This and all future requests should be directed to the given URI.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/301",
},
302: {
- 'description': 'Found',
- 'extended_description': 'Tells the client to look at (browse to) another URL.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/302',
+ "description": "Found",
+ "extended_description": "Tells the client to look at (browse to) another URL.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/302",
},
303: {
- 'description': 'See Other',
- 'extended_description': 'The response to the request can be found under another URI using the GET method.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/303',
+ "description": "See Other",
+ "extended_description": "The response to the request can be found under another URI using the GET method.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/303",
},
304: {
- 'description': 'Not Modified',
- 'extended_description': 'Indicates that the resource has not been modified since the version specified by the request headers If-Modified-Since or If-None-Match.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/304',
+ "description": "Not Modified",
+ "extended_description": "Indicates that the resource has not been modified since the version specified by the request headers If-Modified-Since or If-None-Match.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/304",
},
305: {
- 'description': 'Use Proxy',
- 'extended_description': 'The requested resource is available only through a proxy, the address for which is provided in the response.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/305',
+ "description": "Use Proxy",
+ "extended_description": "The requested resource is available only through a proxy, the address for which is provided in the response.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/305",
},
306: {
- 'description': '(Unused)',
- 'extended_description': "No longer used. Originally meant 'Subsequent requests should use the specified proxy.'",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/306',
+ "description": "(Unused)",
+ "extended_description": "No longer used. Originally meant 'Subsequent requests should use the specified proxy.'",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/306",
},
307: {
- 'description': 'Temporary Redirect',
- 'extended_description': 'The request should be repeated with another URI, but future requests should still use the original URI.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/307',
+ "description": "Temporary Redirect",
+ "extended_description": "The request should be repeated with another URI, but future requests should still use the original URI.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/307",
},
308: {
- 'description': 'Permanent Redirect',
- 'extended_description': 'The request, and all future requests should be repeated using another URI.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/308',
+ "description": "Permanent Redirect",
+ "extended_description": "The request, and all future requests should be repeated using another URI.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/308",
},
400: {
- 'description': 'Bad Request',
- 'extended_description': 'The server could not understand the request due to invalid syntax.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400',
+ "description": "Bad Request",
+ "extended_description": "The server could not understand the request due to invalid syntax.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400",
},
401: {
- 'description': 'Unauthorized',
- 'extended_description': 'The request requires user authentication.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401',
+ "description": "Unauthorized",
+ "extended_description": "The request requires user authentication.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/401",
},
402: {
- 'description': 'Payment Required',
- 'extended_description': 'This code is reserved for future use.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/402',
+ "description": "Payment Required",
+ "extended_description": "This code is reserved for future use.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/402",
},
403: {
- 'description': 'Forbidden',
- 'extended_description': 'The server understood the request, but it refuses to authorize it.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403',
+ "description": "Forbidden",
+ "extended_description": "The server understood the request, but it refuses to authorize it.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403",
},
404: {
- 'description': 'Not Found',
- 'extended_description': 'The server can not find the requested resource.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404',
+ "description": "Not Found",
+ "extended_description": "The server can not find the requested resource.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404",
},
405: {
- 'description': 'Method Not Allowed',
- 'extended_description': 'The method specified in the request is not allowed for the resource identified by the request URI.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405',
+ "description": "Method Not Allowed",
+ "extended_description": "The method specified in the request is not allowed for the resource identified by the request URI.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405",
},
406: {
- 'description': 'Not Acceptable',
- 'extended_description': 'The resource identified by the request is only capable of generating response entities which have content characteristics not acceptable according to the accept headers sent in the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/406',
+ "description": "Not Acceptable",
+ "extended_description": "The resource identified by the request is only capable of generating response entities which have content characteristics not acceptable according to the accept headers sent in the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/406",
},
407: {
- 'description': 'Proxy Authentication Required',
- 'extended_description': 'The client must first authenticate itself with the proxy.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/407',
+ "description": "Proxy Authentication Required",
+ "extended_description": "The client must first authenticate itself with the proxy.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/407",
},
408: {
- 'description': 'Request Timeout',
- 'extended_description': 'The server timed out waiting for the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/408',
+ "description": "Request Timeout",
+ "extended_description": "The server timed out waiting for the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/408",
},
409: {
- 'description': 'Conflict',
- 'extended_description': 'The request could not be completed due to a conflict with the current state of the resource.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/409',
+ "description": "Conflict",
+ "extended_description": "The request could not be completed due to a conflict with the current state of the resource.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/409",
},
410: {
- 'description': 'Gone',
- 'extended_description': 'The requested resource is no longer available at the server and no forwarding address is known.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/410',
+ "description": "Gone",
+ "extended_description": "The requested resource is no longer available at the server and no forwarding address is known.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/410",
},
411: {
- 'description': 'Length Required',
- 'extended_description': 'The server refuses to accept the request without a defined Content-Length.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/411',
+ "description": "Length Required",
+ "extended_description": "The server refuses to accept the request without a defined Content-Length.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/411",
},
412: {
- 'description': 'Precondition Failed',
- 'extended_description': 'The precondition given in one or more of the request-header fields evaluated to false when it was tested on the server.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/412',
+ "description": "Precondition Failed",
+ "extended_description": "The precondition given in one or more of the request-header fields evaluated to false when it was tested on the server.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/412",
},
413: {
- 'description': 'Payload Too Large',
- 'extended_description': 'The server is refusing to process a request because the request payload is larger than the server is willing or able to process.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/413',
+ "description": "Payload Too Large",
+ "extended_description": "The server is refusing to process a request because the request payload is larger than the server is willing or able to process.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/413",
},
414: {
- 'description': 'URI Too Long',
- 'extended_description': 'The server is refusing to service the request because the request-target is longer than the server is willing to interpret.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/414',
+ "description": "URI Too Long",
+ "extended_description": "The server is refusing to service the request because the request-target is longer than the server is willing to interpret.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/414",
},
415: {
- 'description': 'Unsupported Media Type',
- 'extended_description': 'The media format of the requested data is not supported by the server, so the server is rejecting the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/415',
+ "description": "Unsupported Media Type",
+ "extended_description": "The media format of the requested data is not supported by the server, so the server is rejecting the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/415",
},
416: {
- 'description': 'Range Not Satisfiable',
- 'extended_description': "The range specified in the Range header field of the request can't be fulfilled; it's possible that the range is outside the size of the target URI's data.",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/416',
+ "description": "Range Not Satisfiable",
+ "extended_description": "The range specified in the Range header field of the request can't be fulfilled; it's possible that the range is outside the size of the target URI's data.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/416",
},
417: {
- 'description': 'Expectation Failed',
- 'extended_description': 'The expectation given in the Expect request header could not be met by the server.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/417',
+ "description": "Expectation Failed",
+ "extended_description": "The expectation given in the Expect request header could not be met by the server.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/417",
},
418: {
- 'description': "I'm a teapot",
- 'extended_description': "The server refuses to brew coffee because it is, permanently, a teapot. A combined coffee/tea pot that is temporarily out of coffee should instead return 503. This error is a reference to Hyper Text Coffee Pot Control Protocol defined in April Fools' jokes in 1998 and 2014.",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/418',
+ "description": "I'm a teapot",
+ "extended_description": "The server refuses to brew coffee because it is, permanently, a teapot. A combined coffee/tea pot that is temporarily out of coffee should instead return 503. This error is a reference to Hyper Text Coffee Pot Control Protocol defined in April Fools' jokes in 1998 and 2014.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/418",
},
421: {
- 'description': 'Misdirected Request',
- 'extended_description': 'The request was directed at a server that is not able to produce a response. This can be sent by a server that is not configured to produce responses for the combination of scheme and authority that are included in the request URI.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/421',
+ "description": "Misdirected Request",
+ "extended_description": "The request was directed at a server that is not able to produce a response. This can be sent by a server that is not configured to produce responses for the combination of scheme and authority that are included in the request URI.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/421",
},
422: {
- 'description': 'Unprocessable Entity',
- 'extended_description': 'The server understands the content type of the request entity, and the syntax of the request entity is correct, but it was unable to process the contained instructions.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/422',
+ "description": "Unprocessable Entity",
+ "extended_description": "The server understands the content type of the request entity, and the syntax of the request entity is correct, but it was unable to process the contained instructions.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/422",
},
423: {
- 'description': 'Locked',
- 'extended_description': 'The resource that is being accessed is locked.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/423',
+ "description": "Locked",
+ "extended_description": "The resource that is being accessed is locked.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/423",
},
424: {
- 'description': 'Failed Dependency',
- 'extended_description': 'The request failed due to failure of a previous request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/424',
+ "description": "Failed Dependency",
+ "extended_description": "The request failed due to failure of a previous request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/424",
},
425: {
- 'description': 'Too Early',
- 'extended_description': 'Indicates that the server is unwilling to risk processing a request that might be replayed.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/425',
+ "description": "Too Early",
+ "extended_description": "Indicates that the server is unwilling to risk processing a request that might be replayed.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/425",
},
426: {
- 'description': 'Upgrade Required',
- 'extended_description': 'The server refuses to perform the request using the current protocol but might be willing to do so after the client upgrades to a different protocol. The server sends an Upgrade header in a 426 response to indicate the required protocol(s).',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/426',
+ "description": "Upgrade Required",
+ "extended_description": "The server refuses to perform the request using the current protocol but might be willing to do so after the client upgrades to a different protocol. The server sends an Upgrade header in a 426 response to indicate the required protocol(s).",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/426",
},
428: {
- 'description': 'Precondition Required',
- 'extended_description': "The origin server requires the request to be conditional. This response is intended to prevent the 'lost update' problem, where a client GETs a resource's state, modifies it, and PUTs it back to the server, when meanwhile a third party has modified the state on the server, leading to a conflict.",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/428',
+ "description": "Precondition Required",
+ "extended_description": "The origin server requires the request to be conditional. This response is intended to prevent the 'lost update' problem, where a client GETs a resource's state, modifies it, and PUTs it back to the server, when meanwhile a third party has modified the state on the server, leading to a conflict.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/428",
},
429: {
- 'description': 'Too Many Requests',
- 'extended_description': "The user has sent too many requests in a given amount of time ('rate limiting').",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/429',
+ "description": "Too Many Requests",
+ "extended_description": "The user has sent too many requests in a given amount of time ('rate limiting').",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/429",
},
431: {
- 'description': 'Request Header Fields Too Large',
- 'extended_description': 'The server is unwilling to process the request because its header fields are too large.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/431',
+ "description": "Request Header Fields Too Large",
+ "extended_description": "The server is unwilling to process the request because its header fields are too large.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/431",
},
451: {
- 'description': 'Unavailable For Legal Reasons',
- 'extended_description': 'The server is denying access to the resource as a consequence of a legal demand.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/451',
+ "description": "Unavailable For Legal Reasons",
+ "extended_description": "The server is denying access to the resource as a consequence of a legal demand.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/451",
},
500: {
- 'description': 'Internal Server Error',
- 'extended_description': 'The server encountered an unexpected condition that prevented it from fulfilling the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/500',
+ "description": "Internal Server Error",
+ "extended_description": "The server encountered an unexpected condition that prevented it from fulfilling the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/500",
},
501: {
- 'description': 'Not Implemented',
- 'extended_description': 'The server does not support the functionality required to fulfill the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/501',
+ "description": "Not Implemented",
+ "extended_description": "The server does not support the functionality required to fulfill the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/501",
},
502: {
- 'description': 'Bad Gateway',
- 'extended_description': 'The server, while acting as a gateway or proxy, received an invalid response from an inbound server it accessed while attempting to fulfill the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/502',
+ "description": "Bad Gateway",
+ "extended_description": "The server, while acting as a gateway or proxy, received an invalid response from an inbound server it accessed while attempting to fulfill the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/502",
},
503: {
- 'description': 'Service Unavailable',
- 'extended_description': 'The server is currently unable to handle the request due to a temporary overload or scheduled maintenance, which will likely be alleviated after some delay.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503',
+ "description": "Service Unavailable",
+ "extended_description": "The server is currently unable to handle the request due to a temporary overload or scheduled maintenance, which will likely be alleviated after some delay.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503",
},
504: {
- 'description': 'Gateway Timeout',
- 'extended_description': 'The server, while acting as a gateway or proxy, did not receive a timely response from an upstream server it needed to access in order to complete the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/504',
+ "description": "Gateway Timeout",
+ "extended_description": "The server, while acting as a gateway or proxy, did not receive a timely response from an upstream server it needed to access in order to complete the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/504",
},
505: {
- 'description': 'HTTP Version Not Supported',
- 'extended_description': 'The server does not support, or refuses to support, the major version of HTTP that was used in the request message.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/505',
+ "description": "HTTP Version Not Supported",
+ "extended_description": "The server does not support, or refuses to support, the major version of HTTP that was used in the request message.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/505",
},
506: {
- 'description': 'Variant Also Negotiates',
- 'extended_description': 'The server has an internal configuration error: the chosen variant resource is configured to engage in transparent content negotiation itself, and is therefore not a proper end point in the negotiation process.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/506',
+ "description": "Variant Also Negotiates",
+ "extended_description": "The server has an internal configuration error: the chosen variant resource is configured to engage in transparent content negotiation itself, and is therefore not a proper end point in the negotiation process.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/506",
},
507: {
- 'description': 'Insufficient Storage',
- 'extended_description': 'The method could not be performed on the resource because the server is unable to store the representation needed to successfully complete the request.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/507',
+ "description": "Insufficient Storage",
+ "extended_description": "The method could not be performed on the resource because the server is unable to store the representation needed to successfully complete the request.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/507",
},
508: {
- 'description': 'Loop Detected',
- 'extended_description': "The server detected an infinite loop while processing a request with 'Depth: infinity'. This status indicates that the entire operation failed.",
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/508',
+ "description": "Loop Detected",
+ "extended_description": "The server detected an infinite loop while processing a request with 'Depth: infinity'. This status indicates that the entire operation failed.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/508",
},
510: {
- 'description': 'Not Extended',
- 'extended_description': 'Further extensions to the request are required for the server to fulfill it.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/510',
+ "description": "Not Extended",
+ "extended_description": "Further extensions to the request are required for the server to fulfill it.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/510",
},
511: {
- 'description': 'Network Authentication Required',
- 'extended_description': 'The client needs to authenticate to gain network access. Intended for use by intercepting proxies used to control access to the network.',
- 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/511',
+ "description": "Network Authentication Required",
+ "extended_description": "The client needs to authenticate to gain network access. Intended for use by intercepting proxies used to control access to the network.",
+ "link": "https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/511",
},
}
diff --git a/dsg_lib/fastapi_functions/http_codes.py b/dsg_lib/fastapi_functions/http_codes.py
index bb25c65b..55282088 100644
--- a/dsg_lib/fastapi_functions/http_codes.py
+++ b/dsg_lib/fastapi_functions/http_codes.py
@@ -87,17 +87,19 @@ def generate_code_dict(codes, description_only=False):
if description_only:
# Log the operation
- logger.debug(f'description_only is True and returning HTTP codes: {codes}')
+ logger.debug(f"description_only is True and returning HTTP codes: {codes}")
# If description_only is True, return a dictionary where each key is an
# HTTP error code from the input list and each value is the
# corresponding description from the ALL_HTTP_CODES dictionary.
return {
- code: ALL_HTTP_CODES[code]['description'] for code in codes if code in ALL_HTTP_CODES
+ code: ALL_HTTP_CODES[code]["description"]
+ for code in codes
+ if code in ALL_HTTP_CODES
}
else:
# Log the operation
- logger.debug(f'returning HTTP codes: {codes}')
+ logger.debug(f"returning HTTP codes: {codes}")
# If description_only is False, return a dictionary where each key is an
# HTTP error code from the input list and each value is the
diff --git a/dsg_lib/fastapi_functions/system_health_endpoints.py b/dsg_lib/fastapi_functions/system_health_endpoints.py
index 97c968f4..b7a32cb4 100644
--- a/dsg_lib/fastapi_functions/system_health_endpoints.py
+++ b/dsg_lib/fastapi_functions/system_health_endpoints.py
@@ -147,15 +147,17 @@ def create_health_router(config: dict):
from fastapi import APIRouter, HTTPException, status
from fastapi.responses import ORJSONResponse
except ImportError: # pragma: no cover
- APIRouter = HTTPException = status = ORJSONResponse = fastapi = None # pragma: no cover
+ APIRouter = HTTPException = status = ORJSONResponse = fastapi = (
+ None # pragma: no cover
+ )
# Check FastAPI version
- min_version = '0.100.0' # replace with your minimum required version
+ min_version = "0.100.0" # replace with your minimum required version
if fastapi is not None and packaging_version.parse(
fastapi.__version__
) < packaging_version.parse(min_version):
raise ImportError(
- f'FastAPI version >= {min_version} required, run `pip install --upgrade fastapi`'
+ f"FastAPI version >= {min_version} required, run `pip install --upgrade fastapi`"
) # pragma: no cover
# Store the start time of the application
@@ -170,11 +172,11 @@ def create_health_router(config: dict):
router = APIRouter()
# Check if the status endpoint is enabled in the configuration
- if config.get('enable_status_endpoint', True):
+ if config.get("enable_status_endpoint", True):
# Define the status endpoint
@router.get(
- '/status',
- tags=['system-health'],
+ "/status",
+ tags=["system-health"],
status_code=status.HTTP_200_OK,
response_class=ORJSONResponse,
responses=status_response,
@@ -223,14 +225,14 @@ async def health_status():
```
"""
# Log the status request
- logger.info('Health status of up returned')
+ logger.info("Health status of up returned")
# Return a dictionary with the status of the application
- return {'status': 'UP'}
+ return {"status": "UP"}
# Check if the uptime endpoint is enabled in the configuration
- if config.get('enable_uptime_endpoint', True):
+ if config.get("enable_uptime_endpoint", True):
# Define the uptime endpoint
- @router.get('/uptime', response_class=ORJSONResponse, responses=status_response)
+ @router.get("/uptime", response_class=ORJSONResponse, responses=status_response)
async def get_uptime():
"""
Calculate and return the uptime of the application.
@@ -289,23 +291,25 @@ async def get_uptime():
# Log the uptime
logger.info(
- f'Uptime: {int(days)} days, {int(hours)} hours, {int(minutes)} minutes, {round(seconds, 2)} seconds'
+ f"Uptime: {int(days)} days, {int(hours)} hours, {int(minutes)} minutes, {round(seconds, 2)} seconds"
)
# Return a dictionary with the uptime The dictionary has keys for
# days, hours, minutes, and seconds
return {
- 'uptime': {
- 'Days': int(days),
- 'Hours': int(hours),
- 'Minutes': int(minutes),
- 'Seconds': round(seconds, 2),
+ "uptime": {
+ "Days": int(days),
+ "Hours": int(hours),
+ "Minutes": int(minutes),
+ "Seconds": round(seconds, 2),
}
}
- if config.get('enable_heapdump_endpoint', True):
+ if config.get("enable_heapdump_endpoint", True):
- @router.get('/heapdump', response_class=ORJSONResponse, responses=status_response)
+ @router.get(
+ "/heapdump", response_class=ORJSONResponse, responses=status_response
+ )
async def get_heapdump():
"""
Returns a heap dump of the application.
@@ -361,7 +365,7 @@ async def get_heapdump():
# Take a snapshot of the current memory usage
snapshot = tracemalloc.take_snapshot()
# Get the top 10 lines consuming memory
- top_stats = snapshot.statistics('traceback')
+ top_stats = snapshot.statistics("traceback")
heap_dump = []
for stat in top_stats[:10]:
@@ -370,24 +374,26 @@ async def get_heapdump():
# Add the frame to the heap dump
heap_dump.append(
{
- 'filename': frame.filename,
- 'lineno': frame.lineno,
- 'size': stat.size,
- 'count': stat.count,
+ "filename": frame.filename,
+ "lineno": frame.lineno,
+ "size": stat.size,
+ "count": stat.count,
}
)
- logger.debug(f'Heap dump returned {heap_dump}')
+ logger.debug(f"Heap dump returned {heap_dump}")
memory_use = tracemalloc.get_traced_memory()
return {
- 'memory_use': {
- 'current': f'{memory_use[0]:,}',
- 'peak': f'{memory_use[1]:,}',
+ "memory_use": {
+ "current": f"{memory_use[0]:,}",
+ "peak": f"{memory_use[1]:,}",
},
- 'heap_dump': heap_dump,
+ "heap_dump": heap_dump,
}
except Exception as ex:
- logger.error(f'Error in get_heapdump: {ex}')
- raise HTTPException(status_code=500, detail=f'Error in get_heapdump: {ex}')
+ logger.error(f"Error in get_heapdump: {ex}")
+ raise HTTPException(
+ status_code=500, detail=f"Error in get_heapdump: {ex}"
+ )
return router
diff --git a/examples/fastapi_example.py b/examples/fastapi_example.py
index 3d68ea88..92016403 100644
--- a/examples/fastapi_example.py
+++ b/examples/fastapi_example.py
@@ -26,12 +26,14 @@
from dsg_lib.common_functions import logging_config
from dsg_lib.fastapi_functions import system_health_endpoints # , system_tools_endpoints
-logging_config.config_log(logging_level='INFO', log_serializer=False, log_name='log.log')
+logging_config.config_log(
+ logging_level="INFO", log_serializer=False, log_name="log.log"
+)
@asynccontextmanager
async def lifespan(app: FastAPI):
- logger.info('starting up')
+ logger.info("starting up")
# Create the tables in the database
await async_db.create_tables()
@@ -39,17 +41,17 @@ async def lifespan(app: FastAPI):
if create_users:
await create_a_bunch_of_users(single_entry=2000, many_entries=20000)
yield
- logger.info('shutting down')
+ logger.info("shutting down")
# Create an instance of the FastAPI class
app = FastAPI(
- title='FastAPI Example', # The title of the API
- description='This is an example of a FastAPI application using the DevSetGo Toolkit.', # A brief description of the API
- version='0.1.0', # The version of the API
- docs_url='/docs', # The URL where the API documentation will be served
- redoc_url='/redoc', # The URL where the ReDoc documentation will be served
- openapi_url='/openapi.json', # The URL where the OpenAPI schema will be served
+ title="FastAPI Example", # The title of the API
+ description="This is an example of a FastAPI application using the DevSetGo Toolkit.", # A brief description of the API
+ version="0.1.0", # The version of the API
+ docs_url="/docs", # The URL where the API documentation will be served
+ redoc_url="/redoc", # The URL where the ReDoc documentation will be served
+ openapi_url="/openapi.json", # The URL where the OpenAPI schema will be served
debug=True, # Enable debug mode
middleware=[], # A list of middleware to include in the application
routes=[], # A list of routes to include in the application
@@ -57,7 +59,7 @@ async def lifespan(app: FastAPI):
)
-@app.get('/')
+@app.get("/")
async def root():
"""
Root endpoint of API
@@ -65,33 +67,32 @@ async def root():
Redrects to openapi document
"""
# redirect to openapi docs
- logger.info('Redirecting to OpenAPI docs')
- response = RedirectResponse(url='/docs')
+ logger.info("Redirecting to OpenAPI docs")
+ response = RedirectResponse(url="/docs")
return response
-
config_health = {
- 'enable_status_endpoint': True,
- 'enable_uptime_endpoint': True,
- 'enable_heapdump_endpoint': True,
+ "enable_status_endpoint": True,
+ "enable_uptime_endpoint": True,
+ "enable_heapdump_endpoint": True,
}
app.include_router(
system_health_endpoints.create_health_router(config=config_health),
- prefix='/api/health',
- tags=['system-health'],
+ prefix="/api/health",
+ tags=["system-health"],
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
- 'database_uri': 'sqlite+aiosqlite:///:memory:?cache=shared',
- 'echo': False,
- 'future': True,
+ "database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
+ "echo": False,
+ "future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
- 'pool_recycle': 3600,
+ "pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
@@ -108,9 +109,9 @@ class User(base_schema.SchemaBaseSQLite, async_db.Base):
User table storing user details like first name, last name, and email
"""
- __tablename__ = 'users'
+ __tablename__ = "users"
__table_args__ = {
- 'comment': 'User table storing user details like first name, last name, and email'
+ "comment": "User table storing user details like first name, last name, and email"
}
first_name = Column(String(50), unique=False, index=True) # First name of the user
@@ -119,7 +120,7 @@ class User(base_schema.SchemaBaseSQLite, async_db.Base):
String(200), unique=True, index=True, nullable=True
) # Email of the user, must be unique
addresses = relationship(
- 'Address', order_by='Address.pkid', back_populates='user'
+ "Address", order_by="Address.pkid", back_populates="user"
) # Relationship to the Address class
@@ -128,97 +129,101 @@ class Address(base_schema.SchemaBaseSQLite, async_db.Base):
Address table storing address details like street, city, and zip code
"""
- __tablename__ = 'addresses'
+ __tablename__ = "addresses"
__table_args__ = {
- 'comment': 'Address table storing address details like street, city, and zip code'
+ "comment": "Address table storing address details like street, city, and zip code"
}
street = Column(String(200), unique=False, index=True) # Street of the address
city = Column(String(200), unique=False, index=True) # City of the address
zip = Column(String(50), unique=False, index=True) # Zip code of the address
- user_id = Column(String(36), ForeignKey('users.pkid')) # Foreign key to the User table
- user = relationship('User', back_populates='addresses') # Relationship to the User class
+ user_id = Column(
+ String(36), ForeignKey("users.pkid")
+ ) # Foreign key to the User table
+ user = relationship(
+ "User", back_populates="addresses"
+ ) # Relationship to the User class
async def create_a_bunch_of_users(single_entry=0, many_entries=0):
- logger.info(f'single_entry: {single_entry}')
+ logger.info(f"single_entry: {single_entry}")
await async_db.create_tables()
# Create a list to hold the user data
# Create a loop to generate user data
- for _ in tqdm(range(single_entry), desc='executing one'):
+ for _ in tqdm(range(single_entry), desc="executing one"):
value = secrets.token_hex(16)
user = User(
- first_name=f'First{value}',
- last_name=f'Last{value}',
- email=f'user{value}@example.com',
+ first_name=f"First{value}",
+ last_name=f"Last{value}",
+ email=f"user{value}@example.com",
)
- logger.info(f'created_users: {user}')
+ logger.info(f"created_users: {user}")
await db_ops.create_one(user)
users = []
# Create a loop to generate user data
- for i in tqdm(range(many_entries), desc='executing many'):
+ for i in tqdm(range(many_entries), desc="executing many"):
value_one = secrets.token_hex(4)
value_two = secrets.token_hex(8)
user = User(
- first_name=f'First{value_one}{i}{value_two}',
- last_name=f'Last{value_one}{i}{value_two}',
- email=f'user{value_one}{i}{value_two}@example.com',
+ first_name=f"First{value_one}{i}{value_two}",
+ last_name=f"Last{value_one}{i}{value_two}",
+ email=f"user{value_one}{i}{value_two}@example.com",
)
- logger.info(f'created_users: {user.first_name}')
+ logger.info(f"created_users: {user.first_name}")
users.append(user)
# Use db_ops to add the users to the database
await db_ops.create_many(users)
-@app.get('/database/get-primary-key', tags=['Database Examples'])
+@app.get("/database/get-primary-key", tags=["Database Examples"])
async def table_primary_key():
- logger.info('Getting primary key of User table')
+ logger.info("Getting primary key of User table")
pk = await db_ops.get_primary_keys(User)
- logger.info(f'Primary key of User table: {pk}')
- return {'pk': pk}
+ logger.info(f"Primary key of User table: {pk}")
+ return {"pk": pk}
-@app.get('/database/get-column-details', tags=['Database Examples'])
+@app.get("/database/get-column-details", tags=["Database Examples"])
async def table_column_details():
- logger.info('Getting column details of User table')
+ logger.info("Getting column details of User table")
columns = await db_ops.get_columns_details(User)
- logger.info(f'Column details of User table: {columns}')
- return {'columns': columns}
+ logger.info(f"Column details of User table: {columns}")
+ return {"columns": columns}
-@app.get('/database/get-tables', tags=['Database Examples'])
+@app.get("/database/get-tables", tags=["Database Examples"])
async def table_table_details():
- logger.info('Getting table names')
+ logger.info("Getting table names")
tables = await db_ops.get_table_names()
- logger.info(f'Table names: {tables}')
- return {'table_names': tables}
+ logger.info(f"Table names: {tables}")
+ return {"table_names": tables}
-@app.get('/database/get-count', tags=['Database Examples'])
+@app.get("/database/get-count", tags=["Database Examples"])
async def get_count():
- logger.info('Getting count of users')
+ logger.info("Getting count of users")
count = await db_ops.count_query(Select(User))
- logger.info(f'Count of users: {count}')
- return {'count': count}
+ logger.info(f"Count of users: {count}")
+ return {"count": count}
-@app.get('/database/get-all', tags=['Database Examples'])
+@app.get("/database/get-all", tags=["Database Examples"])
async def get_all(offset: int = 0, limit: int = Query(100, le=100000, ge=1)):
- logger.info(f'Getting all users with offset {offset} and limit {limit}')
+ logger.info(f"Getting all users with offset {offset} and limit {limit}")
records = await db_ops.read_query(Select(User).offset(offset).limit(limit))
- logger.info(f'Retrieved {len(records)} users')
- return {'records': records}
+ logger.info(f"Retrieved {len(records)} users")
+ return {"records": records}
-@app.get('/database/get-one-record', tags=['Database Examples'])
+@app.get("/database/get-one-record", tags=["Database Examples"])
async def read_one_record(record_id: str):
- logger.info(f'Reading one record with id {record_id}')
+ logger.info(f"Reading one record with id {record_id}")
record = await db_ops.read_one_record(Select(User).where(User.pkid == record_id))
- logger.info(f'Record with id {record_id}: {record}')
+ logger.info(f"Record with id {record_id}: {record}")
return record
@@ -232,102 +237,106 @@ class UserCreate(UserBase):
pass
-@app.post('/database/create-one-record', status_code=201, tags=['Database Examples'])
+@app.post("/database/create-one-record", status_code=201, tags=["Database Examples"])
async def create_one_record(new_user: UserCreate):
- logger.info(f'Creating one record: {new_user}')
+ logger.info(f"Creating one record: {new_user}")
user = User(**new_user.dict())
record = await db_ops.create_one(user)
- logger.info(f'Created record: {record}')
+ logger.info(f"Created record: {record}")
return record
-@app.post('/database/create-many-records', status_code=201, tags=['Database Examples'])
+@app.post("/database/create-many-records", status_code=201, tags=["Database Examples"])
async def create_many_records(number_of_users: int = Query(100, le=1000, ge=1)):
- logger.info(f'Creating {number_of_users} records')
+ logger.info(f"Creating {number_of_users} records")
t0 = time.time()
users = []
# Create a loop to generate user data
- for i in tqdm(range(number_of_users), desc='executing many'):
+ for i in tqdm(range(number_of_users), desc="executing many"):
value_one = secrets.token_hex(4)
value_two = secrets.token_hex(8)
user = User(
- first_name=f'First{value_one}{i}{value_two}',
- last_name=f'Last{value_one}{i}{value_two}',
- email=f'user{value_one}{i}{value_two}@example.com',
+ first_name=f"First{value_one}{i}{value_two}",
+ last_name=f"Last{value_one}{i}{value_two}",
+ email=f"user{value_one}{i}{value_two}@example.com",
)
- logger.info(f'Created user: {user.first_name}')
+ logger.info(f"Created user: {user.first_name}")
users.append(user)
# Use db_ops to add the users to the database
await db_ops.create_many(users)
t1 = time.time()
- process_time = format(t1 - t0, '.4f')
- logger.info(f'Created {number_of_users} records in {process_time} seconds')
- return {'number_of_users': number_of_users, 'process_time': process_time}
+ process_time = format(t1 - t0, ".4f")
+ logger.info(f"Created {number_of_users} records in {process_time} seconds")
+ return {"number_of_users": number_of_users, "process_time": process_time}
-@app.put('/database/update-one-record', status_code=200, tags=['Database Examples'])
+@app.put("/database/update-one-record", status_code=200, tags=["Database Examples"])
async def update_one_record(
id: str = Body(
...,
- description='UUID to update',
- examples=['6087cce8-0bdd-48c2-ba96-7d557dae843e'],
+ description="UUID to update",
+ examples=["6087cce8-0bdd-48c2-ba96-7d557dae843e"],
),
- first_name: str = Body(..., examples=['Agent']),
- last_name: str = Body(..., examples=['Smith']),
- email: str = Body(..., examples=['jim@something.com']),
+ first_name: str = Body(..., examples=["Agent"]),
+ last_name: str = Body(..., examples=["Smith"]),
+ email: str = Body(..., examples=["jim@something.com"]),
):
- logger.info(f'Updating one record with id {id}')
+ logger.info(f"Updating one record with id {id}")
# adding date_updated to new_values as it is not supported in sqlite \
# and other database may not either.
new_values = {
- 'first_name': first_name,
- 'last_name': last_name,
- 'email': email,
- 'date_updated': datetime.datetime.now(datetime.timezone.utc),
+ "first_name": first_name,
+ "last_name": last_name,
+ "email": email,
+ "date_updated": datetime.datetime.now(datetime.timezone.utc),
}
record = await db_ops.update_one(table=User, record_id=id, new_values=new_values)
- logger.info(f'Updated record with id {id}')
+ logger.info(f"Updated record with id {id}")
return record
-@app.delete('/database/delete-one-record', status_code=200, tags=['Database Examples'])
+@app.delete("/database/delete-one-record", status_code=200, tags=["Database Examples"])
async def delete_one_record(record_id: str = Body(...)):
- logger.info(f'Deleting one record with id {record_id}')
+ logger.info(f"Deleting one record with id {record_id}")
record = await db_ops.delete_one(table=User, record_id=record_id)
- logger.info(f'Deleted record with id {record_id}')
+ logger.info(f"Deleted record with id {record_id}")
return record
@app.delete(
- '/database/delete-many-records-aka-this-is-a-bad-idea',
+ "/database/delete-many-records-aka-this-is-a-bad-idea",
status_code=201,
- tags=['Database Examples'],
+ tags=["Database Examples"],
)
-async def delete_many_records(id_values: list = Body(...), id_column_name: str = 'pkid'):
- logger.info(f'Deleting many records with ids {id_values}')
- record = await db_ops.delete_many(table=User, id_column_name='pkid', id_values=id_values)
- logger.info(f'Deleted records with ids {id_values}')
+async def delete_many_records(
+ id_values: list = Body(...), id_column_name: str = "pkid"
+):
+ logger.info(f"Deleting many records with ids {id_values}")
+ record = await db_ops.delete_many(
+ table=User, id_column_name="pkid", id_values=id_values
+ )
+ logger.info(f"Deleted records with ids {id_values}")
return record
@app.get(
- '/database/get-list-of-records-to-paste-into-delete-many-records',
- tags=['Database Examples'],
+ "/database/get-list-of-records-to-paste-into-delete-many-records",
+ tags=["Database Examples"],
)
async def read_list_of_records(
offset: int = Query(0, le=1000, ge=0), limit: int = Query(100, le=10000, ge=1)
):
- logger.info(f'Reading list of records with offset {offset} and limit {limit}')
+ logger.info(f"Reading list of records with offset {offset} and limit {limit}")
records = await db_ops.read_query(Select(User), offset=offset, limit=limit)
records_list = []
for record in records:
records_list.append(record.pkid)
- logger.info(f'Read list of records: {records_list}')
+ logger.info(f"Read list of records: {records_list}")
return records_list
-if __name__ == '__main__':
+if __name__ == "__main__":
import uvicorn
- uvicorn.run(app, host='127.0.0.1', port=5000)
+ uvicorn.run(app, host="127.0.0.1", port=5000)
diff --git a/examples/log_example.py b/examples/log_example.py
index 509a71ac..983a555a 100644
--- a/examples/log_example.py
+++ b/examples/log_example.py
@@ -5,6 +5,7 @@
License: MIT
"""
import logging
+import multiprocessing
import secrets
import threading
@@ -25,12 +26,11 @@
log_diagnose=True,
# app_name='my_app',
# append_app_name=True,
- file_sink=True,
intercept_standard_logging=True,
- enqueue=False
+ enqueue=False,
)
-
+# @logger.catch
def div_zero(x, y):
try:
return x / y
@@ -39,9 +39,13 @@ def div_zero(x, y):
logging.error(f'{e}')
-@logger.catch
+# @logger.catch
def div_zero_two(x, y):
- return x / y
+ try:
+ return x / y
+ except ZeroDivisionError as e:
+ logger.error(f'{e}')
+ logging.error(f'{e}')
@@ -67,19 +71,40 @@ def log_big_string(lqty=100, size=256):
logging.critical('This is a critical message')
-def worker(wqty=100, lqty=100, size=256):
- for _ in tqdm(range(wqty), ascii=True): # Adjusted for demonstration
+def worker(wqty=1000, lqty=100, size=256):
+ for _ in tqdm(range(wqty), ascii=True, leave=True): # Adjusted for demonstration
log_big_string(lqty=lqty, size=size)
-def main(wqty=100, lqty=100, size=256, workers=2):
- threads = []
- for _ in range(workers): # Create workers threads
- t = threading.Thread(target=worker, args=(wqty, lqty, size,))
- threads.append(t)
- t.start()
+def main(wqty: int = 100, lqty: int = 10, size: int = 256, workers: int = 16, thread_test: bool = False, process_test: bool = False):
+ if process_test:
+ processes = []
+ # Create worker processes
+ for _ in tqdm(range(workers), desc="Multi-Processing Start", leave=True):
+ p = multiprocessing.Process(
+ target=worker, args=(wqty, lqty, size,))
+ processes.append(p)
+ p.start()
+
+ for p in tqdm((processes), desc="Multi-Processing Start", leave=False):
+ p.join(timeout=60) # Timeout after 60 seconds
+ if p.is_alive():
+ logger.error(f"Process {p.name} is hanging. Terminating.")
+ p.terminate()
+ p.join()
+
+ if thread_test:
+ threads = []
+ for _ in tqdm(range(workers), desc="Threading Start", leave=True): # Create worker threads
+ t = threading.Thread(target=worker, args=(wqty, lqty, size,))
+ threads.append(t)
+ t.start()
+
+ for t in tqdm(threads, desc="Threading Gather", leave=False):
+ t.join()
- for t in threads:
- t.join()
if __name__ == "__main__":
- main(wqty=100, lqty=10, size=256, workers=10)
+ from time import time
+ start = time()
+ main(wqty=10, lqty=100, size=64, workers=16, thread_test=False, process_test=True)
+ print(f"Execution time: {time()-start:.2f} seconds")
diff --git a/examples/validate_emails.py b/examples/validate_emails.py
index fb118e51..77f9e4fa 100644
--- a/examples/validate_emails.py
+++ b/examples/validate_emails.py
@@ -1,4 +1,4 @@
- # -*- coding: utf-8 -*-
+# -*- coding: utf-8 -*-
"""
This module is used to validate a list of email addresses using various configurations.
@@ -71,27 +71,20 @@
'this is"not\\allowed@example.com', # spaces, quotes, and backslashes may only exist when within quoted strings and preceded by a backslash
'this\\ still\\"not\\\\allowed@example.com', # even if escaped (preceded by a backslash), spaces, quotes, and backslashes must still be contained by quotes
"1234567890123456789012345678901234567890123456789012345678901234+x@example.com", # local part is longer than 64 characters
-
# Emails with empty local part
"@example.com", # only valid if allow_empty_local is True
-
# Emails with non-ASCII characters
"üñîçøðé@example.com", # only valid if allow_smtputf8 is True
"user@üñîçøðé.com", # only valid if allow_smtputf8 is True
-
# Emails with quoted local part
'"john.doe"@example.com', # only valid if allow_quoted_local is True
'"john..doe"@example.com', # only valid if allow_quoted_local is True
-
# Emails with display name
- 'John Doe ', # only valid if allow_display_name is True
-
+ "John Doe ", # only valid if allow_display_name is True
# Emails with domain literal
- 'user@[192.0.2.1]', # only valid if allow_domain_literal is True
-
+ "user@[192.0.2.1]", # only valid if allow_domain_literal is True
# Emails with long local part
- "a"*65 + "@example.com", # local part is longer than 64 characters
-
+ "a" * 65 + "@example.com", # local part is longer than 64 characters
# Emails with invalid characters
"john doe@example.com", # space is not allowed
"john@doe@example.com", # only one @ is allowed
@@ -102,20 +95,119 @@
# create a list of configurations
configurations = [
- {"check_deliverability": True, "test_environment": False, "allow_smtputf8": False, "allow_empty_local": False, "allow_quoted_local": False, "allow_display_name": False, "allow_domain_literal": False, "globally_deliverable": None, "timeout": 10, "dns_type": 'timeout'},
- {"check_deliverability": False, "test_environment": True, "allow_smtputf8": True, "allow_empty_local": True, "allow_quoted_local": True, "allow_display_name": True, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 5, "dns_type": 'dns'},
+ {
+ "check_deliverability": True,
+ "test_environment": False,
+ "allow_smtputf8": False,
+ "allow_empty_local": False,
+ "allow_quoted_local": False,
+ "allow_display_name": False,
+ "allow_domain_literal": False,
+ "globally_deliverable": None,
+ "timeout": 10,
+ "dns_type": "timeout",
+ },
+ {
+ "check_deliverability": False,
+ "test_environment": True,
+ "allow_smtputf8": True,
+ "allow_empty_local": True,
+ "allow_quoted_local": True,
+ "allow_display_name": True,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 5,
+ "dns_type": "dns",
+ },
{"check_deliverability": True},
- {"check_deliverability": False, "test_environment": False, "allow_smtputf8": True, "allow_empty_local": False, "allow_quoted_local": True, "allow_display_name": False, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 15, "dns_type": 'timeout'},
- {"check_deliverability": True, "test_environment": True, "allow_smtputf8": False, "allow_empty_local": True, "allow_quoted_local": False, "allow_display_name": True, "allow_domain_literal": False, "globally_deliverable": None, "timeout": 20, "dns_type": 'dns'},
- {"check_deliverability": False, "test_environment": False, "allow_smtputf8": True, "allow_empty_local": True, "allow_quoted_local": True, "allow_display_name": True, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 25, "dns_type": 'timeout'},
- {"check_deliverability": True, "test_environment": True, "allow_smtputf8": False, "allow_empty_local": False, "allow_quoted_local": False, "allow_display_name": False, "allow_domain_literal": False, "globally_deliverable": None, "timeout": 30, "dns_type": 'dns'},
- {"check_deliverability": False, "test_environment": True, "allow_smtputf8": True, "allow_empty_local": False, "allow_quoted_local": True, "allow_display_name": True, "allow_domain_literal": False, "globally_deliverable": None, "timeout": 35, "dns_type": 'timeout'},
- {"check_deliverability": True, "test_environment": False, "allow_smtputf8": False, "allow_empty_local": True, "allow_quoted_local": True, "allow_display_name": False, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 40, "dns_type": 'dns'},
- {"check_deliverability": False, "test_environment": True, "allow_smtputf8": True, "allow_empty_local": False, "allow_quoted_local": False, "allow_display_name": True, "allow_domain_literal": True, "globally_deliverable": None, "timeout": 45, "dns_type": 'timeout'},
+ {
+ "check_deliverability": False,
+ "test_environment": False,
+ "allow_smtputf8": True,
+ "allow_empty_local": False,
+ "allow_quoted_local": True,
+ "allow_display_name": False,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 15,
+ "dns_type": "timeout",
+ },
+ {
+ "check_deliverability": True,
+ "test_environment": True,
+ "allow_smtputf8": False,
+ "allow_empty_local": True,
+ "allow_quoted_local": False,
+ "allow_display_name": True,
+ "allow_domain_literal": False,
+ "globally_deliverable": None,
+ "timeout": 20,
+ "dns_type": "dns",
+ },
+ {
+ "check_deliverability": False,
+ "test_environment": False,
+ "allow_smtputf8": True,
+ "allow_empty_local": True,
+ "allow_quoted_local": True,
+ "allow_display_name": True,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 25,
+ "dns_type": "timeout",
+ },
+ {
+ "check_deliverability": True,
+ "test_environment": True,
+ "allow_smtputf8": False,
+ "allow_empty_local": False,
+ "allow_quoted_local": False,
+ "allow_display_name": False,
+ "allow_domain_literal": False,
+ "globally_deliverable": None,
+ "timeout": 30,
+ "dns_type": "dns",
+ },
+ {
+ "check_deliverability": False,
+ "test_environment": True,
+ "allow_smtputf8": True,
+ "allow_empty_local": False,
+ "allow_quoted_local": True,
+ "allow_display_name": True,
+ "allow_domain_literal": False,
+ "globally_deliverable": None,
+ "timeout": 35,
+ "dns_type": "timeout",
+ },
+ {
+ "check_deliverability": True,
+ "test_environment": False,
+ "allow_smtputf8": False,
+ "allow_empty_local": True,
+ "allow_quoted_local": True,
+ "allow_display_name": False,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 40,
+ "dns_type": "dns",
+ },
+ {
+ "check_deliverability": False,
+ "test_environment": True,
+ "allow_smtputf8": True,
+ "allow_empty_local": False,
+ "allow_quoted_local": False,
+ "allow_display_name": True,
+ "allow_domain_literal": True,
+ "globally_deliverable": None,
+ "timeout": 45,
+ "dns_type": "timeout",
+ },
]
t0 = time.time()
- validity=[]
+ validity = []
for email in email_addresses:
for config in configurations:
@@ -123,9 +215,9 @@
res = validate_email_address(email, **config)
validity.append(res)
t1 = time.time()
- validity = sorted(validity, key=lambda x: x['email'])
+ validity = sorted(validity, key=lambda x: x["email"])
for v in validity:
- pprint.pprint(v, indent=4)
+ pprint.pprint(v, indent=4)
print(f"Time taken: {t1 - t0:.2f}")
diff --git a/mkdocs.yml b/mkdocs.yml
index 715c0873..53b2f3d9 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -109,8 +109,6 @@ markdown_extensions:
- pymdownx.superfences
- toc:
permalink: true
-
-
extra:
social:
- icon: fontawesome/brands/github-alt
diff --git a/requirements.txt b/requirements.txt
index 9e7268d4..d6516e06 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -2,39 +2,42 @@ aiomysql==0.2.0 # Vulnerabilities: None
aiosqlite==0.20.0 # Vulnerabilities: None
asyncpg==0.29.0 # Vulnerabilities: None
autoflake==2.3.1 # Vulnerabilities: None
-autopep8==2.3.1 # From 2.1.1 | Vulnerabilities: None
+autopep8==2.3.1 # Vulnerabilities: None
black==24.4.2 # Vulnerabilities: None
bump2version==1.0.1 # Vulnerabilities: None
click==8.1.7 # Vulnerabilities: None
cx-Oracle==8.3.0 # Vulnerabilities: None
-fastapi[all]==0.111.1 # From 0.111.0 | Vulnerabilities: None
-flake8==7.1.0 # From 7.0.0 | Vulnerabilities: None
+fastapi[all]==0.111.1 # Vulnerabilities: None
+flake8==7.1.0 # Vulnerabilities: None
genbadge[all]==1.1.1 # Vulnerabilities: None
-hatchling==1.25.0 # From 1.24.2 | Vulnerabilities: None
+hatchling==1.25.0 # Vulnerabilities: None
loguru==0.7.2 # Vulnerabilities: None
-mkdocs-material==9.5.29 # From 9.5.24 | Vulnerabilities: None
-mkdocs-print-site-plugin==2.5.0 # From 2.4.1 | Vulnerabilities: None
-mkdocstrings[python,shell]==0.25.1 # Vulnerabilities: None
-packaging==24.1 # From 24.0 | Vulnerabilities: None
+mkdocs-material==9.5.30 # From 9.5.29 | Vulnerabilities: None
+mkdocs-print-site-plugin==2.5.0 # Vulnerabilities: None
+mkdocstrings[python,shell]==0.25.2 # From 0.25.1 | Vulnerabilities: None
+packaging==24.1 # Vulnerabilities: None
pre-commit==3.7.1 # Vulnerabilities: None
psycopg2==2.9.9 # Vulnerabilities: None
Pygments==2.18.0 # Vulnerabilities: None
-pylint==3.2.5 # From 3.2.2 | Vulnerabilities: None
-pymdown-extensions==10.8.1 # Vulnerabilities: None
-pytest==8.3.1 # From 8.2.1 | Vulnerabilities: None
-pytest-asyncio==0.23.8 # From 0.23.7 | Vulnerabilities: None
+pylint==3.2.6 # From 3.2.5 | Vulnerabilities: None
+pymdown-extensions==10.9 # From 10.8.1 | Vulnerabilities: None
+pytest==8.3.2 # From 8.3.1 | Vulnerabilities: None
+pytest-asyncio==0.23.8 # Vulnerabilities: None
pytest-cov==5.0.0 # Vulnerabilities: None
pytest-mock==3.14.0 # Vulnerabilities: None
pytest-runner==6.0.1 # Vulnerabilities: None
pytest-xdist==3.6.1 # Vulnerabilities: None
+python-json-logger==2.0.7 # Vulnerabilities: None
pytz==2024.1 # Vulnerabilities: None
PyYAML==6.0.1 # Vulnerabilities: None
-ruff==0.5.4 # From 0.4.5 | Vulnerabilities: None
-SQLAlchemy==2.0.31 # From 2.0.30 | Vulnerabilities: None
+ruff==0.5.5 # From 0.5.4 | Vulnerabilities: None
+SQLAlchemy==2.0.31 # Vulnerabilities: None
+structlog==24.4.0 # Vulnerabilities: None
toml==0.10.2 # Vulnerabilities: None
-tox==4.16.0 # From 4.15.0 | Vulnerabilities: None
+tox==4.16.0 # Vulnerabilities: None
tqdm==4.66.4 # Vulnerabilities: None
-twine==5.1.1 # From 5.1.0 | Vulnerabilities: None
+twine==5.1.1 # Vulnerabilities: None
watchdog==4.0.1 # Vulnerabilities: None
wheel==0.43.0 # Vulnerabilities: None
xmltodict==0.13.0 # Vulnerabilities: None
+mike
diff --git a/tests/conftest.py b/tests/conftest.py
index 75527b47..8b1b3947 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -11,6 +11,6 @@ class PropogateHandler(logging.Handler):
def emit(self, record):
logging.getLogger(record.name).handle(record)
- handler_id = logger.add(PropogateHandler(), format='{message} {extra}')
+ handler_id = logger.add(PropogateHandler(), format="{message} {extra}")
yield _caplog
logger.remove(handler_id)
diff --git a/tests/e.py b/tests/e.py
index 1490e9b8..92138774 100644
--- a/tests/e.py
+++ b/tests/e.py
@@ -40,37 +40,37 @@
def run_ascii_two():
char_list = []
- char_list_csv = open_csv('ascii2.csv')
+ char_list_csv = open_csv("ascii2.csv")
for c in char_list_csv:
- char = c['Symbol']
+ char = c["Symbol"]
if char.isprintable() is True:
char_list.append(char)
err_list = []
count = 0
- for lc in tqdm(char_list, desc='left char', leave=False, ascii=True):
- for r in tqdm(char_list, desc='right char', leave=False, ascii=True):
- text = f'{lc}found one{r} {lc}found two{r}'
+ for lc in tqdm(char_list, desc="left char", leave=False, ascii=True):
+ for r in tqdm(char_list, desc="right char", leave=False, ascii=True):
+ text = f"{lc}found one{r} {lc}found two{r}"
data = pattern_between_two_char(text, lc, r)
- if 'Error' in data:
+ if "Error" in data:
err_list.append(data)
return err_list, char_list, count
-if __name__ == '__main__':
+if __name__ == "__main__":
t0 = time.time()
err_list, char_list, count = run_ascii_two()
t1 = time.time() - t0
combinations = len(char_list) * len(char_list)
print(
- f'process took {t1:.2f} seconds with {combinations:,} combinations and a count cycle of {count:,}'
+ f"process took {t1:.2f} seconds with {combinations:,} combinations and a count cycle of {count:,}"
)
if len(err_list) != 0:
- save_json('err.json', err_list)
+ save_json("err.json", err_list)
else:
- print('there were no errors')
+ print("there were no errors")
diff --git a/tests/sample_data_for_tests.py b/tests/sample_data_for_tests.py
index 9cdc89b5..3271dd9b 100644
--- a/tests/sample_data_for_tests.py
+++ b/tests/sample_data_for_tests.py
@@ -1,219 +1,219 @@
# -*- coding: utf-8 -*-
ASCII_LIST = [
- ' ',
- '!',
+ " ",
+ "!",
'""',
- '#',
- '$',
- '%',
- '&',
+ "#",
+ "$",
+ "%",
+ "&",
"'",
- '(',
- ')',
- '*',
- '+',
- ',',
- '-',
- '.',
- '/',
- '0',
- '1',
- '2',
- '3',
- '4',
- '5',
- '6',
- '7',
- '8',
- '9',
- ':',
- ';',
- '<',
- '=',
- '>',
- '?',
- '@',
- 'A',
- 'B',
- 'C',
- 'D',
- 'E',
- 'F',
- 'G',
- 'H',
- 'I',
- 'J',
- 'K',
- 'L',
- 'M',
- 'N',
- 'O',
- 'P',
- 'Q',
- 'R',
- 'S',
- 'T',
- 'U',
- 'V',
- 'W',
- 'X',
- 'Y',
- 'Z',
- '[',
- '\\',
- ']',
- '^',
- '_',
- '`',
- 'a',
- 'b',
- 'c',
- 'd',
- 'e',
- 'f',
- 'g',
- 'h',
- 'i',
- 'j',
- 'k',
- 'l',
- 'm',
- 'n',
- 'o',
- 'p',
- 'q',
- 'r',
- 's',
- 't',
- 'u',
- 'v',
- 'w',
- 'x',
- 'y',
- 'z',
- '{',
- '|',
- '}',
- '~',
- '€',
- '‚',
- 'ƒ',
- '„',
- '…',
- '†',
- '‡',
- 'ˆ',
- '‰',
- 'Š',
- '‹',
- 'Œ',
- 'Ž',
- '‘',
- '’',
- '“',
- '”',
- '•',
- '–',
- '—',
- '˜',
- '™',
- 'š',
- '›',
- 'œ',
- 'ž',
- 'Ÿ',
- '¡',
- '¢',
- '£',
- '¤',
- '¥',
- '¦',
- '§',
- '¨',
- '©',
- 'ª',
- '«',
- '¬',
- '®',
- '¯',
- '°',
- '±',
- '²',
- '³',
- '´',
- 'µ',
- '¶',
- '·',
- '¸',
- '¹',
- 'º',
- '»',
- '¼',
- '½',
- '¾',
- '¿',
- 'À',
- 'Á',
- 'Â',
- 'Ã',
- 'Ä',
- 'Å',
- 'Æ',
- 'Ç',
- 'È',
- 'É',
- 'Ê',
- 'Ë',
- 'Ì',
- 'Í',
- 'Î',
- 'Ï',
- 'Ð',
- 'Ñ',
- 'Ò',
- 'Ó',
- 'Ô',
- 'Õ',
- 'Ö',
- '×',
- 'Ø',
- 'Ù',
- 'Ú',
- 'Û',
- 'Ü',
- 'Ý',
- 'Þ',
- 'ß',
- 'à',
- 'á',
- 'â',
- 'ã',
- 'ä',
- 'å',
- 'æ',
- 'ç',
- 'è',
- 'é',
- 'ê',
- 'ë',
- 'ì',
- 'í',
- 'î',
- 'ï',
- 'ð',
- 'ñ',
- 'ò',
- 'ó',
- 'ô',
- 'õ',
- 'ö',
- '÷',
- 'ø',
- 'ù',
- 'ú',
- 'û',
- 'ü',
- 'ý',
- 'þ',
- 'ÿ',
+ "(",
+ ")",
+ "*",
+ "+",
+ ",",
+ "-",
+ ".",
+ "/",
+ "0",
+ "1",
+ "2",
+ "3",
+ "4",
+ "5",
+ "6",
+ "7",
+ "8",
+ "9",
+ ":",
+ ";",
+ "<",
+ "=",
+ ">",
+ "?",
+ "@",
+ "A",
+ "B",
+ "C",
+ "D",
+ "E",
+ "F",
+ "G",
+ "H",
+ "I",
+ "J",
+ "K",
+ "L",
+ "M",
+ "N",
+ "O",
+ "P",
+ "Q",
+ "R",
+ "S",
+ "T",
+ "U",
+ "V",
+ "W",
+ "X",
+ "Y",
+ "Z",
+ "[",
+ "\\",
+ "]",
+ "^",
+ "_",
+ "`",
+ "a",
+ "b",
+ "c",
+ "d",
+ "e",
+ "f",
+ "g",
+ "h",
+ "i",
+ "j",
+ "k",
+ "l",
+ "m",
+ "n",
+ "o",
+ "p",
+ "q",
+ "r",
+ "s",
+ "t",
+ "u",
+ "v",
+ "w",
+ "x",
+ "y",
+ "z",
+ "{",
+ "|",
+ "}",
+ "~",
+ "€",
+ "‚",
+ "ƒ",
+ "„",
+ "…",
+ "†",
+ "‡",
+ "ˆ",
+ "‰",
+ "Š",
+ "‹",
+ "Œ",
+ "Ž",
+ "‘",
+ "’",
+ "“",
+ "”",
+ "•",
+ "–",
+ "—",
+ "˜",
+ "™",
+ "š",
+ "›",
+ "œ",
+ "ž",
+ "Ÿ",
+ "¡",
+ "¢",
+ "£",
+ "¤",
+ "¥",
+ "¦",
+ "§",
+ "¨",
+ "©",
+ "ª",
+ "«",
+ "¬",
+ "®",
+ "¯",
+ "°",
+ "±",
+ "²",
+ "³",
+ "´",
+ "µ",
+ "¶",
+ "·",
+ "¸",
+ "¹",
+ "º",
+ "»",
+ "¼",
+ "½",
+ "¾",
+ "¿",
+ "À",
+ "Á",
+ "Â",
+ "Ã",
+ "Ä",
+ "Å",
+ "Æ",
+ "Ç",
+ "È",
+ "É",
+ "Ê",
+ "Ë",
+ "Ì",
+ "Í",
+ "Î",
+ "Ï",
+ "Ð",
+ "Ñ",
+ "Ò",
+ "Ó",
+ "Ô",
+ "Õ",
+ "Ö",
+ "×",
+ "Ø",
+ "Ù",
+ "Ú",
+ "Û",
+ "Ü",
+ "Ý",
+ "Þ",
+ "ß",
+ "à",
+ "á",
+ "â",
+ "ã",
+ "ä",
+ "å",
+ "æ",
+ "ç",
+ "è",
+ "é",
+ "ê",
+ "ë",
+ "ì",
+ "í",
+ "î",
+ "ï",
+ "ð",
+ "ñ",
+ "ò",
+ "ó",
+ "ô",
+ "õ",
+ "ö",
+ "÷",
+ "ø",
+ "ù",
+ "ú",
+ "û",
+ "ü",
+ "ý",
+ "þ",
+ "ÿ",
]
diff --git a/tests/test_common_functions/test_calendar_functions.py b/tests/test_common_functions/test_calendar_functions.py
index 2f61a6f0..7d8131ba 100644
--- a/tests/test_common_functions/test_calendar_functions.py
+++ b/tests/test_common_functions/test_calendar_functions.py
@@ -9,42 +9,42 @@
class TestGetMonth(unittest.TestCase):
def test_valid_input(self):
# Test with valid input month numbers
- self.assertEqual(get_month(1), 'January')
- self.assertEqual(get_month(6), 'June')
- self.assertEqual(get_month(12), 'December')
+ self.assertEqual(get_month(1), "January")
+ self.assertEqual(get_month(6), "June")
+ self.assertEqual(get_month(12), "December")
def test_invalid_input(self):
# Test with invalid input month numbers
- self.assertEqual(get_month(0), 'Invalid month number')
- self.assertEqual(get_month(13), 'Invalid month number')
- self.assertEqual(get_month(-1), 'Invalid month number')
+ self.assertEqual(get_month(0), "Invalid month number")
+ self.assertEqual(get_month(13), "Invalid month number")
+ self.assertEqual(get_month(-1), "Invalid month number")
def test_string_input(self):
# Test with string input
- self.assertEqual(get_month('January'), 'Invalid input, integer is required')
- self.assertEqual(get_month('12'), 'Invalid input, integer is required')
- self.assertEqual(get_month('invalid'), 'Invalid input, integer is required')
+ self.assertEqual(get_month("January"), "Invalid input, integer is required")
+ self.assertEqual(get_month("12"), "Invalid input, integer is required")
+ self.assertEqual(get_month("invalid"), "Invalid input, integer is required")
def test_float_input(self):
# Test with float input
- self.assertEqual(get_month(3.14), 'Invalid input, integer is required')
- self.assertEqual(get_month(8.0), 'August')
- self.assertEqual(get_month(12.99), 'Invalid input, integer is required')
+ self.assertEqual(get_month(3.14), "Invalid input, integer is required")
+ self.assertEqual(get_month(8.0), "August")
+ self.assertEqual(get_month(12.99), "Invalid input, integer is required")
# Tests for get_month_number() function
class TestGetMonthNumber(unittest.TestCase):
def test_valid_input(self):
# Test with valid input month names
- self.assertEqual(get_month_number('January'), 1)
- self.assertEqual(get_month_number('June'), 6)
- self.assertEqual(get_month_number('December'), 12)
+ self.assertEqual(get_month_number("January"), 1)
+ self.assertEqual(get_month_number("June"), 6)
+ self.assertEqual(get_month_number("December"), 12)
def test_invalid_input(self):
# Test with invalid input month names
- self.assertEqual(get_month_number('Invalid'), -1)
- self.assertEqual(get_month_number('13'), -1)
- self.assertEqual(get_month_number(''), -1)
+ self.assertEqual(get_month_number("Invalid"), -1)
+ self.assertEqual(get_month_number("13"), -1)
+ self.assertEqual(get_month_number(""), -1)
def test_integer_input(self):
# Test with integer input
@@ -54,15 +54,15 @@ def test_integer_input(self):
def test_lowercase_input(self):
# Test with lowercase input
- self.assertEqual(get_month_number('january'), 1)
- self.assertEqual(get_month_number('june'), 6)
- self.assertEqual(get_month_number('december'), 12)
+ self.assertEqual(get_month_number("january"), 1)
+ self.assertEqual(get_month_number("june"), 6)
+ self.assertEqual(get_month_number("december"), 12)
def test_spaced_input(self):
# Test with input containing leading/trailing spaces
- self.assertEqual(get_month_number(' January '), 1)
- self.assertEqual(get_month_number(' June '), 6)
- self.assertEqual(get_month_number(' December '), 12)
+ self.assertEqual(get_month_number(" January "), 1)
+ self.assertEqual(get_month_number(" June "), 6)
+ self.assertEqual(get_month_number(" December "), 12)
def test_invalid_type_input(self):
# Test with invalid type input
@@ -71,7 +71,7 @@ def test_invalid_type_input(self):
self.assertEqual(get_month_number(8), -1)
-if __name__ == '__main__':
+if __name__ == "__main__":
unittest.main()
# if __name__ == "__main__":
diff --git a/tests/test_common_functions/test_email_validation.py b/tests/test_common_functions/test_email_validation.py
index af712cb4..8430f806 100644
--- a/tests/test_common_functions/test_email_validation.py
+++ b/tests/test_common_functions/test_email_validation.py
@@ -4,37 +4,44 @@
def test_validate_email_address_valid():
- result = validate_email_address('test@google.com')
- assert result['valid'] is True
- assert result['email'] == 'test@google.com'
+ result = validate_email_address("test@google.com")
+ assert result["valid"] is True
+ assert result["email"] == "test@google.com"
+
def test_validate_email_address_invalid():
- result = validate_email_address('invalid')
- assert result['valid'] is False
- assert result['email'] == 'invalid'
- assert result['error_type'] == 'EmailNotValidError'
+ result = validate_email_address("invalid")
+ assert result["valid"] is False
+ assert result["email"] == "invalid"
+ assert result["error_type"] == "EmailNotValidError"
+
def test_validate_email_address_undeliverable():
- result = validate_email_address('test@example.com')
- assert result['valid'] is False
- assert result['email'] == 'test@example.com'
- assert result['error_type'] == 'EmailUndeliverableError'
+ result = validate_email_address("test@example.com")
+ assert result["valid"] is False
+ assert result["email"] == "test@example.com"
+ assert result["error_type"] == "EmailUndeliverableError"
def test_validate_email_address_dns_type():
with pytest.raises(ValueError):
- validate_email_address('test@google.com', dns_type='invalid')
+ validate_email_address("test@google.com", dns_type="invalid")
+
def test_validate_email_address_timeout():
- result = validate_email_address('test@google.com', timeout=0, dns_type="timeout")
- assert result['valid'] is True
- assert result['email'] == 'test@google.com'
+ result = validate_email_address("test@google.com", timeout=0, dns_type="timeout")
+ assert result["valid"] is True
+ assert result["email"] == "test@google.com"
+
def test_validate_email_address_timeout_invalid():
with pytest.raises(TypeError):
- validate_email_address('test@example.com', timeout='invalid', dns_type="timeout")
+ validate_email_address(
+ "test@example.com", timeout="invalid", dns_type="timeout"
+ )
+
def test_validate_email_address_check_delivery_false():
- result = validate_email_address('test@example.com', check_deliverability=False)
- assert result['valid'] is True
- assert result['email'] == 'test@example.com'
+ result = validate_email_address("test@example.com", check_deliverability=False)
+ assert result["valid"] is True
+ assert result["email"] == "test@example.com"
diff --git a/tests/test_common_functions/test_file_functions/test_create_data.py b/tests/test_common_functions/test_file_functions/test_create_data.py
index 68d5cbe2..c11ab0a6 100644
--- a/tests/test_common_functions/test_file_functions/test_create_data.py
+++ b/tests/test_common_functions/test_file_functions/test_create_data.py
@@ -8,9 +8,9 @@
class TestSampleGenerator(unittest.TestCase):
def setUp(self) -> None:
self.sample_size = 10
- self.file_name = 'test_file_create_sample_data'
- self.csv_file = f'data/csv/{self.file_name}.csv'
- self.json_file = f'data/json/{self.file_name}.json'
+ self.file_name = "test_file_create_sample_data"
+ self.csv_file = f"data/csv/{self.file_name}.csv"
+ self.json_file = f"data/json/{self.file_name}.json"
# def tearDown(self) -> None:
# os.remove(self.csv_file)
@@ -21,17 +21,17 @@ def test_files_created_successfully(self) -> None:
create_sample_files(self.file_name, self.sample_size)
# Print the CSV file path for troubleshooting
- print(f'CSV file path: {self.csv_file}')
+ print(f"CSV file path: {self.csv_file}")
# Check if the CSV file was created successfully
self.assertTrue(os.path.exists(self.csv_file))
# Print the JSON file path for troubleshooting
- print(f'JSON file path: {self.json_file}')
+ print(f"JSON file path: {self.json_file}")
# Check if the JSON file was created successfully
self.assertTrue(os.path.exists(self.json_file))
-if __name__ == '__main__':
+if __name__ == "__main__":
unittest.main()
diff --git a/tests/test_common_functions/test_file_functions/test_file_functions_delete.py b/tests/test_common_functions/test_file_functions/test_file_functions_delete.py
index 722b48b6..d5f14ad3 100644
--- a/tests/test_common_functions/test_file_functions/test_file_functions_delete.py
+++ b/tests/test_common_functions/test_file_functions/test_file_functions_delete.py
@@ -10,21 +10,21 @@
class TestDeleteFile(unittest.TestCase):
def setUp(self):
self.tmpdir = TemporaryDirectory()
- self.datadir = Path(self.tmpdir.name) / 'data'
+ self.datadir = Path(self.tmpdir.name) / "data"
self.datadir.mkdir()
- self.csvdir = self.datadir / 'csv'
+ self.csvdir = self.datadir / "csv"
self.csvdir.mkdir()
- self.jsondir = self.datadir / 'json'
+ self.jsondir = self.datadir / "json"
self.jsondir.mkdir()
- self.textdir = self.datadir / 'text'
+ self.textdir = self.datadir / "text"
self.textdir.mkdir()
# Create some test files
- self.csvfile = self.csvdir / 'test.csv'
+ self.csvfile = self.csvdir / "test.csv"
self.csvfile.touch()
- self.jsonfile = self.jsondir / 'test.json'
+ self.jsonfile = self.jsondir / "test.json"
self.jsonfile.touch()
- self.textfile = self.textdir / 'test.txt'
+ self.textfile = self.textdir / "test.txt"
self.textfile.touch()
def tearDown(self):
@@ -32,66 +32,66 @@ def tearDown(self):
def test_delete_csv_file(self):
# Test deleting a CSV file
- filename = 'test'
+ filename = "test"
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
- delete_file(filename + '.csv')
+ delete_file(filename + ".csv")
self.assertFalse(self.csvfile.exists())
def test_delete_json_file(self):
# Test deleting a JSON file
- filename = 'test'
+ filename = "test"
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
- delete_file(filename + '.json')
+ delete_file(filename + ".json")
self.assertFalse(self.jsonfile.exists())
def test_delete_text_file(self):
# Test deleting a text file
- filename = 'test'
+ filename = "test"
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
- delete_file(filename + '.txt')
+ delete_file(filename + ".txt")
self.assertFalse(self.textfile.exists())
def test_delete_nonexistent_file(self):
# Test deleting a nonexistent file
- filename = 'nonexistent'
+ filename = "nonexistent"
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
with self.assertRaises(FileNotFoundError):
- delete_file(filename + '.csv')
+ delete_file(filename + ".csv")
def test_delete_invalid_filename(self):
# Test deleting a file with an invalid filename
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
with self.assertRaises(ValueError):
- delete_file('invalid/filename.csv')
+ delete_file("invalid/filename.csv")
def test_delete_unsupported_filetype(self):
# Test deleting a file with an unsupported filetype
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
with self.assertRaises(ValueError):
- delete_file('test.jpg')
+ delete_file("test.jpg")
def test_delete_nonstring_filename(self):
# Test deleting a file with a non-string filename
with patch(
- 'dsg_lib.common_functions.file_functions.directory_to_files',
+ "dsg_lib.common_functions.file_functions.directory_to_files",
str(self.datadir),
):
with self.assertRaises(TypeError):
diff --git a/tests/test_common_functions/test_file_functions/test_open_csv.py b/tests/test_common_functions/test_file_functions/test_open_csv.py
index b3d5ddd4..c49a0dd6 100644
--- a/tests/test_common_functions/test_file_functions/test_open_csv.py
+++ b/tests/test_common_functions/test_file_functions/test_open_csv.py
@@ -9,29 +9,29 @@
class TestOpenCsv(unittest.TestCase):
@classmethod
def setUpClass(cls):
- os.makedirs('data/csv', exist_ok=True)
- with open('data/csv/test_file.csv', 'w', encoding='utf-8') as f:
- f.write('col1,col2,col3\n1,2,3\n4,5,6\n')
+ os.makedirs("data/csv", exist_ok=True)
+ with open("data/csv/test_file.csv", "w", encoding="utf-8") as f:
+ f.write("col1,col2,col3\n1,2,3\n4,5,6\n")
@classmethod
def tearDownClass(cls):
- os.remove('data/csv/test_file.csv')
+ os.remove("data/csv/test_file.csv")
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_open_csv_with_valid_file(self):
- data = open_csv('test_file')
+ data = open_csv("test_file")
self.assertEqual(len(data), 2)
- self.assertEqual(data[0]['col1'], '1')
- self.assertEqual(data[0]['col2'], '2')
- self.assertEqual(data[0]['col3'], '3')
- self.assertEqual(data[1]['col1'], '4')
- self.assertEqual(data[1]['col2'], '5')
- self.assertEqual(data[1]['col3'], '6')
-
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ self.assertEqual(data[0]["col1"], "1")
+ self.assertEqual(data[0]["col2"], "2")
+ self.assertEqual(data[0]["col3"], "3")
+ self.assertEqual(data[1]["col1"], "4")
+ self.assertEqual(data[1]["col2"], "5")
+ self.assertEqual(data[1]["col3"], "6")
+
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_open_csv_with_invalid_file(self):
with self.assertRaises(FileNotFoundError):
- open_csv('non_existent_file')
+ open_csv("non_existent_file")
def test_open_csv_with_invalid_file_name_type(self):
with self.assertRaises(TypeError):
@@ -39,16 +39,16 @@ def test_open_csv_with_invalid_file_name_type(self):
def test_open_csv_with_invalid_quote_level(self):
with self.assertRaises(ValueError):
- open_csv('test_file', quote_level='invalid')
+ open_csv("test_file", quote_level="invalid")
def test_open_csv_with_delimiter_and_quotechar(self):
with self.assertRaises(TypeError):
- open_csv('test_file', delimiter=',', quote_level='minimal', quotechar='"')
+ open_csv("test_file", delimiter=",", quote_level="minimal", quotechar='"')
def test_open_csv_with_invalid_delimiter_type(self):
with self.assertRaises(TypeError):
- open_csv('test_file', delimiter='abc')
+ open_csv("test_file", delimiter="abc")
def test_open_csv_with_quotechar_length_greater_than_one(self):
with self.assertRaises(TypeError):
- open_csv('test_file', quotechar='abc')
+ open_csv("test_file", quotechar="abc")
diff --git a/tests/test_common_functions/test_file_functions/test_open_json.py b/tests/test_common_functions/test_file_functions/test_open_json.py
index 896d5f37..12a4be98 100644
--- a/tests/test_common_functions/test_file_functions/test_open_json.py
+++ b/tests/test_common_functions/test_file_functions/test_open_json.py
@@ -9,21 +9,21 @@
class TestFileFunctions(unittest.TestCase):
@classmethod
def setUpClass(cls):
- with open('data/json/test_file.json', 'w') as f:
+ with open("data/json/test_file.json", "w") as f:
f.write('{"key": "value"}')
@classmethod
def tearDownClass(cls):
- os.remove('data/json/test_file.json')
+ os.remove("data/json/test_file.json")
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_open_json_with_valid_file(self):
- data = open_json('test_file.json')
- self.assertEqual(data, {'key': 'value'})
+ data = open_json("test_file.json")
+ self.assertEqual(data, {"key": "value"})
def test_open_json_with_invalid_file_name(self):
with self.assertRaises(FileNotFoundError):
- open_json('invalid_file.json')
+ open_json("invalid_file.json")
def test_open_json_with_non_string_file_name(self):
with self.assertRaises(TypeError):
diff --git a/tests/test_common_functions/test_file_functions/test_open_text.py b/tests/test_common_functions/test_file_functions/test_open_text.py
index 37165632..ec9a90e5 100644
--- a/tests/test_common_functions/test_file_functions/test_open_text.py
+++ b/tests/test_common_functions/test_file_functions/test_open_text.py
@@ -9,11 +9,11 @@
from dsg_lib.common_functions.file_functions import open_text
-random_file_name_for_test = f'test_open_text_{secrets.token_hex(3)}.txt'
+random_file_name_for_test = f"test_open_text_{secrets.token_hex(3)}.txt"
class FileFunctionTests(unittest.TestCase):
- directory_to_files = 'data'
+ directory_to_files = "data"
# def setUp(self):
# os.makedirs(os.path.join(self.directory_to_files, "text"), exist_ok=True)
@@ -26,7 +26,7 @@ class FileFunctionTests(unittest.TestCase):
# ) as f:
# f.write("This is a test ")
def setUp(self):
- text_dir = os.path.join(self.directory_to_files, 'text')
+ text_dir = os.path.join(self.directory_to_files, "text")
if os.path.exists(text_dir):
shutil.rmtree(text_dir)
os.makedirs(text_dir)
@@ -34,29 +34,29 @@ def setUp(self):
# Create test file
with open(
os.path.join(text_dir, random_file_name_for_test),
- 'w',
- encoding='utf-8',
+ "w",
+ encoding="utf-8",
) as f:
- f.write('This is a test ')
+ f.write("This is a test ")
def tearDown(self):
test_dir = Path(self.directory_to_files)
- for file_path in test_dir.glob('**/*'):
+ for file_path in test_dir.glob("**/*"):
if file_path.is_file():
file_path.unlink()
# Verify all files have been removed before attempting to remove parent directory
- while os.listdir(os.path.join(self.directory_to_files, 'text')):
+ while os.listdir(os.path.join(self.directory_to_files, "text")):
time.sleep(1)
- os.rmdir(os.path.join(self.directory_to_files, 'text'))
+ os.rmdir(os.path.join(self.directory_to_files, "text"))
def test_open_text_file(self):
file_name = random_file_name_for_test
- file_contents = 'This is a test '
- file_path = os.path.join(self.directory_to_files, 'text', file_name)
+ file_contents = "This is a test "
+ file_path = os.path.join(self.directory_to_files, "text", file_name)
# Create test file
- with open(file_path, 'w', encoding='utf-8') as file:
+ with open(file_path, "w", encoding="utf-8") as file:
file.write(file_contents)
# Test function
@@ -64,14 +64,14 @@ def test_open_text_file(self):
self.assertEqual(result, file_contents)
def test_open_text_invalid_file_name(self):
- invalid_file_name = 'invalid/file/name.txt'
+ invalid_file_name = "invalid/file/name.txt"
# Test function
with self.assertRaises(TypeError):
open_text(invalid_file_name)
def test_open_text_nonexistent_file(self):
- nonexistent_file_name = 'nonexistent.txt'
+ nonexistent_file_name = "nonexistent.txt"
# Test function
with self.assertRaises(FileNotFoundError):
@@ -85,5 +85,5 @@ def test_open_text_integer_file_name(self):
open_text(integer_file_name)
-if __name__ == '__main__':
+if __name__ == "__main__":
unittest.main()
diff --git a/tests/test_common_functions/test_file_functions/test_save_csv.py b/tests/test_common_functions/test_file_functions/test_save_csv.py
index 92e0cfbd..86d0e959 100644
--- a/tests/test_common_functions/test_file_functions/test_save_csv.py
+++ b/tests/test_common_functions/test_file_functions/test_save_csv.py
@@ -9,10 +9,10 @@
class TestFileFunctions(unittest.TestCase):
def setUp(self):
self.test_data = [
- ['John Doe', '123 Main St', 'jdoe@example.com'],
- ['Jane Smith', '456 Maple Ave', 'jsmith@example.com'],
+ ["John Doe", "123 Main St", "jdoe@example.com"],
+ ["Jane Smith", "456 Maple Ave", "jsmith@example.com"],
]
- self.csv_path = Path('data/csv/test_file.csv')
+ self.csv_path = Path("data/csv/test_file.csv")
if self.csv_path.exists():
self.csv_path.unlink()
@@ -20,51 +20,51 @@ def tearDown(self):
if self.csv_path.exists():
self.csv_path.unlink()
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_valid_data(self):
- result = save_csv('test_file', self.test_data)
- self.assertEqual(result, 'complete')
+ result = save_csv("test_file", self.test_data)
+ self.assertEqual(result, "complete")
self.assertTrue(self.csv_path.exists())
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_invalid_data(self):
with self.assertRaises(TypeError):
- save_csv('test_file', 'not a list')
+ save_csv("test_file", "not a list")
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_invalid_file_name(self):
with self.assertRaises(TypeError):
- save_csv('invalid/name', self.test_data)
+ save_csv("invalid/name", self.test_data)
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_custom_delimiter(self):
- result = save_csv('test_file', self.test_data, delimiter=';')
- self.assertEqual(result, 'complete')
+ result = save_csv("test_file", self.test_data, delimiter=";")
+ self.assertEqual(result, "complete")
self.assertTrue(self.csv_path.exists())
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_custom_quotechar(self):
- result = save_csv('test_file', self.test_data, quotechar="'")
- self.assertEqual(result, 'complete')
+ result = save_csv("test_file", self.test_data, quotechar="'")
+ self.assertEqual(result, "complete")
self.assertTrue(self.csv_path.exists())
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_custom_root_folder(self):
- result = save_csv('test_file', self.test_data, root_folder='data/custom')
- self.assertEqual(result, 'complete')
- custom_path = Path('data/custom/csv/test_file.csv')
+ result = save_csv("test_file", self.test_data, root_folder="data/custom")
+ self.assertEqual(result, "complete")
+ custom_path = Path("data/custom/csv/test_file.csv")
self.assertTrue(custom_path.exists())
- @patch('dsg_lib.common_functions.file_functions.directory_to_files', 'data')
+ @patch("dsg_lib.common_functions.file_functions.directory_to_files", "data")
def test_save_csv_with_valid_data_two(self):
- result = save_csv('test_file', self.test_data)
- self.assertEqual(result, 'complete')
+ result = save_csv("test_file", self.test_data)
+ self.assertEqual(result, "complete")
self.assertTrue(self.csv_path.exists())
# Test invalid delimiter argument
with self.assertRaises(TypeError):
- save_csv('test_file', self.test_data, delimiter='invalid')
+ save_csv("test_file", self.test_data, delimiter="invalid")
# Test invalid quotechar argument
with self.assertRaises(TypeError):
- save_csv('test_file', self.test_data, quotechar='invalid')
+ save_csv("test_file", self.test_data, quotechar="invalid")
diff --git a/tests/test_common_functions/test_file_functions/test_save_json.py b/tests/test_common_functions/test_file_functions/test_save_json.py
index 5b2daceb..0529a7f7 100644
--- a/tests/test_common_functions/test_file_functions/test_save_json.py
+++ b/tests/test_common_functions/test_file_functions/test_save_json.py
@@ -9,15 +9,15 @@
class TestSaveJson(unittest.TestCase):
def setUp(self) -> None:
# Define test data
- self.valid_file_name = 'test.json'
- self.invalid_file_name = 'test/invalid.json'
- self.test_data = {'id': 1, 'name': 'Test'}
+ self.valid_file_name = "test.json"
+ self.invalid_file_name = "test/invalid.json"
+ self.test_data = {"id": 1, "name": "Test"}
def test_success_save_json(self):
# Test successful save of JSON file
result = save_json(self.valid_file_name, self.test_data)
- self.assertEqual(result, 'File saved successfully')
- file_path = Path('data/json') / self.valid_file_name
+ self.assertEqual(result, "File saved successfully")
+ file_path = Path("data/json") / self.valid_file_name
self.assertTrue(file_path.exists())
def test_invalid_file_name(self):
@@ -27,13 +27,13 @@ def test_invalid_file_name(self):
def test_invalid_data_type(self):
# Test raise error when data is not list or dictionary
- invalid_data = 'test'
+ invalid_data = "test"
with self.assertRaises(TypeError):
save_json(self.valid_file_name, invalid_data)
def tearDown(self) -> None:
# Remove created test files
- file_path = Path('data/json') / self.valid_file_name
+ file_path = Path("data/json") / self.valid_file_name
if file_path.exists():
file_path.unlink()
# shutil.rmtree(Path("data/json"))
diff --git a/tests/test_common_functions/test_file_functions/test_save_text.py b/tests/test_common_functions/test_file_functions/test_save_text.py
index 4772de81..c4a9af47 100644
--- a/tests/test_common_functions/test_file_functions/test_save_text.py
+++ b/tests/test_common_functions/test_file_functions/test_save_text.py
@@ -11,43 +11,45 @@ class SaveTextTestCase(unittest.TestCase):
def setUp(self):
"""Create a temporary directory for the test files"""
self.temp_dir = TemporaryDirectory()
- self.data_dir = os.path.join(self.temp_dir.name, 'data')
+ self.data_dir = os.path.join(self.temp_dir.name, "data")
os.makedirs(self.data_dir)
def tearDown(self):
"""Delete the temporary directory and its contents"""
self.temp_dir.cleanup()
- @patch('dsg_lib.common_functions.file_functions.save_text', side_effect=save_text)
+ @patch("dsg_lib.common_functions.file_functions.save_text", side_effect=save_text)
def test_save_text(self, mock_save_text):
"""Test that text is saved to a file"""
# Create text to save to file
- text = 'Hello, world!'
+ text = "Hello, world!"
# Save text to file
- file_name = 'test_file'
+ file_name = "test_file"
mock_save_text(file_name=file_name, data=text, root_folder=self.data_dir)
# Check that file was created and contains the correct text
- file_path = os.path.join(self.data_dir, 'text', f'{file_name}.txt')
+ file_path = os.path.join(self.data_dir, "text", f"{file_name}.txt")
self.assertTrue(os.path.exists(file_path))
- with open(file_path, 'r') as file:
+ with open(file_path, "r") as file:
self.assertEqual(file.read(), text)
- @patch('dsg_lib.common_functions.file_functions.save_text', side_effect=save_text)
+ @patch("dsg_lib.common_functions.file_functions.save_text", side_effect=save_text)
def test_save_text_invalid_data(self, mock_save_text):
"""Test that an exception is raised when the data parameter is not a string"""
# Try to save a non-string value to a file
- file_name = 'test_file'
+ file_name = "test_file"
invalid_data = 123
with self.assertRaises(TypeError):
- mock_save_text(file_name=file_name, data=invalid_data, root_folder=self.data_dir)
+ mock_save_text(
+ file_name=file_name, data=invalid_data, root_folder=self.data_dir
+ )
- @patch('dsg_lib.common_functions.file_functions.save_text', side_effect=save_text)
+ @patch("dsg_lib.common_functions.file_functions.save_text", side_effect=save_text)
def test_save_text_invalid_file_name(self, mock_save_text):
"""Test that an exception is raised when the file name parameter contains a forward slash or backslash"""
# Try to save a file with an invalid file name
- file_name = 'invalid/file/name'
- text = 'This should not be saved to a file'
+ file_name = "invalid/file/name"
+ text = "This should not be saved to a file"
with self.assertRaises(ValueError):
mock_save_text(file_name=file_name, data=text, root_folder=self.data_dir)
diff --git a/tests/test_common_functions/test_folder_functions/test_folder_functions_directory_list.py b/tests/test_common_functions/test_folder_functions/test_folder_functions_directory_list.py
index c1a05dee..6df3a763 100644
--- a/tests/test_common_functions/test_folder_functions/test_folder_functions_directory_list.py
+++ b/tests/test_common_functions/test_folder_functions/test_folder_functions_directory_list.py
@@ -15,8 +15,8 @@ class TestGetDirectoryList(unittest.TestCase):
def setUp(self):
# Create a temporary directory for the tests
self.test_dir = tempfile.mkdtemp()
- self.subdir1 = os.path.join(self.test_dir, 'subdir1')
- self.subdir2 = os.path.join(self.test_dir, 'subdir2')
+ self.subdir1 = os.path.join(self.test_dir, "subdir1")
+ self.subdir2 = os.path.join(self.test_dir, "subdir2")
os.mkdir(self.subdir1)
os.mkdir(self.subdir2)
@@ -35,7 +35,7 @@ def test_get_directory_list(self):
def test_get_directory_list_nonexistent_dir(self):
# Try to get the list of directories in a nonexistent directory
- dir_list = get_directory_list('nonexistent_dir')
+ dir_list = get_directory_list("nonexistent_dir")
# Check that the function returns None
self.assertIsNone(dir_list)
diff --git a/tests/test_common_functions/test_folder_functions/test_folder_functions_file_change.py b/tests/test_common_functions/test_folder_functions/test_folder_functions_file_change.py
index 8ad6f14f..d4de7305 100644
--- a/tests/test_common_functions/test_folder_functions/test_folder_functions_file_change.py
+++ b/tests/test_common_functions/test_folder_functions/test_folder_functions_file_change.py
@@ -12,10 +12,10 @@
class TestLastDataFilesChanged(unittest.TestCase):
def setUp(self):
- self.test_dir = Path('test_folder_file_change')
+ self.test_dir = Path("test_folder_file_change")
self.test_dir.mkdir()
- self.file1 = self.test_dir / 'file1.txt'
- self.file2 = self.test_dir / 'file2.txt'
+ self.file1 = self.test_dir / "file1.txt"
+ self.file2 = self.test_dir / "file2.txt"
# Set the modification time of file1 to be 1 hour ago
file1_time_tuple = time.localtime(time.time() - 3600)
@@ -23,12 +23,12 @@ def setUp(self):
self.file1.touch()
# adding sleep time to make sure there is no conflict on last modified file
time.sleep(0.01)
- self.file1.write_text('test')
+ self.file1.write_text("test")
os.utime(
self.file1,
times=(time.mktime(file1_time_tuple), self.file1.stat().st_mtime),
)
- print(f'file1 modification time: {self.file1.stat().st_mtime}')
+ print(f"file1 modification time: {self.file1.stat().st_mtime}")
# Set the modification time of file2 to be 1 day ago
file2_time_tuple = time.localtime(time.time() - 86400)
@@ -36,10 +36,10 @@ def setUp(self):
self.file2,
times=(time.mktime(file2_time_tuple), self.file2.stat().st_mtime),
)
- print(f'file2 modification time: {self.file2.stat().st_mtime}')
+ print(f"file2 modification time: {self.file2.stat().st_mtime}")
def tearDown(self):
- for path in self.test_dir.glob('**/*'):
+ for path in self.test_dir.glob("**/*"):
try:
path.unlink(missing_ok=True)
except OSError:
@@ -52,8 +52,8 @@ def tearDown(self):
def test_get_last_modified_file(self):
# Get the last modified file in the test directory and check that it is file1.txt
last_modified_time, last_modified_file = last_data_files_changed(self.test_dir)
- print(f'last_modified_time: {last_modified_time}')
- print(f'last_modified_file: {last_modified_file}')
+ print(f"last_modified_time: {last_modified_time}")
+ print(f"last_modified_file: {last_modified_file}")
self.assertEqual(
last_modified_time,
datetime.datetime.fromtimestamp(self.file1.stat().st_mtime),
@@ -62,7 +62,7 @@ def test_get_last_modified_file(self):
def test_empty_directory(self):
# Get the last modified file in an empty directory and check that None is returned
- empty_dir = Path('empty_dir')
+ empty_dir = Path("empty_dir")
empty_dir.mkdir()
last_modified_time, last_modified_file = last_data_files_changed(empty_dir)
self.assertIsNone(last_modified_time)
@@ -71,7 +71,9 @@ def test_empty_directory(self):
def test_nonexistent_directory(self):
# Get the last modified file in a nonexistent directory and check that None is returned
- nonexistent_dir = Path('nonexistent_dir')
- last_modified_time, last_modified_file = last_data_files_changed(nonexistent_dir)
+ nonexistent_dir = Path("nonexistent_dir")
+ last_modified_time, last_modified_file = last_data_files_changed(
+ nonexistent_dir
+ )
self.assertIsNone(last_modified_time)
self.assertIsNone(last_modified_file)
diff --git a/tests/test_common_functions/test_folder_functions/test_folder_functions_make_folder.py b/tests/test_common_functions/test_folder_functions/test_folder_functions_make_folder.py
index 86d68f63..e4df90dc 100644
--- a/tests/test_common_functions/test_folder_functions/test_folder_functions_make_folder.py
+++ b/tests/test_common_functions/test_folder_functions/test_folder_functions_make_folder.py
@@ -13,7 +13,7 @@
class TestMakeFolder(unittest.TestCase):
def setUp(self):
- self.test_dir = Path('test_folder_make_folder')
+ self.test_dir = Path("test_folder_make_folder")
self.test_dir.mkdir()
def tearDown(self):
@@ -21,7 +21,7 @@ def tearDown(self):
def test_create_folder_successfully(self):
# Create a new folder and check that it was created successfully
- new_folder = self.test_dir / 'new_folder'
+ new_folder = self.test_dir / "new_folder"
self.assertTrue(make_folder(new_folder))
self.assertTrue(new_folder.is_dir())
@@ -33,7 +33,7 @@ def test_folder_already_exists(self):
def test_folder_name_contains_invalid_characters(self):
# Create a folder with an invalid name and check that an exception is raised
- invalid_folder_name = self.test_dir / 'new'
+ assert result["found"] == ["one", "two", "three"]
+ assert result["matched_found"] == 3
+ assert result["pattern_parameters"]["left_character"] == "<"
+ assert result["pattern_parameters"]["right_character"] == ">"
# assert result["pattern_parameters"]["regex_pattern"] == "<(.+?)\>"
- assert result['pattern_parameters']['regex_pattern'] is not None
- assert result['pattern_parameters']['text_string'] == 'abc123456'
+ assert result["pattern_parameters"]["regex_pattern"] is not None
+ assert (
+ result["pattern_parameters"]["text_string"] == "abc123456"
+ )
def test_pattern_between_two_char_edge_cases(self):
# test with very long input string
- long_input = 'xyz' * 10000
- long_text = f'{long_input}abc123456{long_input}'
+ long_input = "xyz" * 10000
+ long_text = f"{long_input}abc123456{long_input}"
result = pattern_between_two_char(
- text_string=long_text, left_characters='<', right_characters='>'
+ text_string=long_text, left_characters="<", right_characters=">"
)
- assert result['found'] == ['one', 'two', 'three']
- assert result['matched_found'] == 3
- assert len(result['pattern_parameters']['text_string']) > 20000
+ assert result["found"] == ["one", "two", "three"]
+ assert result["matched_found"] == 3
+ assert len(result["pattern_parameters"]["text_string"]) > 20000
# test with special characters in input string
result = pattern_between_two_char(
- text_string='*c]', left_characters='*', right_characters=']'
+ text_string="*c]", left_characters="*", right_characters="]"
)
print(result)
- assert result['found'] == ['c\\']
+ assert result["found"] == ["c\\"]
diff --git a/tests/test_database_functions/test_async_database.py b/tests/test_database_functions/test_async_database.py
index 0b8d8b63..0bc2aa21 100644
--- a/tests/test_database_functions/test_async_database.py
+++ b/tests/test_database_functions/test_async_database.py
@@ -11,10 +11,10 @@
from dsg_lib.async_database_functions.database_operations import DatabaseOperations
config = {
- 'database_uri': 'sqlite+aiosqlite:///:memory:?cache=shared',
- 'echo': True,
- 'future': True,
- 'pool_recycle': 3600,
+ "database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
+ "echo": True,
+ "future": True,
+ "pool_recycle": 3600,
}
db_config = DBConfig(config)
async_db = AsyncDatabase(db_config)
@@ -23,13 +23,13 @@
# Define User class here
class User(async_db.Base):
- __tablename__ = 'users'
+ __tablename__ = "users"
pkid = Column(Integer, primary_key=True)
name = Column(String, unique=True)
class TestDatabaseOperations:
- @pytest.fixture(scope='session')
+ @pytest.fixture(scope="session")
def db_ops(self):
loop = asyncio.get_event_loop()
loop.run_until_complete(async_db.create_tables())
@@ -38,7 +38,7 @@ def db_ops(self):
@pytest.mark.asyncio
async def test_count_query(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- users = [User(name=f'User{i}-{secrets.token_hex(2)}') for i in range(10)]
+ users = [User(name=f"User{i}-{secrets.token_hex(2)}") for i in range(10)]
result = await db_ops.create_many(users)
assert len(result) == 10
@@ -58,31 +58,31 @@ async def test_count_query_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that count_query returns an error dictionary
result = await db_ops.count_query(select(User))
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_count_query_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that count_query returns an error dictionary
result = await db_ops.count_query(select(User))
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_read_query(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- user = User(name='Mike')
+ user = User(name="Mike")
await db_ops.create_one(user)
data = await db_ops.read_query(select(User))
assert isinstance(data, list)
@@ -93,68 +93,68 @@ async def test_read_query_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that read_query returns an error dictionary
result = await db_ops.read_query(select(User))
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_read_query_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that read_query returns an error dictionary
result = await db_ops.read_query(select(User))
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_read_multi_query(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- queries = {'all_users': select(User)}
+ queries = {"all_users": select(User)}
results = await db_ops.read_multi_query(queries)
assert isinstance(results, dict)
- assert 'all_users' in results
- assert isinstance(results['all_users'], list)
+ assert "all_users" in results
+ assert isinstance(results["all_users"], list)
@pytest.mark.asyncio
async def test_read_multi_query_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that read_multi_query returns an error dictionary
- queries = {'test_query': select(User)}
+ queries = {"test_query": select(User)}
result = await db_ops.read_multi_query(queries)
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_read_multi_query_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that read_multi_query returns an error dictionary
- queries = {'test_query': select(User)}
+ queries = {"test_query": select(User)}
result = await db_ops.read_multi_query(queries)
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_create_one(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- user_name = f'Mike{secrets.randbelow(100000)}'
+ user_name = f"Mike{secrets.randbelow(100000)}"
user = User(name=user_name)
result = await db_ops.create_one(user)
assert isinstance(result, User)
@@ -165,47 +165,47 @@ async def test_create_one_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that create_one returns an error dictionary
- result = await db_ops.create_one(User(name='test'))
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ result = await db_ops.create_one(User(name="test"))
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_create_one_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that create_one returns an error dictionary
- result = await db_ops.create_one(User(name='test'))
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ result = await db_ops.create_one(User(name="test"))
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_create_one_integrity_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an IntegrityError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=IntegrityError(None, None, 'Test error message'),
+ "get_db_session",
+ side_effect=IntegrityError(None, None, "Test error message"),
)
# Check that create_one returns an error dictionary
- result = await db_ops.create_one(User(name='test'))
+ result = await db_ops.create_one(User(name="test"))
assert result == {
- 'error': 'IntegrityError',
- 'details': '(builtins.str) Test error message\n(Background on this error at: https://sqlalche.me/e/20/gkpj)',
+ "error": "IntegrityError",
+ "details": "(builtins.str) Test error message\n(Background on this error at: https://sqlalche.me/e/20/gkpj)",
}
@pytest.mark.asyncio
async def test_create_many(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- users = [User(name=f'User{i}') for i in range(10)]
+ users = [User(name=f"User{i}") for i in range(10)]
result = await db_ops.create_many(users)
assert isinstance(result, list)
assert len(result) == 10
@@ -215,62 +215,64 @@ async def test_create_many_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that create_many returns an error dictionary
- result = await db_ops.create_many([User(name='test1'), User(name='test2')])
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ result = await db_ops.create_many([User(name="test1"), User(name="test2")])
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_create_many_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that create_many returns an error dictionary
- result = await db_ops.create_many([User(name='test1'), User(name='test2')])
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ result = await db_ops.create_many([User(name="test1"), User(name="test2")])
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_create_many_integrity_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an IntegrityError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
+ "get_db_session",
side_effect=IntegrityError(None, None, None, None),
)
# Check that create_many returns an error dictionary
- result = await db_ops.create_many([User(name='test1'), User(name='test2')])
- assert result['error'] == 'IntegrityError' # Corrected spelling
- assert isinstance(result['details'], str)
- assert result['details'] != ''
+ result = await db_ops.create_many([User(name="test1"), User(name="test2")])
+ assert result["error"] == "IntegrityError" # Corrected spelling
+ assert isinstance(result["details"], str)
+ assert result["details"] != ""
@pytest.mark.asyncio
async def test_update_one(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- name = f'Mike{secrets.randbelow(1000)}'
+ name = f"Mike{secrets.randbelow(1000)}"
user = User(name=name)
user_record = await db_ops.create_one(user)
- updated_user = {'name': 'John12345'}
+ updated_user = {"name": "John12345"}
result = await db_ops.update_one(
table=User, record_id=user_record.pkid, new_values=updated_user
)
assert isinstance(result, User)
- assert result.name == 'John12345'
+ assert result.name == "John12345"
@pytest.mark.asyncio
async def test_update_one_record_not_found(self, db_ops):
# Check that update_one returns an error dictionary when no record is found
- result = await db_ops.update_one(table=User, record_id=9999, new_values={'name': 'test'})
+ result = await db_ops.update_one(
+ table=User, record_id=9999, new_values={"name": "test"}
+ )
assert result == {
- 'error': 'Record not found',
- 'details': 'No record found with pkid 9999',
+ "error": "Record not found",
+ "details": "No record found with pkid 9999",
}
@pytest.mark.asyncio
@@ -278,16 +280,18 @@ async def test_update_one_integrity_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an IntegrityError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=IntegrityError(None, None, 'Test error message'),
+ "get_db_session",
+ side_effect=IntegrityError(None, None, "Test error message"),
)
# Check that update_one returns an error dictionary
- updated_user = {'name': 'John12345', 'pkid': 'bob'}
- result = await db_ops.update_one(table=User, record_id=1, new_values=updated_user)
+ updated_user = {"name": "John12345", "pkid": "bob"}
+ result = await db_ops.update_one(
+ table=User, record_id=1, new_values=updated_user
+ )
assert result == {
- 'error': 'IntegrityError',
- 'details': '(builtins.str) Test error message\n(Background on this error at: https://sqlalche.me/e/20/gkpj)',
+ "error": "IntegrityError",
+ "details": "(builtins.str) Test error message\n(Background on this error at: https://sqlalche.me/e/20/gkpj)",
}
@pytest.mark.asyncio
@@ -295,15 +299,17 @@ async def test_update_one_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that update_one returns an error dictionary
- result = await db_ops.update_one(table=User, record_id=1, new_values={'name': 'test'})
+ result = await db_ops.update_one(
+ table=User, record_id=1, new_values={"name": "test"}
+ )
assert result == {
- 'error': 'SQLAlchemyError',
- 'details': 'Test error message',
+ "error": "SQLAlchemyError",
+ "details": "Test error message",
}
@pytest.mark.asyncio
@@ -311,29 +317,31 @@ async def test_update_one_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that update_one returns an error dictionary
- result = await db_ops.update_one(table=User, record_id=1, new_values={'name': 'test'})
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ result = await db_ops.update_one(
+ table=User, record_id=1, new_values={"name": "test"}
+ )
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_delete_one(self, db_ops):
# db_ops is already awaited by pytest, so you can use it directly
- user = User(name='Mike12345')
+ user = User(name="Mike12345")
user_record = await db_ops.create_one(user)
result = await db_ops.delete_one(table=User, record_id=user_record.pkid)
- assert result == {'success': 'Record deleted successfully'}
+ assert result == {"success": "Record deleted successfully"}
@pytest.mark.asyncio
async def test_delete_one_record_not_found(self, db_ops):
# Check that delete_one returns an error dictionary when no record is found
result = await db_ops.delete_one(table=User, record_id=9999)
assert result == {
- 'error': 'Record not found',
- 'details': 'No record found with pkid 9999',
+ "error": "Record not found",
+ "details": "No record found with pkid 9999",
}
@pytest.mark.asyncio
@@ -341,26 +349,26 @@ async def test_delete_one_sqlalchemy_error(self, db_ops, mocker):
# Mock the get_db_session method to raise an SQLAlchemyError
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=SQLAlchemyError('Test error message'),
+ "get_db_session",
+ side_effect=SQLAlchemyError("Test error message"),
)
# Check that delete_one returns an error dictionary
result = await db_ops.delete_one(table=User, record_id=1)
- assert result == {'error': 'SQLAlchemyError', 'details': 'Test error message'}
+ assert result == {"error": "SQLAlchemyError", "details": "Test error message"}
@pytest.mark.asyncio
async def test_delete_one_general_exception(self, db_ops, mocker):
# Mock the get_db_session method to raise an Exception
mocker.patch.object(
db_ops.async_db,
- 'get_db_session',
- side_effect=Exception('Test error message'),
+ "get_db_session",
+ side_effect=Exception("Test error message"),
)
# Check that delete_one returns an error dictionary
result = await db_ops.delete_one(table=User, record_id=1)
- assert result == {'error': 'General Exception', 'details': 'Test error message'}
+ assert result == {"error": "General Exception", "details": "Test error message"}
@pytest.mark.asyncio
async def test_get_table_names(self, db_ops):
@@ -379,7 +387,7 @@ async def test_get_primary_keys(self, db_ops):
@pytest.mark.asyncio
async def test_read_one_record(self, db_ops):
- user_name = f'Mike{secrets.randbelow(1000)}'
+ user_name = f"Mike{secrets.randbelow(1000)}"
user = User(name=user_name)
result = await db_ops.create_one(user)
assert isinstance(result, User)
@@ -390,7 +398,7 @@ async def test_read_one_record(self, db_ops):
@pytest.mark.asyncio
async def test_read_one_record_none(self, db_ops):
- user_name = f'Mike{secrets.token_hex(10)}'
+ user_name = f"Mike{secrets.token_hex(10)}"
user = select(User).where(User.name == user_name)
data = await db_ops.read_one_record(user)
@@ -402,14 +410,16 @@ async def test_delete_many(self, db_ops):
import secrets
# db_ops is already awaited by pytest, so you can use it directly
- users = [User(name=f'User{i}-{secrets.token_hex(2)}') for i in range(100)]
+ users = [User(name=f"User{i}-{secrets.token_hex(2)}") for i in range(100)]
result = await db_ops.create_many(users)
assert len(result) == 100
# get all pkids from results
pkids = [user.pkid for user in result]
assert len(pkids) == 100
# delete all users
- result = await db_ops.delete_many(table=User, id_column_name='pkid', id_values=pkids)
+ result = await db_ops.delete_many(
+ table=User, id_column_name="pkid", id_values=pkids
+ )
# assert isinstance(result, int)
assert result == 100
@@ -418,7 +428,7 @@ async def test_delete_many_exception(self, db_ops):
import secrets
# db_ops is already awaited by pytest, so you can use it directly
- users = [User(name=f'User{i}-{secrets.token_hex(2)}') for i in range(10)]
+ users = [User(name=f"User{i}-{secrets.token_hex(2)}") for i in range(10)]
result = await db_ops.create_many(users)
assert len(result) == 10
# get all pkids from results
@@ -428,4 +438,4 @@ async def test_delete_many_exception(self, db_ops):
table=User, id_column_name=secrets.token_hex(4), id_values=pkids
)
# assert result contains "error"
- assert 'error' in result
+ assert "error" in result
diff --git a/tests/test_database_functions/test_base_schema.py b/tests/test_database_functions/test_base_schema.py
index 9bacbc75..10395e75 100644
--- a/tests/test_database_functions/test_base_schema.py
+++ b/tests/test_database_functions/test_base_schema.py
@@ -11,7 +11,8 @@
# Get the database URL from the environment variable
database_url = os.getenv(
- 'DATABASE_URL', 'postgresql://postgres:postgres@postgresdbTest:5432/dsglib_test'
+ "DATABASE_URL",
+ "postgresql://postgres:postgres@postgresdbTest:5432/dsglib_test",
# postgres://postgres:postgres@postgresdb:5432/devsetgo_local
)
@@ -20,19 +21,19 @@
# Define a dictionary with the connection strings for each database
# Replace the placeholders with your actual connection details
DATABASES = {
- 'sqlite': 'sqlite:///:memory:',
- 'postgres': database_url,
+ "sqlite": "sqlite:///:memory:",
+ "postgres": database_url,
}
# Define a dictionary with the schema base classes for each database
SCHEMA_BASES = {
- 'sqlite': SchemaBaseSQLite,
- 'postgres': SchemaBasePostgres,
+ "sqlite": SchemaBaseSQLite,
+ "postgres": SchemaBasePostgres,
}
# Parameterize the test function with the names of the databases
-@pytest.mark.parametrize('db_name', DATABASES.keys())
+@pytest.mark.parametrize("db_name", DATABASES.keys())
def test_schema_base(db_name):
# Get the connection string and schema base class for the current database
connection_string = DATABASES[db_name]
@@ -40,7 +41,7 @@ def test_schema_base(db_name):
# Define the User model for the current database
class User(SchemaBase, Base):
- __tablename__ = f'test_table_{db_name}'
+ __tablename__ = f"test_table_{db_name}"
name_first = Column(String, unique=False, index=True)
# Set up the database engine and session factory
@@ -55,7 +56,7 @@ class User(SchemaBase, Base):
try:
user = User()
- user.name_first = 'Test'
+ user.name_first = "Test"
# Add the instance to the session and commit it to generate id
session.add(user)
diff --git a/tests/test_database_functions/test_db_config.py b/tests/test_database_functions/test_db_config.py
index 2bcae31a..0c6abaa5 100644
--- a/tests/test_database_functions/test_db_config.py
+++ b/tests/test_database_functions/test_db_config.py
@@ -8,10 +8,10 @@
def test_sqlite_supported_parameters():
config = {
- 'database_uri': 'sqlite+aiosqlite:///:memory:',
- 'echo': True,
- 'future': True,
- 'pool_recycle': 3600,
+ "database_uri": "sqlite+aiosqlite:///:memory:",
+ "echo": True,
+ "future": True,
+ "pool_recycle": 3600,
}
db_config = DBConfig(config)
assert isinstance(db_config.engine, AsyncEngine)
@@ -20,10 +20,10 @@ def test_sqlite_supported_parameters():
def test_sqlite_unsupported_parameters():
config = {
- 'database_uri': 'sqlite+aiosqlite:///:memory:?cache=shared',
- 'pool_size': 5,
- 'max_overflow': 10,
- 'pool_timeout': 30,
+ "database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
+ "pool_size": 5,
+ "max_overflow": 10,
+ "pool_timeout": 30,
}
with pytest.raises(Exception):
DBConfig(config)
@@ -31,14 +31,14 @@ def test_sqlite_unsupported_parameters():
def test_postgresql_supported_parameters():
config = {
- 'database_uri': 'postgresql+asyncpg://postgres:postgres@db/postgres',
- 'echo': True,
- 'future': True,
- 'pool_pre_ping': True,
- 'pool_size': 5,
- 'max_overflow': 10,
- 'pool_recycle': 3600,
- 'pool_timeout': 30,
+ "database_uri": "postgresql+asyncpg://postgres:postgres@db/postgres",
+ "echo": True,
+ "future": True,
+ "pool_pre_ping": True,
+ "pool_size": 5,
+ "max_overflow": 10,
+ "pool_recycle": 3600,
+ "pool_timeout": 30,
}
db_config = DBConfig(config)
assert isinstance(db_config.engine, AsyncEngine)
@@ -47,8 +47,8 @@ def test_postgresql_supported_parameters():
def test_postgresql_unsupported_parameters():
config = {
- 'database_uri': 'postgresql+asyncpg://postgres:postgres@db/postgres',
- 'unsupported_option': True,
+ "database_uri": "postgresql+asyncpg://postgres:postgres@db/postgres",
+ "unsupported_option": True,
}
with pytest.raises(Exception):
DBConfig(config)
diff --git a/tests/test_endpoints/test_http_codes.py b/tests/test_endpoints/test_http_codes.py
index 9be69b5a..8c0554d7 100644
--- a/tests/test_endpoints/test_http_codes.py
+++ b/tests/test_endpoints/test_http_codes.py
@@ -7,7 +7,7 @@
def test_generate_code_dict_description_only():
codes = [200, 404]
result = generate_code_dict(codes, description_only=True)
- assert result == {200: 'OK', 404: 'Not Found'}
+ assert result == {200: "OK", 404: "Not Found"}
def test_generate_code_dict_all_info():
diff --git a/tests/test_endpoints/test_system_health_endpoints.py b/tests/test_endpoints/test_system_health_endpoints.py
index 997e2a3b..7649ff6b 100644
--- a/tests/test_endpoints/test_system_health_endpoints.py
+++ b/tests/test_endpoints/test_system_health_endpoints.py
@@ -15,45 +15,45 @@
config = {
# "enable_status_endpoint": False, # on by default
# "enable_uptime_endpoint": False, # on by default
- 'enable_heapdump_endpoint': True, # off by default
+ "enable_heapdump_endpoint": True, # off by default
}
# Health router
health_router = create_health_router(config)
-app.include_router(health_router, prefix='/api/health', tags=['system-health'])
+app.include_router(health_router, prefix="/api/health", tags=["system-health"])
def test_health_status():
- response = client.get('/api/health/status')
+ response = client.get("/api/health/status")
assert response.status_code == 200
- assert response.json() == {'status': 'UP'}
+ assert response.json() == {"status": "UP"}
def test_get_uptime():
- response = client.get('/api/health/uptime')
+ response = client.get("/api/health/uptime")
assert response.status_code == 200
- assert 'uptime' in response.json()
+ assert "uptime" in response.json()
# Check that the uptime dictionary has the expected keys
- assert set(response.json()['uptime'].keys()) == {
- 'Days',
- 'Hours',
- 'Minutes',
- 'Seconds',
+ assert set(response.json()["uptime"].keys()) == {
+ "Days",
+ "Hours",
+ "Minutes",
+ "Seconds",
}
def test_get_heapdump():
tracemalloc.start() # Start tracing memory allocations
- response = client.get('/api/health/heapdump')
+ response = client.get("/api/health/heapdump")
tracemalloc.stop() # Stop tracing memory allocations
assert response.status_code == 200
- assert 'heap_dump' in response.json()
+ assert "heap_dump" in response.json()
# Check that each item in the heap dump has the expected keys
- for item in response.json()['heap_dump']:
- assert set(item.keys()) == {'filename', 'lineno', 'size', 'count'}
+ for item in response.json()["heap_dump"]:
+ assert set(item.keys()) == {"filename", "lineno", "size", "count"}
-@patch('tracemalloc.stop') # , side_effect=TracemallocNotStartedError())
+@patch("tracemalloc.stop") # , side_effect=TracemallocNotStartedError())
def test_get_heapdump_tracemalloc_error(mock_start):
- response = client.get('/api/health/heapdump')
+ response = client.get("/api/health/heapdump")
assert response.status_code == 500
- assert 'detail' in response.json()
+ assert "detail" in response.json()
diff --git a/unreleased/log_example_structlog.py b/unreleased/log_example_structlog.py
new file mode 100644
index 00000000..b84ebda6
--- /dev/null
+++ b/unreleased/log_example_structlog.py
@@ -0,0 +1,104 @@
+# -*- coding: utf-8 -*-
+"""
+Author: Mike Ryan
+Date: 2024/05/16
+License: MIT
+"""
+import logging
+import multiprocessing
+import secrets
+import threading
+
+import structlog
+from tqdm import tqdm
+
+from dsg_lib.common_functions import logging_config_structlog
+
+
+logger = structlog.get_logger()
+
+
+def div_zero(x, y):
+ try:
+ return x / y
+ except ZeroDivisionError as e:
+ logger.error("division by zero", error=str(e))
+ logging.error(f'{e}')
+
+
+def div_zero_two(x, y):
+ try:
+ return x / y
+ except ZeroDivisionError as e:
+ logger.error("division by zero", error=str(e))
+ logging.error(f'{e}')
+
+
+def log_big_string(lqty=100, size=256):
+ # big_string = secrets.token_urlsafe(size)
+ big_string = """
+ Bacon ipsum dolor amet meatball kielbasa chislic, corned beef ham hock frankfurter jowl sirloin meatloaf ribeye boudin. Capicola ham hock pork landjaeger, jerky t-bone strip steak pork chop boudin shankle tri-tip andouille pork belly flank.
+ """
+ for _ in range(lqty):
+ logging.debug(f'Lets make this a big message {big_string}')
+ div_zero(x=1, y=0)
+ div_zero_two(x=1, y=0)
+ logger.debug('This is a debug message')
+ logger.info('This is an info message')
+ logger.error('This is an error message')
+ logger.warning('This is a warning message')
+ logger.critical('This is a critical message')
+
+ logging.debug('This is a debug message')
+ logging.info('This is an info message')
+ logging.error('This is an error message')
+ logging.warning('This is a warning message')
+ logging.critical('This is a critical message')
+
+
+def worker(wqty=1000, lqty=100, size=256):
+ for _ in tqdm(range(wqty), desc="Worker", leave=True, ascii=True): # Adjusted for demonstration
+ log_big_string(lqty=lqty, size=size)
+
+
+def main(wqty: int = 100, lqty: int = 10, size: int = 256, workers: int = 16, thread_test: bool = False, process_test: bool = False):
+ if process_test:
+ processes = []
+ # Create worker processes
+ for _ in tqdm(range(workers), desc="Multi-Processing Start", leave=True):
+ p = multiprocessing.Process(
+ target=worker, args=(wqty, lqty, size,))
+ processes.append(p)
+ p.start()
+
+ for p in tqdm((processes), desc="Multi-Processing Start", leave=True):
+ p.join(timeout=60) # Timeout after 60 seconds
+ if p.is_alive():
+ logger.error(f"Process {p.name} is hanging. Terminating.")
+ p.terminate()
+ p.join()
+
+ if thread_test:
+ threads = []
+ for _ in tqdm(range(workers), desc="Threading Start", leave=True): # Create worker threads
+ t = threading.Thread(target=worker, args=(wqty, lqty, size,))
+ threads.append(t)
+ t.start()
+
+ for t in tqdm(threads, desc="Threading Gather", leave=True):
+ t.join()
+
+
+if __name__ == "__main__":
+ logging_config_structlog.configure_logging(
+ logging_directory='log',
+ log_name='log',
+ logging_level='INFO',
+ log_rotation=100, # Size in MB
+ log_retention=10
+ )
+ from time import time
+ start = time()
+ main(wqty=100, lqty=100, size=256, workers=16,
+ thread_test=True, process_test=True)
+ print(f"Execution time: {time()-start:.2f} seconds")
diff --git a/unreleased/logging_config_structlog.py b/unreleased/logging_config_structlog.py
new file mode 100644
index 00000000..0ef6f694
--- /dev/null
+++ b/unreleased/logging_config_structlog.py
@@ -0,0 +1,178 @@
+"""
+This module provides a logging configuration setup with structlog, including
+support for safe log rotation and multiprocessing.
+"""
+
+import logging
+import logging.handlers
+import os
+import threading
+from collections import deque
+from datetime import datetime
+from multiprocessing import Lock, Process, Queue
+from queue import Empty
+
+import structlog
+from pythonjsonlogger import jsonlogger
+
+rotation_lock = Lock()
+
+class SafeRotatingFileHandler(logging.handlers.RotatingFileHandler):
+ """A rotating file handler that safely handles log file rotation."""
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+ self.process_name = os.getpid() # Get the process ID
+
+ def doRollover(self):
+ """Perform the log file rollover."""
+ with rotation_lock:
+ if self.stream:
+ self.stream.close()
+ self.stream = None
+
+ dfn = self.rotation_filename(f"{self.baseFilename}.{self.process_name}.{self.backupCount}")
+ if os.path.exists(dfn):
+ os.remove(dfn)
+ self.rotate(self.baseFilename, dfn)
+
+ if not self.delay:
+ self.stream = self._open()
+
+class QueueHandler(logging.Handler):
+ """This is a logging handler which sends events to a queue. It can be used
+ from different processes to send logs to a single log file.
+ """
+ def __init__(self, log_queue):
+ super().__init__()
+ self.log_queue = log_queue
+
+ def emit(self, record):
+ try:
+ self.log_queue.put_nowait(record)
+ except Exception as e:
+ print(f"Error emitting log record: {e}")
+
+class QueueListener:
+ """This is a listener which receives log events from the queue and processes
+ them. It should be run in a separate process.
+ """
+ def __init__(self, log_queue, handlers):
+ self.log_queue = log_queue
+ self.handlers = handlers
+ self.stop_event = threading.Event()
+
+ def start(self):
+ """Start the queue listener."""
+ while not self.stop_event.is_set():
+ try:
+ record = self.log_queue.get(timeout=0.05)
+ self.handle(record)
+ except Empty:
+ continue
+
+ def handle(self, record):
+ """Handle a log record."""
+ for handler in self.handlers:
+ handler.handle(record)
+
+ def stop(self):
+ """Stop the queue listener."""
+ self.stop_event.set()
+
+class CachingRotatingFileHandler(logging.handlers.RotatingFileHandler):
+ """A rotating file handler with caching capabilities."""
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+ self.cache = deque(maxlen=1000)
+
+ def emit(self, record):
+ """Emit a record."""
+ try:
+ self.cache.append(record)
+ except Exception as e:
+ print(f"Error caching log record: {e}")
+
+ def flush_cache(self):
+ """Flush the cache."""
+ while self.cache:
+ record = self.cache.popleft()
+ super().emit(record)
+
+def configure_logging(
+ logging_directory: str = 'log',
+ log_name: str = 'log',
+ logging_level: str = 'INFO',
+ log_rotation: int = 100, # Size in MB
+ log_retention: int = 10,
+ multiprocess: bool = False
+):
+ """Configure logging with rotating file handlers."""
+ if not os.path.exists(logging_directory):
+ os.makedirs(logging_directory)
+
+ timestamp = datetime.now().strftime("%Y%m%d%H%M%S")
+ log_name = f"{log_name}_{timestamp}.json"
+ log_path = os.path.join(logging_directory, log_name)
+ max_bytes = log_rotation * 1024 * 1024
+
+ cache_rotating_handler = CachingRotatingFileHandler(
+ log_path,
+ maxBytes=max_bytes,
+ backupCount=log_retention
+ )
+ safe_rotating_file_handler = SafeRotatingFileHandler(
+ filename=log_path,
+ maxBytes=max_bytes,
+ backupCount=log_retention
+ )
+ formatter = jsonlogger.JsonFormatter()
+ cache_rotating_handler.setFormatter(formatter)
+ safe_rotating_file_handler.setFormatter(formatter)
+
+ handlers = [cache_rotating_handler, safe_rotating_file_handler]
+
+ listener_process, listener_instance = None, None
+
+ if multiprocess:
+ log_queue = Queue()
+ queue_handler = QueueHandler(log_queue)
+ handlers = [queue_handler]
+ listener_instance = QueueListener(log_queue, [cache_rotating_handler, safe_rotating_file_handler])
+ listener_process = Process(target=listener_instance.start)
+ listener_process.start()
+
+ logging.basicConfig(
+ level=logging_level,
+ handlers=handlers
+ )
+
+ structlog.configure(
+ processors=[
+ structlog.processors.TimeStamper(fmt="iso", utc=True),
+ structlog.processors.StackInfoRenderer(),
+ structlog.processors.format_exc_info,
+ structlog.processors.JSONRenderer()
+ ],
+ context_class=dict,
+ logger_factory=structlog.stdlib.LoggerFactory(),
+ wrapper_class=structlog.stdlib.BoundLogger,
+ cache_logger_on_first_use=True,
+ )
+
+ cache_rotating_handler.flush_cache()
+
+ return listener_instance, listener_process
+
+# Example usage
+if __name__ == "__main__":
+ listener_instance, listener_process = configure_logging(
+ logging_directory='log',
+ log_name='log',
+ logging_level='INFO',
+ log_rotation=100, # Size in MB
+ log_retention=10,
+ multiprocess=True
+ )
+
+ logger = structlog.get_logger()
+ logger.info("Logging configured with SafeRotatingFileHandler")
diff --git a/unreleased/struct_log_config.py b/unreleased/struct_log_config.py
new file mode 100644
index 00000000..446e17ca
--- /dev/null
+++ b/unreleased/struct_log_config.py
@@ -0,0 +1,133 @@
+# -*- coding: utf-8 -*-
+"""
+Simplification and standardization of structlog configuration for Python applications. This module provides a basic configuration for structlog, a powerful logging library that supports structured logging. The `config_log` function sets up a logger with a console sink and a file sink, and allows for customization of the logging level, log file name, and log format. It also provides a `get_logger` function to retrieve the configured logger.
+
+Author: Mike Ryan
+DateCreated: 2021/07/21
+DateUpdated: 2024/07/21
+
+License: MIT
+"""
+import logging
+import logging.handlers
+import os
+import zipfile
+from logging.handlers import BaseRotatingHandler
+import time
+from datetime import datetime
+import structlog
+import colorama
+
+# Initialize colorama for ANSI color codes
+colorama.init()
+
+class TimedRotatingFileHandlerWithZip(BaseRotatingHandler):
+ def __init__(self, filename, when='midnight', interval=1, backupCount=5, maxBytes=100*1024*1024, encoding=None, delay=False, utc=False):
+ self.when = when.upper()
+ self.backupCount = backupCount
+ self.maxBytes = maxBytes
+ self.utc = utc
+ self.interval = self.compute_interval(when, interval)
+ self.suffix = "%Y-%m-%d"
+ self.extMatch = r"^\d{4}-\d{2}-\d{2}$"
+ BaseRotatingHandler.__init__(self, filename, 'a', encoding, delay)
+ self.rolloverAt = self.compute_rollover(time.time())
+
+ def compute_interval(self, when, interval):
+ when_upper = when.upper()
+ if when_upper == 'MIDNIGHT':
+ return 86400 * interval # 86400 seconds in a day
+ else:
+ supported_intervals = ['MIDNIGHT'] # Extend this list as needed
+ raise ValueError(f"Invalid rollover interval specified: {when}. Supported intervals are: {', '.join(supported_intervals)}")
+
+ def compute_rollover(self, currentTime):
+ if self.when == 'MIDNIGHT':
+ if self.utc:
+ t = time.gmtime(currentTime)
+ else:
+ t = time.localtime(currentTime)
+ currentDay = time.mktime(t[:3] + (0, 0, 0) + t[6:9])
+ return currentDay + self.interval
+ else:
+ return currentTime + self.interval
+
+ def shouldRollover(self, record):
+ if self.stream.tell() + len(self.format(record)) >= self.maxBytes:
+ return 1
+ currentTime = int(time.time())
+ if currentTime >= self.rolloverAt:
+ return 1
+ return 0
+
+ def doRollover(self):
+ self.stream.close()
+ if self.backupCount > 0:
+ for i in range(self.backupCount - 1, 0, -1):
+ sfn = self.rotation_filename("%s.%s" % (self.baseFilename, self.timeSuffix(i)))
+ dfn = self.rotation_filename("%s.%s.zip" % (self.baseFilename, self.timeSuffix(i + 1)))
+ if os.path.exists(sfn):
+ with zipfile.ZipFile(dfn, 'w', zipfile.ZIP_DEFLATED) as zf:
+ zf.write(sfn, os.path.basename(sfn))
+ os.remove(sfn)
+ dfn = self.rotation_filename("%s.%s.zip" % (self.baseFilename, self.timeSuffix(0)))
+ if os.path.exists(dfn):
+ os.remove(dfn)
+ self.rotate(self.baseFilename, dfn)
+ if not self.delay:
+ self.stream = self._open()
+ currentTime = int(time.time())
+ newRolloverAt = self.compute_rollover(currentTime)
+ while newRolloverAt <= currentTime:
+ newRolloverAt += self.interval
+ self.rolloverAt = newRolloverAt
+
+ def rotate(self, source, dest):
+ with zipfile.ZipFile(dest, 'w', zipfile.ZIP_DEFLATED) as zf:
+ zf.write(source, os.path.basename(source))
+ os.remove(source)
+
+ def timeSuffix(self, idx):
+ timeTuple = time.localtime(time.time() - (self.interval * idx))
+ return time.strftime(self.suffix, timeTuple)
+
+def custom_log_processor(logger, method_name, event_dict):
+ timestamp = datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S.%f')[:-3]
+ level = event_dict.get('level', '').upper()
+ name = event_dict.get('logger_name', '')
+ function = event_dict.get('function', '')
+ line = event_dict.get('line', '')
+ message = event_dict.get('event', '')
+ formatted_message = (
+ f"{colorama.Fore.GREEN}{timestamp}{colorama.Style.RESET_ALL} | "
+ f"{level: <8} | "
+ f"{colorama.Fore.CYAN}{name}{colorama.Style.RESET_ALL}:"
+ f"{colorama.Fore.CYAN}{function}{colorama.Style.RESET_ALL}:"
+ f"{colorama.Fore.CYAN}{line}{colorama.Style.RESET_ALL} - "
+ f"{message}"
+ )
+ return formatted_message
+
+def configure_structlog(log_filename:str='~/log/app.log', level:str='INFO'):
+ handler = TimedRotatingFileHandlerWithZip(log_filename, when="midnight", interval=1, backupCount=5)
+ logging.basicConfig(handlers=[handler], level=level.upper())
+ structlog.configure(
+ processors=[
+ structlog.stdlib.add_log_level,
+ structlog.stdlib.add_logger_name,
+ structlog.processors.StackInfoRenderer(),
+ structlog.processors.format_exc_info,
+ custom_log_processor, # Use the custom processor
+ ],
+ context_class=dict,
+ logger_factory=structlog.stdlib.LoggerFactory(),
+ wrapper_class=structlog.stdlib.BoundLogger,
+ cache_logger_on_first_use=True,
+ )
+
+# Example usage
+if __name__ == "__main__":
+ configure_structlog(log_filename='/workspaces/devsetgo_lib/log/app.log',level='DEBUG')
+ logger = structlog.get_logger(__name__)
+ logger.info("This is a test log message", function="main", line=75)
+ logger.debug("This is a debug message", function="main", line=76)