Skip to main content

easyfabric.fabric.logging_utils

contextlib

contextvars

json

logging

random

re

sys

time

uuid

datetime

timezone

Path

Optional

notebookutils

Row

StringType

StructField

StructType

TimestampType

config

get_spark

spark

MAX_MESSAGE_LEN

LOG_PATTERN

TIMESTAMP_PATTERN

GUID_PATTERN

EXCLUDED_LOGGERS

LOG_SCHEMA

is_top_level_notebook

def is_top_level_notebook() -> bool

Identifies if the current execution is the top-most notebook.

A notebook is top-level when it was not invoked via notebookutils.notebook.run from another notebook. Fabric sets isReferenceRun=True on every child invocation and False (or absent) on the entry-point notebook.

save_log_file_to_table

def save_log_file_to_table(end_log: bool = False) -> None

Reads a log file from OneLake, parses structured logs (including multiline), and bulk inserts into Meta.dbo.logging using Spark DataFrame.

If it's a top-level notebook or end_log is True, logs an END entry with duration, clears logging handlers, resets state, and persists log entries to the table. Otherwise returns without emitting anything.

save_historical_log_file_to_table

def save_historical_log_file_to_table(abfs_path: str) -> None

Parses a specific log file by ABFS path and inserts missing logs into Meta.dbo.logging.

FabricLoggerAdapter Objects

class FabricLoggerAdapter(logging.LoggerAdapter)

Adapter that automatically includes log_type and log_category in all log records.

__init__

def __init__(logger: logging.Logger,
log_type: Optional[str] = "IN PROCESS",
log_category: Optional[str] = "Technical")

process

def process(msg, kwargs)

OneLakeFileHandler Objects

class OneLakeFileHandler(logging.Handler)

MAX_RETRIES

BASE_DELAY

__init__

def __init__(path)

emit

def emit(record)

SafeFormatter Objects

class SafeFormatter(logging.Formatter)

Formatter that ensures custom fields exist to prevent KeyErrors from third-party libraries.

format

def format(record)

formatTime

def formatTime(record, datefmt=None)

Include milliseconds in the timestamp for better sorting.

to_snake_case

def to_snake_case(string: str) -> str

Convert a string from camel case to snake case.

set_verbose_mode

def set_verbose_mode(enabled=True)

Enable or disable verbose logging mode globally.

SegmentHandle Objects

class SegmentHandle()

Context manager returned by log_segment(). Accumulates a structured payload during the segment and emits it as |CTX:{...}| JSON on the END line. Four outcome states: Success (default), Skipped, Unchanged, Failed (exception escaped).

__init__

def __init__(type: str, name: str)

record

def record(**kwargs) -> None

skip

def skip(reason: str) -> None

unchanged

def unchanged(reason: str) -> None

__enter__

def __enter__()

__exit__

def __exit__(exc_type, exc_val, exc_tb)

log_segment

def log_segment(type: str, name: str) -> SegmentHandle

Context manager to log the start and end of a logic segment. Usage: with log_segment("Data Load", "Bronze Loading") as seg: seg.record(files=nr_of_files) if skipped_by_config: seg.skip(reason="loadskip_configured") return if nothing_to_do: seg.unchanged(reason="files_unchanged") return ... logic ...

current_segment

def current_segment() -> Optional[SegmentHandle]

Return the innermost active segment, or None if outside a segment.

segment_record

def segment_record(**kwargs) -> None

Record kwargs onto the current segment's payload. No-op if no segment is active.

segment_skip

def segment_skip(reason: str) -> None

Mark the current segment as Skipped. No-op if no segment is active.

segment_unchanged

def segment_unchanged(reason: str) -> None

Mark the current segment as Unchanged. No-op if no segment is active.

init_logging

def init_logging(log_source: str = "Sys", log_object: str = None) -> str

Call once at the very top of the entry-point notebook / wheel. Returns the absolute OneLake path of the log file.

get_log_file_path

def get_log_file_path() -> Optional[str]

Returns the path of the current log file. Checks singleton state, global config, and active handlers to ensure reliability even in nested notebooks or after module reloads.

extract_real_error

def extract_real_error(log_text: str) -> str

Extracts the most relevant error message from a Spark stack trace.