2
0

ADALog: Adaptive Unsupervised Anomaly detection in Logs with Self-attention Masked Language Model

Abstract

Modern software systems generate extensive heterogeneous log data with dynamic formats, fragmented event sequences, and varying temporal patterns, making anomaly detection both crucial and challenging. To address these complexities, we propose ADALog, an adaptive, unsupervised anomaly detection framework designed for practical applicability across diverse real-world environments. Unlike traditional methods reliant on log parsing, strict sequence dependencies, or labeled data, ADALog operates on individual unstructured logs, extracts intra-log contextual relationships, and performs adaptive thresholding on normal data. The proposed approach utilizes a transformer-based, pretrained bidirectional encoder with a masked language modeling task, fine-tuned on normal logs to capture domain-specific syntactic and semantic patterns essential for accurate anomaly detection. Anomalies are identified via token-level reconstruction probabilities, aggregated into log-level scores, with adaptive percentile-based thresholding calibrated only on normal data. This allows the model to dynamically adapt to evolving system behaviors while avoiding rigid, heuristic-based thresholds common in traditional systems. We evaluate ADALog on benchmark datasets BGL, Thunderbird, and Spirit, showing strong generalization and competitive performance compared to state-of-the-art supervised and unsupervised methods. Additional ablation studies examine the effects of masking, fine-tuning, and token positioning on model behavior and interpretability.

View on arXiv
@article{pospieszny2025_2505.13496,
  title={ ADALog: Adaptive Unsupervised Anomaly detection in Logs with Self-attention Masked Language Model },
  author={ Przemek Pospieszny and Wojciech Mormul and Karolina Szyndler and Sanjeev Kumar },
  journal={arXiv preprint arXiv:2505.13496},
  year={ 2025 }
}
Comments on this paper