To main content

LLM‑Based Compliance Assessment with LexAlign: Telenor’s Anomaly‑Detection Pipeline

The goal of the thesis is to (1) explore how Large Language Models (LLMs) and LexAlign can be used to assess GDPR and EU AI Act compliance for Telenor’s anomaly‑detection pipeline; and (2) extend LexAlign to support company-specific or sector-specific for the telecom sector.

Contact persons

Ill.: Generated with Microsoft CoPilot

Research Topic Focus

Telecommunication operators increasingly rely on machine‑learning pipelines to detect anomalies in network telemetry data. These pipelines typically span multiple stakeholders — such as Telenor Denmark (data and problem owner), Telenor R&I (analytics development and research), and external academic collaborators — creating a complex compliance landscape involving internal governance, cross‑border data handling, contractual constraints, and documentation requirements.

LexAlign, developed in the DataPACT project, is a multi‑agent LLM‑based compliance engine that transforms regulatory text into executable decision flows and performs automated compliance assessments through multiple collaborating agents. It supports compliance assessment for a wide range of data‑ and AI‑enabled systems, including pipelines, workflows, decision‑support tools, or business/operational processes.

Figure 1. Overview of the LexAlign workflow, showing how regulatory text is transformed into executable decision flows and compliance assessments through a multi‑agent LLM architecture.

Objective and Research Questions

The Master’s thesis has two primary objectives:

  1. Evaluate LexAlign on Telenor’s anomaly‑detection pipeline, using existing GDPR and AI Act decision flow models from DataPACT.

  2. Prototype sector‑specific or company‑specific compliance rules (e.g., telecommunication‑sector obligations, internal governance requirements, or contractual terms) and extend LexAlign to support these.

Based on these objectives, the thesis will explore research questions such as:

  • How effectively can LexAlign assess GDPR and AI Act compliance for the anomaly‑detection pipeline at Telenor?

  • What types of facts (inputs, documentation, logs, system descriptions) are needed for LLM‑based compliance assessment?

  • How can Telenor‑specific governance rules or sector‑specific obligations be modelled as executable decision flows?

  • How should LexAlign be extended to support telecommunications‑sector regulatory needs (e.g., security obligations, operational documentation requirements)?

  • What are the strengths and limitations of using multi‑agent LLM architecture for compliance assessments?

Expected Results and Learning Outcomes

Expected Results

  • Evaluation of LexAlign on a real Telenor case study: A documented assessment of how LexAlign performs GDPR and AI Act compliance checks for the Telenor’s anomaly‑detection pipeline using existing decision flow models from DataPACT.

  • Prototype of sector‑specific / company‑specific compliance extensions: A proof‑of‑concept extension of LexAlign that models Telenor‑specific or telecommunication‑sector‑specific compliance obligations and integrates them into executable decision flows.

  • Thesis report and potential publication: A Master’s thesis report suitable for SINTEF and NorwAI dissemination, with the possibility of a joint research paper depending on the results.

Learning Outcomes

By completing this thesis, the student will:

  • Gain hands‑on experience with multi‑agent LLM systems (e.g., agent workflows using LangChain/LangGraph) and understand how they can be applied to regulatory reasoning.

  • Develop insight into GDPR, the EU AI Act, and governance challenges relevant to critical infrastructure operators.

  • Learn how to convert legal and organizational requirements into decision flow logic suitable for automated compliance assessment.

  • Strengthen skills in analysing and modelling real industrial systems involving data processing, AI/ML workflows, or cross‑organizational information exchange.

  • Build competencies in digital compliance, knowledge representation, and AI governance tools.

Qualifications

  • Background in AI/ML and Large Language Models (LLMs).

  • Knowledge of knowledge‑representation or rule‑based reasoning.

  • Interest in AI governance, GDPR, the AI Act, or digital compliance.

  • Proficiency in Python programming.

  • Curiosity about real industrial systems and data/AI workflows.

  • Experience with LangChain, LangGraph, or agent‑based LLM frameworks is helpful but not required.

Supervision

  • SINTEF Digital — main supervision for LexAlign, multi‑agent LLM workflows, and compliance modelling.

  • Telenor — provides case‑study support, such as system descriptions, process information, and access to sample data or documentation when needed.

  • University supervision – the thesis will also include an academic supervisor from the student’s university, with SINTEF collaborating closely throughout the project.

References

  • DataPACT project website: https://datapact.eu/

  • LexAlign repository (private, access provided during thesis): https://github.com/DATAPACT/LexAlign

  • EU Artificial Intelligence Act (AI Act), Regulation (EU) 2024/1689: https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng 

  • General Data Protection Regulation (GDPR), Regulation (EU) 2016/679: https://eur-lex.europa.eu/eli/reg/2016/679/oj/eng 

  • Malacarne, S., Hoel-Høiseth, E., Aune, E., Biró, D. Z., and Ruocco, M., “Context-Aware Graph Attention for Unsupervised Telco Anomaly Detection,” in Proc. ESANN 2026 (European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning), Bruges, Belgium, Apr. 2026, to appear

  • S. Malacarne, E. Hoel-Høiseth, E. Aune, D. Z. Biró, and M. Ruocco, “Scalable Context-Aware Graph Attention for Unsupervised Anomaly Detection in Large-Scale Mobile Networks,” under review at IEEE TNSM, 2026.