Large Language Models (LLMs) have demonstrated exceptional capabilities in understanding and generating human-like text. In this paper, we leverage their powerful skills in the scope of lifelong learning agents. Instead of relying on fine-tuning procedures, we exploit Temporal Knowledge Graphs (TKGs) to continually store and update fresh information. In particular, we introduce a novel in-context learning approach called Continual In-context Knowledge LLM (CIK-LLM) capable of bridging an LLM with a dynamically changing TKG. The graph is updated whenever new knowledge becomes available, while the LLM is instructed to find the relational paths that are most relevant to the input instruction, with the goal of identifying smaller subgraphs of evidences. We propose to encode the subgraphs in a compressed-and-prompt-friendly manner, efficiently bridging the LLMs and TKGs. Then, the LLM provides an answer which is conditioned to the knowledge in the graph, exploiting its skills to support the reasoning process. We evaluate our approach on a TKG Question Answering benchmark which includes questions about events that happened at different times. The same questions are asked to models equipped with obsolete or incomplete information and models including progressively more up-to-date knowledge. CIK-LLM overcomes pre-trained LLMs, being able to immediately adapt to newly accumulated knowledge, and it reaches performances that are not far from the ones of a state-of-the art model which is trained not only exploiting LLMs but also large datasets of questions and answers. Furthermore, our model represents a valuable “forgetting-free” approach to quickly adapt an LLM to novel domains without any fine-tuning, QA datasets, or incremental learning procedures.

Tomorrow brings greater knowledge: large language models join dynamic temporal knowledge graphs

Giannini, Francesco;
2025

Abstract

Large Language Models (LLMs) have demonstrated exceptional capabilities in understanding and generating human-like text. In this paper, we leverage their powerful skills in the scope of lifelong learning agents. Instead of relying on fine-tuning procedures, we exploit Temporal Knowledge Graphs (TKGs) to continually store and update fresh information. In particular, we introduce a novel in-context learning approach called Continual In-context Knowledge LLM (CIK-LLM) capable of bridging an LLM with a dynamically changing TKG. The graph is updated whenever new knowledge becomes available, while the LLM is instructed to find the relational paths that are most relevant to the input instruction, with the goal of identifying smaller subgraphs of evidences. We propose to encode the subgraphs in a compressed-and-prompt-friendly manner, efficiently bridging the LLMs and TKGs. Then, the LLM provides an answer which is conditioned to the knowledge in the graph, exploiting its skills to support the reasoning process. We evaluate our approach on a TKG Question Answering benchmark which includes questions about events that happened at different times. The same questions are asked to models equipped with obsolete or incomplete information and models including progressively more up-to-date knowledge. CIK-LLM overcomes pre-trained LLMs, being able to immediately adapt to newly accumulated knowledge, and it reaches performances that are not far from the ones of a state-of-the art model which is trained not only exploiting LLMs but also large datasets of questions and answers. Furthermore, our model represents a valuable “forgetting-free” approach to quickly adapt an LLM to novel domains without any fine-tuning, QA datasets, or incremental learning procedures.
2025
Settore IINF-05/A - Sistemi di elaborazione delle informazioni
Settore INFO-01/A - Informatica
3rd Conference on Lifelong Learning Agents, CoLLAs 2024
Pisa
29 luglio - 1 agosto 2024
Conference on Lifelong Learning Agents, 29-1 August 2024, University of Pisa, Pisa, Italy
ML Research Press
Adversarial machine learning; Contrastive Learning
   Foundations of Trustworthy AI - Integrating Reasoning, Learning and Optimization
   TAILOR
   European Commission
   Horizon 2020 Framework Programme
   952215
File in questo prodotto:
File Dimensione Formato  
CoLLAs - LLM-KG.pdf

accesso aperto

Tipologia: Published version
Licenza: Non specificata
Dimensione 3.67 MB
Formato Adobe PDF
3.67 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11384/152463
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact