Large Language Models for Continual Relation Extraction
Real-world data streams, such as news articles and social media posts, are dynamic and nonstationary, creating challenges for real-time structured representation via knowledge graphs, where relation extraction is a key component. Continual relation extraction (CRE) addresses this setting by incrementally learning new relations while preserving previously acquired knowledge. This work investigates the use of pretrained language models for CRE, focusing on large language models (LLMs) and the effectiveness of memory replay in mitigating forgetting. We evaluated decoder-only models […]