Uncategorized

Fine-Tuning Large Language Models for Domain Knowledge Graph Alignment via Neighborhood Partitioning and Generative Subgraph Encoding



Download a PDF of the paper titled GLaM: Fine-Tuning Large Language Models for Domain Knowledge Graph Alignment via Neighborhood Partitioning and Generative Subgraph Encoding, by Stefan Dernbach and 4 other authors

Download PDF

Abstract:Integrating large language models (LLMs) with knowledge graphs derived from domain-specific data represents an important advancement towards more powerful and factual reasoning. As these models grow more capable, it is crucial to enable them to perform multi-step inferences over real-world knowledge graphs while minimizing hallucination. While large language models excel at conversation and text generation, their ability to reason over domain-specialized graphs of interconnected entities remains limited. For example, can we query a LLM to identify the optimal contact in a professional network for a specific goal, based on relationships and attributes in a private database? The answer is no–such capabilities lie beyond current methods. However, this question underscores a critical technical gap that must be addressed. Many high-value applications in areas such as science, security, and e-commerce rely on proprietary knowledge graphs encoding unique structures, relationships, and logical constraints. We introduce a fine-tuning framework for developing Graph-aligned LAnguage Models (GLaM) that transforms a knowledge graph into an alternate text representation with labeled question-answer pairs. We demonstrate that grounding the models in specific graph-based knowledge expands the models’ capacity for structure-based reasoning. Our methodology leverages the large-language model’s generative capabilities to create the dataset and proposes an efficient alternate to retrieval-augmented generation styled methods.

Submission history

From: Stefan Dernbach [view email]
[v1]
Fri, 9 Feb 2024 19:53:29 UTC (1,030 KB)
[v2]
Fri, 16 Feb 2024 17:23:56 UTC (543 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *