Uncategorized

An Example-Driven Tabular Transformer for Joinability by Leveraging Large Language Models



Download a PDF of the paper titled DTT: An Example-Driven Tabular Transformer for Joinability by Leveraging Large Language Models, by Arash Dargahi Nobari and Davood Rafiei

Download PDF
HTML (experimental)

Abstract:Many organizations rely on data from government and third-party sources, and those sources rarely follow the same data formatting. This introduces challenges in integrating data from multiple sources or aligning external sources with internal databases. Commercial database systems do not offer adequate support for integrating data from heterogeneous sources, and manual integration is both time-consuming and inefficient. State-of-the-art data integration approaches that rely on similarity functions and textual transformations often fail to handle challenging cases where multiple mappings are required, or the mappings go beyond simple textual transformations. In this paper, we study the potentials of deep neural models for transforming tables for joinability. In particular, we cast the problem as a prediction task and develop a framework that leverages large deep-learning language models to transform tabular data from a source formatting to a desired target representation. Our framework can efficiently learn the patterns for mapping a source formatting into an expected target using just a few examples, which can then be used for tasks such as table joining, filling in missing values, and error detection. Compared to state-of-the-art mapping and joining approaches, our framework delivers noticeably more accurate and scalable performance on both real-world and synthetic datasets. Our experimental evaluation also shows that the performance of the proposed framework using our fine-tuned model is at par or better than large language models such as GPT-3, despite the significant difference in size, and that using large language models within our framework improves their performance.

Submission history

From: Arash Dargahi Nobari [view email]
[v1]
Sun, 12 Mar 2023 20:51:26 UTC (304 KB)
[v2]
Mon, 25 Dec 2023 05:31:28 UTC (285 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *