Uncategorized

[2308.15197] Where Would I Go Next? Large Language Models as Human Mobility Predictors



Download a PDF of the paper titled Where Would I Go Next? Large Language Models as Human Mobility Predictors, by Xinglei Wang and 3 other authors

Download PDF
HTML (experimental)

Abstract:Accurate human mobility prediction underpins many important applications across a variety of domains, including epidemic modelling, transport planning, and emergency responses. Due to the sparsity of mobility data and the stochastic nature of people’s daily activities, achieving precise predictions of people’s locations remains a challenge. While recently developed large language models (LLMs) have demonstrated superior performance across numerous language-related tasks, their applicability to human mobility studies remains unexplored. Addressing this gap, this article delves into the potential of LLMs for human mobility prediction tasks. We introduce a novel method, LLM-Mob, which leverages the language understanding and reasoning capabilities of LLMs for analysing human mobility data. We present concepts of historical stays and context stays to capture both long-term and short-term dependencies in human movement and enable time-aware prediction by using time information of the prediction target. Additionally, we design context-inclusive prompts that enable LLMs to generate more accurate predictions. Comprehensive evaluations of our method reveal that LLM-Mob excels in providing accurate and interpretable predictions, highlighting the untapped potential of LLMs in advancing human mobility prediction techniques. We posit that our research marks a significant paradigm shift in human mobility modelling, transitioning from building complex domain-specific models to harnessing general-purpose LLMs that yield accurate predictions through language instructions. The code for this work is available at this https URL.

Submission history

From: Xinglei Wang [view email]
[v1]
Tue, 29 Aug 2023 10:24:23 UTC (2,472 KB)
[v2]
Tue, 9 Jan 2024 14:08:03 UTC (2,177 KB)



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *