{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T02:47:56Z","timestamp":1773802076580,"version":"3.50.1"},"reference-count":0,"publisher":"Association for the Advancement of Artificial Intelligence (AAAI)","issue":"14","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["AAAI"],"abstract":"<jats:p>Remote sensing imagery presents vast, inherently unstructured spatial data, necessitating sophisticated reasoning to interpret complex user intents and contextual relationships beyond simple recognition tasks. In this paper, we aim to construct an Earth observation workflow to handle complex queries by reasoning about spatial context and user intent. As a reasoning workflow, it should autonomously explore and construct its own inference paths, rather than being confined to predefined ground\u2011truth sequences.\nIdeally, its architecture ought to be unified yet generalized, possessing capabilities to perform diverse reasoning tasks through one model without requiring additional fine-tuning.\nExisting remote sensing approaches rely on supervised fine-tuning paradigms and task\u2011specific heads, limiting both autonomous reasoning and unified generalization.\nTo this end, we propose RemoteReasoner, a unified workflow for geospatial reasoning. The design of RemoteReasoner integrates a multi-modal large language model (MLLM) for interpreting user instructions and localizing targets, together with task transformation strategies that enable multi-granularity tasks, including object-, region-, and pixel-level. \nIn contrast to existing methods, our framework is trained with reinforcement learning (RL) to endow the MLLM sufficient reasoning autonomy. \nAt the inference stage, our transformation strategies enable diverse task output formats without requiring task-specific decoders or further fine-tuning. Experiments demonstrated that RemoteReasoner achieves state-of-the-art performance across multi-granularity reasoning tasks. Furthermore, it retains the MLLM's inherent generalization capability, demonstrating robust performance on unseen tasks and categories.<\/jats:p>","DOI":"10.1609\/aaai.v40i14.38175","type":"journal-article","created":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T00:14:23Z","timestamp":1773792863000},"page":"11883-11891","source":"Crossref","is-referenced-by-count":0,"title":["RemoteReasoner: Towards Unifying Geospatial Reasoning Workflow"],"prefix":"10.1609","volume":"40","author":[{"given":"Liang","family":"Yao","sequence":"first","affiliation":[]},{"given":"Fan","family":"Liu","sequence":"additional","affiliation":[]},{"given":"Hongbo","family":"Lu","sequence":"additional","affiliation":[]},{"given":"Chuanyi","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Rui","family":"Min","sequence":"additional","affiliation":[]},{"given":"Shengxiang","family":"Xu","sequence":"additional","affiliation":[]},{"given":"Shimin","family":"Di","sequence":"additional","affiliation":[]},{"given":"Pai","family":"Peng","sequence":"additional","affiliation":[]}],"member":"9382","published-online":{"date-parts":[[2026,3,14]]},"container-title":["Proceedings of the AAAI Conference on Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/38175\/42137","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/38175\/42137","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T00:14:23Z","timestamp":1773792863000},"score":1,"resource":{"primary":{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/view\/38175"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2026,3,14]]},"references-count":0,"journal-issue":{"issue":"14","published-online":{"date-parts":[[2026,3,17]]}},"URL":"https:\/\/doi.org\/10.1609\/aaai.v40i14.38175","relation":{},"ISSN":["2374-3468","2159-5399"],"issn-type":[{"value":"2374-3468","type":"electronic"},{"value":"2159-5399","type":"print"}],"subject":[],"published":{"date-parts":[[2026,3,14]]}}}