{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T02:56:14Z","timestamp":1773802574059,"version":"3.50.1"},"reference-count":0,"publisher":"Association for the Advancement of Artificial Intelligence (AAAI)","issue":"19","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["AAAI"],"abstract":"<jats:p>Attribute-specific fashion retrieval aims to enhance fine-grained image retrieval by emphasizing the similarity of specific attributes. Current methods primarily rely on attention mechanisms to extract attribute-related visual features but face two key challenges: the limitations of coarse-grained localization in achieving fine-grained accuracy, and an imbalance between global and local perception, where excessive focus on local features can undermine overall performance. To address these issues, we propose the fashion microscope ProFashion, which achieves pixel-level attribute awareness through optimal transport and neural semantic aggregation. The framework begins by employing optimal transport to align semantic attributes with visual patterns from a global perspective, generating an attribute-visual value map that highlights distinctive regions while reducing interference. This is followed by simulating the human brain's perception of attribute feature patterns through superpixel generation and aggregation, capturing attribute-related features at the pixel semantic level and forming key semantic clusters that preserve microstructures. Building on this, an attribute graph is constructed to facilitate feature clustering, significantly enhancing the framework's capability to handle overlapping features and cross-scale relationships. Comprehensive experiments on the FashionAI, DeepFashion, and DARN datasets demonstrate the framework's effectiveness, achieving overall MAP improvements of 3.11%, 3.70%, and 3.49%, respectively. Additionally, the framework delivers relative average throughput gains of 26.94%, 22.22%, and 24.78% on the FashionAI, DeepFashion, and DARN datasets, respectively.<\/jats:p>","DOI":"10.1609\/aaai.v40i19.38672","type":"journal-article","created":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T00:46:43Z","timestamp":1773794803000},"page":"16343-16351","source":"Crossref","is-referenced-by-count":0,"title":["Fashion Microscope: Pixel-Level Attribute Perception via Optimal Transport and Neural Semantic Aggregation"],"prefix":"10.1609","volume":"40","author":[{"given":"Shuili","family":"Zhang","sequence":"first","affiliation":[]},{"given":"Hongzhang","family":"Mu","sequence":"additional","affiliation":[]},{"given":"Jiawei","family":"Sheng","sequence":"additional","affiliation":[]},{"given":"Qianqian","family":"Tong","sequence":"additional","affiliation":[]},{"given":"Wenyuan","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Quangang","family":"Li","sequence":"additional","affiliation":[]},{"given":"Tingwen","family":"Liu","sequence":"additional","affiliation":[]}],"member":"9382","published-online":{"date-parts":[[2026,3,14]]},"container-title":["Proceedings of the AAAI Conference on Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/38672\/42634","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/38672\/42634","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T00:46:43Z","timestamp":1773794803000},"score":1,"resource":{"primary":{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/view\/38672"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2026,3,14]]},"references-count":0,"journal-issue":{"issue":"19","published-online":{"date-parts":[[2026,3,17]]}},"URL":"https:\/\/doi.org\/10.1609\/aaai.v40i19.38672","relation":{},"ISSN":["2374-3468","2159-5399"],"issn-type":[{"value":"2374-3468","type":"electronic"},{"value":"2159-5399","type":"print"}],"subject":[],"published":{"date-parts":[[2026,3,14]]}}}