Multimodal recommendation aims to model user and item representations comprehensively with the involvement of multimedia content for effective recommendations. Existing research has shown that it is beneficial for recommendation performance to combine (user- and item-) ID embeddings with multimodal salient features, indicating the value of IDs. However, there is a lack of a thorough analysis of the ID embeddings in terms of feature semantics in the literature. In this paper, we revisit the value of ID embeddings for multimodal recommendation and conduct a thorough study regarding its semantics, which we recognize as subtle features of \emph{content} and \emph{structure}. Based on our findings, we propose a novel recommendation model by incorporating ID embeddings to enhance the salient features of both content and structure. Specifically, we put forward a hierarchical attention mechanism to incorporate ID embeddings in modality fusing, coupled with contrastive learning, to enhance content representations. Meanwhile, we propose a lightweight graph convolution network for each modality to amalgamate neighborhood and ID embeddings for improving structural representations. Finally, the content and structure representations are combined to form the ultimate item embedding for recommendation. Extensive experiments on three real-world datasets (Baby, Sports, and Clothing) demonstrate the superiority of our method over state-of-the-art multimodal recommendation methods and the effectiveness of fine-grained ID embeddings. Our code is available at https://anonymous.4open.science/r/IDSF-code/.
翻译:多模态推荐旨在通过融入多媒体内容,全面建模用户与物品的表征以实现有效推荐。现有研究表明,将(用户与物品的)ID嵌入与多模态显著特征相结合有助于提升推荐性能,这体现了ID的价值。然而,现有文献缺乏对ID嵌入在特征语义层面的深入分析。本文重新审视了ID嵌入在多模态推荐中的价值,并对其语义进行了深入研究,我们将其识别为**内容**与**结构**的隐性特征。基于研究发现,我们提出了一种新颖的推荐模型,通过融入ID嵌入以增强内容与结构的显著特征。具体而言,我们提出了一种层级注意力机制,在模态融合中引入ID嵌入,并结合对比学习以增强内容表征。同时,我们为每个模态设计了一个轻量级图卷积网络,融合邻域信息与ID嵌入以改进结构表征。最终,内容与结构表征被结合以形成用于推荐的最终物品嵌入。在三个真实数据集(Baby、Sports和Clothing)上的大量实验表明,我们的方法优于当前最先进的多模态推荐方法,并验证了细粒度ID嵌入的有效性。我们的代码公开于:https://anonymous.4open.science/r/IDSF-code/。