摘 要: 针对人类轨迹与可穿戴传感器数据的匹配问题,提出了一种解决方法,通过将摄像头捕获的人类轨迹 与可穿戴设备的传感器数据进行匹配,首先利用深度学习模型SyncScoreDTW评估单位时间内轨迹与传感器数据 的相似度,其次通过似然融合算法逐步更新这些相似度。在自制数据集上进行的实验验证表明,该方法实现了 77.5%的高匹配准确率。同时,在公开的UEA数据集上,该方法也展示出优越的性能。此项研究揭示了跨模态数 据匹配的潜力,为轨迹匹配领域提供了一种全新、高效的解决方案. |
关键词: 深度学习;时间序列;人体轨迹;传感器;多模态传感器数据分析;Transformer |
中图分类号: TP391
文献标识码: A
|
|
Research on Trajectory Matching Method Based on the Fusion of Deep Learning and DTW |
YAN Jinzhe, WU Xiangyang
|
(Department of Computer Science, Hangzhou Dianzi University, Hangzhou 310019, China)
jzandyan@gmail.com; wuxy@hdu.edu.cn
|
Abstract: To address the problem of matching human trajectories with wearable sensor data, this paper proposes a solution that matches human trajectories captured by cameras with sensor data from wearable devices. First, the deep learning model SyncScoreDTW is used to evaluate the similarity between trajectories and sensor data over unit time. Next, a likelihood fusion algorithm is employed to progressively update these similarities. Experiments conducted on a self-made dataset demonstrate that this method achieves a high matching accuracy of 77.5%. Additionally, on the publicly available UEA dataset, the method also shows superior performance. This research reveals the potential of cross-modal data matching and provides a novel and efficient solution for the field of trajectory matching. |
Keywords: deep learning; time series; human trajectory; sensors; multimodal sensor data analysis; Transformer |