Accurate Indoor Localization with Multiple Feature Fusion
Document Type
Conference Proceeding
Publication Date
2017
Publication Title
Wireless Algorithms, Systems, and Applications, WASA 2017
Abstract
In recent years, many fingerprint-based localization approaches have been proposed, in which different features (e.g., received signal strength (RSS) and channel state information (CSI)) were used as the fingerprints to distinguish different positions. Although CSI-based approaches usually achieve higher accuracy than RSSI-based approaches, we find that the localization results of different approaches usually compensate with each other, and by fusing different features we can get more accurate localization results than using only single feature. In this paper, we propose a localization method that fusing different features by combining results of different localization approaches to achieve higher accuracy. We first select three most possible candidate positions from all the candidate positions generated by different approaches according to a newly defined metric called confidence degree, and then use the weighted average of them as the position estimation. When there are more than three candidate positions, we use a minimal-triangle principle to break the tie and select three out of them. Our experiments show that the proposed approach achieves median error of 0.5 m and 1.1 m respectively in two typical indoor environments, significantly better than that of approaches using only single feature.
Repository Citation
Xiao, Yalong; Wang, Jianxin; Zhang, Shigeng; Wang, Haodong; and Cao, Jiannong, "Accurate Indoor Localization with Multiple Feature Fusion" (2017). Electrical and Computer Engineering Faculty Publications. 436.
https://engagedscholarship.csuohio.edu/enece_facpub/436
DOI
10.1007/978-3-319-60033-8_45
Volume
10251
Comments
This work is partially supported by the National Natural Science Foundation of China under Grant Nos. 61402056 and 61402541, the Hunan Provincial Natural Science Foundation of China under Grant No. 2017JJ3413, the NSFC/RGC Joint research Scheme under Grant No. N PolyU519/12, ANR/RGC Joint Research Scheme under Grant No. A-PolyU505/12, and the CERNET Innovation Project under Grant No. NGII20160309.