We proposed a new mobile AR framework that provides an elaborate level of AR by natural feature tracking and that is also scalable to the number of objects to be augmented. We successfully integrated this scalable recognition technology based on a bag of visual words and the natural feature tracking on mobile phones through conventional Wi-Fi networks and proved that its performance is acceptable to real-world applications.
In our research we remove the limitations on mobile AR coming from inexact tracking and make scalability possible. We expect that more high quality AR applications are created based on the
presented techniques.
An experiment is carried out on an Android Nexus one with a 1GHz snapdragon processor. The camera has an image size of320x240 pixels, gets frames at around 20Hz depending on the
illumination conditions, and its network connection is IEE 802.1b 100Mbps. The database has pictures of 10k music CD covers and 10k corresponding music video scenes; it is used for correctly
augmenting the music video scenes on the CD case covers.
Paper link: http://mind.kaist.ac.kr/pdf/srtmar201...
(via YouTube)