AR back-end Sample Clauses
AR back-end. The AR back-end server processes up- loaded frames from the AR front-end and matches it with objects in its database. When it is started, it first reads the current database stored in YAML format [18]. Our database is populated with 105 objects emulating a retail store and is partitioned based on sections like food, toys and so on. Each object is stored in the database as a set of: object name, an annotated tag, SURF [27] keypoints and descrip- tors from the image of object. The AR back-end server first decodes an encoded frame from the AR front-end and runs the SURF algorithm to extract SURF feature keypoints and descriptors from it. Using this information, it starts the matching process with objects currently stored in the database. Here, the database is pruned based on the user’s location information provided by the LTE-direct localization manager. During the matching procedure, it uses several steps to improve the matching accuracy even though it in- creases runtime. In each step, it compares the output with the threshold and then decides whether to proceed to the next step or return a “no-match” response. First, it performs the ratio test with the two best matches from a brute-force k-nearest matcher for the two images. Second, it performs the symmetry test to check whether the best matches from two brute-force k-nearest matchers are the same or not. If they are not the same, the best match is discarded. Last, it validates the matches using RANSAC (random sample con- sensus) to return the correct estimates (matches) as inliers and incorrect one as outliers.
