Abstract: In recent years, the importance of location-based services and indoor positioning systems increased significantly for both, research and industry. Visual localization systems have the advantage of not depending on dedicated infrastructure and thus they are interesting for navigation within buildings. While there are already approaches which are using pre-recorded databases of reference images to obtain an absolute position for a given query image, suitable applications which are estimating the relative movement of pedestrians out of a first person perspective video are still missing. This paper presents a novel approach for a pedometer as well as for an activity detector using a such a first person perspective video stream of a pedestrian as input data. The system counts the number of steps and furthermore detects current activities of a user. Therefore, we analyze all video input data with the SURF algorithm in order to extract robust feature points. Especially the orientation and scaling properties of this feature points are used for an accurate measurement.