Kategoriat Landsurveying

Experiments with commercial visual-inertial odometry systems in partially snow-covered forest environment


Visual-inertial odometry (VIO) is a method of estimating the motion of a device in 3D space by combining the information from both visual and inertial sensors. It involves the use of a camera or cameras that capture images of the environment, and an inertial measurement unit (IMU) that provides measurements of the acceleration and rotation of the camera. Currently visual-inertial odometry is used widely, for example in the robotic applications especially in built environments. (Scaramuzza & Zhang 2019.) 

Previously, an interest in utilization of visual-inertial odometry in situations where global navigation satellite system (GNSS) based positioning is not available, has been reported both nationally here in Finland by the National Land Survey (Kaasalainen, Mäkelä, Saajasto, Kirkko-Jaakkola & Kuusniemi 2021) and globally for example by the US Air Force Research Laboratory (Miller, Soloviev, Uijt de Haag, Veth, Raquet, Klausutis & Touma 2010). Today, the availability of commercial low-cost VIO systems has inspired ideas for utilization of these systems in several different GNSS-lacking environments and applications.

The objective of the experiments reported here was to test the visual-inertial odometry capabilities of commercial off-the-shelf (COTS) products with free or price-included software in a forest environment, where GNSS-denied positioning may be needed for example by tourism business, rescue services or the military. These experiments were conducted in the DEMOMAMI22 project funded by The Regional Council of Lapland from the European Regional Development Fund. Besides the remarks of the VIO experiments performed, this report includes some observations of other potential methods for GNSS-denied positioning.

Related work

Commercial VIO devices are already utilized for visual-inertial odometry widely in different industries, and there is comprehensive basic and applied research considering VIO. However, most of the studies are focused on urban environments and/or vehicles and may utilize other types of purpose-built systems than commercial off-the-shelf products, and the studies rarely aim to evaluate different COTS products for further commercial adaptation. 

A similar kind of experiment and evaluation has been done by the Department of Mechanical Systems Engineering of Sookmyung Women’s University, South Korea (Kim, Kim, Song, Lee, Jung & Kim 2022) in built environments. Faculty of Information Technology and Communication Sciences of Tampere University has been benchmarking and evaluating VIO and visual SLAM methods on forest roads (Ali, Durmush, Suominen, Yli-Hietanen, Peltonen, Collin, & Gotchev 2020). The Finnish Geospatial Institute has been involved in studies about the tactical usage of VIO systems in built environments, such as the CANDO project (Morrison, Ruotsalainen, Mäkelä, Rantanen & Sokolova 2019) and the INTACT project (Ruotsalainen, Kirkko-Jaakkola, Rantanen & Mäkelä 2017).

Used systems

Following devices and software were chosen to form the systems:

  • Intel RealSense D455 depth camera & Spectacular AI SDK examples
  • Stereolabs ZED2 depth camera & ZED SDK 4.0 examples
  • Apple iPhone 14 Pro smartphone & ARKit iOS Logger app
  • Google Pixel 6 Pro smartphone & ARCore IMURecorder app
Figure 1. 3D-printed test rig
Figure 2. Test rig in terrain

An Intel RealSense depth camera was used with Spectacular AI’s (2023) SDK examples (Seiskari & Rantalankila 2023). A Stereolabs ZED2 depth camera was paired with the manufacturer’s SDK examples (2023) with minor modifications. An Apple iPhone 14 Pro smartphone was used via PyojinKim’s iOS Logger app (Kim 2021) that draws from Apple’s ARKit, and a Google Pixel 6 Pro smartphone was used via rfbr’s IMURecorder app (Taniguchi 2019) that draws from Google’s ARCore. The IMURecorder app also had minor modifications done by Lapland UAS’s FrostBit lab to take into account the changed permission policies in recent Android versions. All devices were fastened to a 3D-printed plastic rig manufactured by Lapland UAS’s IoT lab. The depth cameras were connected to a Windows laptop for power and I/O and the smartphones ran independently. All the estimates for position were calculated in real time by software.

Experiments and results

The experiments held place in Ounasvaara area in Rovaniemi on 4 May and 8 May 2023 during daylight. Terrain was typical low-density boreal forest, with some height differences. The ground floor was partially covered by snow. Lighting conditions were flat and didn’t differ noticeably during the experiments.

Figure 3. Ounasvaara terrain 1
Figure 4. Ounasvaara terrain 2

The experiments included six separate routes, approximately from 190 m to 700 m long, that were walked back and forth with the test rig. The first three and the last three routes had the same point for start and finish. At start all the recordings were initiated, then the route was walked, and at finish the recordings were stopped and software restarted. The test rig was held approximately at chest height and aimed forward but was variated slightly during the test both intentionally and unintentionally. No sudden or abrupt movement was done. Experiments were conducted by foot in terrain and there was no external shock absorption or stabilization for the rig.

On the routes 4, 5 & 6 the Google Pixel 6 Pro had both software and user related problems that prevented the recording. The iPhone 14 Pro had similar problems on route 2, as well as the iPhone 12 mini used to record reference GPX trail on the last three routes. The stereo cameras had no malfunctions. The accuracy of the reference GPX trail is several meters at best, so it is used mostly to gain context for the relative comparison of these systems. The systems didn’t know their geospatial location, so all the results were relative to system origo and then converted. Outputs were in different formats and orientations, and were unified to common origo in ETRS-TM35FIN for processing, using Excel and 3D-Win software.

The results are shown below in tables and maps that include errors of closure in XY and Z coordinates, the approximate trail lengths, and bearings from start to the turning point of the trail.

Figure 5. Route tables
Figure 6. Route 1
Figure 7. Route 2
Figure 8. Route 3
Figure 9. Route 4
Figure 10. Route 5
Figure 11. Route 6

Discussion and conclusion

We have conducted visual-inertial odometry experiments for four different COTS systems in a forest environment. The results demonstrate challenges for using these systems and methods for geospatial positioning, but show potential especially for filling the gaps and breaks in GNSS positioning or small-scale relative positioning.

Considering the small sample size and very preliminary nature of these experiments, on the first three routes that are considered the most reliable and comparable ones, all the devices performed quite similarly. The most noticeable error on all these is the 20–30 grad declination, which is greater than the approximately 14 grad magnetic declination in the area, but appears to be consistent within the systems and between the routes. None of the systems appear to be truer than the other in this sense. Considering the errors of closure, the smartphones performed the best and most consistently in XY but especially in Z. The stereo cameras usually had greater and more inconsistent errors of closure, especially the ZED2 in XY and RealSense D455 in Z. In this small sample group and in this environment, the iPhone 14 Pro’s ToF-LiDAR doesn’t seem to be accounting to much better performance than the Google Pixel 6 Pro with traditional cameras. There are also hints of scaling inconsistencies and accumulation of rotational errors that are typical for these types of tracking methods, but no further analysis was done to gain more understanding.

Other potential methods

The US Air Force Research Laboratory (Miller et al. 2010) has summarized three potential techniques for filling the “navigation gap” in GNSS-denied environments. One of these is aided inertial, which we have introduced here. The other two are beacon-based navigation and signals of opportunity (SoOP). These briefs include only examples that do not need a comprehensive a priori model of the environment, for example up-to-date dense point clouds.

Beacon-based navigation is a technique that uses signal emitting beacons, that are usually captured with a receiver to achieve positioning or navigation by trilateration. These types of methods are currently widely applied in indoor positioning. (Sharma, Bidari, Valente & Paredes 2019). These techniques also include pseudolites, which are kind of “repeaters” for GNSS signals and which have gained interest in military contexts, but do have their own challenges and limitations (Jones 2017). There are not many COTS solutions for outdoor scale beacon-based navigation, but for example Haglöf Sweden AB (2023) has developed a product called PosTex that uses ultrasound and lasers, and has been tested by the Sveriges lantbruksuniversitet (Lämås 2010). A Finnish company Terratec Oy is also developing a pseudolite system for forestry applications (Metsäkeskus 2019).

Signals of opportunity positioning (SoOP) is a technique that leverages existing wireless signals, such as TV, cellular or Wi-Fi to determine the position of a receiver (Duckworth & Baranoski 2007). There are no readily available COTS solutions, even though smartphones readily use “assisted” GNSS methods that utilize the vast amount of geospatial information the big operation system vendors have about for example the sources of cellular signals. Other SoOP-technologies that are beginning to be commercialized to this scale are magnetic navigation (AstraNav 2023) and muon particle navigation (Muon Solutions 2023).


Ali, I.; Durmush, A.; Suominen, O.; Yli-Hietanen, J.; Peltonen, S.; Collin, J. & Gotchev, A. (2020) FinnForest dataset: A forest landscape for visual SLAM. Robotics and Autonomous Systems, Vol 132. https://urn.fi/URN:NBN:fi:tuni-202009016823

AstraNav (2023) Company website: Astra Navigation Inc., Texas, USA. https://www.astranav.com/

Duckworth, G.L. & Baranoski, E.J. (2007) Navigation in GNSS-Denied Environments: Signals of Opportunity and Beacons. Military Capabilities Enabled by Advances in Navigation Sensors (pp. 3-1 – 3-14). Meeting Proceedings RTO-MP-SET-104, Paper 3. Neuilly-sur-Seine, France: RTO. Available from: http://www.rto.nato.int

Haglög Sweden AB. (2023) Positioning of individual trees. https://haglofsweden.com/project/positioning-of-individual-trees/

Jones, M. (2017) Army pseudolites: What, why and how? North Coast Media: GPS World. https://www.gpsworld.com/army-pseudolites-what-why-and-how/

Kim, P. (2021) GitHub: iOS Logger. https://github.com/PyojinKim/ios_logger

Kim, P.; Kim, J.; Song, M.; Lee, Y.; Jung, M. & Kim, HG. (2022) Benchmark Comparison of Four Off-the-Shelf Proprietary Visual–Inertial Odometry Systems. Department of Mechanical Systems Engineering, Sookmyung Women’s University, Seoul, South Korea. https://www.mdpi.com/1424-8220/22/24/9873

Lämås, T. (2010) The Haglöf PosTex ultrasound instrument for the positioning of objects on forest sample plots. Umeå: Sveriges lantbruksuniversitet – Institutionen för skoglig resurshushållning. https://pub.epsilon.slu.se/5461/1/Lamas_t_101019.pdf

Metsäkeskus 2019. Loppuraportti: Koealamittaus 2020 – Suomen metsäkeskuksen projekti 21300/527. Project report.

Miller, M.; Soloviev, A.; Uijt de Haag, M.; Veth, M.; Raquet, J.; Klausutis, T. & Touma, J. (2010) Navigation in GPS Denied Environments: Feature-Aided Inertial Systems. RTO-EN-SET116(2010). Munitions Directorate, Air Force Research Laboratory, Florida, USA. https://apps.dtic.mil/sti/pdfs/ADA581023.pdf

Morrison, J.; Ruotsalainen, L.; Mäkelä, M.; Rantanen, J. & Sokolova, N. (2019) Combining visual, pedestrian, and collaborative navigation techniques for team based infrastructure free indoor navigation (CANDO). SINTEF, FGI, University of Helsinki. https://tuhat.helsinki.fi/ws/files/128942125/ION2019_Paper_Draft_V4.pdf

Muon Solutions (2023) Company website: Muon Solutions Oy, Oulu, Finland. http://muon-solutions.com/

National Land Survey of Finland. (2021) Kaasalainen, S.; Mäkelä, M.; Saajasto, M.; Kirkko-Jaakkola, M. & Kuusniemi, H. Report: MML 50103/08 05/2021. Selvitys GNSS-palvelujen tarjonnasta ja toiminnasta. https://www.maanmittauslaitos.fi/sites/maanmittauslaitos.fi/files/GNSS_selvitys_loppuraportti.pdf

Ruotsalainen, L.; Kirkko-Jaakkola, M.; Rantanen, J. & Mäkelä, M. (2017) Summary report: Infrastructure-free tactical situational awareness (INTACT). 2017/2500M-0063. MATINE, FGI. https://www.defmin.fi/files/4160/MATINE_Summary_Report_2500M-0063_INTACT.pdf

Scaramuzza, D. & Zhang, Z. (2019) Visual-Inertial Odometry of Aerial Robots. Springer Encyclopedia of Robotics, Berlin, Germany.

Seiskari, O. & Rantalankila, P. (2023) GitHub: Spectacular AI SDK. https://github.com/SpectacularAI/sdk

Sharma, P.; Bidari, S.; Valente, A. & Paredes, H. (2019). Towards blind user’s indoor navigation: a comparative study of beacons and decawave for indoor accurate location. https://www.researchgate.net/publication/337730120_Towards_blind_user’s_indoor_navigation_a_comparative_study_of_beacons_and_decawave_for_indoor_accurate_location

Spectacular AI. (2023) Company website: Spectacular AI OY, Helsinki. https://www.spectacularai.com/

Stereolabs. (2023) Developers: SDK Downloads. Stereolabs Inc., USA. https://www.stereolabs.com/developers/release/

Taniguchi, I. (2019) GitHub: IMURecorder – Android application used to record IMU data and associated pose via ARCore. https://github.com/rfbr/IMU_and_pose_Android_Recorder

Writer Janne W. Matilainen

Writers: Janne Matilainen & Teuvo Heimonen

Lecturer Janne W. Matilainen (M.Eng.) and Senior Lecturer Teuvo Heimonen (Lis.Sc.(Tech.)) work at the Lapland University of Applied Sciences in teams of TEQU and Land Surveying Technologies as part of the Smart Built Environment expertise group. This text is related to DEMOMAMI22-project.