- ESA project to develop navigation solution for precise lunar landing
- Stereo camera synchronized tightly with IMU and GNSS using the SentiBoard, flown on a miniature helicopter
- Strong results, happy customer
- Reported significant time and cost saved both during integration and during development
The purpose of this article is to emphasize the importance of timing when working with robotics.
Some of the key tasks for a robotic system is:
- Sensing (sensors)
- Processing (computer)
- Actions (motions e.g. move from A to B, brake, turn, pitch forward etc.)
But what about time?
Let us use the following example:
Unmanned aerial vehicles (UAV) interacting with moving objects/platforms/targets.
Before delving into the example, we must first think of what is important for robotics that are interacting with the world:
- Sensing
- Motion estimation
- Object localization
To achieve the above-mentioned bullet points with high precision and accuracy, accurate timing is necessary.
A common problem is that latencies from sensing to localization (of objects/platform/targets) and action is a problem. With knowledge of such latencies, the control system compensates for delayed measurements using motion predictions. If not, the control system will struggle making decisions to achieve its given task at the right time. E.g. moving to a position, applying automatic braking, initiating a turn or tilting the multirotor UAV to change position in a given direction.
Splitting the challenges down to pieces
Let us now split up the latency from sensor measurement and all the way to the robotic system starts to perform an action. This is illustrated in the image below.

Latency accounting. Summarizing the latencies relative measurement time of validity (TOV)
+Latency before transport (TOT-TOV)
+Sensor data transport time over communication/data link (TOA-TOT)
+Latency at compute before processing start (TOP-TOA)
+Sensor data processing time (TOD-TOP)
+Time from decision to action (TOAc-TOD)
Total latency = TOAc-TOV
How do these latencies affect my system? It works fine, right?
That depends. The negative effect of unhandled timing errors and latency increases with:
+Relative dynamics of objects
+Timing errors
+Unmitigated latencies
= Increased action errors related to time
How can the SentiBoard (SB) by SentiSystems technology help with this?

- Sensing, timing and sensor processing must be tightly integrated to achieve full potential of your agile cyber-physical system
- Hardware-based timing is superior to software-based timing
- The SB times sensor measurements with: time of validity (TOV), time of arrival (TOA), TOV in UTC time if GNSS time is available
- With the SB and software utilities, time syncing protocols such as IEEE1588 can be used to accurately sync the PC time with SB time
- This enables SentiSystems to estimate sensor data latencies from measurement validity, via transport, to onboard processing and to action, providing this to the user
- Having access to latencies, vehicle (UAV, vessel, robot, car, vessel, etc.) motion states/signal and (track of) identified objects/platforms enables SentiSystems to do real-time localization and motion prediction of objects and planforms
- Relative timing errors among sensor matters also for slow/low-dynamic applications
One of the challenges faced by robotic system designers is to decide which sensors to include in their systems. Several factors come into play, such as weight, power consumption, cost, accuracy, operating range, and robustness. The ultimate goal of the sensor payload is to estimate internal states, such as the position and orientation of the robot, and external states, such as range and bearing to an object of interest. This is typically achieved through fusing measurements from different sensors into a single coherent state estimate.

A key insight is that the overall accuracy of the sensor fusion algorithm is limited by one out of two factors.
1. The accuracy of the individual sensor measurements.
2. The accuracy of the time synchronization between the different sensors.
If your system is limited by the accuracy of the time synchronization between sensors, then purchasing a more expensive and more accurate sensor will not improve your overall accuracy.
For a given time synchronization accuracy there will be an optimal sensor accuracy, improvements beyond this optimal point will not result in increased overall accuracy.
We are happy to announce that we have joined the NVIDIA INCEPTION PROGRAM. This is an important step for us in the direction we want to go. This will contribute to bringing our sensor fusion technology, robotics navigation and perception systems to a broader audience. We are excited for the future! #nvidiainception
Trondheim, Norway – SentiSystems, a Trondheim-based technology company, is proud to announce its role in developing sensor and navigation technology for the world’s first autonomous ferry connection: Lavik–Oppedal in Sognefjorden, Norway.
In collaboration with Norwegian Electric Systems (NES) and Fjord1 AS, this pioneering project marks a new chapter in maritime transportation. By combining zero-emission technology with autonomous navigation systems, the Lavik–Oppedal ferry will set a global benchmark for sustainable and intelligent transport.
“This is an exciting opportunity for us at SentiSystems and an important collaboration where we will contribute with our technology,” says Arne Kjorsvik, CEO at SentiSystems.
The Lavik–Oppedal route will be the first of its kind, where ferries operate autonomously using advanced systems for perception, positioning, and decision-making. SentiSystems plays a key role by delivering sensor fusion and situational awareness technology that enables safe, reliable, and precise autonomous operations in dynamic maritime environments.
“By leveraging sensor fusion in autonomous vessels, we enable enhanced perception, situational awareness, and improved positioning precision and accuracy, all of which are crucial for the safe and efficient operation of these ferries,” says Frederik Leira, CTO at SentiSystems.
Supporting Norway’s Maritime Innovation Ecosystem
SentiSystems is an active member of the Ocean Autonomy Cluster and the FI Ocean Space Incubator, two of Norway’s leading communities for ocean technology innovation. The collaboration with NES—a member of the GCE Blue Maritime Cluster—highlights how Norwegian clusters and innovation environments are coming together to position Norway as a leader in autonomous maritime technology.
“It’s great to see that more and more members of OAC are becoming suppliers of components and new technology in major projects,” says Frode Halvorsen, cluster leader at Ocean Autonomy Cluster.
The partnership aligns with the goals of the MIDAS project, which aims to promote autonomous maritime technology as a future export industry for Norway.