Understands natural descriptions
Describe targets the way teams already talk: “spill kit in bay 4”, “equipment trolley for theatre 2”, “scanner cart 3”.
Semantic search for robots
Seek is a training-free semantic navigation layer for robots. Give it a simple phrase – “blue trolley in bay 4”, “visitor kiosk in atrium B” – and it guides your robot through real buildings as if it has worked there for years.
No site-specific training runs. No custom detectors per deployment. Just natural language in, and reliable search behaviour out.
Built for labs, warehouses, hospitals, and campuses where layouts shift, inventory moves, and robots need to adapt in real time.
::.. .-:. .::. ..: .. : .:.: .. :.. . .::. :. .--.. .::. . .::. . :. . .. .:. -...:...:: :.. ..: . .:. .... [ 200 OK ] Seek
# semantic-search session
target: "blue trolley in bay 4"
environment: ward corridor
mode: training-free
> starting search... > narrowing down promising areas... > suggesting safe approach pose...
Instead of “go to coordinate X”, ask your robot to “go find that thing” – and let Seek handle the rest.
Point the robot at a simple phrase like “blue trolley in bay 4” and let Seek decide where to look first, next, and when it’s close enough to stop.
Main Features //
Seek gives robots a sense of what they’re looking for, where it might be, and when they’re close enough — all from a simple phrase.
Describe targets the way teams already talk: “spill kit in bay 4”, “equipment trolley for theatre 2”, “scanner cart 3”.
Seek keeps track of what the robot has already seen, so it doesn’t keep circling the same empty corner when there are better places to check.
The search behaviour focuses time and motion on areas that matter, instead of blindly sweeping every corridor just to “cover the map”.
Developer First //
Seek plugs into the stack you already have. You keep SLAM, planning, and safety. We provide the search behaviour and high-level goals.
Works with ROS 2, simulators, custom frameworks, and real robots. No changes to your low-level control required.
from seek import Seek
seek = Seek(api_key="ss-YOUR_API_KEY")
# Start a natural-language search
session = seek.search_start(
target="blue trolley in bay 4",
pose=robot.pose(),
frame=rgbd.latest_frame(),
)
# Ask Seek where to look next
while not session.done:
waypoint = seek.search_next(session.id)
nav.go_to(waypoint)
# Stream updates as the robot moves
seek.search_update(
session_id=session.id,
pose=robot.pose(),
frame=rgbd.latest_frame(),
)
# When Seek confirms the target
result = seek.search_verify(session.id)
if result.found:
nav.go_to(result.approach_pose)
Integrations //
Seek fits alongside your existing tools instead of replacing them.
Foundations //
Seek is designed around the messy reality of real buildings: things move, people rearrange spaces, and robots arrive after everything has changed.
No per-site retraining just to make “find the trolley” work. Seek aims to generalise across locations from day one.
Any platform that can localise and stream a camera feed can use Seek — AMRs, mobile bases with arms, quadrupeds, and more.
Search runs produce clear traces, so teams can understand how a robot looked for something and why it chose the paths it did.
Proof //
We focus on measurable outcomes in real environments — so stakeholders can approve rollout based on evidence, not hype.
How quickly a robot can locate the target under normal site conditions.
How often the robot confirms the correct target and reaches a safe approach pose.
How much manual searching and recovery work is removed from the workflow.
Features //
Instead of writing a new script every time you want a robot to “go and find X”, you call Seek and reuse the same behaviour across fleets and sites.
Move from internally-named waypoints to natural phrases that operations teams actually use.
Get predictable, repeatable search behaviour instead of handcrafted routines that differ per robot.
When a target is found, Seek proposes sensible stand-off distances for inspection, handover, or docking.
Save runs as artefacts you can replay, compare over time, or use as evidence for pilots and audits.
Use cases //
A few ways teams use Seek in practice.
When inventory and reality disagree, send a robot to “find pallet 18B” or “locate scanner cart 3” and bring back evidence.
Help visitors and staff with instructions like “walk to the visitor kiosk in atrium B” or “go to the charging bay for unit 12”.
Let robots look for mobile assets – trolleys, carts, spill kits – that don’t stay in fixed bays.
Run language-driven navigation experiments without building a search behaviour from scratch every semester.
Contact //
If you have a robot operating in a real building (or a simulator setup you want to validate), we can discuss fit, integration requirements, and what a measurable proof-of-value looks like.
What to include
Robot platform, sensors (RGB-D/LiDAR), your nav stack (ROS2/Nav2 etc), and example “find-by-name” queries.
Typical outcomes
A repeatable search behaviour + logs/metrics to support internal approval for rollout.
Deployment options
Hosted API for early evaluations. On-prem/VPC discussions for larger deployments.
FAQ //
Everything you need to know about how Seek fits into your robots and your stack.
Seek is a semantic navigation layer that lets robots find things in real buildings using natural language, not just coordinates.
No. Seek is designed to work across sites without per-environment training loops. You may tune some parameters for your platform and use case, but not retrain from scratch.
Any platform that can estimate its pose and stream a camera feed — AMRs, mobile manipulators, quadrupeds, and indoor drones.
No. You keep navigation, control, and safety. Seek provides high-level search behaviour and goal suggestions that plug into your existing stack.
Early deployments run as a hosted API. For larger rollouts, we’re open to discussing on-prem or VPC setups based on security and operational requirements.
Contact us with your robot platform, environment, and a few example targets you want the robot to find. We’ll confirm integration fit, define success metrics, and propose a short proof-of-value plan.