Understands natural descriptions
Describe targets the way teams already talk: “spill kit in bay 4”, “equipment trolley for theatre 2”, “scanner cart 3”.
We’ll reach out about fit, timelines, and next steps for integrating Seek on your robot.
Semantic search for robots
Seek is a training-free semantic navigation layer for robots. Give it a simple phrase – “blue trolley in bay 4”, “visitor kiosk in atrium B” – and it guides your robot through real buildings as if it has worked there for years.
No site-specific training runs. No custom detectors per deployment. Just natural language in, and smart search behaviour out.
Built for labs, warehouses, hospitals, and campuses where layouts shift, inventory moves, and robots need to adapt in real time.
::.. .-:. .::. ..: .. : .:.: .. :.. . .::. :. .--.. .::. . .::. . :. . .. .:. -...:...:: :.. ..: . .:. .... [ 200 OK ] Seek
# semantic-search session
target: "blue trolley in bay 4"
environment: ward corridor
mode: training-free
> starting search... > narrowing down promising areas... > suggesting safe approach pose...
Instead of “go to coordinate X”, ask your robot to “go find that thing” – and let Seek handle the rest.
Point the robot at a simple phrase like “blue trolley in bay 4” and let Seek decide where to look first, next, and when it’s close enough to stop.
Main Features //
Seek gives robots a sense of what they’re looking for, where it might be, and when they’re close enough — all from a simple phrase.
Describe targets the way teams already talk: “spill kit in bay 4”, “equipment trolley for theatre 2”, “scanner cart 3”.
Seek keeps track of what the robot has already seen, so it doesn’t keep circling the same empty corner when there are better places to check.
The search behaviour focuses time and motion on areas that matter, instead of blindly sweeping every corridor just to “cover the map”.
Developer First //
Seek plugs into the stack you already have. You keep SLAM, planning, and safety. We provide the search behaviour and high-level goals.
Works with ROS 2, simulators, custom frameworks, and real robots. No changes to your low-level control required.
from seek import Seek
seek = Seek(api_key="ss-YOUR_API_KEY")
# Start a natural-language search
session = seek.search_start(
target="blue trolley in bay 4",
pose=robot.pose(),
frame=rgbd.latest_frame(),
)
# Ask Seek where to look next
while not session.done:
waypoint = seek.search_next(session.id)
nav.go_to(waypoint)
# Stream updates as the robot moves
seek.search_update(
session_id=session.id,
pose=robot.pose(),
frame=rgbd.latest_frame(),
)
# When Seek confirms the target
result = seek.search_verify(session.id)
if result.found:
nav.go_to(result.approach_pose)
Integrations //
Seek fits alongside your existing tools instead of replacing them.
Foundations //
Seek is designed around the messy reality of real buildings: Things move, people rearrange spaces, and robots arrive after everything has changed.
No per-site retraining just to make “find the trolley” work. Seek aims to generalise across locations from day one.
Any platform that can localise and stream a camera feed can use Seek — AMRs, mobile bases with arms, quadrupeds, and more.
Search runs produce clear traces, so teams can understand how a robot looked for something and why it chose the paths it did.
Features //
Instead of writing a new script every time you want a robot to “go and find X”, you call Seek and reuse the same behaviour across fleets and sites.
Move from internally-named waypoints to natural phrases that operations teams actually use.
Get predictable, repeatable search behaviour instead of handcrafted routines that differ per robot.
When a target is found, Seek proposes sensible stand-off distances for inspection, handover, or docking.
Save runs as artefacts you can replay, compare over time, or use as evidence for pilots and audits.
Pricing //
Right now we’re working closely with a handful of teams. No public pricing page, no self-serve yet — just robots, real spaces, and clear outcomes.
For simulation and small robots.
Free
per month (limited)
For 1–2 robots in a live site.
Custom
per pilot
For robotics teams with fleets.
TBD
post-alpha
For large fleets and custom constraints.
Contact
for details
Community //
How teams are turning “go find X” into a repeatable behaviour instead of a one-off demo.
“We dropped Seek in front of our existing stack and were running language-driven recovery runs in simulation in a weekend.”
Pilot partner – Warehouse robotics team
“Having a single, shared search behaviour across robots is a huge step up from custom scripts per experiment.”
Research engineer – Robotics lab
Use cases //
A few ways teams use Seek in practice.
When inventory and reality disagree, send a robot to “find pallet 18B” or “locate scanner cart 3” and bring back evidence.
Help visitors and staff with instructions like “walk to the visitor kiosk in atrium B” or “go to the charging bay for unit 12”.
Let robots look for mobile assets – trolleys, carts, spill kits – that don’t stay in fixed bays.
Run language-driven navigation experiments without building a search behaviour from scratch every semester.
Get started //
We’re working closely with a small group of partners to prove Seek in demanding environments. If you already have a robot moving in a real building, we’d love to talk.
FAQ //
Everything you need to know about how Seek fits into your robots and your stack.
Seek is a semantic navigation layer that lets robots find things in real buildings using natural language, not just coordinates.
No. Seek is designed to work across sites without per-environment training loops. You may tune some parameters for your platform and use case, but not retrain from scratch.
Any platform that can estimate its pose and stream a camera feed — AMRs, mobile manipulators, quadrupeds, and indoor drones.
No. You keep navigation, control, and safety. Seek provides high-level search behaviour and goal suggestions that plug into your existing stack.
During the alpha, Seek runs as a hosted API. For larger deployments, we’re open to discussing on-prem or VPC setups with partners.
While in alpha, we work on a case-by-case basis. Longer term, pricing will be based on usage and number of robots, similar to other infrastructure APIs.