Introducing Seek v0.1 – training-free semantic search for robots in real buildings. Learn how we think about navigation

Semantic search for robots

Intuitive visual search for robots

Seek is a training-free semantic navigation layer for robots. Give it a simple phrase – “blue trolley in bay 4”, “visitor kiosk in atrium B” – and it guides your robot through real buildings as if it has worked there for years.

No site-specific training runs. No custom detectors per deployment. Just natural language in, and smart search behaviour out.

Explore the API

Built for labs, warehouses, hospitals, and campuses where layouts shift, inventory moves, and robots need to adapt in real time.

[ SESSION ] [ SEARCH ] [ ACTIVITY ]
::..
.-:.
.::.
..:  ..
: .:.:  ..
:..  .  .::.
:.  .--..
.::.  .  .::.
. :.  .  ..  .:.
 -...:...::  :..
..:  .  .:.  ....
[ 200 OK ]  Seek

# semantic-search session

target: "blue trolley in bay 4"
environment: ward corridor
mode: training-free

> starting search...
> narrowing down promising areas...
> suggesting safe approach pose...
Helps robotics teams deliver • Faster recovery runs • Fewer manual searches • Clearer robot behaviour • Reusable search logic

Turn your buildings into language-searchable space

Instead of “go to coordinate X”, ask your robot to “go find that thing” – and let Seek handle the rest.

Search

Point the robot at a simple phrase like “blue trolley in bay 4” and let Seek decide where to look first, next, and when it’s close enough to stop.

Main Features //

Search that feels natural to humans, reliable for robots

Seek gives robots a sense of what they’re looking for, where it might be, and when they’re close enough — all from a simple phrase.

Understands natural descriptions

Describe targets the way teams already talk: “spill kit in bay 4”, “equipment trolley for theatre 2”, “scanner cart 3”.

Remembers where it has looked

Seek keeps track of what the robot has already seen, so it doesn’t keep circling the same empty corner when there are better places to check.

Moves with intent

The search behaviour focuses time and motion on areas that matter, instead of blindly sweeping every corridor just to “cover the map”.

Developer First //

Add “find-by-name” in a single loop

Seek plugs into the stack you already have. You keep SLAM, planning, and safety. We provide the search behaviour and high-level goals.

Works with ROS 2, simulators, custom frameworks, and real robots. No changes to your low-level control required.

Python C++ ROS 2
[ ROBOT ] [ SEARCH ] [ API ]
from seek import Seek

seek = Seek(api_key="ss-YOUR_API_KEY")

# Start a natural-language search
session = seek.search_start(
    target="blue trolley in bay 4",
    pose=robot.pose(),
    frame=rgbd.latest_frame(),
)

# Ask Seek where to look next
while not session.done:
    waypoint = seek.search_next(session.id)
    nav.go_to(waypoint)

    # Stream updates as the robot moves
    seek.search_update(
        session_id=session.id,
        pose=robot.pose(),
        frame=rgbd.latest_frame(),
    )

# When Seek confirms the target
result = seek.search_verify(session.id)
if result.found:
    nav.go_to(result.approach_pose)

Integrations //

Drop into the stack you already run

Seek fits alongside your existing tools instead of replacing them.

  • • Use Seek waypoints as goals for your existing planners.
  • • Prototype everything in simulation before touching real hardware.
  • • Export search runs as traces for debugging, demos, and reports.
  • • Hook logs into your current observability and monitoring stack.

Foundations //

Built for dynamic environments, not static maps

Seek is designed around the messy reality of real buildings: Things move, people rearrange spaces, and robots arrive after everything has changed.

Training-free by default

No per-site retraining just to make “find the trolley” work. Seek aims to generalise across locations from day one.

Platform-agnostic

Any platform that can localise and stream a camera feed can use Seek — AMRs, mobile bases with arms, quadrupeds, and more.

Observable behaviour

Search runs produce clear traces, so teams can understand how a robot looked for something and why it chose the paths it did.

Features //

We handle the search behaviour so you don’t have to

Instead of writing a new script every time you want a robot to “go and find X”, you call Seek and reuse the same behaviour across fleets and sites.

Language-driven goals

Move from internally-named waypoints to natural phrases that operations teams actually use.

Consistent search patterns

Get predictable, repeatable search behaviour instead of handcrafted routines that differ per robot.

Safe approach poses

When a target is found, Seek proposes sensible stand-off distances for inspection, handover, or docking.

Replayable traces

Save runs as artefacts you can replay, compare over time, or use as evidence for pilots and audits.

Pricing //

Flexible access while we’re in alpha

Right now we’re working closely with a handful of teams. No public pricing page, no self-serve yet — just robots, real spaces, and clear outcomes.

Sandbox

For simulation and small robots.

Free

per month (limited)

  • • Core API access
  • • Simulation-only usage
  • • Community support

Pilot

For 1–2 robots in a live site.

Custom

per pilot

  • • Up to 2 robots
  • • Single building
  • • Weekly check-ins

Team

For robotics teams with fleets.

TBD

post-alpha

  • • Multiple robots & sites
  • • Priority support
  • • Rollout planning
Talk to us

Enterprise

For large fleets and custom constraints.

Contact

for details

  • • Deep integrations
  • • Custom security & SLAs
  • • Joint pilots & metrics
Contact sales

Community //

Early partners on Seek

How teams are turning “go find X” into a repeatable behaviour instead of a one-off demo.

“We dropped Seek in front of our existing stack and were running language-driven recovery runs in simulation in a weekend.”

Pilot partner – Warehouse robotics team

“Having a single, shared search behaviour across robots is a huge step up from custom scripts per experiment.”

Research engineer – Robotics lab

Use cases //

From “where is it?” to repeatable robot routines

A few ways teams use Seek in practice.

Recovery runs

When inventory and reality disagree, send a robot to “find pallet 18B” or “locate scanner cart 3” and bring back evidence.

Guided navigation

Help visitors and staff with instructions like “walk to the visitor kiosk in atrium B” or “go to the charging bay for unit 12”.

Asset search & inspection

Let robots look for mobile assets – trolleys, carts, spill kits – that don’t stay in fixed bays.

Research & teaching

Run language-driven navigation experiments without building a search behaviour from scratch every semester.

Get started //

Alpha access for 5–10 teams

We’re working closely with a small group of partners to prove Seek in demanding environments. If you already have a robot moving in a real building, we’d love to talk.

  • • A mobile robot already operating in a real environment.
  • • Ability to stream camera frames and poses to an API.
  • • Working in logistics, healthcare, research, or similar indoor settings.
Email us

FAQ //

Frequently asked questions

Everything you need to know about how Seek fits into your robots and your stack.

What is Seek?

Seek is a semantic navigation layer that lets robots find things in real buildings using natural language, not just coordinates.

Do we need to train it for each site?

No. Seek is designed to work across sites without per-environment training loops. You may tune some parameters for your platform and use case, but not retrain from scratch.

Which robots are compatible?

Any platform that can estimate its pose and stream a camera feed — AMRs, mobile manipulators, quadrupeds, and indoor drones.

Does Seek replace our navigation stack?

No. You keep navigation, control, and safety. Seek provides high-level search behaviour and goal suggestions that plug into your existing stack.

Can Seek run on-prem?

During the alpha, Seek runs as a hosted API. For larger deployments, we’re open to discussing on-prem or VPC setups with partners.

How does pricing work?

While in alpha, we work on a case-by-case basis. Longer term, pricing will be based on usage and number of robots, similar to other infrastructure APIs.