Only this pageAll pages
Powered by GitBook
1 of 34

docs.innate.bot

Welcome

Loading...

Loading...

Loading...

Loading...

Loading...

Robots

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

SOFTWARE

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Frequently Asked Questions

We regularly get some questions asked, here is a compilation of answers:

Can everything really run on the robot without a computer with GPU?

Yes!

We take great care to make sure your robot can be run everywhere without the need for a clunky setup. Our VLA (Innate ACT) and the SLAM, and the BASIC agent, all run on the robot (with the help of the cloud for BASIC).

A computer is required to develop for it, but once you're happy with your programs, you can close it.

Note that training the model (after collecting data) can be done on your computer OR on the cloud, in our infrastructure.

MARS Example use-cases

As an extensible general-purpose robot, MARS can be used for a wide array of use-cases, which can be further increased with additional sensors and effectors.

All the examples below are possible autonomously — some of them might require more data than we have collected so far.

  • A Desk Personal Productivity companion that hits you whenever you look at your phone or when it sees you are not working on your screen.

  • An Elderly Companion that takes notes during the day of what your grand-parents do and can remind them to take their medicine.

  • A floor decluttering robot that takes legos and trash off the floor and in a particular bin in another room.

  • A Chess Playing Robot that roasts you while you play (or adopts a different character depending on the user).

  • A Security Robot that can open doors and explore the place, and call 911 if it sees someone.

  • A food serving robot that can put food in your plate when you show one in front of it.

  • A concierge robot that gives indication to visitors, flyers, and drives them to destination.

Autonomous demos

The videos below show real autonomous deployments of MARS in different situations. All were accomplished with training the arm and running the BASIC agent.

Playing Chess

Picking up socks to clean the room

Giving tools for work

Patroling the house with opening doors

Home

At Innate, we make affordable & intuitive AI robots that you can teach and program.

Our whole stack, from the hardware to the software, is designed to run the state-of-the-art of AI in embodied agents and manipulation models.

This documentation describes our robots, our open operating system, how to develop with it, and examples of what you can do with them.

Our systems are developed with a few principles in mind:

  • Users should be able to quickly run programs developed for our robots.

  • Developers should have access to a complete powerful stack.

  • The robots should be robust, reliable and precise to execute AI algorithms.


Robots

MARS, Innate's first robot, is a portable mobile manipulator under $2k capable of running embodied agents and VLAs.

Perfect for developers & hobbyists looking to build on the state of the art of AI

Innate OS & BASIC

How to develop and train your robot running the Innate OS and its brain BASIC.

Run your first behavior in minutes, learn how to make complex behaviors with VLAs, and share your behaviors with others.

Any question that this documentation cannot answer ?

MARS

MARS is a small mobile manipulator packing all of the hardware and AI required to run a complete AI Robot. It is designed for hobbyists wanting to develop and share physical applications developed on it.

What do I get with MARS?

MARS is a lot more than a few pieces of hardware strapped together. It is a fully assembled, calibrated robot, that we designed to run AI algorithms properly.

MARS's arm and visual stack is designed so that VLAs can run repeatably.

The robot itself is robust enough to resist (reasonable) falls, shocks, and can easily be opened and modded.

MARS is basically a small research platform, but with a software made for engineers and builders.

Open-source

MARS is open-source, so that you can quickly modify it or develop your own stack on top of it.

You can buy an assembled and calibrated MARS from .

Navigation

More details coming Soon

In the meantime check examples on

MARS Quick Start

Sensors

Overview

Mars features a comprehensive sensor suite consisting of three primary sensors:

Forward-Facing RGBD Camera

  • High-quality depth perception with 7.5cm stereo baseline

  • 150° diagonal field of view

  • Effective depth range: 40cm - 6m

  • Depth accuracy: <2% error up to 3.5m, <4% to 6.5m, <6% to 9m

Gripper-Mounted RGB Camera

  • 160° diagonal field of view

  • 2MP resolution (1920x1080)

  • 30 FPS at full resolution

  • Ideal for close-range manipulation tasks and visual servoing

2D LiDAR

  • 360° scanning coverage

  • Range: 0.15m - 12m

  • Angular resolution: ≤1°

  • Distance resolution: <0.5mm

  • Scan rate: 5.5Hz

  • Primary sensor for SLAM and navigation

This sensor configuration enables robust environmental perception, precise manipulation, and reliable navigation through complementary sensing modalities. Each sensor's data can be accessed and processed through the Innate SDK.


Custom Sensors

Mars provides two powered USB 3.0 ports for connecting additional sensors and peripherals.

Port Specifications

  • 2x USB 3.0 ports

  • Power output: 1.2 Amps per port

  • Data transfer rate: Up to 5 Gbps

  • Hot-swappable

  • Full compatibility with USB 2.0 devices

MARS
SOFTWARE
REPOSITORY
our website

Behaviors

What is a behavior?

Behaviors steer your robot to perform complex long horizon tasks. A behavior is a combination of robot skills, additional sensory inputs, and language instructions.

When running with BASIC, it brings the robot to life to execute the behavior, like a human being would by following the instructions.

Behaviors are like apps!

Behaviors are like apps, but for your robot. Once created, you can easily send your behavior and the associated code and data to anyone else with an Innate robot.

Where can I find my behaviors?

All Innate robots come with a set of behaviors pre-installed–like your PC comes with a set of small apps pre-installed.

You can directly find them on the Home page in the Innate Controller App.

Inputs

Inputs allow to send additional data asynchornously to BASIC when running a behavior.

They are particularly fit to integrating data from a sensor you added to the robot, for example a microphone.

They are also useful to have it receive data asynchronously from another device or api on the internet, like emails.

Innate Controller App

Note that we are still in beta for both version, please reach out on Discord to get access.

The app is available here for android and here for ios.

A robot should be easily controllable everywhere, anytime. We developed an app compatible with both iPhone and Android so that you can control your robot and trigger your behaviors whenever you want.

Manipulation

More details coming Soon

In the meantime, check examples on MARS Quick Start

Connecting to BASIC

BASIC is accessible for free to all users of Innate robots for 300 cumulative hours - and probably more if you ask us.

Authentication is TBD, users of the first Innate robots will have a direct, personal endpoint to the system.

MARS Quick Start

Learn how to control MARS with your phone and the controller arm, make it navigate and talk, and trigger autonomous behaviors & skills made by others

This page describes the experience of receiving your MARS robot for the first time and running it.

If you want a more technical introduction to our SDK and the underlying ROS2 system, please go to:

A quick overview

Powering up and connecting to MARS

MARS turns on automatically when plugging the battery in. Put it on the floor, plug the battery, and open the Innate Controller App (works on iPhone and Android).

The app is available here for android and here for ios.

Experience the phone control

Once connected, you can verify that MARS is properly running by controling the robot straight through the app.

  1. Go to manual control

  2. Make the robot move with the joystick and the head with the slider.

  3. Plug the controller arm in and toggle the arm control to verify that the robot mimics your movement.

Talk to MARS for the first time

MARS can run BASIC, our embodied agent that allows the robot to act in real-time and decide what to do based on what it sees and hears.

Put MARS on a table, in front of you, go to "Agentic" on the app, and ask it what it sees. You should start seeing its thoughts and answers appear in the chat.

Make MARS navigate

When running BASIC, MARS can navigate. On the agentic screen, ask him things such as "move forward 1m", or "go to the fridge" if the fridge is in sight.

You can also try more complex requests such as "explore until you see a human".

Use an autonomous arm AI skill

Innate robots can use "skills" to perform actions in the physical world or the digital world.

You can observe which ones are installed by going in the app to Skills (in the tab bar, middle icon) and looking at the list of physical and digital skills installed.

To run one, go back to the home screen, select manual control, and open the skills menu to pick one to use. Then press the button and observe.

VIDEO

Use an existing behavior

BASIC allows to run programs we call "behaviors" that determine the robot's purpose and abilities. On the app, you can see which ones are already installed.

Try out the "Roast me" one, the "Find my shoes", or the "Pick up the trash" straight from the app.

Before running a behavior, you can observe what it was programmed to do.

VIDEO

Create your first map

Innate robots running BASIC have spatial memory, but they currently require a map for it to be fully functional, so that the robot can remember where it saw things.

To create a map, in the app, go in Settings -> Mapping and press the button. Once the robot is mapping, move around and observe the map being created.

Once you're satisfied of how it looks like, you can save the map which will automatically activate in your robot.

VIDEO

Navigate, this time with memory

Now that MARS has spatial memory, you can drive it around and it will memorize what it sees. Try to drive it around your kitchen then, in agentic mode, ask "go to the fridge" and observe it going there.

VIDEO

Congrats, you can control your robot!

Now you know how to run basic controls of the robot from the app.

Next up: Create your first behavior and train your first manipulation model, to run them autonomously!

You have completed the quick start!

Congrats!

You can now create basic behaviors that allow the robot to interact with the world. There is a lot more to try.

You can create additional skills like "wave" and "navigate" to use the arm for learned actions or have the arm access the internet to use Google, send emails, call other agents...

Go to the following pages to learn more:

PAGE FOR TEACHING THE ARM A SKILL

PAGE TO WRITE SKILLS WITH CODE

PAGE TO TEACH AN ARM SKILL WITH INVERSE KINEMATICS

PAGE FOR SHARING A BEHAVIOR

ADVANCED: USE AND MODIFY THE CORE ROS2 OS

ADVANCED: IMPLEMENTING A DIFFERENT MANIPULATION MODEL

Innate robots introduce a new paradigm in programming robots through code, demonstrations and language.

Our robots run through skills and behaviors.

Skills are atomic capabilities built for MARS. These can be digital written in code, like sending emails, or physical like pick up socks, navigate to room.

Skills can be written as pure code using the sdk, trained models for manipulation (VLAs) and navigation or both.

Behaviors are like an app for your robot.

A BASIC behavior is a composition of atomic skills with a system prompt. A robot running on a behavior performs complex long horizon task wrapping together its skills with reasoning, memory and decision making capabilities.

Example: Security Robot

This is a quick walk through of how Mars can be programmed to patrol a house and send alerts over email

1. Installation

Install the Innate SDL on your workstation, or download the Innate app from your app store

pip install innate-sdk 

You can start coding right after by defining files in ~/skills and ~/behaviors

2. Train Skill to Open Doors

This is best done through the app, collect a new dataset of demonstrations, choose your parameters and train on our servers

3. Email Skill

Through code add a Skill that gives the robot the ability to send emails to different users. Define a guideline for how to use the skill and provide a method to execute

"""
Example: Minimal SendEmail skill
----------------------------------
This snippet demonstrates how to build a simple Skill class
that sends an email using SMTP. It is designed for documentation
purposes: clean, minimal, and easy to follow.
"""

import os, smtplib
from email.mime.text import MIMEText
from brain_client.primitives.types import Skill, SkillResult


class SendEmail(Skill):
    # Every skill should define a name property
    @property
    def name(self):
        return "send_email"

    # Provide short usage guidelines
    def guidelines(self):
        return "Send an emergency email. Provide subject, message, and recipients."

    def __init__(self):
        # Configure SMTP server and credentials via environment variables
        self.smtp_server = "smtp.gmail.com"
        self.smtp_port = 587
        self.smtp_user = os.getenv("SMTP_USER", "")
        self.smtp_pass = os.getenv("SMTP_PASS", "")

    # The core method: what happens when the skill is executed
    def execute(self, subject: str, message: str, recipients):
        # Allow either a single recipient string or a list
        if isinstance(recipients, str):
            recipients = [recipients]

        # Build the email
        msg = MIMEText(message)
        msg["From"] = self.smtp_user
        msg["To"] = ", ".join(recipients)
        msg["Subject"] = subject

        # Connect to the SMTP server and send the message
        with smtplib.SMTP(self.smtp_server, self.smtp_port) as server:
            server.starttls()
            server.login(self.smtp_user, self.smtp_pass)
            server.sendmail(self.smtp_user, recipients, msg.as_string())

        # Return a simple success result
        return f"Email sent to {', '.join(recipients)}", SkillResult.SUCCESS

4. Create a Behavior

This is what describes the purpose of the robot during its execution. Define through prompting what the robot should do in different situations, add its relevant skills.

class SecurityGuard(Behavior):
    def name(self) -> str:
        return "security_guard_behavior"

    def get_skills(self) -> List[str]:
        return ["navigate", "open_door", "send_email"]

    def get_prompt(self) -> str:
        return """You are a security guard robot. Patrol the house, 
        stay alert and professional, watch for intruders, open doors
         when needed, and send an email immediately if you find one."""

5. Run The Behavior

New Behaviors are automatically registered. You can run them from the app.

6. Share

Behaviors and Skills are shareable working across robots.

You can simply share the files in a github repository like our .

Mars is designed for builders who want to push the boundaries of what's possible. Whether you're building a 3D printing assistant or something entirely new, we can't wait to see what you'll create.

Overview

How to create Behavior Apps for Innate Robots using the SDK - or ROS2 straight

Core concepts

A system diagram of the software running on Innate robots

An agentic OS

Innate robots run an agentic OS built on top of ROS2. It is powered by our cloud agent called BASIC.

This abstraction layer allows to create powerful agentic applications quickly without having to care about the usual suspects of classic robotics (unless you want to).

Behaviors

The central concept of the Innate OS is the behavior , which is our name for a physical app for robots. They are defined by a system prompt and a set of skills they can use.

Behaviors are like physical apps for Innate robots.

The most simple behavior is:

class HelloWorld(Behavior):
    def name(self) -> str:
        return "hello_world"

    def get_skills(self) -> List[str]:
        return [""]

    def get_prompt(self) -> str:
        return """You are just a robot saying hello_world once you start."""

This will start the robot and make it say hello world on the speakers once.

See more in Behaviors

Skills

Skills are the second core concept of the Innate OS.

A skill can be defined with code, a model checkpoint (such as a VLA) or other specific interfaces we define for you. Adding a skill to a behavior is like giving additional capabilities to your robot.

Similarly to agentic frameworks, skills can be thought as tool calls, with extra sauce.

Skills can be interrupted by the robot during execution if required, and send feedback in the context of to the running behavior.

See how to create skills in Skills

BASIC

BASIC is the embodied AI agent that controls Mars. BASIC can run behaviors and skills, and gives Mars the ability to reason, memorize, plan and make decisions as it runs.

Understand more how BASIC runs in BASIC

ROS2 core

Our OS runs at the core on ROS2 and can be augmented at that level by roboticists that understand it.

See ROS2 core for more information on nodes, topics, and services available.

MARS Quick Development

Learn to create your first behavior, train your first manipulation model, give MARS the ability to read emails, and put the pieces together

Create your first behavior

Now that you know the basics, you can create your first behavior for Mars using the SDK. On the app, go to Settings -> Wifi and read the IP of the robot.

Now, on your PC, ssh in the robot with

Go to ~/behaviors/ and create a hello_world.py behavior file:

"wave" and "navigate_to_position" are basic skills that come already created for the robot. This behavior makes use of them to act autonomously.

Save the file, then restart the robot (unplug and plug again), open the app and start your behavior. Sit in front of the robot, and observe!

VIDEO

To dive more in the details of how to develop behaviors:

Train your first manipulation model for a skill

Innate Robots arms can be trained using state-of-the-art manipulation AI models, running straight on the onboard computer. For MARS, we developed an improved version of ACT with a reward model - see

To train it, you can use the app to collect episodes of data for imitation learning. I.E. you will be repeatedly performing the task with the robot for a given amount of repetitions to make sure it learns it the way you want.

In the app, go to Skills -> Physical, create a new skill, name it, and press "Add Episodes".

Then, arm the arm and press record to collect an episode. Ideally, all episodes should start in a similar position and end in a similar position, following roughly the same movement. Start with very similar trajectories to accomplish the goal while making sure that the arm camera has the objective of motion relatively in sight. More guidelines on training can be found .

Below, an example of training the arm to pick up a cherry.

Once you collected around 50 episodes, you can start considering stopping data collection. We can easily train your dataset for you if you go to the Training tab and press Train with whole dataset. You can also use the episodes for yourself by ssh-ing in the robot and getting them there.

Once the model is trained (which takes up to 4 hours), you can get it back on your robot and then trigger it from Manual Control Screen!

VIDEO

Create your first digital skill

Innate robots can also run any kind of code in the embodied agent BASIC, which can be used to query APIs online or run custom routines onboard.

Below is an example of how to create a skill that queries gmail to read the last emails.

You can run this skill in a behavior that can query it:

Below is the result of running it:

VIDEO

To learn more about the Skills SDK:

Starting a behavior

Behaviors can be triggered on the robot or straight from

Trigger a Behavior

Option 1: Mars App

  1. Connect to your robot in the Mars app

  2. On the Home screen, locate the Behaviors section

  3. Tap a behavior to activate it

Default behavior

You can define a behavior as the default behavior of your robot. In this case, the robot will automatically boot up to this behavior upon starting.

You can do this in the app, on Home, press a behavior, then press the "Set as default" button.

When a behavior is set as default, it is displayed as such on the controller app.

Arm

Specifications

  • Reach: 40cm

  • Repeatability: 2mm

  • Payload: 250g at maximum extension

Actuators

MARS's arm uses 3 types of dynamixel actuators:

These were picked for their robustness and repeatability over alternatives. This arm can for example repeatably play chess (see ).

Control via app & leader arm

MARS (and all Innate robots) are controllable via our app available and .

Every MARS robot comes with a leader arm attachable to your phone in this way:

The app will automatically recognize your arm. Pressing the red "Arm" button allows to control the robot with the leader arm. For the best experience we recommend placing yourself behind the robot. If the connection is choppy, you can connect to your robot using your phone hostpot. You can see more details in .

For more examples linked to MARS, look for the videos in

the controller app.

1

Dynamixel XL430

Robotis

XL430-W250-T

2

Product Page

2

Dynamixel XC430

Robotis

XC430-T240BB-T

1

Product Page

3

Dynamixel XL330

Robotis

XL330-M288

4

Product Page

MARS Example use-cases
here for android
here for ios
Innate Controller App
MARS Quick Start
The arm is plugged in your phone with a double USB-C cable

Example: Microphone

Innate Robots are designed with extensibility in mind.

MARS, in particular, can easily be augmented with a $50 microphone to be able to listen to you (see tested components in Extending MARS)

By default, a "micro" input device is available on MARS. If you have a microphone plugged in and are triggering a behavior using it, you will be able to talk directly to MARS. See below the relevant part of the code in the hello world behavior (same than in Definitions).

def get_inputs(self) -> List[str]:
        # This directive can use an intergrated microphone input to hear user
        return ["micro"] 

The code for this input device is available in our open-source repositories at TBD.

Navigation skills

Manipulation skills

Join our Discord server

Digital skills

SOFTWARE
MARS Quick Development
ssh jetson1@<YOUR-ROBOT-IP>
from typing import List
from brain_client.behaviors.types import Behavior

class HelloWorld(Behavior):
    @property
    def name(self) -> str:
        return "hello_world"

    def get_skills(self) -> List[str]:
        return [
            "navigate_to_position",
            "wave",
        ]

    def get_prompt(self) -> str:
        return 
"""
You are a friendly greeting robot whose sole purpose is to say hello world to the user!

Your personality:
- You are a nice and cheerful robot.

Instructions:
- When you see a user in front of you, say "hello world" and wave at the user.
- Don't navigate, just turn around if you don't see the user.
"""
import imaplib
import email
from brain_client.skill_types import Skill, SkillResult


class RetrieveEmails(Primitive):
    def __init__(self, logger):
        self.logger = logger
        self.imap_server = "imap.gmail.com"
        self.email = "[email protected]"

    @property
    def name(self):
        return "retrieve_emails"

    def guidelines(self):
        return "Use to retrieve recent emails. Provide count (default 5). Returns subjects and content."

    def execute(self, count: int = 5):
        count = min(max(1, count), 20)
        try:
            mail = imaplib.IMAP4_SSL(self.imap_server, 993)
            mail.login(self.email, self.password)
            # ... fetch and process emails ...
            email_data = "Email 1: Subject, From, Content..."
            self._send_feedback(email_data)
            return f"Retrieved {count} emails with subjects and content", PrimitiveResult.SUCCESS
        except Exception as e:
            return f"Failed to retrieve emails: {str(e)}", PrimitiveResult.FAILURE
from typing import List
from brain_client.behaviors.types import Behavior

class EmailAssistant(Behavior):
    @property
    def name(self) -> str:
        return "email_assistant"

    def get_skills(self) -> List[str]:
        return [
            "retrieve_emails"
        ]

    def get_prompt(self) -> str:
        return 
"""
You are an email assistant. 
When the user sits in front of you, you should tell them what their last email is
"""
more details here
here
Behaviors
Skills

Connecting to a robot

The Innate App connects to your robot via WiFi. It can also work with your phone hotspot.

If this is the first time you are connecting, you will first use Bluetooth to tell the robot which network to connect to and get its ip.

Press on the robot when it appears to connect via bluetooth

If it's already on the same network than your phone, you can connect
Otherwise, select "Change Robot WiFi" to pick another network to connect it to

Once connected, you can also get your current IP on the network in the Configuration Tab -> WiFi

Extending MARS

MARS was designed for hardware extensibility through several features:

  • Most GPIO pins are available on the onboard computer.

  • Two USB-A extension ports are available at the top of the robot.

  • The hardware is in order to be easier to modify, in particular the end-effector.

We describe below how we recommend to extend MARS on the hardware side.

Changing the end-effector

The end effector by default is a gripper (opposable thumbs).

By nature of robot learning, if you change the end effector to any other shape, you will have to recollect your data for your manipulation models, but nothing else in the operating system needs to change if you still use the last actuator.

Examples of two different types of end effectors we used without changing the code:

Adding sensors and effectors on the USB and GPIOs

We leave it to the user to integrate any additional sensors they want on the ports available. For any added device, users should make sure they install it properly on the robot. You can then feed the data in the BASIC OS by creating a Sensor object in the SDK (to be revealed...)

Some sensors we have already tried and work plug-and-play on MARS:

  • TONOR directional microphone () - Plug it in and use the TONORMicrophoneInput (to be revealed...) to make MARS able to listen to you in real time.

  • Blackiot .

Onboard computer

MARS comes with a

Specs:

  • AI Performance: Up to 67 TOPS (sparse INT8)

  • GPU: NVIDIA Ampere architecture with 1024 CUDA cores and 32 Tensor cores

  • CPU: 6-core ARM Cortex-A78AE v8 (64-bit)

  • Memory: 8 GB LPDDR5

  • Storage: 1TB SSD (integrated by Innate) + 32GB microSD (for OS)

  • Developer Kit Includes: Jetson Orin Nano Super module and reference carrier board

We include an additional 2 USB ports for users after everything is plugged in by Innate.

More physical skills

You can create additional physical skills with code by using lower-level APIs such as navigating to a specific pose:

from innate.skill import Skill
from innate import navigation, manipulation
import numpy as np
from typing import Tuple

class GoHome(Skill):
    """
    Returns the robot to its home/base pose safely by coordinating
    manipulation and navigation.
    """

    def __init__(self):
        super().__init__()
        navigation.init()
        manipulation.init()

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - Robot needs to return to its home position
        - Robot needs to reset its configuration
        - Starting a new set of tasks

        Do not use when:
        - Robot is carrying objects
        - Path to home is blocked
        """

    def execute(self) -> Tuple[str, bool]:
        try:
            # 1) Move arm to a safe zero/park joint configuration
            home_joints = np.zeros(6)  # Adjust DOF as needed
            manipulation.set_joint_pose(home_joints)

            # 2) Navigate to absolute 'home' pose
            navigation.go_to_pose(0.0, 0.0, 0.0)

            return "Successfully returned to home position", True
        except Exception as e:
            return f"Failed to return home: {e}", False

    def interrupt(self):
        # Stop all motion channels
        navigation.interrupt()
        manipulation.interrupt()
open-sourced
$29 on Amazon
Polverine Air Quality Sensor
MARS v1 gripper
A gripper with just one side moving
Jetson Orin Nano Super 8GB Development Kit.

Definition

Every InputDevice has to implement a name, an on_open and on_close methods. The on_open is called when a behavior is started with this device, and when stopped, on_close is called.

Template

This template, once activated, sends data every 1 second to the agent as if the user was talking.

import threading

from brain_client.input_types import InputDevice

class MyInput(InputDevice):
    def __init__(self):
        self._stop_evt = threading.Event()

    @property
    def name(self) -> str:
        return "my_input"  # used by behaviors/directives

    def on_open(self):
        self._stop_evt.clear()
        
        def timer_loop():
            while not self._stop_evt.is_set():
                self.send_data("tick!", data_type="chat_in")
                self._stop_evt.wait(timeout=1.0)
        
        threading.Thread(target=timer_loop, daemon=True).start()

    def on_close(self):
        self._stop_evt.set()

Data types

The BASIC agent performs better with properly formatted inputs, so we provide the data_type parameter to send data on the right channels.

Currently, data_type can either be chat_in or custom.

If you set data_type="custom"when sending feedback, format your message preferably as a stringified json object.

Behavior examples

Examples

Clean Socks Behavior

class CleanSocks(Behavior):
    def name(self) -> str:
        return "clean_socks_behavior"


    def get_skills(self) -> List[str]:
        return ["navigate", "pick_up_sock", "drop_socks"]

    def get_prompt(self) -> str:
        return """You are a sock-cleaning robot. Search the room for socks.
        When you see one, go to it, pick it up, and put it in the wooden box.
         Keep doing this until there are no more socks."""

Security Guard Behavior

class SecurityGuard(Behavior):
    def name(self) -> str:
        return "security_guard_behavior"


    def get_skills(self) -> List[str]:
        return ["navigate", "open_door", "send_email"]

    def get_prompt(self) -> str:
        return """You are a security guard robot. Patrol the house,
         stay alert and professional, watch for intruders, 
         open doors when needed, and send an email immediately 
         if you find one."""

ROS2 core

BASIC

BASIC is our embodied agent, named in tribute to the BASIC languages that hobbyists used in the early age of PCs. Think of it as the LLM OS running your behaviors and skills. When running, BASIC controls the robot autonomously following your instructions.

BASIC is a complex assembly of several models, giving the impression you're interacting with one model with ability to memorize and plan spatially.

Technical Report

TBD

BASIC is evaluated and improved in a 3d simulation in which we test its ability to perform a wide array of tasks. More to be revealed soon.

Definitions

Behaviors are located in ~/behaviors on your robot.

Defining a Behavior requires two components.

  1. A list of Skills that BASIC has access to, like picking an object, navigating, querying the internet...

  2. A natural language prompt on how to perform the task

  3. (Optional) a list of additional inputs the agent can take - like microphone in, air data, incoming emails...

Template: A simple hello world

from innate.behavior import Behavior
from typing import List

class HelloWorld(Behavior):
    def get_skills(self) -> List[str]:
        # List the Skills this behavior can call.
        return ["wave", "navigate"]


    def get_prompt(self) -> str:
        # Define how BASIC should use the Skills.
        return """
You are a robot who can say hello world to the user.

- Speak in lowercase. You can respond to the user.
- Don't navigate, just turn around if you don't see the user.
- Say hello world while waving if you see the user.
"""
    
    def get_inputs(self) -> List[str]:
        # This directive can use an intergrated microphone input to hear user
        return ["micro"] 
        
        

Skills

Skills are atomic robot capabilities that are combined and chained by BASIC to enable complex long horizon behaviors. Skills can code for physical capabilities ( Manipulation, Navigation etc), digital capabilities (emails, data processing, API calls) or both.

Defining a skill requires three main components which BASIC can interact with

  1. Guidelines: Provides BASIC with language instruction for when and how to use the capabilities encoded in the Skill

  2. Execution: Provides a method for BASIC to call to execute the capability encoded

  3. Cancellation: Provides a method for BASIC safely and reactively cancel execution when appropriate.

Defining Skills

Below is a sample skill.

from innate.skill import Skill
from typing import Tuple

class MySkill(Skill):
    def __init__(self):
        """
        Initialize the skill
        """
        super().__init__()  # Required: initialize parent class

    def guidelines(self) -> str:
        """
        Define usage guidelines for when this skill should be used
        Returns: string describing use cases and restrictions
        """
        return """
        Use this skill when:
        - [Describe when to use this skill]
        - [List relevant conditions]

        Do not use when:
        - [Describe when not to use this skill]
        - [List restrictions or limitations]
        """

    def execute(self) -> Tuple[str, bool]:
        """
        Main execution logic
        Returns: (feedback string, success boolean)
        """
        try:
            # Implement your skill’s logic here
            return "Task completed successfully", True
        except:
            return "Task failed", False

    def interrupt(self):
        """
        Define how to safely stop execution
        """
        # Implement safe stopping behavior here

Examples

Go Home Skill

~/skills/go_home.py

from innate.skill import Skill
from innate import navigation, manipulation
import numpy as np
from typing import Tuple

class GoHome(Skill):
    """
    Returns the robot to its home/base pose safely by coordinating
    manipulation and navigation.
    """

    def __init__(self):
        super().__init__()
        navigation.init()
        manipulation.init()

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - Robot needs to return to its home position
        - Robot needs to reset its configuration
        - Starting a new set of tasks

        Do not use when:
        - Robot is carrying objects
        - Path to home is blocked
        """

    def execute(self) -> Tuple[str, bool]:
        try:
            # 1) Move arm to a safe zero/park joint configuration
            home_joints = np.zeros(6)  # Adjust DOF as needed
            manipulation.set_joint_pose(home_joints)

            # 2) Navigate to absolute 'home' pose
            navigation.go_to_pose(0.0, 0.0, 0.0)

            return "Successfully returned to home position", True
        except Exception as e:
            return f"Failed to return home: {e}", False

    def interrupt(self):
        # Stop all motion channels
        navigation.interrupt()
        manipulation.interrupt()

Pick Trash Skill

~/skills/pick_trash.py

from innate.skill import Skill
from innate import navigation, manipulation
import time
from typing import Tuple

class PickTrash(Skill):
    """
    Vision-guided picking of a described trash item.
    Combines in-sight navigation with a grasping policy.
    """

    def __init__(self):
        super().__init__()
        navigation.init()
        manipulation.init()

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - You need to pick up trash or debris
        - The trash item is visible to the robot
        - The item is within the robot's manipulation range

        Do not use when:
        - The trash is too heavy (>200 g)
        - The trash is hazardous material
        - Multiple items need to be picked at once
        - The item is not clearly visible
        """

    def execute(self, description: str) -> Tuple[str, bool]:
        try:
            # 1) Move to a vantage point where item is in view
            navigation.go_to_in_sight(description, distance=0.2)

            # 2) Pre-grasp pose (example values; tune per robot)
            manipulation.set_ee_pose({"x": 0.30, "y": 0.00, "z": 0.10})
            time.sleep(0.5)

            # 3) Run a policy named "pick_item"
            manipulation.run_policy("pick_item")

            return f"Successfully picked up {description}", True
        except Exception as e:
            return f"Failed to pick up {description}: {e}", False

    def interrupt(self):
        # Halt movement and open gripper for safety
        navigation.interrupt()
        manipulation.interrupt()
        manipulation.set_gripper_pressure(0.0)

Capture and Upload Image Skill

~/skills/capture_and_upload.py

from innate.skill import Skill
from innate import sensors
from typing import Tuple
from datetime import datetime
import io
import cv2
import numpy as np
from PIL import Image

from googleapiclient.discovery import build
from googleapiclient.http import MediaIoBaseUpload
from google.oauth2 import service_account


class CaptureAndUpload(Skill):
    """
    Capture an RGB image from the robot sensors and upload it
    to Google Drive (or Google Photos with a different API call).
    """

    def __init__(self):
        super().__init__()
        sensors.init()

        # Configure Google API (using a service account JSON)
        SCOPES = ["https://www.googleapis.com/auth/drive.file"]
        SERVICE_ACCOUNT_FILE = "/path/to/credentials.json"
        creds = service_account.Credentials.from_service_account_file(
            SERVICE_ACCOUNT_FILE, scopes=SCOPES
        )
        self.drive_service = build("drive", "v3", credentials=creds)

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - You need to capture a visual snapshot from the robot
        - You want the image archived to cloud storage (Drive/Photos)

        Do not use when:
        - Sensors are not initialized
        - Network connectivity is unavailable
        """

    def execute(self) -> Tuple[str, bool]:
        try:
            # 1) Capture RGB image
            rgb_image, _ = sensors.get_rgbd()

            # 2) Convert to JPEG in memory
            image = Image.fromarray(cv2.cvtColor(rgb_image, cv2.COLOR_BGR2RGB))
            buffer = io.BytesIO()
            image.save(buffer, format="JPEG")
            buffer.seek(0)

            # 3) Upload to Drive
            timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
            file_metadata = {"name": f"robot_capture_{timestamp}.jpg"}
            media = MediaIoBaseUpload(buffer, mimetype="image/jpeg")

            uploaded = (
                self.drive_service.files()
                .create(body=file_metadata, media_body=media, fields="id")
                .execute()
            )

            return f"Image captured and uploaded to Drive (file ID: {uploaded['id']})", True
        except Exception as e:
            return f"Failed to capture/upload image: {e}", False

    def interrupt(self):
        # No long-running process; nothing special needed
        pass
examples repository