Only this pageAll pages
Powered by GitBook
1 of 14

docs.innate.bot

Welcome

Loading...

Loading...

BASIC

Loading...

Loading...

Loading...

Loading...

Mars

Loading...

Loading...

Loading...

Loading...

Loading...

Home

We are Innate,

We don’t want robots confined to factories or faceless logistics centers. We want robots that live in the world with us—machines you can teach, shape, and improve over time.

Our first robot out now is Mars, the first Personal AI Robot. Powered by its Embodied AI agent BASIC, Mars can reason, memorize and act in the real world. Programmed through code, demonstrations, and prompting.

Follow us on Discord! We'll be posting more frequent updates and organize events there.

Setup

Coming Soon

Simulation

Coming Soon

Get started

Mars and BASIC introduce a new paradigm in programming robots through code, demonstrations and language.

BASIC functions through Skills and Behaviors. Skills are atomic capabilities built for Mars. These can be digital like send emails, or physical like pick up socks, navigate to room. Skills can be written as pure code using the sdk, trained policies for manipulation and navigation or both.

Behaviors give BASIC the ability to compose atomic skills and perform complex long horizon task wrapping together its skills with reasoning, memory and decision making capabilities

Example: Security Robot

This is a quick walk through of how Mars can be programmed to patrol a house and send alerts over email

1. Installation

Install the Innate SDL on your workstation, or download the Innate app from your app store

pip install innate-sdk 

You can start coding right after by defining files in ~/skills and ~/behaviors

2. Train Skill to Open Doors

This is best done through the app, collect a new dataset of demonstrations, choose your parameters and train on our servers

3. Email Skill

Through code add a Skill that gives the robot the ability to send emails to different users. Define a guideline for how to use the skill and provide a method to execute

"""
Example: Minimal SendEmail skill
----------------------------------
This snippet demonstrates how to build a simple Skill class
that sends an email using SMTP. It is designed for documentation
purposes: clean, minimal, and easy to follow.
"""

import os, smtplib
from email.mime.text import MIMEText
from brain_client.primitives.types import Skill, SkillResult


class SendEmail(Skill):
    # Every skill should define a name property
    @property
    def name(self):
        return "send_email"

    # Provide short usage guidelines
    def guidelines(self):
        return "Send an emergency email. Provide subject, message, and recipients."

    def __init__(self):
        # Configure SMTP server and credentials via environment variables
        self.smtp_server = "smtp.gmail.com"
        self.smtp_port = 587
        self.smtp_user = os.getenv("SMTP_USER", "")
        self.smtp_pass = os.getenv("SMTP_PASS", "")

    # The core method: what happens when the skill is executed
    def execute(self, subject: str, message: str, recipients):
        # Allow either a single recipient string or a list
        if isinstance(recipients, str):
            recipients = [recipients]

        # Build the email
        msg = MIMEText(message)
        msg["From"] = self.smtp_user
        msg["To"] = ", ".join(recipients)
        msg["Subject"] = subject

        # Connect to the SMTP server and send the message
        with smtplib.SMTP(self.smtp_server, self.smtp_port) as server:
            server.starttls()
            server.login(self.smtp_user, self.smtp_pass)
            server.sendmail(self.smtp_user, recipients, msg.as_string())

        # Return a simple success result
        return f"Email sent to {', '.join(recipients)}", SkillResult.SUCCESS

4. Create a Behavior

This is what describes the purpose of the robot during its execution. Define through prompting what the robot should do in different situations, add its relevant skills.

class SecurityGuard(Behavior):
    def name(self) -> str:
        return "security_guard_behavior"

    def get_skills(self) -> List[str]:
        return ["navigate", "open_door", "send_email"]

    def get_prompt(self) -> str:
        return """You are a security guard robot. Patrol the house, 
        stay alert and professional, watch for intruders, open doors
         when needed, and send an email immediately if you find one."""

4. Run The Behavior

New Behaviors are automatically registered. You can run them from the app.

5. Share

Behaviors and Skills are shareable working across robots. Show the word what you've built.

Mars is designed for builders who want to push the boundaries of what's possible. Whether you're building a 3D printing assistant or something entirely new, we can't wait to see what you'll create.

Manipulation

Coming Soon

Skills

Skills are atomic robot capabilities that are combined and chained by BASIC to enable complex long horizon behaviors. Skills can code for physical capabilities ( Manipulation, Navigation etc), digital capabilities (emails, data processing, API calls) or both.

Defining a skill requires three main components which BASIC can interact with

  1. Guidelines: Provides BASIC with language instruction for when and how to use the capabilities encoded in the Skill

  2. Execution: Provides a method for BASIC to call to execute the capability encoded

  3. Cancellation: Provides a method for BASIC safely and reactively cancel execution when appropriate.

Defining Skills

Below is a sample skill.

from innate.skill import Skill
from typing import Tuple

class MySkill(Skill):
    def __init__(self):
        """
        Initialize the skill
        """
        super().__init__()  # Required: initialize parent class

    def guidelines(self) -> str:
        """
        Define usage guidelines for when this skill should be used
        Returns: string describing use cases and restrictions
        """
        return """
        Use this skill when:
        - [Describe when to use this skill]
        - [List relevant conditions]

        Do not use when:
        - [Describe when not to use this skill]
        - [List restrictions or limitations]
        """

    def execute(self) -> Tuple[str, bool]:
        """
        Main execution logic
        Returns: (feedback string, success boolean)
        """
        try:
            # Implement your skill’s logic here
            return "Task completed successfully", True
        except:
            return "Task failed", False

    def interrupt(self):
        """
        Define how to safely stop execution
        """
        # Implement safe stopping behavior here

Examples

Go Home Skill

~/skills/go_home.py

from innate.skill import Skill
from innate import navigation, manipulation
import numpy as np
from typing import Tuple

class GoHome(Skill):
    """
    Returns the robot to its home/base pose safely by coordinating
    manipulation and navigation.
    """

    def __init__(self):
        super().__init__()
        navigation.init()
        manipulation.init()

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - Robot needs to return to its home position
        - Robot needs to reset its configuration
        - Starting a new set of tasks

        Do not use when:
        - Robot is carrying objects
        - Path to home is blocked
        """

    def execute(self) -> Tuple[str, bool]:
        try:
            # 1) Move arm to a safe zero/park joint configuration
            home_joints = np.zeros(6)  # Adjust DOF as needed
            manipulation.set_joint_pose(home_joints)

            # 2) Navigate to absolute 'home' pose
            navigation.go_to_pose(0.0, 0.0, 0.0)

            return "Successfully returned to home position", True
        except Exception as e:
            return f"Failed to return home: {e}", False

    def interrupt(self):
        # Stop all motion channels
        navigation.interrupt()
        manipulation.interrupt()

Pick Trash Skill

~/skills/pick_trash.py

from innate.skill import Skill
from innate import navigation, manipulation
import time
from typing import Tuple

class PickTrash(Skill):
    """
    Vision-guided picking of a described trash item.
    Combines in-sight navigation with a grasping policy.
    """

    def __init__(self):
        super().__init__()
        navigation.init()
        manipulation.init()

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - You need to pick up trash or debris
        - The trash item is visible to the robot
        - The item is within the robot's manipulation range

        Do not use when:
        - The trash is too heavy (>200 g)
        - The trash is hazardous material
        - Multiple items need to be picked at once
        - The item is not clearly visible
        """

    def execute(self, description: str) -> Tuple[str, bool]:
        try:
            # 1) Move to a vantage point where item is in view
            navigation.go_to_in_sight(description, distance=0.2)

            # 2) Pre-grasp pose (example values; tune per robot)
            manipulation.set_ee_pose({"x": 0.30, "y": 0.00, "z": 0.10})
            time.sleep(0.5)

            # 3) Run a policy named "pick_item"
            manipulation.run_policy("pick_item")

            return f"Successfully picked up {description}", True
        except Exception as e:
            return f"Failed to pick up {description}: {e}", False

    def interrupt(self):
        # Halt movement and open gripper for safety
        navigation.interrupt()
        manipulation.interrupt()
        manipulation.set_gripper_pressure(0.0)

Capture and Upload Image Skill

~/skills/capture_and_upload.py

from innate.skill import Skill
from innate import sensors
from typing import Tuple
from datetime import datetime
import io
import cv2
import numpy as np
from PIL import Image

from googleapiclient.discovery import build
from googleapiclient.http import MediaIoBaseUpload
from google.oauth2 import service_account


class CaptureAndUpload(Skill):
    """
    Capture an RGB image from the robot sensors and upload it
    to Google Drive (or Google Photos with a different API call).
    """

    def __init__(self):
        super().__init__()
        sensors.init()

        # Configure Google API (using a service account JSON)
        SCOPES = ["https://www.googleapis.com/auth/drive.file"]
        SERVICE_ACCOUNT_FILE = "/path/to/credentials.json"
        creds = service_account.Credentials.from_service_account_file(
            SERVICE_ACCOUNT_FILE, scopes=SCOPES
        )
        self.drive_service = build("drive", "v3", credentials=creds)

    def guidelines(self) -> str:
        return """
        Use this skill when:
        - You need to capture a visual snapshot from the robot
        - You want the image archived to cloud storage (Drive/Photos)

        Do not use when:
        - Sensors are not initialized
        - Network connectivity is unavailable
        """

    def execute(self) -> Tuple[str, bool]:
        try:
            # 1) Capture RGB image
            rgb_image, _ = sensors.get_rgbd()

            # 2) Convert to JPEG in memory
            image = Image.fromarray(cv2.cvtColor(rgb_image, cv2.COLOR_BGR2RGB))
            buffer = io.BytesIO()
            image.save(buffer, format="JPEG")
            buffer.seek(0)

            # 3) Upload to Drive
            timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
            file_metadata = {"name": f"robot_capture_{timestamp}.jpg"}
            media = MediaIoBaseUpload(buffer, mimetype="image/jpeg")

            uploaded = (
                self.drive_service.files()
                .create(body=file_metadata, media_body=media, fields="id")
                .execute()
            )

            return f"Image captured and uploaded to Drive (file ID: {uploaded['id']})", True
        except Exception as e:
            return f"Failed to capture/upload image: {e}", False

    def interrupt(self):
        # No long-running process; nothing special needed
        pass

Overview

BASIC is the embodied AI agent that controls Mars. BASIC gives Mars the ability to reason, memorize, plan, make decisions, and use physical and digital skills.

Sensors

Overview

Mars features a comprehensive sensor suite consisting of three primary sensors:

Forward-Facing RGBD Camera

  • High-quality depth perception with 7.5cm stereo baseline

  • 150° diagonal field of view

  • Effective depth range: 40cm - 6m

  • Depth accuracy: <2% error up to 3.5m, <4% to 6.5m, <6% to 9m

Gripper-Mounted RGB Camera

  • 160° diagonal field of view

  • 2MP resolution (1920x1080)

  • 30 FPS at full resolution

  • Ideal for close-range manipulation tasks and visual servoing

2D LiDAR

  • 360° scanning coverage

  • Range: 0.15m - 12m

  • Angular resolution: ≤1°

  • Distance resolution: <0.5mm

  • Scan rate: 5.5Hz

  • Primary sensor for SLAM and navigation

This sensor configuration enables robust environmental perception, precise manipulation, and reliable navigation through complementary sensing modalities. Each sensor's data can be accessed and processed through the Innate SDK.


CLI Access

Stream Sensor Data

View live sensor data streams in a visualization window:

# Stream RGBD camera
innate sensor play rgbd
# Shows color and depth streams in separate windows# Stream gripper camera
innate sensor play gripper
# Shows color stream from gripper camera# Stream LiDAR data
innate sensor play lidar
# Shows 2D scan visualization

Capture Sensor Data

Save sensor data snapshots to file:

# Capture RGBD data
innate sensor capture rgbd
# Saves color image as {timestamp}_rgb.png and depth as {timestamp}_depth.png# Capture gripper camera image
innate sensor capture gripper
# Saves image as {timestamp}_gripper.png# Capture LiDAR scan
innate sensor capture lidar
# Saves scan data as {timestamp}_scan.txt

By default, captures are saved in the current working directory. Use the optional --output flag to specify a different save location:

innate sensor capture rgbd --output /path/to/directory


Python SDK

Setup

from innate import sensors
sensors.init()

RGBD Camera

# Get current RGBD data
rgb_image, depth_image = sensors.get_rgbd()
# Returns:#   rgb_image: PIL.Image - RGB image (1280x800)#   depth_image: PIL.Image - 16-bit depth image (1280x800)

Gripper Camera

# Get current gripper camera image
gripper_image = sensors.get_gripper()
# Returns: PIL.Image - RGB image (1920x1080)

LiDAR

# Get current LiDAR scan
scan = sensors.get_lidar()
# Returns: numpy.ndarray - Array of distances in meters# Length: 360 elements (one per degree)

Custom Sensors

Mars provides two powered USB 3.0 ports for connecting additional sensors and peripherals.

Port Specifications

  • 2x USB 3.0 ports

  • Power output: 1.2 Amps per port

  • Data transfer rate: Up to 5 Gbps

  • Hot-swappable

  • Full compatibility with USB 2.0 devices

Navigation

Coming Soon

Hardware

Mars is designed from the ground up to be extensible. All sensors and actuators come pre calibrated. Additional powered USB Ports, GPIO, and Power terminals are exposed to add capabilities and features.

Behaviors

Behaviors steer BASIC to perform complex long horizon tasks. Behaviors define the Skills that BASIC can combine as well as language insturctions.

Defining Behaviors

Defining a Behavior requires two components.

  1. List of Skills that BASIC has access to

  2. A natural language prompt that instructs BASIC how to perform the task

from innate.behavior import Behavior
from typing import List


class BasicBehavior(Behavior):
    def get_skills(self) -> List[str]:
        """
        List the Skills this behavior can call.
        Return: list of skill names (snake_case)
        """
        return [
        "skill_one",
        "skill_two",
        ]


    def get_prompt(self) -> str:
        """
        Define how BASIC should use the Skills.
        Return: instructions shown to the agent.
        """
        return """You are Mars, a capable robot. You have these skills available:


        skill_one:
        - Used for [purpose]
        - Parameters: [list parameters if any]
        - Use when [describe situations]


        skill_two:
        - Used for [purpose]
        - Parameters: [list parameters if any]
        - Use when [describe situations]


        For each request:
        1. Choose the appropriate skill
        2. Extract any needed parameters
        3. Execute the skill
        4. Provide clear, concise feedback"""

Using Behaviors

Option 1: Command Line (Innate SDK CLI)

# Activate a behavior
innate behavior activate <behavior_name>


# Deactivate all behaviors
innate behavior deactivate

Option 2: Mars App

  1. Connect to your robot in the Mars app

  2. On the Home screen, locate the Behaviors section

  3. Tap a behavior to activate it

Examples

Clean Socks Behavior

class CleanSocks(Behavior):
    def name(self) -> str:
        return "clean_socks_behavior"


    def get_skills(self) -> List[str]:
        return ["navigate", "pick_up_sock", "drop_socks"]

    def get_prompt(self) -> str:
        return """You are a sock-cleaning robot. Search the room for socks.
        When you see one, go to it, pick it up, and put it in the wooden box.
         Keep doing this until there are no more socks."""

Security Guard Behavior

class SecurityGuard(Behavior):
    def name(self) -> str:
        return "security_guard_behavior"


    def get_skills(self) -> List[str]:
        return ["navigate", "open_door", "send_email"]

    def get_prompt(self) -> str:
        return """You are a security guard robot. Patrol the house,
         stay alert and professional, watch for intruders, 
         open doors when needed, and send an email immediately 
         if you find one."""