Claude-skill-registry ai-assistants
AI-powered development tools configuration and usage
git clone https://github.com/majiayu000/claude-skill-registry
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/data/ai-assistants" ~/.claude/skills/majiayu000-claude-skill-registry-ai-assistants && rm -rf "$T"
skills/data/ai-assistants/SKILL.md- curl piped into shell
- makes HTTP requests (curl)
- references .env files
AI Assistants
This environment includes AI-powered development tools to enhance your ROS2 development workflow.
Access Methods
There are two ways to access AI tools in this environment:
| Method | Commands Available | Requires |
|---|---|---|
| Devshell | , | or |
| Home-Manager | All aliases (ai-code, pair-voice, etc.) | Module enablement |
Note: The devshell commands are always available when you enter the development environment. The extended aliases require home-manager module configuration.
aichat - Foundation AI CLI
aichat is the default AI assistant - a tiny, provider-agnostic CLI that works with multiple AI providers.
Supported Providers
| Provider | Model Examples | API Key Env Var |
|---|---|---|
| Anthropic | claude-3-opus, claude-3-sonnet | |
| OpenAI | gpt-4, gpt-4-turbo, gpt-3.5-turbo | |
| gemini-pro, gemini-1.5-pro | | |
| Ollama | llama2, codellama, mistral | (local, no key) |
| Azure OpenAI | gpt-4, gpt-35-turbo | |
Quick Start
# Set your API key (choose your provider) export ANTHROPIC_API_KEY="your-key-here" # or export OPENAI_API_KEY="your-key-here" # Basic usage ai "explain what ROS2 topics are" # Code assistance ai-code "write a ROS2 publisher node in Python" # Code review ai-review "review this launch file for best practices" # Explain code cat src/my_node.py | ai-explain
Available Aliases
| Alias | Command | Purpose |
|---|---|---|
| | General AI chat |
| | Code generation |
| | Code explanation |
| | Code review |
Configuration
aichat stores configuration in
~/.config/aichat/config.yaml:
# Example configuration model: claude # Short model name (aichat resolves to latest) save: true highlight: true temperature: 0.7 # Custom roles roles: - name: ros2-expert prompt: | You are a ROS2 expert. Help with: - Node development (Python/C++) - Launch files - Message/Service definitions - Best practices for robotics
Note: aichat uses short model names (e.g.,
claude, gpt-4, gemini-pro) and automatically resolves to the latest available version.
Using with Ollama (Local Models)
For offline/private AI assistance:
# Install Ollama (if not already) curl -fsSL https://ollama.com/install.sh | sh # Pull a coding model ollama pull codellama # Use with aichat aichat --model ollama:codellama "write a ROS2 subscriber"
ROS2-Specific Usage
# Explain a ROS2 concept ai "explain ROS2 QoS profiles" # Generate a launch file ai-code "create a launch file that starts a camera node and image processor" # Debug an error ai "why am I getting 'could not find package' in colcon build" # Review code cat src/robot_controller/robot_controller/controller.py | ai-review
Tips
- Pipe code for context:
cat file.py | ai "explain this" - Use roles for consistency:
for generation,ai-code
for feedbackai-review - Save sessions: Use
to continue conversationsaichat -s session-name - Local models: Use Ollama for private/offline work
Aider - AI Pair Programming
Aider is a Git-integrated AI pair programmer that edits code in your repo with automatic commits.
Quick Start
# Start aider in current directory pair # Work on specific files pair src/my_package/my_node.py # Voice-to-code mode (requires portaudio) pair-voice # Watch mode - auto-commit on file changes pair-watch # Use specific model aider --model claude-3-sonnet-20240229 aider --model gpt-4-turbo
Key Features
| Feature | Description |
|---|---|
| Git Integration | Auto-commits changes with descriptive messages |
| Repo Mapping | Understands your entire codebase structure |
| Voice Mode | Speak your coding requests |
| Watch Mode | Monitors files and auto-commits changes |
| 100+ Languages | Python, C++, Rust, TypeScript, etc. |
Available Aliases
| Alias | Command | Purpose |
|---|---|---|
| | Start AI pair programming |
| | Voice-to-code mode |
| | Auto-commit on changes |
| | Use Claude |
| | Use GPT-4 |
ROS2-Specific Usage
# Edit a ROS2 node pair src/my_robot/my_robot/controller.py > "Add a service server that accepts velocity commands" # Modify launch files pair src/my_robot/launch/robot.launch.py > "Add a parameter for robot_name" # Update CMakeLists.txt pair src/my_robot/CMakeLists.txt > "Add the new action interface dependency"
Configuration
Create
~/.aider.conf.yml:
# Default model model: claude-3-sonnet-20240229 # Auto-commit settings auto-commits: true auto-lint: true # Editor integration edit-format: diff # Voice settings (if using --voice) voice-language: en # Dark mode for terminal dark-mode: true
Environment Variables
# API keys (set in .envrc or shell profile) export ANTHROPIC_API_KEY="sk-ant-..." # For Claude export OPENAI_API_KEY="sk-..." # For GPT-4 export DEEPSEEK_API_KEY="..." # For DeepSeek
Voice Mode Requirements
Voice-to-code requires:
(included in devshell)portaudio- Microphone access
- API key for speech-to-text (uses provider's audio API)
Environment Variables
Add to your shell profile or
.envrc:
# Choose one provider export ANTHROPIC_API_KEY="sk-ant-..." export OPENAI_API_KEY="sk-..." export GOOGLE_API_KEY="..." # Optional: Set default model export AICHAT_MODEL="claude-3-sonnet-20240229"
Home-Manager Configuration
If using home-manager, enable the AI modules to get all aliases:
{ # Enable aichat with aliases (ai, ai-code, ai-explain, ai-review) programs.aichat = { enable = true; settings = { model = "claude"; # Short model name save = true; highlight = true; }; }; # Enable aider with aliases (pair, pair-voice, pair-watch, pair-claude, pair-gpt4) programs.aider = { enable = true; settings = { model = "claude-3-sonnet-20240229"; auto-commits = true; dark-mode = true; }; }; }
Module Locations:
- aichat configurationmodules/common/ai/aichat.nix
- aider configurationmodules/common/ai/aider.nix
- AI module aggregatormodules/common/ai/default.nix
LocalAI - Local LLM Inference
LocalAI provides an OpenAI-compatible API server for running LLMs locally. It's the recommended inference backend for this environment.
Quick Start
# Start LocalAI server localai start # Check status localai status # List available models localai models # Stop server localai stop
Features
| Feature | Description |
|---|---|
| OpenAI API | Drop-in replacement for OpenAI API |
| P2P Federation | Distributed inference across multiple machines |
| Model Formats | GGUF, GGML, Safetensors, HuggingFace |
| GPU Support | CUDA, ROCm, Metal acceleration |
| No Internet | Fully offline capable |
Configuration
LocalAI uses the models directory at
~/.local/share/localai/models.
# Set custom models path export LOCALAI_MODELS_PATH="/path/to/models" # Download a model (example) curl -L "https://huggingface.co/TheBloke/Mistral-7B-v0.1-GGUF/resolve/main/mistral-7b-v0.1.Q4_K_M.gguf" \ -o ~/.local/share/localai/models/mistral-7b.gguf
Integration with Other Tools
# Use LocalAI with aichat export OPENAI_API_BASE="http://localhost:8080/v1" aichat --model local-model "Hello" # Use LocalAI with aider OPENAI_API_BASE=http://localhost:8080/v1 aider
Port Configuration
| Port | Service |
|---|---|
| 8080 | LocalAI API |
Documentation: See
docs/adr/adr-006-agixt-integration.md for architecture decisions.
AGiXT - AI Agent Platform
AGiXT is a powerful AI Agent Automation Platform that enables building and orchestrating complex AI workflows.
Quick Start
# Ensure LocalAI is running first localai start # Start AGiXT services agixt up # Check service status agixt status # View logs agixt logs # Stop services agixt down
Architecture
┌─────────────────────────────────────────────────────────────┐ │ AGiXT Stack │ ├─────────────┬─────────────┬─────────────┬───────────────────┤ │ AGiXT API │ AGiXT UI │ PostgreSQL │ MinIO │ │ :7437 │ :3437 │ :5432 │ :9000/:9001 │ └──────┬──────┴──────┬──────┴──────┬──────┴─────────┬─────────┘ │ │ │ │ └─────────────┴─────────────┴────────────────┘ │ ┌───────┴───────┐ │ LocalAI │ │ :8080 │ └───────────────┘
Port Configuration
| Port | Service |
|---|---|
| 7437 | AGiXT API |
| 3437 | AGiXT UI |
| 5432 | PostgreSQL |
| 9000 | MinIO API |
| 9001 | MinIO Console |
| 8080 | LocalAI (on host) |
Environment Variables
# .env.agixt or exported export AGIXT_URL="http://localhost:7437" export AGIXT_API_KEY="agixt-dev-key" export LOCALAI_URL="http://localhost:8080"
Management Commands
# Full command reference agixt up # Start all services agixt down # Stop all services agixt logs # Follow logs agixt status # Show container status agixt shell # Shell into AGiXT container
ROS2 Integration
The AGiXT Rust SDK bridge (
rust/agixt-bridge/) enables ROS2 nodes to communicate with AGiXT:
# Build the bridge cd rust/agixt-bridge cargo build # Run example cargo run --example basic_chat
Key files:
- Rust SDK integrationrust/agixt-bridge/
- Docker Compose configurationdocker-compose.agixt.yml
- Environment template.env.agixt.example
- Architecture decision recorddocs/adr/adr-006-agixt-integration.md
Related Skills
- Distributed Systems - NATS, Temporal
- Observability - Monitoring AI services
- Rust Tooling - AGiXT Rust SDK development