Claude-skill-registry-data mistral-hello-world

install
source · Clone the upstream repo
git clone https://github.com/majiayu000/claude-skill-registry-data
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/majiayu000/claude-skill-registry-data "$T" && mkdir -p ~/.claude/skills && cp -r "$T/data/mistral-hello-world" ~/.claude/skills/majiayu000-claude-skill-registry-data-mistral-hello-world && rm -rf "$T"
manifest: data/mistral-hello-world/SKILL.md
source content

Mistral AI Hello World

Overview

Minimal working example demonstrating core Mistral AI chat completion functionality.

Prerequisites

  • Completed
    mistral-install-auth
    setup
  • Valid API credentials configured
  • Development environment ready

Instructions

Step 1: Create Entry File

TypeScript (hello-mistral.ts)

import Mistral from '@mistralai/mistralai';

const client = new Mistral({
  apiKey: process.env.MISTRAL_API_KEY,
});

async function main() {
  const response = await client.chat.complete({
    model: 'mistral-small-latest',
    messages: [
      { role: 'user', content: 'Say "Hello, World!" in a creative way.' }
    ],
  });

  console.log(response.choices?.[0]?.message?.content);
}

main().catch(console.error);

Python (hello_mistral.py)

import os
from mistralai import Mistral

client = Mistral(api_key=os.environ.get("MISTRAL_API_KEY"))

def main():
    response = client.chat.complete(
        model="mistral-small-latest",
        messages=[
            {"role": "user", "content": "Say 'Hello, World!' in a creative way."}
        ],
    )

    print(response.choices[0].message.content)

if __name__ == "__main__":
    main()

Step 2: Run the Example

# TypeScript
npx tsx hello-mistral.ts

# Python
python hello_mistral.py

Step 3: Streaming Response (Advanced)

TypeScript Streaming

import Mistral from '@mistralai/mistralai';

const client = new Mistral({
  apiKey: process.env.MISTRAL_API_KEY,
});

async function streamChat() {
  const stream = await client.chat.stream({
    model: 'mistral-small-latest',
    messages: [
      { role: 'user', content: 'Tell me a short story about AI.' }
    ],
  });

  for await (const event of stream) {
    const content = event.data?.choices?.[0]?.delta?.content;
    if (content) {
      process.stdout.write(content);
    }
  }
  console.log(); // newline
}

streamChat().catch(console.error);

Python Streaming

import os
from mistralai import Mistral

client = Mistral(api_key=os.environ.get("MISTRAL_API_KEY"))

def stream_chat():
    stream = client.chat.stream(
        model="mistral-small-latest",
        messages=[
            {"role": "user", "content": "Tell me a short story about AI."}
        ],
    )

    for event in stream:
        content = event.data.choices[0].delta.content
        if content:
            print(content, end="", flush=True)
    print()  # newline

if __name__ == "__main__":
    stream_chat()

Output

  • Working code file with Mistral client initialization
  • Successful API response with generated text
  • Console output showing:
Hello, World!

(But spoken by a million synchronized starlings,
spelling it across the twilight sky...)

Error Handling

ErrorCauseSolution
Import ErrorSDK not installedVerify with
npm list @mistralai/mistralai
Auth ErrorInvalid credentialsCheck MISTRAL_API_KEY is set
TimeoutNetwork issuesIncrease timeout or check connectivity
Rate LimitToo many requestsWait and retry with exponential backoff

Examples

Multi-turn Conversation

const messages = [
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'user', content: 'What is the capital of France?' },
];

const response1 = await client.chat.complete({
  model: 'mistral-small-latest',
  messages,
});

// Add assistant response to conversation
messages.push({
  role: 'assistant',
  content: response1.choices?.[0]?.message?.content || '',
});

// Continue conversation
messages.push({ role: 'user', content: 'What about Germany?' });

const response2 = await client.chat.complete({
  model: 'mistral-small-latest',
  messages,
});

console.log(response2.choices?.[0]?.message?.content);

With Temperature Control

const response = await client.chat.complete({
  model: 'mistral-small-latest',
  messages: [{ role: 'user', content: 'Write a haiku about coding.' }],
  temperature: 0.7, // 0-1, higher = more creative
  maxTokens: 100,
});

Resources

Next Steps

Proceed to

mistral-local-dev-loop
for development workflow setup.