Skip to content

A React wrapper for Mistral AI with hooks and customizable chat components. Features theme customization, streaming support, and TypeScript definitions.

License

Notifications You must be signed in to change notification settings

ivanlori/mistral-react

mistral-react

Disclaimer: This is an unofficial community-driven wrapper for Mistral AI. It is not affiliated with or endorsed by Mistral AI.

React hooks and components for integrating Mistral AI into your React applications.

npm version License: MIT CI TypeScript PRs Welcome

Features

  • 🎣 React Hooks - Easy-to-use hooks for chat, completion, and streaming
  • ⚛️ React Context - Provider pattern for managing Mistral AI configuration
  • 📦 TypeScript - Full TypeScript support with type definitions
  • 🔄 Streaming - Real-time streaming support for chat responses
  • 🎨 Flexible - Customizable options for temperature, tokens, and more
  • 🧪 Tested - Comprehensive test coverage with Jest

Installation

npm install mistral-react

Try the Example

# Install dependencies
npm install

# Install example dependencies
npm run example:install

# Run the example app
npm run example

Open http://localhost:3000 and enter your Mistral API key to start chatting!

Quick Start

1. Wrap your app with MistralProvider

import { MistralProvider } from 'mistral-react';

function App() {
  return (
    <MistralProvider apiKey="your-mistral-api-key">
      <YourComponents />
    </MistralProvider>
  );
}

2. Use the MistralChat Component

The simplest way to add chat to your app:

import { MistralProvider, MistralChat } from 'mistral-react';

function App() {
  return (
    <MistralProvider apiKey="your-api-key">
      <MistralChat />
    </MistralProvider>
  );
}

3. Customize with Themes

The theme prop accepts 6 customizable objects:

<MistralChat
  theme={{
    // Customize user messages
    userMessage: {
      bgColor: '#4CAF50',
      textColor: '#fff',
      borderRadius: '18px',
      padding: '12px 16px',
      fontSize: '14px',
      fontFamily: 'Inter, sans-serif',
    },

    // Customize AI messages
    assistantMessage: {
      bgColor: '#2196F3',
      textColor: '#fff',
      borderRadius: '18px',
      padding: '12px 16px',
      fontSize: '14px',
      fontFamily: 'Inter, sans-serif',
    },

    // Customize the chat container
    container: {
      bgColor: '#f9f9f9',
      maxWidth: '800px',
      height: '600px',
      padding: '20px',
      borderRadius: '12px',
      boxShadow: '0 2px 10px rgba(0,0,0,0.1)',
    },

    // Customize the input field
    input: {
      bgColor: '#ffffff',
      textColor: '#212529',
      borderColor: '#ced4da',
      focusBorderColor: '#007bff',
      borderRadius: '24px',
      padding: '12px 16px',
      fontSize: '14px',
    },

    // Customize the clear button
    button: {
      bgColor: '#007bff',
      hoverBgColor: '#0056b3',
      textColor: '#ffffff',
      borderRadius: '24px',
      padding: '12px 24px',
      fontSize: '14px',
      fontWeight: '600',
    },

    // Customize the loading indicator
    loading: {
      bgColor: '#e9ecef',
      textColor: '#6c757d',
      borderRadius: '18px',
      padding: '12px 16px',
      fontSize: '14px',
      fontFamily: 'inherit',
      opacity: '0.7',
    },
  }}
/>

4. Custom Rendering

You can provide your own components to render messages and the loading indicator.

renderMessage

<MistralChat
  renderMessage={({ content, role, index }) => (
    <div style={{ padding: '8px', borderRadius: '8px', margin: '4px' }}>
      <strong>{role === 'user' ? 'You' : 'Mistral'}:</strong> {content}
    </div>
  )}
/>

renderLoading

<MistralChat
  renderLoading={() => (
    <div style={{ padding: '12px', color: '#888' }}>
      Please wait, the AI is thinking...
    </div>
  )}
/>

5. Event Callbacks

<MistralChat
  onMessageSent={(msg) => console.log('Message sent:', msg)}
  onReplyReceived={(reply) => console.log('Reply received:', reply)}
  onError={(error) => console.error('Error:', error)}
/>

Advanced Usage

Using Hooks Directly

Chat Hook

import { useMistralChat } from 'mistral-react';
import { useState } from 'react';

function ChatComponent() {
  const { messages, isLoading, sendMessage, clearMessages } = useMistralChat();
  const [input, setInput] = useState('');

  const handleSubmit = async (e: React.FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isLoading) return;

    const message = input;
    setInput('');
    
    await sendMessage(message, {
      model: 'mistral-small-latest',
      temperature: 0.7,
    });
  };

  return (
    <div>
      {messages.map((msg, idx) => (
        <div key={idx}>
          <strong>{msg.role}:</strong> {msg.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input
          type="text"
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Type your message..."
          disabled={isLoading}
        />
      </form>
    </div>
  );
}

Completion Hook

import { useMistralCompletion } from 'mistral-react';
import { useState } from 'react';

function CompletionComponent() {
  const { completion, isLoading, complete } = useMistralCompletion();
  const [input, setInput] = useState('');

  const handleSubmit = async (e: React.FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isLoading) return;
    
    await complete(input);
  };

  return (
    <div>
      <form onSubmit={handleSubmit}>
        <input
          type="text"
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Enter your prompt..."
          disabled={isLoading}
        />
      </form>
      {completion && <p>{completion}</p>}
    </div>
  );
}

Stream Hook

import { useMistralStream } from 'mistral-react';
import { useState } from 'react';

function StreamComponent() {
  const { content, isStreaming, startStream } = useMistralStream();
  const [input, setInput] = useState('');

  const handleSubmit = async (e: React.FormEvent) => {
    e.preventDefault();
    if (!input.trim() || isStreaming) return;

    const message = input;
    setInput('');
    
    await startStream(
      [{ role: 'user', content: message }],
      {
        model: 'mistral-small-latest',
        onChunk: (chunk) => console.log('Chunk:', chunk),
        onComplete: () => console.log('Stream complete'),
      }
    );
  };

  return (
    <div>
      <form onSubmit={handleSubmit}>
        <input
          type="text"
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Type your message..."
          disabled={isStreaming}
        />
      </form>
      <p>{content}</p>
    </div>
  );
}

API Reference

MistralChat Component

The main chat component with full customization options.

Props:

Prop Type Default Description
theme MistralChatTheme - Theme customization object
model string 'mistral-small-latest' Mistral AI model to use
temperature number 0.7 Controls randomness (0=focused, 1=creative)
maxTokens number - Maximum tokens to generate
placeholder string 'Type your message...' Input placeholder text
clearButtonText string 'Clear' Clear button text
showClearButton boolean true Show/hide clear button
initialMessages ChatMessage[] [] Initial chat messages
className string - CSS class name
loadingText string 'Thinking...' Text to show in the default loading indicator
renderMessage (props: RenderMessageProps) => React.ReactNode - Custom message renderer
renderLoading () => React.ReactNode - Custom loading indicator renderer
onMessageSent (message: ChatMessage) => void - Callback when user sends a message
onReplyReceived (message: ChatMessage) => void - Callback when AI replies
onError (error: Error) => void - Callback for errors

Temperature Guide

The temperature parameter controls the randomness and creativity of AI responses:

  • 0.0 - 0.3: Deterministic and focused

    • Most consistent and predictable responses
    • Best for: Code generation, factual answers, data extraction
    • Example: temperature={0.1}
  • 0.4 - 0.7: Balanced (recommended)

    • Good mix of accuracy and creativity
    • Best for: General conversations, Q&A, support chatbots
    • Example: temperature={0.7}
  • 0.8 - 1.0: Creative and diverse

    • More varied and creative responses
    • Best for: Brainstorming, creative writing, storytelling
    • Example: temperature={0.9}
// For coding assistance or factual Q&A
<MistralChat temperature={0.1} />

// For general chat (recommended)
<MistralChat temperature={0.7} />

// For creative tasks
<MistralChat temperature={0.9} />

Example with all customization options:

<MistralChat
  theme={{
    userMessage: { bgColor: '#4CAF50', textColor: '#fff' },
    assistantMessage: { bgColor: '#2196F3', textColor: '#fff' },
    container: { bgColor: '#f9f9f9', maxWidth: '800px' },
    loading: { textColor: '#007bff' },
  }}
  model="mistral-small-latest"
  temperature={0.7}
  maxTokens={1000}
  placeholder="Ask me anything..."
  clearButtonText="Clear Chat"
  showClearButton={true}
  loadingText="The AI is pondering..."
  renderMessage={({ content, role, index }) => (
    <div style={{ padding: '8px', margin: '4px' }}>
      <strong>{role === 'user' ? 'You' : 'AI'}:</strong> {content}
    </div>
  )}
  renderLoading={() => <div>Loading...</div>}
  onMessageSent={(msg) => console.log('Sent:', msg)}
  onReplyReceived={(reply) => console.log('Reply:', reply)}
  onError={(error) => console.error('Error:', error)}
/>

MistralProvider

Provider component that wraps your app and provides Mistral AI configuration.

Props:

  • apiKey (string, required): Your Mistral AI API key
  • endpoint (string, optional): Custom API endpoint
  • children (ReactNode): Your app components

useMistralChat

Hook for managing chat conversations with Mistral AI.

Returns:

  • messages: Array of chat messages
  • isLoading: Whether a request is in progress
  • error: Any error that occurred
  • sendMessage(content, options?): Send a message and get a response
  • clearMessages(): Clear all messages

useMistralCompletion

Hook for single-shot text completions.

Returns:

  • completion: The generated completion text
  • isLoading: Whether a request is in progress
  • error: Any error that occurred
  • complete(prompt, options?): Generate a completion
  • reset(): Clear the completion

useMistralStream

Hook for streaming responses from Mistral AI.

Returns:

  • content: The accumulated streamed content
  • isStreaming: Whether streaming is in progress
  • error: Any error that occurred
  • startStream(messages, options?): Start streaming
  • stopStream(): Stop the current stream
  • reset(): Clear the content

Options

ChatOptions / CompletionOptions

  • model: Model to use (default: 'mistral-small-latest')
  • temperature: Sampling temperature (0-1)
  • maxTokens: Maximum tokens to generate
  • topP: Nucleus sampling parameter
  • safeMode: Enable safe mode (chat only)

StreamOptions

All ChatOptions plus:

  • onChunk(chunk): Callback for each chunk
  • onComplete(): Callback when streaming completes
  • onError(error): Callback for errors

Examples

Check out the examples/ directory for more complete examples:

  • Basic chat application
  • Streaming demo
  • Completion examples

Development

# Install dependencies
npm install

# Run tests
npm test

# Build the library
npm run build

# Run linter
npm run lint

# Format code
npm run format

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Questions & Support

Please use GitHub Discussions for questions, troubleshooting, or general support:

License

MIT © Ivan Lori

Links

Acknowledgments

  • Built with Mistral AI
  • Inspired by the React community

About

A React wrapper for Mistral AI with hooks and customizable chat components. Features theme customization, streaming support, and TypeScript definitions.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •