Disclaimer: This is an unofficial community-driven wrapper for Mistral AI. It is not affiliated with or endorsed by Mistral AI.
React hooks and components for integrating Mistral AI into your React applications.
- 🎣 React Hooks - Easy-to-use hooks for chat, completion, and streaming
- ⚛️ React Context - Provider pattern for managing Mistral AI configuration
- 📦 TypeScript - Full TypeScript support with type definitions
- 🔄 Streaming - Real-time streaming support for chat responses
- 🎨 Flexible - Customizable options for temperature, tokens, and more
- 🧪 Tested - Comprehensive test coverage with Jest
npm install mistral-react# Install dependencies
npm install
# Install example dependencies
npm run example:install
# Run the example app
npm run exampleOpen http://localhost:3000 and enter your Mistral API key to start chatting!
import { MistralProvider } from 'mistral-react';
function App() {
return (
<MistralProvider apiKey="your-mistral-api-key">
<YourComponents />
</MistralProvider>
);
}The simplest way to add chat to your app:
import { MistralProvider, MistralChat } from 'mistral-react';
function App() {
return (
<MistralProvider apiKey="your-api-key">
<MistralChat />
</MistralProvider>
);
}The theme prop accepts 6 customizable objects:
<MistralChat
theme={{
// Customize user messages
userMessage: {
bgColor: '#4CAF50',
textColor: '#fff',
borderRadius: '18px',
padding: '12px 16px',
fontSize: '14px',
fontFamily: 'Inter, sans-serif',
},
// Customize AI messages
assistantMessage: {
bgColor: '#2196F3',
textColor: '#fff',
borderRadius: '18px',
padding: '12px 16px',
fontSize: '14px',
fontFamily: 'Inter, sans-serif',
},
// Customize the chat container
container: {
bgColor: '#f9f9f9',
maxWidth: '800px',
height: '600px',
padding: '20px',
borderRadius: '12px',
boxShadow: '0 2px 10px rgba(0,0,0,0.1)',
},
// Customize the input field
input: {
bgColor: '#ffffff',
textColor: '#212529',
borderColor: '#ced4da',
focusBorderColor: '#007bff',
borderRadius: '24px',
padding: '12px 16px',
fontSize: '14px',
},
// Customize the clear button
button: {
bgColor: '#007bff',
hoverBgColor: '#0056b3',
textColor: '#ffffff',
borderRadius: '24px',
padding: '12px 24px',
fontSize: '14px',
fontWeight: '600',
},
// Customize the loading indicator
loading: {
bgColor: '#e9ecef',
textColor: '#6c757d',
borderRadius: '18px',
padding: '12px 16px',
fontSize: '14px',
fontFamily: 'inherit',
opacity: '0.7',
},
}}
/>You can provide your own components to render messages and the loading indicator.
<MistralChat
renderMessage={({ content, role, index }) => (
<div style={{ padding: '8px', borderRadius: '8px', margin: '4px' }}>
<strong>{role === 'user' ? 'You' : 'Mistral'}:</strong> {content}
</div>
)}
/><MistralChat
renderLoading={() => (
<div style={{ padding: '12px', color: '#888' }}>
Please wait, the AI is thinking...
</div>
)}
/><MistralChat
onMessageSent={(msg) => console.log('Message sent:', msg)}
onReplyReceived={(reply) => console.log('Reply received:', reply)}
onError={(error) => console.error('Error:', error)}
/>import { useMistralChat } from 'mistral-react';
import { useState } from 'react';
function ChatComponent() {
const { messages, isLoading, sendMessage, clearMessages } = useMistralChat();
const [input, setInput] = useState('');
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim() || isLoading) return;
const message = input;
setInput('');
await sendMessage(message, {
model: 'mistral-small-latest',
temperature: 0.7,
});
};
return (
<div>
{messages.map((msg, idx) => (
<div key={idx}>
<strong>{msg.role}:</strong> {msg.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
disabled={isLoading}
/>
</form>
</div>
);
}import { useMistralCompletion } from 'mistral-react';
import { useState } from 'react';
function CompletionComponent() {
const { completion, isLoading, complete } = useMistralCompletion();
const [input, setInput] = useState('');
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim() || isLoading) return;
await complete(input);
};
return (
<div>
<form onSubmit={handleSubmit}>
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Enter your prompt..."
disabled={isLoading}
/>
</form>
{completion && <p>{completion}</p>}
</div>
);
}import { useMistralStream } from 'mistral-react';
import { useState } from 'react';
function StreamComponent() {
const { content, isStreaming, startStream } = useMistralStream();
const [input, setInput] = useState('');
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!input.trim() || isStreaming) return;
const message = input;
setInput('');
await startStream(
[{ role: 'user', content: message }],
{
model: 'mistral-small-latest',
onChunk: (chunk) => console.log('Chunk:', chunk),
onComplete: () => console.log('Stream complete'),
}
);
};
return (
<div>
<form onSubmit={handleSubmit}>
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Type your message..."
disabled={isStreaming}
/>
</form>
<p>{content}</p>
</div>
);
}The main chat component with full customization options.
Props:
| Prop | Type | Default | Description |
|---|---|---|---|
theme |
MistralChatTheme |
- | Theme customization object |
model |
string |
'mistral-small-latest' |
Mistral AI model to use |
temperature |
number |
0.7 |
Controls randomness (0=focused, 1=creative) |
maxTokens |
number |
- | Maximum tokens to generate |
placeholder |
string |
'Type your message...' |
Input placeholder text |
clearButtonText |
string |
'Clear' |
Clear button text |
showClearButton |
boolean |
true |
Show/hide clear button |
initialMessages |
ChatMessage[] |
[] |
Initial chat messages |
className |
string |
- | CSS class name |
loadingText |
string |
'Thinking...' |
Text to show in the default loading indicator |
renderMessage |
(props: RenderMessageProps) => React.ReactNode |
- | Custom message renderer |
renderLoading |
() => React.ReactNode |
- | Custom loading indicator renderer |
onMessageSent |
(message: ChatMessage) => void |
- | Callback when user sends a message |
onReplyReceived |
(message: ChatMessage) => void |
- | Callback when AI replies |
onError |
(error: Error) => void |
- | Callback for errors |
The temperature parameter controls the randomness and creativity of AI responses:
-
0.0 - 0.3: Deterministic and focused- Most consistent and predictable responses
- Best for: Code generation, factual answers, data extraction
- Example:
temperature={0.1}
-
0.4 - 0.7: Balanced (recommended)- Good mix of accuracy and creativity
- Best for: General conversations, Q&A, support chatbots
- Example:
temperature={0.7}
-
0.8 - 1.0: Creative and diverse- More varied and creative responses
- Best for: Brainstorming, creative writing, storytelling
- Example:
temperature={0.9}
// For coding assistance or factual Q&A
<MistralChat temperature={0.1} />
// For general chat (recommended)
<MistralChat temperature={0.7} />
// For creative tasks
<MistralChat temperature={0.9} />Example with all customization options:
<MistralChat
theme={{
userMessage: { bgColor: '#4CAF50', textColor: '#fff' },
assistantMessage: { bgColor: '#2196F3', textColor: '#fff' },
container: { bgColor: '#f9f9f9', maxWidth: '800px' },
loading: { textColor: '#007bff' },
}}
model="mistral-small-latest"
temperature={0.7}
maxTokens={1000}
placeholder="Ask me anything..."
clearButtonText="Clear Chat"
showClearButton={true}
loadingText="The AI is pondering..."
renderMessage={({ content, role, index }) => (
<div style={{ padding: '8px', margin: '4px' }}>
<strong>{role === 'user' ? 'You' : 'AI'}:</strong> {content}
</div>
)}
renderLoading={() => <div>Loading...</div>}
onMessageSent={(msg) => console.log('Sent:', msg)}
onReplyReceived={(reply) => console.log('Reply:', reply)}
onError={(error) => console.error('Error:', error)}
/>Provider component that wraps your app and provides Mistral AI configuration.
Props:
apiKey(string, required): Your Mistral AI API keyendpoint(string, optional): Custom API endpointchildren(ReactNode): Your app components
Hook for managing chat conversations with Mistral AI.
Returns:
messages: Array of chat messagesisLoading: Whether a request is in progresserror: Any error that occurredsendMessage(content, options?): Send a message and get a responseclearMessages(): Clear all messages
Hook for single-shot text completions.
Returns:
completion: The generated completion textisLoading: Whether a request is in progresserror: Any error that occurredcomplete(prompt, options?): Generate a completionreset(): Clear the completion
Hook for streaming responses from Mistral AI.
Returns:
content: The accumulated streamed contentisStreaming: Whether streaming is in progresserror: Any error that occurredstartStream(messages, options?): Start streamingstopStream(): Stop the current streamreset(): Clear the content
model: Model to use (default: 'mistral-small-latest')temperature: Sampling temperature (0-1)maxTokens: Maximum tokens to generatetopP: Nucleus sampling parametersafeMode: Enable safe mode (chat only)
All ChatOptions plus:
onChunk(chunk): Callback for each chunkonComplete(): Callback when streaming completesonError(error): Callback for errors
Check out the examples/ directory for more complete examples:
- Basic chat application
- Streaming demo
- Completion examples
# Install dependencies
npm install
# Run tests
npm test
# Build the library
npm run build
# Run linter
npm run lint
# Format code
npm run formatContributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Please use GitHub Discussions for questions, troubleshooting, or general support:
MIT © Ivan Lori
- Built with Mistral AI
- Inspired by the React community