- π Overview
- β¨ Key Features
- π Quick Start
- π Documentation
- ποΈ Architecture
- π€ Contributing
- π License
- iOS 15.0+ device
- Xcode 15+
- OpenAI API key
git clone https://github.com/Gunnarguy/OpenAssistant.git
cd OpenAssistant
open OpenAssistant.xcodeprojDetailed setup instructions: docs/installation/INSTALLATION.md
All documentation is organized in the docs/ directory:
- π Documentation Index - Complete documentation overview
- οΏ½οΈ Installation Guide - Setup instructions
- π€ Contributing - How to contribute
- οΏ½ Privacy Policy - Data handling information
- ποΈ Architecture Diagram - Visual component interactions
OpenAssistant is a feature-rich, native iOS application built meticulously with SwiftUI and the Combine framework. It serves as a sophisticated client for the OpenAI Assistants API, empowering users to harness the full potential of AI assistants directly from their Apple devices. The application offers comprehensive management of assistants, vector stores for retrieval, and file handling, all wrapped in an intuitive user interface. It is designed to handle the complexities of asynchronous API interactions, thread management, and local data persistence, providing a robust and user-friendly mobile experience.
| Feature | Description |
|---|---|
| π€ Assistant Lifecycle Management | Create, view, meticulously configure (name, instructions, model selection including GPT-4o/4.1/O-series, description, temperature, top P, reasoning effort), and delete OpenAI Assistants. |
| π οΈ Advanced Tool Configuration | Dynamically enable or disable powerful tools for assistants, such as Code Interpreter and File Search (Retrieval). |
| ποΈ Vector Store Operations | Full CRUD (Create, Read, Update, Delete) for Vector Stores. Associate Vector Stores with Assistants to enable precise, file-based knowledge retrieval. |
| π Comprehensive File Handling | Upload various file types (PDF, TXT, DOCX, etc.) to OpenAI, associate them with specific Vector Stores using configurable chunking strategies (size and overlap). View detailed file metadata and manage files within these stores. |
| π¬ Dynamic Chat Interface | Engage in interactive conversations with selected Assistants. Features include Markdown rendering for assistant responses, robust message history management (persisted locally via MessageStore), and OpenAI thread lifecycle control. |
| π Reactive UI & Data Sync | Leverages the Combine framework for managing asynchronous operations and NotificationCenter for decoupled, real-time updates across the UI when assistants, stores, or settings change. |
| π Secure & Persistent API Key | Securely stores and manages the OpenAI API key using @AppStorage, ensuring it persists across app sessions. |
| π¨ Adaptive Appearance | Supports Light, Dark, and System-defined appearance modes, configurable via in-app settings for a personalized user experience. |
| π± Native iOS Excellence | Built from the ground up using SwiftUI, ensuring a modern, responsive, and platform-native user experience optimized for iOS. |
| ποΈ Robust MVVM Architecture | Organizes code using the Model-View-ViewModel (MVVM) pattern, promoting clear separation of concerns, enhanced testability, and superior maintainability. |
| βοΈ Dedicated API Service Layer | A specialized service layer (APIService) encapsulates all interactions with the OpenAI API, efficiently handling requests, responses, error conditions, and retries. |
The application is architected using the Model-View-ViewModel (MVVM) pattern, a cornerstone for building scalable and maintainable SwiftUI applications.
- Model: Represents the data structures and business logic. These are primarily Codable structs that mirror the OpenAI API entities (e.g.,
Assistant,Message,Thread,Run,VectorStore,File) and internal application data constructs. - View: The UI layer, built declaratively with SwiftUI. Views observe ViewModels for state changes and render the UI accordingly. Examples:
ChatView,AssistantManagerView,VectorStoreDetailView. They delegate user actions to their respective ViewModels. - ViewModel: Acts as the bridge between the View and the Model. It prepares and provides data for the View, processes user input, manages UI state (e.g., loading indicators, error messages), and orchestrates operations by interacting with services (primarily
APIService). Examples:ChatViewModel,AssistantManagerViewModel,VectorStoreManagerViewModel.
graph TD
subgraph "User Interface (SwiftUI Views)"
direction LR
V_Chat[ChatView]
V_AsstMgr[AssistantManagerView]
V_AsstDetail[AssistantDetailView]
V_VecStoreList[VectorStoreListView]
V_VecStoreDetail[VectorStoreDetailView]
V_Settings[SettingsView]
V_Picker[AssistantPickerView]
V_CreateAsst[CreateAssistantView]
V_MainTab[MainTabView]
V_Content[ContentView]
end
subgraph "ViewModels (State & Business Logic)"
direction LR
VM_Base[BaseViewModel]
VM_BaseAsst[BaseAssistantViewModel] --- VM_Base
VM_Content[ContentViewModel] --- VM_Base
VM_Chat[ChatViewModel] --- VM_BaseAsst
VM_AsstMgr[AssistantManagerViewModel] --- VM_BaseAsst
VM_AsstDetail[AssistantDetailViewModel] --- VM_BaseAsst
VM_AsstPicker[AssistantPickerViewModel] --- VM_BaseAsst
VM_VecStoreMgr[VectorStoreManagerViewModel] --- VM_Base
end
subgraph "Services (API & File Handling)"
direction LR
S_OpenAI_Init[OpenAIInitializer]
S_OpenAI[OpenAIService] -. Uses .-> S_OpenAI_Init
S_OpenAI_AsstExt[OpenAIService-Assistant Ext.] -- Extends --> S_OpenAI
S_OpenAI_ThreadExt[OpenAIService-Threads Ext.] -- Extends --> S_OpenAI
S_OpenAI_VecExt[OpenAIService-Vector Ext.] -- Extends --> S_OpenAI
S_FileUpload[FileUploadService] -. Uses .-> S_OpenAI
end
subgraph "Data Persistence & System Services"
P_AppStorage["@AppStorage (API Key, Settings)"]
P_MessageStore["MessageStore (Chat History)"]
P_NotifCenter[NotificationCenter]
P_Combine["Combine Framework"]
end
subgraph "External Dependencies"
Ext_OpenAI_API[OpenAI API]
end
%% View to ViewModel (User Actions & Data Binding)
V_Content --> VM_Content
V_MainTab --> V_Picker
V_MainTab --> V_AsstMgr
V_MainTab --> V_VecStoreList
V_MainTab --> V_Settings
V_Picker --> VM_AsstPicker
V_AsstMgr --> VM_AsstMgr
V_AsstMgr --- V_CreateAsst
V_AsstMgr --- V_AsstDetail
V_CreateAsst --> VM_AsstMgr
V_AsstDetail --> VM_AsstDetail
V_VecStoreList --> VM_VecStoreMgr
V_VecStoreDetail --> VM_VecStoreMgr
V_Chat --> VM_Chat
V_Settings --> P_AppStorage
V_Settings -. Posts .-> P_NotifCenter
%% ViewModel to Service (Requesting Data/Actions)
VM_Base --> S_OpenAI
VM_Chat -.-> S_OpenAI_ThreadExt
VM_AsstMgr -.-> S_OpenAI_AsstExt
VM_AsstMgr -.-> S_OpenAI_VecExt
VM_AsstDetail -.-> S_OpenAI_AsstExt
VM_AsstDetail -.-> VM_VecStoreMgr
VM_AsstPicker -.-> VM_AsstMgr
VM_VecStoreMgr -.-> S_OpenAI_VecExt
VM_VecStoreMgr -.-> S_FileUpload
%% Service to External API
S_OpenAI --> Ext_OpenAI_API
S_FileUpload --> Ext_OpenAI_API
%% Data Flow & State Management
P_MessageStore <--> VM_Chat
P_AppStorage <--> VM_Base
P_NotifCenter <--> VM_Base
P_NotifCenter <--> VM_Content
P_Combine <--> S_OpenAI
P_Combine <--> VM_Base
P_Combine <--> VM_VecStoreMgr
classDef view fill:#B0E0E6,stroke:#4682B4,stroke-width:2px;
classDef viewModel fill:#98FB98,stroke:#2E8B57,stroke-width:2px;
classDef service fill:#FFA07A,stroke:#CD5C5C,stroke-width:2px;
classDef persistence fill:#DDA0DD,stroke:#8A2BE2,stroke-width:2px;
classDef external fill:#FFD700,stroke:#B8860B,stroke-width:2px;
class V_Chat,V_AsstMgr,V_AsstDetail,V_VecStoreList,V_VecStoreDetail,V_Settings,V_Picker,V_CreateAsst,V_MainTab,V_Content view;
class VM_Base,VM_BaseAsst,VM_Content,VM_Chat,VM_AsstMgr,VM_AsstDetail,VM_AsstPicker,VM_VecStoreMgr viewModel;
class S_OpenAI_Init,S_OpenAI,S_OpenAI_AsstExt,S_OpenAI_ThreadExt,S_OpenAI_VecExt,S_FileUpload service;
class P_AppStorage,P_MessageStore,P_NotifCenter,P_Combine persistence;
class Ext_OpenAI_API external;
The project is organized into several directories, each serving a specific purpose. Here's a detailed breakdown:
APIService (Networking & OpenAI Interaction)
| File | Summary |
|---|---|
| CommonMethods.swift | Defines an extension on OpenAIService with methods for configuring and creating URLRequest objects. |
| FileUploadService.swift | Defines a FileUploadService class for uploading files to OpenAI and managing vector stores. |
| OpenAIInitializer.swift | Manages the initialization of the shared OpenAIService instance with thread safety. |
| OpenAIService-Assistant.swift | Extension for OpenAIService to manage assistants (CRUD operations). |
| OpenAIService-Threads.swift | Extension for OpenAIService to manage threads, runs, and messages. |
| OpenAIService-Vector.swift | Extension for OpenAIService to manage vector stores and files. |
| OpenAIService.swift | The main OpenAIService class for handling API requests, responses, and errors. |
| OpenAIServiceError.swift | Defines custom error types for OpenAIService operations. |
Main Application Logic & Shared Components (Main/)
| File | Summary |
|---|---|
| Additional.swift | Defines various data models used across the application. |
| Appearance.swift | Manages appearance-related settings (e.g., Light, Dark, System modes). |
| Errors.swift | Defines custom error types and error handling utilities. |
| LoadingView.swift | A SwiftUI view for displaying a loading indicator. |
| MainTabView.swift | The main tab view of the application. |
| ModelCapabilities.swift | Helper for checking the capabilities of different AI models. |
| OpenAssistantApp.swift | The main entry point of the SwiftUI application. |
| ResponseFormat.swift | Defines structs and enums for handling JSON response formats. |
| SettingsView.swift | A SwiftUI view for managing user settings. |
| Content/ContentView.swift | The root view of the application. |
| Content/ContentViewModel.swift | The view model for the ContentView. |
MVVM Components (MVVMs/)
| File | Summary |
|---|---|
| Bases | |
| BaseAssistantViewModel.swift | A base class for ViewModels related to Assistants. |
| BaseViewModel.swift | The primary base class for all ViewModels. |
| Assistants Feature | |
| AssistantDetailView.swift | SwiftUI view for managing an assistant's details. |
| AssistantDetailViewModel.swift | ViewModel for AssistantDetailView. |
| AssistantManagerView.swift | SwiftUI view for listing and managing assistants. |
| AssistantManagerViewModel.swift | ViewModel for AssistantManagerView. |
| AssistantPickerView.swift | SwiftUI view for selecting an assistant to start a chat. |
| AssistantPickerViewModel.swift | ViewModel for AssistantPickerView. |
| Chat Feature | |
| ChatView.swift | The main chat interface. |
| ChatViewModel.swift | Core chat logic and state management. |
| MessageStore.swift | Manages the persistence of chat messages. |
| VectorStores Feature | |
| VectorStoreDetailView.swift | Displays the details of a VectorStore. |
| VectorStoreListView.swift | Manages the list of vector stores. |
| VectorStoreManagerViewModel.swift | Manages all API interactions for vector stores. |
| AddFileView.swift | SwiftUI view for uploading files to a vector store. |
The application starts with OpenAssistantApp, which sets up the main ContentView and injects essential environment objects like AssistantManagerViewModel, VectorStoreManagerViewModel, and MessageStore.
The OpenAI API key is securely stored using @AppStorage. The application prompts the user for the key on first launch via the SettingsView. The BaseViewModel ensures that the OpenAIService is re-initialized whenever the key is updated.
The MainTabView is the central navigation hub, providing access to the main features:
- Assistants: Select an assistant for a chat (
AssistantPickerView). - Manage: Create, edit, and delete assistants (
AssistantManagerView). - Vector Stores: Manage vector stores and their files (
VectorStoreListView). - Settings: Configure the API key and app appearance (
SettingsView).
ViewModels are responsible for fetching data from the APIService. They use @Published properties to expose data to the SwiftUI views, which automatically update when the data changes. The Combine framework is used extensively for handling asynchronous data streams.
User actions in the views are delegated to their respective ViewModels. The ViewModel processes the action, interacts with the APIService or other services, and updates its state, which in turn updates the UI.
OpenAssistantApp is the entry point, setting up the main window and environment. ContentView acts as the root view, displaying MainTabView or a loading indicator based on the state managed by ContentViewModel.
The APIService and its extensions form a dedicated layer for all OpenAI API communications. It handles request creation, authentication, response decoding, and error handling. The FileUploadService specializes in handling multipart file uploads.
BaseViewModel provides common functionalities like OpenAIService access and error handling. BaseAssistantViewModel extends this for assistant-specific ViewModels.
This feature allows users to perform full CRUD operations on assistants. AssistantManagerView and its ViewModel handle the list of assistants, while AssistantDetailView and its ViewModel manage the configuration of individual assistants.
ChatView and its ViewModel provide the core chat experience. They manage the creation of threads, sending and receiving messages, and polling for run status updates. MessageStore ensures that chat history is persisted locally.
This feature allows users to manage vector stores and their associated files. VectorStoreListView and its ViewModel handle the list of vector stores, while VectorStoreDetailView provides details and file management options.
The SettingsView allows users to configure the application, including the OpenAI API key and appearance settings.
MessageStore: Persists chat history usingUserDefaultsand JSON serialization.@AppStorage: Used for storing the API key and appearance settings.
NotificationCenter is used to broadcast significant events (e.g., assistantCreated, settingsUpdated), allowing different parts of the application to stay in sync without being tightly coupled.
The interactions.html file provides a visual, interactive diagram of the component interactions within the application, offering a clear overview of the architecture and data flow.
- Error Handling: Enhance error handling with more specific error messages and user-friendly recovery options.
- Unit Testing: Increase unit test coverage for ViewModels and services to ensure robustness.
- Performance Optimization: Profile and optimize data fetching and UI rendering for a smoother experience.
- Accessibility: Improve accessibility by adding labels and hints to all UI elements for better VoiceOver support.
- Xcode 15 or later
- Swift 5.9 or later
- An OpenAI API key
-
Clone the repository:
git clone [https://github.com/Gunnarguy/OpenAssistant.git](https://github.com/Gunnarguy/OpenAssistant.git) cd OpenAssistant -
Open the project in Xcode:
open OpenAssistant.xcodeproj
-
Set your OpenAI API key:
- Run the application.
- Navigate to the Settings tab.
- Enter your OpenAI API key and tap "Save Settings".
-
Build and run the project on your iOS device or simulator.
OpenAssistant follows the MVVM (Model-View-ViewModel) pattern with:
- Models: OpenAI API entities (
Assistant,Message,VectorStore) - Views: SwiftUI components (
ChatView,AssistantManagerView) - ViewModels: Business logic and state management
- Services: API communication layer
OpenAssistant/
βββ Main/ # App entry point & core utilities
βββ APIService/ # OpenAI API integration layer
βββ MVVMs/ # ViewModels and Views by feature
β βββ Bases/ # Base classes for inheritance
β βββ Chat/ # Chat interface components
β βββ Assistants/ # Assistant management
β βββ VectorStores/ # File and vector store management
βββ Assets.xcassets/ # App icons and resources
See detailed architecture: docs/interactions.html
We welcome contributions! Please see our Contributing Guide for:
- Development setup
- Code style guidelines
- Pull request process
- Architecture patterns
Quick start for contributors:
- Fork the repository
- Create a feature branch
- Follow our MVVM patterns
- Submit a pull request
This project is licensed under the MIT License. See the LICENSE file for details.
TL;DR: Free to use, modify, and distribute. No warranty provided.