SwiftUI-powered AI assistant for the OpenAI Responses API featuring computer use, code interpreter, file search, image generation, and MCP integrations—all wrapped in a production-ready iOS experience with deep observability and safety rails.
Status — November 2025: Phase 1 is complete. OpenResponses ships with local conversation storage, full Responses tool support, and the Minimal Viable App-Store Submission (MVAS) checklist. Phase 2 focuses on Conversations API migration and cross-device sync.
- Overview
- Core Features
- Architecture
- Getting Started
- Initial Configuration
- Toolbox at a Glance
- Privacy, Safety, and Compliance
- Testing & Quality Gates
- Release Workflow
- Documentation Hub
- Roadmap Snapshot
- Contributing
- Support
- License
OpenResponses is an end-to-end iOS, iPadOS, and macOS (Catalyst) client for the OpenAI Responses API. It targets developers and advanced users who need:
- Full coverage of the current tool surface (computer use, code interpreter, file/vector search, image generation, MCP connectors).
- Rich observability—streaming analytics, reasoning trace playback, and API inspectors that make debugging and demos effortless.
- Enterprise-ready safeguards—Keychain credential storage, explicit approval flows for automation, and a minimal privacy footprint.
The app follows a productized workflow: everything you need to test, ship, and submit to the App Store—including privacy docs, tracking scripts, and the MVAS tracker—is built into the repository.
- Model Playground: Live model catalogue with compatibility gating, preset management, and advanced request controls (streaming flags, prompt cache IDs, reasoning toggles).
- Observability Surface: Streaming activity feed, live token usage, “Assistant Thinking” trace viewer, analytics events, and structured logging for every tool event.
- Tooling Portfolio: Computer use with safety approvals, code interpreter with artifact viewer, multi-vector file search, direct file and image attachments, Notion/MCP connectors, custom function calls.
- Knowledge Workflows: Vector store management flows, file conversion pipeline, and document picker enhancements built on
FileConverterService. - Native Shell: SwiftUI UI with accessibility support, keyboard shortcuts, share sheets, prompt library, onboarding, and settings tuned for fast iteration.
OpenResponses follows MVVM with dependency injection through AppContainer.
- Views: SwiftUI views such as
ChatView,MessageBubbleView, and modular settings/onboarding screens. - View Models:
ChatViewModelorchestrates conversations, state, and tool execution; extensions such asChatViewModel+Streaminghandle 40+ streaming event types. - Services:
OpenAIServicewraps the Responses API,ComputerServiceautomates the computer-use browser,ConversationStorageServicepersists local history,KeychainServicestores secrets, and compatibility helpers gate tooling per model. - Data Models: Rich types for streaming events, function calls, computer-use actions, artifacts, and reasoning traces keep decoding resilient and expressive.
Dive deeper in
docs/CASE_STUDY.mdfor component diagrams, request flows, and design decisions.
- Xcode 16.1 (or newer)
- macOS Sonoma
- An OpenAI API key (sk-… project key)
git clone https://github.com/Gunnarguy/OpenResponses.git
cd OpenResponses
open OpenResponses.xcodeproj- OpenResponses (iOS/iPadOS): Run on simulator or device.
- OpenResponses (macOS Catalyst): Build/run via “My Mac (Designed for iPad)” scheme.
- Launch the app.
- Complete onboarding (3 screens summarizing capabilities and key requirements).
- When prompted, paste your OpenAI API key. It is stored in the iOS Keychain (
KeychainService) and never checked into source control. - Use Settings → General to toggle streaming, published prompts, and prompt cache IDs.
- Enable tools (code interpreter, computer use, file search, MCP) in Settings → Tools. Each capability enforces additional confirmation flows as required.
Secrets are intentionally absent from the repo. Run python3 scripts/secret_scan.py anytime to validate.
| Capability | Details |
|---|---|
| Computer Use | Navigate/click/scroll automation with safety approval sheets, blank-page recovery, screenshot attachments, and status updates. |
| Code Interpreter | Sandboxed Python execution with artifact viewer, status heartbeats, and result summarization. |
| File Search & Vector Stores | Upload files, manage vector stores, toggle file search per prompt, and configure rankers or thresholds. |
| Image Generation | Trigger image creation with optional detail level control and inline previews. |
| MCP Connectors | Register local/remote MCP servers, inspect tools, and gate usage through approval UI with Keychain-backed auth. |
| Prompt Library | Save and reuse prompt presets including reasoning/model settings and safety identifiers. |
| Observability | Activity feed, streaming status chips, token usage counters, API inspector, debug console, and analytics hooks. |
- Credentials: API keys and integration tokens live only in the Keychain. No secrets ship with the app or reside on disk.
- Data Residency: Conversations and attachments stay on device until you explicitly send them to OpenAI or an MCP tool.
- Permissions: The app currently requests Photos, Files, Calendars, Contacts, Reminders, and Local Network usage descriptions. Camera, microphone, speech recognition, and location are intentionally excluded in v1.0.0.
- Computer Use Safeguards: Every automation step requires review; declines cancel the chain immediately. Status updates ensure reviewers see what is happening at all times.
- Docs: See
PRIVACY.mdfor the privacy summary anddocs/AppReviewNotes.mdfor reviewer instructions.
- Unit & Snapshot Tests: Run inside Xcode (
⌘U) or viaxcodebuildonOpenResponsesTests,StreamingEventDecodingTests, and related targets. - Secret Scan:
python3 scripts/secret_scan.py - Preflight Check:
bash scripts/preflight_check.shverifies Info.plist usage descriptions and reruns the secret scan. - Manual QA: Follow
docs/PRODUCTION_CHECKLIST.mdfor streaming, tooling, accessibility, and documentation checks. - API Coverage: Update
docs/api/Full_API_Reference.mdwhen adding request fields, tool types, or event handling.
The Minimal Viable App-Store Submission (MVAS) plan captures everything needed to submit OpenResponses to TestFlight/App Store with ~6–12 hours of effort.
- Track progress in
docs/MVAS_SUBMISSION_TRACKER.md(checklist + decision log). - Ensure privacy copy is current (
PRIVACY.md, App Store metadata,docs/AppReviewNotes.md). - Run
bash scripts/preflight_check.shto confirm secrets and Info.plist values are clean. - Archive in Xcode → Organizer → Validate/Upload.
- Invite internal TestFlight testers for the sanity pass (onboarding, chat, computer use).
- Submit to App Review with the dossier from
docs/AppReviewNotes.md.
docs/ROADMAP.md— phased rollout plan with current status.docs/CASE_STUDY.md— architecture narrative including diagrams and streaming lifecycle.docs/api/Full_API_Reference.md— field-by-field implementation status for Responses.docs/PRODUCTION_CHECKLIST.md— manual QA and release verification steps.docs/Advanced.md,docs/Tools.md,docs/Files.md,docs/Images.md— feature-specific how-tos.docs/AppReviewNotes.md— one-pager for App Store reviewers.Notion/— MCP connector setup guides.
- Phase 1 (Complete): Multi-modal inputs, full Responses tool coverage, computer-use hardening, vector workflow, observability overhaul.
- Phase 2 (In Progress): Conversations API adoption, annotation rendering, cross-device sync, enhanced conversation metadata.
- Beyond: Apple Intelligence integration, richer UI polish, offline caching, and advanced prompt caching (see
docs/ROADMAP.md).
We welcome pull requests aligned with the roadmap.
- Fork the repo and branch from
mainor the active release branch. - Implement the change with tests where applicable.
- Run unit tests and
bash scripts/preflight_check.sh. - Update relevant docs (
docs/,PRIVACY.md,README.md, etc.). - Submit a PR describing the change, test evidence, and any roadmap linkage.
Please open an issue before large architectural work so we can coordinate on Phase 2 priorities.
- Email: support@gunnarguy.com
- Issues: https://github.com/Gunnarguy/OpenResponses/issues
- Discussions and roadmap queries: see
docs/ROADMAP.mdanddocs/MVAS_SUBMISSION_TRACKER.md
MIT — see LICENSE.