Looks like another Claude App/Cowork-type competitor with slightly different tradeoffs (Cowork just calls Claude Code in a VM, this just calls Codex CLI with OS sandboxing).

Here's the Codex tech stack in case anyone was interested like me.

Framework: Electron 40.0.0

Frontend:

- React 19.2.0

- Jotai (state management)

- TanStack React Form

- Vite (bundler)

- TypeScript

Backend/Main Process:

- Node.js

- better-sqlite3 (local database)

- node-pty (terminal emulation)

- Zod (validation)

- Immer (immutable state)

Build & Dev:

- pnpm (package manager)

- Electron Forge

- Vitest (testing)

- ESLint + Prettier

Native/macOS:

- Sparkle (auto-updates)

- Squirrel (installer)

- electron-liquid-glass (macOS vibrancy effects)

- Sentry (error tracking)

They have the same stack of a boot camper, quite telling.

[deleted]

The use of the name Codex and the focus on diffs and worktrees suggests this is still more dev-focused than Cowork.

It's a smart move – while Codex has the same aspirations, limiting it to savvy power users will likely lead to better feedback, and less catastrophic misuse.

> this just calls Codex CLI with OS sandboxing

The git and terminal views are a big plus for me. I usually have those open and active in addition to my codex CLI sessions.

Excited to try skills, too.

wow where did you find the stack? how about claude app stack?

Is the integration with Sentry native or via MCP ?

What does Sentry via MCP even mean? You want the LLM to call Sentry itself whenever it encounters an error?

Meaning sentry exposes an MCP layer with a tool call layer and tool registry. In this case, the layer is provided by Sentry. Native would mean if calling specific Sentry APIs is provided as a specific integration path depending on the context. Atleast thats how I categorize.

I'm so confused. Sentry is a native client crash reporting tool. What does this have to do with MCP or the LLM itself? Do you mean when interpreting the crash data?

Sentry provides a MCP server where your LLM can call the Sentry MCP and answer questions like number of crashes in the last X days etc.

The LLM gets the data from Sentry using Sentry MCP.