In what sense? Codex CLI is FOSS and works fine with other models as a backend, including those served by llama.cpp.