AI Tools
Scivics Lab provides a suite of lightweight, Java/Quarkus-based tools for working with AI models and the Model Context Protocol (MCP).
LLM Consoles
Browser-based chat interfaces for different LLM providers. All share the same UX: real-time SSE streaming, prompt queue, tool execution forwarded to the Web UI, and 10 color themes.
| Tool | Backend | GitHub |
|---|---|---|
| LLM Console (Claude) | Claude Code CLI / Anthropic API | quarkus-llm-console-claude |
| LLM Console (Codex) | OpenAI Codex CLI / OpenAI API | quarkus-llm-console-codex |
| LLM Console (Local) | vLLM, Ollama, OpenAI-compatible | quarkus-llm-console |
MCP Infrastructure
| Tool | Description | GitHub |
|---|---|---|
| MCP Gateway | Name-based reverse proxy with caller identification and session metadata | quarkus-mcp-gateway |
| Emacs MCP Server | MCP server that controls Emacs via emacsclient (Python / TypeScript / Java) | emacs-mcp-server |
How They Fit Together
Claude Code / LLM Console / Workflow Editor
│ MCP
▼
quarkus-mcp-gateway (:8888)
├── /mcp/llm-console-claude → llm-console-claude (:8090)
├── /mcp/workflow-editor → workflow-editor (:8091)
└── /mcp/emacs → emacs-mcp-server (:8092)
The gateway provides name-based routing (no need to remember port numbers), caller identification (each request carries metadata about who sent it), and a session metadata API for on-demand introspection.
Each service is both an MCP server and an MCP client. The Workflow Editor can call the LLM Console to run AI prompts, and the LLM Console can call the Workflow Editor to trigger workflows — all routed through the gateway with full caller traceability.