Skip to main content

AI Tools

Scivics Lab provides a suite of lightweight, Java/Quarkus-based tools for working with AI models and the Model Context Protocol (MCP).

LLM Consoles

Browser-based chat interfaces for different LLM providers. All share the same UX: real-time SSE streaming, prompt queue, tool execution forwarded to the Web UI, and 10 color themes.

ToolBackendGitHub
LLM Console (Claude)Claude Code CLI / Anthropic APIquarkus-llm-console-claude
LLM Console (Codex)OpenAI Codex CLI / OpenAI APIquarkus-llm-console-codex
LLM Console (Local)vLLM, Ollama, OpenAI-compatiblequarkus-llm-console

MCP Infrastructure

ToolDescriptionGitHub
MCP GatewayName-based reverse proxy with caller identification and session metadataquarkus-mcp-gateway
Emacs MCP ServerMCP server that controls Emacs via emacsclient (Python / TypeScript / Java)emacs-mcp-server

How They Fit Together

Claude Code / LLM Console / Workflow Editor
│ MCP

quarkus-mcp-gateway (:8888)
├── /mcp/llm-console-claude → llm-console-claude (:8090)
├── /mcp/workflow-editor → workflow-editor (:8091)
└── /mcp/emacs → emacs-mcp-server (:8092)

The gateway provides name-based routing (no need to remember port numbers), caller identification (each request carries metadata about who sent it), and a session metadata API for on-demand introspection.

Each service is both an MCP server and an MCP client. The Workflow Editor can call the LLM Console to run AI prompts, and the LLM Console can call the Workflow Editor to trigger workflows — all routed through the gateway with full caller traceability.