7.5 KiB
MCP Dev Mode Chat Handover — Apr 10, 2026
Why this exists
This doc captures important context from the current GPT session because MCP/Gitea access in dev mode does not carry memory across chats.
Use this file as a fast resume point for future sessions.
What was reviewed across repos
Repos reviewed
zlh-grindknowledge-basezlh-agentzpack-apizpack-portalZpackVelocityBridge
Trust order established
knowledge-basefor canonical architecturezlh-grindfor current execution state / handover- source repos for implementation truth
- older docs/logs only when needed
Important nuance:
INFRASTRUCTURE.mdhad been overwritten/truncated and was restored during this session.
Infrastructure doc recovery completed
What happened
zlh-grind/INFRASTRUCTURE.mdhad been overwritten down to a single PBS row.- History was checked and the earlier full infrastructure version was recovered from Git history.
- The current surviving PBS row data was preserved and merged into the restored file.
Recovery actions completed
- Restored full infra doc structure
- Preserved
9017 zlh-backcurrent data:10.60.0.24172.60.0.30
- Corrected router WAN IPs:
9001 zlh-router→66.163.115.2219002 zpack-router→66.163.115.115
- Removed obsolete
internal.zlhinventory section - Replaced that with a note that
internal.zlhis not used for current hot-path service discovery
Current status
INFRASTRUCTURE.mdis usable again as a working infra reference- It should still get a quick sanity pass if any other VM/IP rows changed after Apr 2, 2026
Cross-repo architecture understanding established
Confirmed architecture
- Portal → API → Agent
- Hosted IDE access is API-mediated
- Console access is API websocket bridge mediated
- Agent owns runtime/filesystem execution
- Velocity consumes platform state sourced through the API / plugin path
Repo-specific conclusions
zpack-api
Confirmed in source:
- hosted IDE flow is implemented in
src/routes/devProxy.js - console proxy is implemented in
src/routes/consoleProxy.js - internal Velocity server-list endpoint is implemented in
src/routes/internalVelocity.js - game/server lifecycle orchestration is handled through API routes like
servers.js
zpack-portal
Confirmed in source:
- Open IDE uses the newer API-mediated flow (
POST /api/dev/:vmid/ide-token) - Console uses the API websocket bridge path
- Portal still has some migration debt via
src/lib/api/legacy.ts testdameon/testdaemonstyle cleanup debt had been noted earlier in review
zlh-agent
Confirmed in source:
- real Minecraft readiness probing exists in
internal/minecraft/readiness.go - agent has Fabric-specific provisioning support
- likely unresolved issue is sequencing/use of readiness, not total lack of Fabric support
Important correction on Velocity / registration model
Earlier working assumption was that the API might directly push registration into Velocity.
After code review, the more accurate model is:
- API exposes server inventory/state
- Velocity plugin consumes that information and performs actual server registration inside Velocity
So the plugin is a key part of the registration/readiness path.
ZpackVelocityBridge review summary
Repo hygiene
- Repo was initially pushed with generated Gradle junk (
.gradle/,build/) - User deleted and recreated repo
- Cleaner push landed afterward
- Current repo is clean of generated build junk
- Remaining mild annoyance: extra top-level folder
ZpackVelocityBridge/still exists - This nesting is not a blocker, just mildly inconvenient
Key plugin findings
1. The plugin does actual Velocity registration
ServerRegistry.java performs:
proxy.registerServer(...)proxy.unregisterServer(...)
So plugin code is the component actually adding/removing backends in Velocity.
2. The plugin startup flow pulls from API
In ZpackVelocityBridge.java, startup rehydrate pulls backend inventory from the API.
Important detail found during review:
- the default endpoint still points to
zpack-api.internal.zlh - that default is stale relative to the current architecture unless overridden externally
3. The plugin likely ignores readiness during startup rehydrate
Most important finding from plugin review:
- plugin startup rehydrate appears to register servers returned by the API
- it does not appear to require
ready == truebefore registration
This is likely the core explanation for Fabric being surfaced too early.
Better phrasing of the bug:
- API/plugin path is exposing/registering a backend while the Minecraft/Fabric server is
runningbut not actuallyready
4. Plugin also exposes webhook-style HTTP endpoints
From ZpackHttpHandlers.java:
POST /zpack/registerPOST /zpack/unregisterGET /zpack/status
So plugin supports both:
- startup/API rehydrate
- webhook-style live register/unregister
Needs future verification:
- is anything actually calling these webhook endpoints in production now?
- or is plugin state mostly startup snapshot + manual restarts?
5. Routing layer is simple and not the bug
RoutingHandler.java does:
- hostname exact match first
- single-server fallback when only one backend exists
- otherwise deny
Routing is consuming registry state, not deciding readiness.
Most likely active technical issue now
Best current diagnosis
The most likely active Minecraft/Fabric issue is:
The API/plugin path is allowing a backend to be registered in Velocity when the server is
runningbut not actuallyready.
This is more precise than the earlier broader theory about generic Fabric artifact mismatch.
Highest-value next code change
Patch ZpackVelocityBridge so startup rehydrate only registers servers where:
ready == true
Not merely:
- valid name/address/port
status=running
Second plugin follow-up
Replace or remove stale default reliance on:
zpack-api.internal.zlh
Use current configured API address / env-configured endpoint instead.
Current repo / session state at end of this chat
Completed
- multi-repo review completed
- trust order established
INFRASTRUCTURE.mdrestored and corrected- plugin repo reviewed
- plugin repo cleaned of generated junk after recreate/re-push
Still open
- patch plugin startup rehydrate to honor
ready - confirm whether plugin webhook endpoints are actively used
- optionally flatten
ZpackVelocityBridgerepo so project files live at repo root - optionally reduce Portal dependency on
legacy.ts - validate remaining infra rows for any post-Apr-2 changes
Recommended next move for future chat
First recommended task
Patch ZpackVelocityBridge:
- inspect startup rehydrate logic in
ZpackVelocityBridge.java - require
ready == truebeforeregisterIfAbsent(...) - review stale default API endpoint behavior (
internal.zlhdefault) - decide whether webhook mode, polling, or startup-only rehydrate is the intended long-term model
Secondary follow-up
After patching plugin:
- retest Fabric readiness / Velocity exposure sequence
- verify Fabric backend is not routable until actually accepting connections
Notes for future GPT session
Do not restart broad repo discovery from scratch.
Resume from these assumptions unless newer code disproves them:
- architecture understanding is already established
- infra doc recovery is complete
- plugin is the registration actor inside Velocity
- plugin readiness handling is the most likely near-term bug