- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to chat model list
- Set MiniMax-M2.7 as default model (replacing M2.5)
- Keep all previous models as available alternatives
Extend the existing MiniMax integration (currently video-only) with a new
MinimaxChatNode that supports text generation via MiniMax-M2.5 and
MiniMax-M2.5-highspeed language models. The node follows the same
ComfyExtension pattern used by OpenAI and Gemini chat nodes.
Changes:
- Add chat API models (request/response) to apis/minimax.py
- Add MinimaxChatNode with system prompt, temperature, and max_tokens support
- Register the new node in MinimaxExtension