Extend the existing MiniMax integration (currently video-only) with a new
MinimaxChatNode that supports text generation via MiniMax-M2.5 and
MiniMax-M2.5-highspeed language models. The node follows the same
ComfyExtension pattern used by OpenAI and Gemini chat nodes.
Changes:
- Add chat API models (request/response) to apis/minimax.py
- Add MinimaxChatNode with system prompt, temperature, and max_tokens support
- Register the new node in MinimaxExtension