Skip to Main Content
HCL Domino Ideas Portal

Welcome to the #dominoforever Product Ideas Forum! The place where you can submit product ideas and enhancement request. We encourage you to participate by voting on, commenting on, and creating new ideas. All new ideas will be evaluated by HCL Product Management & Engineering teams, and the next steps will be communicated. While not all submitted ideas will be executed upon, community feedback will play a key role in influencing which ideas are and when they will be implemented.

For more information and upcoming events around #dominoforever, please visit our Destination Domino Page

Status Needs Review
Workspace Domino
Categories Integration
Created by Guest
Created on Jul 31, 2025

Multiple LLM:s on the same Domino server

Different LLM:s is good for different things therefore it would be good to have the possibility to enable more than one LLM on one Domino server especially remote LLM:S. I understand that enabling more than one local one might limit on system specs but external ones shouldn't impact that much. And should Domino setup be the limiting factor ?
Have the configuration use to be set using the command document.

  • Attach files
  • Guest
    Aug 8, 2025

    Actually you can specify different LLM models for the same remote server per command.

    The only limitation is that this needs to be the same remote server. For example a Ollama server loading multiple models.
    There is a configuration for it. And each OpenAI chat completion request sends over the model. The model must just match what the server supports.