Connect an LLM
Last updated
Last updated
Within this section, admins can manage the connections to any AI service providers that have been configured for the domain. By default, AI Studio will have a connection setup with MangoApps Managed LLM.
From here, Admins can configure new connections to supported LLMs, enable/disable connected LLMs, or remove connections entirely.
To configure a new connection to an AI service provider, click the Connect a LLM button located in the top right corner of the screen.
From the resulting pop-up window, select an option from a dropdown list of MangoApps-supported LLMs. Upon selecting one of these options, the name and icon of the LLM will auto-populate and can be edited by the Admin.
Fill out the API credentials provided by the admin’s account managed by the AI service provider and click Save to initiate the connection request.
If successful, a confirmation message will appear and any new/existing AI Assistants can then be configured using the new AI service provider’s models as its associated LLM.
MangoApps domains with the AI Studio module enabled come with a preset connection to the MangoApps Managed LLM.
Responses generated by AI Assistants using the MangoApps Managed LLM are generated solely by an LLM instance hosted within the MangoApps AI infrastructure (e.g. Azure OpenAI), meaning intranet data will never leave the MangoApps network.
The MangoApps Managed LLM is available to all employees and can be used for any new/existing AI Assistants created for the domain.