In this article, we describe how to set-up your AI Providers to integrate third party models within LILT. This article is meant for anyone interested in using and managing their own LLMs within the LILT platform.
Administrator role or equivalent API Access role permission is required to configure LLMs in LILT.
LILT enables administrators to set-up and manage LLMs all within the LILT platform. To set-up and manage an LLM:
Use the sidebar to navigate to “Manage” then select “AI Providers”.
On the first tab, you will see the various providers we support including Google, DeepL, OpenAI, AWS, and Azure. In addition, we also support Custom Models - please contact the LILT team to learn more!
From the desired LLM card, select “Configure”
Adding AWS credentials for the first time
Enter the required configurations for your LLM (e.g. API key, JSON credentials, bucket name, Password, etc). For more information on configuration details, see the relevant pages:
Use Terms = allows selected terminology entries from the LILT Data Source attached to the model to be used for fine tuning Models built from this LLM.
Allow fine-tuning = allows LILT to share memory entries from selected LILT Data Sources to create more contextualized models.
Once you click Verify and Save, you will be able to see all of the associated services that are enabled for your organization based on the credentials provided.
Edit AWS credentials
Once you’ve configured your AI Providers, click on the “Services” Tab
Here you will see all of the different services (i.e. Translation, Create, etc.) you can access in LILT and the corresponding 3rd party AI services to power it in your platform.