Reference14r2:Concept App Service myApps Assistant

From innovaphone wiki
Jump to navigation Jump to search
Tools clipart.png FIXME: This product is in the beta phase and is not yet finished

Applies To

  • innovaphone from version 14r2

Overview

The myApps Assistant App offers an interface for other apps to access a remote large language model (LLM). This might be a locally hosted open-source model like Mixtral or a model hosted by providers such as OpenAI (e.g., gpt-3.5-turbo, gpt4...). Other apps can integrate the assistant service via the local JavaScript API in the client and thus offer the response of the LLM.

The service acts as a proxy and includes a corresponding backend. This means that you need your own account at the external or self-hosted service and can configure your API key in the PBX Manager for this app service if necessary.

Currently, only the openAI API Chat Completions API v1 is implemented (which is also supported by several open source projects e.g. https://ollama.com/), but more API implementations are likely to follow.

Licensing

No license needed

Installation

Go to the PBX manager and open the "AP app installer" plugin. On the right panel, the App Store will be shown. Hint : if you access it for the first time, you will need to accept the "Terms of Use of the innovaphone App Store"

  • In the search field located on the top right corner of the store, search for "myApps Assistant" and click on it
  • Select the proper firmware version, for example "Version 14r2" and click on install
  • Tick "I accept the terms of use" and continue by clicking on the install yellow button
  • Wait until the install has been finished
  • Close and reopen the PBX manager again in order to refresh the list of the available colored AP plugin
  • Click on the "AP assistant" and click on " + Add an App" and then on the "Assistant API" button.
  • Enter a "Name" that is used as display name (all character allowed) for it and the "SIP" name that is the administrative field (no space, no capital letters). e.g : Name: Assistant API, SIP: assistant-api
  • Fill in the fields for "Remote Service URL", "LLM Model" and if necessary "API Key" (for more information about these fields see below)
  • Tick the appropriate template to distribute the App (the app is needed at every user object from any user who wants to use the assistant API)
  • Click OK to save the settings and a green check mark will be shown to inform you that the configuration is good

myApps Assistant - Client

There is no view/app that the user can explicitly open. The service runs in the background and has no UI and can be integrated by other apps. In order for the user to have access to the assistant service, their user object must have access to the app object of the assistant app.

The client side of the app service receives requests from all other apps via the local JavaScript API and sends them to the service. The result of the request is returned to the original app via the local JavaScript API

myApps Assistant - App Service

The App Service performs tasks in the following areas:

  • Question relay to an LLM
  • Language recognition
  • Caching
  • Translations using the LLM (not yet fully tested)

It can be configured in the PBX Manager App.

Reference14r2 Concept App Service Assistant App-Config.png


On order to make the app connect to your LLM model provider you need to configure a few fields.

  • Remote Service URL - The URL where the remote service interacting with the LLM is hosted. If you are using a model hosted by openAI this would probably be: https://api.openai.com/v1/chat/completions
  • LLM Model - The actual LLM model you intend to use. There are a couple of open source models you could use. If you intend to use openAI models this could be something like gpt-3.5-turbo or gpt4
  • API -Key - If you are not using a self-hosted model you might need an API-Key in order to make successful HTTP Requests to your provider. This key will be delivered by your LLM model provider (e.g. openAI https://platform.openai.com/signup )


Language recognition

An offline API is offered to recognize the language used in a string. This recognition is also used for translations to avoid unnecessary translations (e.g. EN to EN translations are prevented). This payload will not be sent to any backend service, thus avoiding unnecessary costs.
A few features exist which should make sure, a recognized language is only considered, if there is enough confidence:

  • Strings shorter than 6 words are skipped, since the chance for false positives are relatively high
  • Strings in which are not enough word matches to be confident for any language will be skipped, also because of high chances for false positives
  • Strings in which are more than one language with a nearly similar amount of word matches will be skipped
Tools clipart.png FIXME: supported langs

Caching

To save costs, all translations are cached in the App Services database. If a string is translated multiple times, only the first translation is carried out by the backend and the translated version is saved in the cache. A second translation is therefore free of charge and performs better.

Translations against the backend

Translation requests that cannot be handled by the local cache are forwarded to the configured translation backend. After successful translation, the translated version is kept in the cache for future requests.

Tools clipart.png FIXME: supported langs

Troubleshooting

To troubleshoot this App Service, you need the traceflags App, Database, HTTP-Client in your App instance.

Related Articles