The AZN team has integrated OpenAI's ChatGPT chatbot into the Customer Intelligence Suite (CIS), which enables customers to collect and consolidate various business touchpoints through reviews, surveys and support chats in one place.
ChatGPT is a natural language processing tool based on artificial intelligence technology designed to provide information and answer questions through a conversational interface.
At the current stage of integration, AZN has included ChatGPT in the CIS feedback section, where it helps operators provide responses based on the information it was provided with prior to training. The OpenAI service is used by AZN not directly, but through the Microsoft Azure layer, which makes it possible to configure the chatbot.
At the moment the chat only works with English text because it is in real-time development. In the future, new languages will not keep you waiting.
How does ChatGPT work for us?
For ChatGPT to work correctly, our team gives it a certain system message so that it can understand who it is, what it can/cannot do, how it should do it and how it should act in a given situation. Next, the team gives it examples that are taken from the real process of interaction between the operator and the client/user, and edits them to obtain better data, which will help the bot produce a suitable result.
When a client sends a request, the bot immediately provides a response that could be used by the operator in the context of that request. The ChatGPT result appears in the review case in CIS and the client will not be able to see it until the operator agrees to it. In order for the bot to learn better, the operator can make changes to the proposed text before sending the answer to the user. In the future, after the trial period, this will help reduce the number of requests that operators need to process.
After editing, information appears in the system that the operator responded to the client’s request, based on the results of ChatGPT. In additional system messages you will be able to see exactly what changes were made: what was edited and by whom.
There are several integration paths and ChatGPT models: GPT-3.5-Turbo and GPT-4, which differ only in pricing policy. These models can be easily adapted to a specific task, including content creation, summarization, semantic search, and natural language-to-code conversion.
Which bot model to connect depends on the context you have and its volume. If the organization is large and has many departments, subsidiary projects, services, employees, reports, etc., then the chatbot should be able to provide all this information. To do this, there is a way of additional training by adding data sources: creating a resource on which this data will be located, and creating resources for cognitive search.
Cognitive search uses a semantic search model approach, the cost of which will be higher than keyword search. But it is suitable for large decisions and contexts that can be created.
In fact, models can be trained from scratch based on collected high-quality data, but this will be more expensive than using already trained models. We use trained models that know how to interact with users and put them into context with our instructions.
The integration of the ChatGPT chatbot helps operators, and they, in turn, help us improve the model by editing responses. Mutual collaboration helps us analyze modified responses, create statistics, and understand how best to retrain the model so that it produces even better results and works more efficiently.