Knowledge Base

AI, and specifically "generative AI" like ChatGPT, is quickly working its way into every area of our world. Flow XO has a variety of powerful tools you can use to harness these capabilities in your conversational agents, but they all require a non-trivial amount of work to learn how to use Large Language Models, create the correct prompts, and process the outputs. What if you just want your bot to answer your users questions based on content you already have available about your company, product or service?

That's where the Flow XO Knowledge Base feature comes into play. Using a Flow XO Knowledge Base, all you need to do is add some content to your knowledge base and feed us a question from your user, and we'll take care of the rest of the steps required to query your documents for the ones most relevant to the question being asked, format those documents in a way that the AI can use, then process the response from the AI so that it can be immediately sent back to your customer.  We'll even send back links to the original sources, so your users can read more about the topic they are asking about. Here's an example interaction with the Knowledge Base we created from our support website:

If the answer to the users question is not contained in the source material, the AI won't answer the question:

How it Works

When documents are added to your Flow XO knowledge base, they are encoded and stored in a specialized type of database that allows them to be queried by the concepts found within the query string, rather than by keywords, unlike a traditional text search. When you provide a user question that you want answered, that question is encoded using using the same natural language processing techniques as the documents and sent to the document database, where documents that share concepts with the query are returned. Flow XO will then stitch those documents together into a block of "reference data" called a context, combine that context with a carefully designed prompt, and send it to a "large language model" or LLM (ChatGPT or GPT-4). Once we receive a response, we detect which of the documents were actually used in the answer, and return those documents along with the source links so you can display the result to your user.

Getting Started

Using the Flow XO knowledge base is extremely simple. Basically there are only three things you need to do:

  1. Create a knowledge base and add your documents
  2. Connect your OpenAI account
  3. Create a flow that captures your users question, and send it to our "Answer Question" task.

We'll go through these steps in detail next. Note that if you need more control over the process, for example if you want to use your own prompts, or don't want to generate an answer but just return links to your users, you can query the knowledge base without invoking the AI. Our AI powered "Query Knowledge Base" task will return documents that match the meaning of your users questions and return the results as a collection of documents, or as a block of text you can use in your own templates which you can combine with our OpenAI tasks to completely tailor the interaction with the language model.

1. Create a Knowledge Base, add some documents

The first step is to create your knowledge base. You can have as many knowledge bases as you need, and choose one or all of them to query when answering questions depending on your needs.

Configuring a new knowledge base is simply a matter of choosing a name and a description, and where your initial documents should come from. If you want to add your own documents manually, choose "Manual". You will have the opportunity upload documents later as well, after your knowledge base is created.

ote that you will always search all your documents for every user question regardless of the category or topic, then you don't ever really need more than one knowledge base (KB). The ability to create multiple KBs is provided in case you want to have seperate knowledge bases for different clients, different departments, etc. that you will want to query individually.

Website - This setting will read your website and import all the webpages it finds. This will automatically schedule a regular job to check for new pages periodically.
Upload - Upload PDFs, word documents, etc
Webpage - reads the text from one or more specific pages. 
Manual - do not import any documents. You will add more documents later.

Add some documents

Once your knowledge base is created, click 'View' to navigate to the KB home page. To add a document, click 'New Document'

NOTE: You can get a lot of value out of even just one document. In fact, if your needs are simply or you want your chatbot to answer questions about many aspects of one topic, you may only have one large document, and it will work just as well as if you had one hundred smaller documents. The main reason to split your content into multiple documents is to map them to individual URLs, because when the AI answers questions based on your data and cites sources, it can only cite an entire document as a source and not individual portions of a document.

Connecting your Open AI account

Now that you have a document, we are almost ready to ask it questions. First, we need to connect an Open AI account to your Knowledge Bases. To do that, click on "Knowledge Bases" on the main side navigation to go to your list of knowledge bases. 

Then click 'Connect your OpenAI account'

Fill in your OpenAI API key and lick 'Create Connection'. If you don't already have an OpenAI account, you can create one here. You can then then access your API key

NOTE: We are considering providing direct access to the Open AI API key as a paid add-on to your Flow XO account so that you don't have to establish your own account with Open AI. If this is something you think you would be interested in, let us know.

Once you click 'Create Connection', you are ready to ask your knowledge base questions.

Getting Answers

Querying your Knowledge Base in the user interface

You can test your Knowledge Base directly from the Flow XO user interface before you even build a flow to expose the knowledge base to your users. To do that, either from the Knowledge Base home page or from an individual Knowledge Base, you can click 'Search' or 'Test':

Search allows you to submit a question to your Knowledge Base and shows you the most relevant matching documents.
Test allows you to ask your knowledge base a question and returns an AI generated answer. Note that 'Test' is only available once you have connected an OpenAI account. 

Querying your Knowledge Base from your flows

Let's create a simple flow to prompt your user to ask a question, and provide an AI generated response. 

Create a blank flow, and for this example, let's use a 'Catch-All' trigger.

Use a catch-all trigger means that anything the user types into your bot that isn't handled by a specific New Keyword trigger will be sent to our knowledge base flow. This setup makes it very convenient to test our knowledge base because we can just type the questions we want to ask directly into the bot without a lot of prompting. You are free to get questions from your user however you want, though, and can include the 'Answer a Question' task in any flow at any time.

Now, we need to add an 'Answer Question' step

Use the 'Message' from the Catch-All trigger as the question to ask the knowledge base. In this example we are treating every message to the bot as a question for the knowledge base. In your application, you may want to provide some additional prompting/structure to guide the user on answering a question your knowledge base can answer. 

For now, you can leave all the other fields as their defauts. Now we need to send the response back to the user. Choose a simple "Send a Message" step and set the message to be the response from the Knowledge Base:

Make sure to enable your flow:

Now we can test it! Click on the 'Talk Bubble' in the bottom right of your screen to open the test console, and type a question. After a few moments, you should get an answer back:

Depending on what documents you put in your knowledge base, you probably got a strange message, or 'I don't know',  as the first message the bot sent, before you typed a question. That is because when a user first connects to your bot, a special start message with the word 'start' (or '/start on Telegram) is sent to Flow XO. Because we are using a catch-all, that start message was interpreted as a question and the knowledge base tried to answer it.

We can fix this by adding a filter to our Catch-All trigger to ignore the start message:

When you're done the filter should look like this

Including Sources

It is already pretty useful that your bot can answer user questions from your knowledge base without any human intervention. But it would be better if the answers included links to the source material. There are a few reasons why this is important in a Customer Service type scenario. 

  • The AI may not have told the user everything they need to know to fully answer their question, so they may need to click the link to find out additional details
  • The user may need to send a link to the documentation to a colleague, such as a developer, or save it for later use
  • You likely want to understand where the AI is getting its answers from. If the AI gives an incorrect or inappropriate answer to a user or during your testing, you will need to understand where the answer came from so you can make a correction.

Fortunately, including sources in your responses is very easy. In addition to the raw text response we get from the AI, we also get a list of sources in several formats. Which you choose depends on the platform you are using. On the Web, Messenger, WhatsApp, you will want to send sources in plaintext or Markdown. On Telegram, you might want any of those options depending on how your bot is configured. On SMS, only plain text is appropriate.

To add sources the response, edit the 'Send a Message' action and choose the format you want to include:

When you are done, your Send a Message template will look like this:

Now when you ask the knowledge base questions, you should get one or more links back. Of course we only have one document in our knowledge base right now, so it will always be the same link:

Reviewing Question & Answer History

Unlike more traditional and mechanical chatbot responses, AI responses are not entirely under your control. Although the AI vendors are constantly working to improve, and we take certain steps to help prevent inaccurate or made up responses, there is an inherent and unavoidable level of uncertainty when using AI to generate content that will be sent directly to your users. Because of this, it is important to regularly review the questions users are asking and the responses the AI is giving, to make sure the AI is giving correct, helpful answers to users questions. In cases where it may not be, you can add more specific documentation to your Knowledge Base to address specific questions users are asking to create better results. Of course, it's a great idea to to this ANYWAY, to keep a pulse on what your users want and need.

In Flow XO, you have access to the full log of questions and answers sent to the AI across all your Knowledge Bases. To access it, navigate to the main Knowledge Base list (click the Knowledge Base link in the main left navigation of the app), and choose 'Q & A History':

This will show you the log of all questions and corresponding responses, as well as how long the language model took to respond. This 'duration', or time to get a response, is useful in determining how often and how quickly to send "status updates" (see the section below).

Letting your user know what to expect

Generative AI is very, very powerful. It can also be pretty slow, especially relative to what users may be used to with more mechanical chatbots. It is always a good idea to let them know it might take a moment to get a response. We recommend always sending a message immediately before your 'Answer Question' step to let the user know to be prepared for a short delay. Sometimes, especially if you are using GPT-4 with a large sized context (don't worry if you don't know what that means), the AI can be *very slow*. You probably won't run into this unless you are having it generate huge responses or are providing it a huge amount of context, but if you do, you might want to consider using Subflows to periodically remind the user that the AI is still working and to be patient. We discuss techniques for that here.

Configuring the Answer Question task

We just used the defaults of the Answer Question task for our quick example, but it has some settings you can use to tailor the AI. Let's take a look:

Question - The question that the AI should answer
Additional Instructions - here you can introduce your bot, give it a name if you like, and any special instructions the AI should follow when generating a response. Note that ChatGPT may or may not follow your additional instructions properly, whereas Gpt4 is usually very good about respecting the additional instructions properly.
Unknown Response Message - what message would you like to send to the user if the AI cannot accurately answer the question?
AI Connection - This lets you select the AI connection you created earlier. You can actually create as many of these as you like, using different API keys. Normally, you won't ever have more than one. But if you are an agency building bots for clients and want to keep costs segregated, or you have different accounts for different environments such as development and production, you might want to have more than one AI Connection.
Model - Here you choose the Open AI model you want to use. By default, ChatGPT is used for the knowledge base (gpt-3.5-Turbo). But if you are familiar with Open AI and you want to use a different model, for example gpt-3.5-turbo-16k to include more source documents in your context, or gpt-4 for potentially vastly improved answers at the expense of speed and cost, you can select the model you want here.
Knowledge Base - by default, the Answer Question task will query ALL your knowledge bases at once to construct its answer. You can select only a specific knowledge base if you want. If you only have one knowledge base, this setting doesn't do anything.
Secondary Knowledge Base - you can configure a secondary knowledge base to query from. A very useful pattern is to keep one knowledge base for documents automatically scraped from your website, and another knowledge base for manual documents you create to help your bot answer questions properly where your website data isn't sufficient. You can use this secondary knowledge base slot to indicate which KB should be used as the refinement knowledge base.
Maximum response length - this is the maximum number of tokens to return in the response. If your answers are getting cut off, or are too brief, increase this number. The lower the number, the faster and cheaper your AI usage will be, so it may take some experimentation to find just the right response size.
Maximum source documents - here you can limit how many source documents should be considered in the response. This setting, too, will require some experimentation, and impacts your Open AI bill. You want to use as small a number as you can that produces good results. Usually, the answer will be in the first couple of search results, so you can keep this number rather small. If the AI isn't finding the answers to the questions your users are answering, however, you can increase this to provide it with more knowledge on each query. Keep in mind, though, that how many search results are actually used by the AI will also depend on the model you selected. For example, gpt-3.5-turbo-16k can use up to 16,000 tokens (bout 64,000 characters) but gpt-3.5-turbo can only use 4,000 tokens, or about 16,000 characters. So if you are using gpt-3.5-turbo (the default), and set your maximum source documents to 100, most of the search results will get discarded. 

Querying the Knowledge Base without generating an AI response

While the easiest way to use the Knowledge Base is through the 'Answer Question' task we just discussed, we also provide a task to query the knowledge base and directly return the documents that match. This is called 'semantic search' because when we query for documents, we aren't using keywords but the meaning of the question compared to the meaning of the words found in the results. Simply querying for documents is much faster, because it doesn't pass through the generative AI part of the process, which is what is so slow. Here are some reasons you might want to directly query your knowledge base:

  • You don't need to summarize the answer to the users question, you just want to give them links they can explore on their own
  • You want to feed your documents into your own Open AI prompts, using our Open AI integration
  • You want to perform a semantic search of documents to use in your flows in some creative way we haven't thought of

Directly querying the Knowledge Base is as easy as using the Answer Question task:

Question - this is the same as Answer Question - it's the question you want to search for the answer to. Of course, it doesn't really HAVE to be a question. The search will return any document segments related to the input, whether or not the input is actually a question. 
Knowledge Base - which knowledge base to search. Optional. When not specified, ALL knowledge bases are searched at once.
Maximum Documents - the maximum number of document segments to return
Generate Context - whether or not to combine the content of the search results into a single block of text that can easily be fed into a language model prompt.

This task produces a few different kinds of outputs you can use:

  •  The list of sources as Markdown, HTML or plain text
  • A single block of text called 'context' that combines all the document segments together, for use with an LLM prompt
  • A JSON formatted array of the search results
  • Whether or not the AI felt it was able to answer the question accurately
  • The input language the user used to ask the question
  • Whether or not the user seems upset

Here's an example of the output:

Document Links (Plain text)

Document Links (Markdown)
- [Pricing](

Document Links (HTML)
<a href="">Pricing</a>

What is an interaction?
A triggered flow (e.g. receiving a message) is counted as one interaction, regardless of the number of messages or actions in response. Any flow that is filtered out doesn’t count.
How do logs expire?
Log entries expire after a set period, depending on your plan. So if you have 3 months of logs, each log entry will be removed after 3 months.
How can I track my interaction limit?
We’ll send you an email when you reach 75% of your limit, and again when you’ve used up all of your interactions. It’s quick and easy to add more if you need to.
What’s the trigger interval?
Some triggers are instant, for example receiving messages. For others (such as checking for new emails), we need to check for new items periodically. The interval is 5 minutes on the free plan, 1 minute on the standard plan.
Is it easy to cancel if I want to?
Yes, it’s as simple as downgrading to our free plan. The downgrade will happen immediately and you’ll receive a pro-rata refund of any charges made for the current billing month.
What is a bot or flow?
Each separate presence on a platform is called a bot. A flow defines one command that your bots can respond to. Inactive flows aren’t counted.
How do I upgrade or downgrade?
Upgrades and downgrades happen immediately. You’ll be charged a pro-rata amount for upgrades, and receive pro-rata credit for downgrades which will be applied against your next invoice.
Do you send invoices/receipts?
Yes, by email each time we make a charge. Our email invoices/receipts include your business billing address for tax purposes.
Do I need a credit card to sign up?
No you don’t, you only need to tell us your name and email address to get an account. We’ll only ask you for payment details when you move to the paid plan.
What support do you offer?
Source :

Start for free and upgrade when you need to
If you're ready to get started creating your own bots and workflows then sign-up today for free. As you grow, simply upgrade your account to a paid option.
...<remaining context>

[{"documentId":"704141fe-182e-11ee-a2c9-97b24f15fb9b","documentStoreId":"61ccd174-182e-11ee-a2c9-5787c1dba0ba","text":"What is an interaction?\nA triggered flow...<rest of content>","metadata":{"source":null,"sourceId":null,"url":"","createdAt":null,"author":null,"title":"Pricing"}}, ...<remaining documents>]

Is Unknown Response<br>false<br><br>Input Language<br>en

Is User Upset<br>false

That's it for the knowledge base feature! It's an extremely powerful tool in your conversational toolbox, so we can't wait to see how you use it. As always, feel free to reach out to with any questions or feedback.

Happy flowing!

Still need help? Contact Us Contact Us