AI Model

AI Model with ChatGPT✨

You can choose the GPT model that is used to answer questions according to your requirements.

A few differences between the GPT-4 and GPT-3.5 models:

  • GPT-4 is more robust and provides more precise answers but slower and more expensive whereas GPT 3.5 is faster and suits most cases;
  • GPT-4 probably has higher performance and ability to handle complex tasks compared to GPT-3.5;
  • GPT-4 could provide more accurate and relevant answers to questions than GPT-3.5;
  • GPT-4 could be trained on a larger amount of data, which contributes to its ability to understand and generate text.

Context proof level:

User can specify the creativity level for the model. The closer the value to zero the more creative the model but it doesn’t stick directly to the context provided in the uploaded data. The closer the value to one and the model sticks directly to the provided data.

Example question:

  • “How many people live on Mars now?”

Example answer:

  • “At present, humans do not reside on Mars. However, numerous space agencies and private companies are actively working on missions and projects that could lead to future human habitation. Advanced robotic missions have been deployed to assess the planet’s resources, climate, and potential landing sites. Innovative ideas, such as 3D-printed habitats, underground colonies, and terraforming technologies, are being explored to make Mars more habitable. The dream of living on Mars is fueling creative solutions and pushing the boundaries of human exploration.”

Semantic balance option:

Neutral +1 is the the most optimal state.

For a search experience that mirrors natural language processing, considering synonyms, context, and related concepts, adjust the slider closer to “Semantic vector search.💡


For a more straightforward search relying solely on the words in the provided data, adjust the slider closer to “Classic text search.”🤖

Knowledge Sources:

Knowledge Sources determine the maximum number of documents or chunks of text used to generate the answer. It allows controlling the volume of information from which the model extracts context and details for formulating a response. By adjusting this parameter, one can balance between depth of analysis and response speed.

The more Knowledge Sources specified the more information the model has for analysis and utilization in crafting a response. However, this can also impact query processing time. Conversely, if too few Knowledge Sources are specified, the model may not have enough context to generate a comprehensive and accurate answer.

Fine-tuning Knowledge Sources enables more precise control over which information the model will use to respond to queries, which is especially useful if there’s a need to limit the model’s access to information due to resource constraints or specific project requirements.



Personality:

“prompt”


Advance settings :


To access additional settings, click on the “Advanced Settings” button, which will redirect you to the additional settings.

Degree of Originality

Adjust the slider to the desired position. ‘The default position is optimal.’

Diversity Range

Adjust the slider to the desired position. ‘The default position is optimal.’


Amount of the last messages to be saved in context


Custom Prompt:

Instead of relying on standard queries, you can create your own, more specific queries that better suit your particular task or situation. This approach allows for better control over interaction with the model and obtaining more precise and relevant answers. A Custom Prompt includes a combination of text and special tags that instruct the model on the desired context or task.

Examples:

Most likely’ before answering. Please take into attention the context: {context}

Answer the question: {question}

{currentDateTime}

Always check {currentDateTime} on the background and use this information in providing answers.

When a user asks about upcoming events, check the {currentDateTime}.

Use the {currentDateTime} with the event list to identify the next upcoming event and provide its date and time.

A “Custom prompt” refers to a personalized input invitation, replacing the standard one. If there’s information to convey,  Tone and Role formatting are disabled, ensuring the priority of the custom prompt. Consequently, if a “custom prompt” is present, it overrides the standard “prompt,” ensuring its exclusive use.

Was this article helpful to you? Yes No

How can we help?