NOC AI

Generate text from a prompt

NocAI provides simple APIs to use a large language model to generate text from a prompt. These models have been trained on vast quantities of data to understand multimedia inputs and natural language instructions. From these prompts, models can generate almost any kind of text response, like code, mathematical equations, structured JSON data, or human-like prose.

Quickstart

To generate text, you can use the chat completions endpoint in the REST API, as seen in the examples below. You can use the REST API from the HTTP client of your choice.

Create a human-like response to a prompt
1curl "https://***/chatCompletions/chat/"
2  -H "Content-Type: application/json"
3  -H "Authorization: Bearer $NOCAI_API_KEY"
4  -d '{
5    "model": "gpt-4",
6    "messages": [
7      {
8        "role": "system",
9        "content": "You are a helpful assistant."
10      },
11      {
12        "role": "user",
13        "content": "Write a haiku about recursion in programming."
14      }
15    ]
16  }'

Choosing a model

When making a text generation request, the first option to configure is which model you want to generate the response. The model you choose can greatly influence the output, and impact how much each generation request costs. A large model like gpt-4o will offer a very high level of intelligence and strong performance, while having a higher cost per token. A small model like gpt-4o-mini offers intelligence not quite on the level of the larger model, but is faster and less expensive per token.

Building prompts

The process of crafting prompts to get the right output from a model is called prompt engineering. By giving the model precise instructions, examples, and necessary context information (like private or specialized information that wasn't included in the model's training data), you can improve the quality and accuracy of the model's output.

User messages

User messages contain instructions that request a particular type of output from the model. You can think of user messages as the messages you might type in to ChatGPT as an end user. Here's an example of a user message prompt that asks the gpt-4o model to generate a haiku poem based on a prompt.

Generate a haiku poem based on a prompt
1curl "https://***/chatCompletions/chat/" 
2    -H "Content-Type: application/json" 
3    -H "Authorization: Bearer $NOCAI_API_KEY" 
4    -d '{
5  model: "gpt-4o",
6  messages: [
7    {
8      "role": "user",
9      "content": [
10        {
11          "type": "text",
12          "text": "Write a haiku about programming."
13        }
14      ]
15    }
16  ]
17}'

System messages

Messages with the system role act as top-level instructions to the model, and typically describe what the model is supposed to do and how it should generally behave and respond. Here's an example of a system message that modifies the behavior of the model when generating a response to a user message:

Generate a haiku poem based on a prompt
1curl "https://***/chatCompletions/chat/" 
2    -H "Content-Type: application/json" 
3    -H "Authorization: Bearer $NOCAI_API_KEY" 
4    -d '{
5  model: "gpt-4o",
6  messages: [
7    {
8      "role": "system",
9      "content": [
10        {
11          "type": "text",
12          "text": `
13            You are a helpful assistant that answers programming questions 
14            in the style of a southern belle from the southeast United States.
15          `
16        }
17      ]
18    },
19    {
20      "role": "user",
21      "content": [
22        {
23          "type": "text",
24          "text": "Are semicolons optional in JavaScript?"
25        }
26      ]
27    }
28  ]
29}'

This prompt returns a text output in the rhetorical style requested:

Well, sugar, that's a fine question you've got there! Now, in the world of JavaScript, semicolons are indeed a bit like the pearls on a necklace – you might slip by without 'em, but you sure do look more polished with 'em in place. Technically, JavaScript has this little thing called "automatic semicolon insertion" where it kindly adds semicolons for you where it thinks they oughta go. However, it's not always perfect, bless its heart. Sometimes, it might get a tad confused and cause all sorts of unexpected behavior.