LLM prompts are commands used to guide large language models (LLM) to precise answers. Well-for­mu­lated prompts can improve the quality of results and help you work more ef­fi­ciently with gen­er­at­ive AI. Ap­proaches like CARE are useful for designing clear and precise LLM prompts.

What are LLM prompts?

LLM prompts are in­struc­tions or input given to a large language model (LLM) in order to generate certain answers or actions. They can include questions, tasks and context in­form­a­tion in written or spoken language, as well as images and other data. The quality and structure of the prompt play a big role in how precise and useful an answer you get from the AI model. For­mu­lat­ing precise LLM prompts is called LLM prompt en­gin­eer­ing and aims to take full advantage of the pos­sib­il­it­ies of gen­er­at­ive AI.

AI Tools at IONOS
Empower your digital journey with AI
  • Get online faster with AI tools
  • Fast-track growth with AI marketing
  • Save time, maximise results

What are some best practices for writing LLM prompts?

LLM prompting is essential for ef­fi­ciently in­ter­act­ing with ar­ti­fi­cial in­tel­li­gence, as poorly for­mu­lated in­struc­tions might not return the kind of answer you’re looking for. For the best results, make sure to pay attention to the following things when writing an LLM prompt:

  • Un­der­stand the abilities of the AI model: Knowing the strengths and weak­nesses of the LLM you’re using (as well as its training data) can help you to adapt a prompt to the cap­ab­il­it­ies of that AI.
  • Formulate LLM prompts precisely: Unclear prompts usually result in imprecise or ambiguous answers. Clear and precise prompts ensure that the AI model can correctly interpret the task and deliver targeted results. It’s also important to keep LLM prompts concise and to use the same tone in the input as you’d like in the output.
  • Provide context: Back­ground in­form­a­tion makes it easier for AI to un­der­stand the prompt. Providing clear context will sig­ni­fic­antly increase the relevance and accuracy of the output. If you provide ad­di­tion­al sources, it’s also useful to specify which in­form­a­tion the AI model should consider.
  • Optimise prompts step by step: You’ll often need to adjust an LLM prompt to get the result you’re looking for. If the first prompt you try doesn’t produce the right output, modify it based on the answer you got or try out different prompts. -Use neutral for­mu­la­tions: Leading questions can sometimes influence a model’s answer. Make sure to formulate LLM prompts neutrally to avoid any extra bias in the results.
  • Clearly define the role of the AI model: If you assign the AI a role, you’ll get more relevant results. Giving it a specific role allows you to tailor the context and get targeted answers.
  • Use LLM prompt templates: Use tried-and-tested prompt templates and adapt them to your in­di­vidu­al needs. You can find numerous LLM prompt examples for a variety of uses online.
IONOS AI Model Hub
Your gateway to a sovereign mul­timod­al AI platform
  • 100% GDPR-compliant and securely hosted in Europe
  • One platform for the most powerful AI models
  • No vendor lock-in with open source

How to optimise LLM prompts using the CARE approach

There are several frame­works for using large language models (LLMs) ef­fect­ively. One popular method is the CARE formula, a simple system designed to help create highly effective LLM prompts. CARE is an acronym that stands for:

  • Content
  • Ask
  • Rules
  • Examples

Content

Define the subject matter or theme you want the AI to focus on. Be as detailed as possible when de­scrib­ing the context or back­ground in­form­a­tion. Strong content helps the AI un­der­stand exactly what domain or situation it is working within, improving the relevance and quality of its response.

Ask

Frame a clear and specific request. Instead of giving vague commands like ‘Write something about marketing’, be direct, such as ‘Create a social media post promoting a new product launch in the tech industry’. A strong ask gives the AI a defined goal, which leads to more targeted and useful outputs.

Rules

Establish any guidelines the AI should follow when gen­er­at­ing the response. This might include setting the tone (e.g., formal or casual), format­ting in­struc­tions (e.g., use bullet points or write in para­graphs), language re­stric­tions (e.g., UK English), or output length (e.g., 150 words maximum). Rules act like bound­ar­ies that keep the AI’s cre­ativ­ity aligned with your ex­pect­a­tions.

Examples

Provide sample outputs or model responses that match the style, tone, or structure you want. Examples serve as concrete ref­er­ences that guide the AI more precisely than abstract in­struc­tions alone. The more relevant and clear your examples are, the easier it is for the AI to mirror the desired result.

Go to Main Menu