<aside>

</aside>

The Art of Prompt Engineering: How to Get the Best Responses from AI Models

Consider this: a vague request like "Do a marketing report" leaves you confused. But a specific one like "Prepare a Q3 marketing report highlighting key results, actions, and recommendations for the CMO and CEO" gives you clear direction.

AI works similarly. Vague prompts yield vague results, while clear, detailed prompts with context lead to targeted, valuable responses. That's the essence of prompt engineering: guiding AI to deliver precise, useful outputs.

In the following sections, we'll explore effective prompt crafting techniques to harness AI's potential and achieve your goals efficiently.

1. Understand how to chose a AI model

<aside> <img src="notion://custom_emoji/e47b797c-56a7-459b-93ba-0024def72785/13740873-dc9b-80a8-8b4a-007a44ab465d" alt="notion://custom_emoji/e47b797c-56a7-459b-93ba-0024def72785/13740873-dc9b-80a8-8b4a-007a44ab465d" width="40px" />

Selecting a model is about matching your task, power requirements, and specialization needs.

</aside>

Here are some key factors to consider:

| 🎯 Your Task | Consider whether you need a generic model or one that’s tailored for a specific task. Well-known models like GPT-4o, Claude Sonnet, Gemini, and Mistral Large are designed for broad, generic use cases.

However, there are many other models available. | | --- | --- | | 💪🏼 Power | An AI model’s power isn’t just about size; it’s also about how much context it can handle at once. Models like GPT-4 and Claude have larger context windows, meaning they can keep track of more information, which is great for complex, ongoing tasks. Smaller models, like Mistral, have fewer parameters and shorter context windows, making them faster and efficient for simpler tasks. | | ⛏️ Specilization Need | When it comes to AI model specialization, the number of parameters plays a significant role in shaping a model’s abilities—whether it’s excelling at creativity, logical reasoning, or understanding nuanced language. Larger models with more parameters, like GPT-4 or Claude, can capture complex patterns and often produce responses that are richer in detail and context. This makes them especially useful for creative tasks, complex problem-solving, and delivering human-like responses.

On the other hand, models with fewer parameters, like Mistral, are streamlined for efficiency and can handle straightforward tasks with speed and accuracy. They may not be as detail-oriented in generating nuanced content, but they excel in tasks where simplicity, quick responses, or lower computational cost is prioritized.

In short, parameter size affects a model’s “personality.” Larger models tend to be more adaptable and creative, while smaller models are typically more direct and task-focused. |

<aside> 👉🏻

Find a tab summarizing what model to choose based on your task.

</aside>

<aside> ⚠️

Prompts aren't portable across models.

Prompts are optimized for a specific version of a model from a particular provider. Updating a model version or changing providers isn't as simple as flipping a switch.

</aside>

💬 All about Text prompt

napkin-selection (14).png

Why Prompt Engineering Matters

AI models process information based on patterns in the data they've been trained on, but they lack context unless you provide it.

Imagine you’re giving instructions to a brand-new employee who doesn’t yet know the nuances of your workflow or business.

In that same way, an AI model depends on detailed, precise instructions to understand the goal of a task.

Here’s where prompt engineering shines—it provides the context, clarity, and constraints the model needs to perform accurately.

Best Practices for Crafting Clear and Effective Prompts

<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/e47b797c-56a7-459b-93ba-0024def72785/282eef54-ca37-47fa-82cf-294264e74609/1.svg" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/e47b797c-56a7-459b-93ba-0024def72785/282eef54-ca37-47fa-82cf-294264e74609/1.svg" width="40px" />

Be Specific, Detailed, and Direct

</aside>

A good prompt specifies exactly what you want the model to do. Vague instructions leave room for interpretation, which can lead to inaccurate or off-target responses.