Prompting Techniques

Zero-Shot vs Few-Shot Prompting

Learn when to use zero-shot prompting versus providing examples, and how many examples you actually need.

Zero-Shot Prompting

Zero-shot prompting means asking the model to perform a task without providing any examples. The model relies entirely on its pre-trained knowledge.

When to use zero-shot:

  • Simple, well-defined tasks
  • When the task format is obvious
  • When you want faster responses

Few-Shot Prompting

Few-shot prompting provides one or more examples of the desired input-output pattern. The model "learns" the pattern from your examples.

When to use few-shot:

  • Complex classification tasks
  • When you need a very specific output format
  • When the task is ambiguous
  • When quality matters more than speed

One-Shot vs Few-Shot

  • One-shot: One example provided
  • Few-shot: 2-5 examples (more usually doesn't help significantly)
  • Many-shot: When you need highly consistent formatting

Example Quality Matters

Bad examples can hurt performance. Ensure your examples:

  • Represent the full diversity of inputs
  • Have correct outputs (wrong examples mislead the model)
  • Are in the exact format you want for output

Example

text
// Zero-shot (no examples)
"Classify the sentiment of this review as Positive,
Negative, or Neutral.

Review: 'The course was comprehensive and the
instructor explained concepts clearly.'

Sentiment:"

// One-shot (one example)
"Classify the sentiment of reviews.

Example:
Review: 'Terrible service, waited 2 hours.'
Sentiment: Negative

Now classify:
Review: 'The product exceeded my expectations!'
Sentiment:"

// Few-shot (multiple examples)
"Convert these informal messages to professional emails.

Informal: 'hey can u send me the report asap'
Professional: 'Could you please send me the report at
your earliest convenience?'

Informal: 'the meeting got moved'
Professional: 'I wanted to inform you that the
meeting time has been rescheduled.'

Informal: 'ur code is broken'
Professional:"
Try it yourself — TEXT