logo
P
Prompt Master

Prompt 大师

掌握和 AI 对话的艺术

Code Snippets

Generate code from a comment or instruction

TL;DR

  • This is a minimal code generation test: give the model a natural language instruction (inside a comment) and have it output runnable code.
  • Key risks: the model might skip input/output, ignore edge cases, or generate code that doesn't match the target language/runtime.
  • Production tip: write the instruction as a checklist (language/runtime/IO/examples/error handling) and use test cases for evaluation.

Background

This prompt tests an LLM's code generation capabilities by asking it to generate a code snippet given details about the program through a comment using /* <instruction> */.

How to Apply

You can treat the "comment instruction" as a stable input protocol:

  • Use /* ... */ (or whatever format your team agrees on) to describe requirements
  • Specify the language and runtime (browser / Node.js / Python)
  • Specify input/output (CLI / function / API handler)
  • Give 1-3 examples (input → output)

This way the model is much more likely to generate executable code rather than just pseudocode.

How to Iterate

  1. Add constraints: language version, dependency restrictions, banned APIs
  2. Add tests: require the output to include 3-5 test cases
  3. Add self-check: have the model list "assumptions" first, then generate code
  4. Multi-turn iteration: first have the model output a plan/interface, then fill in implementation details

Self-check Rubric

  • Does it meet the requirements (functionally correct)?
  • Does it actually run (syntax, dependencies, environment match)?
  • Does it cover edge cases and error handling?
  • Does it follow constraints (no banned libraries/APIs)?

Practice

Exercise: replace the instruction with a real small task from your work, and at minimum include:

  • language/runtime
  • function signature or CLI interface
  • 2-3 examples

Then use test cases to regression-compare quality across different models/prompts.

Prompt

/_
Ask the user for their name and say "Hello"
_/

Code / API

OpenAI (Python)

from openai import OpenAI

client = OpenAI()

response = client.chat.completions.create(
    model="gpt-4",
    messages=[
        {
            "role": "user",
            "content": '/*\nAsk the user for their name and say "Hello"\n*/',
        }
    ],
    temperature=1,
    max_tokens=1000,
    top_p=1,
    frequency_penalty=0,
    presence_penalty=0,
)

Fireworks (Python)

import fireworks.client

fireworks.client.api_key = "<FIREWORKS_API_KEY>"

completion = fireworks.client.ChatCompletion.create(
    model="accounts/fireworks/models/mixtral-8x7b-instruct",
    messages=[
        {
            "role": "user",
            "content": '/*\nAsk the user for their name and say "Hello"\n*/',
        }
    ],
    stop=["<|im_start|>", "<|im_end|>", "<|endoftext|>"],
    stream=True,
    n=1,
    top_p=1,
    top_k=40,
    presence_penalty=0,
    frequency_penalty=0,
    prompt_truncate_len=1024,
    context_length_exceeded_behavior="truncate",
    temperature=0.9,
    max_tokens=4000,
)