0

Master the art of timely engineering

In today’s AI-driven world Timely engineering It’s not just a buzzword, it’s an important skill. This fusion of art and science goes beyond simple queries, allowing you to turn vague ideas into precise and viable AI output.

Whether you’re using Chatgpt 4O, Google Gemini 2.5 Flash, or Claude Sonnet 4, four fundamentals will unlock the full potential of these powerful models. Master them and turn each interaction into a portal for special results.

This is an important pillar of effective and timely engineering:

1. Clear and specific explanation

The basis of content generated by high-quality AI (including code) relies on clear instructions. Tell AI exactly what you want Do and how You want it to render.

For Chatgpt and Google Gemini:

Use powerful action verbs: Start your prompt with direct commands like “write”, “generate”, “create”, “convert” or “extract”.

Specify the output format: Explain the required structure (e.g., “Providing code as a Python function”, “Output in JSON array”, “Step using numbered lists”).

Define the range and length: Make it clear whether you need “a short script”, “single functionality”, or “code for a specific task”.

Example tips: “Writing a python function called calculate_rectangle_area that takes length and width as parameters and returns the area. Please include a comment that describes each line.”

For Claude:

Use the fixed coefficient to clarify: Enclose your main instructions in different tags, such as… or triple quotes (“”…””). This segmentation helps Claude separate and focus on core tasks.

In affirmative language: Focus on you think AI to complete, not you No Want it to do it.

Consider a “system prompt”: Before your main query, build roles or general rules (e.g., “You are a professional Python developer focusing on clean, readable code.”).

Example tips: “”” Generate a JavaScript function to reverse the string. The function should be named reversestring” and take a parameter, “inputstr”.

2. Provide a comprehensive environment

AI models require relevant background information to understand the nuances of your request and prevent misunderstandings and answer in your specific situation.

For Chatgpt and Google Gemini:

Include background details: Describe the scheme or purpose of the code (e.g., “I’m building a simple web page that I need JavaScript to click the button.”).

Define variables/data structures: If your code must interact with specific data, clearly describe its format (for example, “The input will be a list of dictionaries, each with ‘name’ and ‘age’ keys.”).

Mentioning dependencies/libraries (if known): “Use the request library to make API calls.”

Example tips: “I have a CSV file called products.csv with columns “item”, “price” and “quantity”. Write a python script to read this CSV and calculate the total value of all items (price*quantity).”

For Claude:

The subdivided environment is clear: Use different parts or delimiters to introduce background information (e.g. ).

Set a role: As mentioned earlier, establishing a specific role of Claude in the prompt (e.g., “You are a senior front-end developer”) immediately constitutes its response in that expertise, affecting the tone and depth.

Example tips: I’m developing a small reaction application. I need a component that displays a welcome message. Create a functional response component called welcomemessage , which accepts “name’prop”prop and says “Hello, [name]! ”

3. Take advantage of illustrative examples (several lenses)

Examples are powerful teaching tools for LLM, especially when presenting required patterns or complex transformations that express challenges only through descriptive language.

For all LLMs (Chatgpt, Gemini, Claude):

Show input and expected output: For the function, its expected behavior is clearly demonstrated and has a specific input and its corresponding correct output.

Examples of formats provided: If you need a specific output style (e.g., exact JSON structure), include samples of that format.

“Seldom Shoot” Tips: Combines 1-3 pairs of example inputs and their respective required outputs. This guides AI to understand the basic logic.

Example tips (for any LLM): “Write a python function that converts temperature from Celsius to Fahrenheit. Here is an example:

Input: celsius_to_fahrenheit(0)

Output: 32.0

Enter: celsius_to_fahrenheit (25)

Output: 77.0 inches

4. Accept iterative and experimental methods

Rarely is the perfect tip for making the first attempt. Hope to refine and iterate based on the initial response of AI to get the best results.

For Chatgpt and Google Gemini:

Provides error messages for debugging: If the generated code is not running, paste the exact error message back into the chat and ask the AI ​​to debug or explain the problem.

Description of unexpected output: If the code runs but produces incorrect or undesirable results, clearly state what you observe and what you expect.

Requires alternatives: Prompt “Can you tell me another way to do this?” or “Can you optimize this code to speed?”

For Claude:

Clarify and add new constraints: If the output is too wide or a specific detail is missed, introduce new instructions (for example, “Make sure the code handles negative input gracefully.”)

Perfect role: If the generated content is not very correct in pitch or style, adjust the initial system prompt or add specific instructions like “In a cleaner coding style.”

Decompose complex tasks: If Claude struggles with large, multifaceted requests, simplify it to smaller, easy-to-manage steps and ask for code for each step.

By systematically applying these principles and understanding the subtle preferences of different LLMs, you can transform your AI into an incredibly effective coding assistant, simplifying projects and expanding problem-solving capabilities.


Max is an AI analyst at Marktechpost, based in Silicon Valley, who actively shapes the future of technology. He teaches robotics at Brainvyne, uses comma to combat spam, and uses AI every day to transform complex technological advancements into clear, understandable insights