Back

All I want for Christmas is a comprehensive GPT prompting guide


With thanks to Eric Vyacheslav on Linkedin, this is Open AI’s recommended strategy for getting the best out of prompting GPT.

1 – write clear instructions
  • be specific: Clarity in instructions leads to more relevant outcomes.
  • define the desired output length and complexity.
  • demonstrate preferred formats.
  • minimise ambiguity to enhance model accuracy
2 – provide reference text
  • counteract potential fabrications with concrete reference materials.
  • reference texts guide the model towards accurate and reliable answers
3 – split complex tasks into simpler subtasks:
  • break down tasks to reduce errors and improve manageability.
  • consider tasks as workflows of simpler, interconnected steps.
4 – give the model time to ‘think’
  • allow the model to process and reason, similar to a human solving a complex problem.
  • encourage a “chain of thought” approach for more accurate reasoning
5 – use external tools
  • supplement the model’s capabilities with specialized tools for specific tasks.
  • leverage resources like text retrieval systems or code execution engines
6 – test changes systematically
  • measure improvements with a comprehensive testing approach.
  • ensure that modifications lead to overall performance enhancements

Click here for the full guide.

Mondatum Labs is here and ready to support all your machine learning- / GPT- / generative AI art-based project, process, pipeline and workflow requirements.

Contact us for advice, guidance or consultancy – Colin Birch (colin@mondatum.com), John Rowe (john@mondatum.com).

Main image ℅ Search Engine Journal



RELATED INSIGHTS