How to use prompt parameters in ChatGPT !
Today I learned how to use ChatGPT’s prompt parameters !
Prompt parameters are ChatGPT settings you can tweak to change it’s output. The
3 main parameters are:
Max Tokens
Temperature
Frequency Penalty
Max tokens
controls the length of the output text
tokens consist of words, punctuation and spaces
one word is roughly 5 tokens
i.e 500 words equals 2500 tokens!
Remember, tokens are much more accurate than simply using 'word count'
E.g Prompt
Set max tokens limit to 2500
Temperature
temperature controls the randomness of ChatGPT response
temperature ranges from 0.00 - 2.00
high temperature(1.8) equals very creative output
low temperature(0.2) equals safe/most likely responses
No best temprature ! the right temperature depends on your use case!
Set temperature 1.8 in our conversation
Frequency Penalty
Frequency discourages ChatGPT from repeating itself using common phrases.
(Makes ChatGPT give more unique responses)
E.g Prompt
Use 'frequency_penalty' value of 1 in our conversation
Discover more parameters
E.g Prompt
I will provide you a list of parameters and you will come up with 10 more parameters
Use 'frequency_penalty' value of 1 in our conversation"
Use 'temperature' value of 1 in our conversation"
If you don't know what they mean, then use this prompt:
Explain each of the parameters that you suggested and their values
Come up with 10 new parameters. Prioritise uncommon parameters only