Forum in maintenance, we will back soon 🙂
Long Article
Hi everyone
I have website for ai writing assistant, I noticed that I can't create long article with 1200 words while i'm using chatgpt 3.5 turbo on my website, some people told me that is happened from open AI, others said you need to explain that in the prompt (I did and I added in the prompt that result must be as Max Results length but doesn't work).
So, if there is a special prompt can help me to create 1200 words or more article (especially my customer needs that), i tried to search in the course but didn't find any related thing.
Hello friend, according to what OpenAI released, GPT-3.5 allows the generation of 4,096 tokens at a time which is roughly equivalent to 1,000 to 3,000 words.
So, if you didn't bypass 4096 tokens the problem might be in your prompt.
@husein-aboul-hasan Can you help me with prompt that allowed me yo create more than 1200
@wafa-a share your code. Please use Github and share a public link to the code so we can review and help you.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
3.5 is limited to 4096 tokens. so theoretically, you can generate that number of words.
what the exact problem you are facing?
are you generating with the API? or using ChatGPT?
Â
Â
I need to understand how you are generating. what do you mean by "I generated API"
what is the UI that you are using? is this your own website? what is the code behind it? maybe there is a token limit in the API call.Â
please clarify more to helpÂ
Â
@wafa-a the gpt-3.5-turbo model has a total of 4096 tokens to be used for both input and output. So if you have a large input then your output is going to be smaller. You can use a higher model to increase the maximum tokens.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
@ssadvisor my prompt is short as this:
@wafa-a a token is considered to be 4 characters. Divide the character count by 4 to determine the tokens used on input then do the same for the output.
Regards,
Earnie Boyd, CEO
Seasoned Solutions Advisor LLC
Schedule 1-on-1 help
Join me on Slack
@wafa-a is there any option in this script where you can set the token limit in the API call, maybe you should consider asking the script author also.
if the (prompt + output) is less than 4096 tokens, then it is probably a problem with the API call in the script.
Â
another point, I don't see in your problem that you set the content length, update your prompt and tell it I want 1200 word article, and see if it works.
@adminÂ
I test this one"
another point, I don't see in your problem that you set the content length, update your prompt and tell it I want 1200 word article, and see if it works"
and the result are same.
I will try with this "API call.
Â
Thank you
Â
Â