note: please remember that this problem is not from the creators because the creators cannot control it happened because of jllm itself or the way you chat with them
YOU ππ»ββοΈ: I WANT SHORT RESPONSE!
- lower your tokens up to 200-300,300 tokens is the sweetspot for you to get a short response. if it cuts your message, just need to press this button β¦

- you have to understand that if the bot's initial message is long, it will encourage the bot to give a long response as well
- you won't get a perfect response like c.ai because jllm is still beta(but it's still good tho), i recommend you use cosmosrp instead if you like short response..
- what is cosmosrp? it is a new model created by Pawan so you can use it FREELY/UNLIMITED/8K CONTEXT SIZE(in the future it is possible that they will increase the contest size for this model) / UNDERSTAND OOC/ SUITABLE FOR ROLEPLAY/ GOOD RESPONSE/ LIMITLESS/ NO NEED JAILBREAK
discord for this model for more infoβ·
pawan
YOU ππ»ββοΈ : I WANT LONG RESPONSE!
- set your tokens to 0 (unlimited)
- your temperature needs to be 0.85 (sweetspot) but 0.95 to be more creative
- alert! a short initial message will encourage the bot to give a short response too. so you need to look at the initials of the message first before starting the roleplay
- if you want a descriptive response you need to do the same thing for your message. so, give a response of at least 2 paragraphs, 2-3 dialogues
HOW? β·
explain the action, the way you speak, your thoughts,atmosphere,location,your clothes,the way you look at {{char}}, body reaction ,explain your emotions,explain what you hear,taste,textures,shape
- by giving commands using OOC
THIS FILE EXPLAINS WHAT OOC IS? AND HOW TO USE OOC? β·