The model should support 8k of context but currently it is only accepting 4k. I believe this is a problem, because on other platforms it normally accepts 8k of context (counting max_tokens)
Please authenticate to join the conversation.
In Review
🖋️ Nebius AI Studio
Other
Over 1 year ago

Wilame Souza
Get notified by email when there are changes.
In Review
🖋️ Nebius AI Studio
Other
Over 1 year ago

Wilame Souza
Get notified by email when there are changes.