{“error”:”Trying to keep the first 167702 tokens when context the overflows. However, the model is loaded with context length of only 32768 tokens, which is not enough. Try to load the model with a larger context length, or provide a shorter input”}
{“error”:”Trying to keep the first 167702 tokens when context the overflows. However, the model is loaded with context length of only 32768 tokens, which is not enough. Try to load the model with a larger context length, or provide a shorter input”}