Calculate the maximum number of tokens possible to generate for a model.
Example:
.. code-block:: python
max_tokens = openai.modelname_to_contextsize("gpt-3.5-turbo-instruct")
| Name | Type | Description |
|---|---|---|
modelname* | str | The modelname we want to know the context size for. |