# split_text_on_tokens

> **Function** in `langchain_text_splitters`

📖 [View in docs](https://reference.langchain.com/python/langchain-text-splitters/base/split_text_on_tokens)

Split incoming text and return chunks using tokenizer.

## Signature

```python
split_text_on_tokens(
    *,
    text: str,
    tokenizer: Tokenizer,
) -> list[str]
```

## Parameters

| Name | Type | Required | Description |
|------|------|----------|-------------|
| `text` | `str` | Yes | The input text to be split. |
| `tokenizer` | `Tokenizer` | Yes | The tokenizer to use for splitting. |

## Returns

`list[str]`

A list of text chunks.

---

[View source on GitHub](https://github.com/langchain-ai/langchain/blob/b302691ff9ad841804e93e5addbdc53b6974473b/libs/text-splitters/langchain_text_splitters/base.py#L430)