lexers

package
v0.0.0-...-d55b142 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 16, 2026 License: Apache-2.0 Imports: 6 Imported by: 0

Documentation

Overview

Package lexers provides helpers for tokenizing YAML strings.

This package wraps github.com/goccy/go-yaml/lexer with additional utilities for document-aware tokenization and iterating over token streams.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Tokenize

func Tokenize(src string) token.Tokens

Tokenize wraps lexer.Tokenize for convenience.

func TokenizeDocuments

func TokenizeDocuments(src string) iter.Seq2[int, token.Tokens]

TokenizeDocuments is like lexer.Tokenize, but splits the YAML string into multiple token streams, one for each YAML document found (separated by '---' tokens).

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL