class documentation

Convert natural language type strings to reStructuredText.

Syntax is based on numpydoc type specification with additionnal recognition of PEP 484-like type annotations (with parentheses or square brackets characters).

Exemples of valid type strings and output
Type string Output
List[str] or list(bytes), optional List[str] or list(bytes), optional
{"html", "json", "xml"}, optional {"html", "json", "xml"}, optional
list of int or float or None, default: None list of int or float or None, default: None
`complicated string` or `strIO <twisted.python.compat.NativeStringIO>` complicated string or strIO
Method __init__ Undocumented
Method __str__ No summary
Instance Variable warnings Undocumented
Class Method _tokenize_type_spec Split the string in tokens for further processing.
Static Method _recombine_set_tokens Merge the special literal choices tokens together.
Method _build_tokens Undocumented
Method _convert_type_spec_to_rst Undocumented
Method _token_type Find the type of a token. Types are defined in C{TokenType} enum.
Method _trigger_warnings Append some warnings.
Class Variable _ast_like_delimiters_regex Undocumented
Class Variable _ast_like_delimiters_regex_str Undocumented
Class Variable _default_regex Undocumented
Class Variable _natural_language_delimiters_regex Undocumented
Class Variable _natural_language_delimiters_regex_str Undocumented
Class Variable _token_regex Undocumented
Instance Variable _annotation Undocumented
Instance Variable _tokens Undocumented
Instance Variable _warns_on_unknown_tokens Undocumented
def __init__(self, annotation: str, warns_on_unknown_tokens: bool = False): (source)
def __str__(self) -> str: (source)
Returns
strThe parsed type in reStructuredText format.
warnings: list[str] = (source)

Undocumented

@classmethod
def _tokenize_type_spec(cls, spec: str) -> list[str]: (source)

Split the string in tokens for further processing.

@staticmethod
def _recombine_set_tokens(tokens: list[str]) -> list[str]: (source)

Merge the special literal choices tokens together.

Example

>>> tokens = ["{", "1", ", ", "2", "}"]
>>> ann._recombine_set_tokens(tokens)
... ["{1, 2}"]
def _build_tokens(self, _tokens: list[str | Any]) -> list[tuple[str, TokenType]]: (source)

Undocumented

def _convert_type_spec_to_rst(self) -> str: (source)

Undocumented

def _token_type(self, token: str | Any) -> TokenType: (source)

Find the type of a token. Types are defined in C{TokenType} enum.

def _trigger_warnings(self): (source)

Append some warnings.

_ast_like_delimiters_regex = (source)

Undocumented

_ast_like_delimiters_regex_str: str = (source)

Undocumented

_default_regex = (source)

Undocumented

_natural_language_delimiters_regex = (source)

Undocumented

_natural_language_delimiters_regex_str: str = (source)

Undocumented

_token_regex = (source)

Undocumented

_annotation = (source)

Undocumented

_warns_on_unknown_tokens = (source)

Undocumented