o e}_ @sPdZddlmZddlmZddlmZddlmZGdddZ d d d Z dS) z SQL Lexer) TextIOBase)tokens) SQL_REGEX)consumec@seZdZdZedddZdS)Lexerz?Lexer Empty class. Leaving for backwards-compatibility Nccst|tr |}t|trn,t|tr3|r||}nz|d}Wnty2|d}Yn wtdt |t |}|D]>\}}t D]1\}}|||}|sTqHt|t j rb||fVn t|rm||Vt|||dnt j|fVqBdS)a Return an iterable of (tokentype, value) pairs generated from `text`. If `unfiltered` is set to `True`, the filtering mechanism is bypassed even if filters are defined. Also preprocess the text, i.e. expand tabs and strip it if wanted and applies registered filters. Split ``text`` into (tokentype, text) pairs. ``stack`` is the initial stack (default: ``['root']``) zutf-8zunicode-escapez+Expected text or file-like object, got {!r}N) isinstancerreadstrbytesdecodeUnicodeDecodeError TypeErrorformattype enumeraterr _TokenTypegroupcallablerendError)textencodingiterableposcharrexmatchactionmr0/usr/lib/python3/dist-packages/sqlparse/lexer.py get_tokenss>           zLexer.get_tokensN)__name__ __module__ __qualname____doc__ staticmethodr!rrrr rsrNcCst||S)zTokenize sql. Tokenize *sql* using the :class:`Lexer` and return a 2-tuple stream of ``(token type, value)`` items. )rr!)sqlrrrr tokenizeLsr)r") r&iorsqlparsersqlparse.keywordsrsqlparse.utilsrrr)rrrr s    6