3

V'ícV
ã@s‚dZddlZddlmZmZddlmZmZmZm	Z	ddl
mZdddgZGd	d„deƒZ
Gd
d„deƒZiZGdd„deƒZdS)zÂ
    pygments.lexers.special
    ~~~~~~~~~~~~~~~~~~~~~~~

    Special lexers.

    :copyright: Copyright 2006-2022 by the Pygments team, see AUTHORS.
    :license: BSD, see LICENSE for details.
éN)ÚLexerÚline_re)ÚTokenÚErrorÚTextÚGeneric)Úget_choice_optÚ	TextLexerÚOutputLexerÚ
RawTokenLexerc@s:eZdZdZdZdgZdgZdgZdZdd„Z	d	d
„Z
dS)r	z3
    "Null" lexer, doesn't highlight anything.
    z	Text onlyÚtextz*.txtz
text/plaing{®Gáz„?ccsdt|fVdS)Nr)r)Úselfr©rú;/tmp/pip-build-gk9425m9/Pygments/pygments/lexers/special.pyÚget_tokens_unprocessedsz TextLexer.get_tokens_unprocessedcCstjS)N)r	Úpriority)rrrrÚanalyse_text"szTextLexer.analyse_textN)Ú__name__Ú
__module__Ú__qualname__Ú__doc__ÚnameÚaliasesÚ	filenamesÚ	mimetypesrrrrrrrr	sc@s"eZdZdZdZdgZdd„ZdS)r
zj
    Simple lexer that highlights everything as ``Token.Generic.Output``.

    .. versionadded:: 2.10
    zText outputÚoutputccsdtj|fVdS)Nr)rZOutput)r
rrrrr/sz"OutputLexer.get_tokens_unprocessedN)rrrrrrrrrrrr
&sc@s:eZdZdZdZgZgZdgZdd„Zdd„Z	dd	„Z
d
S)ra
    Recreate a token stream formatted with the `RawTokenFormatter`.

    Additional options accepted:

    `compress`
        If set to ``"gz"`` or ``"bz2"``, decompress the token stream with
        the given compression algorithm before lexing (default: ``""``).
    zRaw token datazapplication/x-pygments-tokenscKs*t|dddddgdƒ|_tj|f|ŽdS)NÚcompressÚÚnoneÚgzÚbz2)rrrÚ__init__)r
Úoptionsrrrr!EszRawTokenLexer.__init__ccsÈ|jr~t|tƒr|jdƒ}y>|jdkr:ddl}|j|ƒ}n|jdkrVddl}|j|ƒ}Wn$tk
r|t|j	dƒfVYnXt|t
ƒr’|j	dƒ}|jdƒd}x"|j|ƒD]\}}}||fVq¬WdS)NÚlatin1rrr Ú
)
rÚ
isinstanceÚstrÚencodeÚgzipÚ
decompressr ÚOSErrorrÚdecodeÚbytesÚstripr)r
rr(r ÚiÚtÚvrrrÚ
get_tokensJs"





zRawTokenLexer.get_tokensc	
csòd}xètj|ƒD]Ú}y˜|jƒjƒjddƒ\}}tj|ƒ}|sŽt}|jdƒdd…}x2|D]*}|sp|djƒrxt	dƒ‚t
||ƒ}qXW|t|<tj|ƒ}t
|tƒsªt	dƒ‚Wn$tt	fk
rÐ|jƒ}t}YnX|||fV|t|ƒ7}qWdS)Nrú	éÚ.zmalformed token namezexpected str)rÚfinditerÚgroupÚrstripÚsplitÚ_ttype_cacheÚgetrÚisupperÚ
ValueErrorÚgetattrÚastÚliteral_evalr%r&ÚSyntaxErrorrÚlen)	r
rÚlengthÚmatchZttypestrÚvalZttypeZttypesZttype_rrrr_s*




z$RawTokenLexer.get_tokens_unprocessedN)rrrrrrrrr!r1rrrrrr6s	)rr>Zpygments.lexerrrZpygments.tokenrrrrZ
pygments.utilrÚ__all__r	r
r9rrrrrÚ<module>	s