PyLogo


reader

reader for pylogo
Ian Bicking <ianb@colorstudy.com>

Tokenizer/lexer. Examples:

>>> tokenize('1 2 3')
[1, 2, 3, '\n']
>>> tokenize('fd 100')
['fd', 100, '\n']
>>> tokenize('pr "hello\nfd 100\n')
['pr', '"', 'hello', '\n', 'fd', 100, '\n']
>>> tokenize('while [:a>2] [make :a :a+1]')
['while', '[', ':', 'a', '>', 2, ']', '[', 'make', ':', 'a', ':', 'a', '+', 1, ']', '\n']
>>> tokenize('>>>= <= <><> == =>=<')
['>', '>', '>=', '<=', '<>', '<>', '=', '=', '=>', '=<', '\n']
>>> tokenize('apple? !apple .apple apple._me apple10 10apple')
['apple?', '!apple', '.apple', 'apple._me', 'apple10', 10, 'apple', '\n']

Note that every file fed in is expected to end with a 'n' (even if the file doesn't actually). We get common.EOF from the tokenizer when it is done.


Attributes

a EOF

[EOF]

a extended_symbols

['>=', '=>', '<=', '=<', '<>']

a generators

_Feature((2, 2, 0, 'alpha', 1), (2, 3, 0, 'final', 0), 4096)

a number_re

<_sre.SRE_Pattern object at 0xb7662120>

a only_word_re

<_sre.SRE_Pattern object at 0xb745b340>

a symbols

'()[]+-/*":=><;'

a white_re

<_sre.SRE_Pattern object at 0xb7402d68>

a word_matcher

'[a-zA-Z\\._\\?!][a-zA-Z0-9\\._\\?!]*'

a word_re

<_sre.SRE_Pattern object at 0xb7634720>

Functions

f is_word(tok) ...

f tokenize(s) ...

f main() ...

Classes

C FileTokenizer(...) ...

An interator over the tokens of a file. Will prompt interactively if prompt is given.

This class contains 6 members.

C ListTokenizer(...) ...

This is just a cache of previously tokenized expressions. So that [a block] can be treated like a stream of tokens. The tokens are taken from l .

This class contains 5 members.

C TrackingStream(...) ...

A file-like object that also keeps track of rows and columns, for tracebacks.

This class contains 3 members.

See the source for more information.