Is enumerate in python lazy?
I'd like to know what happens when I pass the result of a generator function to py开发者_如何学运维thon's enumerate(). Example:
def veryBigHello():
i = 0
while i < 10000000:
i += 1
yield "hello"
numbered = enumerate(veryBigHello())
for i, word in numbered:
print i, word
Is the enumeration iterated lazily, or does it slurp everything into the <enumerate object>
first? I'm 99.999% sure it's lazy, so can I treat it exactly the same as the generator function, or do I need to watch out for anything?
It's lazy. It's fairly easy to prove that's the case:
>>> def abc():
... letters = ['a','b','c']
... for letter in letters:
... print letter
... yield letter
...
>>> numbered = enumerate(abc())
>>> for i, word in numbered:
... print i, word
...
a
0 a
b
1 b
c
2 c
It's even easier to tell than either of the previous suggest:
$ python
Python 2.5.5 (r255:77872, Mar 15 2010, 00:43:13)
[GCC 4.3.4 20090804 (release) 1] on cygwin
Type "help", "copyright", "credits" or "license" for more information.
>>> abc = (letter for letter in 'abc')
>>> abc
<generator object at 0x7ff29d8c>
>>> numbered = enumerate(abc)
>>> numbered
<enumerate object at 0x7ff29e2c>
If enumerate didn't perform lazy evaluation it would return [(0,'a'), (1,'b'), (2,'c')]
or some (nearly) equivalent.
Of course, enumerate is really just a fancy generator:
def myenumerate(iterable):
count = 0
for _ in iterable:
yield (count, _)
count += 1
for i, val in myenumerate((letter for letter in 'abc')):
print i, val
Since you can call this function without getting out of memory exceptions it definitly is lazy
def veryBigHello():
i = 0
while i < 1000000000000000000000000000:
yield "hello"
numbered = enumerate(veryBigHello())
for i, word in numbered:
print i, word
Old school alternative since I was using a generator that someone else (sklearn) wrote that didn't work with the approaches here.
i=(-1)
for x in some_generator:
i+=1
精彩评论