Python: Why is functools.partial necessary?
Partial application is cool. What functionality does functools.partial
offer that you can't get through lambdas?
>>> sum = lambda x, y : x + y
&g开发者_JS百科t;>> sum(1, 2)
3
>>> incr = lambda y : sum(1, y)
>>> incr(2)
3
>>> def sum2(x, y):
return x + y
>>> incr2 = functools.partial(sum2, 1)
>>> incr2(4)
5
Is functools
somehow more efficient, or readable?
What functionality does
functools.partial
offer that you can't get through lambdas?
Not much in terms of extra functionality (but, see later) – and, readability is in the eye of the beholder.
Most people who are familiar with functional programming languages (those in the Lisp/Scheme families in particular) appear to like lambda
just fine – I say "most", definitely not all, because Guido and I assuredly are among those "familiar with" (etc) yet think of lambda
as an eyesore anomaly in Python...
He was repentant of ever having accepted it into Python whereas planned to remove it from Python 3, as one of "Python's glitches".
I fully supported him in that. (I love lambda
in Scheme... while its limitations in Python, and the weird way it just doesn't fit in with the rest of the language, make my skin crawl).
Not so, however, for the hordes of lambda
lovers -- who staged one of the closest things to a rebellion ever seen in Python's history, until Guido backtracked and decided to leave lambda
in.
Several possible additions to functools
(to make functions returning constants, identity, etc) didn't happen (to avoid explicitly duplicating more of lambda
's functionality), though partial
did of course remain (it's no total duplication, nor is it an eyesore).
Remember that lambda
's body is limited to be an expression, so it's got limitations. For example...:
>>> import functools
>>> f = functools.partial(int, base=2)
>>> f.args
()
>>> f.func
<type 'int'>
>>> f.keywords
{'base': 2}
>>>
functools.partial
's returned function is decorated with attributes useful for introspection -- the function it's wrapping, and what positional and named arguments it fixes therein. Further, the named arguments can be overridden right back (the "fixing" is rather, in a sense, the setting of defaults):
>>> f('23', base=10)
23
So, as you see, it's definely not as simplistic as lambda s: int(s, base=2)
!-)
Yes, you could contort your lambda to give you some of this – e.g., for the keyword-overriding,
>>> f = lambda s, **k: int(s, **dict({'base': 2}, **k))
but I dearly hope that even the most ardent lambda
-lover doesn't consider this horror more readable than the partial
call!-). The "attribute setting" part is even harder, because of the "body's a single expression" limitation of Python's lambda
(plus the fact that assignment can never be part of a Python expression)... you end up "faking assignments within an expression" by stretching list comprehension well beyond its design limits...:
>>> f = [f for f in (lambda f: int(s, base=2),)
if setattr(f, 'keywords', {'base': 2}) is None][0]
Now combine the named-arguments overridability, plus the setting of three attributes, into a single expression, and tell me just how readable that is going to be...!
Well, here's an example that shows a difference:
In [132]: sum = lambda x, y: x + y
In [133]: n = 5
In [134]: incr = lambda y: sum(n, y)
In [135]: incr2 = partial(sum, n)
In [136]: print incr(3), incr2(3)
8 8
In [137]: n = 9
In [138]: print incr(3), incr2(3)
12 8
These posts by Ivan Moore expand on the "limitations of lambda" and closures in python:
- Closures in Python (part 2)
- Closures in Python (part 3)
In the latest versions of Python (>=2.7), you can pickle
a partial
, but not a lambda
:
>>> pickle.dumps(partial(int))
'cfunctools\npartial\np0\n(c__builtin__\nint\np1\ntp2\nRp3\n(g1\n(tNNtp4\nb.'
>>> pickle.dumps(lambda x: int(x))
Traceback (most recent call last):
File "<ipython-input-11-e32d5a050739>", line 1, in <module>
pickle.dumps(lambda x: int(x))
File "/usr/lib/python2.7/pickle.py", line 1374, in dumps
Pickler(file, protocol).dump(obj)
File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/lib/python2.7/pickle.py", line 748, in save_global
(obj, module, name))
PicklingError: Can't pickle <function <lambda> at 0x1729aa0>: it's not found as __main__.<lambda>
Is functools somehow more efficient..?
As a partly answer to this I decided to test the performance. Here is my example:
from functools import partial
import time, math
def make_lambda():
x = 1.3
return lambda: math.sin(x)
def make_partial():
x = 1.3
return partial(math.sin, x)
Iter = 10**7
start = time.clock()
for i in range(0, Iter):
l = make_lambda()
stop = time.clock()
print('lambda creation time {}'.format(stop - start))
start = time.clock()
for i in range(0, Iter):
l()
stop = time.clock()
print('lambda execution time {}'.format(stop - start))
start = time.clock()
for i in range(0, Iter):
p = make_partial()
stop = time.clock()
print('partial creation time {}'.format(stop - start))
start = time.clock()
for i in range(0, Iter):
p()
stop = time.clock()
print('partial execution time {}'.format(stop - start))
on Python 3.3 it gives:
lambda creation time 3.1743163756961392
lambda execution time 3.040552701787919
partial creation time 3.514482823352731
partial execution time 1.7113973411608114
Which means that partial needs a bit more time for creation but considerably less time for execution. This can well be the effect of the early and late binding which are discussed in the answer from ars.
Besides the extra functionality Alex mentioned, another advantage of functools.partial is speed. With partial you can avoid constructing (and destructing) another stack frame.
Neither the function generated by partial nor lambdas have docstrings by default (though you can set the doc string for any objects via __doc__
).
You can find more details in this blog: Partial Function Application in Python
I understand the intent quickest in the third example.
When I parse lambdas, I'm expecting more complexity/oddity than offered by the standard library directly.
Also, you'll notice that the third example is the only one which doesn't depend on the full signature of sum2
; thus making it slightly more loosely coupled.
Functionals serve a useful purpose for when certain variables are evaluated.
Coming from an outsider, here's a series of more friendly examples:
from functools import partial
sum = lambda x, y: x + y # sum(x, y) == x + y
n = 2
normalSum = lambda x: sum(x, n) # normalSum(x) == sum(x, y=n)
partialSum = partial(sum, y = n) # partialSum(sum(y=n)) == sum(x, 2)
print(normalSum(2), partialSum(2)) # 4 4
n = 6
print(normalSum(2), partialSum(2)) # 8 4
Notice how the partial holds the value of whatever was n
at the time.
...
n = 2
partialSumOrig = partial(sum, y = n) # partialSumOrig(sum(y=n)) == sum(x, 2)
n = 6
partialSumNew = partial(sum, y = n) # partialSumNew(sum(y=n)) == sum(x, 6)
print(partialSumOrig(2), partialSumNew(2)) # 4 8
Extra example showing how arguments are passed into nested lambdas:
...
n = 8
partialSumOrig = partial(sum, y = n) # partialSumOrig(sum(y=n)) == sum(x, 8)
partialSumNew = partial(sum, n) # partialSumNew(sum(n)) == sum(8, y)
print(partialSumOrig(2)) # 10 # partialSumOrig(sum(2, 8)) == sum(2, 8)
print(partialSumNew(2)) # 10 # partialSumNew(sum(8, 2)) == sum(8, 2)
One last example showing how arguments are passed in partials:
...
n = 2
m = 2
partialSumSilly = partial(sum, n, m) # partialSumSilly(sum(n, m)) == sum(2, 2)
print(partialSumSilly()) # 4
The big takeaway is that:
normalSum()
behaves like a late binding, wheren
is evaluated when ran.partialSum()
behaves like an early binding, wheren
is evaluated when defined.
Note: In reality nearly everything is a late binding in cpython due to its interpreted nature.
精彩评论