Question on Definition and values of symbols
Definition
"knows" the way how a value for a symbol was defined: using Set
or SetDelayed
. But how? As I understand, after a value for a symbol was 开发者_如何学编程assigned there is no any difference for the evaluator how it was assigned: by using Set
or SetDelayed
. It can be illustrated by the function OwnValues
which always returns definitions with the Head
RuleDelayed
. How Definiton
obtains this information?
In[1]:= a=5;b:=5;
Definition[a]
Definition[b]
OwnValues[a]
Out[2]= a=5
Out[3]= b:=5
Out[4]= {HoldPattern[a]:>5}
OwnValues[a] = {HoldPattern[a] -> 3}; OwnValues[a]
gives {HoldPattern[a] :> 3}
instead of {HoldPattern[a] -> 3}
but Definition[a]
shows what one can expect. Probably this definition is stored internally in the form of Rule
but is converted to RuleDelayed
by OwnValues
for suppressing of evaluation of the r.h.s of the definition. This hypothesis contradicts my original understanding that there are no difference between values assigned by Set
and SetDelayed
. Probably such definitions are stored in different forms: Rule
and RuleDelayed
correspondingly but are equivalent from the evaluator's point of view.
It is interesting to see how MemoryInUse[]
depends on the kind of definition.
In the following experiment I used the kernel of Mathematica 5.2 in interactive session without the FrontEnd. With the kernels of Mathematica 6 and 7 one will get different results. One reason for this is that in these versions Set
is overloaded by default.
First of all I evaluate $HistoryLength=0;
for having DownValues
for In
and Out
variables not affecting my results. But it seems that even when $HistoryLength
is set to 0 the value of In[$Line]
for current input line is still stored and removed after entering new input. This is likely the reason why result of the first evaluation of MemoryInUse[]
always differs from the second.
Here is what I have got:
Mathematica 5.2 for Students: Microsoft Windows Version
Copyright 1988-2005 Wolfram Research, Inc.
-- Terminal graphics initialized --
In[1]:= $HistoryLength=0;
In[2]:= MemoryInUse[]
Out[2]= 1986704
In[3]:= MemoryInUse[]
Out[3]= 1986760
In[4]:= MemoryInUse[]
Out[4]= 1986760
In[5]:= a=2;
In[6]:= MemoryInUse[]
Out[6]= 1986848
In[7]:= MemoryInUse[]
Out[7]= 1986824
In[8]:= MemoryInUse[]
Out[8]= 1986824
In[9]:= a:=2;
In[10]:= MemoryInUse[]
Out[10]= 1986976
In[11]:= MemoryInUse[]
Out[11]= 1986952
In[12]:= MemoryInUse[]
Out[12]= 1986952
In[13]:= a=2;
In[14]:= MemoryInUse[]
Out[14]= 1986848
In[15]:= MemoryInUse[]
Out[15]= 1986824
In[16]:= MemoryInUse[]
Out[16]= 1986824
One can see that defining a=2;
increases MemoryInUse[]
by 1986824-1986760=64 bytes. Replacing it with the definition a:=2;
increases MemoryInUse[]
by 1986952-1986824=128 bytes. And replacing the latter definition with the former reverts MemoryInUse[]
to 1986824 bytes. It means that delayed definitions require 128 bytes more than immediate definitions.
Of course this experiment does not prove my hypothesis.
Complete definition for a symbol can be accessed via undocumented new-in-8 symbols Language`ExtendedDefinition
and Language`ExtendedFullDefinition
. Citing Oleksandr Rasputinov:
"If anyone is curious, Language`ExtendedDefinition
and Language`ExtendedFullDefinition
are analogous to Definition
and FullDefinition
but capture the definition of a symbol in such a way as it can be reproduced in another kernel. For example, defs = Language`ExtendedFullDefinition[sym]
returns a Language`DefinitionList
object. The syntax used to restore the definition is highly irregular: Language`ExtendedFullDefinition[] = defs
, where defs
is a Language`DefinitionList
. Note that Language`ExtendedFullDefinition
takes the ExcludedContexts
option whereas Language`ExtendedDefinition
does not."
Information
calls Definition
, and a Trace on Definition
(or FullDefinition
) shows nothing. I must assume that this is a low level function that accesses data outside of the *Values
tables. Perhaps it keeps a copy of the original definition expressions as they were parsed at that time.
精彩评论