Why in Ruby, a || 1 will throw an error when `a` is undefined, but a = a || 1 will not?
When a
is undefined, then a || 1
will throw an error, but a = a || 1
will not. Isn开发者_JAVA技巧't that a little bit inconsistent?
irb(main):001:0> a
NameError: undefined local variable or method 'a' for main:Object
from (irb):1
from c:/ruby/bin/irb:12:in '<main>'
irb(main):002:0> a || 1
NameError: undefined local variable or method 'a' for main:Object
from (irb):2
from c:/ruby/bin/irb:12:in '<main>'
irb(main):003:0> a = a || 1
=> 1
a
Here, you are evaluating a
, which isn't defined. Therefore, you get an exception.
a || 1
Here, you still have to evaluate a
to determine the value of the boolean expression. Just like above, a
isn't defined. Therefore, you get an exception.
a = a || 1
Here, a
is defined. It is defined to be an uninitialized local variable. In Ruby, uninitialized variables evaluate to nil
, so the right hand side of the assignment expression evaluates to nil || 1
which evaluates to 1
, so the return value of the assignment expression is 1
and the side effect is that a
is initialized to 1
.
EDIT: It seems that there is some confusion on when variables get defined and when they get initialized in Ruby. The get defined at parse time but initialized at runtime. You can see it here:
foo # => NameError: undefined local variable or method `foo' for main:Object
foo
is undefined.
if false
foo = 'This will never get executed'
end
At this point, foo
is defined, even though the line will never get executed. The fact that the line never gets executed is completely irrelevant, because the interpreter has nothing to do with this anyway: local variables are defined by the parser, and the parser obviously sees this line.
foo # => nil
There is no error, because foo
is defined, and it evaluates to nil
because it is uninitialized.
When you do a || 1
, you're asking it to look for the value of a
which is undefined.
When you do a = a || 1
you're asking it to look for the value of assigning a
to a
which doesn't seem to give an error.
So, although weird, I don't believe it to be inconsistent.
Is this what you mean?
if !(defined? a) then
a = 1
end
It might be simpler to declare the value with 1 as default.
精彩评论