Improving an 'each' statement
I am using Ruby on Rails 3 and I would like to improve the following code to a better way to do the same thing.
query = {}
params.each { |k,v| query[k.si开发者_StackOverflowngularize] = v }
How can I do that?
If you were actually finding that
params.each { |k,v| query[k.singularize] = v }
was taking too long, singularize
would be taking up most of your time.
If most of the words were the same, I'd consider memoization.
Actually, if you had ten thousand parameters, I'd consider a code review!
query = Hash[params.map{|k, v| [k.singularize, v]}]
As nzifnab wrote, my other code seems to be slow. The following one may be slightly faster than the original posted.
query = params.each_with_object({}){|(k, v), h| h[k.singularize] = v}
Well I had an idea (same idea as sawa) and decided I wanted to know if it was an improvement. Here's the benchmark results:
params = {'puppies' => 'cute',
'dinosaurs' => 'angry',
'kittens' => 'kill them all',
'wat' => 4}
Benchmark.bm do |x|
x.report(".each"){10000.times{query = {}; params.each{ |k,v| query[k.singularize] = v }}}
x.report("Hash"){10000.times{query = Hash[params.map{|k, v| [k.singularize, v]}]}}
end
And the result:
user system total real
.each 3.850000 0.390000 4.240000 ( 4.260567)
Hash 3.910000 0.400000 4.310000 ( 4.402304)
So very little difference, although Hash is the opposite of improvement, sadly - if performance was a concern for you.
I still tend to use the Hash[]
format just because I like how .map
works... but .map
has to loop through every single item too so it's no different.
EDIT:
I went with the comment suggestion to do one really large hash instead of a tiny hash 10,000 times. Here's the results:
myhash = {}
20000.times do |i|
myhash[i.to_s * 2 + 's'] = i
end
Benchmark.bm do |x|
x.report(".each"){query = {}; myhash.each{|k,v| query[k.singularize] = v}}
x.report("Hash"){query = Hash[myhash.map{|k,v| [k.singularize, v]}]}
end
Results:
user system total real
.each 1.980000 0.110000 2.090000 ( 2.100811)
Hash 2.040000 0.140000 2.180000 ( 2.176588)
Edit 2: Credit goes to sawa for this third method:
Benchmark.bm do |x|
x.report(".each"){query = {}; myhash.each{|k,v| query[k.singularize] = v}}
x.report("Hash"){query = Hash[myhash.map{|k,v| [k.singularize, v]}]}
x.report("with_object"){query = myhash.each_with_object({}){|(k, v), h| h[k.singularize] = v}}
end
user system total real
.each 2.050000 0.110000 2.160000 ( 2.174315)
Hash 2.070000 0.110000 2.180000 ( 2.187600)
with_object 2.100000 0.110000 2.210000 ( 2.207763)
If you (or someone) can find a way to modify each value in-place I suspect this would be the fastest way to do it:
params.each{|arr| arr[0].singularize!}
But you can't do that because
singularize!
is not defined, and- when you try to do this:
params.each{|arr| arr[0].gsub!('s', '')}
You get an error:
TypeError: can't modify frozen string
I would just stick with the original version :p
I tried the following in Ruby YARV 1.9.1, without ActiveSupport (hence reverse
rather than singularize
)
require "benchmark"
myhash = {}
2000000.times do |i|
myhash[i.to_s * 2 + 's'] = i
end
Benchmark.bm do |x|
x.report(".each"){query = {}; myhash.each{|k,v| query[k.reverse] = v}}
x.report("Hash"){query = Hash[myhash.map{|k,v| [k.reverse, v]}]}
puts RUBY_VERSION
puts RUBY_ENGINE if defined?(RUBY_ENGINE)
end
gave me
user system total real
.each 6.350000 0.070000 6.420000 ( 6.415588)
Hash 5.710000 0.100000 5.810000 ( 5.795611)
1.9.1
ruby
So in my case, Hash was faster.
Considering the difference in speed between my benchmark and nzifnab's, I'd want to check the bulk of time wasn't spend on singularize
.
Update:
Under 1.8.7:
user system total real
.each 11.640000 0.380000 12.020000 ( 12.019372)
Hash 15.010000 0.540000 15.550000 ( 15.552186)
1.8.7
So it's slower using Hash under 1.8?
精彩评论