开发者

Prevent ffmpeg from taking over stdout

When I do system "ffmpeg -i just-do-it.mp4 -ab 96k -ar 22050 -qscale 6 output.flv" ffmpeg takes over the ruby process till the job is done, which sometimes take a long time. I've tried using threads amd fork in Ruby to no avail, also system equivalent commands like exec %x[] I also tried the latest Fibers in ruby 1.9.2, but I don't think I'm using it properly.

My question is how to run two ffmpeg processes from ruby concurrently?

EDIT:

fork do
  fork do
    system "ffmpeg -i you-know.mp4 -ab 96k -ar 22050 -qscale 6 #{Time.now.sec}.flv"
  end                            

  开发者_StackOverflow中文版fork do
    system "ffmpeg -i bangbang.mp4 -ab 96k -ar 22050 -qscale 6 #{Time.now.sec}.flv"
  end
end


fork/exec is the right solution. Since forked processes inherit the parent processes fopen file handles/etc., you'll have to close (or redirect) the file handles you don't want children processes to use.

For example:

# this will print nothing, but yes is running as a forked process
# you'll want to `killall yes` after running this script.
fork do
  [$stdout, $stderr].each { |fh| fh.reopen File.open("/dev/null", "w") }
  exec "yes"
end

Ok, some comments on the code you posted. The outer fork is pointless. Just fork the two ffmpeg process from the main process. Maybe write a helper function like:

def ffmpeg(mp4)
  fork do
    [$stdout, $stderr].each { ... }
    exec "ffmpeg -i #{mp4} ..."
  end
end

ffmpeg("you-know.mp4")
ffmpeg("bangbang.mp4")


Try the subprocess gem - that's what I'm using now for dealing with process forking and finding it much easier to use.

E.g.

    work_list.each do |cmd|
        process = Subprocess::Popen.new(cmd)
        process.run
        process.wait
        #puts process.stdout
        #puts process.stderr
        puts process.status
    end
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜