开发者

Rails or Node for web parser?

i want to build a parser, which parses multiple pages of an xml document. these pages include images and other media, which are served by an extra static asset server. now my question is:

should i use rails to parse the document or make a special node server, which does that? i thought about node because of the performa开发者_Go百科nce.

the parser creates a html document out of the xml file.

thanks in advance!

edit: these html pages get viewed from multiple users.


I don't think it makes much of a difference if you can parse 100k pages per second with ruby or 200k pages per second with node.js unless you're parsing billions of xml documents (I made those numbers up). You should use the tools you're most familiar with.

However, there are some cool libaries for web scraping/parsing in node.js

  • You can use jQuery with node.js!
  • node.io - web scraper module, supports jquery, distributed processing, modules and more
  • another node.js web scraper module - fetch pages in parallel, add rate limiting


I'd use node because rails seems a little overkill for this. However if you're more familiar w/ ruby instead of rails you should do it w/ Sinatra or any other smaller framework. Simply my point is that rails is over kill for this.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜