开发者

Is is possible to parse a web page from the client side for a large number of words and if so, how?

I have a list of keywords, about 25,000 of them. I would like people who add a certain < script> tag on their web page to have these keywords transformed into links. What would be the best way to go and achieve this?

I have tried the simple javascript approach (an array with lots of elements and regexping/replacing each) and it obviously slows down the browser.

I could always process the content server-side if there was a way, from the client, to send the page's content to a cross-domain serve开发者_C百科r script (I'm partial to PHP but it could be anything) but I don't know of any way to do this.

Any other working solution is also welcome.


I would allow the remote site add a javascript file and using ajax connect to your site to get a list of only specific terms. Which terms?

  • Categories: Now if this is for advertising (where this concept has been done a lot) let them specify what category their site falls into and group your terms into those categories. Then only send those groups of terms. It would be in their best interest to choose the right categories because the more links they have the more income they can generate.

  • Indexing: If that wouldn't work, you can maybe when the first time someone tries to load the page, on your server index a copy of it and index all the words on their page with the terms you have and for any subsequent loads you have a list of terms to send them based on what their page contains. ideally after that you would have some background process that indexes their pages with your script like once a day or every few days to catch any updates. Possibly use the script to get a hash of the page contents and if changed at all you can then update your indexed copy.

I'm sure there are other methods, which is best is really just preference. Try looking at a few other advertising-link sites/scripts and see how they do it.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜