开发者

is it possible to track or store the visited or clicked urls in webpages created using servlets and jsp?

I created a webpage contains more th开发者_如何学Goan 10 links . is it possible to store the clicked urls or visited in my database or file . I have developed webpage using servlet. It is just a page with 10 links.


You may want to consider using a Filter for this instead. You can then use one Filter to track which pages are read.

public class LogFilter implements Filter {

  public void doFilter(ServletRequest req, ServletResponse res, 
    FilterChain chain) throws IOException, ServletException {

    HttpServletRequest request = (HttpServletRequest) req;

    ... // use req.getRequestURI() and log it out to a database or file

  }

}

HttpServletRequest.getRequestURI() will give you which URL they're trying to view.

You'll need to use a filtermapping in your web.xml to activate the filter:

   <filter>
      <filter-name>LogFilter</filter-name>
      <filter-class>LogFilter</filter-class>
   </filter>
   <filter-mapping>
      <filter-name>LogFilter</filter-name>
      <!-- 
        this will match any url, you probably want it to either match your jsp urls
        with *.jsp or your servlets with whatever you're using as your servlet-mapping
        e.g. *.html
      -->
      <url-pattern>/*</url-pattern>
   </filter-mapping>  

Edit: If they're external links, you could consider rerouting them through a single servlet which then does a redirect to the external URL and do the logging from that Servlet.


Capturing simple hits on your own site/servlet is easy as pie - if you just want to know the URLs visited, you can almost always set your webserver to record this to a logfile. (Then if desired you can parse and insert this into your database).

Capturing outbound links requires a little more works, and there are many ways to go about this, with the best depending on your specific requirements (complexity of URLs/methods, complexity of alternative link attributes such as target, desired robustness, desired performance, desired comprehensiveness, etc.).

One solution as specified above is changing all outbound links into references to a redirection page on your own site, with the target as a parameter. Then in your typical webserver logs you'll see a hit something like http://example.com/redir.html?target=http%3A%2F%2Fwww.bbc.co.uk%2Fnews, from which you can easily parse the target URL.

Another would involve firing a manual request to your server when the link is clicked (typically a 1x1 image), to which you can append any query parameters you like and later extract them from your logfile. This may or may not involve interrupting the click action with a javascript event listener and then retriggering the page change later on, depending on exactly how you're making the request as often this will lead to a race condition if not explicitly handled.

All of the options above that make explicit requests for the purpose of creating a line in the logfiles, could instead make a request to a particular page (e.g. a JSP) that writes entries directly into your database, if required (though ideally this would still keep some kind of log to allow for retrying failed inserts).

Having worked in web analytics for several years, I can tell you that it's surprising just how many different situations there are to consider if you want to capture as much accurate information as possible. If this is just for a toy website for your own edification then just about any technique will likely be acceptable, but as soon as you start to put any weight behind the figures you really need to care about the things you never considered. There's a reason why a market exists for several relatively expensive web analytics software packages... :-)

Edit: As prakash's comment implies, my recommendation would be to inspect the range of free web analytics tools already out there and integrate one if you actually care about the results. If you're just doing this out of curiousity, or as a throwaway low-importance project then by all means get your hands dirty.


If the links are internal to your site you should look at access logs or an analytics package like google analytics.

If they are external and you are looking to track click through rates, You could have the links actually go to your server where you would log it and then redirect to the true destination.


You could do it the same way Google does: encode each link to be a link through your website which it subsequently does a redirect for.


Do it the simple way: Add Google Analytics to your site.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜