开发者

Making network-intensive code more robust

I have code which interacts with amazon S3 and the files transferred are usually big - couple of gigs so what suggestions can you make to make them more robust in case of failure. Also what is the (if any) general strategy for implementing robustness in network code. Is something like that acceptable in order to try an operation 3 times? Any tips are appreciated

public void downloadFile(String path, int retries) {
 (if retries == 3) return;
 Connection con = new ConnectToAmazon();
 try {
  con.saveFileToDisk(path, LocalDiskPath);
 } catch ( Exception) { 
  downloadFil开发者_运维问答e(path, retries++);
 }


Make sure that you upload using the multipart upload API more info is in the documentation

http://docs.amazonwebservices.com/AmazonS3/latest/dev/index.html?uploadobjusingmpu.html

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜