开发者

Getting an error when uploading a file to Azure Storage

I'm converting a website from a standard ASP.NET website over to use Azure. The website had previously taken an Excel file uploaded by an administrative user and saved it on the file system. As part of the migration, I'm saving this file to Azure Storage. It works fine when running against my local storage through the Azure SDK. (I'm using version 1.3 since I didn't want to upgrade during the development process.)

When I point the code to run against Azure Storage itself, though, the process usually fails. The error I get is: System.IO.IOException occurred

  Message=Unable to read data from the transport connection: The connection was closed.
  Source=Microsoft.WindowsAzure.StorageClient
  StackTrace:
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
       at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
       at Framework.Common.AzureBlobInteraction.UploadToBlob(Stream stream, String BlobContainerName, String fileName, String contentType) in C:\Development\RateSolution2010\Framework.Common\AzureBlobInteraction.cs:line 95
  InnerException: 

The code is as follows:

public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
        string contentType)
    {
        // Setup the connection to Windows Azure Storage
        CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());

        DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
        dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
        dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
        DiagnosticMonitor.Start(storageAccount, dmc);      
        CloudBlobClient BlobClient = null;
        CloudBlobContainer BlobContainer = null;
        BlobClient = storageAccount.CreateCloudBlobClient();

        // For large file copies you need to set up a custom timeout period
        // and using parallel settings appears to sp开发者_如何学Cread the copy across multiple threads
        // if you have big bandwidth you can increase the thread number below
        // because Azure accepts blobs broken into blocks in any order of arrival.
        BlobClient.Timeout = new System.TimeSpan(1, 0, 0);
        Role serviceRole = RoleEnvironment.Roles.Where(s => s.Value.Name == "OnlineRates.Web").First().Value;
        BlobClient.ParallelOperationThreadCount = serviceRole.Instances.Count;  

        // Get and create the container
        BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
        BlobContainer.CreateIfNotExist();

        //delete prior version if one exists
        BlobRequestOptions options = new BlobRequestOptions();
        options.DeleteSnapshotsOption = DeleteSnapshotsOption.None;
        CloudBlob blobToDelete = BlobContainer.GetBlobReference(fileName);
        Trace.WriteLine("Blob " + fileName + " deleted to be replaced by newer version.");
        blobToDelete.DeleteIfExists(options);

        //set stream to starting position
        stream.Position = 0;
        long totalBytes = 0;
        //Open the stream and read it back.
        using (stream)
        {
            // Create the Blob and upload the file
            CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
            try
            {
                BlobClient.ResponseReceived += new EventHandler<ResponseReceivedEventArgs>((obj, responseReceivedEventArgs)
                =>
                {
                    if (responseReceivedEventArgs.RequestUri.ToString().Contains("comp=block&blockid"))
                    {
                        totalBytes += Int64.Parse(responseReceivedEventArgs.RequestHeaders["Content-Length"]);
                    }
                });                 
                blob.UploadFromStream(stream);
                // Set the metadata into the blob
                blob.Metadata["FileName"] = fileName;
                blob.SetMetadata();
                // Set the properties
                blob.Properties.ContentType = contentType;
                blob.SetProperties();
            }
            catch (Exception exc)
            {
                Logging.ExceptionLogger.LogEx(exc);
            }
         }
     }

I've tried a number of different alterations to the code: deleting a blob before replacing it (although the problem exists on new blobs as well), setting container permissions, not setting permissions, etc.


Your code looks like it should work, but it has lots of extra functionality that is not strictly required. I would cut it down to an absolute minimum and go from there. It's really only a gut feeling, but I think it might be the using statement giving you grief. This enture function could be written (presuming the container already exists) as:

public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
                string contentType)
            {
                // Setup the connection to Windows Azure Storage
                CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());
                CloudBlobClient BlobClient = storageAccount.CreateCloudBlobClient();
                CloudBlobContainer BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
                CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
                stream.Position = 0;
                blob.UploadFromStream(stream);
            }

Notes on the stuff that I've removed:

  • You should set up diagnostics just once when you're app starts, not every time a method is called. Usually in the RoleEntryPoint.OnStart()
  • I'm not sure why you're trying to set ParallelOperationThreadCount higher if you have more instances. Those two things seem unrelated.
  • It's not good form to check for the existence of a container/table every time you save something to it. It's more usual to do that check once when your app starts or to have a process external to the website to make sure all the required containers/tables/queues exist. Of course if you're trying to dynamically create containers this is not true.


The problem turned out to be firewall settings on my laptop. It's my personal laptop originally set up at home and so the firewall rules weren't set up for a corporate environment resulting in slow performance on uploads and downloads.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜