最近抽空做个小工具,使用AWSSDK 对本地文件目录监控,并自动同步上传文件到S3 的过程,使用的是多线程异步上传,针对大文件进行了分块

参考文献:

https://www.codeproject.com/Articles/131678/Amazon-S-Sync

https://aws.amazon.com/cn/documentation/s3/

Introduction

The SprightlySoft S3 Sync application allows you to take a folder on your computer and upload it to Amazon S3. You can make additions, deletions, and changes to your local files, and the next time you run the application, it will detect these changes and apply them to S3. This program allows you to create a mirror of a local folder on S3 and always keep it up to date.

Background

Amazon Simple Storage Service (Amazon S3) is a service that allows you to store files in Amazon's cloud computing environment. When your files are in Amazon's system, you can retrieve the files from anywhere on the web. You can also use Amazon's CloudFront service in conjunction with S3 to distribute your files to millions of people. Amazon S3 is the same highly scalable, reliable, secure, fast, inexpensive infrastructure that Amazon uses to run its own global network of web sites. Best of all, Amazon S3 is free for the first 5 GB of storage.

SprightlySoft S3 Sync uses the free SprightlySoft AWS component for .NET to interact with Amazon S3. In the code, you will see an example of uploading files to S3, deleting files from S3, listing files, and getting the properties of files.

Using the code

The SprightlySoft S3 Sync program works by listing your files on S3, listing your files locally, and comparing the differences between the two lists. Here is the logic of the program:

    1. The code starts by getting a list of all user settings. These include your Amazon AWS credentials, the S3 bucket you are syncing to, and the local folder you are syncing from.
    2. The next major call is the PopulateS3HashTable function. Here, all the files in your Amazon S3 bucket are listed. The code calls the ListBucket function from the SprighztlySoft AWS component. This function returns an ArrayList of objects that represent each item in S3. Properties of each object include S3 key name, size, date modified, and ETag. These objects are added to a HashTable so they can be used later on.
Hide   Shrink    Copy Code
private static Boolean PopulateS3HashTable(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String UploadPrefix,
ref System.Collections.Hashtable S3HashTable)
{
Boolean RetBool;
SprightlySoftAWS.S3.ListBucket MyListBucket =
new SprightlySoftAWS.S3.ListBucket();
RetBool = MyListBucket.ListBucket(UseSSL, RequestEndpoint, BucketName,
"", UploadPrefix, AWSAccessKeyId, AWSSecretAccessKey); if (RetBool == true)
{
foreach (SprightlySoftAWS.S3.ListBucket.BucketItemObject
MyBucketItemObject in MyListBucket.BucketItemsArrayList)
{
WriteToLog(" Item added to S3 list. KeyName=" +
MyBucketItemObject.KeyName, 2);
S3HashTable.Add(MyBucketItemObject.KeyName, MyBucketItemObject);
}
}
else
{
WriteToLog(" Error listing files on S3. ErrorDescription=" +
MyListBucket.ErrorDescription);
WriteToLog(MyListBucket.LogData, 3);
} return RetBool;
}
    1. The next call is the PopulateLocalArrayList. This function lists all local files and folders you want to synchronize. Each file and folder is added to an ArrayList so it can be used later on. The function has the option to only include certain files or exclude certain files. The function is recursive. That means it calls itself for each sub folder.
Hide   Shrink    Copy Code
private static void PopulateLocalArrayList(String BaseFolder,
String CurrentFolder, Boolean IncludeSubFolders,
System.Collections.ArrayList ExcludeFolderslArrayList,
String IncludeOnlyFilesRegularExpression,
String ExcludeFilesRegularExpression,
ref System.Collections.ArrayList LocalArrayList)
{
System.IO.DirectoryInfo CurrentDirectoryInfo;
CurrentDirectoryInfo = new System.IO.DirectoryInfo(CurrentFolder); String FolderName; foreach (System.IO.FileInfo MyFileInfo in CurrentDirectoryInfo.GetFiles())
{
if (ExcludeFilesRegularExpression != "")
{
//check if the file is excluded
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, ExcludeFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" Local file excluded by " +
"ExcludeFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to local " +
"list. Name=" + MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
} if (IncludeSubFolders == true)
{
foreach (System.IO.DirectoryInfo SubDirectoryInfo
in CurrentDirectoryInfo.GetDirectories())
{
if (ExcludeFolderslArrayList.Contains(
SubDirectoryInfo.FullName.ToLower()) == true)
{
WriteToLog(" Local folder excluded by " +
"ExcludeFolders. Name=" +
SubDirectoryInfo.FullName, 2);
}
else
{
FolderName = SubDirectoryInfo.FullName;
if (FolderName.EndsWith("\\") == false)
{
FolderName += "\\";
} WriteToLog(" Folder added to local list. Name=" +
FolderName, 2);
LocalArrayList.Add(FolderName); PopulateLocalArrayList(BaseFolder, SubDirectoryInfo.FullName,
IncludeSubFolders,
ExcludeFolderslArrayList,
IncludeOnlyFilesRegularExpression,
ExcludeFilesRegularExpression,
ref LocalArrayList);
}
}
}
else
{
WriteToLog(" Sub folders excluded by IncludeSubFolders.", 2);
}
}
    1. Next, the program creates a list of files that should be deleted on Amazon S3. This is done through the PopulateDeleteS3ArrayList function. This function goes through each item in the list of files on S3 and checks if they exist in the list of local items. If the S3 item does not exist locally, the item is added to DeleteS3ArrayList.
Hide   Copy Code
private static void PopulateDeleteS3ArrayList(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter, String SoureFolder,
ref System.Collections.ArrayList DeleteS3ArrayList)
{
//For each S3 item, check if it exists locally. String KeyName;
String LocalPath; foreach (System.Collections.DictionaryEntry MyDictionaryEntry in S3HashTable)
{
KeyName = MyDictionaryEntry.Key.ToString();
KeyName = KeyName.Substring(UploadPrefix.Length); LocalPath = System.IO.Path.Combine(SoureFolder,
KeyName.Replace(S3FolderDelimiter, "\\")); if (LocalArrayList.Contains(LocalPath) == false)
{
DeleteS3ArrayList.Add(MyDictionaryEntry.Key);
}
}
}
    1. Next, the program finds which local files do not exist on S3. This is done through the PopulateUploadDictionary function. This function goes through the local ArrayList and checks if each item exists in the S3 HashTable. The item may exist on S3 and locally but the local file may have different content. To determine if a file is the same between S3 and locally, the program has an option to compare files by ETag. An ETag is an identifier based on the content of a file. If the file changes, the ETag changes. Amazon stores the Etag of each file you upload. When you list files on S3, the ETag for each file is returned. If you choose to compare by ETag, the program will calculate the ETag of the local file and check if it matches the ETag returned by Amazon. Any file that doesn't match is added to a Dictionary of items that need to be uploaded to S3.
Hide   Shrink    Copy Code
private static void PopulateUploadDictionary(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, String CompareFilesBy,
ref Dictionary<string, > UploadDictionary)
{
//Check which local items need to be uploaded to S3. String LocalPathAsKey;
SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
String LocalETag;
System.IO.FileInfo MyFileInfo; CompareFilesBy = CompareFilesBy.ToLower(); foreach (String LocalPath in LocalArrayList)
{
LocalPathAsKey = LocalPath;
LocalPathAsKey = LocalPathAsKey.Substring(SoureFolder.Length);
LocalPathAsKey = LocalPathAsKey.Replace("\\", S3FolderDelimiter); LocalPathAsKey = System.IO.Path.Combine(UploadPrefix, LocalPathAsKey); if (S3HashTable.ContainsKey(LocalPathAsKey) == true)
{
//Only check files to see if the content is different.
if (LocalPath.EndsWith("\\") == false)
{
//The local file exists on S3. Check if the files are different. SprightlySoftAWS.S3.ListBucket.BucketItemObject MyBucketItemObject;
MyBucketItemObject = S3HashTable[LocalPathAsKey]
as SprightlySoftAWS.S3.ListBucket.BucketItemObject; if (CompareFilesBy == "etag")
{
LocalETag = MyCalculateHash.CalculateETagFromFile(LocalPath); if (LocalETag == MyBucketItemObject.ETag.Replace("\"", ""))
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, LocalETag);
}
}
else if (CompareFilesBy == "size")
{
MyFileInfo = new System.IO.FileInfo(LocalPath); if (MyFileInfo.Length == MyBucketItemObject.Size)
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, "");
}
}
else
{
//If the FileName is different the file
//will not exists on S3. No need to do a check here.
}
}
}
else
{
//The local file does not exist on S3, add it to the upload list.
UploadDictionary.Add(LocalPath, "");
}
}
}

Now we have a list of files that need to be deleted on S3 and a list of files that need to be uploaded to S3. The program has an option to list these changes, or go ahead and apply these changes to S3.

If we are deleting files on S3, the DeleteExtraOnS3 function is called. This function goes through the DeleteS3ArrayList and deletes each file in it. This is done by calling the MakeS3Request function. This function uses the SprightlySoft AWS component to send the appropriate command to S3. For more information about using the AWS component, see the documentation included with the source code and read the Amazon S3 API Reference documentation from Amazon. The function has the ability to retry the command if it fails.

Hide   Shrink    Copy Code
private static void DeleteExtraOnS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, ref System.Collections.ArrayList DeleteS3ArrayList,
int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = "";
int DeleteCount = 0; foreach (String DeleteItem in DeleteS3ArrayList)
{
//Send the correct parameters to the MakeS3Request function
//to delete a file on S3. This function will wait
//and retry if there is a 503 error.
RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, DeleteItem,
"", "DELETE", null, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Delete S3 file successful. S3KeyName=" + DeleteItem);
DeleteCount += 1;
}
else
{
WriteToLog(" Delete S3 file failed. S3KeyName=" +
DeleteItem + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling deletion of extra files on S3.");
ExitCode = 1;
break;
}
} WriteToLog(" Number of items deleted: " + DeleteCount);
} private static Boolean MakeS3Request(String AWSAccessKeyId, String AWSSecretAccessKey,
Boolean UseSSL, String RequestEndpoint, String BucketName,
String KeyName, String QueryString, String RequestMethod,
Dictionary<string, > ExtraHeaders, String SendData, int RetryTimes,
ref int ErrorNumber, ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
SprightlySoftAWS.REST MyREST = new SprightlySoftAWS.REST();
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyREST.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, QueryString); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyREST.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders, AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyREST.MakeRequest(RequestURL, RequestMethod,
ExtraRequestHeaders, SendData); //Set the return values.
ErrorNumber = MyREST.ErrorNumber;
ErrorDescription = MyREST.ErrorDescription;
LogData += MyREST.LogData;
ResponseStatusCode = MyREST.ResponseStatusCode;
ResponseStatusDescription = MyREST.ResponseStatusDescription;
ResponseHeaders = MyREST.ResponseHeaders;
ResponseString = MyREST.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyREST.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyREST.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}

Finally, the program calls the UploadMissingToS3 function to upload files from UploadDictionary to S3. Here, the program sets header information such as Content-MD5, Content-Type, and metadata to store the local file's timestamp in S3. It then calls the UploadFileToS3 function, which is very similar to the MakeS3Requestfunction. The difference is, the UploadFileToS3 function has parameters that are only relevant to uploading a file. The program uses a variable called MyUpload which is a SprightlySoft AWS component object. This object raises an event whenever the progress on an upload changes. The program hooks into this progress event and shows the progress of the upload while it is taking place. This is done in the MyUpload_ProgressChangedEvent function.

Hide   Shrink    Copy Code
private static void UploadMissingToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL,
String RequestEndpoint, String BucketName,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, Dictionary<string, > UserRequestHeaders,
Dictionary<string, > UserContentTypes,
Boolean CalculateMD5ForUpload,
ref Dictionary<string, > UploadDictionary,
Boolean SaveTimestampsInMetadata, Single UploadSpeedLimitKBps,
Boolean ShowUploadProgress, int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = ""; Dictionary<string, > ExtraHeaders = new Dictionary<string, >();
String DiffName;
String LocalMD5Hash;
String MyExtension;
String KeyName;
System.IO.FileInfo MyFileInfo;
System.IO.DirectoryInfo MyDirectoryInfo;
int UploadCount = 0; SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
SprightlySoftAWS.S3.Helper MyS3Helper = new SprightlySoftAWS.S3.Helper(); //Get a dictionary of content types from the SprightlySoftAWS.S3.Helper class.
Dictionary<string, > ContentTypesDictionary;
ContentTypesDictionary = MyS3Helper.GetContentTypesDictionary(); foreach (KeyValuePair<string, > UploadKeyValuePair in UploadDictionary)
{
DiffName = UploadKeyValuePair.Key.Substring(SoureFolder.Length,
UploadKeyValuePair.Key.Length - SoureFolder.Length);
DiffName = DiffName.Replace("\\", "/"); KeyName = UploadPrefix + DiffName; if (UploadKeyValuePair.Key.EndsWith("\\") == true)
{
ExtraHeaders = new Dictionary<string, >();
if (SaveTimestampsInMetadata == true)
{
MyDirectoryInfo = new System.IO.DirectoryInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyDirectoryInfo.CreationTime.ToFileTimeUtc().ToString());
} RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, KeyName, "",
"PUT", ExtraHeaders, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData,
ref ResponseStatusCode, ref ResponseStatusDescription,
ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Create S3 folder successful. S3KeyName=" + KeyName);
UploadCount += 1;
}
else
{
WriteToLog(" Create S3 folder failed. S3KeyName=" +
KeyName + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
else
{
//Calculate the MD5 for upload if required.
//Set the content type.
//Add extra headers. ExtraHeaders = new Dictionary<string, >(); if (CalculateMD5ForUpload == true)
{
if (UploadKeyValuePair.Value == "")
{
//Calculate the MD5.
LocalMD5Hash =
MyCalculateHash.CalculateMD5FromFile(UploadKeyValuePair.Key);
}
else
{
//Convert the ETag to MD5.
LocalMD5Hash = MyS3Helper.ConvertETagToMD5(UploadKeyValuePair.Value);
} ExtraHeaders.Add("Content-MD5", LocalMD5Hash);
} MyExtension = System.IO.Path.GetExtension(UploadKeyValuePair.Key).ToLower(); if (UserContentTypes.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type", UserContentTypes[MyExtension]);
}
else if (ContentTypesDictionary.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type",
ContentTypesDictionary[MyExtension]);
} if (SaveTimestampsInMetadata == true)
{
MyFileInfo = new System.IO.FileInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-modified",
MyFileInfo.LastWriteTimeUtc.ToFileTimeUtc().ToString());
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyFileInfo.CreationTime.ToFileTimeUtc().ToString());
} //Add the user supplied headers to the other headers that will be sent to S3.
foreach (KeyValuePair<string, > MyKeyValuePair in UserRequestHeaders)
{
ExtraHeaders.Add(MyKeyValuePair.Key, MyKeyValuePair.Value);
} WriteToLog(" Uploading file. S3KeyName=" + KeyName);
RetBool = UploadFileToS3(AWSAccessKeyId, AWSSecretAccessKey, UseSSL,
RequestEndpoint, BucketName, KeyName, "PUT",
ExtraHeaders, UploadKeyValuePair.Key, UploadSpeedLimitKBps,
S3ErrorRetries, ref ErrorNumber, ref ErrorDescription,
ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders,
ref ResponseString); if (RetBool == true)
{
WriteToLog(" Upload file to S3 was successful.");
UploadCount += 1;
}
else
{
WriteToLog(" Upload file to S3 failed. ErrorNumber=" +
ErrorNumber + " ErrorDescription=" +
ErrorDescription + " ResponseString=" +
ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
} WriteToLog(" Number of items uploaded: " + UploadCount);
} private static Boolean UploadFileToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String KeyName, String RequestMethod,
Dictionary<string, > ExtraHeaders, String LocalFileName,
Single UploadSpeedLimitKBps, int RetryTimes, ref int ErrorNumber,
ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; if (UploadSpeedLimitKBps > 0)
{
MyUpload.LimitKBpsSpeed = UploadSpeedLimitKBps;
} for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyUpload.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, ""); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyUpload.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders,
AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyUpload.UploadFile(RequestURL, RequestMethod,
ExtraRequestHeaders, LocalFileName); //Set the return values.
ErrorNumber = MyUpload.ErrorNumber;
ErrorDescription = MyUpload.ErrorDescription;
LogData += MyUpload.LogData;
ResponseStatusCode = MyUpload.ResponseStatusCode;
ResponseStatusDescription = MyUpload.ResponseStatusDescription;
ResponseHeaders = MyUpload.ResponseHeaders;
ResponseString = MyUpload.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyUpload.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyUpload.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}

When the program completes, it has an option to send log information through an email. This is useful if you run the program as a scheduled task and you want to be notified if there is a failure.

实现多线程异步自动上传本地文件到 Amazon S3的更多相关文章

  1. centos6.8 上传文件到amazon s3

    centos6.8 上传文件到amazon s3 0.参考 AWS CLI Cinnabd Reference Possible to sync a single file with aws s3 s ...

  2. PhpStorm 配置本地文件自动上传至服务器

    目的:本地文件夹下的文件实时同步至指定服务器的文件夹,减少代码移植的成本和风险 添加一个SFTP连接 Tools - Deployment - Browse Remote Host 配置连接参数 Co ...

  3. Android View转为图片保存为本地文件,异步监听回调操作结果;

    把手机上的一个View或ViewGroup转为Bitmap,再把Bitmap保存为.png格式的图片: 由于View转Bitmap.和Bitmap转图片都是耗时操作,(生成一个1M的图片大约500ms ...

  4. [原]unity3d之http多线程异步资源下载

    郑重声明:转载请注明出处 U_探索 本文诞生于乐元素面试过程,被面试官问到AssetBundle多线程异步下载时,愣了半天,同样也被深深的鄙视一回(做了3年多u3d 这个都没用过),所以发誓要实现出来 ...

  5. C#异步批量下载文件

    C#异步批量下载文件 实现原理:采用WebClient进行批量下载任务,简单的模拟迅雷下载效果! 废话不多说,先看掩饰效果: 具体实现步骤如下: 1.新建项目:WinBatchDownload 2.先 ...

  6. 手工创建tomcat应用,以及实现js读取本地文件内容

    手工创建tomcat应用: 1.在webapps下面新建应用目录文件夹 2.在文件夹下创建或是从其他应用中复制:META-INF,WEB-INF这两个文件夹, 其中META-INF清空里面,WEB-I ...

  7. Android 多线程 异步加载

    Android 应用中需要显示网络图片时,图片的加载过程较为耗时,因此加载过程使用线程池进行管理, 同时使用本地缓存保存图片(当来回滚动ListView时,调用缓存的图片),这样加载和显示图片较为友好 ...

  8. 【iOS系列】-多图片多线程异步下载

    多图片多线程异步下载 开发中非常常用的就是就是图片下载,我们常用的就是SDWebImage,但是作为开发人员,不仅要能会用,还要知道其原理.本文就会介绍多图下载的实现. 本文中的示例Demno地址,下 ...

  9. HTML5 jQuery+FormData 异步上传文件,带进度条

    <!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <link href ...

随机推荐

  1. [No0000127]WCF安全体系netTCPBinding绑定

    netTCPBinding绑定之Transport安全模式 一.netTCPBinding 此绑定使用TCP传输协议,不具交互性,只适用于 WCF 到 WCF 的通信. 此绑定的传输安全性的实现:  ...

  2. Oracle 变量之 DDL_LOCK_TIMEOUT

    DDL_LOCK_TIMEOUTProperty DescriptionParameter type IntegerDefault value 0Modifiable ALTER SESSIONRan ...

  3. Django:环境搭建

    django环境配置 1.安装django pip install django #安装指定版本 pip install -v django==1.8.2 通过python shell查看版本,返回版 ...

  4. [DPI][TCP] linux API的接口如何控制urgent包的收发

    做DPI,写协议栈的时候,处理到了urgent数据包.突然好奇应用层是如何控制发出urgent包的呢?而接收端又是如何知道,接受到了urgent包的呢? man 7 tcp,中有如下一段: TCP s ...

  5. [cipher][archlinux][disk encryption][btrfs] 磁盘分区加密 + btrfs

    科普链接:https://wiki.archlinux.org/index.php/Disk_encryption 前面的链接关于硬盘加密,讲了几种,基本上就是选dm-crypt with LUKS ...

  6. textfield内边距

    使用QMUITextField self.inputTf.textInset = UIEdgeInsetsMake(0, -6, 0, 10);

  7. 怎样使用 fiddler抓取网络数据包?

    今天我们使用的工具是一个非常著名的抓包工具,百度搜索一下即可找到(或者关注/私信我,查看共享,一般我在百度经验中使用到的软件类工具,都可以在共享网盘中找到),因此这里不演示下载,相信您能很容易得到它的 ...

  8. NodeJS笔记(一)-免安装设置

    之前在官网下载的nodejs win64版本4.* 最近发现nodejs都已经更新到了7.X 稳定版都升级到了6.X ,nodejs升级的真是神速了,想要升级下, 使用官方给的方法更新失败(使用的是n ...

  9. Qt网络模块如何使用(表格)

    1.网络模块介绍 类名 说明 中文 QAbstractNetworkCache The interface for cache implementations 缓存实现的接口 QNetworkCach ...

  10. 坦克大战java版

    吃了可以加血的血块类 import java.awt.*; public class Blood { //血块移动的路径 int[][] pos = { {450,250},{450,252},{45 ...