实现多线程异步自动上传本地文件到 Amazon S3
最近抽空做个小工具,使用AWSSDK 对本地文件目录监控,并自动同步上传文件到S3 的过程,使用的是多线程异步上传,针对大文件进行了分块
参考文献:
https://www.codeproject.com/Articles/131678/Amazon-S-Sync
https://aws.amazon.com/cn/documentation/s3/
Introduction
The SprightlySoft S3 Sync application allows you to take a folder on your computer and upload it to Amazon S3. You can make additions, deletions, and changes to your local files, and the next time you run the application, it will detect these changes and apply them to S3. This program allows you to create a mirror of a local folder on S3 and always keep it up to date.
Background
Amazon Simple Storage Service (Amazon S3) is a service that allows you to store files in Amazon's cloud computing environment. When your files are in Amazon's system, you can retrieve the files from anywhere on the web. You can also use Amazon's CloudFront service in conjunction with S3 to distribute your files to millions of people. Amazon S3 is the same highly scalable, reliable, secure, fast, inexpensive infrastructure that Amazon uses to run its own global network of web sites. Best of all, Amazon S3 is free for the first 5 GB of storage.
SprightlySoft S3 Sync uses the free SprightlySoft AWS component for .NET to interact with Amazon S3. In the code, you will see an example of uploading files to S3, deleting files from S3, listing files, and getting the properties of files.
Using the code
The SprightlySoft S3 Sync program works by listing your files on S3, listing your files locally, and comparing the differences between the two lists. Here is the logic of the program:
- The code starts by getting a list of all user settings. These include your Amazon AWS credentials, the S3 bucket you are syncing to, and the local folder you are syncing from.
- The next major call is the
PopulateS3HashTable
function. Here, all the files in your Amazon S3 bucket are listed. The code calls theListBucket
function from the SprighztlySoft AWS component. This function returns anArrayList
of objects that represent each item in S3. Properties of each object include S3 key name, size, date modified, and ETag. These objects are added to aHashTable
so they can be used later on.

private static Boolean PopulateS3HashTable(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String UploadPrefix,
ref System.Collections.Hashtable S3HashTable)
{
Boolean RetBool;
SprightlySoftAWS.S3.ListBucket MyListBucket =
new SprightlySoftAWS.S3.ListBucket();
RetBool = MyListBucket.ListBucket(UseSSL, RequestEndpoint, BucketName,
"", UploadPrefix, AWSAccessKeyId, AWSSecretAccessKey); if (RetBool == true)
{
foreach (SprightlySoftAWS.S3.ListBucket.BucketItemObject
MyBucketItemObject in MyListBucket.BucketItemsArrayList)
{
WriteToLog(" Item added to S3 list. KeyName=" +
MyBucketItemObject.KeyName, 2);
S3HashTable.Add(MyBucketItemObject.KeyName, MyBucketItemObject);
}
}
else
{
WriteToLog(" Error listing files on S3. ErrorDescription=" +
MyListBucket.ErrorDescription);
WriteToLog(MyListBucket.LogData, 3);
} return RetBool;
}
- The next call is the
PopulateLocalArrayList
. This function lists all local files and folders you want to synchronize. Each file and folder is added to anArrayList
so it can be used later on. The function has the option to only include certain files or exclude certain files. The function is recursive. That means it calls itself for each sub folder.

private static void PopulateLocalArrayList(String BaseFolder,
String CurrentFolder, Boolean IncludeSubFolders,
System.Collections.ArrayList ExcludeFolderslArrayList,
String IncludeOnlyFilesRegularExpression,
String ExcludeFilesRegularExpression,
ref System.Collections.ArrayList LocalArrayList)
{
System.IO.DirectoryInfo CurrentDirectoryInfo;
CurrentDirectoryInfo = new System.IO.DirectoryInfo(CurrentFolder); String FolderName; foreach (System.IO.FileInfo MyFileInfo in CurrentDirectoryInfo.GetFiles())
{
if (ExcludeFilesRegularExpression != "")
{
//check if the file is excluded
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, ExcludeFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" Local file excluded by " +
"ExcludeFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to local " +
"list. Name=" + MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
}
else
{
if (IncludeOnlyFilesRegularExpression != "")
{
if (System.Text.RegularExpressions.Regex.IsMatch(
MyFileInfo.FullName, IncludeOnlyFilesRegularExpression,
System.Text.RegularExpressions.RegexOptions.IgnoreCase) == true)
{
WriteToLog(" File added to list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
else
{
WriteToLog(" Local file not included by " +
"IncludeOnlyFilesRegularExpression. Name=" +
MyFileInfo.FullName, 2);
}
}
else
{
WriteToLog(" File added to local list. Name=" +
MyFileInfo.FullName, 2);
LocalArrayList.Add(MyFileInfo.FullName);
}
}
} if (IncludeSubFolders == true)
{
foreach (System.IO.DirectoryInfo SubDirectoryInfo
in CurrentDirectoryInfo.GetDirectories())
{
if (ExcludeFolderslArrayList.Contains(
SubDirectoryInfo.FullName.ToLower()) == true)
{
WriteToLog(" Local folder excluded by " +
"ExcludeFolders. Name=" +
SubDirectoryInfo.FullName, 2);
}
else
{
FolderName = SubDirectoryInfo.FullName;
if (FolderName.EndsWith("\\") == false)
{
FolderName += "\\";
} WriteToLog(" Folder added to local list. Name=" +
FolderName, 2);
LocalArrayList.Add(FolderName); PopulateLocalArrayList(BaseFolder, SubDirectoryInfo.FullName,
IncludeSubFolders,
ExcludeFolderslArrayList,
IncludeOnlyFilesRegularExpression,
ExcludeFilesRegularExpression,
ref LocalArrayList);
}
}
}
else
{
WriteToLog(" Sub folders excluded by IncludeSubFolders.", 2);
}
}
- Next, the program creates a list of files that should be deleted on Amazon S3. This is done through the
PopulateDeleteS3ArrayList
function. This function goes through each item in the list of files on S3 and checks if they exist in the list of local items. If the S3 item does not exist locally, the item is added toDeleteS3ArrayList
.
private static void PopulateDeleteS3ArrayList(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter, String SoureFolder,
ref System.Collections.ArrayList DeleteS3ArrayList)
{
//For each S3 item, check if it exists locally. String KeyName;
String LocalPath; foreach (System.Collections.DictionaryEntry MyDictionaryEntry in S3HashTable)
{
KeyName = MyDictionaryEntry.Key.ToString();
KeyName = KeyName.Substring(UploadPrefix.Length); LocalPath = System.IO.Path.Combine(SoureFolder,
KeyName.Replace(S3FolderDelimiter, "\\")); if (LocalArrayList.Contains(LocalPath) == false)
{
DeleteS3ArrayList.Add(MyDictionaryEntry.Key);
}
}
}
- Next, the program finds which local files do not exist on S3. This is done through the
PopulateUploadDictionary
function. This function goes through the localArrayList
and checks if each item exists in the S3HashTable
. The item may exist on S3 and locally but the local file may have different content. To determine if a file is the same between S3 and locally, the program has an option to compare files by ETag. An ETag is an identifier based on the content of a file. If the file changes, the ETag changes. Amazon stores the Etag of each file you upload. When you list files on S3, the ETag for each file is returned. If you choose to compare by ETag, the program will calculate the ETag of the local file and check if it matches the ETag returned by Amazon. Any file that doesn't match is added to aDictionary
of items that need to be uploaded to S3.

private static void PopulateUploadDictionary(ref System.Collections.Hashtable
S3HashTable, ref System.Collections.ArrayList LocalArrayList,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, String CompareFilesBy,
ref Dictionary<string, > UploadDictionary)
{
//Check which local items need to be uploaded to S3. String LocalPathAsKey;
SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
String LocalETag;
System.IO.FileInfo MyFileInfo; CompareFilesBy = CompareFilesBy.ToLower(); foreach (String LocalPath in LocalArrayList)
{
LocalPathAsKey = LocalPath;
LocalPathAsKey = LocalPathAsKey.Substring(SoureFolder.Length);
LocalPathAsKey = LocalPathAsKey.Replace("\\", S3FolderDelimiter); LocalPathAsKey = System.IO.Path.Combine(UploadPrefix, LocalPathAsKey); if (S3HashTable.ContainsKey(LocalPathAsKey) == true)
{
//Only check files to see if the content is different.
if (LocalPath.EndsWith("\\") == false)
{
//The local file exists on S3. Check if the files are different. SprightlySoftAWS.S3.ListBucket.BucketItemObject MyBucketItemObject;
MyBucketItemObject = S3HashTable[LocalPathAsKey]
as SprightlySoftAWS.S3.ListBucket.BucketItemObject; if (CompareFilesBy == "etag")
{
LocalETag = MyCalculateHash.CalculateETagFromFile(LocalPath); if (LocalETag == MyBucketItemObject.ETag.Replace("\"", ""))
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, LocalETag);
}
}
else if (CompareFilesBy == "size")
{
MyFileInfo = new System.IO.FileInfo(LocalPath); if (MyFileInfo.Length == MyBucketItemObject.Size)
{
//Files are the same.
}
else
{
UploadDictionary.Add(LocalPath, "");
}
}
else
{
//If the FileName is different the file
//will not exists on S3. No need to do a check here.
}
}
}
else
{
//The local file does not exist on S3, add it to the upload list.
UploadDictionary.Add(LocalPath, "");
}
}
}
Now we have a list of files that need to be deleted on S3 and a list of files that need to be uploaded to S3. The program has an option to list these changes, or go ahead and apply these changes to S3.
If we are deleting files on S3, the DeleteExtraOnS3
function is called. This function goes through the DeleteS3ArrayList
and deletes each file in it. This is done by calling the MakeS3Request
function. This function uses the SprightlySoft AWS component to send the appropriate command to S3. For more information about using the AWS component, see the documentation included with the source code and read the Amazon S3 API Reference documentation from Amazon. The function has the ability to retry the command if it fails.

private static void DeleteExtraOnS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, ref System.Collections.ArrayList DeleteS3ArrayList,
int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = "";
int DeleteCount = 0; foreach (String DeleteItem in DeleteS3ArrayList)
{
//Send the correct parameters to the MakeS3Request function
//to delete a file on S3. This function will wait
//and retry if there is a 503 error.
RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, DeleteItem,
"", "DELETE", null, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Delete S3 file successful. S3KeyName=" + DeleteItem);
DeleteCount += 1;
}
else
{
WriteToLog(" Delete S3 file failed. S3KeyName=" +
DeleteItem + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling deletion of extra files on S3.");
ExitCode = 1;
break;
}
} WriteToLog(" Number of items deleted: " + DeleteCount);
} private static Boolean MakeS3Request(String AWSAccessKeyId, String AWSSecretAccessKey,
Boolean UseSSL, String RequestEndpoint, String BucketName,
String KeyName, String QueryString, String RequestMethod,
Dictionary<string, > ExtraHeaders, String SendData, int RetryTimes,
ref int ErrorNumber, ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
SprightlySoftAWS.REST MyREST = new SprightlySoftAWS.REST();
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyREST.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, QueryString); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyREST.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders, AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyREST.MakeRequest(RequestURL, RequestMethod,
ExtraRequestHeaders, SendData); //Set the return values.
ErrorNumber = MyREST.ErrorNumber;
ErrorDescription = MyREST.ErrorDescription;
LogData += MyREST.LogData;
ResponseStatusCode = MyREST.ResponseStatusCode;
ResponseStatusDescription = MyREST.ResponseStatusDescription;
ResponseHeaders = MyREST.ResponseHeaders;
ResponseString = MyREST.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyREST.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyREST.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}
Finally, the program calls the UploadMissingToS3
function to upload files from UploadDictionary
to S3. Here, the program sets header information such as Content-MD5, Content-Type, and metadata to store the local file's timestamp in S3. It then calls the UploadFileToS3
function, which is very similar to the MakeS3Request
function. The difference is, the UploadFileToS3
function has parameters that are only relevant to uploading a file. The program uses a variable called MyUpload
which is a SprightlySoft AWS component object. This object raises an event whenever the progress on an upload changes. The program hooks into this progress event and shows the progress of the upload while it is taking place. This is done in the MyUpload_ProgressChangedEvent
function.

private static void UploadMissingToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL,
String RequestEndpoint, String BucketName,
String UploadPrefix, String S3FolderDelimiter,
String SoureFolder, Dictionary<string, > UserRequestHeaders,
Dictionary<string, > UserContentTypes,
Boolean CalculateMD5ForUpload,
ref Dictionary<string, > UploadDictionary,
Boolean SaveTimestampsInMetadata, Single UploadSpeedLimitKBps,
Boolean ShowUploadProgress, int S3ErrorRetries)
{
Boolean RetBool;
int ErrorNumber = 0;
String ErrorDescription = "";
String LogData = "";
int ResponseStatusCode = 0;
String ResponseStatusDescription = "";
Dictionary<string, > ResponseHeaders = new Dictionary<string, >();
String ResponseString = ""; Dictionary<string, > ExtraHeaders = new Dictionary<string, >();
String DiffName;
String LocalMD5Hash;
String MyExtension;
String KeyName;
System.IO.FileInfo MyFileInfo;
System.IO.DirectoryInfo MyDirectoryInfo;
int UploadCount = 0; SprightlySoftAWS.S3.CalculateHash MyCalculateHash =
new SprightlySoftAWS.S3.CalculateHash();
SprightlySoftAWS.S3.Helper MyS3Helper = new SprightlySoftAWS.S3.Helper(); //Get a dictionary of content types from the SprightlySoftAWS.S3.Helper class.
Dictionary<string, > ContentTypesDictionary;
ContentTypesDictionary = MyS3Helper.GetContentTypesDictionary(); foreach (KeyValuePair<string, > UploadKeyValuePair in UploadDictionary)
{
DiffName = UploadKeyValuePair.Key.Substring(SoureFolder.Length,
UploadKeyValuePair.Key.Length - SoureFolder.Length);
DiffName = DiffName.Replace("\\", "/"); KeyName = UploadPrefix + DiffName; if (UploadKeyValuePair.Key.EndsWith("\\") == true)
{
ExtraHeaders = new Dictionary<string, >();
if (SaveTimestampsInMetadata == true)
{
MyDirectoryInfo = new System.IO.DirectoryInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyDirectoryInfo.CreationTime.ToFileTimeUtc().ToString());
} RetBool = MakeS3Request(AWSAccessKeyId, AWSSecretAccessKey,
UseSSL, RequestEndpoint, BucketName, KeyName, "",
"PUT", ExtraHeaders, "", S3ErrorRetries,
ref ErrorNumber, ref ErrorDescription, ref LogData,
ref ResponseStatusCode, ref ResponseStatusDescription,
ref ResponseHeaders, ref ResponseString); if (RetBool == true)
{
WriteToLog(" Create S3 folder successful. S3KeyName=" + KeyName);
UploadCount += 1;
}
else
{
WriteToLog(" Create S3 folder failed. S3KeyName=" +
KeyName + " ErrorNumber=" + ErrorNumber +
" ErrorDescription=" + ErrorDescription +
" ResponseString=" + ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
else
{
//Calculate the MD5 for upload if required.
//Set the content type.
//Add extra headers. ExtraHeaders = new Dictionary<string, >(); if (CalculateMD5ForUpload == true)
{
if (UploadKeyValuePair.Value == "")
{
//Calculate the MD5.
LocalMD5Hash =
MyCalculateHash.CalculateMD5FromFile(UploadKeyValuePair.Key);
}
else
{
//Convert the ETag to MD5.
LocalMD5Hash = MyS3Helper.ConvertETagToMD5(UploadKeyValuePair.Value);
} ExtraHeaders.Add("Content-MD5", LocalMD5Hash);
} MyExtension = System.IO.Path.GetExtension(UploadKeyValuePair.Key).ToLower(); if (UserContentTypes.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type", UserContentTypes[MyExtension]);
}
else if (ContentTypesDictionary.ContainsKey(MyExtension) == true)
{
ExtraHeaders.Add("Content-Type",
ContentTypesDictionary[MyExtension]);
} if (SaveTimestampsInMetadata == true)
{
MyFileInfo = new System.IO.FileInfo(UploadKeyValuePair.Key);
ExtraHeaders.Add("x-amz-meta-local-date-modified",
MyFileInfo.LastWriteTimeUtc.ToFileTimeUtc().ToString());
ExtraHeaders.Add("x-amz-meta-local-date-created",
MyFileInfo.CreationTime.ToFileTimeUtc().ToString());
} //Add the user supplied headers to the other headers that will be sent to S3.
foreach (KeyValuePair<string, > MyKeyValuePair in UserRequestHeaders)
{
ExtraHeaders.Add(MyKeyValuePair.Key, MyKeyValuePair.Value);
} WriteToLog(" Uploading file. S3KeyName=" + KeyName);
RetBool = UploadFileToS3(AWSAccessKeyId, AWSSecretAccessKey, UseSSL,
RequestEndpoint, BucketName, KeyName, "PUT",
ExtraHeaders, UploadKeyValuePair.Key, UploadSpeedLimitKBps,
S3ErrorRetries, ref ErrorNumber, ref ErrorDescription,
ref LogData, ref ResponseStatusCode,
ref ResponseStatusDescription, ref ResponseHeaders,
ref ResponseString); if (RetBool == true)
{
WriteToLog(" Upload file to S3 was successful.");
UploadCount += 1;
}
else
{
WriteToLog(" Upload file to S3 failed. ErrorNumber=" +
ErrorNumber + " ErrorDescription=" +
ErrorDescription + " ResponseString=" +
ResponseString);
WriteToLog(LogData, 2);
WriteToLog(" Canceling upload of missing files to S3.");
ExitCode = 1;
break;
}
}
} WriteToLog(" Number of items uploaded: " + UploadCount);
} private static Boolean UploadFileToS3(String AWSAccessKeyId,
String AWSSecretAccessKey, Boolean UseSSL, String RequestEndpoint,
String BucketName, String KeyName, String RequestMethod,
Dictionary<string, > ExtraHeaders, String LocalFileName,
Single UploadSpeedLimitKBps, int RetryTimes, ref int ErrorNumber,
ref String ErrorDescription, ref String LogData,
ref int ResponseStatusCode, ref String ResponseStatusDescription,
ref Dictionary<string, > ResponseHeaders, ref String ResponseString)
{
String RequestURL;
Dictionary<string, > ExtraRequestHeaders;
String AuthorizationValue;
Boolean RetBool = true;
LogData = ""; if (UploadSpeedLimitKBps > 0)
{
MyUpload.LimitKBpsSpeed = UploadSpeedLimitKBps;
} for (int i = 0; i == RetryTimes; i++)
{
RequestURL = MyUpload.BuildS3RequestURL(UseSSL, RequestEndpoint,
BucketName, KeyName, ""); ExtraRequestHeaders = new Dictionary<string, >(); if (ExtraHeaders != null)
{
foreach (KeyValuePair<string, > MyKeyValuePair in ExtraHeaders)
{
ExtraRequestHeaders.Add(MyKeyValuePair.Key,
MyKeyValuePair.Value);
}
} ExtraRequestHeaders.Add("x-amz-date",
DateTime.UtcNow.ToString("r")); AuthorizationValue = MyUpload.GetS3AuthorizationValue(RequestURL,
RequestMethod, ExtraRequestHeaders,
AWSAccessKeyId, AWSSecretAccessKey);
ExtraRequestHeaders.Add("Authorization", AuthorizationValue); RetBool = MyUpload.UploadFile(RequestURL, RequestMethod,
ExtraRequestHeaders, LocalFileName); //Set the return values.
ErrorNumber = MyUpload.ErrorNumber;
ErrorDescription = MyUpload.ErrorDescription;
LogData += MyUpload.LogData;
ResponseStatusCode = MyUpload.ResponseStatusCode;
ResponseStatusDescription = MyUpload.ResponseStatusDescription;
ResponseHeaders = MyUpload.ResponseHeaders;
ResponseString = MyUpload.ResponseString; if (RetBool == true)
{
break;
}
else
{
if (MyUpload.ResponseStatusCode == 503)
{
//A Service Unavailable response was returned. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else if (MyUpload.ErrorNumber == 1003)
{
//Getting the response failed.
//This may be a network disconnection. Wait and retry.
System.Threading.Thread.Sleep(1000 * i * i);
}
else
{
//An error occured but retrying would not solve the problem.
break;
}
}
} return RetBool;
}
When the program completes, it has an option to send log information through an email. This is useful if you run the program as a scheduled task and you want to be notified if there is a failure.
实现多线程异步自动上传本地文件到 Amazon S3的更多相关文章
- centos6.8 上传文件到amazon s3
centos6.8 上传文件到amazon s3 0.参考 AWS CLI Cinnabd Reference Possible to sync a single file with aws s3 s ...
- PhpStorm 配置本地文件自动上传至服务器
目的:本地文件夹下的文件实时同步至指定服务器的文件夹,减少代码移植的成本和风险 添加一个SFTP连接 Tools - Deployment - Browse Remote Host 配置连接参数 Co ...
- Android View转为图片保存为本地文件,异步监听回调操作结果;
把手机上的一个View或ViewGroup转为Bitmap,再把Bitmap保存为.png格式的图片: 由于View转Bitmap.和Bitmap转图片都是耗时操作,(生成一个1M的图片大约500ms ...
- [原]unity3d之http多线程异步资源下载
郑重声明:转载请注明出处 U_探索 本文诞生于乐元素面试过程,被面试官问到AssetBundle多线程异步下载时,愣了半天,同样也被深深的鄙视一回(做了3年多u3d 这个都没用过),所以发誓要实现出来 ...
- C#异步批量下载文件
C#异步批量下载文件 实现原理:采用WebClient进行批量下载任务,简单的模拟迅雷下载效果! 废话不多说,先看掩饰效果: 具体实现步骤如下: 1.新建项目:WinBatchDownload 2.先 ...
- 手工创建tomcat应用,以及实现js读取本地文件内容
手工创建tomcat应用: 1.在webapps下面新建应用目录文件夹 2.在文件夹下创建或是从其他应用中复制:META-INF,WEB-INF这两个文件夹, 其中META-INF清空里面,WEB-I ...
- Android 多线程 异步加载
Android 应用中需要显示网络图片时,图片的加载过程较为耗时,因此加载过程使用线程池进行管理, 同时使用本地缓存保存图片(当来回滚动ListView时,调用缓存的图片),这样加载和显示图片较为友好 ...
- 【iOS系列】-多图片多线程异步下载
多图片多线程异步下载 开发中非常常用的就是就是图片下载,我们常用的就是SDWebImage,但是作为开发人员,不仅要能会用,还要知道其原理.本文就会介绍多图下载的实现. 本文中的示例Demno地址,下 ...
- HTML5 jQuery+FormData 异步上传文件,带进度条
<!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <link href ...
随机推荐
- Entity Framework Core的贴心:优雅处理带默认值的数据库字段
对于用于保存记录添加时间的数据库日期字段,我们通常会设置一个 GETDATE() 的默认值,而不是在应用程序的代码中获取当前时间进行保存,这样可以避免由于web服务器时钟不同步引起的时间偏差. Ent ...
- 关于servelet入门介绍
servelet 容器 将前台的请求转发给后台 接受 http 表单, 后台处理操作数据库并且放回用户 .(粗劣) 手工编写第一个Servlet 1, 继承httpservlet 2, ...
- Codeforces 1038D - Slime - [思维题][DP]
题目链接:http://codeforces.com/problemset/problem/1038/D 题意: 给出 $n$ 个史莱姆,每个史莱姆有一个价值 $a[i]$,一个史莱姆可以吃掉相邻的史 ...
- [No0000181]改善C#程序的建议9:使用Task代替ThreadPool和Thread
一:Task的优势 ThreadPool相比Thread来说具备了很多优势,但是ThreadPool却又存在一些使用上的不方便.比如: 1: ThreadPool不支持线程的取消.完成.失败通知等交互 ...
- Druid 在有赞的实践
https://mp.weixin.qq.com/s?__biz=MzAxOTY5MDMxNA==&mid=2455759407&idx=1&sn=28390d7f5b2685 ...
- 【编译原理】c++实现自上而下语法分析器
写在前面:本博客为本人原创,严禁任何形式的转载!本博客只允许放在博客园(.cnblogs.com),如果您在其他网站看到这篇博文,请通过下面这个唯一的合法链接转到原文! 本博客全网唯一合法URL:ht ...
- Java+Selenium 如何处理Try/Catch
场景:为了检查数据是否存在,如果存在就把数据删除,不存在则新增保存.因为我们需要做下数据初始化准备工作. 一.:Feature 示例: Scenario: E-251:维护薪资帐套明细 When I ...
- 洛谷P4363 一双木棋chess [九省联考2018] 搜索+hash
正解:记搜+hash 解题报告: 传送门! 因为看到nm范围特别小,,,所以直接考虑爆搜(bushi 先考虑爆搜之后再想优化什么的嘛QwQ 首先对这种都要最优的,就可以直接把答案设为针对某一方,然后题 ...
- ES6 --- JS异步编程的几种解决方法及其优缺点
导言: 我们都知道 JS 是单线程的,这也正是异步编程对于 JS 很重要的原因,因为它无法忍受耗时太长的操作.正因如此有一系列的实现异步的方法. 方法一 setTimeout 常用于:定时器,动画 ...
- ES6面试题总结
1.说出至少5个ES6的新特性,并简述它们的作用.(简答题) 1.let关键字,用于声明只在块级作用域起作用的变量: 2.const关键字,用于声明一个常量: 3.结构赋值,一种新的变量赋值方式.常用 ...