Sending large data with CSOM

As a follow up to my recent post showing how to create or overwrite a document using the CSOM, I’m now going to show how to upload a file containing large amounts of data. To see the original post, click here.

Previously, when saving the binary data, I demonstrated the functionality with the following code:

public void SaveFile(Microsoft.SharePoint.Client.ClientContext context, string folderRelativeUrl, string relativeItemUrl, byte[] fileData)
{
    var fci = new FileCreationInformation
    {
        Url = relativeItemUrl,
        Content = fileData,
        Overwrite = true
    };

    Microsoft.SharePoint.Client.Folder folder = context.Web.GetFolderByServerRelativeUrl(folderRelativeUrl);
    Microsoft.SharePoint.Client.FileCollection files = folder.Files;
    Microsoft.SharePoint.Client.File file = files.Add(fci);

    context.Load(files);
    context.Load(file);
    context.ExecuteQuery();
}

The above code, by default, will be restricts to transferring files that are no more than 2091752 bytes in size, or 2MB.

To get around this, we’ll set the content for the FileCreationInformation object using the ContentStream property instead of the Content property.

As it sounds, this property accepts a Stream object instead of a byte array.

We’ll also add some logic to send the file data in buffered chunks, as there’s still a limit to how much the FileCreationInformation upload can handle in one process without error. For this, I’ve set a maximum file size of 4Mb before the buffer logic kicks in.

The updated code looks like:

public void SaveFile(Microsoft.SharePoint.Client.ClientContext context, string folderRelativeUrl, string relativeItemUrl, string filename)
{
	// Get the files object for the parent folder we want to upload a file to
	Microsoft.SharePoint.Client.Folder folder = context.Web.GetFolderByServerRelativeUrl(folderRelativeUrl);
	Microsoft.SharePoint.Client.FileCollection files = folder.Files;
	context.Load(files);
	context.ExecuteQuery();

	// 4Mb file length that will trigger a buffered upload
	int bufferFileSize = 4 * 1024 * 1024;
	long fileLength = new FileInfo(filename).Length;

	if (fileLength <= bufferFileSize)
	{
		// No need to buffer the upload
		using (FileStream stream = new FileStream(filename, FileMode.Open))
		{
			var fci = new FileCreationInformation
			{
				Url = relativeItemUrl,
				ContentStream = stream,
				Overwrite = true
			};
			
			Microsoft.SharePoint.Client.File file = files.Add(fci);
			
			context.Load(file);
			context.ExecuteQuery();
		}
	}
	else
	{
		// 2Mb buffer chunks
		int blockSize = 2 * 1024 * 1024;
		Microsoft.SharePoint.Client.File file = null;
		Guid uploadId = Guid.NewGuid();

		using (FileStream stream = new FileStream(filename, FileMode.Open))
		{
			byte[] buffer = new byte[blockSize];
			long offset = 0;
			long totalBytesRead = 0;
			int bytesRead;

			// Read data from file system in blocks.
			while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
			{
				ClientResult<long> totalBytesSent;
				totalBytesRead += bytesRead;
				
				// First set of data to upload
				if (file == null)
				{
					using (MemoryStream emptyStream = new MemoryStream())
					{
						var fci = new FileCreationInformation
						{
							Url = relativeItemUrl,
							ContentStream = emptyStream,
							Overwrite = true
						};

						// Create an empty file
						file = files.Add(fci);

						// Start the initial upload of data to the file
						using (MemoryStream bufferStream = new MemoryStream(buffer))
						{
							totalBytesSent = file.StartUpload(uploadId, bufferStream);
							context.ExecuteQuery();
							offset = totalBytesSent.Value;
						}
					}
				}
				// Continue uploading this set of data
				else if (totalBytesRead < fileLength)
				{
					using (MemoryStream bufferStream = new MemoryStream(buffer))
					{
						totalBytesSent = file.ContinueUpload(uploadId, offset, bufferStream);
						context.ExecuteQuery();
						offset = totalBytesSent.Value;
					}
				}
				// We've reach the last buffered set of data to upload
				else
				{
					// Create byte with the final byte count
					var finalBuffer = new byte[bytesRead];
					Array.Copy(buffer, 0, finalBuffer, 0, bytesRead);

					// Complete the file upload
					using (MemoryStream bufferStream = new MemoryStream(finalBuffer))
					{
						file = file.FinishUpload(uploadId, offset, bufferStream);
						context.Load(file);
						context.ExecuteQuery();
					}
				}
			}
		}
	}
}

This will result in the file creation either via buffered streaming or in one pass.

Hope this helps!

This entry was posted in CSOM, SharePoint and tagged , . Bookmark the permalink.
0 0 votes
Article Rating
Subscribe
Notify of
guest

Solve the maths problem shown below before posting: *

8 Comments
Inline Feedbacks
View all comments
Ira

Wow!

This is the tip I have been searching for.

Thanks.

Christoph

Worked perfectly !!!! Thanks very much !

Andy M

ContentStream is new in SharePoint 2013 and does not exist in SharePoint 2010 🙁

Andy M

But still a useful technique, thanks!

nitu bansal

thanks a lot! great solution

Vikram

Perfect !!! Works like a gem !!

Thanks mate.

SYED IMAM

Its Failing above 260 mb file size

Keith

This still errors with “The remote server returned an error: (400) Bad Request.”using ContentStream instead od Content with 6mb file.