When working with custom code that manipulates lists that contain a large number of fields, the ULS logs will likely be littered with the following entries:
A large block of literal text was sent to sql. This can result in blocking in sql and excessive memory use on the front end. Verify that no binary parameters are being passed as literals, and consider breaking up batches into smaller components. If this request is for a SharePoint list or list item, you may be able to resolve this by reducing the number of fields.
Slow Query Duration: [time in milliseconds]
Slow Query StackTrace-Managed: [complete stack trace]
Note: This one’s important as you can look at the stack and identify the exact area of code that initiates the slow query log entries.
SqlCommand: [SQL statement responsible for the log entry]
As a follow up to my recent post showing how to create or overwrite a document using the CSOM, I’m now going to show how to upload a file containing large amounts of data. To see the original post, click here.
Previously, when saving the binary data, I demonstrated the functionality with the following code:
public void SaveFile(Microsoft.SharePoint.Client.ClientContext context, string folderRelativeUrl, string relativeItemUrl, byte fileData)
var fci = new FileCreationInformation
Url = relativeItemUrl,
Content = fileData,
Overwrite = true
Microsoft.SharePoint.Client.Folder folder = context.Web.GetFolderByServerRelativeUrl(folderRelativeUrl);
Microsoft.SharePoint.Client.FileCollection files = folder.Files;
Microsoft.SharePoint.Client.File file = files.Add(fci);
The above code, by default, will be restricts to transferring files that are no more than 2091752 bytes in size, or 2MB.