Wednesday, March 21, 2018

Optimising bulk inserts with Entity Framework 6

Optimising bulk inserts with Entity Framework 6


In Entity Framework 6+ the database context implements the unit of work pattern so that any changes you make to the model are not persisted until you call the SaveChanges() method.
This is the most obvious way of implementing any bulk update. You just push the changes into the context and persist them in a single operation as shown below:
using (var context = new SqlDbContext())
{
    foreach (var book in bookList)
    {
        context.Books.Add(book);
    }
    context.SaveChanges();
};
The problem with this technique is that it is slow. Very slow. Record sets of 10k or more are measured in minutes rather than seconds.

Understand what’s happening under the hood

Before you can optimise any data access then you should have some insight into the SQL statements the Entity Framework is attempting to execute against the database. This is easier to do with version 6 of the entity framework as you can get the context to output all the database calls it makes with a single line of code:
context.Database.Log = Console.Write;
No matter what optimisations you make to the entity framework the SQL that goes to the database server is always the same. Each record will be added in a single insert statement involving a separate round-trip between client and server.
You can make significant gains by changing entity framework settings, but these changes optimise the internal workings of the entity framework itself rather than the way it is calling a remote database. You cannot do anything about the basic bottleneck where inserts are going in row by row.

Switching off change detection

For large data volumes the biggest single saving you can make is by turning off the automated change detection on the database context.
This is the entity framework’s internal process for checking the current property values of an entity against the original values. It gets triggered for any method that updates data and gets particularly expensive if the context is tracking a large number of entities.
Note that if you plan to re-use the context for any other operation then you should consider re-enabling change detection after the bulk insert operation. This can be done through a try\finally clause as shown below:
try
{
    context.Configuration.AutoDetectChangesEnabled = false;
 
    foreach (var book in bookList)
    {
        context.Books.Add(book);
    }
    
    context.SaveChanges();
}
finally 
{
    context.Configuration.AutoDetectChangesEnabled = true;
}
You can also consider switching off model validation so the entity framework won’t validate each entity as it is submitted to the database. This provides a much smaller performance gain that is unlikely to be significant unless you are working with thousands of rows of data.

Batching and resetting the context

As you insert records the context graph tends to get larger, more complex and potentially much slower. Some benchmarks have suggested that by disposing and re-creating the context after a certain number of records you can improve overall performance.
foreach (var book in bookList)
{
    context.Books.Add(book);
    counter++;
    if (counter >= threshold)
    {
        context.SaveChanges();
        context.Dispose();
        context = new SqlDbContext();
        context.Configuration.AutoDetectChangesEnabled = false;
        counter = 0;
    }
}
 
context.SaveChanges();
The potential gains here are not guaranteed and they depend very much on the circumstances. The latency between your client and the server is particularly significant as no amount of tweaking is going to overcome an IO-bound batch operation. If you are using a simple object model and connecting to a remote SQL database hosted in Azure then resetting the context regularly is unlikely to have much impact.

The optimal approach? Don’t use the entity framework!

There’s only so much to be gained by optimising the entity framework for bulk inserts. It’s just not designed to deal with this kind of scenario. When you are inserting data in bulk you want to push it to the server in single operation without any chatty feedback or data tracking.
This is why for large bulk inserts its best to step outside of the confines of the entity framework. For Sql Server the SqlBulkCopy class was designed specifically for large, bulk insert scenarios. You’ll normally find that most solutions designed to extend the entity framework with dedicated bulk methods are using SqlBulkCopy in the background.
It’s cumbersome to work with and does not produce the most elegant code, but it will upload data in a fraction of the time that can be achieved by optimised entity framework code.

Hope this help!
Oumaima

Bundling and Minification

Bundling and Minification


Bundling and minification are two techniques you can use in ASP.NET 4.5 to improve request load time. Bundling and minification improves load time by reducing the number of requests to the server and reducing the size of requested assets (such as CSS and JavaScript.)


Most of the current major browsers limit the number of simultaneous connections per each hostname to six. That means that while six requests are being processed, additional requests for assets on a host will be queued by the browser. In the image below, the IE F12 developer tools network tabs shows the timing for assets required by the About view of a sample application.
B/M
The gray bars show the time the request is queued by the browser waiting on the six connection limit. The yellow bar is the request time to first byte, that is, the time taken to send the request and receive the first response from the server. The blue bars show the time taken to receive the response data from the server. You can double-click on an asset to get detailed timing information. For example, the following image shows the timing details for loading the /Scripts/MyScripts/JavaScript6.js file.
The preceding image shows the Start event, which gives the time the request was queued because of the browser limit the number of simultaneous connections. In this case, the request was queued for 46 milliseconds waiting for another request to complete.

Bundling

Bundling is a new feature in ASP.NET 4.5 that makes it easy to combine or bundle multiple files into a single file. You can create CSS, JavaScript and other bundles. Fewer files means fewer HTTP requests and that can improve first page load performance.
The following image shows the same timing view of the About view shown previously, but this time with bundling and minification enabled.

Minification

Minification performs a variety of different code optimizations to scripts or css, such as removing unnecessary white space and comments and shortening variable names to one character. Consider the following JavaScript function.
JavaScript
AddAltToImg = function (imageTagAndImageID, imageContext) {
    ///<signature>
    ///<summary> Adds an alt tab to the image
    // </summary>
    //<param name="imgElement" type="String">The image selector.</param>
    //<param name="ContextForImage" type="String">The image context.</param>
    ///</signature>
    var imageElement = $(imageTagAndImageID, imageContext);
    imageElement.attr('alt', imageElement.attr('id').replace(/ID/, ''));
}
After minification, the function is reduced to the following:
JavaScript
AddAltToImg = function (n, t) { var i = $(n, t); i.attr("alt", i.attr("id").replace(/ID/, "")) }
In addition to removing the comments and unnecessary whitespace, the following parameters and variable names were renamed (shortened) as follows:
OriginalRenamed
imageTagAndImageIDn
imageContextt
imageElementi

Impact of Bundling and Minification

The following table shows several important differences between listing all the assets individually and using bundling and minification (B/M) in the sample program.

Using B/MWithout B/MChange
File Requests934256%
KB Sent3.2611.92266%
KB Received388.5153036%
Load Time510 MS780 MS53%
The bytes sent had a significant reduction with bundling as browsers are fairly verbose with the HTTP headers they apply on requests. The received bytes reduction is not as large because the largest files (Scripts\jquery-ui-1.8.11.min.js and Scripts\jquery-1.7.1.min.js) are already minified. Note: The timings on the sample program used the Fiddler tool to simulate a slow network. (From the Fiddler Rules menu, select Performance then Simulate Modem Speeds.)

Debugging Bundled and Minified JavaScript

It's easy to debug your JavaScript in a development environment (where the compilation Element in the Web.config file is set to debug="true" ) because the JavaScript files are not bundled or minified. You can also debug a release build where your JavaScript files are bundled and minified. Using the IE F12 developer tools, you debug a JavaScript function included in a minified bundle using the following approach:
  1. Select the Script tab and then select the Start debugging button.
  2. Select the bundle containing the JavaScript function you want to debug using the assets button.
  3. Format the minified JavaScript by selecting the Configuration button , and then selecting Format JavaScript.
  4. In the Search Scrip t input box, select the name of the function you want to debug. In the following image, AddAltToImg was entered in the Search Scrip t input box.
Bundling and minification is enabled or disabled by setting the value of the debug attribute in the compilation Element in the Web.config file. In the following XML, debug is set to true so bundling and minification is disabled.
XML
<system.web>
    <compilation debug="true" />
    <!-- Lines removed for clarity. -->
</system.web>
To enable bundling and minification, set the debug value to "false". You can override the Web.config setting with the EnableOptimizations property on the BundleTableclass. The following code enables bundling and minification and overrides any setting in the Web.config file.
C#
public static void RegisterBundles(BundleCollection bundles)
{
    bundles.Add(new ScriptBundle("~/bundles/jquery").Include(
                 "~/Scripts/jquery-{version}.js"));

    // Code removed for clarity.
    BundleTable.EnableOptimizations = true;
}
Note
Unless EnableOptimizations is true or the debug attribute in the compilation Element in the Web.config file is set to false, files will not be bundled or minified. Additionally, the .min version of files will not be used, the full debug versions will be selected. EnableOptimizations overrides the debug attribute in the compilation Element in the Web.config file


Tuesday, May 30, 2017

Sharepoint Designer 2013, XSLT List View Options ribbon option is not showing

Sharepoint Designer 2013, XSLT List View Options ribbon option is not showing


I have an ordinary Wiki Page, also tried making an Article Page.
Once this page is made, I add an XsltListViewWebPart that points to a list, which is fine, this is added from the sharepoint web site page edit.
When I open it up in Sharepoint Designer 2013, and select the XsltListViewWebPart the ribbon does not contain the List View Options listview for me to edit the filters, parameters etc.
What I do notice is, when I add the XsltListViewWebPart from inside the Sharepoint Designer 2013, I see the ribbon section fine. However once I save changes on the page, close it and re-open it, same issue, can't see that ribbon section.

The solution :
Go to Edit web part and select "Miscellaneous" property and click on Server Render checkbox.
then you will able to see that in designer
hope it helps!

Monday, April 17, 2017

List view error: Attempted to use an object that has ceased to exist

List view  error: Attempted to use an object that has ceased to exist



Error Scenario:
In SharePoint, when navigating to a list view page (ex: 'All Items' view), or to a site page (like home page) that contains a list view web part, the page crashes, throwing the following error:

Server Error in '/' Application.

Attempted to use an object that has ceased to exist. (Exception from HRESULT: 0x80030102 (STG_E_REVERTED)) 


Root Cause:
The error is thrown because of the resource throttling in that list. The GroupBy is counting & folding too many items causing SharePoint to prevent the SQL server from processing the view query. When checking the number of items in list settings page, it shows it had 18670 items (way above the default list view threshold of 5000).


Solution:
The fix is to enable larger resource throttling for the target web application.
Example: In this case I set to 20000.

From Central Administration site > Manage Web Applications > select target web app > General Settings > Resource Throttling > change List View Threshold > OK

Note: there is a reason of why Microsoft set the default theshold into 5000 which is to avoid getting a performance hit on SQL server by queries on too many items. When you have very large lists or libraries, try to think about alternative approaches to divide the data, such as split into multiple lists, or using folders in case of document libraries.


SQL Deadlocks and the Project Server Queues

SQL Deadlocks and the Project Server Queues


on SQL deadlocks and error messages like the following on a busy server.
System.Data.SqlClient.SqlError: Transaction (Process ID 84) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
You will also see error id 7747 in the application event log.
This can be an issue with systems that are quite stressed and in all cases I have seen relates to the process that selects the queue jobs for processing. It does not break anything as such and no data is lost – but processing of queue jobs is delayed (but as the system is very busy they probably wouldn’t have processed quickly anyway!).
Deadlocks occur when two transactions interact in such a way that one requires a resource that the other has locked, and vice versa. Because neither task can continue until a resource is available and neither resource can be released until a task continues, a deadlock state exists. SQL Server selects one of the transactions as the victim and ends it – and posts the above error.  See the SQL Server Books Online for more details.
In Project Server 2007 you can monitor activity using perfmon, and the counters include SQL retries per minute for both the Project and Timesheet queues. You can also modify the queue settings which can reduce the occurrence or behavior of the deadlocks. We don’t have any prescriptive guidance yet on suggested changes, but certainly reducing the number of threads, increasing the polling interval, or increasing the SQL retry intervals would likely reduce the number of deadlocks you see. However, these changes will also reduce the throughput of your queue – particularly when processing light weight jobs. If you see the deadlock behavior at specific time of day only – and want to change queue settings to suit workload you could even use the QueueSystem web service to change the settings (using the SetQueueConfiguration method).
I’m not sure if anyone will really want to micro-manage their queue in this way – or what the overall throughput benefits would be – but the option is there.

Optimising bulk inserts with Entity Framework 6

Optimising bulk inserts with Entity Framework 6 In Entity Framework 6+ the database context implements the  unit of work  pattern so th...