Tag: Entity framework

Optimizing EF4 and EDMGen2

Just reading (from Adrian Florea links ) how the EF4 could be optimized : http://www.codeproject.com/KB/database/PerfEntityFramework.aspx

First modification is to “Pre-generated Your View”. For this you must have .ssdl, .csdl, and .msl files – so you change the “Metadata Artifact Processing property to: Copy to Output Directory.”. Then you process the .ssdl, .csdl, and .msl with edmgen in order to can have the  views.

Until here , all ok.

But, in the next advice, is to keep “Metadata Artifact Processing property to: Embed in Output Assembly.”

One solution is to put “Metadata Artifact Processing property to: Copy to Output Directory.” , compile, put again “Metadata Artifact Processing property to: Embed in Output Assembly.” and compile again. But , if you change the edmx (add fields or tables ) you must redo the operation  – so you will have more things to do(if you remember)

A solution is to build them on the pre-build step .But how to generate the .ssdl, .csdl, and .msl files  ?

Edmgen2 , http://code.msdn.microsoft.com/EdmGen2 , to the rescue. Download, put into a lib folder under your solution folder and put a pre-build like this :

$(SolutionDir)lib\edmgen2\edmgen2 /FromEdmx $(ProjectDir)prod.edmx”

“%windir%\Microsoft.NET\Framework\v4.0.30319\EdmGen.exe” /mode:ViewGeneration /language:CSharp /nologo “/inssdl:prod.ssdl” “/incsdl:prod.csdl”   “/inmsl:prod.msl” “/outviews:$(ProjectDir)prod.Views.cs”

What you must change on your project?

1. IF you are on 64 bit, change Framework to Framework64

2. change the prod with your edmx name.

What is the performance?

Tested by loading a table with 301 rows by doing the steps :

1. open connection, load all table in objects(POCO), closing connection

2. open connection , find object with PK = 1, closing connection

3. open connection ,  loading 1 tables with 2 related (include ) , closing connection

The results are in milliseconds:

Without pre-compiled views

LoadTable LoadID LoadMultiple Total Time
579 800 172 1551
563 755 171 1489
559 754 169 1482
568 762 240 1570

With pre-compiled views:

LoadTable LoadID LoadMultiple Total Time
606 807 183 1596
509 706 177 1392
852 137 192 1181
530 733 221 1484
523 722 183 1428

The average / min / max results:

average max min
without 1523 1570 1482
with 1413.25 1596 1181

In the next picture the smaller the duration(milliseconds), is the better :

image

Conclusions:

1.  For the average and min the difference is 7%, respectively 20%. Please remember we are using only 3 queries.

For the max, it is vey curious : the with is more than without. The penalty is 1%, I think that is a measuring error ? – or maybe not. However , the penalty is small comparing with others.

2. Very curious, find after ID, with a table with 301 rows, took longer than loading the whole table.However, did not take into accound finding in the list the object( it is in memory also)

3. It may worth to add the pre-build step shown before to pre-compile views.

Links :

http://www.codeproject.com/KB/database/PerfEntityFramework.aspx

http://msdn.microsoft.com/en-us/library/bb896240.aspx

TT files – generate enum from database

Many times you will program against a table that contains something like an enum , like Status( open=1 , close=2, sent=3, approved=4 )  .

It is peculiar to wrote those status as text in the other tables – and you do not like also to have update their codes in the C# (VB.NET) code source each time you will add another one.

Rather , it is convenient to auto-generate from database at once.

But how to do it in Visual Studio ? The answer is .tt files – the files that generates also POCO

So here it is my own template for such enum from database .

To use ,unzip, add to your project that contains the edmx and do what is says below- and you will see as many  .cs file as tables want to put.

<#
//*********************************************************
//
//    NO Copyright .Use at your own risk.
//    Please modify :
//    1) the names of tables to generate enums : string nameforenum
//    2) the connection to the database : string connectionstring
//    3) the name of the model : string inputFile
//    Then save the file and you will have an enum …
//*********************************************************
#>

GenerateEnum

EF , automatic history of table and T4 files (TT files)

Usually the data of the tables should be tracking for who modified it.

Think about inserting/updating/deleting an employee :   you must know who did those actions and when. So you create another table, identically as structure, and you add another 3 fields , such as [ModifiedDate](when), [ModifiedBy](who), [ModifiedType] (what : insert, update, delete).

There are several methods to do it :

  1. from database :
  2. from programming code  – every time you modify an object, you remember to modify the history object with appropiate data.

The drawback with the database approach is that you can not retrieve who done the modifications ( usually the applications connect under a single account and have a roles table)

The drawback with the programming approach is that the programmer must REMEMBER doing so…If he does not(and does not wrote tests for history), you are stuck…

In the following I propose an automatically history – that maps convention over configuration in my template, but it is easy for you to modify.

The solution works with Entity Framework 4.0 and, for more easily spearation of concerns , with POCO generators.

Let’s say you have the following tables :

database diagram

As you see we have a Employee and a employee_history, an Department and Department_history

The conventions are:

the history table name = “object” table name  +  “_history” suffix

the history table fields = “object” table name  fields +[ModifiedDate], [ModifiedBy], [ModifiedType]

(if you change those conventions , please change the modelhistory.tt file)

If you want to see in action , please  download code history and do the following
1. create database tests
2. run history.sql
3. run project
4. if necessay, re-create the model1.edmx with the same name and replace the console application app.config with the new connection string

After works, please add any fields to department table  and to department_history table(same field names/type) .  Re-compile the application and modify the new field in department. You will see the modifications in the department_history table.

Ok,now how we do the magic :

We create two new tt file that points to the model.edmx .

The first one ModelHistory.tt , takes care of  creating the constructor for history entities by taking a parameter from the original entity :

public Department_History(Department original):this()
{
this.IDDepartment=original.IDDepartment;
this.Name=original.Name;
}

How it do this magic ? Simple : the ModelHistory.tt recognize the model and history in the name of tables:

</pre>
string inputFile = @"Model1.edmx";
string History = "_History";
<pre>

then it generate code for constructor :

	#>
		public <#=code.Escape(entity)#>():base()
		{
		}
		public <#=code.Escape(entity)#>(<#=NameEntityOriginal #> original):this()
		{
		<#
	foreach (EdmProperty edmProperty in entityOriginal.Properties.Where(p => p.TypeUsage.EdmType is PrimitiveType && p.DeclaringType == entityOriginal))
	{
		#>
				this.<#= code.Escape(edmProperty.Name) #>=original.<#= code.Escape(edmProperty.Name) #>;
		<#

	}
	#>
		}
	<#
</pre>

Ok, and then how to create the history entity ? I wish that the POCO template has had an event “Database saving” – but the only thing I can have is SaveChanges from the ObjectContext – so I create a new ObjectContext , derived from the default one that comes with the project, and creates a new history object :


public override int SaveChanges(SaveOptions options)
{
this.DetectChanges();
DateTime dtModified=DateTime.Now;
string UserModified=clsUser.UserName;
foreach (ObjectStateEntry ose in this.ObjectStateManager.GetObjectStateEntries(EntityState.Added | EntityState.Deleted | EntityState.Modified))
{

//could do this way too
//if (ose.Entity != null && ose.Entity.GetType() == typeof(...))
//{
//}
if (ose.Entity != null)
{
string NameType=ose.EntitySet.ElementType.Name;

switch(NameType)
{

case "Department":
var itemDepartment_History = new Department_History(ose.Entity as Department);
//if compile error here, that means you keep tracking
//of which modified with another properties
//please modify the tt accordingly
itemDepartment_History.ModifiedType= ose.State.ToString();
itemDepartment_History.ModifiedDate= dtModified;
itemDepartment_History.ModifiedBy= UserModified;
base.Department_History.AddObject(itemDepartment_History);
break;

case "Employee":
var itemEmployee_History = new Employee_History(ose.Entity as Employee);
//if compile error here, that means you keep tracking
//of which modified with another properties
//please modify the tt accordingly
itemEmployee_History.ModifiedType= ose.State.ToString();
itemEmployee_History.ModifiedDate= dtModified;
itemEmployee_History.ModifiedBy= UserModified;
base.Employee_History.AddObject(itemEmployee_History);
break;

}
}
}

return base.SaveChanges(options);
}

Now all is ready and I made a console application for testing manually (ok, should make a NUnit / MSTest / xUnit )

 using (var ctx = new testsEntitiesHistory())
            {
                var dep = new Department();
                dep.Name = "IT";
                ctx.Departments.AddObject(dep);
                ctx.SaveChanges();
                id = dep.IDDepartment;
            }
            using (var ctx = new testsEntitiesHistory())
            {
                var dep = ctx.Departments.Where(depart => depart.IDDepartment == id).FirstOrDefault();
                dep.Name = "Information tehnology";
                ctx.SaveChanges();
                //
            }
            using (var ctx = new testsEntitiesHistory())
            {
                var dep = ctx.Departments.Where(depart => depart.IDDepartment == id).FirstOrDefault();
                ctx.Departments.DeleteObject(dep);
                ctx.SaveChanges();

            }
            using (var ctx = new testsEntitiesHistory())
            {
                foreach (var dephist in ctx.Department_History)
                {
                    Console.WriteLine("Found {0} with state {1}", dephist.Name,dephist.ModifiedType);
                }
            }

And the output is :

history saving
automatically saving history department

Now you can add more tables to the edmx or change the fields – all is done automatically when compiling

If you want to see in action , please download code history

Update : for another way to do it( generating trigger and tables ) please see : http://msprogrammer.serviciipeweb.ro/2010/09/27/generating-history-trigger-with-ef-edmx-and-tt-files/

Entity Framework profiler

Many times I’ve had problem with the following error when inserting objects with dates with Entity Framework :

System.Data.UpdateException An error occurred while updating the entries. See the inner exception for details.SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.

Ok, it’s my faute – but to remember each one date is too much for me…

The usual method was to start SqlProfiler, monitor the database and see how the sql is constructed. However, the database being used by all developers, it was not so simple to differentiate between all sql’s.

Other alternative was to log the entity framework generated sql’s . I have discovered Ayende Rahien Entity Framework profiler . Simple to use , as is wrote here in 2 simple steps

  1. add reference to HibernatingRhinos.Profiler.Appender.dll
  2. put this
HibernatingRhinos.Profiler.Appender.EntityFramework.EntityFrameworkProfiler.Initialize();

and start the exe. That will be all.

Do not forget to remove it on release version!

Pros:

Easy to use, valuable information, good!

Cons :

Not free …

Andrei Ignat weekly software news(mostly .NET)

* indicates required

Please select all the ways you would like to hear from me:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.