June 6, 2011 by temebele
Technology teams developing business applications often spend so much time analyzing and comparing tools and frameworks. This is mainly due to the fragmented nature of the providers, both commercial vendors and Open source contributors that build these tools and frameworks. The choice even gets worse with Microsoft’s popular culture in redefining and reinventing tools and frameworks that already exist in the Open source community and were already fully adapted by developers. MS always argues that the only time they reinvent the wheel is when they see that there is a bigger vision on how that product fits in the Microsoft ecosystem. At least I know that data platform’s long term vision was the reason they gave as the drive for Entity Framework which was not just a re-invention of other Open Source ORMs but also Microsoft’s own LINQ to SQL. (Check out Frans Bouma’s blog, http://weblogs.asp.net/fbouma/archive/2008/05/19/why-use-the-entity-framework-yeah-why-exactly.aspx if you want to read more on this.) Let us believe MS for the benefit of the doubt and also for their good initiations in adapting and even contributing to few Open Source frameworks like jQuery.
By the way the re-inventing issue also exists in the open source community itself. Check this website where Fluent NHibernate’s creator is ranting about the new mapping features in NH 3.2, http://lostechies.com/jamesgregory/2011/04/13/me-on-nhibernate-3-2/
The biggest problem is that there is no as such a certified independent vendor whose sole business is to continually compare various products in the market competing in the same domain space and label them with ratings and assurance to consumers. There you go! Great business idea if you haven’t already thought about this one beforeJ.
So you might be asking “How then do technology teams choose these tools?” Honestly decisions are mostly based on information gathered from the different blogs and articles on the internet hosting these technical debates. Microsoft seemed to understand this and has made MS Product Evangelism a main stream in their marketing strategy. That is also assisted by the MVP program which gets awarded to fellows who are active part of the Microsoft development community.
At times these technical debates between tools, frameworks, languages that you find on the internet are more philosophical where one side of the group is already biased with their decision and just try to bring more points to prove to others their point of view. You see the same behavior in development teams where a developer keeps defending their tool suggestion regardless of the ideas that others are putting on the table. That makes the argument even more time consuming and zero value add to the business some times. Check out this article http://weblogs.asp.net/scottgu/archive/2010/01/24/about-technical-debates-both-in-general-and-regarding-asp-net-web-forms-and-asp-net-mvc-in-particular.aspx, by Scott Guthrie if you haven’t already.
I thought it is good to mention that one of the big advantages of Domain Driven Design is how it really allows you to solely focus on your domain model and treat everything else that doesn’t affect the business as infrastructure concern that can easily be plugged in or swapped any time with little or no-impact to the core domain. But you still need to responsibly choose the right tools for your infrastructure but it gives you the flexibility to change your decisions when you find it necessary.
In a project that I recently worked on we were following Domain Driven Design and NHibernate ORM framework was our choice for the persistence infrastructure layer. With Microsoft’s recent improvements to the Entity framework we thought it makes sense to take a second look at EF. Making feature to feature comparison would make absolute sense since we already know our “Must have” features that projects are currently using from NH.
EF and NH Architecture
NH Architecture (http://knol.google.com/k/-/-/1nr4enxv3dpeq/jd6roh/lite.png
EF Architecture (http://i.msdn.microsoft.com/ff830362.fig02(en-us,MSDN.10).png)
So let’s begin by defining our Generic Repository and Unit of Work interface
Note: Eric Evan envisioned repositories in his DDD book as collection like interfaces where access is only possible through the aggregate root. But if you look at all the implementations of different ORMs, they delegate the loading of data (lazy and eager) to the Property accessor get methods. Thus until we have ORM that completely embodies the repository pattern we will be using generic repository.
But we are aware that exposing a generic repository implementation to clients might cause problems since we don’t have control on how clients will be using the repositories. Hence our design was to have clients always go through domain services. The domain services in turn depend on the generic repository for persistence concerns.
Actually personally I don’t really understand the need for custom repositories (on aggregate roots) because we are already using domain service to define business processes. As part of defining the business process I believe domain service could also serve as aggregate roots fo that respective domain.
Also if you are using a generic repository interface IRepository<T>, it means we need to create a new instance of the generic repository implementation for each type. So our choice was to just go with a non-generic repository, IRepository, and only make the methods generic.
Note: Another option would be to go with the generic repository and have a static generic class to access it. (That was Ayende’s implementation, IRepository<T> interface, which is accessible via the static Repository<T>, on Rhino commons which I didn’t like that much)
public interface IRepository
TEntity Get<TEntity>(object pk) where TEntity : IDomainEntity;
IQueryable<TEntity> LoadAll<TEntity>() where TEntity : IDomainEntity;
void Save<TEntity>(TEntity entity) where TEntity : IDomainEntity;
TEntity SaveAndRefresh<TEntity>(TEntity entity) where TEntity : IDomainEntity;
void SaveAndCommit<TEntity>(TEntity entity) where TEntity : IDomainEntity;
void Delete<TEntity>(TEntity entity) where TEntity : IDomainEntity;
void DeleteAll<TEntity>() where TEntity : IDomainEntity;
void DeleteAll<TEntity>(IEnumerable<TEntity> entityList) where TEntity : IDomainEntity;
IEnumerable<TEntity> Query<TEntity>(Expression<Func<TEntity, bool>> expression)
where TEntity : class, IDomainEntity;
IQueryable<TEntity> Query<TEntity>() where TEntity : IDomainEntity;
IList Query(string sqlText);
By defining an interface for Unit of work it will allow us to not be coupled with specific unit of work implementation of either of these frameworks.
Now we are going to start listing all the features we are currently using from NHibernate that we will be using for the features comparison with EF. Comparison is based on the EF4.1 Release version and NH 3.2
1. POCO Support – With domain centric development, everything else should depend on the domain but no infrastructure concern should leak into the domain. So that makes POCO support a critical feature.
NH: This is a framework which was original built with POCO support and DDD concepts in mind. It has full support for POCO objects.
For Lazy loading, there is restriction to make the classes not sealed and also the lazy loaded properties as virtual. This was to allow Interception through Proxy.
Entity frameworks support for POCO has been improving from the previous releases. With EF 4.0, you first need to build the EDMX model and provide a CSDL/SSDL/MSL metadata first, then the framework can use your POCO entities. POCO Entity generator T4 templates can be used to generate code from the EDMX. (You disable code generation on the EDMX so that it doesn’t generate the non-POCO persistence classes)
EF 4.1 code-first allows you to create your POCO entities and it dynamically generates the EDMX metadata. The normal approach in using Code First will be to hand code the entity classes first but it also works if you have existing database model.
(Similar restrictions apply to the POCO classes to allow interception through Proxies).
Note: Currently there is no tool support to map the database model to the POCO classes. I am hoping that LLBLGen will start supporting that soon. If not I think you are better off using visual designer, data model first and then using the T4 generators to create the POCO classes.
(I believe MS argument is if want POCO classes then you better like hand coding the entities. But sometimes you may want to generate POCO from all existing data model and then start using custom POCO objects afterwards)
Issue: Every time you change the model, it requires you to drop and recreate the database. Even if you manually apply the changes both on the model and database it complains about model being different from database. So any data that was stored will be lost for every schema change (??)
2. Support for Data Model – First Development – With tools like LLBLGen and others there was also support to generated entities and mapping for existing data model with NH.
NH: You can use generators like LLBLGen, Visual NHibernate or other tools to achieve this with N
EF: Great support using the visual designer.
3. Support for Domain Model-First Development – NH supports generating and exporting schema information and queries to create database based on the domain model and mappings. Thus you can start development from the Model.
NH: This is also achievable with tools like LLBLGen that allow model first but EF has better support with the visual designer.
EF: Great Visual Studio designer support
4. All Basic Repository Implementations – we should be able to implement and use all our generic repository operations that we are currently using with NH. EF has the same querying capabilities to perform those behaviors defined in the generic repostiory.
5. Mapping from conceptual model to the Storage model – with Fluent NH, we were able to use code based configuration for our entity mappings. We had one mapping class per entity. There was also a flexibility to use hbm XML mapping in certain scenarios where there was no full supporte by the fluent mapping.
With EF 4.1 you pretty much don’t have to configure anything for Table per entity mappings as long as you following the conventions for naming your POCO class properties to match the table names. But if you want to split the table across multiple type or need to name the properties or classes different then you have to override the OnModelCreating method of DbContext.
Note: instead of lumping all custom configuration in a single DbContext class I prefer the approch of having sepate mapping classes per entity. I believe that is achievable with EF but haven’t looked into it yet.
6. Lazy and Eager Loading – with NH lazy loading is a default. Eager loading is possible using Criteria APIs but not very intuitive. We want to see how this is done with EF.
Also NH expects entity classes not to be sealed and have their properties as virtual so that it can use Proxies for Interception during lazy loading. We want to know how lazy loading is done using EF. (TODO: Need to look at EF)
7. Code Generation – with NH we were relying on LLBLGen’s designer tool to generate code both for the domain and mapping layers. The same support exists with with Visual Studio’s designer and T4 template support with EF.
8. Ability to add business logic to Generated Entities – we just use the partial class C# feature to add more custom logic that we don’t want to be overwritten every time a new code is generated. The same can be applied in EF. Anyways in a pure POCO implementation we shouldn’t worry about Code-gen and overriding as the POCO classes represent the evolving modelJ
9. Ability to intercept or subscribe to ORM events to add more data (e.g. Audit information) – with NH we were able to easily add event listeners to intercept inserts and updates and add audit information to database.
With EF you have to override OnSavingChanges in DBContext. NH way of using event listeners looks more intutive.
10. Ability to Cache data on a Session and Application Level – NH by default supports session level caching (First-Level cache) and it also has a support to plug in any caching implementation to be used for Application Level Caching (Second-Level Cache). Thus it avoids the need to create a custom caching layer between your data and services.
EF doesn’t have second level caching support. NH contributor Ayende is working on a commerical product to extend EF with second-level caching, http://ayende.com/blog/4364/designing-the-entity-framework-2nd-level-cache
11. Ability to use Query Cache on Demand – with NH you can enable query caching and then set certain queries as cacheable.
I haven’t see this with EF.
12. Ability to Execute named queries – For our integration tests we were able to use HBM named query mappings to execute large data setup queries.
Same concept exists in EF as DefiningQuery, http://msdn.microsoft.com/en-us/library/bb738450%28VS.100%29.aspx
13. Logging and Traceability
15. Documentation and Support
NH sucks big time when it comes to documentation. It looks like all the contributors prefer coding and not putting together documentation. The introduction of profiler tools like NHProfiler is making it easier to learn the internals of what the ORM is doing.
EF being from Microsoft has lot of documentation, tutorials and books.
NH doesn’t have a commercial foundation or a large developer base as other open source.
EF is supported by Microsoft.
This was my comparison just based on features we are currently using. But there are many other features that are different between these two. I will to put together and share a sample application that uses the same Domain layer with POCO classes but separate data infrastructure implementation with NH and EF soon.
Though these two frameworks are getting a lot closer with their feature sets and support for DDD, it looks like NH still has a little edge over EF both interms of maturity, extendability and some features like the caching support. Performance and other quality attributes will need more research and prototyping to be able to post any comparison.
Refer http://fabiomaulo.blogspot.com/2011/04/me-on-fluent-nhibernate.html for the new Conform mapping with NHibernate 3.2