Thursday, October 6, 2011

LINQ to Object JOIN

I was reviewing a piece of LINQ to Object Join query of one of my colleagues and a code that is written like this (example, not the exact code, the actual query was complex then this…)... IList<int> result =

(from i in col1 join j in col2 on i equals j select i).ToList();

told him to re-write it like this…or using a 2 foreach loop, as the join was too complex. IList<int> result = (from i in col1 from j in col2

where i == j select i).ToList();

my Colleague, argued that Microsoft would have implemented the JOIN operator in a better way then a 2 loops running on each other…

I ran a few test on the 2 queries above, and the query that used the JOIN operator was way faster…

The reason is that the JOIN operator (the extension method) is implemented such that the outer collection’s key is enumerated once and put into a hash table, next the inner collection is iterated and it’s key is checked to see if it’s there in the hashtable, this is just O(1) operation, this is way faster then just comparing a variable in an outer loop with something in an inner loop.

Thank you Rajiv…

Wednesday, May 18, 2011

.Net Reflector to Telerik JustDecompile

I have been using .Net Reflector for almost 4 years now and this really is a cool tool to just go decompile assemblies when you need to, and this was one of the tool I used to decompile some of the assemblies provided by Microsoft to see what's going in the internals.

It was a bit fishy when Reflector was bought by Ants, and gradually as you all might know, Reflector is no longer free, and whats worse the free version will cease to function after May 30th this year.

Enter Telerik, Telerik as promised to release there JustDecompile tool to the public as FREE, this tool gives you almost all the feature .Net Reflector provides.
You can download the beta version from here...

Below shows a screen shot of JustDecompile

Tuesday, May 17, 2011

Entity Framework and POCO - No Support for Xml or Enums

I have been keeping up to some pace on Entity Framework, however did not try a lot on it.
So thought of doing a sample on POCO support with Entity Framework.

This was a class I used to insert to my database.
public class Task
public long TaskID { get; set; }
public XElement TaskParameter{get;set;}
public string TaskType { get; set; }
public DateTime EnquedTime { get; set; }
public DateTime CompletedTime { get; set; }
public Guid TaskIdentifier { get; set; }
public TaskStatus TaskStatus { get; set; }
public string ErrorDescription { get; set; }
public TaskPriority TaskPriority { get; set; }

The property TaskParameter is an XML column in the database and the properties TaskStatus and TaskPriority are enums and are implemented as smallint in the database

Now, I try to instantiate a context and i get this error...

"Mapping and metadata information could not be found for EntityType Task"

To find out the issue is that EF does not support entity mapping to XElement and you have to map the XML column to a string in the entity and that the EF does not support translating back to enums :(

This works perfectly fine in LINQ to SQL , but why is this feature not available in EF? Here is a post to why...

You would also get this error if , your POCO does not implement all properties that are defined in the Entity model and should match the name given in the model.

Sunday, May 1, 2011

WCF Configuration Nightmare, MSBuild, IIS AppCmd

In this post I am going discuss about some of the deployment nightmare that you might have when you design SOA, where you segregate logic into several decoupled services. the post would not really deal with production deployment , but about internal developer releases.

The problem here is that when you have multiple services, and you want to deploy your code in one of the test environments for testing before you release it to the testing team, you have a huge amount of configurations that you need to change.
This typically includes...

1) Changing service addresses, this might include WCF services that your service has referenced. Sometimes it might be the case that one of this service refer not just one service. All the service might not be deployed in one machine, hence, referring localhost will not work.

2) Changing database connections, this might be a cumbersome task, you might have to go into each and every configuration file and change the database serve , database user or database name.

3) Application settings, these might include paths to share folders, etc...

4) Any other configuration sections that might be different from when you are developing in your local machine, for an example, you might not want to give access to meta data after deployment, or you want to change the a few WCF settings like MaxConcurrentConnections.

After deploying all these WCF services you would have to go through these numerous configuration files and change these places to point to the correct changes according to the new environment.

This is cumbersome, and is error prone, you would have to waste a few hours trouble shooting in order to find out which configuration piece was configured wrongly.

A very primitive way of deploying WCF services would be to build your application from Visual Studio and then manually add these services into IIS and if your exposing your services through non HTTP endpoints in IIS7, the you would have to configure binding for the protocol used.
Once this is done, now you would configure service addresses and DB connections. you might have to create a .SVC file for HTTP activation manually or you could use the publish option in Visual Studio to create this automatically.

Next I will outline some of the ways to improve this process.


One this is you can create a MSBuild script that would build your solution and then copy the out put into another location.
MSBuild is powerful enough to do this, this script would be a very small script, you can just instruct MSBuild to build your application pointing to your solution.
If your trying to deploy your web application, then you just need to write a target that would copy the output to a folder called bin and then copy the whole source of the web application without the .cs or ascx files.
You can learn more about MSBuild here

IIS7 AppCmd

Now once this is done , you can use IIS 7 AppCmd that ships with IIS tools to create the web application and WCF application, the app cmd tool is a command line tool that gives administrator the same power that they get with IIS7 GUI, so its just a matter of creating a .BET file to do the task of deploying it in IIS & and call the .Bat file in your MSBuild script as a target.
You can learn more about App Cmd here

MSBuild Extension Pack

MSBuild does not come with any target to change configuration values, however there is a community task library that you use to do this, one such task library is the MSBuild Extension Pack.
This component gives you a set of task to manipulate XML files, like reading, writing and updating.
This is exactly what we want to use to change our configuration files to point to the correct server and change the DB connections.
You can learn more about MSBuild Extension pack here

Automating deployment nightly

Now all you have to do is instruct cruse control or hudsion to execute your build script and you have you your latest build deployed in your environment nightly or weekly.

Friday, April 29, 2011

Moving Session Data from InProc to State Server or Database And There Problems

Its been a while since i posted , almost 4 months now, was busy with office work.
So in this post I would like to talk about ASP.NET sessions. we all know about ASP.Net sessions, we all know there are few methods which you can use to store session information, one common way and default way is store session state in the memory of the machine in which ASP.NET is running, the other way is to use the ASP.Net state server or to use the default SQL Server provider to store session data, we are not going back to the basics however, we should be careful in choosing where the session data will be stored and what sort of data will be stored.
It's particularly important to know that moving session data from one mechanism to another (in proc to state server)is not just a matter of changing the configuration file.
When using in-proc data is stored in memory, but when storing it in the state server or on a database like SQ Server, these session data needs to be serialized.

Not all data is serializable, like for an example the generic Dictonary or XElement class, so you have to be careful of what you store in the session, if your thinking that in the future you might go for a State server.

It depends on what data is stored in the session, for an example, the best thing for you to do is to store very less data in the session so that you don't use up a lot of server memory for storing user data.

There is also sometimes when you would want to store view State data in the session, hence, the amount of data that goes back forth from the user browser to the server is minimized, here is an MSDN Link on how you can do that.

In the above case, that is where you can do real bad then good, cos' you might be storing data that needs to be persisted on a post back, these might include small data sets to a few custom objects.
These are some of the cases where you can go wrong when you stat storing data in sessions and you are moving from in proc to database or state server...

1) You create an annoynmous type and bind it to a grid, well annoynmous types are not serilizable hence, this will not work if you are using the state server or a database to store session data. You would have to create a wrapper type, and use this to be stored in the session.

2) Any objects that has a property of type XElement, unfortunately this type is not serlizable, hence you would have to use either XmlDocument to store the representation of XElement or take to step explained below on the 3rd point.

3) If you are storing one of your custom types, these have to be marked with [Serilizable] attribute, however there are some cases where you would inherited from a class that is not marked as serilizable, marking your implementation Serilizable wont do any difference, one way to come around this problem maybe to use an XmlSerlizer to serialize the object into XML representation and then store the text in the session (there is a overhead here...)

All the 3 steps mentioned have there overheads, so choose wisely on what data is stored inside session.

P.S Some of these ideas came from Yohan S.

Tuesday, January 4, 2011

File Transactions (TFX)

There are situations where a transaction span into a database and to the file system, typically this is handled by running the database call inside a transaction scope and the do the file update call after the database call returns.
If the database call fails then nothing is updated into the file system, however if the file update fails then the database transaction is rolled back.
The above approach has to be done for the reason that the file update does not participate with the DTC.
But what if this we are dealing with multiple files, or any other complex scenario?
This is an article one of my friends (Mihidum) forwarded me about NTFS file system transaction,
here is the link.

Too bad, .Net does not support this, and we have to write unmanaged code.