Skip to main content

Posts

Showing posts from 2010

WCF 4 - File Less Activation, Default Endpoints and Default Bindings

Started going through WCF 4 a few months back, however, its today I got a chance to write down something about it So let me talk about something that was really interesting when I first read about it, File Less Activation of services. In WCF 3.5 when you wanted to host a service in IIS, you had to go through, adding endpoints, bindings and also a SVC file so that IIS can pick the request up. In WCF 4, this has been simplified so much, that you can get a service up and running in no time. Lets take an example, I created a service Service1 in a namespace DefaultEndpointSVC, and this is my web.config file.... <configuration> <system.serviceModel> <serviceHostingEnvironment> <serviceActivations> <add service="DefaultEndpointSVC.Service1" relativeAddress="myService.svc"/> </serviceActivations> </serviceHostingEnvironment> </system.serviceModel> </configuration> Now, if I go and deploy my service in IIS inside a vi

Running .NET code on a 64 bit Machine

Needed some information on how to develop .NET application for 64 bit machine, so I was doing the usual, reading a little bit of it, so before I forgot, i thought of posting it here for future reference. The main advantage of using a 64 bit machine over a 32 bit machine is that memory is not constrained to 4GB; in a 32 bit machine, the highest address that the CPU understands is around 4GB, but with a 64 bit machine, the address space become much more then this in the order of 2^64. Now lets talk about running .NET code, code that is 100% managed has no problem when it executes in a 64 bit, this is due to the fact that the CLR takes care of this. Going back to the basic, when you compile a piece of C# or VB.NET code, the compiler produces MSIL (Microsoft Intermediate Language) code, that contains enough information for the CLR (Common Language Runtime) to start execution, like meta data and the types used and so... When you start a program the windows loader, peeks at the assembly and

Nullable Value vs Casting.

Was talking to a team member half an hour ago, and he pointed out that he needs to change a line of code like this code (isLocked is a nullable bool) bool isEnabled = (bool)isLocked to bool isEnabled = isLocked.Value I asked him why you need to do it that way….and his answer was NO IDEA… So thought of dissembling the IL to see what was really happening. This is the dissembled code I got from Reflector.. public struct Nullable where T: struct { private bool hasValue; internal T value; public Nullable(T value) { this.value = value; this.hasValue = true; } public bool HasValue { get { return this.hasValue; } } public T Value { get { if (!this.HasValue) } ThrowHelper.ThrowInvalidOperationException(ExceptionResource.InvalidOperation_NoValue); return this.value; } public T GetValueOrDefault() { return this.value; } } Digging it a little bit more, doing a casting on a nullable generates IL

Parallel LINQ (PLINQ) - Intro

.Net 4.0 supports parallel LINQ or PLINQ, PLINQ is a parallel implementation of LINQ. PLINQ has the same characteristics has LINQ, in that it executes queries in a differed manner. However, the main difference is that with PLINQ, your data source gets partitioned and each chunk is processed by different worker threads (taking into account the number of processor cores that you have) , making your query execute much faster in certain occasions. Running a query in parallel is just a matter of calling the AsParallel() method of the data source, this will return a ParallelQuery<T> and your query will execute parallel. Let's look a code sample... var query = from num in source.AsParallel() where num % 3 == 0 select ProcessNumber(num); Now, when this query is iterated our a foreach loop or when you call ToList() etc..the query will be run in different worker threads. Although, you have parallelized your query execut

Implementing Asynchronous Callbacks with Task Parallel Library

Bored again...so thought of posting how you can implement callback with Task Parallel Library (TPL). So what am I talking here, basically I start a task in one thread and I want it to call another method once it completes (Asynchronous callbacks ). Here is a sample code.... Task<int> parent = new Task<int>( () => { Console.WriteLine("In parent"); return 100; } ); Task<int> child = parent.ContinueWith( a => { Console.WriteLine("In Child"); return 19 + a.Result; }); parent.Start() Console.WriteLine(child.Result); The code explains it all, all I have to do is create the task and then call its ContinueWith method and register the callback, its important to note that the parent task is an input to the continuation callback and the result of the parent can be accessed by t

Running Parallel Tasks with The Task Parallel Library

Down at home with Conjunctivitis, was boring at home, so was listening to some old classics and then thought of writing a post on how you can run tasks with the Task Parallel Library (TPL). Going forward, Microsoft encourages developers to use TPL for concurrent programming. In my previous post I talked about data parallelism , where I showed how blocks of work running inside a loop can be scheduled to run on different threads. In previous versions of .Net if I want to execute a task in another thread I had do this. Thread thread = new Thread( () => { //Do some work Console.WriteLine("Starting thread"); } ); thread.Start(); With TPL I only do this.. Parallel.Invoke( () => { //Do some work Console.WriteLine("Starting thread"); }

ConcurrentBag<T> - Thread Safe Collections

.Net 4.0 introduces a new namespace, System.Collections.Concurrent, this namespace contains a set of collections that would be very much useful in threaded programming. The Add or Remove methods of List<T> is not thread safe, meaning that if you are adding or removing items from a list which is accessed from multiple threads, you will end up overwriting some of the items. So, in multi threading programming, you would need to lock the list before adding or removing items from it. The ConcurrentBag<T&gt in the System.Collections.Concurrent is a thread safe collections i.e all you have to do is call Add or Remove in the usual way and the collection will take care of adding the items without overwriting them. Here is an example using the ConcurrentBag<T&gt with the Parallel Task Library... ConcurrentBag array = new ConcurrentBag (); try { Parallel.For(0, 100000, i => { //Do some

Task Parallel Library - Refresher - Stop an Iteration

I have been talking about the Task Parallel Library (TPL) 2 years back , when it was in CTP, I was taking a class on High Performance Computing yesterday, and I just remembered that I have forgotten all about this library :). This library has now been officially released with .Net 4.0 and Microsoft recommend you to actually use this library if possible when writing concurrent programs, so that your program can take maximum advantage on the numbers of processes that you have. I thought of posting a sample code as a refresher,. Here is the example, I have a list of Customer objects and I need to get the object that matches a specific criteria, lets say the Name property should be "F". Here is how my Customer object looks like. public class Customer { public int ID { get; set; } public string Name { get; set; } public int Age { get; set; } } Lets assume that the Name property is unique. If I was to write the algorithm for this in .Net 1.1 or

HashSet<T> vs .Net 4.0 SortedSet<T>

HashSet<T> has been there for a while, it can store objects in such a way that the time to add an item to the set or remove it or search for an item is O(1), constant time. It uses hash based implementation to achieve this constant time for these operations, however when you want to iterate this collection in a sorted way, the operation is intensive, as the values within the HashSet is not sorted, you need to create a sorted collection and then iterate this collection. As the values are stored within the collection indirectly based on hash, the sort operation is expensive and also a new collection has to be created. This is where SortedSet<T> comes into play, this collection type was introduced in .Net 4.0, when you add item to this collection, the item is placed in the collection according to the sort criteria, thus when you need to iterate this sorted collection, it is much faster then the HashSet. SortedSet has it's own cons, now that the items has to be placed in th

ASP.NET Localization - Implicit Resource Assignment

In one of my previous post I talked about ASP.NET localization basic and 2 ways of which resources can be assigned to ASP.NET pages and controls were also discussed. In this post I will talk about a powerful ASP.net feature that can be used to assign resources implicitly. This feature can be used to assign multiple resources to a control at one shot without individually assigning resources. Lets take an example to demonstrate this, I am going to place a simple button control on a page. Now I want assign 3 types of resources, the Text, Color of the button and the tooltip of the button. This is my resource file. I have assigned values for all fields that I require. Note how the key has been named, that is a prefix followed by a period and then followed by the property of the control that I need to assign these resources too, for an example "button" is my prefix followed by a period and then the "Text" property. Now, in my ASPX page all I have to do is this. Note that

Satellite Assemblies and Strong Names

A good friend of mine from another project was talking about how a satellite assembly for a given culture that he created for an ASP.NET server control project did not get picked up for the that culture and the default resource was picked up. I was so curious to why this did not happen, and it was today that it hit me that there project is signed with a strong key and that he did not sign the satellite assembly with the same key. In other words, if your main project is signed then so does all your satellite assemblies need to be signed with the same key, if not the loading for that assembly will fall and it will resort to the default resource bundle, that is contained within the main assembly.

ASP.NET Localization

I have written some blog post on globalization before, but all this was something to do with desktop application or general best practices. This week I was interested in localizing ASP.NET applications (well, I was forced to :)) So, let me start by giving an introduction to ASP.NET resource assignment features. Basically it's the same .NET concept, you create resource files, compile and create satellite assemblies and link it to the main DLL. The resource assignment would happen through the resource manager. Although you can compile your resources before deploying, you can also take the feature of a ASP.NET folder named App_LocalResources, you can just plaace all your resource files with these folder and ASP.NET will automatically compile and create satellite assemblies. The screen shot on the left shows a screen shot of my solution. Here I have added the resource files for the Local.aspx into the App_LocalResources folder; I have added 3 different resource files, one for the defau

XQuery - Projecting Attributes With a Where Clause a Query

Had to write an xquery yesterday to extract some attribute values from an xml column, so I thought of posting it here for future reference. This the structure of my XML data stored in the column xmlColumn This is the records that I have in my table What I need is to select the Id column from a normal table column and then the "value" attributes from the xml coulumn whose "SiteText" is equal to 1 This is the query written in 2 different ways select id, xmlColumn. query('data(AuditRecord/Records/Record[@SiteText=1]/@value)') from temp select id, xmlColumn. value('data(AuditRecord/Records/Record[@SiteText=1]/@value)[1]', 'varchar(50)') from temp and this is the result set I get

Validating XML with an XSD using LINQ

So how do you validate an XML document that you get with a particular XSD?. One way you can do is with LINQ to XML with these steps. 1) Create a XmlSchemaSet 2) Add the schemas that your document needs to confirm to 3) Call the validate method of the XDocument So here is the example code. //Load an xml document XDocument doc = XDocument.Load(@"C:\test.xml"); //Define the schema set XmlSchemaSet set = new XmlSchemaSet(); //Add the schema to the set set.Add(string.Empty, @"C:\test.xsd"); //A variable to test whether our document is valid bool isValid = true; //Cal the validate method //Note that I am using a lambda function , alternativley you can use //pass in a delegate method as in the traditional method :) doc.Validate(set, (o, e) => { isValid = false; }); //Print out the result Console

Exposing Enum Types From WCF

There are many cases where you would want to expose enum types from a WCF service, so how do you do this? Simply create a data contract and mark the values with the EnumMember attribute and thats it [DataContract] public enum MyEnumName { [EnumMember] CSV, [EnumMember] XML = 2 }

The maximum nametable character count quota (16384) has been exceeded

Some of our services were growing and the other day it hit the quote, I could not update the service references, nor was I able to run the WCFTest client. An error is diplayed saying " The maximum nametable character count quota (16384) has been exceeded " The problem was with the mex endpoint, where the XML that was sent was too much for the client to handle, this can be fixed by do the following. Just paste the lines below within the configuration section of the devenve.exe.config and the svcutil.exe.config files found at the locations C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE , C:\Program Files\Microsoft SDKs\Windows\v6.0A\bin Restart IIS and you are done. The detailed error that you get is the following : Error: Cannot obtain Metadata from net.tcp://localhost:8731/ Services/SecurityManager/mex If this is a Windows (R) Communication Foundation service to which you have access, please check that you have enabled metadata publishing at the specified address. F

Configuring mexConnections with net.tcp binding

This is one of the issues that does not explain it self to me and I could not find any help from MSDN, so I thought of sharing in the blog for future use. I have a WCF library that is hosted in IIS, after some time we wanted to increase the max connection property of the service to 100 from the default value of 10 and also change the listenBackLog value to the same (if you dont know what these attributes do, a googling would help :0) The service was configured with 2 endpoints, a net;tcp endpoint and the the other being the meta data exchange endpoint using mexTcpBinding. I also configured a base address, part of the configuration file looks like this : Host it in IIS 7 and the service does not start at all and throws up an error like this : There is no compatible TransportManager found for URI 'net.tcp://ct-svr:8731/AuthorisationManager/Services.Security.AuthorisationManage

Response.Redirect vs PostBackUrl

I normally use Response.Redirect to navigate from page to page, someone told me the other day that it would be better to use PostBackUrl of a control to redirect to a page then use Response.Redirect. So, I ran a little test of my own, created a 2 sample pages, where on a button click I do a Response.Redirect like this. protected void Button1_Click(object sender, EventArgs e) { Response.Redirect("Advance.aspx"); } Next I ran Fiddler, this is the result I got on the button click, the response I get back from the server is not the content of the page I want but this... HTTP/1.1 302 Found Server: ASP.NET Development Server/10.0.0.0 Date: Mon, 10 May 2010 16:17:57 GMT X-AspNet-Version: 4.0.30319 Location: /Advance.aspx Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 130 Connection: Close Object moved to here . Here; point to the page you want to navigate to (have to live with this HTML formating :)) The server issues a 302 and the browser issues anot

Passing in parameters into OPENQUERY

I was struggling for some time now, trying to pass in parameters into OPENQUERY, OPENQUERY does not support passing in parameters, all it does it takes a string value as the 2nd parameter like this, SELECT * FROM OPENQUERY(LINKSERVER_NAME, 'SELECT * FROM COUNTRY WHERE COUNTRYID = 10') What's worse, it does not support passing in a varchar variable as the 2nd parameter. So, if you want to pass in parameters, then one of your option is creating a dynamic query and executing it like this declare @var int = 10 declare @query varchar(max) = 'select * from openquery(Test_link,' + '''' + 'SELECT * FROM dbo.TEMP where ID > ' + CAST(@var AS VARCHAR(MAX)) + '''' + ')' execute(@query) IF you want to use the result returned by OPENQUERY, like for an example for joining to another source table, you would have to create a table variable, populate your result and start joining, and illustration would be like this (hypothetical

Generate from using - creating stubs first in VS 2010

A feature that every one loved in VS 2005 is to just type in you class and press Alt+Shift+F10 and Visual studio shows you the options to add the using statements. VS 2010 this has been extended, to create class stubs for you while you code. For an example, if I want to use a class called Person in my code and if the Person class is not there yet in your code base, all you got to do is type the following. Person p = new Person() Press Alt+Shift+F10 and it will show an option asking if it should create the stub Person, if you say yes, VS will create the Person class for you in your current project. Now next when you need to put a property into the Person class all you have to do is : p.Name = "Jack"; again press Alt+Shift+F10, and tell VS to create the property, now in the person class you will see the property Name. You can do the same to generate method stubs as well.

Web Protection Library and XSS attacks

I started using the AntiXSS library that comes with the Microsoft Web Protection Library (WPL), this library gives you several APIs that can be used to protect your application from Cross Site Scripting (XSS). So, What is an XSS attack?, this is where a malicious user injects scripts into the web page the user is viewing and the script runs as part of the response from compromised web site. As a simple example, lets say a malicious user enters a comment on a blog post, with the comment he also appends a script that runs for 1000 times where a alert box is displayed. Now every time a user views the comment section of the particular post, the script would execute. An extream scenario would be where a user enter a script in the comment box where the script passes on the session id or a cookie of the logged in user to the hacker's web site. In not these cases, if the input of the comment text box was scanned for any malicious code before saving it to a data source. WPL supports APIs th

Linked Servers and OPENQUERY

So how do you access data from one database to another? (specifically talking about SQL SERVER) One possible way you can do is to create a Linked Server to the database you want to connect to. The easiest way to do this is to go to Management Studio, right click on "Server Objects" and click "New Linked Server" Now on the "New Linked Server" dialog you have 2 ways you can create a Linked Server. 1) Create a Linked Server pointing directly to a SQL SERVER instance 2) Create a Linked Server pointing to several set of supported data sources, this includes OlE DB data sources and also SQL SERVER. I prefer choosing the 2nd option to create a linked server to SQL SERVER, although I can do this easily by using the first option, the 2nd option gives me the flexibility to easily change my data source without effecting any objects that use it. For an example, if you use option one, you can access an object (eg an SP) in the other database by using a 4 part name, as

Still no Workflow support in Visual Studio 2010

I was trying out Visual Studio 2010, but was disappointed that they have not included Workflow templates into the express edition (They did not in the 2008 express edition too.) However, you can still create workflows, as it is part of .NET 4.0, but with a lot of pain, that is you have to manually code what ever the designer does for you in the normal editions !. Well, this is where I start hating Microsoft :)

Workflows and MS WF

Yesterday, I completed my training on Windows Workflow foundation. In my opinion, it yet a new tech and provides you with a set of good features. However, a question that was racing my mind all the time was, why the hell do you need workflows, I mean almost all programs that we write have activities and business logic within them, so what the big use of using a framework to model workflows. These were some of the pointers the trainer think of if you need to go for workflows. 1) Activities can be clearly identified with boundaries 2) Whether to use state machine workflows can be determined if the logic is push based and not pull based 3) When rules need to be customized externally without rebuilding the system. 4) You have huge number of human interaction (to determine if you need to use state machines) 5) Long running process that can be done asynchronously without user intervention On the other hand, MS Workflow foundation has its own advantages. 1) A cool designer support. 2) Custom