Category Archives: Technology

Dynamics CRM Integration using Azure Service Bus – Part I – using Service Bus Relays

This is the first part of a series of posts about various options for integrating Dynamics CRM with external applications/services. I will update this space with links to the other posts in the series once I publish them.

Since the release of Dynamics CRM 2011, Dynamics CRM had first class support for talking to azure service bus. Let us look at some of the options for making Dynamics CRM talk to external applications through Azure service Bus. In short we are going to look at how to make CRM talk to external systems using Azure Service Bus Relays, Queues and Topics & Subscriptions.

Pre-requisites

You understand Azure Service Bus Namespaces, Relays, Queues, Topics and Subscriptions

Hmmm, this is hard to explain. Distributed applications using messaging patterns is a broad topic where numerous books were written, lot of money is made by software vendors trying to make it “look” easier, and If I am not mistaken even research papers were published. While it is difficult to explain Azure Service Bus or Distributed Applications using messaging patterns in 15 minutes, If you are completely new to Azure Service Bus this short channel 9 video should help you get started.

Azure Service Bus namespace that you are intending to use should have an associated Access Control Service

Until recently an Azure Service Bus namespace you create will automatically be associated with an Access Control Service. However recent changes introduced to Azure portal doesnt create the ACS when you create a Service Bus namespace from the Web portal. However you can use PowerShell to create the namespace and ask azure to create a access control service associated with the namespace

Dynamics CRM integration Service Bus Relays

While considering an integration solution that leverages Azure Service Bus Relays, there are two main parts. The service you are exposing to Azure Service Bus using Relay should expose either IServiceEndpointPlugin or ITwoWayServiceEndpointPlugin interface. Both these interfaces have only one method Execute accepting RemoteExecutionContext as the sole parameter. As the name of the interfaces implies, IServiceEndpointPlugin Execute method is one way and does not return anything, while ITwoWayServiceEndpointPlugin Execute method return a string value. The other end of the solution lies with CRMs ability to talk to these service through Azure Service Bus. This is where things will get easier as you can leverage out of box capabilities exposed by CRM to talk to Azure Service Bus.

There are two ways you can make Dynamics CRM to talk to an external service exposed using Azure Service Bus Relay. One of the obvious choices is to make your plugin talk to Azure Service Bus. However you can also achieve this without writing any code in Dynamics CRM.

Exposing a CRM Listener service using Service Relay

Exposing your CRM listener WCF service to Azure is just as same as exposing any other WCF service to Azure using Service Relay, except that the service contracts you expose should be either IServiceEndpointPlugin or ITwoWayServiceEndpointPlugin as shown below.

It should be obvious from the above code snippet that the entire logic for this service sits inside the Execute method. For e.g as indicated by the comment in the above code snippet, you may want to write some code to call your ERP system and update/insert some records. In order to do whatever processing inside the Execute method, the RemoteExecutionContext parameter provides a wrapper for the IPluginExecutionContext (an implementation, to be precise) interface, you would normally use inside a traditional CRM Plug-in. Which means you have access to Pre and Post entity images and its attributes. The above code snippet prints the organization name from the RemoteExecutionContext parameter for demonstration.

Once you have the service implementation defined it is time to expose the service to Azure using Service relay. The below code snippet should work for any WCF service. There is nothing CRM specific here other than the use of IServiceEndpointPlugin interface. The below code snippet does self-hosting of the service in a Console application. However you can always use IIS or any other host.

// Service Bus details


ServiceBusEnvironment.SystemConnectivity.Mode = ConnectivityMode.Http;
string servicenamespace = "";
string issuerName = "";
string issuerSecret = "";

Uri address = ServiceBusEnvironment.CreateServiceUri(Uri.UriSchemeHttps, servicenamespace,
"");

TransportClientEndpointBehavior servicebusbehavior = new TransportClientEndpointBehavior();
servicebusbehavior.TokenProvider = TokenProvider.CreateSharedSecretTokenProvider
(issuerName, issuerSecret);

//Configure binding
WS2007HttpRelayBinding binding = new WS2007HttpRelayBinding();
binding.Security.Mode = EndToEndSecurityMode.Transport;
//Configure host
ServiceHost host = new ServiceHost(typeof(CrmListener));
host.AddServiceEndpoint(typeof(IServiceEndpointPlugin), binding, address);

//Configure endpoints
IEndpointBehavior serviceRegistrySettings = new ServiceRegistrySettings(DiscoveryType.Public);
foreach (ServiceEndpoint endpoint in host.Description.Endpoints)
{     endpoint.EndpointBehaviors.Add(serviceRegistrySettings);     endpoint.EndpointBehaviors.Add(servicebusbehavior);

}
try
{
//Open Host     host.Open();
}
catch (TimeoutException ex)
{     Console.WriteLine("Opening of service Timed out");
}
Console.ReadLine();

 

 

Once you run the the host application you can see the service endpoint appearing in Azure portal Service Bus –> Relays as shown below. Please note the name “Integrationtest” . This is the name of the endpoint of your service to which Dynamics CRM will talk to. You configure this endpoint while creating the service bus Uri.

Uri address = ServiceBusEnvironment.CreateServiceUri(Uri.UriSchemeHttps, servicenamespace,
                                                    "IntegrationTest");

azurerelay

Configuring CRM to talk to the service through Azure Service Bus

Once we have the service ready (Service Relay to be technically correct

Smile

), we have to let CRM know how to talk to this service using Azure Service Bus. This is done by registering a service endpoint using the Plugin Registration Tool. However before starting , head over to CRM online, go to Settings->Customizations->Developer Resources and download the Windows Azure Service Bus Issuer Certificate. You will need this certificate while registering the service endpoint.

crmcertificate

when you register a service endpoint, the important things to consider are the Path and Contract. Path should exactly match the name of the endpoint you have configured while configuring the Service Relay. In our example, it will be “IntegrationTest”. Since our Service implements the one way IServiceEndpointPlugin interface, you should select the Contract as “OneWay”

servieendpoint

After entering these values click Save & Configure ACS button and the tool will prompt you for the certificate and the issuer name. Browse to the certificate you have downloaded earlier and give the issuer name as “crm.dynamics.com”. Once you click configure and Save the settings, Plugin Registration tool will automatically create the appropriate service identity settings for your service bus namespace in Azure. Once this configuration is complete you can head over to Azure management portal and verify the settings under the Access Control Service portal for your Namespace.

acsserviceidentity

Next step is to register a step under the Service End point registration you have just created using the Plugin Registration Tool. Select the Service End point and from the Register Menu click Register New Step menu item. In the below screen I am registering a step to post the execution context to the registered Service Endpoint whenever an opportunity is updated.

registernewstep

Please note that Dynamics CRM will not allow you to select Execution Mode as Synchronous. You can test the integration by performing an update operation on opportunity entity. Make sure your service is running before you update the opportunity entity. If your service is not hit or you get any error while updating the opportunity, you can check the status of the integration in Dynamics CRM under Settings –> System Jobs.

Writing a two-way service

From the above example it should be clear that if you are using the out of box CRM support for communicating with Azure Service Bus, you will be following a fire and forget style communication with your service. What if you want to do some processing with the return value from the external service in your Dynamics CRM? At the services side you have to implement ITwoWayServiceEndpointPlugin interface. The following code shows a service which returns a simple string to the caller.

In order to host the service I have avoided the lengthy WCF configuration code we did in the previous sample and moved all the configuration to the App.config file

ServiceHost host = null;
try
{
    host = new ServiceHost(typeof(CrmTwoWayListener));
    host.Open();
    Console.WriteLine("Service Started");
    Console.WriteLine("Press Enter to stop the service");
    Console.ReadLine();
}
catch (Exception ex)
{
    Console.WriteLine(ex);
}
finally
{
    host.Close();
}

As before, if you run your service you can see the service appearing under the Service Relay in your Azure Portal under your Service Bus namespace.

Once the service is up and running, you have to create a Plugin to talk to it using the SDK. This is relatively straightforward. The IServiceProvider container passed to the Execute method of the IPlugin interface, contains an instance of the IServiceEndpointNotificationService, which has one method Execute. Execute method expects you to pass an entity reference to the “serviceendpoint” system entity corresponding to the Service Endpoint you register in the Plugin Registration Tool. Which means you have to have the unique id of the Service Endpoint registration record, which you can grab it from the Plugin Registration tool and pass it as a configuration string to the Plugins constructor as shown in the code snippet below.

public class ContactUpdatePlugin : IPlugin
{
    private Guid serviceendpointid;
    public ContactUpdatePlugin(string config)
    {
        if(string.IsNullOrEmpty(config) || !Guid.TryParse(config,out serviceendpointid)){
            throw new InvalidPluginExecutionException("Invalid Plugin Configuration");
        }
    }
    public void Execute(IServiceProvider serviceProvider)
    {
        IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        ITracingService trace = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
        IServiceEndpointNotificationService azureservice = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService));

        Entity entity = (Entity)context.InputParameters["Target"];
        if(entity == null) throw new InvalidPluginExecutionException("Target Entity is null");

        IOrganizationService orgservice = (IOrganizationService)serviceProvider.GetService(typeof(IOrganizationService));

        if (orgservice == null)
        {
            IOrganizationServiceFactory serviceFactory =  (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            if (serviceFactory == null) throw new InvalidPluginExecutionException("org service factory is null");
            orgservice = serviceFactory.CreateOrganizationService(context.UserId);
            if (orgservice == null) throw new InvalidPluginExecutionException("org service is null");
        }

        string response = null;
        try
        {
            response = azureservice.Execute(new EntityReference("serviceendpoint", serviceendpointid), context);

        }
        catch(Exception ex)
        {
            trace.Trace("Exception while calling external service {0}", ex.ToString());
            response = ex.Message;
        }
        try
        {

            var serviceContext = new OrganizationServiceContext(orgservice);
            if (serviceContext == null) throw new InvalidPluginExecutionException("Service Context is null");
            Entity post = new Entity("post");
            post["regardingobjectid"] = new EntityReference(entity.LogicalName, entity.Id);
            post["text"] = response;
            post["source"] = new OptionSetValue(1);
            post["type"] = new OptionSetValue(7);

            serviceContext.AddObject(post);
            serviceContext.SaveChanges();
        }
        catch (Exception ex)
        {
            if (ex.InnerException != null) throw new InvalidPluginExecutionException("Error updating post \n" + ex.InnerException.Message);
            throw new InvalidPluginExecutionException("Error updating post \n" + ex.Message);
        }
    }
}

To make things little more interesting I am creating a post entity with the return value from the service and attaching to the original contact entity so that we can verify the results side

                                                                                                                                                        

by side. Even if the external service returns an exception it will appear in the post. Registering this Plugin is same as registering any other Plugin except that you have to remember to update the secured and unsecured configuration strings with the unique id of the Service Endpoint registration. Please note that Dynamics CRM will allow you to register this as a Synchronous Plugin which means that until the external service returns or Plugin times out your operation will not complete.

aftercontactupdate

I will cover Dynamics CRM integration using Azure Service Bus Queues and Topics in the coming posts in this series.

procmon : fix “Another version of the Process Monitor driver is already loaded. A reboot is required to run this version” error

I unfortunately had multiple copies of procmon in my machine and after running an older version of the procmon, any attempt to run a newer version resulted in an error message saying “Another version of the Process Monitor driver is already loaded. A reboot is required to run this version”. I tried running procmon /terminate without any avail.

The solution is simple. Navigate to HKCU\SysInternals\Process Monitor registry key. Delete the Process Monitor node or delete all the keys and values under the node. Before doing this you may want to make sure procmon is not running. Running procmon /terminate command is a good option to consider. Restart the machine and you should be able to run the new version of the procmon.

registry

The reason behind the error is because the procmon*.sys driver was always held by the OS kernel after you first execute the procmon.exe. This looks life a bug because the newer versions of the procmon.exe doesn’t seem to do this. You can find out whether the driver is loaded by the kernel by running the pocexp tool.

procexp

From the above image you can see that PROCMON23.SYS driver is loaded by System process. If you would like to check this in proxexp, make sure you enable the lower pane by pressing Ctrl+L key and customize the content displayed in the lower pane to DLL using the Ctrl+D key. Alternatively you can perform both these operations from the View menu in proxexp

Hope this helps!!!

Installing Windows Live Writer 2012 on Windows 10 Technical Preview

 

Windows Live writer is my primary blog writing tool. I had tried many alternatives including Scribe – the Firefox add-on, various online blog editors and various desktop ones. However I kept going back to Live Writer and Kept installing it on every machine I use. However installing Live Writer on the recent Windows 10 Technical Preview 10049 build was nothing less than nightmarish. Once you start the installation, at around 30% my machine hangs. None of the control keys will work and I had to power down the machine by keeping the power button pressed. Surprisingly no log files or crash dumps were written by the installer.I tired capturing the crash dump using procdump without any luck.

I captured the log file by running the installer from the command line with the /log: <Log File Name>. From the log file I could see that the machine hangs at the point where it tries to verify the signature of some downloaded file.

writererror

Windows Live writer installation downloads a bunch of files which are windows updates and those updates doesn’t seem to be working well with the Windows 10 Technical Preview. I poked around the command line switches and found that there is an option to disable checking for updates. Finally below is the command line which worked for me.

wlsetup-all.exe    /AppSelect:Writer /q /log:C:\temp\Writer.Log /noMU /noHomepage /noSearch

/noMU is the switch to control windows updates.

Outsourcing, software development and pimping

Below images represent two of the many situations where your life is dependent on the software.

gs_screen heart_respiration_o2_monitor

Image courtesy : Google images

Now imagine you know how this software was created, how many bugs were ‘punted’. Going a step further imagine you know the company which developed the software and their outsourcing partner in India. If you are a software engineer or somebody working in India’s software outsourcing industry, your throat might have become a little bit dry, you might have gasped a bit of extra air.

Let us imagine a fictitious scenario. Company XYZ developing the heart rate monitor decides to outsource the development of the software component to one of the top companies in India. They invite the Business development manger (position or title is immaterial, but his/her job is to bring more work, but let us call him BDM for brevity) and briefs him about their requirement.  He comes out of the meeting understanding the requirements (at least he understands that i am going to bring some business to my company) and discusses the opportunity with his counterpart in India. Let us call this guy Mr. Sensible (for no apparent reason)

Imagine this fictitious conversation between the business development manager and the Sensible guy and his team in India (most of them are checking whatsapp or facebook in their phones during the call).

“Hey guys, we have this golden opportunity, Company XYZ is looking for outsourcing their product development to us.”

“That’s great news”

“yeah (thinking, Hey i have met my target :)) They are looking for around 100 people who has c++ knowledge”

“What, c++, we don’t have anybody who knows c++, and it is very hard to find people who knows c++”

“That is ok, we will train them in c++”

“That is a good option, but they might not be able to develop such a complex piece of software”

“Common, what complexity you are talking, (thinking, this idiot will not let me meet my number and go on vacation), i was also a programmer 20 years back. I developed this software ….” (blah… blah…unleashes his verbiage which has been repeated 200 times).

“I feel it is too risky, these are mission critical software, people’s life will depend on the code that we write”

“OK, i understand your point (thinking, there is no way i can understand what you are saying). let me talk to your boss and my boss and let them decide”

Since there was a deadlock, the issue was escalated to the so called LT (leadership team) and a high level meeting was called. The bosses made it very clear, at any cost we have to get it done. Cost? which could be people dying because of the code you write?

“Common we do test our code, is it? this company has strict quality procedure. And the company XYZ will also have procedures to ensure quality”.

And the sensible guy was given a last piece of advise, “There are two kinds of people, people who find solutions for every problem and people who find problems in every solution”

Eventually the outsourcing company won the contract, they hired 100 C++ developers from India (don’t ask me how)  and developed the software for the heart rate monitoring device- Sales guy met his target went for vacation, bosses bought new cars, Mr. Sensible guy was given a rating of 4, (5 being “get out of the company now”) with the comment that “he is not flexible”- at which you are looking now from your hospital bed.

Outsourcing and call centers

Outsourced call centers are another place, where similar conversations might happen, mostly. I am not so sure whether you have read about the @N incident? you can read a detailed account here. In summary a hacker duped a call center agent and stole the credit card information of the owner of @N twitter handle (from Godaddy or PayPal) and asked him to release his twitter handle (which was worth $50000 at that time) in exchange of the hacker giving back the changed Godaddy password. Call center agent at Godaddy (or PayPal) during the verification process let the hacker guess the last four digits of the credit card number by trying random numbers.

 

The moment i read the incident, the first thing that came to my mind was hmm, that could have happened through an outsourced call center, most likely in India. Why? the call center agent has zero accountability for the company (Godaddy or its customer). He/she might have been working in night shift and wanted to finish this damn job go back home and sleep. Does he/she really worry what consequences his/her action might have caused? Does he/she have to? Forget it, does he/she know what happened? most likely, no.

I suggest we change the name outsourcing to pimping, That is the closest analogy i can think of. Both involves two common factors, screwing and accountability. You screw customers, screw developers’ work life balance personal life and there is no accountability.

Architecting Social Applications

 

I still remember this conversation. It was an office pantry conversation with one of my colleagues  who was also my classmate. I was mentioning to him about an Orkut group for our alma mater. Yes, this happened a long time back, when Orkut was still alive. For those who do not know about Orkut, think of it as a previous lame duck version of Google+ – pun intended . Next day I met him again and asked him why is he not in the group yet. He replied, he had tried to join the group –  after joining Orkut, but it was too difficult and gave up. Fast forward a few years and I was connected with the same friend on Facebook. He was so active on Facebook, that I had to unsubscribe notifications from him, because my newsfeed was crowded with status updates on what he did in the morning, afternoon, evening, night and his thoughts on different current affairs, blah, blah, blah. Since I had unsubscribed from his notifications, in my Facebook newsfeed,  I did not know when he decided to relocate until one of our common friends told me about it. Well, I wrote about this incident not to substantiate how popular Facebook is while other social networking platforms – including Google+ are dying. Instead I wanted to highlight the simplicity and seamlessness that Facebook had, which made my friend to “abuse” it (by pouring status messages and making other’s newsfeed unreadable )

Well, this post not about comparing Social networks. I wrote the above incident to highlight some of the things, Facebook has done extremely well in architecting their application, which I feel can be, and should be considered while you are architecting your application or product

Oh, Social !!!

I don’t have to say this, but you might have already realized that “Social” is The new buzzword in the tech industry. “we are going social”, “we are becoming more social” these are some of the common statements in a press release associated with a new product release or release of a new version. Customers always ask for Social integration in their Line of Business Applications for obvious reasons. Social networks have become another channel, – a powerful one – for your customers to interact with your company, generate leads, market your product etc.. While Social network integration is one of the powerful influence in Application architecture, I feel that this is only one of the different new ideas, Social networks have brought in, which can influence Application Design or architecture.

Social graph

Social Graph is one of the most important components that powers Facebook. In layman’s terms, they have tried to replicate the relationships between you, your friends, movies you watch, celebrities you like, photos you posted, places you visited, milestones in your life and many other things in a system where all these are connected like nodes in a graph. This makes it easy to make interesting correlations, important findings, surprising relationships by traversing this enormous graph. In 2009 I remember reading one of Newsweek articles, in which the author had tried to highlight the power of this social graph. He mentioned that, Facebook has so much data, the power of which they themselves haven’t realized. For e.g. Facebook can predict who will win next US presidential elections, how successful apple’s next iPad release will be etc..

In other words Facebook has made it easy to capture and record your relationships with not only your friends and family members, but also with anything that you do in your life (be it reading a book, meeting a friend, buying a new phone) without you even realizing it.  I believe this is one of the interesting ideas which can be brought to traditional line of business applications or products.

Let us a dig a bit deep into it. Think of any Line of business application that you might have used or use daily. Or if you are a software developer, any application you might have designed and/or developed. More than one people use this application differently, doing different things with it, to achieve different business goals. This group of people interact with each other outside your application, for doing their work or to achieve different business goals or sometimes even a common business goal. They also interact with different entities in the business (some of them might already be represented in the line of business application). Question is, are these interactions – interactions between people and different entities in business – important to the business, so that they have to be captured, recorded and analyzed. By interaction, I don’t mean people chit-chatting about the birthday party they attended yesterday, in office, which I guess they already do this in Facebook or  their favorite Social network. Instead think of a call center agent getting help from his or her colleague, Sales executive wants to keep a close eye on a particular company or client, a teacher’s favorite student etc.. Does any of the applications currently in use in many of the enterprises capture these relationships in the context of the business.

In plain English, when you are working with a line of business application, you treat it as a data entry operation, not surprisingly application also treat you the same way, meaning it captures the data you key in, prints some reports and provides some analysis. It doesn’t care whether the student record you are looking at is of your favorite student’s or your colleague sitting next to you is helping you, when you try to solve a customer’s problem who is on the line. I believe capturing the enterprise social relationships between people and different entities in the business or outside your business can help you make interesting correlations, important findings, surprising relationships, and many more things.

Graph databases

I guess it will be safe to say that 95% applications in the enterprises use some kind of a relational database for persisting data – the other 5 % being COLBOL Mainframe applications. if you are thinking of enabling your application to capture these unconventional relationships, relational database system might not be the wisest choice. Being more specific I would consider using a graph database to capture the social relationships in your enterprise. Speaking of graph databases, consider Neo4j, the most popular and graph database. Though it is based on Java – you have to install java in your machine – Neo4J exposes a first class REST API and has got a .NET SDK as well.

While using a graph database to model your enterprise social graph gives you the ultimate flexibility and power, you should also consider offloading this representation to enterprise social networking product like yammer (if your company already has a subscription). Similar to Facebook, Yammer also exposes a graph API which can be consumed by applications.

UI Libraries

If you are a software developer, I am sure you might have come across many user interface libraries, while developing applications. Most of them internally developed on top of some third party control suite. The arguments for developing UI libraries and developing applications on top of UI libraries are relevant mostly. Consistency and Productivity. Enables developers to develop user interfaces, with consistent look and feel faster. While this consistency is one of the reasons for making every screen look similar, it also contributes to making applications look like a bunch of data entry screens. As a matter of fact, consistency is no longer the user interface design guideline, instead it is intuitiveness. Make your UI intuitive for the user, stop worrying about whether it looks similar to other screens. When you are forced to design your screens within the boundaries of UI libraries, you are certainly not thinking about the person who is going to use and what he/she wants to achieve, other than making sure that data entered is saved.

Summary

To conclude, users of your application are people with different kind of social interactions, even within the context of the business your application operates. They expect your application to be aware of this, instead of treating them as data entry operators. I feel, as an architect when you are aware of these expectations and architect applications to capture these social interactions there is a very high chance that you might have architected a true social application.

Technical Readiness plan

Look at the Google trends graph below showing search trends for different .net framework versions over the last one and half year. On an average 50% of searches are still on .net framework 2.0. While this is not an indication of how many applications are built on different versions of .NET, I am sure that percentage of applications built on earlier versions of .NET Framework (earlier than version 3.0) could be much higher.

So I was not surprised when one of my friends asked me to help him create a readiness plan to update himself from .NET 2.0 to .NET 4.0/4.5 (Could be 5.0 soon) . According to him he gets lost, as soon after he starts looking around what is new in .NET 4.0/4.5. That shouldn’t surprise anybody, a lot has changed in the technology space since 2003. Look at the famous photo to get an indication of the upsurge in smartphone and in general mobile device penetration.

photo courtesy : instagram

Technology landscape changed a lot, from the avalanche of tables starting with iPad, smartphone revolution started with iPhone, different providers fighting for cloud supremacy, to the death of SOAP.

When I started creating a .NET readiness roadmap (an official term) a.k.a development plan for my friend, I realized how much the technology landscape has become open. Open standards and open source technologies have become so important that no development plan is complete without them.

On second thought it is not really about open source or .NET or any other proprietary technology.  Instead it is about how these advancements influence technologies used to build different layers of an application. So here is my attempt to derive a developer’s roadmap by looking at technology advancements or changes happened at each application layer (User Interface, Services and Persistence) and some of the cross cutting concerns as well.

User Interface

Device proliferation has created an obvious need for your web pages to be responsive, so that they adapt not only to different browsers, but also to different screen sizes and resolutions. Most of the browser vendors and smartphone makers support HTM5 and CSS3 including IE, Chrome, Firefox, Safari.

I would definitely put basic knowledge of HTML5 and CSS3 in my technical knowledge kitty. You can check out the below pluralsight courses to update yourself with HTML5 and CSS

 HTML5 Fundamentals

CSS3 From Scratch

As I write this post, I am sure a new JavaScript library would have been released, maybe another one by the time when you read this post. A myriad of JavaScript libraries starting with jQuery is what made HTML5 based applications popular. I can’t imagine developing a web page without jQuery and Modernizer. If you haven’t dirtied your hands with jQuery until now, you should do it as soon as possible. Again pluralsight has a ton of jQuery courses, but this one for sure will get you started.

If you are a ASP.NET developer and haven’t checked out ASP.NET MVC yet, be sure to do that. Along the way you will also encounter many of the JavaScript libraries as well, at the least jQuery for sure.

I am not a CSS expert, hence I rely on Twitter Bootstrap CSS library to create responsive pages.

Mobile development

It is no surprise that most companies have a mobile first strategy. With Mobile device proliferation showing no signs of receding in the near future, I would definitely add mobile app development skills to my armory. My personal favorite is windows phone and windows 8 development, however be sure to check-out other development platforms as well. If you are a .NET developer and interested in cross platform development of mobile applications, be sure to check out Xamarin and PhoneGap

Beyond touch

If you are interested in advanced natural UI technologies, you may want to consider learning Microsoft Kinect SDK and Leap Motion. Leap Motion even has a c# SDK.

At the Server’s side

Without any doubt, the most revolutionary thing happened at server side computing, in the recent years is the advent of cloud computing. With Azure and Amazon fighting to serve their customer’s with reduced price and new services and offerings, time to start developing for the cloud is now.

Windows Azure offers three months free trial, thought most of the services you can run locally for development and testing purpose, as a part of SDK.

Amazon does not have a sophisticated client side emulator for their services, but they do offer one year free trial with some restrictions to the available services

Again, pluralsight has great resources for Windows Azure.  Though their introductory course on AWS is really good, I found a lot of good resources a their YouTube channel, especially the below introductory video.

Node.js

One way to look at node.js is as a framework for developing cross platform web applications. If you have been primarily working with .NET and know a little bit of  JavaScript, Node.Js could be your tool to create web applications that will run in both Windows and Linux.  This single threaded, tiny JavaScript  framework is making  waves. So be sure to check out and understand Node.js. Some good tutorials are here 

Take some REST

I found it hard to believe that Roy Fielding had defined REST as early as year 2000, and the creators of SOAP either did not find it or ignored it. If Azure and Amazon can expose their cloud services as RESTful services, you really do not have to think twice for adopting RESTful design for your services. If you are a .NET developer be sure to check out ASP.NET Web API. Some great tutorials are available here.

For RESTful API design guidelines check out the apigee site. They have some excellent blogs and even YouTube videos.

For a more serious understanding of the architectural principles behind REST, check out the pluralsight course.

During your RESTful journey, if you even stumble upon oData, I would suggest you check out why oData is not really REST, before getting excited too much.

Persisting Data

The famous infographic below shows how much data is generated every minute around world wide web

Image courtesy Mashable

Imagine storing one percentage of this data in a relational database without compromising reliability and scalability. NoSQL (not only SQL) databases originated from this very use case. NoSQL databases sacrifices consistency (eventual consistency against transactional consistency)  in favor of availability, allowing you to scale out indefinitely for storing non-relational data (CAP theorem). Most of the people consider polyglot persistence while designing applications, with a mix of relational and non-relational databases. If you are a .NET developer, consider learning RavenDB, which uses LINQ as its querying language. MongoDB and Neo4j are also worth checking out, with the later being a Graph Database, allowing you to store Graph data. One of the interesting things I found about NoSQL databases is that, all of them expose a RESTful API interface.

Great, we have a solution for storing these “BigData”. How do we make some sense out of this? MapReduce is the technology for analyzing this BigData stored in huge clusters of NoSQL databases. If you have started playing with NoSQL databases, RavenDB or MongoDB, you might have already seen that these databases allow you to write Map and Reduce queries.  These days, I haven’t seen a sentence written, without mentioning Hadoop, and most of the people start their NoSQL journey with Hadoop. Though primarily written in Java, Hapdoop has support from Azure and Amazon in the cloud. Here is a great introduction to NoSQL by none other than Martin Fowler.

ORMs

I haven’t written a SQL query since 2005. Though I was staying away from SQL and Relational Databases deliberately, so that they don’t influence my NoSQL journey, another reason was ORM’s.  Microsoft’s Entity Framework has come a long way since its inception, though it lacks certain features compared to its counterpart, NHibernate, for e.g. level 2 caching. EF’s fluent API is powerful and easy.  Check out this channel9 video by Entity Framework Guru Julie Lerman

Security

oAuth is one the latest authorization protocols started by Twitter and popularized by Facebook and other internet majors by. Consuming a service which is implementing oAuth is fairly easy, while implementing oAuth for your service is not a pleasant experience. If you are a C# developer check out the DotNetOpenAuth library.

Claims based authentication was prevalent, well before the “cloud burst”. However I would argue that the popularity of cloud is what made the claims based authentication imperative for a lot applications. Windows Identity Foundation (WIF) , which makes implementing claims based authentication for your applications and services is now a part of the .NET framework itself. Both Azure and Amazon has hosted Access Control Services on the cloud. If you are a .NET developer, you may want to start learning about claims based authentication here.

Programming paradigms

Functional Programming

I was listening to one of the dotnetrocks podcasts with Robert C. Martin (@unclebobmartin) about the future of Object-Oriented Programming (if you haven’t checked out dotnetrocks yet, please do so, these guys are amazing, and you will find it difficult to listen to any other podcast, once you listen to dotnetrocks).  Two of the many pearls of wisdom from @unclebobmartin got me thinking. First one was, why would OOP have a future, it is done, go invent something else. Another one was about how variable assignments bring the third dimension of time to your program (a.k.a statefullness), and how this is making difficult to write multi-threaded applications. The answer to both these problems is Functional Programming. Though it was not knew, functional programming was always stuck with academia, and never came into commercial software development. @unclebobmartin describes Functional Programming as programming without assignments, that is your variables do not change state when your program is run – a.k.a immutability. Another way to look at functional programming is to think of sending functions to data other than sending data to functions (the way we were taught). If you closely look it, this is what MapReduce is all about. If you have been dealing with BigData and MapReduce, but never came across functional programming, you should definitely check out the concepts

Channel9 has a Functional Programming Fundamentals series by Dr. Erik Meijer. Do check it out, though the examples are in haskell langauge

F# is the answer if you want to try Functional Programming in Visual Studio. The best way to learn F# is through the tryfsharp.org website, though I prefer typing the examples in Visual Studio, than the online editor.

Async, Async and await

In my opinion, Task Parallel Library (TPL) and its natural progression async, await pattern, are the two major feature additions to C# and .NET in the recent times. If you have already programmer using ThreadPool and Asynchronous I/O, you should be able to understand TPL and async/await quite easily. I will try to find a good tutorial and post it here.

Notable Omissions

WCF – If you have been programming in .NET, you might have noticed that I did not mention WCF at all. Should you learn WCF? Maybe, if you are planning to write SOAP services. How about RESTful services? I wouldn’t implement a RESTFul service in WCF. My choice would be either ASP.NET Web API or Node.js. I have also not mentioned MVVM and WPF. You are sure to encounter MVVM, if you are developing apps for Windows 8 platform.

Happy learning!!!

Some Interesting definitions

    • Askhole – A person who seeks a lot of help, but never puts anything to practice
      • Insultant – Of course who else, a consultant, most likely technology consultant
      • Marchitecture – Marketing architecture, best example Private Cloud
      • YAGNI – You Ain’t Gonna Need It. Best way to convince a customer?

I will try to keep this list updated Open-mouthed smile

Technorati Tags:

64-bit Installer for your VSTO add-in

 

I hope this is my last post on VSTO. I am not an expert in VSTO and I am really excited about the new app model for Office 2013, which lets you create a cleaner powerful extensions for office products like Work, Excel, Outlook and even SharePoint

However, If you are planning to create an installer for your VSTO add-in using the instructions here and deploy on a 64-bit machine it will not work.This is because the registry key is different for 64-bit machines at the step where you create a .prq XML file and add Visual Studio 2010 Tools for Office Runtime as a pre-requisite. Instead of the XML file snippet, use the following one when you create a .prq file for 64-bit machines

<?xml version="1.0" encoding="UTF-8"?>
<SetupPrereq>
<conditions>
    <condition Type="32" Comparison="2" Path="HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VSTO Runtime Setup\v4R" FileName="Version" ReturnValue="10.0.40303" Bits="2"></condition>
</conditions>
<files>
    <file LocalFile="&lt;ISProductFolder&gt;\SetupPrerequisites\VSTOR\vstor_redist.exe" URL="http://go.microsoft.com/fwlink/?LinkId=140384" CheckSum="b6639489e159b854b6dc43d5cb539043" FileSize="0,40023024"></file>
</files>
<execute file="vstor_redist.exe" returncodetoreboot="1641,3010" requiresmsiengine="1">
</execute>
<properties Id="Your GUID goes here" Description="This prerequisite installs the most recent version of the Microsoft Visual Studio 2010 Tools for Office Runtime." >
</properties>

</SetupPrereq>

Extract Team Foundation settings from Team Foundation project add-in

 

One of my project manager friends approached me with a request recently. He wanted a “working” project add-in, that will help him sync certain attributes of the Project tasks with the corresponding work items in Team Foundation server. According to him, the built-in TFS add-in doesn’t work half the time, and when it works, it completely screws up the project file, making it unusable.

Storing the configuration settings for this add-in posed some interesting problems for me. Initially I started out by storing the Team Foundation settings, URL, Team Project etc. in a .config file, thought I felt that this was not a good design. Since these configuration settings will be different for each project file (.mpp), storing these in a configuration file is definitely not a good idea.  I started exploring a different path. If Team foundation Add-in for Microsoft Porject (for that matter, excel also) is storing these settings for each project file, why can’t we read those settings instead of designing our own configuration storage settings

The below diagram shows the custom properties of a Microsoft Project file.

image 

In the above diagram you can see some custom properties with the name starting “VS Team System Data DO NOT EDIT” Team foundation add-in stores various configuration settings against these property names. However the values against each of these property names are Base64 encoded. In fact, Team foundation add-in constructs the settings as single xml file, encodes it, splits it into thirty odd segments (as you can see from the number of property names) and stores them as custom property values

Code to read custom properties

In order to read these settings, you have to combine these settings, decode and uncompress, which will   produce a single monolithic xml, containing all the configuration settings. Once you have the xml loaded into memory, you can use Linq to XML to read individual configuration settings like Team foundation server name, Team foundation project name etc.

Combine all the custom “VS” property values into a string builder

project = app.ActiveProject;
if (project == null) return;
var properties = (DocumentProperties)app.ActiveProject.CustomDocumentProperties;

string VSTSSystemData = "VS Team System Data DO NOT EDIT";
int propCount = 0;
try
{
    propCount=(int)properties[VSTSSystemData].Value;
}
catch (ArgumentException argex)
{
    //throw new ApplicationException("Project is not connected to TFS");
    MessageBox.Show("Project is not connected to TFS");
    return;
}
StringBuilder sb = new StringBuilder();

for (int i = 0; i < propCount; i++)
{
    sb.Append((string)properties[VSTSSystemData + i].Value);
}

Decode the base 64 bit encoded string

 

char[] chars = sb.ToString().ToCharArray();
byte[] encodedDataAsBytes = System.Convert.FromBase64String(sb.ToString());

Before you load the decoded string to an XML document, you have to omit the first  bits of the array. These 8 bits store some settings specific to the encoding and compression.

byte[] result = new byte[encodedDataAsBytes.Length - 8];
Array.Copy(encodedDataAsBytes,8, result, 0, encodedDataAsBytes.Length - 8);
MemoryStream s = new MemoryStream(result);

Now you have compressed data in a memory stream. You can deflate this memory stream and load the it into a XDocument to access the various Team Foundation configuration settings.

using (var df = new System.IO.Compression.DeflateStream(s, System.IO.Compression.CompressionMode.Decompress, true))
{
    var doc = XDocument.Load(df);
    var tfsservernode = (from x in doc.Descendants("V")
                where x.Attribute("n").Value == "namespaceUrl"
                select x).FirstOrDefault();
    if (tfsservernode == null) { MessageBox.Show("Could not find TFS server name"); return; }
    tfsservername = tfsservernode.Value;
    if (string.IsNullOrEmpty(tfsservername)) { MessageBox.Show("TFS server name is empty"); return; } 
    var projectnode = (from x in doc.Descendants("V")
                       where x.Attribute("n").Value == "project"
                         select x).FirstOrDefault();
    projectname = projectnode.Value;
}

Please note, the above steps apply to Microsoft Project 2010. However the format, Team foundation add-in, encodes and compresses these settings are undocumented and proprietary. Please use these at your own risk