Monthly Archives: January 2015

Implementing messaging queue with RabbitMQ in 15 minutes

This is the beginning of a series or articles meant to capture the main differences between various message queue implementations, including MSMQ, RabbitMQ, ZeroMQ and ActiveMQ.

RabbitMQ uses Advanced Message Queuing Protocol (AMQP) and is a brokered messaging system. Brokered in the sense there is a central entity responsible for storing and forwarding of the messages from producers to consumers.

This article presents the 1-2-3’s of creating a simple producer/consumer example using RabbitMQ.

Mandatory Setup
1. Download and install Erlang. RabbitMQ Server is written in Erlang.
2. Download and install the RabitMQ Service broker, which is a windows service.
3. Ensure service is running on its default port 5672. To do this, you can enter the following command on a DOS prompt:

netstat -ano > processes.txt

After running command, open text file using Notepad or even better, Notepad++ do a search for port 5672. If port is found, this means that the service is running on your local machine.

Create Producer and Consumer
4. Create a new Console project called RabbitMQProducer.
5. Using NuGet, add a reference to the C# RabbitMQ binding called RabbitMQ.Client.
6. Modify the content of the Program class using the following code:

  public class RabbitMqPublisher
    {
        public static void Main()
        {
            var factory = new ConnectionFactory() {HostName = "localhost"};

            //The connection abstracts the socket connection, and takes care of protocol version negotiation and authentication and so on for us.
            using (var connection = factory.CreateConnection())
            {
                using (var channel = connection.CreateModel())
                {
                    //1. create a queue to send message. Queue is created only if it does not already exist.
                    channel.QueueDeclare("hello-queue", false, false, false, null);


                    while (true)
                    {
                        var stopWatch = new Stopwatch();
                        stopWatch.Start();

                        //2. create a message, that can be anything since it is a byte array
                        for (var i = 0; i < 1000; i++)
                        {
                            SendMessage(channel,
                                string.Format("{0}: msg:{1} hello world ", DateTime.UtcNow.Ticks, i));
                            Thread.Sleep(10);
                        }

                        stopWatch.Stop();
                        Console.ReadLine();

                        Console.WriteLine("====================================================");
                        Console.WriteLine("[x] done sending 1000 messages in " + stopWatch.ElapsedMilliseconds);
                        Console.WriteLine("[x] Sending reset counter to consumers.");

                        SendMessage(channel, "reset");
                        Console.ReadLine();
                    }
                }
            }
        }

        private static void SendMessage(IModel channel, string message)
        {
            var body = Encoding.UTF8.GetBytes(message);

            //3. send the message
            channel.BasicPublish("", "hello-queue", null, body);

            Console.WriteLine(" [x] Sent {0}", message);
        }
    }
}

6. Compile and run the program and you should see each message being set written to the console.
7. Create a new C# Console project, called RabbitMQReceiver and modify the Program class as
follows:

public class RabbitMqSubscriber
    {
        static void Main(string[] args)
        {
            int messagesReceived = 0;

            var factory = new ConnectionFactory() { HostName = "localhost" };
            using (var connection = factory.CreateConnection())
            {
                using (var channel = connection.CreateModel())
                {
                    //1. create queue if it does not exist.
                    channel.QueueDeclare("hello-queue", false, false, false, null);

                    //2. now get message from queue
                    var consumer = new QueueingBasicConsumer(channel);
                    channel.BasicConsume("hello-queue", true, consumer);

                    Console.WriteLine(" [*] Waiting for messages." +
                                        "To exit press CTRL+C");
                    while (true)
                    {
                        var ea = consumer.Queue.Dequeue();

                        var body = ea.Body;
                        var message = Encoding.UTF8.GetString(body);
                        messagesReceived += 1;

                        Console.WriteLine("[x] {0} Received {1}", messagesReceived, message);
                        if (string.CompareOrdinal("reset", message) == 0)
                        {
                            messagesReceived = 0;
                        }
                    }

                }
            }
        }
    }

8. Using NuGet, add a reference to the C# RabbitMQ client binding, called RabbitMQ.Client to resolve compilation problems.

9. Compile and run the program. If you have the Producer already running, you may have to
right click on the consumer project and select Debug | Start new instance.
10. Run an instance of the publisher and subscriber and notice messages being passed from former to latter.

Some notes about the implementation:
1. You can fire up more than one publisher and the will all send messages to the queue.
2. You can fire up one or more listeners and they will all retrieve messages from the queue.
3. Messages remain in the queue until they have been consumed by subscribers.
4. Messages are shared among subscribers. So if one of them gets a certain message, the others do not.
5. If you have more than one subscriber, messages are equally distributed among them.

Use case
This can be used in a scenario where you have field data and have a bunch of processing stations to process the field data in parallel.

The value of quality tech support from web hosting partner

Technical service is paramount in our business, especially if you, like me, do not like farting around with stuff other people do better, such as hosting web applications.  I prefer to think of my speciality as crafting and building top notch software solutions and prefer to let others who are best at providing for me the infrastructure to do just this.

This is what http://www.discountasp.net has provided for me over the years. I have nothing but praise about their commitment to quality infrastructure, support and service. Regardless of the time and day you are almost guaranteed to get a response from these guys in very reasonable time, in most cases under three hours.

See, I have an ASP.NET MVC transit application being prototyped.  Application is being migrated from Windows Azure because I could not find a way to host it on a relative path. And since discountasp.net is already hosting our family charity’s website, I naturally turned to them for help.

I signed up for a new account on Jan 18, 2015, and in 24 hours after updating my DNS settings with my domain registrar, my domain name was up and running. Next, I deployed my ASP.NET MVC application to the discountasp.net servers and this is when things started to get interesting.

Since the application’s database still resides in Azure, I had to add an IP white list to entry to Azure’s management portal . The folks at discountasp.net promptly provided the IP address list and were also very quick to correct all the misunderstandings ( and I had many) that kept coming up. We went back and forth on this several times and at one point, they were prepared to do a remote session via webex. Now, this is technical support.

After they confirmed to me that they were able to connect to my Azure database, it was just a simple matter for me to correct some minor glitches in the application and re-deploy it for it to start working correctly. If I was doing all of this by myself, there is no way the work could have been done in such a short period of time, further distracting me from the actual task of creating great software.

Nothing but good words for these guys and this is what, in my humble opinion, is a distinguishing factor amongst web hosting partners. Having these kinds of reliable partners is well worth the money, allowing one to focus on what they do best.

And no, I am not affiliated with them in any way.

Use case for custom Log4Net MemoryAppender

When developing an enterprise grade desktop or web application, there is always a need to be able to view application logs including activities in one of the views. These activities could include user actions, system warnings or errors, triggered from the client or server.

Server errors or warnings could be polled for, through a WCF or RESTFul API, and stored in an inmemory collection that is bound to some data grid in a view. Log messages generated on the client, either through exceptions, caught and logged somewhere or via actions that require auditing such as changing sensitive system parameters, can also share the same inmemory collection with data retreived from the server.

Log4Net is a popular .NET library used for application logging. The general pattern in using Log4Net is as follows:

1. Get an ILog instance.
2. Call any of the API on the ILog instance to log your message.

Log4Net log messages eventually are persisted to entitities called Appenders. There are FileAppenders, ConsoleAppenders, InMemoryAppenders and many more. An Appender is the destination of a log message.

We can leverage an MemoryAppender to catch application error or warning messages into a collection which can then be bound to a view. The code is as follows:

 public class NotifyingInMemoryLog4NetAppender : MemoryAppender
    {
        private readonly static BindableCollection<LogMessage> LogData = new BindableCollection<LogMessage>();
        private readonly static List<LoggingEvent> CachedEvents = new List<LoggingEvent>();
        private readonly Timer _flushTimer;
        private readonly object _lock  = new object();
        public NotifyingInMemoryLog4NetAppender()
        {
            _flushTimer = new Timer((arg) =>
            {
                lock (_lock)
                {
                    if (CachedEvents.Count > 0)
                    {
                        LogData.AddRange(CachedEvents.Select(Convert));
                        CachedEvents.Clear();
                        
                    }    
                }
            }, null, 5000, 3000);
        }

        public IListExtended Logs
        {
            get { return LogData; }
        }

        private DateTime _lastEventTime = DateTime.Now;
        private readonly TimeSpan _threshold = new TimeSpan(0, 0, 0, 0, 500);

        protected override void Append(LoggingEvent loggingEvent)
        {
            var now = DateTime.Now;
            
            // if we are getting flooded, cache the events and process them later.
            if (now - _lastEventTime < _threshold)
            {
                lock (_lock)
                {
                    CachedEvents.Add(loggingEvent);
                }
            }
            else
            {
                LogData.Add(Convert(loggingEvent));    
            }

            _lastEventTime = DateTime.Now;
        }

        private Severity ConvertFrom(Level level)
        {
            if (level == Level.Error)
                return Severity.Error;
            if (level == Level.Warn)
                return Severity.Warning;
            if (level == Level.Info)
                return Severity.Information;

            return Severity.Trace;
        }

        private LogMessage Convert(LoggingEvent loggingEvent)
        {
            return new LogMessage
            {
                Timestamp = loggingEvent.TimeStamp,
                Message = loggingEvent.RenderedMessage,
                Severity = ConvertFrom(loggingEvent.Level)
            };
            
        }
    }

This custom appender is used as follows:

<log4net debug="true">
    <appender name="InMemory" type="Infrastructure.Dictionary.Facades.NotifyingInMemoryLog4NetAppender, Pidac.Infrastructure.Dictionary ">
       <conversionPattern value="%5level [%thread] (%file:%line) - %message%newline" />
    </appender>
    <appender name="Console" type="log4net.Appender.ConsoleAppender">
      <layout type="log4net.Layout.PatternLayout">
        <!-- Pattern to output the caller's file name and line number -->
        <conversionPattern value="%5level [%thread] (%file:%line) - %message%newline" />
      </layout>
    </appender>
    <appender name="RollingFile" type="log4net.Appender.RollingFileAppender">
      <file value="dosewin.log" />
      <appendToFile value="true" />
      <rollingStyle value="Size" />
      <maxSizeRollBackups value="10" />
      <maximumFileSize value="10MB" />
      <staticLogFileName value="true" />
      <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
      </layout>
    </appender>
    <root>
      <!--ALL,DEBUG,INFO,WARN,ERROR,FATAL-->
      <level value="DEBUG" />
      <appender-ref ref="Console" />
      <appender-ref ref="RollingFile" />
      <appender-ref ref="InMemory" />
    </root>
  </log4net>

When you call any of the ILog API’s the Append(LoggingEvent loggingEvent) method in our custom implementation is called. We add this message to a local queue and then subsequently pass this on to the Static LogData structure which could be bound to a view. Note that since we do not really care about preserving the order in which the messages appear, since most datagrids today provide an ability for data to be sorted prior to presentation, a simple List will suffice for our queue.

And that is it.

Persisting View State via the ViewModel

Any Enterprise grade application you design will have a requirement to persist state. Say in your architecture, you have an implementation where all ViewModels subscribe to an ApplicationClosingEvent and then submit their profile to some implementation of an IProfileService. Upon application start-up, each ViewModel is presented with its saved state, by some profile service, cataloged by name.

So, what do you serialize and present as state? Well, since you already have a ViewModel, there is temptation to just serialize it, assuming it is serializable and pass this on to the profile service.

However, I prefer a different approach. For each ViewModel that has potentially serializable content, I make it implement an interface ISettingsContributor that has a single method GetPersistInfo(). This interface is defined as follows:

  public interface IPersistentUserSettingsContributor
    {
        PersistInfo GetSettingstoPersist();
        void LoadedPersistedSettings(PersistInfo persistInfo);
    }

Then I implement this in the base ViewModel class as abstract and force all ViewModels to provide an implementation. These methods can then be called by the framework to persist and restore state.

JavaScript runtime error: Unable to get property of undefined or null reference.

I randomly get this error when using TypeScript in ASP.NET MVC5 application.

Unhandled exception at line 15, column 5 in http://localhost:2425/theapp/Scripts/app.js?51
0x800a138f - JavaScript runtime error: Unable to get property 'prototype' of undefined or null reference

This is over the following automatically generated JavaScript:

var __extends = this.__extends || function (d, b) {
    for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p];
    function __() { this.constructor = d; }
    __.prototype = b.prototype;
    d.prototype = new __();
};

This is a typical application calling main App.js in _Layout.cshtml as follows:

   <script src="@Url.Content("~/Scripts/app.js?51")"></script>

Turns out that if I manually increment the number after app.js the problem magically disappears. Why? I cannot say at this time…

Setting MapSuite FeatureLayer projection during runtime

When working with ThinkGeo’s MapSuite, there may be times when you need to set a FeatureLayer’s projection dynamically at runtime.  Say for example, you loaded a Shapefile whose projection is WGS84 or Spatial Reference System Identifier (SRID) 4326 onto the map currently using OpenStreetMap (OSM) as the base map.  OSM’s SRID is 3857.

If the Shapefile had an associated projection file, you would have used the code similar to the following in order to create a projection for the ShapeFile’s FeatureSource:


 var file = Path.GetFileNameWithoutExtension(filename);
 var prjFile = Path.Combine(Path.GetDirectoryName(filename), file + &quot;.prj&quot;);
 
 var proj4 = new Proj4Projection();
 string prjFileText = File.ReadAllText(prjFile); //get the text of the .prj file.
 proj4.InternalProjectionParametersString = Proj4Projection.ConvertPrjToProj4(prjFileText); //set the internal projection. 

Now, if you know the SRID of the current basemap, and let us assume in this case, it is 3857 for OSM, this is also the best time to set it as follows:

var proj4 = new Proj4Projection();
proj4.ExternalProjectionParametersString= Proj4Projection.GetEpsgParametersString(3587)

before assigning the projection object to the FeatureSource as follows:

shapeFileLayer.FeatureSource.Projection = proj4;

If you added the layer to the map first before setting the external projection parameters of its feature source…. so you did something like this:

overlay.Layers.Add(layerName, layer);
var featureLayer = layer as FeatureLayer;
 if (featureLayer != null)
 {
    EstablishLayerProjection(featureLayer);
}

where in EstablishLayerProjection, you are basically doing something like this


var projection = featureLayer.FeatureSource.Projection as Proj4Projection;
if (projection != null)
{
   projection.Open();
   projection.ExternalProjectionParametersString = CurrentProjection.ExternalProjectionParametersString;
   projection.Close();
}

you will run into problems.  So it is best to establish both the projection’s internal and external parameters before assigning it to a layer.

WPF resource path resolution is all about context

Say you wanted to defined a Menu and Toolbar in a XAML file that associated with any code behind. One reason why you would do this is to be able to support a generic data grid view that you pass it a menu, toolbar and status bar with with its asssociated view model. As you can image, such a framework can allow you use a single DatagridView implementation for many different types of view.

Such a menu can be defined as follows:

<Menu
    DockPanel.Dock="Top"
    MinHeight="25"
    Background="Transparent"   
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:utils="clr-namespace:Pidac.Controls.Dictionary.Utils;assembly=Pidac.Controls.Dictionary">
    <Menu.Resources>
        <ResourceDictionary>
            <ResourceDictionary.MergedDictionaries>
                <utils:SharedResourceDictionary Source="../Resources/Styles.xaml"/>
            </ResourceDictionary.MergedDictionaries>
        </ResourceDictionary>
    </Menu.Resources>
    <MenuItem 
        Header="File"                  
        Margin="5,1,5,0" >
        <MenuItem 
            Header="Copy"   
            Command="{Binding Commands[CopyCmd]}" CommandParameter="{Binding ViewID}"  
            Icon="{StaticResource CopyImg16}"/>
        <Separator />
        <MenuItem 
            Header="Export" 
            Command="{Binding Commands[ExportToFileCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource DocumentExportImg16}"/>
        <MenuItem
            Header="Forward as Attachment" 
            CommandParameter="{Binding ViewID}"
            Command="{Binding Commands[AttachToEmailCmd]}"
            Icon="{StaticResource DocumentAttachImg16}"/>
        <MenuItem 
            Header="Print" 
            Command="{Binding Path=Commands[PrintCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource PrintImg16}"/>
        <Separator/>                       
        <MenuItem 
            Header="Refresh Now" 
            Command="{Binding Path=Commands[RefreshViewCmd]}"
            CommandParameter="{Binding ViewID}" 
            Icon="{StaticResource RefreshImage16}"/>   
        <MenuItem 
            Header="Close" 
            Command="{Binding Path=Commands[CloseCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource CloseOrExitImage16}"/>
            
    
    <MenuItem Header="Tools" Margin="5,1,5,0" >           
        <MenuItem 
            Header="Query" 
            Command="{Binding Path=Commands[QueryCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource SearchImage16}"/>
        <Separator/>
        <MenuItem 
            Header="Select columns" 
            Command="{Binding Path=Commands[SelColumnsCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource TableSelectColumnImage16}"/>
        <MenuItem 
            Header="Set Options" 
            Command="{Binding Path=Commands[PropertiesCmd]}" 
            CommandParameter="{Binding ViewID}"
            Icon="{StaticResource SetOptionsImage16}"/>
        </MenuItem>
    </MenuItem>
</Menu>

and the code that will parse this XAML into an menu instance that will be added to the object tree is as follows:

   public static TObject CreateObjectFromResource<TObject>(string resourceUrl, string szBaseUri) //where TObject : UIElement
        {
            var context = new ParserContext();
            if (szBaseUri != null)
                context.BaseUri = new Uri(szBaseUri);

            object Object = null;
            using (Stream resource = ResourceUtils.GetResourceStream(resourceUrl))
                Object = XamlReader.Load(resource, context);

            return (TObject)Object;
        }

Now this is the thing. You MUST absolutely have your resource paths right, otherwise, this you will scratch head for a couple of hours just to get this simple part of the framework in place. The two lines of pain are:

    xmlns:utils="clr-namespace:Pidac.Controls.Dictionary.Utils;assembly=Pidac.Controls.Dictionary"

and

       <utils:SharedResourceDictionary Source="../Resources/Styles.xaml"/>

In the first line, I am using a custom SharedResourceDictionary implementation to re-use styles. Even though this class is implemented in the same assembly that has the XAML resource, I need to still provide the assembly attribute value as well. Doing this:

    xmlns:utils="clr-namespace:Pidac.Controls.Dictionary.Utils"

throws the following exception and this is possibly due to the fact that the XAML parser code is not contained in the same assembly as the SharedResourceDictionary implementation.

On the second line, however, providing an absolute path is not necessary. Reason here is that XAML resource file is embedded within an assembly, which implies that the WPF style resolving process will look for the style relative to the resource location in its containing assembly. An absolute path never hurts but its not necessary.