Monthly Archives: December 2017

Resharper with NUnit does not find local files

This is one of the problems we encounter ever so often but do not take the time to document it.

Say you have some test files in your project and have set their properties to Content and Copy Always.  You are using our favorite Resharper to run your unit tests using the NUnit test runner.  When you attempt to execute this test, you get an error as follows:

 Could not find a part of the path 'C:\Users\knji\AppData\Local\JetBrains\Installations\ReSharperPlatformVs15_427a36eb\TestFiles\PEs\notification.xml'.
 at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
 at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions

When you continue digging, you do realize that your files are indeed present in the bin/Debug folder of your application but Resharper just will not find it. Possible solutions:
1. Turn off Resharper shadow copying. This did not work for us.
2. Instruct your test to explicitly use the current working directory from the TestContext. This worked for us. So here’s the fix. Instead of doing this:

var file = File.ReadAllLines(@"my-relative-folder/some-cool-date.xml");

do this

var file = File.ReadAllLines(@TestContext.CurrentContext.TestDirectory + /"my-relative-folder/some-cool-date.xml");

Happy Coding.

Advertisements

Composing a Client in a Client/Server Architecture Model without Service

I will start by saying this:

If all you have is the functional requirements of the client application and known table structure of the underlying database and the service interface has not yet been established, then let the requirements of the client application drive the service contract.  The structure of the underlying database should not drive this service contract.

Over the course of my career, I have had to architect several solutions against external systems, mostly webservices, which either had not been developed nor had not yet been conceptualized or were in the initial stages of development.

Recently at Phreesia, we had to develop against a web service which was in its initial stages of conception.   At Honeywell’s Global Tracking division , when we migrated the legacy OCC 200, a 20+ years old legacy VMS Search and Rescue application, to the OCC 600, a modern client/server architecture, there was enormous amounts of effort expended in formulating the functional requirements of the desktop application since getting the user interface and all its usability concerns right was of utmost concern to facilitate adoption.

While such efforts was spent on defining the requirements of the client, the requirements of the server were not being developed at the same pace.  Consequently, development of the client started well in advance of the server and in such a case, a decision had to be made on how to compose the Data Access Layer of this desktop application.

We started development of the client side WPF desktop application, by decomposing the application into the three main layers: Presentation Layer Tier, Business Service Layer, Data Access Layer as shown below:

three layered application.png

We had a solid set of functional requirements against the Presentation Layer, which also drove the Middle Layer as well as the DAL. We also had an understanding of the database even though this was not documented and were able to work with the back-end developers to ensure that the contracts we were formulating could be met.  Based on the nature of the application’s use case in Search and Rescue; we are talking about a heavy desktop GIS mapping application, significant amount of functionality was developed before the Web Service came online, creating fake Web Services which implemented the same service contracts.

The Data Access Layer

As the saying goes, this is where the rubber hits the road,  this layer was responsible for retrieving data from the underlying source of truth, our databases and making this available to the application.  It’s purpose was two fold:

  1. Manage all connectivity with the underlying data source:  This was achieved via WCF service constructs.
  2. Encapsulate data retrieval from the application: This was achieved via interfaces and dependency injection.
  3. Map external entities retrieved from the underlying data source to business objects are required by the application: This was achieved via the Service Oriented Architectural and Chain of Responsibility pattern.

This layer was a dedicated set of DLLs which were injected into the application via IoC allowing the front and backend development to continue in a loosely coupled fashion.  The application’s DAL ended up looking like this:

DAL decomposition (1).png

The experiment was a success, after which we had a highly usable and performance WPF desktop application which the end customer has grown to like and adopt.

In another blog, post I will address the issue of mapping from web service entities to the application domain model.  How we did this in a consistent, loosely coupled, testable and extensible manner.

What has been your experiences?

Configuring Asp.NET Core 2 to return Pretty JSON

When creating a WebAPI which returns JSON, it is often times very useful to return formatted JSON, especially if your API does not come with any sort of documentation.

ASP.NET Core 2, makes this easy.  All you have to do in your Web API project is to configure the Json serializer before the application starts and this is done in Startup.cs as follows:


// this method is in the Startup class......

public void ConfigureServices(IServiceCollection services)
 {
 services.AddMvc()
 .AddJsonOptions(options => options.SerializerSettings.Formatting = Formatting.Indented );

If your ValuesController returned

["value1", "value2"]

after the above change, it will now return

[
   "value1"
   "value2"
]

 

Once again, happy coding.