Software Development Guidelines to Live By

Not too long ago I posted a tweet that immediately went viral. (OK, it’s all relative – to me 66 retweets and 120 favorites is viral.)  It referred to Microsoft’s Engineering Guidelines for contributing to its open-source repository on GitHub for the next version of its web development platform, ASP.NET 5.

eng-guidelines

You may be familiar with other C# Coding Guidelines.  And generally I’m a huge fan of picking a set of guidelines, making necessary adjustments, and sticking to them as a team.  But what I appreciate about the ASP.NET 5 guidelines is that they not only cover coding guidelines, but also include other vital aspects of software development, such as source code management, product planning and issue tracking.

One of the first things listed is the guidelines for submitting pull requests and a description of how they are reviewed and approved (using an emoticon!).  Their Git branching strategy is also described, as well as the solution and project folder structure and the assembly naming pattern.  It’s notable that xUnit is used for all testing and is indicative that xUnit has now supplanted both MSTest and NUnit as the preeminent testing platform.  (In other words, just use it.)  It’s also worth noting that to use xUnit you no longer need to install a Visual Studio extension – adding the xUnit NuGet package is all you need to do for tests to show up in the Visual Studio Test Explorer.  And ReSharper has its own xUnit plugin, which you can install to use the R# test runner with xUnit tests.  Lastly, you’ll want to pay close attention to the naming guidelines for test assemblies, classes and methods, as well as appropriate unit test structure, and taking into account exception messages in your tests.

The actual coding guidelines section is rather lightweight, probably because ReSharper enforces most generally accepted rules.  (You are using ReSharper, aren’t you?!)  If you’re a company, the commercial license is well worth the price, but if you’re an individual developer and price is a concern, see if you can qualify for a free license as the author of an open source project.  One interesting thing to call out is the use of the [NotNull] parameter attribute for argument null checking, which results in null-checking code injected into the method at compile-time.  You’ll need to use it on all public methods, including constructors and property setters.

Another thing to note is the restricted use of internal types and members.  You’ll almost always want to shy away from the use of internal and instead place those types in a namespace which indicates they are primarily for internal use.  It’s not a hard and fast rule, but it’s how most modern frameworks are built.  This makes the use of the [InternalsVisibleTo] attribute unnecessary, which is good because it prevents you from having to modify a core assembly when all you want to do is add a test assembly for it.

The most controversial part of the coding guidelines is the recommendation to use the var keyword whenever possible for local type inference.  I have mixed feelings on this one, because nowadays you’re often looking at code in GitHub, or with a Git client, where you can’t just mouse over var for the return of a method call to see what the type really is.  On the other hand, there are arguments for using var whenever allowed by the compiler.  For this one, it’s really a matter of what your team prefers.

Note that guidelines are only as effective as their enforcement.  For this reason, you’ll need something like StyleCop, which has been implemented as a ReSharper extension.  First install it by selecting Extensions Manager from the ReSharper menu in Visual Studio.

resharper-stylecop-ext

Then go to ReSharper, Options, Tools, and select StyleCop.  If you see a compatibility warning message, click the the button that says, “Reset C# Code Style Options.”

resharper-stylecop-opt

After this, you’ll see StyleCop rules pop up when you’ve violated standard conventions, such as the ordering of using directives. In the example below, selecting the first option will reorder the usings appropriately.

resharper-code-cleanup

There are also facilities which allow you to select a Settings.StyleCop file, and to enable the  insertion of file headers during cleanup.

The last thing I’ll mention is that there is an unwritten rule regarding line length: lines of code should not be so long as to cause the scroll bar to appear in GitHub.  I found this to happen when exceeding 118 characters, but you may wish to limit your code to a shorter width.  Whatever length you decide on, you need a visual cue for when your code has exceeded it.  The best way I know if is to install the Visual Studio Productivity Power Tools, and insert a registry entry so that a vertical guideline will appear showing where to insert a line break.

The AspNet guidelines are a step in the right direction, but there are a bunch of other things developer should keep in mind when writing code.  In particular, when refactoring one should avoid code smells as much as possible.  These coding standards provide a nice summary, and they mention the principles of SOLID, KISS, YAGNI and DRY, as well as common design patterns.

The important thing is not to take these guidelines as written in stone, but as a starting point for formulating your own set of guidelines, which work for you, your project and your team.

Posted in Technical | Tagged , | Leave a comment

Cmder: Making the Command Line Your Best Friend

Now that I’ve jumped fully on board the Git and GitHub bandwagon, I’m spending a lot more time at the command line.  In fact, I find myself working with both Visual Studio and the command prompt simultaneously, constantly switching back and forth between the two.  The main reason is that, while Visual Studio 2013 makes a great Git client and tools such as TortoiseGit are a great help, there are some Git commands, like stash and rebase, that either aren’t supported well by the tools or are just easier to perform at the command line.  Besides, most of the online Git tutorials list Git commands.

cmder-main

DOWNLOAD Cmder here: http://bliker.github.io/cmder.

Another reason why I find myself more on the command line is that I’m working with ASP.NET 5, which requires the use of a command prompt for managing versions of the runtime and generally embraces command line tools as a first class citizen.  Finally, there’s Chocolatey, for easily installing tools at the command line, and, of course, PowerShell for automating actions with scripts.

As my need to work at the command line has increased, so has my frustration with the standard windows command prompt. Thankfully, I’ve run across a very nice command prompt replacement called Cmder, which combines the console emulator, ConEmu, with cmd enhancements from Clink and Git support from msysgit.  In short, it’s the one command prompt you’ll ever want to use.

If for no other reason, Cmder will win you over with its support for ^C and ^V for Copy and Paste. A list of basic commands can be found here, and you can execute a command to add file explorer integration, so that you can right-click on a directory to see “Cmder Here” in the context menu (open an administrator prompt at the Cmder installation directory and execute .\cmder.exe /REGISTER ALL).

I much prefer using Cmder for all my git command-line tasks rather than opening a Git Bash prompt, because all the standard cmd.exe commands are there, and it shows both the repo location and current branch.  Here you can see I’m at a local Git repo on the master branch, then I list all the branches and checkout the develop branch.

cmder-git

And here you can see an example of executing ASP.NET 5 commands, such as dotnetsdk (aka kvm).

cmder-asp

There are some fancy things you can do with Cmder, including basic unix commands (such as ls, mv, cp, grep, cat) and aliases, which are shortcuts you can define in a text file (located in config\aliases) for common tasks. For example, there’s a built-in alias for the Windows file explorer. Simply type .e, and an explorer window opens at your current location.

Besides its versatility, what I especially like about Cmder is the ConEmu goodness that comes with it.  It’s easy to set up predefined tasks that open additional tabs in the same Cmder window so you can switch between them by pressing Ctrl+Tab and Ctrl+Shift+Tab.  You can also assign each task a hot key. To set up tasks, just click on the dropdown arrow to the right of the green plus icon in the lower right-hand corner.

cmder-setup-tasks

Selecting “Setup tasks…” from the menu will display a dialog for configuring ConEmu settings.  From here you can click “Add default tasks,” then tweak the commands that were inserted.  This will allow you to open new tabs for an administrator prompt, PowerShell, Chocolatey and Git Bash, among others.

cmder-tasks

Another thing you can do at this dialog is to assign a different palette (color scheme) for different commands.  For example, it would be nice for the Admin prompt to show up with a slightly different color scheme to give you a visual queue as to which console you’re on, and it would be just as nice for the PowerShell console to light up with the standard PS colors.  To do that, you can select Features / App distinct, then add application distinct settings which override the default color palette.

cmder-app-distinct

Executing the PowerShell Admin task will, for example, open a new elevated Cmder tab colored in the same way as the standard PS prompt.

cmder-ps

Lastly, after making these customizations you’ll want to export the settings file.  If you want to use the settings file I created for my own use, you can get it here.  Better yet, you can copy the entire cmder folder to a USB stick or a cloud service, such as DropBox, where you can use it from any connected machine.  Enjoy!

Posted in Technical | Tagged , | Leave a comment

Handle Cyclical References with ASP.NET Web API 2 and MVC 6

Many ORM (object-relational mapping) tools, such as Entity Framework 6 Tools for Visual Studio 2012 & 2013, Entity Framework Power Tools, or Entity Framework Reverse POCO Generator, generate entity classes that contain cyclical references.


Download the source code, samples, and NuGet packages for this blog post:
1. AspNet WebApi 2 Helpers (Asp.Net 4 / vCurrent)

json-xml-logo-webapi2  protobuf-logo-webapi2

2. AspNet Mvc 6 Helpers (Asp.Net 5 / vNext)

jaon-xml-logo-mvc6  protobuf-logo-mvc6


Say, for example you have two classes: Product and Category.

public partial class Product
{
public int ProductId { get; set; }

public string ProductName { get; set; }

public int? CategoryId { get; set; }

public Category Category { get; set; } // References Category
}

public partial class Category
{
public Category()
{
Products = new HashSet<Product>();
}

public int CategoryId { get; set; }

public string CategoryName { get; set; }

public ICollection<Product> Products { get; set; } // References Product
}

Product has a Category property, and Category has a Products property.  This is all fine, except when you attempt to return those entities from a web service, in which case the serializer will complain that the object graph for the type contains cycles and cannot be serialized.  Here you see the exception generated by the XML (Data Contract) serializer.

ser-error-xml

Here is the exception generated by Json.Net.

ser-error-json

There are a number of ways to solve this problem. First, you could eliminate the cyclical reference in entities returned by the service.  For example, just remove the Products property from Category, and the error disappears. This works if you use the DTO (data transfer objects) pattern, perhaps utilizing a class mapping tool.

If you don’t want to maintain a separate set of classes, your other alternative would be to give a hint to the serializer to serialize object references instead of values.  This can be done with attributes places on classes and/or properties.  Json.Net has a JsonObject attribute with an IsReference parameter which you can set to true.  Similarly, the DataContract attribute also has an IsReference property, but using it means you have to decorate every property you want included with a DataMember attribute. Not only is the resulting code messy, but polluting classes with serialization attributes violates the POCO (plain old CLR object) principle by coupling entities to specific infrastructure concerns. And to include them in your entities, you’d need to modify the T4 template used for the code generation, which isn’t all that easy.  Otherwise, your changes will be wiped out the next time you re-generate your entity classes.  Your task is further complicated should you decided to support another serialization format, such as Protobuf.

// Should entities be aware of serialization concerns?
[JsonObject(IsReference = true)]
[DataContract(IsReference = true)]
public partial class Product
{
[DataMember]
public int ProductId { get; set; }
[DataMember]
public string ProductName { get; set; }
[DataMember]
public int? CategoryId { get; set; }
[DataMember]
public Category Category { get; set; }
}

If you’re not using DTO’s and you’d rather not liter your POCO’s with attributes, a better approach would be to configure the serializers programmatically. And that’s what my serialization helpers are all about.

It turns out ASP.NET Web API has good support for configuring serializers on the default Json and Xml formatters, but the way you go about it varies depending on which formatter you are using.  For example, Json.Net supports setting the serializer to preserve references globally, but the DataContractSerializer requires you configure it on a per-type basis by initializing a new serializer for each type.  Protobuf-net, on the other hand, requires that you configure metadata for each type property and add it to a static RuntimeTypeModel.  What is needed is a simple consistent API for configuring formatters to handle cyclical references, both on the server and on the client.

To address this need, I created a set of NuGet packages which you can add to a project hosting ASP.NET Web API services: AspnetWebApi2Helpers.Serialization and AspnetWebApi2Helpers.Serialization.Protobuf. Then simply add a few lines of code to the WebApiConfig.Register method in the App_Start folder of your web project.

public static class WebApiConfig
{
public static void Register(HttpConfiguration config)
{
// Configure formatters to handle cyclical references
config.Formatters.JsonPreserveReferences();
config.Formatters.XmlPreserveReferences();
config.Formatters.ProtobufPreserveReferences(typeof(Category).Assembly);

// Other code elided for clarity ...
}
}

Neither the Json nor Xml formatters require passing types to the XxxPreserveReferences method (the API for the Xml serializer is simplified though the use of a custom media type formatter), but that the Protobuf formatter does need the types to be passed, because of a limitation with the current version of the Protobuf-net formatter.  The client-side code is similarly straightforward.

// Configure formatter to handle cyclical references
var jsonFormatter = new JsonMediaTypeFormatter();
jsonFormatter.JsonPreserveReferences();
var xmlFormatter = new CustomXmlMediaTypeFormatter();
var protoFormatter = new ProtoBufFormatter();
protoFormatter.ProtobufPreserveReferences();

// Select a formatter
MediaTypeFormatter formatter = jsonFormatter;
var products = productsResponse.Content.ReadAsAsync<List<Product>>(new[] { formatter }).Result;

Lastly, I’ve created NuGet packages which target ASP.NET MVC 6 for use with ASP.NET 5 (vNext) and Visual Studio 2015.  The Json and Xml serialization helpers target both the full and core versions of the CLR that are package-deployable and cloud-friendly. (The MVC 6 Protobuf serialization helper depends on Protobuf-net, which does not currently support CLR Core).  Here is a Startup class which includes configuration of MvcOption by using the XxxPreserveReferences extension methods.

public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
// Configure formatters to handle cyclical references
services.AddMvc()
.Configure<MvcOptions>(options =>
{
options.JsonPreserveReferences();
options.XmlPreserveReferences();
#if ASPNET50
options.ProtobufPreserveReferences(typeof(Product).Assembly);
#endif
});
services.AddWebApiConventions();
}

public void Configure(IApplicationBuilder app)
{
app.UseMvc(routes =>
routes.MapWebApiRoute("DefaultApi", "api/{controller}/{id?}"));
app.UseWelcomePage();
}
}

I’ve uploaded all the serialization helper packages to the NuGet Gallery (at this time the versions are pre-release).  To use them, simply all them to your projects using the NuGet Package Manager – just remember to select “Include Prerelease” and search for AspNetWebApi2Helpers.

AspNetWebApi2Helpers

For ASP.NET MVC 6 (vNext), check “Include Prerelease” and search for AspNetMvc6Helpers.

AspNetMvc6Helpers

To download the source code and samples, please visit the GitHub repositories for AspNet WebApi 2 Helpers (Asp.Net 4 / vCurrent) and AspNet Mvc 6 Helpers (Asp.Net 5 / vNext).

Enjoy!

Posted in Technical | Tagged , , | 3 Comments

MyGet: Continuous Integration for the Rest of Us

By this time, most .NET developers are familiar with NuGet.  It used to be that if you wanted to use some part of .NET, such as Entity Framework, you would add a reference to your project pointing to some DLL in the .NET Framework Class Library (FCL).  Those days are loooong gone.  Nowadays, if you want to use Entity Framework you must get it from NuGet, because it’s not even included in the FCL anymore.  The reason is simple: product teams at Microsoft can push out updates in much shorter cycles using NuGet than they ever could do with the .NET Framework itself.

So why not use NuGet to distribute libraries to other parts of your organization?  This process is straightforward and well-documented.  But what is not so well documented is how to generate NuGet packages and push them up to a feed whenever you check code into your repository and kick off a Continuous Integration build.  Fundamentally it’s not that difficult.  The process entails writing a build script that executes pack and push commands using the NuGet command line tool, which you can install using either a bootstrapper or Chocolatey.  If you are using Team Foundation Build Server, you can create a build definition to execute these NuGet commands, or for smaller projects you can use a cloud-hosted build controller with Visual Studio Online.

But what if you’re authoring an open-source project on GitHub or BitBucket and want the benefits of a build server with NuGet integration?  That’s where MyGet comes in!  For example, my Trackable Entities project is hosted on GitHub and whenever I push to the repo, MyGet kicks off a build and publishes all the packages to a CI NuGet feed. To get those packages, you need to add a NuGet package source pointing to the feed (in Visual Studio select Tools, Options, NuGet Package Manager, Package Sources).

trackable-ci-feed

After adding the CI package feed from MyGet, you’ll see the latest packages in the dialog that appears when you select Manage NuGet Packages after right-clicking on the project in the Visual Studio solution explorer.  These will show up when you select “Include Prerelease” from the drop down.

trackable-ci-packages

MyGet provides a hosted NuGet server with continuous integration build services that hook into several popular online Git repositories, including GitHub, BitBucket, CodePlex, and Visual Studio Online.  You can even configure builds to run unit tests and push releases to the public NuGet package gallery.  And for public feeds using less than 500 MB, the service is free.

I implemented continuous integration for my Trackable Entities open-source project using MyGet and thought it might be useful to share my experience here.  MyGet’s build services, while still in beta, are well documented, and excellent tech support is also provided.  The most reliable path for me was to write a custom build script containing NuGet commands for generating NuGet packages.  Along the way, I discovered some things about NuGet I didn’t know before.  In particular, I found out how to use tokenized nuspec files in conjunction with csproj files, which allows setting the NuGet package version based on the build version.  What you want to do is create a nuspec file with the same name and in the same directory as a csproj file.  The path to the csproj file is passed to the NuGet pack command, but the $PackageVersion$ token is used in the nuspec file to pull in the NuGet PackageVersion parameter.  For example, here is the nuspec file for my TrackableEntities.Client NuGet package. Notice how the $PackageVersion$ token is also used for the dependency on TrackableEntities.Common.  This way, all the package versions for a single build stay in sync, which is nice.

<?xml version="1.0" encoding="utf-8"?>

<package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd">

    <metadata>

        <id>TrackableEntities.Client</id>

        <version>$PackageVersion$</version>

        <title>Trackable Entities Client</title>

        <authors>Tony Sneed</authors>

        <owners>Tony Sneed</owners>

        <licenseUrl>http://trackable.codeplex.com/license</licenseUrl>

        <projectUrl>http://trackable.codeplex.com</projectUrl>

        <iconUrl>http://tonysneed.com/images/trackable/tracking-small.png</iconUrl>

        <requireLicenseAcceptance>false</requireLicenseAcceptance>

        <description>Change-tracking utility for client applications that wish to transmit entities to a web service for batch updates.</description>

        <language>en-US</language>

        <tags>change-tracking entity-framework n-tier wcf web-api</tags>

        <dependencies>

            <group targetFramework=".NETPortable0.0-net45+sl5+win8+windowsphone8">

                <dependency id="Newtonsoft.Json" version="6.0.3" />

                <dependency id="TrackableEntities.Common" version="$PackageVersion$" />

            </group>

        </dependencies>

    </metadata>

</package>

Here is the build script with an install command for restoring NuGet packages and compiling the project, followed by the pack command for generating the NuGet package and setting the PackageVersion.

REM Set Variables:
set config="%Configuration%"
if %config% == "" (set config=Release)
set version="%PackageVersion%"

REM Restore:
Source\.nuget\nuget.exe install Source\TrackableEntities.Client\packages.config -OutputDirectory Source\packages -NonInteractive

REM Build:
mkdir Build\Source\Output\TrackableEntities.Client\portable-net45+sl5+win8+windowsphone8
%WINDIR%\Microsoft.NET\Framework\v4.0.30319\msbuild Source\TrackableEntities.Client\TrackableEntities.Client.csproj /p:Configuration="%config%" /m /v:M /fl /flp:LogFile=Build\Source\Output\TrackableEntities.Client\portable-net45+sl5+win8+windowsphone8\msbuild.log;Verbosity=Normal /nr:false

REM Package:
Source\.nuget\nuget.exe pack "Source\TrackableEntities.Client\TrackableEntities.Client.csproj" -symbols -o Build\Source\Output\TrackableEntities.Client\portable-net45+sl5+win8+windowsphone8 -p Configuration=%config%;PackageVersion=%version%

Notice the presence of the –symbols parameter, which produces a NuGet package containing just the source and symbols (pdf files) for the project, which are then pushed by MyGet to the symbols server at SymbolSource.org, provided you set up MyGet to push packages to SymbolSource.  This allows developers using the CI NuGet packages to debug the library and step into code.  To get this to work, you’ll need to add the correct symbol file location (http://srv.symbolsource.org/pdb/MyGet/tonysneed/f190ef34-3196-4bc2-b3a1-61b2172064e3) and specify a local symbols cache on your machine.  In addition you’ll need to go to Tools, Options, Debugging, General, then check “Enable source server support” and uncheck “Require source files to exactly match the original version” (which is needed because the checksums will differ).

trackable-ci-symbols

What’s nice about this story is that it enables developers using a NuGet package to easily step through the source code while debugging, while matching it to a specific NuGet package version and commit to a Git repository.  You’ll want to pay close attention to the Modules window, which you can view while debugging by selecting Debug, Windows, Modules.  Make sure that the debug symbols are loaded from the correct location set for the symbols cache.  If not, you can disable automatic symbol loading, then load the symbols manually.

Thanks to MyGet, connecting CI builds to a NuGet feed and a symbol server is now attainable for the ordinary developer. Enjoy!

Posted in Technical | Tagged , | Leave a comment

EF 6.x Code-First and Model-First with Trackable Entities 2.1

Until now Trackable Entities has required the Entity Framework Power Tools to reverse engineer Code-First model classes from an existing database.  But not long ago the Entity Framework team released the EF 6.x Tools for Visual Studio 2012 and 2013, so that you can use the same Entity Data Model wizard to generate context and entity classes using either the Code First or Model first approaches.  The advantage of the consolidated designer is that you can pick and choose which tables you want, versus generating classes for all the tables in your database – which could take an inordinate amount of time if your database has a lot of tables.

trackable-tools

When the tools were first released, the ability to customize the T4 templates used for generating Code First classes was undocumented.  However, the team eventually published an article on how to customize the templates, and I was able to include customized T4 templates in the Visual Studio samples and templates for Trackable Entities v2.1.  You can install Trackable Entities right from within Visual Studio, by selecting Tools, Extensions and Updates, and then searching for “trackable” in the Online Visual Studio Gallery.

trackable-online-gallery

Once you’ve installed Trackable Entities, simply create a new project and select either “Trackable Web API Application” or “Trackable WCF Service Application” from the Trackable category under Visual C#. (You can also select “Trackable Web API Application with Repository and Unit of Work”.)  Then right-click the Service.Entities project and select “Add New Item” from the context menu.  From the Data category select “ADO.NET Entity Data Model,” type a name for the model and click Add.  You will be presented with the Entity Data Model Wizard, where you can select an option to either create Code First classes or add an EF Designer model.

trackable-edm-choose

From there you can select which database objects to include.  If you chose the “Code First from Database” option, you’ll get context and model classes for the tables you selected.  If, on the other hand, you selected “EF Designer from Database,” you’ll get an Entity Data Model designer (backed by an EDMX file), with entities and associations representing a conceptual model of your database.  This is often referred to as the Model-First (or Database First) approach.

Trackable Entities v2.1 supports this strategy by providing a set of custom T4 templates for generating client and service entities based on the Entity Data Model depicted in the diagram.  Many developers prefer Model-First over Code-First because it is relatively easy to keep the model in sync with the database by right-clicking on the diagram and selecting “Update Model from Database.”  To generate trackable service entities from the EDM diagram, simply right-click on the design surface of the model and select “Add Code Generation Item”.

trackable-model-first-code-gen

From there you can select which database objects to include.  Then expand the Trackable, Data category and select “Service Trackable Entities EF 6.x Model First Generator”. For the name enter the same model name you specified when adding the EDM to your project.

trackable-model-first-generator

You will then be prompted to overwrite the existing .tt files in your project.  Respond “Yes” to overwrite the files, and you’ll get model classes which are customized to work with trackable entities.  These are identical to those generated for Code First using either the EF Power Tools or the EF 6.x Tools for VS.  For example, the Product entity appears as follows:

[JsonObject(IsReference = true)]

[DataContract(IsReference = true, Namespace = "http://schemas.datacontract.org/2004/07/TrackableEntities.Models")]

public partial class Product : ITrackable

{

    [DataMember]

    public int ProductId { get; set; }

    [DataMember]

    public string ProductName { get; set; }

    [DataMember]

    public Nullable<int> CategoryId { get; set; }

    [DataMember]

    public Nullable<decimal> UnitPrice { get; set; }

    [DataMember]

    public bool Discontinued { get; set; }

    [DataMember]

    public byte[] RowVersion { get; set; }

    

    [DataMember]

    public Category Category { get; set; }

    

    [DataMember]

    public TrackingState TrackingState { get; set; }

    [DataMember]

    public ICollection<string> ModifiedProperties { get; set; }

    [JsonProperty, DataMember]

    private Guid EntityIdentifier { get; set; }

}

Because the ADO.NET Entity Data Model designer is not compatible with the Client.Entities project, which is a Portable Class Library, you’ll need to add a .NET 4.5 Class Library project to the solution to generate client-side entities.  Simply repeat the process of adding a code generation item, but instead select “Client Trackable Entities EF 6.x Model First Generator”.  From the Client.Entities project you can then link to the model classes by right-clicking the project, then selecting Add Existing Item and choosing “Add As Link”.

Trackable entities contain three additional properties which enable tracking entity change-state across service boundaries: TrackingState (Unchanged, Modified, Added, Deleted), ModifiedProperties (a list of properties with changed values – used for partial table updates), and EntityIdentifier (a Guid used for correlating updated entities with original entities, so that changes can be merged back).  On the client, a ChangeTrackingCollection<T> automatically sets TrackingState when entities are added, modified or deleted, and caches deleted entities so they can be retrieved using the GetChanges method, along with other changed entities in the object graph – including child and reference entities.

A Portable Class Library is used for client entities, and for the client Nuget package, so that they can be used with any client, including WPF, Silverlight, Windows Store (tablet), Windows Phone, iOS and Android (via Xamarin).  And the libraries are designed in such a way that the client remains completely ignorant of how entities are persisted by the service.  Both XML and JSON serialization formats are supported, and there are mutli-project templates for both WCF and ASP.NET Web API, which allow you to implement an n-tier solution in a fraction of the time you would otherwise spend.  In addition, the templates implement best practices for stateless services which persist changes using the async capabilities of Entity Framework and WCF or Web API.  There is an ApplyChanges extension method on DbContext which walks the entire object graph in all directions to inform the EF context of changes, which are all persisted in a single transaction when SaveChanges is called.

Posted in Technical | Tagged , , , | 2 Comments

Real-World MVVM with Entity Framework and ASP.NET Web API

I just completed a sample application using Simple MVVM Toolkit together with Trackable Entities to build a real-world N-Tier solution with a WPF client and portable POCO entities that are automatically change-tracked and sent to an ASP.NET Web API service that uses Entity Framework to perform asynchronous CRUD operations (Create, Retrieve, Update, Delete). The sample includes a Windows Presentation Foundation client, but the toolkit has a Visual Studio template for building a multi-platform client with portable view models that are shared across WPF, Silverlight, Windows Phone, Windows Store, iOS and Android.

Download the Simple MVVM Trackable Entities sample application here.

The nice thing about this sample is that it demonstrates how to build a complete end-to-end solution.  Client-side entities don’t care if they are sent to a WCF or Web API service and are marked up for serialization using both [DataContract] and [JsonObject] attributes.  Both WCF and Json.NET serializers accept attribute-free classes, but the attributes are included in order to handle cyclic references. The WPF client binds views to view models which expose entities as properties, and because ChangeTrackingCollection<T> extends ObservableCollection<T>, it is data-binding friendly.

View models have methods which call GetChanges on the change tracker, so that only changed entities are sent to the service.  GetChanges traverses the object graph in all directions, including 1-1, M-1, 1-M and M-M relations, and returns only entities which have been added, modified or deleted, saving precious bandwidth and improving performance.  Service operations return inserted and updated entities back to the client, which include database-generated values, such as identity and concurrency tokens.  View models then invoke MergeChanges on the change tracker to update existing entities with current values.

  1. public async void ConfirmSave()
  2. {
  3.     if (Model == null) return;
  4.     try
  5.     {
  6.         if (IsNew)
  7.         {
  8.             // Save new entity
  9.             var createdOrder = await _orderServiceAgent.CreateOrder(Model);
  10.             Model = createdOrder;
  11.         }
  12.         else
  13.         {
  14.             // Get changes, exit if none
  15.             var changedOrder = ChangeTracker.GetChanges().SingleOrDefault();
  16.             if (changedOrder == null) return;
  17.  
  18.             // Save changes
  19.             var updatedOrder = await _orderServiceAgent.UpdateOrder(changedOrder);
  20.             ChangeTracker.MergeChanges(updatedOrder);
  21.  
  22.             // Unsubscribe to collection changed on order details
  23.             Model.OrderDetails.CollectionChanged -= OnOrderDetailsChanged;
  24.  
  25.             // End editing
  26.             EndEdit();
  27.         }
  28.  
  29.         // Notify view of confirmation
  30.         Notify(ResultNotice, new NotificationEventArgs<bool>(null, true));
  31.     }
  32.     catch (Exception ex)
  33.     {
  34.         NotifyError(null, ex);
  35.     }
  36. }

The ConfirmSave method is from the OrderViewModelDetail class, which exposes a ResultNotice event to facilitate communication with OrderDetailView.xaml.  The code-behind for OrderDetailView handles ResultNotice by setting the view’s DialogResult, which closes the dialog and sets the result to true for confirmation or false for cancellation.

  1. public partial class OrderDetailView : Window
  2. {
  3.     public OrderDetailView(Order order)
  4.     {
  5.         ViewModel = (OrderViewModelDetail)DataContext;
  6.         ViewModel.Initialize(order);
  7.         ViewModel.ErrorNotice += OnErrorNotice;
  8.         ViewModel.ResultNotice += OnResultNotice;
  9.     }
  10.  
  11.     public OrderViewModelDetail ViewModel { get; private set; }
  12.  
  13.     private void OnResultNotice(object sender, NotificationEventArgs<bool> eventArgs)
  14.     {
  15.         DialogResult = eventArgs.Data;
  16.     }
  17.  
  18.     private void OnErrorNotice(object sender, NotificationEventArgs<Exception> eventArgs)
  19.     {
  20.         MessageBox.Show(eventArgs.Data.Message, “Error”);
  21.     }
  22.  
  23.     private void OnUnloaded(object sender, RoutedEventArgs e)
  24.     {
  25.         ViewModel.ErrorNotice -= OnErrorNotice;
  26.         ViewModel.ResultNotice -= OnResultNotice;
  27.     }
  28. }

I enjoyed putting the sample together because it gave me the opportunity to revisit my MVVM toolkit and soak up some of the goodness I put into it.  For example, the ViewModelDetail base class implements IEditableObject by cloning and caching the entity when BeginEdit is called, and pointing the Model property of the view model to the cached entity.  Because the user is working off a separate entity, the UI showing the original entity does not reflect changes the user is making until EndEdit is called, when values are copied from the working copy back to the original.  CancelEdit simply points Model to the original and discards the edited version.  The view model base class also includes IsEditing and IsDirty properties, which are updated appropriately.

I also took advantage of support for async and await in .NET 4.5. For example, CustomerServiceAgent provides an async GetCustomers method, which is called by the view model to bind a list of customers to a combo box.  This transparently marshals code following await onto the UI thread to update the contents of the combo box.

  1. public class CustomerServiceAgent : ICustomerServiceAgent
  2. {
  3.     public async Task<IEnumerable<Customer>> GetCustomers()
  4.     {
  5.         const string request = “api/Customer”;
  6.         var response = await ServiceProxy.Instance.GetAsync(request);
  7.         response.EnsureSuccessStatusCode();
  8.         var result = await response.Content.ReadAsAsync<IEnumerable<Customer>>();
  9.         return result;
  10.     }
  11. }

Tinkering with XAML for the views allowed me the opportunity to solve some common challenges.  For example, the customer orders view has a pair of data grids that need to function in concert as master-detail, with the first grid showing orders for a selected customer, and the second grid showing details for the selected order.  I had to bind SelectedIndex on the orders grid to the SelectedOrderIndex property on the view model, and bind SelectedItem to the SelectedOrder property.  I got the details grid to synchronize by binding ItemsSource to SelectedOrder.OrderDetails.

Another interesting problem was how to populate a Products data grid combo box column in the details grid on OrderDetailView.xaml.  That required placing a Products property on the view model and using a RelativeSource binding on the ElementStyle and EditingElementStyle properties of the combo box column.

  1. <DataGrid Grid.Row=2 Grid.Column=0 Height=140 VerticalAlignment=Top
  2.           ItemsSource=“{Binding Model.OrderDetails} AutoGenerateColumns=False Margin=0,10,0,0 IsTabStop=True TabIndex=3 >
  3.     <DataGrid.Columns>
  4.         <DataGridTextColumn Binding=“{Binding OrderDetailId} ClipboardContentBinding=“{x:Null} Header=OrderDetail Id/>
  5.         <DataGridComboBoxColumn SelectedValueBinding=“{Binding ProductId}
  6.                             SelectedValuePath=ProductId
  7.                             DisplayMemberPath=ProductName
  8.                             Header=Product Width=150>
  9.             <DataGridComboBoxColumn.ElementStyle>
  10.                 <Style TargetType=ComboBox>
  11.                     <Setter Property=ItemsSource Value=“{Binding RelativeSource={RelativeSource FindAncestor, AncestorType={x:Type Window}}, Path=DataContext.Products}/>
  12.                     <Setter Property=IsReadOnly Value=True/>
  13.                 </Style>
  14.             </DataGridComboBoxColumn.ElementStyle>
  15.             <DataGridComboBoxColumn.EditingElementStyle>
  16.                 <Style TargetType=ComboBox>
  17.                     <Setter Property=ItemsSource Value=“{Binding RelativeSource={RelativeSource FindAncestor, AncestorType={x:Type Window}}, Path=DataContext.Products}/>
  18.                 </Style>
  19.             </DataGridComboBoxColumn.EditingElementStyle>
  20.         </DataGridComboBoxColumn>
  21.         <DataGridTextColumn Binding=“{Binding UnitPrice, StringFormat=\{0:C\}} ClipboardContentBinding=“{x:Null} Header=Unit Price/>
  22.         <DataGridTextColumn Binding=“{Binding Quantity} ClipboardContentBinding=“{x:Null} Header=Quantity/>
  23.         <DataGridTextColumn Binding=“{Binding Discount, StringFormat=\{0:F\}} ClipboardContentBinding=“{x:Null} Header=Discount/>
  24.     </DataGrid.Columns>
  25. </DataGrid>

Here is a screen shot of the main view, which has a “Load” button for retrieving customers.  Selecting a customer from the combo box will retrieve the customer’s orders with details.

mvvm-trackable-main

Clicking “Create Order” will bring up the order detail view with a new Order. Clicking “Modify Order” will open the order detail view with the selected Order.  Clicking “Delete Order” will prompt the user to confirm the delete, then pass the id for the selected order to the delete operation on the Orders controller of the Web API service.

Here is a screen shot of the Add Order dialog.  The user interacts with the order details grid to add, modify or remove details from the order.  Clicking OK will pass a new or existing order to the Orders controller, together with new or changed details.  Because orders and details are change-tracked, they can be sent to a service for persistence in one round trip, so that Entity Framework can perform multiple inserts, updates and deletes within a single transaction.

mvvm-trackable-add

On the client-side, Trackable Entities marks entities as Created, Modified or Deleted as individual properties are modified and as they are added or removed from a change tracking collection.  Change state is carried with the entity across service boundaries as a simple, lightweight TrackingState enumeration.  Then on the service-side, ApplyChanges is called on the order to traverses the object graph, informing the DbContext of the entity’s change state.  That’s how the magic works.

Enjoy!

Posted in Technical | Tagged , , , | 7 Comments

Become an N-Tier Ninja with Trackable Entities 2.0

Taking a cue from Julie Lerman and Scott Hanselman, I’ve decided to dub version 2.0 of my Trackable Entities framework, the “Ninja Edition.” After releasing v 1.0 six month ago, I received several constructive suggestions on the project discussion and issues forums, which highlighted features I needed to add in order to make Trackable Entities a viable alternative to the now defunct Entity Framework Self-Tracking Entities. Now that I’ve added these items, Trackable Entities lets you become an N-Tier Ninja.

tracking-logoLet me put it this way. With just a few clicks, Trackable Entities gives you a fully functioning, real-world N-Tier application, with client-side change tracking that is portable and platform agnostic (it can work on any client – from WPF and Windows Phone to iOS and Android), and server-side persistence of batch updates in a disconnected and stateless manner, within a single transaction and a single round trip. In other words, this thing is cool.


Can’t wait to get started? Check out the Online Tutorials, or have a look at the Getting Started Video to build a complete n-tier application with EF 6.1 and ASP.NET Web API 2.1 in less than 10 minutes.

 


Aside from the coolness factor, what makes Trackable Entities a killer framework, is that it comes as a Visual Studio Extension.  To get it from within Visual Studio, all you have to do is install it from the Tools, Extensions and Updates menu.  This will give you a number of Trackable Entities project templates.

CreateWebApiProject

Once you click on one of these babies, you get a mulit-project Visual Studio solution. This in itself is an incredible time-saver.  All of the NuGet packages are there. All of the project references are correctly set. Customized T4 templates are inserted and ready to go.

WebApiSolutionExplorer

The next step is to reverse engineer an existing database into entities and mapping classes. For this aspect, we leverage the Entity Framework Power Tools. (I’m planning to replace this in the near future with the EF 6.1 Tools for Visual Studio, which also come with VS 2013 Update 2.)  The reason for this is that it’s easy to customize entity code-generation by modifying a set of T4 templates.  On the client side, the entities are data-binding friendly and they play nice with the change tracker. On the server side, the templates generate classes that are much simpler, without concern for change tracking or data binding. Because the classes are persistence ignorant POCO’s (Plain Old CLR Objects), they’re ideal for use with Domain Driven Design Patterns (such as Repository and Unit of Work).

What this separation between client and server entities means is that the service does not have to worry about how change tracking takes place on the client.  All it cares about are two little properties: TrackingState (Unchanged, Modified, Added or Deleted) and ModifiedProperties (which properties have been changed). This is essentially what sets Trackable Entities apart from its evil twin, Self-Tracking Entities.  In order to apply changes to an Entity Framework DbContext, we don’t need to know anything about original property values, and all the change-tracking logic is encapsulated on the client, by adding the Trackable Entities Client NuGet package.

As a consequence there’s nothing to stop a JavaScript client, such as a SPA, or Single Page Application, from setting tracking state on entities and sending them off to a RESTful web service that uses Trackable Entities to perform batch updates.  Nice.

Then there’s hassle-free serialization. Trackable Entities sets everything up so you don’t have to worry about tweaking configuration settings or attaching the correct attributes.  This is worth mentioning because by default Entity Framework favors the least common use-case, which a client communicating directly with a database via EF.  DbContext is configured to use dynamic proxies for lazy loading, and these are not serializable.  Trackable Entities, on the other hand, provides a DbContext class with proxy generation turned off, and entities are decorated with both JsonObject and DataContract attributes, so that cyclical references do not break the serializer.

The other thing that makes n-tier just plain hard is the Entity Framework API for persisting changes in a disconnected fashion.  Julie Lerman and Rowan Miller discuss this in their excellent DbContext book, but it remains a complex problem to solve, especially when it comes to dealing with many-to-many relationships.  But this is where Trackable Entities shines.  All you have to do is write one line of code:  DbContext.ApplyChanges.  Pass in one or more entities, and Trackable Entities will walk the entire object graph in all directions, correctly setting entity state.  It uses recursion to traverse the graph and can apply a change several layers deep.  And here’s the kicker: it knows how to deal with many-to-many relationships.  This is a feature I added to version 2.0, and it uses the metadata workspace to reflect over the conceptual model and add or remove entities from the M-M relationship without inserting or deleting the entities themselves.

Another cool feature I added to v 2.0 is an extension method to DbContext called LoadRelatedEntities.  This is one I’m especially proud of.  It obviates the need to call LoadProperty on ObjectContext, which can be quite inefficient and can only be executed synchronously.  The reason you may need to load properties is to set reference properties on added entities.  For example, you create a new order with details, and you want to return the inserted order with Customer set on each order and Product set on each order detail.  What makes LoadRelatedEntities so gnarly (to use surfer-speak) is (a) it traverses the object graph, so you only have to call it once, (b) it fetches batches of entities by dynamically constructing Entity SQL statements, and (c) it can be executed asynchronously, utilizing EF 6’s async capabilities.

After performing updates, it is important to return updated entities to the client, so that entity properties are populated with database generated values, for example, identity keys and concurrency tokens.  The trick is knowing how to merge changes back into the original objects, when those objects do not contain any metadata concerning primary keys.  The solution is to use a unique identifier that allows you to correlate each updated entity to the corresponding original entity.  In v 2.0 I added a new MergeChanges method that does precisely this.  Now you can update a collection of entities bound to a data grid without affecting the order or messing up the data bindings.

There are separate installers for Visual Studio 2012 and 2013, with limited support for .NET 4.0.  The VS extension includes multi-project templates for both WCF and ASP.NET Web API, with item templates that provide both WCF service types and Web API controllers. And wherever possible operations and actions are asynchronous.

Lastly, the icing on the cake is full support for domain driven design with repository and unit of work patterns.  A separate template gives you a multi-project solution, with interfaces separated cleanly from implementation, so that you have complete independence from the underlying persistence framework, enabling you to future-proof your app against potential obsolescence and, should it become necessary, swap out EF for something else.

The beauty of Trackable Entities is that you get all this goodness for free, and you can accomplish in minutes what it would otherwise take hours or days to pull off.  To get started you can go through one of the Online Tutorials (Web API, WCF, or Repo / Unit of Work), or take a look at the Getting Started Video, to build an end-to-end n-tier application in no time with Trackable Entities, Entity Framework, and ASP.NET Web API or WCF.  Enjoy!

Posted in Technical | Tagged , , | 6 Comments