Saturday 19 October 2013

Robust logging in applications using functional parameters

Just as logging components in applications should have no side effects on the components themselves the act of logging should not throw exceptions and cause an application to crash or a process to fail. Many times I have seen some or all of the following used as log statements:

log.Debug(string.Format("Value x: '{0}' - Value y: '{1}'", obj.Property, SomeFunction());

My issue with this is two fold:

  • There is performance issue, if debug is disabled whilst the logging framework will not output to the log but the string.Format statement will still be executed.
  • If either the object parameter is null, retrieving the property value or calling the SomeFunction method were to throw an error and it was not caught correctly the current process would fail.

The first issue is dealt with in one of two ways. Log4net (other logging frameworks are available) provides a method for the common log levels to supply a format string and a list of parameters. It will not format the string if the log level the log statement is being applied to is disabled.

log.DebugFormat("Value x: '{0}' - Value y: '{1}'", "a", 1);

If a format method is not available in the version of the logging framework it should still probably provide a way of testing if a specific logging level is enabled or not, this can also be quite useful if you wanted to build up a string using an arbitrary number of parameters or apply some logic to the building of the message.

if (log.IsDebugEnabled)
{
    var message = "foo" + x == y ? "bar" : "wibble";

    log.Debug(message);
}

Whilst the format method handles the issue of unnecessarily formatting a string it still has to evaluate any parameters. With the if statement the whole block would be ignored if the specific logging level was disabled. However both methods would still need to have some additional logic to deal with null references or other exceptions being thrown.

To simplify the need of having to constantly check if log a specific value will tear down your application you can use selection of extension methods that take in functional parameters allowing you to delay the execution of retrieving the data and ensure that if they are called they are called safely. An example syntax might look something like:

log.DebugFormat("Value x: '{0}' - Value y: '{1}'", () => obj.Property, () => SomeFunction());

log.Debug(() => SomeComplexMethodToGetALogMessage());

In my implementation I would first check that the logging level is enabled and then proceed in one of two ways. If logging using a single delegate to generate a message I would execute the delegate safely, if successful log the output otherwise log that there was an error logging a message with the specific exception. If I was logging using a string format and passing in a list of delegates I would evaluate each of the parameters safely and if any of them were to fail I would substitute a place holder for the failed value(s) and log what I can of the expected message as well as logging any failure(s). An example implementation might look like:

public static void Debug(this ILog log, Func<string> message)
{
    if (log.IsDebugEnabled)
    {
        log.Debug(TryAndResolveArgument(message));
    }
}

public static void DebugFormat(this ILog log, string format, params Func<object>[] argFunctions)
{
    if (log.IsDebugEnabled)
    {
        log.Debug(string.Format(format, TryAndResolveArguments(argFunctions).ToArray()));
    }
}

private static IEnumerable<object> TryAndResolveArguments(IEnumerable<Func<object>> argFunctions)
{
    return argFunctions.Select(TryAndResolveArgument);
}

private static object TryAndResolveArgument(Func<object> argFunction)
{
    try
    {
        return argFunction.Invoke();
    }
    catch (Exception ex)
    {
        Trace.WriteLine("Failed to resolve logging parameter");
        Trace.WriteLine(ex);
    }

    return "{error logging value}";
}

Tuesday 26 February 2013

Auto mocking with NSubstitute and Windsor

I have now fully embraced the world of auto mocking and can now see past the illusion of magic and see it for its wire-up cleanliness. I do not want to have to introduce an additional frameworks,  I am a lover of NSubstitute and generally use Windsor as my IoC of choice. The following explains how to create an auto mocking test framework using the two however I am sure you can adapt it to other frameworks.

Auto mocking base class

I have taken the approach of using a base class because it is a better fit for how I do my testing but this could easily be done in a self contained component that can be created in a test fixture. Whilst my real implementation does more than I have put here the main functionality is to create a WindsorContainer, register the LazyComponentAutoMocker which is covered later and the actual class under test. There is then an opportunity in the derived class to do some custom wire up by overriding the SetUp method. It then creates an instance of the class under test via Windsor. The derived classes have the ability of getting hold of registered components or registering their own by calling GetDependency or RegisterDependency respectively.

[TestFixture]
abstract class AutoMockedTestFixtureBaseClass<TClassUnderTest>
{
    private WindsorContainer mockContainer;

    [SetUp]
    public void virtual AutoMockedTestFixtureBaseClassSetup()
    {
        SetUpContainer();

        SetUp();

        ClassUnderTest = mockContainer.Resolve<TClassUnderTest>();
    }

    private void SetUpContainer()
    {
        mockContainer = new WindsorContainer();
        mockContainer.Register(Component.For<LazyComponentAutoMocker>());
        mockContainer.Register(Component.For<TClassUnderTest>());
    }

    protected virtual void SetUp() { }

    protected TClassUnderTest ClassUnderTest { get; private set; }

    protected TDependency GetDependency<TDependency>()
    {
        return mockContainer.Resolve<TDependency>();
    }

    protected TDependency RegisterDependency<TDependency>(TDependency dependency)
    {
        mockContainer.Register(Component.For<TDependency>().Instance(dependency));
        return dependency;
    }

    protected void RegisterDependency<TService, TDependency>() where TDependency : TService
    {
        mockContainer.Register(Component.For<TService>().ImplementedBy<TDependency>());
    }
}

Lazy component auto mocker

The lazy component auto mocker uses Windsors ILazyComponentLoader interface to create a dependency on demand via NSubstitute.

public class LazyComponentAutoMocker : ILazyComponentLoader
{
    public IRegistration Load(string key, Type service, IDictionary arguments)
    {
        return Component.For(service).Instance(Substitute.For(new[] { service }, null));
    }
}

Putting it together

The following is an example of how we might test a single class with a single dependency.

public interface ISomeDependency
{
    public int SomeValue { get; }
}

public class SomeClass
{
    public SomeClass(ISomeDependency someDependency)
    {
        DependentValue = someDependency.SomeValue;
    }

    public int DependentValue { get; private set; }
}

We have defined SomeClass with a dependency of ISomeDependency. The class uses a single property from the dependency to wire up its own internal property.

class When_creating_SomeClass_with_a_given_dependency_value : AutoMockedTestFixtureBaseClass<SomeClass>
{
    protected override void SetUp()
    {
        GetDependency().SomeValue.Returns(5);
    }

    [Test]
    public void Should_return_value_from_dependency()
    {
        ClassUnderTest.DependentValue.Should().Be(5);
    }
}

As you can see the test wire up is very simple, you have a simple setup of the dependency using NSubstitutes extensions and a simple assertion on the class under test. Here I am using FluentAssertions to assert the value.

Saturday 27 October 2012

Loading a Certificate from the Certificate Store via a Custom Configuration Section

I have recently been doing a fair amount of work with Windows Identity Foundation (WIF). In doing so I have had to load up certificates so in order to make the application flexible enough to deploy to different environments, use different certificates and follow certain standards I wanted to load the certificates from the Windows Certificate Store.

I knew that some of the other frameworks in the core libraries, such as WCF, load certificates out of the certificate stores via configuration so I wanted to emulate how they did that. After looking through some of the classes in the libraries and reflecting over the code and borrowing code generated when creating a WIF STS reference website I came up with the following that uses the CertificateReferenceElement.

interface IMySecurityConfiguration
{
    X509Certificate2 RequiredCertificate { get; } 
    X509Certificate2 OptionalCertificate { get; }
}

class MySecurityConfigurationSection : ConfigurationSection, IMySecurityConfiguration
{
    private const string ELEMENT_OPTIONALCERTIFICATE = "optionalCertificate";
    private const string ELEMENT_REQUIREDCERTIFICATE = "requiredCertificate";

    private ConfigurationPropertyCollection properties;

    protected override ConfigurationPropertyCollection Properties
    {
        get
        {
            return properties ?? (properties = new ConfigurationPropertyCollection
                                {
                                    new ConfigurationProperty(ELEMENT_REQUIREDCERTIFICATE,
                                                  typeof (CertificateReferenceElement), null,
                                                  ConfigurationPropertyOptions.IsRequired),
                                    new ConfigurationProperty(ELEMENT_OPTIONALCERTIFICATE,
                                                  typeof (CertificateReferenceElement), null,
                                                  ConfigurationPropertyOptions.None)
                                });
        }
    }

    private CertificateReferenceElement RequiredCertificateReference
    {
        get { return (CertificateReferenceElement) this[ELEMENT_REQUIREDCERTIFICATE]; }
    }

    private CertificateReferenceElement OptionalCertificateReference
    {
        get { return (CertificateReferenceElement) this[ELEMENT_OPTIONALCERTIFICATE]; }
    }

    public X509Certificate2 RequiredCertificate
    {
        get { return RequiredCertificateReference.LocateCertificate(); }
    }

    public X509Certificate2 OptionalCertificate
    {
        get
        {
            return OptionalCertificateReference.ElementInformation.IsPresent ?
                   OptionalCertificateReference.LocateCertificate() : null;
        }
    }
}

static class CertificateConfigurationExtensions
{
    public static X509Certificate2 LocateCertificate(this CertificateReferenceElement element)
    {
        return CertificateUtil.GetCertificate(element.StoreName, element.StoreLocation, element.X509FindType, element.FindValue, true);
    }
}

static class CertificateUtil
{
    public static X509Certificate2 GetCertificate(
        StoreName storeName, StoreLocation storeLocation, X509FindType findType, object findValue, bool throwIfMultipleOrNoMatch)
    {
        var certificateStore = new X509Store(storeName, storeLocation);
        X509Certificate2Collection certificates = null;
        try
        {
            certificateStore.Open(OpenFlags.ReadOnly);
            certificates = certificateStore.Certificates.Find(findType, findValue, false);
            if (certificates.Count == 1)
            {
                return new X509Certificate2(certificates[0]);
            }
            if (throwIfMultipleOrNoMatch)
            {
                if (certificates.Count == 0)
                {
                    throw new InvalidOperationException(
                        string.Format(
                            "Cannot find certificate: StoreName = '{0}', StoreLocation = '{1}', FindType = '{2}', FindValue - '{3}'",
                            storeName, storeLocation, findType, findValue));
                }
                else
                {
                    throw new InvalidOperationException(
                        string.Format(
                            "Found multiple certificates for:  StoreName = '{0}', StoreLocation = '{1}', FindType = '{2}', FindValue - '{3}'",
                            storeName, storeLocation, findType, findValue));
                }
            }
            else return null;
        }
        finally
        {
            if (certificates != null)
            {
                foreach (X509Certificate2 certificate in certificates)
                {
                    certificate.Reset();
                }
            }
            certificateStore.Close();
        }
    }
}

Using the configuration section

Then you would load up the config as follows:

<mysecurityconfiguration>
    <requiredcertificate findvalue="CN=foo" />
</mysecurityconfiguration>

For a full list of the properties have a look at the certificateReference element.

A note about loading a Private Key from the Certificate Store

In order to load a certificate with a private key you need to give the user the application is going to run as permission to load the private key. For more details see this blog post How to give IIS access to private keys.

Monday 15 October 2012

ASP.NET Desktop Membership Manager Application

Last month I started up an open source project on CodePlex with a couple of developer friends. The application is for helping developers and system administrators to release ASP.Net web projects to a given environment where they have no ability to manage the ASP.Net Membership accounts. Either because the web application does not have user management yet or there is no need for membership management in the application.

The project is still in Beta but the most functional version so far has been released as a ClickOnce application and allows you to administer users and roles. It needs work on providing full support for all the membership options such as forgotten password questions and support for more complex profile properties but it has already been useful to me for creating roles and users even in its limited capacity.

There are other projects out there that try to address the pain of ASP.Net membership but they are either incomplete or web based solutions. Additionally all the desktop solutions require your to copy your web applications Membership, Role and Profile configuration into the applications configuration file before you start it up. This application addresses that by allowing you to load the configuration up at runtime.

If this is something that you would find useful I would love to get some feedback either here or on the project website, mostly I would be interested in how it works (or doesn’t work) with your configuration and what additional features would be useful.

ASP.Net Membership Manager on CodePlex

Thursday 11 October 2012

Migrating Authentication out of your ASP.Net applications into a Single Sign-on Secure Token Service

It happens every time we create a new ASP.Net application we talk about doing something better than using ASP.Net membership but it never happens and before you know it your live using the SqlMembershipProvider database. Now you are trapped because removing it would be a big change and the business requires you to focus on more important features.

BUT that is not true, this post will go through the steps required to separate your authentication from your application whilst maintaining your original membership database giving you a Single Sign-On (SSO) Security Token Service (STS). The additional benefits of this mean that you can use the same STS for all your applications and focus your efforts on writing websites, not having to re write authentication and user management code.

I plan to in a later post show how you can use Azures Access Control Service (ACS) to separate your applications from a singe STS and allow you the flexibility of introducing additional STSs without having to change your applications.

Installing Windows Identity Foundation (WIF)

The first step is to pull out the authentication process from your application and move it to it’s own web application. There are a number of ways to do this but the easiest way is to install the Windows Identity Foundation (WIF) Framework and SDK which will plug-in to Visual Studio. This will provide some UI tools which will make things easier as well as providing a number of assembly libraries which will be used to create and consume the security token. See the download links at the end of the article.

Creating your ASP.Net STS

To start off we need to create a website that will be our STS. The STS takes over the role of authentication so that your website can concentrate on doing what it should do. In this example I am going to create a MVC4 website but if you are not bothered you can get Visual Studio to create you a dummy asp.net project, which we will do later anyway, and just use that.

Create a new ASP.Net MVC4 Internet Web Site project

Create a new Asp.Net MVC4 Internet Website project and add it to your solution, if you select blank website it will not create the AccountController for you and you will have to do this by hand. My examples below assume you are using the Razor view engine but feel free to use the view engine of your choice and adapt the code appropriately.

Change the login post action in the AccountController to do the following:
if (Membership.Provider.ValidateUser(model.UserName, model.Password))
{
    FormsAuthentication.SetAuthCookie(model.UserName, model.RememberMe);
       return Redirect(returnUrl);
}
Create a Custom Security Token service for your domain

The easiest way to do this is to get Visual Studio to generate the boiler plate code for you and then change it as you see fit. Most of my changes involved separating out the components so that I could do DI via Windsor.

  1. To generate the code first make sure you comment out the  “system.serviceModel” configuration section from your web config otherwise Visual Studio will think it is a WCF project.
  2. Right click on your website and select “Add STS reference…” as mentioned previously. Enter the Uri to the root of your website including the port number.
  3. Click “Next” and then click “Yes” if prompted.
  4. Select “Create a new STS project in the current solution”.
  5. Click “Next” and then “Finish”
  6. Go to the newly created project and copy everything out of the projects App_Code folder into your STS Website in an appropriate place.
  7. Delete the generated STS Website
  8. Edit the copied CustomSecurityTokenService and change the GetOutputClaimsIdentity method to the following which will add your users roles as claims to the generated token.
protected override IClaimsIdentity GetOutputClaimsIdentity( IClaimsPrincipal principal, RequestSecurityToken request, Scope scope )
{
    var outputIdentity = new ClaimsIdentity();
    var user = Membership.Provider.GetUser(principal.Identity.Name, true);
    outputIdentity.Claims.Add( new Claim( System.IdentityModel.Claims.ClaimTypes.Name, user.Username ) );

    foreach (var role in user.Roles)
    {
        outputIdentity.Claims.Add( new Claim( Microsoft.IdentityModel.Claims.ClaimTypes.Role, role ) );    
    }
    return outputIdentity;
}
Wire up the CustomSecurityTokenService in the Web.config

Add the following to your STS Website’s web.Config

<appSettings>
    <add key="IssuerName" value="PassiveSigninSTS"/>
    <add key="SigningCertificateName" value="CN=STSTestCert"/>
    <add key="EncryptingCertificateName" value=""/>
</appSettings>

<system.web>
    <compilation debug="true" targetFramework="4.0">
        <assemblies>
            <add assembly="Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
        </assemblies>
    </compilation>
</system.web>
Use the CustomSecurityTokenService to generate a Token that can be posted back to the Relying website

Create a Token action on the AccountController with a TokenModel and view that uses the CustomSecurityTokenService that presents a Token to a view which is then posted back to the calling website via Javascript.

public class TokenModel
{
    public string Wa { get; set; }
    public SignInResponseMessage Response { get; set; }
}

public ActionResult Token(TokenModel tokenModel)
{
    if (tokenModel.Wa == WSFederationConstants.Actions.SignIn)
    {
        var securityTokenService = new CustomSecurityTokenService(new CustomSecurityTokenServiceConfiguration());

        var requestMessage = (SignInRequestMessage)WSFederationMessage.CreateFromUri( Request.Url );

        SignInResponseMessage responseMessage = FederatedPassiveSecurityTokenServiceOperations
            .ProcessSignInRequest( requestMessage, User, securityTokenService )

        tokenModel.Response = responseMessage;

        return View(tokenModel);
    }
    if (tokenModel.Wa == WSFederationConstants.Actions.SignOut)
    {
        SignOutRequestMessage signoutRequest = WSFederationMessage.CreateFromUri( Request.Url );

        try
        {
            FormsAuthentication.SignOut();
        }
        finally
        {
            SessionAuthenticationModule authenticationModule = FederatedAuthentication.SessionAuthenticationModule;
            if (authenticationModule != null)
            {
                authenticationModule.DeleteSessionTokenCookie();
            }
        }

        if (!string.IsNullOrWhiteSpace(signoutRequest.Reply))
        {
            return Redirect(signoutRequest.Reply);
        }
    }
    return null;
}
@model <Your namespace>.TokenModel
@{
    ViewBag.Title = "Token";
}
@using (Html.BeginForm(null, null, FormMethod.Post, new { @action = Model.Response.BaseUri }))
{
    <p>Token.</p>
    <input type="hidden" name="wa" value="@Model.Response.Action" />
    <input type="hidden" name="wresult" value="@Model.Response.Result" />
    <input type="hidden" name="wctx" value="@Model.Response.Context" />
    
    <noscript>
        <p>JavaScript is disabled please click Submit to continue.</p>
        <input type="submit" value="Submit" />
    </noscript>
}
<script language="javascript">window.setTimeout('document.forms[0].submit()', 0);</script>

Linking the STS to your existing membership database

Replace the Membership and Role provider configuration in the STS we.config which would have been auto generated with the configuration from your websites web.config, remember to copy over the  membership database connection string.

Get your website to use the STS as an authentication service

We should now be at a point where we can now authenticate against the STS.

  1. First make your website and the STS use static ports, this will make things easier when working in a team environment.
  2. Add a reference to the Microsoft.Identity.Model
  3. Edit your websites config as per the following, you may find that most of this all done for from when we added the STS service reference but you will need to update the address to the STS service. There may also need to be some small differences depending on how complicated your web config is.
<configSections>
    <section name="microsoft.identityModel" type="Microsoft.IdentityModel.Configuration.MicrosoftIdentityModelSection, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
</configSections>

<system.web>
    <httpRuntime requestValidationMode="2.0" />
    <authorization>
        <deny users="?" />
    </authorization>
    <authentication mode="None" />
    <compilation debug="true" targetFramework="4.0">
        <assemblies>
            <add assembly="Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
        </assemblies>
    </compilation>
       <!-- The httpModules will need to be removed when deploying to IIS7-->
    <httpModules>
        <add name="WSFederationAuthenticationModule" type="Microsoft.IdentityModel.Web.WSFederationAuthenticationModule, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
        <add name="SessionAuthenticationModule" type="Microsoft.IdentityModel.Web.SessionAuthenticationModule, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />
    </httpModules>
</system.web>

<system.webServer>
    <modules>
        <add name="WSFederationAuthenticationModule" type="Microsoft.IdentityModel.Web.WSFederationAuthenticationModule, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" preCondition="managedHandler" />
        <add name="SessionAuthenticationModule" type="Microsoft.IdentityModel.Web.SessionAuthenticationModule, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" preCondition="managedHandler" />
    </modules>
</system.webServer>

<microsoft.identityModel>
    <service>
        <audienceUris>
            <add value="http://<website address and port number>/" />
        </audienceUris>
        <federatedAuthentication>
            <wsFederation passiveRedirectEnabled="true" issuer="http://<STS address and port number>/Account/Token" realm="http://<website address and port number>/" requireHttps="false" />
            <cookieHandler requireSsl="false" />
        </federatedAuthentication>
        <applicationService>
            <claimTypeRequired>
                           <!-- If you do not need roles or a name in the claim then remove the following -->
                <claimType type="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name" />
                <claimType type="http://schemas.microsoft.com/ws/2008/06/identity/claims/role" />
            </claimTypeRequired>
        </applicationService>
        <issuerNameRegistry type="Microsoft.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, Microsoft.IdentityModel, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35">
            <trustedIssuers>
                           <!--
                                You will need to get the thumb print of the Certificate that was created and put in your user store
     when Visual Studio generated the STS earlier on
                                Be careful when copying the thumbprint as you may end up with an invisible funny invalid character
     at the front of the thumbprint that you will have to remove.
                           -->
                <add thumbprint="9A7465C17A42099FFB0C061F9D9A7BCBCD440908" name="http://<STS address and port number>/" />
            </trustedIssuers>
        </issuerNameRegistry>
    </service>
</microsoft.identityModel>
Change your websites logout method to do the following
FederatedAuthentication.SessionAuthenticationModule.SignOut();

WSFederationAuthenticationModule authModule = FederatedAuthentication.WSFederationAuthenticationModule;
string signoutUrl = (WSFederationAuthenticationModule
 .GetFederationPassiveSignOutUrl(authModule.Issuer, authModule.Realm, null));
return Redirect(signoutUrl);

Fire up both websites and try navigating to your website. If everything is configured correctly you should be redirected to your STS where you can authenticate using the existing membership database and you should then be redirected back to your website with a claim containing your username and roles.

Additional tasks

If we have got here then hopefully everything above worked, here are some additional task that I had to do which are specific to my domain but they are worth considering.

General clean up: The above code works, great, but it is not the prettiest and was simplified for brevity. Consider moving things around to make everything more testable and readable. I created wrappers for certain things so that I could use dependency injection.

Styling and user management: All the above does is move over the authentication to the STS, there is no user management. There is no reason why it cannot stay in the main website, this is of course a decision you will need to make.

Consider how you will deploy: You will need to consider how you are going to deploy this out to other development machines and environments. How are you going to manage certificates and such?

Development and testing: In order to facility testing and prevent the need to have both websites running in a development environment I created a Test STS that allowed me to login as a verity of different users with different claims which I then hosted on an internal server that the whole team could access. In stead of having to enter a username and password all you did was change the name of the user if you wanted and tick the roles you wanted to be in the claim.

Abstracting from a single STS using Azure’s ACS

In order to further separate  your website you can also consider using an Access Control Service (ACS) which Microsoft hosts on Azure. The service allows you to sign in from a verity of different Security Token Services and maps the claims from each of these STS Domains to claims your own domain can recognize. You could of course role your own but either of these two topics is a post in its own right.

Further Reading

MSDN: A Guide to Claims-Based Identity and Access Control (2nd Edition)

Windows Identity Foundation

Windows Identity Foundation SDK

Federated Identity with Multiple Partners


Federated Identity with Multiple Partners and Windows Azure Access Control Service

Microsoft.IdentityModel Namespace

Thursday 16 August 2012

Public WCF contract legacy testing

I work on a project that contains a number of public facing WCF interfaces and recently we were doing a pre release to our test environment when we found out that our public contracts were not working, the interfaces were fine but some of the data members of the contracts were not being serialized. On inspecting the WSDL we realised that some of the namespaces had been changed, it turns out when a rename had been done on some of the namespaces and a shuffling around of some components to more logical namespaces.

The main issue is that we were using the default namespaces provided by the WCF API and not hand crafting our public facing namespaces to be more specific to the business but irrespective of that we didn’t have a way of knowing that we have potentially broken a public facing contract.

Solution

Breaking a public interface is not necessarily a bad thing but it is if you don’t know that you have done it. What we needed was a series of unit tests that could be run by our CI server and flag up errors when a contract is broken. This would allow the team to discuss the breaking change and determine if the change is required or if there is another way of implementing the change.

Given the following service and data contracts I will show how to create a series of tests that will break if the contract changes. No custom namespace has been applied so the default .net namespaces will be used but this same process can be used for service contracts with custom namespaces.

namespace Foo
{
    [ServiceContract]
    public interface IPublicInterface
    {
        [OperationContract]
        GetInfoResponse GetInfo(GetInfoRequest request);
    }

    [DataContract]
    public class GetInfoRequest
    {
        [DataMember]
        public string RequestData { get; set; }
    }

    [DataContract]
    public class GetInfoResponse
    {
        public GetInfoResponse(string result
        {
            Result = result;
        }

        [DataMember]
        public string Result { get; private set; }
    }
}

As you can see it is a very simple interface using simple request and response objects, each object has a single DataMember. Now assuming that this interface has been made public we now need to apply a safe guard that will warn us if the contract is broken. To do this we need to create a suite of tests that can run a legacy service contract at our real service contract. .Net makes this easy by allowing you to customize the namespace and name of the ServiceContract and DataContract attributes. Ultimately we are trying to create a client side contract that matches the currently live WSDL of our public service. The following tests make use of my WCF MockServiceHostFactory, NSubstitute mock framework and FluentAssertions testing assertion framework.

namespace Foo.Tests
{
    [TestFixture]
    class When_calling_service_using_the_version_1_service_contract
    {
        private ServiceHost publicInterfaceServiceHost;
        private ChannelFactory<ILegacyPublicInterface_v1> legacyPublicInterfaceChannelFactory;
        private IPublicInterface publicInterfaceMock;
        private ILegacyPublicInterface_v1 publicInterfaceChannel;

        [SetUp]
        public void SetUp()
        {
            publicInterfaceMock = Substitute.For<IPublicInterface>();
            publicInterfaceMock.GetInfo(Arg.Any<GetInfoRequest>()).Returns(new GetInfoResponse("Success", string.Empty));

            publicInterfaceServiceHost = MockServiceHostFactory.GenerateMockServiceHost(publicInterfaceMock, new Uri("http://localhost:8123/PublicInterface"));
            publicInterfaceServiceHost.Open();

            legacyPublicInterfaceChannelFactory = new ChannelFactory<ILegacyPublicInterface_v1>(new BasicHttpBinding(), new EndpointAddress("http://localhost:8123/PublicInterface"));
            publicInterfaceChannel = legacyPublicInterfaceChannelFactory.CreateChannel();
        }

        [TearDown]
        public void TearDown()
        {
            legacyPublicInterfaceChannelFactory.Close();
            publicInterfaceServiceHost.Close();
        }

        [Test]
        public void Should_deserialize_request_object_at_the_service()
        {
            publicInterfaceChannel.GetInfo(new GetInfoRequest_v1 { RequestData = "Foo" });

            publicInterfaceMock.Received().GetInfo(Arg.Is<GetInfoRequest>(x => x.RequestData == "Foo"));
        }

        [Test]
        public void Should_deserialize_the_response_object_at_the_client()
        {
            GetInfoResponse_v1 response = publicInterfaceChannel.GetInfo(new GetInfoRequest_v1());

            response.ShouldHave().AllProperties().EqualTo(new GetInfoResponse_v1 { Result = "Success" });
        }
    }

    [ServiceContract(Namespace = "http://tempuri.org/", Name = "IPublicInterface")]
    interface ILegacyPublicInterface_v1
    {
        [OperationContract]
        GetInfoResponse_v1 GetInfo(GetInfoRequest_v1 request);
    }

    [DataContract(Namespace = "http://schemas.datacontract.org/2004/07/Foo", Name = "GetInfoRequest")]
    class GetInfoRequest_v1
    {
        [DataMember]
        public string RequestData { get; set; }
    }

    [DataContract(Namespace = "http://schemas.datacontract.org/2004/07/Foo", Name = "GetInfoResponse")]
    class GetInfoResponse_v1
    {
        [DataMember]
        public string Result { get; set; }
    }
}

If we were now to move any of the components involved in the public contract to different namespaces, renames the interface or anything that would break the contract this test would fail indicating that we need to either revert the code, fix the contract by providing custom names and namespaces or inform our clients that there will be a breaking change in the next release.

Testing different contract versions

Not all contract changes have to break a contract. In some cases we might want to add an additional field to a request or response object or add a new operation to the contract. Given we need to add a new field called Message to the GetInfoResponse object the following test provides coverage for the new contract whilst still maintaining the test for the previous version of the contract.

[DataContract]
public class GetInfoResponse
{
    public GetInfoResponse(string result, string message)
    {
        //...
        Message = message;
    }

    //...

    [DataMember]
    public string Message { get; private set; }
}

[TestFixture]
class When_calling_service_using_the_version_2_service_contract
{
    //...

    private ChannelFactory<ILegacyPublicInterface_v2> legacyPublicInterfaceChannelFactory;
    private ILegacyPublicInterface_v2 publicInterfaceChannel;

    [SetUp]
    public void SetUp()
    {
        //...

        publicInterfaceMock.GetInfo(Arg.Any<GetInfoRequest>()).Returns(new GetInfoResponse("Success", "Bar"));

        legacyPublicInterfaceChannelFactory = new ChannelFactory<ILegacyPublicInterface_v2>(new BasicHttpBinding(), new EndpointAddress("http://localhost:8123/PublicInterface"));
    }

    [Test]
    public void Should_deserialize_the_response_object_at_the_client()
    {
        GetInfoResponse_v2 response = publicInterfaceChannel.GetInfo(new GetInfoRequest_v1());

        response.ShouldHave().AllProperties().EqualTo(new GetInfoResponse_v2 { Result = "Success", Message = "Bar" });
    }
}

[ServiceContract(Namespace = "http://tempuri.org/", Name = "IPublicInterface")]
interface ILegacyPublicInterface_v2
{
    [OperationContract]
    GetInfoResponse_v2 GetInfo(GetInfoRequest_v1 request);
}

[DataContract(Namespace = "http://schemas.datacontract.org/2004/07/Foo", Name = "GetInfoResponse")]
class GetInfoResponse_v2
{
    [DataMember]
    public string Result { get; set; }

    [DataMember]
    public string Message { get; set; }
}

We now have a version two of the public interface and response object which can be used to test the contract as in the previous test but it has left the original legacy contract untouched meaning that the first version of the contract will still be tested for compatibility.

Summary

It is very important that you are happy with your public interface before you give it to your clients as once it is in use it is very difficult to change and the more clients the bigger the impact. To minimize the chance of change the contract via tools such as ReSharper you should provide custom namespaces for all of your contract components and in my opinion it looks much more professional to a consuming third party to see your companies name as the root URL in the WSDL.

And finally these tests will not protect you from everything but they do provide a safety.

What these tests do:

  • Check for breaking changes in your public contracts at the WCF layer
  • Allow you to test legacy versions of the contracts

What these tests don’t do:

  • Force your real services to support legacy objects
  • Force you to have clear easy to use contracts

Wednesday 4 April 2012

Passing mocked parameters between AppDomains

Currently I have been doing a lot of Acceptance Testing which has become more and more complex of late. One of the issues was separating out our systems that all run in the same process in the context of our Acceptance Tests. These are all NUnit test which use StoryQ and are run by the ReSharper and TeamCity NUnit test runners. To achieve the separation each of the systems is run in it’s own AppDomain but where previously certain parts of the systems were mocked out we are now unable to pass the mocks or any object across the app domain boundaries that is either not serializable or inherits from MarshalByRefObject.

Fortunately there is a way, the following shows how it is possible to pass mocks across the AppDomain boundary with no change to implementation but with certain restrictions. I use the NSubstitute mocking framework which uses Castles DynamicProxy framework under the covers but it should also apply to other mocking frameworks as well.

Demonstration

I have created a test fixture that gives an example of what can and cannot be achieved with mocks. The following shows:

  • Test setup which creates an AppDomain and creates a MarshalByRefObject that will consume our mock object
  • Test teardown which unloads the AppDomain so everything is nice and clean for the next run.
  • Introduces the classes and interfaces that will be used in the tests to demonstrate passing the mocks.
NB. All the assertions are done using the FluentAssertions framework.
[TestFixture]
class When_passing_mocks_to_an_app_domain
{
    private MarshalByRefType marshalByRefObject;
    private AppDomain testAppDomain;
 
    [SetUp]
    public void SetUp()
    {
        string exeAssembly = GetType().Assembly.FullName;
 
        var ads = new AppDomainSetup
        {
            ApplicationBase = Environment.CurrentDirectory,
            DisallowBindingRedirects = false,
            DisallowCodeDownload = true,
            ConfigurationFile = AppDomain.CurrentDomain.SetupInformation.ConfigurationFile
        };
 
        testAppDomain = AppDomain.CreateDomain("AD #2", null, ads);
 
        marshalByRefObject = (MarshalByRefType)testAppDomain.CreateInstanceAndUnwrap(exeAssembly, typeof(MarshalByRefType).FullName);
 
    }
 
    [TearDown]
    public void TearDown()
    {
        AppDomain.Unload(testAppDomain);
    }
}
 
public interface IThing
{
    string ValueProperty { get; }
    string ValueMethod(int i);
    Foo ComplexValue();
    IBar ComplexInterface();
}
 
public class Foo
{
    public string Value()
    {
        return "Foo";
    }
}
 
public interface IBar
{
    string Value();
}
 
public class MarshalByRefType : MarshalByRefObject
{
    public string GettingASimpleString(IThing thingMock)
    {
        return thingMock.StringMethod(1);
    }
 
    public string GettingAStringFromAComplexType(IThing thingMock)
    {
        return thingMock.ComplexType().Value();
    }
 
    public string GettingAStringFromAComplexInterface(IThing thingMock)
    {
        return thingMock.ComplexInterface().Value();
    }
}

Proving our regular mocks don’t work

Here we have an example of how we would normally create a mock which is then passed it to our MarshalByRefObject.

[Test]
public void Should_pass_mock()
{
    var stringThing = Substitute.For<IThing>();
 
    stringThing.ValueProperty.Returns("wibble");
 
    //This is going to throw an exception
    var result = marshalByRefObject.GettingASimpleString(stringThing);
 
    result.Should().Be("wibble");
}

Running the above generates a System.Runtime.Serialization.SerializationException

System.Runtime.Serialization.SerializationException : Type 'NSubstitute.Core.CallRouter' in Assembly 'NSubstitute, Version=0.1.3.0, Culture=neutral, PublicKeyToken=92dd2e9066daa5ca' is not marked as serializable.

This is because or mock is not serializable nor does it inherit from MarshalByRefObject . With NSubstitute we have a nice little work around, we can tell NSubstitute to create a mock that implements our interface and inherit from MarshalByRefObject .

var stringThing = Substitute.For<IThing, MarshalByRefObject>();

Passing our mocked proxy

The following test is the same as our previous test except that it generates our mock using the above technique, this time it passes.

[Test]
public void Should_pass_a_mock_that_has_a_mocked_string_method()
{
    var stringThing = Substitute.For<IThing, MarshalByRefObject>();
 
    stringThing.ValueMethod(1).Returns("wobble");
 
    var result = marshalByRefObject.GettingASimpleString(stringThing);
 
    result.Should().Be("wobble");
}

Lets try something more adventurous

Dealing with simple types is relatively straight forward but what about mocking more complex objects. The following tries to use a method that returns a more complex concrete type.

[Test]
public void Should_pass_a_mock_that_has_a_mocked_complex_method()
{
   var stringThing = Substitute.For<IThing, MarshalByRefObject>();
   stringThing.ComplexType().Returns(new Foo());
 
   //This is going to fail because Foo is not Serializable
   var result = marshalByRefObject.GettingAStringFromAComplexType(stringThing);
 
   result.Should().Be("Foo");
}

This fails for the same reason as our first test did except this time it fails inside our new AppDomain, it is worth noting when this fails, not when passing the object in but when trying to use Foo. This is because our MarshalByRefObject instance in the test is a proxy and only passes things across to the other AppDomainwhen it needs it. Unfortunately there is not much we can do in this situation without changing the structure of our code as we will see.

Here we have an example which is similar to the above except this time we are going to use a method that returns an interface. By default NSubstitute will create a mock for any method on a mock that returns an interface, unfortunately we cannot use these as we will be in the same situation as the previous example so we will have to mock the methods return values by hand.

[Test]
public void Should_pass_a_mock_that_has_a_mocked_complex_method_with_interface_using_marshalbyrefobject()
{
    var stringThing = Substitute.For<IThing, MarshalByRefObject>();
    var complexInterface = Substitute.For<IBar, MarshalByRefObject>();
    complexInterface.Value().Returns("Bar");
    stringThing.ComplexInterface().Returns(complexInterface);
 
    var result = marshalByRefObject.GettingAStringFromAComplexInterface(stringThing);
 
    result.Should().Be("Bar");
}

Limitations and things to watch for

As you have seen it is possible to create mocks that can be serialized but there are some caveats:

  1. You need to inherit your mock from the MarshalByRefObject base class which rules out concrete types that are not already serializable.
  2. Extra leg work to mock more complex types.
  3. If you have not mocked something correctly you will only know at the point when you come to use it which could make debugging confusing.