• Speaking at SharePoint Saturday Montréal 2016

    This year again I have the opportunity to speak at the SPS Montréal

    I also had the occasion to help organize the event this year, it’s been an interesting experience so far and I’d like to thank Serge, Nico, Luc, Seb and Fabrice for their implication in that project, it’s much more work than what we think!

    This time I’ll talk about setting up your release pipeline for SharePoint/Office 365 developments in Visual Studio Online (Team Foundation Services). (In French)

    We’ll see how you can automate all the things to make your development team more productive and more efficient. For that we’ll see how to set up build and release definitions and how you can leverage both azure and on premises environments

    Note: you’ll find a lot of very good sessions during this free event happening on April the 2nd. Come join us!

    See you on Saturday!

    • 30/3/2016
  • Controlled update of SQL databases Schema with Entity Framework Code First in a continous delivery process


    Entity Framework is a well-known technology in the ORM domain for the .NET world.

    Note: Object Relational Mapping, this is the part between data persistence unit and some more “high level” code.

    When you start a new entity framework project you have a few different ways to do it, one of which being “code first”. It means our code will drive the SQL Schema evolution (which simplifies a lot of things and allows you to focus on business and not technical implementation).

    When you chose this approach you have a few methods to run the update of your schema:

    - Nuget or DNS console, a developer has to do it manually during deployment

    - Auto Migration, entity framework will update the database automatically at runtime when the application starts if it finds differences. This is really useful for small projects but it brings some bad sides: performances are degraded during application startup, you’re not in control anymore of “when” the update is going to happen.

    Because we’re developers and really lazy people , because we want to avoid errors, oversights, save time to everybody, we’re going to automate the process while keeping it under control.

    Following examples will be for DNX projects but classic .NET projects are really similar, just the commands change a little bit.

    Note: this post assumes you’ve followed something similar to that article in setting up your dbcontext

    Build or release process?

    In our case we’re using Visual Studio Team Services software industry for our software developments. Some of the provided services are build services and release management services.

    In our case we have a real decoupling between our build and our release processes.

    Build process is responsible for:

    - Transforming source code to deliverables (compilation, transpilation…)

    - Transmitting additional release artifacts (scripts)

    - Run unit tests

    Release management process is responsible for:

    - Deploying deliverables to target environments

    - Data migration

    - Environments setup and configuration

    - …

    Which explains why it makes sense in our case to put the database update process in our release process. Depending on how you set up your continuous delivery process it may differ a little bit and you may have to adapt examples.

    Configuring definitions in VSTS

    Script that will run the update

    The first thing we’re going to want to add to our “code” is a script which will tell DNX to run the database update and a configuration source file which will allow to configure the db context.

    It’s your lucky day because I’ve already done that for you, here are my gists

    Getting the sources and the scripts

    What must be understood is that DNX will use the source code and build it live in order to run the database update, I’m not aware of any way to do that from an already compiled DNX library (really just a nuget) available now.

    What it means is we’ll need the sources to do the entity framework database update. Source that are already at hand if you’re doing that in the build process. In our case thought we’re updating the database schema from the release process. So you’ll need to bring your sources with your build artifacts.

    To do so edit your build process and in “copy files” add two lines:

    - src\** for sources (our sources are in the folder src, which is a standard for DNX projects)

    - *.ps1 to bring the script we just added

    Updating the schema during deployment

    Last step is to update your deployment process to tell it to run the entity framework database update. To do so just add a powershell step and configure it to run the script we just added.

    As argument give it the connection script you want to use for the database to be updated.


    As you can see it’s pretty easy to automate and master the database update of an entity framework code first project with a release pipeline. And that with or without DNX involved.

    I hope this post will save you guys time and deployments issues.

    • 3/3/2016
  • Units tests using xUnit and Visual Studio Team Services for DNX projects



    DNX will soon be released and there’ll be a growing number of developers wanting to write unit tests to improve the quality of developments.

    Ideally you’d like to run these unit tests in a continuous integration process in order to make sure you didn’t introduce any regression/bug or degraded performances.

    In .NET world you have many tools to write unit tests:

    -          MSTest/VSTest

    -          NUnit

    -          xUnit


    (a non-exhaustive list )

    In our case we chose to use xUnit because it is widely used and supported by the community, and it’s one of the few to support DNX at the moment.

    Another thing, you must have noticed from my previous blog posts, we’ve been using a lot the tools provided by VSTS (VSO), in particular build2015 and release management.

    And as a reminder DNX = “new framework” which means the tooling has to be adapted to it and compiled for it in order to support it properly.

    Local tests and tests projects

    After following this tutorial, you’ll be able to write and run tests in a dnx library project (which produces a nuget at the end) and to execute these tests locally via console (dnx test) or via the test explorer.

    Note you can also use yeoman to scaffold the tests project typing yo aspnet and selecting the choice corresponding to tests.

    Now that you can run tests locally the question is: how can I make VSO/VSTS run the tests and get the results with my build?

    Executing tests with agent

    The first step is to make the agent run the tests, it is as easy as running a powershell script.

    dnx -p ".\src\BizDesk365ContentService.Tests\" test -xml TestResults.xml

    First parameter is the relative path to the tests project. The second one tells the xunit’s dnx runner to output the results to a xml file that will be needed later.

    In my exemple I added some lines to ensure that dnvm, my runtime and dnx are here, but depending on your configuration it might not be required.

    Getting the tests results

    Now you need to tell VSTS to get and process the unit tests results, to do so just add a publish test results step.

    In type select xunit and give it the location of the test results file.


    Tests results now show up with our build in VSTS and we can see that with a little bit of scripting and configuration it is easy to build unit tests for DNX projects

    Have fun testing!

    • 1/3/2016