• 8/8/2018

    Speaking at SharePoint Fest Seattle 2018

    SharePoint Fest Seattle 2018 is happening from August 20th to 24th. This event will feature 2 days pre-conference workshops and 3 days of conference. You can find more information about it on the website. I’ve been selected amongst 74 other speakers (including people from Microsoft product teams!!) to present this year two sessions:

    AZR203 - Azure Functions In The Real World: Lessons Learned & Best Practices

    “Azure Functions is one of the most powerful new solutions provided by Microsoft. Customers are leveraging it, and it has generally been available for a year now quietly delivering value across hundreds of projects. Many of you are probably asking yourself questions like “are functions mature enough?” or “are they production ready?” or even “is the right tooling here yet?” when considering this option for your projects.”

    AZR302 – Microsoft Graph and SharePoint Framework under steroids with Azure Functions

    “Modern development means client side first, backend second. However, there are still cases where you might need some backend processing, for long running operations, heavy computing consuming tasks or security concerns.

    During that session we will learn how you can build server-less solutions to support modern development. We will determine together when it makes sense to offload things on the backend and when is does not. We will have a lot of examples working with the Microsoft Graph as well as the SharePoint Framework.

    Finally, we will see that server-less does not mean hacky solutions and that proper continuous integration and deployment processes can be implemented.”

    I’m truly honored to be part of this prestigious event with so many other great SharePoint/Office 365 speakers. If you haven’t booked your ticket to the event yet, go ahead!

    See you there.

    • 6/8/2018

    Speaking at SharePoint Saturday Charlotte 2018

    I Have the pleasure to announce I have been selected to speak at SPS Charlotte 2018.

    I’ll give a speech about the graph “How to do Dev-Ops with the SharePoint Framework and why it matters”.

    Together we’ll see why it’s important to follow the DevOps processes, methodologies, and philosophy and how to implement it for the SharePoint Framework with Visual Studio Team Services (or TFS). Go from the original idea to production automating as much as possible!

    If you’re in the area Saturday August, the 18th 2018 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.


    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at the UNC Charlotte Center City.

    See you there!

    • 4/8/2018

    Guest on the Microsoft Graph Community call (June): The Online Meeting Solution

    I must admit I’ve not been on page lately for the blog posts. However, I had the privilege (among other great speakers) to present during the Microsoft Graph Community call for the month of June.

    Microsoft Graph Community calls are online webinar organized by the Microsoft Graph Team. The Team usually presents announcements and then allows time for community speakers to presents discoveries, real life solutions and lessons learnt with the Microsoft Graph. Those call are free to attend and if you’re not already registered, here is all the information you need.

    During this edition I was presenting a real-life solution that we (2toLead) built and deployed to more than 100 000 users. This solution helps the customer be more efficient when organizing online meetings by creating the invitations in Exchange, creating a structured space in SharePoint to capture key elements (agenda, tasks, comments…) and creating a Skype online meeting so people can attend virtually.

    This solution relies heavily on the Microsoft Graph and during the project we learnt a few tricks I’m sharing during the demonstration that could be useful to any of you building solutions that take advantage of the Microsoft Graph.

    Here is the link to watch the recording (that also contains awesome demonstration with Flow and PowerApps).

    • 2/8/2018

    Guest on the Microsoft 365 Dev Podcast: Microsoft Graph Open Extensions and Calendering

    Last week I had the honor to be a guest on the Microsoft 365 Dev Podcast hosted by Jeremy Thake and Paul Schaeflein.

    It was the second time I was invited on a podcast (the first one being in French about devops), this is a fun exercise where the content is delivered as a discussion between the hosts and the guests.

    During this episode we discussed about two main topics:

    • The Microsoft Graph open extensions (currently starting a project using those at 2toLead)
    • The Calendar capabilities in the Microsoft Graph (and I shared a few tricks that might interested you if you plan on using those APIs)

    I’d like to thank Jeremy and Paul for giving me this opportunity and for the good conversations.

    Click here to listen to the podcast (multiple platforms supported) and don’t forget to subscribe to it!

    • 4/7/2018

    Not Office Servers and Services MVP anymore, becoming Office development MVP

    Renewed as MVP and changed category

    I have the pleasure to announce I’ve been renewed as a Microsoft MVP, for the fifth year in a row (time flies, I’m getting old). A slight change this time, my award is Office development.

    Over the years I’ve been a SharePoint MVP, an Office Servers and Services MVP and now an Office development MVP.

    Originally Microsoft had it’s MVPs organized by products, which made a lot of sense in a on premises world which required a deep knowledge in something very specific. Also, it allowed MVP’s to be connected directly with the teams at Microsoft on their product. However, over the last decade the industry transitioned to a cloud first model and somebody who was working with SharePoint (or Exchange, Skype…) is most likely to be working with Office 365 at large (and some Azure as well). A couple of years ago (3?) Microsoft decided to reorganize the MVP program to have award categories (Azure, Office servers and services, Visual Studio and Team Services…) which regroup contribution areas (e.g. OSS: SharePoint, Exchange, Office 365, …)

    The only downside for that reorganization is that in the case of Office Servers and Services, the focus was now much more on the “IT pros” and “power user” side of things, with little attention given to any developers (aren’t those guys in Azure now?) when at the time, the SharePoint category would include devs as well.

    After a year Microsoft realized that and decided to create the Office Dev award category (which regroups SharePoint dev, Office 365 dev, Excel “dev”, Office add-ins…) and somehow randomly dispatched the MVP’s in that award.

    I’m not going to dive in the intricacy of the program (and I don’t think I’m allowed to at some point) but basically the category you’re in will dictate the content you have access to, the product teams that will listen to you and so on. During the last MVP summit, my content was way off my centers of interest and I had to ask all the time to my friends in the “correct” category “hey which room are we going to next?”.

    This new category should map much better to my interests and contributions and allow me to have better interaction with the many different product teams at Microsoft.

    About this blog

    I started this blog eight years ago now, (time flies, I’m really getting old) back when I was student (I’d soon become a Microsoft Student Partner) and the primary focus was to fill the gaps of Microsoft documentation and my memory.

    The idea was simple, whenever I had a problem the documentation wasn’t giving a clear answer to and I didn’t want to forget the solution for, I’d document it. And instead of keeping that for myself, let’s put it on a blog so it might help over people.

    Over the years a lot of things changed: Microsoft is much more active on their blogging platform (the focus changed a bit though), they transitioned a to open source, even the documentation is based on open source allowing us to fill the gaps with a much simpler process, people read less and watch more, and cross-technology help platforms emerged (thinking about stack overflow).

    With those considerations in mind I think it’s much more valuable to contribute to the global effort on those public platforms (stackoverflow, github…) that will benefit everybody (even those that have no clue about your blog) and allow for peer reviews, updates, etc…

    Besides investing more time on those way of contributing, I also changed a lot, my focus back then was both IT pro and dev, and I’ve matured to understand my core passion is development (in a DevOps philosophy).

    Lastly I spend much more time giving in person sessions at events than a couple of years ago.( I actually gave my first session in English ever almost five years ago now at SharePoint Saturday Ottawa, hopefully my English has improved since).

    All those reasons hopefully explain why there’s a bit less content over here, and this is for the greater good :)


    To conclude this post, I’d like to thank all my peers (MVPs or not), the people leading/following/challenging me and finally Simran Chaudhry who was until recently an amazing Canadian MVP lead during my four first years as an MVP!

    • 27/6/2018

    Speaking at SharePoint Saturday New York City 2018

    This year again I have the opportunity to speak at the SPS NYC.

    I’ll give a speech about “Migrate your custom components to the SharePoint Framework”.

    Together we will see how you can transition from either the server-side object model or the add-ins development fashions to the modern SharePoint Framework approach. This session is mostly meant for developers and technical architects who are curious about getting started on the new model and that might have legacy applications.

    Together we will see a lot of demonstrations, examples and you’ll leave the session with a clear understanding of how to transition your existing custom solutions to the latest and greatest.

    If you’re in the area Saturday July, the 28th 2018 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day.

    See you there!

    • 16/5/2018

    Less than three weeks before the #SPSMontreal 2018!

    This year again I have the privilege to be part of the organizing committee for SharePoint Saturday Montréal 2018 taking place June the 2nd at Cégep du vieux Montréal.

    It’s free, it’s the occasion to learn a lot, not only about SharePoint but also about Office 365 and Azure.

    This year in a few numbers:

    • 25 sessions (in English and French)
    • 1 Keynote by the Great Serge Tremblay
    • 16 Microsoft MVP (Most Valuable Professional)
    • 1 MSP (Microsoft Student Partner)
    • 5 MCT (Microsoft Certified Trainers)
    • 9 partners
    • 200 attendees (planned)

    Besides the content, it’s also the occasion to develop your network, eat some Schwartz (smoked meat) and “SharePint” (Share a pint) !

    I hope I’ll see you there in a couple of weeks!


    • 10/5/2018

    Speaking at the #Techorama Belgium 2018 about the #MicrosoftGraph

    This year again I have the honour to be selected to speak at the Techorama Belgium 2018.

    It is a paid event taking place at the Kinepolis Antwerp (attending/speaking in a cinema theater is really cool!) from May 23rd and 24th. They have a great content, great speakers (many folks from Microsoft or other MVPs) and if you haven’t booked your ticket yet, I suggest you do!

    I’ll be giving two sessions that are related:

    • What’s new with the Microsoft Graph? (we’ll cover together what came out during the last year)
    • Deep dive into Microsoft Graph (we’ll cover advanced scenarios around capabilities, authentication and authorization…)


    Hopefully see you there! (I know that a lot of the Office 365/SharePoint people will be at SharePoint Conference North America during the same time period)

    • 16/4/2018

    Internet Explorer compatibility mode is changing on SharePoint Online

    The history being this situation….

    Internet Explorer has been a corporate browser for two decades now. And many of us remember the dark ages of web development when we needed to have “IE compatible code” and “web compatible code”.

    As many companies invested deeply in the browser building portals that worked with specific versions, Microsoft provided a decade ago a compatibility mode, allowing the browser to “behave” like a former version of itself and stay compatible with websites that had not been updated.

    You can set, from your website, this compatibility mode to instruct Internet Explorer which version it should run under and since SharePoint 2013, it was set to version 10.

    This made a lot of sense originally as SharePoint had a lot of legacy code that needed to be migrated by the product team before it could run properly under IE 11.

    However, as years passed, it was more and more painful, degrading performances, compatibility with modern frameworks and bringing strange rendering behaviors.

    …And what changed recently

    At 2toLead we started noticing a couple of tenants changing from the version 10 compatibility mode to “edge” (IE11, it was called this way at the time in the transition period) with tenant version

    You can check the version of your tenant by using your browser’s development tools, look at any request to SharePoint and look at the MicrosoftSharePointTeamServices response header.

    To check which compatibility mode SharePoint is currently sending to your users you can look in the source of the page and check the X-UA-Compatible metadata.

    You can mostly expect performances improvements and better compatibility with modern web standards and frameworks. There might be cases where, because you had to workaround issues with older version of IE, some things might start behaving/looking differently.

    Lookout for this change coming to your tenants if you have any customization in place!


    Alternatively, your admins should be able to put the SharePoint site in a list and force the compatibility from an admin perspective, that might come at the price of some native SharePoint functionalities not working anymore and should only be used temporary to give you the time to fix the situation.

    Reminder, Internet Explorer 11 is the only supported browser for client OS now.

    Related resources

    • 21/3/2018

    Git: fork/import/manual copy, keep it clean

    Part of my role at 2toLead is to help set guidance around best source management practices either internally or for our customers. One of the questions I get often is: Should I fork this repository or do something else with it?

    It’s hard to get clear and simple guidance on the web so I thought I’d take a stab at it.

    As we’re using Visual Studio Team Services, my examples and screenshots will be based on it, but it really applies to any git service like Github or internal git servers.

    The options you get

    From my point of view, when you have an existing repository somewhere, you have the following options to move/copy it depending on your scenario:

    • Making a manual copy: manually copying files over, running a git init or something similar and starting anew. This should probably be always avoided. You’ll lose history, capability to merge with the source and so on. Even if it looks super easy, you’ll regret it later.
    • Using the import option: In VSTS you have the option, after creating a new repository, to import the content (and history, and branches…) from another repository. This is great for simple migration scenarios as it’ll keep the history and other artifacts that will be useful later. Use it if the source is not in the same service (ie Github => VSTS) or if you’re planning to delete the source right after.
    • Fork a repo (from the source): this is probably the best option if you’re planning to have both the source and the new repository live. It will allow you to easily port your commits from one repository to another (via pull requests). This choice should probably be your go to by default.

     Example of authoring a PR across repos.

    Example of importing a repo

    Getting out from the mess

    Now let’s say you landing on this article because the situation already got out of control. You have the “same” source base that got over multiple repositories, and not necessary on the recommended way, how do you fix that?

    Before we begin let me say that this operation can be error prone, make you loose work, will induce a “service interruption” (even short) for your developers and this solution is provided with no warranty whatsoever. Also make sure all changes are committed and pushed before starting anything, for every developer accessing the repo.

    You are facing two main cases:

    • Your repositories have at some point a common commits tree (they have been forked or imported). Git is going to “understand” some of what happened and will be able to help us
    • Your repositories don’t share a common tree (manual copy of files), you are going to have to “replay” the changes on the new fork manually, super error prone.


    Common tree scenario

    Let’s say I have the current structure.



    The second one being an import of the source one. Source didn’t get any updates since the import but the import did. And now I want to be able to propagate changes from importedRepo to the source one, without having to handle merges and multiple remotes locally.

    First, fork the RepoSource repo into ProjectB/ForkedRepo

    Then clone the ForkedRepo locally. After that run the following commands.

    Make sure you set up the branch policies, builds definitions and release definitions are up to date. Even run a diff tool on your local machine, branch per branch, between the two repositories folders and you’re good to go!

    For the other developers on the team, simply run these set of commands to re-map to the new forked repository.

    My ask for the VSTS product team

    Please make it easier to move Git repositories between team projects keeping the fork link and everything.



    I hope that post helped bring a bit of clarity on the best practices as well as it helped some of you fix the situation.

    • 8/1/2018

    New SharePoint Framework PnP Samples available: using the Skype UCWA Web SDK to subscribe to people’s status


    I added two new SharePoint framework WebParts PnP sample to demonstrate how to use the Unified Communications Web API JavaScript SDK from Skype for business. This SDK allows you to do things like subscribe to a person status, start instant messaging conversations, calls…

    To get a look:

    Long version

    I recently had the occasion to make my first contribution to PnP (besides creating issues and helping investigate those). In these two new samples I show you how to you how to leverage the Skype UCWA SDK to subscribe and display people skype status.

    That skype status will update itself it changes (ie: the target user sets his/her status to something different in Skype for business or goes offline).

    This approach is better than the previous ones leveraging office integration, activeX or the SharePoint File User Status because:

    • It is cross platform (doesn’t need activeX, Internet Explorer, or security zones configuration)
    • It does not require Office to be installed on the machine or the user to be singed in Skype for Business
    • It updates with status changes
    • It does not need a file to be created and can be rendered anywhere on the page
      • (the file presence indicator is SharePoint talking to the Skype server and rendering some HTML, but it only does so in libraries)

    The UCWA

    The API itself is a bit peculiar a doesn’t necessary works like a “standard API” (OAuth + REST). The key differences being:

    • You can subscribe to some of the object model (status, conversations, audio/video feeds…)
    • Authentication is a bit (too) complex
      • It’s not standard and expects you to get a 403 in order to get the identity provider information
      • The discovery server also acts like a relay and the “API” is segmented in multiple resources (besides the permissions) which necessitates multiple rounds of authorization.
    • Not OData capable (we got used to that in the Microsoft world)

    All those differences impart the few SDK’s. The JS SDK (the one used in the samples) is a bit “old fashioned”. I really wish they spent some time:

    • Moving to UMD and not a global var
    • Providing npm packages and not a single bundle on the CDN
    • Providing types definitions
    • Providing better version information (+ semver)
    • Allowing developer to leverage their own auth libraries (like hellojs or msal) instead of the implementation the have bundled.

    Those are the main reasons why I’m side loading the script and not including it as an external dependency in the samples.

    Lastly, the documentation is a little bit all over the place, I’d suggest you corroborate the information even from official sources because it looks like some of the documentation hasn’t been updated. Here are the entry points:


    Microsoft Teams vs Skype

    The skype for business API’s are still not available through the Microsoft Graph. And Microsoft announced that Microsoft Teams is going to be the future for unified communications, instant messaging and so much more. However, Skype for business Server is going to keep being the backend (and at least provide some of the APIs) for a least a few years.

    • 5/1/2018

    Speaking at SharePoint Fest DC (Washington) 2018

    SharePoint Fest DC (Washington) 2018 is happening from March 26th to March 30th. This event will feature 2 days pre-conference workshops and 3 days of conference. You can find more information about it on the website. I’ve been selected amongst 44 other speakers to present this year two sessions:

    AZR204 – Microsoft Graph and SharePoint framework under steroids with Azure Functions

    Modern development means client side first, backend second. However, there are still cases where you might need some backend processing, for long running operations, heavy computing consuming tasks or security concerns.

    During that session we will learn how you can build server-less solutions to support modern development. We will determine together when it makes sense to offload things on the backend and when is does not. We will have a lot of examples working with the Microsoft Graph as well as the SharePoint Framework.

    Finally, we will see that server-less does not mean hacky solutions and that proper continuous integration and deployment processes can be implemented.”

    More information here.

    DEV 302 - Is it possible to do DevOps with the SharePoint framework?

    You had it all right with solutions and add-ins. Your release pipeline was set up. Do new technologies and methodologies mean starting over?


    Don’t panic I’m here to help! Together we’ll see how to set up a devops pipeline for SPFX developments with:

     Automated builds

     Automated deployments

     Automated tests

     Code quality check

    There will be a lot of demonstrations during this session and we’ll be mostly using Visual Studio Team Services/Team Foundation Server.

    This session is mostly meant for develops, architects, quality assurance people…

    More information here.

    I’m truly honored to be part of this prestigious event with so many other great SharePoint/Office 365 speakers. If you haven’t booked your ticket to the event yet, go ahead!

    See you there.

    • 3/1/2018

    Full version of lodash now available in the SharePoint Framework

    TL; DR;

    Microsoft replaced @types/es6-collections by the es2015.collection library in version 1.4.0 of the packages. Those had a conflicting definition of weakmap which caused issues with packages like lodash.

    Long version

    Microsoft recently release v1.4.0 of the SharePoint Framwork and it’s packages. It contains a lot of improvements and one of those probably went unnoticed by many of us.

    @types/es6-collections has been replaced by es2015.collection library (native, comes with the compiler/JS Engines). That package had a “special” definition of WeakMap (among other things) which was causing numerous packages not to work properly, include one of my favorites: lodash.

    To workaround that issue, Microsoft had to provide @microsoft/sp-lodash-subset which as it name indicates, is only a subset of lodash and didn’t provide useful methods like map.

    Which one should I use?

    This is something that’s hard to say at the moment, I’ll give you the pro’s and con’s instead for using the subset


    • The subset is lighter, which means faster load times for your users
    • The subset is maintained/checked by Microsoft, which means less likely to break SPFX


    • The subset doesn’t have all the lodash features, which means you might have to do more things manually
    • The subset is maintained by Microsoft, which means improvements in lodash won’t get to us as fast
    • The subset is not on a CDN, which can impact your load times depending on a lot of other considerations (HTTP2, quality of your CDN…)
    • The subset might not be around for long, considering it represents extra work for Microsoft, they might deprecate it to focus on other things


    Now all that being said, nothing prevents you from using both in parallel and the upgrade path (from subset to full version) is fairly easy once you’re on SPFX >= 1.4.0: it’s about a couple of text replace.

    • 17/11/2017

    Speaking at SharePoint Saturday Detroit 2017

    This year again I have the opportunity to speak at the SPS Detroit.
    I’ll give a speech about the graph “Improving DevOps using Microsoft's Business Productivity Tools and more” and I'll be co-presenting the session with my friend Haniel Croitoru.
    We'll explore together how DevOps practices impact and improve solutions delivery for your customers and for the best. With real life scenarios and experience for the field we'll show you how you can get started and what to expect out of it.
    If you’re in the area Saturday December the 2nd 2017 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.
    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.
    See you there!
    • 9/10/2017

    Using PnP PowerShell on Visual Studio Team Services (VSTS) Hosted Agent

    Visual Studio Team Services (VSTS) provides great Continuous Integration (CI) and Continuous Deployment (CD) functionalities you can leverage to implement DevOps pipelines and automation with your custom developments.

    If your custom solutions rely on PnP PowerShell during their build and/or deployment processes, you will need PnP PowerShell to be installed on the agent.
    Unfortunately the Hosted Agents do not have PnP PowerShell installed by default.
    Note: that documentation only applies to the Hosted and Hosted 2017 agents, the Linux Hosted Agent is not supported at the moment

    Install PnP PowerShell

    Add a first task to your build/release definition (type PowerShell). In the Type Field select Inline Script.
    In the Inline Script section copy and paste that script

    Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser"
    Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -AllowClobber -Force

    Note: you can also install a specific version using the -RequiredVersion parameter at line 2.
    Note: you can also improve that script according to your needs as well as save it in a file you include in your repository to best fit your pipeline.
    Note: that module installation task must be included once agent phase

    Using PnP PowerShell

    In your scripts leveraging PnP PowerShell, before calling any command related to that module, make sure you include the following line.

    Import-Module SharePointPnPPowerShellOnline -Scope "Local"

    Uninstalling PnP PowerShell

    Note: this step is optional if you are using the VSTS Hosted Agent and is only provided to people using custom agents on which they do not want to / can not install PnP PowerShell globally
    To avoid conflicts if your scripts require a specific version of PnP PowerShell, it is a good practice to cleanup after your build/release is done.
    In order to do so simply add another PowerShell task and in the inline script section copy the script bellow.

    get-installedmodule SharePointPnPPowerShellOnline | ? {$_.InstalledLocation -like "*"+$home+"*"} | Uninstall-Module  -Force

    Note: this is a repost of a wiki page I created on the PnP PowerShell repo
    • 6/10/2017

    Determine your technical debt using SonarQube - Conclusion

    Installing and setting up SonarQube may seem quite complex and tedious.

    I hope that this series has helped you to go faster implementing it.

    Now, you can clearly identify your technical debt and take actions to improve the quality of your developments.

    It is obvious that when a thousand problems appear in the code at once, it can be discouraging, just keep this in mind:

    -          There are false positives, make a first pass to ignore/exclude those

    -          Try to have a policy like "no commit should make the situation worse" or even better "each commit must correct all the problems on the edited files” Which will allow you to improve the situation little by little.

    -          Some organizations prefer to do one or two sprints of technical debt solving to get a fresh start


    How about you? did you find this useful? feel free to comment.

    • 4/10/2017

    Determine your technical debt using SonarQube - Bonus SonarLint extension configuration

    TL; DR

    You can display SonarQube static analysis results live in Visual Studio error and information console using the same rules set as the SonarQube project.

    Installing the extension

    Just go to and proceed with the installation.

    Binding the Visual Studio solution to the SonarQube analysis

    From the Team Explorer click SonarQube

    Click on connect.

    (if you obtain a certificate error, you must install the self-signed certificate of the SonarQube server on your machine)

    To generate a personal access token, refer to the following documentation

    Enter the token in the SonarQube login prompt as well as the server url

    Double click on the SonarQube project that you want to bind to the Visual Studio solution

    The errors detected by SonarQube static analysis now show up as a warning in the error console as well as in the intellisense.

    JavaScripts Projects

    There are cases where you'll be only working on JavaScript/TypeScript using an editor lighter than the full version of Visual Studio, for example Visual Studio code. With SonarQube, the static analysis for JavaScripts projects, primarily relies on ESlint and TSlint. To have static analysis work from within your code editor, you only need to install the corresponding extensions and add a few configuration files in your source base.

    If you want to analyze TypeScript, you'll also need to install the TS Plugin, you'll find all the details here

    • 2/10/2017

    Determine your technical debt using SonarQube - Monitoring the results

    TL; DR

    Static analysis errors will appear as of the warnings in the compilation section. A static analysis badge will also appear on the build report and you'll be able to have detailed and comprehensive information from SonarQube.

    Information incorporated with the build

    When displaying the details of a build, you'll now find a new section dedicated to SonarQube. Within that section, besides the quality badge, you'll also find a link to the static analysis results details. Also under the build section, all static analysis critical issues will show up as warnings.

     Note: that only shows up for the msbuild kind of projects.

    Details available in SonarQube

    From your SonarQube web portal, you'll find detailed static analysis results indicating how the code got better or worse. Using SonarQube you can build new dashboard that will help you have a clear vision at glance of your code quality and how to improve it.

    • 29/9/2017

    Determine your technical debt using SonarQube - Updating your build definitions

    TL; DR

    Static analysis will be executed when building your source base using the central build machine. You have two options to set this up with VSTS:

    • Your project is "Visual Studio" related and leverages sln and cs/vb proj files: in that case you can leverage integrated pre and post build tasks provided by the SonarQube VSTS extension.
    • Your project is not build using msbuild: in that case you must leverage the SonarQube CLI task. It's a little bit more complicated so I'll demonstrate only the first case for now.

    Adding tasks to the build definition

    We'll go under the assumption that you're already using build 2015/vNext and you have working with build definitions for at least one project.

    Edit your build definition and add two SonarQube tasks, place the begin analysis before the visual studio build task and the end of analysis after the execution of unit tests. (that way, if you have code coverage results, they will surface in SonarQube).

    Under the "begin analysis" task, set the endpoint, name of project, key to the project, as well as optional parameters if needed.

    In my case I also added /d:sonar.exclusions=**/*.min.js,**/*.min.css in optional parameters to exclude minified javascript files from analysis.

    Note: these settings can also be specified in the global SonarQube settings or in SQ project settings.

    Note: java must be installed on the build machine if you have your own build machines.

    Note: I recommend you add a 'SonarQubeAnalysisTimeoutInSeconds' variable to the build definition with the following value "600". This will extend the time-out for the static analysis, sometime your machine to have several results to import at a time, stale a little bit, which can cause builds to take longer and/or time-out.

    If you're working on a non-msbuild project just use the CLI task somewhere after you built and unit tested your code.

    • 29/9/2017

    Determine your technical debt using SonarQube - Creating the SonarQube project

    TL; DR

    SonarQube allows you to create projects. These projects will hold your code analysis results. You can configure a SQ project for each code repository or even for each branch to have different deltas. (ex my master builds every month, I want to see changes to the monthly and my dev builds daily so I want to see evolution on a day by day basis).

    Creating the project

    Go to "configuration"-> "Projects"-> "Management" then "create project".

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    • 27/9/2017

    Determine your technical debt using SonarQube - Creating and configuring a service account for VSTS in SonarQube

    TL; DR

    To prevent anyone from sending analysis results to our SonarQube installation, we need to secure the access to its services. To do so, we'll configure a service account.

    Creating the service account

    From SonarQube, go to administration, security, users, and add an account.

    Next click on the "tokens" cell for the account we just created an generate a new personal access token.

    You can also refer to that documentation if you're not sure how to generate a PAT

    Provisioning the service account

    To leverage this service account in VSTS, go to your team project, click settings, Services and click on "New service endpoint" and "SonarQube". Then enter the URL of your SonarQube installation, a display name for the connection and the Personal Access Token.

    • 25/9/2017

    Determine your technical debt using SonarQube - Setting up Azure Active directory for authentication

    TL; DR

    We will install and configure an add-on to delegate authentication to Azure Active Directory. This will allow our developers to use the same account between Visual Studio Team Services and SonarQube.

    Configuration of the authentication module

    Since version 5.4 SonarQube provides an additional plugin relying on the OAuth protocol to communicate with AAD. This will allow the users to leverage their corporate account to access SonarQube, providing SSO and simplifying the administrators job by having a central identity repository.

    The setup procedure is already well documented, rather than duplicating it, here is a link to the resources.

    Installing the SonarQube extension to VSTS

    Visual Studio Team Services provides a highly extensible model to third parties so they can integrate their solution with VSTS.

    SonarQube has implemented build tasks and service definitions for VSTS. Before being able to leverage SonarQube from VSTS you need first to install the corresponding extension.

    To do so, just click on the link provided bellow and click on install, you need to be team project collection administrator to install extensions.

    Note: for on premises TFS installations, it will require a few more steps, see this link:

    • 22/9/2017

    Determine your technical debt using SonarQube - Adding modules

    TL; DR

    Static analysis works by leveraging rules. These rules are grouped by language or language categories in modules that you can install. In addition to providing support for the subsequent languages, these modules can extend the native capabilities of SonarQube.

    Most of them are free, some are subject to commercial licenses.

    Installing Add-ons

    Open SonarQube and go to configuration, system, search for and install the modules that you're interested in.

    Once all the modules installed you need to restart the server using the button available in the UI of SonarQube.

    • 20/9/2017

    Determine your technical debt using SonarQube - Opening SonarQube’s ports

    TL; DR

    Open ports 22, 9000, 80 and 443 inbound on the VM.

    Details of the opening of ports

    Rather than repeating what is already documented, I will provide you with the link 

    It is necessary to open the ports 22, 80, 443 and 9000 allowing respectively to access the machine remote shell, load http and https content, and access the management console.

    • 18/9/2017

    Determine your technical debt using SonarQube - Installing the machine

    TL; DR

    We will update the machine, install docker, and provision the containers we need.

    Installation of docker and updating the machine

    Connect to the machine using SSH (Putty is a very good client for windows) and run the following commands:

    Setting up containers, creating the certificates

    The containers are the components of our system managing the web traffic and providing the SonarQube service.

    To secure connections, we will also generate self-signed SSL certificates which is not the easiest thing to do when someone is not used to working with linux environments. It's most likely to be the case for developers using Visual Studio Team Services (or TFS) because they come mostly from the Windows world.

    I shared configuration scripts on Github to help you. Obviously if you have your own certificates, or if your environment already has some pre-existing configurations, you can edit this script.

    (see the SSL part of the script)