Vince365

    • 16/5/2018

    Less than three weeks before the #SPSMontreal 2018!

    This year again I have the privilege to be part of the organizing committee for SharePoint Saturday Montréal 2018 taking place June the 2nd at Cégep du vieux Montréal.

    It’s free, it’s the occasion to learn a lot, not only about SharePoint but also about Office 365 and Azure.

    This year in a few numbers:

    • 25 sessions (in English and French)
    • 1 Keynote by the Great Serge Tremblay
    • 16 Microsoft MVP (Most Valuable Professional)
    • 1 MSP (Microsoft Student Partner)
    • 5 MCT (Microsoft Certified Trainers)
    • 9 partners
    • 200 attendees (planned)

    Besides the content, it’s also the occasion to develop your network, eat some Schwartz (smoked meat) and “SharePint” (Share a pint) !

    I hope I’ll see you there in a couple of weeks!

    Registrations:

    http://www.spsevents.org/city/montreal/montreal2018/_layouts/15/spsevents/registrations/registrationform.aspx

    • 10/5/2018

    Speaking at the #Techorama Belgium 2018 about the #MicrosoftGraph

    This year again I have the honour to be selected to speak at the Techorama Belgium 2018.

    It is a paid event taking place at the Kinepolis Antwerp (attending/speaking in a cinema theater is really cool!) from May 23rd and 24th. They have a great content, great speakers (many folks from Microsoft or other MVPs) and if you haven’t booked your ticket yet, I suggest you do!

    I’ll be giving two sessions that are related:

    • What’s new with the Microsoft Graph? (we’ll cover together what came out during the last year)
    • Deep dive into Microsoft Graph (we’ll cover advanced scenarios around capabilities, authentication and authorization…)

     

    https://techoramabelgium2018.sched.com/event/DOxx/whats-new-with-the-microsoft-graph

    https://techoramabelgium2018.sched.com/event/DOvi/deep-dive-into-microsoft-graph

    Hopefully see you there! (I know that a lot of the Office 365/SharePoint people will be at SharePoint Conference North America during the same time period)

    • 16/4/2018

    Internet Explorer compatibility mode is changing on SharePoint Online

    The history being this situation….

    Internet Explorer has been a corporate browser for two decades now. And many of us remember the dark ages of web development when we needed to have “IE compatible code” and “web compatible code”.

    As many companies invested deeply in the browser building portals that worked with specific versions, Microsoft provided a decade ago a compatibility mode, allowing the browser to “behave” like a former version of itself and stay compatible with websites that had not been updated.

    You can set, from your website, this compatibility mode to instruct Internet Explorer which version it should run under and since SharePoint 2013, it was set to version 10.

    This made a lot of sense originally as SharePoint had a lot of legacy code that needed to be migrated by the product team before it could run properly under IE 11.

    However, as years passed, it was more and more painful, degrading performances, compatibility with modern frameworks and bringing strange rendering behaviors.

    …And what changed recently

    At 2toLead we started noticing a couple of tenants changing from the version 10 compatibility mode to “edge” (IE11, it was called this way at the time in the transition period) with tenant version 16.0.0.7604.

    You can check the version of your tenant by using your browser’s development tools, look at any request to SharePoint and look at the MicrosoftSharePointTeamServices response header.

    To check which compatibility mode SharePoint is currently sending to your users you can look in the source of the page and check the X-UA-Compatible metadata.

    You can mostly expect performances improvements and better compatibility with modern web standards and frameworks. There might be cases where, because you had to workaround issues with older version of IE, some things might start behaving/looking differently.

    Lookout for this change coming to your tenants if you have any customization in place!

     

    Alternatively, your admins should be able to put the SharePoint site in a list and force the compatibility from an admin perspective, that might come at the price of some native SharePoint functionalities not working anymore and should only be used temporary to give you the time to fix the situation. https://docs.microsoft.com/en-us/internet-explorer/ie11-deploy-guide/turn-on-enterprise-mode-and-use-a-site-list

    Reminder, Internet Explorer 11 is the only supported browser for client OS now. https://support.microsoft.com/en-us/help/17454/lifecycle-faq-internet-explorer

    Related resources

    https://techcommunity.microsoft.com/t5/SharePoint/IE9-IE10-users-in-SPO-Time-to-move-to-modern-browsers/td-p/36692

    https://docs.microsoft.com/en-us/internet-explorer/ie11-deploy-guide/fix-compat-issues-with-doc-modes-and-enterprise-mode-site-list

    • 21/3/2018

    Git: fork/import/manual copy, keep it clean

    Part of my role at 2toLead is to help set guidance around best source management practices either internally or for our customers. One of the questions I get often is: Should I fork this repository or do something else with it?

    It’s hard to get clear and simple guidance on the web so I thought I’d take a stab at it.

    As we’re using Visual Studio Team Services, my examples and screenshots will be based on it, but it really applies to any git service like Github or internal git servers.

    The options you get

    From my point of view, when you have an existing repository somewhere, you have the following options to move/copy it depending on your scenario:

    • Making a manual copy: manually copying files over, running a git init or something similar and starting anew. This should probably be always avoided. You’ll lose history, capability to merge with the source and so on. Even if it looks super easy, you’ll regret it later.
    • Using the import option: In VSTS you have the option, after creating a new repository, to import the content (and history, and branches…) from another repository. This is great for simple migration scenarios as it’ll keep the history and other artifacts that will be useful later. Use it if the source is not in the same service (ie Github => VSTS) or if you’re planning to delete the source right after.
    • Fork a repo (from the source): this is probably the best option if you’re planning to have both the source and the new repository live. It will allow you to easily port your commits from one repository to another (via pull requests). This choice should probably be your go to by default.

     Example of authoring a PR across repos.

    Example of importing a repo

    Getting out from the mess

    Now let’s say you landing on this article because the situation already got out of control. You have the “same” source base that got over multiple repositories, and not necessary on the recommended way, how do you fix that?

    Before we begin let me say that this operation can be error prone, make you loose work, will induce a “service interruption” (even short) for your developers and this solution is provided with no warranty whatsoever. Also make sure all changes are committed and pushed before starting anything, for every developer accessing the repo.

    You are facing two main cases:

    • Your repositories have at some point a common commits tree (they have been forked or imported). Git is going to “understand” some of what happened and will be able to help us
    • Your repositories don’t share a common tree (manual copy of files), you are going to have to “replay” the changes on the new fork manually, super error prone.

     

    Common tree scenario

    Let’s say I have the current structure.

    ProjectA/RepoSource

    ProjectB/ImportedRepo

    The second one being an import of the source one. Source didn’t get any updates since the import but the import did. And now I want to be able to propagate changes from importedRepo to the source one, without having to handle merges and multiple remotes locally.

    First, fork the RepoSource repo into ProjectB/ForkedRepo

    Then clone the ForkedRepo locally. After that run the following commands.

    https://gist.github.com/baywet/0373d9a298bbb5f4cbd0ae6df6326872#file-bringimportedrepobranchesbacktoforked-sh

    Make sure you set up the branch policies, builds definitions and release definitions are up to date. Even run a diff tool on your local machine, branch per branch, between the two repositories folders and you’re good to go!

    For the other developers on the team, simply run these set of commands to re-map to the new forked repository.

    https://gist.github.com/baywet/0373d9a298bbb5f4cbd0ae6df6326872#file-updatedevelopersrepoaftermigration-sh

    My ask for the VSTS product team

    Please make it easier to move Git repositories between team projects keeping the fork link and everything.

    https://visualstudio.uservoice.com/forums/330519-visual-studio-team-services/suggestions/17189462-make-it-easier-to-move-a-git-repo-from-one-team-pr

     

    Conclusion

    I hope that post helped bring a bit of clarity on the best practices as well as it helped some of you fix the situation.

    • 8/1/2018

    New SharePoint Framework PnP Samples available: using the Skype UCWA Web SDK to subscribe to people’s status

    TL;DR;

    I added two new SharePoint framework WebParts PnP sample to demonstrate how to use the Unified Communications Web API JavaScript SDK from Skype for business. This SDK allows you to do things like subscribe to a person status, start instant messaging conversations, calls…

    To get a look:

    Long version

    I recently had the occasion to make my first contribution to PnP (besides creating issues and helping investigate those). In these two new samples I show you how to you how to leverage the Skype UCWA SDK to subscribe and display people skype status.

    That skype status will update itself it changes (ie: the target user sets his/her status to something different in Skype for business or goes offline).

    This approach is better than the previous ones leveraging office integration, activeX or the SharePoint File User Status because:

    • It is cross platform (doesn’t need activeX, Internet Explorer, or security zones configuration)
    • It does not require Office to be installed on the machine or the user to be singed in Skype for Business
    • It updates with status changes
    • It does not need a file to be created and can be rendered anywhere on the page
      • (the file presence indicator is SharePoint talking to the Skype server and rendering some HTML, but it only does so in libraries)

    The UCWA

    The API itself is a bit peculiar a doesn’t necessary works like a “standard API” (OAuth + REST). The key differences being:

    • You can subscribe to some of the object model (status, conversations, audio/video feeds…)
    • Authentication is a bit (too) complex
      • It’s not standard and expects you to get a 403 in order to get the identity provider information
      • The discovery server also acts like a relay and the “API” is segmented in multiple resources (besides the permissions) which necessitates multiple rounds of authorization.
    • Not OData capable (we got used to that in the Microsoft world)

    All those differences impart the few SDK’s. The JS SDK (the one used in the samples) is a bit “old fashioned”. I really wish they spent some time:

    • Moving to UMD and not a global var
    • Providing npm packages and not a single bundle on the CDN
    • Providing types definitions
    • Providing better version information (+ semver)
    • Allowing developer to leverage their own auth libraries (like hellojs or msal) instead of the implementation the have bundled.

    Those are the main reasons why I’m side loading the script and not including it as an external dependency in the samples.

    Lastly, the documentation is a little bit all over the place, I’d suggest you corroborate the information even from official sources because it looks like some of the documentation hasn’t been updated. Here are the entry points:

     

    Microsoft Teams vs Skype

    The skype for business API’s are still not available through the Microsoft Graph. And Microsoft announced that Microsoft Teams is going to be the future for unified communications, instant messaging and so much more. However, Skype for business Server is going to keep being the backend (and at least provide some of the APIs) for a least a few years.

    • 5/1/2018

    Speaking at SharePoint Fest DC (Washington) 2018

    SharePoint Fest DC (Washington) 2018 is happening from March 26th to March 30th. This event will feature 2 days pre-conference workshops and 3 days of conference. You can find more information about it on the website. I’ve been selected amongst 44 other speakers to present this year two sessions:

    AZR204 – Microsoft Graph and SharePoint framework under steroids with Azure Functions

    Modern development means client side first, backend second. However, there are still cases where you might need some backend processing, for long running operations, heavy computing consuming tasks or security concerns.

    During that session we will learn how you can build server-less solutions to support modern development. We will determine together when it makes sense to offload things on the backend and when is does not. We will have a lot of examples working with the Microsoft Graph as well as the SharePoint Framework.

    Finally, we will see that server-less does not mean hacky solutions and that proper continuous integration and deployment processes can be implemented.”

    More information here.

    DEV 302 - Is it possible to do DevOps with the SharePoint framework?

    You had it all right with solutions and add-ins. Your release pipeline was set up. Do new technologies and methodologies mean starting over?

     

    Don’t panic I’m here to help! Together we’ll see how to set up a devops pipeline for SPFX developments with:

     Automated builds

     Automated deployments

     Automated tests

     Code quality check

    There will be a lot of demonstrations during this session and we’ll be mostly using Visual Studio Team Services/Team Foundation Server.

    This session is mostly meant for develops, architects, quality assurance people…

    More information here.

    I’m truly honored to be part of this prestigious event with so many other great SharePoint/Office 365 speakers. If you haven’t booked your ticket to the event yet, go ahead!

    See you there.

    • 3/1/2018

    Full version of lodash now available in the SharePoint Framework

    TL; DR;

    Microsoft replaced @types/es6-collections by the es2015.collection library in version 1.4.0 of the packages. Those had a conflicting definition of weakmap which caused issues with packages like lodash.

    Long version

    Microsoft recently release v1.4.0 of the SharePoint Framwork and it’s packages. It contains a lot of improvements and one of those probably went unnoticed by many of us.

    @types/es6-collections has been replaced by es2015.collection library (native, comes with the compiler/JS Engines). That package had a “special” definition of WeakMap (among other things) which was causing numerous packages not to work properly, include one of my favorites: lodash.

    To workaround that issue, Microsoft had to provide @microsoft/sp-lodash-subset which as it name indicates, is only a subset of lodash and didn’t provide useful methods like map.

    Which one should I use?

    This is something that’s hard to say at the moment, I’ll give you the pro’s and con’s instead for using the subset

    Pro’s:

    • The subset is lighter, which means faster load times for your users
    • The subset is maintained/checked by Microsoft, which means less likely to break SPFX

    Con’s

    • The subset doesn’t have all the lodash features, which means you might have to do more things manually
    • The subset is maintained by Microsoft, which means improvements in lodash won’t get to us as fast
    • The subset is not on a CDN, which can impact your load times depending on a lot of other considerations (HTTP2, quality of your CDN…)
    • The subset might not be around for long, considering it represents extra work for Microsoft, they might deprecate it to focus on other things

     

    Now all that being said, nothing prevents you from using both in parallel and the upgrade path (from subset to full version) is fairly easy once you’re on SPFX >= 1.4.0: it’s about a couple of text replace.

    • 17/11/2017

    Speaking at SharePoint Saturday Detroit 2017

    This year again I have the opportunity to speak at the SPS Detroit.
    I’ll give a speech about the graph “Improving DevOps using Microsoft's Business Productivity Tools and more” and I'll be co-presenting the session with my friend Haniel Croitoru.
    We'll explore together how DevOps practices impact and improve solutions delivery for your customers and for the best. With real life scenarios and experience for the field we'll show you how you can get started and what to expect out of it.
    If you’re in the area Saturday December the 2nd 2017 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.
    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.
    See you there!
    • 9/10/2017

    Using PnP PowerShell on Visual Studio Team Services (VSTS) Hosted Agent

    Visual Studio Team Services (VSTS) provides great Continuous Integration (CI) and Continuous Deployment (CD) functionalities you can leverage to implement DevOps pipelines and automation with your custom developments.

    If your custom solutions rely on PnP PowerShell during their build and/or deployment processes, you will need PnP PowerShell to be installed on the agent.
    Unfortunately the Hosted Agents do not have PnP PowerShell installed by default.
    Note: that documentation only applies to the Hosted and Hosted 2017 agents, the Linux Hosted Agent is not supported at the moment

    Install PnP PowerShell

    Add a first task to your build/release definition (type PowerShell). In the Type Field select Inline Script.
    In the Inline Script section copy and paste that script

    Install-PackageProvider -Name NuGet -Force -Scope "CurrentUser"
    Install-Module SharePointPnPPowerShellOnline -Scope "CurrentUser" -Verbose -AllowClobber -Force
    

    Note: you can also install a specific version using the -RequiredVersion parameter at line 2.
    Note: you can also improve that script according to your needs as well as save it in a file you include in your repository to best fit your pipeline.
    Note: that module installation task must be included once agent phase

    Using PnP PowerShell

    In your scripts leveraging PnP PowerShell, before calling any command related to that module, make sure you include the following line.

    Import-Module SharePointPnPPowerShellOnline -Scope "Local"
    

    Uninstalling PnP PowerShell

    Note: this step is optional if you are using the VSTS Hosted Agent and is only provided to people using custom agents on which they do not want to / can not install PnP PowerShell globally
    To avoid conflicts if your scripts require a specific version of PnP PowerShell, it is a good practice to cleanup after your build/release is done.
    In order to do so simply add another PowerShell task and in the inline script section copy the script bellow.

    get-installedmodule SharePointPnPPowerShellOnline | ? {$_.InstalledLocation -like "*"+$home+"*"} | Uninstall-Module  -Force

    Note: this is a repost of a wiki page I created on the PnP PowerShell repo
    • 6/10/2017

    Determine your technical debt using SonarQube - Conclusion

    Installing and setting up SonarQube may seem quite complex and tedious.

    I hope that this series has helped you to go faster implementing it.

    Now, you can clearly identify your technical debt and take actions to improve the quality of your developments.

    It is obvious that when a thousand problems appear in the code at once, it can be discouraging, just keep this in mind:

    -          There are false positives, make a first pass to ignore/exclude those

    -          Try to have a policy like "no commit should make the situation worse" or even better "each commit must correct all the problems on the edited files” Which will allow you to improve the situation little by little.

    -          Some organizations prefer to do one or two sprints of technical debt solving to get a fresh start

     

    How about you? did you find this useful? feel free to comment.

    • 4/10/2017

    Determine your technical debt using SonarQube - Bonus SonarLint extension configuration

    TL; DR

    You can display SonarQube static analysis results live in Visual Studio error and information console using the same rules set as the SonarQube project.

    Installing the extension

    Just go to http://www.sonarlint.org/VisualStudio/index.html and proceed with the installation.

    Binding the Visual Studio solution to the SonarQube analysis

    From the Team Explorer click SonarQube

    Click on connect.

    (if you obtain a certificate error, you must install the self-signed certificate of the SonarQube server on your machine)

    To generate a personal access token, refer to the following documentation https://docs.SonarQube.org/display/SONAR/User+Token

    Enter the token in the SonarQube login prompt as well as the server url

    Double click on the SonarQube project that you want to bind to the Visual Studio solution

    The errors detected by SonarQube static analysis now show up as a warning in the error console as well as in the intellisense.

    JavaScripts Projects

    There are cases where you'll be only working on JavaScript/TypeScript using an editor lighter than the full version of Visual Studio, for example Visual Studio code. With SonarQube, the static analysis for JavaScripts projects, primarily relies on ESlint and TSlint. To have static analysis work from within your code editor, you only need to install the corresponding extensions and add a few configuration files in your source base.

    If you want to analyze TypeScript, you'll also need to install the TS Plugin, you'll find all the details here https://github.com/Pablissimo/SonarTsPlugin

    • 2/10/2017

    Determine your technical debt using SonarQube - Monitoring the results

    TL; DR

    Static analysis errors will appear as of the warnings in the compilation section. A static analysis badge will also appear on the build report and you'll be able to have detailed and comprehensive information from SonarQube.

    Information incorporated with the build

    When displaying the details of a build, you'll now find a new section dedicated to SonarQube. Within that section, besides the quality badge, you'll also find a link to the static analysis results details. Also under the build section, all static analysis critical issues will show up as warnings.

     Note: that only shows up for the msbuild kind of projects.

    Details available in SonarQube

    From your SonarQube web portal, you'll find detailed static analysis results indicating how the code got better or worse. Using SonarQube you can build new dashboard that will help you have a clear vision at glance of your code quality and how to improve it.

    • 29/9/2017

    Determine your technical debt using SonarQube - Creating the SonarQube project

    TL; DR

    SonarQube allows you to create projects. These projects will hold your code analysis results. You can configure a SQ project for each code repository or even for each branch to have different deltas. (ex my master builds every month, I want to see changes to the monthly and my dev builds daily so I want to see evolution on a day by day basis).

    Creating the project

    Go to "configuration"-> "Projects"-> "Management" then "create project".

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    Keep the project key in mind, we will need this parameter later when setting up the builds.

    • 29/9/2017

    Determine your technical debt using SonarQube - Updating your build definitions

    TL; DR

    Static analysis will be executed when building your source base using the central build machine. You have two options to set this up with VSTS:

    • Your project is "Visual Studio" related and leverages sln and cs/vb proj files: in that case you can leverage integrated pre and post build tasks provided by the SonarQube VSTS extension.
    • Your project is not build using msbuild: in that case you must leverage the SonarQube CLI task. It's a little bit more complicated so I'll demonstrate only the first case for now.

    Adding tasks to the build definition

    We'll go under the assumption that you're already using build 2015/vNext and you have working with build definitions for at least one project.

    Edit your build definition and add two SonarQube tasks, place the begin analysis before the visual studio build task and the end of analysis after the execution of unit tests. (that way, if you have code coverage results, they will surface in SonarQube).

    Under the "begin analysis" task, set the endpoint, name of project, key to the project, as well as optional parameters if needed.

    In my case I also added /d:sonar.exclusions=**/*.min.js,**/*.min.css in optional parameters to exclude minified javascript files from analysis.

    Note: these settings can also be specified in the global SonarQube settings or in SQ project settings.

    Note: java must be installed on the build machine if you have your own build machines.

    Note: I recommend you add a 'SonarQubeAnalysisTimeoutInSeconds' variable to the build definition with the following value "600". This will extend the time-out for the static analysis, sometime your machine to have several results to import at a time, stale a little bit, which can cause builds to take longer and/or time-out.

    If you're working on a non-msbuild project just use the CLI task somewhere after you built and unit tested your code.

    • 27/9/2017

    Determine your technical debt using SonarQube - Creating and configuring a service account for VSTS in SonarQube

    TL; DR

    To prevent anyone from sending analysis results to our SonarQube installation, we need to secure the access to its services. To do so, we'll configure a service account.

    Creating the service account

    From SonarQube, go to administration, security, users, and add an account.

    Next click on the "tokens" cell for the account we just created an generate a new personal access token.

    You can also refer to that documentation if you're not sure how to generate a PAT https://docs.SonarQube.org/display/SONAR/User+Token

    Provisioning the service account

    To leverage this service account in VSTS, go to your team project, click settings, Services and click on "New service endpoint" and "SonarQube". Then enter the URL of your SonarQube installation, a display name for the connection and the Personal Access Token.

    • 25/9/2017

    Determine your technical debt using SonarQube - Setting up Azure Active directory for authentication

    TL; DR

    We will install and configure an add-on to delegate authentication to Azure Active Directory. This will allow our developers to use the same account between Visual Studio Team Services and SonarQube.

    Configuration of the authentication module

    Since version 5.4 SonarQube provides an additional plugin relying on the OAuth protocol to communicate with AAD. This will allow the users to leverage their corporate account to access SonarQube, providing SSO and simplifying the administrators job by having a central identity repository.

    The setup procedure is already well documented, rather than duplicating it, here is a link to the resources.

    https://Github.com/baywet/azure-docker-SonarQube#step-6-configure-authentication

    Installing the SonarQube extension to VSTS

    Visual Studio Team Services provides a highly extensible model to third parties so they can integrate their solution with VSTS.

    SonarQube has implemented build tasks and service definitions for VSTS. Before being able to leverage SonarQube from VSTS you need first to install the corresponding extension.

    To do so, just click on the link provided bellow and click on install, you need to be team project collection administrator to install extensions.

    https://marketplace.visualstudio.com/items?itemName=SonarSource.SonarQube

    Note: for on premises TFS installations, it will require a few more steps, see this link:

    https://blogs.msdn.microsoft.com/visualstudioalm/2016/03/31/team-foundation-server-extensions-2/

    • 22/9/2017

    Determine your technical debt using SonarQube - Adding modules

    TL; DR

    Static analysis works by leveraging rules. These rules are grouped by language or language categories in modules that you can install. In addition to providing support for the subsequent languages, these modules can extend the native capabilities of SonarQube.

    Most of them are free, some are subject to commercial licenses.

    Installing Add-ons

    Open SonarQube and go to configuration, system, search for and install the modules that you're interested in.

    Once all the modules installed you need to restart the server using the button available in the UI of SonarQube.

    • 20/9/2017

    Determine your technical debt using SonarQube - Opening SonarQube’s ports

    TL; DR

    Open ports 22, 9000, 80 and 443 inbound on the VM.

    Details of the opening of ports

    Rather than repeating what is already documented, I will provide you with the link

    https://Github.com/baywet/azure-docker-SonarQube#step-2-opening-firewall-ports 

    It is necessary to open the ports 22, 80, 443 and 9000 allowing respectively to access the machine remote shell, load http and https content, and access the management console.

    • 18/9/2017

    Determine your technical debt using SonarQube - Installing the machine

    TL; DR

    We will update the machine, install docker, and provision the containers we need.

    Installation of docker and updating the machine

    Connect to the machine using SSH (Putty is a very good client for windows) and run the following commands:

    https://Github.com/baywet/azure-docker-SonarQube#step-4-Setup-docker

    Setting up containers, creating the certificates

    The containers are the components of our system managing the web traffic and providing the SonarQube service.

    To secure connections, we will also generate self-signed SSL certificates which is not the easiest thing to do when someone is not used to working with linux environments. It's most likely to be the case for developers using Visual Studio Team Services (or TFS) because they come mostly from the Windows world.

    I shared configuration scripts on Github to help you. Obviously if you have your own certificates, or if your environment already has some pre-existing configurations, you can edit this script.

    (see the SSL part of the script)

    https://github.com/baywet/azure-docker-sonarqube#step-5-configure-all-containers 

    • 15/9/2017

    Determine your technical debt using SonarQube - Creating the database

    TL; DR

    Create a db sql azure with collation set to SQL_Latin1_General_CP1_CS_AS.

    Details of the database creation

    The SQL Azure database creation steps are already well described, crucial detail: use the following collation: SQL_Latin1_General_CP1_CS_AS. (and use a blank template)

    https://Github.com/baywet/azure-docker-SonarQube#step-3-create-the-azure-SQL-database

    Keep the database access settings (FQDN of the server, username, password, the database name) somewhere, we will need those later.

    Don't forget to open the firewall of the SQL Server for connections from Azure.

    • 13/9/2017

    Determine your technical debt using SonarQube - Provisioning the SonarQube VM

    TL; DR

    We'll provision a ubuntu server in Azure, and install Putty and WinSCP on your local machine

    Details of provisioning

    Here is a link to a documentation explaining how to do it

    https://Github.com/baywet/azure-docker-SonarQube#step-1-create-the-virtual-machine-in-Azure

    This is! the machine is being provisioned!

    Meanwhile take the opportunity to download a SSH terminal if you don't have one, I recommend http://www.PuTTY.org/  (you can also install WinSCP that will also provide a GUI to transfer files)

    • 11/9/2017

    Determine your technical debt using SonarQube - What parts to use?

    TL; DR

    Planning, Sources, Build, deployment, testing: VSTS. Analysis: Azure VM (SonarQube), Azure SQL.

    Parts of our software factory

    Because we use a maximum of services cloud at 2toLead I realized the following installation:

    • Source control: VSTS (git or tfsvc, doesn't matter)
    • Build system: build 2015 VSTS
    • Build machine: provided by VSTS as a service
    • SonarQube machine: Ubuntu Server hosted in Azure
    • Data SonarQube: Azure SQL Database, 10 DTU

    Note that to facilitate the management of the SonarQube "box" we are going to install Docker on the ubuntu machine. Once docker installed, we'll hydrate two containers, nginx and SonarQube.

    Why docker? the philosophy of this article is that processing components are disposable (SonarQube as such). We can replace them quickly if they have stopped working or if a new version of SonarQube is available. Our data will reside in SQL Azure.

    Estimated costs

    • VSTS: free because all our developers have MSDN accounts
    • Build machine: charged by the minute (4 hours available for free per month)
    • SonarQube machine: 60CAD per month
    • Database: 5 CAD per month

    For 65CAD per month (with public prices), you can have a complete suite of software delivery with work management, source control, continuous integration, automated tests, automated deployments, and automated static analysis.

    It took me about 1 hour to install and configure everything from start to finish, and I didn't have this series of articles to guide me, it's quite fast to set up.

    • 8/9/2017

    Determine your technical debt using SonarQube - Static analysis

    TL; DR

    Static analysis allows you to understand weaknesses of your code based on a set of rules. You can have it run automatically on a server or from the IDE.

    Introduction to static analysis

    The principle of static analysis is to take advantage of rules set more or less complex, those will detect patterns in the code that are problematic, categorize their importance and suggest a resolution.

    A few examples:

    • Non-static methods working with static fields can cause competition problems (thread safety).
    • Too complex Boolean expressions that have been modified several times may not be meaningful anymore and may be causing erratic behaviors in the program.

    There are two main families of static analysis tools the first one being called "centralized" or "automated". It will generally be executed once the code has been pushed on the source control. These analyses are usually running during your CI (Continuous Integration) from your build servers so that developers are free to do something else during this period of time.

    The other family is called "integrated", which means that we'll have an analysis in (almost) real time when the developer is writing code. For example, resharper, the analyzers available with roselyn etc. This avoids pushing bad code to the source control code and having to fix it afterwards.

    Note: in some scenarios, we could be perfectly set up 'gated check-ins' which means that the code won't be accepted by source control until the static analysis runs on the new source base and gives a positive feedback.

    Ideally you will have both kinds of static analysis and those are based on a common set of rules to give the same results. We'll see that it is perfectly possible with SonarQube and SonarLint for .net or TSlint for TypeScript/JavaScript developers.

    • 7/9/2017

    Speaking at SharePoint Saturday Ottawa 2017

    This year again I have the opportunity to speak at the SPS Ottawa.


    I’ll give a speech about the graph “Migrate your custom components to the SharePoint Framework ”. We'll see how you can migrate your existing investments in SharePoint development (either full trust solutions or add-ins) to the new SharePoint framework. Migrating these components will not only help you make sure you stay ahead of the technology but will also improve users experience and help migrating to Office 365.


    If you’re in the area Saturday October the 28th 2017 don’t hesitate to register to the event.
    Just as a reminder SPS are free events organized by the community with lot of great sessions.


    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.
    See you there!

    • 6/9/2017

    Determine your technical debt using SonarQube - Introduction

    TL; DR

    This series will explain *how to set up an automated code quality analysis* which is almost free of charge with Visual Studio Team Services, Docker, Azure, and SonarQube.

    Preamble

    There is bad quality code in every development project. This goes from the quick and dirty hack we are not proud of, to the long-forgotten code written by a developer who quit the company.

    The problem with this code is that it will eventually accumulate and slow down the pace of new features delivery. The reasons are various, a lot of time spent in bug fixes, refactoring, support...

    To get out of this vicious circle, we must know what is the current baseline and then update the picture periodically. That'll allow us to see what is the amount of work required to correct the situation, and whether we are improving the situation or making it worse, etc...

    I had the chance to attend a presentation of SonarQube, which is the tool / platform we are going to study during this series of articles. You should know that SonarQube is not the only option out there. Here are the key advantages of this solution:

    • being opensource,
    • relatively known java developers
    • supported by a strong community

    During the Microsoft build 2014 keynote, Microsoft quickly presented it, and they are working with the community to integrate SQ with the Microsoft tools/languages.

     

    Writing this article had been delayed because of several conferences last two years but finally got the time to publish since I discovered and dug the subject.

    In the meantime, Microsoft has published two articles.

    One about installing everything locally, I'll explain you how to do online.

    http://blogs.msdn.com/b/visualstudioalm/archive/2015/09/29/QuickStart-analyzing-NET-projects-with-SonarQube-MSBuild-or-Visual-Studio-online-and-third-party-analyzers-stylecop-ReSharper.aspx

    Another one to install SQ in on vm hosted by Azure (on Windows).

    https://blogs.msdn.Microsoft.com/visualstudioalmrangers/2016/10/06/easily-deploy-SonarQube-server-in-Azure/

    By the time you read this article, all posts of the series are already scheduled for publication, if ever you want to revisit a post or another, you can leverage my blog's search feature.

    Objectives

    While progressing on my writing, I realized that there is a tremendous number of things to explain. Hence these suggested objectives for this series:

    • Introduction to the notion of technical debt (already done) and automated static analysis 
    • Introduction to SonarQube
    • Getting you up and running in a few hours on a first project and this for free (apart from the cost of running the machines)
    • Giving you some tips to finalize the configuration on your projects

    Your installation

    To perform your installation, you have two options:

    • Either you are in a hurry and know already know what I'm about to explain in details, you can jump directly to the installation from GitHub https://Github.com/baywet/azure-docker-SonarQube
    • Or you're discovering Azure, Docker, SonarQube and you want more than just the step by step explanation, in this case to follow this series of blogs