Vince365

    • 27/9/2017

    Determine your technical debt using SonarQube - Creating and configuring a service account for VSTS in SonarQube

    TL; DR

    To prevent anyone from sending analysis results to our SonarQube installation, we need to secure the access to its services. To do so, we'll configure a service account.

    Creating the service account

    From SonarQube, go to administration, security, users, and add an account.

    Next click on the "tokens" cell for the account we just created an generate a new personal access token.

    You can also refer to that documentation if you're not sure how to generate a PAT https://docs.SonarQube.org/display/SONAR/User+Token

    Provisioning the service account

    To leverage this service account in VSTS, go to your team project, click settings, Services and click on "New service endpoint" and "SonarQube". Then enter the URL of your SonarQube installation, a display name for the connection and the Personal Access Token.

    • 25/9/2017

    Determine your technical debt using SonarQube - Setting up Azure Active directory for authentication

    TL; DR

    We will install and configure an add-on to delegate authentication to Azure Active Directory. This will allow our developers to use the same account between Visual Studio Team Services and SonarQube.

    Configuration of the authentication module

    Since version 5.4 SonarQube provides an additional plugin relying on the OAuth protocol to communicate with AAD. This will allow the users to leverage their corporate account to access SonarQube, providing SSO and simplifying the administrators job by having a central identity repository.

    The setup procedure is already well documented, rather than duplicating it, here is a link to the resources.

    https://Github.com/baywet/azure-docker-SonarQube#step-6-configure-authentication

    Installing the SonarQube extension to VSTS

    Visual Studio Team Services provides a highly extensible model to third parties so they can integrate their solution with VSTS.

    SonarQube has implemented build tasks and service definitions for VSTS. Before being able to leverage SonarQube from VSTS you need first to install the corresponding extension.

    To do so, just click on the link provided bellow and click on install, you need to be team project collection administrator to install extensions.

    https://marketplace.visualstudio.com/items?itemName=SonarSource.SonarQube

    Note: for on premises TFS installations, it will require a few more steps, see this link:

    https://blogs.msdn.microsoft.com/visualstudioalm/2016/03/31/team-foundation-server-extensions-2/

    • 22/9/2017

    Determine your technical debt using SonarQube - Adding modules

    TL; DR

    Static analysis works by leveraging rules. These rules are grouped by language or language categories in modules that you can install. In addition to providing support for the subsequent languages, these modules can extend the native capabilities of SonarQube.

    Most of them are free, some are subject to commercial licenses.

    Installing Add-ons

    Open SonarQube and go to configuration, system, search for and install the modules that you're interested in.

    Once all the modules installed you need to restart the server using the button available in the UI of SonarQube.

    • 20/9/2017

    Determine your technical debt using SonarQube - Opening SonarQube’s ports

    TL; DR

    Open ports 22, 9000, 80 and 443 inbound on the VM.

    Details of the opening of ports

    Rather than repeating what is already documented, I will provide you with the link

    https://Github.com/baywet/azure-docker-SonarQube#step-2-opening-firewall-ports 

    It is necessary to open the ports 22, 80, 443 and 9000 allowing respectively to access the machine remote shell, load http and https content, and access the management console.

    • 18/9/2017

    Determine your technical debt using SonarQube - Installing the machine

    TL; DR

    We will update the machine, install docker, and provision the containers we need.

    Installation of docker and updating the machine

    Connect to the machine using SSH (Putty is a very good client for windows) and run the following commands:

    https://Github.com/baywet/azure-docker-SonarQube#step-4-Setup-docker

    Setting up containers, creating the certificates

    The containers are the components of our system managing the web traffic and providing the SonarQube service.

    To secure connections, we will also generate self-signed SSL certificates which is not the easiest thing to do when someone is not used to working with linux environments. It's most likely to be the case for developers using Visual Studio Team Services (or TFS) because they come mostly from the Windows world.

    I shared configuration scripts on Github to help you. Obviously if you have your own certificates, or if your environment already has some pre-existing configurations, you can edit this script.

    (see the SSL part of the script)

    https://github.com/baywet/azure-docker-sonarqube#step-5-configure-all-containers 

    • 15/9/2017

    Determine your technical debt using SonarQube - Creating the database

    TL; DR

    Create a db sql azure with collation set to SQL_Latin1_General_CP1_CS_AS.

    Details of the database creation

    The SQL Azure database creation steps are already well described, crucial detail: use the following collation: SQL_Latin1_General_CP1_CS_AS. (and use a blank template)

    https://Github.com/baywet/azure-docker-SonarQube#step-3-create-the-azure-SQL-database

    Keep the database access settings (FQDN of the server, username, password, the database name) somewhere, we will need those later.

    Don't forget to open the firewall of the SQL Server for connections from Azure.

    • 13/9/2017

    Determine your technical debt using SonarQube - Provisioning the SonarQube VM

    TL; DR

    We'll provision a ubuntu server in Azure, and install Putty and WinSCP on your local machine

    Details of provisioning

    Here is a link to a documentation explaining how to do it

    https://Github.com/baywet/azure-docker-SonarQube#step-1-create-the-virtual-machine-in-Azure

    This is! the machine is being provisioned!

    Meanwhile take the opportunity to download a SSH terminal if you don't have one, I recommend http://www.PuTTY.org/  (you can also install WinSCP that will also provide a GUI to transfer files)

    • 11/9/2017

    Determine your technical debt using SonarQube - What parts to use?

    TL; DR

    Planning, Sources, Build, deployment, testing: VSTS. Analysis: Azure VM (SonarQube), Azure SQL.

    Parts of our software factory

    Because we use a maximum of services cloud at 2toLead I realized the following installation:

    • Source control: VSTS (git or tfsvc, doesn't matter)
    • Build system: build 2015 VSTS
    • Build machine: provided by VSTS as a service
    • SonarQube machine: Ubuntu Server hosted in Azure
    • Data SonarQube: Azure SQL Database, 10 DTU

    Note that to facilitate the management of the SonarQube "box" we are going to install Docker on the ubuntu machine. Once docker installed, we'll hydrate two containers, nginx and SonarQube.

    Why docker? the philosophy of this article is that processing components are disposable (SonarQube as such). We can replace them quickly if they have stopped working or if a new version of SonarQube is available. Our data will reside in SQL Azure.

    Estimated costs

    • VSTS: free because all our developers have MSDN accounts
    • Build machine: charged by the minute (4 hours available for free per month)
    • SonarQube machine: 60CAD per month
    • Database: 5 CAD per month

    For 65CAD per month (with public prices), you can have a complete suite of software delivery with work management, source control, continuous integration, automated tests, automated deployments, and automated static analysis.

    It took me about 1 hour to install and configure everything from start to finish, and I didn't have this series of articles to guide me, it's quite fast to set up.

    • 8/9/2017

    Determine your technical debt using SonarQube - Static analysis

    TL; DR

    Static analysis allows you to understand weaknesses of your code based on a set of rules. You can have it run automatically on a server or from the IDE.

    Introduction to static analysis

    The principle of static analysis is to take advantage of rules set more or less complex, those will detect patterns in the code that are problematic, categorize their importance and suggest a resolution.

    A few examples:

    • Non-static methods working with static fields can cause competition problems (thread safety).
    • Too complex Boolean expressions that have been modified several times may not be meaningful anymore and may be causing erratic behaviors in the program.

    There are two main families of static analysis tools the first one being called "centralized" or "automated". It will generally be executed once the code has been pushed on the source control. These analyses are usually running during your CI (Continuous Integration) from your build servers so that developers are free to do something else during this period of time.

    The other family is called "integrated", which means that we'll have an analysis in (almost) real time when the developer is writing code. For example, resharper, the analyzers available with roselyn etc. This avoids pushing bad code to the source control code and having to fix it afterwards.

    Note: in some scenarios, we could be perfectly set up 'gated check-ins' which means that the code won't be accepted by source control until the static analysis runs on the new source base and gives a positive feedback.

    Ideally you will have both kinds of static analysis and those are based on a common set of rules to give the same results. We'll see that it is perfectly possible with SonarQube and SonarLint for .net or TSlint for TypeScript/JavaScript developers.

    • 7/9/2017

    Speaking at SharePoint Saturday Ottawa 2017

    This year again I have the opportunity to speak at the SPS Ottawa.


    I’ll give a speech about the graph “Migrate your custom components to the SharePoint Framework ”. We'll see how you can migrate your existing investments in SharePoint development (either full trust solutions or add-ins) to the new SharePoint framework. Migrating these components will not only help you make sure you stay ahead of the technology but will also improve users experience and help migrating to Office 365.


    If you’re in the area Saturday October the 28th 2017 don’t hesitate to register to the event.
    Just as a reminder SPS are free events organized by the community with lot of great sessions.


    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.
    See you there!

    • 6/9/2017

    Determine your technical debt using SonarQube - Introduction

    TL; DR

    This series will explain *how to set up an automated code quality analysis* which is almost free of charge with Visual Studio Team Services, Docker, Azure, and SonarQube.

    Preamble

    There is bad quality code in every development project. This goes from the quick and dirty hack we are not proud of, to the long-forgotten code written by a developer who quit the company.

    The problem with this code is that it will eventually accumulate and slow down the pace of new features delivery. The reasons are various, a lot of time spent in bug fixes, refactoring, support...

    To get out of this vicious circle, we must know what is the current baseline and then update the picture periodically. That'll allow us to see what is the amount of work required to correct the situation, and whether we are improving the situation or making it worse, etc...

    I had the chance to attend a presentation of SonarQube, which is the tool / platform we are going to study during this series of articles. You should know that SonarQube is not the only option out there. Here are the key advantages of this solution:

    • being opensource,
    • relatively known java developers
    • supported by a strong community

    During the Microsoft build 2014 keynote, Microsoft quickly presented it, and they are working with the community to integrate SQ with the Microsoft tools/languages.

     

    Writing this article had been delayed because of several conferences last two years but finally got the time to publish since I discovered and dug the subject.

    In the meantime, Microsoft has published two articles.

    One about installing everything locally, I'll explain you how to do online.

    http://blogs.msdn.com/b/visualstudioalm/archive/2015/09/29/QuickStart-analyzing-NET-projects-with-SonarQube-MSBuild-or-Visual-Studio-online-and-third-party-analyzers-stylecop-ReSharper.aspx

    Another one to install SQ in on vm hosted by Azure (on Windows).

    https://blogs.msdn.Microsoft.com/visualstudioalmrangers/2016/10/06/easily-deploy-SonarQube-server-in-Azure/

    By the time you read this article, all posts of the series are already scheduled for publication, if ever you want to revisit a post or another, you can leverage my blog's search feature.

    Objectives

    While progressing on my writing, I realized that there is a tremendous number of things to explain. Hence these suggested objectives for this series:

    • Introduction to the notion of technical debt (already done) and automated static analysis 
    • Introduction to SonarQube
    • Getting you up and running in a few hours on a first project and this for free (apart from the cost of running the machines)
    • Giving you some tips to finalize the configuration on your projects

    Your installation

    To perform your installation, you have two options:

    • Either you are in a hurry and know already know what I'm about to explain in details, you can jump directly to the installation from GitHub https://Github.com/baywet/azure-docker-SonarQube
    • Or you're discovering Azure, Docker, SonarQube and you want more than just the step by step explanation, in this case to follow this series of blogs
    • 13/7/2017

    Speaking at SharePoint Saturday Brussels 2017

    This year again I have the opportunity to speak at the SPS Brussels.


    I’ll give a speech about the Azure functions and Microsoft flow “Introduction to Azure Functions and Flow”.
    Flow and Azure Functions are two new tools you now have for rapid applications development. It’s a revolution that changes the way of building and delivering modern applications. Instead of shipping a monolithic bloc, which can take up to a few months, we’ll now deliver each feature as a part of the solution.


    If you’re in the area Saturday October the 21st 2017 don’t hesitate to register to the event.


    Just as a reminder SPS are free events organized by the community with lot of great sessions.
    This is a good occasion to expand your network, learn a lot of things and to spend a good day.
    See you there!

    • 11/7/2017

    Re-awarded Microsoft MVP for year 2017 2018

    You may not know it but Microsoft has changed the organization of the MVP program during these last few years.
    They used to nominate new MVP’s every 3 months and renew people every year on anniversary date.
    One of the changes the brought to the program is around the renew and nomination cycles. New MVP’s are awarded every month and existing ones are renewed all together every year in July.
    I used to be an “April MVP” and I was used to blogging about my renewal in April.
    I’m pleased to announce I’ve been renewed MVP for year 2017-2018 on the Office Servers and Services category.
    Let’s go for another year :)

    • 6/7/2017

    Speaking at SharePoint Saturday New York City 2017

    This year again I have the opportunity to speak at the SPS NYC.
    I’ll give a speech about the SharePoint Framework and devops methodologies “Is it possible to do devops with the SharePoint framework?”.
    You had it all right with solutions and add-ins. Your release pipeline was set up. Do new technologies and methodologies mean starting over?
    Don’t panic I’m here to help! Together we’ll see how to set up a devops pipeline for SPFX developments with:
    -    Automated builds
    -    Automated deployments
    -    Automated tests
    -    Code quality checks
    There will be a lot of demonstrations during this session and we’ll be mostly using Visual Studio Team Services/Team Foundation Server.
    This session is mostly meant for develops, architects, quality assurance people…
    If you’re in the area Saturday July the 29th 2017 don’t hesitate to register to the event.
    Just as a reminder SPS are free events organized by the community with lot of great sessions.
    This is a good occasion to expand your network, learn a lot of things and to spend a good day.
    See you there!

    • 20/6/2017

    SharePoint Framework in my own words

    If you're following the SharePoint community, you've probably heard of the "SharePoint framework in his own words" video series. https://www.voitanos.io/ is behind that initiative

    ( @andrewconnell founded voitanos recently)

    The idea is to get the thoughts from the community and people who have been around the SharePoint development for a few years. If you don't know that series I'd encourage you to check it out and I also hope that the product team behind the framework is listening the that super valuable feedback.

    When I was at Techorama in Belgium a few weeks ago, I had the chance to connect with Andrew, and aside from speaking and beer tasting, I was honoured to be interviewed for that series.

    You can go check it out on his blog: 

    http://vtns.io/spfxinownwords-vincentbiret

    • 16/6/2017

    Update to _spPageContextInfo type definitions - new properties available

    If you're building modern SharePoint components (Framework or not), there's a high chance you're using TypeScript.

    In that case you're probably using @types/SharePoint to provide auto-completion as well as some level of understanding for the compiler of what's going on.

    There's one object in particular that SharePoint hydrates for us to give some understanding of where the user is and what he/she is doing: _spPageContextInfo.

    A lot of properties were missing in those type definitions, old stuff like the web Id but also new things coming from SharePoint Online like canUserCreateMicrosoftForm.

    A few days ago I created a pull request to add these properties so you don't need anymore to do things like (_spPageContextInfo as any).webId for compilation to go through and it recently got accepted.

    The entire list of new properties is here https://github.com/DefinitelyTyped/DefinitelyTyped/pull/17089/commits/4f8942e318d76352cf0e2e9a68e72ba4b21cee55 and you can get those just by updating @types/SharePoint.

    Happy coding!

    • 23/5/2017

    Customizing page layouts, master pages and image renditions in SharePoint Online

    Microsoft recently made changes around look and feel customizations for SharePoint Online.

    If you create a new site collection and try to edit page layouts, master pages or even image renditions, there’s a high chance you’ll get an access denied error message. That even if you’re site collection administrator.

    Investigating further and checking your permissions on the master page gallery library, you will notice that every and each user has a “deny” permission for “Add and Customize Pages”. That permission level is not one of the originals we’re used to in SharePoint.

    It comes from the tenant settings now providing an option for the tenant administrator to “traditional” branding in SharePoint.

    Two ways to unlock this: PowerShell or the Tenant administration interface.

    You’ll need to have the SharePoint Online Management Shell installed on your machine:

    Connect-SPOservice -Url https://tenant-admin.sharepoint.com 

    Set-SPOsite https://tenant.sharepoint.com -DenyAddAndCustomizePages 0

    (note that this setting only unlocks one site collection, but right away)

    If you don’t want to write some PowerShell, go to the tenant administration, in the menu, select the SharePoint administration. Go to settings, scroll a little bit and check “Allow users to create Site Pages”.

    If you’re doing to advanced customization you might also want to check “Allow users to run custom script on personal sites” and “Allow users to run custom script on self-service created sites” which will allow you to execute PowerShell against these sites.

     

    Thanks Mike for pointing this setting out when I was pulling my hair.

    Happing SharePoint customizing.

    • 26/4/2017

    Speaking at Techorama Belgium 2017

    Techorama is a 2-day event that takes place in Antwerp, Belgium from May 22nd to May 24th.

    They lined up great speakers from Microsoft and other big industry players, checkout the agenda here http://techorama.be/agenda/

    I have the pleasure to announce I’ve been selected and I’ll be giving two sessions:

    -          Set up your DevOps process for SharePoint/Office365

    -          Migrate your custom components to the SharePoint framework

     

    Tickets are still available so if you are in the area or can travel there I would strongly suggest you check-it out! http://techorama.be/tickets/

    See you there.

    • 20/4/2017

    Publishing SharePoint Image renditions with PnP Provisioning

    A couple of days ago, I had to work again with Image Renditions in SharePoint. We’re developing a intranet for one of our customer and that solution partially relies on publishing features.

    Just as a reminder, image renditions were introduced back in 2013 to help you serve the most optimized version of an image without having the information workers caring about sizes and whatnot or without you having to resize images with code.

    If you haven’t come across those, here are very good resources about the subject.

    http://www.eliostruyf.com/provision-image-renditions-to-your-sharepoint-2013-site/

    https://www.eliostruyf.com/image-renditions-december-cumulative-update-sharepoint-2013/

    Now in our case we’re heavily leveraging pnp provisioning to take care of creating files, site columns, content types and so on for us. Hence the question: is it possible to configure Image Renditions using PnP provisioning?

    As a matter of that it is, you can go ahead and configure all your templates from the site settings and then grab the file located in ~/SiteCollection/_catalogs/masterpage/PublishingImageRenditions.xml

    The last thing you need to do is updating your pnp template to deploy that new file for you, here is the snippet:

    <pnp:Files>

            <pnp:File Src="PublishingImageRenditions.xml" Folder="{SiteCollection}/_catalogs/masterpage" Overwrite="true" Level="Published">

            <pnp:Properties>

                <pnp:Property Key="ContentTypeId" Value="0x01010012BCF119622FF14793A8A38D5831F25C" />

                <pnp:Property Key="ContentType" Value="Document" />

              </pnp:Properties>

            </pnp:File>

          </pnp:Files>

     

    As a bonus, here is a snippet to make sure publishing features are turned on, which is required by Image Renditions

    <pnp:Features>

            <pnp:SiteFeatures>

              <pnp:Feature ID="f6924d36-2fa8-4f0b-b16d-06b7250180fa" Description="SharePoint Server Publishing Infrastructure" />

            </pnp:SiteFeatures>

            <pnp:WebFeatures>

              <pnp:Feature ID="94c94ca6-b32f-4da9-a9e3-1f3d343d7ecb" Description="SharePoint Server Publishing" />

            </pnp:WebFeatures>

          </pnp:Features>

     

     

    • 9/3/2017

    Speaking at SharePoint Saturday Vancouver 2017

    This year again I have the opportunity to speak at the SPS Vancouver.

    I’ll give a speech about the SharePoint Framework and devops methodologies “Is it possible to do devops with the SharePoint framework?”.

    You had it all right with solutions and add-ins. Your release pipeline was set up. Do new technologies and methodologies mean starting over?

    Don’t panic I’m here to help! Together we’ll see how to set up a devops pipeline for SPFX developments with:

    -    Automated builds

    -    Automated deployments

    -    Automated tests

    -    Code quality checks

    There will be a lot of demonstrations during this session and we’ll be mostly using Visual Studio Team Services/Team Foundation Server.

    This session is mostly meant for develops, architects, quality assurance people…

    If you’re in the area Saturday April the 8th 2017 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day.

    See you there! 

    • 7/3/2017

    Speaking at SharePoint Saturday Calgary 2017

    This year again I have the opportunity to speak at the SPS Calgary.

    I’ll give a speech about the Azure functions and Microsoft flow “Introduction to Azure Functions and Flow”.

    Flow and Azure Functions are two new tools you now have for rapid applications development. It’s a revolution that changes the way of building and delivering modern applications. Instead of shipping a monolithic bloc, which can take up to a few months, we’ll now deliver each feature as a part of the solution.

    If you’re in the area Saturday April the 1st 2017 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day.

    See you there!

    • 6/1/2017

    Speaking at Aos Tours Ottawa and Toronto

    I’ll have the privilege to speak at next AOS (Azure, Office 365, SharePoint) in Ottawa (8th) and Toronto (10th).

    I’ll present in English: Introduction to Azure Functions and Flow

    Flow and Azure Functions are two new tools you now have for rapid applications development. It’s a revolution that changes the way of building and delivering modern applications. Instead of shipping a monolithic bloc, which can take up to a few months, we’ll now deliver each feature as a part of the solution.

    During this session, we’ll introduce those two new services and see during demos and cases studies how to use those.

    To register for the event here is the address http://canada.aos.community/

    See you there !

    • 23/12/2016

    Joining 2ToLead

    Big step for me during this 2016 celebrations season as I decided to leave Negotium for 2ToLead

    Good Bye Negotium

    Since I joined AlphaMosaik more than four years ago (I originally joined only for 18 months huh…), by the acquisition by Negotium and up to this day my working environment and the company I was part of changed a lot.

    This company allowed me to maturate and to learn a lot on cultural, personal and professional aspects. I’d like to take this opportunity to thank everybody I had the occasion to work with during this period of time.

    Hello 2ToLead

    I’m joining a young, innovating, fast growing company as a “Office 365 and Azure developer”. I’ll be lucky enough to work with brilliant people I met within the communities.

    These guys are just impressive, whitepapers with tens of thousands downloads, sessions at Ignite, fast growing… I hope I’m going to be up to speed.

    I’ll be the 3rd MVP to join the company also composed of former MSFT’s.

    Beyond my direct technical expertise, I’d like my new family to benefit from my knowledge around devops processes.

    See you soon, and thank you for being this many to read me.

    • 13/12/2016

    Troubleshooting load balancing issues with SharePoint

    A few weeks ago, I’ve had to get hands on a customer’s SharePoint farm showing performance problems. They also had inconsistencies with display data between multiple calls.

    I quickly suspected a configuration issue for the load balancer dispatching calls between the front-end servers. However, I only had access to the SharePoint farm and I had to provide data so the network team so they could start investigating.

    Hence the question: how can we determine which server served the request between multiple calls?

    Most of the content out there suggests adding a text file containing the server’s name to the root of the web application. I’m not a big fan of this situation for multiple reasons:

    -          Adding/deleting files to SharePoint’s content is not something to do without being cautious, especially if you don’t know SharePoint very well.

    -          You can have cases, depending on the load balancer’s configuration (and especially if it’s not configured properly), that a server answers to one request and another one to the next one in subsequent calls. You have to determine which server has served each request.

    Obviously I didn’t try solutions like “deploy this custom solution and add that custom web part which is going to tell you which server is doing what”.

    I decided to add a Http response header that the server is going to send over with each request.

    The only downside of that solution is that it’s that setting up the header will recycle the application pool, causing a service interruption.

    To setup the header go to the IIS management console and select the web application you’re working with.

    Click on “http response headers”

    Add a new entry giving the name you want and the server name as the value.

    (repeat those steps for each server of the farm)

    Using any browser, open developer tools (most likely pressing F12), enable requests tracing and go to the response’s headers of any request. You’ll see your header with the name of the server as the value.

    Once you’re done troubleshooting, don’t forget to remove the header for two reasons:

    -          Http headers, before Http2 (which as of today is far from being widely implemented), are not being compressed which will increase network traffic.

    -          For security reasons, it’s never good to expose your server’s names.

     

    Have fun troubleshooting your load balancers.

    • 29/11/2016

    Speaking at SharePoint Saturday Ottawa 2016

    This year again I have the opportunity to speak at the SPS Ottawa.

    I’ll give a speech about the graph “Making Graph data useful to your company”. We’ll talk about the Graph, delve, yammer, machine learning and a lot of other interesting concepts and the conversation will be aimed towards developers.

    If you’re in the area Saturday December the 3rd 2016 don’t hesitate to register to the event.

    Just as a reminder SPS are free events organized by the community with lot of great sessions.

    This is a good occasion to expand your network, learn a lot of things and to spend a good day. The event takes place at algonquin college.

    See you there!