After struggling with this requirement for more than a day, and reading too much information about the OAuth2 protocol, I finally was able to accomplish it, and thought it will save some time to document the process for future use.
So here are the required steps:

Create a new project with Visual Studio 2015

  1. Create a web application using one of the ASP.NET 5 templates. If you choose the “Web Application” template, set the authentication option to “No Authentication”.
  2. On the project properties -> Debug, set the “Enable SSL” checkbox and change the “App URL” to use the https protocol with the SSL port. e.g.:Project Properties

Configure the ADFS 3.0 server

  1. On your ADFS server, open the “AD FS Management” console.
  2. Select the “Relying Party Trusts” node and click “Add Relying Party Trust…”.
  3. Select “Enter data about the relying party manually” and click “Next”.
  4. Choose a display name for the trust party. (Usually the same name as your visual studio solution).
  5. Keep the default selection of “AD FS profile” and keep clicking “Next” until the “Configure Identifies” step.
  6. On the “Relying Party trust identifier” type your application URL (Ex. https://localhost:44356) and click “ADD”.
  7. Keep clicking “Next” until you finish the wizard.
  8. When you click on the “Close” button, the “Edit Claim Rule” wizard will open.  There are many options there but a standard configuration will include sending the username, display name and some roles as claims. Click “Add Rule”, leave the default selected template as “Send LDAP Attributes as Claims” and click “Next”. On the “Configure Claim Rule” tab, configure the rule like this:Add Rule
  9. Click Finish and close the wizard.
  10. Run PowerShell console as administrator and execute the following code. Replace the value of the “relyingParyName” and the “appUri” variables with the relevant values:
    Import-Module ADFS
    $relyingPartyName = "ADFSExample"
    $appUri = "https://localhost:44356"
    $clientId = [guid]::NewGuid()
    $redirectUri = "$appUri/oauth2"
    Add-AdfsClient -Name $relyingPartyName -ClientId $clientId -RedirectUri $redirectUri
    Write-Host "Client Id: $clientId`nClient Uri: $appUri`nCallback Path: /oauth2"
  11. Take a note of the output of the script as you will need it later in the process.
  12. Export the ADFS’s token-signing certificate by selecting “Service” in the “AD FS Management” -> Certificates. Select the “Token-signing” certificate and click “View Certificate…”. On the Details tab click “Copy to File …”, keep all the defaults and save the file. Copy the result file to the main folder of your application (the same folder that contains the “wwwroot” folder).

Configure the project to use ADFS

  1. Back in visual studio, extract the 2 attached files (link at the bottom) and include them in your project.
  2. You will need to add some reference packages to make the code compile. The easiest way to do it is to open the “OAuthAdfsAppBuilderExtensions.cs” file and use the quick action (Ctrl-dot) to add the references. The required packages are:
    • “Microsoft.AspNet.Authentication.Cookies”: “1.0.0-rc1-final”
    • “Microsoft.AspNet.Authentication.OAuth”: “1.0.0-rc1-final”
    • “System.IdentityModel.Tokens.Jwt”: “5.0.0-rc1-211161024”
  3. Open the Startup.cs file.
    • Add the following using statements:
      using System.IO;
      using C60.OAuthAdfs;
      using Microsoft.Extensions.PlatformAbstractions;
    • Locate the “Configure” Method and add another parameter to it: IApplicationEnvironment appEnv. (The DI will inject the value of this parameter automatically when this method is called).
    • Add the following code at the beginning of the “Configure” method:
      var oauthConfig = Configuration.GetSection("OAuth");
      app.UseOAuthAdfsAuthentication(option =>
          option.FederationServiceIdentifier = oauthConfig.Get<string>("Issuer:FederationServiceIdentifier");
          option.AuthorizationEndpoint = oauthConfig.Get<string>("Issuer:AuthorizationEndpoint");
          option.TokenEndpoint = oauthConfig.Get<string>("Issuer:TokenEndpoint");
          option.TokenSigningCertificateFile = Path.Combine(appEnv.ApplicationBasePath, oauthConfig.Get<string>("Issuer:TokenSigningCertificateFile"));
          option.ClientUri = oauthConfig.Get<string>("Client:Uri");
          option.ClientId = oauthConfig.Get<string>("Client:ClientId");
          option.CallbackPath = oauthConfig.Get<string>("Client:CallbackPath");
          option.UsernameClaimType = oauthConfig.Get<string>("ClaimsType:Username");
          option.RoleClaimType = oauthConfig.Get<string>("ClaimsType:Role");
  4. Open the “appsettings.json” file and add the “OAuth” section as follows:
       "Logging": {
       "OAuth": {
          "Issuer": {
             "FederationServiceIdentifier": "http://adfs.dev.local/adfs/services/trust",
             "AuthorizationEndpoint": "https://adfs.dev.local/adfs/oauth2/authorize",
             "TokenEndpoint": "https://adfs.dev.local/adfs/oauth2/token",
             "TokenSigningCertificateFile": "adfs.cer"
          "Client": {
             "Uri": "https://localhost:44356",
             "ClientId": "dcd1c090-b7e0-42a7-af49-a18d6f3f944c",
             "CallbackPath": "/oauth2"
          "ClaimsType": {
             "Username": "winaccountname",
             "Role": "role"

    You will need to replace the following configuration values:

    • FederationServiceIdentifier – the identifier of the ADFS server. If you don’t know this value you can leave it, and the first time you will execute the application, the token validator will throw an exception with the expected value. (Setting this parameter to the expected value will eliminate the exception).
    • AuthorizationEndpoint – The ADFS OAuth endpoint with the “/authorize” suffix.
    • TokenEndpoint – The ADFS OAuth endpoint with the “/token” suffix.
    • TokenSigningCertificateFile – The name of the certificate file that you export on step 12 of the previous section.
    • Client section – Provide the values from the PowerShell output you executed on step 11 of the previous section.
    • ClaimsType – Depends on the rule configuration you did on step 8 of the previous section. (if you leave the standard configuration you don’t need to change anything).
  5. Decorate with the [Authorize] attribute the controllers that required authentication. (you will need to add a using Microsoft.AspNet.Authorization; at the top of the file). Or you can add a policy (using the “AuthorizationPolicyBuilder”) that will be applied globally. e.g.:Attribute
  6. Execute the application, and browse to a controller that requires authentication. You will be redirected to the ADFS server and after successfully authenticating you will be redirected back to the application.


I know that Windows 2016 is coming and will support OpenId Connect, which is supposed to be simpler to configure, but until then I would love to see Microsoft improving their support of this configuration and hopefully, it will be integrated into the Visual Studio’s “Create New Project” wizard like it was for MVC 5.

Download the code to include (2 KB)


A while ago I wrote a post about integration between SharePoint 2010 and System Center Orchestrator, and the solution has been successfully used so far in my company. Since then, we moved to SharePoint 2013 and Microsoft released their new runbook engine – “Service Management Automation” (SMA) that integrated with the Windows Azure Pack (WAP). So, after we deployed WAP I was looking for an option to integrate between our SharePoint and the new runbook engine but found that the existing solutions are the same as they were for the Orchestrator – implemented by an infinity loop that queries a SharePoint list every specific interval. As I explained in my previous post this implementation is not efficient as it executes a lot of unnecessary queries and includes a delay of up to the interval time till the runbook kicks off. So I decided to adapt my previous solution to the new SMA engine. The attached solution uses SharePoint BDC to expose the SMA runbooks as an external list. The list contains a column called “InitValue” and by updating its value, the selected runbook will be triggered and the new value will be provided as the runbook’s first parameter. This external list allows users to create a SharePoint workflow (using SharePoint designer) that will execute a SMA runbook by using the “Update List Item” action, picking the desired runbook from the list and setting the parameter’s value. To send more than one parameter you can pass the current record Id as a parameter and then query the record’s values from the runbook. The solution also allows the administrator to filter the runbooks that are exposed by the external list, to include only runbooks that are tagged with a specific name.

Required Ingredients:

  1. SMA server with at least one runbooks that accepts zero or one parameter
  2. SharePoint 2013 with the BCD service and the Security storage service enabled and attached to your web application.
  3. The “SMA BDC connector” (download link at the bottom of this post)


  1. Deploy the provided WSP File. You can accomplish that by executing the following cmdlets on the SharePoint server:
    Add-SPSolution {extract path}\SmaBdcConnector.wsp
    Install-SPSolution -Identity SmaBdcConnector.wsp -GACDeployment
  2. On the Business Data Connectivity Service set the object permissions of the new “Runbook” type and give the end users the Execute permission.
  3. Create a new Secure Storage Target Application with the following configurations:
    1. Name: “C60.SmaBdcConnector”
    2. Type: “Group Restricted”Secure Storage Target Application
    3. On the fields definition page, add the last 2 fields:
      Name Type Mask
      Windows User Name Windows User Name False
      Windows Password Windows Password True
      URL Generic False
      Tag Filter Key False

      BDC Fields

    4. On the permission page, set the Members to the end users group. I used “Everyone”BDC Permissions
  4. Choose the new store application and click “Set credential”:
    1. Username/Password – user that has permission to execute the SMA’s runbooks. The username should include the domain (Domain\username). You can grant a user the required permission by adding it to the local group “SMA Administration Group” on the SMA server.
    2. URL of the SMA services. The format is (if deployed with the default port): https://{SMA Server}:9090/00000000-0000-0000-0000-000000000000
    3. Optional – to filter the runbooks list for only a specific tag, type in the “Tag Filter” field the required tag name.BDC Set Credentials Fields
  5. Create an External List based on the “SmaBdcModel” External content type.
  6. Edit one of the workflows on the list, set the value of the “InitValue” field to the value of the first parameter.
  7. Check that the runbook was executed.

Download the WSP file and source (15.2 MB)

Mail apps for Outlook makes developing Outlook customizations simple and straightforward. A mail app is just a webpage that is hosted inside Outlook. Outlook activates it and makes it available to the user contextually with respect to the item that the user is currently viewing. The user controls starting any available mail app and the app can run seamlessly across the Outlook rich clients, Outlook Web App and OWA for Devices, such that you need to install a mail app only once for a mailbox and it will work on the devices and on the Outlook clients that it is designed for.
When a user starts the app, Outlook provides a context object that contains information about the current item and also enables access to the Exchange Web Services (EWS) of the current mailbox (using the mailbox’s “makeEwsRequestAsync” method).
I have been asked to provide, inside Outlook, some statistical information about the current email’s sender like the number of messages, number of unread messages etc. After considering the available options, I have found that implementing such a requirement with a mail app is the easiest method.  The main disadvantage relative to a classic Outlook Add-in is that the app can work only on Exchange Online or Exchange Server 2013 or a later version, but this was not an issue in my environment.
In my app (source available at the bottom) I used the EWS services to query the sender information. The “makeEwsRequestAsync” method expects a string parameter with a valid SOAP request message, but creating this SOAP message manually can be burdensome. I have found it easier to create a .NET console application that will query the relevant information using EWS proxy and then used an application like Fiddler to capture the generated SOAP message.  The rest of the application is very straightforward, just html with light JavaScript to bind the information to the page.
After starting the app it will looks something like this:
Sender Info App
In the Configuration.js file, you can configure the “More info about …” link to navigate to a report about the current sender. For example, in my environment, I used it to open a webpage with information from our CRM about the specific contact, if exists. The configuration file also allows modifications to any of the displayed labels.
The “Sender Info” app deployment process includes two main steps:

  1. Deploy the web site: Copy the content of the Website folder to any Web server that can serve content over HTTPS and is accessible to your users. Optionally, update the file Configurations.js (under the “AppRead” folder) to set a specific “More info about …” link.
  2. Deploy the app manifest file:
    1. Edit the SenderInfoOutlookApp.xml file, find each occurrence of “~remoteAppUrl” and replace it with the URL of the web site from the previous step.
    2. Using the Exchange Admin Center (EAC), install the updated manifest file as described here.

Download the app and the source

So I needed to automate some configuration tasks on a Cisco ASA firewall, and thought it will be an easy task since it has an SSH interface. But after a couple of failed tries and some searching on the web, I realized that I could not use the standard SSH command mode to access the ASA and that the only working and reliable solution out there (that I found) was on this post: “How to automate scripted commands to a Cisco ASA via ssh“. However,  it relies on the “Expect” Linux command, and in my case, I preferred to execute the script directly from the System Center Orchestrator machine, which is windows based. Some blogs mentioned the windows Plink.exe command as an option too, this solution worked but it did not allow to do validations and extra logic during the script execution, as the script is sent to the device in one block. I also found this PowerShell module “SSH from PowerShell using the SSH.NET library”  that sounded promising at first, but  works with the standard SSH command and when trying to use it, I was not able to connect to my ASA firewall.
Finally, I decided to develop my own PowerShell module base on the SSH.Net library, but unlike the above module, I will be using only the SSH shell stream to interact with the device. The tricky part of working with shell stream is that there is no notification when a command execution is completed. One way to overcome this is by checking for available data on the output stream. Most of the commands’ script are easy to handle because it is valid to assume that the command execution is completed as soon as there is something in the output stream. The problem is that this assumption is not true for long-running commands that report their progress during the execution. To support this kind of commands I needed to add support for specifying a timeout before assuming the command was completed and also allow to specify a regular expression to ignore progress messages when waiting for the command output. The module also handle cleaning extra BS(\u0008) characters from the output stream. That noise characters usually appeared when executing a long command.
Proof of concept – script to create a new network object:

Import-Module SshShell

$elevatedPrompt = "#.$"
$configPrompt = "\(config\)#.$"
$objectPrompt = "object\)#.$"

$s = New-SshSession -SshHost $asaIP -User $user -Password $password
Send-SshCommand $s "enable" -Expect "Password:"
Send-SshCommand $s "$elevatedPassword" -Expect $elevatedPrompt

Send-SshCommand $s "show run object id $objectId" -Expect $elevatedPrompt

if ($s.LastResult -match "does not exist") {
	Send-SshCommand $s "conf t" -Expect $configPrompt
	Send-SshCommand $s "object network $objectId" -Expect $objectPrompt
	Send-SshCommand $s "description $description" -Expect $objectPrompt
	Send-SshCommand $s "host $hostIP" -Expect $objectPrompt
	Send-SshCommand $s "end" -Expect $elevatedPrompt
	Send-SshCommand $s "write mem" -Expect "[OK]" -WaitUnlimitedOn "configuration\.\.\.|Cryptochecksum|copied"

Close-SshSession $s


  • These PowerShell variables are prepopulated with values and have self-explanatory names: $asaIP, $user, $password, $elevatedPassword, $objectId, $description, $hostIP.
  • The value of the “Expect” parameter is a regular expression. If the result of the command doesn’t match that expression an exception will be thrown.
  • To access the result of the Send-SshCommand cmdlet you can either use the cmdlet output or use one of the session variable properties: LastResult, LastResultLine or AllResult.

To deploy the module, just copy the SshShell folder to one of the PSModulePath values (for Orchestrator server copy it to “SystemRoot%\SysWOW64\WindowsPowerShell\v1.0\Modules”) and make sure the dll files are not blocked. The module works with PowerShell 2.0 and require .net framework 3.5.

Download the module and the source

Sending email notifications regarding a working item in Service Manager is a common requirement that doesn’t really exist out of the box. Travis had published, a while ago, a plug-in called “SCSM SendEmail”. This plug-in though filling that gap, still was missing some of the common requirements:

  • Supporting only Incident – you cannot send email notifications about a service request.
  • Adding new email template required a new workflow and manual updates to the management pack XML, which means the end user cannot do it alone.
  • There is no way to send notifications to email addresses that do not exist in the SCSM user list. In my environment, for example, I needed to send the notifications to all the email addresses in a specific field of the work item.
  • Does not provide a way to “Set First Response” or change the work item status to “Pending”
  • It has some multi users/messages reliability issues:
    • When you are trying to send the same message content again, even if you are changing the template, it will not send it and there will be no indication that the message was not sent.
    • When you are trying to send a second message while the work item is open in the console, the first message will not be sent and there will be no indication as well.
    • When you are trying to send a second message while the work item is not open in the console and before the previous sending workflow has been started, it will send two emails but both with the content of the second message.

All these issues forced me to develop a custom solution (When I started to work on this, Travis’s project was part of the exchange connector and was not published as an open source).
I built the console task “Send notification” and a supporting workflow, using the “SendEmail Activity” that I showed in my previous post, which solved all of the above problems and also allowed the console user to choose between the affected user and the assigned user as the email recipient.

Send Notification Task

Send Notification Task

Send Notification Dialog Box

Send Notification Dialog Box


When you select “Affected Contacts” or “Both” as a recipient the email will be sent to the affected user and to any email address in the “alternate contact” field.
The templates drop-down list shows all the email templates that relate to an incident/service request depending on the current item type, this way any end user, with the right permissions, can simply add more templates and with no delay start using them.
In case the user tries to send a second message before the first one was sent, he will get a popup message to try again in a few seconds.

The solution contains two management packs:
C60.Incident.Notification.mpb –

  • Contains the types’ extension to incident and service request. I’m using the same extension field’s names as in Travis’s solution, adding one extra filed called “MessageToAssignedUser” that sets the notification recipient.
  • Contains the “Send Notification” Console Task and a supported assembly. The package also hides the out of the box “Mark as first response” and “Request User Input” console tasks.
  • Contains empty email templates (the email body will include just the message from the dialog box) for incident and service requests that will be used when the user chooses “No Template” in the templates combo.

C60.Incident.Notification.WF.xml –

  • Contains two workflows (one for incident and one for service request) that do the actual sending. Here is the workflow as shown in the authoring tool:
    Send Notification Workflow

    Send Notification Workflow

    The workflow’s trigger condition is when a work item’s “MessageTemplateId” field is updated from null to a different value. When executed, the workflow checks the “MessageToAssignedUser” field and based on its value sets the notification recipients:

      • True – will send the notification to the assigned user.
      • False – will send the notification to the affected user / contacts.
      • Null – will send the notification to the assigned user and the affected user / contacts.

    After the send email activity, the workflow will clear the “MessageTemplateId” field.

To deploy the solution, just import both packages using the SCSM console and copy the content of the “SCSMExtension” folder to the SCSM folder (%ProgramFiles%\Microsoft System Center 2012\Service Manager). The “SCSMExtension” folder contains an assembly for each workflow and my custom workflow activity assembly.

To open the workflows in the authoring tool you will need to deploy the custom workflow activities as described in my previous post.

The source code and binaries for this article are available here

The Problem:
There are many blog posts and discussions about sending email notifications from Service Manager. Although Out-of-the-box, Service Manager has strong capabilities for notifications such as:

  • Template editor that allow the end user to insert relevant entity fields into the mail body
  • The ability to allow the end user to specify events conditions that will trigger specific notification, using a simple wizard.

What is missing is the capability to use this functionality from inside a SCSM Workflow.
There are some solutions provided by Travis (SendEmail) and German (http://scsmsource.blogspot.co.uk/2011/04/work-item-assignmentreassignment.html), however both these solutions do not allow customization to the workflow in the Authoring Tool.
In my SCSM environment I needed the capability to send notifications as part of more complex workflows and was also required to send the notification to external emails not defined in the SCSM users DB.

The Solution:
After learning German’s solution, I noticed that he is using an out-of-the-box workflow (from the ‘WorkflowFoundation’ assembly) to send the notifications. I dug deeper (using reflector), and found that the workflow used an activity called ‘SendNotificationsActivity’ – exactly what I was looking for. Unfortunately, this activity is not compatible with the authoring tool, so to use it, I needed to wrap it with my own custom activity. In doing so, I also added a property that can contain email addresses, separated with semicolon, and these addresses will be added to the email recipients. (Link to the final result + source code can be found at the bottom of this post).

The attached assembly contains the “Send Email Notification” activity and some other useful activities that can help build more advanced workflows. For deployment follow these links: How to Install a Custom Activity Assembly


To send an email, drag the “Send Email Notification” activity to the required section on the workflow designer and configure the following properties:

Property’s Name
EmailAddress List of email addresses, separated with semicolon, which will be added to the email recipient.
InstanceId Related object Guid. This object will be used as the template source object.
PrimaryUser1 Primary user’s internal Id (Guid) that will get the email.
PrimaryUserRelationship1 Full Name of the relationship to the instance class, which contains the primary user.
TemplateDispalyName2 Name of the email template that will be used.
TemplateId Email2 template internal Id (Guid)

1 – Either PrimaryUser or PrimaryUserRelationship has to be configured.
2 – Either TemplateDispalyName or TemplateId has to be configured.


SendEmailNotification Configuration

Similar to the other solutions, this solution is not supported by Microsoft, which means that you might need to do some adjustments in future version of SCSM. Hopefully, these kinds of basic capabilities will be in the product itself in the next version and we will not need such custom solutions.

In the next post I will show you how I used these activities to create improved “SendEmail” solution.
The source code and binaries for this article are available here

The problem:
SharePoint workflows, which are based on Microsoft WF, are a great way to automate processes that require human interaction and can be easily managed through SharePoint designer, without any custom development. But when it comes to automating IT processes, Microsoft provides us with another workflows engine called Orchestrator (part of System center). The Orchestrator has a variety of activities and integration packs that make him a powerful tool for implementing IT runbooks.
I needed a solution for users who design SharePoint’s workflows, using the SharePoint designer, to easily create workflows that will include execution of Orchestrator’s runbook. Using the SharePoint OIP (Orchestrator Integration Packs) you can monitor a SharePoint list for a change and execute a runbook as a result (as described here). However, this monitoring approach is based on pooling, so for example, if I’m using the default 30 seconds for pooling interval with the above approach for a task that is usually required once a week, it will generate more than 20,000 unnecessary queries against the SharePoint. Then, when the user will update the list’s item it will take up to 30 seconds before the workflow will start.
And so, it would seem that triggering the runbook from SharePoint will be a much better solution.

Continue Reading »