Thursday, December 6, 2018

Problems to call web services from Nintex form for O365

We have tried to call web services from Nintex form for O365. We have only one button that will call weather.com web service and try to get the result back so we could process. Here are few configurations we have tried but none of them are working. If you have any idea how to make this works, the Nintex form can be used extensively in SharePoint online and O365.

1. Nintex web request control seems the west way. Try to use enterprise web request control but it does not seem to be available for Nintex form for O365. It might be only available for on-premises Nintex version, can you please confirm?


The Call http request is only available in Nintex worklfow not form. Is this correct?

2. Tried to use JavaScript action button to call custom javascript as in the following screenshots. Here are two questions.






a. The JavaScript button is only available in classic form, when this will be available in modem form?

b. Tried to configure the javaScript like posted in this blog and the similar script, the web service call always fail into the error. Do we have suggestion how to configure the web service call and retrieve the return values?

The script is like below.

function getHarryListTitle(){

alert("Before Ajax call");

NWF$.ajax({
            url: "https://api.weather.gov/points/39.7456,-97.0892",
            data: "", 
            contentType: "application/json; charset=utf-8", 
            type: "GET",
            success: function (data) {
               alert("After Ajax call");
               alert(data);
            },
            error: function (x, y, z) {
               alert("Error Ajax call");   // Always fails here!
               alert(x.responseText +"  " +x.status);
            }
        });



}

Please let us know if you have any suggestions.

Wednesday, December 5, 2018

How to set up Nintex workflow to read configuration values

When we use O365 Nintex workflow variables, it's a good practice to configure them in external system like SharePoint list to make them flexible and configurable. Here are two different ways to configure using SharePoint list as look-up.

We have a configuration like called "HarryConfigDev" with two columns "Title" and "Value". Two configure entries with two titles as "EndPoint1" and "EndPoint2". They will point to two different web site. This will be used in method #1 to retrieve the configuration in "Value" column.


We also have a SharePoint list named as "HarryDev" with one column named "EndPoint". This will be used in method #2 to retrieve "HarryConfigDev" configuration in "Value" column when the list column "EndPoint" value equals "Title" of the configuration.



Here is the configuration in Nintext workflow set variables for method #1. Please note the value to look up is harded as "EndPoint1"



Here is the configuration in Nintext workflow set variables for method #2. Please note the value is dynamically getting from the list field "EndPoint".


The logs will should the result from both methods.


Based on the different situation, you might need to use on of these methods.


Monday, December 3, 2018

Detailed procedure Nintex workflow integrates with Azure function


Nintex and Azure functions are heavily used in my new company for O365. There is a need to integrate Nintex workflow with Azure functions. The first task for me is to configure how the Nintex workflow to call Azure function and retrieve the returns from the function.  Here are details I would like to keep as my reference.

1. The first step is to set up a SharePoint online list that could test the Nintex workflow. The list is out of box simple list with default "Title" field.

2. Second step is to set up a simple Azure function as indicated as in this blog. You could use visual studio to create a default Azure function and deploy to Azure as described here. Then you need to identify the endpoint that your Nintex workflow can call.

In your new function, click </> Get function URL at the top right, select default (Function key), and then click Copy. 



The endpoint will looks like this:

Since this is simple function and will only take one input query key/value as name as "Bob" to display "Hello Bob", the key and value will be passed in the URL.

3. The third is to set up a Nintex workflow. Here are details.
a. Set up a Nintex workflow variables as in the screenshot.


b. Set the variables through Build Dictionary action as in the screenshot.

c. Add workflow action Call Http Web Services to call Azure function using the function endpoint as described before. Please note the URL is hard coded and we will discuss how to get it from configuration list.

d. Add Log to History List action to debug the function.


After save and publish, the overall workflow will look like this:

3. Run the workflow from the list item and verify the workflow history.

You can see the response status is "OK" and the response result it "Hello Bob".

Now you can integrate Azure function from Nintex workflow. The next steps will to create Azure functions that will return other format like xml or jason. Then use Nintex workflow to retrieve the return result and process the business logic. This seems like similar as text returns.



Monday, September 17, 2018

Ally the best two Azure features (Azure function and Azure automation) to support long running process with event triggers

As we can see there are two major features Azure can provide that will help SharePoint online development.

  1. Event trigger like Azure function, Flow, Logic Apps. 
  2. Back-end long running process like Azure Automation that can run inside Azure on on Hybrid worker. 

The issue for Azure event trigger listed above is they normally cannot run long time, may not support PowerShell, and may be difficult to run against hybrid worker.

The issue for Azure automation is it cannot be triggered by events like queue or SharePoint list item creation. However, Azure automation can utilize hybrid worker to work with SharePoint on-premises as we explained in previous blog.

The hope is if we can combine the event trigger with Azure automation, the sky will be unlimited! Here is the details.

One of the most interesting case to use Azure to handle the events like SharePoint events. If we need to handle the events triggered from SharePoint, we would need to implement some Azure features like Azure function. Here are the triggers supported at this time for Azure function.


·        HTTPTrigger - Trigger the execution of your code by using an HTTP request. For an example, see Create your first function.
·        TimerTrigger - Execute cleanup or other batch tasks on a predefined schedule. For an example, see Create a function triggered by a timer.
·        GitHub webhook - Respond to events that occur in your GitHub repositories. For an example, see Create a function triggered by a GitHub webhook.
·        Generic webhook - Process webhook HTTP requests from any service that supports webhooks. For an example, see Create a function triggered by a generic webhook.
·        CosmosDBTrigger - Process Azure Cosmos DB documents when they are added or updated in collections in a NoSQL database. For an example, see Create a function triggered by Azure Cosmos DB.
·        BlobTrigger - Process Azure Storage blobs when they are added to containers. You might use this function for image resizing. For more information, see Blob storage bindings.
·        QueueTrigger - Respond to messages as they arrive in an Azure Storage queue. For an example, see Create a function triggered by Azure Queue storage.
·        EventHubTrigger - Respond to events delivered to an Azure Event Hub. Particularly useful in application instrumentation, user experience or workflow processing, and Internet of Things (IoT) scenarios. For more information, see Event Hubs bindings.
·        ServiceBusQueueTrigger - Connect your code to other Azure services or on-premises services by listening to message queues. For more information, see Service Bus bindings.
·        ServiceBusTopicTrigger - Connect your code to other Azure services or on-premises services by subscribing to topics. For more information, see Service Bus bindings.


At this time, the most cost effective plan for Azure function is "consumption plan". However, the max timeout for the plan is 10 minutes. If you have long running process, you may have to switch to App Service plan. We found there are may cases we do need to run the process longer then 10 minutes but we still like to leverage the cheaper "consumption plan". Here is the solution - utilize the Azure automation.

The reason most of time we could not use Azure automation directly is there is no trigger like Queue trigger to invoke the Azure automation. This limited the Azure automation usage. However, there is a way to create a webhook on top of the Azure automation. Then this Azure automation can be called from Azure function. The details is described in this blog. As a result, the implementation like this.
  1. Create a Azure function to be trigger by desired event
  2. Create a Azure automation to do the real work like PowerShell to process report
  3. Create Webhook on top of Azure automation
  4. Invoke Azure automation Webhook  from Azure function

The architecture is as described below.



The code to invoke the Azure automation in the Azure function like this code.

    ....
    $webhookurl = 'https://s2events.azure-automation.net/webhooks?token=[secrettoken]'
    $body = @{"SITETITLE" = $siteTitle; "SITEURL" = $siteURL}
    $params = @{
        ContentType = 'application/json'
        Headers = @{'from' = 'Harry Chen'; 'Date' = "$(Get-Date)"}
        Body = ($body | convertto-json)
        Method = 'Post'
        URI = $webhookurl
    }
    #Invoking call
    Invoke-RestMethod @params -Verbose
    ....

As a result, we could ally the best two Azure features Azure function and Azure automation to support long running process with cheaper price and still be trigger by events!

Please note other event triggers like Azure flow, Azure Logic Apps can also replace Azure function for your own purpose.

Unified SharePoint 2013 on-premises and SPO site provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, Azure Function, and Azure Automation using Hybrid Runbook Worker


In previous session we have discussed Office 365 site self-provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, and Azure Function. In our company we are utilizing the same architecture and framework to provisioning BOTH SharePoint online site and SharePoint on-premises site. The architecture will also unify the site provisioning for SharePoint on-premises 2013, 2016, or 2019. Here is the architecture and detailed design for the implementation.

The architecture is almost identical to we implemented previous for SharePoint online. The only difference is we have added a Azure Automation using Hybrid Runbook Worker that could run the SharePoint on-premises site creation. The previous Azure function will call the Azure Automation webhook. 

The key part here is to configure the Azure automation Runbook like PowerShell through Hybrid Runbook Worker. This way the PowerShell from Azure automation can provisioning the on-premises site through on-premises site creation method. Here is the architecture of the Azure automation Runbook Worker.


If you have set up Azure automation Runbook Worker, everything is almost same as we configured previous for SharePoint online site provisioning process. The overall architecture diagram is listed below.



  1. User enters site provisioning information from SharePoint Framework (SPFx) form on SPO site.
  2. The new list item created will trigger the Azure Logic app to send to approval.
  3. If the request approved, Azure Logic App will send the request to Azure queue with request ID that is list item ID. We have separate the two queues and if the request of for on-premises site creation, the request will be send to on-premises queue.
  4. Azure queue will trigger the Azure function.
  5. Azure function will be trigger by new message from the queue.
  6. Azure function will call the Azure Automation through webhook. Azure automation will use Runbook worker to provision the site collection using on-premises web services. Then perform post provisioning steps.
  7. Azure Automation will update the SPO site request list with correct status.
  8. Logic app will read the SPO site request list status
  9. Logic app will send the emaul notification on the site creation.
Since this process is almost identical to SharePoint online site provisioning process except the step 6 & 7 Azure function will call Azure automation. You could refer previous blog for other steps.

The step 6 is Azure function will call the Azure Automation through webhook. You could set up a Azure automation Runbook Webhook as described procedure here. The Azure automation Runbook will look likes as below. Please remember to select the Hybrid Worker for "Run On". You need to record the webhook URL after created and you may not able to get it later!


The you could call this Azure automation just like the call below.

#Import dll from library
Import-Module "D:\home\site\wwwroot\qcsbxssspqueseprocessPS\Modules\SharePointPnPPowerShellOnline\3.0.1808.1\SharePointPnPPowerShellOnline.psd1" -Global;
#Get trigger content
$requestBody = Get-Content $triggerInput -Raw | ConvertFrom-Json
#$itemID = $requestBody.ItemId
$siteTitle = $requestBody.SiteTitle
$siteURL = $requestBody.SiteUrl
$output_SiteTitle = "SITETITLE: " + $siteTitle
Write-Output $output_SiteTitle
$output_SiteUrl = "SITEURL: " + $siteURL
Write-Output $output_SiteUrl

try
{
#region constructing Webhook call body
$webhookurl = 'https://s2events.azure-automation.net/webhooks?token=<webhookToken>'
$body = @{"SITETITLE" = $siteTitle; "SITEURL" = $siteURL}
$params = @{
ContentType = 'application/json'
Headers = @{'from' = 'Harry Chen'; 'Date' = "$(Get-Date)"}
Body = ($body | convertto-json)
Method = 'Post'
URI = $webhookurl
}
#Invoking call
Invoke-RestMethod @params -Verbose
Write-Output "Call Invoked - awaiting updating list item"
# Todo: Waite and query the SharePoint
#Connect-PnPOnline -url $requestListSiteUrl -Credentials $creds
#Set-PnPListItem -List $requestList -Identity $itemID -Values @{"Provisioning_x0020_Status" = "Provisioned";}
Write-Output "Item Updated"

#end region

}
catch [System.Exception]
{
$output = "Error Details: "  + $_.Exception.Message
Write-Output $output

}


The step 7 is to use web service to create on-premises site. We tried SP App model in on-premises with just CSOM mentioned by Vesa Juvonen. However, we are running into some issues. As a result, we are using the on-premises admin web service "_vti_adm/Admin.asmx?WSD" to create the on-premises site.
The Azure automation code looks likes as below.

......

[CmdletBinding()]
Param(
[object]$WebhookData,
[string]$siteTitle,
[string]$siteUrl)
if ($WebhookData)
{
Write-Output ("Starting runbook from webhook")
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
$WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody

# Collect individual headers. Input converted from JSON.
$From = $WebhookHeaders.From
$InputBody = (ConvertFrom-Json -InputObject $WebhookBody)
Write-Verbose "WebhookBody: $InputBody"

$url = $InputBody.url
$fileName = $InputBody.fileName
Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
$siteTitle = $InputBody.siteTitle
$output = "Site Title is: " + $siteTitle
Write-Output $output
$siteUrl = $InputBody.siteUrl
$output = "Site Url is: " + $siteUrl
Write-Output $output
} else
{
Write-Output ("Starting runbook manually")
Write-Output ("Input Parameters following:")
$output = "Site Title is: " + $siteTitle
Write-Output $output
$output = "Site Url is: " + $siteUrl
Write-Output $output
}

try
{

$user = "owner"# Owner Login
$userName = "owner"# Owner Name
$pwd = 'password'
$adminSiteUrl = "http://SPadmin:port#/"
$securePwd = ConvertTo-SecureString $pwd -AsPlainText -Force
$cred = New-Object PSCredential($user, $securePwd)
$wsdlUrl = $adminSiteUrl + "/_vti_adm/Admin.asmx?WSDL"
$svc = New-WebServiceProxy -Uri $wsdlUrl -Credential $cred
$output = "Before Running Web Service - Site URL:" + $siteUrl + " Site Title: " + $siteTitle
write-output $output
$svc.CreateSite(
$siteUrl, # URL
$siteTitle, # Title
"", # Description
1033, # LCID
"STS#0", # WebTemplate
$user, # Owner Login
$userName, # Owner Name
"", # Owner Email
"", # PortalUrl
"") # PortalName

# Todo: Udpate the SPO list status w/ Completed status
# Post provisioning steps
}
catch
{
write-Output "`n[Main] Errors found:`n$_"
# Todo: Udpate the SPO list status w/ error status
}


You can see this is remote site provisioning process and the Azure Runbook worker can be any on-premises
server and does not need to be the SharePoint server. We have verified that the admin web service is supported
for SharePoint 2013, 2016, and 2019. So the code is same for all SharePoint versions.


At this point, we are using SharePoint farm account to provisioning the on-premises site. We are testing if
we could use another account to create site.

Now, we have combined both SharePoint online and SharePoint on-premises site creation into a same Azure
process!

Friday, September 14, 2018

Office 365 site self-provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, and Azure Function


About three years, when we first looked at the Office 365 site provisioning for the company, there were limited options. The best option at that time was using provider hosted SharePoint add-in architecture similar like posted here. Since then PnP provisioning package came out and has been gone through multiple iterations. With many Azure services available and integration with O365 now, there are more better options like the one published here. We have put together the Office 365 Modern Provisioning architecture and implemented using SharePoint Framework (SPFx), Azure Logic Apps, Queue, and Azure Function for the company. We will compare some alternative and options in the blog. We will cover SharePoint on-premises site self-provisioning with SharePoint Framework (SPFx), Azure Logic Apps, Queue, and Azure Function in another blog.

The Office 365 site self-provisioning architecture is illustrated in the below diagram. Here are the components.
  
  • User enters site provisioning information from SharePoint Framework (SPFx) form on SPO site.
  • The new list item created will trigger the Azure Logic app to send to approval.
  • If the request approved, Azure Logic App will send the request to Azure queue with request ID that is list item ID.
  • Azure queue will trigger the Azure function to provision the site collection and perform post provisioning steps.


Here is the details for each component and different alternative.

1. The first component is SharePoint online list that will store all the user site request. The reason we use this is to capture the site creation request for history and analysis. The list should have minimal the following fields. You could adjust fields based on your business requirement.
  • SPSiteTitle
  • SPSiteURL
  • SPSiteType
  • SPSiteDescriptions
  • SPSiteOwners
  • SPSiteMembers
  • SPSiteVistors
  • ApprovalStatus – Capture the approval status
  • ProvisioningStatus – This will be used to indicate if site create successfully

2. The second component is SharePoint Framework (SPFx) form. You could use Azure PowerApps, blot, SharePoint add-in as alternatives. The reason we select SPFx form because we need extensive field validation. One example is when user enters the site URL, we need to verify if the URL is already in SharePoint or already in the request list. Our current SPFx form looks like the screen below.


3. The third component is the Azure Logic App. The Logic App will be triggered when new site request SharePoint online list item created. You could use Azure flow or WebJob as alternatives. The reason we select Azure Logic App because we have not open flow for users and we would like to control all components inside same place. In this case is inside Azure resource groups.


4. The forth component is the Azure storage queue. The queue has the site provisioning request information that will trigger the Azure function. The reason we choice the queue is to decouple the services and we will extend to support both  SharePoint online and SharePoint on-premises site provisioning. We decided to only include the "RequestId" in the queue to indicate the request list item ID. This will make the design flexible since any fields we need to add to the site creation will not impact the queue. The queue looks like this below. 

{
"RequestId": "86"
}

5. The firth component is the Azure function. We use PnP PowerShell as the implementation. You can use any language as the implementation but PnP PowerShell is much simpler. The only concern is PowerShell support is experimental at current V1.x. We are hoping future PowerShell V2.x will still be supported in Azure function. Here is the snippet of the Azure function code.

#Import dll from library
Import-Module "D:\home\site\wwwroot\yourmodulefodler\modules\SharePointPnPPowerShellOnline.psd1" -Global;
Import-Module "D:\home\site\wwwroot\yourmodulefodler\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell" -Global;

#Get trigger content
$requestBody = Get-Content $triggerInput -Raw | ConvertFrom-Json

#Initial variables
$username = $env:sssp_username
$password = $env:sssp_password
$requestListSiteUrl = $env:sssp_requestsiteurl
$requestList = $env:sssp_requestlistname
$siteCollectionRootURL = $env:sssp_rooturl
$SiteDesign ="Topic"    #Right now hard coded to Topic only
$secpasswd = ConvertTo-SecureString $password -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ($username, $secpasswd)

#Get request ID from queue
$requestID = $requestBody.RequestId

#Use request ID from the queue to retrive request information form the SPO site request list

    # Connect to request site and get request information based on request ID
    Connect-PnPOnline -url $requestListSiteUrl -Credentials $creds

    #Get request item
    $requestItem = Get-PnPListItem -List $requestList -Id $requestID
    $requestItem

    #Get all required site request information
    $requestSiteTitle$requestItem["SPSiteName"]
    $requestSiteUrl = $requestItem["SPSiteURL"]
    $requestDescription = $requestItem["SPSiteDescription"]
    $requestSiteOwners = $requestItem["SPSiteOwners"]
    $requestSiteMembers = $requestItem["SPSiteMembers"]
    $requestSiteVisitors = $requestItem["SPSiteVisitors"]
    $requestSiteType = $requestItem["SPSiteType"]

    #Create the site collection
    $siteCollectionURL = $siteCollectionRootURL + $requestSiteUrl

    $newSiteUrl = New-PnPSite -Type CommunicationSite -Title $requestSiteTitle -Url $siteCollectionURL -Description $requestDescription -SiteDesign $SiteDesign

    #Connect to the new created site collection
    Connect-PnPOnline -url $siteCollectionURL -Credentials $creds

    #Add additional site collection admin
    Add-PnPSiteCollectionAdmin -Owners additionalowner@mycompany.com

    foreach ($owner in $requestSiteOwners)
    {
        #Add-PnPUserToGroup -LoginName $ownerEmail -Identity $siteOwnersGroup
        if($owner.Email -ne $null -and $owner.Email -ne "")
        {
            Add-PnPUserToGroup -LoginName $owner.Email -Identity 3
        }
        else
        {
            Add-PnPUserToGroup -LoginName $owner.LookupValue -Identity 3
        }
    }
    # Other post provisioning steps …
    # Connect to request site and set ProvisioningStatus to "Provisioned"

There are few enhancements you could do. One example is  you can set up a app and use AppId and App Secret in the Powershell as below.

Connect-PnPOnline -Url <String> -AppId <String> -AppSecret <String>

The other two enhancements we are thinking are use site design and utilize PnP template. At this time, we are enhance the Office 365 site self-provisioning process to support both SharePoint online and SharePoint on-premises site creation process. We will share the architecture and implementation in the future.