Responding to changes in MS Dynamics Portal Yes / No Options (Business rules in JScript)

As business rules in Dynamics don’t work in the portal we have to code up the business rules in Javascript. I’ve done this before but today the rules were dependent upon a ‘Two Options’ field, i.e a Yes / No field. The portal draws these out as two radio buttons so i needed to have code run when the value changes and read them when the page loads.

Some digging was required and thanks to this blog and this Stack Overflow i found a decent way to get the value and respond. Lots of code…

First up responding to a change in the radio button values

 (function ($) {
$(document).ready(function () {
onLoadAndOnChange();
$("#fieldname").change(onLoadAndOnChange);
});

So straightforward –  as the change of radio button fires your change event nicely, so you don’t have to worry about wiring up a change handler to each control that makes up the yes / no in the portal page.

Then the function you can call to respond…

function onLoadAndOnChange(){

var doValidation = $("input[name*=fieldName]:checked").val();
debugger;
if(doValidation=="0"){
removeRequiredFlag('related field name');
return;
}
.... validators go next

The variable ‘doValidation’ gets a the underlying values in Dynamics, so a 0 for no and a 1 for yes. I can then remove any red stars I’ve added to indicate the related field(s) is/are required (or not). After this piece of code you can add any validators you need (check out this MS article). Remember that you need one validator already in place so there’s a Jscript array to add your new validators to.

One last handy bit of code to add and remove the red star that shows a field is required…

function setRequiredFlag(fieldName)
{
$('#' + fieldName + '_label').after('<span id="spanId" style="color: red;"> *</span>');
}

function removeRequiredFlag(fieldName)
{
$('#' + fieldName + '_label').nextAll('#spanId').remove();
}
Advertisements

MS Dynamics Portal – Error / Status 204

So back in the world of portals for a while, nice to be back! Though I was initially confused that the portal was showing without any of it’s styling intact! I hit f12 to get the dev tools up for the browser, went to ‘network’ and wanted to check what the issue was with the files coming back from the web server.

error204

Status 204 – ‘The server has successfully fulfilled the request and that there is no additional content to send in the response payload body’. Thanks google!

I checked that the web files annotation had the appropriate css content…

webfileannotation

Ok, all good there, i could open the file, no problem. Then it occurred to me if the portals team were doing anything slightly odd (i.e. outside of the SDK messages) then the 204 message makes sense. I hadn’t set up the Document Management solution that was pushing our annotations into Blob storage (see Microsoft Labs Attachment Management – Logic App) but i know how it’s configured. Off to Doc Management settings and the doc management tool can be switched off for particular entities.

webfilesetting

One save and a cache clear later and the site was looking good again! Dynamics isn’t difficult, it just has a lot of moving parts….

Microsoft Labs Attachment Management – Logic App

So in order to save your client a bundle of cash on their Dynamics storage you’re going to move your attachments in to Azure Blob storage. Cool. The Attachment Management solution (available from appsource ) is a tried and tested way of doing this in your online tenant (it doesn’t work for on-prem)

However it can timeout when the initial move is run from the UI, if you have a lot of files!

attachment1

It doesn’t give many clues to whats working but if you have a look in the code you can see a link to this git hub page that can create a logic app from a template for you.

Attachment2

If you follow the instructions you get a new logic app in your azure that will copy the existing files from the crm db to azure blob storage. The problem is that the existing script sets the document body of the annotation to ‘null’, so you need to change the script to set it to the expression null

attachment3

Now when you run it your file is accessible from Dynamics and coming from blob storage (you can use the azure storage explorer to confirm it’s there)

Azure Chat Bot v4 – QnA Maker

So I’ve had a bit of time to look at some of the cognitive services that Azure hosts, some really cool stuff. Computer Vision worked great from a quickstart (apart from image sizes tripping me up!) and most of the QnA maker quick start worked great too, though with a couple of issues. Bear in mind i went with the v4 SDK, not the v3 so the use of templates from the Azure page is less point and click.

The Q and A bot is made up of two distinct pieces

  1. The Bot Site itself (a web app), that you can run/ test locally before releasing to Azure
  2. The Q and A Knowledge Base, linked to your bot site by the config in the .bot file in your web app (see below)
  • The https://www.qnamaker.ai/ site takes you through creating a Q and A Knowledge Base allowing you to train and test your work in and easy way. When it comes time to publish it can guide you through creating the Azure Cognitive service (for the KB) to publish to as well.
  • I wanted to do a local test of my bot site before pushing the code out to Azure, however I struggled with Bot Framework emulator. From git hub i expected to be able to build a CS project in VS but no dice, so i ended up going with the installer from github instead.
  • Installing the VS code for the Bot was straight forward. It’s a local web app that when you start gives you a URL to use in the Emulator. However the instructions casually say that you should update the .Bot file in the project. I assumed the .bot file would have these entries in it for an easy cut and paste. So OOB you .bot file looks like this (from the github samples)
    {
        "name": "QnABot",
        "description": "",
        "secretKey": "",
      "services": [
        {
          "id": "http://localhost:3978/api/messages",
          "name": "QnABot",
          "type": "endpoint",
          "appId": "",
          "appPassword": "",
          "endpoint": "http://localhost:3978/api/messages"
        }
      ]
    }

    Turns out to make it work it should look like this

    {
    "name": "QnABot",
    "description": "",
    "secretKey": "",
    "services": [
    {
    
    "id": "http://localhost:1912/api/messages",
    "name": "QnABot",
    "type": "qna",
    "appId": "",
    "appPassword": "",
    "kbId": "guid here from your https://www.qnamaker.ai settings",
    "endpointKey": "guid here from your https://www.qnamaker.ai settings",
    "endpoint": "http://localhost:1912/api/messages",
    "hostname": "https://yourbotservice.azurewebsites.net/"
    },
    {
    "id": "http://localhost:3978/api/messages",
    "name": "QnABot",
    "type": "endpoint",
    "appId": "",
    "appPassword": "",
    "endpoint": "http://localhost:3978/api/messages"
    }
    ]
    }
    

    You can get all the settings from your QnAMaker site:

    QnAMakerSettings

Two Way Azure Plugin Walkthrough

There’s plenty of information on the web about how to write two-way plugins but the problem i found is that none of them are 100% complete in terms of the level of understanding of the components (though that may be more down to my lack of understanding…) They make throw away comments about using Azure service bus without describing what needs to be set-up in Azure, for example.

So, from the top here’s what you need to make it all work…

  • Relay in Azure
  • A Listener on the relay (i’m running it locally for this demo)
  • An Azure aware crm plugin
  • A registered service end point in crm
  • A Step to trigger the plugin in crm

Azure Relay

Not a queue or a topic, or an event hub (tbh I’ve not tried it with that) but a relay. It doesn’t work like a queue or a topic in that when the plugin is run (and remoteexecutioncontext is being pushed out to Azure) that it lands on a queue and will stay there ready for you to pick it up with your function app. A relay requires that there is a listener already there. So when you use portal.azure.com and create your relay namespace then you don’t need to create a WCF Relay entry in the namespace – when your listener is spun up (locally in my example) you’ll see a dynamic entry in your WCF Relays appear.

wcfrelaynotthere

No WCF Relay will show if the listener isn’t running

In order to get a WCF listener to run you’ll need to get a SAS Key from Azure and use it in your listener code. From Settings… Shared Access Policies… RootManagedAccessKey … Then grab the primary key.

You can spin up a console app with the following code. It needs the nu get packages for the crm sdk and azure service bus.

using Microsoft.ServiceBus;
using Microsoft.Xrm.Sdk;
using System;
using System.Diagnostics;
using System.ServiceModel;

namespace ConsoleApp1
{
class ProblemSolver : ITwoWayServiceEndpointPlugin
{
public string Execute(RemoteExecutionContext executionContext)
{
Console.WriteLine(executionContext.MessageName);
return "from azure"; // snappy eh?
}
}

class Program
{
static void Main(string[] args)
{
ServiceBusEnvironment.SystemConnectivity.Mode = ConnectivityMode.Http;

string serviceNamespace = "yourRelayNamespace"; // your relay namespace goes here

string issuerName = "RootManageSharedAccessKey";

string issuerKey = "PeqkmiKfDaBrzOu6cl6zQhqfUBdrclbbXpAkLPLBbDY=";

string servicePath = "http-relay"; // you'll need this to register your service end point in CRM

Uri address = ServiceBusEnvironment.CreateServiceUri(
Uri.UriSchemeHttps,
serviceNamespace,
servicePath);

var sharedSecretServiceBusCredential = new TransportClientEndpointBehavior()
{
TokenProvider = TokenProvider.CreateSharedAccessSignatureTokenProvider("RootManageSharedAccessKey", issuerKey)
};

// Using an HTTP binding instead of a SOAP binding for this endpoint.

WS2007HttpRelayBinding binding = new WS2007HttpRelayBinding();
binding.Security.Mode = EndToEndSecurityMode.Transport;

// Create the service host for Azure to post messages to.

ServiceHost host = new ServiceHost(typeof(ProblemSolver));

((ServiceBehaviorAttribute)host.Description.Behaviors[typeof(ServiceBehaviorAttribute)]).InstanceContextMode = InstanceContextMode.Single;

host.AddServiceEndpoint(typeof(ITwoWayServiceEndpointPlugin), binding, address);

// Create the ServiceRegistrySettings behavior for the endpoint.

var serviceRegistrySettings = new ServiceRegistrySettings(DiscoveryType.Public);

foreach (var endpoint in host.Description.Endpoints)

{
endpoint.Behaviors.Add(serviceRegistrySettings);
endpoint.Behaviors.Add(sharedSecretServiceBusCredential);
}

// Begin listening for messages posted to Azure.
try
{
host.Open();
}
catch (Exception e)
{
Trace.TraceInformation($"error {e.Message}");
}

Console.ReadLine();

}
}

}

When you spin up your Listener you can refresh your list of WCF Relays and you’ll see your dynamic listener!

Create your service endpoint

So this bit is a bit more tricky. Go back to azure and pick up the connection string from the Shared Access Policy. Fire up the Plugin Registration tool (you’re getting that from nu get now, right?) and use Register… Register New Service Endpoint… Enter the connection string you just got from Azure and click next.

serviceendpointreg

Now change the message format to json. I haven’t confirmed if that’ll prevent the code from working. Call it homework 😉 Then Select the Designation Type of Two-way. Update the namespace address to match the relay, i.e. it starts https://namespace&#8230;. adding the name of the relay that is going to be listening, giving you a full url. This is the ‘servicepath’ value from the listener code above. Update the path also with this value, though if the namespace has the path i don’t see why it doesn’t just append this value itself… anyway, your screen should look like this.

serviceendpointreg2

Now you just need the service end point guid. You can get it from the properties of the service end point in the plugin registration tool

Create your plugin

Your plugin is very straightforward. It’s going to pick up the service endpoint guid from the unsecure configuration, call the serviceendpoint and just trace the return value. I make no apologies for the lack of comments here!!

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using System;

namespace CSharpPluginProject1
{
public class PluginClass1 : IPlugin
{
#region Secure/Unsecure Configuration Setup
private string _secureConfig = null;
private string _unsecureConfig = null;

public PluginClass1(string unsecureConfig, string secureConfig)
{
_secureConfig = secureConfig;
_unsecureConfig = unsecureConfig;
}
#endregion
public void Execute(IServiceProvider serviceProvider)
{
ITracingService tracer = (ITracingService)serviceProvider.GetService(typeof(ITracingService));
IPluginExecutionContext context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
IOrganizationService service = factory.CreateOrganizationService(context.UserId);

try
{
Entity entity = (Entity)context.InputParameters["Target"];

tracer.Trace("Posting the execution context.");
var NotificationService = (IServiceEndpointNotificationService)serviceProvider.GetService(typeof(IServiceEndpointNotificationService));

string response = NotificationService.Execute(
new EntityReference("serviceendpoint",
new Guid(_unsecureConfig)), context);
if (!String.IsNullOrEmpty(response))
{
tracer.Trace($"Response = {response}");
}
tracer.Trace("Done.");
}
catch (Exception e)
{
throw new InvalidPluginExecutionException(e.Message);
}
}
}
}

All that’s left is to register your new plugin assembly and register a step against an appropriate message – I’ve used create of contact. You can see where the endpoint guid is and that it’s a synchronous step – the whole point here is to get something back from our listener so if necessary we can throw an exception and rollback in CRM.

stepConfig

The test

On creating a new contact our listener code will now return a simple value ‘From Azure’ – you can check this by looking in Settings… Plugin trace log in CRM.

tracelog

 

Enjoy!!