You are currently viewing Updating Dataverse Records and Business Process Flows (BPF) with Copilot Studio

Updating Dataverse Records and Business Process Flows (BPF) with Copilot Studio

In a recent project, I was challenged to try and build a Copilot in Copilot Studio that would update a Dataverse table AND update a Business Process Flow (BPF). On its face, its not too crazy. These are classic components of the Power Platform. However, after a few days, it turned into quite the endeavor. One thing that kept sticking for me was I wanted any flows/topics/actions connected with the Co Pilot to be dynamic in nature so it could be used over and over. This will be a LONG post as it requires quite a few setup and configuration items, but it is well worth running through it.

 

The example will be that of a somebody who manages a 3D Printing business (I love 3D printing) via Dataverse and Model Driven apps. The co pilot will allow the user to get the status of a print job from the database and update its associated BPF.

 

Initial Configuration

Dataverse Table

--- title: Project ERD --- erDiagram Prints }| -- || Contacts :contains Prints }| -- |{ PrintJobsBPF:contains PrintJobsBPF{ int EntraID PK } Contacts{ int Person PK } Prints { string PrintJob PK lookup Requestor FK lookup BPF FK string PrintItem int Quanity boolean PrintCompleted date DateRequested date DateStarted date DateCompleted }

 

BPF

flowchart LR A(Request) --> B(Printing) B --> C(Completed)

 

 

Keys

Create a key for the print job column to ensure no names are unique

Views

Form

App

Classic Workflows

I use classic workflows when I manage BPF automatically moving between stages. It can be done with Power Automate, but I find this much easier.

 

Move BPF from Request to Printing

  1. Create a new Classic workflow that is attached to the BPF table
  2. Set the conditions to be the based on the related table (Prints), then set the date requested and created on to contains data. Also, set the Active stage to equals Request
  3. Add an Update step for the BPF table, then select “Set Properties”
  4. Set the Active Stage to Printing
  5. Then convert the workflow to a real time workflow
  6. Save and Activate the Workflow

With the workflow in place, the BPF will move forward shortly after the record is created.

Move BPF from Printing to Completed

  1. As with the step before, create a new workflow background workflow
  2. Set the condition to be the same as the BPF stage itself and the active stage to Printing
  3. App an update step to put the BPF in the completed Stage status to Inactive
  4. Then, add another update stage that updates the Prints table Status column to Inactive
  5. Update the Starts when to Process changes
  6. Using the Prints table, set the Update if Date Completed
  7. Save and Active the workflow

Copilot Studio

Setup

When building out this Copilot, I tried using straight generative AI with knowledge, Topics connected to flows, and actions connected with flows. I wanted the ability for a user to query a job, get the BPF stage and its steps, then update it. I landed on using topics with power automate flows and generative AI enabled.

Knowledge

I built out two knowledge sources, one for just the prints table and one for the BPF table and its associated tables.

Topic 1 - Get BPF info

The first topic will reach out to the Prints table, gets associated BPF record, and give the information back to the user.

  1. The trigger will be from copilot
  2. The user is then asked for the name of the print job. The output is saved in variable “varResponsePrintjobname”
  3. A message is sent to the user letting them know the action is running
  4. A Power Automate flow is then executed. I will show how to build this next.
  5. Three global variables are set from the output of the flows. I chose to do this instead of saving the topic variable as global because I find responses in the chat are “better” if they are Topic only.

 

Power Automate flow - Get BPF Information for Copilot

  1. Create a Power Automate workflow called Get BPF Information for Copilot with a Copilot trigger.
  2. Add a text input called Print Job Name
  3. Add four variables, two string variables, and two array variables
    1. varEntityLogicalName - String
    2. varCurrentStageName - String
    3. varStagesArray - Array
    4. varStepArray - Array
  4. Using a List Rows filter, the table using FetchXML ( I find it easier)
    <fetch>
      <entity name="andy_prints">
        <filter>
          <condition attribute="andy_printjob" operator="eq" value="" />
        </filter>
      </entity>
    </fetch>

     

  5. Using another list rows, do a FetchXML to find the BPF row based on the associated Print row
    outputs('List_rows_-_Prints')?['body/value']?[0]?['andy_printsid']

     

    💡 The next part should be built out as a child flow for production purposes so it can be reused. But for this demo i put it all into one flow

  6. Create vars
  7. Create another Scope called “Scope - Process Stage”
  8. Within the Scope there will be two actions
  9. The First Action “Get a row by ID - Process Stages”, will use the outputs of the List rows - BPF to get the process stages of the BPF
    outputs('List_rows_-_BPF')?['body']?['value'][0]?['_activestageid_value']

     

  10. The second will compose the active stage display name

    💡 This is where the complications of BPF stages and steps come into play. To get the steps of a BPF we need to parse the workflows table and get the clientdata of the row. This is a JSON stored as a string. It is comprised of StageSteps (BPF Stages) and StepSteps (the BPF data steps). We need this information so that we can send it back to copilot and inform the user what stage its at and the steps that need to be taken to update it.

  11. Next, add parallel Dataverse action to get the the workflow information
  12. Add a Dataverse, get row by ID, and use the process ID gathered above
    outputs('List_rows_-_BPF')?['body']?['value'][0]?['_processid_value']

     

  13. Nex,t will be a ParseJSON of the outputs clientdata
    {
        "type": "object",
        "properties": {
            "__class": {
                "type": "string"
            },
            "id": {
                "type": "string"
            },
            "description": {
                "type": "string"
            },
            "name": {
                "type": "string"
            },
            "stepLabels": {
                "type": "object",
                "properties": {
                    "list": {
                        "type": "array"
                    }
                }
            },
            "steps": {
                "type": "object",
                "properties": {
                    "list": {
                        "type": "array",
                        "items": {
                            "type": "object",
                            "properties": {
                                "__class": {
                                    "type": "string"
                                },
                                "id": {
                                    "type": "string"
                                },
                                "description": {
                                    "type": "string"
                                },
                                "name": {
                                    "type": "string"
                                },
                                "stepLabels": {
                                    "type": "object",
                                    "properties": {
                                        "list": {
                                            "type": "array"
                                        }
                                    }
                                },
                                "steps": {
                                    "type": "object",
                                    "properties": {
                                        "list": {
                                            "type": "array"
                                        }
                                    }
                                },
                                "relationshipName": {},
                                "attributeName": {},
                                "isClosedLoop": {
                                    "type": "boolean"
                                }
                            },
                            "required": [
                                "__class",
                                "id",
                                "description",
                                "name",
                                "stepLabels",
                                "steps"
                            ]
                        }
                    }
                }
            },
            "primaryEntityName": {
                "type": "string"
            },
            "nextStepIndex": {
                "type": "string"
            },
            "isCrmUIWorkflow": {
                "type": "boolean"
            },
            "category": {
                "type": "string"
            },
            "businessProcessType": {
                "type": "string"
            },
            "mode": {
                "type": "string"
            },
            "title": {
                "type": "string"
            },
            "workflowEntityId": {
                "type": "string"
            },
            "formId": {},
            "argumentsArray": {
                "type": "array"
            },
            "variables": {
                "type": "array"
            },
            "inputs": {
                "type": "array"
            }
        }
    }

     

  14. A filter array is used to filter the clientdata so only the steps are given and no other parts of the data. To do this, we need to convert the list data into an array then filter it
    array(body('Parse_JSON_-_Workflow_Client_data')?['steps']?['list'])

     

    equals(item()?['__class'], 'EntityStep:#Microsoft.Crm.Workflow.ObjectModel')

     

  15. Then, we can set the Entity logical name
    outputs('Filter_array_-_For_Steps')['body']?[0]?['description']

     

  16. To get the steps within the array, we now add an apply to each and use the body of the filter array - for steps
  17. Configure the Apply to each to run after the setting of the variable and the compose of the stage name. The Apply to each will loop over each item from the Filter array - For Steps
  18. Add an append to array variable that will add the stage and the steps to the variable - varStagearray
    item()?['steps']?['list']?[0]?['description']

     

    item()?['steps']?['list']?[0]?['steps']?['list']

     

  19. With an array of the stages and steps, we now clean the steps so we can send a “clean” list to Copilot Studio to show the user the exact stage and steps. To do this, we will do another For Each loop within the true leg of the condition.
  20. Before we can do the loop though, add in a parse JSON to access the array a bit easier, then add the loop and another append to array variable.
  21. Set the variable varCurrentStageName with the current stage name
  22. The last step of the flow is to respond to the Copilot with the current Stage/Step/Current table/RecordGUID

💡 It is important to give the responses to Copilot good names along with good descriptions

{
  "type": "Response",
  "kind": "Skills",
  "inputs": {
    "schema": {
      "type": "object",
      "properties": {
        "current_status": {
          "title": "Current Status",
          "description": "This is the current stage being used in the business process flow",
          "type": "string",
          "x-ms-content-hint": "TEXT",
          "x-ms-dynamically-added": true
        },
        "list_of steps to complete": {
          "title": "List of Steps to Complete",
          "description": "These are the current steps within the current stage of the business process flow",
          "type": "string",
          "x-ms-content-hint": "TEXT",
          "x-ms-dynamically-added": true
        },
        "prints_table logical name": {
          "title": "Prints Table Logical Name",
          "description": "",
          "type": "string",
          "x-ms-content-hint": "TEXT",
          "x-ms-dynamically-added": true
        },
        "print_record guid": {
          "title": "Print Record GUID",
          "description": "",
          "type": "string",
          "x-ms-content-hint": "TEXT",
          "x-ms-dynamically-added": true
        }
      },
      "additionalProperties": {}
    },
    "statusCode": 200,
    "body": {
      "current_status": "@{variables('varCurrentStageName')}",
      "list_of steps to complete": "@{variables('varStepArray')}",
      "prints_table logical name": "@{variables('varEntityLogicalName')}",
      "print_record guid": "@{outputs('List_rows_-_Prints')?['body/value']?[0]?['andy_printsid']}"
    }
  },
  "runAfter": {
    "Scope_-_Process_Stage_and_Workflows": [
      "Succeeded"
    ]
  }
}

 

Testing

To test the co pilot, I have added 30 records

Print JobRequestorPrint ItemQuanity
Print Job 1Jim HensonKermit the Frog3
Print Job 10Jim HensonDr. Bunsen Honeydew25
Print Job 11Jim HensonBeaker60
Print Job 12Jim HensonScooter13
Print Job 13Jim HensonRizzo the Rat70
Print Job 14Jim HensonSam Eagle2
Print Job 15Jim HensonSweetums45
Print Job 16Jim HensonDr. Teeth27
Print Job 17Jim HensonFloyd Pepper39
Print Job 18Jim HensonJanice9
Print Job 19Jim HensonZoot50
Print Job 2Jim HensonMiss Piggy15
Print Job 20Jim HensonCamilla the Chicken6
Print Job 21Jim HensonRobin the Frog73
Print Job 22Jim HensonCrazy Harry30
Print Job 23Jim HensonLew Zealand1
Print Job 24Jim HensonMahna Mahna12
Print Job 25Jim HensonPepe the King Prawn42
Print Job 26Jim HensonBeauregard18
Print Job 27Jim HensonClifford58
Print Job 28Jim HensonLink Hogthrob14
Print Job 29Jim HensonMarvin Suggs67
Print Job 3Jim HensonFozzie Bear22
Print Job 30Jim HensonUncle Deadly31
Print Job 4Jim HensonGonzo the Great5
Print Job 5Jim HensonAnimal48
Print Job 6Jim HensonRowlf the Dog7
Print Job 7Jim HensonStatler34
Print Job 8Jim HensonWaldorf69
Print Job 9Jim HensonThe Swedish Chef11

With just this data and the current topic on here is my test, ask how many jobs there are, get some basic info on a job, then get the BPF status

 

We now have a good extraction of the stage and steps. The next step will be to allow the user to update the BPF via copilot.

Topic 2 - Get Column Info

Topic flow - Part 1

With the columns returned, we now need to get the types when then user asks to update the record. This could have been done during the last flow, but for the sake of speed, I decided to run it after a user asks for it.

  1. Create a trigger and a message for the user. The trigger will be called if the user asks to update the record. The message will let the user know its working.
  2. Run a Power Automate flow that will give us the column types out put (the guide for the flow is next)

 

Power Automate flow - Get Column info from Copilot

This next flow will get the column information that was gathered from the previous flow. This is important because we don't just want to get the information into the bot, we want to write updates back to the BPF.

  1. The trigger will be a run from copilot with two text inputs, Table and steps
  2. The next step is to do a Parse JSON of the steps then to initialize a new variable array called vatTypes
    {
        "type": "array",
        "items": {
            "type": "object",
            "properties": {
                "StepName": {
                    "type": "string"
                },
                "StepLogicalName": {
                    "type": "string"
                }
            },
            "required": [
                "StepName",
                "StepLogicalName"
            ]
        }
    }

     

  3. To get the column type we need to run an HTTP request to Dataverse that will sit in a for each loop of the steps, I like to use the HTTP with Microsoft Entra ID (preauthorized)
    body('Parse_JSON_-_Steps')?[0]?['stepList']

     

    /api/data/v9.2/EntityDefinitions(LogicalName='@{triggerBody()?['text']}')/Attributes(LogicalName='@{items('Apply_to_each_-_each_step')['StepLogicalName']}')

     

  4. Add an append to array action that add the following
    {
      "columnname": @{items('Apply_to_each_-_each_step')?['StepLogicalName']},
      "value": @{body('Invoke_an_HTTP_request')?['AttributeTypeName']?['Value']},
      "output": "",
      "DisplayName": @{items('Apply_to_each_-_each_step')?['StepName']}
    }

     

  5. Within the respond to Copilot add the varTypes to the columntypes output

Topic flow - Part 2

  1. Parse the variable to save it as a table
    kind: Table
    properties:
      columnname: String
      DisplayName: String
      output: String
      value: String

     

  2. Then post a question to the user on the type of column to be updated

Testing

With that we can now get a list of options to update in the chat

Topic flow - Part 3 Update record

With all this information, we can now run a flow to update the record. However, since we do not know the table, or the column name, nor the column type until the user request it from the chat, we need to make a dynamic update process.

There are a total of 28 data verse column types Column data types in Microsoft Dataverse - Power Apps | Microsoft Learn

 

NameTypeData
BigTime StampBigIntType
ChoiceOption SetPicklistType
ChoicesMultiSelect FieldMultiSelectPicklistType
CurrencyCurrencyMoneyType
CustomerCustomerCustomerType
Date and TimeDate and Time Date and Time FormatDateTimeType
Date OnlyDate and Time Date Only FormatDateTimeType
Decimal NumberDecimal NumberDecimalType
DurationWhole Number Duration FormatIntegerType
EmailSingle Line of Text Email FormatStringType
FileFileFileType
Floating Point NumberFloating Point NumberDoubleType
ImageImageImageType
LanguageWhole Number Language FormatIntegerType
LookupLookupLookupType
Multiline TextMultiple Lines of TextMemoType
OwnerOwnerOwnerType
PhoneSingle Line of Text Phone FormatStringType
StatusStatusStateType
Status ReasonStatus ReasonStatusType
TextSingle Line of Text Text FormatStringType
Text AreaSingle Line of Text Text Area FormatStringType
Ticker SymbolSingle Line of Text Ticker Symbol FormatStringType
TimezoneWhole Number Time Zone FormatIntegerType
Unique IdentifierUnique Identifier or Primary KeyUniqueidentifierType
URLSingle Line of Text URL FormatStringType
Whole NumberWhole Number None FormatIntegerType
Yes/NoTwo OptionsBooleanType

With this information, we can build a set of flows that will update the appropriate data type. For the example I will build out three types, Date/String/Boolean.

  1. Add three conditions to handle the data types
  2. Under the Boolean condition add a ask with adaptive card action
    {
        "type": "AdaptiveCard",
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.5",
        "body": [
            {
                "type": "Container",
                "items": [
                    {
                        "type": "Input.Toggle",
                        "title": "TITLE",
                        "id": "varRequestupdateboolean"
                    }
                ]
            },
            {
                "type": "ActionSet",
                "actions": [
                    {
                        "type": "Action.Submit",
                        "title": "Submit",
                        "style": "positive",
                        "id": "SubmitOutput"
                    }
                ]
            }
        ]
    }

     

  3. Since this is Copilot studio, we can update the card to be dynamic with Power FX. So that the title is the display name of the column
    {
      type: "AdaptiveCard",
      '$schema': "http://adaptivecards.io/schemas/adaptive-card.json",
      version: "1.5",
      body: [
        {
          type: "Container",
          items: [
            {
              type: "Input.Toggle",
              title: Topic.varRequestedUpdate.DisplayName,
              id: "varRequestupdateboolean"
            }
          ]
        },
        {
          type: "ActionSet",
          actions: [
            {
              type: "Action.Submit",
              title: "Submit",
              style: "positive",
              id: "SubmitOutput"
            }
          ]
        }
      ]
    }

     

  4. For the Date add this adaptive card
    {
        "type": "AdaptiveCard",
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.5",
        "body": [
            {
                "type": "Container",
                "items": [
                    {
                        "type": "Input.Date",
                        "id": "varRequestupdatedate",
                        "label": "Title"
                    }
                ]
            },
            {
                "type": "ActionSet",
                "actions": [
                    {
                        "type": "Action.Submit",
                        "title": "Submit",
                        "style": "positive",
                        "id": "SubmitOutput"
                    }
                ]
            }
        ]
    }

     

  5. Then update the card to use the following Power FX
    {
      type: "AdaptiveCard",
      '$schema': "http://adaptivecards.io/schemas/adaptive-card.json",
      version: "1.5",
      body: [
        {
          type: "Container",
          items: [
            {
              type: "Input.Date",
              id: "varRequestupdatedate",
              label: Topic.varRequestedUpdate.DisplayName
            }
          ]
        },
        {
          type: "ActionSet",
          actions: [
            {
              type: "Action.Submit",
              title: "Submit",
              style: "positive",
              id: "SubmitOutput"
            }
          ]
        }
      ]
    }

     

  6. For the string type use this card
    {
        "type": "AdaptiveCard",
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.5",
        "body": [
            {
                "type": "Container",
                "items": [
                    {
                        "type": "Input.Text",
                        "placeholder": "Placeholder text",
                        "id": "varRequestupdatestring",
                        "label": "TITLE"
                    }
                ]
            },
            {
                "type": "ActionSet",
                "actions": [
                    {
                        "type": "Action.Submit",
                        "title": "Submit",
                        "style": "positive",
                        "id": "SubmitOutput"
                    }
                ]
            }
        ]
    }

     

  7. Then update it with the following PowerFX
    {
      type: "AdaptiveCard",
      '$schema': "http://adaptivecards.io/schemas/adaptive-card.json",
      version: "1.5",
      body: [
        {
          type: "Container",
          items: [
            {
              type: "Input.Text",
              placeholder: "",
              id: "varRequestupdatestring",
              label: Topic.varRequestedUpdate.DisplayName
            }
          ]
        },
        {
          type: "ActionSet",
          actions: [
            {
              type: "Action.Submit",
              title: "Submit",
              style: "positive",
              id: "SubmitOutput"
            }
          ]
        }
      ]
    }

     

  8. With the cards created we can now run the flows. There will be a flow per each condition. For the example given here, I will show how to build the date update flow. I am going to do another post on how to handle each column type later.

Power Automate - Update DateTime Flow

  1. Create a flow with four text inputs and one date input
  2. The two HTTP requests will be added. The first will be used to get the entity set name and the second will be used to preform the update
    /api/data/v9.0/EntityDefinitions?$filter=LogicalName eq '@{triggerBody()?['text_2']}'&$select=EntitySetName

     

    /api/data/v9.2/@{body('Invoke_an_HTTP_request_-_Get_entityset_name')?['value']?[0]?['EntitySetName']}(@{triggerBody()?['text_3']})

     

    {
      @{triggerBody()?['text']}:"@{triggerBody()?['date']}"
    }d

     

  3. This will be closed out by an empty respond to co pilot action

Within the studio, add the call flow action. For the date ensure to wrap it in the date value PowerFX formula

Text(DateValue(Topic.varRequestupdatedate), "yyyy-mm-dd")

 

 

Power Automate - Update Boolean

The flow will be very similar to the one above with the biggest difference being that the trigger will use a Boolean vs a date

  1. Add the trigger with the following inputs
  2. As before, add two http actions, on to get the entity set name and one to preform the patch
    /api/data/v9.2/@{body('Invoke_an_HTTP_request_-_Get_entityset_name')?['value']?[0]?['EntitySetName']}(@{triggerBody()?['text_3']})

     

    {
      @{triggerBody()?['text']}: "@{triggerBody()?['boolean']}"
    }

     

  3. This will be closed out by an empty respond to co pilot action

Within the studio add the call flow action. For the date ensure to wrap it in the datevalue PowerFX formula

If(Topic.varRequestupdateboolean= "True", true, false)

 

 

 

Power Automate - Update String

  1. Add the trigger with the following inputs
  2. As add two http actions, on to get the entity set name and one to preform the patch
    /api/data/v9.2/@{body('Invoke_an_HTTP_request_-_Get_entityset_name')?['value']?[0]?['EntitySetName']}(@{triggerBody()?['text_3']})

     

    {
      @{triggerBody()?['text']}: "@{triggerBody()?['text_4']}"
    }

     

  3. This will be closed out by an empty respond to co pilot action

Topic flow - Part 4 Close it out and loopback

The last step will be to ask the user if there are any other fields to update

  1. After all the conditions, add a question that asks the user if there are any other fields to update
  2. If the output is true loop back to the Question “What would you like to update”, else end the conversation.

 

 

 

 

 

 

I think there are quite a few things that can be done to improve this. I really want to get the generative flows along with less reliance on Power Automate.

Leave a Reply