PRA PowerBI Monitoring Dashboard 

In our previous blog you can find how to connect PowerBI to UiPath Orchestrator. Today we will demonstrate how to create useful monitoring reports. 

Why Monitoring Dashboard is important? 

After publishing robots in production, it is very important to monitor real time results of automated business processes in one centralized place. Using Power BI, you can present everything that is important for achieving goals defined initially and any other key statistical data showing bot execution results. Power BI offers boundless flexibility in terms of which data we want to show, and in which format. This solution is applicable to any RPA environment that logs its transaction results (UiPath, Power Automate, Blue Prism…). 

We chose PowerBI to present all data because in this way we have everything in one place. We can create metrics that we want, define specific KPIs that we want to follow and create powerful visualizations. 

What to present in Power BI reports: 

  1. Different KPIs such as results of reduced time, cost efficiency, quality of execution and other measures about goals that are defined by top management and end business users 
  1. Performance success rate for revisions and bot optimizations in maintenance phase 
  1. Number of bots running in the organization, machines on which robots are executed 
  1. Overview of transactions in many ways 

All mentioned items and many others are important to determine and measure the success of the implementation of RPA in the company. 

PowerBI example report 

Data source 

Data like process name, organization unit, type of robot, queue name which are presented on the report are stored in SharePoint list because of real time refreshing data in the report. You can store data in other sources by your choice. If you store data in excel you will need to set a gateway for real time refreshing data. 

Data about transactions are retrieved from UiPath Orchestrator through connection which you can create through steps that we defined in our previous blog. Data from SharePoint and Orchestrator are connected through Queue name. When you directly connect PowerBI to Orchestrator there is a predefined time limitation, so that you can retrieve data for approximately past 12 months. If you need all transactions without time limitation, our advice is to get them with an API call out of PowerBI and store it in the database. 

Dashboard 

Dashboard presents all the basic information about RPA processes in the company. From Dashboard you can drill through other pages. 

Table visual presents all organizational units and processes to which that process belongs. There is also a date when the process was published to production environment. The last column is about savings in minutes.

Beside the main table there are card visuals that present the number of processes, number of attended and unattended processes per selected organization unit. Also, there is a decomposition tree which visualizes data in multiple dimensions.

Through a dashboard report you can easily present the basics of robotic process automation across company. From main page, we can drill trough Process details page.

Process details page with drill trough for Process D

Process schedule page

On Process schedule page is calendar presentation when the process should execute.

Transactions overview

Transactions overview presents successfully finished transactions per month, per process and per organization unit. There is a table with comparison of successfully finished transactions in the current month and in previous months. KPI demonstrates successfully finished transactions in this month where the target is the number of successfully finished transactions in previous month.

Conclusion

Monitoring of RPA bots in Power BI gives a powerfull overview of automatized processes in the company. This way you can follow different KPI-s, ROI, FTE savings, accuracy, quality and other RPA metrics. Reports present significant information about bot executions based on which you can optimize and improve quality of execution.

Working with same browser instance between multiple desktop flows in Power Automate

Introduction

In this article we will present how to work with the same browser instance through workflows in Power Automate Desktop and how to focus specific tab when there are multiple tabs.

Situation

Input and output workflow variables are the essential building blocks for making a decoupled RPA Process Architecture. Currently, Microsoft Power Automate Desktop does not support passing a browser instance as input or output variable.

A problem occurred when trying to use the same browser instance within more desktop flows. When you open another desktop flow, you must have browser instance to work with UI elements.

 If you don’t want to produce a new one, you can’t continue working.

And if you want to produce a new browser instance, attached to the one already running, a problem occurs with working with multiple tabs – attaching to a running browser instance with multiple tabs open may not switch focus to the needed tab.

Why this is a problem?

Let’s assume you have automated 10, 50 or even 100+ processes in your company. Assume your ERP client is a web application, as many of current ERP nowadays (e.g., T24, JD Edwards, Dynamics 365…).

Many of these processes will require logging in into this ERP to read data or process transactional information. 

Without the possibility to pass browser instances between workflows, you are forced to create a login section in each individual process workflow. Imagine now that with your 100+ automated processes, ERP login process is changed. You will need to make changes in each of the 100+ workflows that are using the ERP. 

What to do

A simple solution to this problem has three steps:

First step – deconstruct Browser instance from the parent workflow as custom object. Be sure to include the current URL, page title and other variables that can help you to reference this browser. 

Second step – Pass this custom object to the child workflow. 

Third Step – In a child flow, attach to a running browser instance using the Title, URL or other parameters. If that does not work, use the Handle property from deconstructed browser instance. 

We will create a desktop flow“Passing BI” in which we will initialize a browser instance and all required attributes, and after that we will create a second desktop flow “Receiving BI” that receives these information.

First step: Parent flow

Desktop flow – “Passing BI” setup:

First, we want to launch a new Browser and with it we are producing a new Browser Instance which we’ll use.

Variables that are required are URL’S to pages you want to work with. So:

  • Open new tab

 for example, purposes and using variable URL. Variable produced is set to same “Browser” instance, and by that it is being overwritten.

Right after tab is opened, we are using

  • Get details of web page” action getting the “Web page title”,
  • “Get details of web page”  action getting the “Web browser’s current URL address

just in case to be sure initial URL is not changed after opening page (in case it changes, we’ll overwrite our initial URL variable with correct value).

When we add those acitivites, workflow looks like this.

Web browser instance is not acceptable as input argument, so we cannot just pass it as it is.
On that account, we will create logic for passing necessary attributes by

  • initializing new variable as JSON

Next, we need to

  • convert it to custom object

 and after that variable is ready to be sent.

Second step: Passing custom object to child workflow

At the end “Passing BI” desktop flow should look like this:


Before calling “Receiving BI” desktop flow and passing our custom object as input value, we need to set it up:

Third step: Child flow

Desktop flow – “Receiving BI” setup:

The idea is that “Receiving BI” called from “Passing BI” is not opening / launching new browser, but only focusing already opened browser tabs.

Although we can pass object with browser attributer, passing it doesn’t allow us to assign type “Web browser instance” but staying as custom object, therefore is not usable for working with UI elements which requires type “Web browser instance”.

Let’s set input variable “inCustomObject” as type ‘custom object’

so, when “Receiving BI” is called, we will have defined input variable, and we can pass JSON as Custom Object Value.

Next,

  • “Launch new browser” with “Launch mode” set to “Attach to running instance” and attach by title using passed tab title value (both title and URL are passed, so it’s your choice).


Unfortunately, this option alone will not change tab focus, although it will attach our new browser instance to the existing one.

For this reason, we are using

  • “Send keys” with keys shortcut for switching tabs by specified order

“CTRL + 2” for second tab in line with passed tab index value sent to window instance/handle “Browser” instance as in the image below

The last step is now at your disposal to choose actions for working with UI elements (e.g., get details of the UI element). Now we can work without any further delays!

Whole “Receiving BI” flow process is shown below:

For those situations where attaching to a running instance by URL or Tab Name is not possible, there is a possibility to focus on a desired tab by sending hotkeys using the process handle from the Browser object. Afterwards, we will attach to the current active browser instance, pre-focused using the process handle.

That way, we have the possibility to share browser instances between different workflows.

How this solution works?

This solution gives us the opportunity to not repeat our actions (e.g., opening and closing tabs) and reuse the same browser instance within different desktop flows. Makes the process more compact and easier to work with. You still manage the browser instance from the flow who call it but all code for login routine is in its own flow, so in the future you don’t need to waste your time editing your flow one by one if there is e.g., CSS change on the website. You just need to edit/update your login flow routine 1 time. 

Connect PowerBI to Orchestrator for Robot Reporting (1/2)

This article covers the topic of how to connect PowerBI to Orchestrator instance, get data using Orchestrator API and make useful monitoring reports. First part of the post is dealing with connecting your report to Orchestrator API.

Why connect PowerBI to Orchestrator

Many of us developing robotic process automation solutions used to design so called Robot Reports or Robot Activity Logs. Those were usually Excel reports where all of the transactions were tracked from the business perspective (status, timestamp, relevant attributes etc.) and business users could monitor the success rate of the robot. Back in the days where we used Studio + Attended robot this was a fair solution.

Today when most of the companies sufficiently developed infrastructure and have their orchestrators in place, Orchestrator data and PowerBI together can yield powerful and re-usable reporting and monitoring mechanisms.

Orchestrator itself comes with a variety of prebuild reports, however often this is not enough. It is true that with the combination of Orchestrator and Kibana you may achieve similar results. On the other hand, if your organization leverages PowerBI as a preferred solution or your employees are more skilled in PowerBI than Kibana, you should continue reading this blogpost.

While it is possible to directly connect PowerBI to Orchestrator SQL database, usually in production environments your database admins will give you a dirty look if you mention the idea of obtaining direct access to database. For this reasons, we will show you how to connect PowerBI to Orchestrator API, and this is possible with having access only to Orchestrator web interface.

Prerequisites

To make useful reports with the combination of Orchestrator and PowerBI, it is advisable that you adhere to the following principles:

  • Apply dispatcher-performer pattern and utilize Orchestrator Queues
  • Have access to your Orchestrator (proper credentials)
  • Have PowerBI Pro licenses for users who monitor robot reports

Although there are many examples of how you can use Orchestrator data in situations where you do not have above three requirements, to exploit the full potential you should have this in place. Lets look at how a basic Orchestrator Queue looks.

All of this data, as well as fully customizable additional data you can add into queue items allow you for almost unlimited ways of how you can design your reports. Most importantly, this helps on standardizing your robot transactional monitoring. Separate post will cover the importance of Orchestrator Queues and Dispatcher-Performer pattern, and now let’s do some work.

Connect your PowerBI to Orchestrator

Getting data from the Orchestrator is performed in 2 Orchestrator API calls:

  1. Authentication call
  2. Data Request Call

To perform authentication, you need credentials. It is advisable to dedicate a read-only account to make API calls since I use hard-coded password in the Power Query.

Open Power Query

To perform Authentication, you will need Credentials, so firstly you need to make a Blank Query and call it Credentials:

let
    Source = [
        tenancyName = "YOURTENNANTNAME",
        usernameOrEmailAddress = "READONLYUSERNAME",
        password = "READ-ONLYUSER PASSWORD"
    ]
in
    Source

Then, you should make your API request. API Request will differ whether you have on-premise orchestrator, cloud or community version. Below is showed a version where you have your on-premise Orchestrator and separate Tennant.

To easily navigate in API, you can refer to the swagger documentation site. Usually it is placed under the YOURORCHESTRATORURL.com/swagger/index or YOURORCHESTRATORURL.com/swagger/ui/index

let
    BaseUrl = "YOURORCHESTRATORURL"
    Path = "/odata/QueueItems",

    Auth = Json.Document(Web.Contents(BaseUrl, [Headers=[#"Content-Type"="application/json"], Content=Json.FromValue(Credentials), RelativePath="/api/account/authenticate"])),
    Token = Auth[result],

    Source = Json.Document(Web.Contents(BaseUrl, [Headers=[Accept="application/json", #"Authorization"="Bearer " & Token #"X-UIPATH-OrganizationUnitId"=YOURORGANIZATIONUNITID], RelativePath=Path]))
in
    Source

But wait, where can I find my OrganizationUnitId? Let’s see…

Identify OrganizationUnitId using Orchestrator API UI

Of course, by calling another API. Alternatively, you can go directly to the swagger page (as I did) and retrieve manually your organization unit by calling GET on Folders to receive the folder Id.

Once you hit get on the Folders you should receive something like this:

The Id of the folder is the OrganizationUnitId you need to use when calling the Orchestrator API to get Queue Items.

So, in our case the Default folder was 31 so our final code (except for you base Url) should look like this

let
    BaseUrl = "YOURORCHESTRATORURL"
    Path = "/odata/QueueItems",

    Auth = Json.Document(Web.Contents(BaseUrl, [Headers=[#"Content-Type"="application/json"], Content=Json.FromValue(Credentials), RelativePath="/api/account/authenticate"])),
    Token = Auth[result],

    Source = Json.Document(Web.Contents(BaseUrl, [Headers=[Accept="application/json", #"Authorization"="Bearer " & Token #"X-UIPATH-OrganizationUnitId"=31], RelativePath=Path]))
in
    Source

If the PowerQuery complaints then go to “Edit Credentials” and choose Anonymous Access. Only Anonymous Access is allowed when using token authentication.

And Voila! You are connected to Orchestrator API. Next post will explain how to make robot reports to monitor transaction statuses and robot performance.

en_USEnglish