Using an #Azure Logic App to create @AzureDevOps Work Items from a SQL Server dataset

In a previous post, I described how to use an Azure Logic App to update an Azure DevOps work item; in an effort to add additional automation to our processes, I’m starting to move more and more notifications and work items directly to Azure DevOps using Logic apps. For example, for one of our websites, we have an SSRS report that queries the database to determine if employees are compliant with password policies. We get a weekly email that we then have to copy and paste into Azure DevOps (to track the work), and then we do the work. I want to eliminate the email step.

The workflow is very straightforward compared to updating an existing work item; there’s no need to manipulate a REST API for this. Connectors are all built in.

  1. Set up a schedule using the Recurrence item.
  2. Execute a stored procedure. This is the item that can be difficult to set up, especially if you have an on-premises SQL Server. In our case, I had to download and set up an on-premises data gateway, and then configure a connection to my SQL Server. Once that was done, I had to identify the database and stored procedure that contains the result set I wanted to add to the work item.
  3. The result set from the stored procedure is in JSON format; parsing the JSON allows it to be defined as individual elements that can be referenced by additional actions.
  4. I then take those elements and focus on the items of interest to construct an HTML table.
  5. I then create an Azure DevOps work item by adding the HTML table to the description field.

BONUS: I added a timestamp to the work item title by using the formatDateTime() function.

formatDateTime(utcNow(),'D')

Using an #Azure Logic App to move @AzureDevOps Work Items

A big part of my job these days is looking for opportunities to improve workflow. Automation of software is great, but identifying areas to speed up human processes can be incredibly beneficial to value delivery to customers. Here’s the situation I recently figured out how to do:

  1. My SRE team uses a different Azure DevOps project than our development team. This protects the “separation of duties” concept that auditors love, while still letting us transfer items back and forth.
  2. The two projects are in the same organization.
  3. The two projects use different templates, with different required fields.
  4. Our workflow process requires two phases of triage for bugs in the wild: a technical phase (provided by my team), and a business prioritization (provided by our Business Analyst).
  5. Moving a card between projects is simple, but there were several manual changes that had to be made:
    1. Assigning to a Business Analyst (BA)
    2. Changing the status to Proposed from Active
    3. Changing the Iteration and Area
    4. Moving the card.

To automate this, I decided to use Azure Logic Apps. There are probably other ways to approach this problem (like Powershell), but one of the benefits of the Logic Apps model is that it uses the same security settings as our Azure DevOps installation. It just simplifies some of the steps I must go through. The simplest solution I could implement was to move the work item when changing the Assigned To field to a Business Analyst. This allows us to work the card, add comments, notes, but when the time comes to hand over to our development team for prioritization, it’s a simple change to a key attribute and save.

Here’s the Logic Apps workflow overview:

The initial trigger is a timer; every 3 minutes, the app runs and looks for work items that exist in a custom AzureDevOps query. This functionality is built into the Logic Apps designer as an Action for the Azure DevOps connector. The query exists in our SRE project, and simply identifies WorkItems that have been assigned to our Business Analyst Group. Note that the BA group is a team in the SRE project.

SELECT
    [System.Id]
FROM workitems
WHERE
    [System.TeamProject] = @project
    AND [System.WorkItemType] <> ''
    AND [System.State] <> ''
    AND [System.AssignedTo] IN GROUP '[SRE]\BA <id:56e7c8c7-b8ef-4db9-ad9c-055227a30a26>'

Once this query returns a list of work items to the LogicApp, I then use a For Each step in the designer, and embed a Rest API action.

The Rest API action offers maximum flexibility to update values for a work item; there is also an Update action, but the options were limited. There was once gotcha; you have to add the content-type, or it throws an error: application/json-patch+json

The code is below; it’s JSON, and the syntax is that you specify an operation (“add” for both updates and creates), a path to the field you want to change (path), and the value you want to set it to. In this case, I’m changing the Project, The Area Path, the Iteration Path, the State of the Work Item, and adding a comment to the Symptom field.

[
  {
    "op": "add",
    "path": "/fields/System.TeamProject",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.AreaPath",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.IterationPath",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.State",
    "value": "Proposed"
  },
{ 
    "op": "add",
    "path": "/fields/Symptom",
    "value": "Triaged by SRE Team.  See Repro Steps tab for information."
  }
]

Giving OpenLiveWriter a spin

I used to love the Windows version of LiveWriter, and saw a tweet today to let me know that an open-sourced fork of it was available.  This may restore my love of blogging; the interface is simple to use, and doesn’t feel as unfamiliar as the “blocks” used in WordPress these days. 

openlivewriter

The tool is a bit rough in places; I couldn’t get plugins to work, and pasting the above image from the clipboard also was challenging, but overall, it’s a welcome advance.  Just feels natural to an old-school blogger like myself.

Sending monthly scheduled email from an Azure DevOps query

One of my tasks over the last few years is to keep management and senior management aware of software deployments for our hosted services. This started out as a CAB (Change Advisory Board), but all of our deployments quickly became standard, and it basically became a monthly review of what had happened (which is not what a CAB meeting is supposed to be). I figured a meeting wasn’t necessary, so I was looking for a way to show what we’ve done in an easy to digest method.

The problem is that Azure DevOps doesn’t offer a scheduled email functionality out of the box. There is a Marketplace scheduler that you can use as part of a build, but unfortunately, t didn’t work in our environment for some reason. I stumbled on the concept of Power Automate, but Azure DevOps is a premium connector. However, we do have an Azure subscription, so Logic Apps it is.

Below is the flow that I came up with. At first it seemed relatively straightforward to pull together, but the stumbling block was the fact that the HTML tables are VERY rudimentary. No styling, no hyperlinks, nothing. That’s the reason for the additional variable steps.

The initialize variable state is where I define a string variable to handle the output of the Create HTML Table step. It’s empty, until I set it later (in the Set Variable) step. The Create HTML table was mostly easy, except that I wanted a defined border, and a hyperlink that would allow recipients to click on the link and get to the specific work item.

[ba]https://your_org_here/your_project_here/_queries/edit/{{ID}}[ea]{{ID}}[ca]

The set variable then takes the output of the Create HTML table step, and replaces the placeholders with appropriate HTML tags. In this case, I added a table border, and made a hyperlink out of the ID column.

replace(replace(replace(replace(body('Create_HTML_table'), '<table>', '<table border=10>'), '[ba]', '<a href="'), '[ea]', '">'),'[ca]','</a>')

The email step then uses this variable in the body, and the final product looks something like this:

Struggling to blog lately…

Lots of good ideas, but can’t seem to find the time to actually sit down and write about them. At the end of the year (2020), I had a major health scare; turned out to be ok, but it was significant enough to wake me up. I’d like to write about that someday.

I’d also like to describe my network setup. I’ve spent a lot of time working from home, and trying different approaches to staying connected. I’ve finally landed on a solution I like, and I need to spend some time diagramming it.

I should also blog about my job, and some of the challenges and fun stuff we do. I’ve recently started card games at lunch with my team, and I’m hoping that’s going to spur some new relationships. I’ve also started using a PIP tracking tool to help guide conversations with an employee who needs specific guidance (which is a weakness of mine).

So here’s the deal; consider this a blog post about blogging. My goal is to write one of these by the end of the week.

Been a weird year….

And you’re probably going to see a ton of retrospective posts going live soon from a variety of authors. I’m struggling to write… well, anything…. That being said, I’ve had a few key successes over the last year.

  1. Presented several times virtually, particularly as a Friend of Red Gate. DevOps Enterprise Summit, PASS Summit, and DPS. I also presented for Georgia DAMA and for the Nashville SQL Saturday (my last in-person presentation).
  2. Job is good; learning lots of new stuff with Powershell and OCtopus Deploy, as well as Azure DevOps.
  3. We got an awesome dog. Meet Conway.

Of course, lots of other stuff happened too. COVID decimated travel plans, and as most of you are aware, it killed an organization that I’ve been a long-time member of (PASS). It also cancelled the SQLSaturday Atlanta for 2020, perhaps indefinitely.

Top it off with some health stuff, and frankly, I’m exhausted. However, I do have this urge to make the most out of the next year, and the only way I know how to do that, is to get back in the habit of writing.

More to come.

#DOES20 Reflections

I usually try to write these Three Things I’ve Learned posts at the end of each day of a conference, but this is a little different. This is the first multi-day virtual training event I’ve been to, and because it’s virtual, there’s no travel. Which means my day begins on EDT, and the conference starts (and ends) on PDT. Makes for a very long day.

That being said, I love this conference: DevOps Enterprise Summit 2020 continues to push me to think abstractly about technological problems, and grounds me again in looking at cultural issues within my organization. These are the three things that have resonated with me (so far):

  1. I’ve got to do more to make the pain visible across the organization. Lots of folks have stressed the necessity of communication across teams, disciplines, and to the upper management, and that’s really sticking out to me. I think we’ve done a decent job of complaining, but we haven’t done the best job of proposing better ways of solving problems.
  2. I need to celebrate my team’s victories more. My team works hard, and there are moments when they feel forgotten in the grand scheme of things, particularly when other teams are holding them back. I need to make sure that they realize how much change they’ve promoted, and how far we’ve come as an organization.
  3. Set the Vision, but focus on the first step. A few years ago when I started us on this journey, I laid out a vision, and I need to revisit that. A lot has changed, including both targets we hit, and goals that are no longer on the roadmap. I need to make sure that I frame each step along the way in terms of the value it brings to the service we’re offering.

Virtual conferences are a different kind of fatigue; it’s been rough staying focused. I think I’m going to write another blog post to describe what’s worked for me, and what I’ll do differently in the future.

Back to learning; 3 more hours to go today 🙂

#DOES2020 starts tomorrow

This is one of my favorite conferences, and obviously COVID-19 changes everything. It’ll be interesting to see how going online changes the tone of the conversation. One big benefit is that the cost is definitely lower while the quality of the content is expected to be the same; I miss traveling, though.

I know most folks have done a virtual conference this year and have already mastered the Zoom burnout, but I’m a little concerned about. I’m going to try and beat it by staying engaged with a co-worker who is also attended. I also plan on live-tweeting and or blogging as it comes.

Getting my learn on.

Guest of @RedGate at #DOES20

Haven’t blogged in a bit, and I definitely need to get back to writing. However, just wanted to post a quick note that I’m super excited to be presenting a guest session with RedGate Software‘s Grant Fritchey at the DevOps Enterprise Summit. I’m very excited about this for multiple reasons:

  1. I love this conference; DOES is very inspirational, and there are lots of great speakers and content. It’s focused more on the technical goals than the actual tools, so it’s a good fit for where i am career wise.
  2. I love RedGate Software. Their company is simply an amazing producer of tools for the database community, and they’ve been very supportive of #sqlfamily and #sqlSaturdays for a long time. I’m stoked that they’re expanding their reach.
  3. Grant‘s ok. (Really, he’s a great guy and a lot of fun to talk to).

See y’all in virtual Vegas. Registration for the conference is half-price through August 31; it’s a great deal at $325.

Uploading Multiple RDL files to SSRS

My QA folks have a problem; development has been working on migrating a bunch of reports from a legacy 3rd-party application to SQL Server Reporting Services. Development has finished their “sprint” (*cough* waterfall), and handed over 52 reports to QA, and told the to load them into their QA server. The problem? The web interface only loads 1 report at a time.

Me, being the DevOps guy that I am, thought “well, that’s just silly; there has to be a way to upload all of the reports at once, rather than clicking on some silly buttons over and over again”. I googled, and the best explanation I found was at this blogpost: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html (Kudos; this is a VERY well written walk-through).

Unfortunately, I did exactly what the blog suggested, and ran into two problems. The first? Running the .bat file opened and closed a command window in seconds. No feedback. Nothing. I went back and made a minor change to the .bat file; namely, I added the PAUSE command to the end:

“E:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\rs.exe” -i C:\report_upload\Upload_Multiple_RDL_files.rss -s “http://localhost/reportserver” -v FILE_NAME=”C:\report_upload\ConfigurationFile.txt”
pause

This allowed me to discover the second (more important) issue; I ran the .bat file, and now I get the following error:

Unhandled exception:
The definition of this report is not valid or supported by this version of Reporting Services. The report definition may have been created with a later version of Reporting Services, or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character. Line 2563, position 10.

Ummmmm, what?

Back to the Bing. I found another article that describes basically the same process, but at the bottom of the comment section, there was this little gem:

Hello, I was getting the invalid content error:

“….or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character….”

Made some research and found that arrays in vb are created as N+1 long size, and filled with 0x00, so you need to change the script to:

definition = New [Byte](stream.Length – 1) {}

 to get it working.

https://www.mssqltips.com/sqlservertip/3255/sql-server-reporting-services-ssrs-rsexe-utility/

BINGO! Went back to the .rss file and changed the below code block to

 ‘ Deploying the Reports

                    Try

                        Dim stream As FileStream = File.OpenRead(fileFullPath)

                        Dim NameWithExt As String = fileFullPath.Replace(path, “”)

                        Dim NameOnly As String = NameWithExt.Replace(“.rdl”, “”)

                        definition = New [Byte](stream.Length-1) {}

                        stream.Read(definition, 0, CInt(stream.Length))

Ran the.bat, and in just a few minutes, I had a ton of reports loaded to my directory. Now I just gotta fix the datasources.

**Edit – based on comment below, I’ve embedded the entire revised script.

‘ Script Starting Point
' Script to deploy report to report server
' EXECUTE VIA COMMAND LINE
‘ Original script: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html
‘ Upload_Multiple_RDL_files.rss

DIM definition As [Byte]() = Nothing
DIM warnings As Warning() = Nothing

Public Sub Main()

' Variable Declaration
        Dim TextLine As String = ""
        Dim LocalDir As String = ""
        Dim ServerDir As String = ""
        Dim definition As [Byte]() = Nothing
        'Dim warnings As Warning() = Nothing

' Reading Configuration Text file and assigning the values to variables
        If System.IO.File.Exists(FILE_NAME) = True Then
            Dim objReader As New System.IO.StreamReader(FILE_NAME)
            Do While objReader.Peek() <> -1
                TextLine = objReader.ReadLine()
                Dim parts As String() = TextLine.Split(New Char() {","c})
                'TextLine & objReader.ReadLine() '& vbNewLine
                LocalDir = parts(0)
                ServerDir = parts(1)

                Dim path As String = LocalDir
                Dim fileEntries As String() = Directory.GetFiles(path)
                Dim fileFullPath As String = ""
                For Each fileFullPath In fileEntries


 ' Deploying the Reports
                    Try
                        Dim stream As FileStream = File.OpenRead(fileFullPath)
                        Dim NameWithExt As String = fileFullPath.Replace(path, "")
                        Dim NameOnly As String = NameWithExt.Replace(".rdl", "")
                        definition = New [Byte](stream.Length-1) {}
                        stream.Read(definition, 0, CInt(stream.Length))

      warnings = rs.CreateReport(NameOnly, ServerDir, True, definition, Nothing)

      If Not (warnings Is Nothing) Then
        DIM warning As Warning
        For Each warning In warnings
        Console.WriteLine(warning.Message)
        Next warning
      Else
       Console.WriteLine("Report: {0} PUBLISHED!", NameOnly)
      End If

     Catch e As IOException
      Console.WriteLine(e.Message)
     End Try
    Next fileFullPath
   Loop
         Else

            Dim MsgBox as String = "File Does Not Exist"

        End If
End Sub

'End of the Script