Development

Practical #kanban – map your flow to your work

When I started my new gig a little over a month ago, one of the first problems I wanted to tackle was the flow of work. My team does a lot, and some of the work is very structured and repetitive, whereas other work is much more fluid. It’s really two teams with different foci; they run different sync meetings, and had different work intake processes but they were sharing a board.

The board was a typical software-development style kanban board; board columns included things like:

  • Ready – User stories pulled out of backlog based on business priorities
  • In Progress – work that was ongoing
  • On Hold – work that was waiting for other team input
  • Validate – Work that needed sign-off from someone
  • Closed – recently “done” items

For a fluid SRE team where the project work can be ambiguous, a flow like this is OK (with some caveats that I’ll explain below); In Progress is a “squishy” state for work; you don’t know what the steps are, or how long it’s going to take at a glance. However, SRE work can be “squishy”; you don’t know if you’re going to be writing a script to automate a technical process or helping coordinate a large scale project to implement SLO’s for a new service. Having a card sitting In Progress is just a visual reminder of a task list.

For structured, repeatable work, though, this style of board has limitations; if the steps of being In Progress are known, then you can highlight obstacles in your process by teasing them out. The regulated work team doesn’t know how often projects come in, but when they do, they followed a series of steps for every project (no matter how much data was involved. There was also a lot of regular back-and-forth between this team and client teams early in the process; using the old board, cards were constantly moving back and forth between In Progress and On Hold. It made WIP limits on the board very difficult to track.

We spun up a new board to better represent the flow, focusing on replacing the concept of In Progress with more specific states tailored to the flow. We also identified the primary constraint(s) for each state, be it Client, Team Member, or Systems:

  • Ready: Projects (not single stories) pulled from the backlog in terms of priority (Client)
  • Requirements: A discussion phase with clients to finalize the expectations (Client\Team Member)
  • Prep: Project scripts are developed; data is gathered. (Team Member)
  • Produce: Actual work is in-flight (Systems)
  • Analysis: Write-up from project (Team Member)
  • Validate – Work that needed sign-off from someone (Client)
  • Closed – recently “done” items

Note that WIP can now be defined for multiple states (Prep, Produce, and Analysis); each of these states are governed by a limited number of resources; team members can only have 1 project in the Analysis state at a time, for example. Note also that the cards represent the entire project, not just a task inside a project. It’s an assembly line mentality, and allows us to quickly see where constraints are acting as a bottleneck.

With the regulated workflow removed from the original board, it let us focus on some rules for the fluid SRE work (the caveats referenced above). One of the basic decisions needed for a kanban flow is what does a card represent? For assembly lines, a unit of work could be the whole project, but for fluid software development efforts, it gets really soft.

We decided to go with a timebox method; a unit of work should take no more than a day of focused work. A card could be really small or reasonably large, but if you think a project will take more than a day, then split the work out. This has psychological advantages; it’s far better to see things getting marked as “done” than to just watch a card sit in the In Progress column for weeks.

Another rule we implemented was transferring ownership of the card in the Validate phases back to the original requester. We then started a clock; if a card sat in the Validate phase for more than 7 days, we flipped the ownership back to the team member, and closed the card.

So far, the team seems to be working well with the two new improvements; after 30 days, we should have enough data to see where we can continue to improve the flow.

Kanbanly & Google Tasks

As I’ve previously posted, I recently made a job switch. Frankly, it feels like I’m drowning most days. There’s a lot of processes and procedures to learn, as well as getting intimate with a new technology stack, and it’s been tough to maintain good workflow hygiene both personally and professionally. Part of that stems from a shift from Microsoft Office to Google Workspaces; I lost touch with many tools that I was using to capture to-do items.

I started using Google Tasks again, but the visualization is limited; it’s a checklist, and that’s ok for most tasks, but I love Kanban and the organization of workflow to help structure future items as well as current items. However, I haven’t had much success with mobile Kanban apps. They’re usually hard to work with on the smaller screen space.

Enter Kanbanly. It’s a Chrome Web app which transforms Google Tasks into a Kanban board by adding metadata. This allows me to use a Kanban interface on the desktop, and the Google Task client on my iPhone. It’s clunky at times (the metadata tags in the iPhone app can be distracting), but so far it seems to be working; it even has WIP limits per column for visual reminders when I’m overcommitted.

Using an #Azure Logic App to create @AzureDevOps Work Items from a SQL Server dataset

In a previous post, I described how to use an Azure Logic App to update an Azure DevOps work item; in an effort to add additional automation to our processes, I’m starting to move more and more notifications and work items directly to Azure DevOps using Logic apps. For example, for one of our websites, we have an SSRS report that queries the database to determine if employees are compliant with password policies. We get a weekly email that we then have to copy and paste into Azure DevOps (to track the work), and then we do the work. I want to eliminate the email step.

The workflow is very straightforward compared to updating an existing work item; there’s no need to manipulate a REST API for this. Connectors are all built in.

  1. Set up a schedule using the Recurrence item.
  2. Execute a stored procedure. This is the item that can be difficult to set up, especially if you have an on-premises SQL Server. In our case, I had to download and set up an on-premises data gateway, and then configure a connection to my SQL Server. Once that was done, I had to identify the database and stored procedure that contains the result set I wanted to add to the work item.
  3. The result set from the stored procedure is in JSON format; parsing the JSON allows it to be defined as individual elements that can be referenced by additional actions.
  4. I then take those elements and focus on the items of interest to construct an HTML table.
  5. I then create an Azure DevOps work item by adding the HTML table to the description field.

BONUS: I added a timestamp to the work item title by using the formatDateTime() function.

formatDateTime(utcNow(),'D')

Using an #Azure Logic App to move @AzureDevOps Work Items

A big part of my job these days is looking for opportunities to improve workflow. Automation of software is great, but identifying areas to speed up human processes can be incredibly beneficial to value delivery to customers. Here’s the situation I recently figured out how to do:

  1. My SRE team uses a different Azure DevOps project than our development team. This protects the “separation of duties” concept that auditors love, while still letting us transfer items back and forth.
  2. The two projects are in the same organization.
  3. The two projects use different templates, with different required fields.
  4. Our workflow process requires two phases of triage for bugs in the wild: a technical phase (provided by my team), and a business prioritization (provided by our Business Analyst).
  5. Moving a card between projects is simple, but there were several manual changes that had to be made:
    1. Assigning to a Business Analyst (BA)
    2. Changing the status to Proposed from Active
    3. Changing the Iteration and Area
    4. Moving the card.

To automate this, I decided to use Azure Logic Apps. There are probably other ways to approach this problem (like Powershell), but one of the benefits of the Logic Apps model is that it uses the same security settings as our Azure DevOps installation. It just simplifies some of the steps I must go through. The simplest solution I could implement was to move the work item when changing the Assigned To field to a Business Analyst. This allows us to work the card, add comments, notes, but when the time comes to hand over to our development team for prioritization, it’s a simple change to a key attribute and save.

Here’s the Logic Apps workflow overview:

The initial trigger is a timer; every 3 minutes, the app runs and looks for work items that exist in a custom AzureDevOps query. This functionality is built into the Logic Apps designer as an Action for the Azure DevOps connector. The query exists in our SRE project, and simply identifies WorkItems that have been assigned to our Business Analyst Group. Note that the BA group is a team in the SRE project.

SELECT
    [System.Id]
FROM workitems
WHERE
    [System.TeamProject] = @project
    AND [System.WorkItemType] <> ''
    AND [System.State] <> ''
    AND [System.AssignedTo] IN GROUP '[SRE]\BA <id:56e7c8c7-b8ef-4db9-ad9c-055227a30a26>'

Once this query returns a list of work items to the LogicApp, I then use a For Each step in the designer, and embed a Rest API action.

The Rest API action offers maximum flexibility to update values for a work item; there is also an Update action, but the options were limited. There was once gotcha; you have to add the content-type, or it throws an error: application/json-patch+json

The code is below; it’s JSON, and the syntax is that you specify an operation (“add” for both updates and creates), a path to the field you want to change (path), and the value you want to set it to. In this case, I’m changing the Project, The Area Path, the Iteration Path, the State of the Work Item, and adding a comment to the Symptom field.

[
  {
    "op": "add",
    "path": "/fields/System.TeamProject",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.AreaPath",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.IterationPath",
    "value": "Dev"
  },
  {
    "op": "add",
    "path": "/fields/System.State",
    "value": "Proposed"
  },
{ 
    "op": "add",
    "path": "/fields/Symptom",
    "value": "Triaged by SRE Team.  See Repro Steps tab for information."
  }
]

Sending monthly scheduled email from an Azure DevOps query

One of my tasks over the last few years is to keep management and senior management aware of software deployments for our hosted services. This started out as a CAB (Change Advisory Board), but all of our deployments quickly became standard, and it basically became a monthly review of what had happened (which is not what a CAB meeting is supposed to be). I figured a meeting wasn’t necessary, so I was looking for a way to show what we’ve done in an easy to digest method.

The problem is that Azure DevOps doesn’t offer a scheduled email functionality out of the box. There is a Marketplace scheduler that you can use as part of a build, but unfortunately, t didn’t work in our environment for some reason. I stumbled on the concept of Power Automate, but Azure DevOps is a premium connector. However, we do have an Azure subscription, so Logic Apps it is.

Below is the flow that I came up with. At first it seemed relatively straightforward to pull together, but the stumbling block was the fact that the HTML tables are VERY rudimentary. No styling, no hyperlinks, nothing. That’s the reason for the additional variable steps.

The initialize variable state is where I define a string variable to handle the output of the Create HTML Table step. It’s empty, until I set it later (in the Set Variable) step. The Create HTML table was mostly easy, except that I wanted a defined border, and a hyperlink that would allow recipients to click on the link and get to the specific work item.

[ba]https://your_org_here/your_project_here/_queries/edit/{{ID}}[ea]{{ID}}[ca]

The set variable then takes the output of the Create HTML table step, and replaces the placeholders with appropriate HTML tags. In this case, I added a table border, and made a hyperlink out of the ID column.

replace(replace(replace(replace(body('Create_HTML_table'), '<table>', '<table border=10>'), '[ba]', '<a href="'), '[ea]', '">'),'[ca]','</a>')

The email step then uses this variable in the body, and the final product looks something like this:

Uploading Multiple RDL files to SSRS

My QA folks have a problem; development has been working on migrating a bunch of reports from a legacy 3rd-party application to SQL Server Reporting Services. Development has finished their “sprint” (*cough* waterfall), and handed over 52 reports to QA, and told the to load them into their QA server. The problem? The web interface only loads 1 report at a time.

Me, being the DevOps guy that I am, thought “well, that’s just silly; there has to be a way to upload all of the reports at once, rather than clicking on some silly buttons over and over again”. I googled, and the best explanation I found was at this blogpost: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html (Kudos; this is a VERY well written walk-through).

Unfortunately, I did exactly what the blog suggested, and ran into two problems. The first? Running the .bat file opened and closed a command window in seconds. No feedback. Nothing. I went back and made a minor change to the .bat file; namely, I added the PAUSE command to the end:

“E:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\rs.exe” -i C:\report_upload\Upload_Multiple_RDL_files.rss -s “http://localhost/reportserver” -v FILE_NAME=”C:\report_upload\ConfigurationFile.txt”
pause

This allowed me to discover the second (more important) issue; I ran the .bat file, and now I get the following error:

Unhandled exception:
The definition of this report is not valid or supported by this version of Reporting Services. The report definition may have been created with a later version of Reporting Services, or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character. Line 2563, position 10.

Ummmmm, what?

Back to the Bing. I found another article that describes basically the same process, but at the bottom of the comment section, there was this little gem:

Hello, I was getting the invalid content error:

“….or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character….”

Made some research and found that arrays in vb are created as N+1 long size, and filled with 0x00, so you need to change the script to:

definition = New [Byte](stream.Length – 1) {}

 to get it working.

https://www.mssqltips.com/sqlservertip/3255/sql-server-reporting-services-ssrs-rsexe-utility/

BINGO! Went back to the .rss file and changed the below code block to

 ‘ Deploying the Reports

                    Try

                        Dim stream As FileStream = File.OpenRead(fileFullPath)

                        Dim NameWithExt As String = fileFullPath.Replace(path, “”)

                        Dim NameOnly As String = NameWithExt.Replace(“.rdl”, “”)

                        definition = New [Byte](stream.Length-1) {}

                        stream.Read(definition, 0, CInt(stream.Length))

Ran the.bat, and in just a few minutes, I had a ton of reports loaded to my directory. Now I just gotta fix the datasources.

**Edit – based on comment below, I’ve embedded the entire revised script.

‘ Script Starting Point
' Script to deploy report to report server
' EXECUTE VIA COMMAND LINE
‘ Original script: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html
‘ Upload_Multiple_RDL_files.rss

DIM definition As [Byte]() = Nothing
DIM warnings As Warning() = Nothing

Public Sub Main()

' Variable Declaration
        Dim TextLine As String = ""
        Dim LocalDir As String = ""
        Dim ServerDir As String = ""
        Dim definition As [Byte]() = Nothing
        'Dim warnings As Warning() = Nothing

' Reading Configuration Text file and assigning the values to variables
        If System.IO.File.Exists(FILE_NAME) = True Then
            Dim objReader As New System.IO.StreamReader(FILE_NAME)
            Do While objReader.Peek() <> -1
                TextLine = objReader.ReadLine()
                Dim parts As String() = TextLine.Split(New Char() {","c})
                'TextLine & objReader.ReadLine() '& vbNewLine
                LocalDir = parts(0)
                ServerDir = parts(1)

                Dim path As String = LocalDir
                Dim fileEntries As String() = Directory.GetFiles(path)
                Dim fileFullPath As String = ""
                For Each fileFullPath In fileEntries


 ' Deploying the Reports
                    Try
                        Dim stream As FileStream = File.OpenRead(fileFullPath)
                        Dim NameWithExt As String = fileFullPath.Replace(path, "")
                        Dim NameOnly As String = NameWithExt.Replace(".rdl", "")
                        definition = New [Byte](stream.Length-1) {}
                        stream.Read(definition, 0, CInt(stream.Length))

      warnings = rs.CreateReport(NameOnly, ServerDir, True, definition, Nothing)

      If Not (warnings Is Nothing) Then
        DIM warning As Warning
        For Each warning In warnings
        Console.WriteLine(warning.Message)
        Next warning
      Else
       Console.WriteLine("Report: {0} PUBLISHED!", NameOnly)
      End If

     Catch e As IOException
      Console.WriteLine(e.Message)
     End Try
    Next fileFullPath
   Loop
         Else

            Dim MsgBox as String = "File Does Not Exist"

        End If
End Sub

'End of the Script

Checking for #RedGate SCA databases

We’re starting a large scale conversion of our development processes with some very old database servers. Some of these databases were developed over 10 years ago, and they were never checked into source control. As our team develops new features, they’ve been using Red Gate’s SQL Change Automation to develop database projects along side their applications; it’s an impressive tool, and we’re getting ready to tackle older databases in an effort to improve our code base.

But, first, we need to determine what databases are already source control and which ones are not. Most of these databases and applications are minimally documented, so most of our development team doesn’t know what’s actually in production. To discover what databases are already part of SCA, I created a simple SQL script. It looks for the existence of the [dbo].[__MigrationLog] table, an artifact of the SCA process. I can then tell when a snapshot was last deployed, and what version was given.

CREATE TABLE #tmp (dbname varchar(1000), completedate Datetime, version varchar(200))

DECLARE @sql varchar (MAX)

SET @SQL = ' USE ?;
If exists (SELECT * from sys.tables t WHERE t.name = ''__MigrationLog'')
SELECT db_name() dbName, MAX(complete_dt) completeDate, MAX(version) version
FROM dbo.__MigrationLog
'

INSERT INTO #tmp
exec sp_msforeachdb @SQL

SELECT *
FROM #tmp

DROP TABLE #tmp

#NeverStopLearning 10 Minutes of Code: PS Array .Add

One of the scripts I’ve been working through in my spare time is a script to identify what servers I can’t reliably use Powershell remotely on. We manage a large domain of legacy servers, and it’s been sliced into smaller segments that have some complex routing; it’s a bit of a nightmare, but I figured I could probe each computer and run some basic PS script to see if it responds.

However, we have a LOT of servers, and some of them fail even the basic ping test. Why waste resources on testing PS commands when I can’t even reach those? My solution? Run a ping test first, and build two arrays: one that contains pingable servers, and one that does not.

 Foreach ($computer in $computers)
{
#ping test first; if it fails, add it to a new array $aNoPing with the .Add method
if(-not (test-connection -ComputerName $computer.Name -COUNT 1 -Quiet))
{
    [void]$aNoPing.Add($computer)
 }
else #it pinged, add it to a new array $aComputers for the rest of the test
    {
        [void]$aComputers.Add($computer)
    }
}                       
#generate some output; these boxes need to be looked at.
Write-Output “”
Write-Output “The following servers cannot be reached by ping.”
Write-Output “————————————————“
$aNoPing

#SQLSATBR: Database People and #DevOps

Excited to announce that I was chosen to present my session “Database People and DevOps: The Fundamentals” at SQLSaturday Baton Rouge 2019 this August. Very excited to head back close to home this year; I actually attended LSU graduate school for a year before transferring to UGA, so the campus holds a dear place in my heart. SQLSaturday Baton Rouge appears to have grown a lot since the last time I was there, so I’m hoping I can pick ups some ideas for our event in 2020.

This is a fun session for me, and I’ve got some revisions to make after delivering it in Atlanta. I hope folks find it informative, and I give lots of references for future study. This is a summary class, which means I cover a lot of topics at a high level, but I like to build a framework for future study.

Y’all come.

#NeverStopLearning 10 Minutes of Code : PS Select-Object

For a database person, this particular cmdlet has a lot of potential. I started monkeying around with Get-ComputerInfo, and was overwhelmed with the amount of properties that are returned by this command. I only wanted to select a few, and so I started looking at Select-Object. By piping the output of one command to this Select-Object, I could return only the properties I’m interested in (and it’s in a tabular format).

Get-ComputerInfo | Select-Object CsCaption, CsModel

I’m trying to develop a script that will connect to every server in a given list, and validate if I can execute PowerShell remotely on it. I’m getting there. Get-ComputerInfo is too heavy (it reads a lot more information than I need), but being able to pipe the output of a command will help me immensely.

More to come.