Azure

Using an #Azure Logic App to create @AzureDevOps Work Items from a SQL Server dataset

In a previous post, I described how to use an Azure Logic App to update an Azure DevOps work item; in an effort to add additional automation to our processes, I’m starting to move more and more notifications and work items directly to Azure DevOps using Logic apps. For example, for one of our websites, we have an SSRS report that queries the database to determine if employees are compliant with password policies. We get a weekly email that we then have to copy and paste into Azure DevOps (to track the work), and then we do the work. I want to eliminate the email step.

The workflow is very straightforward compared to updating an existing work item; there’s no need to manipulate a REST API for this. Connectors are all built in.

  1. Set up a schedule using the Recurrence item.
  2. Execute a stored procedure. This is the item that can be difficult to set up, especially if you have an on-premises SQL Server. In our case, I had to download and set up an on-premises data gateway, and then configure a connection to my SQL Server. Once that was done, I had to identify the database and stored procedure that contains the result set I wanted to add to the work item.
  3. The result set from the stored procedure is in JSON format; parsing the JSON allows it to be defined as individual elements that can be referenced by additional actions.
  4. I then take those elements and focus on the items of interest to construct an HTML table.
  5. I then create an Azure DevOps work item by adding the HTML table to the description field.

BONUS: I added a timestamp to the work item title by using the formatDateTime() function.

formatDateTime(utcNow(),'D')

Sending monthly scheduled email from an Azure DevOps query

One of my tasks over the last few years is to keep management and senior management aware of software deployments for our hosted services. This started out as a CAB (Change Advisory Board), but all of our deployments quickly became standard, and it basically became a monthly review of what had happened (which is not what a CAB meeting is supposed to be). I figured a meeting wasn’t necessary, so I was looking for a way to show what we’ve done in an easy to digest method.

The problem is that Azure DevOps doesn’t offer a scheduled email functionality out of the box. There is a Marketplace scheduler that you can use as part of a build, but unfortunately, t didn’t work in our environment for some reason. I stumbled on the concept of Power Automate, but Azure DevOps is a premium connector. However, we do have an Azure subscription, so Logic Apps it is.

Below is the flow that I came up with. At first it seemed relatively straightforward to pull together, but the stumbling block was the fact that the HTML tables are VERY rudimentary. No styling, no hyperlinks, nothing. That’s the reason for the additional variable steps.

The initialize variable state is where I define a string variable to handle the output of the Create HTML Table step. It’s empty, until I set it later (in the Set Variable) step. The Create HTML table was mostly easy, except that I wanted a defined border, and a hyperlink that would allow recipients to click on the link and get to the specific work item.

[ba]https://your_org_here/your_project_here/_queries/edit/{{ID}}[ea]{{ID}}[ca]

The set variable then takes the output of the Create HTML table step, and replaces the placeholders with appropriate HTML tags. In this case, I added a table border, and made a hyperlink out of the ID column.

replace(replace(replace(replace(body('Create_HTML_table'), '<table>', '<table border=10>'), '[ba]', '<a href="'), '[ea]', '">'),'[ca]','</a>')

The email step then uses this variable in the body, and the final product looks something like this:

TIL from Atlanta #AzureDataFest

AZURE DATAFEST

I’ve been meaning to write this post since we wrapped up the event, but life, as usual, gets in the way.  Overall, I was very pleased with the whole event; things (for the most part) ran very smoothly.  However, in the spirit of continuous learning, here’s a few lessons (in no particular order) for anyone considering hosting an Azure DataFest in the future. 

Event Management

We used Sessionize and EventBrite to handle speaker submissions, schedule building, and attendee management.  Both tools worked great, but both are a little pricey (Sessionize charges $250 for the event, and EventBrite added a $3.54 fee to every ticket sold).  The benefit is that it was very easy to generate a professional looking schedule, review abstracts, and manage attendees (from fee collection to attendance rosters).  The one downside is that the tools don’t integrate (no way to easily export speakers into Eventbrite), and we really need a central website for people to hit rather than each individual tool.  I also had a small issue where some attendee badges didn’t print; that was probably user error.

Sponsor Expectations

  • I should have added company names to each attendee badge to make it easier for them to see what company attendees were from when talking.
  • I need to explain the email\contact information privacy policies better.  Some sponsors wanted to get more contacts to add to their mailing list.  May need to borrow a page from the SQLSaturday playbook and encourage raffles to get information directly from the attendees.
  • Microsoft SSP’s were on site, and that was a very valuable contribution.  Saw lots of hallway conversations with clients and Microsoft; that’s rare for SQLSaturdays.
  • Need to charge more for sponsorships in general; we had a flat rate of $500, which doesn’t go a long way toward building a community.  Also, I need to provide more structure over what’s included in a sponsorship; we had a couple of sponsors which had 5 or 6 team members show up.  Since food was included in their sponsorship, that literally ate up most of the profit from their sponsorship.
  • Need to find ways to encourage relationships between speakers and sponsors; speaker dinners or vendor parties?

Attendee Management

  • Generally, went well.  Food portions were about right, fee was right for a two day affair, and we had very few snacks and\or drinks left over.
  • Would love to go as paperless as possible; however, I think people like having SWAG bags.  Maybe provide them with an empty bag, and tell them SWAG is available at sponsor tables?
  • Stickers were a HIT!
  • Very different crowd than a SQLSaturday.  In fact, during opening session, only a few people had heard of SQLSaturday or AtlantaMDF.  Need to do a better job of evangelizing both of those, while recognizing that this is a crowd that may not want to give up their weekend.
  • Pretty sizable fall-off on Friday (the second day).  May need to do Monday-Tuesday to see if we do a better job of retaining folks.

Speaker Management

  • As noted above, Sessionize worked great for speaker management.  Abstracts were easy to receive and review, and building a schedule was a snap.
  • Need to be more up-front about the volunteer nature of this conference.  We had a few people that misunderstood, and submitted from abroad, and then inquired about travel reimbursement.  It was cleared up over a few emails, but I should have headed that conversation off earlier.
  • I had a speaker withdraw because we charged $50 to attend the two-day conference;  they felt that didn’t fit as a “community” event, since most community events should be free to the consumer (or offer an optional lunch, like SQLSaturday does).  I get the point, but in practical terms, that’s tough to do with a new event.  No event is free; just different people (sponsors) pick up the tab.  We’ll continue to work on this, but ADF may always have a small fee associated with it.
  • Most sessions had speakers sitting in the audience.  I haven’t seen that happen at SQLSaturday’s in a long time, so I’m hoping that people learned as much as they gave.

Logistics

  • Facility was great, but room capacity != seating arrangement.  I had to steal chairs from sponsors, and actually order more chairs on the first day to eliminate standing room only.
  • I loved having the plenary (everybody in one room) sessions at the start; really need to do one at the end, and then do a wrapup.
  • I could have saved some funds on table linens.  The caterer brought their own, and they weren’t really necessary for the check in tables.
  • We had a few technical glitches, so we need to make sure we keep the facility staff around next year.  They went to lunch and weren’t back in time for the afternoon session, so those were a little rough (maybe promise them free lunch next year?).

#Azure DataFest Sessions I want to see: @sqlgator #Cortana and #PowerBI

There are still plenty of seats left for the inaugural #AtlantaAzureDataFest2018, so I thought I’d try to drum up some interest by posting about a few of the sessions I really want to see.  First up, Ed Watson‘s session: “With Power BI and Cortana, You Can Take Over the World”.

I love the thought of integrating voice control with reporting; have no clue what that means, but it definitely satisfies the whimsical nature of this conference. Let’s build something together just because we can, not necessarily because it satisfies a need.  Ed is a crazy fun presenter to watch (and a good friend).  I’m excited to see him push the envelope a bit.

Join us!  Seats start at $50 for two days of jam-packed training on Aug 16-17th, 2018.  Tell your boss you’re being forward-thinking; they love that.

 

#Azure DataFest Atlanta #ADFATL – Call for Speakers Now Open!

As I’ve mentioned on twitter (what, you don’t follow me?), I’ve been involved with a new conference that’s focusing on the Microsoft Data Platform – Azure DataFest. It’s still very much in the works, but there’ have been a few events around the country so far, and we’re bringing one to Atlanta in August (as well as working on a national standardized presence). If you want to help build a community of data professionals that are passionate about the next generation of analytics and data science, please feel free a topic. Text for the CFP is below, but the actual call for speakers is here: https://sessionize.com/atlanta-2018-azure-datafest-microsoft.

More details to come (after I get through Atlanta SQLSaturday).

Atlanta 2018 Azure DataFest: Microsoft Azure Advanced Analytics and Big Data Conference

This is a call for speakers for the inaugural Atlanta Azure DataFest: Microsoft Azure Advance Analytics and Big Data Conference, a 2-day event to be held on August 16-17, 2018, 9:00AM to 5:00PM at the Microsoft Technology Center, 8000 Avalon Boulevard Suite 900, Alpharetta, GA 30009.

We are looking for 10-12 speakers to present on the following Azure Advanced Analytics and Big Data topics:

  • Azure Data Services
  • Azure Data Warehouse
  • Power BI
  • Cosmos DB
  • Azure Analysis Services
  • HDInsight
  • Machine Learning
  • Stream Analytics
  • Cognitive Services
  • Azure Bot Services
  • Data Lake Analytics
  • Data Lake Store
  • Data Factory
  • Power BI Embedded
  • Data Catalog
  • Log Analytics
  • Apache Spark for Azure
  • Dynamics 365 for Customer Insights
  • Custom Speech Service  APIs
  • Spark

Planned Schedule (Thursday, August 16)        

We plan on delivering a keynote, and three sessions to the at-large audience, then breaking into tracks after lunch.

8:00AM – 9:00AM – check-in/breakfast/networking

9:00AM – 9:50AM – Key Note, Room # All/Combined

10:00AM -10:50AM – Session 1  Room # All/Combined

11:00AM -11:50AM – Session 2 Room # All/Combined

12:00PM – 12:50PM – Partner/Sponsor Lunch and Learn – Room # All/Combined

1:15PM – 2:15PM  Breakout sessions

2:30PM – 3:30PM  Breakout sessions

3:45PM – 4:45PM Breakout sessions

Sessions should be 1 hour in duration, level 300 or higher. You can use best practices, case studies, demos, chalk talks, etc.

Planned Schedule (Friday, August 17)
The second day is intended to build on the first day with workshops, allowing attendees to have hands-on experiences with the applications.

8:00AM – 9:00AM – check-in/breakfast/networking

9:00AM – 11:50AM Workshops

12:00PM – 12:50PM – Lunch – Networking

1:00PM – 3:50PM Workshops

Workshop sessions should be 3 hours in length, and relate to material covered in the sessions on day one.  If you would like to submit a workshop session,  please ALSO submit a single-hour session for the first day.

The session submission deadline is Friday, July 13, 2018.  We will announce the speaker list and alternates on Monday, July 16, 2018.     

If you have questions, please contact stuart.ainsworth@azuredatafest.com