Uploading Multiple RDL files to SSRS

My QA folks have a problem; development has been working on migrating a bunch of reports from a legacy 3rd-party application to SQL Server Reporting Services. Development has finished their “sprint” (*cough* waterfall), and handed over 52 reports to QA, and told the to load them into their QA server. The problem? The web interface only loads 1 report at a time.

Me, being the DevOps guy that I am, thought “well, that’s just silly; there has to be a way to upload all of the reports at once, rather than clicking on some silly buttons over and over again”. I googled, and the best explanation I found was at this blogpost: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html (Kudos; this is a VERY well written walk-through).

Unfortunately, I did exactly what the blog suggested, and ran into two problems. The first? Running the .bat file opened and closed a command window in seconds. No feedback. Nothing. I went back and made a minor change to the .bat file; namely, I added the PAUSE command to the end:

“E:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\rs.exe” -i C:\report_upload\Upload_Multiple_RDL_files.rss -s “http://localhost/reportserver” -v FILE_NAME=”C:\report_upload\ConfigurationFile.txt”
pause

This allowed me to discover the second (more important) issue; I ran the .bat file, and now I get the following error:

Unhandled exception:
The definition of this report is not valid or supported by this version of Reporting Services. The report definition may have been created with a later version of Reporting Services, or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character. Line 2563, position 10.

Ummmmm, what?

Back to the Bing. I found another article that describes basically the same process, but at the bottom of the comment section, there was this little gem:

Hello, I was getting the invalid content error:

“….or contain content that is not well-formed or not valid based on Reporting Services schemas. Details: ‘.’, hexadecimal value 0x00, is an invalid character….”

Made some research and found that arrays in vb are created as N+1 long size, and filled with 0x00, so you need to change the script to:

definition = New [Byte](stream.Length – 1) {}

 to get it working.

https://www.mssqltips.com/sqlservertip/3255/sql-server-reporting-services-ssrs-rsexe-utility/

BINGO! Went back to the .rss file and changed the below code block to

 ‘ Deploying the Reports

                    Try

                        Dim stream As FileStream = File.OpenRead(fileFullPath)

                        Dim NameWithExt As String = fileFullPath.Replace(path, “”)

                        Dim NameOnly As String = NameWithExt.Replace(“.rdl”, “”)

                        definition = New [Byte](stream.Length-1) {}

                        stream.Read(definition, 0, CInt(stream.Length))

Ran the.bat, and in just a few minutes, I had a ton of reports loaded to my directory. Now I just gotta fix the datasources.

**Edit – based on comment below, I’ve embedded the entire revised script.

‘ Script Starting Point
' Script to deploy report to report server
' EXECUTE VIA COMMAND LINE
‘ Original script: https://basic-ssrs.blogspot.com/2015/12/upload-multiple-rdl-files-from-local.html
‘ Upload_Multiple_RDL_files.rss

DIM definition As [Byte]() = Nothing
DIM warnings As Warning() = Nothing

Public Sub Main()

' Variable Declaration
        Dim TextLine As String = ""
        Dim LocalDir As String = ""
        Dim ServerDir As String = ""
        Dim definition As [Byte]() = Nothing
        'Dim warnings As Warning() = Nothing

' Reading Configuration Text file and assigning the values to variables
        If System.IO.File.Exists(FILE_NAME) = True Then
            Dim objReader As New System.IO.StreamReader(FILE_NAME)
            Do While objReader.Peek() <> -1
                TextLine = objReader.ReadLine()
                Dim parts As String() = TextLine.Split(New Char() {","c})
                'TextLine & objReader.ReadLine() '& vbNewLine
                LocalDir = parts(0)
                ServerDir = parts(1)

                Dim path As String = LocalDir
                Dim fileEntries As String() = Directory.GetFiles(path)
                Dim fileFullPath As String = ""
                For Each fileFullPath In fileEntries


 ' Deploying the Reports
                    Try
                        Dim stream As FileStream = File.OpenRead(fileFullPath)
                        Dim NameWithExt As String = fileFullPath.Replace(path, "")
                        Dim NameOnly As String = NameWithExt.Replace(".rdl", "")
                        definition = New [Byte](stream.Length-1) {}
                        stream.Read(definition, 0, CInt(stream.Length))

      warnings = rs.CreateReport(NameOnly, ServerDir, True, definition, Nothing)

      If Not (warnings Is Nothing) Then
        DIM warning As Warning
        For Each warning In warnings
        Console.WriteLine(warning.Message)
        Next warning
      Else
       Console.WriteLine("Report: {0} PUBLISHED!", NameOnly)
      End If

     Catch e As IOException
      Console.WriteLine(e.Message)
     End Try
    Next fileFullPath
   Loop
         Else

            Dim MsgBox as String = "File Does Not Exist"

        End If
End Sub

'End of the Script

Checking for #RedGate SCA databases

We’re starting a large scale conversion of our development processes with some very old database servers. Some of these databases were developed over 10 years ago, and they were never checked into source control. As our team develops new features, they’ve been using Red Gate’s SQL Change Automation to develop database projects along side their applications; it’s an impressive tool, and we’re getting ready to tackle older databases in an effort to improve our code base.

But, first, we need to determine what databases are already source control and which ones are not. Most of these databases and applications are minimally documented, so most of our development team doesn’t know what’s actually in production. To discover what databases are already part of SCA, I created a simple SQL script. It looks for the existence of the [dbo].[__MigrationLog] table, an artifact of the SCA process. I can then tell when a snapshot was last deployed, and what version was given.

CREATE TABLE #tmp (dbname varchar(1000), completedate Datetime, version varchar(200))

DECLARE @sql varchar (MAX)

SET @SQL = ' USE ?;
If exists (SELECT * from sys.tables t WHERE t.name = ''__MigrationLog'')
SELECT db_name() dbName, MAX(complete_dt) completeDate, MAX(version) version
FROM dbo.__MigrationLog
'

INSERT INTO #tmp
exec sp_msforeachdb @SQL

SELECT *
FROM #tmp

DROP TABLE #tmp

Adventures in #CoWorking

So, my experiment in coworking has been fun, and I think I’m going to continue it. The biggest benefit has been the fact that when I’m done in the afternoons, I pack up my laptop and go home. Most nights, I don’t even unpack my bag; my day ends at a reasonable hour, and I’m more involved with family when I need to be.

As I posted previously, I had originally started working out of a place in Suwanee, Georgia (about 20 minutes from house; 30-45 with traffic). CEO Centers Flex space is a nice facility, and the day-to-day staff is fine. There’s very few amenities, and it’s a wide open space. The problem is that I take a lot of calls…. and I mean, a LOT of calls. Wide open spaces are apparently not conducive to lots of phone calls, so about two months in to my stay, I came in to the office to find the following note (prominently placed in front of my seat… and only my seat).

Whoops. Time to find a new home.

PARADIGM WORKHUB

I started looking, and luckily, a new workspace had opened up that was actually closer to home and was 10 bucks cheaper a month. It also had corners, and a door that I could close. Because it’s newly open, not a lot of people are there yet. Score.

#NeverStopLearning 10 Minutes of Code: PS Array .Add

One of the scripts I’ve been working through in my spare time is a script to identify what servers I can’t reliably use Powershell remotely on. We manage a large domain of legacy servers, and it’s been sliced into smaller segments that have some complex routing; it’s a bit of a nightmare, but I figured I could probe each computer and run some basic PS script to see if it responds.

However, we have a LOT of servers, and some of them fail even the basic ping test. Why waste resources on testing PS commands when I can’t even reach those? My solution? Run a ping test first, and build two arrays: one that contains pingable servers, and one that does not.

 Foreach ($computer in $computers)
{
#ping test first; if it fails, add it to a new array $aNoPing with the .Add method
if(-not (test-connection -ComputerName $computer.Name -COUNT 1 -Quiet))
{
    [void]$aNoPing.Add($computer)
 }
else #it pinged, add it to a new array $aComputers for the rest of the test
    {
        [void]$aComputers.Add($computer)
    }
}                       
#generate some output; these boxes need to be looked at.
Write-Output “”
Write-Output “The following servers cannot be reached by ping.”
Write-Output “————————————————“
$aNoPing

Moving from #WFH to #CoWorking

I’ve been a remote worker for the last 15 years of my career, and I’ve often bragged about how efficient it was to work from home. I could do all sorts of household chores, as well as manage my business needs. Didn’t need to spend three hours commuting round trip each day, and was always available as needed. It’s a great time to work remote, and my company has a good culture for us.

Unfortunately, working from home also led to two major issues for me: distractions during the day, and the never ending day. Distractions were subtle; sometimes it was easier to just zone out rather than dealing with the mountain of paperwork and conference calls awaiting me. The never ending day often resulted from those distractions; because my work is at my house, it was easy just to put off family time and just keep working.

My new office space https://ceocentersflex.com/coworking-space/

I’m trying something radical; I’ve started coworking. I didn’t want to drive three hours to the corporate office, so I’ve found a space that was 20 minutes from my house. I rent a table, have faster internet, and I pack in and out each day. It gives me the opportunity to be around people (so I feel guilty hauling a TV screen in) as well as a definitive end to my day.

It’s been an interesting experiment so far; listening to other people’s conference calls and meetings has been the biggest change\challenge, but in general, I’m getting used to it. I’ll talk more about my gear in a future post, but for the most part, I’ve minimized my work footprint.

#SQLSATBR: Database People and #DevOps

Excited to announce that I was chosen to present my session “Database People and DevOps: The Fundamentals” at SQLSaturday Baton Rouge 2019 this August. Very excited to head back close to home this year; I actually attended LSU graduate school for a year before transferring to UGA, so the campus holds a dear place in my heart. SQLSaturday Baton Rouge appears to have grown a lot since the last time I was there, so I’m hoping I can pick ups some ideas for our event in 2020.

This is a fun session for me, and I’ve got some revisions to make after delivering it in Atlanta. I hope folks find it informative, and I give lots of references for future study. This is a summary class, which means I cover a lot of topics at a high level, but I like to build a framework for future study.

Y’all come.

#NeverStopLearning 10 Minutes of Code : PS Select-Object

For a database person, this particular cmdlet has a lot of potential. I started monkeying around with Get-ComputerInfo, and was overwhelmed with the amount of properties that are returned by this command. I only wanted to select a few, and so I started looking at Select-Object. By piping the output of one command to this Select-Object, I could return only the properties I’m interested in (and it’s in a tabular format).

Get-ComputerInfo | Select-Object CsCaption, CsModel

I’m trying to develop a script that will connect to every server in a given list, and validate if I can execute PowerShell remotely on it. I’m getting there. Get-ComputerInfo is too heavy (it reads a lot more information than I need), but being able to pipe the output of a command will help me immensely.

More to come.

#NeverStopLearning: PS Comments, Variables, IF, Remove-Item, Write to a file

Bunch of stuff today in 10 minutes; again, this is my collection of notes, so the actual blog post my be less glamorous than the excitement I felt. PowerShell appears to be a very powerful scripting language, and the more I use it, the more I like it. However, like most coding languages, the syntax varies. For example, I finally had to give up and Google how to write a comment.

#This is a single line comment in PowerShell
<#This is a 
multi-line comment in Powershell #>

Comments are useful as scripts get longer. For my sample script, I wanted to try a few things today:
1. write something to a text file
2. Do some conditional logic
3. delete a file, and
4. declare and use a variable

Script below, and explanations later:

#Define a variable
$File = "C:\list.txt"

#the following code will drop the file if it exists
IF (Test-Path $File) {
Remove-Item $File }

Get-ChildItem -Path C:\ | Set-Content -path $File

Code is straightforward, but one explanation that may be tough for experienced programmers; there’s no type declaration for the variable. Everything in PowerShell is an object, which is why the variable is simple to declare and re-use. Furthermore, when piping the contents of Get-ChildItem to a file, it knows to write them as text. This may confuse me later, but for now, I grok it.

#NeverStopLearning: 10 Minutes of Code – PS Get-ChildItem

In my last post, I started working on a very simple script to explore files in a directory using PowerShell: Get-ChildItem allows me to find the files and folders in the user directory, but what if I wanted to start with a different directory? That’s where the -Path switch comes in.

Get-ChildItem -Path C:\

gives em a listing of all of the files and folders at the root of my C drive. Lots more to come; this is just the beginning of my journey.