Blogging is FUN!

Checking for #RedGate SCA databases

We’re starting a large scale conversion of our development processes with some very old database servers. Some of these databases were developed over 10 years ago, and they were never checked into source control. As our team develops new features, they’ve been using Red Gate’s SQL Change Automation to develop database projects along side their applications; it’s an impressive tool, and we’re getting ready to tackle older databases in an effort to improve our code base.

But, first, we need to determine what databases are already source control and which ones are not. Most of these databases and applications are minimally documented, so most of our development team doesn’t know what’s actually in production. To discover what databases are already part of SCA, I created a simple SQL script. It looks for the existence of the [dbo].[__MigrationLog] table, an artifact of the SCA process. I can then tell when a snapshot was last deployed, and what version was given.

CREATE TABLE #tmp (dbname varchar(1000), completedate Datetime, version varchar(200))

DECLARE @sql varchar (MAX)

SET @SQL = ' USE ?;
If exists (SELECT * from sys.tables t WHERE t.name = ''__MigrationLog'')
SELECT db_name() dbName, MAX(complete_dt) completeDate, MAX(version) version
FROM dbo.__MigrationLog
'

INSERT INTO #tmp
exec sp_msforeachdb @SQL

SELECT *
FROM #tmp

DROP TABLE #tmp

Adventures in #CoWorking

So, my experiment in coworking has been fun, and I think I’m going to continue it. The biggest benefit has been the fact that when I’m done in the afternoons, I pack up my laptop and go home. Most nights, I don’t even unpack my bag; my day ends at a reasonable hour, and I’m more involved with family when I need to be.

As I posted previously, I had originally started working out of a place in Suwanee, Georgia (about 20 minutes from house; 30-45 with traffic). CEO Centers Flex space is a nice facility, and the day-to-day staff is fine. There’s very few amenities, and it’s a wide open space. The problem is that I take a lot of calls…. and I mean, a LOT of calls. Wide open spaces are apparently not conducive to lots of phone calls, so about two months in to my stay, I came in to the office to find the following note (prominently placed in front of my seat… and only my seat).

Whoops. Time to find a new home.

PARADIGM WORKHUB

I started looking, and luckily, a new workspace had opened up that was actually closer to home and was 10 bucks cheaper a month. It also had corners, and a door that I could close. Because it’s newly open, not a lot of people are there yet. Score.

#NeverStopLearning 10 Minutes of Code: PS Array .Add

One of the scripts I’ve been working through in my spare time is a script to identify what servers I can’t reliably use Powershell remotely on. We manage a large domain of legacy servers, and it’s been sliced into smaller segments that have some complex routing; it’s a bit of a nightmare, but I figured I could probe each computer and run some basic PS script to see if it responds.

However, we have a LOT of servers, and some of them fail even the basic ping test. Why waste resources on testing PS commands when I can’t even reach those? My solution? Run a ping test first, and build two arrays: one that contains pingable servers, and one that does not.

 Foreach ($computer in $computers)
{
#ping test first; if it fails, add it to a new array $aNoPing with the .Add method
if(-not (test-connection -ComputerName $computer.Name -COUNT 1 -Quiet))
{
    [void]$aNoPing.Add($computer)
 }
else #it pinged, add it to a new array $aComputers for the rest of the test
    {
        [void]$aComputers.Add($computer)
    }
}                       
#generate some output; these boxes need to be looked at.
Write-Output “”
Write-Output “The following servers cannot be reached by ping.”
Write-Output “————————————————“
$aNoPing

Moving from #WFH to #CoWorking

I’ve been a remote worker for the last 15 years of my career, and I’ve often bragged about how efficient it was to work from home. I could do all sorts of household chores, as well as manage my business needs. Didn’t need to spend three hours commuting round trip each day, and was always available as needed. It’s a great time to work remote, and my company has a good culture for us.

Unfortunately, working from home also led to two major issues for me: distractions during the day, and the never ending day. Distractions were subtle; sometimes it was easier to just zone out rather than dealing with the mountain of paperwork and conference calls awaiting me. The never ending day often resulted from those distractions; because my work is at my house, it was easy just to put off family time and just keep working.

My new office space https://ceocentersflex.com/coworking-space/

I’m trying something radical; I’ve started coworking. I didn’t want to drive three hours to the corporate office, so I’ve found a space that was 20 minutes from my house. I rent a table, have faster internet, and I pack in and out each day. It gives me the opportunity to be around people (so I feel guilty hauling a TV screen in) as well as a definitive end to my day.

It’s been an interesting experiment so far; listening to other people’s conference calls and meetings has been the biggest change\challenge, but in general, I’m getting used to it. I’ll talk more about my gear in a future post, but for the most part, I’ve minimized my work footprint.

#NeverStopLearning 10 Minutes of Code : PS Select-Object

For a database person, this particular cmdlet has a lot of potential. I started monkeying around with Get-ComputerInfo, and was overwhelmed with the amount of properties that are returned by this command. I only wanted to select a few, and so I started looking at Select-Object. By piping the output of one command to this Select-Object, I could return only the properties I’m interested in (and it’s in a tabular format).

Get-ComputerInfo | Select-Object CsCaption, CsModel

I’m trying to develop a script that will connect to every server in a given list, and validate if I can execute PowerShell remotely on it. I’m getting there. Get-ComputerInfo is too heavy (it reads a lot more information than I need), but being able to pipe the output of a command will help me immensely.

More to come.

#NeverStopLearning: PS Comments, Variables, IF, Remove-Item, Write to a file

Bunch of stuff today in 10 minutes; again, this is my collection of notes, so the actual blog post my be less glamorous than the excitement I felt. PowerShell appears to be a very powerful scripting language, and the more I use it, the more I like it. However, like most coding languages, the syntax varies. For example, I finally had to give up and Google how to write a comment.

#This is a single line comment in PowerShell
<#This is a 
multi-line comment in Powershell #>

Comments are useful as scripts get longer. For my sample script, I wanted to try a few things today:
1. write something to a text file
2. Do some conditional logic
3. delete a file, and
4. declare and use a variable

Script below, and explanations later:

#Define a variable
$File = "C:\list.txt"

#the following code will drop the file if it exists
IF (Test-Path $File) {
Remove-Item $File }

Get-ChildItem -Path C:\ | Set-Content -path $File

Code is straightforward, but one explanation that may be tough for experienced programmers; there’s no type declaration for the variable. Everything in PowerShell is an object, which is why the variable is simple to declare and re-use. Furthermore, when piping the contents of Get-ChildItem to a file, it knows to write them as text. This may confuse me later, but for now, I grok it.

#NeverStopLearning: 10 Minutes of Code – PS Get-ChildItem

In my last post, I started working on a very simple script to explore files in a directory using PowerShell: Get-ChildItem allows me to find the files and folders in the user directory, but what if I wanted to start with a different directory? That’s where the -Path switch comes in.

Get-ChildItem -Path C:\

gives em a listing of all of the files and folders at the root of my C drive. Lots more to come; this is just the beginning of my journey.

#NeverStopLearning: 10 Minutes of code

Today, I spent 10 minutes working on PowerShell. I’ve never really spent much time in PowerShell before. I’ve had a few classes a few years ago, but like most things, if you don’t use it, you lose it.  I was going to do a Pluralsight course, but frankly, I haven’t been great about following up on these sorts of things, so I figured I’d start with the completely free (no trial expiration) knowledge at the MS Docs site

I knew most of this stuff, but the point of this exercise is to force myself to start using again.  I spent a few minutes running the ISE, using Get-Help (and updating the help files), and then thought I’d try a quick script.  Like most coders, I stole this from the web:

Get-ChildItem | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-180)}

This simple script helped me find the files and folders in my user directory (the current context) that were modified within the last 180 days.  To limit the results to files only, I added an additional clause to the filter:

Get-ChildItem | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-180) -and ! $_.PsIsContainer }

I think I can do this.  More to come.

Fix More, Whine Less

I get paralyzed by the thought of NOT doing something correct, so I end up doing nothing. I’ve got to work on that. A three sentence blog post is better than no blog post. Fixing one tiny recurring error is better than just watching the logs fill up. It may be a lost cause, or it may be a moment of hope; whatever.

Just fix something. Something small. A lot of small somethings can make a big something.