Ah, summertime; time for the annual “community crisis” for the Professional Association for SQL Server. I’ve tried to stay clear of controversies for the last couple of years, but it’s very hard to be a member of such a passionate group of professionals and not have an opinion of the latest subject d’jour. The short form of the crisis is that there’s questions about how and why sessions get selected to present at the highly competitive Summit this year (disclaimer: I got selected to present this year). For more details, here’s a few blog posts on the subject:
The point of my post is not to rehash the issue or sway your opinion, dear reader, but rather to focus on a single tiny step in the right direction that I’ve decided to make. One of the big issues that struck me about the whole controversy is the lack of a repeatable objective tool for speaker evaluations. As a presenter, I don’t always get feedback, and when I do, the feedback form varies from event to event, meeting to meeting. Selection committees are forced to rely on my abstract-writing skills and/or my reputation as a presenter; you can obfuscate my identity on the abstract, but it’s tough to factor in reputation if do that.
While I agree that there are questions about the process that should be asked and ultimately answered, there’s very little that I can do to make a difference in the way sessions get selected. However, as a presenter, and a chapter leader for one of the largest chapters in the US, I can do a little something.
- I am personally committing to listing every presentation I make on SpeakerRate.com, and soliciting feedback on every presentation. To quote Bleachers, “I wanna get better”.
- I will personally encourage every presenter at AtlantaMDF to set up a profile and evaluation at SpeakerRate for all presentations going forward.
- We will find ways to make feedback electronic and immediate at the upcoming Atlanta SQLSaturday so that presenters can use that information going forward.
- I will champion the evaluation process with my chapter members and speakers, and continue to seek out methods to improve and standardize the feedback process.
Do I have all of the right answers? No. For example, SpeakerRate.com seems to be barely holding on to life; no mobile interface, and a lack of commitment from its members seems to indicate that the site is dying a slow death. However, I haven’t found an alternative to provide a standard, uniform measure of presentation performance.
Do I think this will provide a major change to the PASS Summit selection? Nope. But I do think that a sea change has to start somewhere, and if enough local chapters get interested in a building a culture of feedback and evaluation, that could begin to flow up to the national level.
Hi Stuart-
You may want to check out Lanyrd (http://lanyrd.com). They are growing as a conference/event exchange and have speaker profiles available.
Interestingly, PASS Summit is not listed until their SQL Server events, but SQLBits, a few SQL Saturdays, and Live360 are. Maybe this is an example to point PASS HQ towards.
Regarding feedback, I have listed where I have spoken on speakerrate this year before each event right here http://bit.ly/1mHnmqO You will notice I have yet to get any feedback although I have been to several SQL Saturdays, user group and other events. I started posting on lanyrd and I have yet to get any feedback there either. I am trying to get better so I actively solicit feedback during my presentations the old school way with paper, that works. Because I want to get better, I read all of those feedback sheets and address anything that is listed. I don’t know if any feedback for my talks has reached PASS at all.