This one has been on my to-do list for a while, so it’s far from breaking news. I use these posts as a resource to gauge how we’re doing with our presentation content. Typically these posts don’t generate a lot of feedback, but it is always welcome.
At TechReady 9, I presented the following sessions:
The MCS Application Compatibility Strategy Engagement: Predicting the Unpredictable
Application Compatibility Remediation: The Dark Magic of Fixing Broken Applications
Application Compatibility – Tools discussion (Part One – AppDNA and ACT)
Application Compatibility – Tools discussion (Part Two – ACT and ChangeBase)
What EVERYONE should know about Application Compatibility (with Aaron Margosis)
Not for the Faint of Heart: Hard Core App Compat Debugging (with Gov Maharaj)
How to Navigate the Microsoft Services Application Compatibility Offerings and Deploy Windows 7 at Your Customers
The rating criteria were:
Q1 – The speaker was knowledgeable about the subject matter.
Q2 – The speaker’s presentation skills helped me better understand the material.
Q3 – The content was effective in achieving the stated learning objectives.
Q4 – The demonstration(s) were effective.
Q5 – This session builds skills improving how I sell, market, and/or provide services to our customers and partners.
Q6 – The knowledge/skills I gained are relevant to my role.
Q7 – This readiness solution was worth the investment of my time.
Q8 – I will recommend this readiness solution to others.
Q9 – Overall, I was satisfied with this Session/Chalk Talk.
Q10 – Content Level: Please rank the level of technical information actually delivered.
Q11 – Please provide additional constructive comments, suggestions, feedback.
QAvg – Average for all questions combined
Here are the rankings. I kind of vacillate between displaying rankings as raw numbers or as raw percentages. In this case, I have enough of a distribution that the raw numbers can be misleading, so I’m using percentages:
People are interested in both the technology and the business approach. My two top sessions? A technology session, and a strategy session that didn’t include a single demo (which is incredibly odd for me). This was one of my areas targeted for improvement, and it seems as if we’re finally getting that story put together.
Hard core debugging is still a polarizing session. This session still generates one of two pieces of feedback. Either someone thinks it’s the best thing ever, or they think it’s impossibly obtuse and not useful. There is nothing in between. I threw in a demo this time of “debugging without a debugger” which has traditionally been really powerful with the folks who actually do the work, but for a general audience, it fell flat. They wanted to see hex, apparently. 🙂
Don’t be depressing. Our “what everyone wants to know” session was still in the top half, but boy, it isn’t at all what I like to see in my scores. We raced through topic after topic, painting a horribly depressing picture in an effort to be “real.” But it’s not this incredibly depressing thing – you actually can do it. We didn’t leave any hope, only despair. It turns out people don’t really like that much. Do-over.
Don’t leave your technical guys behind. I did a series of 2 sessions – one with AppDNA and the other with ChangeBase. AppDNA sent their CEO and their CTO. It’s lit up in green. ChangeBase’s technical director didn’t make it, leaving only their director of sales. They are awash in a sea of red and orange as a result. Technical audiences like hearing it straight form the geeks’ mouths.
We need to keep our remediation sessions current. I did a minor revision of this content for TechReady 9, and have a major overhaul in store for the 2010 conference season. It’s still getting consistently high scores, which tells me to keep it current and continue to include it.