It’s T-SQL Tuesday! I’m trying to get back on the blogging bandwagon, and for me, that’s about as fun as pulling teeth. I have the utmost respect for the people who can blog all day and at the same time make it look easy (I know it isn’t), but me, I just have to slog through. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts.
I’ll go straight to the point: I think live demos in technical sessions are a waste of time. No, no, hear me out, I’ll explain what I mean. Even more importantly, I’m curious to hear dissenting views. I’ll start with a little bit of background so you’ll understand where I’m coming from. I’m a Microsoft Certified Trainer, and I’ve been training people professionally for over 20 years. For me, it’s all about the penny dropping for the learner. To put it simply: if you don’t get what I’m trying to teach you, that’s on me. That’s on me, and I need to do better to help you understand.
It’s T-SQL Tuesday! It has been a long time since I last participated, but this month really struck a chord. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post. The Ask This month’s T-SQL Tuesday is hosted by James McGillivray ( Blog | Twitter ).
I just had an absolute blast presenting at the Data Platform Discovery Days - both the European and the US edition! For the US edition I presented “Azure Machine Learning for the Absolute Beginner”, a session looking at machine learning in general, walking through Azure Machine Learning and giving several examples of machine learning in action - in both expected and unexpected places! The European edition asked for “Building an Empire - Implementing Power BI Step by Step”, a session on Power BI, datasets and dataflows.
In part 1 of this blog series I outlined the gear we use to record the podcast. The second part was all about actually recording content. The third was a wall of text on post-processing. After getting back from Oslo and the Nordic Infrastructure Conference, it is now time to finish off the series with an outline of how I publish and push the podcast on the different social media platforms. Gear Recording Post Processing Publication Base camp After working off the blog for a bit, we started using the podcasting platform Pippa in January of 2018. (Pippa was subsequently absorbed into Acast).
In part 1 of this blog series I outlined the gear we use to record the podcast. The second part was all about actually recording content. It’s now time to dive into the third and most technical part – post-processing. Gear Recording Post Processing Publication select * from foo into bar As a quick recap you might remember that the starting point for this step is the raw audio files. I will typically have one file per host plus the recording of the Teams meeting. Let’s start with Teams first, as we need a way to extract the audio feed from the video so I can use that for lining up my other audio files.
In part 1 of this blog series I outlined the gear we use to record the podcast. This part will focus on the techniques and settings we use to recording an episode. Gear Recording Post Processing Publication We share a OneNote file with ideas and information for every episode. We kind of screwed up the episode numbering early on. While we’re technically on episode 104 at the time of this writing, we have recorded several more specials and weirdly numbered episodes. This goes to show that not all ideas are good ideas in the long run and that starting a podcast isn’t easy… We generally try to aim for 30 minutes.
Matthew Roche pinged me on Twitter the other day to ask about the workflow and gear I use to record Knee-deep in Tech. After responding on Twitter I decided it was time to do a new round of “The Tech of Knee-Deep in Tech”. I’ll divide this into several blog posts due to the sheer amount of information. Gear Recording Post Processing Publication Let’s kick off the first part of the series – the gear. Recording on the move When we’re at the same place to record (which is rare these days) or when we’re out and about and do interviews with people, we use a Zoom H6 recorder and AKG C520/C520L microphones.