Credential errors in Synapse On-demand pools

A few weeks ago I created a data lake in Azure and filled it with some CSV files. Then I spun up a Synapse Analytics Workspace and queried the files using Azure Synape Analytics On-demand pools to query the CSV files via the Synapse Analytics Studio. This works great - if you haven’t tried running SQL on text files in Azure data lake, stop reading and go check it out. Next, I created a database in the on-demand pool, and added a view to it, referencing the select OPENROWSET statement. That view can now be used in, say, Power BI or other tools that can connect to the on-demand pool endpoint.

Power BI announcements from MBAS

Microsoft Business Applications Summit (MBAS) turned out to be a veritable goldmine for Power BI. The announcements are out in force, and Marc Lelijveld (Twitter|Blog) has penned an excellent summary of the features. I’d like to give my two cents on two of the features I personally find the most exciting: hybrid tables and streaming datasets. Hybrid Tables Let’s start with hybrid tables - they’re what we’ve been wishing for ever since Direct Query and Composite Models came out. This will give us the ability to combine imported data with Direct Query data in a seamless fashion. I have a use case for it right now: I have an application that logs a lot of data from an integration platform.

New Hosting

Thanks to Kendra Little’s blog post Moving from Wordpress to an Azure Static Site with Hugo I was inspired to try the same. Since I’ve already experimented with Hugo for some time, the move to Azure Static Sites was dead simple - and I love the GitHub integration. I save my markdown file, I push to GitHub, and a few minutes later my changes are up there. Fantastic!

T-SQL Tuesday #135 - tools of the trade

It’s T-SQL Tuesday! I’m trying to get back on the blogging bandwagon, and for me, that’s about as fun as pulling teeth. I have the utmost respect for the people who can blog all day and at the same time make it look easy (I know it isn’t), but me, I just have to slog through. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts.

The Curious Case of Live Demos

I’ll go straight to the point: I think live demos in technical sessions are a waste of time. No, no, hear me out, I’ll explain what I mean. Even more importantly, I’m curious to hear dissenting views. I’ll start with a little bit of background so you’ll understand where I’m coming from. I’m a Microsoft Certified Trainer, and I’ve been training people professionally for over 20 years. For me, it’s all about the penny dropping for the learner. To put it simply: if you don’t get what I’m trying to teach you, that’s on me. That’s on me, and I need to do better to help you understand.

T-SQL Tuesday #134 - give me a break!

It’s T-SQL Tuesday! It has been a long time since I last participated, but this month really struck a chord. T-SQL Tuesday is the brainchild of Adam Machanic (Blog | Twitter). December 2009 was the first T-SQL Tuesday invitation that went out by Adam. It is a monthly blog party on the second Tuesday of each month. Currently, Steve Jones (Blog | Twitter) organises the event and maintains a website with all previous posts. Everyone is welcome to participate in this monthly blog post. The Ask This month’s T-SQL Tuesday is hosted by James McGillivray ( Blog | Twitter ).

Speaking at Data Platform Discovery Day

I just had an absolute blast presenting at the Data Platform Discovery Days - both the European and the US edition! For the US edition I presented “Azure Machine Learning for the Absolute Beginner”, a session looking at machine learning in general, walking through Azure Machine Learning and giving several examples of machine learning in action - in both expected and unexpected places! The European edition asked for “Building an Empire - Implementing Power BI Step by Step”, a session on Power BI, datasets and dataflows. 

The Tech of Knee-Deep in Tech, 2020s edition - part 4

In part 1 of this blog series I outlined the gear we use to record the podcast. The second part was all about actually recording content. The third was a wall of text on post-processing. After getting back from Oslo and the Nordic Infrastructure Conference, it is now time to finish off the series with an outline of how I publish and push the podcast on the different social media platforms. Gear Recording Post Processing Publication Base camp After working off the blog for a bit, we started using the podcasting platform Pippa in January of 2018. (Pippa was subsequently absorbed into Acast).