It’s an early start today here in Belgium preparing for my first public sessions as a speaker.
As I have mentioned previously, I will be presenting a session on Saturday at Power UserDays in Belgium on optimising large data sets in Power BI with aggregations https://sessionize.com/app/speaker/event/details/1667, but the crazy news is that I will also now be presenting another session focused on handling the most awkward of files, the multiformat flat file in Mapping Data Flows as well! Join me on Saturday to find out how to handle this really awkward ETL pattern! Sorry that you have to hear me droning on for multiple sessions, but I promise it will be worth it!
I am really happy to announce that I will be presenting a session at Power User Days in Belgium next month on Saturday September 14th. This is my first conference session I will have ever presented at and I am diving in at the deep end with delivering a hands-on session on how to optimise reporting in Power BI against large data sets using aggregations. I will be demonstrating this using a few billion rows in direct query to highlight how aggregations are a real game changer.
For more details and to register for free for this event please visit https://poweruserdays.com/?fbclid=IwAR2ORE5uwXUYu31dXZ-N4i7e_BnlPWqBu5dUCwXfAGdeTL5vKIj587Tf4pA . Don’t forget to bring your laptop, as I will be getting you to build along with me! There might even be a competition on who can build the best dashboard using the data set provided!
As development continues on ADLS gen2, Power BI now has a beta connector available. This can be found in the Azure folder of the Get Data Experience
To connect to your storage account, you will need to provide the DFS url to your account
Depending on whether or not the right permissions exist, you may get some weird errors once you have signed in with your organisational account, for example I saw this error
To rectify this I went to the Access Control tab in the Azure portal for my storage account and selected one of three of the blob roles (Storage Blob Reader, Storage Blob Contributor or Storage Blob Owner) to assign to the account I used to authenticate
Once this has been done (it can take up to 5 minutes for the roles to propagate in larger organisations) you will now be able to see the contents of your storage account and create reports based on data stored in ADLS gen2.
Thanks to Adam Pearce, a super helpful Senior Consultant here at Altius who found the fix for the connection error.
Although still in its infancy, there is already a lot of good resources out the for Mapping Data Flows in Azure Data Factory. I will try and keep this post updated as more blogs appear.
Mark Kromer: https://kromerbigdata.com/tag/mapping-data-flows/
Cathrine Wilhelmsen: https://www.cathrinewilhelmsen.net/tag/data-flows/
Azure Documentation: https://docs.microsoft.com/en-us/azure/data-factory/concepts-data-flow-overview
SQL Player: https://sqlplayer.net/tag/adfdf/
List of videos: https://github.com/kromerm/adfdataflowdocs/tree/master/videos
Hopefully this is a useful reference, let me know of any more in the comments.