Automating Processing Azure Analysis Services

How to Automate Processing of Azure Analysis Services Models

I’ve been working on a proof of concept for a customer that involved using Azure Analysis Services as a cache for some data in an Azure Data Warehouse instance. One of the things I’ve been working on is scheduling the automatic processing of the Azure AS database. I did find the following documentation on the process, but the screenshots of the Azure portal are out of date and I did find some errors in the instructions. I also found this very extensive project for partition management in Azure AS, but this was a little overkill for my purposes and I was just interested in the very basics.

Read my recap for MS Data Summit here

These previously mentioned resources led me to write this blog post. In this post I’m going to leverage the previously mentioned article and walk through creating an Azure Function App to automatically refresh my Azure Analysis Services model, while correcting a few errors and updating the screenshots.

If you’re new to Azure Analysis Services, take a look at this documentation. For the purposes of this post, I’m going to assume you have a basic understanding of Analysis Services.

I created a Tabular Model with a connection to an Azure SQL Database with one table that had a couple columns. I visualized the data in Power BI so I could verify the data was being refreshed after I processed the Azure AS database.


The Steps for Automating Processing of an Azure Analysis Services Model

1. Create an Azure Function App

The first step is to create an Azure App Function. Navigate to the Azure portal and create a Function App.


2. Create a new Function

After you’ve created the Function App, we need to add a new Timer function. Click the + button next to Functions, select Timer, and click Create this function.


3. Configure the Timer

Give your Timer a name by filling in the textbox for Timestamp parameter name. The default name is myTimer, but you can change this. Just make sure you remember what the name of your timer is because we will need it later.


The Schedule text box expects a CRON expression to define the days and times that the function should execute. Click the little Documentation button on the screen above to read about CRON expressions. But a CRON expression similar to what I’m using would execute the script every 4 hours of every day. Click Save when you’re done.

4. Configure the Function App

Download the latest client libraries for Analysis Services. This needs to be done on your local machine so you can then copy these files to your Azure Function App.

After you’ve downloaded the client libraries, the DLLs can be found in C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies. The two files you need are:

C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies\Microsoft.AnalysisServices.Core.DLL
C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies\Microsoft.AnalysisServices.Tabular.DLL

This step is important because the documentation in Azure references the 130 assemblies, which will not work. You need the assemblies in 140 otherwise you’ll get errors.


Select your function, TimerTriggerCSharp1, and expand the View files windows on the far right of your screen.

Here you need to add a folder called “bin”. Click Add to do this.

Then click the Upload button to add the two previously mentioned DLLs to the bin folder.


You should see the two DLLs in your bin folder now.


5. Add the connection string for your Azure Analysis Services database

Click the name of your Function App, then select Platform features. Select Application settings under General Settings.


Now we need to add our connection string under the Connection strings section. You’ll need your AAS server name and a user ID and password that has access to the AAS database.

You can find your AAS server name by navigating to your AAS database in the Azure portal and copying the value found under Server name:


Your connection string should look like this:

Provider=MSOLAP;Data Source=<your aas server>; Initial Catalog=<aas database name>;User ID=<your username>;Password=<your password>

Fill in the Name textbox with a name for your connection string (remember this, we’ll need it later) and paste your connection string in the Value text box:


Click Save near the top.

6. Add your code

Select TimeTriggerCSharp1.


To programmatically process the tabular model, we’ll leverage Analysis Services Management Objects (AMO). If you’re new to AMO, I’d suggest starting here.

Paste in the following code (you can also download the .cs file I used with the script here). The highlighted sections of the code represent the piece you need to change specific to your function app. The green text represents the commands you should change based on the type of processing operation you wish to execute. I’m processing the whole model with a ProcessFull command so I’ve left the first green line uncommented. Just a note of caution here: If you’re copying and pasting my code here as an example, make sure you check that all the characters copy and paste correctly so you don’t bang your head against a wall for a couple hours like Mike.

#r “Microsoft.AnalysisServices.Tabular.DLL”

#r “Microsoft.AnalysisServices.Core.DLL”

#r “System.Configuration”

using System;

using System.Configuration;

using Microsoft.AnalysisServices.Tabular;

public static void Run(TimerInfo <TheNameOfYourTimer>, TraceWriter log)


    log.Info($”C# Timer trigger function started at: {DateTime.Now}”); 



                Microsoft.AnalysisServices.Tabular.Server asSrv = new Microsoft.AnalysisServices.Tabular.Server();

                var connStr = ConfigurationManager.ConnectionStrings[“<YourConnectionStringName>“].ConnectionString;


                Database db = asSrv.Databases[“<YourDatabaseName>“];

                Model m = db.Model;

                db.Model.RequestRefresh(RefreshType.Full);     // Mark the model for refresh

                //m.RequestRefresh(RefreshType.Full);     // Mark the model for refresh

                //m.Tables[“Date”].RequestRefresh(RefreshType.Full);     // Mark only one table for refresh

                db.Model.SaveChanges();     //commit  which will execute the refresh



            catch (Exception e)


                log.Info($”C# Timer trigger function exception: {e.ToString()}”);


    log.Info($”C# Timer trigger function finished at: {DateTime.Now}”);



Click the Save button at the top.

7. Time to test your function app!

Click the Run button at the top to test the function app.


After clicking Run, you should see the following results:


And now my Power BI report is updated!



Here’s some things to think about if you run into trouble.

  1. Verify your connection string is correct.
  2. Verify that you’ve correctly referenced your timer, connection string, and database name in your script.
  3. Make sure you used the latest version of the DLLs.

I had to do a little bit of troubleshooting to get this to work and some of this was trial and error. But keep poking at it and leave a comment below if you have any questions.


Processing Azure Analysis Services with Function Apps:

Automated partition management with Azure Analysis Services whitepaper and sample codes:


9 thoughts on “How to Automate Processing of Azure Analysis Services Models”

    1. That’s a good question. I haven’t tried that yet but I know from experience that you can’t use a SQL Agent Job to process an AAS database using a stored credential, proxy account, and an Analysis Services command. But the SSIS Analysis Services command would be something to try.

  1. Thanks for this write up. It was extremely helpful. There are two areas where I encountered issues implementing based on your logic and sample code. I thought I would share my experience to save someone else some headaches.

    1. Adding the bin folder for the SSAS assemblies – Pressing “add” thru the file maintenance in Azure only created a bin file but did not provide an option to create a bin folder. Attempting to upload the SSAS assemblies then dropped them into the function directory at the same level as the run.csx. Opening the site thru the diagnostic dashboard in KUDU allowed me to create an actual bin folder. I then went back to the file maintenance pane and was able to upload the SSAS assemblies into the newly created bin folder.

    2. Copying your sample code for run.csx from your blog was a quick short cut, but I started geting lots of errors, such as “Quoted file name expected” when included the assembly references. It turns out that copying the code from the blog changed the double quotes to something that the C# editor did not recoginize. In the code snippet below, the first entry works, but the quotes are subtlety different on the second, pasted entry.

    #r “Microsoft.AnalysisServices.Tabular.DLL”
    #r “Microsoft.AnalysisServices.Tabular.DLL”

    I’ve hit this before but it was about 2 hours of swearing before I remembered the problem with pasting code and identified the culprit. Changing the double quotes cleared up almost all of the compile errors.

    Thx again, hope this helps some one else avoid a couple of wasted hours.


      1. Thanks for the feedback, Mike. This is a challenge with pasting code directly into the blog. I had a LiveWriter plugin that I was using for code snippets but I’ve lost it and can’t seem to find it anymore.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s