Azure Cost Calculations
In this article
Example of Microsoft Azure Estimate
Example of Azure Cost - Analytics with LS Central on-premises
Example of Azure Cost - Analytics with LS Central SaaS
The main components in the price calculations
Example of Microsoft Azure Estimate
Components of Azure Cost
The components of Analytics for LS Central that are setup in Azure and customer needs to pay for differ slightly between the setup for LS Central on-premises and LS Central SaaS and we have therefore setup two cost calculation examples.
Before we move on to the examples it is worth mentioning that the payment for the SQL server database is fixed per month and not usage related, but if you scale the database up or down for a few days you can increase or lower the cost, since the cost for the database is calculated from the monthly cost of the price tier you have selected every day. It is therefore recommended to scale the database up while you run the Initial load, especially if you have a large underlying data set of transactions.
The cost of the Azure data factory is usage based and primarily related to the cost of the integration runtime actions that are run when moving the data between tables within the database (LS Central SaaS) and when moving data between two databases (LS Central - on premises). Three things affect the number of actions run by the integration runtimes:
-
Number of source tables
-
Number of Companies
-
Number of times the Scheduled run is run.
The examples below are setup to reflect the scenario of running an update of Analytics data, with Scheduled run pipeline, once every 24 hours, for one company and with the source tables list included in the standard setup. All numbers are estimated cost per month.
Example of Azure Cost - Analytics with LS Central on-premises
Service type | Description | Estimated monthly cost | Estimated upfront cost |
---|---|---|---|
Data Factory |
Azure Data Factory V2 Type, Data Pipeline Service Type, Azure Integration Runtime: 0 Activity Run(s), 0 Data movement unit(s), 0 Pipeline activities, 0 Pipeline activities – External; Self-hosted Integration Runtime: 15 Activity Run(s), 60 Data movement unit(s), 60 Pipeline activities, 60 Pipeline activities – External, 0 x 8 Compute Optimized vCores, 0 x 8 General Purpose vCores, 0 x 8 Memory Optimized vCores, 1 Read/Write operation(s), 1 Monitoring operation(s) |
$29.38 | $0.00 |
Azure SQL Database | Single Database, fixed cost per month, DTU Purchase Model, Standard Tier, S2: 50 DTUs, 250 GB included storage per DB, 1 Database(s) x 1 Months, 5 GB Retention | $73.61 | $0.00 |
Total | $102.98 | $0.00 |
Example of Azure Cost - Analytics with LS Central SaaS
Service type | Description | Estimated monthly cost | Estimated upfront cost |
---|---|---|---|
Data Factory |
Azure Data Factory V2 Type, Data Pipeline Service Type, Azure Integration Runtime: 9 Activity Run(s), 87 Data integration unit hour(s), 69 Pipeline activities, 20 Pipeline activities – External; Self-hosted Integration Runtime: 0 Activity Run(s), 0 Data movement unit(s), 0 Pipeline activities, 0 Pipeline activities – External, 0 x 8 Compute Optimized vCores, 0 x 8 General Purpose vCores, 0 x 8 Memory Optimized vCores, 0 Read/Write operation(s), 0 Monitoring operation(s) |
$31.10 | $0.00 |
Azure SQL Database | Single Database, fixed cost per month, DTU Purchase Model, Standard Tier, S2: 50 DTUs, 250 GB included storage per DB, 1 Database(s) x 1 Months, 5 GB Retention | $73.61 | $0.00 |
Total | $104.71 | $0.00 |
The main components in the price calculations
Data Factory
Azure Integration Runtime: 9 Activity Runs and 87 Data integration unit (DIU) hour(s) - This means that each pipeline component that is run is counted and the 9 stands for 9 thousand runs which should be ample for Analytics if pipelines are run once every 24 hours. But Azure integration runtime the DIUs hours are also costly and are the largest cost component. They are primarily controlled by the number of actions so the price for each run will increase by the factor of companies and the data factory cost will increase if Scheduled run pipeline is run more than once every 24 hours.
Self hosted integration runtime: 15 Activity Run(s) - This means that each pipeline component that is run is counted and the 15 stands for 15 thousand runs which should be ample for Analytics if pipelines are run once every 24 hours. More frequent updates of the data warehouse will result in higher number of runs and increased cost of the data factory.
Azure SQL Database
S2 - This is the database service tier and performance level that controls how fast the upload of the data is from the data warehouse to the report templates. S2 is what we would recommend for a standard Analytics installation with scheduled Power BI report dataset updates every 24 hours. But this can be adjusted for decreased cost, or if you have more frequent updates of the DW or high through put, then you might want to have a higher performance level for the Azure database as well. You can easily scale the database up to S3 or S4 and test the affects it has on performance. You only pay for the more expensive price tiers on the days you have the database scaled up.
Note: If you are testing the Analytics update and you create Azure resources that you do not want to immediately delete but which are not in frequent use, then we recommend changing the performance level of the database to S1 to decrease the cost. Going lower, to S0, is only recommended if you want to keep the database without running any updates to Power BI reports. You can always change this for increased performance later, if needed.
Disclaimer
All prices shown are in US dollar ($). This is a summary estimate, not a quote. Prices may differ based on sizes of different tables being loaded from LS Central.
For up-to-date pricing information visit the Azure pricing calculator.
This estimate was created at 6/3/2020 1:09:55 PM UTC.