4 Deploy Analytics for LS Central

Now it is time to run the Deployment script you downloaded with the product package earlier, but first you need to make sure that you have PowerShell 7 installed on the machine where you will be running the deployment script. Follow the steps below to deploy Analytics.

Configure PowerShell 7

To run the script, the machine should have Power Shell installed. We recommend using PowerShell 7 and then configuring PowerShell as described below.

You must have the new Azure PowerShell Az module installed. For further instructions see Introducing the new Azure PowerShell Az module.

Note: You must open the PowerShell 'as administrator' to be able to set up the module.

In case you do not have the Az module installed, run the following command as Administrator in PowerShell 7:

Install-Module -Name Az -AllowClobber

You must also register the AZ resource provider by running these lines in PowerShell 7:

Connect-AzAccount -Tenant 'TenantID/ParentManagementGroupIDfromAzure' -Subscription 'SubscriptionIDfromAzure'

Example:

Connect-AzAccount -Tenant 'baef60cc-ecf7-46d0-826d-f00b389e8054' -Subscription 'a45e9eef-be97-49d2-9ebf-28bb188ddb4d'

Note: You should already have this information in the Parameter.txt file. Be careful to put the correct parameters in the correct place.

This opens an Azure login window in the background where you must log in before you continue with the next line:

Register-AzResourceProvider -ProviderName Microsoft.DataFactory

In order to install bicep, you need to copy and run this scrip from Microsoft:

# Create the install folder

$installPath = "$env:USERPROFILE\.bicep"

$installDir = New-Item -ItemType Directory -Path $installPath -Force

$installDir.Attributes += 'Hidden'

# Fetch the latest Bicep CLI binary

(New-Object Net.WebClient).DownloadFile("https://github.com/Azure/bicep/releases/latest/download/bicep-win-x64.exe", "$installPath\bicep.exe")

# Add bicep to your PATH

$currentPath = (Get-Item -path "HKCU:\Environment" ).GetValue('Path', '', 'DoNotExpandEnvironmentNames')

if (-not $currentPath.Contains("%USERPROFILE%\.bicep")) { setx PATH ($currentPath + ";%USERPROFILE%\.bicep") }

if (-not $env:path.Contains($installPath)) { $env:path += ";$installPath" }

# Verify you can now access the 'bicep' command.

bicep --help

# Done!

Note: If you have already downloaded the product package, you can also navigate to the base folder run the InstallBicep.ps1 from there.

When all this is run successfully you are ready to run the deployment script.

Run Deployment script

  1. Open Power shell 7 client.
  2. In Power Shell client navigate to the folder where you stored the product package.
    • You can use cd.. to navigate up a folder.
  3. Once you are in the Base folder you can write depl and press tab and that will display the deployment script line.
  4. Press Enter to run the deployment script.
    • The script will check for the Az module setup mentioned earlier, and if the module is missing, the script will terminate and you can set it up at this time by following the steps in step 3 of the wizard.
    • The script will check for SqlServer module and install and import if needed.
    • In some cases, PowerShell will not trust the files from the ZIP, and will produce an error similar to the following:
      Run only scripts that you trust. While scripts from the internet can be useful, this script can potentially harm your computer.

      If you trust this script, use the Unblock-File cmdlet to allow the script to run without this warning message. Do you want to run ...

      [D] Do not run [R] Run once [S] Suspend [?] Help (default is "D"):

      • To fix the error, you need to run the following command in the base folder of the Analytics product package:

        dir -Path . -Recurse | Unblock-File
    • Then run the deployment script again.

General information

First, the script will collect some general information about the Analytics setup.

  1. First, the script prompts for Azure subscription ID:
    • Enter the ID you saved earlier and click Enter.
  2. Next, you will be prompted to enter your Azure account login information.

    The Azure login window will pop up in a different window but should get focus, and if you have already logged into Azure in a browser your account should be visible and you can just press Enter. If not, you need to enter your Azure login credentials.

  3. Now the script asks you to select the resource group you want to use from a list of resource groups collected from Azure:
    • Enter the number for the resource group you choose, and click Enter.

LS Central information

Next, the script will prompt for LS Central source information. For this setup, this is only the company name since all connection to the LS Central database is handled by the Scheduler server.

  1. The script will prompt for which type of LS Central you have.
    • Here you should always enter 2 for bc2adls with LS Central SaaS.
  2. The script prompts for the name of the company you want to use in the Analytics setup.
    • An empty Companies.txt file is opened.
    • Enter the company names exactly as they are displayed in the Companies table, Name field, in LS Central. If you have more than one company you want to include in your Analytics setup, enter each name in new line.
    • Save the file.
    • Click Enter in the script and the script will prompt for verification of the companies you entered into the file.
    • Enter y if they are correct or n if you want to edit the file.
      • If you select n, edit the file, save, and then click Enter in the script.

Analytics information

The script now prompts for information relating directly to the setup of Analytics in Azure:

  1. Next you will be asked whether you want to set up the Analytics database in Azure or on-premises.
    • Here the only option is Cloud, so you just click Enter.
  2. Next, the script asks whether you want to create a new SQL server in Azure or use an existing one. The number of existing SQL servers in the resource group you selected are displayed in parenthesis behind this option. We recommend setting up a new server, if possible.
    • Enter 0, if you want to create a new server.
    • Enter 1, if you want to use an existing server.
  3. If you choose to create a new server, you will next be prompted for a name for the new server. Otherwise, go to step 4 below.
    • Enter a name for the new server name in lowercase letters, numbers and hyphens only, and click Enter.

    Then you will be asked to create a new server admin login:

    • Enter a user name for the new admin user, click Enter.
      • If the user name does not meet the rules provided in the script, you need to choose a new user name.
    • Enter a password for the new admin user, click Enter.
      • If the password you enter does not meet the rules provided in the script, you need to enter the user name again and select a new password.
    • Confirm the password by entering it again, and click Enter.
      • Make sure you save this password somewhere or select something you will definitely remember since this password is not saved anywhere and you will need to provide it when connecting the Power BI reports to Analytics later in the process.

    You can now move directly to step 5 below, since step 4 only applies if you selected to use an existing Azure SQL server.

  4. If you choose to use an existing server, the script will display a list of existing server names for you to choose from.
    • Note: If using an existing SQL Server in Azure, make sure that a Entra ID admin is set on the server so that user assigned managed identity authentication works.
    • Enter the number of the server you want to use, and click Enter.

    You will then be asked to provide the admin login credentials for the server you selected:

    • Enter the user name for the admin user of the server you selected, click Enter.
    • Enter the password for the admin user of the server you selected, click Enter.
  5. Now the script prompts for database name for the new Analytics database:
    • Enter a name for the database, click Enter.
  6. Then select which Azure pricing tier you want to use for the database:
    • Enter the number for the pricing tier you select. We recommend selecting tier S2.
    • You can read more about the pricing tiers on the Azure cost calculation page.
  7. The script now prompts for a name for the Azure data factory that will be created in Azure and contains all the pipelines needed to move data between database and tables.
    • Enter the name for the Azure Data Factory (ADF), click Enter.
    • Note: The ADF name must be globally unique in Azure, so we recommend that if you want to call it Analytics that you add some additional letters or numbers to the ADF name.
  8. Next the script offers to use that same location for your Azure resources that is set on the resource group you selected before and displays that location.
    • This is most likely what you want to do and then you enter y and click Enter.
    • If you for some reason want to select a different location then you enter n at this point and click Enter.
    • The script will then look up all allowed locations in your subscription which will take a few minutes and then ask you to select the location you want.
      • Enter the number of the location you select and click Enter.
  9. The script will now display a summary of the Analytics parameters you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the Analytics setup parameters again.
  10. In the next step the scripts allows you to select an existing storage account or create a new one.
    • If you select existing storage account you will only be able to select between storage accounts that are compatible with the bc2adls export and once you have selected the one you want you are done.
    • If you select to create a new one, you need to enter a name for the storage account.
  11. Whether you create a new storage account or not a container called bc2adls will be created on the selected storage account.
  12. You will then be asked to verify the location and name of the storage account and you enter y and click Enter to do that.

Install Analytics

  1. The script will now create the resources in Azure and create the pre-staging tables in the Analytics database.

    Tip: This usually takes about 5-10 minutes. Sometimes the import of the SQL database takes longer, caused by load on the Azure service. In that case you continue creating the Azure Data Factory, but you cannot continue the onboarding process until the database resource has been created. How to check this is explained later in the onboarding process.

    The script will print out the following lines as the resources are created:

    Installing Analytics for LS Central...

    Creating a new Azure SQL server. This might take a few minutes... (This will not appear if you selected existing server)

    Adding firewall rules...

    Creating a new Azure SQL database. This might take a few minutes...

    Creating an Azure Storage Account...

    Creating a new Azure Data Factory v2. This might take a few minutes... (A waring will appear here but it can be ignored)

    Adding companies into the Analytics database...

    Adding the new database tables and procedures for BC2ADLS...
    Assigned managed identity to the storage account.

  2. If at any point there is a validation error or something goes wrong, the script will terminate and print out a red error message. This most often explains the issue and the error is also written to the error log in the base folder. Some errors are clear, but others might be more cryptic and then it can be good to check the troubleshooting section of the help.

  3. Once the script is done, it will:

    • Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics admin user password.
      • You should copy this information and paste it to the bottom of the parameters.txt file and that saves you the lookup in Azure for the server path later on.
    • Notify you by printing out a message (Done!).
    • Add a new folder (YYYYMMDDHHMM-MyProject) in the Base Folder.

Verify creation of resources in Azure

  1. Open and log into Azure portal.
  2. You will find four resources created with the deployment script:
    • Azure Data Factory
    • SQL Database
    • SQL Server
    • Azure data lake storage account

Test connections to storage account and Analytics data warehouse

At this point it is also good to test whether the linked service connections in Azure data factory are working.

  1. Open the Azure data factory
  2. Navigate to the Manage tab and under the Connections section in the menu find Linked service
  3. There should be three Linked services in the list
    • AnalyticsDW
    • AzureDataLakeStorage
    • LSCentralSource (is still there because of other scenarios but is not used in this setup)
  4. You can open the linked service by clicking on the name and then a panel opens up.
  5. In the bottom right corner of the panel you have an option to test the connection and you should do so for both AnalyticsDW and AzureDataLakeStorage.
  6. If the deployment was successful they should both display Connection successful.
  7. If there are connection errors you can try waiting a few minutes since it can take some time for permissions to be set in Azure. If the errors persist you need to take a look at whether you had any errors during deployment and you can then resolve them, delete the resources and run the deployment again or report them to LS Retail technical support to get help resolving the issue.

Once all connections are tested successfully you can move to the next step.

 

< BackNext >