5 Deploy Analytics for LS Central
This step is where the magic happens. The deployment script sets up the Azure resources and pipelines that together populate and store the Analytics data warehouse. The script runs in PowerShell and will prompt for all the needed information, create connections and resources in the correct places, and in the end create a summary file with all the information and store it on your server.
The steps below explain some of the steps in details, but the script should be self explanatory. It will verify the data you enter, display summaries for you to confirm along the way, and alert you if something is not right.
The progress of the script is shown at the top for each step. The completed steps are colored green, while the current step is yellow, and the remaining steps are gray.
Run Deployment script
- Navigate to your Base Folder, that is the folder where you downloaded the Analytics Product package earlier.
- In the Base folder, right-click an empty space in the folder and select Open with VS Code.
- When you open the folder like this you have access to all the files in the folder from VS.
- If you have collected your parameters into a file, you can add that file to the Base folder and you can view it from VS Code as well.
- If you do not want to use VS Code or run into any issues with using windows PowerShell or PowerShell ISE you, you can open the DeploymentScript.ps1 using PowerShell 7. We have run into the least difficulties with module dependencies in that app and that is the PowerShell app used in the Analytics Academy courses. Microsoft has instructions on how to install PowerShell 7 in their documentation.
- Run the script (F5).
- The script will check for the Az module setup mentioned earlier, and if the module is missing, the script will terminate and you can set it up at this time by following the steps in step 3 of the wizard.
- The script will check for SqlServer module and install and import if needed.
- If the script detects old Analytics modules it will remove them and import the modules from the current Base folder.
General information
First, the script will collect some general information about the Analytics setup.
- The script prompts for Azure subscription ID:
- Enter the ID you saved earlier, and click Enter.
- Next, you will be prompted to enter your Azure account login information.
Note: The Azure login window will pop up in a different window and might be hidden by VS Code.
- Then, the script asks you to select the resource group you want to use from a list of resource groups collected from Azure.
- Enter the number for the resource group you choose, and click Enter.
- The script will prompt for which type of LS Central you have. Here you should always enter 0 for On-premises.
- If you have LS Central in cloud, you should go to the LS Central in Cloud onboarding wizard. The setup for in cloud is somewhat different from this one.
LS Central information
Next, the script will prompt for LS Central source database information and credentials:
- It prompts for the LS Central source database server name.
- Enter the name of the server, and click Enter.
- Next are the credentials for the database user with read privileges that you created in step 1 of the wizard.
- Enter the user name, click Enter.
- Enter the password, click Enter.
- Next is the LS Central source database name.
- Enter the name of the database, and click Enter.
- Then the names of all the companies found in the LS Central database are added to a Companies.txt file that is opened.
- If you want to exclude some of the companies found in the database from the Analytics setup, you can delete them from the file at this point.
- Save the file if you edited it.
- Click Enter in the script and the script will prompt for verification of the companies in the file.
- Enter y if they are correct or n if you want to edit the file again.
- If you select n, edit the file, save, and then click Enter in the script when you are done.
If the connection to LS Central is unsuccessful, the script will allow you to either try entering the connection parameters again, or if you know you cannot connect to LS Central from the server you are running the script from but you will be able to connect from Azure, you can continue with the parameters you entered without verifying, and you will then need to enter one or more company names into the file.
Note: If you are connecting to an SQL server with a named instance, for example <computer name>\SQLEXPRESS and you are unable to connect to the LS Central source, read the guide on how to configure SQL server to allow remote connections to named instances.
If you know that you cannot connect to the LS Central source from the computer where you are running the script, but want to continue anyway:
- Enter the company name(s) exactly as it is displayed in the Companies table in LS Central into the file, and click Enter.
-
The script will now display a summary of the LS Central source database information you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the LS Central source information again.
Analytics information
The script now prompts for information relating directly to the setup of Analytics in Azure.
- Next you will be asked whether you want to host the Analytics database, in Azure cloud or on-premises.
- Here you should always select 1 for Cloud.
- If you want to set up Analytics database on-premises on an existing SQL server, you should select the Analytics on-premises onboarding wizard.
- Next, the script asks whether you want to create a new SQL server in Azure or use an existing one. The number of existing SQL servers in the resource group you selected is displayed in parenthesis behind this option. We recommend setting up a new server, if possible.
- Enter 0, if you want to create a new server
- Enter 1, if you want to use an existing server.
- If you choose to create a new server, you will next be prompted for a name for the new server. If not, go to step 4 below.
- Enter a name for the new server, click Enter.
Then you will be asked to create a new server admin login:
- Enter a user name for the new admin user, click Enter.
- If the user name does not meet the rules provided in the script you need to choose a new user name.
- Enter a password for the new admin user, click Enter.
- If the password you enter does not meet the rules provided in the script you need to enter the user name again and select a new password.
- Confirm the password by entering it again, and click Enter.
- Make sure you save this password somewhere or select something you will definitely remember since this password is not saved anywhere and you will need to provide it when connecting the Power BI reports to Analytics database later in the process.
You can now move directly to step 6 below, since step 4 only applies if you selected to use an existing Azure SQL server.
- If you choose to use an existing server, the script will display a list of existing server names for you to choose from.
- Enter the number of the server you want to use, and click Enter.
Then you will be asked to provide the admin log in credentials for the server you selected:
- Enter the user name for the admin user of the server you selected, click Enter.
- Enter the password for the admin user of the server you selected, click Enter.
- Now the script prompts for database name for the new Analytics database.
- Enter a name for the database, click Enter.
- Then select which Azure pricing tier you want to use for the database.
- Enter the number for the pricing tier you select. We recommend selecting tier S2.
- You can read more about the pricing tiers on the Azure cost calculation page.
- Now the script prompts for a name for the Azure data factory that will be created in Azure and contains all the pipelines needed to move data between databases and tables.
- Enter the name for the Azure Data Factory (ADF), click Enter.
- Note: The ADF name must be globally unique in Azure, so we recommend that if you want to call it LSInsight that you add some additional letters or numbers to the ADF name.
- Next the script offers to use that same location for your Azure resources that is set on the resource group you selected before and displays that location.
- This is most likely what you want to do and then you enter y and click Enter.
- If you for some reason want to select a different location then you enter n at this point and click Enter.
- The script will then look up all allowed locations in your subscription which will take a few minutes and then ask you to select the location you want.
- Enter the number of the location you select and click Enter.
- The script will now display a summary of the Analytics parameters you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the Analytics setup parameters again.
Install Analytics
- The script will now create the resources in Azure and create the pre-staging tables in the Analytics database.
Tip: This will usually take about 5-10 minutes. Sometimes the import of the SQL database takes longer, caused by load on the Azure service. In that case the script continues with creating the Azure Data Factory, but the step where companies are added to the database is not be completed since the database is not ready. A message will be printed to guide you to a file you can run manually to insert the company/s. This is explained in more detail in the next step of the onboarding process.
The script will print out the following lines as the resources are created:
Installing Analytics for LS Central...
Creating a new Azure SQL server. This might take a few minutes... (This will not appear if you selected existing server)
Adding firewall rules...
Creating a new Azure SQL database. This might take a while...
Creating a new Azure Data Factory v2. This might take a while...
Adding companies into the Analytics database...
-
If at any point there is a validation error or something goes wrong, the script will terminate and print out a red error message. This most often explains the issue and the error is also written to the error log in the base folder. Some errors are clear, but others might be more cryptic and then it can be good to check the error log and the troubleshooting section of the help.
When you run the script again after an error occurs, the script tries to reuse the parameters you selected before, but asks for verification so you must be careful to answer n if you want to change something.
- Once the script is done, it will:
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics admin user password.
- This saves you the lookup in Azure for the server path. If you want to close VS Code it is a good idea to copy this information and save it somewhere for safe keeping.
- Notify you by printing out a message (Done!).
- Add a new folder (YYYYMMDDHHMM-MyProject) in the Base Folder.
- Save the deployment information to the Parameters.json file in the MyProject folder.
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics admin user password.