4 Deploy Analytics
This step is where the magic happens. The deployment script sets up the Azure resources and pipelines that together populate and store the Analytics data warehouse. The script runs in PowerShell and will prompt for all the needed information, create connections and resources in the correct places, and in the end create a summary file with all the information and store it on your server.
The steps below explain some of the steps in details, but the script should be self explanatory. It will verify the data you enter, display summaries for you to confirm along the way, and alert you if something is not right.
The progress of the script is shown at the top for each step. The completed steps are colored green, while the current step is yellow, and the remaining steps are gray.
Run Deployment script
- Navigate to your Base Folder, that is the folder where you downloaded the Analytics Product package earlier.
- In the Base folder, right-click an empty space in the folder and select Open with VS Code.
- When you open the folder like this you have access to all the files in the folder from VS.
- If you have collected your parameters into a file, you can add that file to the Base folder and you can view it from VS Code as well.
- If you do not want to use VS Code or run into any issues with using windows PowerShell or PowerShell ISE you, you can open the DeploymentScript.ps1 using PowerShell 7. We have run into the least difficulties with module dependencies in that app and that is the PowerShell app used in the Analytics Academy courses. Microsoft has instructions on how to install PowerShell 7 in their documentation.
- Run the script (F5).
- The script will check for the Az module setup mentioned earlier, and if the module is missing, the script will terminate and you can set it up at this time by following the steps in step 3 of the wizard.
- The script will check for SqlServer module and install and import if needed.
- If the script detects old Analytics modules it will remove them and import the modules from the current Base folder.
General information
First, the script will collect some general information about the Analytics setup.
- The script prompts for Azure subscription ID:
- Enter the ID you saved earlier, and click Enter.
- Next, you will be prompted to enter your Azure account login information.
Note: The Azure login window will pop up in a different window and might be hidden by VS Code.
- Then, the script asks you to select the resource group you want to use from a list of resource groups collected from Azure.
- Enter the number for the resource group you choose, and click Enter.
- The script will prompt for which type of LS Central you have. Here you should always enter 0 for On-premises.
- If you have LS Central in cloud, you should go to the LS Central in Cloud onboarding wizard. The setup for in cloud is somewhat different from this one.
LS Central information
Next, the script will prompt for LS Central source database information and credentials:
- It prompts for the LS Central source database server name.
- Enter the name of the server, and click Enter.
- Next are the credentials for the database user with read privileges that you created in step 1 of the wizard.
- Enter the user name, click Enter.
- Enter the password, click Enter.
- Next is the LS Central source database name.
- Enter the name of the database, and click Enter.
- Then the names of all the companies found in the LS Central database are added to a Companies.txt file that is opened.
- If you want to exclude some of the companies found in the database from the Analytics setup, you can delete them from the file at this point.
- Save the file if you edited it.
- Click Enter in the script and the script will prompt for verification of the companies in the file.
- Enter y if they are correct or n if you want to edit the file again.
- If you select n, edit the file, save, and then click Enter in the script when you are done.
If the connection to LS Central is unsuccessful, the script will allow you to either try entering the connection parameters again, or if you know you cannot connect to LS Central from the server you are running the script from but you will be able to connect from Azure, you can continue with the parameters you entered without verifying, and you will then need to enter one or more company names into the file.
Note: If you are connecting to an SQL server with a named instance, for example <computer name>\SQLEXPRESS and you are unable to connect to the LS Central source, read the guide on how to configure SQL server to allow remote connections to named instances.
If you know that you cannot connect to the LS Central source from the computer where you are running the script, but want to continue anyway:
- Enter the company name(s) exactly as it is displayed in the Companies table in LS Central into the file, and click Enter.
-
The script will now display a summary of the LS Central source database information you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the LS Central source information again.
Analytics information
The script now prompts for information relating directly to the setup of Analytics for LS Central in Azure.
- First, you will be asked whether you want to set up the Analytics database in On-premises or in cloud (Azure):
- Here you should always select 0 for your own on-premises SQL server.
- If you want to set up Analytics database in Azure, you should select the Analytics in Azure onboarding wizard. If you select to set up in Azure you will also be able to use an existing Azure SQL server.
- Next, the script displays information about setup of the Analytics database template on the on-premises SQL server. Follow the instructions in the script to:
- Restore the database using 'Import Data-tier Application' in SQL Server Management Studio.
- When you restore the database, select a descriptive name for the database, for example LSInsight.
- Create a SQL login on the server with read and write privileges to the Analytics database as well as execute privileges on the stored procedure.
- Under database - Programmability - Stored Procedures
- Right click dbo.LSInsight$InsertCompany
- Select Properties
- Under Permissions tab, search for the sql login you created and click OK
- In the lower panel Grant column, check Execute and click OK
- Enter y in the script to continue.
- You will next be prompted for the name of the on-premises SQL server:
- Enter the name of the SQL server, click Enter.
- You will then be prompted for the name of the Analytics database:
- Enter the name of the database, click Enter.
- Then you will be asked to provide the login credentials you just created for the database:
- Enter the user name of the SQL login, click Enter.
- Enter the password of the SQL login, click Enter.
- Now the script prompts for a name for the Azure Data Factory that will be created in Azure and contains all the pipelines needed to move data between database and tables.
- Enter the name for the Azure Data Factory (ADF), click Enter.
- Note: The ADF name must be globally unique in Azure, so we recommend that if you want to call it LSInsight that you add some additional letters or numbers to the ADF name.
- Next the script offers to use that same location for your Azure resources that is set on the resource group you selected before and displays that location.
- This is most likely what you want to do and then you enter y and click Enter.
- If you for some reason want to select a different location then you enter n at this point and click Enter.
- The script will then look up all allowed locations in your subscription which will take a few minutes and then ask you to select the location you want.
- Enter the number of the location you select and click Enter.
- The script will now display a summary of the Analytics parameters you have selected and entered. If everything is correct, you can just enter a y and the script will continue, but if you do nothing or enter any other letter, the script will start collecting the Analytics setup parameters again.
Install Analytics
- The script will now create the Azure Data Factory in Azure.
Tip: This will take about 1 minute
The script will print out the following lines as the resources are created:
Installing Analytics for LS Central...
Creating a new Azure Data Factory v2. This might take a while...
Adding companies into the Analytics database...
-
If at any point there is a validation error or something goes wrong, the script will terminate and print out a red error message. This most often explains the issue and the error is also written to the error log in the base folder. Some errors are clear, but others might be more cryptic and then it can be good to check the error log and the troubleshooting section of the help.
When you run the script again after an error occurs, the script tries to reuse the parameters you selected before but asks for verification, so you must be careful to answer n if you want to change something.
- Once the script is done, it will:
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics user password.
- If you want to close VS Code it is a good idea to copy this information and save it somewhere for safe keeping.
- Notify you by printing out a message (Done!).
- Add a new folder (YYYYMMDDHHMM-MyProject) in the Base Folder.
- Save the deployment information to the Parameters.json file in the MyProject folder.
- Print out the parameters needed to connect the Power BI reports to the Analytics database, except from Analytics user password.
Note: If the Analytics SQL server is a named instance and contains a \ (backslash), it will be displayed as a double \\ (backslash) in Parameters.json. Change this to a single \ (backslash) so this is correct when it is used to connect the reports to the Analytics database
Next >