Performance of any system is an important part of it's development and implementation and as part of my role as Chief Architect at TSG I often get asked about it...
A few months ago the Microsoft Premier Field Engineer guys released an update to the toolset for analysing performance for the Microsoft Dynamics range of products, which is available on CodePlex here and as of writing is version 1.20.The toolset is a set of SQL scripts to collect DMV data and Microsoft specific product data which persists into a singular database called DynamicsPerf. This allows us to analyse what is going on with a particular Dynamics installation and helps to resolve performance issues quickly.
As the PFE guys say the real benefit of course with the performance analyser toolset is not finding issues in a production environment, but preventing any bad code or environmental misconfiguration from getting to production stage.
The DynamicsPerf database is the central repository for the data that is collected for performance analysis.
How do we use the performance analyser?
Once we have downloaded the performance analyser from CodePlex, we can extract the zip file and copy over to the SQL Server that for the system that we want to analyse.
We need to ensure that we are logged into the SQL Server with an account that has permissions to create databases and tables etc
We also need to create a folder on the server say on the data drive of the SQL Server to store the SQL Trace files that are created. We can call this folder say D:\SQLTrace. For the performance counter logs we also need to create a folder to store the performance counter data that is logged, say D:\PerfLogs.
The performance analyser is a SQL Server Management Studio solution file that includes a number of SQL Server jobs and performance counters that are used to start the collection process.
To be able to use the performance analyser we must first create the DynamicsPerf database, its objects, and the SQL agent jobs, which can be done via the following:
1. Open the SQL Server Management Studio on the database server in the system we want to analyse.
2. Click File > Open > Project / Solution.
3. Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0
4. Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file.
5. In the Solution Explorer, we need to open the DynamicsPerf\Queries\1-Create_Core_Objects.sql file.
6. Select the SQL Server from the Connect to Database Engine dialog and click the connect button.
7. Then we click on the Execute button to run the SQL Script.

Note: Some jobs will have AX within the name and are disabled. For our purpose as we are looking at Dynamics CRM 2013 we can ignore these Jobs.
There is another script located as DynamicsPerf\Queries\2-Create_CRM_Objects.sql, however at the time of writing this, it currently has no Dynamics CRM specific queries so we can ignore this file.
Configure and schedule Performance Data Capture
Now the SQL Jobs have been created we need to configure a few of them with our specific system settings, i.e. we need to specify the CRM database that we want to analyse. By default the system is set up to analyse the MSCRM_CONFIG database, but that’s not really much use as generally we will want to analyse the Organisations specific database.
So to configure performance data capture, we now need to modify the DYNCPREF_Capture_Stats job:
1. We open the DYNPREF_Capture_Stats job from SQL Server Agent > Jobs in SQL Server Management Studio.
2. We select the Steps page, select step 1 sp_capturestats and click the Edit button
3. We then modify the @Database = ‘MSCRM_CONFIG’, changing it to say @Database = ‘CRM001_MSCRM’ (the organisation database name), in the command box and click OK.
4. Then we select the schedules page, select the first schedule “Daily” and click the Edit button.
5. Then choose when we want the Job to run. Note: The default schedule is to run every day at 17:00, but we can change the time here along with how often the schedule occurs.
6. Then we click OK and OK again to close the DYNPERF_Capture_Stats job window.
Configure and Schedule Database Blocking Capture
We want to collect information about blocking events, and so we need to configure the DYNPERF_Default_Trace_Stats job as per the following:
1. We open the DYNPREF_Default_Trace_Stats job from SQL Server Agent > Jobs in SQL Server Management Studio.
2. We select the Steps page, select step 1 “Start Tracing” and click the Edit button
3. In the command box we can change the @FILE_PATH to point to out trace folder which is D:\SQLTrace and click OK.
4. From the General page, we need to tick the Enabled option so that the job will run.
5. From the Schedule page we can edit the schedule of when the job will run, by default this job will run Daily at 00:00.
6. Then we click OK and OK again to close the DYNPERF_Default_Trace_Stats job window.
Configure and Schedule Hourly Performance Data Capture
We can optionally specify whether we want to capture performance data on an hourly basis, which would be quite handy if we are running a number of regression tests over a period of time. If we do want to do this, then we need to make changes so that the job captures performance data for the Organisation database we want to analyse performance for, by the following:
1. We open the DYNPREF_Perfstats_Hourly job from SQL Server Agent > Jobs in SQL Server Management Studio.
2. We select the Steps page, select step 1 “CaptureStats” and click the Edit button
3. We then modify the @Database = ‘MSCRM_CONFIG’, changing it to say @Database = ‘CRM001_MSCRM’ (the organisation database name), in the command box and click OK.
4. Then we click OK and OK again to close the DYNPERF_Perfstats_Hourly job window.
Configure and Schedule Performance Counter Logging
Now we can log information about disk, CPU, memory etc, but to do this we need to configure and schedule the performance counter logging job. In our scenario we are using the Default SQL Instance, so we do this as follows:
1. Click Start > Run and type perfmon, to start the standard windows performance monitor.
2. Then we expand the Data Collector Sets, right click on User Defined and select New > Data Collector Set.
3. Then we name the collector set say “CRM SQL Server Performance”, and select “Create from a Template and click next.
4. We then select “System Performance” from the options and click browse, and browse to the performance analyser template file which is D:\DynamicsPerf1.20 RC0\DynamicsPerf\Windows Perfmon Scripts\Server2008_SQL_Default_Instance.xml, and click Finish.
5. Now we want to change where the performance logs are stored, so we right click on the “CRM SQL Server Performance” data collector set and click properties.
6. Then we select the Directory tab and click browse to browse to our D:\PerfLogs folder.
7. We can now schedule the performance counters to be collected, by clicking on the schedule tab, and clicking on the Add button.
8. We select the beginning date as todays date and leave the other options as their default, so that the performance collection will run continuously without an end date, and click OK to close the folder action window.
Performance Analyser Maintenance
1. Open the SQL Server Management Studio on the database server in the system we want to analyse.
2. Click File > Open > Project / Solution.
3. Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0
4. Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file, and in the Solution Explorer we open the DynamicsPerf - Analysis Scripts\Manual-CaptureStats.sql script.
5. Then we change the @DATABASE_NAME = ‘DynamicsPerf’ to @DATABASE_NAME = ‘CRM001_MSCRM’ (the organisation database name) and Execute the script against the DynamicsPerf database.
Running a Baseline Performance Analysis
Note: As we generally want to keep the baseline data for longer than 14 days, the purge jobs exclude anything prefixed with purge from the deletion.
As the PFE guys say the real benefit of course with the performance analyser toolset is not finding issues in a production environment, but preventing any bad code or environmental misconfiguration from getting to production stage.
So we can run the performance analysis on our test / QA environment by the following:
1. Open the SQL Server Management Studio on the database server in the system we want to analyse.
2. Click File > Open > Project / Solution.
3. Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0
4. Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file, and in the Solution Explorer we open the DynamicsPerf - Analysis Scripts\CaptureStats in TEST.sql script.
5. Then we change the @DATABASE_NAME = ‘XXXXXXXXX’ to @DATABASE_NAME = ‘CRM001_MSCRM’ (the organisation database name) and Execute the script against the DynamicsPerf database.
Although not a complete step-by-step guide this should give enough information on how to configure the performance analyser for Dynamics CRM 2013. Obviously the hardest part with any performance analysis and tuning is actually analysing the data that is recorded and correlating any blocking events back to the code / plugins that generated them.