As part of any application development it is always good to load test various parts to ensure that the application will perform under certain load scenarios, and something I have recently be working on with Luke Barfield (Head of QA) at TSG. It’s not always possible to test every part of an application, e.g. the user interface, but generally you should be able to load test the server side code, and in CRM 2013 this means the Plugins and Workflow assemblies.
So what options do we have?
Well for Microsoft Dynamics CRM 2011 we had the CRM Performance Toolkit, available from Pinpoint here. Unfortunately as of the time of writing this blog a CRM 2013 version doesn’t exist, but the 2011 version can still be used. The performance toolkit is quite in depth, but does require that you have Visual Studio Ultimate to run some of the load tests.
As an alternative there are a couple of different options we can use. We can use a bulk data import via the GUI, which gives us a nice and easy way to trigger things off. Another option is that we can use the web services, and normal Create or ExecuteMultipleRequest but to add some load we can use a Thread Pool as shown in the examples below:
publicstatic void Run(LoadTesterParameters parameters, string dataFilename)
{
List<Entity> entities = newList<Entity>();
// Obtain an organization service proxy.
// The using statement assures that the service proxy will be properly disposed.
using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString))
{
// Try to find the Import Entity Name, based on the filename and our prefix
var entityName = Path.GetFileNameWithoutExtension(dataFilename);
if (entityName.IndexOf(" ") >= 0)
entityName = entityName.Substring(entityName.IndexOf(" ") + 1);
string csvContents = File.ReadAllText(dataFilename);
CSVHelper csv = newCSVHelper(csvContents, ",", "", true);
if (csv.Header != null)
{
RetrieveEntityRequest entityRequest = newRetrieveEntityRequest();
entityRequest.LogicalName = entityName;
entityRequest.EntityFilters = EntityFilters.Attributes;
RetrieveEntityResponse entityResponse = (RetrieveEntityResponse)service.Execute(entityRequest);
AttributeMetadata[] attributes = entityResponse.EntityMetadata.Attributes;
foreach (KeyValuePair<string, string>[] line in csv)
{
Entity entity = newEntity(entityName);
foreach (KeyValuePair<string, string> createAttribute in line)
{
var attribute = attributes.Where(a => a.LogicalName == createAttribute.Key).FirstOrDefault();
if (createAttribute.Value != String.Empty && attribute != null)
{
if (attribute.AttributeType == AttributeTypeCode.String)
entity[createAttribute.Key] = createAttribute.Value;
if (attribute.AttributeType == AttributeTypeCode.Picklist)
entity[createAttribute.Key] = newOptionSetValue(Int32.Parse(createAttribute.Value));
if (attribute.AttributeType == AttributeTypeCode.Money)
entity[createAttribute.Key] = newMoney(Decimal.Parse(createAttribute.Value));
if (attribute.AttributeType == AttributeTypeCode.DateTime)
entity[createAttribute.Key] = DateTime.Parse(createAttribute.Value);
}
}
entities.Add(entity);
}
CreateEntities(service, entities);
}
}
}
staticvoid CreateEntity(OrganizationService service, Entity entity)
{
Guid entityId = service.Create(entity);
}
staticvoid CreateEntities(OrganizationService service, List<Entity> entities)
{
// Create the threads
for (int i = 0; i < entities.Count; i++)
{
var e = entities[i];
ThreadPool.QueueUserWorkItem(newWaitCallback(delegate(object state) { CreateEntity(service, e); }), null);
}
int numberOfThreads = 0;
int availableThreads = 0;
int completionPortThreads = 0;
ThreadPool.GetMaxThreads(out numberOfThreads, out completionPortThreads);
ThreadPool.GetAvailableThreads(out availableThreads, out completionPortThreads);
while (availableThreads != numberOfThreads)
{
Thread.Sleep(100);
ThreadPool.GetMaxThreads(out numberOfThreads, out completionPortThreads);
ThreadPool.GetAvailableThreads(out availableThreads, out completionPortThreads);
}
}
To explain the code, basically we have three methods, with the application starting via command line app which calls the run method.
We pass in a number of parameters and a data file containing the data we want to load our system with. We parse out the simple CSV Data into a list of entities we want to create. We then queue the Create on a new Thread if we have any available in the pool or create a new thread if not. Note: We could use a foreground thread via Thread.Start() but in our case a background thread should be fine.
We then wait for all of the thread to complete, and we are done.
An example of using the ExecuteMultipleRequest can be seen below:
publicstaticstring[] Run(LoadTesterParameters parameters, string dataFilename, int batchSize)
{
List<string> errors = newList<string>();
// Obtain an organization service proxy.
// The using statement assures that the service proxy will be properly disposed.
using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString))
{
// Try to find the Import Entity Name, based on the filename and our prefix
var entityName = Path.GetFileNameWithoutExtension(dataFilename);
if (entityName.IndexOf(" ") >= 0)
entityName = entityName.Substring(entityName.IndexOf(" ") + 1);
string csvContents = File.ReadAllText(dataFilename);
CSVHelper csv = newCSVHelper(csvContents, ",", "", true);
List<List<Entity>> batches = newList<List<Entity>>();
if (csv.Header != null)
{
RetrieveEntityRequest entityRequest = newRetrieveEntityRequest();
entityRequest.LogicalName = entityName;
entityRequest.EntityFilters = EntityFilters.Attributes;
RetrieveEntityResponse entityResponse = (RetrieveEntityResponse)service.Execute(entityRequest);
AttributeMetadata[] attributes = entityResponse.EntityMetadata.Attributes;
List<Entity> entities = newList<Entity>();
foreach (KeyValuePair<string, string>[] line in csv)
{
Entity entity = newEntity(entityName);
foreach (KeyValuePair<string, string> createAttribute in line)
{
var attribute = attributes.Where(a => a.LogicalName == createAttribute.Key).FirstOrDefault();
if (createAttribute.Value != String.Empty && attribute != null)
{
if (attribute.AttributeType == AttributeTypeCode.String)
entity[createAttribute.Key] = createAttribute.Value;
if (attribute.AttributeType == AttributeTypeCode.Picklist)
entity[createAttribute.Key] = newOptionSetValue(Int32.Parse(createAttribute.Value));
if (attribute.AttributeType == AttributeTypeCode.Money)
entity[createAttribute.Key] = newMoney(Decimal.Parse(createAttribute.Value));
if (attribute.AttributeType == AttributeTypeCode.DateTime)
entity[createAttribute.Key] = DateTime.Parse(createAttribute.Value);
}
}
entities.Add(entity);
if (entities.Count > (batchSize - 1))
{
batches.Add(entities);
entities = newList<Entity>();
}
}
if (entities.Count > 1 && entities.Count <= (batchSize - 1))
batches.Add(entities);
}
foreach (var batch in batches)
{
// Create an ExecuteMultipleRequest object.
var requestWithResults = newExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = newExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = newOrganizationRequestCollection()
};
foreach (var entity in batch)
{
CreateRequest createRequest = newCreateRequest { Target = entity };
requestWithResults.Requests.Add(createRequest);
}
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults = (ExecuteMultipleResponse)service.Execute(requestWithResults);
// Display the results returned in the responses.
foreach (var responseItem in responseWithResults.Responses)
{
// An error has occurred.
if (responseItem.Fault != null)
{
var organizationRequest = requestWithResults.Requests[responseItem.RequestIndex];
errors.Add(String.Format("A fault occurred when processing {1} request, at index {0} in the request collection with a fault message: {2}", responseItem.RequestIndex + 1,
organizationRequest.RequestName, responseItem.Fault.Message));
}
}
}
}
return errors.ToArray();
}
You could obviously use this in combination with the thread pool example to get a finer grained load test application to meet you requirements.
By adding some Stopwatch’s to the example you can get an indication of how long the Creates are taking and tune Plugin’s and Workflow assemblies accordingly.
I hope you find this useful!
@simonjen1