Quantcast
Channel: Microsoft Dynamics 365 Community
Viewing all articles
Browse latest Browse all 13977

Microsoft Dynamics CRM 2013 Application Lifetime Management - Part 3

$
0
0
 In this part of the Microsoft Dynamics CRM 2013 Application Lifetime Management blog series, we will look at the Data Manager. This application provides us with the functionality to export and import data from Dynamics CRM via the command line.
The application is split into two Visual Studio projects, the first being a simple class library which contains a number of classes to separate out the export, and import, operations that we want to make available to the second console application project.
It is always good practice to decouple the application logic, from the user interface, that way if future work requires us to change the UI to say a graphical version then we can easily achieve this without heavy modification to the core logic.
One of the great things that Microsoft have done with the CRM SDK libraries, is to provide them as NuGet packages. This means it’s a simple process to add them to a Visual Studio Solution, and we can set the solution to automatically download any missing packages. This also means that if another developer picks up the Visual Studio Solution, they will automatically get any dependent referenced libraries. If you have ever used TFS Online integrated with the Azure hosting platform then you will have used this and know how easy it makes checking in / publishing a project.
 
Exporting the Data
 
As you will probably know Dynamics CRM has a built in method of importing and exporting data, so we use that as a basis. The Dynamics CRM SDK, provides a great example of how to export a data, and so this was simply followed for the Export method of the Data Exporter class within the Data Manager class library. Here we get a CRM Organisation Service based on a connection string the user has passed in, create an ExportMappingsImportMapRequest request, and write out the byte response to an XML file in a path that the user specified.
 
publicstatic ExportStatus ExportData(DataParameters parameters, string logicalName, string filePrefix)
{
    ExportStatus result = ExportStatus.Error;
 
    // Obtain an organization service proxy.
    // The using statement assures that the service proxy will be properly disposed.
    using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString))
    {
        // Get all of the entity records
        bool moreRecords = false;
        int pageNumber = 1;
        string pagingCookie = null;
 
        RetrieveEntityRequest entityRequest = new RetrieveEntityRequest();
        entityRequest.LogicalName = logicalName;
        entityRequest.EntityFilters = EntityFilters.Attributes;
        RetrieveEntityResponse entityResponse = (RetrieveEntityResponse)service.Execute(entityRequest);
 
        // Get a list of all of the column names, and the column name prefixed by src_
        // the reason we use the prefixed version in the file and code, is that it makes it more
        // readable to determine whether we are referring to a soruce column of a target column.
        // Note: We are excluding Guids, e.g. Record Id and OwnerId, however EntityReference
        // Lookups are included
        List<string> columns = entityResponse.EntityMetadata.Attributes.Where(a => a.AttributeType != AttributeTypeCode.Virtual &&
                                                                                    a.AttributeType != AttributeTypeCode.EntityName &&
                                                                                    (a.IsValidForCreate == true ||
                                                                                    (parameters.DataFileType == DataFileTypes.XML &&
                                                                                    a.IsValidForUpdate == true)))
                                                                                    .Select(a => a.LogicalName).ToList();
 
        if (parameters.AttributeList != null)
            columns = columns.Where(c => parameters.AttributeList.Split(',').Contains(c)).ToList();
 
        do
        {
            QueryExpression query = new QueryExpression { EntityName = logicalName, ColumnSet = new ColumnSet(true)};
            query.PageInfo = new PagingInfo { PageNumber = pageNumber, PagingCookie = pagingCookie };
            query.Criteria = new FilterExpression();
            query.Criteria.AddCondition(new ConditionExpression("statecode", ConditionOperator.Equal, 0));
 
            EntityCollection entityCollection = service.RetrieveMultiple(query);
            if (entityCollection.Entities.Count > 0)
            {
                // Export the data in the appropriate format as choosen by the user
                var append = pageNumber == 1 ? false : true;
                if (parameters.DataFileType == DataFileTypes.CSV)
                {
                    var dataPath = "DataFiles";
                    if (parameters.Mode == Modes.Update)
                        dataPath = "DataFileUpdates";
                    WriteDataCSV(service, logicalName, entityCollection,
                                    parameters.DataFilePath + Path.DirectorySeparatorChar + dataPath +
                                    Path.DirectorySeparatorChar + filePrefix + logicalName + ".csv", columns, append);
                }
                elseif (parameters.DataFileType == DataFileTypes.XML)
                {
                    WriteDataXMLSpreadsheet2003(service, logicalName, entityCollection,
                                                parameters.DataFilePath + Path.DirectorySeparatorChar + "DataFiles" +
                                                Path.DirectorySeparatorChar + filePrefix + logicalName + ".csv", columns, append);
                }
 
                // If we have more records, then increment the Page we need to get
                // and cache the paging cookie
                moreRecords = entityCollection.MoreRecords;
                if (moreRecords)
                {
                    pageNumber++;
                    pagingCookie = entityCollection.PagingCookie;
                }
 
                result = ExportStatus.Success;
            }
            else
            {
                result = ExportStatus.Success;
            }
 
        } while (moreRecords);
    }
 
    return result;
}
 
Now exporting an already created data import map is fine, but of course the reason we are doing all of this is to automate as much as possible so that we can use Continuous Integration and automated deployment with TFS.
So how about creating the data import map automatically?
 
Well the Dynamics CRM SDK does provide an example of doing this with complex mappings, so I simply followed the example and the extended it to be more generic. Here we get the attribute metadata for the entity we need to export, and create column mappings for the attributes that are valid for create and update.
The key thing that I found with the example was that it was based on the Account Entity, and mapped the Parent Account Lookup back to an existing Account. The example creates two map lookup dependencies, one to the parent entity that we are looking up to and one to the current entity. Now if you try the example on an entity and lookup that isn’t self-referencing, it will fail. So to get around this we just need to create the second lookup map only if the current attribute is a lookup to the same entity.
 
The other tricky thing with lookups is that the example basis the mapping on the primary name attribute. The problem that you may find here, is that the data in this attribute may not be unique, so to get around this, we can create our own import id attribute which we put an auto number Plugin on to make sure that any record that is created is unique.
 
publicstaticvoid CreateDataMap(DataParameters parameters, string logicalName)
{
    using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString))
    {
        // Delete the Data Map if it already exists
        var entityId = CrmHelper.GetIdByAttribute(service, "importmap", "name", dataMapNamePrefix + logicalName);
        if (entityId != Guid.Empty)
            service.Delete("importmap", entityId);
 
        // Retrieve the metadata for the specified entity.
        RetrieveEntityRequest entityRequest = new RetrieveEntityRequest();
        entityRequest.LogicalName = logicalName;
        entityRequest.EntityFilters = EntityFilters.Attributes;
        RetrieveEntityResponse entityResponse = (RetrieveEntityResponse)service.Execute(entityRequest);
 
        // Create an import map.
        Entity importMap = new Entity("importmap");
        importMap["name"] = dataMapNamePrefix + logicalName;
        importMap["source"] = logicalName + ".csv";
        importMap["description"] = String.Format("Data Map for the {0} Entity", logicalName);
        importMap["entitiesperfile"] = new OptionSetValue((int)CrmHelper.ImportMapEntitiesPerFile.SingleEntityPerFile);
        Guid importMapId = service.Create(importMap);
 
        // Create column mappings.
        foreach (AttributeMetadata am in entityResponse.EntityMetadata.Attributes)
        {
            if (am.AttributeType != AttributeTypeCode.Virtual &&
                am.AttributeType != AttributeTypeCode.EntityName &&
                (am.IsValidForCreate == true ||
                (parameters.DataFileType == DataFileTypes.XML && am.IsValidForUpdate == true)))
            {
                // Create a column mapping
                Entity columnMapping = new Entity("columnmapping");
                columnMapping["sourceattributename"] = "src_" + am.LogicalName;
                columnMapping["sourceentityname"] = logicalName + "_1";
                columnMapping["targetattributename"] = am.LogicalName;
                columnMapping["targetentityname"] = logicalName;
                columnMapping["importmapid"] = new EntityReference(importMap.LogicalName, importMapId);
                columnMapping["processcode"] = new OptionSetValue((int)CrmHelper.ColumnMappingProcessCode.Process);
                Guid columnMappingId = service.Create(columnMapping);
 
                if (am.AttributeType == AttributeTypeCode.Lookup)
                {
                    // Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.                           
                    // This lookupmapping is important because without it the current record
                    // cannot be used as the parent of another record.
 
                    RetrieveEntityRequest parentRequest = new RetrieveEntityRequest();
                    parentRequest.LogicalName = ((LookupAttributeMetadata)am).Targets.FirstOrDefault();
                    parentRequest.EntityFilters = EntityFilters.Attributes;
                    RetrieveEntityResponse parentResponse = (RetrieveEntityResponse)service.Execute(parentRequest);
 
                    // If the parent has an import id attribute, then use it
                    // otherwise use the parent name
                    var lookupAttribute = parentResponse.EntityMetadata.PrimaryNameAttribute;
                    var importId = parentResponse.EntityMetadata.Attributes.Where(a => a.LogicalName.EndsWith("importid")).FirstOrDefault();
                    if (importId != null)
                        lookupAttribute = importId.LogicalName;
 
                    // Create a lookup mapping to the parent record. 
                    Entity parentLookupMapping = new Entity("lookupmapping");
                    parentLookupMapping["columnmappingid"] = new EntityReference("columnmapping", columnMappingId);
                    parentLookupMapping["processcode"] = new OptionSetValue((int)CrmHelper.LookUpMappingProcessCode.Process);
                    parentLookupMapping["lookupentityname"] = parentResponse.EntityMetadata.LogicalName;
                    parentLookupMapping["lookupattributename"] = lookupAttribute;
                    parentLookupMapping["lookupsourcecode"] = new OptionSetValue((int)CrmHelper.LookUpMappingLookUpSourceCode.System);
                    Guid parentLookupMappingId = service.Create(parentLookupMapping);
 
                    // If the lookup is a self referencing lookup, create the lookup map for the current record
                    if (parentResponse.EntityMetadata.LogicalName == entityResponse.EntityMetadata.LogicalName)
                    {
                        lookupAttribute = entityResponse.EntityMetadata.PrimaryNameAttribute;
                        importId = entityResponse.EntityMetadata.Attributes.Where(a => a.LogicalName.EndsWith("importid")).FirstOrDefault();
                        if (importId != null)
                            lookupAttribute = importId.LogicalName;
 
                        // Create a lookup mapping to the current record. 
                        Entity currentLookupMapping = new Entity("lookupmapping");
                        currentLookupMapping["columnmappingid"] = new EntityReference("columnmapping", columnMappingId);
                        currentLookupMapping["processcode"] = new OptionSetValue((int)CrmHelper.LookUpMappingProcessCode.Process);
                        currentLookupMapping["lookupattributename"] = "src_" + lookupAttribute;
                        currentLookupMapping["lookupentityname"] = logicalName + "_1";
                        currentLookupMapping["lookupsourcecode"] = new OptionSetValue((int)CrmHelper.LookUpMappingLookUpSourceCode.Source);
                        Guid currentLookupMappingId = service.Create(currentLookupMapping);
                    }
                }
                elseif (am.AttributeType == AttributeTypeCode.Picklist ||
                            am.AttributeType == AttributeTypeCode.State ||
                            am.AttributeType == AttributeTypeCode.Status)
                {
                    // Use the RetrieveAttributeRequest message to retrieve 
                    // a attribute by it's logical name.
                    RetrieveAttributeRequest retrieveAttributeRequest = new RetrieveAttributeRequest
                    {
                        EntityLogicalName = logicalName,
                        LogicalName = am.LogicalName,
                        RetrieveAsIfPublished = true
                    };
                    RetrieveAttributeResponse retrieveAttributeResponse = (RetrieveAttributeResponse)service.Execute(retrieveAttributeRequest);
 
                    OptionSetMetadata optionSet = null;
                    if (am.AttributeType == AttributeTypeCode.Picklist)
                    {
                        PicklistAttributeMetadata retrievedAttributeMetadata = (PicklistAttributeMetadata)retrieveAttributeResponse.AttributeMetadata;
                        optionSet = retrievedAttributeMetadata.OptionSet;
                    }                           
 
                    if (am.AttributeType == AttributeTypeCode.State)
                    {
                        StateAttributeMetadata retrievedAttributeMetadata = (StateAttributeMetadata)retrieveAttributeResponse.AttributeMetadata;
                        optionSet = retrievedAttributeMetadata.OptionSet;
                    }
 
                    if (am.AttributeType == AttributeTypeCode.Status)
                    {
                        StatusAttributeMetadata retrievedAttributeMetadata = (StatusAttributeMetadata)retrieveAttributeResponse.AttributeMetadata;
                        optionSet = retrievedAttributeMetadata.OptionSet;
                    }
 
                    if (optionSet != null&& optionSet.Options != null)
                    {
                        foreach (var option in optionSet.Options)
                        {
                            // Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
                            Entity pickListMapping = new Entity("picklistmapping");
                            pickListMapping["sourcevalue"] = option.Label.LocalizedLabels.FirstOrDefault().Label;
                            pickListMapping["targetvalue"] = option.Value;
                            pickListMapping["columnmappingid"] = new EntityReference("columnmapping", columnMappingId);
                            pickListMapping["processcode"] = new OptionSetValue((int)CrmHelper.PickListMappingProcessCode.Process);
                            Guid picklistMappingId = service.Create(pickListMapping);
                        }
                    }
                }                                            
            }
        }
    }
}
 
So now we have our data import map, we need to get our data…
Well there were a number of examples that I could follow, but none that did exactly what we need to do, so I made it up!
Here we get the attribute metadata, only including the attributes that are valid for create and update, and then get the data. As you will probably know there is a limit to the number of records we can retrieve, which is 5000 records, so to overcome this we simply use the PagingCookie and increment the Page Number.
 
publicstatic ExportStatus ExportDataMap(DataParameters parameters, string logicalName, string filePrefix)
{
    ExportStatus result = ExportStatus.Error;
 
    // Obtain an organization service proxy.
    // The using statement assures that the service proxy will be properly disposed.
    using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString))
    {
        var entityId = CrmHelper.GetIdByAttribute(service, "importmap", "name", dataMapNamePrefix + logicalName);
        if (entityId != Guid.Empty)
        {
            // Retrieve the xml for the mapping
            var exportRequest = new ExportMappingsImportMapRequest
            {
                ImportMapId = entityId,
                ExportIds = true,
            };
 
            // Save the mapping to a file
            var exportResponse = (ExportMappingsImportMapResponse)service.Execute(exportRequest);
            var mappingXml = exportResponse.MappingsXml;
            File.WriteAllText(parameters.DataFilePath + Path.DirectorySeparatorChar + "DataMapFiles" +
                                Path.DirectorySeparatorChar + filePrefix + logicalName + ".xml", mappingXml);
 
            result = ExportStatus.Success;
        }
    }               
            
    return result;
}      
 
Now that we have our data, we can call the appropriate method of writing the data out as a CSV or XML file depending on what the user requested. The key thing to note here is that we can use the formatted values for certain attributes, e.g. Picklists, but for say Money values this would cause an import failure because of the currency symbol and the number of decimal places that the import expects.
 
publicstaticvoid WriteDataCSV(IOrganizationService service, string logicalName, EntityCollection items, string filenamePath, List<string> columns, bool append)
{
    if (!append && File.Exists(filenamePath))
        File.Delete(filenamePath);
 
    StreamWriter writer = newStreamWriter(filenamePath, true, Encoding.UTF8);
    string fieldSeparator = ",";
    string dataSeparator = "\"";
 
    if (!append)
        writer.WriteLine(String.Join(fieldSeparator, columns.Select(c => "src_" + c)));
 
    foreach (var entity in items.Entities)
    {
        List<string> strs = newList<string>();
        foreach (string column in columns)
        {
            string value = String.Empty;
 
            // Use the formatted value where we can, except if it is a Boolean
            if (entity.FormattedValues.Keys.Contains(column) &&
                entity[column].GetType() != typeof(Boolean) &&
                entity[column].GetType() != typeof(Money))
            {
                value = entity.GetFormattedAttributeValue(column);                       
            }
            elseif (entity.Attributes.Keys.Contains(column))
            {
                // Determine the type and choose the appropriate Name or Value
                // Note: for EntityReference, in the import map we have a mapping
                // to find the correct entity base on its PrimaryNameAttribute.
                // Obviously the record that we are looking up to has to exist
                var kvp = entity.Attributes.First<KeyValuePair<string, object>>((KeyValuePair<string, object> k) => k.Key == column);
                Type type = kvp.Value.GetType();
                if (type == typeof(EntityReference))
                    value = GetEntityReferenceImportId(service, logicalName, (EntityReference)kvp.Value, ((EntityReference)kvp.Value).Name);
                elseif (type == typeof(Money))
                    value = ((Money)kvp.Value).Value.ToString();
                elseif (type != typeof(AliasedValue))
                    value = (type != typeof(OptionSetValue) ? kvp.Value.ToString() : ((OptionSetValue)kvp.Value).Value.ToString());
                else
                    value = ((AliasedValue)kvp.Value).Value.ToString();
            }
 
            if (value == String.Empty || value == null)
            {
                strs.Add(String.Concat(dataSeparator, "", dataSeparator));
            }
            else
            {
                if (value.Contains(dataSeparator))
                    value = value.Replace("\"", "\"\"");
 
                strs.Add(String.Concat(dataSeparator, value, dataSeparator));
            }
        }
        writer.WriteLine(String.Join(fieldSeparator, strs));
    }
    writer.Flush();
    writer.Close();
}
 
Importing the Data
 
The Dynamics CRM SDK, provides a great example of how to import data, and so this was simply followed for the Import method of the Data Importer class within the Data Manager class library. First we need to import the data import map, so we read in all of the text from the file, get a CRM Organisation Service based on a connection string the user has passed in, create an ImportMappingsImportMapRequest request, execute this, and then as a simple validation we retrieve the data import map and check the mapping data was uploaded OK.
 
publicstatic ImportResult ImportDataMap(DataParameters parameters, string dataFilename)
{
    ImportResult result = new ImportResult { Status = ImportStatus.Error, Errors = "" };
 
    var mappingXml = File.ReadAllText(dataFilename);
 
    // Obtain an organization service proxy.
    // The using statement assures that the service proxy will be properly disposed.
    using (var service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString, newTimeSpan(1, 0, 0)))
    {
        // Delete the Data Map if it already exists
        XDocument document = XDocument.Parse(mappingXml);
        var name = document.Descendants("Map").Select(map => map.Attribute("Name").Value).ToList().FirstOrDefault();
        var entityId = CrmHelper.GetIdByAttribute(service, "importmap", "name", name);
        if (entityId != Guid.Empty)
            service.Delete("importmap", entityId);
 
        // Create the import mapping from the XML
        var request = new ImportMappingsImportMapRequest
        {
            MappingsXml = mappingXml,
            ReplaceIds = true,
        };
 
        var response = (ImportMappingsImportMapResponse) service.Execute(request);
        var importedMapId = response.ImportMapId;
 
        // Retrieve the value for validation
        var exportRequest = new ExportMappingsImportMapRequest
        {
            ImportMapId = importedMapId,
            ExportIds = true,
        };
 
        var exportResponse = (ExportMappingsImportMapResponse) service.Execute(exportRequest);
        var importedMappingXml = exportResponse.MappingsXml;
 
        // Compare the result to ensure the import was successful
        if (importedMappingXml != String.Empty)
            result.Status = ImportStatus.Success;
    }
 
    return result;
}
 
Now to import the actual data we use the filename of the data being imported to try to find the data import map we need to use, create an import file based on the file type and data import map id, read in all of the data, set up the delimiters and create the import record within CRM. After we have created the record, we need to trigger the various stages that using the import through the GUI does for us. So we call a ParseImportRequest request, and wait for it to complete. Then we need to call a TransformImportRequest request, to use the column mappings in the data import map which will substitute in our lookup entity reference id’s based on the primary name attribute or our own import id attribute, and wait for the process to complete.
 
Finally we call the ImportRecordsRequest request to actually import the records into CRM. After we have waited for the import to complete, we can then retrieve any errors that occurred during this process and present them back to the user so that they know which records failed and why.
 
publicstatic ImportResult ImportData(DataParameters parameters, string dataFilename)
{
    ImportResult result = new ImportResult { Status = ImportStatus.Error, Errors = "" };           
 
    // Obtain an organization service proxy.
    // The using statement assures that the service proxy will be properly disposed.
    using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString, newTimeSpan(1, 0, 0)))
    {
        // Try to find the Import Map, based on the filename and our prefix
        var importName = Path.GetFileNameWithoutExtension(dataFilename);
        if (importName.IndexOf(" ") >= 0)
            importName = importName.Substring(importName.IndexOf(" ") + 1);
        var importMapId = CrmHelper.GetIdByAttribute(service, "importmap", "name", dataMapNamePrefix + importName);
                
        // Determine the data type as passed in as a parameter parse in the string contents
        CrmHelper.ImportFileFileTypeCode fileType = CrmHelper.ImportFileFileTypeCode.CSV;
        string importContents = File.ReadAllText(dataFilename);
        if (parameters.DataFileType == DataFileTypes.CSV)
            fileType = CrmHelper.ImportFileFileTypeCode.CSV;
        elseif (parameters.DataFileType == DataFileTypes.XML)
            fileType = CrmHelper.ImportFileFileTypeCode.XMLSpreadsheet2003;
 
        if (importContents != String.Empty)
        {
            // Create Import
            Entity import = new Entity("import");
 
            if (parameters.Mode == Modes.Create)
                import["modecode"] = new OptionSetValue(0);
            elseif (parameters.Mode == Modes.Update)
                import["modecode"] = new OptionSetValue(1);
 
            import["name"] = "Importing Data";
 
            Guid importId = service.Create(import);
 
            // Create Import File
            Entity importFile = new Entity("importfile");
            importFile["content"] = importContents;
            importFile["name"] = String.Format("{0} Import", importName);
            importFile["isfirstrowheader"] = true;
 
            if (importMapId != Guid.Empty)
            {
                // Retrieve the xml for the mapping
                var exportRequest = new ExportMappingsImportMapRequest { ImportMapId = importMapId, ExportIds = true };
                var exportResponse = (ExportMappingsImportMapResponse)service.Execute(exportRequest);
                var mappingXml = exportResponse.MappingsXml;
 
                // Parse the Mapping file to find the soruce and target entities
                XDocument document = XDocument.Parse(mappingXml);
                var sourceEntityName = document.Descendants("EntityMap").Select(entityMap => entityMap.Attribute("SourceEntityName").Value).ToList().FirstOrDefault();
                var targetEntityName = document.Descendants("EntityMap").Select(entityMap => entityMap.Attribute("TargetEntityName").Value).ToList().FirstOrDefault();
 
                importFile["importmapid"] = new EntityReference("importmap", importMapId);
                importFile["usesystemmap"] = false;
 
                importFile["sourceentityname"] = sourceEntityName;
                importFile["targetentityname"] = targetEntityName;
            }
 
            importFile["source"] = Path.GetFileName(dataFilename);
            importFile["size"] = importContents.Length.ToString();                  
 
            importFile["importid"] = new EntityReference("import", importId);
            importFile["enableduplicatedetection"] = true;
 
            importFile["filetypecode"] = new OptionSetValue((int)fileType);
            importFile["fielddelimitercode"] = new OptionSetValue((int)CrmHelper.ImportFileFieldDelimiterCode.Comma);
            importFile["datadelimitercode"] = new OptionSetValue((int)CrmHelper.ImportFileDataDelimiterCode.DoubleQuote);
            importFile["processcode"] = new OptionSetValue((int)CrmHelper.ImportFileProcessCode.Process);
 
            // Get the current user to set as record owner
            WhoAmIRequest systemUserRequest = new WhoAmIRequest();
            WhoAmIResponse systemUserResponse = (WhoAmIResponse)service.Execute(systemUserRequest);
            importFile["recordsownerid"] = new EntityReference("systemuser", systemUserResponse.UserId);
 
            Guid importFileId = service.Create(importFile);
 
            // Parse the import file
            ParseImportRequest parseImportRequest = new ParseImportRequest()
            {
                ImportId = importId
            };
            ParseImportResponse parseImportResponse = (ParseImportResponse)service.Execute(parseImportRequest);
            ImportStep = "Waiting for Parse Async Job to complete";
            CrmHelper.WaitForAsyncJobCompletion(service, parseImportResponse.AsyncOperationId);
 
            // Transform the import
            TransformImportRequest transformImportRequest = new TransformImportRequest()
            {
                ImportId = importId
            };
            TransformImportResponse transformImportResponse = (TransformImportResponse)service.Execute(transformImportRequest);
            ImportStep = "Waiting for Transform Async Job to complete";
            CrmHelper.WaitForAsyncJobCompletion(service, transformImportResponse.AsyncOperationId);
 
            // Upload the records
            ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
            {
                ImportId = importId
            };
            ImportRecordsImportResponse importResponse = (ImportRecordsImportResponse)service.Execute(importRequest);
            ImportStep = "Waiting for ImportRecords Async Job to complete";
            CrmHelper.WaitForAsyncJobCompletion(service, importResponse.AsyncOperationId);
 
            result.Status = ImportStatus.Success;
            result.Errors = GetImportErrors(service, importFileId);
 
            ImportStep = null;
        }
    }
 
    return result;
}
 
Importing data files is fine, but some records e.g. Mail Merge Templates are created as part of a solution, so we may need a way to update the solution created records with a number of attribute values or we may just need to update a number of records after all of the other data files have been created.
To do this, we could use the XML format and chain the import after the initial create import has completed, or alternatively we may want the whole batch to be completed in one go. So to achieve this I created another import method called ImportDataUpdates which takes the same CSV format file that we can create from the Export method, but calls a standard Update on the organisation Service.
 
publicstatic ImportResult ImportDataUpdates(DataParameters parameters, string dataFilename)
{
    ImportResult result = new ImportResult { Status = ImportStatus.Success, Errors = "" };
 
    // Obtain an organization service proxy.
    // The using statement assures that the service proxy will be properly disposed.
    using (OrganizationService service = CrmHelper.GetOrganizationService(parameters.CrmConnectionString, newTimeSpan(1, 0, 0)))
    {
        // Try to find the Import Entity Name, based on the filename and our prefix
        var entityName = Path.GetFileNameWithoutExtension(dataFilename);
        if (entityName.IndexOf(" ") >= 0)
            entityName = entityName.Substring(entityName.IndexOf(" ") + 1);
 
        string csvContents = File.ReadAllText(dataFilename);
        CSVHelper csv = new CSVHelper(csvContents, ",", "\"", true);
        if (csv.Header != null)
        {
            RetrieveEntityRequest entityRequest = new RetrieveEntityRequest();
            entityRequest.LogicalName = entityName;
            entityRequest.EntityFilters = EntityFilters.Attributes;
            RetrieveEntityResponse entityResponse = (RetrieveEntityResponse)service.Execute(entityRequest);
            AttributeMetadata[] attributes = entityResponse.EntityMetadata.Attributes;
 
            List<Entity> entities = newList<Entity>();
 
            foreach (KeyValuePair<string, string>[] line in csv)
            {
                Entity entity = new Entity(entityName);
                foreach (KeyValuePair<string, string> createAttribute in line)
                {
                    var attribute = attributes.Where(a => "src_" + a.LogicalName == createAttribute.Key).FirstOrDefault();
                    if (createAttribute.Value != String.Empty && attribute != null)
                    {
                        if (attribute.AttributeType == AttributeTypeCode.Lookup)
                        {
                            var parentEntityRef = CrmHelper.GetLookupEntityReference(service, attribute, createAttribute.Value);
                            if (parentEntityRef != null)
                                entity[attribute.LogicalName] = parentEntityRef;
                        }
                        if (attribute.AttributeType == AttributeTypeCode.Boolean)
                            entity[attribute.LogicalName] = createAttribute.Value == Boolean.TrueString ? true : false;
                        if (attribute.AttributeType == AttributeTypeCode.String ||
                            attribute.AttributeType == AttributeTypeCode.Memo)
                            entity[attribute.LogicalName] = createAttribute.Value;
                        if (attribute.AttributeType == AttributeTypeCode.Picklist)
                        {
                            var optionSetValue = CrmHelper.GetOptionSetValue(service, entityName, attribute, createAttribute.Value);
                            if (optionSetValue != null)
                                entity[attribute.LogicalName] = new OptionSetValue((int)optionSetValue);
                        }
                        if (attribute.AttributeType == AttributeTypeCode.Money)
                            entity[attribute.LogicalName] = new Money(Decimal.Parse(createAttribute.Value));
                        if (attribute.AttributeType == AttributeTypeCode.DateTime)
                            entity[attribute.LogicalName] = DateTime.Parse(createAttribute.Value);
                    }
                }
 
                entities.Add(entity);
            }
 
            foreach (var entity in entities)
            {
                var importAttribute = attributes.Where(a => a.LogicalName.EndsWith("importid")).FirstOrDefault();
                if (importAttribute != null&& entity.Contains(importAttribute.LogicalName))
                    entity.Id = CrmHelper.GetIdByAttribute(service, entity.LogicalName, importAttribute.LogicalName, (string)entity[importAttribute.LogicalName]);
                else
                    entity.Id = CrmHelper.GetIdByAttribute(service, entity.LogicalName, entityResponse.EntityMetadata.PrimaryNameAttribute,
                                                            (string)entity[entityResponse.EntityMetadata.PrimaryNameAttribute]);
 
                service.Update(entity);
            }
        }
    }
 
    return result;
}
 
 
Bringing it all together
 
Now that we have our core classes and library which encapsulates the CRM specific functionality, we can move on to a user interface. I’m from the days when everything was command line driven, well in fact they still are if you use *nix, and a command line script is exactly what we need to use with TFS 2013!
In our command line application, we define a number of arguments that we can accept, and display a nice help message to the user if they haven’t provided them correctly. We then switch between the main operation type, e.g. Export, Import etc. Generally operations will be performed independently of each other, e.g. as part of a developers post build in Visual Studio we will probably want to run the export operation to ensure that we extract all of the Dynamics CRM customisations, so that they can be included within the TFS check in that the developer will do. Whereas we will use the import operation as part of the TFS automated build / deploy script which we will probably schedule to run at 2am, 11am and 3pm.
For exporting, we simply call our Data Exporter, Export method along with the parameters of where the file needs to be exported to etc. or with a number of entities that we want to export, with the * wildcard or all to export everything.
 
case OperationTypes.Export:
    ConsoleHelper.WriteLine("Exporting data from CRM...", ConsoleColor.White);
 
    DataExporter.ExportStatus exportStatus = DataExporter.ExportStatus.Error;
    if (parameters.Type == Types.Data)
    {
        if (parameters.Mode == Modes.Update)
        {
            // Ensure that the Data File Updates folder is clean
            string dir = parameters.DataFilePath + "DataFileUpdates" + Path.DirectorySeparatorChar;
            Utilities.Directory.DeleteAllFiles(dir, validFileExtensions);
        }
        else
        {
            // Ensure that the Data Map Files folder is clean
            string dir = parameters.DataFilePath + "DataMapFiles" + Path.DirectorySeparatorChar;
            Utilities.Directory.DeleteAllFiles(dir, validFileExtensions);
 
            // Ensure that the Data Files folder is clean
            dir = parameters.DataFilePath + "DataFiles" + Path.DirectorySeparatorChar;
            Utilities.Directory.DeleteAllFiles(dir, validFileExtensions);
        }
 
        // The entity list is a comma separated list
        var entities = parameters.EntityList.Split(',');
        var counter = 1;
 
        // If we have been asked to export all, then get all of the entity
        // logical names
        if (entities.Length > 0 && entities[0] == "all")
            entities = DataExporter.GetAllEntitiesLogicalName(parameters, String.Empty);
        else
        {
            List<string> newEntities = newList<string>();
            foreach (var entity in entities)
            {
                if (entity.IndexOf("*") > 0)
                    newEntities.AddRange(DataExporter.GetAllEntitiesLogicalName(parameters, entity.Replace("*", "")));
                else
                    newEntities.Add(entity);
            }
            entities = newEntities.ToArray();
        }
 
        // For each entity, export the data map which contains all of the relevant
        // lookup mappings etc.
        // prefix the filename with the counter, 1 account.csv so that we can import
        // the files back in, in the order the user specified in the entity list
        foreach (var entity in entities)
        {
            bool error = false;
            if (parameters.Mode == Modes.Update)
            {
                ConsoleHelper.WriteLine(String.Format("Exporting {0} Data Updates from CRM...", entity), ConsoleColor.White);
 
                var entityExportStatus = DataExporter.ExportData(parameters, entity, String.Format("{0} ", counter.ToString("000")));
                if (entityExportStatus != DataExporter.ExportStatus.Success)
                    error = true;
            }
            else
            {
                ConsoleHelper.WriteLine(String.Format("Exporting {0} Data Map from CRM...", entity), ConsoleColor.White);
 
                DataExporter.CreateDataMap(parameters, entity);
                var mapExportStatus = DataExporter.ExportDataMap(parameters, entity, String.Format("{0} ", counter.ToString("000")));
 
                ConsoleHelper.WriteLine(String.Format("Exporting {0} Data from CRM...", entity), ConsoleColor.White);
 
                var entityExportStatus = DataExporter.ExportData(parameters, entity, String.Format("{0} ", counter.ToString("000")));
                if (mapExportStatus != DataExporter.ExportStatus.Success && entityExportStatus != DataExporter.ExportStatus.Success)
                    error = true;
            }
 
            if (!error)
            {
                ConsoleHelper.WriteLine("[DONE]", ConsoleColor.Green);
            }
            else
            {
                ConsoleHelper.WriteLine();
                ConsoleHelper.WriteError("[ERROR]");
                ConsoleHelper.WriteError(String.Format("Entity {0} failed to export", entity));
            }
 
            counter++;
        }
    }                           
    elseif (parameters.Type == Types.DataMap)                               
    {
        // Ensure that the Data Map Files folder is clean
        string dir = parameters.DataFilePath + "DataMapFiles" + Path.DirectorySeparatorChar;
        Utilities.Directory.DeleteAllFiles(dir, validFileExtensions);
 
        // Export the Data Map specified
        exportStatus = DataExporter.ExportDataMap(parameters, parameters.DataMapName, "");
        if (exportStatus == DataExporter.ExportStatus.Success)
        {
            ConsoleHelper.WriteLine("[DONE]", ConsoleColor.Green);
        }
        else
        {
            ConsoleHelper.WriteLine();
            ConsoleHelper.WriteError("[ERROR]");
            ConsoleHelper.WriteError(String.Format("Data Map {0} not found", parameters.DataMapName));
        }
    }
    break;
 
 
For importing, we will have a number of a settings data files as part of our overall enterprise solution, so we simply use the GetFiles method of the standard .Net Directory class to give us an enumerable list of all of the data filenames that we need to import. We can take advantage of one of the fundamentals of computing, in that if we prefix a filename with a numerical value it will generally appear in ascending order, meaning that we can easily specify which solution files to import first.
We import each file, creating a background worker thread which keeps the user interface response and displays a basic progress bar of how the import is getting on and report any errors back to the user.
 
case OperationTypes.Import:
 
    string[] types;
    if (parameters.Type == Types.Data && parameters.Mode == Modes.Create)
        types = newstring[] { Types.DataMap, Types.Data };
    elseif (parameters.Type == Types.DataMap)
        types = newstring[] { Types.DataMap };
    else
        types = newstring[] { Types.Data };
 
    foreach (var type in types)
    {
        // Set up the correct text for the type
        var typeText = type == Types.DataMap ? "Data Map" : "Data";
 
        // Determine which Type we need to import, and choose the correct
        // directory
        string dir = parameters.DataFilePath;
        if (type == Types.Data && parameters.Mode == Modes.Create)
            dir += "DataFiles" + Path.DirectorySeparatorChar;
        elseif (type == Types.Data && parameters.Mode == Modes.Update)
            dir += "DataFileUpdates" + Path.DirectorySeparatorChar;
        elseif (type == Types.DataMap)
            dir += "DataMapFiles" + Path.DirectorySeparatorChar;                                                              
 
        // Get a list of all of the CSV and XML files in the directory.
        // if the files are prefixed by 1,2,3 etc. we will get them back
        // in the order we need to process them in
        var fileEntries = Utilities.Directory.GetFiles(dir, validFileExtensions);
        if (fileEntries.Length == 0)
        {
            ConsoleHelper.WriteWarning(String.Format("No valid {0} files found in {1}", typeText, parameters.DataFilePath));
            ConsoleHelper.WriteLine("[DONE]", ConsoleColor.Green);
            break;
        }
 
        // Import each of the files
        foreach (string dataFilename in fileEntries)
        {
            ConsoleHelper.WriteLine(String.Format("Importing {0} file {1} to CRM...", typeText, dataFilename), ConsoleColor.White);
 
            DataImporter.ImportResult importResult = new DataImporter.ImportResult();
            if (type == Types.Data && parameters.Mode == Modes.Create)
            {
                previousImportStep = String.Empty;
                progressChecker = newThread(() => GetProgress());
                progressChecker.Start();
 
                importResult = DataImporter.ImportData(parameters, dataFilename);
 
                progressChecker.Abort();
            }
            elseif (type == Types.Data && parameters.Mode == Modes.Update)
            {
                importResult = DataImporter.ImportDataUpdates(parameters, dataFilename);                                       
            }
            elseif (type == Types.DataMap)
                importResult = DataImporter.ImportDataMap(parameters, dataFilename);
 
            if (importResult.Status == DataImporter.ImportStatus.Success)
            {
                ConsoleHelper.WriteLine();
                ConsoleHelper.WriteLine("[DONE]", ConsoleColor.Green);
 
                if (importResult.Errors != String.Empty)
                    thrownewException(importResult.Errors);
            }
            else
                thrownewException(String.Format("Import of file {0} failed", dataFilename));
        }
    }
    break;
 
So that just about finishes off part 3 of the Microsoft Dynamics CRM 2013 Application Lifetime Management blog series, hope you have found this useful!
Next time… We enter the grid… TFS 2013 is the Tron MCP and we see how it controls things.
 
@simonjen1

Viewing all articles
Browse latest Browse all 13977

Trending Articles