Quantcast
Channel: Microsoft Dynamics 365 Community
Viewing all 13977 articles
Browse latest View live

Resolved | NAM | CRM Online | A small number of customers in North America were unable to use CRM Online on August 20, 2014

$
0
0

On Wednesday, August 20th 2014, at approximately 5:38 PM PDT some customers in the North American region may have experienced the inability to sign-in to their CRM Online Organization(s). The incident was discovered through internal monitoring and mitigated using documented procedures by Dynamics Service Engineering. The service was fully restored by 5:49 PM PDT. This issue affected <10% of CRM Online customers hosted in the region.

We sincerely apologize if this issue affected you in any way. A post incident report will be published within 5 business days.


Creating a New Organization in CRM 2013 SP1 without Service enhancements already enabled

$
0
0
Introduction: After installing SP1 on existing CRM 2013 on-premise server, we need to enable the customer service enhancement features for each Org from Settings –>Administration –>...(read more)

Post Incident Report | Resolved | North America | CRM Online | Some CRM Online organizations were disabled unexpectedly

$
0
0

Summary

 

On 15th August 2014 at 6:55 AM Pacific time, a number of customers hosted in our North American data centers began to experience problems accessing their CRM Online organization. Microsoft engineers began investigating and determined a routine maintenance task resulted in affecting customers' ability to login to their organizations. A large number of customers were brought back online within a few hours of the issue beginning, however it took until August 17th at 3:00 AM to restore access to each affected organization.  

 

 

Customer Impact

Impacted customers could not sign in to their CRM Online Organization. The majority saw an error message stating their Organization was disabled, and a small number saw a "Page Cannot Be Found" or a similar error message.  

 

Incident Start Date and Time

August 15th, 2014 6:55 AM Pacific time

 

Date and Time Service was Restored

August 17thth, 2014 3:00 AM Pacific time

 

Root Cause

The problem was caused by an operator error and a failure to capture an exception in a routine maintenance task. An erroneous execution of an administrative tool resulted in affecting multiple organizations, causing them to become disabled.

While we were restoring service to the disabled organizations, some customers experienced performance degradation and/or the inability to execute workflows. A large number of customers were restored within a few hours of the issue beginning, however it took until August 17th at 3:00 AM to restore access to each affected organization.  

 

Next Step(s)

Issue

Next Step

Team Owner

Timeline

Operator error

Dynamics Service Engineering is reviewing the internal procedures to reduce the complexities and likelihood of human errors while executing maintenance tasks.

CRM Online Service Engineering

In process

Tool safeguards

Modifications to the existing administrative tool to ensure additional safeguards are implemented.

CRM Online Development team

In process

Process change

An additional process has been put in place to evaluate the risk, review and approve these types of changes to the CRM Online service.

CRM Online Service Engineering

Implemented

Microsoft’s Fred Studer, Duke Chung Talk Top Trends in Customer Service

$
0
0

 In an age where globalization and online access have practically erased price, product characteristics and availability as competitive differentiators, it’s customer service that now stands out as a key reason why consumers will favor one brand, product or service over another. According to a recent Customers 2020 Report, by 2020, customer experience will overtake price and product as the key brand differentiator. Many say it already has.

Talking the Customer Experience Talk

While it’s easy for many brands today to talk the talk when it comes to being customer-centric or offering an exceptional customer experience, walking the walk is a story that currently can only be truly told by the few. In fact, a recent study by Bain & Company reveals that while 80% of companies believe they are delivering a superior experience to their customers, only 8% of customers agree.  

According to Forrester Principal Analyst Kate Leggett in a new report, Navigate the Future of Customer Service in 2014, customers are becoming increasingly dissatisfied because organizations are often delivering (1) inconsistent cross-channel experiences, (2) reactive, not proactive customer service, (3) one-size-fits-all customer engagement processes, and (4) inefficient agent interactions.

So to walk the walk when it comes to consistently improving and delivering satisfying customer experiences, brands should focus on key aligning trends that include:

  • an optimized mobile customer experience

  • social media as a platform for customer engagement and service

  • using knowledge as a foundation for consistent multi-channel support

  • using data and analytics to improve and personalize the customer experience.

Why the focus on the above trends? Consider these statistics:

  • By 2015, there will be more people accessing the web using a mobile device than a wireless computer. (IDC Worldwide New Media Market Model Forecast)

  • 71% of consumers who experience a quick and effective brand response on social media are likely to recommend that brand to others, compared to just 19% of customers who do not receive a response. (NM Incite)

  • 40% of approximately 3,000 consumers in a global survey said they prefer self-service to human contact for their future contact with companies; 70% expect a company website to include a self-service application. (The Real Self-Service Economy)

  • The customer of 2020 will be more informed and in charge of the experience they receive. They will expect companies to know their individual needs and personalize the experience. Immediate resolution will not be fast enough as customers will expect companies to proactively address their current and future needs. (Customers 2020 Report)


Getting a head start on mobile, social media, big data and knowledge management above will help brands and organizations when it comes to tackling the even bigger trends on the horizon such as the Internet of Things (IoT).

August 28 Webinar Discusses Top Trends

If your brand or organization is not already on the path to an improved customer experience across current and new channels - or if it is, and you’d like to listen in on the latest discussion - join Fred Studer, GM of Microsoft Dynamics CRM, on August 28 for the Meet the Experts Live! Webinar with recent Harvard Business Review author and nationally recognized customer service thought leader, Duke Chung, founder of Parature, from Microsoft.

This is a one-hour free webinar that will take place at 3pm Eastern / noon Pacific. Click here to register.


Microsoft Dynamics CRM 2013 Toolkit with Visual Studio 2013

$
0
0

I recently needed to install the CRM 2013 Toolkit for Visual Studio 2013, and the Virtual Machine I was installing on didn’t have Visual Studio 2012.

As of writing this, the CRM 2013 SDK still doesn’t provide an installer for Visual Studio 2013, but there are a number of blogs that do provide each of the parts needed, and so I thought I would pull them all together in a single blog.

Note: Content used in this blog is from Matt (sadly not sure of the second name to provide full credit) via Hashtagcrm.com and a link provided by Petr Abdulinhere on updating the registry with dummy settings from Visual Studio 2012

The below steps assume that you have extracted the CRM SDK to C:\CRM-SDK on the machine you want to install on.

The first step is to extract the MSI installer file and modify the VSXI Manifest file

1.       Extract the contents of CrmDeveloperToolsVS12_Installer.msi

2.       Open a Command Prompt

3.       Navigate to the extracted SDK folder e.g. C:\CRM-SDK\sdk\Tools\DeveloperToolkit\

4.       Execute the command msiexec /a  C:\CRM-SDK\sdk\tools\DeveloperToolkit\sdkCrmDeveloperToolsVS12_Installer.msi /qb TARGETDIR=C:\CRM-SDK\Toolkit

5.       Navigate to the folder which you extracted the files

6.       Open the Visual Studio folder

7.       Open the archive file Microsoft.CrmDeveloperTools.vsix, or extract it to a sensible location

8.       Then we need to edit the extension.vsixmanifest, and replace - InstalledByMsi=”true” with InstalledByMsi=”false”

9.       We also need to replace all instances of - Version=”[11.0,12.0)" with: Version="[11.0,12.0]“  (notice the closing bracket has been changed from ) to ] )

10.   Update the manifest file in the archive or repackage the extracted directory.


The second step is to install the VSIX package

1.       Double click to install the VSIX package Microsoft.CrmDeveloperTools.vsix

 

 

2.       Click install

 

 

3.       Then click close.

The third step is to copy the CRM MSBUILD folder from C:\CRM-SDK\Toolkit\CRM MSBUILD folder to C:\Program Files (x86)\MSBUILD\Microsoft\CRM

The final step is to create a registry file with the following details in :
Windows Registry Editor Version 5.00
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\Setup\VS]
"ProductDir"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\"
"MSMDir"="C:\\Program Files (x86)\\Common Files\\Merge Modules\\"
"VS7EnvironmentLocation"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\Common7\\IDE\\devenv.exe"
"EnvironmentPath"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\Common7\\IDE\\devenv.exe"
"EnvironmentDirectory"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\Common7\\IDE\\"
"VS7CommonDir"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\Common7\\"
"VS7CommonBinDir"=""
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\Setup\VS\BuildNumber]
"1033"="12.0"
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\Setup\VS\Pro]
"ProductDir"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\"
 
[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\Setup\VS\VSTD]
"ProductDir"="C:\\Program Files (x86)\\Microsoft Visual Studio 12.0\\"
 
Once the registry file has been created, simply double click to install it, and the CRM 2013 Toolkit for Visual Studio should now work.

Note: The registry file only needs to be created if Visual Studio 2012 is not already installed.

Note: As previously mentioned, content used in this blog is from Matt (sadly no second name to provide full credit) via Hashtagcrm.com and a link provided by Petr Abdulin here on updating the registry with dummy settings from Visual Studio 2012.

May the force be with you…

@simonjen1

Microsoft Dynamics 2013 SP1 Package Deployer Tool

$
0
0

As part of the SP1 SDK update for Microsoft Dynamics CRM 2013, we got the new Package Deployer Tool. I was recently doing some research on this at TSG and thought I would write about what it is and how to use it.


What is the package deployer tool?

 Well the package deployer tool allows us to deploy packages we create from a Visual Studio project that comes as part of the CRM SDK Templates (which are part of the CRM 2013 SP1 SDK). The package is based on a number of managed solutions, migrated data that was exported using the data configuration manager tool (which is also part of the CRM 2013 SP1 SDK), and perform a number of custom actions via .NET code (with various Pre/Post/Before/After hooks) say for assigning all users to a specific role, and display HTML information to the user before or after the package is deployed.
One of the key benefits of using a Package is that you can run an upgrade migration to upgrade from one version of a solution to another.

How do we create a package for the deployer tool in Visual Studio 2013?

First we need to download the CRM 2013 SP1 SDK, and extract to say C:\CRM-SDK, then we need to run C:\CRM-SDK\SDK\Templates\CRMSDKTemplates.vsix

The following error may appear in the install log with Visual Studio 2013:

Install Error : Microsoft.VisualStudio.ExtensionManager.MissingReferencesException: This extension cannot be installed because the following references are missing:
- NuGet Package Manager

To resolve this we simply install the NuGet Package Manager from Visual Studio Gallery here, and ensure the latest update is applied to Visual Studio, at the time of writing this is Update 3 and available here.

We can also resolve the error by extracting the CRMSDKTemplates.vsix file, and edit the extension.vsixmanifest file and remove the following line:

<Dependency Id="NuPackToolsVsix.Microsoft.67e54e40-0ae3-42c5-a949-fddf5739e7a5" DisplayName="NuGet Package Manager" Version="[2.8.50126.400,3.0)" />

The we need to zip the extracted files back up, rename to .vsix and re-run the VSIX file.

Now we can create a CRM Package Project, which is under CRM SDK Templates, in Visual Studio.


What does the project consist of?

 As part of the Visual Studio project that is created we get a number of things. Within the bin\Debug\tools folder we get the standard CRM SDK files that we would expect when we need to connect to CRM, but we also get the Solution Packager. This isn’t proven, but I suspect this is used for the Holding Solution when migrating a solution from one version to another, update to follow on migrating a solution from one version to another.

We also get the PkgFolder which contains, most importantly, the ImportConfig.xml along with the HTML files for the Welcome, and End Pages of the deployment wizard.
Any custom action code that we want to create can be put into the PackageTemplate.cs file.

To add files to the Package, we simply add them to the root of the PkgFolder and we must remember to set the Copy to Output Directory to Copy Always as part of the properties of the files we add. Setting the Copy to Output Directory property means that when we build the class library project we get the files we need to deploy in the relevant debug or release folder.

The great thing about the content folder is that we can create multi-lingual installation files, by simply creating a locale folder say de-DE under the main content folder.

 
So what is in the ImportConfig.xml file?

An example of the ImportConfig.xml file is below:

<?xmlversion="1.0"encoding="utf-16"?>
<configdatastoragexmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xmlns:xsd="http://www.w3.org/2001/XMLSchema"
                   installsampledata="false"
                   waitforsampledatatoinstall="true"
                   agentdesktopzipfile=""
                  agentdesktopexename=""
                   crmmigdataimportfile="">
  <solutions>
    <configsolutionfilesolutionpackagefilename="solutionFile.zip" />
  </solutions>
  <filestoimport>
    <configimportfilefilename="File.csv"filetype="CSV"associatedmap=""importtoentity="FileEntity"datadelimiter=""fielddelimiter="comma"enableduplicatedetection="true"isfirstrowheader="true"  isrecordownerateam="false"owneruser=""waitforimporttocomplete="true"/>
    <configimportfilefilename="File.zip"filetype="ZIP"associatedmap="ZipFileMap"importtoentity="FileEntity"datadelimiter=""fielddelimiter="comma"enableduplicatedetection="true"isfirstrowheader="true"  isrecordownerateam="false"owneruser=""waitforimporttocomplete="true"/>
    <zipimportdetails>
      <zipimportdetailfilename="subfile1.csv"filetype="csv"importtoentity="account" />
      <zipimportdetailfilename="subfile2.txt"filetype="csv"importtoentity="contact" />
    </zipimportdetails>
  </filestoimport>
  <filesmapstoimport>
    <configimportmapfilefilename="FileMap.xml" />
    <configimportmapfilefilename="ZipFileMap.xml" />
  </filesmapstoimport>
</configdatastorage>

We can simply change the Solution Package filename to the correct filename of our solution, and add more elements if we have multiple solutions to import. The key thing for any dependencies is to list the solutions in the order we wish to import them.

If we need to import some configuration settings before we import the solutions then we would have to use a custom action and add a custom setting in the config file.

We can export configuration settings use the Configuration Manager, we manually export settings from CRM, or we can use a custom data manager to export the data (an example can be found here)

With the configuration files we also have a neat option of saying whether we want to wait for an import of a data file to complete before continuing or if we know we have no dependant relationships in the files then we can run asynchronously.

The following table is an extract from MSDN page Create packages for the CRM Package Deployer which can be seen here
Parameter/Node
Description
crmmigdataimportfile
File name of the data file (.zip) exported using the Configuration Migration tool.
ImportantImportant
If your data file contains user information, the user information won’t be imported. To import user information from the source CRM instance to the target instance, you must use the Configuration Migration tool. For more information, see Manage your configuration data in the CRM Implementation Guide.
<solutions> node
Contains an array of <configsolutionfile> nodes that describe the solutions to import. The order of the solutions under this node indicates the order in which the solutions will be imported on the target server.
<configsolutionfile> node
Use this node under the <solutions> node to specify the individual solution file name to be imported. You use the solutionpackagefilename attribute under this node to specify the .zip file name of your solution. You can add multiple solution file names in a package by adding as many <configsolutionfile> nodes. For example, if you want three solution files to be imported, add them like this:
 
<filestoimportnode> node
Contains an array of <configimportfile> and <zipimportdetails> nodes that are used to describe individual files and zip files respectively to be imported.
<configimportfile> node
Use this node under the <configimportfile> node to describe a file to be imported to CRM. You can add multiple files in a package by adding as many <configimportfile> nodes.
 
This has the following attributes:
 
Attribute
Description
filename
Name of the file that contains the import data. If the file is a .zip file, a <zipimportdetails> node must be present with a <zipimportdetail> node for each file in the .zip file.
filetype
This can be csv, xml, or zip.
associatedmap
Name of the CRM import data map to use with this file. If blank, attempts to use the system determined import data map name for this file.
importtoentity
Can be the name of the exe in the zip file, a URL, or an .msi file to provide a link to invoke at the end of the process.
datadelimiter
Name of the data delimiter used in the import file. Valid values are singlequote or doublequotes.
fielddelimiter
Name of the field delimiter used in the import file. Valid values are comma or colon, or singlequote.
enableduplicatedetection
Indicates whether to enable duplicate detections rules on data import. Valid values are true or false.
isfirstrowheader
Used to denote that the first row of the import file contains the field names. Valid values are true or false.
isrecordownerateam
Indicates whether the owner of the record on import should be a team. Valid values are true or false.
owneruser
Indicates the user ID that should own the records. The default value is the currently logged in user.
waitforimporttocomplete
If true, the system waits for the import to complete before proceeding. If false, it queues the jobs and moves on.
 
<zipimportdetails> node
This node contains an array of <zipimportdetail> nodes that describe the files included in a zip file that is used to import to CRM.
<zipimportdetail> node
Use this node under the <zipimportdetails> node to provide information about an individual file in a .zip file that is specified in the <configimportfile> node.
 
This has the following attributes:
 
Attribute
Description
filename
Name of the file that contains the import data.
filetype
This can be csv or xml.
importtoentity
Can be the name of the exe in the zip file, a url, or an .msi file to provide a link to invoke at the end of the process.
 
<filesmapstoimport> node
This node contains an array of <configmapimportfile> nodes to import. The order of the map files in this node indicates the order in which they are imported. For information about data maps, see Create data maps for import.
<configimportmapfile> node
Use this node under the <filesmapstoimport> node to provide information about an individual map file to import in CRM.
 

What is available as part of the Import Extension Class?

The following properties are exposed via the Import Extension class:

Name
Description
Returns a pointer to the CRM instance.
Returns the Title Link Text that is show on the completed page.
Returns the long name of the import process.
Returns the name of the Import package data folder.
Gets package description import.
Description of the import package.
Returns the folder name of the import.
Returns the long name of the import process.
Parent dispatcher for displaying UI elements.

The following methods are available for us to write some custom actions:

Name
Description
Called After all Import steps are complete, allowing for final customizations or tweaking of the CRM instance.
Called Before each application record is imported.
Called before the Main Import process begins, after solutions and data.
Returns the Name of the import project.
Returns the Name of the Import project.
Called to Initialize any functions in the Custom Extension.
Initialize extension functionality.
Is called during a solution upgrade when both solutions, old and Holding, are present in the system.

We can also use the PackageLog logging interface within the Import Extension class to write out any trace log information to the package trace log.

Full details of the Import Extension class along with other helper classes on MSDN can be found here.

How do we deploy a Package after we have created it?

Once we have created a package there are a few steps that we need to follow to deploy the package, these are as follows:

1.       We need to create a Deploy folder say in C:\Deploy

2.       Then we copy the contents of the C:\CRM-SDK\SDK\Tools\PackageDeployer folder to the C:\Deploy folder.

3.       Next we copy the contents of our package from say the projects bin\Debug folder into the C:\Deploy folder, you may be prompted to replace some files, so accept to replace the files.

4.       Now we can run PackageDeployer.exe from C:\Deploy.


Note:
We could include the files from C:\CRM-SDK\SDK\Tools\PackageDeployer as part of the Visual Studio solution, setting their Copy to Output Directory properties to Copy Always. This would then allow us to simply copy the projects bin\Debug folder, and would also mean that if we were using TFS (Team Foundation Server) we could have an automated build run on check-in of the solution that rebuilds the solution and places the built files in our standard builds folder meaning we have a single place to look for our deployment package and required executable files.

Although not a full step-by-step guide hopefully this has shown how powerful the Package Deployer Tool is and how it can be used as part of the CRM 2013 Application Lifetime Management.

@simonjen1
 

Microsoft Dynamics CRM 2013 SP1 WPF Application for CRM

$
0
0

As part of the SP1 SDK update for Microsoft Dynamics CRM 2013, we got the new WPF Application for CRM Visual Studio Template. I was recently doing some research on this at TSG and thought I would write about what it is and how to use it.

What is the WPF Application for CRM?

Well the WPF Application for CRM is a Visual Studio project which comes as part of the CRM SDK Templates (which are part of the CRM 2013 SP1 SDK). It is a neat little bootstrap project that enables us to quickly create an application that connects to CRM.

How do we create a WPF Application for CRM in Visual Studio 2013?

First we need to download the CRM 2013 SP1 SDK, and extract to say C:\CRM-SDK, then we need to run C:\CRM-SDK\SDK\Templates\CRMSDKTemplates.vsix

The following error may appear in the install log with Visual Studio 2013:

Install Error : Microsoft.VisualStudio.ExtensionManager.MissingReferencesException: This extension cannot be installed because the following references are missing:

- NuGet Package Manager

To resolve this we simply install the NuGet Package Manager from Visual Studio Gallery here, and ensure the latest update is applied to Visual Studio, at the time of writing this is Update 3 and available here.

We can also resolve the error by extracting the CRMSDKTemplates.vsix file, and edit the extension.vsixmanifest file and remove the following line:

<Dependency Id="NuPackToolsVsix.Microsoft.67e54e40-0ae3-42c5-a949-fddf5739e7a5" DisplayName="NuGet Package Manager" Version="[2.8.50126.400,3.0)" />

Then we need to zip the extracted files back up, rename to .vsix and re-run the VSIX file.
Now we can create a WPF Application for CMR Project, which is under CRM SDK Templates, in Visual Studio.

 

What does the project consist of?

 As part of the Visual Studio project that is created we get a number of things. Within the bin\Debug\tools folder we get the standard CRM SDK files that we would expect when we need to connect to CRM.

We also get a LoginWindow folder, along with a Xaml CRM Login Form. The login form can also be added to an existing WPF application by right-clicking, then clicking New > Add Item and selecting it from CRM SDK Templates in the Add New Item dialog.

We get the standard App.config and App.xaml files that we would expect to see from a WPF application, along with a MainWindow.xaml form which would be the starting point of our application.

Note: The packages.config file is just a standard NuGet package file.

When trying to running the application we may receive an error in the CrmLogin.xaml that it can’t find the following resource:

<ResourceDictionary Source="pack://application:,,,/Microsoft.Xrm.Tooling.Ui.Resources;component/Resources/Button/Styles.xaml"/>

This is fixed by adding the Microsoft.Xrm.Tooling.Ui.Resources.dll reference.

A look around the CRM Login Form

The CRM login form contains a user control from the Tooling library, which contains all of the standard fields we would normally have to put on to a form and maintain to provide the user with a login form.


We get a few events within the forms code behind to check the status of connection, to close the form if the user clicks cancel etc. but one of the key events is the window loaded event.

///<summary>
///<summary>
/// Raised when the window loads for the first time.
///</summary>
///<param name="sender"></param>
///<param name="e"></param>
privatevoid Window_Loaded(object sender, RoutedEventArgs e)
{
    /*
       This is the setup process for the login control,
       The login control uses a class called CrmConnectionManager to manage the interaction with CRM, this class and also be queried as later points for information about the current connection.
       In this case, the login control is referred to as CrmLoginCtrl
       */
 
    // Set off flag.
    bIsConnectedComplete = false;
 
    // Init the CRM Connection manager..
    mgr = newCrmConnectionManager();
    // Pass a reference to the current UI or container control,  this is used to synchronize UI threads In the login control
    mgr.ParentControl = CrmLoginCtrl;
    // if you are using an unmanaged client, excel for example, and need to store the config in the users local directory
    // set this option to true.
    mgr.UseUserLocalDirectoryForConfigStore = true;
    // if you are using an unmanaged client,  you need to provide the name of an exe to use to create app config key's for.
    //mgr.HostApplicatioNameOveride = "MyExecName.exe";
    // CrmLoginCtrl is the Login control,  this sets the CrmConnection Manager into it.
    CrmLoginCtrl.SetGlobalStoreAccess(mgr);
    // There are several modes to the login control UI
    CrmLoginCtrl.SetControlMode(ServerLoginConfigCtrlMode.FullLoginPanel);
    // this wires an event that is raised when the login button is pressed.
    CrmLoginCtrl.ConnectionCheckBegining += newEventHandler(CrmLoginCtrl_ConnectionCheckBegining);
    // this wires an event that is raised when an error in the connect process occurs.
    CrmLoginCtrl.ConnectErrorEvent += newEventHandler<ConnectErrorEventArgs>(CrmLoginCtrl_ConnectErrorEvent);
    // this wires an event that is raised when a status event is returned.
    CrmLoginCtrl.ConnectionStatusEvent += newEventHandler<ConnectStatusEventArgs>(CrmLoginCtrl_ConnectionStatusEvent);
    // this wires an event that is raised when the user clicks the cancel button.
    CrmLoginCtrl.UserCancelClicked += newEventHandler(CrmLoginCtrl_UserCancelClicked);
    // Check to see if its possible to do an Auto Login
    if (!mgr.RequireUserLogin())
    {
        if (MessageBox.Show("Credentials already saved in configuration\nChoose Yes to Auto Login or No to Reset Credentials", "Auto Login", MessageBoxButton.YesNo, MessageBoxImage.Question) == MessageBoxResult.Yes)
        {
            // If RequireUserLogin is false, it means that there has been a successful login here before and the credentials are cached.
            CrmLoginCtrl.IsEnabled = false;
            // When running an auto login,  you need to wire and listen to the events from the connection manager.
            // Run Auto User Login process, Wire events.
            mgr.ServerConnectionStatusUpdate += newEventHandler<ServerConnectStatusEventArgs>(mgr_ServerConnectionStatusUpdate);
            mgr.ConnectionCheckComplete += newEventHandler<ServerConnectStatusEventArgs>(mgr_ConnectionCheckComplete);
            // Start the connection process.
            mgr.ConnectToServerCheck();
 
            // Show the message grid.
            CrmLoginCtrl.ShowMessageGrid();
        }
    }
}

Here we can see that we are using the new CrmConnectionManager which is part of the Microsoft.Xrm.Tooling.CrmConnectControl namespace. We set up the parent control to be the Login control, set whether we want to store the config in the user local user directory which can be used for persisting connection details. We can set whether we want to display the Full Login Panel or just the Config Panel, then we set up a number of event handlers to respond to connection status changes, and user cancellation event.

We can see that we can auto login if we have saved the configuration, quite useful for a background task console application or saving the user from having to type in everytime!

///<summary>
/// Complete Event from the Auto Login process
///</summary>
///<param name="sender"></param>
///<param name="e"></param>
privatevoid mgr_ConnectionCheckComplete(object sender, ServerConnectStatusEventArgs e)
{
    // The Status event will contain information about the current login process,  if Connected is false, then there is not yet a connection.
    // Unwire events that we are not using anymore, this prevents issues if the user uses the control after a failed login.
    ((CrmConnectionManager)sender).ConnectionCheckComplete -= mgr_ConnectionCheckComplete;
    ((CrmConnectionManager)sender).ServerConnectionStatusUpdate -= mgr_ServerConnectionStatusUpdate;
 
    if (!e.Connected)
    {
        // if its not connected pop the login screen here.
        if (e.MultiOrgsFound)
            MessageBox.Show("Unable to Login to CRM using cached credentials. Org Not found",
                            "Login Failure");
        else
            MessageBox.Show("Unable to Login to CRM using cached credentials", "Login Failure");
 
        resetUiFlag = true;
        CrmLoginCtrl.GoBackToLogin();
        // Bad Login Get back on the UI.
        Dispatcher.Invoke(DispatcherPriority.Normal,
                new System.Action(() =>
                {
                    this.Title = "Failed to Login with cached credentials.";
                    MessageBox.Show(this.Title, "Notification from ConnectionManager",
                    MessageBoxButton.OK, MessageBoxImage.Error);
                    CrmLoginCtrl.IsEnabled = true;
                }
                ));
        resetUiFlag = false;
    }
    else
    {
        // Good Login Get back on the UI
        if (e.Connected && !bIsConnectedComplete)
            ProcessSuccess();
    }
 
}

Another event we get is the Connection Complete Check event, which is called from the auto login process, and allows us to validate that CRM Organisation was found etc.

///<summary>
/// Login control connect check status event.
///</summary>
///<param name="sender"></param>
///<param name="e"></param>
privatevoid CrmLoginCtrl_ConnectionStatusEvent(object sender, ConnectStatusEventArgs e)
{
    //Here we are using the bIsConnectedComplete bool to check to make sure we only process this call once.
    if (e.ConnectSucceeded && !bIsConnectedComplete)
        ProcessSuccess();
 
}
 
///<summary>
/// This raises and processes Success
///</summary>
privatevoid ProcessSuccess()
{
    resetUiFlag = true;
    bIsConnectedComplete = true;
    CrmSvc = mgr.CrmSvc;
    CrmLoginCtrl.GoBackToLogin();
    Dispatcher.Invoke(DispatcherPriority.Normal,
        new System.Action(() =>
        {
            this.Title = "Notification from Parent";
            CrmLoginCtrl.IsEnabled = true;
        }
        ));
 
    // Notify Caller that we are done with success.
    if (ConnectionToCrmCompleted != null)
        ConnectionToCrmCompleted(this, null);
 
    resetUiFlag = false;
}

If we did log in OK via the user entering the credentials, then we get the connection status event, and call the Process Success method to clean up and notify the caller.

A look around the Main Window

The main window is fairly simple, it just contains a login button and you can’t get much simpler than that!

The code behind is also fairly simple and contains two events

///<summary>
/// Button to login to CRM and create a CrmService Client
///</summary>
///<param name="sender"></param>
///<param name="e"></param>
privatevoid LoginButton_Click(object sender, RoutedEventArgs e)
{
    #region Login Control
    // Establish the Login control
    CrmLogin ctrl = newCrmLogin();
    // Wire Event to login response.
    ctrl.ConnectionToCrmCompleted += ctrl_ConnectionToCrmCompleted;
    // Show the dialog.
    ctrl.ShowDialog();
 
    // Handel return.
    if (ctrl.CrmConnectionMgr != null&& ctrl.CrmConnectionMgr.CrmSvc != null&& ctrl.CrmConnectionMgr.CrmSvc.IsReady)
        MessageBox.Show("Good Connect");
    else
        MessageBox.Show("BadConnect");
 
    #endregion
 
}

///<summary>
/// Raised when the login form process is completed.
///</summary>
///<param name="sender"></param>
///<param name="e"></param>
privatevoid ctrl_ConnectionToCrmCompleted(object sender, EventArgs e)
{
    if (sender isCrmLogin)
    {
        this.Dispatcher.Invoke(() =>
        {
            ((CrmLogin)sender).Close();
        });
    }
}

Here we can see that the On Click event of the button is wired up with some code to instantiate the Login form and handle the return event. The thing to note is that the Login form is show modally.

How do we Enable Tracing?
 
If we want to enable tracing then we simply modify the App.config file and add or edit the following lines :

    <switches>
      <!--
            Possible values for switches: Off, Error, Warning, Info, Verbose
                Verbose:    includes Error, Warning, Info, Trace levels
                Info:       includes Error, Warning, Info levels
                Warning:    includes Error, Warning levels
                Error:      includes Error level
        -->
      <add name="Microsoft.Xrm.Tooling.Connector.CrmServiceClient" value="Verbose" />
      <add name="Microsoft.Xrm.Tooling.CrmConnectControl" value="Verbose"/>
      <add name="Microsoft.Xrm.Tooling.WebResourceUtility" value="Verbose" />
    </switches>

This is great if you have a problem in the field and need a trace of what is going on with the application to diagnose an enviromental problem!

What else is available as part of the new Tooling namespace?

As part of the CRM 2013 SP1 SDK Spring ’14 Update we now also get the Microsoft.Xrm.Tooling namespace, which contains a few useful classes for connecting to CRM, but also for getting resources from CRM.

The following shows the classes that are available in the new Microsoft.Xrm.Tooling.WebResourceUtility namespace:

Class
Description
Web Resource actions for dealing with Image Resources.
This class provides an override for the default trace settings. These settings must be set before the components in the control are used for them to be effective.
 
This class is used to access and retrieve web

The following shows the notable methods from the ImageResources class:

Name
Description
Returns BitMap Image Resource from CRM

The following shows the notable methods from the XmlResources class:

Name
Description
Returns Xml Resource from CRM

Full details of the Web Resource Utility classes along with other helper classes on MSDN can be found here and full details of Building Windows client applications using the XRM tools can be found here.

This should give a good idea of the new features of the WPF Application for CRM and how to use it.

Until next time…

@simonjen1

10 Top Customer Experience Takeaways from CRM Evolution

$
0
0

 While there were more than 100, perhaps more than 1000, top takeaways from the recent 2014 CRM Evolution conference (and adjoining Customer Service Experience conference), here are 10 terrific customer service and customer experience insights shared by key speakers and analysts at the event including Kate Leggett of Forrester Research, Ray Wang of Constellation Research, Brian Vellmure, Denis Pombriant and more:

  1. Getting Customer Engagement Right:“True customer engagement is making engagement personal, by knowing who you’re speaking to, and also interacting with each customer in the way they want.” – Bob Stutz, Corporate Vice President, Microsoft Dynamics CRM (@rlstutz)

  2. Experience vs. Engagement: “The customer experience would have been something that the business designed, but now that you have social media and interactive experiences driven by the customer who is providing feedback and direction, you now need customer engagement.” - Banafsheh Ghashemi, VP, CRM and Customer Experience, Marketing, American Red Cross (@banafshehgh)

  3. Five Guiding Principles for Creating a Consistently Better Customer Experience: (1) Know me. Get better connected to your customer. (2) Remember me. Stay connected with more relevant touches. (3) Make me feel special. Proactively optimize the customer’s account and make product suggestions tailored to them. (4) Help me succeed. Continue to improve online tools and technology. (5) Always be there for me. Be there for your customers – whenever, wherever, however.– Jill Hewett, Customer Experience Designer, Catalyst, Inc. (@jhewitt98)

  4. What the Customer Really Wants: Customers want timely, competent, well-executed processes in moments of truth: what your company provides, stakes its reputation on, and what customers say they want or expect from you. But while customers want connected processes, most brands are currently delivering unconnected individual transactions. Brands need to empower their people with the technology and ability to successfully manage processes that will satisfy customer expectations and needs during a brand’s moments of truth. – Denis Pombriant, Founder and Managing Principal, Beagle Research (@denispombriant)

  5. It’s About Time:“You are not competing with your best competitor, you are competing with and for time. Whoever can deliver the fastest is going to win. If service is not convenient, you will lose in a digital world. Customers don’t care what department you’re in; they want you to solve their problem on the 1st interaction.” - R “Ray” Wang, Founder and Principal Analyst - Constellation Research, Inc. (@rwang0)

  6. The Four Ps of Customer Service: (1) Painless. Consumers want effortless service from the touchpoint of their choice. (2) Personalized. Customers don’t want a one-size-fits-all service experience. (3) Productive. A customer service experience has to be reliable, efficient, satisfying, but also delivered at a cost that makes sense to the business. (4) Proactive: Customers want to be notified of a problem and told it’s being fixed. Better yet, customers want this proactive service to happen, whenever possible, behind the scenes so that problems are addressed before they happen or before the customer is even aware there is a problem. – Kate Leggett, VP and Principal Analyst, Forrester Research (@kateleggett)

  7. Creating an Emotional Investment:Take your customers from making a financial investment in your brand to making an emotional investment. The value and return is even greater.” – Eric McKirdy, Global Customer Care Manager, Ask.com (@AskDotCom_Eric)

  8. Breaking Away from the Pack: “Digital transformation is creating new competitors, but the emotional connection will create breakaway companies.” - R “Ray” Wang, Founder and Principal Analyst - Constellation Research, Inc. (@rwang0)

  9. Nine Core Fundamentals for Digital Transformation: (1) Think networks. (2) Apply context and relevance - attention is scarce. (3) Know your customers. (4) Make everything intelligent. (5) Put speed at the center. (6) Every company is a tech company. Think Amazon, Starbucks, Nike. (7) Focus on the experience as a differentiator. (8) Build knowledge flows. (9) Develop talent. - Brian Vellmure, Principal and Founder, Innovantage / Initium (@BrianVellmure)

  10. Top Trends Shaping the Future of Customer Service: (1) Digital transformation; a massive push towards automation; decision support tools so that any person at any point in any interaction has the tools to make the right and best decision; mass personalization at scale. - R “Ray” Wang, Founder and Principal Analyst - Constellation Research, Inc. (@rwang0)

    (2)Social media has given corporations a consciousness that they did not have before; people look at that engagement and feedback and make decisions; giving customer service teams enough clout in an organization to change or get things changed. - ‎Dr. Natalie Petouhoff, VP and Principal Analyst, Constellation Research, Inc. (@drnatalie)

    (3)The future of customer service will become much more amazing because it will become more meaningful. Whereas, the web was once just a bunch of pages, now it’s become meaningful and connected. The same will happen with customer service – an emergence of meaning. - Martha Brooke, Founder & Chief Customer Experience Analyst, Interaction Metrics (@MarthaBrooke100)


Microsoft Dynamics Marketing Spring '14 Administration in Office 365

$
0
0
(Please visit the site to view this video)This quick video shows you step-by-step how to create new Microsoft Dynamics Marketing subscriptions in Office 365, issue licenses, add users, and define security permissions. Learn 3 scenarios for adding different types of new users.

Introducing the CRMUG Summit 2014 Keynote Speaker

$
0
0

 We’re thrilled to introduce the CRMUG Summit 2014 keynote speaker….Steve Rizzo! We’ll hear from Steve on Wednesday morning, October 15, as he helps us set the stage for a productive, successful week in St. Louis.

Steve Rizzo is a national headline comedian turned author and motivational business speaker.  As a comedian, he’s had opening acts such as Drew Carey, Rosie O’Donnell, and Dennis Miller, and has shared the marquee with comedic icons like Ellen DeGeneres, Rodney Dangerfield, Eddie Murphy, and Jerry Seinfeld.  He’s the author of the best-selling book, “Get Your SHIFT Together” and is a member of the elite Speakers Hall of Fame.

He’s a seriously funny guy who will challenge us to shift our focus and way of thinking to discover greater enthusiasm, increased productivity, and new levels of success in our lives and careers.

Click to learn more about Steve, watch a short sneak-peek video, and learn how you can start to ‘get your shift together’ today!

Meet us in St. Louis - register for CRMUG Summit 2014 today (registration rates go up after September 7)!

Performance Analyser for Microsoft Dynamics CRM 2013

$
0
0

Performance of any system is an important part of it's development and implementation and as part of my role as Chief Architect at TSG I often get asked about it...
A few months ago the Microsoft Premier Field Engineer guys released an update to the toolset for analysing performance for the Microsoft Dynamics range of products, which is available on CodePlex here and as of writing is version 1.20.

 The toolset is a set of SQL scripts to collect DMV data and Microsoft specific product data which persists into a singular database called DynamicsPerf. This allows us to analyse what is going on with a particular Dynamics installation and helps to resolve performance issues quickly.

As the PFE guys say the real benefit of course with the performance analyser toolset is not finding issues in a production environment, but preventing any bad code or environmental misconfiguration from getting to production stage.

The DynamicsPerf database is the central repository for the data that is collected for performance analysis.

How do we use the performance analyser?

Once we have downloaded the performance analyser from CodePlex, we can extract the zip file and copy over to the SQL Server that for the system that we want to analyse.

We need to ensure that we are logged into the SQL Server with an account that has permissions to create databases and tables etc

We also need to create a folder on the server say on the data drive of the SQL Server to store the SQL Trace files that are created. We can call this folder say D:\SQLTrace. For the performance counter logs we also need to create a folder to store the performance counter data that is logged, say D:\PerfLogs.

 The performance analyser is a SQL Server Management Studio solution file that includes a number of SQL Server jobs and performance counters that are used to start the collection process.

To be able to use the performance analyser we must first create the DynamicsPerf database, its objects, and the SQL agent jobs, which can be done via the following:

1.       Open the SQL Server Management Studio on the database server in the system we want to analyse.

2.       Click File > Open > Project / Solution.

3.       Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0

4.       Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file.

5.       In the Solution Explorer, we need to open the DynamicsPerf\Queries\1-Create_Core_Objects.sql file.

6.       Select the SQL Server from the Connect to Database Engine dialog and click the connect button.

7.       Then we click on the Execute button to run the SQL Script.

 
 Once the execution has completed the DynamicsPerf database will have been created, with a number of tables with various prefixes e.g. BLOCKED, COLLECTION, DYN, INDEX, PERF, QUERY, SERVER, SQL etc. A number of SQL Agent Jobs will have also been created with the prefix DYNPERF.

Note: Some jobs will have AX within the name and are disabled. For our purpose as we are looking at Dynamics CRM 2013 we can ignore these Jobs.

There is another script located as DynamicsPerf\Queries\2-Create_CRM_Objects.sql, however at the time of writing this, it currently has no Dynamics CRM specific queries so we can ignore this file.

Configure and schedule Performance Data Capture

Now the SQL Jobs have been created we need to configure a few of them with our specific system settings, i.e. we need to specify the CRM database that we want to analyse. By default the system is set up to analyse the MSCRM_CONFIG database, but that’s not really much use as generally we will want to analyse the Organisations specific database.

So to configure performance data capture, we now need to modify the DYNCPREF_Capture_Stats job:

1.       We open the DYNPREF_Capture_Stats job from SQL Server Agent > Jobs in SQL Server Management Studio.

2.       We select the Steps page, select step 1 sp_capturestats and click the Edit button

3.       We then modify the @Database = ‘MSCRM_CONFIG’, changing it to say @Database = ‘CRM001_MSCRM’ (the organisation database name), in the command box and click OK.

4.       Then we select the schedules page, select the first schedule “Daily” and click the Edit button.

5.       Then choose when we want the Job to run. Note: The default schedule is to run every day at 17:00, but we can change the time here along with how often the schedule occurs.

6.       Then we click OK and OK again to close the DYNPERF_Capture_Stats job window.


Configure and Schedule Database Blocking Capture

We want to collect information about blocking events, and so we need to configure the DYNPERF_Default_Trace_Stats job as per the following:

1.       We open the DYNPREF_Default_Trace_Stats job from SQL Server Agent > Jobs in SQL Server Management Studio.

2.       We select the Steps page, select step 1 “Start Tracing” and click the Edit button

3.       In the command box we can change the @FILE_PATH to point to out trace folder which is D:\SQLTrace and click OK.

4.       From the General page, we need to tick the Enabled option so that the job will run.

5.       From the Schedule page we can edit the schedule of when the job will run, by default this job will run Daily at 00:00.

6.       Then we click OK and OK again to close the DYNPERF_Default_Trace_Stats job window.


Configure and Schedule Hourly Performance Data Capture

We can optionally specify whether we want to capture performance data on an hourly basis, which would be quite handy if we are running a number of regression tests over a period of time. If we do want to do this, then we need to make changes so that the job captures performance data for the Organisation database we want to analyse performance for, by the following:

1.       We open the DYNPREF_Perfstats_Hourly job from SQL Server Agent > Jobs in SQL Server Management Studio.

2.       We select the Steps page, select step 1 “CaptureStats” and click the Edit button

3.      We then modify the @Database = ‘MSCRM_CONFIG’, changing it to say @Database = ‘CRM001_MSCRM’ (the organisation database name), in the command box and click OK.

4.      Then we click OK and OK again to close the DYNPERF_Perfstats_Hourly job window.


Configure and Schedule Performance Counter Logging

Now we can log information about disk, CPU, memory etc, but to do this we need to configure and schedule the performance counter logging job. In our scenario we are using the Default SQL Instance, so we do this as follows:

1.      Click Start > Run and type perfmon, to start the standard windows performance monitor.

2.      Then we expand the Data Collector Sets, right click on User Defined and select New > Data Collector Set.

3.       Then we name the collector set say “CRM SQL Server Performance”, and select “Create from a Template and click next.

4.       We then select “System Performance” from the options and click browse, and browse to the performance analyser template file which is D:\DynamicsPerf1.20 RC0\DynamicsPerf\Windows Perfmon Scripts\Server2008_SQL_Default_Instance.xml, and click Finish.

5.       Now we want to change where the performance logs are stored, so we right click on the “CRM SQL Server Performance” data collector set and click properties.

6.       Then we select the Directory tab and click browse to browse to our D:\PerfLogs folder.

7.       We can now schedule the performance counters to be collected, by clicking on the schedule tab, and clicking on the Add button.

8.       We select the beginning date as todays date and leave the other options as their default, so that the performance collection will run continuously without an end date, and click OK to close the folder action window.


Performance Analyser Maintenance

There are two jobs that are scheduled by default to purge data collected to ensure that any unneeded data is removed and thus we don’t exponentially take up disk space. These two jobs are DYNPERF_Capture_Stats_Purge (which is scheduled to remove data older than 14 days and run daily), and DYNPERF_Purge_Blocks (which is scheduled to run daily at 04:00, and removes any data recorded via the DYNPERF_Optional_Polling_for_Blocks Job).

Capturing Performance Data Manually

Whilst we can schedule the various jobs, there might be an occasion where we want to capture data at a specific point in time. We can do this by the following:

1.       Open the SQL Server Management Studio on the database server in the system we want to analyse.

2.       Click File > Open > Project / Solution.

3.       Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0

4.       Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file, and in the Solution Explorer we open the DynamicsPerf - Analysis Scripts\Manual-CaptureStats.sql script.

 

-- --------------------------------------------------------------
-- Script to execute SP_CAPTURESTATS
--
-- The run name is output at completion:
-- RUN NAME = TEST1
-- This date/time is then used to predicate
-- subsequent queries to QUERY_STATS_CURR_VW or INDEX_STATS_CURR_VW
----------------------------------------------------------------
 
USEDynamicsPerf
EXECSP_CAPTURESTATS    @DATABASE_NAME='DynamicsPerf',
                        @DEBUG='Y'--, @SKIP_STATS = 'Y'
 
-- --------------------------------------------------------------
-- Alternatively, run name can also be passed to SP_CAPTURESTATS if desired:
-- Here is an example to create a baseline capture
 
EXECSP_CAPTURESTATS    @DATABASE_NAME='XXXXXXXXXX',
                        @RUN_NAME='BASE Before sp1`'
 

5.       Then we change the @DATABASE_NAME = ‘DynamicsPerf’ to @DATABASE_NAME = ‘CRM001_MSCRM’ (the organisation database name) and Execute the script against the DynamicsPerf database.

Running a Baseline Performance Analysis

When the system is warmed up so that at least most of the normal day to day operations have been running we can run the DYNPERF_Capture_Stats_Baseline Job. This allows us to capture a known performing set of data for our system which we can then compare against after we have made either significant code or environmental changes for.

Note:
As we generally want to keep the baseline data for longer than 14 days, the purge jobs exclude anything prefixed with purge from the deletion.
 
So how do we run the Performance Analysis in a Test Environment?

As the PFE guys say the real benefit of course with the performance analyser toolset is not finding issues in a production environment, but preventing any bad code or environmental misconfiguration from getting to production stage.

So we can run the performance analysis on our test / QA environment by the following:

1.       Open the SQL Server Management Studio on the database server in the system we want to analyse.

2.       Click File > Open > Project / Solution.

3.       Browse to the location on the SQL Server to locate the extracted DynamicsPerf1.20.zip file, in our case this D:\DynamicsPerf1.20 RC0

4.       Select the Performance Analyser 1.20 for Microsoft Dynamics.ssmssln file, and in the Solution Explorer we open the DynamicsPerf - Analysis Scripts\CaptureStats in TEST.sql script.

--------------------------------------------------------------
-- For testing in a non-production system do the following:
--
 
--STEP1
          USEDynamicsPerf
          GO
 
          EXECSP_PURGESTATS
            @PURGE_DAYS=-1,
            @DATABASE_NAME='XXXXXXXX'--Use this option to delete all data for 1 database
          GO
    
--        --     Be sure all users are out of the test database
--        --     Get yourself to the point in the application where you are ready
--        --     to push the button for the code   you want to review.
--
--STEP2
 
          DBCCFREEPROCCACHE
 
-- OR the follwing for a specific database
--
          DECLARE@intDBIDINTEGER
       
        SET@intDBID=(SELECTdbid
                        FROM   master.dbo.sysdatabases
                        WHERE  name='database_name')
       
        DBCCFLUSHPROCINDB(@intDBID)
       
--
--
-- Capture data enabling delta's to be calculated
--
-- STEP3  (optional)
--  USE DynamicsPerf
--  EXEC SP_CAPTURESTATS       @DATABASE_NAME = 'XXXXXXXXX'
--
--
 
 
-- STEP4
--        --NOW RUN YOUR TESTS
 
 
-- STEP5 capture data again
 
EXECSP_CAPTURESTATS    @DATABASE_NAME='XXXXXXXXXX'
 
 
-- STEP 6  Review your data
 

5.       Then we change the @DATABASE_NAME = ‘XXXXXXXXX’ to @DATABASE_NAME = ‘CRM001_MSCRM’ (the organisation database name) and Execute the script against the DynamicsPerf database.


Although not a complete step-by-step guide this should give enough information on how to configure the performance analyser for Dynamics CRM 2013. Obviously the hardest part with any performance analysis and tuning is actually analysing the data that is recorded and correlating any blocking events back to the code / plugins that generated them.

That’s all for now…

@simonjen1
 

Clone Records in Dynamics CRM

$
0
0
Very often we are troubled with writing huge codes and end up spending our valuable time for cloning an entity record. Just imagine the time needed if you are asked to create a cloned record with more...(read more)

Location, Location, Location: Bing Maps, Parature Come Together on Spatially-enabled Customer Care

$
0
0

NOTE: The following is a guest post by Parature, from Microsoft Technical Specialist, Geoff Innis. Geoff came to Microsoft with the acquisition of Multimap in 2007, and enjoys helping organizations optimize customer engagement and make smarter business decisions with cloud solutions. He is based in Bellevue, WA.

 When interacting with a customer, knowing where they are located can enable a customer service representative to provide them with personalized, real-time, location-specific insight and information. Further, by visualizing larger groupings of customers, interactions, or social media events on a map, customer service and related teams can identify trends by location, and use these location-based trends to optimize customer care strategies and behaviors, as well as proactively address and provide information.

The Bing Maps Blog recently published a developer-oriented post that shows how organizations can spatially enable their approach to customer care by integrating Bing Maps for Enterprise, Microsoft’s enterprise-grade web-based mapping platform, into Parature.

The blog post provides step-by-step guidance on how to:

  • Capture information about the customer’s location as they submit trouble tickets through a Parature Customer Portal
  • Display map views of individual ticket locations in the Parature Service Desk, to empower CSRs with location context they can use to quickly and accurately resolve customer issues
  • Visualize ‘Heat Maps’ of customer issues within the Service Desk, to enable CSRs and supervisors to identify geographic trends in customer issues, and use this location intelligence to make smarter business decisions. 


By incorporating a map view of a customer location or ticket locations directly into the customer service representative’s ticket view, ticket resolution time can be reduced by avoiding agent context switching when:

  • Handling issues related to finding nearby locations or service points in proximity to the customer
  • Providing information on products or services offered in the customer’s region or specific location
  • Pinpointing precisely where customer-reported incidents or other problem-reports are located at first-contact.


Scenarios in which heat map visualizations of tickets can be of benefit include:

  • Understanding where, geographically, we may have infrastructural issues that need addressing, based on the concentration of service issue reports
  • Understanding which jurisdictions we may need to adapt our constituent outreach programs in
  • Understanding which geographies or demographic groups a brand's Parature Knowledgebase content may not be serving effectively, and adapting it.


With the continued increase of customer and constituent engagement through mobile channels and the proliferation of connected devices, along with growing expectations from the customer for real-time, proactive and personalized service, the opportunities to optimize and streamline customer care using location intelligence will only grow over time. This is a great example of where customer service is heading.

You can find the full developer-centric post on the Bing Maps Blog here.

To request additional information on Parature Customer Service Software, click here.

 

Microsoft Dynamics CRM 2013 Server-Side Exchange Synchronisation

$
0
0

 As part of my role at TSG I often get asked about email set up with Dynamics CRM. With Dynamics CRM 4 and 2011 email messages were sent and received either via the email router or via Outlook, this was great as you could track email messages against a case, contact, account etc. in CRM. In Dynamics CRM 2013 the Email Router and Outlook are still available, however as with the previous version of CRM you have to have Outlook open in order to synchronise contacts, appointments and tasks, something which doesn’t really lend itself to mobile working with the cloud.

One of the new features that came in with Dynamics CRM 2013 was the long awaited Server-Side Exchange Synchronisation.

Here we can see that now Server-Side Exchange Synchronisation runs as part of the CRM Asynchronous Service and connects to the exchange web services. This means we no longer need a dedicated service, and if we’re using CRM Online with Exchange on Office 365, we can have them integrate in the cloud, and access our Contacts, Appointments, Tasks, and tracked Emails via our PC, Windows Phone, and Surface Pro 3 all using Outlook!

Features of the Server-Side Exchange Synchronisation include the following:

Feature

Description

Efficient Resource Utilization

Server-Side Synchronization provides management of mailboxes and allows you to disable inactive mailboxes with the Microsoft Dynamics CRM web application. You can prevent resource hogging by applying upper limits to polling intervals and concurrent connections to external email systems.

Migrating Email Router Profiles

Switching from the Email Router is simplified by providing the capability to migrate Email Router incoming and outgoing profiles to Server Profiles for Server-Side Synchronization.

Service Isolation

Server-Side Synchronization has separate queue-management and configuration settings for asynchronous operations.

Error Reporting

Server-Side Synchronization provides error logging and reporting within the Microsoft Dynamics CRM web application.

Performance Counters

New counters have been added to the asynchronous service for measuring queue performance and email processing.

 

 

So what are the supported configurations?

The following table shows the support configurations of CRM and Exchange, and what items will be synchronised.

CRM Deployment Type
Email System
Email Synchronization
Supported
Synchronization of Contacts,
Tasks, and Appointments
Supported
Protocols Used
Online
Exchange Online
Yes
Not at RTM,
but now available
Exchange Web Services
On-premise
Exchange 2010
Exchange 2013
Yes
Supported
Exchange Web Services
Online and On-premise
POP3 (incoming)
SMTP (outgoing)
Yes
Not supported
POP3
SMTP
 

Note:
Server-Side Exchange Synchronisation does not support Exchange 2007, however the email router does.

Exchange Impersonation with Exchange 2010 / 2013 On-Premise

In order for us to be able to Synchronise Appointments, Contacts, and Tasks, with an Exchange Server that is running On-Premise, we must grant the user we are going to use to connect to Exchange with the exchange impersonation role.

We can do this by going to the Exchange Server, and opening an Exchange Management Shell. Then we need to run the following command:

New-ManagementRoleAssignment -Name "CRM Exchange Service" -Role:ApplicationImpersonation -User crm-exchange.service@test.local

Where crm-exchange.service@test.local is the account we want to use to connect to exchange.

Note:
If we don’t grant the exchange impersonation role to this user, when we test the mailbox later we will receive the following error in the alerts:


Testing without SSL with Dynamics CRM 2013 On-Premise

Dynamics CRM 2013 by default doesn’t allow us to save the email credentials in email server profiles and mailbox records if we don’t use SSL.
In an on-premise deployment however we can disable this for testing purpose. The setting are within the DeploymentProperties table within the MSCRM_CONFIG database, however Microsoft recommend changing this via power shell with the following command:

add-pssnapin Microsoft.Crm.Powershell

To allow the saving of credentials when SSL is not used, we can run the following commands:

$itemSetting = new-object ‘System.Collections.Generic.KeyValuePair[String,Object]'("AllowCredentialsEntryViaInsecureChannels",1)
$configEntity = New-Object "Microsoft.Xrm.Sdk.Deployment.ConfigurationEntity"
$configEntity.LogicalName="Deployment"
$configEntity.Attributes = New-Object "Microsoft.Xrm.Sdk.Deployment.AttributeCollection"
$configEntity.Attributes.Add($itemSetting)
Set-CrmAdvancedSetting -Entity $configEntity

To allow the use of connections to servers that do not use SSL, we can run the following commands:

$itemSetting = new-object 'System.Collections.Generic.KeyValuePair[String,Object]'("ECAllowNonSSLEmail",1)
$configEntity = New-Object "Microsoft.Xrm.Sdk.Deployment.ConfigurationEntity"
$configEntity.LogicalName="Deployment"
$configEntity.Attributes = New-Object "Microsoft.Xrm.Sdk.Deployment.AttributeCollection"
$configEntity.Attributes.Add($itemSetting)
Set-CrmAdvancedSetting -Entity $configEntity

Note:
A guide to using PowerShell with Dynamics CRM can be found on MSDN here which has a number of useful of cmdlets for administrating a Dynamics CRM Deployment.

Testing with SSL with Dynamics CRM On-Premise

In order to test with SSL, we must add an SSL certificate to our Dynamics CRM 2013 application server, via IIS Manager.


Then we click on Create Self Signed Certificate from the right hand menu.


We enter a friendly name for the certificate and click OK.

Then we need to change the bindings on the Dynamics CRM website to allow it to use HTTPS / SSL, so we right click on the website and the click Edit Bindings.


We click on the Add button and select HTTPS from the Type drop down, we leave the port as 443 and select our newly created self-signed certificate from the SSL Certificate drop down list, and click OK.

Now we can access the server via https://test-apps-01.test.local however it is good practice to tell CRM that we are now using HTTPS via the deployment manager.


So on the Web Address tab, we select the Binding Type as HTTPS and ensure the server addresses don’t have a port number associated to them, and this is because 443 is the default HTTPS port so we don’t need to specify it.

Note: We need to install the Root Certificate Authority Certificate on to the Dynamics CRM Server, along with specifying the full qualified domain name of the server in internet explorer, otherwise we are prompted that the certificate may not be valid because the server doesn’t match the server name on the certificate, and when we come to Test the Mailbox access this will fail telling us that the certificate is not valid.

How do we set up the Server-Side Exchange Synchronisation with CRM 2013 and Exchange 2013 On-Premise?

We can set up server side synchronisation by completing the followings tasks:

1.       Create an Email Server Profile.

2.       Configure the Mailbox record for the users to use the newly created Email Server Profile.

3.       Configure Microsoft Dynamics CRM 2013 to use Server-Side Email Synchronisation.


From Dynamics CRM we go to Settings > Email Configuration, and select Email Server Profiles.

 

1.       We click on New > Exchange Profile from the ribbon menu

2.       In Name, we type “Exchange Test”.

3.       In Auto Discover Server Location, we select No.

4.       In Incoming Server Location, we type “https://TEST-CORE-02/EWS/Exchange.asmx”. (Where TEST-CORE-02 is the name of our exchange server)

5.       In Outgoing Server Location, type “https://TEST-CORE-02/EWS/Exchange.asmx”. (Where TEST-CORE-02 is the name of our exchange server)

6.       In Authenticate Using, we select Credentials Specified in Email Server Profile. Note: if the User Name and Password fields are disabled, this is because the web application is not accessed using HTTPS and we haven't disabled SSL via PowerShell.

7.       We then enter our username (including the domain) and password.

8.       In Use Impersonation, we select Yes.

9.       In Use same settings for Outgoing, we select Yes.

10.   If necessary, we click Advanced to expand the Advanced tab and can review the settings.

11.   On the ribbon menu, we click Save & Close.


Now we need to configure a user mailbox record, so from Dynamics CRM we go to Settings > Email Configuration, and select Mailboxes.

1.       We double click on the User record

2.       Then we can select the Email Server Profile from the list, i.e. “Exchange Test”

3.       We can specify the Incoming, Outgoing and what method we wish to use to Synchronise Appointments, Contacts and Tasks.


Once we have saved the Mailbox record, we can then click on the Test & Enable Mailbox from the ribbon menu.

After a short while, when the Async process has processed the request, if everything has worked correctly and we have successfully connected to the exchange server, we will received the following messages under Alerts:


Finally we need to configure the global email settings. From Dynamics CRM we go to Settings > Email Configuration, and select Email Configuration Settings.


We can then set Process Email Using to Server-Side Synchronisation, along with specifying the default synchronisation method for our users.

Now we have successfully set up the Server-Side Exchange Synchronisation, we can log in to our Outlook Web App Server, via https://test-core-02.test.local/owa and see that we have the test email that CRM created in our inbox, along with some Appointments and Tasks. Clicking on the People tab shows our contacts from Dynamics CRM 2013.


Note:
The reason we use exchange web access as opposed to outlook is that if we had the CRM Outlook Plugin installed then it could synchronise with CRM as opposed to proving that the synchronisation actually took place at the server level.

That’s pretty much all there is to it, hopefully you will have found this post useful!

@simonjen1

Microsoft Dynamics CRM 2013 Change Deployment Settings via PowerShell

$
0
0
There a number of settings within the DeploymentProperties table within the MSCRM_CONFIG database which sometimes we need to change. Microsoft recommend changing this via power shell with the following command:

add-pssnapin Microsoft.Crm.Powershell

To allow the saving of credentials when SSL is not used, we can run the following commands:

$itemSetting = new-object ‘System.Collections.Generic.KeyValuePair[String,Object]'("AllowCredentialsEntryViaInsecureChannels",1)
$configEntity = New-Object "Microsoft.Xrm.Sdk.Deployment.ConfigurationEntity"
$configEntity.LogicalName="Deployment"
$configEntity.Attributes = New-Object "Microsoft.Xrm.Sdk.Deployment.AttributeCollection"
$configEntity.Attributes.Add($itemSetting)
Set-CrmAdvancedSetting -Entity $configEntity

To allow the use of connections to servers that do not use SSL, we can run the following commands:

$itemSetting = new-object 'System.Collections.Generic.KeyValuePair[String,Object]'("ECAllowNonSSLEmail",1)
$configEntity = New-Object "Microsoft.Xrm.Sdk.Deployment.ConfigurationEntity"
$configEntity.LogicalName="Deployment"
$configEntity.Attributes = New-Object "Microsoft.Xrm.Sdk.Deployment.AttributeCollection"
$configEntity.Attributes.Add($itemSetting)
Set-CrmAdvancedSetting -Entity $configEntity
 
There are also a number of other settings across a CRM Deployment which we can change via PowerShell:

To Enable Tracing, we can run the following commands:

$setting = Get-CrmSetting TraceSettings
$setting.Enabled=$True
Set-CrmSetting $setting
 
Get-CrmSetting TraceSettings
CallStack     : True
Categories    : *:Error
Directory     : c:\crmdrop\logs
Enabled       : True
FileSize      : 10
ExtensionData : System.Runtime.Serialization.ExtensionDataObject
 
 
To Disable Tracing, we can run the following commands:

$setting = Get-CrmSetting TraceSettings
$setting.Enabled=$False
Set-CrmSetting $setting
 
Get-CrmSetting TraceSettings
CallStack     : True
Categories    : *:Error
Directory     : c:\crmdrop\logs
Enabled       : False
FileSize      : 10
ExtensionData : System.Runtime.Serialization.ExtensionDataObject

The following table shows some of the settings that can be changed via PowerShell.

Configuration settings class name
Description
Contains settings values regarding asynchronous jobs and can be used to tune the asynchronous processing service.
Contains settings for claims authentication.
Contains settings that pertain to running custom code.
Contains settings for dashboards.
Contains settings that tune the operation of duplicate detection.
Contains settings for enterprise transaction management that can be used to throttle CRM for Outlook.
Contains settings for Internet facing deployments.
Contains settings that tune a data import operation.
Contains settings for the Marketplace.
Contains monitoring settings.
Contains settings for multi-entity quick find (finds records of different types).
Contains settings for quick find (finds records of one type).
Contains settings for controlling server side synchronizations for accounts, contacts and tasks.
Contains settings for controlling server side synchronizations for emails.
Contains settings for controlling server side synchronizations for queues.
Contains SQM settings.
Contains settings for auto created (system-managed) access teams.
Contains settings which can be used to throttle CRM for Outlook.
Contains settings that tune the amount of trace information generated.
Contains deployment root domain address values.
Contains deployment wide workflow settings.
Contains settings for Yammer.
 

Note:
A guide to using PowerShell with Dynamics CRM can be found on MSDN here which has a number of useful of cmdlets for administrating a Dynamics CRM Deployment, you can also see a list of all of the Advanced Settings that can be changed via PowerShell here.

This is a short tip but can be usefully if you want to quickly change a setting within a CRM 2013 Deployment.

@simonjen1

Update HTML on Form Save Button

$
0
0
As we know that, we can embed a HTML page in the Form using IFRAME/Web Resource control. Recently, we had a requirement where for some purpose we need Record Id of the form and some Form field values in...(read more)

Dynamics CRM 2013 & CRM 2013 SP1 Version Numbers

$
0
0

How do we tell what version of the platform we are on?

Well from the Dynamics CRM 2013 Deployment we can see the list of Organisations along with their version numbers.

 
So what are the version numbers for Dynamics CRM 2013 and Dynamics CRM 2013 SP1?

Dynamics CRM 2013 SP1

Initial Release 6.01.00.0581

Cumulative Update / Rollup
Version
Released
Update Rollup 1
6.1.xx.xxxx
TBA
 
Dynamics CRM 2013

Initial Release 6.00.00.0809

Cumulative Update / Rollup
Version
Released
6.00.01.0061
December 2013
6.00.02.0051     
April 2014
6.00.03.0106
July 2014
 
Until next time

@simonjen1

Announcing Spring ’14 Release Videos for Microsoft Dynamics CRM Now Available

$
0
0

Did you know that every second there are close to 50,000 YouTube videos being viewed around the world?  In fact, educational research has been shown to support what most of us have known intuitively all along. Not only does the use of video help people understand concepts and retain information, but it also increases their enthusiasm about the information being presented.  We couldn’t be more excited about what we are about to share with you.  In order to give you a comprehensive overview of the solutions we have delivered in the Spring ’14 release, we've created some very informative videos for your viewing pleasure. Please check them out!

Microsoft Dynamics CRM

Microsoft Dynamics CRM Spring ’14 Product Updates

Microsoft Dynamics CRM Spring ’14 Service Management Overview

Microsoft Dynamics CRM Spring ’14 Sandbox Instance

Microsoft Dynamics CRM Online License Management

Microsoft Dynamics CRM Spring '14 Case Management Enhancements

Microsoft Dynamics CRM Spring '14 Queue Enhancements

Microsoft Dynamics CRM Spring '14 Routing Rules 

Microsoft Dynamics CRM Spring ’14 Overview of Unified Service Desk

Microsoft Social Listening

Microsoft Social Listening Spring ’14 Overview

Microsoft Social Listening Spring ’14 Sales Scenarios

Microsoft Social Listening Spring ’14 Marketing Scenarios

Microsoft Social Listening Spring ’14 Customer Care Scenarios

Microsoft Social Listening Spring ’14 Trial Overview

Microsoft Social Listening Spring ’14 Trial Walkthrough

Microsoft Social Listening Spring ’14 Understanding Sentiment

Microsoft Social Listening Spring ’14 Tuning search topics

Microsoft Social Listening Spring ’14 Setting Up Search Topic

Microsoft Social Listening Spring '14 Managing Alerts 

Microsoft Social Listening Spring '14 Working with Facebook Pages

Microsoft Social Listening Spring '14 Understanding Quotas

Microsoft Social Listening Spring '14 Analytics Deep Dive

Microsoft Social Listening Spring '14 Identifying Influencers 

Microsoft Social Listening Spring '14 Managing Users and Settings 

Microsoft Social Listening Spring '14 Working With Posts 

Microsoft Dynamics Marketing

Microsoft Dynamics Marketing Spring ’14 Overview

Microsoft Dynamics Marketing Spring '14 CRM Connector Setup

Microsoft Dynamics Marketing Spring '14 Administration in O365

 

Do let us know if they are helpful—we’d love your comments!

Happy viewing,

Post Incident Report | EMEA | CRM Online | A small subset of customers on Europe were inaccessible on August 27, 2014

$
0
0

Summary

 

On August 27 2014 at approximately 17:45 UTC+01:00 a small subset of CRM Online orgs served from a Europe datacenter became temporarily inaccessible. The issue was identified through an internal health check and then quickly resolved by Microsoft Service Engineering using documented mitigation procedures. This issue affected less than 2% of the organizations in the region.

 

Customer Impact

Impacted customers could not access their CRM Online organizations.

 

Incident Start Date and Time

August 27, 2014 17:45 (UTC+01:00)

 

Date and Time Service was Restored

August 27, 2014 18:25 (UTC+01:00)

 

Root Cause

The investigation has determined that a SQL server became unresponsive which put Availability Group (AG) into an unhealthy state. Further investigation into why the SQL server became unresponsive is in progress.

 

Next Step(s)

Issue

Next Step

Team Owner

Timeline

SQL Server Investigation

Service Engineering is investigating the root cause for the SQL server becoming unresponsive and will work with the Microsoft SQL team to determine how we prevent it in the future.

Dynamics Service Engineering

In Progress

 

Update Fetch Query of System/User View Dynamically

$
0
0
Based on the client requirement sometimes it is needed to update the system/user view to filter record as per needed. In your query you need to apply outer join which won’t be possible through Advanced...(read more)
Viewing all 13977 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>