Tag Archives: SQL Server

Using an ERD tool to create CRM entities

I get asked this now and then:

“I want to use Erwin [or Visio or Visual Studio or…] to graphically design the CRM entity model, then have the actual CRM entities generated off of the tool I used to design the entities.”

While I am not aware of a direct way to do this with any ERD tools, there is a multi-step process you can use as long as your ERD designer can generate a SQL or Access database:

  • Use your ERD designer to model your entities, fields and relationships
  • Generate a SQL or Access database from the designer
  • Use https://xrmspeedy.codeplex.com/ to generate the CRM entities from the database

Of course, you’ll need to make sure you don’t pick data types in your diagram that the xrmspeedy tool can’t translate to CRM types, but this approach can be used as a productivity enhancement to those who prefer to diagram out their entities and generated them.  Since this is an open source tool, you might consider contributing feedback if you have ideas for improvement.  After a little trial and error, I’ve found I get value out of using Visual Studio’s Entity Framework Designer plus this tool when brainstorming data models and building POCs.  Your Mileage May Vary (YMMV).

@devkeydet

CRM and SQL 2012 AlwaysOn Availability Groups

I’ve spent a fair amount of time lately talking to people about the benefits of using SQL Server 2012 AlwaysOn Availability Groups in your CRM infrastructure.  I just came across a great video that’s a great primer on AlwaysOn Availability Groups:

https://channel9.msdn.com/posts/SQL-2012-AlwaysOn-Availability-Groups/player?w=512&h=288

Here is a link to the SQL documentation as well:

http://msdn.microsoft.com/en-us/library/hh510230

AlwaysOn is already used in CRM Online for both local failover/redundancy and geographically distributed failover/redundancy for disaster recovery.  As of Update Rollup 12 for Microsoft Dynamics CRM 2011, the implementation guide has been updated with AlwayOn guidance:

http://msdn.microsoft.com/en-us/library/jj822357.aspx

One of the nice features of AlwaysOn is Active Secondaries: Readable Secondary Replicas (AlwaysOn Availability Groups).   You can configure custom reports to use the readable replica.  Here’s a good blog post on the topic:

http://blogs.msdn.com/b/thecrmguys/archive/2012/08/10/crm-reporting-and-sql-2012-alwayson-availability-groups-better-together.aspx

Make sure you read the limitations of this approach stated in the blog post.  You should evaluate the feasibility of using this approach to determine whether the benefits are applicable to you. 

@devkeydet

The new bulk API in Polaris/UR12

If you haven’t heard, CRM Online has completed the Polaris rollout and UR12 is available.  With it comes a new bulk API which makes programmatic data import into Dynamics CRM 2011 faster.  Check out Sonoma Partners blog post on a simple benchmark of the improvements.  As you can see, it’s really worth moving to this new API.  What?  You don’t do data migration in code:)?  You use a third party tool like the CozyRoc and KingswaySoft SSIS adapters or Scribe?  The good news is that both CozyRoc and KingswaySoft have already updated their adapters to support this new API.  I’ve also updated my Improving the performance of multiple Inserts, Updates, and Deletes via the CRM 2011 Organization Service post to suggest how one can combine bulk with parallelism to achieve even better performance.  In fact, if you review my Data Migration with CRM 2011 post, you’ll recognize that the Balanced Data Distributor (BDD) is an addon to SQL SSIS which enables parallelism within your SSIS package.

@devkeydet

Converting a SQL report to FecthXML

Scenario:

“I want to move to CRM Online, but I wrote all my SQL Server Reporting Services (SSRS) using the SQL Data Source.  How do I convert them to use FetchXML?”

Below I have three reports:

image

All three reports will look identical to the end user in the end.

facilityFetch.rdl – Written using the “Microsoft Dynamics CRM Fetch” Data Source available after you install the Microsoft Dynamics CRM 2011 Report Authoring Extension.

facilitySql.rdl – Written using the “Microsoft SQL Server” Data Source.

facilityConvertedSqlToFetch.rdl – Identical to facilitySql.rdl to start, but will be identical to facilityFetch.rdl once converted.

The key to a quick conversion is getting the results of the fetch query to look IDENTICAL to the sql query.  Basically, what you need to do is go through all your Datasets and update them with a matching fetch query.  In my example, facilityConvertedSqlToFetch.rdl DataSet1 uses a local Data Source called sqlds:

image

image

image

First step is to create a new fetch data source:

image

image

I tend to develop locally for developer productivity so I am pointing a local CRM 2011 deployment in my VM.  The format for the connection string is [root];[orgname].  Now that we have the fetch Data Source created, we need to create a matching fetch query.  My favorite way of going about this is here.  So let’s update the Dataset:

image

Make sure you update the Data Source to use the new fetch data source and replace the query with the right FetchXML query:

image

Again, the KEY here is that the shape of the Dataset is the same AND the column names are the same.  Save & Preview your new report.  It should “just work.”  For more advanced FetchXML scenarios, see Build Queries with FetchXML.

@devkeydet

Data Migration with CRM 2011

UPDATE (18APR2013): If you find this post interesting, then you will want to watch the Enterprise Data Integration for CRM, featuring Daniel Cai webcast.

“I am building a solution using CRM 2011, but I need to migrate a significant amount of data from an existing system.  The Import Data wizard in the CRM 211 UI doesn’t meet my needs due to the amount of data and the logic necessary to transform the data from the source system into the destination system.”

This scenario is often referred to as Extract Transform and Load (ETL).  The first thing most people look at is using SQL Server Integrations Services (SSIS), because it is the ETL tool that comes with SQL Server.  The only supported way to perform Create Update and Delete operations against CRM 2011 is to go through the CRM Web Services.  While it is possible to call a web service from SSIS out of the box, the reality is that the level of effort is arguably higher than most expect when interacting with the CRM 2011 Web Services.  Because of this, there are two companies (that I am aware of) who provide add-ons to SSIS:

http://cozyroc.com/

http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm 

These add-ons greatly improve productivity when performing CRUD operations through the CRM Web Services.  I’ve personally never used either, but I’ve spoken with folks who have.  Those people claim that these add-ins are well worth the license fee compared to creating the SSIS packages without one.  Once people have built SSIS packages to perform data migration, they inevitably want to know how they can improve the performance of the data import.  One of the first things you should consider is executing parts of your package in parallel where possible.  There’s another add-on for SSIS called the Balanced Data Distributor that helps you parallelize: 

http://blogs.msdn.com/b/sqlperf/archive/2011/05/25/the-balanced-data-distributor-for-ssis.aspx

This one is free.  Make sure you check out link to the video in the comments of the post.  Of course performance will improve if you run this on hardware that has more CPUs/Cores.  There are also some other things to take into consideration when it comes to performance of data imports for CRM 2011. 

  • First, you want to make sure you’ve optimized your CRM server infrastructure (see here).  Following the optimization guidelines makes a difference.
  • Where possible, deactivate workflows and plugin messages before importing large amounts of data.  Reactivate after import is complete.
  • It’s also possible to improve performance by dropping indexes and rebuilding them after the import is complete.  However, index rebuild time needs to be considered.
  • Disable statistics, then re-enable after import is completed (see here)

Whether you can apply some of these depends on your data import scenarios.  For example, if you have to import data during normal usage hours, some of these may not be feasible.  But then again, you probably shouldn’t be “migrating a significant amount of data” during normal usage hours.

You may be asking yourself: “What about Scribe?”  Scribe makes wonderful tools that do much more than what’s needed for basic ETL scenarios.  The focus of this post the scenarios for which ETL with SSIS makes the most sense.

UPDATE: Last, but definitely not least as a few folks have pointed out, it’s important to understand the fundamentals of a well built / optimized SSIS package.  This post was intended to cover with the CRM nuances.  Here are a few links to help with the fundamentals:

Top 10 SQL Server Integration Services Best Practices

SQL Server Integration Services SSIS Best Practices

 

@devkeydet

My Gov2.0 Expo presentation on Bing Maps / OData

You can find my deck here:

http://cid-1f72da7294089597.skydrive.live.com/browse.aspx/Public/Gov2.0Expo

I plan to have my OData + Entity Framework + SQL 2008 spatial demos packaged/cleaned up some time this week.  I will also record a deeper walkthrough of the demo/sample and publish it to Channel 9 some time soon.

US Census Data for SQL Server 2008

I’m presenting at Gov 2.0 Expo on May 25th:

http://www.gov2expo.com/gov2expo2010/public/schedule/detail/14877

Since SQL Azure will support the Geospatial types soon, I’m planning on showing a demo that shows what will be possible once it does.  For details on the announcement of the upcoming SQL Azure features, see:

http://blogs.msdn.com/sqlazure/archive/2010/03/19/9981936.aspx

As part of my demo preparation, I was looking for some data to use and found this little gem:

http://uscdsql.codeplex.com

“This is a repository for US Census Data for SQL Server 2008 with spatial information. Some of the data that is and will be included will be for states, counties, cities, and zip codes. The project will be updated as the data is available in SQL Server 2008 format.”

I plan on using this project in my demos.  I’d love to see you at my workshop at the Gov 2.0 conference!

SCREENCAST: Running Reporting Services Reports in Windows Azure


digg_url = “http://blogs.msdn.com/devkeydet/archive/2010/02/24/screencast-running-reporting-services-reports-in-windows-azure.aspx”;digg_title = “SCREENCAST: Running Reporting Services Reports in Windows Azure”;digg_bgcolor = “#555555”;digg_skin = “normal”;

http://digg.com/tools/diggthis.js

digg_url = undefined;digg_title = undefined;digg_bgcolor = undefined;digg_skin = undefined;


In this screencast, I show you how to run a SQL Server Reporting Services 2008 report in Windows Azure using the ReportViewer control that ships with Visual Studio 2010.  As an added bonus, I demonstrate using ReportViewer against an OData service through the use of WCF Data Services client libraries and the ObjectDataSource.


Get Microsoft Silverlight


Direct link to Ch. 9 post:


https://channel9.msdn.com/posts/keydet/Running-Reporting-Services-Reports-in-Windows-Azure/

Virtual Earth, SQL 2008, & GeoJSON

I’ve been chatting with Vish over email.  He made me aware his approach to getting SQL 2008 & Virtual Earth to play well together using GeoJSON.  I actually didn’t know about GeoJSON.  Thanks Vish!  Check out his post explaining the approach:

http://viswaug.wordpress.com/2008/06/02/sql-server-2008-virtualearth-made-easier-and-building-a-rest-api/