Wednesday 31 December 2008

Deploying multiple SSIS Packages from Development to Live Environments

This is as much a reminder for myself as a reference for others ;)

There are several ways of deploying SSIS dtsx packages under SQL 2005, but I like this one the best. It allows almost "one touch" style deployment of an entire project of packages. It makes use of package configurations. You may find you need to make several changes to every package in your solution, but you'll only need to do it once.

Once it's done you can copy your packages from DEV to LIVE without having to worry about different locations and connection strings in your live environment


Setting up your DEV and LIVE environments

  1. Create a new database on your database server, I've called mine SSISConfig. Don't add anything to it, BI Studio will do that for us.


  2. Create a new environment variable on each machine. I've called mine DW_CONFIG.

    Start -> (right click) My Computer -> Properties -> Advanced -> Environment variables -> New

    give this variable a value that points to what will become your configuration database (where all other csv file locations, excel connection strings etc will be stored). For Example, my database is called SSISConfig:

    Provider=SQLNCLI.1;Data Source=dur-vmdevolap-01;Integrated Security=SSPI;Initial Catalog=SSISConfig

    Note that your changes won't take effect until you restart. You can see what is in effect by opening a cmd window and typing

    SET

    You can make temporary changes that will be immediately effective by using

    SET DW_Config=SQLNCLI.1;Data Source=dur-vmdevolap-01;Integrated Security=SSPI;Initial Catalog=SSISConfig

Setting up your SSIS Packages

  1. Open your DTSX project using BI Studio / VS 2005
    Go to project Properties -> Deployment Utility -> Create deployment Utility

    Set the value to true


  2. In the "Data Sources" folder of your project, create a datasource pointing at your SSISConfig database as it is found on DEV. It's also a good opportunity to create a Data Source object for each database system your packages might connect to.

  3. Open each package in turn and:

  4. Create a new connection under Connection Managers. Select the "New connection from data source" option, select your SSISConfig data source

  5. Right click on the design surface and select "Package Configurations" (you may have to enable them first)

  6. Add a new package configuration, select "Environment Variable", and provide the value "DW_Config"
  7. Now add another package configuration, select "SQL Server", choose the connection "SSISConfig", press the "new" option beside Configuration table and use the defaults. For "Configuration Filter" use a name that uniquely identifies this configuration parameter (eq "DataWarehouse Connection String"). Press next, then select the property you wish add to the configuration (normally "Connection String")


When making these changes to further packages, be sure to reuse the same "Configuration Filter" value when you conceptually mean the same thing. It will ask if you wish to "reuse" or "overwrite". Choose reuse.

Once you are finished, inspect the results of your work by running

USE SSISConfig
SELECT * FROM SSIS Configurations


You should see your configuration entries. Backup and restore this database to your live server, then change the data in the "ConfiguredValue" column to represent the locations/connections that exist on the live system.

Deploying Changes to LIVE

Build your package, copy the resultant .bin/Deployment folder to your live SQL Server (with SSIS on). Double click the file with the ".SSISDeploymentManifest" extension. The SSIS Wizard will load the files and validate them for you. It will also warn you if it can't find the files/environment variables listed in package configuration. Once this is complete you can execute your new SSIS packages.

Friday 16 May 2008

BDC - I just wanna connect!

Using the Business Data Catalogue (BDC to the cool) is still quite painful, and all the articles I've seen cut straight to clever "best practice" methods such as Single Sign On and Passthrough authentication. Sometimes though you just want to connect to the BDC using standard SQL Server credentials. The key is the ridiculously named "RevertToSelf" authentication mode. Below is a snippet you can use, simply replace the connection details with valid ones. This should allow you to test everything out much more simply, and allow you to diagnose your problems with SSO etc in good time.


- <LobSystemInstance Name="TestInstance">
- <Properties>
<Property Name="DatabaseAccessProvider" Type="Microsoft.Office.Server.ApplicationRegistry.SystemSpecific.Db.DbAccessProvider">SqlServer</Property>
<Property Name="AuthenticationMode" Type="Microsoft.Office.Server.ApplicationRegistry.SystemSpecific.Db.DbAuthenticationMode">RevertToSelf</Property>
<Property Name="RdbConnection Data Source" Type="System.String">w2k3moss</Property>
<Property Name="RdbConnection Initial Catalog" Type="System.String">QTest</Property>
<Property Name="RdbConnection Integrated Security" Type="System.String">false</Property>
<Property Name="RdbConnection Pooling" Type="System.String">false</Property>
<Property Name="RdbConnection User ID" Type="System.String">myaccount</Property>
<Property Name="RdbConnection Password" Type="System.String">mypassword</Property>
</Properties>
</LobSystemInstance>

Monday 31 March 2008

Wrong content type when selecting "New Item" using content types: Continued

Aha! more progress on this one. I've narrowed it down now - this issue only occurs when you use Office 2007 documents that have already been stored in SharePoint 2007 as your template for a content type. This includes documents that were in SharePoint and now are not, or documents currently stored in a library.

If you create your templates from files using good ol' fashioned pre-SharePoint storage (hard disk ;) ) and upload them using the "upload a new document template" option along with the content type, then you will get the desired results.



Hmm, I'm waiting for Microsoft to try and convince me now that this is a "know issue by design". It's a clear bug. Once SharePoint has added a customxml folder and associated files into the docx package you're pretty much screwed. This also puts the end to any ideas of using a standards library as the basis for all the content types used within the business. Also - try getting your quickparts spot on if you can't ever put the template in the library to get the quickparts list populated...

The workaround suggested elsewhere - upload documents only, don't create from new - is pretty lame I think. Let's see what Microsoft have to say...

Code Signing for Click Once - The Free Way

When writing click once applications deployed over the web you are going to need to sign your manifest files with a certificate. In development you can quite happily test this by using the "create test certificate" option and use a certificate signed by yourself.


It's not ideal though - particularly if more than one person is developing and deploying the product - as each person ends up with their own certificate. Deploying to test becomes a chore as you have to uninstall from the test device each time to avoid manifest errors.


This issue can be compounded when you have components in the "partially trusted caller" category, such as infopath forms running inside the windows InfoPath form control. Each of these forms can also be signed with test certificates - the end result being anyone who makes a change to the form will be told that the certificate is invalid, resign it with their own, and have further fun deploying to test.

The answer is to make a common certificate for code signing - using the makecert.exe tool. For example:

C:\Program Files\Microsoft Visual Studio 8\VC>makecert -r -pe -n "CN=Temp Code Signing" -b 01/01/2005 -e 01/01/2100 -sky exchange -ss my

creates a pub/priv key signed certificate in the "my" section of your certificate store. You can export this using the cerificate store mmc snap-in, and use the "select from file" option to add the certificate into your project.



Now it is a project file, the certificate isn't called "P_Devenney" or some rubbish, and you can sign your infopath forms from it too. Each developer can sign with the same certificate, saving a load of test deployment hassle, and simulating a live scenario far better.



You can actually use this in a live environment too - if you accept it's disadvantage of being highlighted to the user as an unverifiable certificate. It does have an advantage over commercial code signing certifcates too - you'll see my cert was set to expire in 2100! Unfortunately most providers resign each renewed code signing certificate with a different private key - meaning that the end user has no choice but to uninstall and reinstall - having received scary warnings that the manifest is not from the same publisher! We actually use the makecert certificate in some intranet enviroments, as trusting the certificate once on each device is no real hassle.



Saturday 22 March 2008

Selecting Content Types from the "New" menu gets the wrong content type

The scenario

  • Create 2 breand new content types/ Provide each one with a word 2007 document as the template (each with some different text to test success).
  • Attach both content types to a document library.
  • From the "new" dropdown on the doc library select the last content type on the list.
  • Add some text add save back. The content type attributed to the document will be the first one in the list (the default) *even though* we selected the correct type.
  • saving to disk and inspecting "item4.xml" inside the docx shows the wrongly selected content type. Correct the documents content type in sharepoint, download to disk and inspect item4.xml and you will see the correct content type.

This is a bit of a weird one. I'm going to do a bit of investigation and report back through my blog. It's rather frustrating when you have a different workflow that kicks off on the creation of a document of each content type

Friday 21 March 2008

The B2B upgrader timer job failed

I found a great article on getting WSS 3.0 SP1 working after receiving the worrying "The B2B upgrader timer job failed" error message. I'm only linking it here in order to try and bump it up good so others can benefit.

Thursday 21 February 2008

SQL Server 2005 Partition Problems

We have a replication scenario where have SQL Server 2005 replication publications on development, test and live servers. Recently we had an issue where test server publication, having not been touched for about 3 months (between releases) suddenly stopped synchronising correctly with the following message:

The merge process failed because it detected a mismatch between the replication metadata of the two replicas, such that some changes could be lost leading to non-convergence. This could be due to the subscriber not having synchronized within the retention period, or because of one of the replicas being restored to a backup older than retention period, or because of the publisher performing more aggressive cleanup on articles of type download-only and articles with partition_options = 3


The effect was that all data seemed to come down, but the result from the synchronisation attempt was failure. A bit of googling turned up an incredibly similar issue where the partion_options=3. There is a hotfix available on request for this exact issue. Helpfully microsoft say "change this", but leave you with the usual link-to-link hell of trying to find a posting in msdn that actually tells you how to make the change.

Now in our case we were actually using partition_options=0, but I found that following the process below fixed it for me:

  1. Set partion_options=1
    To do this you will need to use partition groups. Execute the following SQL against your publication database:

    sp_changemergepublication @publication='MyPublication', @force_invalidate_snapshot=1, @property='use_partition_groups', @value='true'

  2. Then, either run stored procedure statements, or as I did, through Management Studio -> Replication -> Local Publications -> "My Publication" -> Properties -> Articles -> Article Properties -> Set Properties of all article tables
    Change the "Partition Options" from "Overlapping" to "Overlapping, disallow out-of-patition data changes"

  3. Reinitialise the snapshot

  4. Change the partition options back to overlapping

  5. Run sp_changemergepublication @publication='MyPublication', @force_invalidate_snapshot=1, @property='use_partition_groups', @value='false'

  6. Reinitialise the snapshot.


Synchronisation seems to have worked fine from this point. As this was our UAT environment there was little actual harm done, but you'd probably want to take precautions in a live environment - as unmerged data on local devices would be lost in the above process.

Tuesday 8 January 2008

OWSTimer Hogging Processor Part 2

It seems there was more...

The fundamental issue on this VM environment turned out not to be the time synch issue in all probablity, but a lack of resources in the environment.

3 Server pools had been set up, dev, test and live. VMWare allocated the resources 33% to each. There were 3 live servers, and one of the servers had been given an exact memory allocation equating to most of the resources available to the server pool.

Of course, if you only ever log into your VMs through RDP then your machine will tell you how much ram the VM image believes to be in its "Hardware". The effect - VMWare Infrastructure client reports that resources are not fully utilised (so no problems) but your VM client reports maximum processor usage (normally OWSTimer is the offender when you notice, or sometimes the mssearch service.

The answer in the case of this environment was simply to remove the server pools, and allow all the servers to contend normally for the resources they required. Server pools are a double edged sword....