Tag Archives: Solution Design

Delete and Constraint Handling Stored Procedure Using SQL Server 2000

This is an older article I wrote on Code Project. The solution stems from a very unique problem I encountered when assigned to fix a WinForms application created by a nightmarish codegen. It deals with SQL Server 2000 and is probably no longer applicable, but definitely an interesting read. If anything it may inspire a WTF.

Here is an excerpt:

I was providing consulting services, developing an application in C# and .NET 2.0 with a SQL Server 2000 database. The application framework was based on a custom code-generator, which created SQL scripts, business objects, and UI code. Because the previous consultant who designed the code-generator was no longer available, time constraints restricted us from discovering and modifying the generator, leading me to take a different approach to my solutions.

In this article, I describe how I created the ability to perform deletes. The framework did not include proper scripts for cleaning up and deleting records, and for reasons beyond the discussion of this article, I chose to do all of the constraint checks and deletes from a stored procedure, and return an error message if any constraint violations would occur from a delete.

The solution actually involves permanently modifying master database and system level objects. When I wrote the article I may not have clearly stated the reason for the solution in the first place. SQL Server 2000 does not do well when trying to cascade delete a record that has an unimaginable number of restraints on very large tables. If you look at the master level stored procedure, it actually checks the constraints, lets the delete process throw the error anyway. This results in very slow deletes. In this case it could take up to a minute to return an error. The solution is actually an aim to rewrite those stored procedures to behave correctly. I wouldn’t recommend it. I would never do it again. But in this case, I had no choice. Here is the article for a full implementation:



Issues integrating AXIS Web Services with .NET

I have been integrating with ITIM 5.0 using the unsupported Web Services that use AXIS 1.3. All of the issues I have experienced were fixed in 1.4, and seeing as the product is over 5 years old, I don’t understand why they chose to stick with an older version. There are several issues I have found besides the well published one about how AXIS represents arrays by pointing to a centralized value store. One of those issues is namespacing. AXIS tends to generate WSDL with a namespace, but the return values do not have a namespace, .NET just fails to find the value during deserialization. It took me forever to find out why, but I solved it by manually removing the namespace requirement from the proxy code. The second issue I ran into was how the WSDL represents collections of items. The collection is named SomeCollection with the items underneath being named Item. When the service returns, the items are named the same as the collection (SomeCollection contains SomeCollection(s)). I had to manually rename all of these references in the proxy to be the same as the collection. These cost me countless hours. If I find anymore I will post them here. Obviously the solution is to upgrade to 1.4, but IBM won’t support it or turn over the source code to do so.

Clone Wars: Object copy using reflection

Ever needed to copy the properties of one object to another where the objects did not inherit from an Interface or Abstract class? In defense of the pragmatic programmer, this situation should never happen. However, I have run into this situation twice when dealing with web services from an external Java source. You first must call a Session service to get a “ticket” back that must be included at the top of each subsequent web service request. The session object is just an xml serialized object that must be deserialized into a SessionObject. The problem is that when Visual Studio creates the proxy against the WSDL for the other services, they each encapsulate their own version of SessionObject. Even though they have the same schema, they are not the same object to .NET. This particular web service suite requires this type of object sharing a lot, so I created a helper function that would simplify the object copy using reflection. The thing I like is that you could theoretically fill a simple object based on a more complex object that contained more properties.
private object CopyObject(object objSource, object objTarget)
  Type typeSource = objSource.GetType();

  foreach (PropertyInfo propTarget in objTarget.GetType().GetProperties())
     PropertyInfo propSource = typeSource.GetProperty(propTarget.Name);
     object valueSource = propSource.GetValue(objSource, null);

     if (propSource.PropertyType.Equals(propTarget.PropertyType))
        propTarget.SetValue(objTarget, valueSource, null);
     else if (valueSource == null)
        propTarget.SetValue(objTarget, null, null);
        object newTarget = Activator.CreateInstance(propTarget.PropertyType);
        newTarget = CopyObject(valueSource, newTarget);
        propTarget.SetValue(objTarget, newTarget, null);

  return objTarget;

Die, Die My Darling: a GAC alternative

The Global Assembly Cache is an excellent feature and does provide a solution to prevent DLL Hell (if you are a software company with a release strategy that is). What about the rest of us that live in the real world with aggressive timelines and budgets. We have one release called “Production”. The GAC is a decent solution for the globalization of custom framework and helper assemblies for a suite of applications, but not the right solution for most of us, especially those in the corporate sector.

The Pros: globalized assemblies, supports multiple concurrent versions, assemblies usually run with full trust, can be installed/uninstalled with a setup project.

The Cons: cannot be xcopy deployed, gacutil is unreliable, must physically be on the machine as an administrator to install, IIS applications do not automatically restart after a DLL is published, complicates the development cycle.

What are the alternatives? If the GAC was all-that-and-a-bag-o-chips you wouldn’t see forum after forum post asking for ways to get around it.

All of my applications (ranging from ASP.NET to Windows Services) reference a fairly large central framework that handles database actions, caching, encryption, etc. Trying to copy this DLL around could mean deploying it to 25 or more different bin directories (and the list will continue to grow). This would have forced me to create a fairly robust automated deployment mechanism comprised of manifests and such. Yuck. My previous solution was to use the GAC — a very hard sell to someone like me in the first place. After months of compounding small frustrations, I decided to look for alternatives. The answer is surprisingly simple, yet nearly impossible to find searching the Internet.

Information on assembly binding can be found here: http://msdn.microsoft.com/en-us/library/0ash1ksb.aspx

Basically this will redirect JIT compiler to an alternate assembly file when it attempts to compile the library. There was one trick which threw me off for a day or two getting this to work. CodeBase only accepts an HREF not a file path. You must use the URL file path syntax you might see when accessing a directory through Internet Explorer. The benefits of this method are the same as using assemblies directly in the bin folder, but you only have to deploy to a single location. You can even point to different versions for different applications if necessary.

<assemblyBinding xmlns=”urn:schemas-microsoft-com:asm.v1″>
<assemblyIdentity name=”nBaked” publicKeyToken=”2a6f8gc6611eh9s1″ culture=”neutral” />
<codeBase version=”″ href=”file:///C:/nBaked/bin/nBaked.dll” />