Monday, December 20, 2004

C# DateTime and SQL Server DateTime Field Woes

Well, it appears all is not well with transformations between datetime datatypes with C# and SQL Server. I was having an error occur when executing as stored procedure passing a datetime parameter. Primarily, I was trying to pass an empty or null value to the database and haven't yet found a solid way to do this. Instead, I thought I would trap the minimum datetime value and insert null instead at the SP side. Well, as it turns out, DateTime.MinValue and SqlDateType.MinValue are not equal.


So my next attempt was to set the value to System.Data.SqlTypes.SqlDateTime.MinValue. However, my field is a smalldatetime instead and the min values are not equal. I ended up coding in a DateTime.Parse("1/1/1900") as that is the minimum small date time value.


For future reference, here are the min's:

DateTime.MinValue = 1/1/0001

SqlDateTime.MinValue = 1/1/1753

the min for SmallDateTime = 1/1/1900


I wish they had a SqlSmallDateTime.MinValue (it would make sense).

Wednesday, November 03, 2004

XML Transformations with C# Part 2

Well, I found the error in the function from my previous post about XML Transformations with C#. The line with:


XmlWriter tempXmlWriter = new XmlTextWriter(tempStringBuilder, xSet);


needs to be:


XmlWriter tempXmlWriter = XmlTextWriter.Create(tempStringBuilder, xSet);


Just goes to show you how transferring code from one test system to another can create pain points.

Monday, November 01, 2004

XSL-Transformations with C#

Ok, I think I have a functional method for applying xlst's to an xml document. If anyone sees things that don't work, let me know and I'll fix then repost.


public XmlDocument myXmlTransformation(XmlDocument xmlDoc, XmlDocument xsltDoc)

{

    XmlReader xsltReader = new XmlTextReader(xsltDoc.InnerXml, XmlNodeType.Document, null);

    System.Text.StringBuilder tempStringBuilder = new System.Text.StringBuilder();

    XmlWriterSettings xSet = new XmlWriterSettings();

    xSet.Encoding = System.Text.Encoding.UTF32;

    XmlWriter tempXmlWriter = new XmlTextWriter(tempStringBuilder, xSet);

    XsltCommand transCommand = new XsltCommand();



    //Complile and execute against the xml file

    transCommand.Compile(xsltReader);

    transCommand.Execute(sourceDoc, tempXmlWriter);


    //Reset to an XmlDocument object and return

    XmlDocument returnDoc = new XmlDocument();

    returnDoc.LoadXml(tempStringBuilder.ToString());

    return returnDoc;

}


Go figure, I test compile this right before I hit "publish" and something is broken. I'll update when I have a chance.

Saturday, October 09, 2004

Visual Studio 2005 and MS Word Library

Lately I've been attempting to instantiate an MS Word document in order to convert it to XML. I'll keep logging the progress to my end result.



I am aware of a couple ways to open a word document from ASP.NET 2.0. The first way is to use the traditional COM objects (and also including the Microsoft.Office.Tools.Word library .NET library). I'm going to attempt this one first. The second way is to use an MS Word project directly from Visual Studio. Since I know next to nothing about this method, I will only go further down this road if I can find more documentation. The help docs are still fairly loose and incomplete (I have found several links that don't work).



At any rate, I'll continue to note what happens and my progress.

Friday, October 01, 2004

Firefox, Can IE Ties be Broken?

I've been messing with Firefox now for about two months. I think it is a great product and I like not being tied to Internet Explorer...or am I still tied to IE? I've installed Firefox on two systems that I use quite often. As a matter a fact, I bounce back and forth regularly. Many of the features I love (such as native tabbed browsing). But I continually find myself using both IE and Firefox.

So when I combine two browsers with two systems, I get four places to search if I'm trying to do a history lookup for a site I was at a while ago (and various other things). You might say the answer is to only use Firefox. Well, both AOL IM and MSN IM use Internet Explorer exclusively. So you want to click on some news story from AIM...tough, you are using IE.

The latest release of Firefox (1.0 Preview Release) did fix one of the most annoying issues I've had. When clicking on a hyperlink in an email from MS Outlook, I would get an Outlook error saying the url could not be found even though Firefox opened and went the right page. Not that it was a big deal, just annoying (thank you for fixing this Mozilla).

I guess it boils down to the fact that for now, not even MS partners (let alone MS) respect your right to choose a browser. I won't stop using Firefox, but I don't see any option but to use multiple browsers.

Tuesday, August 24, 2004

XML from MS SQL Server 2005 Using FOR XML PATH

To begin with...wow. A quick google search and ten minutes worth of reading can gain you so much. It sure doesn't take long to appreciate the changes Microsoft has made with respect to FOR XML PATH. My google search on "FOR XML PATH" returned a winner on the first hit. Michael Rys blog had exactly what I was looking for. At any rate, here is yesterday's work wrapped up into one select statement.



SELECT BaseTable.BaseTableId AS "baseId",

    BaseTable.BaseName AS "baseName",

    BaseTable.BaseTypeId AS "baseType/@baseTypeId",

    BaseTableType.BaseTableTypeName AS "baseType/text()"

FROM BaseTable

    LEFT OUTER JOIN BaseTableType ON BaseTable.BaseTableTypeId = BaseTableType.BaseTableTypeId

WHERE BaseTableId = 5

FOR XML PATH('baseTable')


The result is exactly what I wanted (and in much less time).


<baseTable>

 <baseId>5</baseId>

 <baseName>Name of Base</baseName>

 <baseType baseTypeId="1">A Type of Base</baseType>

</baseTable>

Monday, August 23, 2004

XML from MS SQL Server 2005 Using FOR XML EXPLICIT

Lately I have been working on extracting content from an MSSQL 2005 Server in an XML format. Starting with Query Analyzer, I have been building up my FOR XML EXPLICIT statement. Here is the path I have taken:



SELECT 1 AS Tag, NULL AS Parent, BaseTable.BaseTableId AS [baseRoot!1!baseId]

FROM BaseTable WHERE BaseTableId = @BaseTableId

FOR XML EXPLICIT


The result of this statement is the following xml:
Next, I wanted to make the ID an element instead of an attribute. This is easily accomplished by exchanging [BaseRoot!1!BaseTableId] with [baseRoot!1!baseId!ELEMENT]. Now I have the following xml document:



<baseRoot>

 <baseId>5</baseId>

</baseroot>


My next step is to add additional fields to my base table configuration. So in order to add a baseName field, here is my new select statement:



SELECT 1 AS Tag,

    NULL AS Parent,

    BaseTable.BaseTableId AS [baseRoot!1!baseId!ELEMENT],

    BaseTable.BaseName AS [baseRoot!1!baseName!ELEMENT]

FROM BaseTable WHERE BaseTableId = @BaseTableId

FOR XML EXPLICIT


And my result is as expected:



<baseRoot>

 <baseId>5</baseId>

 <baseName>NewName</baseName>

</baseRoot>


Next we need to dive a bit deeper and add a foriegn key relationship for BaseType. I updated my statement as follows:



SELECT 1 AS Tag,

    NULL AS Parent,

    BaseTable.BaseTableId AS [baseRoot!1!baseId!ELEMENT],

    BaseTable.BaseName AS [baseRoot!1!baseName!ELEMENT],

    BaseTableType.BaseTableTypeName AS [baseRoot!1!baseType!ELEMENT]

FROM BaseTable LEFT OUTER JOIN BaseTableType ON

    BaseTable.BaseTableTypeId = BaseTableType.BaseTableTypeId

WHERE BaseTableId = @BaseTableId

FOR XML EXPLICIT


Again, my result is as follows:



<baseRoot>

 <baseId>5</baseId>

 <baseName>NewName</baseName>

 <baseType>Type A</baseType>

</baseRoot>


Now is where I started to get into trouble. Instead of having a separate level to model type, I want to add an attribute to view the item as follows: <baseType baseTypeId="1">Type A</baseType>. Well, in my research, this can only be done with a Union and setting of a second level node. So the code would look something like this:



SELECT 1 AS Tag,

    NULL AS Parent,

    BaseTable.BaseTableId AS [baseRoot!1!baseId!ELEMENT],

    BaseTable.BaseName AS [baseRoot!1!baseName!ELEMENT],

    NULL AS [baseType!2!baseTypeId],

    NULL AS [baseType!2!baseTypeName!ELEMENT]

FROM BaseTable LEFT OUTER JOIN BaseTableType ON

    BaseTable.BaseTableTypeId = BaseTableType.BaseTableTypeId

WHERE BaseTableId = @BaseTableId

UNION ALL

SELECT 2,

    1,

    BaseTable.BaseTableId,

    BaseTable.BaseName,

    BaseTable.BaseTableTypeId,

    BaseTableType.BaseTableType

FROM BaseTable LEFT OUTER JOIN BaseTableType ON

    BaseTable.BaseTableTypeId = BaseTableType.BaseTableTypeId

WHERE BaseTableId = @BaseTableId

FOR XML EXPLICIT


That doesn't give me exactly what I want, but it is close enough for now. I'm sure it would only take a little more education and tweaking on my part to get what I want from the EXPLICIT statement. However, I can see that as I add complexity to my fetch statement the EXPLICIT statement becomes more and more painful. Instead, I'm going to investigate nested FOR XML statements and finally FOR XML PATH queries. I will post on those topics soon.

Sunday, July 18, 2004

Yukon - Quick Rant

Overall, I have very positive things to say about the most recent moves by Microsoft with SQL Server 2005. However, after installing the Express version, I was having trouble locating the Books Online. It appears my trouble comes from the fact that they aren't included in the SQL Server Express install. I could be wrong, but I was unable to locate the appropriate .chm file. At any rate, here is a link to the books online, additional help, and sample files. I know the link is clearly displayed on the Yukon Express download page, but I was sure they would include the books online in the installation files (also since it actually says it installs the books online).



Thursday, July 15, 2004

Setting up SQL Server 2005 "Yukon" and Visual Studio 2005 "Whidbey"

I'm sure this is academic for some, but I'll run through how I setup these two beta applications and got them talking.


I should probably start off with some background on setup. I have a Windows Server 2003 (referred to as "the server" below) system running MS SQL Server 2005 Express (essentially Beta 2) and MS Visual SourceSafe 6.0d. Another system on the LAN running Windows XP with Visual Studio 2005 Beta 1 (referred to as "the LAN system" below). And finally, I have another Windows XP system running Visual Studio 2005 Beta 1, but the system is located outside the firewall with VPN access (referred to as "the remote system" below). I won't go into the specifics of what VPN solution, but that may play an important part on the remote access to the SQL Server for different people in different environments.


One other noteworthy thing about environment is that these systems are set apart from the domain (Windows 2003 server setup as a standalone system).


After installing Yukon and Visual SourceSafe on the server and VS 2005 on the LAN and remote systems, we were able to open the VSS project by selecting File - Source Control - MS Visual SourceSafe and then entering the properties of the SourceSafe database. Once the connection was setup, we could then choose File - Open - Project/Solution, then select MS Visual SourceSafe, the database we already attached to, and finally select a solution. In only mention this since MS moved the Visual SourceSafe call for opening a VSS project from Visual Studio.


This is the point at which the LAN system and remote system split paths. For the LAN system, we had some difficulty adding a connection to the server. When attempts where made, we kept getting the error: No connection could be made because the target machine actively refused it. As it ends up, we forgot to install the Visual Studio Remote Debugger. One thing that also needs to happen is to happen is to enable TCP traffic for SQL Server and start the SQL Browser service. To enable TCP traffic, open the SQL Computer Manager, drill into Server Network Configuration - Protocols for . Once you click on the Protocols, you should see SM (shared memory), NP (named pipes), TCP, and VIA (Virtual Interface Architecture). Right-click TCP and select Enable. Once TCP is enabled, open up the Services window, locate SQL Browser and double-click it, then select a Startup Type of Automatic, and (finally) click start. Once that was up and going, the LAN system worked like a champ.


I had trouble with the remote system, but found out that it was a performance issue. As I was trying to connect, I also had a fairly large file I was copying to another destination. I continually received the following error: Unknown ProviderError Locating Server/Instance Specified. Once the file completed the upload, I was able to connect without any problems. If anyone is getting the error above, I suggest looking at bandwidth (although I'm sure that isn't the only way to receive that generic error).

Wednesday, July 14, 2004

MS Windows 2003 Server Reboot Logging

Not that I've had extensive experience with Windows 2003 Server, I noticed the logging function to specify a reason for a reboot. Let me just say that the ability to say what category a reboot falls in and a description for details is a definite plus. Don't get me wrong, best practices say to log what changes you make to a server and when you make them, but this simplifies some of the process (especially if you have many people doing the work).

At any rate, I'm curious how fully functional the logging is. For example, if I run windows update to apply a security patch, does the prommpt for reboot automatically log this data? If so, VERY cool function. If not, you could still use it by saying no at the prompt then rebooting through normal means. Not as cool, but still useful.

Thursday, July 08, 2004

MS Virtual PC 2004 First Impressions

Well, I must say that my first impressions of MS Virtual PC 2004 are mixed. My basis for comparison is an old version of VMWare Workstation and it has been some time since I last ran it. In short, my memory of VMWare is weak at best and a direct comparison will not take place.


The first notible thing for my experience is that I was able to reuse and activate the same Windows XP Pro key that my system is using as the base OS. I was a little surprise, but then I've never taken the time to fully understand the activation rules from MS. After setup and updating the OS, I installed the Virtual Machine Additions for Virtual PC. This loaded an ISO image to the Virtual PC's CD ROM. I know that loading an ISO as an available disk is nothing new, I was happy to see this functionality.Terminal Server seemes to be the closest comparison to the experience of Virtual PC after instaling the VM Additions. Since I consume Terminal Services quite a bit, I felt comfortable with the relation.


Once I had all of the environment setup, I wanted to test the extreme case of copying it to another system (actually, I went from a laptop to a desktop). Since I had allocated 6 GB for the Virtual PC window and I'm limited to 100 megabits, the copy took some time to create. However, to my surprise, the copy was successful and the other installation of Virtual PC was able to open and work in the copied environment. In opening the existing instance, the only thing I had to do was make sure my memory was configured the same (and I might have even been able to change that).


Overall, I think this is a pretty desent solution for my testing purposes. At times the system is pretty slow, but I don't have any basis for comparison and I am quite positive I could tweek my base OS to better accomodate the virtual environment (allow for more memory to focus on background services, etc.).

Friday, July 02, 2004

How much can you learn at once?

Well, I guess when a person jumps off a diving board, they had better make sure it's into the deep end. This week's initiative is to install MS Virtual PC 2004 (which is very simple), .NET Framework 2.0 Beta 1, MS SQL Server 2005 Express, and MS Visual Studio Beta 1 on my laptop. I should probably sum up my experience with all the products mentioned: 0. Granted I use VS 2003, .NET Framework 1.1, and SQL Server 2000, I understand the newer versions to be significant updates. Over the next few days, I will post my findings and comment on likes and dislikes.

If anyone has any specific examples of things to check out with these products, please comment here.

Wednesday, June 16, 2004

Cloning and Inserting an XML Node with C#

I've recently had some trouble locating some code to append data to an XML file. Just in case someone else out there is having the same problem, here is some code to assist.


//Instantiate an XML Document and load the file

XmlDocument objDoc = new XmlDocument();

objDoc.Load(Server.MapPath("/") + "/files/myxmlfile.xml");


//Set a root node pointer

XmlNode objRoot = objDoc.DocumentElement;


//locate the node you want to clone and copy it to a new node

string strXmlQuery = "/myNodePath/myNode";

XmlNode objToBeCloned = objRoot.SelectSingleNode(strXmlQuery);

XmlNode objNewNode = objToBeCloned.CloneNode(true);

//The bool value indicates true to copy the whole tree of chile nodes, false only copies the top node


//Alter the node and child nodes as relevant

objNewNode.InnerText = "alter node text or child node text and/or attributes as needed";


//Insert the node after the last child of a commoon parent (see note below)

objRoot.InsertAfter(objNewNode, objRoot.LastChild);


//Resave the xml file

objFilters.Save(Server.MapPath("/") + "/files/myxmlfile.xml");


NOTE: it is critical that the InsertAfter call is made from the common parent for the nodes. I initially tried using objDoc and it failed with "The referenced node is not a child of this node." One last thing, the SelectSingleNode call uses XPath to locate the node.


Well, I hope this helps someone else. There are some areas that may be improved, but this works and made sense in the scheme of the application I was writing.

Monday, June 14, 2004

Skype

Recently, myself and a few business partners started using the Skype beta (v0.98). I tried using the beta a year or so ago, but it was a failed attempt. Since then, the application itself has seen some improvement (if my memory serves me correctly).


Once SkypeOut is launched (PC to POTS), I hope the company is able to find a way to make money. I mention that fact since I actually like the application. Skype has done well in creating a reliable and quality VoIP solution and I would hate to see an application I like go by the wayside due to lack of income.


The other interesting service will be SkypeIn. This service will allow for POTS to connect to Skype users. This process is much more complicated, but I believe they will eventually find a way to do accomplish this task. Once done, I could see dumping my phone line. I would be very interested in seeing how this service would work for conference calls. Combining SkypeIn and SkypeOut, you could conference people all over the world at a low rate (or no rate if they are on their PC). It never hurts to mention that there will probably be a country limit with Skype.


The only trouble we experienced was due to a person's flaky internet connection. Granted we all have broadband in one form or another, I am interested to see how this service works when one of us is in Europe (more than likely a trip in August).


As always, I have one criticism of the application...lack of Instant Messaging (IM) features. It does have a little IM capable portion, but it doesn't work well when you are in a conference environment. Simply put, I can't get by without the ability to share text and files (screen sharing is nice too, but another bandwidth constraint). Granted I can solve this problem by using other applications (MSN IM, AIM, or YahooIM), I would prefer a single integrated (or partnered) environment. So if Skype added a premium feature set for a monthly or yearly charge, I would buy in a second (as long as they kept a well positioned price point).


In summary, nice application and you can't beat the price. Whether you are trying to contact family, have a small conference call (10 person limit with Skype), or keep in contact with friends, this is a great solution. So, grab a headset and start gabbing.

Wednesday, June 09, 2004

GMail Beta

Well, I finally opened a GMail account so I could play around with the features and see how it compares to the other email providers out there. As with anything, I have found good, bad, and ugly.


The feature set is decent, the interface is clean, and I do really respect the effort of reorganizing email in order to make it more consumable. I'm not sure if it will succeed, but only time will tell on that part. After my initial investigation, I think the new approach to conversations (like a message board) are handy.


However, I have found one missing feature that will keep me from migrating my Yahoo or Hotmail accounts to GMail. The management of contacts has regressed considerably when compared to tools that already exist. What do I mean by that? Just try to create a list of contacts. That's pretty basic functionality, but I didn't see a way to do it. Also, the only information regarding contacts that you can store is their first, last, and email address. What does this mean? You have to have two places to keep contact information (Gmail for email and another for address/birthday/etc.). I have enough problems keeping my contacts together without this limitation.


Another area that Gmail needs to improve is integration with IM and other types of collaboration. To think that email stands alone is a poor assumption. Granted I haven't read Gmail's intended growth path or claim to know what version one will contain, they need to think larger than email. Since Google doesn't have an IM client, what about expanding it to become an RSS reader?


One last area I would like to see improvement is in the interface. Granted Google has gotten by for years with the clean search interface, I don't want to spend time in email looking at the plain screen. Give me a skin or two to choose from. If they are ugly, at least I'll choose to go with the plain than have that be the only option.


Although the 1,000 mg of disk space is extremely attractive, it still isn't enough to over come the lack of contact management. If it added in contact management and the ability to integrate with other collaboration tools (RSS Reader, etc.), it would be a no brainer for me to switch. Also, if they were to throw in some sugar coating and make the interface more consumable, I would turn my back on Yahoo and Hotmail and tout the greatness of Gmail.

Wednesday, June 02, 2004

LAMP or No LAMP

Lately I have been pondering the likelihood of moving to a LAMP environment. The problem stems from this: my background and infrastructure is nearly all Microsoft technologies. I am quite interested in (and have even dabbled in) PHP and MySQL. The problem is that I very much enjoy the .NET environment over PHP. Assuming that my desire holds out, I will more than likely end up with a mixed mode environment. Over the next year, I will be moving all web access components (currently in ASP which I don't like more than PHP) to web services. These services will start in .NET and quite possibly stay that way (unless a compelling reason to change comes along). My biggest win is that I can keep my data in its original holding place (MS SQL, Exchange, etc.) and only expose that data to specific web servers. Those web servers can be running anything since their responsibility has now shrunk to a web service call and displaying the data. Ok, there needs to be some additional mention of client side scripting (which will continue to be javascript), but that will be fairly minimal.



Hopefully this will give me the most options, my preferred development environment, and not lock me completely in Microsoft. Granted most of my infrastructure applications will still be Microsoft, even those can now be replaced as necessary and only the web service need be updated.

Thursday, May 06, 2004

Calling a Web Service from MS SQL

Well, with input from a few people in a forum and my own investigation, I've found only three ways to call a web service from an MS SQL Server 2000 system. Here are the three ways:



  • Write an exe and call it from xp_cmdshell (ugh)

  • Write an extended stored procedure and do it in there (better)

  • Do a vbscript call from a Scheduled Job (ok, but not ideal)


I ended up going with #3. The script is the same as it would be for an ASP page (not .NET). The downside is that I'm not passing a parameter to the web service and therefore I had to add a "ProcessedDate" field to account for the records I wanted to review. This is ok if you don't need real-time interaction. If you need it to be immediate, I would look to the extended stored procedure.



I really hope that with the upcoming releases of Whidbey and Yukon, Microsoft will make advances in linking to web services/web references. MS makes it fairly easy to link to an ODBC datasource in linked servers, this would just seem like the next logical step.

Monday, May 03, 2004

MS SQL Server and MySQL Connectivity

I know, no post for a while. As I was wondering the country I let this one slip. However, last week was not without an interesting challenge. One of my attempts last week was to connection a MS SQL Server 2000 (source) database to a MySQL Database (target). More specifically, I wanted to insert values into my target based on a trigger in my source. Let me tell you, without help from this post by Ben Margolin. I will not repeat, but do a summary of what worked for me.



This I took from Ben's recommendation regarding adding a linked server to your MSSQL Database Server.



  • Install Connector/ODBC for MySQL on your MSSQL server.


  • Depending on your configuration, you may need to open ports between your servers (in my case the MySQL server was in a completely different domain controlled by another company).


  • Create a system DSN on the MSSQL server then configure and test for connectivity to the MySQL server.


  • Open Query Analyzer for MSSQL and execute the following stored procedure (see NOTE 1 for details). This is possible via the Enterprise Manager, but I prefer the SQL directly.



    EXEC sp_addlinkedserver @server='LinkedAlias', @srvproduct='MySQL', @provider='MSDASQL', @datasrc='DsnName'



  • Map target username and password using the following stored procedure. Please note that this may change if you prefer a different connection method (again, you can configure through Enterprise Manager if you prefer).



    EXEC sp_addlinkedsvrlogin @rmtsrvname='LinkedAlias', @useself='false', @locallogin=NULL, @rmtuser='MySqlUsername', @rmtpassword='MySqlPassword'



  • Confirm your connection by either refreshing Enterprise Manager (looking in Security - Linked Servers) or executing this stored procedure:



    EXEC sp_helplinkedsrvlogin @rmtsrvname='LinkedAlias'


For the trigger/insertion part, I chose to create a stored procedure and call it from the trigger instead of putting all the code in the trigger. Either way is acceptable, but I wanted the option of calling the stored procedure from other locations (rather than only being able to use the trigger).



Here is the code for the stored procedure:



CREATE PROCEDURE [mysp_MySqlPush]

@id varchar(50)

AS

DECLARE @param1 varchar(100), @param2 varchar(100), @creationDate smalldatetime

DECLARE myCursor CURSOR FOR SELECT dbParam1, dbParam2 FROM myTable WHERE id = @id

OPEN myCursor

FETCH NEXT FROM myCursor INTO @param1, @param2

SELECT @creationDate = GetDate()

IF @@FETCH_STATUS = 0

BEGIN

INSERT INTO OPENQUERY(LinkedAlias, 'SELECT param1, param2, creationDate FROM MySqlTable WHERE 0=1') (param1, param2, creationDate) VALUES (@param1, @param2, @creationDate)

END

CLOSE myCursor

DEALLOCATE MyCursor


Next, I created the following trigger (see NOTE 2):



CREATE TRIGGER [trigMySqlPush] ON [dbo].[MyTable]

FOR INSERT

AS

SET XACT_ABORT ON

DECLARE @id varchar(50)

DECLARE trigCursor CURSOR FOR SELECT inserted.id FROM inserted

OPEN trigCursor

FETCH NEXT FROM trigCursor INTO @id

IF @@FETCH_STATUS = 0

BEGIN

exec mysp_MySqlPush @id

END

CLOSE trigCursor

DEALLOCATE trigCursor


NOTE 1: More information about the stored procedure can be found via books online or on MSDN. This article will discuss things such as the provider being MSDASQL for all ODBC connections.



NOTE 2: There are several noteworthy things. First of all, the call to XACT_ABORT was a necessity for me. You will want to review what XACT_ABORT does to see if you want to set it to ON. You can do this server wide instead of in the trigger, but I opted to only do it in the trigger. The other noteworthy thing for those that don't know, inserted is a system controlled table that holds the records of the most recent insert. Another thing is the statement for OPENQUERY. Without going into much detail, the 0=1 makes sure the remote server isn't queried for data (which would add to processing time) but the select statement is required to map columns. Last, there are several places in here to improve error trapping, etc., but this is a starting point.



Well, that's all. Enjoy this (and I hope it helps someone). Once again, proper thanks should go to Ben.

Friday, April 16, 2004

SQL Junkies and Data Modeling?

Low and behold someone else talking about data modeling. SQL Junkies posted a question to their blog title "Build the Database First, or the Application First?" I added my two cents on the whole thing. Basically, my entry (by Chad) states neither should be done first. I am a strong believer in modeling first. Now comes the hard part, how to be vigilant when designing and coding. Just remember, quick fixes are ok but will always come back to bite you if you don't take the time to do it right.

Thursday, April 15, 2004

Logical Data Structures (LDS) for Data Modeling

I've been recently introduced to the LDS methodology for data modeling by Peter O'Kelly, an analyst at Burton Group. So far, I am intrigued by the approaches of John Carlis and Joseph Maguire in their book, Mastering Data Modeling: A User-Driven Approach. Although you can gain some information from their website logicaldatastructures.com, if you are interested in this methodology the book is a must.

To date, I have been most impressed at the goal of separating the physical placement of data to it's logical equivalent. It is important to note that LDS is valuable in the early stages of software development, but it does not replace the need to do similar ERD, DFD, or other such diagramming. That being said, I am still very early in my evaluation of this approach so I will post more as I gain understanding.

Friday, April 09, 2004

Collaboration Products

Today I feel compelled to write about a fairly new product I've been working with lately. Macromedia released a product called Breeze a little over a month ago. I took advantage of their trial and really liked what I saw. In the past, I have used other products (such as WebEx) and Breeze totally blew away the competition. Perhaps the most amazing thing was the ability to do remote desktop sharing without having to install a client on either machine. I was pretty blown away that once I gave permission to share a document, the other person in my mini conference was able to start editing away. I haven't fully tested the capability, but it was exciting to open a session on demand, invite another attendee, and be remotely controlling content without an install, firewall issues (since everything is flash), or a difficult interface for a non-savvy user to find a way to share content.

The other attractive piece is third parties can repackage Breeze with other functions. This is the case with Convoq, but I found their product not as useful as Breeze. The one really nice feature in Convoq that is not available [yet] in Breeze is the instant messanger integration. I did find it buggy at times, but it lowered the time it took me to begin a session. I would guess that either Breeze will add similar functionality or another third party will create a product more taylored to my expectations.

Overall, you must check out the product if you do any remote discussions (sales, internal, or broadcast style). Nice job Macromedia!

Wednesday, April 07, 2004

Rendering Images from an XML Source

It appears that there isn't a good way to treat an image as data/content instead of a link. I was guessing this to be the case, but unless someone out there knows a way and isn't sharing.

Here is a bit for the W3C.... How about adding a tag to the XHTML spec that would enable the source to be base64 encoded version of the image (or other binary data) instead of linking? I understand that this would not be optimal for large images or for a large number of images. However, in the case that I want to protect access to an image, I could ship only that image definition to the client for rendering. That would enable XML to host all of the data instead of dividing it.

Here is what I am going to do (unless someone knows better and I find out about it). In the service that returns the XML version of the content, I'll embed each image's ID, type, size, and the image itself (in case of PDF distribution or third-party). For the HTML representation, I'll ignore the image and create a service page that will re-fetch the image using the details I send via the XML request. The last part will be linking to the service page in order to render the image. Definitely not ideal, but it will work.

Using XML to Transport Images

Today's investigation is using XML as a data transport for all types of data (binary and image data included). I decided to start by looking at how image data is stored in XML . Although there are many different encoding schemes and algorithms out there, it seems my best bet will be to store the images with a base64 encoding. This seems to be the default method being discussed and used (although not the absolute most efficient for large images).

Now that I have generated my XML file to transport data to a web server, now I need to transform the base64 data to a real image. This is where I have been hung up. Most of my research has pointed to storing a pointer to an image in the XML file instead of the image itself. Some, not all, of my images are data, not presentation. In essense, I want to be able to send XML to different rendering engines (web, pdf, third party) and have all the data intacted.

Next is the question of rights to view the image. Having authrization to view the document as a whole should be continued at the image level. If I just store this image as a link, I can't control access to the image alone. In addition, this complicates the matter if I wish to use a Apache/Linux box vs IIS/Windows box.

I'll add another post to discuss my findings.

Tuesday, April 06, 2004

Evolution of XML and Security

I've been starting do dive into the XML world quite a bit lately. It started with a random interest and has lead to working with Web services, XSL-T, XSD, and several other acronyms. I'm sure this will continue to several other levels of acronyms (WSDL, etc.). Perhaps the most interesting thing I have to say to date is this: same but different. Basically, no matter what you are programming in/to/on, it is still programming. Logical statements that affect the data around you. The different part is related to the level of interoperability and extensibility. I know this is quite obvious to most of the world out there, but the nature of the development work done is altered by the fact that this data is open. No longer can the developer sit in a little isolated world and let the network admins worry about security.

Don't get me wrong, there will always be security at the network level. The problem is that security is not only who you are, but what you are using to get to the data. Tie that to the ability to trust a partner to vouch for a third party, you now end up with varying degrees of trust in telling who you are, different access methods, and all the while locking down the same functions depending on the combinations above. It's no wonder that security analysts and developers are in hot demand these days. It seems like (and there probably is and I just haven't gotten to it yet) there should be some X-acronym to describe what kind of access a type of person should have given different access methods. You could call it Extensible Security Description Language (XSDL). Oh wait, those four letters are already taken. I guess the world of acronyms is just too crowded. How about XSec and deviate from the norm?

Alright, enough of that topic for now.