Tuesday, December 20, 2005

All-Digital by 2009 (TV)

Not that anyone who reads this would not have a dish or digital cable, but over-the-air tv is going to be phased out. Well, maybe not phased out, but discontinued. Evidentily there is money to be made on the radio frequencies being used. Check out the article by Yahoo! News.

Tuesday, November 08, 2005

SQL Server Parameter Default Value Problem

RANT: Ok, so I understand that sometimes there are reasons why some things are allowed and others are not. However, I haven't been able to justify this one.


If I want to default a stored procedure parameter to today's date if left empty, I can't do it. Reason being only a constant or Null can be a default value of a parameter. Ok, fine. I'll set the default to null then evaluate if it's null in the SP and set to GetDate(). It's a workaround and another couple lines of code that don't need to be there.


My question is this though: Why can I set a column in a table to a default value of GetDate() but not in a stored procedure? Even better, if you try to create a stored procedure with this assignment by default, it gives you some strange error about ")" or whatever line precedes the assignment and doesn't explain that a constant wasn't found.

Tuesday, October 11, 2005

More SQL Date Time Comparisons

I found an article by Gregory Larsen called Working with SQL Server Date/Time Variables: Part Three - Searching for Particular Date Values and Ranges that helped me with a few different ways of verifying a date range I was trying to make sure included events scheduled for after today's date.


I'm sure there are more elegant ways of doing this, but I wanted to list all "upcoming" events which included events that may have already occured today (but have them fall off the list tomorrow). I went for the FLOOR approach with this code:


SELECT * FROM Events WHERE Event_Date >= CAST(FLOOR(CAST(GETDATE() AS FLOAT)) AS DATETIME)


I've always wished they had a slick date function that would just drop the seconds from a date for ease of comparison. Oh well.

Wednesday, October 05, 2005

Architectural Roles and Responsibilities

I found a post on Dinesh's Blog regarding architectural roles and responsibilities. I very much like the division and definition of these roles. However, I think the majority of companies combine these architect roles into much more traditional buckets: DBA, chief architect, and an application development manager. In fact, one of the more important roles in today's society, security, is just starting to get a more visible and political influence in organizations. It isn't that I don't agree that these roles should be identified and recognized, I just think it would be very common to see a name listed on several role definitions.

Wednesday, September 28, 2005

Alter Table Timeout Exceeded

I recently had some problems altering a column in a table in the SQL Server 2005 June CTP version. I continually ran into an error that the Timeout Exceeded. After much deliberation about what the problem was, I ended up returning to good old TSQL to alter the column and it worked like a champ. Here is my two cents on why the timeout occured.


The table I was adjusting is very large. In fact, it contains an image data field that can hold pdf's, ppt's, word documents, mp3 files, etc. I was trying to alter an nvarchar field to accept 75 characters instead of 50. When using the GUI in SQL Server, an alter table resaves all things that relate to the table. In otherwords, the foreign keys, indexes, altered columns, everything. The process took so long (> 30 seconds) and therefore would timeout. Since I was only executing one ALTER TABLE ALTER COLUMN statement, it was incredibly quick and painless (< 1 second). In fact, these types of changes make me wonder if SQL Server Management Studio should be used at all to make changes. I would have thought that by now the interface would have been smart enough to notice (via a comparison) that my change didn't need to resave everything. Oh well, maybe this will be true in SQL Server 3005 ;)

Thursday, September 08, 2005

Skype to be bought by...eBay?

Ok, maybe I haven't been watching that closely, but I really didn't think eBay was in the VoIP industry. Evidently it is since they are talking about buying Skype for $2-$3 Billion! Here's the article, WSJ: eBay in Talks to Buy Skype

Tuesday, August 23, 2005

Sharepoint and Visual Studio 2005

I found this interesting post from Scott Guthrie. Specifically, this section I found particularly interesting as I'm currently looking at Sharepoint and future integration with other projects.


"Sharepoint Server Support


We've done a lot of work to integrate ASP.NET 2.0 with the next release of Sharepoint that ships with Office 12 next year. The new version is built entirely using the new feature of ASP.NET 2.0: WebParts, Master Pages, Themes, Site Navigation, Membership, Roles, Personalization, Localization, Data Controls, etc.


One of our goals with Visual Studio is to enable great tool support for building Sharepoint solutions. Today with VS 2003 this is fairly limited (you can pretty much only create a web-part control as a class library). One of the design goals of the new VS 2005 web project system was to enable great Sharepoint support, and enable directly opening and debugging Sharepoint solutions. Going forward we will support creating and editing new pages in running Sharepoint servers, support adding code-behind logic to pages, creating application logic, customizing workflow, building and adding new webpart controls, and debugging the finished solution."

Thursday, August 04, 2005

SharePoint Applications by Microsoft

It looks like Microsoft is trying to make SharePoint more attractive to enterprises with some additional (30) templates. An EWeek article can be found here. The templates themselves can be found on Microsoft's Technet SharePoint Services site. Of interest (maybe only to me) is that a colleague of mine, Peter O'Kelly from Burton Group, is quoted in the EWeek article.

Tuesday, July 26, 2005

Roborapter for your Inner Geek

Ok, I don't typically blog about products, but this one is pretty cool. A company called Wow Wee has produced a second generation robot called Roboraptor (Robosapien being the first). This one appears to be pretty impressive even though the commercial for it is pretty sad. Too bad I can't get my hands on one. If I didn't have kids that would thrash it, I might even shell out the $120 for it. At any rate, a review for Roboraptor is currently up on PCMagazine's website.

Thursday, July 21, 2005

How to Update SQL Server Beta 2 with the June CTP

I thought I might as well as jot down my experience with this update as I need to do this three times and others might find some assurance here as well.



  1. The very first thing I did was to run a backup job on my user databases. It turns out this was just a precautionary measure as I didn't need to restore from backup.

  2. Next, I tried to run the Upgrade Advisor from the Redist/Upgrade Advisor directory on the DVD. It promptly told me that it needed the .Net Framework 2.0 before it could do anything. Since I had an older version of the framework installed, I guess it wants a specific version of the .Net Framework 2.0. In short, I didn't run this tool again.

  3. sqlbuw.exe, the Build Uninstall Wizard, was next on my list and I ran it from the \Setup Tools\Build Uninstall Wizard directory. This rarely happens, but the tool worked like a champ. I included the uninstall of the .Net Framework 2.0 beta and the Client Network Utilities.

  4. Reboot.

  5. Parts of this next step are not required. The uninstall utility does not delete the directories nor the .mdf files for the system databases. I elected to only delete the master, temp, model, etc .mdf and .ldf files instead of deleting the entire directory tree. Hypothetically, if you move your .mdf and .ldf files for your user databases, the entire tree could be deleted before running setup. Then those moved files would then need to be placed back in your data and log directories after setup. Again, I didn't use that method, so I'm not sure if it works (it should in theory).

  6. Next, run setup from the root of the DVD. This step is pretty much what you would expect as it runs through the options for authentication type, collation, etc.

  7. After installation completed, I rebooted even though "you don't need to."

  8. Once my system was back up, I attached the user databases.

  9. Since I didn't do any sort of archiving/restoring of my master db, I had to recreate any security accounts I had setup. If you have a ton of security settings in place, look to do some backup and restoring of master using the documented methods. Oh, and good luck finding those documented methods.

Visual Studio and SQL Server Beta Madness

As most are aware that the new releases of Visual Studio Beta 2 (and CTP's) do not work with the original SQL Server Beta 2. In fact, things continue to get more confusing when you try to mix and match SQL Server CTP's with Visual Studio CTP's. I asked a colleague of mine, Peter O'Kelly to make some inquirys to his contacts at Microsoft to answer the question, "what works with what?" As it turns out, my official (or unofficial depending on how you look at it) is yet to be returned.


However, Peter was able to dig up an answer by Eric Nelson (who I believe works at Microsoft). Basically, it looks like I can safely install the June CTP of SQL Server with Visual Studio 2005 Beta 2. Consequently, this is exactly what I wanted to hear since I wanted to avoid SQL Server Express and the Developer editions which were the only April CTP options.

Wednesday, July 20, 2005

Say Goodbye to Sql Server 2005 Beta 3

According to Microsoft in this press release, SQL Server 2005 (or Yukon) will not undergo a Beta 3 release. In it's place, Community Technical Previews (CTP's) will continue. To quote them exactly:



With the availability of the SQL Server 2005 April CTP, Microsoft also announced that it would adopt CTPs for the remainder of the SQL Server 2005 development cycle.


Not that I am completely excited about this change of events, but I will be ok with it as long as updates to SQL Server manage not to break interoperability with Visual Studio 2005 Beta 2. Hmmm, in that statement I might as well be saying "I'll be happy as long as the sun doesn't set tonight."

Tuesday, July 19, 2005

XML Serializer Fix for Visual Studio 2005 Beta 2

For those interested, Microsoft introduced a bug with beta 2 which results in the following error:



"If one class in the class hierarchy uses explicit sequencing feature (Order), then its base class and all derived classes have to do the same."



As it turns out, the bug is fixed in the July CTP, but there is no "go live" license support for the CTP. The interim fix was stated at: http://forums.microsoft.com/msdn/ShowPost.aspx?PostID=55034. Using that post as a guideline, I added System.Xml.Serialization to the using commands, then added the [XmlElement] entry to every property in my object model.



/// <summary>

/// Sets/Gets Test

/// </summary>

/// <value></value>

[XmlElement(ElementName = "Test", Order = 1)]

public int Test

{

    set { _test = value; }

    get { return _test; }

}



This fix hasn't solved all my beta 2 problems, but it is a very major step in the right direction. Good luck for those others out there doing the same.

Wednesday, July 13, 2005

XML Data Management Tutorial

Yesterday I attended a tutorial titled "XML Data Management" by Peter Lacey. Mr. Lacey did a great job filling in the gaps of my knowledge as it relates to XML Schema (xsd) and XQuery. The course focused on XML data as opposed to XML text. Although XML text has more relation to my uses, the course definitely improved my understanding and future implementations.



Even though it may be considered a side-issue related to the tutorial, I learned that my use of Visual Studio and web service creation was not only wrong, but dreadfully wrong. My implementations worked, but lacked foresight and planning.



I started with a logical data model to outline and map my data. My next step should have been to work on a taxonomy and namespace generation. Follow that up with xsd creation and WSDL definitions then I would finally be ready to write my webservices. Instead, I jumped directly from my Logical Data Structure (LDS model) to my web service creation.



In short, my shortcomings were due to ignorance about the tools I am using. It will be an interesting adjustment and some require some additional education, but I'm up for the challenge.

Thursday, June 09, 2005

Internet on Planes?

I ran into this little gem and thought I should share it:


FAA Approves Wi-Fi for United

Business travelers will be happy to learn the sky's not the limit for internet access. Teaming with Verizon Airfone®, United Airlines has become the first U.S. carrier to receive Federal Aviation Administration (FAA) approval to install the necessary cabin equipment to enable wireless technology (wi-fi) devices on board a U.S. domestic commercial aircraft.


"Our research shows that connecting to the Internet is customers' most preferred form of communication to the ground, and this certification is a crucial step to bring this in-flight wireless access to our customers," said Dennis Cary, United's SVP Marketing. The date that customers may begin using wi-fi devices on United ultimately will be determined by the Federal Communications Commission (FCC) in the coming months.


More info here

Monday, June 06, 2005

Replication between SQL Server 2000 and Yukon (2005)

Lately I've been working on a solution to replicate data between two servers in remote locations. Server A is the master server running Windows Server 2000 and MS SQL Server 2000 while Server B is the subscribing server using Windows Server 2003 and MS SQL Server 2005 beta 2. I ran into a few snags and ended up investigating several methods to accomplish my goal. Here are the methods I looked into:


  • Snapshot replication (pull and push models)

  • Transaction replication (pull and push models)

  • DTS Import/Export package building and MS Agents to run the package

  • Yukon's Business Intelligence Development Studio to create or modify an existing DTSX package

I ended up getting all of them working (to one degree or another) minus the SQL Server Agents launching a DTSX package. For some reason, I could run the DTEXEC function and it would work. Launch that one with a SQL Server Agent and it would fail with some security thing. My end guess is that SQL Server (in a dev environment only) was running under the local system account instead of a named account and therefore mucked it up.


The sad part of all this research and investigation is that I ended up going with the first solution I looked into (snapshot push model replication). Here was the biggest problem I ran into with the snapshot push (from SQL Server 2000 to Yukon).


If you add the servers all in and request the subscription from Server B (Yukon), the server is added as a Remote Server on Server A. Everytime the connection attempt would occur, I would receive an error that I had to use the Management Studio or SMO to manage Server B (Yukon). I ended up killing the remote server entry and adding a Linked Server entry for Server B using the MS OLEDB for SQL Server Provider. I also had to alter a setting under the Publication's subscribers to alter the connection to use a named SQL Server Login (due to the System account usage for the SQL Server Agent process). This information from Cryer.com helped.


Anyway, I hope this can help someone else out there.

Wednesday, May 25, 2005

Business Programmers and Core Developers

I've been thinking lately about the different types of programmers and different attitudes toward application design and development. An interesting concept occurred to me. In my industry, there seem to be two types of programmers: business programmers and core developers. Why do I make a distinction? I've been analyzing some code and reviewing articles (for example, Scott Mitchell's "More On Why I Don't Use DataSets in My ASP.NET Applications") and it occurred to me that I don't dive into the constructs as much as many people do. I typically take a given tool, a dataset for example, and implement it. If I have issues with performance, I will investigate alternatives and update my knowledge so next time I use a datareader instead (or whatever tool fits the situation).


How is this different from a core developer? I think a core developer will research all the options and dive deep into what fits initially. The good news is the application will run well and correct the first time. The downside is that investigation takes time. I would actually say it takes longer to investigate than it does to actually code. This causes the business to be slow in adoption and slow in momentum behind new projects (which in turn causes poor moral due to additional red tape...but that is another topic).


That being said, I believe that the correct balance lies in the middle. Businesses need to be agile and small applications need to be created quickly. That is the role of the business programmer. However, there is a point in the application design where the architect needs to decide on how much developer time to spend. If too little time is spent, the application becomes a maintenance nightmare with many points of failure. It may work when you roll it out, but will it hold up?


Okay, so I don't have any grand insight as to the answer as this was just a passing thought I wanted to jot down. I guess the real answer might just goes back to the saying, "Everything in moderation."

Wednesday, May 18, 2005

Visual Studio 2005 Beta 1 to Beta 2 Upgrade

I thought I would document my upgrade issues and post them for anyone they might help. Some noteworthy things. I did a complete reinstall on a system (fresh OS) then installed Office, Visual Studio 2003, and a few other tools. I then copied my VS beta 1 code to the new system and tried did the following:




  1. Find and Replace on all files in solution CompileWith to CodeFile

  2. Find and Replace on all files in solution - ClassName to Inherits

  3. Recopy any .js files as some had "ClassName" text in them.

  4. I had used the namespace System.Xml.Query which was removed. Specifically, I had made an XmlCommand to do a transformation. I had to rework it using the System.Xml.Xsl.XslTransformation class. I found the article "Transforming an XML string with an XSLT string" by Robert Levy helpful. I will post my completed method later.

  5. System.Configuration.ConfigurationSettings.ConnectionStrings seems to have disappeared. Not only that, but I get a bunch of warnings that I should be using ConfigurationManager instead. Only downside? Where IS the ConfigurationManager (not in System.Configuration!)?

  6. I had a case where one of my web services wouldn't compile with the error: Could not create type 'service name'. As it turns out, I had to move the code behind file into a new directory named App_Code. The errors showed one at a time, but I eventually had to move all code behind files to the App_Code directory (and update the asmx files to point to the App_Code directory).

  7. I also had to change an attribute of the WebServiceBindings class from ConformanceClaims to ConformsTo and the attribute value to WsiProfiles.BasicProfile1_1.




That's it.

Wednesday, May 11, 2005

Digital ID World 2005 Part 2

This post is more a note of interest rather than debate. About four years ago I heard the term "Reference Architecture" from a company called NetReference (now a part of Burton Group). During the course of this conference, I have heard that same term used by several vendors but none more than oath. Interestingly enough, Burton Group's product is termed "Burton Group's Reference Architecture" and oath's initiative is called the "OATH's Reference Architecture" version 1. Evidently someone was watching/listening and liked the term.

Digital ID World 2005 Part 1

To preface, these posts will not be in chronological order as my notes are not completely gathered yet. However, I've been listening to several vendor presentations and I'm begining to think that VPNs will add and combine strong authentication mechanisms with mal-ware detection software to protect employee access to network systems. How would this work? Here is my take.


First off, a USB device would have a required token in order to access the device (so if plugged in, no access would occur until the device was unlocked. Once unlocked, the device would scan the system for any mal-ware or other intrusion methods to ensure nothing can be captured. The next step is to contain the VPN software required to access your private network. Once this software was located and the user is authenticated, the device is required to be removed from the system (to prevent users from leaving the device in a single system or random system.


Other additions would be related to work systems that already have the vpn software and scanning software which would speed up the authentication process. That way the typical login time would not be as intense as it would be given a random system that an employee attempts to use. After all, do you know what exists on your neighbors computer?

Tuesday, May 10, 2005

Digital ID World 2005 Opening

I'm currently waiting for the Digital ID World conference to begin in San Francisco. I've been lucky enough to sit and chat with Jamie Lewis, the CEO of Burton Group. The conference appears to be small enough to keep a close feel with the speakers. I'll continue to do more "Live Blogging" as the day continues and I have more opinions on what is said.

Thursday, April 28, 2005

Computer Woes

Last weekend was a bad computer-weekend for me. My laptop locked up 30 times (no joke) so I started rebuilding it on Sunday. Now today is Thursday and I'm almost done. Good thing I can work on another system. The reason it is taking so long? I applied an update to my Intel chipset from Dell and it locked my system back up yesterday (to the point where it would no longer boot). So I started it over yesterday. I am back to a basic installation with all my system drivers installed and MS Office (plus a few basic apps). I'm afraid to install Visual Studio again. Previously I had VS 2002, 2003, and 2005 beta 1 installed. For some reason, beta 1 worked better than 2003 (which is why I had to reload in the first place). Oh well.

Tuesday, April 19, 2005

Adobe buying Macromedia?

I have to say I was a bit surprised to read this. I will be very interested to see what the outcome of this aquisition will be both on the other players in the market and on the pricing structure of Macromedia products (such as Breeze). Ah well, time will tell.

Friday, April 01, 2005

Delete Child Table Rows Using Parent Table Rows (SQL)

I ran into this article when working on a deletion of child records using a parent record's fields for evaluation. Hopefully someone else will find this useful too.



You can find the article by Thom Pantazi here.



I did alter his code a bit as I was about to delete all rows from my child table instead of the ones that matched my criteria. Here is my final statements:




DELETE ChildTable

WHERE ParentId IN (SELECT ParentId FROM ParentTable WHERE Criteria = 1)



DELETE ParentId WHERE Criteria = 1

Wednesday, March 02, 2005

All About Humor

I found this particularly entertaining. The comments left me laughing. Isn't it great when you can get MS secrets about highly debated technology?!

Tuesday, March 01, 2005

SQL Server 2005 Copy Database Script

Well, although I've had some trouble actually getting this out the door, I think I have a script that will copy a database to a new database and be completely intact (minus uncommited transactions in process during time of backup). There are additional options to enable that using NORECOVERY or STANDBY states then applying additional transaction log entries (but that is not the problem I am trying to solve).



I found the basics for this script from Michael Schwarz. I believe his script worked under a different version of SQL Server, but I had to make some adjustments in order to make it work for SQL Server 2005 (beta 2). The primary changes related to the columns returned from the RESTORE FILELISTONLY method, selection of the FILE value of the RESTORE DATABASE command, and adding the RECOVERY option to the RESTORE DATABASE command (otherwise the error "Database 'name' cannot be opened. It is in the middle of a restore" reared its ugly head).



In a nutshell, here is the script:



USE master

GO


DECLARE @DB varchar(200)

SET @DB = 'SourceDB'

-- the backup filename

DECLARE @BackupFile varchar(2000)

SET @BackupFile = 'c:\testing\SourceDBbackup.dat'

-- the new database name

DECLARE @TestDB varchar(200)

SET @TestDB = 'DestinationDB'

-- the new database files without .mdf/.ldf

DECLARE @RestoreFile varchar(2000)

SET @RestoreFile = 'c:\testing\destination'



-- ********************************************

-- no change below this line

-- ********************************************



DECLARE @query varchar(2000)

DECLARE @DataFile varchar(2000)

SET @DataFile = @RestoreFile + '.mdf'

DECLARE @LogFile varchar(2000)

SET @LogFile = @RestoreFile + '.ldf'

IF @DB IS NOT NULL

BEGIN

    SET @query = 'BACKUP DATABASE ' + @DB + ' TO DISK = ' + QUOTENAME(@BackupFile, '''')

    EXEC (@query)

END



IF EXISTS(SELECT * FROM sysdatabases WHERE name = @TestDB)

BEGIN

    SET @query = 'DROP DATABASE ' + @TestDB

    EXEC (@query)

END

RESTORE HEADERONLY FROM DISK = @BackupFile

DECLARE @File int

SET @File = @@ROWCOUNT

    -- This always returned 0 for me but the

    -- RESTORE call returned the number

    -- of rows associated with the backup. Strange...



DECLARE @Data varchar(500)

DECLARE @Log varchar(500)

SET @query = 'RESTORE FILELISTONLY FROM DISK = ' + QUOTENAME(@BackupFile , '''')

CREATE TABLE #restoretemp

(

    LogicalName varchar(500),

    PhysicalName varchar(500),

    Type varchar(10),

    FilegroupName varchar(200),

    Size int,

    MaxSize bigint,

    FileID bigint,

    CreateLSN numeric(25,0),

    DropLSN numeric(25,0),

    UniqueId uniqueidentifier,

    ReadOnlyLSN numeric(25,0),

    ReadWriteLSN numeric(25,0),

    BackupSizeInBytes bigint,

    SourceBlockSize int,

    FileGroupId int,

    LogGroupGUID uniqueidentifier,

    DifferentialBaseLSN numeric(25,0),

    DifferentialBaseGUID uniqueidentifier,

    IsReadOnly bit,

    IsPresent bit

)

INSERT #restoretemp EXEC (@query)

SELECT @Data = LogicalName FROM #restoretemp WHERE type = 'D'

SELECT @Log = LogicalName FROM #restoretemp WHERE type = 'L'

PRINT @Data

PRINT @Log

TRUNCATE TABLE #restoretemp

DROP TABLE #restoretemp



SET @query = 'RESTORE DATABASE ' + @TestDB + ' FROM DISK = ' + QUOTENAME(@BackupFile, '''') +

' WITH MOVE ' + QUOTENAME(@Data, '''') + ' TO ' + QUOTENAME(@DataFile, '''') + ', MOVE ' +

QUOTENAME(@Log, '''') + ' TO ' + QUOTENAME(@LogFile, '''') + ', FILE = 1, RECOVERY'



EXEC (@query)



GO

Tuesday, February 01, 2005

To DNN or Not to DNN?

Ok, I've been pretty happy with DotNetNuke and I'm very excited about their updated version (3.0.9 was my latest install test). The problem? I've grown attached to Visual Studio 2005 and Yukon. Don't get me wrong, ASP.Net is good and I still like it, but ASP.NET 2.0 becomes addicting. Actually it isn't the ASP side of VS 2005 that I'm liking so much. It is the class development and usage that has been fun. Whenever I go back to VS 2003, I try to do things the same way and it just isn't quite the same ;)


That brings us to my delima. I want to program my modules in vs 2005 to run on SQL Server 2005 of which the beta of DNN 3 doesn't currently support. I'm sure there are ways to hack it so things will work, but I also don't have that kind of time. So the question really becomes: Should I use DNN and VS 2003 or use VS 2005 and not some stock portal app?


Friday, January 07, 2005

Null This, Don't Null that!

I have effectively added to my confusion level when it comes to C# and SQL Server. My most recent frustration appeared when creating a loop in a stored procedure. In a nutshell, I was trying to order a funky sort of tree (First Child, Next Sibling type relationships).


I had a clean recursive query that worked like a champ until I ran into an issue with 32 nested levels. If you hit that like I did, you need to do a loop. Don't get me wrong, 32 is plenty deep. However, when I had a malformed tree (something got messed up importing data) I couldn't even pull the sections to my app since it would just error out.


Well, not big deal, just write a loop, right? If I had totally understood the issues with NULL and variables, this would have been a knockout. As it was, I had some serious issues with my while loop.


WHILE (@Id <> NULL)


That won't do it folks. You need:


WHILE (IsNull(@Id, 0) > 0)


Interesting how in SQL Server, you can assign NULL to a variable but can't compare it to null and in C# you can't assign NULL to a native type (string, int), but can compare it to null. Hopefully this will help someone else out there.