Friday, April 16, 2004

SQL Junkies and Data Modeling?

Low and behold someone else talking about data modeling. SQL Junkies posted a question to their blog title "Build the Database First, or the Application First?" I added my two cents on the whole thing. Basically, my entry (by Chad) states neither should be done first. I am a strong believer in modeling first. Now comes the hard part, how to be vigilant when designing and coding. Just remember, quick fixes are ok but will always come back to bite you if you don't take the time to do it right.

Thursday, April 15, 2004

Logical Data Structures (LDS) for Data Modeling

I've been recently introduced to the LDS methodology for data modeling by Peter O'Kelly, an analyst at Burton Group. So far, I am intrigued by the approaches of John Carlis and Joseph Maguire in their book, Mastering Data Modeling: A User-Driven Approach. Although you can gain some information from their website logicaldatastructures.com, if you are interested in this methodology the book is a must.

To date, I have been most impressed at the goal of separating the physical placement of data to it's logical equivalent. It is important to note that LDS is valuable in the early stages of software development, but it does not replace the need to do similar ERD, DFD, or other such diagramming. That being said, I am still very early in my evaluation of this approach so I will post more as I gain understanding.

Friday, April 09, 2004

Collaboration Products

Today I feel compelled to write about a fairly new product I've been working with lately. Macromedia released a product called Breeze a little over a month ago. I took advantage of their trial and really liked what I saw. In the past, I have used other products (such as WebEx) and Breeze totally blew away the competition. Perhaps the most amazing thing was the ability to do remote desktop sharing without having to install a client on either machine. I was pretty blown away that once I gave permission to share a document, the other person in my mini conference was able to start editing away. I haven't fully tested the capability, but it was exciting to open a session on demand, invite another attendee, and be remotely controlling content without an install, firewall issues (since everything is flash), or a difficult interface for a non-savvy user to find a way to share content.

The other attractive piece is third parties can repackage Breeze with other functions. This is the case with Convoq, but I found their product not as useful as Breeze. The one really nice feature in Convoq that is not available [yet] in Breeze is the instant messanger integration. I did find it buggy at times, but it lowered the time it took me to begin a session. I would guess that either Breeze will add similar functionality or another third party will create a product more taylored to my expectations.

Overall, you must check out the product if you do any remote discussions (sales, internal, or broadcast style). Nice job Macromedia!

Wednesday, April 07, 2004

Rendering Images from an XML Source

It appears that there isn't a good way to treat an image as data/content instead of a link. I was guessing this to be the case, but unless someone out there knows a way and isn't sharing.

Here is a bit for the W3C.... How about adding a tag to the XHTML spec that would enable the source to be base64 encoded version of the image (or other binary data) instead of linking? I understand that this would not be optimal for large images or for a large number of images. However, in the case that I want to protect access to an image, I could ship only that image definition to the client for rendering. That would enable XML to host all of the data instead of dividing it.

Here is what I am going to do (unless someone knows better and I find out about it). In the service that returns the XML version of the content, I'll embed each image's ID, type, size, and the image itself (in case of PDF distribution or third-party). For the HTML representation, I'll ignore the image and create a service page that will re-fetch the image using the details I send via the XML request. The last part will be linking to the service page in order to render the image. Definitely not ideal, but it will work.

Using XML to Transport Images

Today's investigation is using XML as a data transport for all types of data (binary and image data included). I decided to start by looking at how image data is stored in XML . Although there are many different encoding schemes and algorithms out there, it seems my best bet will be to store the images with a base64 encoding. This seems to be the default method being discussed and used (although not the absolute most efficient for large images).

Now that I have generated my XML file to transport data to a web server, now I need to transform the base64 data to a real image. This is where I have been hung up. Most of my research has pointed to storing a pointer to an image in the XML file instead of the image itself. Some, not all, of my images are data, not presentation. In essense, I want to be able to send XML to different rendering engines (web, pdf, third party) and have all the data intacted.

Next is the question of rights to view the image. Having authrization to view the document as a whole should be continued at the image level. If I just store this image as a link, I can't control access to the image alone. In addition, this complicates the matter if I wish to use a Apache/Linux box vs IIS/Windows box.

I'll add another post to discuss my findings.

Tuesday, April 06, 2004

Evolution of XML and Security

I've been starting do dive into the XML world quite a bit lately. It started with a random interest and has lead to working with Web services, XSL-T, XSD, and several other acronyms. I'm sure this will continue to several other levels of acronyms (WSDL, etc.). Perhaps the most interesting thing I have to say to date is this: same but different. Basically, no matter what you are programming in/to/on, it is still programming. Logical statements that affect the data around you. The different part is related to the level of interoperability and extensibility. I know this is quite obvious to most of the world out there, but the nature of the development work done is altered by the fact that this data is open. No longer can the developer sit in a little isolated world and let the network admins worry about security.

Don't get me wrong, there will always be security at the network level. The problem is that security is not only who you are, but what you are using to get to the data. Tie that to the ability to trust a partner to vouch for a third party, you now end up with varying degrees of trust in telling who you are, different access methods, and all the while locking down the same functions depending on the combinations above. It's no wonder that security analysts and developers are in hot demand these days. It seems like (and there probably is and I just haven't gotten to it yet) there should be some X-acronym to describe what kind of access a type of person should have given different access methods. You could call it Extensible Security Description Language (XSDL). Oh wait, those four letters are already taken. I guess the world of acronyms is just too crowded. How about XSec and deviate from the norm?

Alright, enough of that topic for now.