Published On: 2006-06-07 14:13:13
Have you ever wanted to test a long sql DDL script for syntax errors but didn't want to actually create your db structure yet? I've found the easiest way to do this is through the use of transactions. simply begin a transaction at the start of the script and roll it back at the end of the script. For example:
-- PostgreSQL DDL script BEGIN; -- begins our transaction block CREATE TABLE test_tbl ( pk numeric NOT NULL, data varchar(128), ); ROLLBACK; -- roll back everything this script just did COMMIT; -- use this instead of ROLLBACK to commit the changes
This has the benefit of allowing us to test the script for errors and yet not actually run it on the DB. The EXPLAIN command can do this also on some DB's but you would need it for every statement you wrote in the script and some statements will error out if you use EXPLAIN on them. I've found the Transaction method to work best for what I want to do.
Published On: 2006-05-18 16:26:57
and preserve the xml information in the index? May I present "the XML Indexer
". My brother, who's very populer AJAX Bible app
has been getting attention, needed an xml index of the KJV Bible. He asked if I could help him get it. We would be parsing the KJV in XML format and I needed to pull out the reference information for every occurence of every word. Well I thought an xml indexer might be useful in more than one capacity and there wasn't much on the net or cpan with the capability to do it. It needed to be light and fast because it was going to be parsing the entire bible so a DOM parser was out of the question. So I wrote my own. xml_indexer.pm is a module to index the words in an xml document and preserve the xml information about each occurence of the word. It's a little rough around the edges right now but it works. It uses the expat parser so it's light and fast. Look at the bible_index.pl script for an example of how it works. I'll do a tutorial on it later. Update:
This baby has been confirmed to parse the entire bible in Zaphania xml format in under 3 minutes. That is a 16 MB file. It spits out a 23 MB index in that space of time. Quite honestly it surprised me.
Published On: 2006-01-17 16:29:22
Not too long ago, before the bubble burst as they say, one of the HOT new things was B2B technology. Hooking businesses together for their mutual profit. You don't hear a whole lot about that anymore. I think probably because those companies lost their focus and consequently never made any money. You see the real power of the "network" is in sharing data. B2B really was all about sharing that data. If you could emphasize that feature you could have made money. Becoming the middle guy in the selling and purchasing of data could become a very powerful and lucrative business. Especially since the new "emphasis" on standards is helping the process along. If you look at a lot of the hottest things in the web right now they all talk about sharing data of some sort. Flikr, Techdirt, Blog aggregators. All of them provide ways to access and share their data easily. And the investors are salivating. Notice I said "their" data. If you never have data to share no one uses your API/Standard. The reality is that Standards only work if someone shows you how to use them and uses them themselves. The Data Middle Men are the ones who will define these standards of exchange. They will be brokering the transfers and more importantly providing the infrastructure for those transfer. I've been thinking about this a lot lately because one of my customers has an opportunity to become one of the first of the Homecare Data Middle Men. It's gonna be a fun and wild ride :-)
Published On: 2005-10-07 00:35:57
An article about MetaData and Database Design. Mostly just a look inside my thoughts recently on Database design and on the fly expandability.
Published On: 2005-09-27 00:36:36
Recognizing the difference between your Data and your Metadata can go a long way toward keeping your data formats extensible. Theoretically keeping your data generic and using metadata to describe it can allow for much greater flexibility in your application's design. Planning for, and accounting for, the ability to add more metadata on the fly can allow you a much greater capacity for growth in the types of data your application can handle. I'm experiencing this in a current project in fact. The company in question is growing and is faced with a need to change their current application to allow for that growth. A massive refactoring of the application is going to be needed. They will have to be able to add new "products" (otherwise known as data) to the application and present more ways for customers to get access to said product. A metadata based design in their data format will give them that kind of flexibility.
Published On: 2005-09-07 21:31:35
I am working on a legacy web application right now that is giving me fits. I'd say about 90% or so of the application is done in SQL. Yes you got that right. The business logic is almost completely written in a huge number of stored procedures, sql functions, and scheduled database tasks. This makes tracking down the parts of your app you are trying to work on very difficult. Every time I turn around there is another stored procedure, function, or scheduled database task that needs tweaking. I'm starting to go a little crazy. The problem is it obfuscates what your application is really doing. You think a perl obfuscation contest produces difficult to follow code? They got nothing on this. I realize stored procedures were the cat's meow at the time but this is beyond all human decency. I have got to start refactoring this thing before it gets out of control.
Published On: 2005-08-25 04:03:39
Let me let you in on a secret. SQL is a great language for getting information out of a database. But if your writing a long string of stored procedures that call functions which use a view that ties several tables together just to find out what particular piece of data is linked to your piece of data then you need a good talking too. Not only does all this unnecessary complexity make debugging hard for you. But it makes folks like me want to beat you up with a baseball bat. So do us all a favor. See if you can reduce the number of steps and keep the squirming pathway a little straighter. Then you can scream and rant and pummel the brains out of your fellow coders (who didn't heed this warning) with the rest of us.
Published On: 2005-05-17 04:13:43
Over at A List Apart
They are talking about custom DTD and Modular XHTML So I thought I'd dig up an old article I wrote on the subject and share it with you yet again. The Power of Modular XHTML
Published On: 2005-05-04 01:30:17
I am slowly getting some of my old article's back up and online. You will start to see them in the links under pages on the sidebar. As I determine they are useful or helpful I will put them back up. The first is Perl and CGI part I
I really should do part II of that one I suppose :-)