Marzhill Musings

... Next>>

Transactions as a debugging tool

Published On: 2006-06-07 14:13:13
Have you ever wanted to test a long sql DDL script for syntax errors but didn't want to actually create your db structure yet? I've found the easiest way to do this is through the use of transactions. simply begin a transaction at the start of the script and roll it back at the end of the script. For example: 1 2 -- PostgreSQL DDL script 3 BEGIN; 4 -- begins our transaction block 5 CREATE TABLE test_tbl ( pk numeric NOT NULL, data varchar(128), ); 6 ROLLBACK; -- roll back everything this script just did 7 COMMIT; -- use this instead of ROLLBACK to commit the changes This has the benefit of allowing us to test the script for errors and yet not actually run it on the DB. The EXPLAIN command can do this also on some DB's but you would need it for every statement you wrote in the script and some statements will error out if you use EXPLAIN on them. I've found the Transaction method to work best for what I want to do.

Tags:

Did you ever need to index an xml doc

Published On: 2006-05-18 16:26:57
and preserve the xml information in the index? May I present "the XML Indexer". My brother, who's very populer AJAX Bible app has been getting attention, needed an xml index of the KJV Bible. He asked if I could help him get it. We would be parsing the KJV in XML format and I needed to pull out the reference information for every occurence of every word. Well I thought an xml indexer might be useful in more than one capacity and there wasn't much on the net or cpan with the capability to do it. It needed to be light and fast because it was going to be parsing the entire bible so a DOM parser was out of the question. So I wrote my own. xml_indexer.pm is a module to index the words in an xml document and preserve the xml information about each occurence of the word. It's a little rough around the edges right now but it works. It uses the expat parser so it's light and fast. Look at the bible_index.pl script for an example of how it works. I'll do a tutorial on it later. Update: This baby has been confirmed to parse the entire bible in Zaphania xml format in under 3 minutes. That is a 16 MB file. It spits out a 23 MB index in that space of time. Quite honestly it surprised me.

Tags:

Are You a Data Middle Man?

Published On: 2006-01-17 16:29:22
Not too long ago, before the bubble burst as they say, one of the HOT new things was B2B technology. Hooking businesses together for their mutual profit. You don't hear a whole lot about that anymore. I think probably because those companies lost their focus and consequently never made any money. You see the real power of the "network" is in sharing data. B2B really was all about sharing that data. If you could emphasize that feature you could have made money. Becoming the middle guy in the selling and purchasing of data could become a very powerful and lucrative business. Especially since the new "emphasis" on standards is helping the process along. If you look at a lot of the hottest things in the web right now they all talk about sharing data of some sort. Flikr, Techdirt, Blog aggregators. All of them provide ways to access and share their data easily. And the investors are salivating. Notice I said "their" data. If you never have data to share no one uses your API/Standard. The reality is that Standards only work if someone shows you how to use them and uses them themselves. The Data Middle Men are the ones who will define these standards of exchange. They will be brokering the transfers and more importantly providing the infrastructure for those transfer. I've been thinking about this a lot lately because one of my customers has an opportunity to become one of the first of the Homecare Data Middle Men. It's gonna be a fun and wild ride :-)

Tags:
... Next>>