I talked about this a long time ago, writing a java spider that could walk the sites and grab all item data. It wouldn't be that horribly difficult. The hardest part I would think would be getting it into the blob format to reinsert. Also, alot of characteristics arent going to be accounted for in the websuck, so you'd probably still have to tweak alot of them by hand changing default cases.
I've manipulated the blobs a bit through php, but that was just reading the data, not manipulating it. WC might have a better opinion. If we gave you a text file with the following template format,
Name:
Item Type: 1 hand slash
Weight: 6.0
Lore: Y
etc...
could someone come up with a parser that could recreate it into an sql statement in the binary blob format? If someone says yes to that, then writing a spider would be a piece of cake.. Just give it the base URL, have it search each page for those characteristics, then insert it into a .db file that could be run through a encoder to form an itemupdate.sql
Just my 2 cents.
|