|
|
 |
 |
 |
 |
|
 |
 |
|
 |
 |
|
 |
|
Support::Packetcollector Any PacketCollector related problems or questions should be posted here. |
 |
|
 |

05-30-2010, 04:21 PM
|
Developer
|
|
Join Date: Feb 2004
Location: UK
Posts: 1,540
|
|
Quote:
Originally Posted by cavedude
our IDs are going to change QUICK now. SQLs won't help much in this case because the IDs will almost certainly conflict.
|
I did wonder if I could set variables for the starting Insert IDs at the start of the generated SQL and just reference them with increments afterwards, so you could just change
a few variables at the start to the next free IDs. I didn't know if I could do that in SQL, but I just tested it with select statements, and it seems to be possible:
Code:
set @myinsertid = 1001;
select id, name from npc_types where id = @myinsertid;
select id, name from npc_types where id = @myinsertid + 1;
Output:
Code:
mysql> source test.sql
Query OK, 0 rows affected (0.00 sec)
+------+-------------+
| id | name |
+------+-------------+
| 1001 | Guard_Mezzt |
+------+-------------+
1 row in set (0.00 sec)
+------+--------------+
| id | name |
+------+--------------+
| 1002 | Guard_Jerith |
+------+--------------+
1 row in set (0.00 sec)
So maybe that is the way to go.
|
 |
|
 |

05-30-2010, 04:37 PM
|
 |
The PEQ Dude
|
|
Join Date: Apr 2003
Location: -
Posts: 1,988
|
|
Quote:
Originally Posted by Derision
So maybe that is the way to go.
|
Yes! I think you got it! It would mean everybody would need to re-dump their SQLs to share, but it would also mean that with a couple changes all those files will be universal over any database.
|
 |
|
 |

05-30-2010, 05:24 PM
|
Developer
|
|
Join Date: Feb 2004
Location: UK
Posts: 1,540
|
|
Quote:
Originally Posted by cavedude
Yes! I think you got it! It would mean everybody would need to re-dump their SQLs to share, but it would also mean that with a couple changes all those files will be universal
over any database.
|
How would it be if I removed all the INSERT ID fields from the Extractor UI, so you didn't have to worry about them when generating the SQL, and could just go in afterwards
and set them as required, i.e. I would put a template at the start of the generated SQL like:
Code:
set @NPCTypesStartingInsertID = XXXXXXX
set @SpawnEntryStartingInsertID = XXXXXXX
...
...
-- Set the starting Insert IDs above and remove the exit statement below before executing this SQL
exit
<Generated Insert statements follow, referencing the variables defined above>
I.e. you would just have to press the Load .pcap button, select the .pcap file, alter the check boxes if you wanted, not have to worry about IDs at that point, and then just click 'Generate SQL'.
|
 |
|
 |

05-30-2010, 05:27 PM
|
 |
Demi-God
|
|
Join Date: May 2007
Location: b
Posts: 1,449
|
|
Quote:
Originally Posted by Derision
How would it be if I removed all the INSERT ID fields from the Extractor UI, so you didn't have to worry about them when generating the SQL, and could just go in afterwards
and set them as required, i.e. I would put a template at the start of the generated SQL like:
Code:
set @NPCTypesStartingInsertID = XXXXXXX
set @SpawnEntryStartingInsertID = XXXXXXX
...
...
-- Set the starting Insert IDs above and remove the exit statement below before executing this SQL
exit
<Generated Insert statements follow, referencing the variables defined above>
I.e. you would just have to press the Load .pcap button, select the .pcap file, alter the check boxes if you wanted, not have to worry about IDs at that point, and then just click 'Generate SQL'.
|
That seems ideal to me. That way it's compatable with any database.
|

05-30-2010, 05:31 PM
|
 |
Demi-God
|
|
Join Date: May 2007
Location: b
Posts: 1,449
|
|
As for loot drops and everything, I found that Magelo is a better resource than Lucy. I don't know why or how, but it is. I think Magelo uses the same ID in items too.
edit: yeah it does. http://eq.magelo.com/item/1001
|
 |
|
 |

05-30-2010, 05:45 PM
|
 |
Developer
|
|
Join Date: Aug 2006
Location: USA
Posts: 5,946
|
|
Quote:
Originally Posted by Derision
How would it be if I removed all the INSERT ID fields from the Extractor UI, so you didn't have to worry about them when generating the SQL, and could just go in afterwards
and set them as required, i.e. I would put a template at the start of the generated SQL like:
Code:
set @NPCTypesStartingInsertID = XXXXXXX
set @SpawnEntryStartingInsertID = XXXXXXX
...
...
-- Set the starting Insert IDs above and remove the exit statement below before executing this SQL
exit
<Generated Insert statements follow, referencing the variables defined above>
I.e. you would just have to press the Load .pcap button, select the .pcap file, alter the check boxes if you wanted, not have to worry about IDs at that point, and then just click 'Generate SQL'.
|
I prefer having the fields there, but a simple toggle box to toggle them enabled/disabled would work well. That way, people could toggle them off when collecting for PEQ or leave them on if collecting for their own DB. A simple toggle would make it nice and quick to generate one of each type of you wanted. I think it is nice to be able to see the actual range of numbers that the SQL is planning to use so it can be easily compared against the DB to make sure it will be ok.
|
 |
|
 |

05-30-2010, 06:07 PM
|
 |
The PEQ Dude
|
|
Join Date: Apr 2003
Location: -
Posts: 1,988
|
|
I agree with Trevius. Although I would personally disable the IDs in the program, I think it should still be an option for others.
|
 |
|
 |

05-30-2010, 07:33 PM
|
 |
Developer
|
|
Join Date: Aug 2006
Location: USA
Posts: 5,946
|
|
I don't know the details of how the tool works exactly, but it seems like it parses the .pcap when you first load it. Perhaps you could just give an option to generate a file of that parse, which could then be loaded and used to generate SQL from. Figured that may be another possible option for creating files anyone could use and share without worrying about private info. Another nice bonus to using a parse instead of some generated SQL is that a parse should be able to hold data that may not currently be set to be generated into SQL yet, so as the extractor tool is refined/expanded, those same parses could be ran through the tool again for an instant update with the latest features. This should prevent people from having to run their own .pcaps through the tool and sending them again. Instead, Cavedude could just run the parses through the tool, and have the latest with minimal work.
Last edited by trevius; 05-30-2010 at 08:03 PM..
|
 |
|
 |
 |
|
 |

05-31-2010, 01:56 AM
|
Developer
|
|
Join Date: Apr 2009
Location: USA
Posts: 478
|
|
Quote:
Originally Posted by trevius
I don't know the details of how the tool works exactly, but it seems like it parses the .pcap when you first load it. Perhaps you could just give an option to generate a file of that parse, which could then be loaded and used to generate SQL from. Figured that may be another possible option for creating files anyone could use and share without worrying about private info. Another nice bonus to using a parse instead of some generated SQL is that a parse should be able to hold data that may not currently be set to be generated into SQL yet, so as the extractor tool is refined/expanded, those same parses could be ran through the tool again for an instant update with the latest features. This should prevent people from having to run their own .pcaps through the tool and sending them again. Instead, Cavedude could just run the parses through the tool, and have the latest with minimal work.
|
Why not just have a tool that zeros out any personal info from the .pcap files so they could be sent in as is. Then the full communication between client and server is complete in case we figure out something additional to extract from it later, but it won't contain any personal information that a player may wish kept from the general populace.
|
 |
|
 |

05-30-2010, 08:55 PM
|
 |
Administrator
|
|
Join Date: Feb 2009
Location: MN
Posts: 2,072
|
|
Quote:
Originally Posted by cavedude
I agree with Trevius. Although I would personally disable the IDs in the program, I think it should still be an option for others.
|
I just got back from ball, I agree completely that this should be a universal application for dumping collects, but should also be customizable if need be.
|
 |
|
 |

05-31-2010, 01:52 AM
|
Developer
|
|
Join Date: Apr 2009
Location: USA
Posts: 478
|
|
Quote:
Originally Posted by Derision
I did wonder if I could set variables for the starting Insert IDs at the start of the generated SQL and just reference them with increments afterwards, so you could just change
a few variables at the start to the next free IDs. I didn't know if I could do that in SQL, but I just tested it with select statements, and it seems to be possible:
Code:
set @myinsertid = 1001;
select id, name from npc_types where id = @myinsertid;
select id, name from npc_types where id = @myinsertid + 1;
Output:
Code:
mysql> source test.sql
Query OK, 0 rows affected (0.00 sec)
+------+-------------+
| id | name |
+------+-------------+
| 1001 | Guard_Mezzt |
+------+-------------+
1 row in set (0.00 sec)
+------+--------------+
| id | name |
+------+--------------+
| 1002 | Guard_Jerith |
+------+--------------+
1 row in set (0.00 sec)
So maybe that is the way to go.
|
Why not just fill in the ID with a sub select? You could simply do a (select max(id) from table)+1 where you would otherwise put in the ID. Or you could at least use a select to automatically set the variable at the start and otherwise do it as you have above.
|
 |
|
 |
Thread Tools |
|
Display Modes |
Hybrid Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -4. The time now is 05:57 PM.
|
|
 |
|
 |
|
|
|
 |
|
 |
|
 |