Return to Project-GC

Welcome to Project-GC Q&A. Ask questions and get answers from other Project-GC users.

If you get a good answer, click the checkbox on the left to select it as the best answer.

Upvote answers or questions that have helped you.

If you don't get clear answers, edit your question to make it clearer.

0 votes
324 views
I've been using a combersome process of creating several pocket querries to get all caches in a large area. (say a county or two)  I'm hoping that there is a much easier way to load them into my virtual GPS to get a GPX.  I'm aware of the PQ Splitter, but this also seems like a lot of work and covers a larger area than I want.
in Support and help by Geo-Knot (550 points)

2 Answers

+2 votes
There is no easy way to get all caches in an large area. You will always have a max caches per query and day problem.
It is 10x1000 for PQ:s , 10000 total for vgps. Thre is also 6000 full API refresh and 10000 light refresh each day but to use the some desktop/phone program is needed. The fastest way is to use the PQ splitter and run 10 each day as PQ. and 10x1000 on vgps (can be added with a single "from / to hidden date" filter.
 
There is no limitation to the numbers of PQ:s you can define. Set one for the correct area and number of cacher you are interested in and make a copy of it a lot of time. Just rename the PQ and set the corrext date and you can reuse it next time you need a new GPX use dont use the "Run this query once then delete it" option
 
That will get you 20000 caches per day (A bit less in practice because of splint in 990 and less depending on number hidden each day). This is of course per account and with multiple account it will be speed up. 
 
But if you what have access to all caches in an area I would suggest to some program like GSAK local on your computer. Then you just have to add new cacher and can keep the old one. And just refresh what you need. and use the API to get more caches
 To update status (archived/disabled/active) and download logs has no limitations except  than the 30 API request per minute.
There is a nice fetch all cacher punished in the last x(x<30) day in one or multiple countries or states. Do that with light data and a full refresh on the new in the database is an easy way add new cache.
There exist script to refresh the caches with the oldest  
(ex 5000) data each night automatic and if you do that the database will be quite updated.
To now that you have all cache you only need the GC codes of the and they can be extracted from the VGPS CVS export quite easy
 
Another nice feature is that if you copy your database to an android phone and uses GDAK on in you have access to all caches without a slow import.
 
by Target. (Expert) (104k points)
0 votes
I have my PQs set up by state and by date. Works great for CH and D and should also work for other places. Alternatively what I can suggest is to create routes (assuming you know where you are about to go) and create by-route-PQs. Still the amount of data to be handled may also be an issue.
by Domino_67 (6.8k points)
...