There is no easy way to get all caches in an large area. You will always have a max caches per query and day problem.
It is 10x1000 for PQ:s , 10000 total for vgps. Thre is also 6000 full API refresh and 10000 light refresh each day but to use the some desktop/phone program is needed. The fastest way is to use the PQ splitter and run 10 each day as PQ. and 10x1000 on vgps (can be added with a single "from / to hidden date" filter.
There is no limitation to the numbers of PQ:s you can define. Set one for the correct area and number of cacher you are interested in and make a copy of it a lot of time. Just rename the PQ and set the corrext date and you can reuse it next time you need a new GPX use dont use the "Run this query once then delete it" option
That will get you 20000 caches per day (A bit less in practice because of splint in 990 and less depending on number hidden each day). This is of course per account and with multiple account it will be speed up.
But if you what have access to all caches in an area I would suggest to some program like GSAK local on your computer. Then you just have to add new cacher and can keep the old one. And just refresh what you need. and use the API to get more caches
To update status (archived/disabled/active) and download logs has no limitations except than the 30 API request per minute.
There is a nice fetch all cacher punished in the last x(x<30) day in one or multiple countries or states. Do that with light data and a full refresh on the new in the database is an easy way add new cache.
There exist script to refresh the caches with the oldest
(ex 5000) data each night automatic and if you do that the database will be quite updated.
To now that you have all cache you only need the GC codes of the and they can be extracted from the VGPS CVS export quite easy
Another nice feature is that if you copy your database to an android phone and uses GDAK on in you have access to all caches without a slow import.