I have a small issues, which has lead to a thought.
I would like to create a challenge to have Full D/T of Multis (or maybe 80 unique D/Ts). My issue is that in Sweden, there are 153 Swedes who has full D/T with Multis. However, almost every single one of them has it from a stupid cache series where I know 90% of the canisters has quite fake rating, I have logged the series myself. If I exclude the county where this series is located I am down to 7 Swedes. So 146/153 Swedes doesn't really deserve it in my opinion.
As a Challenge cache owner I would like to counter this, and I have the following ideas (one of them suggested to me by someone else):
* Exclude geocaches in the specific county - Not sure this will be allowed by the reviewers/HQ. No official rule against it, but I can really see how it would be seen as discriminating. A dangerous road for the future of Challenges. For example, the next step would be that someone says "no finds in Germany counts" just because they don't like Germans (Germans is just an example, please don't feel offended).
* Don't allow to use multiple geocaches where the cache title begins with the same text. For example "Pole fishing multis #1" and "Pole fishing multis #34" would be too similar. - I doubt this is allowed since challenge aren't allowed to use the cache name.
* Allow max 9 (or some other number) finds per county. - This would really prevent using the same series for the whole D/T matrix. A creative way to get around the issue in my opinion (the idea was suggested to me). So my 81 D/Ts would have to be spread in at least 9 (81/9=9) counties. The issue here is that such challenge checker would be fairly complex to write, and also potentially very slow.
Anyone have thoughts on the whole concept/problem? Do you think it would be feasible to create a checker that:
* Check for X number of unique D/Ts with the supplied GetFinds-compatible filter.
* Check that there are at most Y number of finds per country/region/county combination.
* Returns a list based on gccode, visitdate, difficulty, terrain, (country), (region), county. Probably sorted on the D/T.
Do you think it would be fast enough even for someone with almost 10,000 multis logged? Geocacher "qrang" has almost 9,000 multis for example. Too me it feels like the problem would have to be iterated a lot, and that it's likely that it won't be possible to do it in less than 30 seconds.
If anyone would be interested in throwing a test together I would be happy, I feel that my LUA skills are too low to get flow enough to work with this. I also believe many others here are a lot smarter than me in the area of smart algorithms.