Return to Project-GC

Welcome to Project-GC Q&A. Ask questions and get answers from other Project-GC users.

If you get a good answer, click the checkbox on the left to select it as the best answer.

Upvote answers or questions that have helped you.

If you don't get clear answers, edit your question to make it clearer.

0 votes
1.3k views
When I tag a checker, I use the "verify tag" to check if the tag I have created is working  correctly.

If tagged correctly it shows on the outcome "OK" or if the challenge is not completed "Not ok"

If tagged wrong it shows on the outcome "Error"

My question is: What is the meaning if there is no outcome at all. It looks like the verify tag just stops?
in Support and help by vogelbird (Expert) (56.5k points)

1 Answer

0 votes

That shouldn't happen. Cases I can imagine:

  • 0 find logs on the challenge maybe. Note that the amount of logs in PGC is the relevant number, which has 24-36h delay.
  • Interuption in the ajax call, but then it should work if you try again.
  • It's brkoen for all tags due to some new bug (hard to imagine this though).

If the problem persists and none of the above seems to match, publish the tag so that I can test it as well.

by magma1447 (Admin) (241k points)
You are not really giving me information enough to answer you anything. I assume he does not fulfil it then?
We spent time enough on this subject for now, Thank you Ganja
I think Alamogul will be a problem on most checkers. I tried with the template script at uncommitted the local finds = PGC_GetFinds line
With the standard fields {'gccode', 'cache_name', 'visitdate', 'visitdate', 'country'} the scripts run out of memory
i 'country' i removed the memory usage is 249,344 kB  
But a user with 109.970 is an exception
The global #10 racer2814 with "only" 57804 finds uses 133,376 kB with the same script.
There is probibly no way to write that looks at all caches for the top 10
Racer2814 works fine on the script used in the GCcodes above "Generic streak checker" with a memory usage of 185,856 kB
Thank you for the additional information. Maybe we will raise the max memory for some users. Maybe it will end up with something like 128 MB + numFinds*0.01 MB, or something similar.

A more scientific approach than guessing would be needed though.
Thanks for the comment
...