Why aren't appeals being used as a way to identify, evaluate, and educate poor reviewers?
One of the most desired aspects of appeals was that it would help curb bad reviewing and decrease the frequency at which nominations were improperly rejected by the community. Now that the appeals system is up and running, it appears that when Niantic processes an appeal, they essentially are only re-reviewing it. It does not appear that Niantic takes any look at who wrongly rejected a nomination, and it does not appear that Niantic is doing anything to identify those who are reviewing improperly and taking actions to educate those reviewers.
Why aren't appeals being used as a way to identify, evaluate and educate poor reviewers? Does Niantic have any plans to actually address bad reviewers in a way that will make a meaningful impact on the Wayfarer ecosystem?
Yes similar to the question I posted about appeals: https://community.wayfarer.nianticlabs.com/discussion/30250/appeals-statistics#latest
Statistics can provide valuable insights to improve the whole process.
I had a similar question but I believe this discussion meets the intent of mine.
Many "wrong" rejections were made due to insufficient or unclear guidelines. What plans are in place to both update guides and educate existing reviewers? This would decrease dependency on appeals and waste less time from anyone.
Also, it needs to be done properly and take in to consideration people with a lot of agreements may be part of organised groups tactically voting, either for agreements or team bias.
As well as prolific voters who have huge stats but also wrongly reject some things regularly
Every time an appeal gets accepted, every wayfarer who rejected this nomination should get a message that shows them the nomination. This message should be shown in login into wayfarer.
Appeals are so new. We can’t expect Niantic to be using them as a way to deal with poor reviewers yet, but it would be nice to know if that’s a plan for the future. Perhaps we first need to ask What has Niantic learned from the initial round of Appeals? Are they seeing a high number of quality submissions that have been initially rejected by the community? If yes, what do they plan to do to right that ship? Is the next step to communicate with, redirect, penalize and/or re-educate those reviewers who are voting down submissions that then get approved on Appeal?
While it would be very nice if Niantic could use appeals data to clarify their criteria, automatically punishing reviewers sounds like it would be a terrible idea as there can be many valid or at least understandable rejection reasons that might still get overturned on appeal, including:
I completely agree that automatic punishments would have issues, however there are a lot of cases where people just blatantly did not review correctly. The frequency of this I'm sure varies greatly between different rejection reasons.
For example, I recently had a nomination rejected for "URL" when the only URL in the submission was in my supplemental info. The people that rejected me for URL blatantly and objectively did not review correctly. Those people should be reeducated at the minimum.
Some other examples include a submission in community garden that was rejected for PRP, a trail marker that was rejected for pedestrian access, a nature sign that was rejected for natural feature, etc. These type of things are in absolutely no way subjective, and when people are found to be repeatedly reviewing incorrectly, they should be punished.
There are times where I can understand why a reviewer might reject one of my submissions for location mismatch (or give it a low location score), even when I know they are wrong. Every reviewer is different and some will be more trusting and may put in more effort than others. But sometimes rogue reviewers will swoop in with unjustified rejections and tip the scales against you with a bad rejection reason when they shouldn't have. Combatting those reviewers is what is most important, its objective, and in many cases it could make all the difference in whether a nomination gets approved or not.
I agree with both of you. Penalties need to be done with care, and as AisforAndis-ING says, there are poor reviewers. I have had URLs in my supplemental information and rejected as "URL" too. I do not put the URL in my nomination's description. I have also been rejected for "mismatch location" for something that clearly exists. I have also had a university s rejected as "K-12". And many more. These inappropriate rejects frustrate the nominators.
I don't usually complain about this stuff on the forums, this actually being my first post. But I have to jump boat with y'all and agree, something needs to be done soon about bad reviewers. It is extremely frustrating when you take time of your gameplay time or even put other things aside, to travel a few miles out of your way to explore an area with potential Waypoint candidates, that actually needs some Waypoints, and they get rejected for absurd reasons.
I've not yet made an appeal mainly because I don't see the option. Not sure why but that's another topic. But most certainly I don't see this feature as a means to "punish" reviewers. In my humble opinion, this feature should be about giving a chance to submitters to have their submissions re-reviewed faster than having to re-submit and go again through all the long process of queue-voting-decision. But also should give Niantic a chance to see what went wrong in the initial process. Like fellow Wayfarers already mentioned above, improvements can still be made on Niantic's side regarding criteria and communication. But also reviewers might need help in some of those grey areas. And let's not forget, submitters aren't perfect either. The appeal feature should also be a means to help and educate submitters too, if it's not too much to ask from a single feature hihihi.
And I believe it's more important to find those "poor" reviewers that are willing to take their time to review, and educate them rather than "automatically punish" someone for trying.
All that being said, I fully support the "identify, evaluate, educate" idea presented in the topic title.
Yes for punishment. If done manually.
No for any automatically punishments.
I definitely understand the concerns about automatic punishements. If they go that route, and I do believe they should, it should be for things where the rejection was objectively incorrect, like an unwarranted URL, lisence plate, or face reject. And even then, it's probably best if an automated punishment only comes after multiple offenses.
I don't think there's any way to properly automate punishment. At some point, an actual thinking person is going to have to look at and evaluate a reviewer's "bad" decisions and determine what the bad pattern actually is, and then approve punishment. Even then, the bad decisions would have to be so obviously ill-intentioned that there could be no disputing that, and that's incredibly hard to determine. Someone rejecting URLs, for example, might just have an improper understanding of the rules. And, since there's no situation where any given type of POI deserves an automatic approval, you can't have automatic punishments, either.
I like the concept of showing someone a successfully appealed submission that was rejected by them. That contributes to the continuous learning. Automatic punishments (which will inevitably not be done properly) will just give Wayfarer a bad image. No one will want to bother with a punishment-happy platform. Remember, this is freely given labor Niantic is exploiting. They'd be looking a gift horse in the mouth if they automated punishment.
This needs to happen. Or make approved appeals viewable by all reviewers in a thread somewhere on these forums. Or both. It would help speedup the nomination process and also decrease the number of appeals filed.
It would also be so simple for some of the guidelines to be "built in" / included in the reviewing page. One quick example: the supplemental information section could include text saying "URLs are allowed in this section".
I think automation could happen after there are 'x' number of rejections over 'y' amount of time that end up successfully appealed.
Example: in a 12-month period, 1-9 successfully appealed rejections notify the reviewer; 10th suspends reviewing for a month, or delays their own nominations from being approved, or suspends their ability to nominate, or whatever.
That's just an example, obviously numbers would be modified to fit the overall situation based on much better analysis than I've done.
I strongly agree with @Diskrepansi-ING above. While I agree it's hard to see intention, setting a numeric bar of a certain number of mistakes in a certain time frame seems fair. A reviewer isn't making a genuine attempt to understand and get better if that type of pattern is being seen.
I think the suspension is a good moderate first step but I would add it should also progress to a permanent ban if there are a certain number of suspensions in a time frame. (I.e. a year)
Since the suspension(s) limit(s) the number of submissions an individual reviewer could do in the time frame, exceeding a set number of suspensions indicates at the very least that the reviewer isn't willing to try and understand what they're doing wrong and at worst reveals a deliberate attempt to act in bad faith, i.e. target agents they have a personal dislike of.