Researchers Build Public Leaderboard for Language Processing Tools | HackerNoon
Briefly

In this study, we introduce a novel adaptation of the benchmarking approach to NLPre, aimed at establishing an automated and credible method for evaluating NLPre systems against a benchmark.
The primary objective is to continuously update the performance ranking of NLPre systems through a publicly accessible scoreboard, promoting a fair and transparent evaluation process.
The leaderboard serves as a reliable point of reference for developers, ensuring the ongoing evaluation of new or upgraded NLPre systems and preventing result manipulation.
Read at Hackernoon
[
|
]