I usually forget to check the Evaluate parameters (Log loss & RMSE) in the Evaluate button before optimizing a preset to check in which way the new FSRS parameters are better than the old ones.
A way to fix this could be to show a pop-up window showing the change in these Evaluate parameters after optimizing the FSRS parameters.
For example, you press Optimize Current Preset, the FSRS parameters are optimized and a pop-up windows appear with the following information:
“Optimizing the FSRS parameters resulted in
· Log loss: 0.4195 → 0.4170
· RMSE(bins): 2.26% → 2.20%
(Smaller numbers are better)”
Anki exists for one main reason: to help users learn effectively. That’s the whole point. In my opinion, developers shouldn’t waste time on requests that don’t move the needle in that direction. Who really cares if some internal value changes by 1% or 56%? It’s irrelevant to the learning experience. Effort should go into features that actually improve retention and usability—not cosmetic stats that don’t matter.
It’s not irrelevant and it does matter. RMSE and log loss are metrics showing how well the FSRS algorithm performs. It should be as good as possible.
How detailed they should be shown comes down to preference. This user clearly would like it to be more detailed than it currently is. Others want it removed, as long as the model is as accurate as possible (meaning they don’t really care about the values as long as it works).
Also: developers / contributers can work on whatever they deem worthwhile. And as opinions differ, the opinion regarding what feature is worthwhile or not differs too.
I understood that, my question was what they’re trying to do by knowing the numbers before optimization. These numbers don’t directly correlate with your understanding of the cards that you have (AFAIK), so I was just trying to clarify what they’re aiming to do.
I just want to know how much the optimization is improving the parameters. There has been several cases in which the parameters have been worse after optimization vs before. If I can’t know the previous parameters, how can I tell if the optimization has improved or worsen the algorithm?
Also, what’s the point of the Evaluate button if the metrics are irrelevant? Either they are important, in which case being able to see the change in the metrics is relevant, or they are not important, in which case then the Evaluate button should be removed.
I don’t think the metrics are cosmetic stats, they exist and were created for a reason.
Not talking about my personal experience, as I usually don’t write down the parameters before the optimization, so I don’t know if they have improved or not, thus my request.
It might be helpful for you to know – unless you’ve updated your Anki/FSRS version between optimizations, that shouldn’t be happening. Optimization is testing itself, so if the new parameters it generates are worse, you’ll get a parameters-already-optimal message instead of “worse” parameters.
I am someone who consistently uses Evaluate before and after re-optimization and keeps a log of the results. But for most users, I don’t think it’s necessary to show them an Evaluate result automatically.
You’ve stumbled into a hot-button issue right now – with certain factions trying to get rid of the Evaluate button entirely. That might help explain some of the resistance you’re getting to this idea.