Sort using R × D
Make a new suggestions post? FSRS is the future as dae said so we should move towards greater consistency with this algorithm.
If it’s not using your FSRS paremeters and other things estimated from your review history, it’s not as accurate as you may think
Suggest what?
The ratio? Why the eff you calculated it then.
I calculated it to update the “10x new cards” rule of thumb
It’s not meant to be used for choosing desired retention, that’s what CMRR is for
Bruh, I’m talking about the warning message that people are shown.
It’s not parameter-independent. The ratio will change if you change parameters. The “X day interval will become Y days” is parameter-independent.
Who just said it’s better than the current thumb rule.
When newcomers ask “How many reviews will I have to do in the future?”, people say “Take your number of new cards that you learn per day and multiply it by 10”. This table is a more exact version of that.
Yeah, its the same rule of thumb the warning message uses. Although, Im not sure this can be done if the rule changes with changing another option.
… I mean, not a bad idea. It’d probably have to be (1 - R) x D, but that might be something to try. Doubt it’s going to be so good it needs to be implemented, but I’m curious now.
It’ll create better results even if a user has all due cards either at very similar R or similar D values. Currently we just randomise when cards with same values need to be sorted.
Actually I was thinking modifying the sort orders to add a second condition sort of thing. Because right now it’s just random().
I don’t understand. I thought you were talking about this?
Only now I realized that perhaps you were talking about this
It would be cool, yeah, but again, the ratios depend on parameters, so eh. And simulations take forever to run
If you’re using R or D as the first sort, my guess is there will almost never be ties. I think those are calculated to several sig figs.
Both are calculated with 2 digits of precision, unless Card Info is lying
I figured they are just displayed to 2 digits and the actual calculation is whatever the system defaults to, which is usually like 16.
I’ll probably go crazy trying to follow this. See this post: Ordering Request: Reverse Relative Overdueness - #267 by Expertium
Is it not random performing better than retrievability_desc
and retrievability_asc
? Let’s first decide which data we are using before we go on further from here.
What happens to the results of the simulation if there are new cards constantly added…like how it would be in a real life situation, say 100, 200, 300…500 new cards every day