This was my thinking as well. I think what’s changed my mind is realizing that Retrievability is often wrong. That’s why FSRS works so well, it’s very adaptable to specific cards, and the actual underlying difficulty of individual cards varies wildly.
So sorting by Ascending Difficulty (or equally your PSG Descending) would be like mining for those cards you’re most likely to retain easiest and getting them pushed up to higher Stability/intervals faster.
If cards with low Difficulty also have a low Retrievability, that probably means you haven’t studied for a long time. For a card to even get to the point that it has a low D, you have to answer correctly a bunch of times in a row. It’s going to have a high Stability. In this situation, most of your deck is probably low R, and those cards with the lowest D are the ones most likely to be remembered with a low R imo.
Again, even FSRS is wrong about the actual underlying Retrievability a lot of the time, and it’s working to fix that with every review. I think Ascending Difficulty is the fastest route to FSRS getting there.
No, my idea is you find what the next review interval (or stability) is going to be for card. Then you calculate the percentage change and then you multiply it with retrievability.
The multiplication is because potential depends on how much you’re actually going to remember out of those cards. Just run a mental simulation with sets of cards.
No and yes. Over a large number of cards R values are pretty accurate. Yes, there is error (as you see in RMSE) but it’s pretty good IMHO.
I don’t think so, try to multiply the number of cards with the average R of those easy cards and I think you’ll find R values to make sense.
When we’re looking at the our parameter weights to see how they affect the formula, the index starts at zero, right? There are 17 weights, w_0 through w_16 ?
No wonder I have so many cards at D=100%. With my parameters, I have to hit the easy button for the next D to go any lower. Hitting good actually raises the D score (keeps it at the 100% max). That formula probably needs a fix. The mean reversion isn’t enough to pull them out.
Edit: Actually that may not be something that needs to be fixed. That is probably a product of me going so many long periods of time without studying and having so many cards backlogged repeatedly. I’m guessing if I get on a better study schedule for long enough, those parameters will change.
Edit 2: I was wrong! I must have fat fingered a number. It stays close to 10, but dips just under.
Check it yourself. This preset has over 150k reviews in its history, so this is not due to lack of data.
I’m still on FSRS 4.5
Here’s the formula. Looks like in FSRS 5 they changed the mean reversion to D_0(4) instead of D_0(3), so might have been trying to address this already.
Yeah, my w7 basically makes it take forever for any of my cards to lose any of their difficulty unless I use the easy button. I guess Ascending Difficulty sort will be pretty useless for me.
The difficulty increases when using the again (x2) and hard buttons. When using the good button, it should not increase.
In your case, everything can be simplified to
Yeah, it’s common for difficulty to decrease by a tiiiiny amount when you press Good. Trust me, we experimented with making it change D a lot, it didn’t help.
What precision is the complexity value limited to?
For some of my presets, w7=0.0000
If the minimum value was limited to 0.0001, it would make sorting by difficulty more useful.
I either need to change how strict I am with the Easy button, or run the sim and make the Easy button much more rare, because I pretty much only use it when I could basically just delete the card I know it so well.
I’ll try it later
EDIT: nevermind, calculating the stability increase using Jarrett’s code turns out to be more difficult than I thought. I’ll see what I can do, but no promises.
EDIT 2: nope, this is too much for me. You’ll have to ask Jarrett to implement PSG.
This is a total hypothesis, but I bet if you create two custom decks, prop:d>0.99 prop:r>0.8 prop:r<0.9
and prop:d<0.25 prop:r<0.4
you’ll get a higher proportion of the latter correct. Or if you checked the historical data on a massive data set of flashcard reviews, I bet this would hold true.
What is the theoretical increase in retention after all this for a person with a permanent backlog: FSRS 5 + Reverse Relative Overdueness + <1d scheduling replacing (re)learning steps?