Woz already thought of something like this, btw. Go to “Expected value of the increase in memory stability depending on retrievability”.
Yep. And I have tested this idea 3 years ago. But it doesn’t maximize the knowledge acquisition rate.
Well, humans think in very similar ways so not that surprised. If you’ve ever delved into ancient philosophical traditions, there is a lot of similarities in the ideas of people despite communication systems being very bad back then.
I hope this one works better though. Imo, this will produce a bit worse results for maintaining constant retention but will be better with knowledge acquisition rate.
This is a good read. I am sold on this point. At this moment I am using Difficulty Asc. on a permanent backlog I have had for 2 years now. I don’t know how effective it is in my case in comparison to R desc or PSG desc in reality. Eagerly waiting for one of the two to come along.
Now, one of the issues with
difficulty_asc
was mentioned that the sorting doesn’t change with time asdifficulty
only changes when grading cards. Even for very old backlogs, thus sorting stays the same as before
When checking the Difficulty Asc, did the cards have the same preset?
Thanks. Took a lot of time to draft it first in obsidian.
@rich70521 As Keks said, the formula becomes like this image-5 hosted at ImgBB — ImgBB
Re: absolute vs relative change
It will make high stability cards appear much before low stability cards. The current strength of PSG is also that it is going to be better at interleaving. Besides, absolutely change is going to make it almost similar to the existing stability sort, which doesn’t perform too well.
Maybe absolute change should be counted, Idk. I’m not doing the simulations myself so I can’t check it.
Is Difficulty Asc equally good when the cards have one preset or several?
I’m not sure, maybe you’re right in your suspicions. Even then, do we have a remedy?
FYI, the simulations were done for cards in same preset.
I get your point here but I’m not sure how this relate to the metrics being bad? If R goes down it pulls the average down. What’s the issue with that?
Also, can you clarify efficiency?
Stability sort doesn’t change over time at all. PSG does, and still would with absolute instead of percent.
Also, maybe the magic of PSG is multiplying by R and not the calculating of the change in stability. Maybe SxR sort gets you most of what you want in this idea. No idea, but possible.
What I’m saying is that the sorts that we know perform the best (basically what I meant by efficient) are keeping cards at the back of the backlog for longer while they prioritize keeping memory retained on higher R cards, even difficulty_asc
does this. They all have a valley in the distribution.
The backlogged cards at the bottom of the distribution are being intentionally left alone because they are the most forgotten. That is something you want and is a good thing, but those cards’ low R is dragging down the total_remembered
metric, which makes those sorts look worse than they are. Then, we end the sim before the backlog has been worked through, so those better sorts have a ton of cards down below R=0.25
. The metric just isn’t useful, or it’s not telling us what sort is better, it’s just telling us what sort has a better average R when there’s still a huge backlog to work through.
Look at retrievability_asc
, which is one of the worst performing sorts. It doesn’t have any cards below R=0.5
, because it prioritizes not having low R cards. But it doesn’t allow good retention, R>0.9
to build up at all. It spends a ton of time getting those low R cards wrong over and over (inefficient) and letting the high R cards get worse in the meantime.
Maybe I should have said S_Recall x R
I have no problem with the PSG idea, I like it. I just think we should try
PSG = (S_Recall - S) x R
as well as
PSG = (S_Recall / S) x R
I could see it maybe working.
It is rather (11-D) which is refined to some extent by R.
Ascending difficulty + opposite Relative overdueness
Dont you have a simulator
I do, but I’m not at home right now
To simplify it a lot, it is possible that (11-D)*R
Then I’m into it. A big problem with Ascending Difficulty is that it doesn’t take that many (at least with my personal parameters) Again
presses to get a card to exactly D=10
. Go do a search of your decks for prop:d=1
and see how many qualify. I’d bet money that as you’re working through a backlog, the majority of your cards each day are cards that have a 10 Difficulty, which means Anki will default to the secondary sort, which is Order Added I believe. That’s one of the worst sorts.
If this basically makes Reverse Relative Overdueness, or Retrievability Descending, the secondary sort, or a secondary consideration, then I think it’s a big improvement.