Suggestion to Damien: Migrating to an NN (Neural Net)

  1. I meant “99% of what LSTM can do”, just in case you misunderstood. Like, relative to LSTM, not as some sort of absolute metric.
  2. There will be a “Better optimization (uses GPU)” toggle or something like that in deck options. At the moment I don’t know if it will require an Nvidia GPU or if AMD will work too.
    Later I’m going to show a graph like these: Calibration of different FSRS versions - Album on Imgur. I’ll add FSRS-7 there. FSRS-7 should be very accurate, on average. Aka I can’t guarantee that it will be great for every single user, but on average, across thousands of users, it will be accurate. I mean, even FSRS-6 is pretty good. Just keep in mind that this graph doesn’t tell you about worst-case scenarios for the most unlucky users.
3 Likes

Ah yes, that makes sense. I did misunderstand this.

Could this not be GPU independent? E.g. my computer uses an intel graphics card and I’d love to have the full optimization available (if you need me for testing purposes, we can probably arrange that).

Regarding the graph: If I understand this correctly, your graph should essentially be like the FSRS Calibration graph from Search Stats Extended, except that yours is for the average of your data sample (multiple people) and mine is from my own data. Aside from the lower part of the prediction (stuff that isn’t around 95 %) the current FSRS-6 is already doing a great job for me:

But does the LSTM as a first step optimization potentially improve the lower predicted retention?

Could this not be GPU independent?

It will probably display a warning like “WARNING: GPU not detected” or “GPU isn’t suitable for optimization” or something. It can run on the CPU, but it will take dozens of minutes to several hours, depending on the number of reviews.

But does the LSTM as a first step optimization potentially improve the lower predicted retention?

We’ll see once I finish some more runs.

1 Like

Not ideal. But if it’s just a few minutes and the results worth it, I’d be willing to do it.

That being said: I don’t know why it needs a specific GPU (or gpu manifacturer) but if there is something I can do to add support for intel based GPUs, feel free to ping me!

Because of Nvidia’s CUDA. Then again, we’ll see, maybe Jarrett can implement LSTM in Rust in a way that CUDA is not a strict requirement.
Also, remember, this whole toggle will be optional. It’s a “hey, do you want to throw more compute at the problem to get better parameters?” toggle.

2 Likes

Say, would Intel Arc Graphics make the cut :sweat_smile: :red_question_mark:

I have hundreds of thousands of reviews, that is assuming you have raised the number of reviews recognized per card from 64 to 1000+.

Uh…no idea, honestly. Again, we’ll see whether Jarrett can implement LSTM in Rust in a way that doesn’t rely on Nvidia’s CUDA. If it works, well, I guess you will be the guinea pig, lol.
Oh, and it will likely be slower and more buggy than using an Nvidia GPU or an AMD GPU.

One more thing. On top of using LSTM as a teacher, there will be another way of improving FSRS’s optimization: just run it for longer. More precisely, enabling the “Better optimization” feature will increase the number of epochs used to optimize FSRS. Number of epochs means roughly “how many times does FSRS see every review?”. 8 will be the standard, and better optimization will use 40. This will further improve the resulting parameters at the cost of time. “Better optimization” will take advantage of both ways of improving optimization.

3 Likes