Changelog

Local LLM support for AI features

Firstly, the new AI features (interactive tutor, flashcard generation) are great and I seem them stepping up my learning game huuuuuuugely!!

I was wondering if support for locally hosted LLMs (through Ollama, etc) using RAG is also in the pipeline. I’d love to be able to use the AI+ features for some tasks (generating flashcards, quizzes, summary etc for a given source) using a locally hosted model and reserving the monthly AI tokens solely for the interactive tutor (which requires internet access and is therefore expensive), instead of burning the precious AI tokens up trying to do everything - even tasks which do not strictly require internet access