Given local models getting better (yesterday’s Gemma 4 by Google runs quite well on my Mac), it seems that Panda especially could work well for users seeking to benefit from private, local AI just because it uses markdown files fully accessible to/from the file system. It wouldn’t necessarily require any additional work or be limited to local models if users opt for cloud ones, but the current LLM landscape, especially local models, increases the value of Panda.
Here Karpathy describes a compelling use case for LLMs in knowledge apps: