Search across all your Logseq blocks using text embeddings. Instead of matching exact keywords, semantic search finds blocks that are conceptually similar to your query.
In Logseq, go to Plugins → Marketplace and search for “Semantic Search”.
It uses ollama running locally by default, although can use it remotely or even use OpenAI-compatible servers. No GPU is required although it will improve the indexing speed. Search speed is pretty fast even without a GPU, as shown below.
View on GitHub.

