• CharlesW 5 minutes ago

    How would you compare and contrast this to Steve Yegge's Beads (https://github.com/steveyegge/beads/), or to ordinary file-based memory following vendors' guidelines (https://code.claude.com/docs/en/memory)?

    • bilekas 2 hours ago

      This looks interesting, and will try it out to see what it can do, I like the idea of using temporal values as a significant weight, but one thing isn't really clear to me.

      > Traditional Vector DBs require extensive setup, cloud dependencies, and vendor lock-in:

      Is this really true ? What's wrong with running your own local Redis vector db? They have their open source version that's separate to their hosted offering..

      > https://redis.io/docs/latest/operate/oss_and_stack/

      Am I missing something ?

      • zffr 25 minutes ago

        Yeah it’s strange that the project does not mention using redis, or even SQLite with a vector DB extension.

      • davidarenas an hour ago

        It would awesome if this could be part of AgentFS which also runs on SQLite.

        You would be able to easily offer agents that have all of a tenants data and agent state in a single file which can be synced onto s3.

        https://github.com/tursodatabase/agentfs

        • A4ET8a8uTh0_v2 2 hours ago

          Parts of this weekend is alloted for a local inference build. It genuinely looks interesting. This is kinda what I hoped for local llm scene would become: everything becomes modular and you just swap pieces you want or think would work well together.

          • koakuma-chan 2 hours ago

            This does not look interesting. This is AI slop.

            • A4ET8a8uTh0_v2 2 hours ago

              Ok. Why it does not look interesting? It does seem to solve a problem. Have you actually looked into what it takes to build your own equivalent of ollama? It gets into fascinating trade offs real fast.

              • koakuma-chan an hour ago

                Because this is the output of "Hey cursor, write a memory store for AI agents." This is by no means an equivalent of ollama. I don't know where you got this from.

                Check this out: https://github.com/CaviraOSS/OpenMemory/blob/17eb803c33db88a...

                • A4ET8a8uTh0_v2 an hour ago

                  Admittedly, I don't have much exposure to cursor so I am taking your statement at face value ( as in, I don't see obvious relevant artifacts ). I am playing with stuff this weekend anyway so it just means I will be digging a little deeper now:D

                  • ctxc an hour ago

                    How did you figure that out though, did you skim through the source code or was there some other tell?

                    • koakuma-chan an hour ago

                      I was pretty sure after reading that README, and skimming through source code confirmed, like you said, it literally has agent comments in there lol.

                    • ctxc an hour ago

                      This is insane.

                      The comment in code literally says "# Wait, `get_vecs_by_sector` returns all vectors." :|

                      • A4ET8a8uTh0_v2 an hour ago

                        Adversarial review as a service incoming. Brave new world.

                        • A4ET8a8uTh0_v2 an hour ago

                          edit:

                          from gpt5.2 with prompt:

                          << 'adversarial review request. please look at the github link for signs of being written by llm ( extra points if you can point to the llm that generated it ) https://github.com/CaviraOSS/OpenMemory'

                          >> I can’t prove it’s LLM-written from the outside, but the README (at least) has a lot of “LLM smell.” I’d put it at high likelihood of AI-assisted marketing/docs copy, with some sections bordering on “generated then lightly edited.”

                          but then it adds a list of style reason why it could be generated by llm

                          << “Extra points”: which LLM wrote it?

                          Most likely: Claude 3.5 Sonnet–style output

                          << if i were to point to comments in readme and code, what would you say upon re-review

                          >> Comments that narrate the obvious (especially line-by-line) >> Tutorial voice inside production code

                          **