

I’m actually also working on a project using LLMs to talk to NPCs. Though this one doesn’t use local models but online models called through a proxy using API keys, which lets you use much larger and better models.
But yeah it’s been interesting digging deep into the exact and precise construction of the prompts to get the NPCs talking and behaving exactly like you want them, and be as real and lifelike as possible.
Well, what I’m working on is a mod for STALKER Anomaly, and most large models already seem to have good enough awareness of the STALKER games setting. I can imagine it’s a much bigger challenge if you’re making your own game set in your own unique world. I still need to have some minor game information inserted into the prompt, but only like a paragraph detailing some important game mechanics.
Getting longer term interactions to work right is actually what I’ve been working on the last few weeks, implementing a long-term memory for game characters using LLM calls to condense raw events into summaries that can be fed back into future prompts to retain context. The basics of this system was actually already in place created by the original mod author, I just expanded it into a true full on hierarchical memory system with long- and mid-term memories.
But it turns out creating and refining the LLM prompts for memory management is harder than implementing the memory function itself!