A downloadable game for Windows, macOS, and Linux

AI Content Disclosure

This game includes AI generated content. The included local LLM AI model enables unscripted dialogue with all characters. Most music on the radio feed in the background is also AI generated via SunoAI.

About

You are thrown into the midst of a murder investigation in a local grocery store. It's your task to interview 5 odd customers, among which is 1 murderer. Gather clues through interactions and unravel the mystery to bring the killer to justice. All murder elements and roles are randomly distributed on each run.

Controls

  • WASD - Move
  • LSHIFT - Sprint
  • SPACE - Jump
  • E - Notebook
  • F - Interact
  • MOUSEWHEEL - Scroll chat
  • ESC - Close

Tips

  • Before posing questions to characters, experiment with altering their mood by adjusting your tone (sentiment) and discussing topics or using words they may favor or dislike
  • Their mood will influence the level of detail in their responses

Development

Made in 3 weeks in Game Design at HTW Berlin by ROADEDLICH

Honourable mentions:

Third-party tools & content

  • LLMUnity by undreamai (Local LLM integration server)
  • ChatML by OpenAI (LLM chat language format)
  • OpenHermes 2.5 Mistral 7B by Teknium, GGUF by TheBloke
  • Includes AI music from SunoAI

Known bugs/oddities

  • Low-end "potatoe" PC's can potentially cause unpredictable behaviors and bugs in the LLM model and a generally slow output speed (see our system requirements)
  • Sometimes the server/LLM has hiccups which can cause the LLM to respond with the same output repeatedly (its not gamebreaking as you can just talk with another character or sometimes it resolves itself after multiple player inputs)
  • Sometimes the characters might seem to be stuck on "Mmh" (buffering placeholder) at the start or middle of their output. This resolves itself if you wait for a moment or simply go start a conversation with another character. This is actually intentional and used to cover "LLM hallucinations" that sometimes get detected via a filter I developed. This filtering process will be made unnoticeable at some point
  • Very rarely the server can crash, causing all characters to be stuck pemanently on "Mmh". Please let us know if that happened to you!
  • Let us know of any more bugs you might have encountered!

System Requirements

  • As the game is running a local LLM AI in the background, a modern setup with a dedicated GPU is required
  • Tested extensively on a high-end desktop with RTX 4090, i9-13900k (Highest output speed)
  • Tested extensively on a low-end laptop with GTX 1050 Ti Mobile, i7-7700HQ (Low but tolerable output speed)
  • So a setup between those specifications will most likely work

Download

Download
murder-in-aisle-4-windows-x64.zip 4.1 GB
Version 0.4.12 35 days ago
Download
murder-in-aisle-4-macos-universal.zip 4.1 GB
Version 0.4.12 34 days ago
Download
murder-in-aisle-4-linux-x86_64.zip 4.1 GB
Version 0.4.12 34 days ago

Install instructions

Check the system requirements before downloading!

Uncompressed folder is 4.6 GB large. Most of it stems from the local LLM AI which is about 4.1 GB. Starting up the game can take a short moment, as the background server is set to not boot asynchronously.

Changelog

Development log

Comments

Log in with itch.io to leave a comment.

(1 edit)

the ai don't responde to me 

it stuck on "mmh"

(+1)

Can you provide some more details regarding your system? Which build did you download and maybe your PC specs? Has it worked at least once or never?

(+2)

Well of course, I downloaded the Windows version yesterday, and my specs are unfortunately poor and I think that's why, but it worked smoothly with me, But I only struggled to get it open.

My specs is: mx130, intel core i5-10210U, 8GB ram.
and it didn't work with all the characters at all.
But thank you anyway, the game is very nice.

(+2)

Okay so the latest patch 0.4.4 might be a fix for your problem (stuck in "Mmh")

Thank you for your efforts, but unfortunately it is still stuck

(+4)

Short but fun! Thanks again for sharing it on r/LocalLLaMA. Would love to see more games make use of local models

(+1)

Thank you for playing it! I think efforts by people like undreamai will make implementing these llms easier and more common