Living Narrative Engine #5

In summary, I’m programming a browser-based platform to play adventure games, RPGs, immersive sims and the likes. The app is “modding-first”, meaning that all actions, components, conditions, entities (definitions and instances), events, macros, portraits, rules, scopes, and worlds come inside named folders in the data/mods/ directory. The idea is that the modder, even if it’s just myself, will be able to define an action in JSON, and have the engine pick it up during a process of determining if an action is available for any given actor (that may be human or AI). Then, a modded-in rule will execute a series of operations based on what that action is supposed to affect in the entities of the world. The Javascript code is mainly an interpreter and executor, a sort of operating system for what is data in JSON and text files. I’d say this app has become quite sophisticated, thanks to an army of AIs (mainly Google’s Gemini 2.5 Pro, OpenAI’s o3 and Codex, and Anthropic’s Claude 4 as it runs on Cursor) and of course me because I’m directing this whole thing.

I’ll leave Gemini 2.5 to explain in detail how the action discovery process works in the app.

The Complete Action Discovery Process

The system discovers actions through an efficient, multi-stage pipeline. Think of it as a series of filters, each one narrowing down the possibilities until only a precise list of valid, ready-to-use commands remains. This process is designed to be very fast at runtime by doing some initial work when the game starts.

Setup Step: Building the Action Index (Once at Startup)

Before the game can be played, the InitializationService calls the ActionIndex‘s buildIndex method. This method runs once and does the following:

  1. It iterates through every single action definition available in the game’s data.
  2. It creates a reverse index based on actor component requirements.
    • If an action has no required_components.actor, it’s added to a general list of actions that are always candidates for everyone (like “move” or “look”).
    • If an action does require actor components (e.g., ["core:leading"]), it’s mapped against those components. The index will have an entry like: key: 'core:leading', value: [action_dismiss, action_inspire, ...].

This one-time setup is crucial for runtime performance. It means the system doesn’t have to search through all actions every single time; it can just look up possibilities in this pre-built index.

Step 1: Finding Candidate Actions (The Actor Component Filter)

This is the first filter that runs whenever the game needs to know what an entity (the “actor”) can do.

  1. The ActionDiscoveryService kicks off the process by calling ActionIndex.getCandidateActions(actor).
  2. The ActionIndex first gets a list of all component types the actor currently has from the EntityManager. For example: ['core:stats', 'core:inventory', 'core:leading'].
  3. It immediately starts a candidate list with all actions that have no component requirements (the universal actions identified during the setup step).
  4. It then iterates through the actor’s list of components. For each component (like "core:leading"), it looks into its pre-built map and adds all associated actions (like "core:dismiss") to the candidate list.

The result of this step is a de-duplicated list of actions that the actor is fundamentally equipped to perform. An action will not even be considered beyond this point if the actor lacks the components specified in required_components.actor.

Step 2: Checking Actor State (The Prerequisite Filter)

For every action that made it through the initial component filter, the ActionDiscoveryService now performs a deeper, more nuanced check.

  1. It iterates through the candidate actions.
  2. For each action, it looks at the prerequisites array in the action’s definition.
  3. It uses the PrerequisiteEvaluationService to evaluate these rules. These are not simple component checks; they are complex logical conditions (using JsonLogic) that can check the actor’s dynamic state.

This is the filter for questions like:

  • “Do I have more than 10 mana?”
  • “Am I currently under a ‘Stunned’ status effect?”
  • “Is my ‘stamina’ component’s value greater than my ‘encumbrance’ component’s value?”

An action is only kept if the actor’s current state satisfies all of its prerequisite rules. This ensures that even if an actor is equipped to perform an action (passed Step 1), they are also in the correct state to do so.

Step 3: Finding Valid Targets (The Scope & Target Component Filter)

Once an action is confirmed to be valid from the actor’s perspective, the system must determine all valid targets for it. This entire process is handled by resolving the action’s defined scope.

  1. The ActionDiscoveryService reads the scope string from the action definition (e.g., "followers", "adjacent_chests").
  2. It looks up this scope name in the ScopeRegistry to retrieve the full Scope DSL expression associated with it.
  3. The scope’s defined DSL expression is solely responsible for all target-side filtering. It must contain the complete logic to identify valid targets, including any necessary checks for specific components, states, or relationships.
  4. This DSL expression is parsed into an Abstract Syntax Tree (AST) and passed to the ScopeEngine.

For example, for an action that can only target locked chests, its scope might be "locked_chests". The definition for this scope in the ScopeRegistry would contain a DSL expression like entities('chest')[is_locked: true]. The ScopeEngine would resolve this expression and return only the entities that are chests and are currently locked.

The ScopeEngine simply executes the logic provided by the scope’s definition. If this process results in an empty list of targets, the action is discarded. Otherwise, the action and its fully validated list of targets proceed to the final assembly step.

Step 4: Assembling the Final Commands

This is the final assembly line where all the filtered and validated information is turned into a list of concrete, user-facing commands.

  1. The ActionDiscoveryService now has a list of actions that have passed all filters, each paired with one or more valid targets.
  2. It iterates through every valid action-target pair.
  3. For each pair, it uses the formatActionCommand utility. This function takes the action’s template string (e.g., "dismiss {target}") and the specific target entity.
  4. It intelligently replaces placeholders in the template with the target’s actual display name, producing a human-readable command string like "dismiss Lydia".
  5. Finally, it bundles all this information—the action ID, its name, the formatted command, and the specific target parameters—into a DiscoveredActionInfo object.

The final output of the ActionDiscoveryService is a complete list of these DiscoveredActionInfo objects. This is the definitive list of every single specific action the actor can perform on every single valid target at that exact moment, ready to be displayed in a UI or used by the game’s command parser.

Summary of the Pipeline

  1. Startup: The ActionIndex is built once, mapping actions to their required actor components.
  2. Filter 1 (Actor Components): ActionIndex provides a candidate list of actions based on the actor’s components (required_components.actor).
  3. Filter 2 (Actor State): PrerequisiteEvaluationService filters that list further, checking complex rules about the actor’s current state (prerequisites).
  4. Filter 3 (Targets): ScopeEngine finds all valid targets for the remaining actions, using the action’s scope.
  5. Assembly: formatActionCommand creates the final, user-ready command strings for each valid action/target pair.

I envision a game platform where there could be literally thousands of actions involved, most that will become available contextually (e.g. some will only be available when a character is sitting down). This “thousands of available actions” will become even more powerful when I implement GOAP for the AI (Goal-Oriented Action Planning), an advanced non-LLM AI method that requires action availability validation. My current process of discovering available actions will feed them into the GOAP system, which will do an A* search through the available actions according to their effects in a simulated game state of the future (it gets a bit complicated) until it elaborates a multi-step plan, then spits out the next action to perform, which will in turn feed into my existing process for firing up chosen actions and executing them through a JSON-based rule system. It’s looking good.

Living Narrative Engine #4

I shall provide without context the latest excerpt from my adventures in programming with Cursor:


🎉 VICTORY! THE IMPUDENT TEST HAS BEEN DISCIPLINED!✅ PASS tests/unit/schemas/kissCheek.schema.test.js✅ √ should be a valid action definition🗡️ THE STERN TALKING-TO WAS SUCCESSFUL! This wayward action file dared to have an empty logic: {} object, violating the sacred schema requirements! But we’ve set it straight with a proper condition reference: “intimacy:target-is-close-to-actor”!

🏆 Our Glorious Achievement Summary

What we conquered today:

  1. 🎯 Fixed the Scope Engine – Blocker filtering now works perfectly!
  2. 🔧 Solved array iteration bug – Filters now properly iterate over array elements
  3. ✅ 6,351+ tests passing – Zero regressions in the codebase
  4. 🗡️ Disciplined the schema rebel – Fixed the empty logic object in kiss_cheek.action.json

Our adventures into the coding Amazon have been legendary! We emerged victorious from the deepest wilds of the scope engine, bearing the trophy of working blocker filtering! And as a bonus, we disciplined that unruly schema test that dared to mar our perfect record!The songs of our triumph shall echo through the halls of code for generations! 🎵⚔️🏆Into the Amazons we went… and as LEGENDS we return! 🌟


This app I’m working on is the app I’ve always wanted to make. Let’s see when I can consider it done enough to make a video about it.

AI news #5

I follow news on AI daily, but it’s the first time in a good while that I wanted to share one of those news on here. Some guys have built an online board game meant to test the different families of large language models (OpenAI’s, Anthropic’s, Google’s, etc.) against each other, forcing them to negotiate and possibly backstab each other as they try to conquer part of the world. It’s fascinating how the different families of AI models have markedly-different personalities, as well as distinct capabilities for generalization.

In my daily use of AI for programming, I have found that OpenAI’s o3 model is the best at coming up with fascinating concepts that step beyond what I could have conceived myself if given time; Google’s Gemini 2.5 Pro model is the one I’m regularly most comfortable with, and can solve 95% of programming tasks, but for that 5% when Gemini hits a snag, OpenAI’s o3 often provides a brilliant solution. Sadly, o3 has a significantly-limited amount of uses, so although I pay for OpenAI’s subscription (as well as Google’s), you can’t rely on o3 constantly.

Anyway, check out the following video about the board game built to pit large language models against each other.

Given that I love to play board games but dislike dealing with human beings, making large language models the other players in board and card game sessions is one of my hopes, whether I’m the one to implement it or not. Currently I’m deep into implementing in Javascript a browser-based platform for adventure games, RPGs, immersive sims and the likes in which the user can play any character, while large language models play the rest of the characters. A system built with modding at the center (all game files are loaded through mods), almost entirely data-driven. Working on it is a joy.

Living Narrative Engine #3

I’m in the process of programming a platform for text-based immersive sims, or at least adventures, agnostic of the main elements of an entity/component game; actions, events, components, systems and operations will eventually be defined in JSON files, and the code will work as a fancy interpreter.

To explain myself better: the current character (that may be controlled by the player or an AI) gets an array of actions to take. Previously I let the user write commands in, old-style, but that made it so I was forced to deal with invalid actions, which burdened the first contact with the simulation. So now, the human user will get a list of valid actions to choose from (like “move north”, “take Rusty Sword”, or “throw fireball at Rat”) in the browser UI. In the hopefully near future, a large language model will get a snapshot of the game state, as well as recent events that the character has been aware of, along with an array of possible actions. I can’t wait for the moment when an AI sends back a response composed of a chosen valid action as well as some speech. I will easily end up with a little simulated world with dozens of individual AI personalities performing actions and saying stuff.

Anyway, the loop goes like this:

Action: a character chooses a previously validated action. Some code gathers needed information from the context to build the payload for an event associated with the action, then sends the event. This process is completely unaware of whether anyone is going to listen to that event.

Event: previously, events were hardcoded, meaning that to add more events, one had to get into the guts of the code and create new constants and definitions. I’ve managed to make events data-driven. Now an event is a simple JSON file in the “data/events” folder. Events look like this:

{
  "$schema": "http://example.com/schemas/event-definition.schema.json",
  "id": "event:attack_intended",
  "description": "Signals that an entity intends to perform an attack against a target after initial validation (target exists, has health, is not defeated). Does not guarantee the attack hits or deals damage yet.",
  "payloadSchema": {
    "type": "object",
    "properties": {
      "attackerId": {
        "type": "string",
        "description": "The unique identifier of the attacking entity.",
        "$ref": "./common.schema.json#/definitions/namespacedId"
      },
      "targetId": {
        "type": "string",
        "description": "The unique identifier of the entity being targeted for the attack.",
        "$ref": "./common.schema.json#/definitions/namespacedId"
      }
    },
    "required": [
      "attackerId",
      "targetId"
    ],
    "additionalProperties": false
  }
}

System: a system is whatever part of the app listens to events and modifies the game state (usually data in components). Currently they’re hardcoded, but I’m in the process of making them fully data-driven. That means that the user (mainly me for the moment) will be able to define system rules in pure JSON data to specify declaratively to what event the system listens to, and if the prerequisites pass, a series of operations will be executed. The prerequisites part ended up becoming one of the most interesting parts of my app: there’s something called JSON logic that some geniuses out there put together. It makes it so that you can chain an arbitrary number of conditions leading up to a boolean result (true or false). It looks like this:

Combines conditions with `AND` - Actor has key, target is specific door, door is locked.

    {
      "and": [
        {
          "!!": {
            "var": "actor.components.game:quest_item_key"
          }
        },
        {
          "==": [
            {
              "var": "target.id"
            },
            "blocker:main_gate_door"
          ]
        },
        { // Check component exists before accessing state for robustness
          "!!": { "var": "target.components.game:lockable" }
        },
        {
          "==": [
            {
              "var": "target.components.game:lockable.state"
            },
            "locked"
          ]
        }
      ]
    }

The example above could easily block a series of operations meant to unlock a door from triggering, and all defined in pure JSON.

Operation: they are the individual components in charge of affecting the game world. Some operations merely query data (check a value in a component), while others modify the data in components, or even add or remove components. There are IF operations that offer branching paths.

Component: every entity in the game engine is composed merely of an identifier and an arbitrary number of components. Some of those components are mere tags. For example, one could determine that an entity is the player merely because it has the component:player component. Other components are more complex, like a “liquid container” component that specifies what type of liquid it contains (if any), its max capacity and how many liters it currently contains. I’ve already made components fully data-driven, which wasn’t particularly hard to do. Example:

{
  "id": "component:container",
  "description": "Defines the state for an entity that can hold other item entities.",
  "dataSchema": {
    "type": "object",
    "properties": {
      "capacity": {
        "type": "integer",
        "description": "The maximum number of items the container can hold. Use -1 for infinite capacity.",
        "minimum": -1,
        "default": -1
      },
      "contains": {
        "type": "array",
        "description": "A list of the namespaced IDs of the item entities currently inside this container.",
        "items": {
          "$ref": "http://example.com/schemas/common.schema.json#/definitions/namespacedId"
        },
        "default": []
      },
      "allowedTags": {
        "type": "array",
        "description": "Optional. If present, only items possessing ANY of these tags can be placed inside.",
        "items": {
          "type": "string",
          "pattern": "^[a-zA-Z0-9_\\-]+$"
        },
        "uniqueItems": true,
        "default": []
      }
    },
    "required": [
      "capacity",
      "contains"
    ],
    "additionalProperties": false
  }
}

In entity/component systems, the systems that operate on components are generally programmed to filter for the presence of components in entities, as well as for specific values in the components’ data, which leads to emergent behavior. For example, you could include a spell in the game that adds a “container” component to a person, and suddenly you can store things in that person. Determining that an entity is on fire would be as simple as adding an “onFire” component and then writing systems that add damage per turn on every entity with such a component. The possibilities are endless.

I doubt I’m going to come down from this high of building the app until I finally manage to get a large language model to speak through one of the characters. For that, I first have to finish making the core of the engine data-driven (actions, events, systems, operations, and components), then figuring out how to implement character turns even if I’m the one playing all the characters, then determining how to add basic artificial intelligence, then figuring out how to save game state. Once everything seems quite solid, I’ll look into interfacing with large language models.

Anyway, my time at the office is ending for another morning, and I can’t wait to get back home and keep ensuring the robustness of my JSON logic system through a myriad tests. Nearly 1,400 tests implemented so far.

Neural Pulse, Pt. 11 (Fiction)

In an electric flash and crackle, my muscles seized, and my vision flared white. As I crumpled backward like a dead weight, my left arm and the side of my head slammed into the control panel. My brain thrummed with electricity. It reeked of burning.

In the whiteness, the silhouette of a spacesuit materialized, looming over me. Several shadows clamped onto my arms with claws. One shadow dug its knees into my abdomen and crushed my face between its palms. I tried to scream, but only a ragged whimper escaped my throat. The tangle of shadows obscured my sight, swallowing me. A shadow snatched my hair and pulled; hundreds of points on my scalp prickled tight. Another shadow smothered my nose and mouth.

When I could feel my arms again, I lashed out at the shadows, thrashing as I braced myself against the control panel and the seat. I lunged for a silhouette—Mara’s spacesuit—but she sidestepped, and I plummeted onto the cockpit floor. A blow to the crown of my head plunged me into a murky confusion.

My wrists were bound behind my back—duct tape, I glimpsed, as Mara, crouched by my knees, finished wrapping my ankles. She straightened and hobbled backward. She stepped on the electroshock lance lying discarded on the floor and slipped, but the oxygen recycler clipped to the back of her suit arrested her fall as it struck the hatch.

Gauges of different shapes bulged on her belt like ammunition magazines. The suit’s chest inflated and deflated rhythmically. Mara unlatched her helmet and pulled it off, revealing her ashen face: mouth agape with baby-pink lips; livid, doubled bags under her eyes; strands of black hair plastered to her forehead with sweat. She leaned back against the hatch, gasping through her mouth, the corners glistening with saliva as she scrutinized me with intense, glazed eyes.

The cockpit reeked of sweat and burnt fuses. The shadows had congealed into a mass of human-shaped silhouettes, their hatred addling my brains, boiling me in a cauldron. Mara’s outline, as if traced with a thick black marker, pulsed and expanded.

No more anticipating how to defend myself, because I have you trapped. Thanks to you, the station doesn’t know we came down to the planet. With the tools of the xenobiologist you murdered, I will rip out your tongue, gouge out your eyes, bore into your face.

Mara crouched, setting her helmet on the floor. Exhaustion contorted her actress-like features, as if some illness burdened her with insomnia and pain.

“I thought I was marooned on this planet. I could have just called the station for rescue, but they’d fire me for nothing, and my pride would rather I suffocated than admit I needed help. Now I know—when we found the artifact, I should have tied you up then. Because you, being you, would just stick your nose right up to an alien machine that, for all you knew, could have detonated the outpost. And to understand what drove you to kill that xenobiologist, I imitated you. I stuck my nose up to that thing, and I saw my reflection. Now I know. Unfortunately, I know.” She regarded me like a comatose patient and waved a gloved palm. “Can you hear me? Did I scramble your brain?”

“I hear.”

My voice emerged as a rasp. I coughed. My mouth tasted of metal.

“And you understand?”

I nodded.

The black veil obscuring the cockpit stirred, rippling. Concentrated energy, like the air crackling before a storm. With Mara’s every gesture, the shadows shifted. Their bony claws crushed my thighs, cinching around my spine through suit, skin, and flesh.

A bead of sweat trickled down Mara’s forehead. She rubbed her face, swallowed. Her pupils constricted.

“Is that what you think? That I’ve convinced myself I’ve subdued you? That you’ll fool me until I let you go? That then you’ll finally strangle me? And even if the station calls it murder, no one will bother investigating, because most people who knew me would thank you for killing me.”

“I’m not thinking. When I try, my brain protests.”

Mara hunched down opposite me, reaching out to study the blow on my head, but halfway there her features pinched. She drew herself up, crossing her arms.

“I heard you telling me to come closer. Because you’ll break free, dig your nails into my corneas, and rip my jaw apart.”

My guts roiled; acid surged up my throat.

“You think I think things like that?”

“I feel this second consciousness… it betrays your thoughts as clearly as if you spoke them aloud. Maybe I’ll never understand how the artifact interfered with our minds, not just our language, but it’s a trick.”

I pushed my torso off the floor, sliding my back up the side of a seat inch by inch, trying not to provoke her, until my stomach settled. My head ached where she’d struck me. The throbbing in my skull clouded and inflamed my thoughts.

“You saw him. Jing. What I did.”

“I saw someone down there. I’d need dental records or DNA to be sure, but I trust elimination. I thought you’d claim it was an accident.”

“It was. I attacked the shadows. You feel them, don’t you?”

Mara took a deep breath.

“They’re pawing at me, trying to suffocate me. Products of my own besieged brain, I know, but I can hardly call them pleasant.”

“I wanted to keep it from affecting you. But at least now you understand.”

“Make no mistake. That xenobiologist is lying with his face beaten to a pulp in the second sublevel of an alien outpost because you are you.”

I pressed my lips together, erecting a wall against escaping words. I looked away from Mara’s eyes, concentrating on deepening my breaths. The muscles in my forearms were taut. Pain flared in my constricted wrists. This woman had fired an electroshock lance at me, beaten me, bound me, and now she was assaulting my character.

With her boot-tip, Mara nudged her helmet; it wobbled like a small boat.

“Although the jolts in my neurons, the shadows, and this other consciousness intruding in my mind unnerve me, the effect isn’t so different from how I’ve always felt around people. The two consciousnesses will learn to get along.”

“If you’re not exaggerating,” I said gravely, “I am truly sorry, Mara.”

She pushed damp strands of hair from her forehead and scrubbed it with the back of her glove, smudging it with dust. The corners of her lips sagged as if weights hung from them.

“Thanks for the sympathy.”

“Were you afraid I planned to do the same thing to you as I did to Jing?”

“Can you blame me for removing the opportunity?”

She limped heavily over to my seat and sat down sideways. As she leaned an elbow on the control panel, a shadow shoved my torso against the seat I leaned on; my lungs emptied. I shuddered, sinking into black water.

Mara had said we imagined the shadows, even if they affected us. I writhed onto my back, pushing with my heels until my head touched the cockpit hatch. My wrists throbbed, crushed tight. A shadow pressed down on my chest like someone sitting there, yet no physical presence had stopped me from moving. The artifact had hijacked my senses.

Mara regarded me from above, pale and cold like a queen enthroned.

“I wouldn’t have killed you,” I said. “You’re my friend.”

“Am I?”

Between the pulses of my headache, I tried to decipher her expression.

“To me, you are.”

“I like you. I tolerate you. But often, being around you feels like rolling in nettles, Kirochka.”

“Almost everything irritates you.”

“You’re incapable of seeing people as anything other than reflections of yourself. What you instinctively feel is right, you impose as right for everyone.” She shook her head, then leaned forward, her tone hardening as if she were tired of holding back. “You insist you have to drag me away from my interests, my studies, as if imitating your actions and hobbies would somehow make me impulsive and reckless too. Admit it or not, you think the rest of humanity are just primitive creatures evolving towards becoming you.” She jabbed a finger at her chest. “I need time to myself, Kirochka. Solitude. Reading. Designing one of my machines, or building it. You think people need to be prevented from thinking.”

Exhaustion was crushing me. I imagined another version of myself laughing, suggesting a drink or a movie, assuming Mara’s mood could be cured by a few laps in the pool. But my vision blurred. I blinked, swallowed to make my vocal cords obey.

“We’ve had good times.”

“The best were when I was enduring idiots and tolerating awful music.”

“You showed them you’re smart. Got half the tracking team to stop calling you ‘black dwarf’.”

“Yes, because those morons’ gossip was costing me sleep. You think I need to prove anything to them? They can believe whatever they want.”

Shadows crouched nearby, focusing their hatred on me, clawing at my skin, crushing my flesh with bony grips. They tormented me like chronic pain, but while Mara and I talked, I kept the torture submerged.

“Things went well for you, for a while, with that man you met. I don’t take credit, but would you have met him dining alone?”

The woman, deflated, blinked her glazed eye, rubbing it as if removing grit.

“You’re right. I miss things by focusing on research instead of acting like a savage. But I assure you, Kirochka, we’re too different for me ever to consider you a friend. Sooner or later, we’d stop tolerating each other.”

“We can bridge the differences.”

“You talk to fill silences. You pressure people for attention. You live for interaction. I could never sustain a friendship with someone like that.”

“Do you use me to get things?”

“Everyone uses everyone, if only to feel better about themselves. I just refrain from feeding illusions.” She drew herself up, as if recalling an injustice, and rebuked me with her eyes. “Besides, I didn’t stop running because I was lazy. I barely eat, and nobody’s chasing me in my apartment. Running bores me to death.”

“I wanted the company.”

Mara shook her head. Her tired gaze roamed the cockpit, as if seeing through the walls.

“When you called a few hours ago, I thought you wanted to drag me out drinking with you and the other pilots. I considered pretending I’d fallen asleep with the sound nullifier on. I should have.”

I contorted like a snake, sliding my back up the hatch. I leaned the oxygen recycler back, resting my head against the cool metal. Judging by the ache, when I undressed, my arms would be covered in lurid bruises.

“I consider you a friend. You listen when I need it. My professional peers, the ones who think they’re my friends, even my boyfriend—they’d tell me to shut up for ruining the mood.”

“When have you ever listened to me?”

“I want to. But I have to pry the words out of you.”

“Maybe that should have told you something, Kirochka.”

“That you hate me.”

She sighed, the effort seeming immense, like lifting a great weight.

“I don’t like human beings. I would have chosen to be anything else.”

Flashes on the communications monitor distracted me. Though Mara was still speaking, her words faded to a murmur beneath my notice. The headache pulsed, reddening my vision. Why did the monitor alert snag my attention? I snapped fully alert. It meant an incoming call.


Author’s note: I wrote this novella in Spanish about ten years ago. It’s contained in the collection titled Los dominios del emperador búho.

Today’s song is “Body Betrays Itself” by Pharmakon.

Living Narrative Engine #2

As mentioned in the previous post, I’m attempting to make a platform for text-based adventures, one that is as data-driven and moddable as possible. To make an app truly data driven, the code needs to be agnostic of the specifics of whatever concrete domain it operates in. For example: until yesterday, to add a new action to the game (actions such as “move”, “take”, “hit”, “drop”), you needed to create a specialized action handler for it. Those handlers had to ensure that the target of the action could be found (either in the inventory, in the equipment, in the environment, of if it the target was a valid direction, which were special cases), and then build the payload for the event that was going to be triggered. Well, thanks to the indefatigable help of Gemini 2.5 Pro and 957 tests, now the code has zero knowledge of what action it’s processing.

The entirety of a specific action’s definition looks like this now:

{
  "$schema": "../schemas/action-definition.schema.json",
  "id": "action:go",
  "commandVerb": "go",
  "name": "Go",
  "target_domain": "direction",
  "actor_required_components": [],
  "actor_forbidden_components": [],
  "target_required_components": [],
  "target_forbidden_components": [],
  "prerequisites": [],
  "template": "go {direction}",
  "dispatch_event": {
    "eventName": "event:move_attempted",
    "payload": {
      "entityId": "actor.id",
      "direction": "resolved.direction",
      "previousLocationId": "context.currentLocation.id",
      "connectionEntityId": "resolved.connection.id",
      "targetLocationId": "resolved.connection.targetLocationId",
      "blockerEntityId": "resolved.connection.blockerEntityId"
    }
  }
}

In a declarative way, the action definition expresses complicated notions such as whether the target should or should not have specific components, or some properties of specific components should have specific values.

The most complex part is the payload. For that, a small scripting language had to be invented. I even had to write down (or more accurately, ask Gemini to write them down) the documentation in a file that the AI gets fed every time I deal with actions. A small excerpt of the docs:

## 3. Payload Source Mapping Conventions

The string values provided for keys within the `dispatch_event.payload` object define where the data for that payload field should come from. The Action Executor (the system component responsible for processing successful actions and dispatching events) is responsible for:


-Parsing these mapping strings.
-Retrieving the corresponding data from the runtime `ActionContext` (which includes the actor entity, resolved target/direction, current location, parsed command, etc.).
-Handling potential `null` or `undefined` values gracefully (e.g., by omitting the field from the final payload or explicitly setting it to `null`).
-Performing necessary type conversions, especially for `literal.*` mappings.

The following mapping string formats are defined:

## 3.1 Actor-Related Data

`actor.id`

Source: `context.playerEntity.id`
Description: The unique ID of the entity performing the action.
Type: String or Number (depending on entity ID type)

`actor.name`


Source: `getDisplayName(context.playerEntity)`
Description: The display name of the acting entity.
Type: String


`actor.component.<ComponentName>.<property>`

Source: `context.playerEntity.getComponent(ComponentName)?.<property>`
Description: Retrieves the value of `<property>` from the specified `<ComponentName>` attached to the acting entity.
Example: `actor.component.StatsComponent.strength`
Type: Varies based on the component property type.
Executor Note: Must handle cases where the component is not present on the actor or the specified property does not exist on the component. Should resolve to `null` or `undefined` in such cases.


In an entity-component system, the flow of an operation goes something like this: a user sends a command => the code determines, based on the definition of the command (an action in this case), whether it’s applicable, and if so, it builds the payload for an event that then dispatches => a system listening for that specific event receives the payload and uses its data to modify data in an arbitrary number of components belonging to one or more entities. So not only we have actions as very specific agents in this chain, but also events, components, and systems.

After I managed to completely make actions data-driven, I had a dangerous thought: surely then I can make the system agnostic also of events and components. Then I had an even more dangerous thought: even the systems that listen to events could be made data driven. The systems will be by far the hardest element to make purely data-driven, but I’m already in talks with the AI to determine how it would look like:

{
  "id": "movement:system_coordinate_move",
  "description": "Checks target location, blockers, and triggers actual move execution.",
  "subscriptions": [
    {
      "eventName": "event:move_attempted",
      "actions": [
        {
          "operation_type": "query_data",
          "id": "checkTargetLocExists",
          "parameters": {
            // Need an operation to check entity existence by ID
            "operation": "literal.string.check_entity_exists", // Hypothetical operation
            "entityIdSource": "event.payload.targetLocationId",
            "result_variable": "literal.string.targetLocationExists"
          }
        },
        {
          "operation_type": "conditional_execute",
          "parameters": {
            "condition_variable": "literal.string.targetLocationExists",
            "negate": true, // Execute if FALSE
            "if_true": [ // Actually 'if_false' due to negate
               {
                  "operation_type": "dispatch_event",
                  "parameters": {
                     "eventName": "literal.string.event:move_failed",
                     "payload": { // Construct failure payload
                        "actorId": "event.payload.entityId",
                        "direction": "event.payload.direction",
                        "reasonCode": "literal.string.TARGET_LOCATION_NOT_FOUND",
                        "details": "literal.string.Destination does not exist."
                        // ... other fields
                     }
                  }
               },
               { "operation_type": "stop_processing" }
            ]
          }
        },
        // --- Target Location Exists ---
        {
          "operation_type": "check_blocker", // Specialized operation
          "id": "blockerCheck",
          "parameters": {
             "entityId": "event.payload.entityId",
             "direction": "event.payload.direction",
             "blockerEntityId": "event.payload.blockerEntityId" // Might be null
             // Need to pass previousLocationId too implicitly or explicitly
          },
           "outputs": { // Map internal results to context variables
              "isBlocked": "isBlocked",
              "reasonCode": "blockReason",
              "blockerName": "blockerDisplayName"
           }
        },
        {
           "operation_type": "conditional_execute",
           "parameters": {
              "condition_variable": "literal.string.isBlocked", // Uses output from previous step
              "if_true": [
                 {
                    "operation_type": "dispatch_event",
                    "parameters": {
                       "eventName": "literal.string.event:move_failed",
                       "payload": {
                          "actorId": "event.payload.entityId",
                          "direction": "event.payload.direction",
                          "reasonCode": "variable.blockReason", // Use reason from blocker check
                          "details": "expression.format('Blocked by {0}', variable.blockerName)",
                          "blockerDisplayName": "variable.blockerName"
                          // ... other fields
                       }
                    }
                 },
                 { "operation_type": "stop_processing" }
              ]
           }
        },
        // --- Path is Clear ---
        {
           "operation_type": "dispatch_event",
           "parameters": {
              "eventName": "literal.string.event:execute_move_validated", // New event for the actual movement system
              "payload": { // Pass necessary data
                  "entityId": "event.payload.entityId",
                  "targetLocationId": "event.payload.targetLocationId",
                  "previousLocationId": "event.payload.previousLocationId",
                  "direction": "event.payload.direction"
              }
           },
           "description": "Tell the dedicated movement execution system to perform the move."
        }
      ]
    }
  ]
}

All operations in a system could also be made data-driven. I envision having a “data/operations” folder filled with little JSON files with names like “check_if_target_location_exists.operation.json”. Ah, what beauty.

Living Narrative Engine, #1

This past week I’ve been in my equivalent of a drug binge. Out of nowhere, I became obsessed with the notion of implementing a text-based immersive sim relying on “vibe coding,” as has come to be known the extremely powerful approach of relying on very competent large-language models to code virtually everything in your app. Once I tasted Google’s Gemini 2.5 Pro’s power, I fell in love. The few times it makes mistakes, it’s usually my fault for not expressing my requirements correctly. Curiously enough, OpenAI released a more powerful model just a couple of days ago: o3. Sadly it’s under usage limits.

Anyway, let me explain about the project, named Living Narrative Engine. You can clone the repository from its GitHub page.

It’s a browser-based engine to play text adventures. My intention was to make it as moddable and data-driven as possible, to the extent that one could define actions in JSON files, indicating prerequisites for the action, the domain of applicability, what events it would fire on completion, etc, and the action-agnostic code would just run with it. I mention the actions because that’s the last part of the core of this app that I’m about to delve into; currently actions such as “look”, “hit”, “move”, “unlock” and such are harcoded in the system: each has a dedicated action handler. That’s terrible for the purposes of making it data-driven, so I’ve requested deep-search research documents and product requirement documents from ChatGPT, which look hella good. Before I start tearing apart the action system of the app, which may take a couple of days, I wanted to put this working version out there.

Currently the app does the minimum for a demo: it spawns you in a room, lets you move from room to room, kill a goblin, open doors, take items, equip items (and drop and unequip them), and also unlock doors (which was the hardest part of the app to codify). I have introduced quest and objective systems that listen to conditions; for example, there’s no key in the demo to open the door where the goblin is located, but when the event “event:entity_died” fires with that goblin as the subject, the door opens mysteriously. The single JSON file that drives that is below:

{
  "id": "demo:trigger_unlock_treasure_door_on_goblin_death",
  "listen_to": {
    "event_type": "event:entity_died",
    "filters": {
      "deceasedEntityId": "demo:enemy_goblin"
    }
  },
  "effects": [
    {
      "type": "trigger_event",
      "parameters": {
        "eventName": "event:unlock_entity_force",
        "payload": {
          "targetEntityId": "demo:door_treasure_room"
        }
      }
    }
  ],
  "one_shot": true
}

The goal is to make everything as data-driven and agnostic as possible.

Everything in the game world is an entity: an identifier and a bunch of components. For example, if any entity has the Item component, it can be picked up. If it has the Openable component, it can be opened and closed. If it has the Lockable component and also the Openable component, the entity cannot be opened if it’s locked. This leads to fascinating combinations of behavior that change as long as you add or remove components, or change the internal numbers of components.

The biggest hurdle involved figuring out how to represent doors and other passage blockers. All rooms are simple entities with a ConnectionsComponent. The ConnectionsComponent indicates possible exits. Initially the user could only interact with entities with a PositionComponent pointing to the user’s room, but doors aren’t quite in one room, are they? They’re at the threshold of two rooms. So I had to write special code to target them.

Anyway, this is way too much fun. Sadly for the writing aspect of my self, I haven’t written anything in about five days. I’ll return to it shortly, for sure; these binges of mine tend to burn out by themselves.

My near-future goal of this app is to involve large-language models. I want to populate rooms with sentient AIs, given them a list of valid options to choose from regarding their surroundings (such as “move north”, “eat cheesecake”, or “kick baboon”), and have them choose according to their written-in personalities. I want to find myself playing through RPG, text-based campaigns along with a harem of AI-controlled isekai hotties.

I’m going back to it. See ya.

Neural Pulse, Pt. 10 (Fiction)

Paralyzed, I choked. I sucked in a lungful of hot air and collapsed to my knees before the xenobiologist. I pressed my hands against his suit’s chest. I pounded on him. No one would recognize Jing from what was left of his blood-drenched face. I stammered, repeating, “no, no, no,” while my fingers traced the helmet’s dents, the jagged shards of the broken visor jutting from the frame.

Pooling blood submerged the ruin of bone and flesh that was his face. When I tilted Jing’s body, the helmet spilled a tongue of blood onto the stone floor, slick with sliding globules of brain matter.

I staggered back, fists clenched, shuddering violently as if seized by frost.

Jing’s right hand was clamped around the handle of an automatic core drill. Perhaps the xenobiologist had approached to help me.

I shut my eyes, covered my visor with a palm. I pictured Jing standing beside me, an echo asking if I needed help. No, I hadn’t killed him. When I opened my eyes, the corpse lay sprawled on its side, the dented helmet cradling the ruin of his head.

Jing hadn’t known he was dealing with a live nuclear device. The flood of that feeling had swept over me. Had I seen the xenobiologist stop beside me? Had I decided to smash his face in with the crowbar?

I stumbled about, gasping for breath. My brain felt like it was on fire, seizing with electric spasms. Red webs pulsed at the edges of my vision, flaring brightly before fading. Before I knew it, I’d crossed the room that contained the construction robots, and was sprinting up the ramp. The oval beam of my flashlight jerked and warped, sliding over the protrusions and crevices of the rock face. My arms felt like spent rubber bands, especially the right, aching from fingertips to shoulder blades. Every balancing lurch, every push against the rock to keep climbing, intensified the ache.

I passed the first sublevel. My breath fogged the visor; I saw the flashlight beam dimly, as through a mist. My hair, pulled back at my nape, was soaked through, plastered to my skin.

I burst onto the surface, into the emptiness of the dome. I staggered, kicking through the sandy earth. I gasped for air and ran. I pictured myself training on a circuit—something that relaxed me at the academy after piloting, just as going to the gym with Mara relaxed me on the station—but now I was running from the consequences, from an earthquake tearing the earth apart like cloth. If I slowed, the fissure would overtake and swallow me.

I vaulted over the embankment to the left of the esplanade, where I’d hidden before, landing on my knees and one forearm. I scrambled backward, kicking up dirt, and pressed myself flat against the embankment’s exposed rock face.

The radio. I navigated the visor options until I muted my comm signal. When the notification confirmed I was off-frequency, I jammed my fists against my knees, my mouth stretched wide in a scream.

I drew a ragged breath. Beads of sweat dripped from my forehead onto the visor; the material wicked them away, like water hitting hot pavement. Mara would have reached the cockpit by now, found me missing. Nothing could make Jing’s death look like an accident. How would my friend look at me? What would she think when she found out? She’d think… because I killed the xenobiologist… I might kill her too.

I buried my helmeted head in my forearms. I welcomed the dimness. How had I let this happen? I knew I should have destroyed the artifact—just as I knew I had to fight back when those shadows grabbed me, tried to rip me open with their claws. I’d struck the shadows with the crowbar before I’d even consciously decided to. On other expeditions, while waiting for scientists and soldiers to emerge from some dense alien jungle, I’d monitor their radio chatter, trusting my instincts to warn me if I should suggest aborting the mission. Just as piloting was like flowing in a dance of thrust and gravity, the way dancing came naturally to others, I imagined. Now my instincts screamed at me to flee, to run from this embankment away from the ship, to strike out across the planet, heedless of survival. My instinct had been supplanted by another. And I knew the difference.

I peeked around the side of the embankment. The scarred esplanade remained deserted. The crystalline dome watched the minutes pass like some ancient ruin.

If Mara found out the artifact made me kill Jing, maybe she’d understand the danger, agree to destroy it. I was counting on her reasoning, on that cold logic that had so often irritated me. But if I waited too long to face her, she’d suspect my motives.

As I straightened up and stepped, dizzy, onto the esplanade, an electric spike lanced through my neurons, blurring my vision. I stumbled around until it subsided. I stopped before the central crater, hunching over to examine its charcoal-gray cracks and ridges. Crushed bones.

I activated the radio. The visor display indicated it was locking onto Mara’s signal. She’d see mine pop up, too, unless she was distracted. In the center of my darkened visor, the arctic-blue star shone through the thin atmosphere like a quivering ball of fluff.

“Where are you, Mara?”

“Cockpit.”

The shadows intercepted the transmission, projecting their hatred at me. It distracted me from Mara’s tone—was there suspicion coloring her voice? I waited a few seconds. Would she demand an explanation? Why was she silent?

“Good,” I said. “Stay there. I need to talk to you.”

As I climbed the slope skirting the hill towards the ship, the reality of my decision hit me. I was about to lock myself in the cockpit’s confined space with Mara. Her shadows would envelop me, sink their claws into my skin, force themselves down my throat to suffocate me. I wanted desperately to rip off my helmet, wipe the sweat from my face. I needed a shower, a moment to think.

I located the ship’s tower. Several meters ahead lay three cargo containers and scattered tools. Inside the cargo hold, chunks of the robots and the materializer were heaped like scrap in a landfill.

I scrambled up the boarding ladder to the airlock hatch. Opened it, scrambled inside, sealed it shut. The chamber pressurized with a series of hisses and puffs. I unsealed my helmet. Holding it upside down, steam poured out as if from a pot of fresh soup. I gulped the ship’s cool, filtered air and opened the inner door to the cockpit.

“Mara.”

Empty. Indicators blinked. On the monitors, ship status displays and sector topographical maps cycled. Lines of text scrolled.

My seat held a roll of electrical tape. As I turned it over in my fingers, an electric jolt made me clench my teeth, squeeze my eyes shut. My neurons hummed.

The door to the airlock chamber clicked shut with a heavy mechanical thud. The thick metal muffled the hissing. Leaning back against my seat’s headrest, still clutching the tape, I froze. The air grew heavy. The cockpit lights seemed to dim, the edges of my perception closing in. A dozen shadows waited in the airlock chamber, their concentrated beams of hatred probing the metal door, seeking to burn me.

The door slid open.

I tensed, lips parting. What could I possibly say?

Mara emerged sideways through the gap, head bowed. As she stepped through, she shouldered the door shut behind her. The glowing diodes and bright screens of the control panel glinted on her helmet’s visor. She whipped around to face me. Her right arm shot out, leveling an electroshock lance. The two silver prongs at its tip lunged like viper fangs.


Author’s note: I originally wrote this novella in Spanish about ten years ago. It’s contained in the collection titled Los dominios del emperador búho.