•  Gork   ( @Gork@lemm.ee ) 
    link
    fedilink
    93 months ago

    Instead of generative AI for game assets, id much rather see something like a LLM in game that dynamically controls NPC behavior. That would be cool as hell.

    Like an RPG where you can type what you want to say to an NPC instead of choosing a fixed dialogue tree.

    • Edit: ugh, I lost myself in this reply. It’s just geeking about the future what could be possible, mostly not worth reading if you value your time.

      This is one of the most exciting developments to me, the actual AI of bots or NPCs. Not only for RPG games, I can also envision multiplayer games to be more fun playing offline with bots. Imagine they act like humans, with their hearings and trying to trick you out in Mario Kart, Street Fighter and Counter Strike. Obviously we are long way from this, but this is very exciting to me.

      Also GTA where people act normal and do stuff humans would probably try too is exciting as well. In RPGs imagine you hear about a hero in a village who defends its town and you recruit him, finding out its just a normal NPC for other players, but got strong because it found a holy weapon you dropped near to him in the beginning of the game. Just totally wild idea I know, but what if the future of games (probably 50 years from now… sheesh) is extremely rich and dynamic? I have no idea how this vision could be accomplished without AI and always server connection to power servers…

    • Not gonna happen. Not really.

      So far research suggests the guardrail and hallucination problems are unsolvable, and we are seeing diminishing returns from increasing the complexity of these systems.

      Hence devs will never have the necessary control required to author an actual narrative. NPCs will end up talking about mechanics that don’t exist, or saying things that contradict an overrall narrative.

      Even with actual people, if you just throw them in a room and have the improv a world into existence, it never ends up quite as good as a properly authored narrative.

      And LLMs are nowhere near achieving the level of internal consistency required for something like the worlds of Elden Ring or Mass Effect.

      Baldur’s Gate 3 contains truly staggering amounts of writing, multiple times that of classical literary works. The hallucination problem means that if all that were AI generated, small parts of it might pass inspection, but trying to immerse yourself in it as a fictional world would have you noticing immersion breaking continuity errors left and right.