选择你自己的冒险工作流(人在回路中)¶
对于某些工作流应用,可能希望并且/或者需要有人类参与其执行。例如,工作流的某个步骤可能需要人类的专业知识或输入才能运行。在另一种情况下,可能需要人类验证工作流的初始输出。
在本笔记本中,我们将展示如何在工作流中实现人在回路模式。我们将构建一个工作流,以“选择你自己的冒险”风格创建故事,其中 LLM 生成故事的一个片段以及可能的行动,然后需要人类从这些行动中选择一个。
使用 LLM 生成故事片段¶
在这里,我们将利用 LLM 生成结构化输出的能力。我们将任务 LLM 创建一个故事片段,该片段是之前生成片段和行动选择的延续。
In [ ]
已复制!
from typing import Any, List
from llama_index.llms.openai import OpenAI
from llama_index.core.bridge.pydantic import BaseModel, Field
from llama_index.core.prompts import PromptTemplate
from typing import Any, List from llama_index.llms.openai import OpenAI from llama_index.core.bridge.pydantic import BaseModel, Field from llama_index.core.prompts import PromptTemplate
In [ ]
已复制!
class Segment(BaseModel):
"""Data model for generating segments of a story."""
plot: str = Field(
description="The plot of the adventure for the current segment. The plot should be no longer than 3 sentences."
)
actions: List[str] = Field(
default=[],
description="The list of actions the protaganist can take that will shape the plot and actions of the next segment.",
)
class Segment(BaseModel): """Data model for generating segments of a story.""" plot: str = Field( description="The plot of the adventure for the current segment. The plot should be no longer than 3 sentences." ) actions: List[str] = Field( default=[], description="The list of actions the protaganist can take that will shape the plot and actions of the next segment.", )
In [ ]
已复制!
SEGMENT_GENERATION_TEMPLATE = """
You are working with a human to create a story in the style of choose your own adventure.
The human is playing the role of the protaganist in the story which you are tasked to
help write. To create the story, we do it in steps, where each step produces a BLOCK.
Each BLOCK consists of a PLOT, a set of ACTIONS that the protaganist can take, and the
chosen ACTION.
Below we attach the history of the adventure so far.
PREVIOUS BLOCKS:
---
{running_story}
Continue the story by generating the next block's PLOT and set of ACTIONs. If there are
no previous BLOCKs, start an interesting brand new story. Give the protaganist a name and an
interesting challenge to solve.
Use the provided data model to structure your output.
"""
SEGMENT_GENERATION_TEMPLATE = """ You are working with a human to create a story in the style of choose your own adventure. The human is playing the role of the protaganist in the story which you are tasked to help write. To create the story, we do it in steps, where each step produces a BLOCK. Each BLOCK consists of a PLOT, a set of ACTIONS that the protaganist can take, and the chosen ACTION. Below we attach the history of the adventure so far. PREVIOUS BLOCKS: --- {running_story} Continue the story by generating the next block's PLOT and set of ACTIONs. If there are no previous BLOCKs, start an interesting brand new story. Give the protaganist a name and an interesting challenge to solve. Use the provided data model to structure your output. """
In [ ]
已复制!
FINAL_SEGMENT_GENERATION_TEMPLATE = """
You are working with a human to create a story in the style of choose your own adventure.
The human is playing the role of the protaganist in the story which you are tasked to
help write. To create the story, we do it in steps, where each step produces a BLOCK.
Each BLOCK consists of a PLOT, a set of ACTIONS that the protaganist can take, and the
chosen ACTION. Below we attach the history of the adventure so far.
PREVIOUS BLOCKS:
---
{running_story}
The story is now coming to an end. With the previous blocks, wrap up the story with a
closing PLOT. Since it is a closing plot, DO NOT GENERATE a new set of actions.
Use the provided data model to structure your output.
"""
FINAL_SEGMENT_GENERATION_TEMPLATE = """ You are working with a human to create a story in the style of choose your own adventure. The human is playing the role of the protaganist in the story which you are tasked to help write. To create the story, we do it in steps, where each step produces a BLOCK. Each BLOCK consists of a PLOT, a set of ACTIONS that the protaganist can take, and the chosen ACTION. Below we attach the history of the adventure so far. PREVIOUS BLOCKS: --- {running_story} The story is now coming to an end. With the previous blocks, wrap up the story with a closing PLOT. Since it is a closing plot, DO NOT GENERATE a new set of actions. Use the provided data model to structure your output. """
In [ ]
已复制!
# Let's see an example segment
llm = OpenAI("gpt-4o")
segment = llm.structured_predict(
Segment,
PromptTemplate(SEGMENT_GENERATION_TEMPLATE),
running_story="",
)
# Let's see an example segment llm = OpenAI("gpt-4o") segment = llm.structured_predict( Segment, PromptTemplate(SEGMENT_GENERATION_TEMPLATE), running_story="", )
In [ ]
已复制!
segment
segment
Out[ ]
Segment(plot="In the bustling city of Eldoria, a young adventurer named Aric discovered a mysterious map hidden inside an old bookshop. The map hinted at a hidden treasure buried deep within the enchanted Whispering Woods. Intrigued and eager for adventure, Aric decided to follow the map's clues.", actions=['Follow the map to the Whispering Woods', 'Seek advice from the old bookshop owner', 'Gather supplies for the journey', 'Ignore the map and continue with daily life'])
将之前的片段串联起来¶
我们需要将故事片段串联起来,并将其作为 running_story
的值传递给提示词。我们定义一个 Block
数据类,用于存储 Segment
以及行动的 choice
。
In [ ]
已复制!
import uuid
from typing import Optional
BLOCK_TEMPLATE = """
BLOCK
===
PLOT: {plot}
ACTIONS: {actions}
CHOICE: {choice}
"""
class Block(BaseModel):
id_: str = Field(default_factory=lambda: str(uuid.uuid4()))
segment: Segment
choice: Optional[str] = None
block_template: str = BLOCK_TEMPLATE
def __str__(self):
return self.block_template.format(
plot=self.segment.plot,
actions=", ".join(self.segment.actions),
choice=self.choice or "",
)
import uuid from typing import Optional BLOCK_TEMPLATE = """ BLOCK === PLOT: {plot} ACTIONS: {actions} CHOICE: {choice} """ class Block(BaseModel): id_: str = Field(default_factory=lambda: str(uuid.uuid4())) segment: Segment choice: Optional[str] = None block_template: str = BLOCK_TEMPLATE def __str__(self): return self.block_template.format( plot=self.segment.plot, actions=", ".join(self.segment.actions), choice=self.choice or "", )
In [ ]
已复制!
block = Block(segment=segment)
print(block)
block = Block(segment=segment) print(block)
BLOCK === PLOT: In the bustling city of Eldoria, a young adventurer named Aric discovered a mysterious map hidden inside an old bookshop. The map hinted at a hidden treasure buried deep within the enchanted Whispering Woods. Intrigued and eager for adventure, Aric decided to follow the map's clues. ACTIONS: Follow the map to the Whispering Woods, Seek advice from the old bookshop owner, Gather supplies for the journey, Ignore the map and continue with daily life CHOICE:
创建“选择你自己的冒险”工作流¶
该工作流将包含两个步骤,循环执行直到生成最大步骤数(即片段数)。第一步将由 LLM 创建新的 Segment
,用于创建新的故事 Block
。第二步将提示人类从新创建的 Segment
中指定的行动列表中选择他们的冒险。
In [ ]
已复制!
from llama_index.core.workflow import (
Context,
Event,
StartEvent,
StopEvent,
Workflow,
step,
)
from llama_index.core.workflow import ( Context, Event, StartEvent, StopEvent, Workflow, step, )
In [ ]
已复制!
class NewBlockEvent(Event):
block: Block
class HumanChoiceEvent(Event):
block_id: str
class NewBlockEvent(Event): block: Block class HumanChoiceEvent(Event): block_id: str
In [ ]
已复制!
class ChooseYourOwnAdventureWorkflow(Workflow):
def __init__(self, max_steps: int = 3, **kwargs):
super().__init__(**kwargs)
self.llm = OpenAI("gpt-4o")
self.max_steps = max_steps
@step
async def create_segment(
self, ctx: Context, ev: StartEvent | HumanChoiceEvent
) -> NewBlockEvent | StopEvent:
blocks = await ctx.get("blocks", [])
running_story = "\n".join(str(b) for b in blocks)
if len(blocks) < self.max_steps:
new_segment = self.llm.structured_predict(
Segment,
PromptTemplate(SEGMENT_GENERATION_TEMPLATE),
running_story=running_story,
)
new_block = Block(segment=new_segment)
blocks.append(new_block)
await ctx.set("blocks", blocks)
return NewBlockEvent(block=new_block)
else:
final_segment = self.llm.structured_predict(
Segment,
PromptTemplate(FINAL_SEGMENT_GENERATION_TEMPLATE),
running_story=running_story,
)
final_block = Block(segment=final_segment)
blocks.append(final_block)
return StopEvent(result=blocks)
@step
async def prompt_human(
self, ctx: Context, ev: NewBlockEvent
) -> HumanChoiceEvent:
block = ev.block
# get human input
human_prompt = f"\n===\n{ev.block.segment.plot}\n\n"
human_prompt += "Choose your adventure:\n\n"
human_prompt += "\n".join(ev.block.segment.actions)
human_prompt += "\n\n"
human_input = input(human_prompt)
blocks = await ctx.get("blocks")
block.choice = human_input
blocks[-1] = block
await ctx.set("block", blocks)
return HumanChoiceEvent(block_id=ev.block.id_)
class ChooseYourOwnAdventureWorkflow(Workflow): def __init__(self, max_steps: int = 3, **kwargs): super().__init__(**kwargs) self.llm = OpenAI("gpt-4o") self.max_steps = max_steps @step async def create_segment( self, ctx: Context, ev: StartEvent | HumanChoiceEvent ) -> NewBlockEvent | StopEvent: blocks = await ctx.get("blocks", []) running_story = "\n".join(str(b) for b in blocks) if len(blocks) < self.max_steps: new_segment = self.llm.structured_predict( Segment, PromptTemplate(SEGMENT_GENERATION_TEMPLATE), running_story=running_story, ) new_block = Block(segment=new_segment) blocks.append(new_block) await ctx.set("blocks", blocks) return NewBlockEvent(block=new_block) else: final_segment = self.llm.structured_predict( Segment, PromptTemplate(FINAL_SEGMENT_GENERATION_TEMPLATE), running_story=running_story, ) final_block = Block(segment=final_segment) blocks.append(final_block) return StopEvent(result=blocks) @step async def prompt_human( self, ctx: Context, ev: NewBlockEvent ) -> HumanChoiceEvent: block = ev.block # get human input human_prompt = f"\n===\n{ev.block.segment.plot}\n\n" human_prompt += "Choose your adventure:\n\n" human_prompt += "\n".join(ev.block.segment.actions) human_prompt += "\n\n" human_input = input(human_prompt) blocks = await ctx.get("blocks") block.choice = human_input blocks[-1] = block await ctx.set("block", blocks) return HumanChoiceEvent(block_id=ev.block.id_)
运行工作流¶
由于工作流是异步优先的,这在笔记本中运行良好。如果你在自己的代码中运行,如果尚未运行异步事件循环,则需要使用 asyncio.run()
来启动一个。
async def main():
<async code>
if __name__ == "__main__":
import asyncio
asyncio.run(main())
In [ ]
已复制!
import nest_asyncio
nest_asyncio.apply()
import nest_asyncio nest_asyncio.apply()
In [ ]
已复制!
w = ChooseYourOwnAdventureWorkflow(timeout=None)
w = ChooseYourOwnAdventureWorkflow(timeout=None)
In [ ]
已复制!
result = await w.run()
result = await w.run()
打印最终故事¶
In [ ]
已复制!
final_story = "\n\n".join(b.segment.plot for b in result)
print(final_story)
final_story = "\n\n".join(b.segment.plot for b in result) print(final_story)
In the bustling city of Eldoria, a young adventurer named Aric discovered a mysterious map hidden in an old bookshop. The map hinted at a hidden treasure buried deep within the Whispering Woods, a place known for its eerie silence and ancient secrets. Determined to uncover the treasure, Aric set off on his journey, leaving the city behind. Aric found a seasoned guide named Elara, who knew the Whispering Woods like the back of her hand. Elara agreed to help Aric, intrigued by the promise of hidden treasure. Together, they ventured into the forest, the map leading them to a fork in the path where the trees seemed to whisper secrets. Elara examined the map closely and listened to the whispers of the trees. She suggested taking the left path, as it seemed to align with the ancient markings on the map. Trusting her expertise, Aric and Elara proceeded down the left path, where the forest grew denser and the air filled with an eerie stillness. As Aric and Elara searched for hidden clues along the dense path, they stumbled upon an ancient stone altar covered in moss and vines. Upon closer inspection, they discovered a hidden compartment within the altar containing the long-lost treasure—a chest filled with gold, jewels, and ancient artifacts. With their mission complete, Aric and Elara returned to Eldoria, their bond strengthened by the adventure and their hearts filled with the thrill of discovery.
实现人在回路的其他方法¶
还可以通过创建一个独立的工作流专门用于收集人类输入,并利用嵌套工作流来实现人在回路。这种设计适用于你希望人类输入收集成为独立于工作流其余部分的服务的场景,例如当你使用 llama-deploy 部署嵌套工作流时。