We introduce Knowledgeable Network of Thoughts (kNoT): a prompt scheme that advances the capabilities of large language models (LLMs) beyond existing paradigms like Chain-of-Thought (CoT), Tree of Thoughts (ToT), and Graph of Thoughts (GoT). The key innovation of kNoT is the LLM Workflow Template (LWT), which allows for an executable plan to be specified by LLMs for LLMs. LWT allows these plans to be arbitrary networks, where single-step LLM operations are nodes, and edges correspond to message passing between these steps. Furthermore, LWT supports selection of individual elements through indexing, facilitating kNoT to produce intricate plans where each LLM operation can be limited to elementary operations, greatly enhancing reliability over extended task sequences. We demonstrate that kNoT significantly outperforms the state of the art on six use cases, while reducing the need for extensive prompt engineering. For instance, kNoT finds 92% accuracy for sorting 32 numbers over 12% and 31% for ToT and GoT, while utilizing up to 84.4% and 87.3% less task-specific prompts, respectively.
翻译:本文提出知识化思维网络(kNoT):一种提示方案,旨在将大语言模型(LLMs)的能力提升至超越现有范式(如思维链(CoT)、思维树(ToT)和思维图(GoT))的水平。kNoT的核心创新在于大语言模型工作流模板(LWT),该模板允许由LLMs为LLMs指定可执行的计划。LWT支持将这些计划构建为任意网络结构,其中单步LLM操作作为节点,边则对应这些步骤间的消息传递。此外,LWT支持通过索引选择独立元素,使得kNoT能够生成精细化的计划,其中每个LLM操作均可限制在基本运算范围内,从而显著提升了长任务序列的可靠性。我们通过六个用例证明,kNoT在显著减少对大量提示工程需求的同时,其性能显著优于现有最优方法。例如,在对32个数字进行排序的任务中,kNoT达到了92%的准确率,而ToT和GoT分别仅为12%和31%;同时,kNoT所需的任务特定提示量分别减少了84.4%和87.3%。