Superior prompting methods like chain of thought [8] and tree of thought [9] prompting have drastically improved the flexibility of huge language fashions (LLMs) to unravel complicated, reasoning-based duties. At a excessive stage, forcing the LLM to assemble a step-by-step response to an issue drastically improves its problem-solving capabilities. Nonetheless, all of such methods assume that the reasoning course of ought to comply with a linear patterns that progresses from one thought to the following. Notably, the reasoning course of adopted by people tends to be fairly completely different, following a number of completely different chains of thought and even combining insights from completely different ideas to reach at a ultimate resolution. Inside this overview, we might be learning a number of prompting methods that mannequin the reasoning course of as a graph construction — relatively than a series or tree — that higher captures the assorted sorts of non-linear patterns that will happen when reasoning over an issue.
“Human pondering is usually characterised by its capability to make sudden leaps and connections between seemingly unrelated concepts, which may result in novel insights and options. This non-linear, leaping thought course of is a trademark of human creativity, reasoning, and problem-solving skills.” — from [1]
Inside this overview, we are going to discover a number of superior prompting methods for LLMs that can be utilized to unravel troublesome multi-step reasoning issues. Fortunately, we’ve got just lately overviewed the fundamental concepts behind prompting, together with:
- Prompting fundamentals (i.e., immediate engineering, context home windows, construction of a immediate, and many others.) [link]
- Superior prompting methods (e.g., chain of thought, self-consistency, and least-to-most prompting) [link]
Now we have lined each sensible and superior prompting methods up to now. All of those ideas — particularly chain of thought (CoT) prompting [8], self-consistency [10], and tree of thought (ToT) prompting [9] — might be related for gaining an understanding of this overview. Past these concepts, we have to perceive the transformer structure and the graph…