Self-consistency prompting is an strategy that merely asks a mannequin the identical immediate a number of instances and takes the bulk consequence as the ultimate reply.
It’s a comply with as much as CoT Prompting.
Self-consistency prompting works by making a collection of reasoning paths within the AI’s choice making course of.
By analyzing the consistency throughout these paths, it might discard any path that doesn’t align with the preliminary premises.
Self-consistency goals “to exchange the naive grasping decoding utilized in chain-of-thought prompting.” Wang.
The concept is to pattern a number of, various reasoning paths via few-shot CoT and use the generations to pick probably the most constant reply.
This helps to spice up the efficiency of CoT prompting on duties involving arithmetic and commonsense reasoning.
Since you may typically take quite a few paths to reach at a solution to a math downside, why don’t we ask an LLM to create a number of reasoning paths?
CoT with self-consistency entails launching a number of various CoT threads, noting their outputs after which contemplating the bulk vote of those outputs as…