Skip to main navigation menu Skip to main content Skip to site footer

Bootstrapped Structural Prompting for Analogical Reasoning in Pretrained Language Models

Abstract

This paper proposes an analogical reasoning method based on structured prompt graphs and bootstrapped path optimization. The goal is to improve the reasoning ability of large language models in complex structural transfer tasks. The method first constructs an analogical prompt graph that transforms input analogy pairs into explicit structural representations. This captures relational paths and topological dependencies between concepts. On this basis, a bootstrapped path optimization mechanism is introduced. It scores initial reasoning paths based on semantics and filters them for structural consistency. The prompt graph is then dynamically updated to strengthen the transfer expression of analogical structures. The model performs multi-round path reconstruction to achieve stable modeling and iterative refinement of analogical structures. This enhances the robustness and consistency of the language model in scenarios involving multi-hop reasoning, few-shot analogy, and path uncertainty. The experiments systematically evaluate the proposed method under different path depths, task types, and sample sparsity levels. The evaluation focuses on accuracy, path consistency, and structural alignment. The results show that the proposed method outperforms existing mainstream models across all metrics. It demonstrates significant advantages in structural reasoning, stability, and cross-task transfer. These findings validate the modeling value of structured prompting and bootstrapped mechanisms in analogical reasoning tasks.

pdf