Currently, the physiological signals in the classification of acrophobia emotions mainly involve electroencephalogram (EEG), electrocardiogram (ECG), and skin electromyography (EMG). However, due to the limitations of EEG acquisition and processing as well as the fusion between multimodal signals, a dynamic weighted decision fusion algorithm based on six peripheral physiological signals is proposed. Firstly, the different levels of acrophobia are induced in the subjects through the virtual reality technology, while six peripheral physiological signals (ECG, BVP, EMG, EDA, SKT, and RESP) are recorded. Secondly, the statistical and event-related features of the signals are extracted to construct a dataset of acrophobia emotions. Thirdly, a dynamic weighted decision fusion algorithm is proposed according to the classification performance, modal, and cross-modal information, so as to effectively integrate multi-modal signals to improve the recognition precision. Finally, the experimental results are compared with previous relevant research, and then verified on the open-source WESAD emotion dataset. The conclusions show that multi-modal peripheral physiological signals are conducive to enhancing the classification performance of acrophobia emotions, and the proposed dynamic weighted decision fusion algorithm significantly improves both the classification performance and model robustness.