In autonomous driving, a prior LiDAR map(PLM) is used as a powerful tool for correcting SLAM drift, but finding robust and accurate correspondences between cross modal sensors is a challenging problem. To address this cross-modality issue, this paper proposes a real-time plane-based stereo localization system with a PLM. In the proposed system, drift in visual pose estimation is eliminated through plane-based joint optimization and the registration module. Two types of planes are employed in this system: surfel, ensuring accuracy in a narrow domain, and global plane, providing robustness in a wide area. For accurate and robust matching between the visual map point and PLM, surfels and global planes in PLM are utilized collaboratively based on point-to-PLM distance. To reduce the computational cost of the registration module, a plane-to-plane drift estimation module is proposed. The performance of the proposed system is extensively validated across synthetic simulation and real-world indoor and outdoor datasets. We validate the effectiveness of each module through ablation studies and also assess the robustness against error that may exist in PLM and initial pose. In most of the validation, the proposed system shows more accurate and robust performance compared to the state-of-the-art methods.