With the increasing deployment of nanosatellites, free-space optical (FSO) communication is being actively utilized to minimize interference between communication links. However, failing to account for beam optimization against pointing errors during the design phase can result in degraded pointing, acquisition, and tracking (PAT) accuracy, as well as reduced transmission efficiency. This study investigates the factors contributing to elliptical pointing errors in nanosatellites and proposes an optimization method to address these issues. The optimization performance was evaluated using a machine learning approach based on the Random Forest model. By leveraging a novel optimization algorithm based on the modeled pointing errors, we achieved a significant improvement in communication performance compared to conventional methods.