In this paper, optimization of update period is investigated in a digital twin (DT) environment. In particular, we analyze the relationship between the synchronization error - caused by periodic update tasks - and the total number of processed bits. Based on this analysis, we propose a method to determine the optimal update period. In DT systems, accurate synchronization between the physical twin (PT) and the cyber twin (CT) is critical, as synchronization errors can degrade prediction accuracy, which in turn leads to a decline in overall system performance. Therefore, optimizing the update period is a key factor in maintaining DT system performance. To address this challenge, we mathematically modeled the effect of the update period on data processing capacity and synchronization accuracy, considering the exponential growth of synchronization error over time. Through this analysis, we derived the optimal update period that minimizes the ratio of total synchronization error to the number of processed bits. Our approach reformulates the update period optimization problem into a monotonically increasing function, demonstrating that the optimal update period can be efficiently determined while satisfying specific data processing capacity.