Batch normalization (BN) is a technique used to enhance training speed and generalization performance by mitigating internal covariate shifts. However, implementing BN in hardware presents challenges due to the need for an additional complex circuit to normalize, scale and shift activations. We proposed a hardware binary neural network (BNN) system capable of BN in hardware, which is consist of an AND-type flash memory array as a synapse and a voltage sense amplifier (VSA) as a neuron. In this system, hardware BN was implemented using a voltage shifter by adjusting the threshold of the binary neuron. To validate the effectiveness of the proposed hardware-based BNN system, we fabricated a charge trap flash with a gate stack of SiO<sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sub>/Si<sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sub>N<sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">4</sub>/SiO<sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sub>. The electrical characteristics were modelled by using BSIM3 model parameters so that the proposed circuit was successfully demonstrated by a SPICE simulation. Moreover, variation effects of the voltage shifter were also analyzed using Monte Carlo simulation. Finally, the performance of the proposed system was proved by incorporating the SPICE results into a high-level simulation of binary <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">LeNet-5</i> for MNIST pattern recognition, resulting in the improvement of the proposed system in terms of power and area, compared to the previous studies.