A commercial tracking camera readily available in aerospace applications usually provides reliable navigation solutions. While the navigation resulting from a tracking camera sometimes fails or drifts in a certain environment, another tracking camera facing a different direction in the same environment is usable. To produce consistent navigation with multiple tracking cameras facing distinct directions, we propose a real-time fusion approach employing an interacting multiple-model filtering framework. In other words, estimation outputs from each tracking camera are weighted according to their accuracy and uncertainty to generate a robust navigation system. The real-world experiments show that the proposed navigation system overcomes sensor failure and loss of texture for one camera.