This paper introduces CodeDive, a web-based programming environment with real-time behavioral tracking designed to enhance student progress assessment and provide timely support for learners, while also addressing the academic integrity challenges posed by Large Language Models (LLMs). Visibility into the student’s learning process has become essential for effective pedagogical analysis and personalized feedback, especially in the era where LLMs can generate complete solutions, making it difficult to truly assess student learning and ensure academic integrity based solely on the final outcome. CodeDive provides this process-level transparency by capturing fine-grained events, such as code edits, executions, and pauses, enabling instructors to gain actionable insights for timely student support, analyze learning trajectories, and effectively uphold academic integrity. It operates on a scalable Kubernetes-based cloud architecture, ensuring security and user isolation via containerization and SSO authentication. As a browser-accessible platform, it requires no local installation, simplifying deployment. The system produces a rich data stream of all interaction events for pedagogical analysis. In a Spring 2025 deployment in an Operating Systems course with approximately 100 students, CodeDive captured nearly 25,000 code snapshots and over 4000 execution events with a low overhead. The collected data powered an interactive dashboard visualizing each learner’s coding timeline, offering actionable insights for timely student support and a deeper understanding of their problem-solving strategies. By shifting evaluation from the final artifact to the developmental process, CodeDive offers a practical solution for comprehensively assessing student progress and verifying authentic learning in the LLM era. The successful deployment confirms that CodeDive is a stable and valuable tool for maintaining pedagogical transparency and integrity in modern classrooms.