This study aims to develop an algorithm that contributes to participation evaluation by analyzing the activeness and consistency of user responses in an interactive conversation chatbot system. DST analysisbased AI prompt engineering was proposed to evaluate the activeness of responses through the user's response speed and length of response, and to determine whether it is consistent with past utterances. By integrating these factors, the system increases the reliability of the response and strengthens the interaction with the user by creating personalized chatbot utterances. This approach has applicability in psychological counseling and is expected to improve user participation in various industries, including education and healthcare.