기본 정보
연구 분야
프로젝트
논문
구성원
article|
인용수 0
·2025
Math Formula Embedding and Korean High School Math Problem Classification Using LSTM and BERT
Tae-Ho Lee, Bong-Kee Sin
Journal of Korea Multimedia Society
초록

While recent advances in deep learning many existing approaches primarily emphasize textual descriptions and fail to fully capture the mathematical content of equations. To address this limitation, this study proposes a hybrid model that integrates a Long Short-Term Memory (LSTM) network with Bidirectional Encoder Representations from Transformers (BERT) for classifying equations into topic domains. In the proposed method, LaTeX-formatted equations are transformed into bigram token sequences for structural analysis by the LSTM encoder, while the surrounding textual context is processed by the BERT encoder to capture semantic meaning. The two encoded representations are concatenated and passed through a feed-forward neural network for final classification. Experiments on a high school mathematics dataset demonstrate that this approach outperforms standalone LSTM and BERT models, achieving 91.96% accuracy, 92.11% precision, 91.97% recall, and an F1-score of 0.92, confirming the effectiveness of jointly leveraging structural and contextual information for mathematical problem classification.

키워드
EncoderEmbeddingBigramDeep learningArtificial neural networkSecurity tokenContext (archaeology)Transformer
타입
article
IF / 인용수
- / 0
게재 연도
2025

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.