기본 정보
연구 분야
프로젝트
논문
구성원
article|
gold
·인용수 1
·2025
TEF-PLM: A Tabular and Embeddings Fusion Framework using Pretrained Language Model for enhanced electric vehicle energy forecasting
Muhammad Waqar, Yong-Woon Kim, Yung-Cheol Byun
Energy Reports
초록

Electric Vehicle Energy Prediction (EVEP) is vital for optimizing smart grid operations and charging infrastructure management. Traditional forecasting approaches primarily rely on historical tabular data, often overlooking the semantic and contextual richness embedded in categorical features. This study introduces TEF-PLM (Tabular Embeddings Fusion using Pretrained Language Model), a novel hybrid framework that combines structured numerical data with semantic embeddings derived from the e5-base-v2 language model to enhance multi-horizon forecasting performance. The framework is evaluated across both Non-Sequential Models (NSM), including MLP, XGB, LGBM, TabNet, and TabTransformer, and Sequential Models (SM), including BiLSTM, BiGRU, Transformer Encoder, and Temporal Fusion Transformer (TFT), under diverse data configurations: embeddings-only, dimensionally reduced representations (Principal Component Analysis (PCA) and AutoEncoder (AE)), and fused feature settings. In the NSM setup, the MLP model using PCA-reduced fused features attained the best results, achieving a 64% reduction in MAE and a 52% reduction in RMSE over the baseline. In the SM setup, the TFT model with AE-reduced fused features delivered the most consistent performance for longer horizons, achieving up to 27% improvement in RMSE, 26% in MAE, and 14% gain in R 2 compared to traditional tabular models. These outcomes validate the hypothesis that integrating pretrained semantic embeddings with observed variables significantly improves prediction accuracy, offering a robust and generalizable approach for EVEP tasks. • Proposes TEF-PLM, a hybrid framework combining PLM-based semantic embeddings with tabular features for improved EV energy prediction. • Evaluates sequential and non-sequential models with PCA- and AE-reduced embeddings for efficient forecasting. • Achieves up to 64% MAE and 27% RMSE improvement, validated through statistical significance tests. • Analyzes training overhead and latency, confirming scalability and robustness for real-world EV systems.

키워드
TransformerCategorical variableAutoencoderLanguage modelEncoderFusionGridFeature (linguistics)Mean squared error
타입
article
IF / 인용수
- / 1
게재 연도
2025

주식회사 디써클

대표 장재우,이윤구서울특별시 강남구 역삼로 169, 명우빌딩 2층 (TIPS타운 S2)대표 전화 0507-1312-6417이메일 info@rndcircle.io사업자등록번호 458-87-03380호스팅제공자 구글 클라우드 플랫폼(GCP)

© 2026 RnDcircle. All Rights Reserved.