Volgen
Xu Qing
Xu Qing
Senior Research Engineer, Institute for Infocomm Research (I2R) - A*STAR
Geverifieerd e-mailadres voor i2r.a-star.edu.sg
Titel
Geciteerd door
Geciteerd door
Jaar
KDnet-RUL: A knowledge distillation framework to compress deep neural networks for machine remaining useful life prediction
Q Xu, Z Chen, K Wu, C Wang, M Wu, X Li
IEEE Transactions on Industrial Electronics 69 (2), 2022-2032, 2021
592021
Contrastive adversarial knowledge distillation for deep model compression in time-series regression tasks
Q Xu, Z Chen, M Ragab, C Wang, M Wu, X Li
Neurocomputing 485, 242-251, 2022
262022
A hybrid ensemble deep learning approach for early prediction of battery remaining useful life
Q Xu, M Wu, E Khoo, Z Chen, X Li
IEEE/CAA Journal of Automatica Sinica 10 (1), 177-187, 2023
152023
Automatic detection of retinopathy with optical coherence tomography images via a semi-supervised deep learning method
Y Luo, Q Xu, R Jin, M Wu, L Liu
Biomedical Optics Express 12 (5), 2684-2702, 2021
112021
Cross‐domain retinopathy classification with optical coherence tomography images via a novel deep domain adaptation method
Y Luo, Q Xu, Y Hou, L Liu, M Wu
Journal of Biophotonics 14 (8), e202100096, 2021
32021
Reinforced Knowledge Distillation for Time Series Regression
Q Xu, K Wu, M Wu, K Mao, X Li, Z Chen
IEEE Transactions on Artificial Intelligence, 2023
12023
Contrastive Distillation with Regularized Knowledge for Deep Model Compression on Sensor-based Human Activity Recognition
Q Xu, M Wu, X Li, K Mao, Z Chen
IEEE Transactions on Industrial Cyber-Physical Systems, 2023
12023
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Q Xu, M Wu, X Li, K Mao, Z Chen
arXiv preprint arXiv:2307.03347, 2023
12023
Improve Knowledge Distillation via Label Revision and Data Selection
W Lan, Y Cheung, Q Xu, B Liu, Z Hu, M Li, Z Chen
arXiv preprint arXiv:2404.03693, 2024
2024
Het systeem kan de bewerking nu niet uitvoeren. Probeer het later opnieuw.
Artikelen 1–9