Yichun Yin
Yichun Yin
Huawei Noah's Ark Lab
Verified email at huawei.com
Title
Cited by
Cited by
Year
TinyBERT: Distilling BERT for Natural Language Understanding
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
Conference on Empirical Methods in Natural Language Processing (EMNLP-2020 …, 2019
3132019
Unsupervised word and dependency path embeddings for aspect term extraction
Y Yin, F Wei, L Dong, K Xu, M Zhang, M Zhou
IJCAI 2016, 2016
1302016
Document-level multi-aspect sentiment classification as machine comprehension
Y Yin, Y Song, M Zhang
EMNLP 2017, 2044-2054, 2017
472017
TernaryBERT: Distillation-aware Ultra-low Bit BERT
W Zhang, L Hou, Y Yin, L Shang, X Chen, X Jiang, Q Liu
EMNLP 2020, 2020
182020
Nnembs at semeval-2017 task 4: Neural twitter sentiment classification: a simple ensemble method with different embeddings
Y Yin, Y Song, M Zhang
Proceedings of the 11th International Workshop on Semantic Evaluation …, 2017
152017
Splusplus: a feature-rich two-stage classifier for sentiment analysis of tweets
L Dong, F Wei, Y Yin, M Zhou, K Xu
Proceedings of the 9th international workshop on semantic evaluation …, 2015
122015
Socialized Word Embeddings.
Z Zeng, Y Yin, Y Song, M Zhang
IJCAI, 3915-3921, 2017
112017
Dialog State Tracking with Reinforced Data Augmentation
Y Yin, L Shang, X Jiang, X Chen, Q Liu
AAAI 2020, 2019
82019
PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction
Y Yin, C Wang, M Zhang
COLING 2020, 2019
32019
Improving Task-Agnostic BERT Distillation with Layer Mapping Search
X Jiao, H Chang, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
Neurocomputing 2021, 2020
22020
Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation
C Chen, Y Yin, L Shang, Z Wang, X Jiang, X Chen, Q Liu
ICANN 2021, 2021
12021
LightMBERT: A Simple Yet Effective Method for Multilingual BERT Distillation
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
arXiv preprint arXiv:2103.06418, 2021
2021
AutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models
Y Yin, C Chen, L Shang, X Jiang, X Chen, Q Liu
ACL 2021, 2021
2021
The Solution of Huawei Cloud & Noah’s Ark Lab to the NLPCC-2020 Challenge: Light Pre-Training Chinese Language Model for NLP Task
Y Zhang, J Yu, K Wang, Y Yin, C Chen, Q Liu
CCF International Conference on Natural Language Processing and Chinese …, 2020
2020
More Chinese women needed to hold up half the computing sky
M Zhang, Y Yin
Proceedings of the ACM Turing Celebration Conference-China, 1-4, 2019
2019
一种基于数据重构和富特征的神经网络机器阅读理解模型
尹伊淳, 张铭
中文信息学报 32 (11), 112-116, 2018
2018
The system can't perform the operation now. Try again later.
Articles 1–16