Follow
Takashi Ishida
Takashi Ishida
Lecturer, The University of Tokyo
Verified email at ms.k.u-tokyo.ac.jp - Homepage
Title
Cited by
Cited by
Year
Learning from complementary labels
T Ishida, G Niu, W Hu, M Sugiyama
Advances in neural information processing systems, 5639-5649, 2017
972017
Do We Need Zero Training Loss After Achieving Zero Training Error?
T Ishida, I Yamane, T Sakai, G Niu, M Sugiyama
International Conference on Machine Learning, 2020
572020
Complementary-label learning for arbitrary losses and models
T Ishida, G Niu, AK Menon, M Sugiyama
International Conference on Machine Learning, 2971-2980, 2019
512019
Binary classification from positive-confidence data
T Ishida, G Niu, M Sugiyama
Advances in Neural Information Processing Systems, 5917-5928, 2018
462018
LocalDrop: A Hybrid Regularization for Deep Neural Networks
Z Lu, C Xu, B Du, T Ishida, L Zhang, M Sugiyama
IEEE Transactions on Pattern Analysis & Machine Intelligence, 1-1, 2021
72021
Machine Learning from Weak Supervision: An Empirical Risk Minimization Approach
M Sugiyama, H Bao, T Ishida, N Lu, T Sakai, G Niu
MIT Press, 2022
32022
Learning from Noisy Complementary Labels with Robust Loss Functions
H ISHIGURO, T ISHIDA, M SUGIYAMA
IEICE TRANSACTIONS on Information and Systems 105 (2), 364-376, 2022
2022
Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification
T Ishida, I Yamane, N Charoenphakdee, G Niu, M Sugiyama
arXiv preprint arXiv:2202.00395, 2022
2022
The system can't perform the operation now. Try again later.
Articles 1–8