Follow
Mohammad Pezeshki
Mohammad Pezeshki
Mila
Verified email at umontreal.ca - Homepage
Title
Cited by
Cited by
Year
Theano: A Python framework for fast computation of mathematical expressions
R Al-Rfou, G Alain, A Almahairi, C Angermueller, D Bahdanau, N Ballas, ...
arXiv, arXiv: 1605.02688, 2016
9892016
Towards end-to-end speech recognition with deep convolutional neural networks
Y Zhang, M Pezeshki, P Brakel, S Zhang, CLY Bengio, A Courville
arXiv preprint arXiv:1701.02720, 2017
4982017
Zoneout: Regularizing rnns by randomly preserving hidden activations
D Krueger, T Maharaj, J Kramár, M Pezeshki, N Ballas, NR Ke, A Goyal, ...
arXiv preprint arXiv:1606.01305, 2016
3872016
Gradient starvation: A learning proclivity in neural networks
M Pezeshki, O Kaba, Y Bengio, AC Courville, D Precup, G Lajoie
Advances in Neural Information Processing Systems 34, 1256-1272, 2021
2922021
Theano: A Python framework for fast computation of mathematical expressions
TTD Team, R Al-Rfou, G Alain, A Almahairi, C Angermueller, D Bahdanau, ...
arXiv preprint arXiv:1605.02688, 2016
2192016
Negative momentum for improved game dynamics
G Gidel, RA Hemmat, M Pezeshki, R Le Priol, G Huang, S Lacoste-Julien, ...
The 22nd International Conference on Artificial Intelligence and Statistics …, 2019
2072019
Simple data balancing achieves competitive worst-group-accuracy
BY Idrissi, M Arjovsky, M Pezeshki, D Lopez-Paz
Conference on Causal Learning and Reasoning, 336-351, 2022
1682022
Deconstructing the Ladder Network Architecture
M Pezeshki, L Fan, P Brakel, A Courville, Y Bengio
arXiv preprint arXiv:1511.06430, 2015
1382015
Theano: A Python framework for fast computation of mathematical expressions. arXiv
R Al-Rfou, G Alain, A Almahairi, C Angermueller, D Bahdanau, N Ballas, ...
arXiv preprint arXiv:1605.02688 10, 2016
512016
On the learning dynamics of deep neural networks
R Tachet, M Pezeshki, S Shabanian, A Courville, Y Bengio
arXiv preprint arXiv:1809.06848, 2018
43*2018
Multi-scale Feature Learning Dynamics: Insights for Double Descent
M Pezeshki, A Mitra, Y Bengio, G Lajoie
https://arxiv.org/pdf/2112.03215.pdf, 2021
242021
Comparison three methods of clustering: K-means, spectral clustering and hierarchical clustering
K Kowsari, T Borsche, A Ulbig, G Andersson, AM Saxe, JL McClelland, ...
arXiv Preprint, 2013
16*2013
Sequence modeling using gated recurrent neural networks
M Pezeshki
arXiv preprint arXiv:1501.00299, 2015
152015
Discovering environments with XRM
M Pezeshki, D Bouchacourt, M Ibrahim, N Ballas, P Vincent, D Lopez-Paz
arXiv preprint arXiv:2309.16748, 2023
122023
Predicting grokking long before it happens: A look into the loss landscape of models which grok
P Notsawo Jr, H Zhou, M Pezeshki, I Rish, G Dumas
arXiv preprint arXiv:2306.13253, 2023
122023
Deep belief networks for image denoising
MA Keyvanrad, M Pezeshki, MA Homayounpour
arXiv preprint arXiv:1312.6158, 2013
122013
Feedback-guided Data Synthesis for Imbalanced Classification
R Askari Hemmat, M Pezeshki, F Bordes, M Drozdzal, A Romero-Soriano
arXiv e-prints, arXiv: 2310.00158, 2023
8*2023
Compositional Risk Minimization
D Mahajan, M Pezeshki, I Mitliagkas, K Ahuja, P Vincent
arXiv preprint arXiv:2410.06303, 2024
2024
Dynamics of learning and generalization in neural networks
M Pezeshki
2022
Deliberate Practice with Synthetic Data
R Askari-Hemmat, M Pezeshki, P Astolfi, M Hall, F Bordes, J Verbeek, ...
Adaptive Foundation Models: Evolving AI for Personalized and Efficient Learning, 0
The system can't perform the operation now. Try again later.
Articles 1–20