Follow
Daniel Kunin
Daniel Kunin
Verified email at stanford.edu - Homepage
Title
Cited by
Cited by
Year
Pruning neural networks without any data by iteratively conserving synaptic flow
H Tanaka, D Kunin, DL Yamins, S Ganguli
Advances in neural information processing systems 33, 6377-6389, 2020
5372020
Loss landscapes of regularized linear autoencoders
D Kunin, J Bloom, A Goeva, C Seed
International conference on machine learning, 3560-3569, 2019
932019
Neural mechanics: Symmetry and broken conservation laws in deep learning dynamics
D Kunin, J Sagastuy-Brena, S Ganguli, DLK Yamins, H Tanaka
arXiv preprint arXiv:2012.04728, 2020
632020
Beyond the Quadratic Approximation: the Multiscale Structure of Neural Network Loss Landscapes
C Ma, D Kunin, L Wu, L Ying
arXiv preprint arXiv:2204.11326, 2022
45*2022
Two routes to scalable credit assignment without weight symmetry
D Kunin, A Nayebi, J Sagastuy-Brena, S Ganguli, J Bloom, D Yamins
International Conference on Machine Learning, 5511-5521, 2020
342020
Noether’s Learning Dynamics: Role of Symmetry Breaking in Neural Networks
H Tanaka, D Kunin
24*2021
The Limiting Dynamics of SGD: Modified Loss, Phase-Space Oscillations, and Anomalous Diffusion
D Kunin, J Sagastuy-Brena, L Gillespie, E Margalit, H Tanaka, S Ganguli, ...
Neural Computation 36 (1), 151-174, 2023
16*2023
The asymmetric maximum margin bias of quasi-homogeneous neural networks
D Kunin, A Yamamura, C Ma, S Ganguli
arXiv preprint arXiv:2210.03820, 2022
152022
Stochastic collapse: How gradient noise attracts sgd dynamics towards simpler subnetworks
F Chen, D Kunin, A Yamamura, S Ganguli
Advances in Neural Information Processing Systems 36, 2024
92024
A Quasistatic Derivation of Optimization Algorithms' Exploration on Minima Manifolds
C Ma, D Kunin, L Ying
2022
The system can't perform the operation now. Try again later.
Articles 1–10