Volgen
Renbo Tu
Renbo Tu
Geverifieerd e-mailadres voor mail.utoronto.ca
Titel
Geciteerd door
Geciteerd door
Jaar
Federated hyperparameter tuning: Challenges, baselines, and connections to weight-sharing
M Khodak, R Tu, T Li, L Li, MFF Balcan, V Smith, A Talwalkar
Advances in Neural Information Processing Systems 34, 19184-19197, 2021
702021
NAS-Bench-Suite-Zero: Accelerating research on zero cost proxies
A Krishnakumar, C White, A Zela, R Tu, M Safari, F Hutter
Advances in Neural Information Processing Systems 35, 28037-28051, 2022
272022
A deeper look at zero-cost proxies for lightweight nas
C White, M Khodak, R Tu, S Shah, S Bubeck, D Dey
ICLR Blog Track, 2022
232022
NAS-bench-360: Benchmarking neural architecture search on diverse tasks
R Tu, N Roberts, M Khodak, J Shen, F Sala, A Talwalkar
Advances in Neural Information Processing Systems 35, 12380-12394, 2022
212022
NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search
R Tu, M Khodak, N Roberts, A Talwalkar
arXiv preprint arXiv:2110.05668, 2021
102021
Speeding up Fourier Neural Operators via Mixed Precision
C White, R Tu, J Kossaifi, G Pekhimenko, K Azizzadenesheli, ...
arXiv preprint arXiv:2307.15034, 2023
62023
AutoML for Climate Change: A Call to Action
R Tu, N Roberts, V Prasad, S Nayak, P Jain, F Sala, G Ramakrishnan, ...
arXiv preprint arXiv:2210.03324, 2022
62022
Towards deeper generative architectures for GANs using dense connections
S Tripathi, R Tu
arXiv preprint arXiv:1804.11031, 2018
12018
Proteus: Preserving Model Confidentiality during Graph Optimizations
Y Gao, M Haghifam, C Giannoula, R Tu, G Pekhimenko, N Vijaykumar
arXiv preprint arXiv:2404.12512, 2024
2024
Guaranteed Approximation Bounds for Mixed-Precision Neural Operators
R Tu, C White, J Kossaifi, B Bonev, G Pekhimenko, K Azizzadenesheli, ...
The Twelfth International Conference on Learning Representations, 2023
2023
Het systeem kan de bewerking nu niet uitvoeren. Probeer het later opnieuw.
Artikelen 1–10