Follow
Keivan Alizadeh-Vahid
Keivan Alizadeh-Vahid
Verified email at uw.edu - Homepage
Title
Cited by
Cited by
Year
Recurrent poisson factorization for temporal recommendation
SA Hosseini, K Alizadeh, A Khodadadi, A Arabzadeh, M Farajtabar, H Zha, ...
Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge …, 2017
672017
Butterfly Transform: An Efficient FFT Based Neural Architecture Design
K Alizadeh-Vahid, A Prabhu, A Farhadi, M Rastegari
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
44*2020
Dkm: Differentiable k-means clustering layer for neural network compression
M Cho, KA Vahid, S Adya, M Rastegari
arXiv preprint arXiv:2108.12659, 2021
272021
Llm in a flash: Efficient large language model inference with limited memory
K Alizadeh, I Mirzadeh, D Belenko, K Khatamifard, M Cho, CC Del Mundo, ...
arXiv preprint arXiv:2312.11514, 2023
152023
Relu strikes back: Exploiting activation sparsity in large language models
I Mirzadeh, K Alizadeh, S Mehta, CC Del Mundo, O Tuzel, G Samei, ...
arXiv preprint arXiv:2310.04564, 2023
112023
Are We Overfitting to Experimental Setups in Recognition?
M Wallingford, A Kusupati, K Alizadeh-Vahid, A Walsman, A Kembhavi, ...
arXiv preprint arXiv:2007.02519, 2020
11*2020
Butterfly transform: An efficient fft based neural architecture design. In 2020 IEEE
KA Vahid, A Prabhu, A Farhadi, M Rastegari
CVF Conference on Computer Vision and Pattern Recognition (CVPR), 12021-12030, 0
4
FLUID: A unified evaluation framework for flexible sequential data
M Wallingford, A Kusupati, K Alizadeh-Vahid, A Walsman, A Kembhavi, ...
arXiv preprint arXiv:2007.02519, 2020
22020
eDKM: An Efficient and Accurate Train-time Weight Clustering for Large Language Models
M Cho, KA Vahid, Q Fu, S Adya, CC Del Mundo, M Rastegari, D Naik, ...
IEEE Computer Architecture Letters, 2024
12024
2020 Index IEEE Transactions on Knowledge and Data Engineering Vol. 32
T Abeywickrama, TB Adji, I Agrafiotis, S Agrawal, NK Ahmed, R Akbarinia, ...
The system can't perform the operation now. Try again later.
Articles 1–10