Mahdi Nikdan

CV

prof_pic.jpg

Hi! I’m a PhD student in Computer Science at the Institute of Science and Technology Austria (ISTA), where I’m fortunate to be supervised by Prof. Dan Alistarh. Before joining ISTA, I earned my B.Sc. in Computer Engineering from Sharif University of Technology in Iran.

My research focuses on making neural network training more efficient, particularly for large language models (LLMs). I currently work on two main areas:

  • Training in compressed formats — developing methods for directly training sparse or quantized models to achieve end-to-end speedups.
  • Data selection — designing algorithms to select the most “relevant” training samples to improve efficiency and performance.

I’m especially interested in algorithmic approaches to these problems and occasionally explore systems-level solutions to unlock real-world training speedups.

Publications

Below you can find a list of my most recent publications.

  • Nikdan, M., Cohen-Addad, V., Alistarh, D., & Mirrokni, V. (2025). Efficient Data Selection at Scale via Influence Distillation. Preprint. link
  • Ashkboos, S.*, Nikdan, M.*, Tabesh, S., Castro, R. L., Hoefler, T., & Alistarh, D. (2025). HALO: Hadamard-Assisted Lossless Optimization for Efficient Low-Precision LLM Training and Fine-Tuning. arXiv preprint arXiv:2501.02625. link
  • Panferov, A., Chen, J., Tabesh, S., Castro, R. L., Nikdan, M., & Alistarh, D. (2025). QuEST: Stable Training of LLMs with 1-Bit Weights and Activations. arXiv preprint arXiv:2502.05003. link
  • Nicolicioiu, A.*, Iofinova, E.*, Jovanovic, A.*, Kurtic, E.*, Nikdan, M.*, Panferov, A., … & Alistarh, D. (2024). Panza: Design and analysis of a fully-local personalized text writing assistant. arXiv preprint arXiv:2407.10994. link
  • Nikdan, M.*, Tabesh, S.*, Crnčević, E., & Alistarh, D. (2024). Rosa: Accurate parameter-efficient fine-tuning via robust adaptation. In International Conference on Machine Learning. link
  • Nikdan, M.*, Pegolotti, T.*, Iofinova, E., Kurtic, E., & Alistarh, D. (2023, July). SparseProp: Efficient sparse backpropagation for faster training of neural networks at the edge. Oral in International Conference on Machine Learning (pp. 26215-26227). PMLR. link
  • Bitarafan, A., Nikdan, M., & Baghshah, M. S. (2020). 3D image segmentation with sparse annotation by self-training and internal registration. IEEE Journal of Biomedical and Health Informatics, 25(7), 2665-2672. link