Welcome...

Z. Atashgahi MSc (Zahra)

PhD Candidate

About Me

Zahra is a second-year Ph.D. student at the University of Twente,  Data Management & Biometrics (DMB) group. During her Ph.D., she focuses on Deep Learning and, particularly, sparse neural networks. She seeks to develop algorithms to solve different tasks efficiently in terms of computational costs and data requirements.

Expertise

Engineering & Materials Science
Costs
Data Storage Equipment
Energy Resources
Feature Extraction
Neural Networks
Neurons
Plasticity
Time Series Analysis

Research

Artificial neural networks (ANNs) have gained huge attention over the last few years due to their promising results in a large variety of tasks. However, deep neural networks (DNNs) require plenty of annotated data and are recognized as being computationally demanding. Therefore, deep learning models are not well-suited to applications with limited computational resources, battery life, and labeled instances. Current solutions to reduce computation and annotation costs mostly focus on inference efficiency, while being resource-intensive during training. Zahra aims to address these challenges by developing cost-effective neural networks that can achieve decent performance on various complex tasks using minimum computational resources and labeled samples, both during training and inference of the network.

Publications

Recent
Liu, S., Chen, T. , Atashgahi, Z., Chen, X., Sokar, G. , Mocanu, E., Pechenizkiy, M., Wang, Z. , & Mocanu, D. C. (2022). Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity. In The Tenth International Conference on Learning Representations, ICLR 2022 OpenReview. https://openreview.net/forum?id=RLtqs6pzj1-&noteId=d7CKVDyMGZi
Liu, S., Chen, T. , Atashgahi, Z., Chen, X., Sokar, G. A. Z. N. , Mocanu, E., Pechenizkiy, M., Wang, Z. , & Mocanu, D. C. (2021). FreeTickets: Accurate, Robust and Efficient Deep Ensemble by Training with Dynamic Sparsity. Poster session presented at Sparsity in Neural Networks: Advancing Understanding and Practice 2021, Online.
Liu, S., Chen, T., Chen, X. , Atashgahi, Z., Yin, L., Kou, H., Shen, L., Pechenizkiy, M., Wang, Z. , & Mocanu, D. C. (2021). Sparse Training via Boosting Pruning Plasticity with Neuroregeneration (Poster). Poster session presented at Sparsity in Neural Networks: Advancing Understanding and Practice 2021, Online.
Kichler, N. , Atashgahi, Z. , & Mocanu, D. C. (2021). Robustness of sparse MLPs for supervised feature selection (poster). Poster session presented at Sparsity in Neural Networks: Advancing Understanding and Practice 2021, Online.
Atashgahi, Z., Sokar, G. A. Z. N., van der Lee, T. , Mocanu, E. , Mocanu, D. C. , Veldhuis, R. N. J., & Pechenizkiy, M. (2021). Quick and robust feature selection: the strength of energy-efficient sparse training for autoencoders (Extended Abstract). In BNAIC/BENELEARN 2021: The 33rd Benelux Conference on Artificial Intelligence and the 30th Belgian Dutch Conference on Machine Learning
Atashgahi, Z. , Mocanu, D. C. , Veldhuis, R. N. J., & Pechenizkiy, M. (2021). Unsupervised Online Memory-free Change-point Detection using an Ensemble of LSTM-Autoencoder-based Neural Networks (Extended Abstract). Paper presented at 8th ACM Celebration of Women in Computing womENcourage, Prague, Czech Republic.
Atashgahi, Z., Sokar, G. A. Z. N., van der Lee, T. , Mocanu, E. , Mocanu, D. C. , Veldhuis, R. N. J., & Pechenizkiy, M. (Accepted/In press). Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders (poster). Poster session presented at Sparsity in Neural Networks: Advancing Understanding and Practice 2021, Online.
Liu, S., Chen, T., Chen, X. , Atashgahi, Z., Yin, L., Kou, H., Shen, L., Pechenizkiy, M., Wang, Z. , & Mocanu, D. C. (2021). Sparse Training via Boosting Pruning Plasticity with Neuroregeneration. In Advances in Neural Information Processing Systems
Liu, S., van der Lee, T., Yaman, A. , Atashgahi, Z., Ferraro, D., Sokar, G. A. Z. N., Pechenizkiy, M. , & Mocanu, D. C. (2020). Topological Insights into Sparse Neural Networks. Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2020. https://arxiv.org/abs/2006.14085

UT Research Information System

Google Scholar Link

Contact Details

Visiting Address

University of Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Zilverling (building no. 11)
Hallenweg 19
7522NH  Enschede
The Netherlands

Navigate to location

Mailing Address

University of Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Zilverling
P.O. Box 217
7500 AE Enschede
The Netherlands

Working days

Week Monday Tuesday Wednesday Thursday Friday
Even
Odd

Social Media