Welcome...

dr.ir. D.C. Mocanu (Decebal)

Assistant Professor

About Me

Vacancies (2022):

  • We have an open postdoctoral researcher position in reinforcement learning, evolutionary algorithms, and sparse training with humans and robots in the loop :). Does it sound challenging enough? If so, please read more and apply until 14 July here: link

News (2022):

  • 13 July - We are organising the second edition of the "Sparsity in Neural Networks: Advancing Understanding and Practice" - SNN Workshop 2022 (https://www.sparseneural.net/)
  • 14 June - One paper on sparse training and continual learning accepted at ECMLPKDD 2022 (link)
  • 21 May - I am doing a research visit to the group of Dr. Matthew Taylor at the University of Alberta
  • 16 May - One sparse training paper accepted at UAI 2022 (link)
  • 10 May - Our paper "Dynamic Sparse Training for Deep Reinforcement Learning" received best paper award at ALA 2022, collocated with AAMAS 2022 (link)
  • 25 April - We had the pleasure of hosting Utku Evci, Research Engineer at Google Brain Montreal, to give a very engaging in-person talk (link)
  • 20 April - One paper on sparse training and deep reinforcement learning accepted at IJCAI-ECAI 2022 (link)
  • 15 April - Our tutorial "Sparse Neural Networks Training" has been accepted at ECMLPKDD 2022 (link)
  • 6 April - Shiwei Liu defended his outstanding PhD thesis (link) with cum laude
  • 28 Jan - 2 sparse training papers accepted at ICLR 2022 (Links: 12)

 

Narrative CV:

Decebal Mocanu is Assistant Professor in Artificial Intelligence and Machine Learning within the DMB group, Faculty of Electrical Engineering, Mathematics, and Computer Science at the University of Twente; and Guest Assistant Professor within the Data Mining group, Department of Mathematics and Computer Science at the Eindhoven University of Technology (TU/e). 

From September 2017 until February 2020, Decebal was Assistant Professor in Artificial Intelligence and Machine Learning within the Data Mining group, Department of Mathematics and Computer Science, TU/e and a member of TU/e Young Academy of Engineering. In 2017, he received his PhD in Artificial Intelligence and Network Science from TU/e. During his doctoral studies, Decebal undertook three research visits at the University of Pennsylvania (2014), Julius Maximilians University of Wurzburg (2015), and the University of Texas at Austin (2016).

Prior to this, in 2013, he obtained his MSc in Artificial Intelligence from Maastricht University. During his master studies, Decebal also worked as a part time software developer at We Focus BV in Maastricht. In the last year of his master studies, he also worked as an intern at Philips Research in Eindhoven, where he prepared his internship and master thesis projects. Decebal obtained his Licensed Engineer degree from University Politehnica of Bucharest. While in Bucharest, between 2001 and 2010, Decebal started MDC Artdesign SRL (a software house specialized in web development), worked as a computer laboratory assistant at the University Nicolae Titulescu, and as a software engineer at Namedrive LLC.

 

Research

Decebal and his co-authors have laid the ground (connected papers) for sparse training in deep learning (training sparse artificial neural networks from scratch), while introducing both static and dynamic sparsity. Besides the expected computational benefits, sparse training achieves in many cases better generalisation than dense training.

  • Static sparsity in A topological insight into restricted Boltzmann machines, Machine Learning (2016), preprint https://arxiv.org/abs/1604.05978 (2016)
  • Dynamic sparsity in Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science, Nature Communications (2018), preprint https://arxiv.org/abs/1707.04780 (2017).

 

Decebal short-term research interest is to conceive scalable deep artificial neural network models and their corresponding learning algorithms using principles from network science, evolutionary computing, optimization and neuroscience. Such models shall have sparse and evolutionary connectivity, make use of previous knowledge, and have strong generalization capabilities to be able to learn, and to reason, using few examples in a continuous and adaptive manner.

Most science carried out throughout human evolution uses the traditional reductionism paradigm, which even if it is very successful, still has some limitations. Aristotle wrote in Metaphysics “The whole is more than the sum of its parts”. Inspired by this quote, in long term, Decebal would like to follow the alternative complex systems paradigm and study the synergy between artificial intelligence, neuroscience, and network science for the benefits of science and society.

Github: http://github.com/dcmocanu/

UT Research Information System

Google Scholar Link

Education

Current PhD students:

Graduated PhD students:

  • Shiwei Liu (cum laude), Sparse Neural Network Training with In-Time Over-Parameterization (graduated 2022, Postdoctoral Researcher - TU Eindhoven)
  • Anil YamanEvolution of biologically inspired learning in artificial neural networks (graduated 2019, Assistant Professor - VU Amsterdam) 

PDEng students supervision (graduated)

  • Pranav Bhatnagar, Automatic Microscope Alignment via Machine Learning (at Thermo Fisher Scientific, Eindhoven)September 2019
  • Eleftherios Koulierakis, Detection of outbreak of infectious diseases : a data science perspective (at GDD, Eindhoven)July 2018

MSc students supervision (graduated)

  • Mickey Beurskens, Pass the Ball! - Learning Strategic Behavioural Patterns for Distributed Multi Agent Robotic Systems, November 2020
  • Sonali Fotedar (cum laude), Information Extraction on Free-Text Sleep Narratives using Natural Language Processing (at Philips Research, Eindhoven), November 2020
  • Selima Curci (cum laude), Large Scale Sparse Neural Networks, October 2020
  • Manuel Munõz Sánchez (cum laude), Domain Knowledge-based Drivable Space Estimation (at TNO Helmond), September 2020
  • Chuan-Bin Huang, Novel Evolutionary Algorithms for Robust Training of Very Small Multilayer Perceptron Models, August 2020
  • Jeroen Brouns, Bridging the Domain-Gap in Computer Vision Tasks (at Philips Research, Eindhoven), December 2019
  • Daniel Ballesteros Castilla, Deep Reinforcement Learning for Intraday Power Trading (at ENGIE Global Markets, Brussels), December 2019
  • Mauro Comi (cum laude), Deep Reinforcement Learning for Light Transport Path Guiding, November 2019
  • Saurabh Bahulikar, Unsupervised Learning for Early Detection of Merchant Risk in Payments (at Payvision, Amsterdam), November 2019
  • Thomas Hagebols, Block-sparse evolutionary training using weight momentum evolution: training methods for hardware efficient sparse neural networks (at Philips Research, Eindhoven), March 2019
  • Bram Linders, Prediction and reduction of MRP nervousness by parameterization from a cost perspective (2nd supervisor, at Prodrive Technologies), February 2019
  • Joost Pieterse (cum laude), Evolving sparse neural networks using cosine similarity, July 2018

 

Conference tutorials

  • D.C. Mocanu, E. Mocanu, T. Pinto, Z. Vale, Scalable Deep Learning: How far is one billion neurons?, European Conference on Artificial Intelligence (ECAI 2020), Materials
  • D.C. Mocanu, E. Mocanu, T. Pinto, Z. Vale, Scalable Deep Learning: How far is one billion neurons?, International Joint Conference on Artificial Intelligence (IJCAI 2020), Materials
  • D.C. Mocanu, E. Mocanu, P.H. Nguyen, M. Gibescu, Z. Vale, D. Ernst, Scalable Deep Learning: from theory to practice (T11), International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2019), Link, Materials
  • D.C. Mocanu, E. Mocanu, Z. Vale, D. Ernst, Scalable Deep Learning: from theory to practice (T24), International Joint Conference on Artificial Intelligence (IJCAI 2019), Link, Materials
  • E. Mocanu, D.C. Mocanu, Scalable Deep Learning: from theory to practice, European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD 2019), Link, Materials

Affiliated Study Programmes

Bachelor

Master

Courses Academic Year  2021/2022

Courses in the current academic year are added at the moment they are finalised in the Osiris system. Therefore it is possible that the list is not yet complete for the whole academic year.
 

Courses Academic Year  2020/2021

Contact Details

Visiting Address

University of Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Zilverling (building no. 11)
Hallenweg 19
7522NH  Enschede
The Netherlands

Navigate to location

Mailing Address

University of Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Zilverling
P.O. Box 217
7500 AE Enschede
The Netherlands