Hi there! I’m Emanuele, a first-year MSc student in Data Science and Scientific Computing (Artificial Intelligence and Machine Learning track), and a volunteer research assistant in Artificial Intelligence and Cyber-Physical Systems with the Bortolussi Group (a.k.a. [email protected]) at the University of Trieste (Italy).
During my studies (a BSc in Physics, previously) I focused mainly on statistical and computational methods for data analysis in experimental physics and signal processing. More recently, I moved toward more general artificial intelligence, (statistical) machine learning and analysis of massive datasets, mainly centered around decision-making problems, planning-and-control, and robustness of deep learning systems to adversarial attacks, from both a theoretical and an applied point of view.
My research interests include - to various extent -:
- Core methods for optimization in highly-dimensional manifolds with minimal assumptions (e.g. SGD, MonteCarlo-based, tempering, genetic programming-based optimization);
- Artificial neural computation and deep learning (e.g. neuron models, CNNs, RNNs, autoencoders/VAEs, GANs, neural ODEs, novel neural architectures);
- Unsupervised learning systems and approaches, particularly those robust to adversarial input and/or endowed with common sense;
- Neural information processing systems strongly inspired by brain anatomy, physiology and/or neuroscience (e.g. Hebbian approaches, synaptic plasticity, neuromodulation, free-energy principle);
- Open-ended artificial systems and algorithms, evolutionary methods and neuroevolution (e.g. NEAT-based approaches, novelty search, coevolution);
- Computational methods to reverse-engineer, model and artificially re-implement the foundations of human cognition, action and behaviour;
- Statistical and information-theoretical foundations - and related methods - of human (and animal, in general) intelligence;
- Statistical network models for inference (e.g. probabilistic graphical models, Bayesian networks, energy-based NNs, Boltzmann machines, Belief Networks) or simulation (e.g. diffusive processes on networks, computational epidemiology);
- Bayesian statistics and Bayesian ML methods - shallow and deep - including (approximate) variational inference;
- Prediction, decision-making and control - through reinforcement (including Deep RL), logic and hybrid approaches - under uncertainty and/or in complex environments;
- Scientific high-performance computing, and, more generally, ways to harness computational power and efficiency from dedicated hardware architectures (e.g. GPGPU computing, neuro-/physio-morphic hardware).
- Bidirectional interplay between artificial intelligence and physics (e.g. quantum machine learning, machine learning in high-energy physics, statistical mechanics and/or complex systems analysis of deep learning).
When not about any of the above, I usually rant about optimality in policy-making, electoral systems and politics; about energy policy, and open science/software/knowledge.
My (computational) toolbox
In trying to solve whichever problem I have at hand, I will happily use any combination of the following, provided they run on Linux.
- Python & its rich ecosystem (NumPy, SciPy, Pandas, SciKit Learn, PyTorch, JAX, Keras, modern TensorFlow, (num)Pyro …);
- Modern C++ (>=2011) with libraries (the STL, TorchScript, Armadillo, Boost, ROOT, …);
- The Julia programming language and its ecosystem;
- The R programming language, tidyverse included (but not limited to);
- Integrated computational languages/environments, e.g. MATLAB/Simulink and Wolfram Mathematica/SystemModeler;
- Bash/Zsh as a scripting language for system automation and orchestration;
- Rust, Fortran or F# if I feel brave enough.