About me

I am completing a two-year Master’s Degree in Artificial Intelligence and Data Science at National School of Computer Science and Applied Mathematics of Grenoble, France, expecting to graduate in August 2022. As part of my Master’s thesis, I am a Research Intern at Mila-Quebec AI institute under Prof. Yoshua Bengio. Herein, I am mainly investing my time working on learning how to generate latent hypergraphs using Generative Flow Networks, for short GFlowNets. Having spent only a small-time (roughly three research internships) in the research environment, GFlowNet is one of the most exciting subjects I have ever encountered (probably because I am still a beginner in AI-based research). I believe GFlowNets, still in its infancy, will be the next breakthrough in AI akin to what Transformers did five years ago.

I finished the first year of my Master’s Degree working as a Research Intern at IBM Research Europe, Zurich under Dr. Matteo Manica and Teodoro Laino. I worked on exploiting the expressivity of context-based language representation models to discover active sites on proteins in an unsupervised manner ACS Spring 2022, paper. More precisely, we trained Transformer-based models with the self-supervised learning task mask language modelling on the string representation of bio-catalyzed enzymatic reactions, and probe the attention matrix of an embedded enzymatic reaction to extract active sites of the enzyme involved in the reaction. The results were validated using an overlap-based metric and an energy-based comparison provided by docking simulations.

I hold a Bachelor’s Degree equivalent in Computer Engineering from the The National Advanced School of Engineering of Yaounde, Cameroon

Thesis

Research Interests

While I remain widely open to addressing every problem using deep learning-based approaches and reasoning, I feel excited about pursuing research falling into the following remits:

  • Boost machine-reading models using context-based (Attention-based models) and context-free (Commonsense knowledge) language presentation
  • Tailor language generation models to produce text satisfying a given metric ahead using Reinforcement Learning.

Besides the said topics, I am interested in Computer Vision, Reinforcement Learning, braodly in Deep Learning.

Extra Research

I feel particularly interested in research, including life as a whole, existential questions, consciousness, the roots of living things, and metaphysics.