Home Page of Kazuki Irie
Kazuki Irie
Kazuki Irie is a postdoc supervised by Prof. Jürgen Schmidhuber at the Swiss AI Lab IDSIA (Università della Svizzera italiana & SUPSI) in Lugano, Switzerland.
Before joining IDSIA, he completed his PhD in Computer Science at RWTH Aachen University, Germany in May 2020 under the supervision of Prof. Hermann Ney.
Prior to Aachen, he studied Applied Mathematics at École Centrale Paris and ENS Cachan in France, and obtained his Diplôme d'Ingénieur and Master of Science degrees.
Google Scholar / GitHub / dblp
/ OpenReview
E-mail: kazuki at idsia dot ch
Recent Publications
2022
Preprints
Conference papers
-
Kazuki Irie*, Róbert Csordás*, Jürgen Schmidhuber.
The Dual Form of Neural Networks Revisited: Connecting Test Time Predictions to Training Patterns via Spotlights of Attention.
International Conference on Machine Learning (ICML) 2022, Baltimore, MD, USA.
-
Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber.
A Modern Self-Referential Weight Matrix That Learns to Modify Itself.
International Conference on Machine Learning (ICML) 2022, Baltimore, MD, USA.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
The Neural Data Router: Adaptive Control Flow in Transformers Improves Systematic Generalization.
International Conference on Learning Representations (ICLR) 2022, Virtual Only.
2021
Conference papers
- Kazuki Irie*, Imanol Schlag*, Róbert Csordás, Jürgen Schmidhuber.
Going Beyond Linear Transformers with Recurrent Fast Weight Programmers.
Conference on Neural Information Processing Systems (NeurIPS) 2021, Virtual Only.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers.
Conference on Empirical Methods in Natural Language Processing (EMNLP) 2021, Punta Cana, Dominican Republic.
- Imanol Schlag*, Kazuki Irie*, Jürgen Schmidhuber.
Linear Transformers Are Secretly Fast Weight Programmers.
International Conference on Machine Learning (ICML) 2021, Virtual Only.
Workshop presentations
- Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber. A Modern Self-Referential Weight Matrix That Learns to Modify Itself.
NeurIPS 2021 Workshop on Deep Reinforcement Learning (DeepRL), Virtual Only.
- Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber.
Improving Baselines in the Wild.
NeurIPS 2021 Workshop on Distribution Shifts (DistShift), Virtual Only.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber. Learning Adaptive Control Flow in Transformers for Improved Systematic Generalization.
NeurIPS 2021 Workshop on Advances in Programming Languages and Neurosymbolic Systems (AIPLANS), Virtual Only.
- Anand Gopalakrishnan, Kazuki Irie, Jürgen Schmidhuber, Sjoerd van Steenkiste. Unsupervised Learning of Temporal Abstractions using Slot-based Transformers.
NeurIPS 2021 Workshop on Deep Reinforcement Learning (DeepRL) & Workshop on Offline Reinforcement Learning (OfflineRL), Virtual Only.
- Kazuki Irie, Jürgen Schmidhuber. Training and Generating Neural Networks in Compressed Weight Space.
ICLR 2021 Workshop on Neural Compression, Virtual Only.
Publications in or before 2020 (on Language Modelling & Speech Recognition)