Home Page of Kazuki Irie
Kazuki Irie
Note(October 2023): I moved to the Center for Brain Science, Harvard University.
You can reach me at kirie AT g.harvard.edu .
This website is no longer being updated. For the most recent information, please refer to other platforms on the web. Thank you!
Kazuki Irie was a postdoc supervised by Prof. Jürgen Schmidhuber at the Swiss AI Lab IDSIA (University of Lugano) in Lugano, Switzerland, from May 2020 to September 2023.
Before joining IDSIA, he completed his PhD in Computer Science ("Advancing neural language modeling in automatic speech recognition") at RWTH Aachen University, Germany, in May 2020 under the supervision of Prof. Hermann Ney.
Prior to Aachen, he studied Applied Mathematics at École Centrale Paris and ENS Cachan, France, and obtained his Diplôme d'Ingénieur and Master of Science degrees.
Google Scholar / GitHub / dblp
/ OpenReview / arXiv
E-mail: kazuki AT idsia DOT ch
Recent Publications
2023
Preprints
-
Róbert Csordás, Piotr Piękos, Kazuki Irie, Jürgen Schmidhuber.
SwitchHead: Accelerating Transformers with Mixture-of-Experts Attention.
Preprint, December 2023.
-
Kazuki Irie, Róbert Csordás, Jürgen Schmidhuber.
Automating Continual Learning.
Preprint, November 2023.
-
Kazuki Irie, Anand Gopalakrishnan, Jürgen Schmidhuber.
Exploring the Promise and Limits of Real-Time Recurrent Learning.
Preprint, May 2023.
-
Mingchen Zhuge*, Haozhe Liu*, Francesco Faccio*, Dylan R. Ashley*, et al.
Mindstorms in Natural Language-Based Societies of Mind.
Preprint, May 2023.
-
Kazuki Irie*, Róbert Csordás*, Jürgen Schmidhuber.
Topological Neural Discrete Representation Learning à la Kohonen.
Preprint, February 2023.
Conference papers
- Kazuki Irie, Róbert Csordás, Jürgen Schmidhuber.
Practical Computational Power of Linear Transformers and Their Recurrent and Self-Referential Extensions.
Short paper, Conference on Empirical Methods in Natural Language Processing (EMNLP), Sentosa, Singapore, December 2023.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
Approximating Two-Layer Feedforward Networks for Efficient Transformers.
Findings of Conference on Empirical Methods in Natural Language Processing (EMNLP-Findings), Sentosa, Singapore, December 2023.
-
Aleksandar Stanić*, Anand Gopalakrishnan*, Kazuki Irie, Jürgen Schmidhuber.
Contrastive Training of Complex-Valued Autoencoders for Object Discovery.
Conference on Neural Information Processing Systems (NeurIPS), New Orleans, LA, USA, December 2023.
-
Kazuki Irie, Jürgen Schmidhuber.
Images as Weight Matrices: Sequential Image Generation Through Synaptic Learning Rules.
International Conference on Learning Representations (ICLR), Kigali, Rwanda, May 2023.
Workshop papers
-
Kazuki Irie*, Róbert Csordás*, Jürgen Schmidhuber.
Topological Neural Discrete Representation Learning à la Kohonen.
ICML 2023 Workshop on Sampling and Optimization in Discrete Space, Honolulu, HI, USA, July 2023.
-
Kazuki Irie, Jürgen Schmidhuber.
Accelerating Neural Self-Improvement via Bootstrapping.
ICLR 2023 Workshop on Mathematical and Empirical Understanding of Foundation Models (ME-FoMo), Kigali, Rwanda, May 2023.
2022
Conference papers
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
CTL++: Evaluating Generalization on Never-Seen Compositional Patterns of Known Functions, and Compatibility of Neural Representations.
Short paper, Conference on Empirical Methods in Natural Language Processing (EMNLP), Abu Dhabi, UAE, December 2022.
-
Kazuki Irie, Francesco Faccio, Jürgen Schmidhuber.
Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules.
Conference on Neural Information Processing Systems (NeurIPS), New Orleans, LA, USA, November 2022.
-
Kazuki Irie*, Róbert Csordás*, Jürgen Schmidhuber.
The Dual Form of Neural Networks Revisited: Connecting Test Time Predictions to Training Patterns via Spotlights of Attention.
International Conference on Machine Learning (ICML), Baltimore, MD, USA, July 2022.
-
Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber.
A Modern Self-Referential Weight Matrix That Learns to Modify Itself.
International Conference on Machine Learning (ICML), Baltimore, MD, USA, July 2022.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
The Neural Data Router: Adaptive Control Flow in Transformers Improves Systematic Generalization.
International Conference on Learning Representations (ICLR), Virtual Only, April 2022.
Journal papers
Workshop papers
2021
Conference papers
- Kazuki Irie*, Imanol Schlag*, Róbert Csordás, Jürgen Schmidhuber.
Going Beyond Linear Transformers with Recurrent Fast Weight Programmers.
Conference on Neural Information Processing Systems (NeurIPS), Virtual Only, December 2021.
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber.
The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers.
Conference on Empirical Methods in Natural Language Processing (EMNLP), Punta Cana, Dominican Republic, November 2021
- Imanol Schlag*, Kazuki Irie*, Jürgen Schmidhuber.
Linear Transformers Are Secretly Fast Weight Programmers.
International Conference on Machine Learning (ICML), Virtual Only, July 2021.
Workshop papers
- Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber. A Modern Self-Referential Weight Matrix That Learns to Modify Itself.
NeurIPS 2021 Workshop on Deep Reinforcement Learning (DeepRL), Virtual Only, December 2021.
- Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber.
Improving Baselines in the Wild.
NeurIPS 2021 Workshop on Distribution Shifts (DistShift), Virtual Only, December 2021
- Róbert Csordás, Kazuki Irie, Jürgen Schmidhuber. Learning Adaptive Control Flow in Transformers for Improved Systematic Generalization.
NeurIPS 2021 Workshop on Advances in Programming Languages and Neurosymbolic Systems (AIPLANS), Virtual Only, December 2021
- Anand Gopalakrishnan, Kazuki Irie, Jürgen Schmidhuber, Sjoerd van Steenkiste. Unsupervised Learning of Temporal Abstractions using Slot-based Transformers.
NeurIPS 2021 Workshop on Deep Reinforcement Learning (DeepRL) & Workshop on Offline Reinforcement Learning (OfflineRL), Virtual Only, December 2021.
- Kazuki Irie, Jürgen Schmidhuber. Training and Generating Neural Networks in Compressed Weight Space.
ICLR 2021 Workshop on Neural Compression, Virtual Only, May 2021.