At the age of 15, German-born computer scientist Jürgen Schmidhuber (1963) had already set out his life ambition: build a self-improving artificial intelligence smarter than himself and retire. Although he hasn’t yet achieved this, he has driven huge advances in AI and deep learning along the way.
Deep research
Inspired by this early ambition, he studied and then taught at Technische Universität München, before becoming Professor of Artificial Intelligence at the Università della Svizzera Italiana in Switzerland. Today scientific director of Swiss AI Lab IDSIA, he and his research teams started exploring self-improving meta-learning general problem solvers back in 1987. They established their first deep learning neural networks just 4 years later.
One of the most famous contributions of his lab is the Long Short Term Memory (LSTM) architecture. Equipped with feedback connections, LSTM is used by the likes of Google and Microsoft in tasks ranging from speech recognition to machine translation. In 2017, Facebook used it to perform around 4.5 billion automatic translations every day!
“Humans should do zero percent of the hard and boring work, computers the rest.”
Addicted to AI
He didn’t stop there. IDSIA deep learners were the first to win object detection and image segmentation contests — including for medical diagnosis (in this case, cancer detection), whose partial automation Jürgen believes could save billions of dollars and make healthcare more accessible. They also won 9 international competitions for machine learning and pattern recognition. Let’s not forget that he also helped develop the “Formal Theory of Creativity, Fun and Intrinsic Motivation” to explain many essential aspects of intelligence including autonomous development, science, art, music, and humor. As well as advancing low-complexity art, an extreme form of minimal art that be described by a very short computer program.
Tomorrow’s intelligence
Many steps closer to his dream, in 2014, Jürgen co-founded NNAISENSE, an AI startup with the aim of building the first purpose-built AI, while “delivering advanced neural network solutions that improve how products are made and how they work.” As computational power gets cheaper, he predicts that machines will be up to 10,000 times faster by 2036. His work has paved the way for achieving AI capable of learning incrementally and planning, reasoning, and decomposing problems. Human-level AI is no longer such a distant dream.
Key Dates
-
1990
Formal Theory of Creativity, Fun and Intrinsic Motivation
Jürgen Schmidhuber publishes Formal Theory of Creativity, Fun and Intrinsic Motivation, which explores the concept of “maximizing intrinsic reward for the active creation or discovery of novel, surprising patterns allowing for improved prediction or data compression.”
-
1997
Long Short-Term Memory
Sepp Hochreiter and Jürgen Schmidhuber’s first peer-reviewed publication on Long Short Term Memory (LSTM), which proves really good at tasks like unsegmented, connected handwriting recognition, speech recognition, machine translation, anomaly detection, playing video games, improving healthcare.
-
2014
Leveraging Machine Learning
NNAISENSE starts leveraging ML methods like deep and reinforcement learning for an array or practical applications from anomaly detection to waste reduction.