How the ‘bigger is better’ mentality is damaging AI research
Reported today on TheNextWeb
For the full article visit: https://thenextweb.com/syndication/2019/12/05/how-the-bigger-is-better-mentality-is-damaging-ai-research/
How the 'bigger is better' mentality is damaging AI research
Something you'll hear a lot is that the increasing availability of computing resources has paved the way for important advances in artificial intelligence. With access to powerful cloud computing platforms, AI researchers have been able to train larger neural networks in shorter timespans. This has enabled AI to make inroads in many fields such as computer vision, speech recognition, and natural language processing.
But what you'll hear less is the darker implications of the current direction of AI research. Currently, advances in AI are mostly tied to scaling deep learning models and creating neural networks with more layers and parameters. According to artificial intelligence research lab OpenAI, "since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.4-month doubling time." This means that in seven years, the metric has grown by a factor of 300,000.
Rented shoes are gross
But bowling is fun! Join us for Bowlr, Amsterdam's best networking event
YEAH!
This requirement imposes severe limits on AI research and can also have other, less savory repercussions.
For the moment, bigger is better
"Within m