Ten years ago, if you mentioned the term “artificial intelligence” in a boardroom there’s a good chance you would have been laughed at. For most people it would bring to mind sentient, sci-fi machines such as 2001: A Space Odyssey’s HAL or Star Trek’s Data.
Today it is one of the hottest buzzwords in business and industry. AI technology is a crucial lynchpin of much of the digital transformation taking place today as organizations position themselves to capitalize on the ever-growing amount of data being generated and collected.
Christopher S Penn looks at how to plan and build his first machine learning/AI project with the AI/Machine Learning Lifecycle. Complicated post, but could be useful.
Artificial intelligence and machine learning will create computers so sophisticated and godlike that humans will need to implant “neural laces” in their brains to keep up, Tesla Motors and SpaceX CEO Elon Musk told a crowd of tech leaders this week (June 2016).
This essay, originally published in eight short parts, aims to condense the current knowledge on Artificial Intelligence. It explores the state of AI development, overviews its challenges and dangers, features work by the most significant scientists, and describes the main predictions of possible AI outcomes.
How is AI affecting you? From autonomous cars to digital advertising, columnist James Green explores how artificial intelligence is impacting our jobs and our lives.
Andreessen Horowitz partner Frank Chen is here to tell you that anyone can utilize the technology, not just members of the “priesthood.”
But while Hollywood gets some of it right, there’s plenty of artistic license at work. Let’s take a look at some of the things that Hollywood gets wrong about AI, and why.
Should it be renamed?
Since the scope of artificial intelligence goes far beyond that, and we may be past the point of instilling a new definition of old words, why not use new words instead?
Kelly’s word of choice is cognification, and he uses it to describe ‘smart’ things.
How AI Evolved
“Two years after the first ImageNet competition, in 2012, something even bigger happened. Indeed, if the artificial intelligence boom we see today could be attributed to a single event, it would be the announcement of the 2012 ImageNet challenge results.”
Fascinating article if you are interested in the history of AI.
In the grandest irony of all, the greatest benefit of an everyday, utilitarian AI will not be increased productivity or an economics of abundance or a new way of doing science—although all those will happen. The greatest benefit of the arrival of artificial intelligence is that AIs will help define humanity. We need AIs to tell us who we are.
Can some of the impressive achievements of the last few years be sustained and continue to follow a fully distributed innovation pattern? Or will things go in the other direction and end up in another AI winter? It’s hard to predict, but there are a variety of factors that can impede the current rate of innovation, which Robbie Allen reviews. Fascinating analysis.
Artificial Intelligence is colossally hyped these days, but the dirty little secret is that it still has a long, long way to go. Sure, A.I. systems have mastered an array of games, from chess and Go to “Jeopardy” and poker, but the technology continues to struggle in the real world.
To get computers to think like humans, we need a new A.I. paradigm, one that places “top down” and “bottom up” knowledge on equal footing.
This post will make you really think.
Each group is certain its discipline holds the key to unlocking the mystery. But it’s likely a combination of these technologies that will solve the puzzle. Not all human brains work the same, so why should every AI?
The resurgence of artificial intelligence in recent years has been fueled by both the advent of cheap, available mass processing capacity and breakthroughs in AI algorithms that allow them to scale and tackle more complex problems. Interestingly, this recent trend is reminiscent of the personal computing revolution of the ’80s, when cheaper and more available computing became a catalyst for mass “computerization” of numerous industries