Altucher's Crypto Trader VIP dan repost
Conversational AI Comes of Age
In 1950, computer scientist Alan Turing proposed a test to evaluate the sophistication of artificial intelligence (AI).
In his paper, “Computing Machinery and Intelligence,” Turing suggested testing a computer’s ability to think by proposing a test, called the Turing Test. If a computer could effectively hold a conversation indistinguishable from one you might have with a human... it would pass the test.
Since then, the Turing Test has remained a benchmark for AI developers around the world.
In recent years, many of us have had the chance to try the Turing Test ourselves with smart devices like Amazon Echo or Google Assistant.
However, when you start to ask more challenging or strange questions like “Alexa, how many siblings do you have” or “Hey Google, what’s your birthday?” these devices predictably will admit that they don’t understand or don’t have an answer.
Soon, this could be set to change.
Earlier this summer, AI development firm OpenAI released a software package called GPT-3, which is capable of engaging in increasingly dynamic conversations.
When asked questions like, “Why don't animals have three legs?” GPT-3 is capable of producing logical responses like, “Animals don't have three legs because they would fall over.”
It can even handle more complex questions, which require knowledge about the world like, “Which is heavier, a mouse or a toaster?“
If you’re an iPhone user and ask Siri these same questions, the AI will pull up internet search results… but it can’t provide an answer itself.
That said, GPT-3 still has its limits. Unlike Alexa or Google Assistant, the program doesn’t seem willing to admit when it doesn’t know the answer. The program also displays its limits when posed with trick questions.
For example, a blogger recently posted his experience using GPT-3. The user asked GPT-3 the question, “Who was the president of the United States in 1600?”... and received the response. “Queen Elizabeth I.”
Although the answer is close, it’s still not quite there — demonstrating the very real challenges faced by scientists hoping to solve the Turing Test.
However, the advances of GPT-3 also show the incremental progress in conversational AI.
Beyond the party tricks currently offered by the Amazon Echo and others, conversational AI could ultimately become a multibillion-dollar industry, where AI is used to replace all sorts of human interactions. With advances like GPT-3, that future might be closer than we think.
Sincerely,
James Altucher
Editor, Altucher's Investment Network
#Updates #ALN
In 1950, computer scientist Alan Turing proposed a test to evaluate the sophistication of artificial intelligence (AI).
In his paper, “Computing Machinery and Intelligence,” Turing suggested testing a computer’s ability to think by proposing a test, called the Turing Test. If a computer could effectively hold a conversation indistinguishable from one you might have with a human... it would pass the test.
Since then, the Turing Test has remained a benchmark for AI developers around the world.
In recent years, many of us have had the chance to try the Turing Test ourselves with smart devices like Amazon Echo or Google Assistant.
However, when you start to ask more challenging or strange questions like “Alexa, how many siblings do you have” or “Hey Google, what’s your birthday?” these devices predictably will admit that they don’t understand or don’t have an answer.
Soon, this could be set to change.
Earlier this summer, AI development firm OpenAI released a software package called GPT-3, which is capable of engaging in increasingly dynamic conversations.
When asked questions like, “Why don't animals have three legs?” GPT-3 is capable of producing logical responses like, “Animals don't have three legs because they would fall over.”
It can even handle more complex questions, which require knowledge about the world like, “Which is heavier, a mouse or a toaster?“
If you’re an iPhone user and ask Siri these same questions, the AI will pull up internet search results… but it can’t provide an answer itself.
That said, GPT-3 still has its limits. Unlike Alexa or Google Assistant, the program doesn’t seem willing to admit when it doesn’t know the answer. The program also displays its limits when posed with trick questions.
For example, a blogger recently posted his experience using GPT-3. The user asked GPT-3 the question, “Who was the president of the United States in 1600?”... and received the response. “Queen Elizabeth I.”
Although the answer is close, it’s still not quite there — demonstrating the very real challenges faced by scientists hoping to solve the Turing Test.
However, the advances of GPT-3 also show the incremental progress in conversational AI.
Beyond the party tricks currently offered by the Amazon Echo and others, conversational AI could ultimately become a multibillion-dollar industry, where AI is used to replace all sorts of human interactions. With advances like GPT-3, that future might be closer than we think.
Sincerely,
James Altucher
Editor, Altucher's Investment Network
#Updates #ALN