قالب وردپرس درنا توس
Home / Trends / 5 possibilities for the future A.I. Assistants take speech technology to the next level

5 possibilities for the future A.I. Assistants take speech technology to the next level



  Apple Whispers
Apple

Since Siri debuted on the iPhone 4s in 2011, voice assistants have evolved from a non-functional gimmick to the base for intelligent speaker technology found in every sixth American household.

Before Siri, when I talked about [what I do]there were empty glances, "said Tom Hebner, head of innovation at Nuance Communications, the state-of-the-art AI developing voice technology, Digital Trends said. People would say, 'Do you build these terrible phone systems? I hate you. "That was the only interaction of a group of people with speech technology."

That is no longer the case today. According to eMarketer forecasts, nearly 1

00 million smartphone users will use voice assistants by 2020. But while A.I. Assistants are no longer a novelty, we are still at the beginning of their development. There is still a long way to go before they fully honor the promise of language assistants as a product category.

There are five ways in which technology can be improved to make it smarter and more efficient – and to help us achieve more productive leadership lives as a result. Let's call them "predictions" or "wishlist", these are the challenges that need to be solved.

More knowledge, fewer problems

Alexa can tell you what the weather is like in Kuala Lumpur, Malaysia. the total number of dollars you receive for 720 South African Rand and how to spell "Disestablishmentarianism". But consumer A.I. Essentially, wizards are the digital equivalent of a person with a full set of recent encyclopedias. You get (hopefully) the right information, but there is no professional expertise.

"The challenge that the systems in your house have is that there is so much of a range of things that they are trying to do. "Hebner told Digital Trends.

  alexa cortana windows 10 listening

This is a difficult task, but it could change. Nuance develops many specialized systems that are tailored to a specific use case, such as: For example, assisting airline customers with answering questions or making notes by doctors. This not only means that these systems can drill down to get more detailed information, but also that more information can be incorporated. "People were very excited about computers that can understand words, but that's not necessarily important if you do not enter information. I do not know what to say with these words, "Hebner said.

An example is a nuance system that not only understands when physicians can read a list of potential medications for patients, but also cause potential conflicts. This goes far beyond the capabilities of most A.I. assistants.

However, a more detailed understanding of different domains – something suggested by Alexa Skills – could be transformative. If you ask your smart speaker for legal or medical advice, that sounds crazy at first sight. But there has been extraordinary progress in areas such as legal bots, while a recent report indicates that Apple wants Siri to be able to conduct health-oriented discussions with users by 2021.

Charts of A.I. Wizards are the epitome of science fiction dreams right now, though a recent Voicebot.ai report shows how quickly virtual assistant skills are increasing. However, when skills penetrate into the terrain of specialties, we will experience something special!

More (and better) personalization

The personalization of today's smart speakers is still in its infancy. You can change the accent and gender of the language assistants, add or remove skills, and enter information such as your name and workspace. In some cases, you can set up multiple language profiles so that Google Home recognizes each member of your household.

  Amazon Echo Show

But there is still a long way to go – though the juice should be worth the pressure. The Mattersight Corporation has A.I. Call center technology called Predictive Behavioral Routing, which analyzes the speech patterns of callers and matches them with human collaborators with compatible personality types. According to the company, assigning a person to a compatible personality leads to a successful call that lasts only half the time, alongside that of a person with a conflicting personality type.

Using a similar approach, A.I. Wizards who contact you the way you want. It can be so simple that it matches the accent and volume of the person you're talking to. Or it could change the way ideas are approached by possibly using more emotional words for some users, compared to more detailed information that could be used for others. Some people may want a voice assistant to chat extensively while others simply want one to convey the required information as precisely as possible. A.I Wizards should be able to do both.

Technologies such as Google Duplex show how convincingly accurate the synthesized voices and conversations produced by A. I. are. Expect this technology to play an important role as A.I.s advances into areas that are more complex than editing song requests and food timers.

This could be supported by breakthroughs in identifying users via voice. Hebner notes that Nuance's technology allows users to identify audio in just a single second. "It used to take 10 seconds to understand who you are in order to get an accurate signal," he said. "This is critical." Identifying users by using a small spelling triggers the password problem and provides the ability to use voice assistants for more confidential information.

Be proactive

Good assistant will do something if you ask her. A great assistant does not have to be asked. At the moment A.I. Assistants are still in this first phase. Users can receive the desired title or reminder, but usually only when expressly requested. As people become more familiar with language assistants, they have a great opportunity to use not only purely reactive but also proactive devices.

There are big questions about whether certain jobs should be handed over to machines or not.

How would you feel about an AI? Decision support for you? This can be when someone says it's cold, or when he rebooks a lunch, because you're late, or when he asks you to do more sports or keep your paycheck better. With more and more smart devices entering the home, the number of commands a voice assistant might be able to perform will increase significantly.

Part of this question is the social question of how people feel about machines, which in turn make decisions. There are big questions about whether people want to hand over certain jobs to machines or not. Imagine giving your assistant, flesh-and-blood, your credit card and the house key – but with a much stronger pinch of Skynet. The downside is giving up some control. The potential plus increases your free time. Of course there is a big technical challenge …

It's all about feedback

Tom Hebner pointed out a big challenge in terms of proactivity: how do our machines know when they got it right? To come back to the idea of ​​the good versus the great assistant: A great assistant could publish all your files before a big meeting without you having to ask for it. But what if they are the wrong files? A big problem in the home improvement of A.I. It is more proactive for assistants that there are currently limited ways to find out if we get the right information or not.

  A.I. is the good pepper of the robots
Tomohiro Ohsumi / Getty Images

"When I ask for the same song every day, when I go into my house, and then I go in and it just starts to play, how do they know the? did you understand it correctly? "Hebner said. "If I do not stop playing, does that mean it's right?" When I say "Stop" does that mean that it has been misunderstood and it should never do it again? The feedback mechanism is one of the reasons why you do not get more proactive systems. "

This is a challenge for engineers. Anyone who has ever had an intern asking for instructions and feedback on each task knows that sometimes it's easier to do a job by yourself than to delegate it. An A.I. The assistant is there to make your life smoother. You do not have to do dozens of mini-surveys every day to confirm that the task was done properly. This needs to be resolved in a way that does not compromise the usability of these devices, and it does not require much training before the systems become familiar with their preferences.

What is the answer? I'm not sure. But, as Steve Jobs once said, it's not the customer's job to figure it out.

New Interaction Methods

There is a scene in 2001: A Space Odyssey in which the murderous HAL 9000, disturbingly, still the most famous fictitious AI assistant in history shows that not only microphones are used to determine what is said. When two crewmembers try to choose a place where they can speak that they know HAL does not hear, HAL shows that they can still understand them based on reading their lip movement “/>

2001: One Space Odyssey

Scary moment of the movie? For sure. An example, like A.I. Assistants could work in the future? Um, sure!

The idea that voice assistants should limit themselves to the voice reduces the number of ways in which they can meaningfully interact with us. With the advent of face recognition and emotion tracking technologies, a growing number of biometrics collected via users, and even the possibility of mind reading technology on the horizon, there are many different signals that could be used by AI assistants to draw their conclusions.

The idea that in 10 years we will only use the voice to create this A.I. Assistants is like looking at PCs in the early 80's and the thought that we never have more than one keyboard available.

Editor's recommendations





Source link