The future of Cortana is intelligent, emotional, and potentially dangerous?

Fahad Al-Riyami

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

The future of Cortana is intelligent, emotional, and potentially dangerous?

Cortana’s potential is undoubtedly very high. Microsoft’s virtual personal assistant can currently set location-based reminders, manage your calendar and appointments, scour the web for answers to your queries, and even predict the winners of sporting events. Microsoft have also hinted at giving her the ability to strike up and maintain intelligent conversations at relevant moments, not unlike Spike Jonze’s adaptation of Samantha in the movie ‘Her’.

With bi-weekly updates, Microsoft is continually improving Cortana by adding new features and capabilities. Although these improvements are admittedly minor, mostly consisting of slight UI changes and better voice recognition, they will become larger as the years pass by and technology advances.

If Cortana is as capable as she is now when she can only recognize your voice, what happens when you give her the ability to see? With Microsoft’s Kinect technology already rumored to be making its way to laptops and tablets, it’s only a matter of time until the technology shrinks its way into smartphones. Imagine the possibilities that that could result in. Will Cortana then be able to recognize your facial expressions? Cracking jokes in an attempt to cheer you up? Will she watch your steps when you text while walking to make sure you don’t trip over anything? Or maybe even dial emergency services and warn you of the impending danger of a stroke when one half of your face goes numb and your speech gets slurred, potentially saving your life? The potential is endless given the ability to see and hear your immediate surroundings.

But let’s look ahead; 5, 10, 15 years into the future, when artificial intelligence becomes capable enough to improve by reprogramming itself. Similar to how humans learn through experience, machines could potentially write and re-write their own code to acquire new features and capabilities, without any developer involvement. Cortana may well become an operating system in and within herself. Bye bye Windows, and hello Cortana.

The idea isn’t that farfetched considering machine learning is already a possibility today, although unsupervised machine learning is still primitive. In the next decade, who knows how much more advanced that could get, but that’s not necessarily a good thing.

In a recent interview with Public Radio International, UC Berkeley professor of computer science and engineering Stuart Russell says that scientists in the AI field are coming to the “realization that ‘more intelligent’ is not necessarily better.” Warning that “if we make machines that are much smarter than human beings, then it’s very difficult to predict what such a system would do and how to control it.” He explained that just like in nuclear physics, when you have too much of a chain reaction, it results in a nuclear explosion, “So we need controlled chain reactions, just like we need controlled artificial intelligence.”

Emotion is another aspect of intelligence that Microsoft is trying to infuse in Cortana, because having emotions makes beings more human, more alive, more relatable. The company pairs emotional responses with Jen Taylor’s pre-recorded voice and that together makes for a striking sense of realism, almost like a fellow human is trapped in your phone. And the more types of emotions are implemented, the more realistic the AI will become – that also includes the negative ones like anger and frustration.

If you were to pull out your Windows Phone, and ask Cortana if she loved you, she’d respond by either telling you that there is “definitely a spark” or the more sincere response below:

Cortana: How intelligent is too intelligent

However, Christop Koch, Chief Scientific Officer at the Allen Institute for Brain Science argues that, “We can surely in the future … program entities so that they behave as if they have conscious feelings, and they say they have conscious feelings – of love or trust or anger – but how do we really know? For that we need a theory of consciousness, and there is no agreed-upon theory of consciousness right now.”

So until a theory of consciousness is established – which it may never be – virtual assistants will be severely limited from an emotional point of view to pre-recorded, and pre-programmed responses. And Cortana will never truly love you back as much as you love her.

On a lighter note, Cortana looks to have a bright future ahead of her. Microsoft seem fully committed to her improvement, as they reportedly work on teaching her more languages and expanding her platform availability to Windows. What and where all these advancements will lead to? Only time will tell.



(Audio courtesy of Engadget)