Thursday, December 26, 2024
HomeBlock ChainEnter the 'Whisperverse': How AI voice agents will guide us through our...

Enter the ‘Whisperverse’: How AI voice agents will guide us through our days

Published on

spot_img

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


A common criticism of big tech is that their platforms treat users as little more than glassy eyeballs to be monetized with targeted ads. This will soon change, but not because tech platforms are moving away from aggressively targeting users. Instead, our ears are about to become the most efficient channel for hammering us with AI-powered influence that is responsive to the world around us. Welcome to the Whisperverse.   

Within the next few years, an AI-powered voice will burrow into your ears and take up residence inside your head. It will do this by whispering guidance to you throughout your day, reminding you to pick up your dry cleaning as you walk down the street, helping you find your parked car in a stadium lot and prompting you with the name of a coworker you pass in the hall. It may even coach you as you hold conversations with friends and coworkers, or when out on dates, give you interesting things to say that make you seem smarter, funnier and more charming than you really are. These will feel like superpowers.

The ‘Whisperverse’ will require highly advanced technology

Of course, you won’t be the only one “augmented” with context-aware AI guidance. Everyone else will have similar abilities. This will create an arms race among the public to embrace the latest AI-powered enhancements. It will not feel like a choice, because not having these capabilities will put you at a cognitive disadvantage. This is the future of mobile computing. It will transform the bricks we carry around into body-worn devices that see and hear our surroundings and covertly offer useful information and friendly reminders at every turn.

Most of these devices will be deployed as AI-powered glasses because that form-factor gives the best vantage point for cameras to monitor our field of view, although camera-enabled earbuds will be available too. The other benefit of glasses is that they can be enhanced to display visual content, enabling the AI to provide silent assistance as text, images, and realistic immersive elements that are integrated spatially into our world. Also, sensored glasses and earbuds will allow us to respond silently to our AI assistants with simple head nod gestures of agreement or rejection, as we naturally do with other people.  

This future is the result of two technologies maturing and merging into one — AI and augmented reality. Their combination will enable AI assistants to ride shotgun in our lives, observing our world and giving us advice that is so useful, we will quickly feel like we can’t live without it. Of course there are serious privacy concerns, not to mention the risk of AI-powered persuasion and manipulation, but what choice will we have? When big tech starts selling superpowers, to not have these abilities will mean being at a disadvantage socially, professionally, intellectually and economically.

‘Augmented mentality’ changing our lives

I’ve been writing about our augmented future for more than 30 years, first as a researcher at Stanford, NASA and the U.S. Air Force, and then as a professor and entrepreneur. When I first started working in the field we now call “augmented reality,” that phrase didn’t exist, so I described the concept as “perceptual overlays” and showed for the first time that AR could significantly enhance human abilities. These days, there is a similar lack of words to describe the AI-powered entities that will sit on our shoulders and coach us through our day. I often refer to this emerging branch of computing as “augmented mentality” because it will change how we think, feel and act.   

Whatever we end up calling this new field, it is coming soon and it will mediate our lives, assisting us at work, at school or even when grabbing a late-night snack in the privacy of our own kitchen. If you are skeptical, you’ve not been tracking the massive investment and rapid progress made by Meta on this front and the arms race they are stoking with Apple, Google, Samsung and other major players in the mobile market. It is increasingly clear that by 2027, this will become a major battleground in the mobile device industry.

The first of these devices is already on the market — the AI-powered Ray-Bans from Meta. Although currently a niche product, I believe it is the single most important mobile device being sold today. That’s because it follows the new paradigm that will soon define mobile computing: Context aware guidance. To enable this, the Meta Ray-Bans have onboard cameras and mics that feed a powerful AI engine and pumps verbal guidance into your ears. At Meta Connect in September, the company showcased new consumer-focused features for these glasses, such as helping users find their parked cars, translating languages in real-time and naturally answering questions about things you see in front of you.

‘Cute’ creatures rather than ‘creepy’ ones

Of course, the Meta Ray-Bans are just a first step. The next step is to visually enhance your experience as you navigate your world. Also in September, Meta unveiled their prototype Orion glasses that deliver high quality visual content in a form factor that is finally reasonable to wear in public. The Orion device is not planned for commercial deployment, but it paves the way for consumer versions to follow.

So, where is this all headed? By the early 2030’s, I predict the convergence of AI and augmented reality will be sufficiently refined that AI assistants will appear as photorealistic avatars that are embodied within our field of view. No, I don’t believe they will be displayed as human-sized virtual assistants who follow us around all day.  That would be creepy. Instead, I predict they will be rendered as cute little creatures that fly out in front of us, guiding us and informing us within our surroundings.

Back in 2020, I wrote a short story (Carbon Dating) for a sci-fi anthology in which I refer to these AI assistants as Electronic Life Facilitators, or ELFs for short. I like thinking of these AI-powered entities as elves because that is what they will become in our lives — helpful little creatures that prompt you with the exact cargo capacity of a railcar when you just can’t remember in an important meeting, or takes the shape of a flying fairy that guides you through Costco to find the items on your shopping list as efficiently as possible. These features will not just be helpful, they will make our lives seem magical.

Computer scientist Louis Rosenberg with ELF concept (Carbon Dating, 2021)

On the other hand, deploying intelligent systems that whisper in your ears as you go about your life could easily be abused as a dangerous form of targeted influence. And when this is coupled with the ability to visually modify the world around you, AI/AR powered glasses could enable the most powerful tools of persuasion and manipulation ever created. For these reasons, I sincerely hope that industry leaders do not adopt an advertising business model when commercializing these AI-powered glasses. I also hope they consider how these products will shake-up social dynamics, as they can change how people interact face-to-face in damaging ways (the short film Privacy Lost shows examples).

For three decades I’ve researched how AR and AI can enhance human abilities in positive ways. That said, the last thing I want is for giant corporations to battle for marketing dollars based on how efficiently their AI assistants can talk me into buying things I don’t need or believing things that aren’t true. To enable the magical benefits while protecting our privacy and agency, I recommend that regulators quickly focus on this emerging market. Their goal should be to define the playing field so that big tech can compete aggressively on how magical they make your life, not how effectively they can influence it.

Louis Rosenberg is a computer scientist in the fields of AR and AI. He is known for founding Immersion Corp, Outland Research and Unanimous AI.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers



Source link

Latest articles

DeepSeek-V3, ultra-large open-source AI, outperforms Llama and Qwen on launch

Join our daily and weekly newsletters for the latest updates and exclusive content...

What Does the ‘i’ in iPhone stand for? The answer might surprise you

For decades, the iconic “i” in Apple’s products—like the iPhone, iPad, and iMac—has...

How to get 24×7 home security from Dubai police. What are the charges?

If you feel anxious while leaving your home unattended, the Dubai Police has...

More like this

DeepSeek-V3, ultra-large open-source AI, outperforms Llama and Qwen on launch

Join our daily and weekly newsletters for the latest updates and exclusive content...

What Does the ‘i’ in iPhone stand for? The answer might surprise you

For decades, the iconic “i” in Apple’s products—like the iPhone, iPad, and iMac—has...