What does the metaverse look like in 5 years? Maybe this.

What is the metaverse? Ask 10 different experts, and you’ll get a slew of different answers. Is it virtual reality? Mixed reality? Will it be a virtual city where we buy digital apartments? Will it just be a bunch of NFT JPEGs and other get-rich schemes before crypto creates so much carbon that our world burns? Or is it just Roblox and Snapchat in a couple of years?

While consensus is hard to find, Argodesign — a design firm that’s spent years consulting for Magic Leap — has put its cards on the table. It has developed a five-year vision for mixed reality, offering a convincing argument for how products like HoloLens and Magic Leap can work in half a decade. This metaverse would function as well for Starbucks as for Apple, with enough value and control for people to actually use.

The secret sauce of it all? Argodesign’s metaverse is basically the internet. But instead of going to websites, you go to real places — where mixed reality glasses reveal otherwise invisible digital layers.

[Image: courtesy Argodesign]

That means your living room could have a layer for watching a giant virtual TV. And it could have a layer for playing Pokémon, in which virtual monsters run around your couch. And it could have a layer for cleaning, with your family’s garments sorted automatically as you fold clothes while Spotify plays in the background.

[Image: courtesy Argodesign]

“Our goal is to untangle some of the misinformed visions of the metaverse,” says Mark Rolston, founder of Argodesign. “To describe it as an advancing art form of computing. . . and offer a picture of what that might look like. ”

[Image: courtesy Argodesign]

Dazzling hardware that wants nothing to do with goggles

That picture begins with Argodesign’s concept for mixed reality glasses — lightweight frames that glow, telegraphing that you’re in this other layer of reality (rather than hiding it from others and creating social confusion). Aside from having a distinct look, the glasses also contain a stylus, which can stick to the arm bar of the glasses like a pencil over your ear.

[Image: courtesy Argodesign]

But why do we need this controller at all, when products like HoloLens and Oculus Quest can read the motions of your hands already? Because the way we work with virtual screens and objects will often be at a distance. “If you want to do stuff at 2 feet. . . laptops are better, ”Rolston says. “This is a 10- to 20-foot world. Imagine sitting on your sofa [looking to your TV]. ”

[Image: courtesy Argodesign]

Plus, naturally, you’ll want to point to things that are far away, but pointing with your finger will lead to (more) confusion from people around.

“The point gesture won’t end up in mixed reality; it’s a human interaction, ”argues Jared Ficklin, partner and lead creative technologist at Argodesign. “It doesn’t matter how natural it feels when I think I’m computing [and someone nearby] thinks I’m chewing them out. ” You might say that twirling a pencil through the air looks odd, too. But at least it’s less confusing.

[Image: courtesy Argodesign]

The software is built like the web

The hardware is fun to consider, but Argodesign admits that glasses could look all sorts of different ways. What the designers are more confident about is their entire vision of metaverse software — specifically, how it’s organized and how you access it.

Today, we work largely in the desktop metaphor of the early Macintosh. Your computer desktop is modeled after a literal desktop. That’s why we have files and folders. These affordances, as they are called, helped people who were new to computers understand how to work these machines. By basing computers upon the same organizing principles of someone’s office, the technology was more familiar and less intimidating. Then, after desktops, the iPhone gave us apps. The apps became the containers that had information we wanted.

[Image: courtesy Argodesign]

With the metaverse, Argodesign imagines that the spaces and things around you will be the containers. That means your office desk can become a special metaverse zone. It can pull up a virtual computer for you. It could be set to hide private files, and appear differently to a colleague who comes by. Meanwhile, a teleworking colleague who wants to hop into your presentation doesn’t need to be there to take part. They could join your desk via their iPhone (or perhaps even a VR headset), and your meeting becomes centered on a real gathering place rather than an app like Zoom.

To Argodesign, the metaverse is all about using the world as our foundational interface for computing — real locations replace folders — but in a way that offers the convenience of accessing it remotely when you don’t want to show up physically.

This vision isn’t simply for desks, however. Any one physical space could house all sorts of different digital “layers.” A Starbucks could have a corporate layer with menus and deals for ordering your latte, then you might switch to a layer run by Meta that displays conversations left by people in your community. Then you might switch to a Target pop-up store in the Starbucks, and then you might even switch to your own private work layer that you’ve set for the times you want to answer emails at a coffee shop.

This vision sounds fantastical, but technically it’s feasible. Our glasses would be looking for an “anchor,” a space or object that’s mapped to a specific URL, just like a website is today.

“What’s missing is. . . where the anchors are coming from, ”Ficklin says. Microsoft has its own Azure anchor standard, for instance, and Niantic, the maker of Pokémon Go, acquired the world-mapping company 6D.ai in 2020 to build a global, anchored map. But Ficklin points out that there is no universal service built like how the nonprofit ICANN works today on our web, as a middle man to point the URLs you type to actual servers full of information. Argodesign, which is shamelessly optimistic about the promise of technology, believes this standard must be universal and owned by no single company.

It’s easy to imagine what could happen if not. Consider how poorly Apple iMessages works with Google Messenger today. Apple could have a metaverse built on iGlasses, and it would be semi or incompatible with Google’s metaverse built on Android glasses. (Then, of course, Meta and Niantic and all sorts of other players would also fight in this land grab through a mix of hardware and software.)

While these battles are already well underway, the web offers an elegant proof of concept demonstrating how these companies can work together to agree upon a universal standard. (And there’s all sorts of precedent beyond the web in companies sharing core technologies, ranging from USB to emoji.)

[Image: courtesy Argodesign]

It’s compelling, but it’s still a metaverse

Yet is the metaverse something that anyone needs? Ultimately, Argodesign believes the answer is yes; this metaverse is the next, natural form of computing. As Rolston explains, the history of computers aimed at a screen at you. On the other hand, augmented glasses point a camera out of your eyes. It’s the first time in human history that a machine will share our viewpoint and our specific context moment to moment.

“Therefore, when you write software for it to be more useful, it becomes more human,” Ficklin says. “The metaphor itself is a more human computer.”

I get that argument, and it’s tempting to buy into this vision of technological progress. Argodesign’s entire metaverse vision is both convincing and feasible, with a grokable information design that, in many ways, makes more sense than files and folders do in 2022. This world may very well come to be! And yet, I must push back on the value and necessity of building such a metaverse at all. We’ve already learned that screen time makes us unhappy, and Argodesign’s vision could turn the world into one big sports bar, with your waiter Meta tracking your eyeballs all along the way.

Indeed, there is a reason that many futurists have imagined a future with fewer screens, real or virtual. If we know a simple walk through a garden will measurably lower our cortisol levels, I simply don’t understand: Why are we still building more screens instead of more gardens?

Leave a Comment