The model pictured above has over 151,000 followers on Instagram. She’s been featured in high fashion publications like Vogue and Cosmopolitan, she’s modeled for upscale brands like Balmain and Tiffany and high end make-up lines like Pat McGrath. She’s a girl who seems to have it all. Despite her entrancing beauty, the lovely woman you see above is digital. Meaning, she’s not real. Her name is Shudu, and she has been called “the world’s first digital supermodel.” The obsessive fawning over her life-like features suggests she won’t be the last.
She arrives at a time when Instagram, Snapchat filters, and photo-editing apps that rely on artificial intelligence have blurred the lines between reality and fantasy, turning ordinary people into paintings. Our carefully edited digital avatars are preening for “likes” among the rest of the social media masses. The use of Facetune and other digitally-altered photos apps are now the norm across social platforms. So much so that the ability to tell what is real versus what is altered is sometimes impossible.

Realistically speaking, our social media is already full of augmented reality. Everyday we scroll through countless perfected pictures of perfectly posed people. (Say that 5 times fast!) Perfect is the new normal. This perfection trend that has steamrolled its way into our lives has left the door wide open for the only people who could ever realistically be perfect… digital ones.
CGI created models and influencers such as Shudu (@Shendu.Gram), Miquela Sousa and Bermuda are changing the game. These cyber beauties are ushering in a new era by serving as the culture’s first Digital Influencers. These CGI lifeforms giving us a new perspective on artificial intelligence through the way we interact with them as well as the ways they interact with each other.
Reality is becoming fantasy…
Before we dive into how AI and human interactions in the real world, we must first take a moment to appreciate the comedy and Orwellian antics that happened earlier this year when computer-generated avatars, with anonymous creators, had an Instagram beef with each other.
A 19-year-old Brazilian-American model, singer, and Instagram personality, Miquela (@LilMiquela) was hacked by a blonde, pro-Trump troll named Bermuda (@BermudaIsBae). Over the course of about eight hours, Bermuda wiped Lil Miquela’s account clean, posting photos of herself instead with threatening captions like: “You can’t have your account back until you promise to tell people the truth.” Did I mention they’re both robots? Yes, this really happened. It doesn’t get more 2018 than this, folks. And when it comes to confusing encounters with hyper-realistic CGI humans, things like this are just the tip of the iceberg.

How close are we to seeing digital humans who look and act like the real thing?
Surprisingly, we’re pretty damn close.
There are already a number of startups working on commercial applications for what they call “digital” or “virtual” humans. Some are focusing on using them for customer service applications like personal assistants, hotel concierges, etc. while others are specifically being developed to push the boundaries of merging fantasy with reality. This is going to redefine what customer service and human resources means. It’s going to shift out culture in a very real way, and it’s time for all of us to start thinking about what that means.
This year, Epic Games, CubicMotion, 3Lateral, Tencent, and Vicon took a major step toward in disrupting the film and gaming industries by creating believable digital humans. In March, the five tech companies joined forces to debut Siren, a demo of a woman rendered in real-time using Epic’s Unreal Engine 4 technology.
This same Unreal Engine 4 Technology has already made drastical advances in the gaming community. Through it’s use, game makers can instantly create digital facial animations, rather than having to animate those characters by hand, which is a considerable time saver. Improvements to eyes, skin, and hair help bridge the uncanny valley, a characteristic dip in emotional response that happens when we encounter an entity that is almost, but not quite, human. The theory was first identified by Japanese roboticist Masahiro Mori in 1970. Anything with a highly human-like appearance can be subject to the uncanny valley effect, but the most common examples are androids, computer game characters and life-like dolls. However, not all near-human robots are eerie, and the perception of eeriness varies from person to person.
Quantum Capture has chosen to focus their efforts in the space toward creating digital humans for virtual, augmented and mixed reality applications. This forward-thinking startup is leveraging its 3D-scanning and motion-capture technologies for real-world purposes today. Currently, they are piloting AI for a luxury hotel, where a “virtual human” concierge greets guests in the lobby and helps them check in.
Artificial intelligence technology is also becoming more commonly used by news organizations. This month, China’s Xinhua News Agency unveiled “the world’s first artificial intelligence (AI) news anchor,” which was revealed at the World Internet Conference in China’s Zhejiang province. The anchor is advanced enough to learn from live broadcasting videos by himself and can read texts as naturally as a professional news anchor.
There is no denying that society has and will continue to benefit from AI across all industries and applications. But we’ll need to keep our own reality in check. There is still much evolution to be had and the risk associated with AI are real, especially if we don’t understand the quality of the incoming data and set AI rules for interpretation. The use of AI in everyday life can be awesome… but there’s a lot of room for it to become problematic.
CGI characters (typically created using video game programs like UNITY or UNREAL), never age, never get pregnant, and never gain weight… That leaves a lot of room for jobs like modeling or acting or even being an influencer to become obsolete. And as our favorite personalities, friendly customer service reps, and accommodating hotel concierges are replaced by well programmed digital humans & AI, what happens to us? What happens to the progress toward inclusion that has been made in the beauty and fashion if digital humans became the new faces of those industries? What happens to human connection when everything from our ads to purchased services to our entertainment are provided by programmed people? This can be a huge problem. So what’s the solution? Participate in the conversation.
Instead of waiting for this new turn in technology to decide for us what the future looks like, it is important that we develop skills in this field. The people coding and building the framework for digital humans should be an inclusive mix of folks from all walks of life, working together to help define the way AI and the real world interact. When advancements in this tech is showcased at conventions, we need to be there witnessing it and providing feedback. We need to know who these developers are… and we need to become part of the development. This is how we solve problems creatively. This is how we stop the the advances in CGI and AI from happening to us, and participate in making it work for us.[/vc_column_text][/vc_column][/vc_row]