I tried the Sam Altman-backed orb face scanner. It couldn’t verify my humanity

TruthLens AI Suggested Headline:

"Sam Altman's Orb Technology Aims to Distinguish Humans from AI Bots"

View Raw Article Source (External Link)
Raw Article Publish Date:
AI Analysis Average Score: 8.8
These scores (0-10 scale) are generated by Truthlens AI's analysis, assessing the article's objectivity, accuracy, and transparency. Higher scores indicate better alignment with journalistic standards. Hover over chart points for metric details.

TruthLens AI Summary

A new technology developed by Tools for Humanity, co-founded by Sam Altman, aims to address the increasing challenge of distinguishing humans from artificial intelligence bots online. This technology, known as the Orb, is designed to verify a user's identity by analyzing their facial features and reactions to light. However, during a personal test, the Orb failed to recognize the user as a 'unique human' due to the blue-light blocking contacts they were wearing, which the device interpreted as an attempt to disguise their identity. Tiago Sada, the chief product officer, expressed relief that the Orb rejected the user, highlighting the importance of accurate verification in an era where bots proliferate across various online platforms, from ticket sales to dating apps. The need for effective identity verification has become increasingly urgent, especially as bots are used for harassment and disinformation on social media, and to manipulate market prices in ticket sales. Sada noted that every time tickets go live, bots are quick to purchase them, leading to inflated prices, a concern echoed by many users in the online space.

The Orb captures images of a user's face and eyes to create a unique identifier called 'WorldID,' which functions like a digital passport for online services. Already, over 12 million people have utilized the technology in more than 20 countries, although its availability in the United States is temporarily halted for updates. Tools for Humanity plans to expand its services by delivering Orbs directly to homes, beginning in Latin America. The project, which operates on an open-source platform, has raised questions regarding privacy and data protection, especially since it collects sensitive biometric information. Critics worry that offering cryptocurrency incentives for participation could lead users to overlook potential privacy risks. Despite these concerns, Tools for Humanity asserts that it prioritizes user privacy, claiming that biometric data is deleted after being sent to users' devices. As the technology continues to evolve, it faces scrutiny from regulators and privacy advocates, all while aiming to provide a reliable means of verifying human identity in a digital age increasingly populated by AI.

TruthLens AI Analysis

The article presents a compelling examination of a new technology designed to differentiate humans from artificial bots, specifically through the use of a device called the Orb, backed by Sam Altman. The author shares a personal experience with the Orb, highlighting both the potential and the challenges of this innovative identity verification tool. By exploring the implications of this technology, the article raises questions about privacy, efficacy, and societal acceptance.

Implications of Technology on Identity Verification

The Orb is intended to address a significant problem in the digital age: the proliferation of bots that can impersonate humans online. As the article notes, these bots are increasingly used in various domains, from ticket sales to social media, often leading to negative consequences such as inflated prices and misinformation. The technology aims to create bot-free environments, which could enhance user experiences in various online platforms.

Public Reception and Privacy Concerns

Despite the potential benefits, the article reveals that public acceptance may be a significant hurdle. The Orb's design and operation raise privacy issues, prompting some regions, like the UK and Hong Kong, to ban its use. This skepticism reflects broader societal concerns about surveillance and data protection, suggesting that any new technology must navigate a complex landscape of public sentiment and regulatory scrutiny.

Critique of AI and Technological Overreach

The mention of critics who view the Orb as part of a broader push by tech leaders to shape an AI-driven future highlights an ongoing debate about the ethical implications of such technologies. This skepticism is not merely about the Orb itself but also about the motivations behind its development and the potential for misuse. The article touches on fears that AI technologies may outpace societal readiness, leading to unintended consequences.

Potential Economic and Social Impact

The ability to effectively filter out bots could have wide-ranging implications for various sectors, including entertainment, dating, and social media. It could lead to a more secure online environment, potentially benefiting companies that rely on authentic user interaction. However, if consumers remain hesitant to adopt such identification methods, the anticipated advantages may not materialize, leading to an ongoing struggle between innovation and public trust.

Impacts on Investment and Market Trends

This news may influence investor sentiment towards companies involved in AI and identity verification technologies. As public awareness of digital security grows, stocks related to cybersecurity and verification tools might see increased interest. Conversely, companies that do not adapt to these technological advancements could face challenges in maintaining user engagement amid rising concerns about bot interference.

Broader Context and Relevance

In the context of today's discussions about digital identity and privacy, this article connects to larger themes of trust in technology. As society grapples with rapid advancements in AI, understanding and addressing these challenges becomes paramount. The Orb's journey reflects the friction between innovation and the ethical considerations that accompany it.

The article does not appear to contain overt manipulative elements, but it does emphasize the need for caution in embracing new technologies. The language used suggests a balance between optimism for technological advancement and concern for privacy and identity integrity.

The overall reliability of the article is high, given its detailed examination of both the technology and the societal implications. However, the skepticism surrounding the Orb reflects broader anxieties about privacy and the rapid evolution of AI, which are essential to consider in any discussion of emerging technologies.

Unanalyzed Article Content

A Sam Altman-backed company has an ambitious plan to solve one of the internet’s trickiest problems — distinguishing humans from artificial bots. It involves an Orb. The Orb couldn’t verify me as a “unique human” when I tried it — but it turns out, that’s partially by design. I wear blue-light blocking contacts with a slight yellow tint, so the Orb evidently thought I was trying to fool it by disguising my identity. “Well, that’s sad, but I’m grateful that it rejected you,” Tiago Sada, chief product officer at the company behind the Orb, told me after the tool returned a message saying, “something is blocking your face.” The Orb, built by Tools for Humanity, which Altman co-founded, wants to provide what it says is a more effective identity verification method in the age of artificial intelligence. Now that AI systems can understand images, for example, it’s no longer just humans who can pick out the photos of school buses on traditional CAPTCHA systems. And cracking down on bots is crucial as fake accounts have become more prominent everywhere from ticket sales websites to dating apps. What’s more, armies of bots can be used to spread harassment or disinformation campaigns on social media, and AI scams are already proving costly for victimized families and companies. “Every single time, for any artist around the world, tickets go live, bots go and buy tickets and they drive up the prices,” Sada said. “I’m on dating apps all the time, and probably every week I come across an account that I’m like, ‘Oh, why does this person have six fingers?’ Well, turns out it was an AI.” Still, convincing large numbers of people to let a gleaming, sci-fi-looking Orb photograph their faces to confirm their humanity and enable bot-free online spaces may be a tall order. Already, several places have banned the project over privacy and other concerns, including the United Kingdom and Hong Kong. And some critics say the Orb is part of an AI future being pushed by tech leaders like Altman that not everyone wants. But Sada said he believes the need for the service is already apparent. “Every single website, whether it’s finance, social media, commerce, whatever, we think of proof of human is super relevant.” How the Orb works When a user scans their face with the Orb, the technology photographs their eyes and face to analyze things like dimension and how they react to light, Sada said. When the Orb has verified a user’s “humanness,” it sends a unique code, or “WorldID,” to an app on their phone. The idea is that WorldID would serve as a sort of digital passport to verify users’ identities across a range of online services. It’s sort of like Face ID, but without requiring an iPhone and potentially applicable across more services. Sada envisions, for example, that if a bank wanted to confirm a customer’s identity before divulging financial information, it could use your WorldID rather than security questions, which may be easier for a scammer to guess. Currently, there are few practical use cases for WorldID. But it can be used in a limited fashion on some platforms, including Reddit and Shopify, where users can access “human-only” subreddits or retail deals, Sada said. More than 12 million people have already verified their identity using Orbs, which are available and free to use in more than 20 countries at malls and other central locations or at Tools for Humanity events. Later this year, the company plans to roll out a service to deliver Orbs directly to people’s homes for easier verification, starting in Latin America. The Orb is temporarily unavailable in the United States while the technology is being updated, but the company says it will re-launch “soon.” The World App is still available in the meantime. Safety questions Tools for Humanity built the Orb for the “World” network, an open-source platform that serves as a sort of repository for “verified” humans. World users in some countries can buy and sell the “Worldcoin” cryptocurrency, a token only accessible to people on the network. The company has gifted some Orb users a bit of Worldcoin in exchange for photographing their faces. (CNN was not offered the cryptocurrency, which is not available in the United States, when testing the device.) The network project launched as Worldcoin in 2023, before rebranding to just “World” last year, leaving the name of the token unchanged. Leaders said the effort could help manage the “economic implications” of AI; for example, by providing a means of delivering universal basic income, an idea of which Altman is a proponent. For now, Tools for Humanity’s main source of revenue is selling or renting Orbs to World “community operators,” who earn extra Worldcoin in exchange for onboarding new users to the network, although the company said it’s exploring additional revenue streams. As with many other cryptocurrency projects, Worldcoin has drawn extra scrutiny to the project. Some skeptics raised concerns that offering users cryptocurrency in exchange for using the technology could encourage them to overlook potential privacy risks. Others worried the project could centralize too much power under Altman, who already has an outsized role in what’s expected to be a revolutionary tech transformation. Last fall’s rebrand was viewed as a likely effort to expand its identity beyond the original cryptocurrency effort. Sada says Tools for Humanity says Worldcoin gives users ownership over the World network. Tools for Humanity did not respond to a request for comment on behalf of Altman, its chairman. The Orb also faces ongoing scrutiny from privacy regulators in Europe and elsewhere, given that it works by collecting sensitive biometric data. According to Tools for Humanity, photos taken by the Orb are sent to users’ phones along with their WorldID and then are automatically deleted from the device, so the company retains no biometric data. The data is encrypted en route to users’ devices, where users can set up the app to require authentication before opening it. The company has also open-sourced much of the technology behind the Orb, an effort, it says, to allow third parties to verify its safety claims. “It’s not very intuitive that something like this is … private,” Sada said. “We think that privacy is freedom, and something like World exists first and foremost to defend people’s freedom.”

Back to Home
Source: CNN