If you ask anyone in and around the Web3 gaming space, they’ll tell you that bots are present and problematic. However, most people accept this as an industry given, without a true accounting of the full set of reasons behind it, or a true desire to solve the problem.
There’s a pernicious double-edged sword to bots and sybils in Web3 gaming that creates a short-term desire to turn a blind eye to satisfy top-line metric growth that satisfy VC investor desires and the qualifications exchanges place on getting listed. Moreover, the increasing sophistication of bot farms and the power of LLMs to help hobbyists build and scale bots faster means the challenge is getting harder.
Hope isn’t lost, though. We’ll cover what we are seeing across the industry and share actionable tactics, open source technologies, and strategies for solving bots & sybil attacks - as well as frameworks for how to generate win-win-wins that include maintaining metric integrity without the economic losses.
Everyone in the Web3 gaming space knows about bots. They're like that uninvited guest at a party who won't leave and keeps drinking the good booze. But the scale of the problem? That's where things get interesting – and alarming. Let's cut to the chase with numbers:
The bot problem in Web3 gaming isn't just about numbers on a screen. It's about trust, player experience, and the sustainability of these digital economies. As we dive deeper into this article, we'll explore why bots are so prevalent in Web3 games, the real impact they're having, and most importantly, how - and when - to fight back.
Now that we've established the scale of the bot problem, let's dive into why Web3 games are such attractive targets. It's a perfect storm of advancing technologies, high profitability, inadequate defenses, and conflicting incentives.
Remember when creating bots required extensive coding knowledge? Those days are long gone. With the advent of Large Language Models (LLMs), creating sophisticated bots is now as easy as having a conversation.
In a recent demonstration bot we created, an LLM-powered bot was able to bypass Twitter's advanced security measures - even dispatching a CAPTCHA in a few seconds. If it can fool Twitter, imagine what it can do to a fledgling Web3 game.
Free, infinite, and can be created programmatically at scale
Some games rely on OAuth with services like Telegram, Discord, Twitter, and Google to benefit from some perceived existing trust layer that they provide. Or, they use manual email or phone number 2FA. Despite being easy - relying on someone else to solve the problem for you - it doesn’t work. Hundreds of Temporary email and phone number sites exist that let you access fake email inboxes for free and real SMS phones in the US for $0.75-$1.00.
Lastly, the rise of professional device rigs has enabled fraudsters to scale operations and reduce costs. Douglas Mun shares a powerful thread showing the evolution of click fraud farm technology over the last several years.
To effectively combat bots, we must first understand the mindset of those creating them. Bot makers approach their activities like any business venture, with a clear focus on ROI.
To do this, we chatted with an actual fraudster on Upwork - “Robert” (name changed for privacy purposes). Robert helps build bots and scripts for people that want to automate account creation on games, dating apps (like Bumble), and take surveys.
The focus on ROI was abundantly clear when we spoke with Robert. Each prevention “tactic” a company employs, he views within a cost-to-beat framework. For example, when talking about Bumble’s face verification and phone number checks, he mentioned that this just meant a few more hours of development work and $1 temporary phone numbers.
Most sybil farms operate like businesses - with a clear start and end to the day. Just like we head to the office every day, so do they.
In one game we at Verisoul partner with, we observed a curious phenomenon: every night, like clockwork, there was a surge in bot activity starting at exactly 7:14 PM CT.
This isn't random; this is orchestrated, automated exploitation at scale, by folks that treat this seriously, like a profit-maximizing enterprise. Additionally, we see bot surges mostly during the weekdays, during the hours of GMT 4am-5pm, which covers the weekday working hours in fraud hotspots like:
Sundays are always the “safest” day for avoiding bots, according to our internal data at Verisoul. Robert anecdotally confirms this data - he says he and others mostly works during the weekdays, and sometimes weekends if the clients need quick turnarounds.
Compared with regular gaming, Web3 games are uniquely profitable because of their:
"If bots are more common in Web3 games, it's likely due to the types of games available in this space."
He further explains the motivations behind botting in Web3 specifically:
"Organized groups seeking financial gain need a way to cash out. Web3 games in general are likely to include features that facilitate this."
Web3 games, with their focus on player-owned economies and real-world value extraction, inadvertently create the perfect environment for both these motivations to thrive.
The unique ecosystem of Web3 gaming creates a paradoxical environment where the incentives of various stakeholders often conflict, inadvertently fostering an ideal breeding ground for bots.
1. The Token Price Growth Imperative: There’s a pernicious feedback loop that disincentivizes blocking bots:
a. Venture Capital firms, a primary source of funding for many Web3 game projects, prioritize rapid growth and impressive metrics to drive token prices, so they can show growth to LPs with hopes of raising funds
b. To do this, games must get listed on the biggest/best exchanges, which care about user count (and not user integrity)
c. This creates pressure on game developers to show substantial user acquisition and engagement numbers, often at the expense of long-term sustainability
2. Short-Term vs. Long-Term Success Metrics: The focus on short-term growth often conflicts with the need for building a sustainable, bot-resistant ecosystem. Game developers find themselves in a Catch-22 situation:
3. User acquisition platform incentive misalignment: There’s a new breed of questing platforms, or more clearly: user acquisition platforms, which aim to connect gamers with games or quests to discover games and web3 apps. The incentives are particularly skewed here, because the platforms themselves get paid per user or per “acquisition.” Therefore, they have a negative incentive to stopping bots, because it reduces their revenue in the short term.
In our first of a series of calls with Robert (Upwork fraudster mentioned above), he let us know that he:
"built a bot that could earn tokens in Crabada. I sold that bot to dozens of players. Days later, the entire game economy collapsed."
Crabada's story is a stark warning to the industry. In just 30 days, the game went from processing 500,000 daily transactions to virtually zero. Why? Bots. Specifically, Robert’s bots. Despite early warnings and attempts at patches, the problem spiraled out of control. By the time decisive action was taken, it was too late – the economy had already crashed.
While the prevailing narrative paints bots as universally harmful, the reality is more nuanced.
Hilmar of CCP Games offers an intriguing perspective:
"Automating repetitive tedious tasks in a game could reduce player frustration and increase enjoyment."
This suggests that some forms of "botting" could actually improve player satisfaction if implemented correctly.
Hilmar also notes:
"Bots could have a positive effect in some game economies by making basic resources more abundant."
In complex game economies, bots could potentially play a role similar to market makers in financial markets, providing liquidity and stability.
As Luke from Pixels points out:
If people aren't trying to bot your game - it's not because they can't - it's because they don't care enough to do it. It's not always the flex you think to say you don't have any bots in an ecosystem.
And as we noted above, if they’re helpful in getting listed on exchanges, they do have some value to the ecosystem.
The fight against bots in Web3 gaming isn't just about implementing a few security measures. It's a comprehensive strategy that requires understanding the economics of bot creation, implementing smart defenses, and constantly evolving your approach. Let's dive deeper into each aspect of this strategy.
While bots can theoretically play any game - even complex, team-based games like CSGO, increasing complexity significantly impacts their profitability by increasing development time and the probability that the bot fails. Here's a deeper look at how to do this:
Botters want to run their automation on servers without GPUs, which are far cheaper than those with actual GPU hardware. By making games more computationally intensive, the resources required go up, and bot profits go down. What may seem negligible for 1 real user or device can become prohibitively expensive at the scale of 10K+ Accounts. Here are a few suggestions on how to do this:
Economic barriers can significantly deter bot operators by increasing their upfront costs and reducing potential profits.
Pixels' implementation of a $35 upfront cost for withdrawals is a prime example of an effective economic barrier. According to their post-implementation report, this measure reduced bot activity by 40% within the first month. However, they also noted an unintended consequence: some bot operators started pooling resources, creating "super bots" that could still turn a profit. This led Pixels to implement additional measures, highlighting the need for a multi-faceted approach.
In 2021, Twitter introduced Twitter Blue, a paid subscription service that offers exclusive features and increased visibility for verified accounts. By tying verification to a monthly fee, Twitter created an economic barrier that significantly reduced the prevalence of bot accounts. Verified users receive priority ranking in conversations, making it harder for bots to drown out legitimate users.
Sharing strategies on what is working is one way to improve our collective deterrence of fraudulent accounts. Some leaders like @whatslukedoing, @HilmarVeigar, and Games on the Block share openly the tactics that have worked well.
While economic deterrents are crucial, robust security measures form the backbone of any effective anti-bot strategy.
Five Critical Questions to Guide Your Strategy
When developing your strategy to prevent bots and promote a healthy game ecosystem, ask yourselves these 5 questions:
As Sasa, the CTO from Community Gaming said about their initialthird-party solution
"Our onboarding process was riddled with CAPTCHAs. We thought we were stopping bots, but in reality, we were just frustrating our real users and stunting our growth."
Here are the key categories of solutions that every modern Web3 gaming security stack should include. Additionally, for those looking to build this stack in-house, we’ve included some of the best open source Github Repos:
Technology | Stack Layer | Description & Importance | Open Source | Vendors |
---|---|---|---|---|
Web Application Firewall (WAF) | Network & Server Access | Acts as a gatekeeper, monitoring and filtering traffic between the game server and users. Detects and blocks common web-based attacks like SQL injection, XSS, and DDoS attempts. |
|
Most CDN Providers
|
🤖 Bot Detection | Application Access Gameplay |
Identifies and prevents automated scripts from interacting with the game, preserving fair play and preventing exploitation. |
|
|
🗺️ Proxy & VPN Detection | Application Access | Provides information about IP reputation and geolocation to identify and block connections from known bot networks, VPNs, or suspicious regions. |
|
|
📱 Virtual Machine & Emulator Detection | Application Access | Identifies bots running on virtual machines or emulators to hide their true hardware and evade detection. |
|
|
💬 Fake Email & Phone Number Detection | Account Creation | Validates the authenticity of user contact information during registration and account recovery to prevent bot accounts. |
|
|
🔗 Account Linking and device fingerprinting | Account Creation Gameplay |
Captures unique characteristics of a user's device to identify and link multiple accounts from the same device, detecting multi-accounting and bot farms. |
|
|
💳 Wallet Linking | Account Creation Gameplay |
Clusters wallets by analyzing on-chain transactions and behaviors. |
|
|
Once we have our signals, the next step is to transform this raw data into actionable insights. This is where simple scoring, machine learning, and advanced analytics come into play. The decisioning phase involves:
The output of this phase is a set of decisions about how to handle different risk profiles under different scenarios.
A simple but effective decision model is classifying users as: Real, Suspicious, Fake, or more granularly:
The final step is implementing real-time workflows that turn our decisions into concrete actions. Where actions might be:
These workflows can range from simple to highly complex:
Simple:
If user_risk_score > threshold, then block_account()
Moderate:
If (user_risk_score > moderate_threshold) && (action == "enter_tournament"), then require_additional_verification()
Complex:
If (user_IP_is_proxy == True) && (transaction_amount > 50) && (user_risk_score > low_threshold), then perform_face_match() || request_phone_verification()
These workflows should be flexible and easily adjustable as new fraud patterns emerge.
We’re already seeing the next wave of top-tier studios learning from the mistakes of others. Rather than repeat history, we’ve seen several of the latest releases focus on bot and fake account prevention from the start. For example, the Guild of Guardians team from Immutable Games - which had the benefit of experience after launching Gods Unchained a few years ago - has been laser-focused on user integrity since its launch in May. Some games are even proactively testing fake account detection platforms before going live. For example, Sonic Games, Burnghost, and Avalon all are leaning into bot prevention during even alpha and beta stages.
The fight against bots in Web3 gaming is ongoing and ever-evolving. The most successful strategies employ multiple layers of defense, constantly adapting to new threats. As we've seen from the various case studies and examples, there's no one-size-fits-all solution. Each game must tailor its approach based on its unique ecosystem, player base, and resources.
Remember, the goal isn't just to stop bots, but to create an environment where real players can thrive. By understanding the economics of botting, implementing smart defenses, and prioritizing user experience, Web3 games can create robust, enjoyable, and fair ecosystems for all players.