Bots have gained massive popularity in the media as Elon Musk made it his personal mission to get rid of Twitter bots, bots were accused of buying out Taylor Swift tickets, and used to illegally steal and sell pharmacy prescriptions.

In 2023, we predict bots will continue to drive headlines, revealing innovative ways to commit fraud, influence public opinion, and shape the economy.

See our cybersecurity predictions below Emoji of finger pointing down

Prediction 1:
Online organizations will experience a record number of bots performing account takeover (ATO) and fake account generation for creative fraud schemes

Just a few weeks into the new year and automated attacks are already behind many of the major security breaches making headlines. A few examples include a national fast food chain investigating large-scale fraud, a security company’s vaulted passwords being breached, and just today, an electronic payments platform shared customer accounts and data were exposed.

In each of these incidents, the attacker relied on automation to execute credential stuffing attacks with the goal to monetize their efforts. We commonly see stolen accounts and fake accounts monetized through traditional forms of fraud such as payment fraud, loyalty program fraud, and promotion abuse.

In 2023, bots will be leveraged to carry out even more inventive fraud schemes that impact eCommerce organizations’ operations and revenue.

Emerging Fraud Schemes 2023

Figure 1: Bot operators have been engaging in emerging fraud schemes such as retail refund fraud, streaming fraud, and BNPL fraud.

Worthy of note, since fake and stolen accounts make up a large portion of the underground bot economy, their monetization isn’t solely based on fraud schemes. Attackers also make money by selling their methods as tools. Much like proxies are tools used to hide the source of cyberattacks, botting tools involve the mass creation of accounts, among others. The harder it is to create fake accounts at certain eCommerce organizations, the higher the resale value the account/tool has. Just like fraud, this impacts the company’s revenue by incurring costs associated with resource consumption and verification methods like OTPs.

Prediction 2:
Bots will be leveraged to alter public opinion

If you’re on any social media platform, chances are you’ve seen bot activity. However, what you may not have noticed is how bots have the ability to shape public opinion, influence purchasing behaviors, and spread misinformation and disinformation. Because bots are a relatively cheap and effective way to amplify messages or opinions at scale to reach hundreds of thousands of people immediately, they can be extremely dangerous. The information that’s spread can be blatantly incorrect or fabricated and can alter public perception. With the upcoming U.S. election, we believe bots will continue to play a major role in sharing political news and information this year despite best efforts to crack down on fake accounts.

Prediction 3:
Bot operators will influence and shape the economy

The trend of bots manipulating stock prices, controlling supply and demand, and influencing online gambling will continue to have an impact in 2023. This may result in a more volatile market for companies and consumers alike.

Hype sales represent just one example of bots effectively driving up the price of a given product by controlling inventory. Kasada’s Threat Intelligence team observed people becoming so successful in botting certain products that they were actually driving the product’s price down and diminishing its resale return on investment (ROI). The botting community then collectively agreed to limit their success in order to drive the resale price back up so they could all maximize their profits.

Data-backed Observations from Kasada’s Threat Intelligence Team

The various predictions we’ve made originate from recent observations of the bot ecosystem. Data suggests there will be more attackers who are more motivated than ever to sustain their profits. Based on the data, it’s now:

  1. Easier to perform bot attacks: Advanced services and tools used to bypass security detection are becoming more commonplace in the underground. These services allow those who may not have the technical knowledge to perform large-scale attacks on major organizations for a low fee.
  2. There are more bad actors using automation and AI: Since the barrier to entry has been lowered, we’ve observed more individuals at a younger age entering the space. We’ve also seen more bot operators crossing the line from using bots to purchase and resell in-demand goods, such as sneakers, which is a legal gray area to engaging in illegal activity such as committing online fraud.
  3. Bot operators are more financially motivated than ever: In times of economic uncertainty, individuals turn to fraud and cybercrime to make extra cash, and organized crime will take advantage of the market conditions to maximize their profits.

Conclusion:

Taking into account our observations of the threat and fraud landscape, coupled with the state of the economy, all signs point to bots wreaking havoc on companies and individuals alike this year. Automation will be the driving force behind sophisticated cyber-attacks and online fraud. Businesses and individuals urgently need to remain vigilant to ensure their accounts, endpoints, and APIs are safe from malicious automation and bad bots.

To learn more about the 2023 trends and predictions, along with how to defend against automated attacks schedule a time to meet with our experts.

Want to learn more?

  • The New Mandate for Bot Detection – Ensuring Data Authenticity

    Can the data collected by an anti-bot system be trusted? Kasada's latest platform enhancements include securing the authenticity of web traffic data.

  • The Future of Web Scraping

    If data is the new oil, then web scraping is the new oil rig. The potential impact of web scraping is escalating as the twin forces of alternative data and AI training both rapidly increase in size and complexity.

Beat the bots without bothering your customers — see how.