Now That Most Web Traffic is Non-Human

In 2024, more than half of all web traffic came from bots — a shift most people didn’t notice until this year.

The word ‘bot’ is short for ‘robot,’  which originates from the Czech term “robota,” meaning “forced labour” or “drudgery.” It was popularized by Czech writer Karel Čapek in his play ‘Rossum’s Universal Robots,’ which premiered in 1921.

Most bots are helpful, unless they unintentionally overwhelm a small website with sudden bursts of requests. When you watch videos online (on platforms like YouTube or Netflix), recommendation bots suggest content based on your preferences. When you shop online, bots help with customer support, product suggestions, and even process your orders. These systems make experiences smoother, faster, and more personalized. Every time you interact with a website, app, or service, there’s a good chance a ‘helpful bot’ is working behind the scenes.

Over the past 24 hours, from 4 AM November 27 to 3:59 AM on November 28, the top five bots accessing Cortes Currents were:

  • Bingbot — Microsoft’s web crawler that discovers and indexes pages for search engines.
  • Googlebot – Google’s web crawler with the same purpose
  • Petalbot — a Chinese web crawler with the same purpose
  • Amazonbot — another web crawler used for indexing
  • Claudebot – a web crawler that fetches publicly available pages to help improve and evaluate Claude’s capabilities

These are not bots Cortes Currents intends to block.

In the same 24-hour window, web crawling bots made 38,440 requests. Four were flagged as ‘attacks’ and the website’s new security programs blocked them. More than 1,000 requests were “unsuccessful,” which likely means the website did not respond quickly enough and the connections timed out.

This is the slack season, and in the past Cortes Currents has averaged a little over 5,000 visitors per month. As a result of the sudden influx of bots, this number has jumped to almost 19,000 since October 26.

Jan David Nose, an engineer for the Rust Blog, described what a similar influx meant for his company’s website: “We’ve seen crawlers hit us hard from time to time, and that has caused noticeable service degradation. We’re painfully aware of the increase in traffic that comes in short but very intense bursts when crawlers go wild.”

No surprise, much of the pressure appears connected to the world’s most widely used search ecosystem — the Chrome browser many people use on their personal computers.

As the Anomali Cyber Watch puts it: “Chrome hitting its seventh exploited zero-day this year should be a wake-up call. It isn’t bad luck; it’s the natural outcome of being the most widely used browser on the planet. High usage creates high incentive, and attackers follow that incentive relentlessly. This is why patch management matters just as much at home as it does in the office. Your personal laptop, your phone, your family’s devices and your workstations all sit on the same internet and face the same exploitation chains. Treat browser updates as non-negotiable, verify that auto-update is actually enabled and remind the people around you to do the same. Expect more Chrome zero-days and assume attackers will move quickly.”

Nevertheless, bots are useful tools, and their web footprint will undoubtedly grow in the years ahead. The takeaway: helpful bots keep the internet running — but when they spike, they can strain smaller websites. One of the most effective steps for individuals and small site owners is to keep software up to date and use modern security tools. At Cortes Currents, we’ve switched to a security program that specializes in bot management.

Links of Interest:

Top image credit: Wizard holding up a Palantír – AI art prompted by Flux Schnell via Wikimedia (Public Domain)

Sign-up for Cortes Currents email-out:

To receive an emailed catalogue of articles on Cortes Currents, send a (blank) email to subscribe to your desired frequency:

Leave a Reply

Your email address will not be published. Required fields are marked *