Bots can skew your website analytics

When companies try to quantify the damage caused by malicious bot activity, they can calculate ad revenue lost as a result of click/ad fraud, or account for prevention or experience costs. a data harvesting or credential stuffing attack.

But a new survey-based report from Netacea suggests a hidden cost of bots is sometimes overlooked: biased analysis of website traffic that leads companies to make ill-informed marketing and merchandising decisions.

“It’s a side effect of bot activity,” Andy Still, CTO at Netacea, said in an interview with SC Media. “Bugs steal data, they stuff credentials…and I think companies… [are] try to stop the negative goal the bots are trying to do. So what’s overlooked is that while the bot is doing this, it’s creating changed data… which companies then make decisions on.

“It’s only when you start looking at your underlying data and see unusual usage patterns that you realize that this data might not be as reliable as you thought. And then you look back and actually think it’s because a lot of that data was generated as a result of unwanted activity.

That’s why marketing and security teams need to work together, so that when the former discovers a bot problem, the latter takes responsibility for fixing it, the report concludes — adding, “If marketing teams are basing their strategies on wrong data, can they have a chance? of success?

Of 440 companies surveyed in the US and UK, 56% told Netacea that bots had a negative impact on their data analytics, resulting in minor financial impact, while 12% said the impact was of moderate severity.

For example, around 55% said they ordered new stock incorrectly due to bots artificially inflating their sales figures, and just over 50% said they ran special promotions based on what turned out to be data tainted by bots. Additionally, around 55% said that bots had “stretched our online budget, leading to unnecessary development in marketing activity”.

According to Still, many companies perform AB testing on their websites, trying different variations of experiences or user journeys to see which ones generate the best customer response — “and then making marketing decisions based on the popularity of products. particular types or particular types of products”. But if the bots interfere significantly, companies could “make marketing decisions based on what robots want to do, which is clearly not the same as what humans want to do.

When that happens, “then they’ll make a website harder for customers to use, and as a result, they’ll lose business that way,” he continued.

One example Still encountered at Netacea involved a company that offered its customers price comparisons for insurance quotes. This company tested two different experiences to receive these quotes. In the first user journey, customers entered all their data into a single form; the second involved more of a site wizard experience that guided users through a sequence of steps. “And they were using analytics to figure out which one was the most successful of them,” he said.

Turns out the majority of humans preferred the wizard approach, but because the bots were using the single form, the company mistakenly thought the single form was more popular and so made the form the method. most important to get a quote. “Once we started engaging with them and removing bot traffic from their site, it became clear that the humans actually preferred the wizard-like approach,” and they backtracked, a- he declared.

For businesses that want to be more aware of the cost of biased web traffic, Still offered a key recommendation: “The first step, I think, is not necessarily to trust the analytics. And if that doesn’t seem right, intuitively, then examine it further. Start analyzing it…maybe validate it with real users. But dig into those stats a bit more.

Explore that next level of data, Still continues. You might find, for example, that certain traffic patterns may all come from a particular geographic region that doesn’t match. This could be a hint of malicious bot activity.

“Often with bots… they’re easy to identify,” he said. “If something seems unusual, examine the tools available to get more information. Validate before making big decisions based on data you’re not comfortable with.

Charles J. Kaplan