2 min

Akamai recently introduced the Content Protector tool. With it, websites can be protected from scraper bots that are not only a nuisance, but threaten security too.

Akamai believes that scraper bots are an important part of the e-commerce system. They search for new content, flag products in comparison sites and collect updated product information for sharing with customers.

Often, however, they are also used for more damaging purposes, such as the undermining of competition, the gathering of intel before attacks on to take inventory or counterfeiting goods and website contents.

Additionally, until scraper bots are stopped, they cause websites to be pinged 24/7, thereby causing poorer web performance, with consequences for consumers who may get a spotty connection to various services. Furthermore, scraper bots have become increasingly sophisticated in recent years and continue to evade other security systems.

Introducing Content Protector

Akamai wants to put an end to this practice. The Content Protector tool helps website administrators detect and address scraper bots that steal content for malicious intent.

The tool improves website performance, provides a better end-user experience and offers intellectual property protection. In addition, the tool should keep false positives as rare as possible when detecting malicious bots.

Akamai sees Content Protector not only as a protection tool, but also as a true “business enabler”. By protecting websites from malicious scraper bots, the tool should also generate more value.

Functionality

More specifically, the tool offers several protection options at different levels. At the protocol level, the tool checks whether the way visitors connect to a Web site is legitimate.

At the application level, it evaluates whether the client can run certain “business logic” in JavaScript. If it can, the Content Protector client collects device and browser characteristics and user preferences. These are checked against the data from the protocol data to verify consistency.

In terms of user interaction, interactions are checked to distinguish differences between human and bot traffic. It does this, for example, by checking how users interact with touch screens, keyboards and a mouse. This allows bots to be identified by their lack of interaction and abnormal usage patterns.

Further, user behavior is analyzed and risk classifications are made of traffic to websites.

Also read about some of Akamai’s research findings: Linux IoT devices vulnerable to self-spreading botnet