Internet bot

Internet bots, also known as web robots, WWW robots or simply bots, are software applications that run automated tasks over the Internet. Typically, bots perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. The largest use of bots is in web spidering, in which an automated script fetches, analyzes and files information from web servers. Bots may also be implemented where a response speed faster than that of humans is required (e.g., video gaming bots and auction-site robots) or less commonly in situations where the emulation of human activity is required, for example chat bots. Recently bots have been used for search advertising, such as Google AdSense.

Commercial purposes

Chatterbots are used in automated online assistants by organizations as a way of interacting with consumers and users of services. This can avail for enterprises to reduce their operating and training cost. A major underlying technology to such systems is natural language processing.

There has been a great deal of controversy about the use of bots in an automated trading function. Auction website eBay has been to court in an attempt to suppress a third-party company from using bots to traverse their site looking for bargains; this approach backfired on eBay and attracted the attention of further bots.

How a botnet works


A Twitterbot is a program used to produce automated posts via the Twitter microblogging service. Twitterbots come in various forms. For example, many serve as spam, enticing clicks on promotional links. Others post at-reply messages in response to tweets that include a certain word or phrase. These auto-tweets can either be silly or used to spread some themed message. Some Twitter users even program Twitterbots to assist themselves with scheduling or reminders.

A recent study of fake accounts on Twitter reveals a bustling business. Twitter bots tend to follow tons of people, not just people who pay for the privilege. This is likely because their creators don’t want to make it obvious that they’re fake, and want to make it more difficult to detect who’s paying for their followers and who’s not.

Malicious purposes

The potential for internet bots to be used for malicious purposes is frequently exploited. The most widely used anti-bot technique is the use of CAPTCHA, which is designed to distinguish between a human user and a less-sophisticated bot by means of a character recognition task that, ideally, only humans can perform successfully. This test can stop spambots from adding large amounts of spam to the webpage.

Web spiders can also be used with malicious intent, although each server spidered may have a file called robots.txt which may contain rules for the bot to follow. The usual purpose of this file is to stop harmless bots from accidentally doing something wrong, however, as bots designed specifically to be malevolent can easily ignore the file entirely.

Some malicious purposes for bots include:

See also: