The user agent request header contains a characteristic string that allows the network protocol peers to identify the application type, operating system, software vendor or software version of the requesting software user agent.
Everyone that is browsing the web right now has a user agent. It's the software that acts as the bridge between you, the user, and the internet. It's easiest to understand user agents if we backtrack and look at the evolution of the web, so we can understand the benefits of user agents.
When the internet was a text-based system, right back at the beginning of its use, users had to type commands to navigate and send messages. Now, we have browsers to do that for us. We simply point and click, and the browser is acting as our "agent," turning our actions into commands.
When your browser (or similar device) loads a website, it identifies itself as an agent when it retrieves the content you've requested. Along with that user agent identification, the browser sends a host of information about the device and network that it's on. This is a really set of data for web developers, since it allows them to customize the experience depending on the user agent that's loaded the page.
Browsers are a straightforward example of a user agent, but other tools can act as agents. Crucially, not all user agents are controlled or instructed by humans, in real time. Search engine crawlers are a good example of a user agent that is (largely) automated — a robot that trawls the web without a user at the helm.
Here's a list of some of the user agents you'll encounter:
Using user agent strings to control how a site behaves is not completely a fool-proof method, because some user agents are not what they seem.
It's possible to send a fake user agent, a process known as "spoofing." This can be used for innocent purposes — like usability, or testing. It can also be used to manipulate content or impersonate another device maliciously.
If you see strange visitor behavior, and you suspect it's a spoofed user agent or malicious crawler, the best thing to do is to block the IP address to be absolutely sure it can't visit your site.