- How fingerprint inconsistencies become detection signals, and the most common patterns
- How to think about consistency checks across HTTP, JavaScript, rendering, and environment signals
- Common scraping pitfalls that trigger inconsistency flagsâand practical ways to reduce them
Why fingerprint inconsistencies are easy to use for detection
The core reason is simple: inconsistencies tend to be a low-false-positive signal for defenders. Spoofing a single field (for example, User-Agent) is relatively easy. But producing a fully coherent âstoryâ across HTTP headers, JavaScript APIs, rendering outputs, and OS-derived characteristics is harderâand more expensive.
Inconsistencies also hint at what you tampered with. If you only change UA, youâll often introduce a mismatch between UA and UA-CH (Client Hints) and/or JavaScript-exposed values. Detection systems can reliably pick up these âmisalignments that normal browsers rarely produce.â
Key takeaway
Bot detection isnât only about guessing âwho you are.â
In practice, a very effective strategy is to check whether your client still looks like a valid, internally consistent browser implementationâand block anything that doesnât.
The big picture: what consistency checks cover
Consistency checks typically correlate signals across multiple layers:
| Layer | Typical signals | Example inconsistency |
|---|---|---|
| HTTP | User-Agent / UA-CH / Accept-Language | UA claims iPhone, but Sec-CH-UA-Platform says “Windows” |
| JavaScript | navigator / Intl / screen / WebRTC | Timezone points to the US, but language and OS-related signals look implausible |
| Rendering / performance | Canvas / WebGL / Audio / FPS | GPU looks like a virtualized/software renderer, but the rest of the profile claims a high-end device |
| Behavior | Input events / scrolling / focus | Human-like events are missingâor are too perfectly regular |
This is where things get real. In recent years, UA-CH (User-Agent Client Hints) has become more important than the classic UA string, and itâs increasingly common to check whether UA, UA-CH, and JS (via navigator.userAgentData) all agree. Low-entropy client hints like Sec-CH-UA, Sec-CH-UA-Mobile, and Sec-CH-UA-Platform may be sent by default (unless restricted), which makes mismatches even easier to detect.
UA vs UA-CH mismatches
Sec-CH-UA
A representative UA-CH header is Sec-CH-UA.
MDN describes it as a low-entropy hint that provides user agent information such as browser brand (for example, Chrome) and major version. Another practical point: unless itâs blocked by permission policy, it can be sent by default, which means sites often get it even when they didnât explicitly opt in.
Example: request headers sent by Chrome

If you only swap the
User-Agent header, youâll often break consistency with UA-CH headers (Sec-CH-UA, Sec-CH-UA-Platform, Sec-CH-UA-Mobile, etc.).That kind of mismatch can directly trigger bot detection.
A classic red flag is a âSafari-like UA stringâ paired with
Sec-CH-UA still identifying as "Chromium"âan implausible combination for real user traffic.navigator.userAgentData
In Chromium-based browsers, UA-CH isnât only visible via HTTP headers. Itâs also exposed to JavaScript via navigator.userAgentData.
Detection systems correlate what they see in HTTP with what they see in JS to decide whether it looks like a single, coherent client. In other words: even if you spoof one surface, any âhonestâ surface you didnât fully control can become the inconsistency that gives you away.
Common inconsistency patterns
Below are inconsistency patterns that show up frequently in production. This isnât a promise that âmatching these will always pass.â Itâs a pragmatic list of signals that tend to fail hard when they donât line up.
OS and input devices
- Mobile UA but
Sec-CH-UA-Mobileis?0(or the reverse) - Profile claims iOS, but touch/pointer characteristics look like a desktop
Language and locale
Accept-Languagediverges fromnavigator.languagesIntl.DateTimeFormat().resolvedOptions().timeZoneis wildly implausible compared to IP-derived location
Screen and window sizing
- The relationship between
screen.width/heightandwindow.innerWidth/innerHeightlooks unnatural (UI chrome, zoom, and device pixel ratio often break) - You hardcode viewport sizes in the tab while OS scaling and DPR signals point elsewhere
WebGL and GPU
- WebGL vendor/renderer looks like a virtualized or software GPU, while UA and performance imply a premium device
- Rendered outputs (Canvas/WebGL) conflict with the font environment
What matters in the field
Individual values matter less than whether the relationships between values look normal. For example, a slightly unusual resolution can still pass if DPR, zoom behavior, UA-CH, and OS signals tell a consistent story. But one standout contradiction can make the whole session look suspicious.
How defenders evaluate consistency
In practice, there are two common approaches.
Rule-based checks
This approach blocks combinations that âshouldnât exist.â For example: UA says Android but Sec-CH-UA-Platform says macOS. These checks are cheap to run and tend to have relatively low false positives, so theyâre often deployed first.
Scoring models
Many inconsistencies are weak on their own. Scoring models treat them as additive risk. For example, small mismatches across language, timezone, fonts, and WebGL might be summed into a risk score; once it crosses a threshold, the site routes the session to CAPTCHA or step-up verification.
W3C guidance on fingerprinting mitigations makes a closely related point:
“Fingerprints can be formed by combining multiple observable characteristics.
When designing specifications, itâs important to evaluate which information may contribute to identification (fingerprinting surface) and assess its impact.” (Paraphrased from the original)
Note: W3C (the World Wide Web Consortium) is the international nonprofit that develops and maintains key web standards (HTML, CSS, DOM, HTTP, and more).
Mistakes scrapers commonly make
Only changing the UA string
This is the most common failure mode. You might successfully rewrite the UA string, but if UA-CH headers and/or navigator.userAgentData remain unchanged (or canât be changed in your setup), youâre likely to fail consistency checks.
Too many (or too few) headers
Some UA-CH values are only sent when the server requests them via Accept-CH (high-entropy hints). If you force-send a fixed set on every request, you can create new âthis doesnât look like real traffic for this siteâ anomaliesâmissing headers, unusual ordering, or unexpected presence.
WAF compatibility issues
Chromiumâs UA-CH documentation notes a real-world compatibility issue: certain characters included in headers like sec-ch-ua (such as double quotes) may be treated as suspicious by some WAFs or legacy intermediaries. That means you can get blocked before bot detection even runs.
A practical process for building consistency in scraping
This section doesnât provide a guaranteed âbypass recipe.â Instead, it lays out a defensible process for consistency designâuseful for both operational hardening and testing.
Inventory every observable surface
- HTTP: UA / UA-CH / Accept-Language, etc.
- JS: navigator / Intl / screen / permissions, etc.
- Rendering: Canvas / WebGL / fonts
- Behavior: input events, navigation flows, waits
Pick a baseline profile
Start with one âpositive profile,â such as âChrome on Windows 11 (x64, desktop).â The goal isnât âvalues that feel plausible.â Itâs the full set of values youâd naturally observe on that real environment.
Minimize diffs
The more you customize, the more contradictions you introduce. Counterintuitively, fewer changes often produce a more consistent profile. Prioritize alignment across UA / UA-CH / JS first, because they expose the same underlying facts (OS, mobile/desktop classification) via different routes.
Confirm what your requests actually send
To validate consistency, log what is truly observable in both HTTP and JS. For example:
# Example: consolidate âwhat the site can seeâ during Playwright-style validation (pseudocode)
# Use this for test-oriented consistency checks, not as a bypass recipe.
result = {
"http_headers": captured_request_headers,
"js": {
"userAgent": page.evaluate("() => navigator.userAgent"),
"uaData": page.evaluate("() => navigator.userAgentData"),
"languages": page.evaluate("() => navigator.languages"),
"timeZone": page.evaluate("() => Intl.DateTimeFormat().resolvedOptions().timeZone"),
"screen": page.evaluate("() => ({w: screen.width, h: screen.height, dpr: devicePixelRatio})")
}
}
print(result)Warning
Trying to âspoof values to get throughâ can violate website terms, contracts, internal security policies, and/or applicable law. If you do this for work, confirm you have permission from the target site and that your approach is compliant.
Getting blocked after UA spoofing?
If your scraper started failing after changing headers, the real issue is often cross-signal inconsistencies (UA vs UA-CH vs JS vs rendering). If you want help auditing what a site can observe and stabilizing a compliant scraping setup, reach out.
Summary
- Fingerprint âinconsistenciesâ are harder to fake than single values, making them strong bot-detection signals
- Mismatches between UA / UA-CH /
navigator.userAgentDataare especially visible - Real-world detection commonly combines rule-based blocks with score-based risk models
- A practical workflow is: inventory surfaces â choose a baseline profile â minimize diffs