WebGL Fingerprint Test

WebGL reveals detailed information about your GPU and graphics capabilities. This data is highly identifying and used for browser fingerprinting.

Understanding WebGL Fingerprinting

What Is WebGL Fingerprinting?

Think of WebGL fingerprinting as asking your computer to draw a specific picture, then examining every tiny detail about how it drew that picture—not what the picture looks like, but the microscopic quirks in the brush strokes. WebGL (Web Graphics Library) is the technology that lets websites display 3D graphics in your browser without plugins. It directly accesses your Graphics Processing Unit (GPU), and here's the wild part: every GPU draws things slightly differently.

When a website asks your browser to render a simple 3D scene through WebGL, your GPU performs billions of floating-point calculations. Different GPU models from NVIDIA, AMD, Intel, and Apple handle these calculations with microscopic variations. Your driver version (the software that talks to your GPU) adds another layer of uniqueness. Even the way your GPU rounds numbers or handles anti-aliasing creates a distinctive signature. Combine all these factors, and you get a fingerprint that's often as unique as your actual fingerprint.

The foundational research came from Keaton Mowery and Hovav Shacham at UC San Diego in 2012 with their paper "Pixel Perfect: Fingerprinting Canvas in HTML5." They discovered that rendering text and WebGL scenes to a canvas element, then examining the pixels produced, created a fingerprint that was "consistent, high-entropy, orthogonal to other fingerprints, transparent to the user, and readily obtainable." Testing across 294 browser instances yielded 116 distinct fingerprints with approximately 5.73 bits of entropy from text rendering alone—and significantly more when combined with WebGL.

How WebGL Fingerprinting Actually Works

WebGL fingerprinting collects data at multiple levels. First, there's the renderer string—your GPU's name and sometimes the driver version. This alone is highly identifying. If you're running an NVIDIA RTX 4090, that shows up in your fingerprint. If your driver is version 531.18, that might show too. The combination of GPU model and driver version immediately narrows you down to a small subset of internet users.

But tracker scripts don't stop there. They query your GPU's capabilities: maximum texture size (often 16384 for modern GPUs, but varies), supported extensions (things like "WEBGL_compressed_texture_s3tc" or "EXT_texture_filter_anisotropic"), shader precision (how accurately your GPU handles decimal numbers), and viewport dimensions. Each of these parameters adds entropy to your fingerprint.

Then comes the rendering test. According to Mowery and Shacham's research, websites render specific 3D scenes using carefully selected parameters—particular textures, anti-aliasing settings, lighting, and transparency effects. They capture the output image and hash it. Because GPUs and drivers differ in how they implement the WebGL specification (there's wiggle room in the standards), the output varies between systems. Your Intel Iris Xe Graphics will produce a different hash than an AMD Radeon RX 7900, even when rendering the identical scene.

The 2014 study "The Web Never Forgets" by Acar et al. from Princeton and KU Leuven found that 5.5% of the top 100,000 websites actively employed canvas fingerprinting (closely related to WebGL fingerprinting). They used automated crawlers to scan for fingerprinting scripts, finding them on 5,542 websites. That was 2014—the number has only grown since.

The DrawnApart Breakthrough: GPU Fingerprinting in 150 Milliseconds

In recent years, researchers discovered they could fingerprint GPUs not just by what they support, but by how fast they perform specific operations. The "DrawnApart" technique uses WebGL 2.0 compute shaders to measure subtle performance characteristics of your GPU. Think of it like identifying a car not just by its make and model, but by precisely timing how quickly it accelerates from 0 to 60.

Here's what makes DrawnApart scary effective: testing across 2,550 devices with 1,605 unique GPU configurations, researchers achieved 98% classification accuracy in just 150 milliseconds. The technique boosts the median tracking duration by 67% compared to using current tracking methods alone. It works by running specific computation tasks and measuring execution time with microsecond precision. Since every GPU has a unique performance profile (affected by clock speed, memory bandwidth, thermal throttling, and driver optimizations), the timing measurements create a distinctive fingerprint.

What's particularly concerning is that DrawnApart works even when privacy tools mask your renderer string. You can spoof your GPU name to say "Generic GPU" all day long, but you can't fake how fast your actual hardware completes mathematical operations. The physics of your silicon gives you away. And because it runs in 150 milliseconds, you won't even notice it happening—faster than an eye blink.

Cross-Browser Tracking: Following You Everywhere

One of the most disturbing applications of WebGL fingerprinting is cross-browser tracking. Most people think using different browsers keeps their activities separate—checking personal email in Chrome, browsing social media in Firefox, doing work in Edge. That separation is an illusion.

Research from the 2017 NDSS Symposium by Yinzhi Cao, Song Li, and Erik Wijmans demonstrated a technique that tracks users across different browsers on the same machine with 99.24% accuracy. They used novel OS and hardware-level features, particularly WebGL rendering characteristics. The key insight: your GPU is the same regardless of which browser you use. When Chrome, Firefox, and Safari all query the same GPU, they get similar responses.

The researchers performed 20 unique WebGL rendering tasks with carefully selected parameters: specific textures, anti-aliasing modes, lighting effects, and transparency settings. The combined fingerprint was unique enough to identify the machine across browsers, defeating the privacy assumption that different browsers provide separate identities. This means ad networks can connect your "anonymous" browsing in one browser to your logged-in session in another, building a complete profile of your online activity.

WebGL Fingerprinting Research FindingsResultSource
DrawnApart GPU Classification Accuracy98% in 150msBleepingComputer 2023
Tracking Duration Improvement+67% median boostBitdefender Report
Pixel Perfect Study: Distinct Fingerprints116 from 294 browsers (5.73 bits)Mowery & Shacham 2012
Canvas Fingerprinting Prevalence5.5% of top 100k websitesAcar et al. 2014
Cross-Browser Tracking Accuracy99.24%Cao et al. NDSS 2017
State-of-Art vs Hardware-Based Tracking90.84% vs 99.24%NDSS 2017 Paper

The Automation Detection Problem

If you're using Selenium, Puppeteer, or Playwright for web automation, WebGL fingerprinting is one of your biggest obstacles. Headless browsers often use software renderers like SwiftShader (Chrome's fallback renderer) or LLVMpipe instead of real GPUs. These software renderers have distinctive fingerprints that immediately signal "automation tool" to detection systems.

Even when automation tools run in headed mode with GPU access, the fingerprint might not match a legitimate user. For example, if your script runs on a server with an NVIDIA Tesla T4 (a data center GPU), websites will notice that consumer users don't typically browse from data center hardware. Or if you're running headless Chrome on Linux with Mesa drivers while your User-Agent claims to be Windows 10 with Chrome, that inconsistency gets flagged instantly.

The detection systems are sophisticated. They don't just check if WebGL works—they verify that your entire fingerprint is internally consistent. Your GPU vendor should match your platform (ANGLE on Windows, Metal on macOS). Your maximum texture size should align with your claimed GPU model. Your supported extensions should match the browser version you're claiming to be. A single mismatch can trigger a block.

Why WebGL Fingerprinting Is So Persistent

Unlike cookies or localStorage that you can clear, or IP addresses that change, your WebGL fingerprint is determined by your hardware. Unless you physically swap your GPU or update your drivers, your fingerprint remains constant. This persistence makes WebGL fingerprinting incredibly valuable for long-term tracking.

Even privacy-focused techniques like browser profiles or virtual machines don't fully solve this. If you run multiple virtual machines on the same host and give them GPU access, they'll all report the same or similar GPU fingerprints because they're ultimately using the same physical hardware. The only way to get a different WebGL fingerprint is to actually use different hardware.

This is why professional multi-accounting and anti-detect browsing requires either real device farms (physically separate machines with different GPUs) or sophisticated software spoofing that intercepts WebGL API calls and returns fake but consistent values. But spoofing has its own challenges—you need to return values that make sense together, that match your other fingerprints (like canvas, which is closely related—see our Canvas Fingerprint Test), and that don't include known "fake" patterns that detection systems have catalogued.

Protecting Against WebGL Fingerprinting

For general privacy, Firefox offers the most robust protection through its privacy.resistFingerprinting setting. When enabled, Firefox reports generic WebGL values instead of your real GPU information. The trade-off is that this makes you identifiable as "someone using Firefox anti-fingerprinting," and it can break some websites that require WebGL for legitimate functionality (like Google Maps, WebGL games, or 3D modeling tools).

Tor Browser takes a similar approach, spoofing WebGL vendor and renderer to standard values. But Tor's threat model is different—they assume all Tor users have the same fingerprint, so you're anonymous within the crowd of millions of Tor users. If you're trying to appear as a unique organic user (for web scraping, multi-accounting, or testing), the "everyone looks the same" approach doesn't work.

For automation and multi-accounting, anti-detect browsers are the industry standard. Tools like Multilogin, GoLogin, and AdsPower let you create browser profiles with custom WebGL parameters. The key is consistency: if your profile claims to have an Intel HD Graphics 630, it must report all the parameters that a real Intel HD 630 would report—the correct maximum texture size, the right extensions, appropriate shader precision values, and a canvas rendering hash that matches that hardware.

Some advanced users disable WebGL entirely through browser settings or extensions. This works for privacy but breaks a surprising amount of the modern web. Many sites use WebGL not for 3D graphics but for faster 2D rendering, image processing, or even as a bot detection method (if WebGL doesn't work at all, they flag you as suspicious).

The nuclear option is using real browsers on real devices with real, diverse hardware. Device farms where each browser instance runs on physically separate hardware with different GPU models are nearly impossible to detect through WebGL fingerprinting. But this approach is expensive and doesn't scale well.

Want to understand how WebGL fingerprinting fits into the broader fingerprinting landscape? Check our complete guide on WebGL Fingerprinting Defense Strategies and explore related techniques like Font Fingerprinting and Audio Fingerprinting that use similar hardware-based tracking approaches.