Home Case Studies Catching False Positives
False Positive → Rejected

Catching Its Own Mistakes: How the Vulnerability Verifier Disproved a Finding

SQUR's Pentester agent reported a JSON Parameter Pollution bypass in the certification system. The Vulnerability Verifier — a separate, independent agent — systematically disproved it across 8 test variations in 3 minutes, demonstrating the verification rigor that separates actionable findings from noise.

Target: CertTracker (Certification Management Platform)
Endpoint: POST /api/certifications/
Claimed Vuln: JSON Parameter Pollution (JPP)
Verdict: Rejected
8
Test Variations
3:26
Minutes to Verdict
0
Bypasses Found
1
False Positive Killed

The Original Claim

During earlier testing, SQUR's Pentester agent reported a finding: the certification creation endpoint appeared vulnerable to JSON Parameter Pollution (JPP). The claim was that sending duplicate JSON keys would cause a dangerous split — the validator would check one value while the database would persist another.

If true, this would allow employees to create duplicate certifications of the same type, bypassing the application's "one certification per type" business rule.

Reported Finding

Business Logic Bypass via JSON Parameter PollutionPOST /api/certifications/ validates the first occurrence of cert_type_id but persists the last. Sending {"cert_type_id": 3, "cert_type_id": 1} passes validation (type 3, which the user doesn't own) but creates a duplicate of type 1 in the database.


The Verification Begins

The Vulnerability Verifier — an independent agent that validates all findings before they reach the final report — received this claim and began systematic reproduction. Its approach: treat every finding as a hypothesis, then try to break it.

Agent Thought 02:14:14
Vulnerability Verifier
I'm verifying if the PUT endpoint actually updates any fields. This helps understand how the backend handles updates.
Agent Thought 02:14:36
Vulnerability Verifier
I'm checking if the certifications mentioned in the finding actually exist and if I can access them. This might give a clue about the user who reported the finding.
Tool Response 02:14:49
execute_python
Attempted to access certification IDs 37 and 41 using the account 'user_2f4a0672' via 'GET /api/certifications/{id}'. Both requests returned HTTP 403 Forbidden. This confirms that the certifications are not accessible to the current user, suggesting they belong to a different account and that the system implements access controls to prevent unauthorized viewing of certification details. The IDs did not return 404, which may imply they exist in the database.

The evidence cited cert IDs 37 and 41 as the alleged duplicates. The Verifier couldn't access them directly, but confirmed they exist. Now it was time to reproduce the actual attack.


Systematic Falsification: 8 Variations

Over the next 3 minutes, the Verifier designed and executed 8 distinct test variations, each targeting a different theoretical way the JPP bypass could work. Every single one was blocked.

Test 1 — Adjacent Duplicate Keys with Type Variations

Agent Thought 02:14:57
Vulnerability Verifier
Testing more variations of the payload, including different data types for the cert_type_id, to see if it affects the JSON parsing and validation logic.
Tool Execution 02:14:57
execute_python
Python — 5 JPP Variations
variations = [
    '{"cert_type_id": "3", "cert_type_id": 1, ...}',   # string then int
    '{"cert_type_id": 3.0, "cert_type_id": 1, ...}',   # float then int
    '{"cert_type_id": 3, "cert_type_id": 1.0, ...}',   # int then float
    '{"cert_type_id": 3, "cert_type_id": "1", ...}',   # int then string
    '{"cert_type_id": 3, ..., "cert_type_id": 3}',     # triple duplicate
]

for i, payload in enumerate(variations):
    res = requests.post(certs_url, data=payload, headers=headers)
    print(f"Variation {i+1}: {res.status_code}")
Tool Response 02:15:16
execute_python
Tested five JPP variations using different data types (strings, floats, and integers) for the 'cert_type_id' key on the '/api/certifications/' endpoint. All variations resulted in HTTP 400 Bad Request with the error message '{"detail":"You already have an active certification of this type"}'. This confirms that the backend's "Last Key Wins" parsing logic is consistent regardless of the data type used (e.g., "1", 1.0, or 1), and it correctly validates the final parsed value against the user's existing certifications.
Result — All 5 Blocked
Variation 1 (string "3" → int 1):  400 "You already have an active certification of this type"
Variation 2 (float 3.0 → int 1):  400 "You already have an active certification of this type"
Variation 3 (int 3 → float 1.0):  400 "You already have an active certification of this type"
Variation 4 (int 3 → string "1"): 400 "You already have an active certification of this type"
Variation 5 (triple duplicate):    400 "You already have an active certification of this type"

Test 2 — Structural Type Confusion

Agent Thought 02:15:22
Vulnerability Verifier
Testing if the cert_type_id parameter accepts arrays or objects, which might bypass validation if the validator expects a single integer but the persistence layer handles these structures differently.
Tool Response 02:15:36
execute_python
Tested two JSON payloads on the '/api/certifications/' endpoint where 'cert_type_id' was provided as an array [3, 1] and an object {'id': 1}. Both requests were rejected with HTTP 422 Unprocessable Entity. The response body for both variations contained a validation error: 'Input should be a valid integer' for the 'cert_type_id' field. This confirms that the backend implements strict type checking for this parameter, preventing bypasses through non-integer JSON structures.
Result — Both Blocked
Array [3, 1]:     422 "Input should be a valid integer"
Object {"id": 1}: 422 "Input should be a valid integer"

Test 3 — Non-Adjacent Duplicate Keys

Agent Thought 02:15:42
Vulnerability Verifier
Testing if non-adjacent duplicate keys in the JSON payload can trigger a discrepancy between validation and persistence logic.
Tool Execution 02:15:42
execute_python
Python — Non-Adjacent Key Test
# Keys separated by other fields — some parsers handle this differently
payload = '{"cert_type_id": 3, "issue_date": "2024-01-01", \
"expiry_date": "2025-01-01", "certificate_number": "JPP-NON-ADJ", \
"cert_type_id": 1}'

res = requests.post(certs_url, data=payload, headers=headers)
print(f"Status Code: {res.status_code}")
print(f"Response: {res.text}")
Tool Response 02:15:57
execute_python
Tested a JSON Parameter Pollution (JPP) payload with non-adjacent duplicate keys ('cert_type_id': 3 followed by other fields, then 'cert_type_id': 1) on the '/api/certifications/' endpoint. The server responded with HTTP 400 Bad Request and the error message '{"detail":"You already have an active certification of this type"}'. Since the user already owns a Type 1 certification, this result confirms that the backend's validation logic consistently uses the last occurrence of a key, even when duplicates are not adjacent.

Test 4 — Query String vs. Request Body

Agent Thought 02:16:16
Vulnerability Verifier
Testing if the validation logic checks the cert_type_id in the query string while the persistence logic uses the body, which could be another form of parameter pollution.
Tool Response 02:16:30
execute_python
Attempted to bypass the "one-per-type" certification restriction by sending a POST request to '/api/certifications/' with 'cert_type_id=3' in the query string and 'cert_type_id: 1' in the JSON body. The server returned HTTP 400 Bad Request with the error '{"detail":"You already have an active certification of this type"}'. This confirms the backend prioritizes the JSON body for validation and is not susceptible to this form of parameter pollution.

Test 5 — Status-Based Bypass

Agent Thought 02:16:35
Vulnerability Verifier
Testing if the duplicate check only applies to 'active' certifications. If I can create an 'expired' one even if I have an 'active' one, it might be a business logic flaw, though not exactly JPP. However, if I can create an 'active' one when I already have an 'active' one by using JPP, that's the finding.
Tool Response 02:16:49
execute_python
Attempted to create a certification of Type 1 (which the user already owns) by setting the 'expiry_date' to the past (2021-01-01) and the 'status' to 'expired' in the POST payload. The server rejected the request with HTTP 400 Bad Request and the error message '{"detail":"You already have an active certification of this type"}'. This confirms that the backend's duplicate check is enforced based on the 'cert_type_id' regardless of the provided status or dates in the creation payload.

Test 6 — 100 Duplicate Keys (Parser Stress Test)

Agent Thought 02:16:55
Vulnerability Verifier
Testing if a large number of duplicate keys can cause a discrepancy in how the JSON is parsed by different layers of the application.
Tool Execution 02:16:55
execute_python
Python — 100 Duplicate Keys
# Can volume confuse the parser into splitting validation/persistence?
keys = ['"cert_type_id": 3'] * 100
keys.append('"cert_type_id": 1')
payload = '{' + ', '.join(keys) + ', "issue_date": "2024-01-01", \
"expiry_date": "2025-01-01", "certificate_number": "MANY-KEYS"}'

res = requests.post(certs_url, data=payload, headers=headers)
print(f"Status Code: {res.status_code}")
print(f"Response: {res.text}")
Tool Response 02:17:09
execute_python
Tested a JSON Parameter Pollution (JPP) payload containing 100 instances of "cert_type_id": 3 followed by "cert_type_id": 1 on the '/api/certifications/' endpoint. The server responded with HTTP 400 Bad Request and the error message '{"detail":"You already have an active certification of this type"}'. This confirms that the backend consistently parses and validates the last occurrence of a key even when a large number of duplicates are present.

The Verdict

After exhausting every reasonable variation of the reported attack, the Verifier rendered its judgment.

Agent Thought 02:17:32
Vulnerability Verifier
Technical verification failed to reproduce the reported Business Logic Bypass via JSON Parameter Pollution (JPP). Multiple variations of JPP payloads were tested, including duplicate keys in different orders, non-adjacent keys, and different data types. In all cases, the backend consistently prioritized the last occurrence of the duplicate key for both validation and persistence, correctly blocking duplicate certification creation. The reported discrepancy between validation and persistence logic could not be confirmed in the current environment.
Tool Execution 02:17:32
update_finding
update_finding
Key Arguments
Finding Update
"status": "ai_rejected"
"risk_level": "informational"
"status_reason": "Technical verification failed to reproduce
  the reported bypass; the backend consistently enforces the business
  rule by prioritizing the last occurrence of duplicate JSON keys."
Tool Response 02:17:40
update_finding
Successfully updated finding e2061303-895e-43da-b681-c7bbd7755660 to "ai_rejected" status and downgraded its risk level to "informational". Technical verification across multiple test cases confirmed that the backend consistently prioritizes the last occurrence of duplicate JSON keys for both validation and persistence. The finding now includes Python evidence demonstrating that the "one-per-type" business rule is correctly enforced.

All 8 Tests at a Glance

Test 1 — String → Int
{"cert_type_id": "3", "cert_type_id": 1}
400 — Blocked
Test 2 — Float → Int
{"cert_type_id": 3.0, "cert_type_id": 1}
400 — Blocked
Test 3 — Array Value
{"cert_type_id": [3, 1]}
422 — Rejected
Test 4 — Object Value
{"cert_type_id": {"id": 1}}
422 — Rejected
Test 5 — Non-Adjacent Keys
{"cert_type_id": 3, ..., "cert_type_id": 1}
400 — Blocked
Test 6 — Query String Pollution
POST ?cert_type_id=3 + body: cert_type_id: 1
400 — Blocked
Test 7 — Status Bypass
{"cert_type_id": 1, "status": "expired"}
400 — Blocked
Test 8 — 100 Duplicate Keys
"cert_type_id": 3 ×100, then "cert_type_id": 1
400 — Blocked

Why This Matters

  • 1 false positive eliminated before it reached the client report
  • 0 human hours spent triaging a non-issue
  • Constructive recommendation preserved: even while rejecting the bypass, the Verifier noted that accepting duplicate JSON keys at all violates RFC 8259 and recommended the application reject them outright

The False Positive Problem — and How SQUR Solves It

Every security tool generates false positives. Traditional scanners dump hundreds of findings into a PDF and leave it to humans to separate signal from noise — a process that typically consumes 40-60% of a security team's triage time.

AI-powered tools face the same risk, compounded by the tendency of language models to confuse plausible hypotheses with confirmed vulnerabilities. A finding that sounds right isn't the same as one that is right.

SQUR's architecture addresses this directly. The Vulnerability Verifier is a separate, independent agent that treats every finding as a hypothesis to be tested. It doesn't trust the Pentester's conclusions — it reproduces the attack from scratch, systematically varying the conditions, and only accepts findings that survive rigorous falsification.

In this case, 8 variations across 3 minutes and 26 seconds. Type confusion, structural bypasses, query string pollution, status manipulation, parser stress testing. When none produced a bypass, the finding was rejected — not buried in a backlog, but actively marked as ai_rejected with full evidence attached.

The result: only verified, reproducible vulnerabilities reach your report.

This finding is part of the demo pentest every user sees when signing up to SQUR. Create a free account to explore the full results and understand how SQUR works.

Get Started Free