Our external review method.
Clear, bounded, reproducible.
An external review is neither an automated scanner nor an intrusive pentest. It is a disciplined technical reading of your public surface, with explicit limits: no brute force, no mass downloads, no data modification, no production disruption.
Framework
What passive means here
- Observe what any outsider can already see from the internet: DNS, subdomains, headers, robots.txt, sitemap.xml, bundles, endpoints, technical docs, exposed IAM, buckets, panels, and automation flows.
- Verify findings without exfiltrating personal data or creating unnecessary side effects.
- Build proof from configuration, client code, response structure, access models, and externally observable behavior.
- Draw the line honestly between confirmed finding, plausible risk, and unverified hypothesis.
The 6 phases of our process
Fast scoping and external mapping
Map the visible perimeter: domains, subdomains, likely stack, and the dependencies most exposed from the outside.
External enumeration
DNS, certificates, headers, common routes, robots, sitemap, admin panels, analytics, IAM, staging, and forgotten environments.
Application analysis
Pull and inspect bundles, hunt for source maps, endpoints, roles, webhooks, data models, and Swagger or GraphQL specs.
Finding validation
Manually qualify each signal: what is actually accessible, what is only exposed as a blueprint, and what is already blocked.
Impact qualification
Prioritize by feasibility, severity, business value, and compliance relevance. A flaw matters for what it really opens, not for its score.
Reporting and remediation
Report, executive summary, reproducible evidence, fix order, and a clear handoff for both engineers and decision-makers.
What we verify in practice
External attack surface
Subdomains, staging environments, forgotten services, technical routes, panels, docs, and public metadata.
Bundles and source maps
Auth flows, roles, data models, endpoints, webhooks, internal URLs, public variables, and sensitive client-side logic.
API and technical exposure
REST, GraphQL, Swagger/OpenAPI, schemas, back-office routes, response differences, and apparent access-control boundaries.
Auth, roles, and data isolation
RLS, tenant separation, permissions, admin flows, reset logic, magic links, and incomplete policies.
Storage and automation
Buckets, signed URLs, webhooks, n8n/Make, HMAC signatures, admin flows delegated to automation, and direct file access by URL.
Business and compliance context
Personal data, health, finance, HR, legal data, enterprise questionnaires, CNIL, HDS, DORA, and PCI-DSS implications.
What we do not do without explicit authorization
- No brute force, credential stuffing, stress testing, or aggressive fuzzing.
- No account creation, no sensitive-flow execution, no data deletion.
- No large-scale download or extraction of personal data.
- No internal pivoting, no active intrusion, nothing that should be treated as a formal pentest.
What you receive
External Review
One critical flaw surfaced with a clear attack scenario and evidence the client can understand immediately.
Detailed report
Findings ranked by severity, real impact, confidence level, evidence, recommendations, and fix priority.
Owner action summary
An executive-facing summary: what is serious, what must be fixed now, and what should be planned next.
Remediation framing
A fix-oriented view for engineers, with the logic behind each fix and the areas to revisit first.
Read next
Full Audit
The right fit when you need to fix several issues and keep due-diligence evidence.
External Review
The right format when you want the single most critical flaw nailed down first.
External review vs pentest
A clear comparison of scope, cost, and use cases for both approaches.
Florian
The researcher behind CleanIssue and the logic driving the audits.