AI-driven authentication Beta
Use AI-driven when the login flow is too complex to describe with a single login URL and a username/password pair — multi-step wizards, tenant pickers, email-code flows, or apps where you simply don't know the DOM structure.

How it works
The scanner spins up a browser, navigates to your app, and hands control to an LLM. The LLM:
- Reads the page, identifies the login affordance.
- Fills inputs with the credentials you provided.
- Handles multi-step screens (tenant selector, remember-device prompts, post-login "What's new" modals).
- Verifies login succeeded before letting the scan start.
When to use
- Multi-step / wizard-style logins.
- Heavy SPAs where a classic "fill selector X" approach keeps breaking.
- Apps with tenant selectors, MFA fallback screens, or consent dialogs between login and dashboard.
- You don't want to maintain brittle selector-based auth scripts.
When not to use
- Simple single-page logins — Form is faster and cheaper.
- Token-based APIs — Token needs no LLM.
- CI runs where cost predictability matters (each login burns LLM tokens).
- CAPTCHA-protected logins — disable CAPTCHA for the scan account instead.
Dashboard
- Create Scan → Step 2, pick AI-driven.
- Provide credentials the LLM can use (typically username + password).
- Make sure Key Source at the top of the step is set — AI-driven needs an LLM key (Levo's platform key or BYOK).
- Click Next.
CLI
shadownet scan https://app.example.com \
--auth ai \
--username "$SCAN_USERNAME" \
--password "$SCAN_PASSWORD"
Set ANTHROPIC_API_KEY or OPENAI_API_KEY for BYOK, or rely on Levo's platform key.
LLM provider precedence
If both keys happen to be exported (common in CI), the scanner picks a provider in this order:
ai.providerinlevo-dast.yml(explicit wins).--ai-providerCLI flag, if passed.ANTHROPIC_API_KEYif set.OPENAI_API_KEYif set.- Levo's platform key (fallback).
To pin a provider on a shared runner, set ai.provider in the YAML rather than relying on env-var order.
levo-dast.yml
auth:
strategy: "ai"
username: "${SCAN_USERNAME}"
login_url: "https://app.example.com"
ai:
provider: "anthropic"
# ANTHROPIC_API_KEY in env
AI-driven auth is under active development. Expect occasional login misidentification on unusual flows; report failures to support@levo.ai with the scan ID so we can tune the prompts.
Cost and latency
- Each scan burns LLM tokens for the login flow (usually tens of thousands of input tokens for 3–4 screens).
- The first login adds ~30–90s to scan start-up compared to Form auth.
- Subsequent login refreshes during a long scan reuse cached decisions where possible.
Next
- Form auth — cheaper if the flow fits.
- YAML config — automate AI-auth scans in CI.