QuilonsQUILONS AI

How Quilons is different

Built for organizations that need privacy, control, and compliance—not just features.

Feature

Quilons Recruit

Typical ATS

Architecture & Deployment

Per-Tenant Isolation
Every customer runs in a dedicated Docker stack with separate DB, queues, storage, and secrets.
⚠️Shared infrastructure and DB with row-level separation.
On-Prem, SaaS & Hybrid
Supports all models — cloud, on-prem, hybrid, and fully air-gapped.
Usually SaaS-only; on-prem is rare or enterprise-priced.
Tenant-Specific Customization
Each tenant can have its own workflow, custom tasks, UI modules, and integrations (e.g., "video submission" step for Airbus).
One global workflow for all customers; minimal per-tenant flexibility.
SSO Everywhere
SAML & OIDC for both SaaS and on-prem deployments.
⚠️SSO only in SaaS and only in enterprise tier.

Workflow & Automation

Dynamic Workflow Engine
General-purpose workflow engine with pluggable tasks. Not limited to ATS pipelines.
Fixed recruitment stages (Applied → Screening → Interview → Hire).
Automatic Interview Scheduling
AI-assisted scheduling that finds slots, sends invites, handles conflicts.
Mostly manual coordination.
Automatic Rejection Handling
Automated rejection emails with audit trail + templates.
⚠️User must manually send rejections.
Custom Tasks per Tenant
Add custom steps like video submission, internal approval, pre-screening forms, identity check.
Not possible without vendor engineering help.
Roadmap: Extend to Onboarding
🟦Workflow engine will support onboarding tasks like background checks, offer generation, contract signing, IT access, equipment provisioning.
ATS stops at "Hire" stage; onboarding requires separate HRIS.

AI & LLM Control

Bring Your Own LLM
OpenAI, Anthropic, Azure, Groq, Llama 3, Mistral, local Ollama — full control.
Vendor-locked AI or limited options.
Multi-Model Per Workflow
Use different LLMs for scoring, ranking, bias checks, text generation.
One model (if any).
Human-in-the-Loop Decisions
AI gives transparent scoring + explanations, but decisions remain human-controlled.
⚠️Black-box scoring or no assistive AI.
Local Model Support
Run Llama or Mistral on-prem for air-gapped deployments.
No local inference support.

GDPR & Compliance

GDPR Onboarding Wizard
Guided setup for retention, consent, DPA, privacy notice, data minimization, PII controls.
No structured onboarding; manual legal setup.
Pseudonymized Data Flows
Candidate data minimized and pseudonymized before LLM interaction.
⚠️Full PII may be sent to vendor AI.
Automated Retention Policies
Configurable per tenant (30–365 days) + nightly purge.
Fixed or manual retention.
Audit Trails
70+ audit points; no PII stored in logs.
⚠️Limited logging; PII often appears in logs.
Encryption & Security
AES-256-GCM at rest, TLS, HMAC-signed workflow states.
⚠️Varies widely; often minimal.

Integrations

Calendar Integrations
Google & Microsoft with BYO OAuth or vendor OAuth. Works in SaaS + On-Prem.
⚠️Vendor-only OAuth; often not available in on-prem.
CV / Document Sources
Drive, SharePoint, Dropbox, SFTP, Box, OneDrive, local folders.
Email upload or manual upload only.
Job Publishing
LinkedIn, Indeed, Platsbanken, hosted career pages. Extensible plugin system.
⚠️Limited boards, often additional fees.

Pricing & Control

Pricing Model
PEPM + minimum floor. Unlimited users, jobs, candidates, workflows.
Pay per job, per recruiter, per feature, per automation.
AI Cost Transparency
You pay your LLM provider directly — no markup.
Vendor AI with hidden multipliers.
Data Ownership
Tenant fully owns data; export anytime (CSV, JSON).
⚠️Vendor controls exports; sometimes paywalled.
Fully supported
🟦Upcoming capability
⚠️Limited or varies by vendor
Not available

Why Organizations Choose Quilons

Built for teams that cannot compromise on security, compliance, or control.

🏛️

Regulated Industries

Banking, healthcare, government. Meet data sovereignty and compliance requirements with on-premise deployment.

GDPRData SovereigntyAudit Trails
🔒

Security-First Teams

Per-tenant isolation, encryption, PII-free logging, and complete control over where data lives.

Per-Tenant StacksAES-256Air-Gapped
💰

Cost Transparency

No AI markup. Pay your LLM provider directly. PEPM pricing with no feature metering.

BYO-LLMNo MarkupPredictable
🎯

Enterprise IT

Deploy on your infrastructure. Integrate with internal systems. Full API access and data export.

Self-HostAPI AccessPortability
🌍

Global Teams

Choose your AI provider region. Deploy in your geography. Multi-language support.

Regional ChoiceMulti-LanguageLocal Data
📊

Transparent AI

4-agent evaluation chain with explainable scoring, bias detection, and complete auditability.

ExplainableBias FlagsAuditable

Real-World Scenarios

1

European Bank

Challenge: Cannot use shared US-based SaaS for candidate data. Requires on-premise deployment with GDPR compliance.
Solution: Quilons on-premise with per-tenant isolation, EU region data storage, automated retention, and complete audit trails.
2

Healthcare Provider

Challenge: Needs AI-assisted screening but cannot send data to external AI vendors. Must audit all AI decisions.
Solution: Quilons with local Ollama (air-gapped). 4-agent evaluation chain with transparent scoring and bias detection.
3

Fast-Growing Startup

Challenge: Wants modern AI recruitment but concerned about vendor lock-in and unpredictable AI costs.
Solution: Quilons Cloud SaaS with BYO-LLM. Choose OpenAI/Anthropic/Groq. Pay provider directly. Switch anytime.
4

Defense Contractor

Challenge: Air-gapped environment. No outbound internet. Still needs workflow automation.
Solution: Quilons on-premise with local Ollama models. All processing happens within secure network.

Experience the difference

Spin up a free demo stack and see per-tenant isolation, BYO-LLM, and GDPR controls in action.