menu
save_alt Herunterladen

Annual Security Audit Procedure

Effective February 9, 2026

< Back

Purpose

This procedure defines the requirements for conducting annual security audits or assessments. Regular security audits demonstrate our commitment to protecting customer data and provide evidence of security program effectiveness.

Our Architecture: Beekeeper Studio is a desktop database client. This audit covers our cloud services (account management, billing, license validation, support systems, optional workspace sync). See Data Flow Diagram.

Minimum Requirement: At least once per year. Internal self-assessment is acceptable; third-party audits recommended when pursuing enterprise customers.

For Educational Institutions: NDPA Section 5.2 requires annual security audit or assessment.


1. Annual Audit Requirement

1.1 Minimum Frequency

Required: At least once per calendar year (12-month period)

Recommended: Conduct audit in Q4 (October-December) to align with annual policy reviews and planning for the following year.

1.2 Audit Scope

The annual security audit must assess:

  • Infrastructure Security: Cloud hosting (Heroku), platform access controls
  • Application Security: Code security, vulnerability management, patch compliance
  • Data Protection: Encryption, data handling, retention compliance
  • Access Management: User access reviews, MFA compliance, privileged access
  • Logging and Monitoring: Log completeness, monitoring effectiveness, incident detection
  • Incident Response: Plan readiness, team preparedness, response capabilities
  • Policy Compliance: Adherence to all security policies
  • Vendor/Subprocessor Security: Third-party security assessments

1.3 Audit Flow

  • Conducted by internal security team or CTO
  • Uses standardized checklist and framework (NIST CSF, CIS Controls)
  • Self-assessment of security controls
  • Documents findings and remediation plans
  • Cost: Low (internal time only)
  • Frequency: Annual minimum

2. Internal Security Assessment Procedure

2.1 Timeline

November (Annual Compliance Week):

Day 1-2: Review and testing (6-8 hours)

  • Review last year’s findings
  • Test security controls using checklist
  • Collect evidence (screenshots, logs, reports)

Day 3: Risk assessment and findings (3-4 hours)

  • Conduct risk assessment (see Section 5)
  • Update risk register
  • Compile audit report with findings

Day 4: Plan remediation (1-2 hours)

  • Create prioritized fix list
  • Assign owners and target dates

Day 5: Policy updates (1-2 hours)

  • Update policies based on findings
  • Document completion of annual audit

Total Time: 12-17 hours

See Also: Compliance Actions Calendar - Annual Compliance Week schedule

2.2 Audit Team

Audit Lead: CTO or Security Contact
Participants:

  • Infrastructure/DevOps team member
  • Application security representative
  • Compliance/policy owner
  • External consultant (if budget allows)

Independence: Where possible, have team members audit areas outside their direct responsibility to maintain objectivity.

2.3 Audit Checklist

Based on NIST Cybersecurity Framework and CIS Critical Security Controls:

Identify (Asset Management)

  • Inventory of all systems with customer/student data access
  • Data flow diagrams up to date
  • Asset classification current (production, staging, development)
  • Subprocessor inventory reviewed and accurate

Protect (Access Control)

  • MFA enabled on all privileged accounts (target: 100%)
  • Quarterly access reviews completed on time
  • Least privilege principle enforced
  • Role-based access control implemented
  • Password policy compliance verified

Protect (Data Security)

  • Encryption at rest enabled (databases, backups, file storage)
  • Encryption in transit enforced (TLS 1.2+ for all connections)
  • Customer data properly classified and labeled
  • Student data handling compliant with NDPA
  • Data retention policy followed (deletion within 60 days of requests)

Protect (Training)

  • All employees completed security awareness training
  • Privileged users completed advanced training
  • Phishing simulation conducted
  • Training completion rate >95%

Detect (Monitoring)

  • Logging enabled on all critical systems
  • Security alerts configured and tested
  • Log reviews conducted per policy (daily automated alerts, monthly manual review)
  • Anomaly detection for access patterns

Detect (Vulnerability Management)

  • Dependency scanning operational (Dependabot, npm audit)
  • Critical vulnerabilities patched within 7 days (SLA met: %)
  • High vulnerabilities patched within 30 days (SLA met: %)
  • Vulnerability scan coverage >95% of systems
  • Zero critical vulnerabilities open >30 days

Respond (Incident Response)

  • Incident Response Plan updated in past year
  • Tabletop exercise conducted (last date: ___)
  • Incident response team identified and trained
  • Breach notification template ready
  • Communication plan tested

Respond (Backup and Recovery)

  • Daily backups operational
  • Backup restoration tested monthly (success rate: %)
  • Disaster recovery plan updated
  • RPO/RTO defined and achievable

Recover (Business Continuity)

  • Business Continuity Plan reviewed and updated
  • Critical systems identified
  • Recovery procedures documented
  • Failover tested (last test date: ___)
  • Post-incident review process followed

Risk Assessment

  • Risk register reviewed and updated (see Section 5)
  • New risks identified and evaluated
  • Existing risk mitigations verified effective
  • Accepted risks re-evaluated
  • Risk summary included in audit report

Penetration Testing

  • Automated scan completed (OWASP ZAP) against staging (see Section 6)
  • Template scan completed (Nuclei) against staging
  • HTTP security headers verified
  • Manual authorization checks completed
  • Dependency audit run (bundle audit, npm audit)
  • Findings documented and remediation planned

2.4 Evidence Collection

For each control, collect evidence:

  • Screenshots of configurations
  • Logs demonstrating compliance
  • Training completion reports
  • Access review sign-offs
  • Vulnerability scan reports
  • Backup test results
  • Policy documents with version dates

Evidence Storage: Create audit evidence folder with all supporting documentation in Google Drive. One folder per annual review.

2.5 Findings Classification

Critical Finding: Severe risk, immediate remediation required

  • Example: Critical vulnerability open >90 days, no MFA on production database access

High Finding: Significant risk, remediate within 30 days

  • Example: Quarterly access review 2 months overdue, logging gaps on key systems

Medium Finding: Moderate risk, remediate within 90 days

  • Example: Training completion at 85% (target: 95%), minor policy deviations

Low Finding: Minor issue, remediate within next review cycle

  • Example: Documentation out of date, cosmetic policy improvements

Observation: No remediation required, informational only

  • Example: Best practice recommendation, process improvement suggestion

2.6 Audit Report Template

Executive Summary (1 page)

  • Overall security posture assessment (Strong / Adequate / Needs Improvement)
  • Number of findings by severity
  • Key accomplishments from past year
  • Top 3 recommendations

Detailed Findings (per finding)

  • Finding title and severity
  • Description of issue
  • Risk/impact assessment
  • Recommendation for remediation
  • Responsible party
  • Target completion date

Control Assessment Results

  • Summary by NIST CSF category (Identify, Protect, Detect, Respond, Recover)
  • Compliance percentage per category
  • Year-over-year trends

Remediation Plan

  • Prioritized list of all findings
  • Assigned owners
  • Target dates
  • Budget implications (if any)

Appendices

  • Detailed evidence
  • Compliance metrics
  • Policy review status
  • Training records

5. Risk Assessment

5.1 Purpose

The annual risk assessment identifies, evaluates, and prioritizes risks to Beekeeper Studio’s systems and customer data. This is conducted as part of the annual security audit during November Compliance Week.

5.2 Risk Assessment Process

During Annual Audit (Day 2):

  1. Identify Risks: Review current threat landscape, past incidents, audit findings, and infrastructure changes
  2. Evaluate Risks: Assess likelihood and impact using the matrix below
  3. Prioritize: Rank risks and determine treatment (mitigate, accept, transfer, avoid)
  4. Document: Update the Risk Register (Section 5.4)
  5. Plan: Create or update mitigation plans for high/critical risks

5.3 Risk Evaluation Matrix

Likelihood:

Rating Definition
High Likely to occur within the next year
Medium Could occur within the next 1-3 years
Low Unlikely to occur within the next 3 years

Impact:

Rating Definition
Critical Data breach affecting customers, significant legal/financial exposure, extended service outage (>24 hours)
High Partial data exposure, regulatory non-compliance, service outage (4-24 hours)
Medium Internal data exposure, minor compliance gap, service degradation (<4 hours)
Low No data exposure, minor operational inconvenience, no customer impact

Risk Rating:

  Low Impact Medium Impact High Impact Critical Impact
High Likelihood Medium High Critical Critical
Medium Likelihood Low Medium High Critical
Low Likelihood Low Low Medium High

Treatment Thresholds:

  • Critical: Immediate mitigation required, escalate to CEO
  • High: Mitigation plan within 30 days
  • Medium: Address within next quarter
  • Low: Accept or address during regular maintenance

5.4 Risk Register

Maintain a risk register as a spreadsheet (Google Sheets) with the following columns:

Column Description
Risk ID Unique identifier (e.g., RISK-2026-001)
Date Identified When the risk was first identified
Category Infrastructure, Application, Data, Access, Vendor, Personnel, Compliance
Description Clear description of the risk
Likelihood High / Medium / Low
Impact Critical / High / Medium / Low
Risk Rating Calculated from likelihood × impact
Current Controls What mitigations are already in place
Treatment Mitigate / Accept / Transfer / Avoid
Mitigation Plan What additional actions are planned
Owner Who is responsible for the risk
Target Date When mitigation should be complete
Status Open / In Progress / Mitigated / Accepted
Last Reviewed Date of last review

5.5 Common Risk Categories

Review these categories annually to ensure completeness:

Infrastructure Risks:

  • Heroku platform outage or discontinuation
  • Database corruption or data loss
  • Backup failure
  • DNS or domain issues

Application Risks:

  • Dependency vulnerabilities (npm, Ruby gems)
  • Application security flaws
  • Authentication bypass
  • API abuse

Data Risks:

  • Unauthorized access to customer data
  • Data breach via compromised credentials
  • Accidental data exposure in logs or error reports
  • Incomplete data deletion

Access Risks:

  • Compromised employee credentials
  • Excessive privileges
  • Orphaned accounts after personnel changes
  • MFA bypass

Vendor Risks:

  • Subprocessor data breach
  • Vendor service discontinuation
  • Vendor security downgrade
  • Supply chain compromise (dependencies)

Personnel Risks:

  • Key person unavailability (see Business Continuity Plan)
  • Insider threat
  • Insufficient security training
  • Social engineering attacks

Compliance Risks:

  • NDPA non-compliance (educational customers)
  • GDPR/CCPA violation
  • Missed breach notification deadline
  • Incomplete audit evidence

5.6 Risk Register Review

  • Annually: Full risk assessment during November Compliance Week (required)
  • Quarterly: Quick review of open risks and mitigation progress
  • Ad-hoc: After any security incident or significant infrastructure change

5.7 Reporting

Include in the annual audit report:

  • Summary of identified risks by rating
  • New risks identified since last assessment
  • Risks mitigated since last assessment
  • Accepted risks with justification
  • Year-over-year risk trend

6. Penetration Testing

6.1 Purpose

Annual penetration testing identifies security vulnerabilities in our cloud services that automated dependency scanning may miss. This covers application-level security: authentication, authorization, input validation, and API security.

6.2 Scope

In scope:

  • Cloud application and public API endpoints
  • Authentication and session management
  • Authorization controls (can user A access user B’s data?)
  • Input validation (SQL injection, XSS, CSRF)
  • HTTP security headers
  • Rate limiting

Out of scope:

  • Desktop application (separate testing process)
  • Third-party vendor infrastructure (Heroku, AWS, Stripe)
  • Customer databases (we never access these)

6.3 Testing Requirements

Automated scanning:

  • Run web application vulnerability scanner against staging environment
  • Run template-based vulnerability scanner against staging environment
  • Verify HTTP security headers on production
  • Run dependency audits (supplement to automated Dependabot scanning)

Manual verification:

  • Authentication: No username enumeration, proper error handling
  • Authorization: No unauthorized cross-account data access
  • Session management: Proper expiry and token rotation
  • Rate limiting: Enforced on login, password reset, and API endpoints
  • Input validation: Special characters handled safely across all input fields
  • Error handling: No stack traces, database details, or internal paths exposed

Important: Always run active scans against staging, not production.

6.4 Reporting

Document results in a penetration testing report:

  • Date and scope of testing
  • Tools used with versions
  • Findings classified by severity (Critical/High/Medium/Low/Info)
  • Evidence (screenshots, request/response logs)
  • Remediation plan for each finding
  • Retest date for confirmed vulnerabilities

File the report in the annual audit evidence folder (Google Drive).

6.5 Schedule

Activity Frequency
Automated vulnerability scan Annually during November compliance week
Dependency audit Monthly (automated via Dependabot, manual quarterly)
HTTP header review Annually
Manual authorization checks Annually

7. LEA Audit Support

7.1 Providing Audit Reports to LEAs

NDPA Section 5.2 requires: “Upon 10 days’ notice and execution of confidentiality agreement, Provider will provide the LEA with a copy of the audit report, subject to reasonable and appropriate redaction.”

Process:

  1. LEA requests audit report (typically during contract negotiation or annual review)
  2. Provide within 10 business days
  3. Execute NDA/confidentiality agreement first
  4. Redact specific vulnerability details, proprietary methodologies
  5. Provide summary or full report based on request

What to Provide:

  • Executive summary (always shareable)
  • Findings summary with remediation status (redact vulnerability specifics)
  • Compliance metrics and trends
  • Evidence of annual audit completion (date, auditor, scope)

What to Redact:

  • Specific IP addresses and server names
  • Detailed vulnerability descriptions
  • Proprietary security tooling/methods
  • Information about other customers
  • Confidential financial data

7.2 LEA-Conducted Audits

Utah Code § 53E-9-309 grants LEAs the right to audit Provider.

If LEA Requests Audit:

  1. Acknowledge request within 48 hours
  2. Negotiate scope and timing
  3. Provide requested documentation:
    • Security policies
    • Audit reports (with NDA)
    • Subprocessor inventory
    • Security certifications (SOC 2 if available)
  4. Facilitate questionnaire completion
  5. Schedule interviews if needed (with reasonable notice)
  6. Provide follow-up information as requested

Response Timeline: 30 days for documentation requests


8. Audit Evidence Retention

Retention Period: 1 year (aligned with Papertrail log retention)

Evidence to Retain:

  • Complete audit reports
  • Supporting evidence and documentation
  • Remediation plans and completion records
  • Correspondence with auditors
  • Executive presentations
  • Audit committee meeting minutes (if applicable)

Storage Location:

  • Secure, access-controlled folder
  • Backup copies maintained
  • Indexed for easy retrieval during LEA audits

9. Remediation Tracking

7.1 Remediation Plan

For each finding, document:

  • Finding ID and title
  • Severity
  • Description
  • Recommended remediation
  • Assigned owner
  • Target completion date
  • Actual completion date
  • Verification method

7.2 Tracking and Reporting

Monthly: Review remediation progress

  • Update status of all open findings
  • Escalate overdue items
  • Report to CTO

Quarterly: Executive summary

  • Findings closed since last quarter
  • Findings remaining open
  • Any new findings from continuous monitoring
  • Budget implications of remediation

Annually: Include in next year’s audit

  • Verify all previous findings remediated
  • Test effectiveness of remediation
  • Close out in new audit report

10. Continuous Improvement

8.1 Year-Over-Year Comparison

Track metrics annually:

  • Number of findings by severity
  • Average time to remediate
  • Percentage of controls meeting standards
  • Security incidents per year
  • Vulnerability patching SLA compliance

Goal: Demonstrate improvement each year

8.2 Industry Benchmarking

Compare to industry standards:

  • CIS Critical Security Controls implementation percentage
  • NIST CSF maturity levels
  • Peer company security metrics (if available)

8.3 Incorporating Lessons Learned

After each audit:

  • Identify control gaps
  • Update policies to address findings
  • Enhance monitoring and detection
  • Improve documentation
  • Invest in tooling or training where needed

Security Policies


Document Information

Version: 2.0
Effective Date: 2026-02-09
Last Reviewed: 2026-02-09
Next Review Due: 2027-02-09
Owner: CTO / Security Contact
Approved By: CEO

Changes from v1.0: Reduced annual audit to 10-15 hours during November Compliance Week, clarified desktop app architecture, made NDPA references optional, added cross-references to legal documents.

Note: Internal self-assessment is acceptable. Third-party audits recommended when pursuing enterprise customers or SOC 2 certification.


Appendix: Annual Audit Checklist Template

ANNUAL SECURITY AUDIT - [YEAR]

Audit Period: [Start Date] to [End Date]
Audit Conducted By: [Name/Firm]
Audit Completed: [Date]

Overall Assessment:
[ ] Strong - All controls effective, minimal findings
[ ] Adequate - Most controls effective, some improvements needed
[ ] Needs Improvement - Significant gaps identified, remediation plan required

Findings Summary:

  • Critical: [Number]
  • High: [Number]
  • Medium: [Number]
  • Low: [Number]
  • Observations: [Number]

NIST CSF Category Compliance:

  • Identify: [%]
  • Protect: [%]
  • Detect: [%]
  • Respond: [%]
  • Recover: [%]

Key Metrics:

  • MFA Compliance (Privileged Accounts): [%]
  • Vulnerability Patching SLA Compliance: [%]
  • Access Review Completion Rate: [%]
  • Training Completion Rate: [%]
  • Backup Success Rate: [%]

Top 3 Accomplishments:

  1. [Achievement]
  2. [Achievement]
  3. [Achievement]

Top 3 Recommendations:

  1. [Priority 1 recommendation]
  2. [Priority 2 recommendation]
  3. [Priority 3 recommendation]

Remediation Plan:
[Attached as separate document]

Next Audit Due: [Date, 12 months from completion]

Audit Report Location: [File path or document reference]

Signed:
________ Date: ____
[CTO or Security Contact]