DocsInstructor Guide

Analytics Overview

The five prebuilt dashboards, when to use each, and how the data is kept fresh

OptiLearn's analytics surface has five prebuilt dashboards plus a custom Reports Builder. Each dashboard answers a specific question, so you don't have to build the same chart from scratch every time you want to check in on your institution.

DashboardURLBest for
Platform/analyticsInstitution health at a glance — totals + trends across all courses
Students/analytics/studentsLearner activity, outcomes, retention
Engagement/analytics/engagementLesson velocity, quiz participation, forum activity
Compliance/analytics/complianceMandatory training status + overdue tracking
Revenue/analytics/revenueList revenue from enrollment activity
Course/courses/[id]/analyticsDeep-dive into a single course
Reports Builder/analytics/reportsAd-hoc reports you build, save, schedule, share

All dashboards and the builder run through the same engine and the same tenant-isolation rules. You never see another institution's data — and when you share a report with another user, the engine re-runs it under their scope, so teachers only see their own courses even on a report built by an admin.

Who sees what

RoleDashboardsReports Builder
Institution admin / Super adminAll dashboards, institution-wide dataFull access, can build and share
Principal / HODAll dashboards, institution-wide dataFull access, can build and share
TeacherAll dashboards, scoped to their own coursesScoped to own courses
Admissions / Finance / LibrarianRead-only dashboardsCan build personal reports
Student / ParentNot availableOwn data only

Scope is enforced at query time, inside the engine. If you open a shared report and you're a teacher, the engine silently adds courseId IN (courses I created) to the query. You can't configure around this; it's the invariant the engine is built on.


Platform Dashboard (/analytics)

Institution-wide health check. One row of KPIs, four trend charts, two pie charts.

KPIs

  • Total Enrollments — lifetime count across the institution
  • New Enrollments — in the selected period
  • Completions — course completions in the selected period
  • Active Learners — distinct students with any lesson activity in the selected period

Charts

  • Enrollments Over Time (area, full width)
  • Completions Trend (line)
  • Certificates Issued (bar, with a "This period" inline KPI)
  • Enrollment Status (pie — ACTIVE / COMPLETED / DROPPED / EXPIRED / SUSPENDED)
  • Course Status (pie — DRAFT / REVIEW / PUBLISHED / ARCHIVED)

Sync to OptiCRM

The Sync to OptiCRM button pushes every new completion record to OptiCRM so student profiles there reflect their OptiLearn progress. One-way push; OptiCRM never writes back to OptiLearn.


Students Dashboard (/analytics/students)

The "who's learning, are they finishing?" dashboard.

KPIs

  • Active Students — distinct learners with lesson activity in the selected period
  • Course Completions — enrollments that finished this period
  • Average Final Score — mean final score across completed enrollments
  • Dropouts — enrollments that dropped in the selected period

Charts

  • Daily Active Students (area, full width) — the momentum signal. Trending down = retention problem.
  • Learner Lifecycle (pie) — distribution across ACTIVE / COMPLETED / DROPPED / EXPIRED / SUSPENDED.
  • Completions Trend (line) — completions per day.

What to look for

  • Daily active dropping — students aren't coming back. Check recent course changes, announcements, onboarding flow.
  • High dropouts + low completions — content is too long, too hard, or oversold. Compare against the Engagement dashboard to see if quiz attempts are also dropping.
  • Average final score under 60% — assessments may be mis-calibrated. Rubric review time.

Engagement Dashboard (/analytics/engagement)

Activity volume — answers "are people actually using the platform?" (distinct from Students, which asks "are they finishing?").

KPIs

  • Lessons Completed — lesson completions this period
  • Quiz Attempts — attempts this period
  • Average Quiz Score — percentage across attempts
  • Forum Posts — discussion posts this period

Charts

  • Lesson Completions (area, full width)
  • Quiz Attempts Over Time (line)
  • Forum Activity (bar — posts per day)

What to look for

  • Lessons plateau but quiz attempts don't — students are replaying quizzes without advancing. Might be an unlock issue, might be gamification pulling them back.
  • Quiz score trending down — content got harder or the assessment changed. Check the course's most-recent edits.
  • Forum flat at zero — no community. Consider seeding discussions, announcing prompts, or removing the forum from the course layout if it's a distraction.

Compliance Dashboard (/analytics/compliance)

Mandatory training health. Audit-grade numbers on who has and hasn't completed their required courses.

KPIs

  • Total Records — lifetime count of mandatory training assignments
  • Completed — lifetime count in COMPLETED status
  • Overdue — records past their due date without completion (the alarm signal)
  • Completed This Period — momentum for recent enforcement

Charts

  • Completions Over Time (area, full width)
  • Compliance Status (pie — NOT_STARTED / IN_PROGRESS / COMPLETED / OVERDUE / WAIVED)

What to look for

  • Overdue growing — enforcement is lagging. Check the scheduled reminders cron is running and the recipient list is current.
  • Completed-this-period drops suddenly — something changed about the training. Broken link, missing video, assignment requirement went up.
  • Most records WAIVED — the training may be too broad. Tighten the eligibility rules so only people who actually need it get assigned.

Revenue Dashboard (/analytics/revenue)

Dollars and enrollments — what the business is bringing in. "List revenue" here means course.price × enrollments in window, not billed revenue (OptiLearn doesn't have a transaction table; actual invoicing lives in your payment processor).

KPIs

  • Total Revenue — sum of course prices × enrollments in the period
  • Enrollments — count of revenue-generating enrollments this period
  • Average Transaction — revenue divided by enrollments (a mix of low- and high-priced courses pulls this around)
  • Top Course Revenue — highest-earning course in the period

Charts

  • Revenue Trend (area, full width) — daily list revenue over time
  • Top 10 Courses by Revenue (horizontal bar) — the highest earners in the period, regardless of enrollment count

Currency

The dashboard displays values in whatever currency the first course in your catalogue uses — typically INR for the default install. Mixed-currency catalogues aren't fully supported; numbers will be summed as if they're the same unit, so keep all courses in one currency.

What to look for

  • Revenue flat but enrollments up — low-priced or free courses are carrying volume. Check pricing.
  • One course dominates the top-10 — concentration risk. If it breaks, revenue craters. Diversify.
  • Revenue trends down while active learners trend up — the free catalogue is pulling traffic but paid conversions aren't happening. Check the checkout UX and pricing page.

Course Dashboard (/courses/[id]/analytics)

Click any course, then the Analytics button in the header. This dashboard is scoped to one course.

KPIs

  • Total Enrolled — lifetime enrollments in this course
  • Completed — with a sub-label showing completion rate as a percentage
  • Avg Progress — mean progress across all enrolled students
  • Avg Final Score — mean final score for completed students

Charts

  • Enrollments Over Time (area, full width)
  • Completions Trend (line)
  • Enrollment Status (pie, for this course only)

Lesson Engagement

A list of every lesson with:

  • Position (1, 2, 3…)
  • Type icon (video, text, quiz, assignment, etc.)
  • Title
  • Completion progress bar
  • Students who completed it

What to look for

Tip

Dashboards tell you what happened. The Reports Builder is where you ask why.

  • Big drop-off between lessons — if lesson 5 is 80% complete but lesson 6 is 20%, something about lesson 6 is losing students. Test it as a student. Video stuck? PDF 404? Quiz unclear?
  • Low Avg Progress + low Completions — content too long or too hard. Split modules, shorten videos, add checkpoints.
  • High Enrollments + low Completions — catalogue oversell. Rewrite the description and expected time.

Date Range Picker

Every dashboard has a date range picker in the top right:

OptionLooks back
Last 7 days7
Last 30 days30 (default)
Last 90 days90
Year to dateDays since Jan 1 of the current year
All timeCapped at 10 years for safety

Changing the range re-runs every KPI and chart on the page simultaneously. In-flight requests are aborted if you click a new range before the current one finishes, so rapid clicks don't queue up stale requests.


Data Freshness — live tables vs. the hourly snapshot

Most numbers on every dashboard are live — computed from the actual fact tables at the moment you load the page. If a student enrolls at 2:14pm, your "New Enrollments" KPI will reflect them at 2:14:01pm.

There are two exceptions to that:

  1. Heavy time-series queries route through a pre-aggregated snapshot table that's refreshed hourly. "Enrollments per day for the last 90 days" hits a ~90-row summary table instead of scanning thousands of live rows. The UI shows a warning line — "Served from pre-aggregated snapshot — refreshed hourly. Live data may be up to 1h behind." — whenever a chart is backed by the snapshot path. Most dashboard widgets are NOT backed by snapshots in v1, so you'll rarely see this.

  2. Results are cached in Redis for 60 seconds per user. Clicking "Re-run" or reloading the page within the same minute serves the cached result; the indicator · cached appears next to the row count. Any write to the underlying data (new enrollment, lesson completed, etc.) automatically invalidates the cache for your institution — you won't see stale data after a change.

If you ever see a number that looks wrong, wait a minute, hit Re-run, and it'll recompute. The engine is designed to fail loud rather than quiet: a broken query returns an error banner, not a blank chart.


Warning badges on charts

Occasionally a chart card shows an amber warning line — e.g., "Aggregation 'active': countDistinct in categorical mode approximates as non-null count." These come from the engine when it has to fall back from a true COUNT(DISTINCT ...) because the query path doesn't support it natively.

The number is still accurate for the filter set you asked for; the warning just tells you why the calculation may differ from a hand-written SQL query against the same data. For time-bucketed reports the warning never appears (the SQL path handles DISTINCT directly).

Snapshot-served results also show a warning: "Served from pre-aggregated snapshot — refreshed hourly. Live data may be up to 1h behind." That's informational, not an error.


Reports Builder (deep dive)

Custom reports live at /analytics/reports. See Building Custom Reports for the full walkthrough, and Saved Reports for managing what you've built. For the scheduled delivery and threshold alerts that run on top of saved reports, see Scheduling Reports.

Short version: click + New Report, pick an entity (Enrollments, Courses, Students, etc.) from the top bar, check off the fields you want in the left panel, watch the live preview build in the center, add filters on the right panel, save. The engine handles everything else.