Back to Admins

Reviewing Student and Instructor Performance Evaluations

Admin
Web

Last updated April 16, 2026

Reviewing Student and Instructor Performance Evaluations

Summary

The Performance tab on each student's and instructor's details overlay shows an AI-generated Performance Evaluation summarizing what they're doing well and where they can improve. Both evaluations analyze the most recent 15 logs and render as a Strengths card and an Areas for Improvement card, followed by the evaluation period and log count. A Notes section below lets admin-level roles add shared commentary that isn't part of the formal training record.

This is the single best surface for a Chief Instructor's "how is this person doing?" read — it's narrative, scoped to recent activity, and designed for oversight.

Who this is for

  • Chief Instructors and Assistant Chief Instructors — primary audience; Performance Evaluations are the core of the chief's oversight workflow
  • Owners and Admins — same views, same capabilities
  • Line instructors and students do not have Admin Portal access, so they do not see this surface

Before you begin

  • Confirm the student or instructor has at least some logs on file — evaluations only generate once there is activity to analyze.
  • Notes are visible between all admin-level roles on your org. Write them with that audience in mind.

Steps

Reviewing a student's Performance Evaluation

  1. From the Admin Portal sidebar, click Members, click a student's name to open their details overlay, and select the Performance tab (the default tab on the overlay).
  2. Read the Student Performance Evaluations section. The section header has an AI sparkle icon. Inside you'll see two cards:
CardIconWhat it shows
StrengthsThumb-upA bulleted narrative of what the student is doing consistently well, drawn from their recent debriefs
Areas for ImprovementTargetA bulleted narrative of where the AI recommends focusing — specific patterns across lessons, not one-off events
  1. Check the evaluation metadata card below the two evaluation cards:
  • Evaluation Period — the date range of the flights analyzed (e.g., "Mar 1 – Apr 6, 2026")
  • Logs analyzed — the count of logs that fed the evaluation (up to 15)

Use these to calibrate how much weight the evaluation carries. A 15-log evaluation spanning two months is robust; a 2-log evaluation is preliminary.

  1. Decide your action. Typical reads:
  • Areas for improvement repeat across evaluations — coordinate with the line instructor to target those areas in the next few lessons.
  • Strengths are narrow (one skill area dominates) — consider whether upcoming lessons will stretch the student into different competencies.
  • Few logs analyzed — interpret directionally; revisit after more flights.

Reviewing an instructor's Performance Evaluation

  1. From the Members view, click an instructor's name, then select the Performance tab.
  2. Read the Instructor Performance Evaluations section. The structure mirrors the student view with one difference in framing: the evaluation assesses instructor effectiveness — debrief quality, teaching patterns, adaptability, emphasis on safety — rather than student skill. Cards are still Strengths and Areas for Improvement.

Typical instructor signals to watch for:

  • Debrief structure and content — are post-flight debriefs covering what they should?
  • Identification of learning points — does the instructor articulate why a maneuver matters?
  • Proactive intervention — does the instructor coach during the flight or only after?
  • Adaptation to student learning pace — does the instructor vary technique when repetition isn't working?
  1. Check the evaluation metadata. Same fields as the student version: Evaluation Period and Logs analyzed (up to 15).

Adding a note

Below the evaluation on either the student or instructor Performance tab, the Notes section is a shared space for admin-level commentary. Notes are not part of the formal training record — use them for internal observations, training plan adjustments, and handoff context.

  1. In the Notes section header, click Add Note.
  2. Enter free-form text describing the observation or decision, then save.
  3. Manage existing notes using the pencil and trash icons on a note. Each note shows the author name, created date, and updated date (if different). You can only edit or delete notes you authored; other admins' notes are read-only to you.

What happens next

  • Performance Evaluations update as new logs are recorded and evaluated; there is no "Generate" button in the UI today, so refresh the page if you expect new data to have been analyzed.
  • Notes appear immediately for every admin-level role on your org.

If there is no evaluation yet, you'll see one of two empty states. No Evaluation Data Available means the subject has no logs on file — record activity first. No Evaluation Available means logs exist but no current evaluation has been generated; this resolves automatically as the backend processes recent activity.

Common issues

ProblemCauseSolution
"No Evaluation Data Available"Student or instructor has no logs recorded yetWait for training activity; the evaluation populates after logs exist.
"No Evaluation Available" despite recent logsEvaluation hasn't regenerated yet after recent activityRefresh the page or check back later. Evaluations analyze up to the most recent 15 logs on a server-side cadence.
Strengths or Areas for Improvement feel staleEvaluation reflects the last 15 logs; older strong points persist longer than recent changesLook at the Evaluation Period — it tells you exactly what window was analyzed. Recent debriefs will reshape the narrative as they're processed.
Can't edit a noteYou didn't author itEach author owns their own notes. Ask the author to edit, or add your own note.
Notes disappearedThe author deleted them, or you deleted yours by mistakeNotes don't have a trash/recovery surface. Re-create if needed.

How this works

What drives the evaluation

Each evaluation analyzes up to the most recent 15 logs for the subject. The AI uses organization default criteria (Strengths and Areas for Improvement are the standard two criteria) to extract patterns across those logs. Single events in one log rarely dominate — the AI looks for repeated themes.

Student vs. instructor lens

Both evaluations use the same mechanism but with different scope:

  • Student Performance Evaluation — what the student did well and needs to work on, drawn from their debriefs as subject
  • Instructor Performance Evaluation — what the instructor did well and needs to work on, drawn from debriefs they authored

A single debrief log can contribute to both — the student's evaluation sees how the student flew; the instructor's evaluation sees how the instructor taught.

Performance tab vs. Training Insights tab

Both live in the Admin Portal and both use AI, but they answer different questions:

  • Performance tab (this article) — narrative evaluation of the person over the last 15 logs, with shared notes. Available for both students and instructors.
  • Training Insights tab — for students only; shows per-ACS-Area proficiency scoring and a training overview summary (see Reviewing Training Insights and Analytics).

Use Performance for narrative, recent-trend oversight. Use Training Insights for ACS-aligned proficiency scoring.

Why notes are owner-scoped

Notes track authorship and edit history because they're part of an internal collaboration thread, not a formal record. Owner-scoped edit/delete keeps the audit trail clean: if you want to correct another admin's note, add a new one referencing the original rather than rewriting theirs.

Related Docs

Still have questions?

Book a demo and we'll walk you through everything.

Book a Demo