Ferndesk
Audits and automation

How Fern audits your help center

Fern audits systematically review your help center to find documentation gaps, stale content, and opportunities to improve coverage. Each audit analyzes support tickets, search queries, reader feedback, and product changes to surface actionable recommendations you can assign directly to Fern.

What audits are

An audit is a structured review where Fern scans your help center ecosystem for missing or outdated documentation. Instead of manually hunting for gaps, you trigger an audit and Fern identifies:

  • Missing coverage - Topics customers ask about in support tickets that lack corresponding articles

  • Search failures - Queries where users searched your help center but found nothing relevant

  • Content quality issues - Articles receiving repeated negative feedback from readers

  • Stale content - Articles that may need updates based on recent product changes

Audit results appear as individual findings you can review, dismiss, or assign to Fern for drafting. Fern does not auto-publish anything—you control which recommendations become new or updated articles.

What Fern analyzes

During an audit, Fern pulls from multiple data sources connected to your workspace:

Support signals

Fern analyzes support conversations from your connected helpdesk (Intercom, Zendesk, Help Scout, or others). She identifies recurring question patterns, clusters related tickets, and flags topics that generate significant support volume but lack documentation.

Search gaps

Fern reviews queries where users searched your help center and found no relevant results. These search misses represent direct evidence of what customers tried to find but could not. Failed conversations where the widget could not answer are also included.

Article feedback

Fern examines reader feedback on existing articles—both ratings and comments. Repeated negative feedback on the same article signals a content quality or completeness problem, even without support ticket corroboration.

Recent product changes

Fern scans your connected codebase (GitHub), release notes, and changelogs to identify shipped features or changes that may need documentation coverage. This helps surface gaps before customers ask about them.

Connect integrations like GitHub and your helpdesk in workspace settings to give Fern access to these sources. Audits work best when multiple data sources are connected.

How to create a manual audit

You can trigger an audit anytime from the Fern page. Use this when you want fresh recommendations outside your scheduled audit cadence.

1

Open the Fern page

Navigate to the Fern page from your dashboard sidebar. This is the main workspace for viewing Fern threads, including audits and tasks.

2

Click Create Audit

Click the Create Audit action in the sidebar. This opens the audit creation dialog where you configure the scope.

3

Review your connections

The dialog shows your connected integrations. If you need additional sources, click Add a connection to connect your helpdesk or codebase. You can skip this if your connections are already set up.

4

Select a lookback window

Choose how far back Fern should analyze data. Options include Last week, Last 2 weeks, Last month, Last quarter, and Last year. A longer window captures more patterns but takes longer to process.

5

Start the audit

Review your selections on the confirmation screen and click Start Audit. You will see a toast confirming the audit started. Fern processes the audit in the background and adds findings as they are ready.

How audit results appear

After an audit completes, results appear as a dedicated audit thread in your Fern thread list. The thread title shows the time window, such as "Audit (Last 2 weeks)".

Inside the audit thread, findings are organized into sections based on the type of gap:

  • Reduce your support volume - Documentation gaps generating significant support tickets

  • Missing essentials - Critical topics lacking any coverage

  • Fill your search gaps - Queries where users found nothing

  • Address negative feedback - Articles with repeated reader complaints

  • Cover what you just shipped - Product changes needing documentation

Each finding shows the gap title, the reason Fern flagged it, a proposed plan, and supporting evidence count. Click any finding to expand the full evidence breakdown—specific tickets, search queries, feedback entries, or commit references that triggered the recommendation.

How to review and assign findings

For each audit finding, you decide whether to act, dismiss, or revisit later.

1

Review the finding details

Open a finding to see the full context: why Fern flagged it, what article she would create or update, and the specific evidence supporting the recommendation. Use this to prioritize which gaps matter most.

2

Draft with Fern or dismiss

For findings you want to address, click Draft with Fern. Fern generates a drafting prompt based on the finding details. You can edit this prompt before confirming if you want to adjust the scope or focus. Click Dismiss for findings you do not plan to address.

3

Review the draft

After Fern finishes drafting, the finding status changes to Ready to review. Click Review & edit changes to open the draft. Make edits directly or message Fern in the task thread with feedback. When satisfied, publish the article to your help center.

Findings move through statuses as you act on them: Ready to review after Fern drafts, Published after you publish, Dismissed if skipped, and Needs retry if the draft needs significant revision.

Automating audits

For ongoing coverage without manual triggers, set up scheduled audits in your workspace settings. Fern will run audits automatically on a weekly, biweekly, or monthly cadence and surface new findings as they emerge.

Was this helpful?