Startup Screener

Investor-style cohort screening for accelerators and universities.

Help Center

Operational guidance for every user type in Startup Screener.

This guide covers guest quick screens, participant uploads, mentor review flows, admin exports, status meanings, and the most common troubleshooting paths in the current web application.

Current audience

Guest

Current view: Guest

Start with the public quick screen whenever you need a first-pass readout, then sign in to move into a cohort workspace with saved reports and exports.

Open Sign In

Role Guide

What each user type can do

The cards below describe the effective web-app behavior for guests, participants, mentors, organization admins, and platform admins in this build.

Guest

Current

Start with the public quick screen whenever you need a first-pass readout, then sign in to move into a cohort workspace with saved reports and exports.

Capabilities

  • Upload files into the public quick screen whenever you need an overview.
  • Review condensed strengths, gaps, and overall score when the run completes.
  • Move into the authenticated workspace for saved history and deeper review.

Recommended next steps

  • Use the quick screen on the home page for a first-pass readout.
  • Open Sign In when you need private report history or cohort access.
  • Use the Help page as a workflow reference before joining a cohort.

Participant

Participants upload files to their assigned cohort workspaces, track processing status, and open their finished reports.

Capabilities

  • Open visible cohorts from the dashboard.
  • Upload PDF, PPTX, DOCX, and XLSX files when the cohort allows participant uploads.
  • Review only your own submissions and completed reports.

Recommended next steps

  • Go to Dashboard to locate your cohort.
  • Upload a file from the cohort workspace and wait for status to move from pending or processing to complete.
  • Open the finished report to review scorecards, SWOT, recommendations, and annotations.

Mentor

Mentors review cohort dashboards, benchmark data, reports, annotations, and exports, but do not upload cohort decks.

Capabilities

  • Review cohort heatmaps and benchmark summaries.
  • Open reports and add annotations for participants.
  • Export CSV score bundles and PDF report bundles for visible cohorts.

Recommended next steps

  • Use Dashboard to open the cohort workspace you mentor.
  • Filter the reviewer table, then open the report that needs attention.
  • Add annotations in the report view and export materials when the cohort is ready.

Org Admin

Org admins have the full reviewer view for their organization and can also upload decks directly into their cohorts.

Capabilities

  • Review dashboards, benchmarks, exports, and completed reports.
  • Upload decks directly when needed for testing or program operations.
  • Use the same report and annotation surfaces available to mentors.

Recommended next steps

  • Open Dashboard to inspect cohorts and recent file activity.
  • Use cohort workspaces to upload, review processing, and open reports.
  • Use Exports when you need score tables or mentor-ready PDF bundles.

Platform Admin

Platform admins can review and operate across all visible organizations and cohorts in the current environment.

Capabilities

  • Access dashboard, cohort, report, and export workflows across the platform.
  • Upload files for verification or support work.
  • Use the same benchmark and export tooling available to organization admins.

Recommended next steps

  • Open Dashboard to inspect the current workspace state.
  • Use cohort workspaces and reports to validate processing behavior.
  • Use Exports for cross-checking delivered score bundles and PDF output.

Workflow Map

How the product behaves end to end

Step 1

Quick screen

The public quick screen accepts one file at a time, returns an overview only, and expires automatically after 7 days. It does not expose private cohort data, annotations, or export history.

Step 2

Cohort uploads

Participant uploads and admin uploads go through the same parsing and scoring pipeline. The workspace auto-refreshes while decks are pending or processing.

Step 3

Reports and annotations

Completed reports include the scorecard, SWOT, narrative, slide map, and recommendations. Mentors and admins can add report annotations.

Step 4

Exports and benchmarks

Reviewer-capable users can download CSV score exports and bundled PDF reports. Cohort benchmarks stay gated until the minimum sample size is met.

Status and Analysis Notes

What the processing states mean

Reports can surface provider details, fallback markers, and warnings. Those indicators are intentional and are meant to explain how a report was produced.

Pending

The file was accepted and queued, but parsing or scoring has not started yet.

Processing

The system is parsing the uploaded file, classifying slides, and generating the report.

Complete

The report is ready. Open the linked report view for the full analysis.

Manual review

Automatic extraction or analysis could not finish cleanly. Review the reason shown in the workspace and try a cleaner text-based export.

Provider, fallback, and warnings

Reports and quick screens can identify the analysis provider so reviewers understand whether the primary model path or the deterministic backup path generated the result.

If fallback is shown, the system still produced a report, but you should read the fallback note and any warnings before making a high-confidence decision.

Warnings usually point to extraction quality issues such as OCR fallback, sparse text, or file formatting that reduced evidence quality.

Troubleshooting

Common issues and the fastest recovery path

A file will not upload

Confirm the file is PDF, PPTX, DOCX, or XLSX and within the configured upload limit. If the same file is being retried, the current build supports same-file re-selection.

A report says fallback was used

That means the system used the deterministic backup path because the primary model path was unavailable or produced invalid output. Review the fallback note and warnings shown in the report.

A report is stuck in manual review

This usually means the file had too little readable text, unsupported content, or parsing issues. Export a cleaner text-based file and upload it again.

Benchmarks are not visible yet

Benchmarking remains gated until the cohort reaches the configured minimum number of completed reports.

Help | Startup Screener