Current audience
Guest
Current view: Guest
Start with the public quick screen whenever you need a first-pass readout, then sign in to move into a cohort workspace with saved reports and exports.
Open Sign InStartup Screener
Investor-style cohort screening for accelerators and universities.
Help Center
This guide covers guest quick screens, participant uploads, mentor review flows, admin exports, status meanings, and the most common troubleshooting paths in the current web application.
Current audience
Current view: Guest
Start with the public quick screen whenever you need a first-pass readout, then sign in to move into a cohort workspace with saved reports and exports.
Open Sign InOn this page
Role Guide
The cards below describe the effective web-app behavior for guests, participants, mentors, organization admins, and platform admins in this build.
Start with the public quick screen whenever you need a first-pass readout, then sign in to move into a cohort workspace with saved reports and exports.
Capabilities
Recommended next steps
Participants upload files to their assigned cohort workspaces, track processing status, and open their finished reports.
Capabilities
Recommended next steps
Mentors review cohort dashboards, benchmark data, reports, annotations, and exports, but do not upload cohort decks.
Capabilities
Recommended next steps
Org admins have the full reviewer view for their organization and can also upload decks directly into their cohorts.
Capabilities
Recommended next steps
Platform admins can review and operate across all visible organizations and cohorts in the current environment.
Capabilities
Recommended next steps
Workflow Map
Step 1
The public quick screen accepts one file at a time, returns an overview only, and expires automatically after 7 days. It does not expose private cohort data, annotations, or export history.
Step 2
Participant uploads and admin uploads go through the same parsing and scoring pipeline. The workspace auto-refreshes while decks are pending or processing.
Step 3
Completed reports include the scorecard, SWOT, narrative, slide map, and recommendations. Mentors and admins can add report annotations.
Step 4
Reviewer-capable users can download CSV score exports and bundled PDF reports. Cohort benchmarks stay gated until the minimum sample size is met.
Status and Analysis Notes
Reports can surface provider details, fallback markers, and warnings. Those indicators are intentional and are meant to explain how a report was produced.
The file was accepted and queued, but parsing or scoring has not started yet.
The system is parsing the uploaded file, classifying slides, and generating the report.
The report is ready. Open the linked report view for the full analysis.
Automatic extraction or analysis could not finish cleanly. Review the reason shown in the workspace and try a cleaner text-based export.
Reports and quick screens can identify the analysis provider so reviewers understand whether the primary model path or the deterministic backup path generated the result.
If fallback is shown, the system still produced a report, but you should read the fallback note and any warnings before making a high-confidence decision.
Warnings usually point to extraction quality issues such as OCR fallback, sparse text, or file formatting that reduced evidence quality.
Troubleshooting
Confirm the file is PDF, PPTX, DOCX, or XLSX and within the configured upload limit. If the same file is being retried, the current build supports same-file re-selection.
That means the system used the deterministic backup path because the primary model path was unavailable or produced invalid output. Review the fallback note and warnings shown in the report.
This usually means the file had too little readable text, unsupported content, or parsing issues. Export a cleaner text-based file and upload it again.
Benchmarking remains gated until the cohort reaches the configured minimum number of completed reports.