Streamlining venture due diligence with LLMs
Using LLMs to streamline preliminary due diligence for startups and founders fit with Good News Venture's investment theses
01 / Overview
Analysts had to manually skim 100s of pitch decks to identify relevant startups.
Good News Ventures (GNV) is a venture capital firm investing in pre-seed to seed-stage startups. During my internship as a Venture Analyst, I noticed a recurring bottleneck: analysts were spending hours skimming through hundreds of pitch decks, many of which barely met GNV's investment criteria.
02 / Solution
A simple workflow that screens startups automatically and ranks them by fit.
Replace hours of manual deck skimming and fragmented research with clear summaries, founder context, and market snapshots in one place. This allowed analysts to filter out low-fit startups fast and focus their effort on the few worth deeper diligence.
Summary of Features
Automatic pitch deck data entry + filterization
Exctracts problem, solution, traction, and business model directly from decks and filters out out-of-scope startups before they ever hit an analyst's desk.
Standardized founder and team backgrounds in one view.
No more bouncing between LinkedIn, Crunchbase, and Google tabs. Backgrounds are pulled automatically into standardized profiles.
Centralized dashboard that scores startups by adjustable criteria.
Startups are scored on founder, traction, and market size using analyst-defined weights, producing a ranked list for ease of access.
03 / Results
80% reduction in manual screening time
Analysts cut deck triage from ~3 hours/week to ~30 minutes.
78% classification accuracy
Standardized scoring criteria reduced variation between analysts' decisions.
Adopted by 2 analysts in MVP phase
As of it's pilot, all intern analysts confirming it fit seamlessly into their process.
04 / Reflection
Validate scrappy: early and often
I validated the tool against our own portfolio companies using scrappy workflows in n8n and spreadsheets, which exposed weaknesses in the prompts before building a polished version.
Support, don't erase human judgment.
Analysts trusted the tool only when it supported their evaluation instead of replacing it. Designing for trust and transparency and keeping humans in the loop made the system usable.
Simplicity drives adoption
A basic automated scoring system saved analysts hours each week. Because it was simple, it quickly became part of their screening workflow.