I've always been fascinated by private equity investment. It's a combination of rigorous analysis, deep thinking, strategic acumen, and sometimes a dash of gut instinct.
A thoughtful investment thesis is never formed in isolation. It takes a village. And it all begins with something that sounds simple but is maddeningly complex and time consuming: understanding the target company, its industry, and the key drivers across its value chain.
The common thread across all of this work is an overwhelming amount of data. Financial statements, filings, industry reports, diligence materials, expert analyses, and the list goes on. The challenge is not access, but synthesis: transforming fragmented information into conclusions that are logical, defensible, and actionable, often under intense time pressure.
That reality became especially clear during my time working in a private equity deal team. Despite operating in a highly digitized world, I was often surprised by how much of the core analytical work remained stubbornly manual.
The unavoidable manual work
Take a seemingly straightforward task: consolidating ten years of historical and LTM income statements of a public company. In theory, it sounds trivial. In practice, it can take hours of focused effort.
While deal teams typically have access to tools like Capital IQ, FactSet, BAMSEC, and EDGAR, pulling historical financials still requires caution. A spreadsheet like that directly pulled from those platforms often includes discrepancies due to mapping, restatements, presentation changes, or simple extraction errors, and it often isn't immediately obvious where a specific number came from. That lack of built-in traceability makes verification labor-intensive. Because these numbers become the foundation for every downstream analysis, accuracy is non-negotiable, and trust is hard to regain once a single data point is questioned.
The result is a familiar routine: opening 10-Ks and 10-Qs across multiple tabs, scrolling through filings, and manually extracting and validating line items, quarter by quarter, year by year. And that's just the starting point, before repeating the process for other financial statements and data that are critical for the analysis.
Limitations of generic LLMs
When large language models began to mature, I became genuinely curious about how they might change this workflow. On the surface, it felt like the perfect application. But that optimism faded quickly when I saw how often general-purpose LLMs hallucinate numbers or produce outputs that aren't reliably grounded in source documents. In finance, being directionally correct is not enough. Small errors compound, and trust is everything.
So I asked a narrower question: if generating an accurate spreadsheet is difficult, could LLMs at least help with comprehension and targeted extraction? Investment professionals often have to review huge volumes of material to form an informed view. If AI could reliably extract a defined category of information across documents, consistently and with citations, it would be a real productivity unlock.
I tested this on a real task: reviewing ten quarterly earnings call transcripts for a public company to understand how management discussed inflation, how it impacted performance, and what it meant for margin expectations. I uploaded the transcripts into a general-purpose LLM and asked for a structured table summarizing the key inflation references and the implied rates and time periods. The result was disappointing. The model struggled to synthesize the transcripts chronologically and couldn't reliably extract context-specific references, especially when inflation came up indirectly or as part of an answer to a different question. After multiple prompt iterations, I ultimately reverted to the familiar approach: manually searching, reviewing, and annotating the transcripts line by line to ensure nothing critical was missed.
That experience is why I'm excited about tools like Kepler, where the promise isn't just speed, but trust. An AI platform that works only from sources you choose and trust, preserves numerical integrity, and makes every conclusion traceable back to its origin fundamentally changes the equation.
There are countless brilliant minds in finance working extraordinarily long hours. With the right tools, that intellectual energy can finally be directed toward the highest-value work: connecting disparate insights, identifying opportunities others miss, engaging thoughtfully with stakeholders, and building businesses that create durable value for shareholders, employees, customers, suppliers, and the communities they operate in.

