Confidence level is not a measure of the company's quality — it is a measure of the scoring system's epistemic certainty. A High confidence score (0.8–1.0) means the AI extracted rich, consistent signal from the deck: quantified metrics, named customers, verifiable team credentials, and a clear narrative. A Low confidence score (0.2–0.5) means the deck contained sparse or ambiguous data, and the score should be treated as directional rather than precise.
The four levels — High, Medium, Low, and Uncertain — map directly to how much weight a reader should place on the number. An Uncertain score below 0.2 essentially means the system could not form a reliable opinion; this typically occurs with very short decks, decks that are mostly visual with little text, or decks that make broad claims without any supporting data.
Displaying a score without its confidence level is a form of misleading precision. Two companies with a NuScore of 6.2 but one at High confidence and one at Uncertain confidence have meaningfully different positions. NUVC enforces that every score surface — in the product, in any API response, and in this glossary — is always displayed with its confidence context.