Startup IndagoAI LLC is has launched its Trust Solutions Platform. It’s a suite of tools aimed at improving confidence in AI-generated research by rating its accuracy and credibility across a broad range of criteria. The firm’s TrustScore is a patent-pending algorithm that applies more than a dozen trust “themes” and 60 factors to evaluate the reliability of research findings in industries such as biopharma, construction, publishing and standards organizations. The company said its goal is to mitigate the growing problem of AI-generated “hallucinations,” or fabricated facts presented as true information, by providing enterprises with a standardized method to assess the trustworthiness of research outputs. Christian Mairhofer, IndagoAI’s co-founder and chief operating officer, said hallucination detection is built directly into the platform. TrustScore evaluates documents, data sources and AI-generated outputs across multiple dimensions, including authorship, objectivity, source credibility, numerical soundness and bias. Scores are generated on a scale of 1 to 10, accompanied by detailed commentary that explains the reasoning behind each rating. The system draws upon both internal analysis and external validation. The platform can also analyze multiple documents simultaneously, providing organizations with an overall trust assessment for large-scale research projects. Mairhofer said TrustScore is designed to integrate into existing enterprise research systems. “If you research 20, 30 or 40 articles, you can have an overall trust score for your research project as well,” he said. “By reinforcing confidence in research results, we can help companies make better decisions with greater transparency.” The company plans to offer both enterprise-wide dashboards and consumer-facing versions of TrustScore. It also hopes to integrate with standards organizations to create industry-wide benchmarks for content credibility.