Learning Timeline
Key Insights
Access Without Registration
You can try the Cleanlab TLM demo directly on their website without having to go through a 'Sign Up' process first.
The Importance of Trustworthiness Score
This score is critical for business AI applications because it tells you whether the facts provided by the AI are reliable or simply hallucinations.
RAG Integration
This system is most effective when used with RAG, where the AI cross-references its answers with the reference documents you provide.
Prompts
Identifying Document Sources
Target:
Cleanlab TLM
Identify the source of the following data document
Step by Step
How to Detect AI Hallucinations Using Cleanlab TLM
- Visit the Cleanlab Trustworthy Language Model (TLM) website.
- Prepare the external documents or files you want to use as a RAG (Retrieval-Augmented Generation) reference.
- Click on the 'Presets' dropdown menu to select a predefined test scenario.
- Select a sample scenario such as 'Identify the source of the following data document'.
- Enter a question or prompt related to the data in the attached document.
- Click the generate button to get a response from the LLM.
- Look at the 'Trustworthiness Score' section displayed alongside the answer to evaluate factual accuracy.
- Compare the score; a high score indicates the answer aligns with the RAG data, while a low score signals potential hallucinations.