For AI-generated results to be trusted, companies must cite their sources
Retrieval Augmented Generation (RAG), a type of generative AI that pulls from multiple sources to produce something new, is increasingly expected by users to cite the sources used in the generation of the new content.
If you’ve been following industry discussions about Generative AI and RAG, you’re likely already aware that the output is dependent on the inputs. The AI algorithm retrieves the information it deems relevant from the sources available to it and should be able to cite those sources and present them back to the user. There have already been news stories about AI-generated content farms being used to game the programmatic ad industry, and the history of bias in AI is well documented. When sources are not made easily accessible, it can be difficult to determine the accuracy of the information presented.
In March, Google opened up Bard to users who quickly noticed that, unlike Bing Chat, Google’s Bard was less likely to cite their sources. Just two months later, Bard was updated to include better citations. Everyone from publishers concerned that AI-generated responses will hurt their web traffic to academic institutions concerned about ethics is pushing for more (and better) citations for RAG content.
We’ve talked about how important it is for companies to take proactive steps to build trust with users. Showing the content’s sources creates more transparency. Users can see where the information came from and how it was used to generate the content. This helps to build trust and credibility.
When users can see the original information, they can verify that it was used correctly. This helps ensure accuracy, and can also be helpful in determining if the source is reputable and/or if it might be biased in any way.
Adding citations doesn’t need to feel forced or visually overwhelming. Companies should provide links or reference the citation in the content itself. One example of how this can look for intelligent search results is shared below:
Building trust with users is essential for any company developing or using RAG. Showing the sources used to create the content is an immediate and highly visible way to help build that trust, and establish your credibility. It signals a desire to work ethically and makes it possible for anyone to ensure that the newly generated content is accurate, credible, and contextualized. This helps build trust and credibility with users, which can drive increased engagement and sales.
855-491-2959
703-822-5314
XAPP AI is a trademark of XAPP AI, LLC