Remove IDP Remove LLM Remove Metadata
article thumbnail

Secure a generative AI assistant with OWASP Top 10 mitigation

Flipboard

Contrast that with Scope 4/5 applications, where not only do you build and secure the generative AI application yourself, but you are also responsible for fine-tuning and training the underlying large language model (LLM). LLM and LLM agent The LLM provides the core generative AI capability to the assistant.

article thumbnail

Build private and secure enterprise generative AI applications with Amazon Q Business using IAM Federation

AWS Machine Learning Blog

If you want to use Amazon Q Business to build enterprise generative AI applications, and have yet to adopt organization-wide use of AWS IAM Identity Center , you can use Amazon Q Business IAM Federation to directly manage user access to Amazon Q Business applications from your enterprise identity provider (IdP), such as Okta or Ping Identity.

IDP 127
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Intelligent document processing with Amazon Textract, Amazon Bedrock, and LangChain

AWS Machine Learning Blog

Document processing has witnessed significant advancements with the advent of Intelligent Document Processing (IDP). With IDP, businesses can transform unstructured data from various document types into structured, actionable insights, dramatically enhancing efficiency and reducing manual efforts.

IDP 137
article thumbnail

Dialogue-guided intelligent document processing with foundation models on Amazon SageMaker JumpStart

AWS Machine Learning Blog

Intelligent document processing (IDP) is a technology that automates the processing of high volumes of unstructured data, including text, images, and videos. Natural language processing (NLP) is one of the recent developments in IDP that has improved accuracy and user experience.

IDP 98
article thumbnail

Discover insights from Amazon S3 with Amazon Q S3 connector 

AWS Machine Learning Blog

Next you need to index the data to make it available for a Retrieval Augmented Generation (RAG) approach where relevant passages are delivered with high accuracy to a large language model (LLM). The web application that the user uses to retrieve answers would be connected to an identity provider (IdP) or the AWS IAM Identity Center.

Metadata 132
article thumbnail

Find answers accurately and quickly using Amazon Q Business with the SharePoint Online connector

AWS Machine Learning Blog

This enables the Amazon Q large language model (LLM) to provide accurate, well-written answers by drawing from the consolidated data and information. The SharePoint online data source can be optionally connected to an IdP such as Okta or Microsoft Entra ID.

IDP 117
article thumbnail

Discover insights from Box with the Amazon Q Box connector

AWS Machine Learning Blog

Next, you need to index this data to make it available for a Retrieval Augmented Generation (RAG) approach where relevant passages are delivered with high accuracy to a large language model (LLM). Amazon Q supports the crawling and indexing of these custom objects and custom metadata.

Metadata 114