This submit is co-authored by Daryl Martis and Darvish Shadravan from Salesforce.
That is the fourth submit in a sequence discussing the combination of Salesforce Knowledge Cloud and Amazon SageMaker.
In Half 1 and Half 2, we present how Salesforce Knowledge Cloud and Einstein Studio integration with SageMaker permits companies to entry their Salesforce knowledge securely utilizing SageMaker’s instruments to construct, prepare, and deploy fashions to endpoints hosted on SageMaker. SageMaker endpoints might be registered with Salesforce Knowledge Cloud to activate predictions in Salesforce. In Half 3, we show how enterprise analysts and citizen knowledge scientists can create machine studying (ML) fashions, with out code, in Amazon SageMaker Canvas and deploy skilled fashions for integration with Salesforce Einstein Studio to create highly effective enterprise functions.
On this submit, we present how native integrations between Salesforce and Amazon Internet Companies (AWS) allow you to Convey Your Personal Massive Language Fashions (BYO LLMs) out of your AWS account to energy generative synthetic intelligence (AI) functions in Salesforce. Requests and responses between Salesforce and Amazon Bedrock go by the Einstein Belief Layer, which promotes accountable AI use throughout Salesforce.
We show BYO LLM integration by utilizing Anthropic’s Claude mannequin on Amazon Bedrock to summarize a listing of open service instances and alternatives on an account file web page, as proven within the following determine.
Accomplice quote
“We proceed to broaden on our robust collaboration with AWS with our BYO LLM integration with Amazon Bedrock, empowering our prospects with extra mannequin selections and permitting them to create AI-powered options and Copilots custom-made for his or her particular enterprise wants. Our open and versatile AI setting, grounded with buyer knowledge, positions us effectively to be leaders in AI-driven options within the CRM house.”
–Kaushal Kurapati, Senior Vice President of Product for AI at Salesforce
Amazon Bedrock
Amazon Bedrock is a totally managed service that gives a alternative of high-performing basis fashions (FMs) from main AI firms like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon by a single API, together with a broad set of capabilities you’ll want to construct generative AI functions with safety, privateness, and accountable AI. Utilizing Amazon Bedrock, you may rapidly experiment with and consider high FMs on your use case, privately customise them together with your knowledge utilizing strategies corresponding to fine-tuning and Retrieval Augmented Technology (RAG), and construct brokers that execute duties utilizing your enterprise methods and knowledge sources. Since Amazon Bedrock is serverless, you don’t should handle infrastructure, and you’ll securely combine and deploy generative AI capabilities into your functions utilizing the AWS providers you might be already accustomed to.
Salesforce Knowledge Cloud and Einstein Mannequin Builder
Salesforce Knowledge Cloud is an information platform that unifies your organization’s knowledge, giving each staff a 360-degree view of the client to drive automation and analytics, personalize engagement, and energy trusted AI. Knowledge Cloud creates a holistic buyer view by turning volumes of disconnected knowledge right into a single, trusted mannequin that’s easy to entry and perceive. With knowledge harmonized inside Salesforce Knowledge Cloud, prospects can put their knowledge to work to construct predictions and generative AI–powered enterprise processes throughout gross sales, help, and advertising.
With Einstein Mannequin Builder, prospects can construct their very own fashions utilizing Salesforce’s low-code mannequin builder expertise or combine their very own custom-built fashions into the Salesforce platform. Einstein Mannequin Builder’s BYO LLM expertise offers the potential to register {custom} generative AI fashions from exterior environments corresponding to Amazon Bedrock and Salesforce Knowledge Cloud.
As soon as {custom} Amazon Bedrock fashions are registered in Einstein Mannequin Builder, fashions are related by the Einstein Belief Layer, a strong set of options and guardrails that defend the privateness and safety of information, enhance the security and accuracy of AI outcomes, and promote the accountable use of AI throughout Salesforce. Registered fashions can then be utilized in Immediate Builder, a newly launched, low-code immediate engineering device that enables Salesforce admins to construct, take a look at, and fine-tune trusted AI prompts that can be utilized throughout the Salesforce platform. These prompts might be built-in with Salesforce capabilities corresponding to Flows and Invocable Actions and Apex.
Resolution overview
With the Salesforce Einstein Mannequin Builder BYO LLM function, you may invoke Amazon Bedrock fashions in your AWS account. On the time of this writing, Salesforce helps Anthropic Claude 3 fashions on Amazon Bedrock for BYO LLM. For this submit, we use the Anthropic Claude 3 Sonnet mannequin. To be taught extra about inference with Claude 3, discuss with Anthropic Claude fashions within the Amazon Bedrock documentation.
On your implementation, it’s possible you’ll use the mannequin of your alternative. Discuss with Convey Your Personal Massive Language Mannequin in Einstein 1 Studio for fashions supported with Salesforce Einstein Mannequin Builder.
The next picture reveals a high-level structure of how one can combine the LLM out of your AWS account into the Salesforce Immediate Builder.
On this submit, we present easy methods to construct generative AI–powered Salesforce functions with Amazon Bedrock. The next are the high-level steps concerned:
- Grant Amazon Bedrock invoke mannequin permission to an AWS Id and Entry Administration (IAM) person
- Register the Amazon Bedrock mannequin in Salesforce Einstein Mannequin Builder
- Combine the immediate template with the sector within the Lightning App Builder
Conditions
Earlier than deploying this resolution, ensure you meet the next conditions:
- Have entry to Salesforce Knowledge Cloud and meet the necessities for utilizing BYO LLM.
- Have Amazon Bedrock arrange. If that is the primary time you might be accessing Anthropic Claude fashions on Amazon Bedrock, you’ll want to request entry. It’s essential have adequate permissions to request entry to fashions by the console. To request mannequin entry, sign up to the Amazon Bedrock console and choose Mannequin entry on the backside of the left navigation pane.
Resolution walkthrough
To construct generative AI–powered Salesforce functions with Amazon Bedrock, implement the next steps.
Grant Amazon Bedrock invoke mannequin permission to an IAM Consumer
Salesforce Einstein Studio requires an entry key and a secret to entry the Amazon Bedrock API. Observe the directions to arrange an IAM person and entry keys. The IAM person will need to have Amazon Bedrock invoke mannequin permission to entry the mannequin. Full the next steps:
- On the IAM console, choose Customers within the navigation panel. On the best aspect of the console, select Add permissions and Create inline coverage.
- On the Specify permissions display, within the Service dropdown menu, choose Bedrock.
- Below Actions allowed, enter “invoke.” Below Learn, choose InvokeModel. Choose All beneath Assets. Select Subsequent.
- On the Overview and create display, beneath Coverage title, enter
BedrockInvokeModelPolicy
. Select Create coverage.
Register Amazon Bedrock mannequin in Einstein Mannequin Builder
- On the Salesforce Knowledge Cloud console, beneath the Einstein Studio tab, select Add Basis Mannequin.
- Select Connect with Amazon Bedrock.
- For Endpoint info, enter the endpoint title, your AWS account Entry Key, and your Secret Key. Enter the Area and Mannequin info. Select Join.
- Now, create the configuration for the mannequin endpoint you created within the earlier steps. Present Inference parameters corresponding to temperature to set the deterministic issue of the LLM. Enter a pattern immediate to confirm the response.
- Subsequent, it can save you this new mannequin configuration. Enter the title for the saved LLM mannequin and select Create Mannequin.
- After the mannequin creation is profitable, select Shut and proceed to create the immediate template.
- Choose the Mannequin title to open the Mannequin configuration.
- Choose Create Immediate Template to launch the immediate builder.
- Choose Discipline Technology because the immediate template sort, template title, set Object to Account, and set Object Discipline to PB Case and Oppty Abstract. It will affiliate the template to a {custom} subject within the account file object to summarize the instances.
For this demo, a wealthy textual content subject named PB Case and Oppty Abstract was created and added to the Salesforce Account web page structure in keeping with the Add a Discipline Technology Immediate Template to a Lightning File Web page directions.
- Present the immediate and enter variables or objects for knowledge grounding and choose the mannequin. Discuss with Immediate Builder to be taught extra.
Combine immediate template with the sector within the Lightning App builder
- On the Salesforce console, use the search bar to search out Lightning App Builder. Construct or edit an current web page to combine the immediate template with the sector as proven within the following screenshot. Discuss with Add a Discipline Technology Immediate Template to a Lightning File Web page for detailed directions.
- Navigate to the Account web page and click on on the PB Case and Oppty Abstract enabled for chat completion to launch the Einstein generative AI assistant and summarize the account case knowledge.
Cleanup
Full the next steps to scrub up your sources.
Amazon Bedrock gives on-demand inference pricing. There’s no extra prices with a continued mannequin subscription. To take away mannequin entry, discuss with the steps in Take away mannequin entry.
Conclusion
On this submit, we demonstrated easy methods to use your individual LLM in Amazon Bedrock to energy Salesforce functions. We used summarization of open service instances on an account object for instance to showcase the implementation steps.
Amazon Bedrock is a totally managed service that makes high-performing FMs from main AI firms and Amazon out there on your use by a unified API. You may select from a variety of FMs to search out the mannequin that’s finest suited on your use case.
Salesforce Einstein Mannequin Builder permits you to register your Amazon Bedrock mannequin and use it in Immediate Builder to create prompts grounded in your knowledge. These prompts can then be built-in with Salesforce capabilities corresponding to Flows and Invocable Actions and Apex. You may then construct {custom} generative AI functions with Claude 3 which are grounded within the Salesforce person expertise. Amazon Bedrock requests from Salesforce go by the Einstein Belief Layer, which offers accountable AI use with options corresponding to dynamic grounding, zero knowledge retention, and toxicity detection whereas sustaining security and safety requirements.
AWS and Salesforce are excited for our mutual prospects to harness this integration and construct generative AI–powered functions. To be taught extra and begin constructing, discuss with the next sources.
In regards to the Authors
Daryl Martis is the Director of Product for Einstein Studio at Salesforce Knowledge Cloud. He has over 10 years of expertise in planning, constructing, launching, and managing world-class options for enterprise prospects, together with AI/ML and cloud options. He has beforehand labored within the monetary providers trade in New York Metropolis. Observe him on LinkedIn.
Darvish Shadravan is a Director of Product Administration within the AI Cloud at Salesforce. He focuses on constructing AI/ML options for CRM, and is the product proprietor for the Convey Your Personal LLM function. You may join with him on LinkedIn.
Rachna Chadha is a Principal Options Architect AI/ML in Strategic Accounts at AWS. Rachna is an optimist who believes that moral and accountable use of AI can enhance society sooner or later and convey financial and social prosperity. In her spare time, Rachna likes spending time together with her household, mountain climbing, and listening to music.
Ravi Bhattiprolu is a Sr. Accomplice Options Architect at AWS. Ravi works with strategic companions Salesforce and Tableau to ship modern and well-architected merchandise and options that assist joint prospects understand their enterprise goals.
Ife Stewart is a Principal Options Architect within the Strategic ISV phase at AWS. She has been engaged with Salesforce Knowledge Cloud over the past 2 years to assist construct built-in buyer experiences throughout Salesforce and AWS. Ife has over 10 years of expertise in expertise. She is an advocate for range and inclusion within the expertise subject.
Mike Patterson is a Senior Buyer Options Supervisor within the Strategic ISV phase at AWS. He has partnered with Salesforce Knowledge Cloud to align enterprise goals with modern AWS options to realize impactful buyer experiences. In Mike’s spare time, he enjoys spending time along with his household, sports activities, and out of doors actions.
Dharmendra Kumar Rai (DK Rai) is a Sr. Knowledge Architect, Knowledge Lake & AI/ML, serving strategic prospects. He works intently with prospects to know how AWS can assist them remedy issues, particularly within the AI/ML and analytics house. DK has a few years of expertise in constructing data-intensive options throughout a variety of trade verticals, together with high-tech, FinTech, insurance coverage, and consumer-facing functions.