Recent Discussions
How to check the parameter is defined in expression library file ?
I have expression.expr file. And I want to check if parameter is defined in snaplogic pipeline used { getS3EnvRoot: () => _has_restricted_access == 1 ? '/abc/' : '/xyz/', }khanh_tran9 months agoNew Contributor III655Views3likes0CommentsGenerate CSV file with Header and footer and in footer show the record count and current date
Hi, Anyone can help regarding generating below output file. input file Even I have created below pipeline for it but getting extra comma in a footer sectionsnkrvish11 months agoNew Contributor II1.5KViews1like1CommentGenAI App Builder Getting Started Series: Part 1 - HR Q&A example
👋 Welcome! Hello everyone, and welcome to our technical guide to getting started with GenAI App Builder on SnapLogic! At the time of publishing, GenAI App Builder is available for testing and will be generally available in our February release. For existing customers and partners, you can request access for testing GenAI App Builder by speaking to your Customer Success Manager or other member of your account team. If you're not yet a customer, you can speak to your Sales team about testing GenAI App Builder. 🤔 What is GenAI App Builder? Before we begin, let's take a moment to understand what GenAI App Builder is and at least a high-level talk about the components. GenAI App Builder is the latest offering in SnapLogic AI portfolio, focused on helping modern enterprises create applications with Generative AI faster, using a low-/no-code interface. That feels like a mouthful of buzzwords, so let me paint a picture (skip this if you're familiar with GenAI or watch our video, "Enabling employee and customer self-service"). Imagine yourself as a member of an HR team responsible for recruiting year round. Every new employee has an enrollment period after or just before their start date, and every existing employee has open enrollment once per year. During this time, employees need to choose between different medical insurance offerings, which usually involves a comparison of deductibles, networks, max-out-of-pocket, and other related features and limits. As you're thinking about all of this material, sorting out how to explain it all to your employees, you're interrupted by your Slack or Teams DM noise. Bing bong! Questions start flooding in: Hi, I'm a new employee and I'm wondering, when do I get paid? What happens payday is on a weekend or holiday? Speaking of holidays, what are company-recognized holidays this year? Hi, my financial account said I should change my insurance plan to one with an HSA. Can you help me figure out which plan(s) include an HSA and confirm the maximum contribution limits for a family this year? Hi, how does vacation accrual work? When does vacation rollover? Is unpaid vacation paid out or lost? All these questions and many others are answered in documents the HR team manages, including the employee handbook, insurance comparison charts, disability insurance sheets, life insurance sheets, other data sheets, etc. What if, instead of you having to answer all these questions, you would leverage a human-sounding large language model (LLM) to field these questions for you by making sure they referenced only the source documents you provide, so you don't have to worry about hallucinations?! Enter GenAI Builder! 🏗 Building an HR Q&A example Once you have access to test GenAI App Builder, you can use the following steps to start building out an HR Q&A example that will answer questions using only the employee handbook or whichever document that you provide. In this guide we will cover the two pipelines used, one that loads data and one that we will use to answer questions. We will not get into Snap customization or Snap details with this guide - it is just meant to show a quick use case. We do assume that you are familiar enough with SnapLogic to create a new Pipeline or import and existing one, search for Snaps, connect Snaps, and a few other simple steps. We will walk you through anything that is new to SnapLogic or that needs some additional context. We also assume you have some familiarity with Generative AI in this guide. We will also make a video with similar content in the near future, so I'll update or reply to this post once that content is available. Prerequisites In order to complete this guide, you will need the items below regardless of whether or not you use the Community-supported chatbot UI from SnapLogic. Access to a Pinecone instance (sign up for a free account at https://www.pinecone.io) with an existing index Access to Azure OpenAI or OpenAI You need a file to load, such as your company's employee handbook Loading data Our first step is to load data into the vector database using a Pipeline similar to the one below, which we will call the "Indexer" Pipeline since it helps populate the Pinecone Index. If you cannot find the patterns in the Pattern Library, you can find it attached below as "Indexer_Feb2024.slp". The steps below assume you have already imported the Pipeline or are building it as we go through. To add more color here, loading data into the vector database is only something that needs to be done when the files are updated. In the HR scenario, this might be once a year for open enrollment documents and maybe a few times a year for the employee handbook. We will explore some other use cases in the future where document updates would be much frequent. Click on the "File Reader" Snap to open its settings Click on the icon at the far right of the "File" field as shown in the screenshot below Click the "Upload" button in the upper-right corner of the window that pops up Select the PDF file from your local system that you want to index (we are using an employee handbook and you're welcome to do the same) to upload it, then make sure it is selected Save and close the "File Reader" Snap once your file is selected Leave the "PDF Parser" Snap with default settings Click on the "Chunker" Snap to open it, then mirror the settings in the screenshot below. Now open the "Azure OpenAI Embedder" or "OpenAI Embedder" Snap (you may need to replace the embedder that came in the Pattern or import with the appropriate one you have an account with). Go to the "Account" tab and create a new account for the embedder you're using. You need to replace the variables {YOUR_ACCOUNT_LABEL} with a label for the account that makes sense for you, then replace {YOUR_ENDPOINT} with the appropriate snippet from your Azure OpenAI endpoint. Validate the account if you can to make sure it works. After you save your new account you can go back to the main "Settings" tab on the Snap. If the account setup was successful, you should now be able to click the chat bubble icon at the far right of the "Deployment ID" field to suggest a "Deployment ID" - in our environment shown in the screenshot below, you can see we have one named "Jump-emb-ada-002" which I can now select. Finally, make sure the "Text to embed" field is set as shown below, then save and close this Snap. Now open the "Mapper" Snap so we can map the output of the embedder Snap to the "Pinecone Upsert" Snap as shown in the screenshot below. If it is difficult to see the mappings in the screenshot above, here is a zoomed in version: For a little more context here, we're mapping the $embedding object coming out of the embedder Snap to the $values object in Pinecone, which is required. If that was all you mapped though, your Q&A example would always reply with something like "I don't know" because there is no data. To do that, we need to make use of the very flexible "metadata" object in Pinecone by mapping $original.chunk to $metadata.chunk. We also statically set $metadata.source to "Employee Handbook.pdf" which allows the retriever Pipeline to return the source file used in answering a question (in a real-world scenario, you would probably determine the source dynamically/programmatically such as using the filename so this pipeline could load other files too). Save and close the "Mapper" Snap Finally, open the "Pinecone Upsert" Snap then click the "Account" tab and create a new account with your Pinecone API Key and validate it to make sure it works before saving Back on the main "Settings" tab of the "Pinecone Upsert" Snap, you can now click on the chat bubble icon to suggest existing indexes in Pinecone. For example, in our screenshot below you can see we have four which have been obscured and one named "se-demo." Indexes cannot be created on the fly, so you will have to make sure the index is created in the Pinecone web interface. The last setting we'll talk about for the Indexer pipeline is the "Namespace" field in the "Pinecone Upsert" Snap. Setting a namespace is optional. Namespaces in Pinecone create a logical separation between vectors within an index and can be created on-the-fly during Pipeline execution. For example, you could create an index like "2024_enrollment" for all documents published in 2024 for open enrollment and another called "2024_employeehandbook" to separate those documents into separate namespaces. Although these can be used just for internal purposes of organization, you can also direct a chatbot to only use one namespace to answer questions. We'll talk about this more in the "Answering Questions" section below which covers the Retriever Pipeline. Save and close the "Pinecone Upsert" Snap You should now be able to validate the entire Pipeline to see what the data looks like as it flows through the Snaps, and when you're ready to commit the data to Pinecone, you can Execute the Pipeline. Answering Questions To answer questions using the data we just loaded into Pinecone, we're going to recreate or import the Retriever Pipeline (attached as "Retriever_Feb2024.slp"). If you import the Pipeline you may need to add additional "Mapper" Snaps as shown below. We will walk through that in the steps below, just know this is what we'll end up with at the end of our first article. The screenshot above shows what the pattern will look like when you import it. Since this first part of the series will only take us up to the point of testing in SnapLogic, our first few steps will involve some changes with that in mind. Right-click on the "HTTP Router" Snap, click "Disable Snap" Click the circle between "HTTP Router" and embedder Snap to disconnect them Drag the "HTTP Router" Snap somewhere out of the way on the canvas (you can also delete it if you're comfortable replacing it later); your Pipeline should now look like this: In the asset palette on the left, search for the "JSON Generator" (it should appear before you finish typing that all out): Drag a "JSON Generator" onto the canvas, connecting it to the "Azure OpenAI Embedder" or "OpenAI Embedder" Snap Click on the "JSON Generator" to open it, then click on the "Edit JSON" button in the main Settings tab Highlight all the text from the template and delete it so we have a clean slate to work with Paste in this text, replacing "Your question here." with an actual question you want to ask that can be answered from the document you loaded with your Indexer Pipeline. For example, I loaded an employee handbook and I will ask the question, "When do I get paid?" [ { "prompt" : "Your question here." } ] Your "JSON Generator" should now look something like this but with your question: Click "OK" in the lower-right corner to save the prompt Click no the "Azure OpenAI Embedder" or "OpenAI Embedder" Snap to view its settings Click on the Account tab, then use the drop-down box to select the account you created in the section above ("Loading Data", steps 8-9) Click on the chat bubble icon to suggest "Deployment IDs" and choose the same one you chose in "Loading Data", step 10 Set the "Text to embed" field to $prompt as shown in the screenshot below: Save and close the "Azure OpenAI Embedder" or "OpenAI Embedder" Snap Click on the Mapper immediately after the embedder Snap Create a mapping for $embedding that maps to $vector Check the "Pass through" box; this Mapper Snap should now look like this: Save and close this "Mapper" Open the "Pinecone Query" Snap Click the Account tab, then use the drop-down to select the Pinecone account you created in "Loading Data", step 14 Use the chat bubble on the right side of the "Index name" field to select your existing Index [OPTIONAL] Use the chat bubble on the right side of the "Namespace" field to select your existing Namespace, if you created one; the "Pinecone Query" Snap should now look like this: Save and close the "Pinecone Query" Snap. Click on the "Mapper" Snap after the "Pinecone Query" Snap. In this "Mapper" we need to map the three items listed below, which are also shown in the following screenshot. If you're not familiar with the $original JSON key, it occurs when an upstream Snap has implicit pass through, or like the "Mapper" in step 17, we explicitly enable pass through, allowing us to access the original JSON document that went into the upstream Snap. (NOTE: If you're validating your pipeline along the way or making use of our Dynamic Validation, you may notice that no Target Schema shows up in this Mapper until after you complete steps 27-30.) Map $original.original.prompt to $prompt Map jsonPath($, "$matches[*].metadata.chunk") to jsonPath($, "$context[*].data") Map jsonPath($, "$matches[*].metadata.source") to jsonPath($, "$context[*].source") Save and close that "Mapper". Click on the "Azure OpenAI Prompt Generator" or "OpenAI Prompt Generator" so we can set our prompt. Click on the "Edit prompt" button and make sure your default prompt looks like the screenshot below. On lines 4-6 you can see we are using mustache templating like {{#context}} {{source}} {{/context}} which is the same as the jsonPath($, "$context[*].source") from the "Mapper" in step 25 above. We'll talk about this more in future articles - for now, just know this will be a way for you customize the prompt and data included in the future. Click "OK" in the lower-right corner Save and close the prompt generator Snap Click on the "Azure OpenAI Chat Completions" or "OpenAI Chat Completions" Snap Click the "Account" tab then use the drop-down box to select the account you created earlier Click the chat bubble icon to the far right of the "Deployment ID" field to suggest a deployment; this ID may be different than the one you've chosen in previous "Azure OpenAI" or "OpenAI" Snaps since we're selecting an LLM this team instead of an embedding model Set the "Prompt" field to $prompt; your Snap should look something like this: Save and close the chat completions Snap Testing our example Now it's time to validate our pipeline and take a look at the output! Once validated the Pipeline should look something like this: If you click the preview data output on the last Snap, the chat completions Snap, you should see output that looks like this: The answer to our prompt is under $choices[0].message.content. For the test above, I asked the question "When do I get paid?" against an employee handbook and the answer was this: Employees are paid on a semi-monthly basis (24 pay periods per year), with payday on the 15th and the last day of the month. If a regular payday falls on a Company-recognized holiday or on a weekend, paychecks will be distributed the preceding business day. The related context is retrieved from the following sources: [Employee Handbook.pdf] Wrapping up Stay tuned for further articles in the "GenAI App Builder Getting Started Series" for more use cases, closer looks at individual Snaps and their settings, and even how to connect a chat interface! Most if not all of these articles will also have an associated video if you learn better that way! If you have issues with the setup, find a missing step or detail, please reply to this thread to let us know!4.1KViews3likes1CommentGenAI App Builder Getting Started Series: Part 2 - Purchase Order Processing
👋 Welcome! Hello everyone and welcome to our second guide in the GenAI App Builder Getting Started Series! First things first, GenAI App Builder is now generally available for all customers to purchase or test in SnapLabs. If you are a customer or partner who wants access to SnapLabs, please reach out to your Customer Success Manager and they can grant you access. If you are not yet a customer, you can check out our GenAI App Builder videos then when you’re ready to take the next step, request a demo with our sales team! 🤔 What is GenAI App Builder? If you’re coming here from Part 1, you may notice that GenAI Builder is now GenAI App Builder. Thank you to our customers who shared feedback on how we could improve the name to better align with the purpose. The original name had led to some confusion that its purpose was to train LLMs. 📑 Purchase Order Processing Example In this example we will demonstrate how to use GenAI in a SnapLogic Pipeline to act like a function written in natural language to extract information from a PDF. The slide below shows an example of how we use natural language to extract the required fields in JSON format that would allow us to make this small pattern part of a larger app or data integration workflow. ✅ Prerequisites In order to following along with this guide, you will need the items below to complete this guide: Access to GenAI App Builder (in your company’s organization or in SnapLabs) Your own API account with access to Azure OpenAI, OpenAI, Amazon Bedrock Claude. ⬆️ Import the pipeline At the bottom of this post you will find several files if you want to just use a pattern to see this in action in your own environment and explore it further. If you are familiar with SnapLogic and want to build the Pipeline on your own you can do that as well and just download the example PDF or try your own! PurchaseOrderExample.pdf InvoiceProcessing_CommunityArticlePipeline_2024_06_28.slp (zipped) Once you are signed in to SnapLogic or SnapLabs you can start with the steps below to import the Pipeline: In Designer, click the icon shown in the screenshot below to import the Pipeline: Select the file in the File Browser window that pops up In the Add New Pipeline panel that opens you can change name and project location if desired Press the Save button in the lower-right corner 🚧 Parsing the file If you imported the pipeline using the steps above, then your pipeline should look like the one below. The steps below assume you imported the pipeline. If you are familiar enough with SnapLogic to build this on your own you can drag the Snaps shown below to create the Pipeline then follow along with us. 🔈 NOTE: The instructions here are completed with the Amazon Bedrock Prompt Generator and the Anthropic Claude on AWS for the last two Snaps in the Pipeline. You can swap these out for Azure OpenAI or OpenAI Snaps if you prefer to use those LLMs. Click the File Reader Snap to open its settings Click the icon at the far right of the File field as shown in the screenshot below Click the Upload File button in the upper-right corner of the window that pops up Select the PDF file from your file browser (download the file “” at the bottom of this post if you have not already) Save and close the File Reader Snap once your file is selected No edits are needed for the PDF Parser Snap, so we'll skip over that one Click the Mapper Snap Add $text in the Expression field and $context in the Target path fields as shown below Save and close the Mapper Snap Click on the fourth Snap, the Prompt Generator Snap (we will demonstrate here with the Amazon Bedrock Prompt Generator Snap - you do not have to use Amazon Bedrock though, you can any of the other LLM Prompt Generators we have like Azure OpenAI, OpenAI, etc.) Click the Edit Prompt button as shown in the screenshot below so we can modify the prompt used for the LLM You should see a pre-generated prompted like the one below: Copy the prompt below and replace the default prompt: Instruction: Your task is to pull out the company name, the date created, date shipped, invoice number, P.O. number, vendor from vendor details, recipient name from recipient details, subtotal, 'Shipping & handling', tax rate, sales tax, and total from the context below. Give the results back in JSON. Context: {{context}} The Prompt Generator text should now look like the screenshot below: Click the Ok button in the lower-right corner to save our prompt changes Click on the last Snap, the Chat Completions Snap (we will demonstrate here with the Anthropic Claude on AWS Chat Completions Snap - you do not have to use Anthropic Claude on AWS though, you can any of the other LLM Chat Completions Snaps we have like Azure OpenAI, OpenAI, etc.) Click the Account tab Click Add Account; if you have an existing LLM account to use you can select that here and skip to step 22 below Select the type of account you want then press Continue - available options will depend on which LLM Chat Completions Snap you chose Enter in the required credentials for the LLM account you chose; here is an example of the Amazon Bedrock Account Press the Apply button when done entering the credentials Verify your account is now selected in the Account tab Click on the Settings Click on the Suggest icon to the right of the Model name field as shown in the screenshot below and select the model you want to use Type $prompt in the Prompt field as shown in the screenshot below: Expand the Model Parameters section by clicking on it (if you are using OpenAI or Azure OpenAI, you can leave Maximum Tokens blank; for Anthropic Claude on AWS you will need to increase Maximum Tokens from 200 to something higher; you can see where we set 50,000 below) Save and close the Chat Completions Snap 🎬 Testing our example At this point we are ready to test our Pipeline and observe the results! The screenshot below shows you where you can click to Validate the Pipeline, which should have every Snap turn green with preview output as shown below. If you have any errors or questions, please reply to share them with us! Here is the JSON output after the Anthropic Claude on AWS Chat Completions Snap (note that other LLMs will have different API output structures): Extras! Want to play with this further? Try adding a Copy Snap after the Mapper and sending the file to multiple LLMs at once then review the results. Try changing {{context}} in the Prompt Generator to something else so you can drop the Mapper from the pipeline 🏁 Wrapping up Congratulations, you have now completed at least one GenAI App Builder integration in SnapLogic! 😎 Stay tuned to the SnapLabs channel here in the Integration Nation for more content on GenAI App Builder in the future! Please share any thoughts, comments, concerns, or feedback in a reply or DM RogerSramkoski!1.8KViews4likes0Commentshow to run python code in script snap
i am having the working python code for reading the simple string and convert into ascii from ebcdic bigendian. but could you please let me know how i can run this in snaplogic script snap could you please provide the additinal information with code and pipline. please find the attached code.venkat4752 years agoNew Contributor II2.3KViews1like1CommentDiscover Project SnapChain: Build your own Chatbot with Snaps and pipelines!
Hey SnapLabs Community! I hope you're ready for our next experiment. Since you loved SnapGPT so much, we have been hard at work figuring out the easiest way for you to build your own chatbot with your own data for your organization to use internally. Checkout the post below and sign up for our SnapLabs corner webinar happening tomorrow (Wednesday December 6th) at 11AM ET (8 AM PT). See you there! Unlock the Future of AI: Discover Project SnapChain and Build Your Own RAG Chatbot1.3KViews0likes0CommentsProject SnapChain update and tell us your thoughts on the webinar!
Hey y'all, We've run into a few hiccups enabling the Project SnapChain in SnapLabs. We're still looking to run this experiment and have you play around this exciting capability. Stay tuned for further updates. If you have no idea what I'm talking about check out this webinar recording to find out how you can build your own Generative AI powered applications with SnapLogic pipelines! And if you attended the webinar, thank you!! We want to hear from you so please fill out this survey or comment back to us what you'd like to use Project SnapChain for. Survey link: https://forms.gle/UvVyMv4yFkmr69Sc7 Happy Friday! AaronAaronK2 years agoAdmin1.3KViews0likes0CommentsChanges to SnapLabs Office Hours in 2024
Hello SnapLabs users! As many of you may have already seen, we are cancelling SnapLabs Office Hours for the rest of 2023. In 2024 will we change the timing and platform for Office Hours in order to make it part of a quarterly series along with the SnapLabs Corner webinar. Our goals are to reduce the number of meetings and improve on our content and cadence for sharing information about SnapLabs with the Community. Stay tuned for more information in early 2024! Thank you everyone for a great 2023 and see you next year! 🎉1.3KViews0likes0CommentsAccess to SnapLabs
In order to provide the best experience for members of our SnapLabs program, we add a small group of users every week. This allows our team to provide dedicated time to support you through the program. Those interested in participating in SnapLabs can sign up for the waitlist at https://www.snaplogic.com/ snaplabs Please don’t hesitate to post questions and examples here in the SnapLabs corner of the SnapLogic Community.dmiller2 years agoFormer Employee1.9KViews3likes0CommentsIntroducing SnapLabs: Discover the Future of Data Integration!
Hello SnapLogic Community! We are beyond excited to introduce SnapLabs, our brand-new innovation hub, designed to offer you a unique, early-access experience of our most innovative features and capabilities. Today, we open the doors to our first batch of SnapLabs explorers! Our first major highlight in SnapLabs is SnapGPT, a generative AI-based solution that revolutionizes the creation of pipelines, SQL queries, mappings, sample data, and pipeline documentation. Please note, as this is still in beta, there are some limitations. As part of the SnapLabs community, you’re invited to: Explore SnapGPT and other upcoming solutions before they are officially released! Provide invaluable feedback to help shape our future offerings. Attend exclusive webinars, workshops, and special events. Remember, SnapLabs is an ephemeral environment, resetting every 30 days, so there’s always something new and exciting to discover! You’ll receive an email shortly with links to log into your SnapLogic account. For any questions or need for assistance, please reach us on our dedicated SnapLabs community channel. With that said, thank you for being a part of this groundbreaking journey and happy exploring!AaronK2 years agoAdmin4.6KViews6likes5Comments