Require API to build a pipeline to find the high CPU and memory utilization of an individual pipeline during its runtime
We are trying to design a pipeline in Snaplogic to identify the high CPU and memory utilization of an individual pipelines that ran in a month using public API. We tried using runtime API but we are not able to get the expected output. API used: “https://elastic.snaplogic.com/api/1/rest/public/runtime/ ” “https://elastic.snaplogic.com/api/1/rest/public/runtime/Environment/ ”+$Runtime_ID Is there an API for it? which can give the expected output.1.4KViews0likes0CommentsSnaplogic Service Account Unauthorized?
Hi, I have configured a pipeline to trigger another pipeline via triggered task. The first pipeline triggers the second one by calling a triggered task’s secured REST URL, using the REST GET snap. I have tested the REST authentication for this purpose by using my own credentials as basic authentication and found success. However, when I create a SnapLogic account and tick the “Provision this user as a service account.” checkbox, then use this account’s credentials to authenticate the REST call, I get an “Unauthorized” error. I can’t find helpful documentation for this, am I configuring this account wrongly?Solved2.6KViews0likes2CommentsExecuting a Snaplogic Triggered Task via REST, From a SnapLogic Pipeline
Hi, I am designing a pipeline that needs to execute another pipeline upon it’s completion. My original intention was to use the “Pipeline Execute” snap, but this snap has a limitation wherein, it cannot execute pipelines that are stored inside another folder within the same project. It is only able to execute from “shared” and from within the same folder. Moving the desired pipelines to the same folder / shared is not an option for me because it would break some design principles of my project. I found the below text in the “Pipeline Execute” Snap Documentation: To execute a SnapLogic Pipeline that is exposed as a REST service, use the [REST Get] Snap instead. Therefore, I am seeking to satisfy my requirement by using REST Get to execute the pipeline, which I have exposed as a triggered task. How can I configure the “REST Get” Snap to pass-in parameters and execute the triggered task, using the Secured Snaplogic URL? Does this go against best-practices? NB: The pipeline that is being executed does not need to return any data to the parent pipeline.Solved3.2KViews0likes1CommentREST Get Slow
Hey guys, I have built a simple REST Get process to grab some data from an API. There are about 100 URLs to call but it is taking ages. 100 URLs takes about 7 mins whereas in another software application the 100 calls takes about 25s! I have tried using the REST Get snap in a child pipeline and increasing the pool size but I’m still not getting anywhere the speed I would expect. Any suggestions?1.6KViews1like0CommentsSetting content-length in REST PUT snap for SharePoint API
Hi gang, Hoping this awesome community can help me figure this out. Our team is trying to reconstruct some pipelines for uploading documents to SharePoint. Where we’re running in to a snag is after we’ve already received an OAuth 2 token and created a SharePoint upload session (both of those are working great), and then attempting to upload the actual file content (this is what is bombing.) When we explicitly set the content-type header in the snap, we get an error saying the content-length header already exists. When we remove the content-type header in the snap, we get an error from SharePoint saying the content-length header is missing. Has anyone else run in to this issue on setting content-length? How do you properly set the content-length header in the REST PUT snap? Cheers, Mark2.8KViews0likes2CommentsProblems with the Snaplogic "Import a Project" API
I exported a project as a zip file and saved it locally, but I’m having trouble getting the Snaplogic Import a Project API to work. I’ve tried to get this API to work on Postman and as a Python script, but I keep getting a 400 - BAD REQUEST response and the error “Expecting body argument for endpoint import_project: file_handle”. So I assume something is wrong with how I’m trying to send the file, but I can’t figure out what. Here is an example of a Python script I’ve tried: import requests from requests.auth import HTTPBasicAuth import os pod_path = 'elastic.snaplogic.com' import_path = 'Partners/Company/x_Imported' url = f'https://{pod_path}/api/1/rest/public/project/import/{import_path}' params = {'duplicate_check': 'true'} headers = {'Content-Type': 'multipart/form-data'} dirname = os.path.dirname(__file__) filename = os.path.join(dirname, 'project_files/xImport.zip') file_obj = open(filename, "rb") files = {'file': ('xImport.zip', file_obj, 'application/zip')} r = requests.post(url, auth=HTTPBasicAuth( <username>, <password>), headers=headers, params=params, files=files) Any ideas about what’s wrong with this script?2.6KViews1like1CommentGithub API Pagination
Hey community, This is my first attempt to get pagination working on a rest call but I’m a bit stuck on how to deal with the ‘Has Next’ and ‘Next URL’ options. The API response contains the next URL but it also contains the following URL. What do I need to put in each of the pagination options to get this to work?1.3KViews0likes0CommentsUnable to handle echo message
Hi all, I’m encountering this error for the first time but we have a ultra task api pipeline running. And today when Iwas testing out the api it gave the error message: Somehow knows what that means. I also saw in the dashboard that we sometime lose connection to the nodes so I found on the community. Lost contact with Snaplex node Designing Pipelines I have a very small pipeline (3 snaps) that I’m reading from a SQL table and writing the results to a new table in the same DB. I keep getting an error: Lost contact with Snaplex node while the pipeline was running. The select statement is pulling 250+M records and I’m using the Azure bulk insert to write. In order to avoid this error I keep having to reduce the batch size, from 20K to 10K to now 8K. Any thoughts on what could be causing the error? But i already have implemented a pipeline execute. and the CPU usage is very low someone knows the first error? Regards JensSolved4.4KViews0likes5CommentsAPI capture POST/PUT content
Hi, I was wondering how to capture the content of a PUT/POST call to update/insert the data into a database. I know how to capture the URL through pipeline paramters (PATH_INFO) to make all my GET requests working but now I’m trying to find a way to capture the content of the PUT/POSt method. I have searched for pipeline parameters but only found QUERY_STRING and it doesn’t capture the content. Anyone knows a solution to this? Regards JensSolved3.7KViews0likes2Comments