Forum Discussion
I am having the same issue. I think the connection Authorizes when I set it up.
But when I go execute, I get the same error.
Error occurred while querying the daily job limits with url
@fmdf typically an error from Salesforce with “Error occurred while querying the daily job limits with url” means the Salesforce account used for the API call does not have permissions to fetch Limits. You can view more about Salesforce Bulk API Limits in their Bulk API Introduction documentation and a bit more granular with their Salesforce API Reference. The full error stack in SnapLogic can often guide you to exactly which permission you’re missing - if you’re able to share that we may be able to provide some additional guidance.
@swright Yes, this is one of the challenging part in snaplogic which I have faced during SOA to Snaplogic migration. There are work around but it’s make your pipeline lengthy.
- Use copy snap where you have exact name of the file. You might need multiple output view as per the requirement.
- Then use gate snap just before the snap where you want to reuse the name of file reader.
3.After gate either you can use mapper or can directly use filename on your write snap.
Using passthru results in some messy json that can complicate things by the time it gets to the end, and it seems that there are some snaps that don’t have the passthru option.
I’m not sure how I can use jsonpath($.“$…PARAMNAME”) for this issue. Can I pass the variable somehow using this?
Thanks.
You define PARAMNAME at the beginning of the pipeline, use passthru where possible, and when you want to use the PARAMNAME later, you use jsonpath to get it.
Other possibility is you get the PARAMNAME value, then use a COPY snap with multiple outputs. Then you use a JOIN snap to bring the value back into the pipeline at other later locations where you need it.
Depending on what you’re trying to do later in your pipeline, the approaches that don’t involve Pipeline Execute are either awkward or not possible.
With Pipeline Execute, this becomes straightforward. Urge your organization to reconsider their decision to avoid one of our product’s most powerful and useful features.
Hey Scott,
As mentioned already, there are several complex options that can be used as a workaround. Those mentioned already are as good as any, so I refrain from mentioning more. Also, I agree with Patrick, that it may behoove your organization to reconsider their position on the Pipeline Execute snap.However, based solely on your example of needing a dynamically generated date stamp constant throughout your pipeline, this may be a less clunky option for you:
Try using pipe.startTime (or a formatted derivative) as part of your filename expression.
"test_" + pipe.startTime.toLocaleDateTimeString({"format":"yyyyMMdd"}) + ".csv"
I believe pipe.startTime should remain constant throughout the pipeline run and this might provide what you can use in your specific use case example.This won’t solve other pass through scenarios though; which is basically the meaning of the title of your post. However, another option for those scenarios might be to place your dynamic expression in a pipeline parameter and use the eval() function to evaluate the parameter. I have used this technique before. As long as it is not an expression result that will change with run time, it should work.
- gmcclintock5 years agoNew Contributor
What we’ve done is made “parent” pipelines to generate this type of thing, then we can pass it to the child and have it all available as a pipeline parameter to be reused - this also allows us to parallelize multiple jobs
eg: we have one parent and one child pipeline for Salesforce CDC - parent grabs the “job” record from our DB that has the case object, which connection to use, the path for storage, and the date range filters, and then it passes those all to a singular child pipeline for execution all at the same time.
We’ve done this as well for other cases like yours, just to keep it clean and knwo where to look and we’ve found it a bit easier to manage than chasing the json through pass through.
Related Content
- 3 years ago
- 10 months ago
- 4 years ago
- 2 years ago