ContributionsMost RecentMost LikesSolutionsRe: Assign a repeating ID For one way to do this with available snaps - please see attached sample pipeline: Community.17134.slp (5.7 KB) Re: Json conversion to new object I think you can add your filter expression within the complete expression where the new object is being created: {"h1": { "h2": {f: $.entries().map(x => {"@name": x[0],"$": x[1]}.filter((val, key) => (val != ''))) }}} Re: Json conversion to new object @skhatri, Maybe the following expression will provide some assistance to get you close. {"h1": { "h2": {f: $.entries().map(x => {"@name": x[0],"$": x[1]}) }}} This expression combines the Object entries() and the Array map() functions together to build out the transformation and embeds the result into a hard-coded wrapper object for the complete body. The attached Pipeline shows the expression in practice: Community.17106.slp (4.7 KB) I hope this helps Re: ServiceNow Query Snap - Pass through does not work for empty result You might could consider disabling the Allow Empty Query Result option and enabling an error view which would capture the pass-through data in the event that no results are returned. Then follow up with a Union snap and mapping as needed. Something like this: Re: How to Convert Timestamp from 1 min to 5 min There are probably a dozen or more ways to do this, but not knowing your full requirements, I’ll just give a couple of expression language examples that I worked up just to get the minimal output for an example. My output is just integers that you could plug back into whatever format or object structure you need for downstream aggregation. The cleanest one that I like so far (assuming the math works for your use case) uses an arrow function with String replace() regex to parse the hour for the input: (x => x + (5 - ( x % 5)))(parseInt($inputDate.replace(/^[^ ]+ \d+:(\d{2}):.+$/ ,"$1"))) But, if that math or some variation doesn’t iron out for your needs, then you could use the match operator with the same parsing method in a way similar to this: match parseInt($inputDate.replace(/^[^ ]+ \d+:(\d{2}):.+$/ ,"$1")) {0..<5 => 5,5..<10=> 10,10..<15 => 15, _ => 'error'} The examples convert the hour part into an integer for simpler processing. Obviously, if your input time formats are different, you will need to tweak the parsing to your needs. Here is the pipeline I tested the ideas with: Community.17000_v1.slp (3.8 KB) I hope this helps. Re: Fetching inputs from basic auth account @aditya.gupta41 , My first thoughts are that you have variables in your HTTP Entity code that do not have values from upstream. REST Post is likely not null-safe. I posted my sample code based on your originally posted code. So, if you’re using my sample, you need to make sure $credentials.application is populated upstream, modify the variable for your case, or remove it from your test. Second thought is you may have put a ‘$’ in front of ‘account.username’ or ‘account.password’ and it is treating it like a variable instead of the account object. If those thoughts are not the case, can you post the contents of the HTTP Entity? Re: Fetching inputs from basic auth account @aditya.gupta41, I tested a few things with the SOAP snap, but I could not get my ideas to work. However, I had more success testing an idea with the REST POST snap, which does allow you to access the account.username and account.password fields of a REST Basic Auth account. You could potentially use the REST POST snap with the SOAP endpoint URL as the Service URL value, add the basic auth account to the snap, and use the following as an expression in the HTTP Entity field. '<env:Envelope xmlns:env="http://www.w3.org/2003/05/soap-envelope" xmlns:ns0="http://siemens.com/agilews"> <env:Header/> <env:Body> <ns0:login xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/"> <credentials> <username>' + account.username + '</username> <password>' + account.password + '</password> <application>' + $credentials.application + '</application> </credentials> </ns0:login> </env:Body> </env:Envelope>' (Note 1: I tried the same with the HTTP Client snap, but it didn’t seem to have access to the account properties) (Note 2: The account properties feature of the REST snaps is not publicly documented as far as I can tell, but it has been referenced by SnapLogic employees in multiple community posts; so keep that consideration in mind. I use it in multiple scenarios for hiding API keys) Re: Unique Constraint Issue in Oracle while inserting Data @sshaik, I don’t have an Oracle DB with which I can test this suggestion, so consider it with that disclaimer; but I think what you may need to do is modify your WHERE query to be a NOT EXISTS subquery. Something similar too: WHERE NOT EXISTS (SELECT NULL FROM Schema.tablename WHERE DEST = $DEST AND LVL = $LVL) Re: Insert Special character in sql server table @rajesh_mangipudi, I confess I am not an expert on this process, so my offering of information is based on what worked for me. I originally had to experiment a bit and lean on the snap documentation and Microsoft BCP documentation quite a bit with trial and error. I eventually got things to work for our case. But, there are a few things I failed to mention in my post above: You must install each of the required executables on each node of your groundplex. Including the required ODBC version and BCP. The executable is launched on the node that which the pipeline is running. This also includes the bcp.bat file you create - must be installed on each node. For my case, I let the installers install the applications to their default locations on the C: drive. However, we keep most of our service applications and dependencies such as SnapLogic and Java on the D: drive - which is where I created the bcp.bat file (D:\opt\BCP\bcp.bat). This is the path that I put into the snap configuration. I can’t confirm this will work for others, but I can confirm that it does work for us at this time . Re: Insert Special character in sql server table @ash42, sorry about my assumption. We run the BCP tool for SQL Server Bulk Load snap on Windows servers on groundplexes. I am not aware if there is an available option for Linux servers, thus my assumption. The path I provided is the standard install location and command-line path at the time I installed BCP on Windows a year or so ago. Sorry, I must defer to someone who is familiar with the Linux uses for the snap.