cancel
Showing results for 
Search instead for 
Did you mean: 

SQS -pipeline stalling

ksalem
New Contributor III

Hi All,

I am facing a peculiar issue with a new pipeline being built. the pipeline is setup to consume an SQS message > apply some processing and auditing(Via MySQL DB) > acknowledge the SQS message

The pipeline runs correctly when the SQS Consumer Snap is set to message count 1. however the intention is for this pipeline to run permanently and constantly consume messages from SQS queue.

when setting the message count property to -1 the pipeline does appear to work correctly until it reaches the first audit step where a 'MySQL Execute' is intended to insert an audit entry into a table (just a basic insert). in this instance the pipeline reaches this snap and indicates an input document of 1 but no output document or error document. (The MySQL execute is set to pass through and is not set to ignore empty results, so an output of some kind should always be expected): see attached screenshot

ksalem_0-1739984190631.png

 

 

I was unable to understand why this works when the SQS message count brings in exactly 1 message, but when set to -1 seems to fail silently here. the pipeline continues to run and further messages are picked up by the sqs but they are always held up at the MySQL step.

Many thanks,

Kareem.

1 ACCEPTED SOLUTION

ksalem
New Contributor III

It turned out that this was a blocking snap based on default configuration from the MySQL account its defaulted to send in batches of 50. changing this to 1 allows for instant DB calls

View solution in original post

2 REPLIES 2

ksalem
New Contributor III

I would also add that removing the first 'MySQL Execute' snap allows the pipeline to continue but it is then blocked at the next MySQL Execute.

I wonder if this snap is blocking and requires all documents before it can execute (like a group or sort)

ksalem
New Contributor III

It turned out that this was a blocking snap based on default configuration from the MySQL account its defaulted to send in batches of 50. changing this to 1 allows for instant DB calls