04-12-2017 01:40 PM
Anyone can come up with a creative way to initiate a pipeline based on an receiving an email?
I would set some rules in the email account to auto route it to a sub-folder. We could have a schedule task to read the email folder but I would rather have a push instead of an interval pull mechanism.
04-13-2017 11:29 AM
Interesting question. I’m asking our Field team if they have any suggestions. It may also depend on which mail server you are using.
04-13-2017 12:27 PM
We use gmail if that helps.
04-14-2017 01:35 PM
If you are using Google/Gmail as your mail provider, Gmail’s Push Notification feature can utilize the Cloud Pub/Sub API to instruct Gmail to publish email deliveries to a Topic that match a certain condition (e.g. the new email has a particular label/in a subfolder). Outside of that, you would have to go back to polling the Users.messages:list API and managing the state yourself.
If using the Pub/Sub API, subscribers to a topic can then either pull and acknowledge messages, or in turn push the message(s) to another endpoint.
Pulling from that topic subscription can be done with scheduled tasks, and the REST Snap Pack with OAuth 2.0 Account.
First, the prerequisites defined by Google must be fulfilled:
Pub/Sub Publisher
permission to serviceAccount:gmail-api-push@system.gserviceaccount.com
Pub/Sub Subscriber
to the email account(s) you want subscribed to the email notifications.Setting up the OAuth 2.0 Account is very similar to the instructions in the Connecting SaaS Providers with SnapLogic’s OAuth-enabled Snaps blog post. The relevant scope
values are “https://www.googleapis.com/auth/gmail.readonly https://www.googleapis.com/auth/pubsub
”.
Then a watch
request needs to be executed semi-regularly on the topic to maintain the subscription:
robin-community-gmail-watch_2017_04_14.slp (4.2 KB)
Another pipeline can then poll the subscription, and receive notifications when changes have occurred with the topic. The issue here is that Gmail’s API is designed for synchronization - meaning the notifications the topic receives are historyId
s (just pointers to the fact that something changed).
To find out what changed, a cache of previous historyId
s needs to be maintained and used to query the history.list
API. For each history event (e.g. messagesAdded
), you then need to query the Users.messages:get
API to get the actual email content. Then the cache should then be updated with the newly received historyId
.
Since effective use of GMail requires some of level of tracking state, this can result in a pipeline that is a little busy but the following crude example shows that it is possible to use a pipeline to read new email messages:
robin-community-gmail-pull_2017_04_14.slp (26.7 KB)
As for utilizing a Push Subscription model for your preferred non-polling solution, this would obviously suit an Ultra pipeline very well but is complicated by Google’s restrictions on only publishing to a domain that is owned and controlled by you and to an endpoint secured by a non-self-signed SSL/TLS certificate.
I haven’t investigated this yet but I imagine this would be possible by setting up a custom API Gateway (e.g. using App Engine) or reverse proxy registered and available at a domain you control, that handles the secure redirection to the Ultra pipeline URL.