August 23, 2017, by
Posted in Automation, File Sync, FTP

Amazon S3 is by far one of the most popular object-based storage repositories for archiving and backup due to its low cost and global access. But one issue you may be having with S3 is finding an easy way to automatically sync files to and from S3 buckets among your other repositories like FTP, email attachment servers, Microsoft Azure, and others.

You may have already tried to solve some of your file syncing scenarios by using the AWS Command Line Interface or by using other scripting tools, but these can be very cumbersome to create and manage workflows as the number of processes grow and become more complex. Additionally, scripts don’t allow you to keep a detailed transaction history of all the files that are being synced between each repository. But what if there was a way you could schedule file synchronizations to and from your Amazon S3 storage without leaving your web browser? Luckily, there is a solution.

Thru’s file transfer automation solution, OptiFLOW, is the only cloud-based solution of its kind that allows you to create, schedule and manage file transfer workflows among various storage repositories, all without any scripting, file size limits, or leaving your web browser. Even better, all syncs of Amazon S3 storage are tracked in Thru and notifications can be sent to your users, devices and systems after syncing is complete.

Below is a quick demo from our webinar, The Future of File Transfer Automation, showing you how OptiFLOW can used to automatically sync files into your Amazon S3 storage from SFTP servers.

Note: OptiFLOW can preform various other file transfer automation scenarios. To learn more about what OptiFLOW can do, go here.

Scenario: Schedule Daily File Transfers from an SFTP Server to Amazon S3

 
Sync Files from FTP to Amazon S3 Storage

The video you are about to watch will show you a scenario where a company needs to schedule a daily transfer of Purchase Order files from an SFTP server > store in Thru Cloud Storage > and then store a copy in Amazon S3 for archiving. You can read the video transcript below the video demo as you follow along.

Video Transcription

Let’s go on to a demo where we can replicate that workflow [syncing files from SFTP server to an Amazon S3 bucket]. Okay so here I am and I’m just going to move a file via SFTP into an SFTP server. Let’s take this “Purchase Order 101” and we’re going to drag and drop it over here into the source. So while this file gets transferred we can now go and look at OptiFLOW itself.

So here’s the control panel that we set up. The OptiFLOW™ Control Panel is divided into sections that show each stage of the workflow. Workflows can be simple like one step actions (putting data into a folder) or you can set up a more complex, multi-step process. Setting up a workflow consists of three basic steps.

Using this pencil [icon], we can get access to edit the workflow and I can give you a quick tour of adding some sources. Choosing what repositories you want OptiFLOW to automatically pull data into your Thru site is as simple as that.

We can then look at adding processors and choosing what filters or transformations you want OptiFLOW to perform on the data that has been pulled into your [Thru] site. I’m going to click on a “Pass-Through.” This is where you can add a destination, which involves setting up an action that will be performed in the target repository after the data has been processed. And here is also a location where you can define notifications for the actions that have been performed. We’ve got email notifications, SMS and REST.

Going back to the workflow I’m just going to quickly remove these extra bits that I have added. We can now go to the Activity Stream, which can be accessed by double-clicking on these arrows. The activity stream is showing us what happened with that file that we uploaded to the SFTP server. If we scroll down here we can see that “Purchase Order 101” was pulled via SFTP from the server source, and put here in the [Thru] Cloud in our target directory. Once it arrived, a REST call was created and we can see the REST response that could be pushed out. Then the file was actually moved from a target folder to an S3 archive folder. And then finally, the file was taken from the S3 archive folder and actually pushed into the S3 bucket. If we go over to our S3 location now and refresh, we can see that same file has arrived.

Just going back to OptiFLOW just a second, we have an analytics view where you can view data regarding statistics of what’s going on with the [Thru] site as we showed in the slides earlier. We can see some further detail and this will give us views of daily and monthly statistics in details, and you can actually drill down into the numbers further.

Finally, I just want to mention if there was any errors or alerts an icon would appear and these can be set up to emailed out to admins or sent out via SMS. That’s the demo!

Want to learn more about using OptiFLOW to automate file transfers across your repositories? Then watch the full webinar on OptiFLOW or get the datasheet.

Ready to try Thru free for 14 days? Start your free trial now.

Be the first to comment.

Leave a Reply


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*


2017 Gartner Magic Quadrant for Content Collaboration Platforms