Are you working with Amazon S3 and would you like an easy way to perform your batch operations? We have what you are looking for!
The Amazon S3 Batch Operations plug-in is available on Automation Hub, download it to empower your Workload Automation environment.
The Amazon S3 Batch Operations plug-in helps you create, run and monitor the batch operations directly from the Dynamic Workload Console. Furthermore, you can also schedule an Amazon S3 Batch Operations job just by creating a simple job definition.
The following are the few prerequisites you need to have to use the plugin:
- AWS Account
- AWS IAM credentials. If you don’t know how to retrieve them, Click Here
- Proper permission for the user to use the resource
Now let us see how easy it is to start and monitor Amazon S3 Batch Operations job on AWS.
Create a new job and select “Amazon S3 Batch Operations” in the Cloud section.
Figure 1: Job Definition Page
Connection
First, establish a connection to the AWS server by entering the required details.
Then, you can test the connection to AWS server by clicking on Test Connection.
Figure 2: Connection Page
Action
After having successfully tested the connection, you can go to the Action tab and specify the details about the manifest file and the kind of operation (copy, invoke lambda function, restore) you need to perform.
Manifest object contains objects and their bucket reference. Use manifest bucket and manifest object to select the manifest file that you want to use.
Figure 3: Action Page
- Copy:
Use this operation to copy all the objects that are referenced in the manifest file to the specified location.
Figure 4: Action Page > Copy section
- Invoke AWS Lambda function:
Use this operation to invoke the selected lambda function on the objects specified on the manifest file.
Figure 5: Action Page > Invoke AWS Lambda Function section
- Restore:
Use this operation to restore the objects that are specified in the manifest file.
You can also specify the type of retrieval that you like to use and for how many days the restored object should be available.
Figure 6: Action Page > Restore section
- Additional Job Configurations:
In Amazon S3 Batch Operations, there is no job name, so the job is differentiated using the description, also you add tags for this job.
You can also set the priority for this job and assign the role that the job can use to complete the job.
You can set the condition for report generation and where to generate it.
Figure 7: Action Page > Additional Job Configuration section A
Figure 8: Action Page > Additional Job Configuration Section B
Submitting your job:
Once the job definition is ready, save it and submit it into the plan.
The job will start its execution and perform the desired operation.
Monitoring your job:
While Amazon S3 Batch Operations executes the process, the plug-in offers the possibility to monitor real-time the execution. Such monitor page, called Workflow details, is accessible from the monitor jobs view and contains all details about the job. Refresh the page to see the updates.
Figure 9: Workflow Details Page
Thus, thanks to the Amazon S3 Batch Operations plug-in, you can automate your Amazon S3 Batch Operations jobs and monitor them, all from one place.
On Automation Hub there are many other integrations that will enable you to automate everything you want.
Automate more, automate better!
Start a Conversation with Us
We’re here to help you find the right solutions and support you in achieving your business goals.