No Products in the Cart
This workflow performs automated, periodic backups of objects from an AWS S3 bucket, an Azure Container or a Google Storage Space to a MinIO S3 bucket running locally or on a dedicated container/VM/server. It can also work if the MinIO bucket is running on a remote cloud provider's infrastructure; you just need to change the URL and keys.
Storage administrators, cloud architects, or DevOps who need a simple and scalable solution for retrieving data from AWS, Azure or GCP.
This workflow uses the official AWS S3 API to list and download objects from a specific bucket, or the Azure BLOB one, then send them to MinIO using their version of the S3 API.
None, just a source Bucket on your Cloud Storage Provider and a destination one on MinIO. You'll also need to get MinIO running.
Are you using Proxmox VE? Create a MinIO LXC Container: https://community-scripts.github.io/ProxmoxVE/scripts?id=minio
Need automated backup from another Cloud Storage Provider?
$\mapsto$ Check out our templates, we've done it with AWS, Azure, and GCP, and we even have a version for FTP/SFTP servers!
For a dedicated source Cloud Storage Provider, please contact us!
$\odot$ These workflows can be integrated to bigger ones and modified to best suit your needs! You can, for example, replace the MinIO node to another S3 Bucket from another Cloud Storage Provider (Backblaze, Wasabi, Scaleway, OVH, ...)