Streamlined File Compression for Cloud Storage Connections in ETL Processes

20 April 2026
KingswaySoft Support

In high-volume ETL workflows, file compression is far more than a housekeeping task; it is a critical strategy for performance and cost optimization. Efficient compression significantly reduces cloud storage footprints, slashes data transfer times between platforms or different regions when the same cloud provider is used, and ensures that sensitive archives are bundled securely for downstream processing. While the Compression Task in KingswaySoft's SSIS Integration Toolkit (previously SSIS Productivity Pack) has long supported these operations, it was previously limited to local file systems. For cloud-first teams, this meant a clunky three-step workaround: download the file(s) to a local directory, compress (or decompress), and then re-upload. This "local staging" not only introduced latency and storage overhead but also created security risks by storing sensitive data on local disks (even if the storage might be temporary).

With the release of SSIS Integration Toolkit v26.1, we have streamlined this process by introducing native support for cloud connection managers within the Compression Task for either the source or the destination paths. This means files can be read directly from a cloud storage location, compressed or decompressed in memory, and written to another cloud storage location, all within a single task and without any local staging. Teams working with cloud-native architectures, whether on Azure, AWS, or other supported providers, can now handle compression and decompression as a first-class task in their Control Flow without the need for inefficient workarounds. This is particularly valuable in scenarios such as archiving processed output files into a ZIP for long-term storage, packaging files for delivery to an external party, or extracting inbound compressed payloads before passing them to downstream data flow components. 

In this blog post, we will walk through two practical examples that demonstrate both sides of this capability.

Example 1: Compressing a Cloud Directory into a ZIP File

In the first example, we have a directory of files stored in an Azure Blob Storage container, and the goal is to package them into a single ZIP archive that is written directly to an Amazon S3 bucket. This kind of workflow is well suited for archival pipelines, file delivery, or situations where multiple output files from a prior process need to be consolidated.

Azure Blob Storage file browser showing a directory of CSV files to be compressed.

To set this up, begin by creating the necessary connection managers in your SSIS project: one for your Azure Blob Storage account and one for your Amazon S3 bucket. Once those are in place, add a Compression Task to the Control Flow and open the editor to configure it.

  1. On the General tab, set the Action to Compress and the Compression Format to Zip.
  2. Under Source Directory/File Settings, select your Azure Blob Storage Connection Manager from the Connection Manager dropdown and set the Source Type to Directory. Enter the path to the source directory in the Source Path field.
  3. Under Destination Directory/File Settings, select your Amazon S3 Connection Manager. To set the destination path, click the ellipsis ( ...) button next to the Destination Path field. A file browser will open showing your S3 bucket structure. There are two approaches for specifying the destination ZIP file:
    • Using the file browser: Navigate to the target folder and use the New File... button to create a new file with a .zip extension. Select it and click OK. Because this file now exists at the destination, you will need to enable Overwrite Existing Items to allow the task to write to it at runtime.
    • Typing the path manually: Select any existing file in the browser to populate the Destination Path field, then close the dialog and manually edit the field to type the full intended path and file name, ending in .zip. Since no file exists at that path yet, you do not need to enable Overwrite Existing Items for the initial run.

    File browser showing the destination archive.zip file selected within an Amazon S3 bucket.

  4. Optionally, enable Include Subdirectories if you want the task to recurse into subdirectories when gathering files to compress. You may also configure Compression Level and supply a Password under the Advanced Settings section if the archive requires password protection.

Compression Task Editor Advanced Settings section showing default Password and Compression Level options.

Compression Task Editor General page showing Action set to Compress with cloud connection managers configured for the source directory and destination file.

When the package runs, the Compression Task streams the files from Azure Blob Storage, compresses them in memory, and writes it to Amazon S3 in a single, fluid motion.

Example 2: Decompressing a ZIP File into a Cloud Directory

In the second example, our requirements are reversed. The source is a ZIP file stored in a cloud storage platform such as an Amazon S3 bucket, and the goal is to extract its contents into a directory in an Azure Blob Storage container. Decompression workflows like this are common when files are delivered by external vendors or partner systems as compressed archives, whether to reduce transfer size, bundle multiple files into a single payload, or meet a specific format required by the sending party. Extracting those archives directly into the target cloud location means the files are immediately available for downstream processing, without any intermediate steps.

The setup is similar to the compression example. With your connection managers already in place, add or reconfigure a Compression Task and open the editor.

  1. On the General tab, set the Action to Decompress and the Compression Format to Zip.
  2. Under Source Directory/File Settings, select your Amazon S3 Connection Manager and set the Source Type to File. Enter the path to the source ZIP file in the Source Path field.
  3. Under Destination Directory/File Settings, select your Azure Blob Storage Connection Manager and specify the destination directory path where the extracted files should be written.
  4. Optionally, enable Overwrite Existing Items if files with the same names may already exist in the destination directory. If the ZIP file is password protected, supply the password under the Advanced Settings section.

Compression Task Editor General page showing Action set to Decompress with an Amazon S3 source file and an Azure Blob Storage destination directory.

When the package runs, the Compression Task will retrieve the ZIP file from the S3 source and extract its contents into the specified Azure Blob Storage directory.

Compression Task component showing a successful execution in the SSIS control flow canvas.

Azure Blob Storage file browser showing the successfully extracted CSV files in the destination directory.

Conclusion

These two examples highlight the flexibility introduced in v26.1. While we focused on Azure and AWS, this enhancement extends across the entire KingswaySoft cloud ecosystem, including Azure Data Lake, Google Cloud Storage, OneDrive, and more. You are no longer tethered to the local file system. Whether you are zipping output for delivery or unzipping inbound payloads, you can now treat cloud compression as a first-class, "zero-staging" operation in your SSIS ETL processes. Whether you are archiving output files, preparing data for delivery, or extracting inbound payloads from a partner system, the Compression Task can handle it directly where the files already live.

Ready to upgrade your pipelines? Download the latest release of the SSIS Integration Toolkit to experience the new capabilities firsthand yourself.

Archive

April 2026 1 March 2026 2 February 2026 2 January 2026 2 December 2025 2 November 2025 2 October 2025 2 September 2025 2 August 2025 2 July 2025 2 June 2025 1 May 2025 2 April 2025 3 March 2025 1 February 2025 1 January 2025 2 December 2024 1 November 2024 3 October 2024 1 September 2024 1 August 2024 2 July 2024 1 June 2024 1 May 2024 1 April 2024 2 March 2024 2 February 2024 2 January 2024 2 December 2023 1 November 2023 1 October 2023 2 August 2023 1 July 2023 2 June 2023 1 May 2023 2 April 2023 1 March 2023 1 February 2023 1 January 2023 2 December 2022 1 November 2022 2 October 2022 2 September 2022 2 August 2022 2 July 2022 3 June 2022 2 May 2022 2 April 2022 3 March 2022 2 February 2022 1 January 2022 2 December 2021 1 October 2021 1 September 2021 2 August 2021 2 July 2021 2 June 2021 1 May 2021 1 April 2021 2 March 2021 2 February 2021 2 January 2021 2 December 2020 2 November 2020 4 October 2020 1 September 2020 3 August 2020 2 July 2020 1 June 2020 2 May 2020 1 April 2020 1 March 2020 1 February 2020 1 January 2020 1 December 2019 1 November 2019 1 October 2019 1 May 2019 1 February 2019 1 December 2018 2 November 2018 1 October 2018 4 September 2018 1 August 2018 1 July 2018 1 June 2018 3 April 2018 3 March 2018 3 February 2018 3 January 2018 2 December 2017 1 April 2017 1 March 2017 7 December 2016 1 November 2016 2 October 2016 1 September 2016 4 August 2016 1 June 2016 1 May 2016 3 April 2016 1 August 2015 1 April 2015 10 August 2014 1 July 2014 1 June 2014 2 May 2014 2 February 2014 1 January 2014 2 October 2013 1 September 2013 2 August 2013 2 June 2013 5 May 2013 2 March 2013 1 February 2013 1 January 2013 1 December 2012 2 November 2012 2 September 2012 2 July 2012 1 May 2012 3 April 2012 2 March 2012 2 January 2012 1

Tags