FAQ
Installation & Upgrade
This issue typically occurs when a previous version of the software is already installed. To resolve it, you need to uninstall the existing version before installing the new one. You can do this by going to "Add/Remove Programs" or "Programs and Features" in the Control Panel, then selecting "SSIS Integration Toolkit" and removing it.
Licensing
You do not need to acquire a license if you only want to test the software functionality within development tools such as SSDT-BI, BIDS (Business Intelligence Development Studio), or Visual Studio. After installation, the software operates under a free developer license by default, allowing you to create and develop data flows and execute test loads without requiring a commercial license.
If you want to evaluate the software outside of development tools, you can acquire a trial license using the License Manager program installed with the software. This allows you to run SSIS packages on a scheduled basis or from other Windows processes such as the command line. The trial is fully functional for a limited period (typically 14 days), after which it reverts back to the free developer license.
The free developer license is fully functional within development tools such as SSDT-BI, BIDS, or Visual Studio. Its primary limitation is that it cannot be used to run the software outside of these tools.
Additionally, the free developer license should not be used for production purposes, including production data extraction or data loading.
Development
We always recommend keeping our software up to date. New features, enhancements, and bug fixes are included in every release. To upgrade a licensed system to the latest software release, you must ensure that your subscription is active. You can check your license status by launching the KingswaySoft License Manager program. If your subscription has expired, you can contact us for a renewal quote, and we will be happy to assist you. As long as your subscription is active, you can upgrade to any version.
For the free developer license, you can upgrade freely; however, we recommend staying on the same version used in production to maintain maximum compatibility when deploying SSIS packages.
We strongly recommend testing the new release in a development or test environment before deploying it to production. Details about Breaking Changes can be found on the Change Log page. Some changes can be resolved by refreshing the component, while others may require additional adjustments.
- Example 1: In v7.1, an SSIS metadata change was introduced for entity type code fields in the CRM destination component when using SOAP 2007/2006 service endpoints. This can be resolved by clicking the “Refresh CRM Metadata” button (in CRM4 or CRM3 destination components) as described in the change log
- Example 2: In v5.0, significant metadata changes were introduced, especially in the Destination Component's Error Output. The previous single output was split into two outputs: Error Output and Default Output. The Error Output no longer contains successful records, and the ID column has been moved to the Default Output as CrmRecordId. You may need to update your SSIS packages accordingly.
Try opening Visual Studio by selecting "Run as Administrator".
When using SQL Server 2014, a cumulative update is required (Cumulative Update 2 or later, or any recent service packs). The following are the download links:
- Runtime - SQL Server 2014 Service Pack 1: https://www.microsoft.com/en-us/download/details.aspx?id=46694
- Design time - Microsoft SQL Server Data Tools - Business Intelligence for Visual Studio 2013: http://www.microsoft.com/en-ca/download/details.aspx?id=42313
This issue is caused by SSIS optimization behavior. It occurs when not all outputs are attached to a destination component.
You can resolve this issue using one of the following methods:
- Change the RunInOptimizedMode property to False in the Properties window at the Data Flow level
- Remove unused outputs by unchecking them on the General page of the JSON/XML Source/Extract component
- Attach an SSIS destination component for each output.
Text fields are generally represented as "nvarchar" as a standard practice. The reason the length is set to 4000 is related to the web service interface used to communicate with the server (for example, Dynamics SL or Oracle CRM On Demand). The WSDL documents used to generate SSIS metadata do not specify a length for text fields. As a result, all text fields are defined with the maximum allowed length of 4000.
If a field exceeds 4000 characters, it may cause a buffer overflow. To resolve this, you can change the field type to DT_NTEXT using the SSIS Advanced Editor.
To fully utilize the BDD component, you need to increase the connection limit imposed by the Microsoft .NET Framework, which allows a maximum of 2 connections per host for service calls.
To override this limit, you must modify the DTExec.exe.config and DtsDebugHost.exe.config files under the DTS\binn folder by adding the following connectionManagement section:
<configuration> ... <system.net> <connectionManagement> <add address="*" maxconnection="100"/> </connectionManagement> </system.net> </configuration>
The above configuration allows up to 100 simultaneous connections per host. You may adjust this number based on your needs. If you are running a 64-bit system, you must update the files under both the "Program Files" and "Program Files (x86)" folders. Note that these changes are not required for version 5.1 or later, as the limitation has been addressed programmatically.
When using the BDD component, keep in mind the following:
- Avoid using too many BDD distributions in a single data flow, as this may overload the server
- BDD uses the SSIS internal buffer to manage distribution, with a default size of 10,000 rows. This means distribution may not occur until a buffer is filled. For instance, the BDD component's second output will not receive anything if you have less than 10,000 records (9,947 records actually), and the third output will not receive anything until you have more than 19,894 records. If you are working on a small load, you must change the data flow task's DefaultBufferMaxRows property and give it a smaller number so that the BDD distribution happens sooner. SSIS Integration Toolkit (formerly SSIS Productivity Pack) as discussed in this blog post
- SSIS has an undocumented limit of 5 active buffers per data flow. Therefore, using more than 5 BDD branches in a single data flow will not improve performance
- To use more than 5 BDD components, split your data across multiple data flow tasks, with up to 5 BDD components per task.
Deployment
This error can occur for several reasons. A common cause is a difference between your development and server environments.
For example, if your development environment is 32-bit but your server environment is 64-bit, you may encounter this issue. Ensure that both environments are consistent in terms of system architecture and configuration.
When using SQL Server 2014, a cumulative update is required (Cumulative Update 2 or later, or a recent service pack).
- Runtime - SQL Server 2014 Service Pack 1: https://www.microsoft.com/en-us/download/details.aspx?id=46694
- Design time - Microsoft SQL Server Data Tools - Business Intelligence for Visual Studio 2013: http://www.microsoft.com/en-ca/download/details.aspx?id=42313
This error is related to the package ProtectionLevel setting used when the SSIS package was saved. By default, SSIS packages use EncryptSensitiveWithUserKey. This means sensitive data, such as passwords, is encrypted using the package author's user key. When the package is executed by SQL Agent, it runs under a different account with a different user key, so the sensitive data cannot be decrypted.
To resolve this issue, you can configure the SQL Agent job and enter the password manually in SQL Server Management Studio (SSMS). Alternatively, you can change the package ProtectionLevel to EncryptSensitiveWithPassword and provide a PackagePassword. When configuring the SQL Agent job, enter the package password in the Command Line page.
The recommended best practice is to use the SSIS Package Configuration Wizard to create a configuration file for deployment and parameterize your connection manager. This allows you to keep the default ProtectionLevel and manage sensitive values through configurations instead.
Dynamics 365
If your CRM, AX, or NAV server has been moved to a different server, you will need to go through your SSIS project and reconfigure your Connection Manager(s) to point to the new server(s).
Salesforce
You can use Force.com Explorer to build queries. Force.com Explorer is a free tool that allows you to select fields visually to create queries. You can also use this tool to validate your queries and preview your Salesforce data.
You can find the SOQL reference document at the following URL: http://www.salesforce.com/us/developer/docs/soql_sosl/index.htm
This error occurs when multiple records have the same text value. During the Text Lookup process, these values are translated into the same Salesforce ID. When the second record is processed, the system throws this exception because the record with that ID has already been deleted.
To avoid this issue, ensure that there are no duplicate values in the system when using the Text Lookup feature with the DELETE action. If duplicates already exist and you must use Text Lookup for DELETE, you can modify your source query to return only distinct records. For example, if your source is Salesforce and you want to delete all Contact records with a LastName starting with "Lastname", you can use a query like the following:
SELECTName FROMContact WHERELastName LIKE'Lastname%' GROUPBY Name LIMIT 2000 |
Using this approach, you may need to run the DELETE data flow task multiple times to complete the deletion.
The following tips can help improve the performance of the Salesforce Destination Component.
From an SSIS development perspective, consider the following:
- Minimize the number of object fields you write to Salesforce
- Use an appropriate batch size
- Consider using the Salesforce Bulk API option in the Salesforce Destination Component
- Consider implementing BDD (Balanced Data Distributor) functionality to write data to Salesforce in parallel.
SharePoint
This error typically occurs when BaseName is set to the root folder for all records in the same batch. BaseName represents the base folder when writing to a SharePoint document library. A solution is to unmap the Virtual_LocalRelativePath field in the destination component.
This error occurs when the connection user is configured to use multi-factor authentication (MFA). Our software is designed for non-interactive use, and MFA requires manual human intervention, which is not possible in a non-interactive process. Therefore, MFA is not supported. The alternative is to use a different connection user that does not have MFA enabled.
There are different ways to upload local files to a SharePoint document library. You can use the Premium File Transfer task to transfer files. You can also use the SSIS Import Column transformation component with our SharePoint Destination Component to help upload files to the library.
HubSpot
It is best practice to select only the fields you need when retrieving data from HubSpot. There is a query limit when selecting fields for record retrieval. If you encounter this error, it is likely that you have exceeded the query limit. To avoid this issue, reduce the number of selected fields within the HubSpot Source Component.
Marketo
The "Access Denied" error means that authentication is successful, but the user does not have sufficient permission to call the API. Additional permissions may need to be assigned to the user role. You should check the API access permissions of your API user role. Also ensure that "Read-Only Opportunity" and "Read-Write Opportunity" permissions are included in the user role.
The Opportunity API, Company API, and Sales Person API are not supported for CRM-enabled subscriptions.
This issue occurs because Marketo attempts to detect differences when using the updateOnly method for the Sales Person object. If the input data is identical to the existing data, a "System error" may occur.
To avoid this error from failing your package, you can enable the "Redirect rows to error output" option on the Error Handling page in the Marketo Destination Component. Alternatively, you can use the Upsert (Create or Update) method.
Dynamics GP
We found that Dynamics GP web services did not expose enough functionality compared to what is available through eConnect. In addition, the complexity of creating properly structured XML documents for eConnect made the integration experience less user-friendly. By directly interacting with eConnect stored procedures, all functionality available in eConnect can be utilized without the need to exchange XML documents.
The number (for example, 3388) is the ErrorCode returned from the Dynamics GP stored procedure. You can query the table Dynamics.dbo.taErrorCodes to find the corresponding error description.
Yes, it does. If you need to work with custom stored procedures created in the Dynamics GP database, you can navigate to:
- C:\Program Files (x86)\KingswaySoft\SSIS Integration Toolkit for Microsoft Dynamics GP\Schema
- Then modify the schema files accordingly by adding the schema for your procedures
- Make sure to create a full backup of the file before making any changes so you have a working reference
- If you want to store schema files in a different folder, you can add a registry entry under:
- HKLM\SOFTWARE\KingswaySoft\SSIS Integration Toolkit for Microsoft Dynamics GP
- Create a new entry called SchemaDirectory and point it to the desired folder (where you store your schema files)
Dynamics SL
In your Dynamics SL installation package, install the User Manuals and review the Web Services SDK document (WebServices.pdf). This guide provides documentation on the default web services. The section that may be most useful is the chapter on Authoring a Custom Web Service.
Depending on your deployment, the optimization strategy may vary.
You may consider using the SSIS Balanced Data Distributor (BDD) component to write data to your server in parallel.
Upon investigation and experimentation with the Solomon Object Model, it was found that the technologies used in the provided libraries were too primitive. Due to the way the Solomon Object Model mimics the Dynamics SL client, a robust solution could not be found to efficiently extract or input data.