Featured Content
-
How to contact Qlik Support
Qlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical e... Show MoreQlik offers a wide range of channels to assist you in troubleshooting, answering frequently asked questions, and getting in touch with our technical experts. In this article, we guide you through all available avenues to secure your best possible experience.
For details on our terms and conditions, review the Qlik Support Policy.
Index:
- Support and Professional Services; who to contact when.
- Qlik Support: How to access the support you need
- 1. Qlik Community, Forums & Knowledge Base
- The Knowledge Base
- Blogs
- Our Support programs:
- The Qlik Forums
- Ideation
- How to create a Qlik ID
- 2. Chat
- 3. Qlik Support Case Portal
- Escalate a Support Case
- Resources
Support and Professional Services; who to contact when.
We're happy to help! Here's a breakdown of resources for each type of need.
Support Professional Services (*) Reactively fixes technical issues as well as answers narrowly defined specific questions. Handles administrative issues to keep the product up-to-date and functioning. Proactively accelerates projects, reduces risk, and achieves optimal configurations. Delivers expert help for training, planning, implementation, and performance improvement. - Error messages
- Task crashes
- Latency issues (due to errors or 1-1 mode)
- Performance degradation without config changes
- Specific questions
- Licensing requests
- Bug Report / Hotfixes
- Not functioning as designed or documented
- Software regression
- Deployment Implementation
- Setting up new endpoints
- Performance Tuning
- Architecture design or optimization
- Automation
- Customization
- Environment Migration
- Health Check
- New functionality walkthrough
- Realtime upgrade assistance
(*) reach out to your Account Manager or Customer Success Manager
Qlik Support: How to access the support you need
1. Qlik Community, Forums & Knowledge Base
Your first line of support: https://community.qlik.com/
Looking for content? Type your question into our global search bar:
The Knowledge Base
Leverage the enhanced and continuously updated Knowledge Base to find solutions to your questions and best practice guides. Bookmark this page for quick access!
- Go to the Official Support Articles Knowledge base
- Type your question into our Search Engine
- Need more filters?
- Filter by Product
- Or switch tabs to browse content in the global community, on our Help Site, or even on our Youtube channel
Blogs
Subscribe to maximize your Qlik experience!
The Support Updates Blog
The Support Updates blog delivers important and useful Qlik Support information about end-of-product support, new service releases, and general support topics. (click)The Qlik Design Blog
The Design blog is all about product and Qlik solutions, such as scripting, data modelling, visual design, extensions, best practices, and more! (click)The Product Innovation Blog
By reading the Product Innovation blog, you will learn about what's new across all of the products in our growing Qlik product portfolio. (click)Our Support programs:
Q&A with Qlik
Live sessions with Qlik Experts in which we focus on your questions.Techspert Talks
Techspert Talks is a free webinar to facilitate knowledge sharing held on a monthly basis.Technical Adoption Workshops
Our in depth, hands-on workshops allow new Qlik Cloud Admins to build alongside Qlik Experts.Qlik Fix
Qlik Fix is a series of short video with helpful solutions for Qlik customers and partners.The Qlik Forums
- Quick, convenient, 24/7 availability
- Monitored by Qlik Experts
- New releases publicly announced within Qlik Community forums (click)
- Local language groups available (click)
Ideation
Suggest an idea, and influence the next generation of Qlik features!
Search & Submit Ideas
Ideation GuidelinesHow to create a Qlik ID
Get the full value of the community.
Register a Qlik ID:
- Go to: qlikid.qlik.com/register
- You must enter your company name exactly as it appears on your license or there will be significant delays in getting access.
- You will receive a system-generated email with an activation link for your new account. NOTE, this link will expire after 24 hours.
If you need additional details, see: Additional guidance on registering for a Qlik account
If you encounter problems with your Qlik ID, contact us through Live Chat!
2. Chat
Incidents are supported through our Chat, by clicking Chat Now on any Support Page across Qlik Community.
To raise a new issue, all you need to do is chat with us. With this, we can:
- Answer common questions instantly through our chatbot
- Have a live agent troubleshoot in real time
- With items that will take further investigating, we will create a case on your behalf with step-by-step intake questions.
3. Qlik Support Case Portal
Log in to manage and track your active cases in Manage Cases. (click)
Please note: to create a new case, it is easiest to do so via our chat (see above). Our chat will log your case through a series of guided intake questions.
Your advantages:
- Self-service access to all incidents so that you can track progress
- Option to upload documentation and troubleshooting files
- Option to include additional stakeholders and watchers to view active cases
- Follow-up conversations
When creating a case, you will be prompted to enter problem type and issue level. Definitions shared below:
Problem Type
Select Account Related for issues with your account, licenses, downloads, or payment.
Select Product Related for technical issues with Qlik products and platforms.
Priority
If your issue is account related, you will be asked to select a Priority level:
Select Medium/Low if the system is accessible, but there are some functional limitations that are not critical in the daily operation.
Select High if there are significant impacts on normal work or performance.
Select Urgent if there are major impacts on business-critical work or performance.
Severity
If your issue is product related, you will be asked to select a Severity level:
Severity 1: Qlik production software is down or not available, but not because of scheduled maintenance and/or upgrades.
Severity 2: Major functionality is not working in accordance with the technical specifications in documentation or significant performance degradation is experienced so that critical business operations cannot be performed.
Severity 3: Any error that is not Severity 1 Error or Severity 2 Issue. For more information, visit our Qlik Support Policy.
Escalate a Support Case
If you require a support case escalation, you have two options:
- Request to escalate within the case, mentioning the business reasons.
To escalate a support incident successfully, mention your intention to escalate in the open support case. This will begin the escalation process. - Contact your Regional Support Manager
If more attention is required, contact your regional support manager. You can find a full list of regional support managers in the How to escalate a support case article.
Resources
A collection of useful links.
Qlik Cloud Status Page
Keep up to date with Qlik Cloud's status.
Support Policy
Review our Service Level Agreements and License Agreements.
Live Chat and Case Portal
Your one stop to contact us.
Recent Documents
-
Reloads with Qlik Data Gateway are randomly slower or fail after 3 hours
Reloads with Data Gateway are randomly slower, sometimes they last for 3 hours until they are automatically aborted.The same reloads usually run faste... Show MoreReloads with Data Gateway are randomly slower, sometimes they last for 3 hours until they are automatically aborted.
The same reloads usually run faster and they can be completed in the right time if relaunched manually or automatically.
The logs do not show specific error messages.Resolution
We recommend two actions to resolve the problem.
The first is to activate the process isolation in order to reduce the number of reloads running at the same time. Please, follow this article.
It is possible to start with a value of 10 for the ODBC|SAPBW|SAPSQL_MAX_PROCESS_COUNT parameter and adjust it after some tests.The second action is to add the "DISCONNECT" command after every query and to start every new query with a "LIB CONNECT".
This will force the closure and re-creation of the connection every time it is needed.
More information about the DISCONNECT statement can be found here.We always recommend to keep Data Gateway on the latest version available.
Cause
This intermittent problem can be due to different causes.
In many cases the system can't handle multiple connections efficiently and this can lead to severe slowness in the data connection. Activating a Process Isolation will help to avoid this.
It is also possible that there is a delay between the connection opening and the query.
A connection for a query can be opened at the beginning a reload, then kept open for a while and re-called later for another table load in the script.
It is possible that there is a disconnection when the connection is not working. This can happen if another connection to the same location is called by a concurrent reload or if a timeout automatically closes the connection.
It is possible to force Data Gateway to recreate the connection using the "DISCONNECT" statement in the script.Environment
- Qlik Data Gateway all supported versions.
-
Log4j tips and tricks
Log4j, incorporated in Talend software, is an essential tool for discovering and solving problems. This article shows you some tips and tricks for usi... Show MoreLog4j, incorporated in Talend software, is an essential tool for discovering and solving problems. This article shows you some tips and tricks for using Log4j.
The examples in this article use Log4j v1, but Talend 7.3 uses Log4j v2. Although the syntax is different between the versions, anything you do in Log4j v1 should work, with some modification, in Log4j v2. For more information on Log4j v2, see Configuring Log4j, available in the Talend Help Center.
Content:
- Configuring Log4j in Talend Studio
- Emitting messages
- Routines
- Controlling Log4j message formats with patterns
- Logging levels
- Using Appenders
- Using filters
- Overriding default settings in Talend Administration Center
Configuring Log4j in Talend Studio
Configure the log4j.xml file in Talend Studio by navigating to File > Edit Project properties > Log4j.
You can also configure Log4j using properties files or built-in classes; however, that is not covered in this article.
Emitting messages
You can execute code in a tJava component to create Log4j messages, as shown in the example below:
log.info("Hello World"); log.warn("HELLO WORLD!!!");This code results in the following messages:
[INFO ]: myproject.myjob - Hello World [WARN ]: myproject.myjob - HELLO WORLD!!!
Routines
You can use Log4j to emit messages by creating a logger class in a routine, as shown in the example below:
public class logSample { /*Pick 1 that fits*/ private static org.apache.log4j.Logger log = org.apache.log4j.Logger.getLogger(logSample.class); private static org.apache.log4j.Logger log1 = org.apache.log4j.Logger.getLogger("from_routine_logSample"); /*...*/ public static void helloExample(String message) { if (message == null) { message = "World"; } log.info("Hello " + message + " !"); log1.info("Hello " + message + " !"); } }To call this routine from Talend, use the following command in a tJava component:
logSample.helloExample("Talend");The log results will look like this:
[INFO ]: routines.logSample - Hello Talend ! [INFO ]: from_routine_logSample - Hello Talend !
Using <routineName>.class includes the class name in the log results. Using free text with the logger includes the text itself in the log results. This is not really different than using System.out, but Log4j can be customized and fine-tuned.
Controlling Log4j message formats with patterns
You can use patterns to control the Log4j message format. Adding patterns to Appenders customizes their output. Patterns add extra information to the message itself. For example, when multiple threads are used, the default pattern doesn't provide information about the origin of the message. Use the %t variable to add a thread name to the logs. To easily identify new messages, it's helpful to use %d to add a timestamp to the log message.
To add thread names and timestamps, use the following pattern after the CONSOLE appender section in the Log4j template:
<param name="ConversionPattern" value= "%d{yyyy-MM-dd HH:mm:ss} [%-5p] (%t): %c - %m%n" />The pattern displays messages as follows:
ISO formatted date [log level] (thread name): class projectname.jobname - message contents
If the following Java code is executed in three parallel threads, using the sample pattern above helps distinguish between the threads.
java.util.Random rand = new java.util.Random(); log.info("Hello World"); Thread.sleep(rand.nextInt(1000)); log.warn("HELLO WORLD!!!"); logSample.helloExample("Talend");This results in an output that shows which thread emitted the message and when:
2020-05-19 12:18:30 [INFO ] (tParallelize_1_e45bc79b-d61f-45a3-be8f-7089ab6d565d): myproject.myjob_0_1.myjob - Hello World 2020-05-19 12:18:30 [INFO ] (tParallelize_1_4064c9b8-0585-41e0-b9f0-95fb31e602b7): myproject.myjob_0_1.myjob - Hello World 2020-05-19 12:18:30 [INFO ] (tParallelize_1_a8ef1065-0106-4b45-8a60-d02a9cbe1f00): myproject.myjob_0_1.myjob - Hello World 2020-05-19 12:18:30 [WARN ] (tParallelize_1_e45bc79b-d61f-45a3-be8f-7089ab6d565d): myproject.myjob_0_1.myjob - HELLO WORLD!!! 2020-05-19 12:18:30 [INFO ] (tParallelize_1_e45bc79b-d61f-45a3-be8f-7089ab6d565d): routines.logSample - Hello Talend ! 2020-05-19 12:18:30 [INFO ] (tParallelize_1_e45bc79b-d61f-45a3-be8f-7089ab6d565d): from_routine.logSample - Hello Talend ! 2020-05-19 12:18:30 [WARN ] (tParallelize_1_a8ef1065-0106-4b45-8a60-d02a9cbe1f00): myproject.myjob_0_1.myjob - HELLO WORLD!!! 2020-05-19 12:18:30 [INFO ] (tParallelize_1_a8ef1065-0106-4b45-8a60-d02a9cbe1f00): routines.logSample - Hello Talend ! 2020-05-19 12:18:30 [INFO ] (tParallelize_1_a8ef1065-0106-4b45-8a60-d02a9cbe1f00): from_routine.logSample - Hello Talend ! 2020-05-19 12:18:31 [WARN ] (tParallelize_1_4064c9b8-0585-41e0-b9f0-95fb31e602b7): myproject.myjob_0_1.myjob - HELLO WORLD!!! 2020-05-19 12:18:31 [INFO ] (tParallelize_1_4064c9b8-0585-41e0-b9f0-95fb31e602b7): routines.logSample - Hello Talend ! 2020-05-19 12:18:31 [INFO ] (tParallelize_1_4064c9b8-0585-41e0-b9f0-95fb31e602b7): from_routine.logSample - Hello Talend !
If you want to know which component belongs to which thread, you need to change the log level to add more information.
You can do this in Studio on the Run tab, in the Advanced settings tab of the Job execution.
In Talend Administration Center, you do this in Job Conductor.
Using DEBUG level adds a few extra lines to the log file, which can help you understand which parameters resulted in a certain output:
2020-05-19 12:51:50 [DEBUG] (tParallelize_1_c6de81be-1bbf-4f9b-9b7a-3d92bf345c40): myproject.myjob_0_1.myjob - tParallelize_1 - The subjob starting with the component 'tJava_1' starts. 2020-05-19 12:51:50 [DEBUG] (tParallelize_1_fa636a36-9f53-423f-abc6-b26c4c52c5b4): myproject.myjob_0_1.myjob - tParallelize_1 - The subjob starting with the component 'tJava_3' starts. 2020-05-19 12:51:50 [DEBUG] (tParallelize_1_d4da8ea0-4401-4229-82e9-86ff0ed67c3b): myproject.myjob_0_1.myjob - tParallelize_1 - The subjob starting with the component 'tJava_2' starts.
Keep in mind the following:
- Changing the default log pattern causes Studio to stop coloring the messages.
- The default log level in Studio is defined by the root logger's priority value (Warn, by default).
- Changing the log level changes the number of messages.
- Changing the pattern changes the message format.
Logging levels
The following table describes the Log4j logging levels you can use in Talend applications:
Debug Level Description TRACE Everything that is available is being emitted at this logging level, which makes every row behave like it has a tLogRow component attached. This can make the log file extremely large; however, it also displays the transformation done by each component. DEBUG This logging level displays the component parameters, database connection information, queries executed, and provides information about which row is processed, but it does not capture the actual data. INFO This logging level includes the Job start and finish times, and how many records were read and written. WARN Talend components do not use this logging level. ERROR This logging level writes exceptions. These exceptions do not necessarily cause the Job to halt. FATAL When this appears, the Job execution is halted. OFF Nothing is emitted. These levels offer high-level controls for messages. When changed from the outside they affect only the Appenders that did not specify a log level and rely on the level set by the root logger.
Using Appenders
Log4j messages are processed by Appenders, which route the messages to different outputs, such as to console, files, or logstash. Appenders can even send messages to databases, but for database logs, the built-in Stats & Logs might be a better solution.
Storing Log4j messages in files can be useful when working with standalone Jobs. Here is an example of a file Appender:
<appender name="ROLLINGFILE" class="org.apache.log4j.RollingFileAppender"> <param name="file" value="rolling_error.log"/> <param name="Threshold" value="ERROR"/> <param name="MaxFileSize" value="10000KB"/> <param name="MaxBackupIndex" value="5"/> <layout class="org.apache.log4j.PatternLayout"> <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} [%-5p] (%t): %c - %m%n"/> </layout> </appender>You can use multiple Appenders to have multiple files with different log levels and formats. Use the parameters to control the content. The Threshold value of ERROR doesn't provide information about the Job execution, but a value of INFO makes errors harder to detect.
For more information on Appenders, see the Apache Interface Appender page.
Using filters
You can use filters with Appenders to keep messages that are not of interest out of the logs. Log4j v2 offers regular expression based filters too.
The following example filter omits any Log4j messages that contain the string " - Adding the record ".
<filter class="org.apache.log4j.varia.StringMatchFilter"> <param name="StringToMatch" value=" - Adding the record " /> <param name="AcceptOnMatch" value="false" /> </filter>
Overriding default settings in Talend Administration Center
When a Java program starts, it attempts to load its Log4j settings from the log4j.xml file. You can modify this file to change the default settings, or you can force Java to use a different file. For example, you can do this for Jobs deployed to Talend Administration Center by configuring the JVM parameters. This way, you can change the logging behavior for a Job without modifying the original Job, or you can revert back to the original logging behavior by clearing the Active check box.
-
Qlik Sense and loading data from CSV: text longer than 1,048,576 characters is b...
Qlik Sense can process a maximum of 1,048,576 (2^20) characters by row when loading data from a CSV file. If a row in the source CSV file is longer th... Show MoreQlik Sense can process a maximum of 1,048,576 (2^20) characters by row when loading data from a CSV file. If a row in the source CSV file is longer than this limit, Qlik Sense automatically breaks it to multiple rows in the loaded table.
This doesn't happen when loading another file format (like XML) or loading the same CSV file in QlikView.
Resolution
To increase the maximum length, please set parameter LongestPossibleLine in Settings.ini file of Qlik Sense Engine to a higher value than 1048576.
See How to modify Qlik Sense Engine's Settings.ini for detailed instructions of changing parameters in Settings.ini.
Cause
Qlik Sense engine supports up to 512 Megabytes (512*1024*1024) as line length. Script reload can handle strings up to this length in a single data cell. However, when using the data selection wizard, such long string may break the web socket. Therefore, maximum string length is limited to 1,048,576 characters to avoid this web socket issue.
Environment
-
Qlik Replicate SAP ODP CDS View not processing Updates done in SAP
When replicating a CDS View the Updates are being processed as Inserts not Updates. In SAP the CDS views delta with UPSERT and SAP capture INSERT and ... Show MoreWhen replicating a CDS View the Updates are being processed as Inserts not Updates. In SAP the CDS views delta with UPSERT and SAP capture INSERT and UPDATE both as one operation which is treated as INSERT.
If you have a SAP login you can look up SAP Note 3300238 for more information as shown below:
SAP Note 3300238 - ABAP CDS CDC: ODQ_CHANGEMODE not showing proper status forcreation
Component: BW-WHM-DBA-ODA (SAP Business Warehouse > Data Warehouse Management > Data Basis >Operational Data Provider for ABAP CDS, HANA & BW), Version: 4, Released On: 19.01.2024Resolution
This is working as expected. It is the designed behavior of the CDC logic. For both insert and update, ODQ_CHANGEMODE = U and ODQ_ENTITYCNTR = 1.
The CDC-delta logic is designed as UPSERT-logic. This means a DB-INSERT (or create) or a DB-UPDATE both get the ODQ_CHANGEMODE = U and ODQ_ENTITYCNTR = 1. It's not possible to distinguish in CDC-delta between Create and Update.
Environment
Qlik Replicate
SAP S/4HANA
SAP BW/4HANA -
Qlik Replicate: 2023.11 Oracle redo log read - caching is not recovering
All versions of Replicate using the Oracle database that buffers online redo logs can experience caching issues with the Linux-related OS. This is see... Show MoreAll versions of Replicate using the Oracle database that buffers online redo logs can experience caching issues with the Linux-related OS. This is seen and verified when the redo logs are stuck in a loop reading the same log over and over again.
[SOURCE_CAPTURE ]V: Reading blocks at offset 0000000000000a00 (from block 5) (oradcdc_redo.c:1096)
[SOURCE_CAPTURE ]V: Start read from online Redo log 5120 bytes at offset 0000000000000800 for requested offset 0000000000000a00, thread '1' (oradcdc_redo.c:1147)
[SOURCE_CAPTURE ]V: Completed to read from Redo log with rc 1 (oradcdc_redo.c:1161)
[SOURCE_CAPTURE ]V: Page validate - iBlockIndex 5 rba.iBlockIndex 4 iBlocksCount 2097153. Current Redo log sequence is 10703. (oradcdc_redo.c:1255)
[SOURCE_CAPTURE ]V: Validate Unverified, current Redo log sequence is 10703, block Redo log sequence is 10700 (oradcdc_redo.c:1330)
[SOURCE_CAPTURE ]V: Reading blocks at offset 0000000000000a00 (from block 5) (oradcdc_redo.c:1096)When archived redo logs are in use, a log switch will happen when a new archive log is generated from the online redo logs. While the task is unable to get the most recent online redo log, the Replicate task will be able to detect the log switch and be able to read off the archived redo logs to continue the replication process. Latency will be seen as the task is stuck reading the same redo log until the archive log can be generated and read. The default task behavior is to recover from the caching issue when a new archive log is finally generated.
The problem:
The supportResetlog Internal Parameter (Default option in a Qlik Replicate task) has been found not to detect the log switches. It is not switching to reading the archived log so the task continues to be stuck with reading the old cached redo logs.
Resolution
Qlik is actively investigating the issue and will issue a fix. Review the release notes of the latest version for details.
Disabling the supportResetlog Internal Parameter can be used as a workaround.
- Go to the Oracle Source Endpoint connection
- Switch to the Advanced tab
- Click Internal Parameters
- Add supportResetLog in the search bar and choose the Internal Parameter.
- Uncheck the Value column for the parameter and save the changes. Your task needs to be stopped and resumed for this change to take effect.
Internal Investigation ID(s)
QB-26734
RECOB-8423Environment
- Qlik Replicate 2023.11
- Oracle source endpoint
-
Qlik Replicate and Oracle source: Invisible primary key – not recognized by repl...
An Oracle table has a primary key that is defined as Invisible which the Qlik Replicate task does not use. Qlik Replicate is encountering errors when ... Show MoreAn Oracle table has a primary key that is defined as Invisible which the Qlik Replicate task does not use. Qlik Replicate is encountering errors when updating rows in a table from Oracle invisible primary keys (PKs).
Message in the task log file:
[TASK_MANAGER ]W: Table 'XYZ'.D_XYZ' (subtask 0 thread 1) is suspended. Failed to build 'where' statement; Failed to get update statement for table XYZ'.D_XYZ', stream position 00007776: 2024-03-22T06:52:46:484188 [ASSERTION ]V: 'UPDATE (3)' event of table XYZ'.D_XYZ' with id '2065373' does not contain all key values (0 from 1), stream position '00000792.a5e2a494.00000001.0008.01.0000:44682.246184.16' (streamcomponent.c:2984)
The primary key is correctly defined in both source and target Databases and the insert statements are correctly replicated.
Excerpt from table DDL:
CREATE TABLE XYZ.D_XYZ
(
MLOT VARCHAR2(4 BYTE) NOT NULL,
MLDAY NUMBER(6) NOT NULL,
NUM_SEQ NUMBER INVISIBLE NOT NULL,
SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS
ALTER TABLE XYZ.D_XYZ ADD (
CONSTRAINT NUM_SEQ_PK
PRIMARY KEY
(NUM_SEQ)
USING INDEX XYZ.NUM_SEQ_PK
ENABLE VALIDATE);We use the advanced setting 'Support invisible' columns.
The task does not use the key even when the Configuration Parameter 'Support invisible columns' is set.
Resolution
The table behaves as if there were no primary key at all and the task tries to build a where clause with all fields.
In this case that did not work because the source table only had Primary Key logging enabled.
To get this to work we enables ALL COLUMN supplemental logging on the source table and the task was able to build the correct where clause for updates.
Related Content
-
Qlik Replicate SAP Extractor cannot process Line feed character to Snowflake
A task which uses the SAP extractor as the source and replicates data to Snowflake may stop at a line feed character during load processing. Resoluti... Show MoreA task which uses the SAP extractor as the source and replicates data to Snowflake may stop at a line feed character during load processing.
Resolution
In SAP:
- Using transaction code /QTQVC/EXTREP go to Manage Extractors,
- scroll to the right,
- mark the checkbox in the Chk. Spec. Char column for all offending extractors.
Related Content
Activate the extractors for Replicate
Internal Investigation ID(s)
QB-26591
Environment
-
Using Qlik Application Automation to create and distribute Excel reports in Offi...
With Qlik Application Automation, you can get data out of Qlik Cloud and distributing it to different users in formatted Excel. The workflow can be au... Show MoreWith Qlik Application Automation, you can get data out of Qlik Cloud and distributing it to different users in formatted Excel. The workflow can be automated by leveraging the connectors for Office 365, specifically Microsoft SharePoint and Microsoft Excel.
Here I share two example Qlik Application Automation workspaces that you can use and modify to suit your requirements.
Content:
Video:
Considerations
- This example is built on distributing a SharePoint link. It is also possible to use attachments with the Mail block (see Creating a Qlik Reporting Service report).
- Qlik Application Automation has a limit of 100,000 rows when getting data out of a Qlik Sense straight table object.
- The On-Demand example uses an extension in QSE SaaS to send data to the Automation. An update to the Qlik Sense Button object is expected soon, which will provide a native way to pass selections to an Automation.
Example 1: Scheduled Reports
- Download the 'Scheduled Report.json' file attached to this document.
- Create a new Automation in QSE SaaS, give it a name, and then upload the workspace you just downloaded by right clicking in the editor canvas, and selecting 'Upload workspace'.
- Select the 'Create Binary File (Personal One Drive)' block, select 'Connection' in the block configurator to the right, and then create your connection to Microsoft SharePoint.
- Select the 'Get Straight Table Data' block. Under 'Inputs' in the block configurator, lookup your the App Id, Sheet Id, and Object Id for the relevant QSE SaaS table you wish to output.
- Select the 'Create Excel Table With Headers' block, select 'Connection' in the block configurator, and then create your connection to Microsoft Excel.
- Select the 'Send Mail' block. Under 'Inputs' in the block configurator update the 'To' to reflect the addresses you wish to deliver to.
- With the 'Send Mail' block still selected, select 'Connection' in the block configurator and add your Sender details.
- To test, Save and then Run the Automation
- If you receive any warnings or errors, navigate to the relevant blocks and ensure your Connection is selected in the block configurator.
- Select the 'Start' block. Under 'Inputs' in the block configurator, change Run Mode to Scheduled and define your required schedule.
Example 2: On-Demand Reports
Note - These instructions assume you have already created connections as required in Example 1.
- Download the 'On-Demand Report v3.json' file attached to this document.
- Download and install the 'qlik-blends' extension. See:
https://github.com/rileymd88/qlik-blends/files/6378232/qlik-blends.zip - Create a new Automation in QSE SaaS, give it a name, and then upload the workspace you just downloaded by right-clicking in the editor canvas, and selecting 'Upload workspace'.
- Ensure your Connections are selected in the block configurator for each of the following blocks, 'Create Binary File (Personal One Drive)', 'Create Excel Table With Headers', 'Add Rows To Excel Worksheet Table (Batch)', 'Create Sharing Link', and 'Send Mail'. Save the Automation.
- Select the 'Start' block and ensure Run Mode is set to Triggered. Make note of the of URL and Execution Token shown in the POST example.
- Open your chosen QSE SaaS application, and Edit the Sheet where you wish to add a Button to trigger an On-Demand report.
- Under 'Custom Objects' look for 'qlik-blends' from the Extensions menu and drag this into your Sheet.
- Under the 'Blend' properties to the right, add-in your POST webhook URL and Token as noted in Step 5.
- We will now add three measures to the 'qlik-blends' object. It is important you add them in the order described. Add the first measure, using the following function in the expression editor: GetCurrentSelections()
- Add the second measure, using the following function DocumentName()
- The final measure will be the Object ID of the table you wish to use. To find the Object ID, select 'Done Editing'. Then right click on the table, select 'Share', select 'Embed', then look for the Object ID under the preview. Copy this value, go back into Editing mode and paste this as your third measure value.
- With the 'qlik-blends' object selected, under Form select 'Add items'. For 'Item type' select Text. Under default value you can choose to add a default email address. For 'Label' and 'Reference' type 'Email'. It is critical that Reference is updated to 'Email'. Turn 'Required input' on.
- You can change the Appearance properties to suit your preferences, such as updating the Button label and message, enabling Dialog, and changing the Color under Theme.
- Back in the Automation, under the Start Block. Set 'Run asynchronously' = yes to allow multiple requests to run at the same time (This will also increase the max run time from 1min to 60min)
- Once happy, test the On-Demand report by entering an email and clicking the button.
This On-Demand Report Automation can be used across multiple apps and tables. Simply copy the extension object between apps & sheets, and update the Object ID (Measure 3) for each instance.
Environment
- Qlik Application Automation
- Qlik Cloud
- Microsoft Office 365
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
-
Distribution of measures to multiple Qlik Sense apps
This article gives an overview of the measure distribution use case. It explains a basic example of a template configured for this scenario and additi... Show MoreThis article gives an overview of the measure distribution use case. It explains a basic example of a template configured for this scenario and additions for a more advanced use case.
For this use case, we will define the following keywords/expressions:
- master items: the following Qlik Sense objects are currently supported: measures, dimensions, and variables.
- main app: A Qlik sense app stored in your tenant in which you define and maintain all your master items for distribution.
- destination space: a space that contains all the Qlik Sense apps that you want your master items to be synced to.
By using this approach, all you need to do is create/update your master items in your main app, and then push these updates to all your destination apps. This way, all destination apps have the same master items.
To support this use case, we created a basic template, which uses measures as master items.
By running this template, you will be able to distribute all the measures created in your main app to all the apps available in the destination space.
Basic automation overview:
All you need to do is select your main app and your destination space.
Of course, this is just a basic implementation. This template can be upgraded to suit more advanced scenarios.
Let's go over a few examples:
- you can sync multiple types of master items (ex: dimensions, variables) by adding their respective blocks into the template, in a similar fashion as the measure ones.
- instead of syncing all the measures from your main app, you can filter them by applying a specific tag, and then using a Filter List block to select only the tagged measures.
- you can also apply the filtering logic to create a selection of destination apps to sync your measures to, instead of an entire space.
- you can add a deletion logic section that could come in handy when you delete measures from your main app and you want this to be mirrored in your destination space/apps as well.
- you can include an input type block, with the inputs being the source app id, the destination space/apps ids and/or related tags or other optional inputs, so you can trigger the template from another source.
- make sure the output of the List Apps block doesn't contain the source app. You can prevent this by:
- adding a space id to the List Apps block and making sure the source app is not in the same space as the target apps.
- adding a Condition block right before the Create Or Update Measure block, this validates the app id from every app in the List Apps block's output to make sure it isn't executed for the source app.
Change distribution
The changes made by this automation won't be accessible immediately in other sessions (like the Qlik Sense UI) more info on that can be found here: Automation session delay. It can take up to 40 minutes for these changes to be visible in other sessions, if these changes are needed sooner in these sessions, the Save App block can be used. But keep in mind it can only be used once for every app that's changed by the automation. More information on the Save App block can be found here: How to use the Save App block.
For the above example, it's best to add an additional List Apps block that's configured exactly the same as the first one so, it returns the same apps. We'll add a Save App block in the loop of the new List Apps block and configure it to run for every app that's returned. This way, we make sure that the Save App block is executed only once for every app that was changed. See the image below for an example with the Save App block.
Advanced template overview:
First part: includes an input block for the source/destination apps and for the measure tags.
Second part: includes a measure deletion flow, for a complete sync automation process.
Both these template examples are available as attachments.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
-
QMC Reload Failure Despite Successful Script in Qlik Sense Nov 2023 and above
Reload fails in QMC even though script part is successfull in Qlik Sense Enterprise on Windows November 2023 and above.When you are using a NetApp bas... Show MoreReload fails in QMC even though script part is successfull in Qlik Sense Enterprise on Windows November 2023 and above.
When you are using a NetApp based storage you might see an error when trying to publish and replace or reloading a published app.In the QMC you will see that the script load itself finished successfully, but the task failed after that.
ERROR QlikServer1 System.Engine.Engine 228 43384f67-ce24-47b1-8d12-810fca589657
Domain\serviceuser QF: CopyRename exception:
Rename from \\fileserver\share\Apps\e8d5b2d8-cf7d-4406-903e-a249528b160c.new
to \\fileserver\share\Apps\ae763791-8131-4118-b8df-35650f29e6f6
failed: RenameFile failed in CopyRenameExtendedException: Type '9010' thrown in file
'C:\Jws\engine-common-ws\src\ServerPlugin\Plugins\PluginApiSupport\PluginHelpers.cpp'
in function 'ServerPlugin::PluginHelpers::ConvertAndThrow'
on line '149'. Message: 'Unknown error' and additional debug info:
'Could not replace collection
\\fileserver\share\Apps\8fa5536b-f45f-4262-842a-884936cf119c] with
[\\fileserver\share\Apps\Transactions\Qlikserver1\829A26D1-49D2-413B-AFB1-739261AA1A5E],
(genericException)'
<<< {"jsonrpc":"2.0","id":1578431,"error":{"code":9010,"parameter":
"Object move failed.","message":"Unknown error"}}ERROR Qlikserver1 06c3ab76-226a-4e25-990f-6655a965c8f3
20240218T040613.891-0500 12.1581.19.0
Command=Doc::DoSave;Result=9010;ResultText=Error: Unknown error
0 0 298317 INTERNAL&
emsp; sa_scheduler b3712cae-ff20-4443-b15b-c3e4d33ec7b4
9c1f1450-3341-4deb-bc9b-92bf9b6861cf Taskname Engine Not available
Doc::DoSave Doc::DoSave 9010 Object move failed.
06c3ab76-226a-4e25-990f-6655a965c8f3Resolution
Qlik Sense Client Managed version:
- May 2024 Initial Release
- February 2024 Patch 4
- November 2023 Patch 9
Potential workarounds
- Change the storage to a file share on a Windows server
Cause
The most plausible cause currently is that the specific engine version has issues releasing File Lock operations. We are actively investigating the root cause, but there is no fix available yet.
Internal Investigation ID(s)
QB-25096
QB-26125Environment
- Qlik Sense Enterprise on Windows November 2023 and above
-
How to Apply a Signed License Key (SLK) to Qlik Sense Enterprise on Windows
This article covers the details of how to license a Qlik Sense Enterprise on Windows server with a Signed License Key (SLK). Index: About the Signed L... Show MoreThis article covers the details of how to license a Qlik Sense Enterprise on Windows server with a Signed License Key (SLK).
Index:
- About the Signed License Key:
- How to apply the license:
- Related Content
- How do I license....?
- Looking for pricing information?
About the Signed License Key:
- It replaces the 16 digits license key and control number.
- It is a JSON Web token.
- It uses secure communication, HTTPS://
To apply a Signed License Key, a secure network connection is required to be established: A signed license key requires connectivity to license.qlikcloud.com. See List of IP Addresses behind license.qlikcloud.com and lef.qliktech.com for details.
It can establish in any of the security scenarios below:
- Less strict security
HTTPS port (port 443) is open - More secure
HTTPS port is open outbound only - Even more secure
HTTPS port is open outbound, limited to https://license.qlikcloud.com - Most secure
HTTPS via proxy, with or without proxy authentication - No connection
See Using a Qlik signed license in an offline environment.
All nodes in a Qlik Sense Enterprise on Windows on-premise multi node environment need access to the license server.
How to apply the license:
- You receive the Signed License Key via email after a successful purchase
- Launch the Qlik Sense Management Console
- If not immediately prompted for a license, go to License Management
- Open Site License in the menu to the right
- Paste in your full key. Thi sis the JSON token you received in the email.
- Click Apply
Related Content
If you're looking for additional information about other products, licenses, or license pricing, we've compiled a list of the most commonly referenced material:How do I license....?
- How to license QlikView Server
- How to license QlikView Desktop
- How to license NPrinting
- Activate Qlik Products without Internet access - April 2020 and onwards
Looking for pricing information?
-
How to customize the text of a generic Qlik Sense error message
The information in this article is provided as is. Adjustments to error messages cannot be supported by Qlik Support and can lead to difficulties iden... Show MoreThe information in this article is provided as is. Adjustments to error messages cannot be supported by Qlik Support and can lead to difficulties identifying underlying root causes of technical issues experienced later. All changes will be reverted after an upgrade.
The method documented in this article is intended for later versions of Qlik Sense Enterprise on Windows.
The message texts are defined in JavaScript files located by default in path C:\Program Files\Qlik\Sense\Client\translate\. There are subfolders for each supported language, e.g. en-US for English.
To modify the message text edit e.g. the file hub.js in the folder corresponding to the language you want to change the text for.
To change e.g. the default message text for access denied messages you need to find the following line:"ProxyError.OnLicenseAccessDenied": "You cannot access Qlik Sense because you have no access pass.",
To change the text of the message you need to modify the second part of the text which is in brackets (see example below):
"ProxyError.OnLicenseAccessDenied": "This is a modified message text for access denied events",
To modify the error message:
- Navigate to C:\Program Files\Qlik\Sense\Client\translate\
- Open the folder corresponding to the language you wish to change, e.g. en-US for English.
- Open either the hub.js file or the qmc.js file. Older version of Qlik Sense relied on a single client.json file.
- Search for the standard error message
- Modify the message text accordingly. If the message you wish to modify cannot be located, then the message cannot be customized.
- Save the file
- Restart the Qlik Sense Proxy service
Users will not be able to see the new message until their browser cache has been cleared.
-
Qlik Sense QRS API using Xrfkey header in PowerShell
Qlik Sense Repository Service API (QRS API) contains all data and configuration information for a Qlik Sense site. The data is normally added and upd... Show More
Qlik Sense Repository Service API (QRS API) contains all data and configuration information for a Qlik Sense site. The data is normally added and updated using the Qlik Management Console (QMC) or a Qlik Sense client, but it is also possible to communicate directly with the QRS using its API. This enables the automation of a range of tasks, for example:- Start tasks from an external scheduling tool
- Change license configurations
- Extract data about the system
Using Xrfkey header
A common vulnerability in web clients is cross-site request forgery, which lets an attacker impersonate a user when accessing a system. Thus we use the Xrfkey to prevent that, without Xrfkey being set in the URL the server will send back a message saying: XSRF prevention check failed. Possible XSRF discovered.
Environments:Note: Please note that this example is related to token-based licenses and in case this is needed to be configured with Professional Analyser type of licenses you might need to use the following API calls:
- /qrs/license/professionalaccesstype/full
- /qrs/license/analyzeraccesstype/full
Furthermore, combining this with QlikCli and in case you need to monitor and more specifically remove users, the following link from community might be useful: Deallocation of Qlik Sense License
Resolution:
This procedure has been tested in a range of Qlik Sense Enterprise on Windows versions.- PowerShell 3.0 or higher (Installed by default in Windows 8 / Windows Server 2012 and later)
- Make sure the Qlik Repository service is up and running and port 4242 is open on the target server
Method 1: Authenticating through Qlik Proxy Service
- Go to PowerShell ISE and paste the following script
- In this example we are sending a GET request with a header of Xrfkey=12345678qwertyui and we are addressing the end point of /about. For more details on all end points, please refer to Connecting to the QRS API
$hdrs = @{} $hdrs.Add("X-Qlik-xrfkey","12345678qwertyui") $url = "https://qlikserver1.domain.local/qrs/about?xrfkey=12345678qwertyui" Invoke-RestMethod -Uri $url -Method Get -Headers $hdrs -UseDefaultCredentialsMethod 2: Use certificate and send direct request to Repository API
- Open Qlik Management Console and export the certificate. Please refer to Export client certificate and root certificate to make API calls with Postman for procedure.
- Make sure that port 4242 is open between the machine making the API call and the Qlik Sense server.
- Import the certificate on the machine you will use to make API calls. This must be imported in the personal certificate store of your user in MMC. The following PowerShell script is fetching automatically the Qlik Client certificate from the Certificate Personal store for the current user. You may need to modify the script if you have QlikClient certificates imported from different Qlik Sense servers in the store)
- Paste the below script in PowerShell ISE:
$hdrs = @{} $hdrs.Add("X-Qlik-xrfkey","12345678qwertyui") $hdrs.Add("X-Qlik-User","UserDirectory=DOMAIN;UserId=Administrator") $cert = Get-ChildItem -Path "Cert:\CurrentUser\My" | Where {$_.Subject -like '*QlikClient*'} $url = "https://qlikserver1.domain.local:4242/qrs/about?xrfkey=12345678qwertyui" Invoke-RestMethod -Uri $url -Method Get -Headers $hdrs -Certificate $cert
Execute the command.
A possible response for the 2 above scripts may look like this (Note that the JSON string is automatically converted to a PSCustomObject by PowerShell) :
buildVersion : 23.11.2.0 buildDate : 9/20/2013 10:09:00 AM databaseProvider : Devart.Data.PostgreSql nodeType : 1 sharedPersistence : True requiresBootstrap : False singleNodeOnly : False schemaPath : About
Related and advanced Content:
If there are several certificates from different Qlik Sense server, these can not be fetched by subject as there will have several certificates with subject QlikClient and that script will fail as it will return as array of certificates instead of a single certificate. In that case, fetch the certificate by thumbprint. This required more Powershell knowledge, but an example can be found here: How to find certificates by thumbprint or name with powershell
-
Qlik Sense Session Inactivity Timeout and Keep Alive settings
Qlik Sense allows for three settings that may influence the perceived connection and session timeout period. These are the "Session Inactivity Timeout... Show MoreQlik Sense allows for three settings that may influence the perceived connection and session timeout period. These are the "Session Inactivity Timeout", "Keep-Alive Timeout", and "TCP Websocket keep-alive" settings.
Note: Adjusting the below settings can help when working with slow internet connectivity or wanting to extend the session inactivity. However, session disconnect issues can be caused by other network connectivity issues and by system resource shortage as well and may require additional troubleshooting. See Hub access times out with: Error Connection lost. Make sure that Qlik Sense is running properly
1. The connection "Keep Alive" setting
This is the maximum timeout for a single HTTP request. The default value is 10 seconds. During the defined keep alive timeout value, the connection between end user and Qlik Sense will remain open.
It serves as protection against denial-of-service attacks. That is, if an ongoing request exceeds this period, Qlik Sense proxy will close the connection.
Increase this value if your users work over slow connections and experience closed connections for which no other workaround has been found. Make sure to take the mentioned DoS consideration above into account.
2. The "Session Inactivity Timeout"
This is the browser authentication session time out ( 30 minutes by default set under Virtual Proxy in QMC ). This sets a cookie on the client machine with the name X-Qlik-Session. This cookie can be traced in Fiddler or Developer tools under the header tab.
If the session cookie header value does not get passed, is destroyed, or modified between the end user client and the Qlik Sense server while 'in-flight' the user session is terminated and the user is logged out.
By default, it will be destroyed after 30 minutes of inactivity or when the browser is closed.3. TCP Websocket keep-alive
This is another setting that may help keep the connection open in certain environments. See Enabling TCP Keep Alive Functionality In Qlik Sense. Note that customers who don't experience any issues with web sockets terminated by the network due to inactive SHOULD NOT switch this feature ON since it may potentially cause Qlik Sense to send unnecessary traffic on the network towards the client.
Related Content:
Environment:
- Qlik Sense Enterprise on Windows, all versions
-
How to import and export master items using Microsoft Excel with Qlik Applicatio...
This article explains how to import and export master items to and from a Qlik Sense app using the Microsoft Excel connector in Qlik Application Autom... Show MoreThis article explains how to import and export master items to and from a Qlik Sense app using the Microsoft Excel connector in Qlik Application Automation.
Content:
- Export master items to a Microsoft Excel sheet
- Import master items from a Microsoft Excel sheet
- Edge cases & next steps
The first part of this article will explain how to export all of your master items configured in your Qlik Sense App to a Microsoft Excel sheet. The second part will explain how to import those master items from the Microsoft Excel sheet back to a Qlik Sense App.
Export master items to a Microsoft Excel sheet
For this, you will need a Qlik Sense app in your tenant that contains measures, dimensions, and variables you want to export. You'll also need an empty Microsoft Excel file. The image below contains a basic example on exporting master items.
The following steps will guide you through recreating the above automation:- Add the Create Workbook Session block from the Microsoft Excel connector. Configure it with the following settings:
- Drive id -> use do lookup
- Item id -> use do lookup to find the empty destination file, if you don't know the path of your file, you can do an empty search (if it isn't located in a folder)
- Add the Add Worksheet block from the Microsoft Excel connector. Use the same Drive Id & Item Id from the previous step and configure the Name parameter to a string of your choice. In this example, we'll use Measures.
- Add the Create Excel Table With Headers block from the Microsoft Excel connector. This block will create a table inside the sheet from the previous step. Specify the following in the block's configuration:
- Start Row -> 1
- Start Column -> A
- End Column -> E
- Headers -> Field,Name,Label Expression,Description,Tags,Measure Color, Segment Color, Number Format
- Name -> Measures
- Add a List Measures block from the Qlik Cloud Services connector and also a Get Measure block inside the loop created by the List Measures block, configure it to get the current item in the loop. Add two variables to convert Gradient and NumFormat JSON objects to string.
- Add an Add Row To Table block from the Microsoft Excel connector inside the loop after the Get Measure block. This will add every measure's information to the table one by one. Set the Drive Id, Item Id, Worksheet, and Table Id to the corresponding values from the previous blocks.
The Row parameter should be an array of values for every header we specified in the Create Excel Table With Headers block (and in the same order). Title, Label expression, and Description should be encapsulated in double-quotes. Apply the JSON encode formula to the value for the Definition (qDef) and apply the Implode formula to the value for the Tags. - Add a Close Workbook Session block from the Microsoft Excel connector. Specify the same Drive Id & Item Id as in the previous blocks and configure the Session Id to the Id returned by the Create Workbook Session block.
An export of the above automation can be found at the end of this article as Export master items to a Microsoft Excel sheet.json
Import master items from a Microsoft Excel sheet
For this example, you'll first need a Microsoft Excel file with sheets configured for each master item type (dimensions, measures, and variables). Use the above example to generate this file. The image below contains a basic example on importing master items from Microsoft Excel to a Qlik Sense app.
- Add the List Rows With Headers block from the Microsoft Excel connector to read master items from the Excel file. Configure the block with the following settings:
- Drive Id -> use do lookup
- Item Id -> use do lookup to find the empty destination file, if you don't know the path of your file, you can do an empty search (if it isn't located in a folder)
- Worksheet Name -> the name of the sheet that contains the measures. Feel free to use do lookup
- Start Cell -> the upper-left cell of the measures table, this should include the header row, for example, A1
- End Cell -> the bottom right cell of the measures table, for example, E23
- Add a Condition block inside the loop created by the List Rows With Headers block. The condition will verify that every measure row has the required information to create a measure (name and expression).
See the below image for an example: - Add the Create Or Update Measure block from the Qlik Cloud Services connector to create or update the measure in the destination app. Map each field from the row in the Excel table to the corresponding input field in the Create Or Update Measure block. Measure Id can be left empty since we're matching measures by Name (Measure Ids can be different across apps). See the below image for an example:
An export of the above automation can be found at the end of this article as Import master items from a Microsoft Excel Sheet.json
Follow the same steps to build automations that import/export dimensions and variables.
Edge cases & next steps
Let's go over some edge cases when exporting information to Microsoft Excel:
- Fields that start with an equals-sign '=' (for example, some variables' definitions) are treated as Excel functions and can be deemed invalid by the Excel API. You can resolve this by adding a single quote before the input field's mapping in the automation.
- Fields that contain newlines (for example measure expressions that contain comments) are invalidated by the Excel API. The solution here is to use the JSON formula to encode the string.
Please check the following articles for more information about working with master items in Qlik Application Automation and also uploading data to Microsoft Excel.
- How to get started with Microsoft Excel
- Distribution of Master items using Qlik Application Automation
- Uploading data to Microsoft Excel
Follow the steps provided in this article How to import & export automations to import the automation from the shared JSON file.
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
-
How to configure Qlik Cloud with Okta
This guide provides the basic instructions on configuring Qlik Cloud with Okta as an identity provider. This customization is provided as is. Qlik Sup... Show MoreThis guide provides the basic instructions on configuring Qlik Cloud with Okta as an identity provider.
This customization is provided as is. Qlik Support cannot provide continued support of the solution. For assistance, reach out to our Professional Services or engage in our active Integrations forum.
Configuring Okta
- Go to your Okta Admin Console
- Navigate to Applications
- Click Create App Integration
- Choose OIDC - OpenID Connect and Web Application, then click Next
- Fill in the App Integration Name (this name identifies the application)
- Set Grant type to Authorization Code
- Enter your tenant URL in Sign-out redirect URIs, adding /login/callback
Example: https://tenant_url/login/callback
This must be the actual tenant name, not the alias.
- Scroll down to the Assignments section. Set Allow everyone in your organization to access
- Click Save
- Copy the Client ID and Client Secret. Both are needed when configuring the IdP on the tenant.
- Switch to the Sign On tab
- Click Edit on the OpenID Connect ID Token
- Set Issuer to the Okta URL
- Set Group claim type to Filter
- Set Group claim filter to groups, followed by Matches regex and .*
- Click Save
- The next step is to add an Authorization Server
If you do not have access to Okta's API Access Management, see Using a custom Authorization Server for Okta in Qlik Cloud.
- Expand the Okta admin panel menu
- Expand Security and open API
- Click Add Authorization Server.
- Set Name to QlikAPI (example)
- Set Audience to qlik.api
- Set Issuer to Okta URL
- Leave everything else default, then click Save
- Switch to the Scopes tab
- Click Add Scope
- Set the Name
- Set a Display phrase
- Set a Description
- Set User content to Implicit
- Mark Set as default scope
- Leave Include in public metadata unchecked
- Click Save
- Switch to Access Policies
- Click Add Policy
- Set a Name
- Set a Description
- Set Assign to to All clients
- Click Update Policy
- Click Add rule
- Set a Rule Name
- Check Client Credentials
- Uncheck all items under Client acting on behalf of a user
- Check Any user assigned the app
- Check Any scopes
- Leave the remaining settings at default
- Click Create rule
- Check Clients Credentials, Any user assigned the app and Any scopes then click Update Rule
Configuring Qlik Cloud Tenant
- Open the Qlik Cloud Management Console and browse to Identity Providers
- Click Create New
- Choose Interactive
- Choose Okta
- Fill out the Application credentials as per the Okta Setup
- Provide your claims mapping as per your setup
- Click Create
For additional information on how to create new identity providers in Qlik Cloud, see Creating a new identity provider configuration.
Environment:
The information in this article is provided as-is and to be used at own discretion. Depending on tool(s) used, customization(s), and/or other factors ongoing support on the solution below may not be provided by Qlik Support.
-
Exploring Qlik Cloud Data Integration
This Techspert Talks session covers: What Qlik Cloud Data Integrations can do How it can be used Cloud Data Architecture Chapters: 01:10 - Dat... Show More -
Qlik Talend Products: Java 17 Migration Guide
From R2024-05, Java 17 will become the only supported version to start most Talend modules, enforcing the improved security of Java 17 and eliminating... Show MoreFrom R2024-05, Java 17 will become the only supported version to start most Talend modules, enforcing the improved security of Java 17 and eliminating concerns about Java's end-of-support for older versions. In 2025, Java 17 will become the only supported version for all operations in Talend modules.
Starting from v2.13, Talend Remote Engine requires Java 17 to run. If some of your artifacts, such as Big Data Jobs, require other Java versions, see Specifying a Java version to run Jobs or Microservices.Content
- Prerequisites
- Procedure
- Windows
- Linux
- MAC OS
- Multiple JDK versions
- Studio
- Remote Engine
- ESB - Runtime
- Studio
- Talend Administration Center (TAC)
- CICD
- Windows Users
- Linux Users
- Jenkins Users
- Additional Notes
- Specifying a Java version to run Jobs or Microservices
Prerequisites
Qlik Talend Module Patch Level and Version Studio Supported from R2023-10 onwards Remote Engine 2.13 or later Runtime 8.0.1-R2023-10 or later Procedure
Windows
For Windows users, please follow the JDK installation guide (docs.oracle.com).
Linux
For Linux users, please follow the JDK installation guide (docs.oracle.com).
MAC OS
For MAC OS users, please follow the JDK installation guide (docs.oracle.com).
Multiple JDK versions
When working with software that supports multiple versions of Java, it's important to be able to specify the exact Java version you want to use. This ensures compatibility and consistent behavior across your applications. Here is how you can specify a specific Java version on the following products (such as build servers, shared application server, and similar):
Studio
For Studio users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps:
- Backup and edit the <Studio Home>\Talend-Studio-win-x86_64.ini file
- Prepend:
-vm
<JDK17 HOME>\bin\server\jvm.dll
Remote Engine
For Remote Engine (RE) users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps.
- Backup and edit the <RE HOME>/etc/talend-remote-engine-wrapper.conf file
- Modify the set.default.JAVA_HOME= property to point to the <JDK 17 HOME> path.
Note 1: If Remote Engine is not installed as a service, the JDK file will be set in the <RE HOME>/bin/setenv file.
Note 2: When it comes to running Jobs or Microservices, you retain the flexibility to either use the default Java 17 version or choose older Java versions, through straightforward configuration of the engine.
How to modify?
Check the following etc folder based configuration and change it to installed jdk/jre path:
{
org.talend.ipaas.rt.dsrunner.cfg--> ms.custom.jre.path
org.talend.remote.jobserver.server.cfg--> org.talend.remote.jobserver.commons.config.JobServerConfiguration.JOB_LAUNCHER_PATH
}
ESB - Runtime
For Runtime users who are using multiple JDKs, please follow the appropriate instructions listed above and follow the proceeding additional steps.
- Backup and edit the <Runtime home>/etc/<TALEND-8-CONTAINER service>-wrapper.conf
- Modify the set.default.JAVA_HOME=C:\<JDK 17 HOME> path
If Runtime is not running as a service:
- Backup and edit the <Runtime home>/bin/setenv.sh
- Modify the SET JAVA_HOME= <JDK 17 HOME> path
Studio
- Data Integration (DI): After installing the 8.0 R2023-10 Talend Studio monthly update or a later one, if you switch the Java version to 17 and relaunch your Talend Studio with Java 17, you must enable your project settings for Java 17 compatibility.
- Go to Studio
- Go to File
- Edit Project properties
- Go to Build
- Go to Java Version
- Activate "Enable Java 17 compatibility"
With the Enable Java 17 compatibility option activated, any Job built by Talend Studio cannot be executed with Java 8. For this reason, verify the Java environment on your Job execution servers before activating the option.
- Big Data Users: Do not enable Java 17 compatibility unless your Spark Cluster supports Java 17.
Talend Administration Center (TAC)
To use Talend Administration Center with Java 17, you need to open the <tac_installation_folder>/apache-tomcat/bin/setenv.sh file and add the following commands:
# export modules export JAVA_OPTS="$JAVA_OPTS --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
Windows users use <tac_installation_folder>\apache-tomcat\bin\setenv.bat
CICD
Windows Users
For Java 17 users, Talend CICD process requires the following Maven options:
- Backup and edit <Maven_home>\bin\mvn.cmd
- Modify to:
set "MAVEN_OPTS=%MAVEN_OPTS% --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
Linux Users
For Java 17 users, Talend CICD process requires the following Maven options:
- Backup and edit <Maven_home>/bin/mvn
- Modify to:
export MAVEN_OPTS="$MAVEN_OPTS \ --add-opens=java.base/java.net=ALL-UNNAMED \ --add-opens=java.base/sun.security.x509=ALL-UNNAMED \ --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED"
Jenkins Users
- Backup and edit the jenkins_pipeline_simple.xml
- Include the following in the Talend_CI_RUN_CONFIG parameter:
<name>TALEND_CI_RUN_CONFIG</name> <description>Define the Maven parameters to be used by the product execution, such as: - Studio location - debug flags These parameters will be put to maven 'mavenOpts'. If Jenkins is using Java 17, add: --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/sun.security.x509=ALL-UNNAMED --add-opens=java.base/sun.security.pkcs=ALL-UNNAMED </description>
Additional Notes
Specifying a Java version to run Jobs or Microservices
Overview
Enable your Remote Engine to run Jobs or Microservices using a specific Java version.
By default, a Remote Engine uses the Java version of its environment to execute Jobs or Microservices. With Remote Engine v2.13 and onwards, Java 17 is mandatory for engine startup. However, when it comes to running Jobs or Microservices, you can specify a different Java version. This feature allows you to use a newer engine version to run the artifacts designed with older Java versions, without the need to rebuild these artifacts, such as the Big Data Jobs, which reply on Java 8 only.
When developing new Jobs or Microservices that do not exclusively rely on Java 8, that is to say, they are not Big Data Jobs, consider building them with the add-opens option to ensure compatibility with Java 17. This option opens the necessary packages for Java 17 compatibility, making your Jobs or Microservices directly runnable on the newer Remote Engine version, without having to go through the procedure explained in this section for defining a specific Java version. For further information about how to use this add-opens option and its limitation, see Setting up Java in Talend Studio.
Procedure
- Stop the engine.
- Browse to the <RemoteEngineInstallationDirectory>/etc directory.
- Depending on the type of the artifacts you need to run with a specific Java version, do the following:
For both artifact types, use backslashes to escape characters specific to a Windows path, such as colons, whitespace, and directory separators, while keeping in mind that directory separators are also backslashes on Windows.
Example:
c:\\Program\ Files\\Java\\jdk11.0.18_10\\bin\\java.exe
- For Jobs, in the <RemoteEngineInstallationDirectory>/etc/org.talend.remote.jobserver.server.cfg file, add the path to the Java executable file.
Example:
org.talend.remote.jobserver.commons.config.JobServerConfiguration.JOB_LAUNCHER_PATH=c:\\jdks\\jdk11.0.18_10\\bin\\java.exe
- For Microservices, in the <RemoteEngineInstallationDirectory>/etc/org.talend.ipaas.rt.dsrunner.cfg, add the path to the Java executable file.
Example:
ms.custom.jre.path=C\:/Java/jdk/bin
Make this modification before deploying your Microservices to ensure that these changes are correctly taken into account.
- For Jobs, in the <RemoteEngineInstallationDirectory>/etc/org.talend.remote.jobserver.server.cfg file, add the path to the Java executable file.
- Restart the engine.
-
Qlik Gold Client Key Split Enhancement Overview
Key Split is a performance tuning option within Qlik Gold Client. Introduced in Qlik Gold Client 8.7.3, it is used to control which tables allow for s... Show MoreKey Split is a performance tuning option within Qlik Gold Client. Introduced in Qlik Gold Client 8.7.3, it is used to control which tables allow for splitting of table content in different Data Flows for export/import. This functionality is triggered when the configured table is a Header table in a Data Type, where it will split the table entries.
The benefit is to allow Gold Client to parallelize the import of data in the same table using multiple jobs, making the import process much faster.
Key Split is to be used carefully and for very specific export/import scenarios, especially when there is a performance issue importing tables on a target system. To maximize the functionality, parallel processing option must be used.
At this moment, some special table type fields like GUID, RAW, STRING, XSTRING, etc. should not be used with Key Split.
There are 4 different types of Key Split options available within Qlik Gold Client:
- Alphanumeric
For a specific range of documents, it will separate the documents by the first alphanumeric digit and create a Container for each of them.
A, B, C, …, 0, 1, 2, …, Special Chars.
Example: B123456789 (this entry will be added with all that start with “B”)
- Alphanumeric by Grouping (introduced in Gold Client 8.7.4 Patch 3 – 8.7.2024.05)
For a specific range of documents, it will separate the documents by the first alphanumeric digit grouping and creating a Container for each of them.
A-C, D-F, G-I, J-L, M-O, P-R, S-U, V-X, Y-Z, 0-2, 3-5, 6-8, 9-Special Chars.
Example: B123456789 (this entry will be added with all that start with “A* B* or C*”)
- Numeric
For a specific range of documents, it will separate the documents by the last numeric digit and create a Container for each of them.
0, 1, 2, 3, 4, 5, 6, 7, 8, 9
Example: 123456789 (this entry will be added with all which last digit is “*9”)
- Numeric by Grouping (introduced in Gold Client 8.7.4 Patch 3 – 8.7.2024.05)
For a specific range of documents, it will separate the documents by the last two numeric digits grouping and creating a Container for each of them.
Group 1: 00-04
Group 2: 05-09
Group 3: 10-14
…
Group 20: 95-99
Example: 1234567899 (this entry will be added with all which last digits are between “*95- 99”)
This diagram shows the Key Split logic for a set of Materials using the Alphanumeric and Alphanumeric by Grouping options (based on the first alphanumeric value):
This diagram shows the Key Split logic for a set of Materials using the Numeric and Numeric by Grouping options (based on the last numeric digit):
Tables using this logic are configured in the Configuration > Data Framework > Additional Tools > Key Split Config
To activate the key split in Gold Client, it is necessary to provide the Table name, Field name and Key split type. The Active/Inactive flag allows to enable/disable the functionality.
It is possible to deactivate Export Engine functionality to split table exports into multiple files based on keys in the Configuration > Administration > MConfig
Example – Alphanumeric and Alphanumeric by Grouping:
Splitting the export of MARA entries based on material Number – MATNR.
In the case of exporting MM – MATERIAL MASTER using Parallel Processing, for a specific range of documents, it will separate the documents by the first digit and create Containers for each of them.
During the import process, this logic will allow the import of data in parallel to MARA table faster.
Qlik Gold Client Sizing Report for table MARA Export with Key Split inactive
Qlik Gold Client Sizing Report for table MARA Export with Key Split Alphanumeric active
Qlik Gold Client Sizing Report for table MARA Export with Key Split Alphanumeric by Grouping active
Example – Numeric and Numeric by Grouping:
Splitting the export of ACDOCA entries based on document Number - BELNR.
In the case of exporting FI - FINANCE DOCUMENTS using Parallel Processing, for a specific range of documents, it will separate the documents by the last digit(s) and create Containers for each of them.
During the import process, this logic will allow the import of data in parallel to ACDOCA table faster.
Qlik Gold Client Sizing Report for table ACDOCA Export with Key Split inactive
Qlik Gold Client Sizing Report for table ACDOCA Export with Key Split Numeric active
Qlik Gold Client Sizing Report for table ACDOCA Export with Key Split Numeric by Grouping active
Please refer to the Qlik Gold Client Configuration and Utilities - User Guide for more information.
Thanks to Qlik SAP Solutions Engineer Hugo Martins for drafting this content!
-
How to collect Snapshots for high memory usage in Replicate
Description: In order to troubleshoot high memory usage, we need to collect memory snapshots for the Replicate service or the task that's having high... Show More