Official

Accepted

Unanswered

Relevance
Selected Relevance

Advanced Filters

Search Results (7)

TO BE REVIEWED: How do I use that Collibra DQ "Deployment Mode" Dialogue to override the default Agent configuration?

Question: I want to manually override the default Agent settings when running my Collibra DQ job. How do I work with the Deployment Mode dialogue? A: We created this presentation to get you started with using Apache Spark in DQ: https://docs.google.com/presentation/d/1epc_0xgpaaIsatEy15BCbhyEsRjMlK9

Questions

18

0

0

0

TO BE REVIEWED: What type of AI techniques (eg. Random Forest) does Collibra DQ Use?

Q: What type of AI techniques (eg. Random Forest) does Collibra DQ Use? Answer: It is important to realize that we have many aspects to Collibra DQ and they don’t all use the same algorithms. Does that make any sense? I will give you some examples: Outlier detection: We use Interquartile Range (“I

Questions

21

0

0

0

REVIEWED-FINAL: Technical: DQ Connections / Connectors

Top Links Supported DQ Connectors: [Click Here] Connectors / Support Q: Is there a list of all supported DQ connectors? A: https://docs.owl-analytics.com/connecting-to-dbs-in-owl-web/owl-db-connection/supported-drivers A: Please note that some drivers marked as ‘Some Testing’ e.g. MongoDB are in Tec

Questions

165

6

0

0

TO BE REVIEWED: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)?

Question: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)? Answer: We have some customer reports to us, sharing their configuration of Apache Spark, Kubernetes (K8s), Docker, and more: Collibra DQ ran on K8s Docker: 150 Million rows, with 31 column

Questions

131

3

0

0

TO BE REVIEWED: DQ Agent configuration with multiple Apache Spark -conf key value pairs

Q: How do I configure the DQ Agent with multiple Apache Spark -conf key value pairs. I tried using more than one -conf in the “Free Form (Appended)” text field in the “Edit Agent” configuration dialogue, but that only accepts the first -conf some.key=value and ignores the second. Answer: Milind Pan

Questions

11

0

0

0

Issue with Snowflake Pushdown

Hi All, I am trying to create a Data Quality Check on Snowflake Tables, My Column Names are like "First Name", "Last Name" , "High School" etc ( placeholder), when I try to create a DQ Check , with pushdown enabled, Collibra DQ transform's the column name to First_Name, Last_Name etc, then trying t

Questions

40

1

0

Closed

TO BE REVIEWED: How can I access my Databricks cluster using Collibra DQ?

Q: How can I access my Databricks Delta Lake cluster via JDBC? Answer from Brian Mearns: Based on the recurring feedback, we decided to prioritize the development. The expected release date is 2022.05 for a default-packaged Databricks JDBC connector. There are 3 primary ways customers desire to use

Questions

30

0

0

0