Can't find what you're looking for?
Official
Accepted
Unanswered
Advanced Filters
Search Results (7)
TO BE REVIEWED: How do I use that Collibra DQ "Deployment Mode" Dialogue to override the default Agent configuration?
Question: I want to manually override the default Agent settings when running my Collibra DQ job. How do I work with the Deployment Mode dialogue? A: We created this presentation to get you started with using Apache Spark in DQ: https://docs.google.com/presentation/d/1epc_0xgpaaIsatEy15BCbhyEsRjMlK9
Questions
TO BE REVIEWED: What type of AI techniques (eg. Random Forest) does Collibra DQ Use?
Q: What type of AI techniques (eg. Random Forest) does Collibra DQ Use? Answer: It is important to realize that we have many aspects to Collibra DQ and they don’t all use the same algorithms. Does that make any sense? I will give you some examples: Outlier detection: We use Interquartile Range (“I
Questions
REVIEWED-FINAL: Technical: DQ Connections / Connectors
Top Links Supported DQ Connectors: [Click Here] Connectors / Support Q: Is there a list of all supported DQ connectors? A: https://docs.owl-analytics.com/connecting-to-dbs-in-owl-web/owl-db-connection/supported-drivers A: Please note that some drivers marked as ‘Some Testing’ e.g. MongoDB are in Tec
Questions
TO BE REVIEWED: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)?
Question: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)? Answer: We have some customer reports to us, sharing their configuration of Apache Spark, Kubernetes (K8s), Docker, and more: Collibra DQ ran on K8s Docker: 150 Million rows, with 31 column
Questions
TO BE REVIEWED: DQ Agent configuration with multiple Apache Spark -conf key value pairs
Q: How do I configure the DQ Agent with multiple Apache Spark -conf key value pairs. I tried using more than one -conf in the “Free Form (Appended)” text field in the “Edit Agent” configuration dialogue, but that only accepts the first -conf some.key=value and ignores the second. Answer: Milind Pan
Questions
Issue with Snowflake Pushdown
Hi All, I am trying to create a Data Quality Check on Snowflake Tables, My Column Names are like "First Name", "Last Name" , "High School" etc ( placeholder), when I try to create a DQ Check , with pushdown enabled, Collibra DQ transform's the column name to First_Name, Last_Name etc, then trying t
Questions
•
Use cases
Closed
TO BE REVIEWED: How can I access my Databricks cluster using Collibra DQ?
Q: How can I access my Databricks Delta Lake cluster via JDBC? Answer from Brian Mearns: Based on the recurring feedback, we decided to prioritize the development. The expected release date is 2022.05 for a default-packaged Databricks JDBC connector. There are 3 primary ways customers desire to use
Questions