Official

Accepted

Unanswered

Relevance
Selected Relevance

Advanced Filters

Search Results (11)

TO BE REVIEWED: DQ Agent configuration with multiple Apache Spark -conf key value pairs

Q: How do I configure the DQ Agent with multiple Apache Spark -conf key value pairs. I tried using more than one -conf in the “Free Form (Appended)” text field in the “Edit Agent” configuration dialogue, but that only accepts the first -conf some.key=value and ignores the second. Answer: Milind Pan

Questions

13

0

0

0

TO BE REVIEWED: How do I use that Collibra DQ "Deployment Mode" Dialogue to override the default Agent configuration?

Question: I want to manually override the default Agent settings when running my Collibra DQ job. How do I work with the Deployment Mode dialogue? A: We created this presentation to get you started with using Apache Spark in DQ: https://docs.google.com/presentation/d/1epc_0xgpaaIsatEy15BCbhyEsRjMlK9

Questions

21

0

0

0

TO BE REVIEWED: What type of AI techniques (eg. Random Forest) does Collibra DQ Use?

Q: What type of AI techniques (eg. Random Forest) does Collibra DQ Use? Answer: It is important to realize that we have many aspects to Collibra DQ and they don’t all use the same algorithms. Does that make any sense? I will give you some examples: Outlier detection: We use Interquartile Range (“I

Questions

24

0

0

0

Rules details export shows difference with respect to summary details

Has anyone experienced a difficulty in interpreting the export of rules details in a data quality job.? The export shows much larger difference with respect to the summary details in a data quality job findings that uses a pushdown connection.

Questions

17

1

0

Apache Spark in Collibra DQ with Py4J (need Spark v. 3.01, and correct Scala version)

Python and Py4J code: owl.owlCheck() fails with an exception: File “/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py”, line 1305, in call File “/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py”, line 128, in deco File “/opt/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py”,

Questions

30

1

0

0

Issue with Snowflake Pushdown

Hi All, I am trying to create a Data Quality Check on Snowflake Tables, My Column Names are like "First Name", "Last Name" , "High School" etc ( placeholder), when I try to create a DQ Check , with pushdown enabled, Collibra DQ transform's the column name to First_Name, Last_Name etc, then trying t

Questions

45

1

0

Closed

TO BE REVIEWED: How can I access my Databricks cluster using Collibra DQ?

Q: How can I access my Databricks Delta Lake cluster via JDBC? Answer from Brian Mearns: Based on the recurring feedback, we decided to prioritize the development. The expected release date is 2022.05 for a default-packaged Databricks JDBC connector. There are 3 primary ways customers desire to use

Questions

35

0

0

0

REVIEWED-FINAL: Technical: DQ Connections / Connectors

Top Links Supported DQ Connectors: [Click Here] Connectors / Support Q: Is there a list of all supported DQ connectors? A: https://docs.owl-analytics.com/connecting-to-dbs-in-owl-web/owl-db-connection/supported-drivers A: Please note that some drivers marked as ‘Some Testing’ e.g. MongoDB are in Tec

Questions

176

6

0

0

job.catalog.JdbcIngestionJob error on databricks JDBC

Hello, all I tried to connect Azure Databricks using Collibra JDBC Driver. I’ve set up the connection via the Catalog successfully with the Driver downloaded from https://marketplace.collibra.com/listings/jdbc-driver-for-databricks/ But I got an error. I guess it might because of the single quote

Questions

125

2

0

0

TO BE REVIEWED: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)?

Question: How can I configure Collibra DQ to run a large DQ Check Job (containing millions of rows of data)? Answer: We have some customer reports to us, sharing their configuration of Apache Spark, Kubernetes (K8s), Docker, and more: Collibra DQ ran on K8s Docker: 150 Million rows, with 31 column

Questions

138

3

0

0

Behavior rules - Upper bound and lower bound calculatuion

Hello Collibra friends, Can somebody share how the behavior check, row-count generates the lower bound and upper bound adaptively and predict a range after the learning phase..

Questions

24

0

0