Databricks Certified Professional Data Engineer Exam Questions With 100% Correct Answers.
Drop the customers database and associated tables and data, all of the tables inside the database are managed tables. Which of the following SQL commands will help you accomplish this?A.DROP DATABASE customers FORCEB.DROP DATABASE customers CASCADEC.DROP DATABASE customers INCLUDED.All the tables must be dropped first before dropping databaseE.DROP DELTA DATABSE customers - Answer: C To help you prepare for the exam, we have recently released Databricks Certified Professional Data Engineer Exam Questions which are designed to simulate the real exam and provide a comprehensive overview of the topics covered in the certification. - What is the underlying technology that makes the Auto Loader work?A.LoaderB.Delta Live TablesC.Structured StreamingD.DataFramesE.Live DataFames - Answer: C Which of the following functions can be used to convert JSON string to Struct data type?A.TO_STRUCT (json value)B.FROM_JSON (json value)C.FROM_JSON (json value, schema of json)D.CONVERT (json value, schema of json)E.CAST (json value as STRUCT) - Answer: C Which of the following locations in Databricks product architecture hosts jobs/pipelines and queries?A.Data planeB.Control planeC.Databricks FilesystemD.JDBC data sourceE.Databricks web application - Answer: B Which of the following type of tasks cannot setup through a job?A.NotebookB.DELTA LIVE PIPELINEC.Spark SubmitD.PythonE.Databricks SQL Dashboard refresh - Answer: E While investigating a performance issue, you realized that you have too many small files for a given table, which command are you going to run to fix this issue A.COMPACT table_name B.VACUUM table_name C.MERGE table_nameD.SHRINK table_name E.OPTIMIZE table_name - Answer: E You are asked to debug a databricks job that is taking too long to run on Sunday's, what are the steps you are going to take to identify the step that is taking longer to run? A.A notebook activity of job run is only visible when using all-purpose cluster. B.Under Workflow UI and jobs select job you want to monitor and select the run, notebook activity can be viewed. C.Enable debug mode in the Jobs to see the output activity of a job, output should be available to view. D.Once a job is launched, you cannot access the job's notebook activity. E.Use the compute's spark UI to monitor the job activity. - Answer: B You are asked to debug a databricks job that is taking too long to run on Sunday's, what are the steps you are going to take to identify the step that is taking longer to run? A.A notebook activity of job run is only visible when using all-purpose cluster. B.Under Workflow UI and jobs select job you want to monitor and select the run, notebook activity can be viewed. C.Enable debug mode in the Jobs to see the output activity of a job, output should be available to view. D.Once a job is launched, you cannot access the job's notebook activity. E.Use the compute's spark UI to monitor the job activity. - Answer: E You are currently working on storing data you received from different customer surveys, this data is highly unstructured and changes over time, why Lakehouse is a better choice compared to a Data warehouse? A.Lakehouse supports schema enforcement and evolution, traditional data warehouses lack schema evolution. B.Lakehouse supports SQL C.Lakehouse supports ACID D.Lakehouse enforces data integrity E.Lakehouse supports primary and foreign keys like a data warehouse - Answer: A
Geschreven voor
- Instelling
- Databricks Certified Professional Data Engineer
- Vak
- Databricks Certified Professional Data Engineer
Documentinformatie
- Geüpload op
- 16 juli 2024
- Aantal pagina's
- 2
- Geschreven in
- 2023/2024
- Type
- Tentamen (uitwerkingen)
- Bevat
- Vragen en antwoorden
Onderwerpen
-
databricks certified professional data engineer ex
Ook beschikbaar in voordeelbundel