Earn an industry-recognized credential from AWS that validates your expertise in AWS data lakes and analytics services. Build credibility and confidence by highlighting your ability to design, build, secure, and maintain analytics solutions on AWS that are efficient, cost-effective, and secure. Show you have breadth and depth in delivering insight from data.
Abilities Validated by the Certification
Define AWS data analytics services and understand how they integrate with each other
Explain how AWS data analytics services fit in the data life cycle of collection, storage, processing, and visualization
Recommended Knowledge and Experience
At least 5 years of experience with data analytics technologies
At least 2 years of hands-on experience working with AWS
Experience and expertise working with AWS services to design, build, secure, and maintain analytics solutions
Prepare for Your Exam
There is no better preparation than hands-on experience. Review the exam guide for information about the competencies assessed on this certification exam. You can also review the sample questions for format examples or take a practice exam.
Looking for more resources to help build your data analytics expertise? Explore options including an AWS Data Analytics Learning Path, an exam readiness digital course, suggested AWS whitepapers and FAQs, and more.
Introduction
The AWS Certified Data Analytics – Specialty (DAS-C01) examination is intended for individuals who perform in a data analytics-focused role. This exam validates an examinee’s comprehensive understanding of using AWS services to design, build, secure, and maintain analytics solutions that provide insight from data.
It validates an examinee’s ability to:
Define AWS data analytics services and understand how they integrate with each other.
Explain how AWS data analytics services fit in the data lifecycle of collection, storage, processing, and visualization.
Recommended AWS Knowledge
A minimum of 5 years of experience with common data analytics technologies
At least 2 years of hands-on experience working on AWS
Experience and expertise working with AWS services to design, build, secure, and maintain analytics solutions
Response Types
There are two types of questions on the examination:
Multiple choice: Has one correct response and three incorrect responses (distractors).
Multiple response: Has two or more correct responses out of five or more options.
Select one or more responses that best complete the statement or answer the question. Distractors, or incorrect answers, are response options that an examinee with incomplete knowledge or skill would likely choose. However, they are generally plausible responses that fit in the content area defined by the test objective.
Unanswered questions are scored as incorrect; there is no penalty for guessing.
Unscored Content
Your examination may include unscored items that are placed on the test to gather statistical information. These items are not identified on the form and do not affect your score.
Exam Results
The AWS Certified Data Analytics – Specialty (DAS-C01) examination is a pass or fail exam. The examination is scored against a minimum standard established by AWS professionals who are guided by certification industry best practices and guidelines.
Your results for the examination are reported as a score from 100–1,000, with a minimum passing score of 750. Your score shows how you performed on the examination as a whole and whether or not you passed. Scaled scoring models are used to equate scores across multiple exam forms that may have slightly different difficulty levels.
Your score report contains a table of classifications of your performance at each section level. This information is designed to provide general feedback concerning your examination performance. The examination uses a compensatory scoring model, which means that you do not need to “pass” the individual sections, only the overall examination. Each section of the examination has a specific weighting, so some sections have more questions than
others. The table contains general information, highlighting your strengths and weaknesses. Exercise caution when interpreting section-level feedback.
Content Outline
This exam guide includes weightings, test domains, and objectives only. It is not a comprehensive listing of the content on this examination. The table below lists the main content domains and their weightings.
Domain 1: Collection 18%
Domain 2: Storage and Data Management 22%
Domain 3: Processing 24%
Domain 4: Analysis and Visualization 18%
Domain 5: Security 18%
TOTAL 100%
Domain 1: Collection
1.1 Determine the operational characteristics of the collection system
1.2 Select a collection system that handles the frequency, volume, and source of data
1.3 Select a collection system that addresses the key properties of data, such as order, format, and compression
Domain 2: Storage and Data Management
2.1 Determine the operational characteristics of a storage solution for analytics
2.2 Determine data access and retrieval patterns
2.3 Select an appropriate data layout, schema, structure, and format
2.4 Define a data lifecycle based on usage patterns and business requirements
2.5 Determine an appropriate system for cataloging data and managing metadata
Domain 3: Processing
3.1 Determine appropriate data processing solution requirements
3.2 Design a solution for transforming and preparing data for analysis
3.3 Automate and operationalize a data processing solution
Domain 4: Analysis and Visualization
4.1 Determine the operational characteristics of an analysis and visualization solution
4.2 Select the appropriate data analysis solution for a given scenario
4.3 Select the appropriate data visualization solution for a given scenario
Domain 5: Security
5.1 Select appropriate authentication and authorization mechanisms
5.2 Apply data protection and encryption techniques
5.3 Apply data governance and compliance controls
QUESTION 1
A financial services company needs to aggregate daily stock trade data from the exchanges into a data store.
The company requires that data be streamed directly into the data store, but also occasionally allows data to
be modified using SQL. The solution should integrate complex, analytic queries running with minimal latency.
The solution must provide a business intelligence dashboard that enables viewing of the top contributors to
anomalies in stock prices.
Which solution meets the company’s requirements?
A. Use Amazon Kinesis Data Firehose to stream data to Amazon S3. Use Amazon Athena as a data source
for Amazon QuickSight to create a business intelligence dashboard.
B. Use Amazon Kinesis Data Streams to stream data to Amazon Redshift. Use Amazon Redshift as a data
source for Amazon QuickSight to create a business intelligence dashboard.
C. Use Amazon Kinesis Data Firehose to stream data to Amazon Redshift. Use Amazon Redshift as a data
source for Amazon QuickSight to create a business intelligence dashboard.
D. Use Amazon Kinesis Data Streams to stream data to Amazon S3. Use Amazon Athena as a data source
for Amazon QuickSight to create a business intelligence dashboard.
Correct Answer: D
QUESTION 2
A financial company hosts a data lake in Amazon S3 and a data warehouse on an Amazon Redshift cluster.
The company uses Amazon QuickSight to build dashboards and wants to secure access from its on-premises
Active Directory to Amazon QuickSight.
How should the data be secured?
A. Use an Active Directory connector and single sign-on (SSO) in a corporate network environment.
B. Use a VPC endpoint to connect to Amazon S3 from Amazon QuickSight and an IAM role to authenticate Amazon Redshift.
C. Establish a secure connection by creating an S3 endpoint to connect Amazon QuickSight and a VPC endpoint to connect to Amazon Redshift.
D. Place Amazon QuickSight and Amazon Redshift in the security group and use an Amazon S3 endpoint to connect Amazon QuickSight to Amazon S3.
Correct Answer: B
QUESTION 3
A real estate company has a mission-critical application using Apache HBase in Amazon EMR. Amazon EMR
is configured with a single master node. The company has over 5 TB of data stored on an Hadoop Distributed
File System (HDFS). The company wants a cost-effective solution to make its HBase data highly available.
Which architectural pattern meets company’s requirements?
A. Use Spot Instances for core and task nodes and a Reserved Instance for the EMR master node. Configure
the EMR cluster with multiple master nodes. Schedule automated snapshots using Amazon EventBridge.
B. Store the data on an EMR File System (EMRFS) instead of HDFS. Enable EMRFS consistent view. Create
an EMR HBase cluster with multiple master nodes. Point the HBase root directory to an Amazon S3 bucket.
C. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view.
Run two separate EMR clusters in two different Availability Zones. Point both clusters to the same HBase
root directory in the same Amazon S3 bucket.
D. Store the data on an EMR File System (EMRFS) instead of HDFS and enable EMRFS consistent view.
Create a primary EMR HBase cluster with multiple master nodes. Create a secondary EMR HBase readreplica
cluster in a separate Availability Zone. Point both clusters to the same HBase root directory in the
same Amazon S3 bucket.
Correct Answer: C
Actualkey Amazon AWS Certified Data Analytics DAS-C01 exam pdf, Certkingdom Amazon AWS Certified Data Analytics DAS-C01 PDF
Best Amazon AWS Certified Data Analytics DAS-C01 Certification, Amazon AWS Certified Data Analytics DAS-C01 Training at certkingdom.com
Comments Off on Amazon AWS Certified Data Analytics – Specialty Exam DAS-C01