100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
SAA C03 Exam | quizzes with 100% Correct Verified Solutions $13.99   Add to cart

Exam (elaborations)

SAA C03 Exam | quizzes with 100% Correct Verified Solutions

 1 view  0 purchase
  • Course
  • SAA C03
  • Institution
  • SAA C03

SAA C03 Exam | quizzes with 100% Correct Verified Solutions

Preview 4 out of 40  pages

  • October 3, 2024
  • 40
  • 2024/2025
  • Exam (elaborations)
  • Questions & answers
  • SAA C03
  • SAA C03
avatar-seller
Examsplug
SAA C03 Exam | quizzes with 100% Correct Verified
Solutions

An event in CloudTrail is the record of an activity in an AWS account. What are the two types of
events that can be logged in CloudTrail? (choose 2)
Platform Events which are also known as hardware level operations


Control Events which are also known as data plane operations


Management Events which are also known as control plane operations


System Events which are also known as instance level operations



Data Events which are also known as data plane operations - ✔✔Trails can be configured to
log Data events and management events:Data events: These events provide insight into the
resource operations performed on or within a resource. These are also known as data plane
operationsManagement events: Management events provide insight into management
operations that are performed on resources in your AWS account. These are also known as
control plane operations. Management events can also include non-API events that occur in
your accountCORRECT: "Data Events which are also known as data plane operations" is a
correct answer.CORRECT: "Management Events which are also known as control plane
operations" is also a correct answer.


A manager is concerned that the default service limits my soon be reached for several AWS
services. Which AWS tool can a Solutions Architect use to display current usage and limits?
AWS Dashboard
AWS Systems Manager
AWS Trusted Advisor

Amazon CloudWatch - ✔✔Trusted Advisor is an online resource to help you reduce cost,
increase performance, and improve security by optimizing your AWS environment. Trusted

,Advisor provides real time guidance to help you provision your resources following AWS best
practices.AWS Trusted Advisor offers a Service Limits check (in the Performance category) that
displays your usage and limits for some aspects of some services.CORRECT: "AWS Trusted
Advisor" is the correct answer.INCORRECT: "AWS Systems Manager" is incorrect. AWS Systems
Manager gives you visibility and control of your infrastructure on AWS. Systems Manager
provides a unified user interface so you can view operational data from multiple AWS services
and allows you to automate operational tasks across your AWS resources.INCORRECT: "AWS
Dashboard" is incorrect. There is no service known as "AWS Dashboard".INCORRECT: "Amazon
CloudWatch" is incorrect. Amazon CloudWatch is used for performance monitoring not
displaying usage limits..


A Solutions Architect is creating a solution for an application that must be deployed on Amazon
EC2 hosts that are dedicated to the client. Instance placement must be automatic and billing
should be per instance.Which type of EC2 deployment model should be used?
Dedicated Instance
Reserved Instance
Dedicated Host

Cluster Placement Group - ✔✔Dedicated Instances are Amazon EC2 instances that run in a
VPC on hardware that's dedicated to a single customer. Your Dedicated instances are physically
isolated at the host hardware level from instances that belong to other AWS accounts.
Dedicated instances allow automatic instance placement and billing is per instance.CORRECT:
"Dedicated Instance" is the correct answer.INCORRECT: "Reserved Instance" is incorrect.
Reserved instances are a method of reducing cost by committing to a fixed contract term of 1
or 3 years..INCORRECT: "Dedicated Host" is incorrect. An Amazon EC2 Dedicated Host is a
physical server with EC2 instance capacity fully dedicated to your use. Dedicated Hosts can help
you address compliance requirements and reduce costs by allowing you to use your existing
server-bound software licenses. With dedicated hosts billing is on a per-host basis (not per
instance).INCORRECT: "Cluster Placement Group" is incorrect. A Cluster Placement Group
determines how instances are placed on underlying hardware to enable low-latency
connectivity.


A company has an eCommerce application that runs from multiple AWS Regions. Each region
has a separate database running on Amazon EC2 instances. The company plans to consolidate
the data to a columnar database and run analytics queries. Which approach should the
company take?

,Launch Amazon Kinesis Data Streams producers to load data into a Kinesis Data stream. Use
Kinesis Data Analytics to analyze the data
Run an AWS Batch job to copy and process the data into a columnar Amazon RDS database. Use
Amazon Athena to analyze the data
Create an AWS Lambda function that copies the data onto Amazon S3. Use Amazon S3 Select to
query the data
Use the COPY command to load data into an Amazon RedShift data warehouse and run the
analytics queries there - ✔✔Amazon Redshift is an enterprise-level, petabyte scale, fully
managed data warehousing service. It uses columnar storage to improve the performance of
complex queries.You can use the COPY command to load data in parallel from one or more
remote hosts, such Amazon EC2 instances or other computers. COPY connects to the remote
hosts using SSH and executes commands on the remote hosts to generate text
output.CORRECT: "Use the COPY command to load data into an Amazon RedShift data
warehouse and run the analytics queries there" is the correct answer.INCORRECT: "Run an AWS
Batch job to copy and process the data into a columnar Amazon RDS database. Use Amazon
Athena to analyze the data" is incorrect. AWS Batch is used for running batch computing jobs
across a fleet of EC2 instances. You cannot create a "columnar Amazon RDS database" as RDS is
optimized for transactional workloads. Athena is used to analyze data on S3.INCORRECT:
"Launch Amazon Kinesis Data Streams producers to load data into a Kinesis Data stream. Use
Kinesis Data Analytics to analyze the data" is incorrect. Kinesis is a real-time streaming data
service. It is not a columnar database so is unsuitable for this use case.INCORRECT: "Create an
AWS Lambda function that copies the data onto Amazon S3. Use Amazon S3 Select to query the
data" is incorrect. S3 is not a columnar database and S3 select does not run analytics queries, it
simply selects data from an object to retrieve.


An application is running on EC2 instances in a private subnet of an Amazon VPC. A Solutions
Architect would like to connect the application to Amazon API Gateway. For security reasons, it
is necessary to ensure that no traffic traverses the Internet and to ensure all traffic uses private
IP addresses only.How can this be achieved?
Create a public VIF on a Direct Connect connection
Create a private API using an interface VPC endpoint
Add the API gateway to the subnet the EC2 instances are located in

Create a NAT gateway - ✔✔An Interface endpoint uses AWS PrivateLink and is an elastic
network interface (ENI) with a private IP address that serves as an entry point for traffic
destined to a supported service. Using PrivateLink you can connect your VPC to supported AWS
services, services hosted by other AWS accounts (VPC endpoint services), and supported AWS

, Marketplace partner services.CORRECT: "Create a private API using an interface VPC endpoint"
is the correct answer.INCORRECT: "Create a NAT gateway" is incorrect. NAT Gateways are used
to provide Internet access for EC2 instances in private subnets so are of no use in this
solution.INCORRECT: "Create a public VIF on a Direct Connect connection" is incorrect. You do
not need to implement Direct Connect and create a public VIF. Public IP addresses are used in
public VIFs and the question requests that only private addresses are used.INCORRECT: "Add
the API gateway to the subnet the EC2 instances are located in" is incorrect. You cannot add API
Gateway to the subnet the EC2 instances are in, it is a public service with a public endpoint.


An application running in an on-premise data center writes data to a MySQL database. A
Solutions Architect is re-architecting the application and plans to move the database layer into
the AWS cloud on Amazon RDS. The application layer will run in the on-premise data
center.What must be done to connect the application to the RDS database via the Internet?
(choose 2)
Select a public IP within the DB subnet group to assign to the RDS instance
Create a security group allowing access from the on-premise public IP to the RDS instance and
assign to the RDS instance
Configure a NAT Gateway and attach the RDS database
Choose to make the RDS instance publicly accessible and place it in a public subnet

Create a DB subnet group that is publicly accessible - ✔✔When you create the RDS instance,
you need to select the option to make it publicly accessible. A security group will need to be
created and assigned to the RDS instance to allow access from the public IP address of your
application (or firewall).CORRECT: "Choose to make the RDS instance publicly accessible and
place it in a public subnet" is a correct answer.CORRECT: "Create a security group allowing
access from the on-premise public IP to the RDS instance and assign to the RDS instance" is also
a correct answer.INCORRECT: "Configure a NAT Gateway and attach the RDS database" is
incorrect. NAT Gateways are used for enabling Internet connectivity for EC2 instances in private
subnets.INCORRECT: "Select a public IP within the DB subnet group to assign to the RDS
instance" is incorrect. The RDS instance does not require a public IP.INCORRECT: "Create a DB
subnet group that is publicly accessible" is incorrect. A DB subnet group is a collection of
subnets (typically private) that you create in a VPC and that you then designate for your DB
instance. The DB subnet group cannot be made publicly accessible, even if the subnets are
public subnets, it is the RDS DB that must be configured to be publicly accessible.

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller Examsplug. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $13.99. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

77529 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$13.99
  • (0)
  Add to cart