Limited-Time Offer: Enjoy 50% Savings! - Ends In 0d 00h 00m 00s Coupon code: 50OFF
Welcome to QA4Exam
Logo

- Trusted Worldwide Questions & Answers

Most Recent Snowflake ARA-R01 Exam Dumps

 

Prepare for the Snowflake SnowPro Advanced: Architect Recertification exam with our extensive collection of questions and answers. These practice Q&A are updated according to the latest syllabus, providing you with the tools needed to review and test your knowledge.

QA4Exam focus on the latest syllabus and exam objectives, our practice Q&A are designed to help you identify key topics and solidify your understanding. By focusing on the core curriculum, These Questions & Answers helps you cover all the essential topics, ensuring you're well-prepared for every section of the exam. Each question comes with a detailed explanation, offering valuable insights and helping you to learn from your mistakes. Whether you're looking to assess your progress or dive deeper into complex topics, our updated Q&A will provide the support you need to confidently approach the Snowflake ARA-R01 exam and achieve success.

The questions for ARA-R01 were last updated on Apr 21, 2026.
  • Viewing page 1 out of 32 pages.
  • Viewing questions 1-5 out of 162 questions
Get All 162 Questions & Answers
Question No. 1

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

Show Answer Hide Answer
Correct Answer: C

The Time Travel retention period for T1 will be 30 days, which is the retention period set at the table level. The Time Travel retention period determines how long the historical data is preserved and accessible for an object after it is modified or dropped. The Time Travel retention period can be set at the account level, the database level, the schema level, or the table level. The retention period set at the lowest level of the hierarchy takes precedence over the higher levels. Therefore, the retention period set at the table level overrides the retention periods set at the schema level, the database level, or the account level. When the user drops the database DB1, the table T1 is also dropped, but the historical data is still preserved for 30 days, which is the retention period set at the table level. The user can use the UNDROP command to restore the table T1 within the 30-day period. The other options are incorrect because:

10 days is the retention period set at the database level, which is overridden by the table level.

20 days is the retention period set at the schema level, which is also overridden by the table level.

37 days is not a valid option, as it is not the retention period set at any level.


Understanding & Using Time Travel

AT | BEFORE

Snowflake Time Travel & Fail-safe

Question No. 2

There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

An Architect needs to create a read-only role for certain employees working in the human resources department.

Which permission sets must be granted to this role?

Show Answer Hide Answer
Correct Answer: A

To create a read-only role for certain employees working in the human resources department, the role needs to have the following permissions on the hr_db database:

USAGEon the database: This allows the role to access the database and see its schemas and objects.

USAGEon all schemas in the database: This allows the role to access the schemas and see their objects.

SELECTon all tables in the database: This allows the role to query the data in the tables.

Option A is the correct answer because it grants the minimum permissions required for a read-only role on the hr_db database.

Option B is incorrect because SELECT on schemas is not a valid permission. Schemas only support USAGE and CREATE permissions.

Option C is incorrect because MODIFY on the database is not a valid permission. Databases only support USAGE, CREATE, MONITOR, and OWNERSHIP permissions. Moreover, USAGE on tables is not sufficient for querying the data. Tables support SELECT, INSERT, UPDATE, DELETE, TRUNCATE, REFERENCES, and OWNERSHIP permissions.

Option D is incorrect because REFERENCES on tables is not relevant for querying the data. REFERENCES permission allows the role to create foreign key constraints on the tables.


: https://docs.snowflake.com/en/user-guide/security-access-control-privileges.html#database-privileges

: https://docs.snowflake.com/en/user-guide/security-access-control-privileges.html#schema-privileges

: https://docs.snowflake.com/en/user-guide/security-access-control-privileges.html#table-privileges

Question No. 3

An Architect is using SnowCD to investigate a connectivity issue.

Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

Show Answer Hide Answer
Correct Answer: B

The SYSTEM$GET_PRIVATELINK function is used to retrieve the list of Snowflake service endpoints that need to be accessible when configuring private connectivity (such as AWS PrivateLink or Azure Private Link) for a Snowflake account. The function returns information necessary for setting up the networking infrastructure that allows secure and private access to Snowflake without using the public internet. SnowCD can then be used to verify connectivity to these endpoints.


Question No. 4

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

B)

C)

D)

E)

Show Answer Hide Answer
Correct Answer: B, E

The architecture in the image shows a Snowflake data platform with two databases, DB1 and DB2, and two schemas, SH1 and SH2. DB1 contains a table TBL1 and a stage STAGE1. DB2 contains a table TBL2. The image also shows a snippet of code written in SQL language that copies data from STAGE1 to TBL2 using a file format FF PIPE 1.

To copy data from DB1 to TBL2, there are two possible options among the choices given:

Option B: Use a named external stage that references STAGE1. This option requires creating an external stage object in DB2.SH2 that points to the same location as STAGE1 in DB1.SH1.The external stage can be created using theCREATE STAGEcommand with theURLparameter specifying the location of STAGE11. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

use database DB2;

use schema SH2;

create stage EXT_STAGE1

url = @DB1.SH1.STAGE1;

Then, the data can be copied from the external stage to TBL2 using theCOPY INTOcommand with theFROMparameter specifying the external stage name and theFILE FORMATparameter specifying the file format name2. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

copy into TBL2

from @EXT_STAGE1

file format = (format name = DB1.SH1.FF PIPE 1);

Option E: Use a cross-database query to select data from TBL1 and insert into TBL2. This option requires using theINSERT INTOcommand with theSELECTclause to query data from TBL1 in DB1.SH1 and insert it into TBL2 in DB2.SH2.The query must use the fully-qualified names of the tables, including the database and schema names3. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

use database DB2;

use schema SH2;

insert into TBL2

select * from DB1.SH1.TBL1;

The other options are not valid because:

Option A: It uses an invalid syntax for theCOPY INTOcommand.TheFROMparameter cannot specify a table name, only a stage name or a file location2.

Option C: It uses an invalid syntax for theCOPY INTOcommand.TheFILE FORMATparameter cannot specify a stage name, only a file format name or options2.

Option D: It uses an invalid syntax for theCREATE STAGEcommand.TheURLparameter cannot specify a table name, only a file location1.


1: CREATE STAGE | Snowflake Documentation

2: COPY INTO table | Snowflake Documentation

3: Cross-database Queries | Snowflake Documentation

Question No. 5

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

Show Answer Hide Answer
Correct Answer: B

According to the Snowflake documentation, tasks are objects that enable scheduling and execution of SQL statements or JavaScript user-defined functions (UDFs) in Snowflake. Tasks can be used to automate data loading, transformation, and maintenance operations. Snowflake scripting is a feature that allows writing procedural logic using SQL statements and JavaScript UDFs. Snowflake scripting can be used to create complex workflows and orchestrate tasks. Therefore, the best option to automate the daily import of two files from an external stage into Snowflake, join and aggregate the data, and produce a final result set is to create a task using Snowflake scripting that will import the files using the COPY INTO command, and then call a UDF to perform the join and aggregation logic. The UDF can return a table or a variant value as the final result set.Reference:

Tasks

Snowflake Scripting

User-Defined Functions


Unlock All Questions for Snowflake ARA-R01 Exam

Full Exam Access, Actual Exam Questions, Validated Answers, Anytime Anywhere, No Download Limits, No Practice Limits

Get All 162 Questions & Answers