The Snowflake SPS-C01 exam preparation guide is designed to provide candidates with necessary information about the SnowPro Specialty - Snowpark exam. It includes exam summary, sample questions, practice test, objectives and ways to interpret the exam objectives to enable candidates to assess the types of questions-answers that may be asked during the Snowflake Certified SnowPro Specialty - Snowpark exam.
It is recommended for all the candidates to refer the SPS-C01 objectives and sample questions provided in this preparation guide. The Snowflake SnowPro Specialty - Snowpark certification is mainly targeted to the candidates who want to build their career in Specialty domain and demonstrate their expertise. We suggest you to use practice exam listed in this cert guide to get used to with exam environment and identify the knowledge areas where you need more work prior to taking the actual Snowflake SnowPro Specialty - Snowpark exam.
Snowflake SPS-C01 Exam Summary:
Exam Name
|
Snowflake SnowPro Specialty - Snowpark |
Exam Code | SPS-C01 |
Exam Price | $225 USD |
Duration | 85 minutes |
Number of Questions | 55 |
Passing Score | 750 + Scaled Scoring from 0 - 1000% |
Recommended Training / Books |
Snowpark DataFrame Programming Training Course SnowPro Specialty: Snowpark Exam Study Guide Level Up Snowpark Essentials Track |
Schedule Exam | PEARSON VUE |
Sample Questions | Snowflake SPS-C01 Sample Questions |
Recommended Practice | Snowflake Certified SnowPro Specialty - Snowpark Practice Test |
Snowflake SnowPro Specialty - Snowpark Syllabus:
Section | Objectives |
---|---|
Snowpark Concepts - 15% |
|
Outline Snowpark architecture |
- Lazy evaluation - Use of key objects
- Types of libraries (DataFrames, Machine Learning)
- Client-side and server-side capabilities
|
Set-up Snowpark |
- Installation
- Development environments
|
Snowpark API for Python - 30% |
|
Create and manage user sessions |
- Account identifiers - Parameters for the CONNECT
function
- Authentication methods
- Session creation |
Use Snowpark with unstructured data
|
- Read files with SnowflakeFile object
- Use UDFs and UDTFs to process files
- Use stored procedures to process files
|
Create Snowpark DataFrames |
- Multiple methods to create Snowpark DataFrames
- Schemas (apply to DataFrames) |
Operationalize UDFs and UDTFs in Snowpark
|
- Create UDFs from files (locally, on a stage)
- Use Python modules (packaged Python code) with UDFs
- Write Python function to create UDFs and UDTFs
- Register UDFs and UDTFs (for example, session.utf(...), functions.utf(...))
- Secure UDFs and UDTFs
- Data types (type hints vs. registration
- Compare scalar and vectorized operations
|
Operationalize Snowpark stored procedures
|
- Create stored procedures from files (locally, on stage)
- Write Python functions to power stored procedures
- Use Python modules (packaged code, Anaconda) with stored procedures
- Register stored procedures
- Make dependencies available to code
- Secure stored procedures
- Use Snowpark Python storedprocedures to run workloads
- Data types (type hints vs. registration API)
- Create Directed Acyclic Graphs (tasks) executing stored procedures
- Bring Python modules (packaged code) to be used with UDFs
|
Snowpark for Data Transformations - 35% |
|
Apply operations for filtering and transforming data
|
- Use scalar functions and operators - Sort and limit results - Input/output (parameters) - Snowpark DataFrames - Columns - Data type casting - Rows and data extraction from a Rows object |
Clean and enrich data using
Snowpark for Python
|
- Perform joins - Handle missing values - Sample data |
Perform aggregate and set-based operations on DataFrames
|
- Functions - Window - Grouping - Table functions - UDFs |
Transform semi-structured data in DataFrames
|
- Traverse semi-structured data - Explicitly cast values in semi-structured data - Flatten an array of objects into rows - Load semi-structured data into DataFrames |
Persist the results of Snowpark DataFrames
|
- Create views from DataFrames - Save DataFrame results as Snowflake tables - Save DataFrame results as files in a stage |
Perform DML operations using Snowpark DataFrames
|
- Delete data - Update data - Insert data - Merge data |
Snowpark Performance Optimization - 20% |
|
Configure Snowpark-optimized warehouses
|
- Use cases for Snowpark-optimized virtual warehouses
- Modify Snowpark-optimized virtual warehouse properties
- Billing for Snowpark-optimized virtual warehouses
- When to scale up/down virtual warehouses
|
Enhance performance in
Snowpark applications
|
- Materialize results (caching)
- Vectorization
- Synchronous versus asynchronous calls
|
Troubleshoot common errors in Snowpark |
- Event tables
- Snowpark Python local testing framework
- Writing tests (pyTest)
- Query history (SQL equivalency to help identify bottlenecks)
|