PCED™ – Certified Entry-Level Data Analyst with Python: EXAM SYLLABUS

Exam: PCED-30-02
Status: ACTIVE


The PCED™-30-02 exam consists of single-select and multiple-select items designed to assess a candidate’s ability to collect, prepare, analyze, and communicate data using Python. The exam evaluates practical skills in data handling, Python programming, basic analytics, and reporting.

Each item is worth a maximum of 1 point. After completion, the candidate’s raw score is normalized and presented as a percentage.

The exam is divided into four blocks, each reflecting key areas of data analysis practice. The weight of each block indicates its importance in the overall exam.

PCED-30-02 badge

The table below summarizes the distribution of exam items and their respective weight in the total exam score.

Block Number Block Name Number of Items Weight
1 Introduction to Data and Data Analysis Concepts 9 22.5%
2 Python Basics for Data Analysis 13 32.5%
3 Working with Data and Performing Simple Analyses 13 32.5%
4 Communicating Insights and Reporting 5 12.5%
40 100%

Exam Syllabus

Last updated: July 14, 2025
Aligned with Exam PCED-30-02


Exam Syllabus Contents


Block 1: Introduction to Data and Data Analysis Concepts

9 objectives covered by the block → 9 exam items

Define and Classify Data

Objective 1.1.1 – Define data and explain how it becomes meaningful.

  1. Define data and explain its role in decision-making, business, and everyday life.
  2. Distinguish between data, information, and knowledge.
  3. Describe how raw data is processed into usable insights for decision-making.

Objective 1.1.2 – Classify data by type and format.

  1. Identify and classify data as quantitative or qualitative.
  2. Differentiate structured, semi-structured, and unstructured data using real-world examples.

Describe Data Sources, Collection Methods, and Storage

Objective 1.2.1 – Identify data sources and collection methods.

  1. Identify various data sources, including APIs, web pages, databases, IoT devices, surveys, and logs.
  2. Explain common data collection methods: surveys, interviews, observations, automated systems, web scraping.
  3. Discuss representative sampling and the risks of biased or incomplete data.
  4. Compare advantages and limitations of data collection techniques for qualitative and quantitative research.

Objective 1.2.2 – Explain how data is stored and organized.

  1. Describe data formats (CSV, JSON, Excel, databases) and storage systems (data lakes, warehouses, relational databases).
  2. Explain the role of metadata and compare storage solutions by type, structure, and purpose of data.
  3. Evaluate storage options based on data structure, scale, and use case.

Explain the Data Lifecycle and Its Management

Objective 1.3.1 – Describe the data lifecycle.

  1. List and explain lifecycle stages: collection, storage, processing, analysis, reporting, archiving, deletion.
  2. Explain how errors at any stage impact results and decisions.
  3. Identify tools and techniques used at each stage.

Objective 1.3.2 – Discuss the value and challenges of lifecycle management.

  1. Explain the importance of lifecycle management for quality, security, and compliance.
  2. Describe challenges in managing large-scale data and strategies like cloud storage and data pipelines.

Understand the Scope of Data Science, Analytics, and Analysis

Objective 1.4.1 – Differentiate between data analysis, data analytics, and data science.

  1. Define each term and explain their relationship.
  2. Compare their scope, tools, and goals using real examples.
  3. Describe roles and responsibilities of professionals in each area.
  4. Identify typical tasks in each field (e.g., statistical summaries vs. ML modeling).

Objective 1.4.2 – Explain the data analytics workflow.

  1. Describe descriptive, diagnostic, predictive, and prescriptive analytics.
  2. Identify key questions each type answers and their business relevance.
  3. Explain the steps: collection, preprocessing, analysis, reporting.
  4. Match analytics types with real-world examples.

Identify Ethical and Legal Considerations in Data Analytics

Objective 1.5.1 – Describe key ethical principles and legal frameworks.

  1. Explain transparency, consent, privacy, fairness, and accountability in data handling.
  2. Identify major laws (GDPR, HIPAA, CCPA) and how they guide data use.
  3. Describe anonymization and encryption techniques that support compliance.

Block 2: Python Basics for Data Analysis

13 objectives covered by the block → 13 exam items

Work with Variables and Data Types

Objective 2.1.1 – Use variables and data types, and perform basic operations.

  1. Define and assign variables in Python using the assignment operator =.
  2. Perform simple operations with numbers (e.g., addition, subtraction) and strings (e.g., concatenation, repetition).
  3. Use type() and isinstance() to inspect variable types.
  4. Identify common Python data types: int, float, str, and bool.

Use Python Data Collections and Sequences

Objective 2.2.1 – Create and manipulate lists.

  1. Create and access list elements using indexing and slicing.
  2. Use list methods such as append(), insert(), pop(), remove(), sort(), reverse(), count(), and index().
  3. Use list comprehensions to transform or filter data.

Objective 2.2.2 – Work with tuples and sets.

  1. Create and access tuples using indexing.
  2. Explain tuple immutability and its use cases.
  3. Create sets and perform operations like add(), remove(), union(), intersection(), difference().
  4. Use sets to remove duplicates and check membership.

Objective 2.2.3 – Use dictionaries for data storage, grouping, and lookup.

  1. Create dictionaries with key-value pairs.
  2. Access, update, and delete dictionary values.
  3. Loop through dictionaries with for...in and items().
  4. Apply dictionaries for counting, lookup, and categorization tasks.
  5. Represent data as lists of dictionaries.

Objective 2.2.4 – Work with strings as sequences and apply string methods.

  1. Use indexing, slicing, and loops with strings.
  2. Apply methods like startswith(), endswith(), find(), capitalize(), isdigit(), isalpha().

Use Functions and Handle Exceptions

Objective 2.3.1 – Define and call functions.

  1. Create functions using def and pass arguments (positional, keyword, default).
  2. Return values and explain how None is used when no return is given.
  3. Use pass for placeholder function bodies.

Objective 2.3.2 – Understand scope and variable behavior in functions.

  1. Distinguish between local and global variables.
  2. Explain name shadowing and variable scope within functions.
  3. Understand when to use global variables.

Objective 2.3.3 – Handle errors with try-except blocks.

  1. Identify common runtime errors like TypeError, ValueError, IndexError.
  2. Use try-except blocks to prevent script crashes.
  3. Print or log useful error messages for debugging.

Control Program Flow with Conditionals and Loops

Objective 2.4.1 – Apply Boolean logic and comparisons.

  1. Use comparison operators and logical operators in conditions.
  2. Use Boolean expressions for data filtering and validation.

Objective 2.4.2 – Use conditional statements to control logic.

  1. Write if, elif, and else statements.
  2. Check for missing data, outliers, and invalid input.
  3. Use nested conditionals for complex decisions.

Objective 2.4.3 – Write loops for repeated tasks.

  1. Use for and while loops.
  2. Apply break, continue, and else with loops.
  3. Combine loops with conditionals for data operations.

Use Modules and Packages

Objective 2.5.1 – Import and use Python modules and packages.

  1. Import modules with import, from...import, and aliases.
  2. Use standard libraries like math, random, statistics, collections, os, datetime.
  3. Use the csv module to read and write CSV files.
  4. Understand when to use built-in vs. third-party packages.

Objective 2.5.2 – Use external libraries in data workflows.

  1. Install and import external libraries like numpy using pip.
  2. Use numpy for arrays and numerical analysis.
  3. Understand the distinction between built-in and third-party libraries.

Block 3: Working with Data and Performing Simple Analyses

13 objectives covered by the block → 13 exam items

Read and Write Data Using Files

Objective 3.1.1 – Read and write plain text files using Python built-ins.

  1. Use open(), read(), readlines(), and write() to handle text files.
  2. Use with statements for safe file handling.
  3. Work with file paths and check file existence with os.path.exists().
  4. Use try-except to catch file-related errors.

Objective 3.1.2 – Read and write CSV files using the csv module.

  1. Read CSV data with csv.reader().
  2. Write data with csv.writer().
  3. Manually parse lines using .strip() and .split(',').
  4. Write formatted summaries with f-strings.

Clean and Prepare Data for Analysis

Objective 3.2.1 – Identify and handle missing or invalid data.

  1. Detect missing/null values with conditionals and list comprehensions.
  2. Replace or remove missing values logically.
  3. Check for invalid types, formats, or ranges before processing.

Objective 3.2.2 – Remove duplicates and normalize values.

  1. Use set(), dict keys, or comprehensions to eliminate duplicates.
  2. Perform min-max normalization manually.
  3. Use enumeration for indexed transformations.

Objective 3.2.3 – Clean and format strings.

  1. Use .strip(), .lower(), .upper(), .replace(), and .title() for cleaning.
  2. Chain string methods for multi-step operations.

Objective 3.2.4 – Convert and format data for analysis and storage.

  1. Convert data types using int(), float(), str(), bool().
  2. Format numbers with f-strings.
  3. Manipulate strings with .split() and .join().
  4. Handle dates with datetime.strptime() and strftime().

Perform Basic Analytical Computations

Objective 3.3.1 – Perform aggregations using Python built-ins.

  1. Use len(), sum(), min(), max(), round() for summaries.
  2. Count values with .count() or dictionary methods.

Objective 3.3.2 – Calculate descriptive statistics with built-in libraries.

  1. Use statistics.mean(), statistics.median(), statistics.stdev().
  2. Use math.sqrt(), math.ceil(), math.floor().
  3. Use collections.Counter() for frequency counts.

Objective 3.3.3 – Perform numerical operations with NumPy.

  1. Create arrays with numpy.array().
  2. Use numpy.mean(), numpy.median(), numpy.std(), numpy.sum().
  3. Generate sequences with numpy.arange() and numpy.linspace().

Objective 3.3.4 – Calculate conditional metrics based on filters or categories.

  1. Use if statements or list comprehensions for filtered metrics.
  2. Group by categories and calculate summaries with dictionaries or loops.
  3. Use logical conditions to filter by multiple factors.

Conduct Basic Exploratory Data Analysis (EDA)

Objective 3.4.1 – Identify patterns and trends using sorting and filtering.

  1. Sort data with sorted() or numpy.sort().
  2. Filter data with filter(), list comprehensions, or conditions.

Objective 3.4.2 – Identify unique values and frequencies.

  1. Find unique values with set() or numpy.unique().
  2. Count frequencies with Counter().

Objective 3.4.3 – Perform simple correlation checks and detect outliers.

  1. Use numpy.corrcoef() for correlation.
  2. Detect outliers with rules or std-based checks.
  3. Filter outliers with conditions or numpy boolean indexing.
  4. Interpret findings from code-based exploration.

Block 4: Communicating Insights and Reporting

5 objectives covered by the block → 5 exam items

Understand Basic Principles of Data Visualization

Objective 4.1.1 – Recognize common visualization types and their uses.

  1. Identify bar charts, line charts, and pie charts.
  2. Explain when to use each type.
  3. Discuss strengths and limitations of each visualization type.

Objective 4.1.2 – Interpret simple data visualizations.

  1. Describe trends, comparisons, and proportions in visuals.
  2. Identify unclear or misleading visuals and suggest improvements.
  3. Assess if a visual supports or confuses the insight.

Apply Fundamentals of Data Storytelling

Objective 4.2.1 – Structure and communicate insights as a narrative.

  1. Explain the structure: introduction, insights, conclusion.
  2. Lead with a key message supported by evidence.
  3. Use transitions and signposting for flow.
  4. Adjust tone and depth to audience needs.

Create Clear and Concise Analytical Reports

Objective 4.3.1 – Summarize and organize results effectively.

  1. Write short summaries with supporting data.
  2. Use logical structure: problem, analysis, insight, recommendation.
  3. Apply formatting (headings, bullet points, visuals) for clarity.

Communicate Insights Effectively in Presentations

Objective 4.4.1 – Present insights with visual and verbal techniques.

  1. Use clean design: labels, titles, colors, font size.
  2. Explain charts and results clearly in presentations.
  3. Respond to questions with evidence from visuals or data.

Download PCED-30-02 Exam Syllabus in PDF


MQC Profile

A Minimally Qualified Candidate (MQC) for the PCED exam is an individual with foundational knowledge and skills in data handling, Python programming, basic analytics, and data communication. The candidate is expected to apply Python in practical data scenarios, using built-in functionality and basic libraries to collect, clean, explore, and summarize data. The MQC is not expected to have prior professional experience but should demonstrate readiness to solve common beginner-level data problems with Python.

This candidate understands the basic data lifecycle, is aware of ethical data handling principles, and can work with structured data using core Python tools and standard libraries such as csv, math, random, statistics, collections, os, datetime, and numpy.

This profile represents a blend of technical proficiency, analytical thinking, and communication skills crucial for navigating the complexities of data-driven environments.

Block 1: Introduction to Data and Data Analysis Concepts (22.5% of total exam)

Minimum Coverage – the candidate can:

  • Define and classify types of data (quantitative, qualitative, structured, unstructured).
  • Explain how raw data is collected, stored, and transformed into meaningful information.
  • Identify data sources and describe basic collection methods like surveys and web scraping.
  • Compare data storage formats (CSV, JSON, Excel) and systems (databases, data lakes).
  • Understand the full data lifecycle and how each stage affects outcomes.
  • Differentiate between data science, analytics, and analysis.
  • Describe common analytics types (descriptive, diagnostic, etc.) and apply them to basic use cases.
  • Recognize ethical and legal principles in data handling and the importance of anonymization and compliance (e.g., GDPR).

Block 2: Python Basics for Data Analysis (32.5% of total exam)

Minimum Coverage – the candidate can:

  • Use Python to define variables, perform basic arithmetic, and manipulate strings.
  • Create and work with lists, tuples, sets, dictionaries, and string methods.
  • Build and use simple functions with parameters and return values.
  • Control program flow with conditionals and loops.
  • Handle exceptions using try and except blocks for robustness.
  • Import and use modules like math, random, collections, statistics, os, and datetime, and understand the role of packages like numpy.

Block 3: Working with Data and Performing Simple Analyses (32.5% of total exam)

Minimum Coverage – the candidate can:

  • Read and write plain text and CSV files using open() and the csv module.
  • Clean data by handling missing values, duplicates, and invalid formats.
  • Normalize and format data using built-in methods and string operations.
  • Use Python functions to calculate basic aggregates, descriptive statistics, and conditional metrics.
  • Perform array operations with NumPy and apply basic statistical techniques using built-in modules.
  • Conduct simple EDA: sort, filter, identify trends and patterns, detect outliers, and compute correlations.

Block 4: Communicating Insights and Reporting (12.5% of total exam)

Minimum Coverage – the candidate can:

  • Understand basic data visualizations (bar, line, pie charts) and interpret them accurately.
  • Structure findings into clear, concise summaries and basic data narratives.
  • Create logically organized reports with supporting visuals.
  • Present insights effectively using verbal and visual techniques suited to different audiences.

Passing Requirement

To pass the PCED exam, a candidate must achieve a cumulative average score of at least 75% across all exam blocks.