Introduction to Apache Spark Essentials (TTSK7502)
*Looking for flexible schedule (after hours or weekend)? Please call or email us: 858-208-4141 or email@example.com.
Student financing options are available.
Transitioning military and Veterans, please contact us to sign up for a free consultation on training and hiring options.
Learn the essentials of using Spark for your big data workloads.
Apache Spark is an important component in the Hadoop Ecosystem as a cluster computing engine used for Big Data. Building on top of the Hadoop YARN and HDFS ecosystem, Spark offers faster in-memory processing for computing tasks when compared to Map/Reduce. It can be programmed in Java, Scala, Python, and R along with SQL-based front-ends.
This course introduces Scala, Python, or R developers to the world of Spark programming. It begins with an overview of the ecosystem and hands-on experience with the platform such as working with the Spark Shell, using RDDs, and DataFrames. You’ll later explore a wider-scoped introduction to NoSQL, Spark Streaming, Spark SQL, Spark MLLib, and how the pieces are put together in a larger application.
Overview of Spark
- Hadoop Ecosystem
- Hadoop YARN vs. Mesos
- Spark vs. Map/Reduce
- Spark: Lambda Architecture
- Spark in the Enterprise Data Science Architecture
Spark Component Overview
- Spark Shell
- RDDs: Resilient Distributed Datasets
- Data Frames
- Spark 2 Unified DataFrames
- Spark Sessions
- Functional Programming
- Spark SQL
- Structured Streaming
- Spark R
- Spark and Python
RDDs: Resilient Distributed Datasets
- Coding with RDDs
- Lazy Evaluation and Optimization
- RDDs in Map/Reduce
- Exercise: Working with RDDs
- RDDs vs. DataFrames
- Unified DataFrames (UDF) in Spark 2.x
- Exercise: Working with Unified DataFrames
Advanced Spark Overview
- Spark SQL
- Spark Streaming
- Spark ML Lib
Data Scientists, Data Engineers, Software Engineers, Architects, and Developers.
What You'll Learn
Join an engaging hands-on learning environment, where you’ll learn:
- The essentials of Spark architecture and applications
- How to execute Spark Programs
- How to create and manipulate both RDDs (Resilient Distributed Datasets) and UDFs (Unified Data Frames)
- How Spark core components come together for complete applications
Before attending this course, you should have:
- Experience programming in either Java, Python, R, or Scala (only one language needed)
- Basic understanding of SQL
With CCS Learning Academy, you’ll receive:
- Instructor-led training
- Training Seminar Student Handbook
- Pre and Post assessments/evaluations
- Collaboration with classmates (not currently available for self-paced course)
- Real-world learning activities and scenarios
- Exam scheduling support*
- Enjoy job placement assistance for the first 12 months after course completion.
- This course is eligible for CCS Learning Academy’s Learn and Earn Program: get a tuition fee refund of up to 50% if you are placed in a job through CCS Global Tech’s Placement Division*
- Government and Private pricing available.*
*For more details call: 858-208-4141 or email: firstname.lastname@example.org; email@example.com