Dataflow

Dataflow is a programming model used for processing large volumes of data. Coursera's Dataflow skill catalogue teaches you about the design and implementation of data pipelines that enable efficient, reliable, distributed data processing. You'll learn about the concepts of parallel processing, windowing, and watermarks in data streaming. You'll also gain insights on how to effectively manage and transform immutable collections of data, understand event time and processing time, and design scalable real-time data processing architectures. This knowledge is beneficial to data engineers, data scientists, and anyone looking to enhance their skills in handling vast quantities of data in real-time scenarios.
20credentials
56courses

Most popular

Trending now

New releases

Filter by

Subject
Required

Language
Required

The language used throughout the course, in both instruction and assessments.

Learning Product
Required

Build job-relevant skills in under 2 hours with hands-on tutorials.
Learn from top instructors with graded assignments, videos, and discussion forums.
Learn a new tool or skill in an interactive, hands-on environment.
Get in-depth knowledge of a subject by completing a series of courses and projects.
Earn career credentials from industry leaders that demonstrate your expertise.

Level
Required

Duration
Required

Subtitles
Required

Educator
Required

Results for "dataflow"

What brings you to Coursera today?

Leading partners

  • Google Cloud
  • Whizlabs
  • Microsoft
  • Pearson
  • Rice University
  • University of Colorado System
  • University of Washington
  • Duke University