Uploaded on Feb 16, 2021
With batch processing, data is collected in batches and then fed into an analytics system. A “batch” is a group of data points collected within a given time period.
Batch Processing: Large, Complex Data Analysis
Batch Processing: Large,
Complex Data Analysis
With batch processing, data is collected in batches and
then fed into an analytics system. A “batch” is a group of
data points collected within a given time period.
Batch Processing: Large,
Complex Data Analysis
Unlike stream processing, batch processing does not immediately
feed data into an analytics system, so results are not available in
real-time. With batch processing, some type of storage is required
to load the data, such as a database or a file system.
Batch Processing: Large,
Complex Data Analysis
Batch processing is ideal for very large data sets and projects that
involve deeper data analysis. The method is not as desirable for
projects that involve speed or real-time results. Additionally, many
legacy systems only support batch processing.
Batch Processing: Large,
Complex Data Analysis
This often forces teams to use batch processing during a cloud
data migration involving older mainframes and servers. In terms of
performance, batch processing is also optimal when the data has
already been collected.
Batch Processing Example: Each day, a retailer keeps track of
overall revenue across all stores. Instead of processing every
purchase in real-time, the retailer processes the batches of each
store’s daily revenue totals at the end of the day.
For information on real time vs batch processing please visit
Rivery.io
Comments