Google made a big contribution to the big data world 10 years ago, when it released a paper on MapReduce, a programming model for doing big computing jobs on hefty data sets. But it turns out that, all this time, Google has been working on something far more advanced. At Google’s annual I/O shindig on Wednesday, the tech giant announced a service that can do much, much more than MapReduce: Google Cloud Dataflow. It can either run a series of computing jobs, batch-style, or do constant work as data flows in. Engineers can start using the service in Google’s burgeoning public cloud. Google takes care of managing the thing. “We handle all the infrastructure and the back-end work required to scale up and scale down, depending on the kind of data needs that you have,” Brian Goldfarb, head of marketing for the Google Cloud Platform, told VentureBeat ahead of Google I/O.