Using chunk() method for big database tables in Laravel

Let's assume if we have a table with more than 100000 rows or bigger, need to update one column. 
We cant simple run sql query here. There should be some PHP logic to do so. If we use foreach loop, it takes much time to complete this process.
Sometimes we may run out of memory. Laravel brings a neat solution for this.



Okay, Let’s take a simple example of a problem I’m talking about:

$users = User::all();
foreach ($users as $user) {
  $some_value = ($user->some_field > 0) ? 1 : 0;
  // might be more logic here
  $user->update(['some_other_field' => $some_value]);
}

It’s fine for a DB table with 100 or 500 users. But what about 10000? 100000? etc..

So here’s the thing: there is a function called chunk() which splits all data into separate selects, like pagination.
Let’s look at an example:

User::chunk(1000, function ($users) {
  foreach ($users as $user) {
    $some_value = ($user->some_field > 0) ? 1 : 0;
    // might be more logic here
    $user->update(['some_other_field' => $some_value]);
  }
});

What it actually does is running a loop of selecting 1000 entries, then doing updates with them, and then another 1000 entries, another update and so on. Which means that at no point there is a huge amount of data taken from the database – you are working with a chunk of entries, not the whole table.

Thanks : Laravel Daily

Comments