Laravel-excel: Allowed memory size of 134217728 bytes exhausted

Created on 19 Jul 2018  ·  22Comments  ·  Source: Maatwebsite/Laravel-Excel

Prerequisites

  • [X] Able to reproduce the behaviour outside of your code, the problem is isolated to Laravel Excel.
  • [X] Checked that your issue isn't already filed.
  • [X] Checked if no PR was submitted that fixes this problem.

Versions

  • PHP version: 7.1.13
  • Laravel version: 5.6
  • Package version: ^3.0

Description

I am getting Allowed memory size of 134217728 bytes exhausted when i try to export with FromQuery option

Steps to Reproduce

Expected behavior:

I want to fix my problem :)

Actual behavior:

Additional Information

namespace App\Exports;

use App\OldTransaction;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Http\Request;
use jDate;
use Maatwebsite\Excel\Concerns\Exportable;
use Maatwebsite\Excel\Concerns\FromQuery;
use Maatwebsite\Excel\Concerns\ShouldAutoSize;
use Maatwebsite\Excel\Concerns\WithHeadings;
use Maatwebsite\Excel\Concerns\WithMapping;

class OldDepositExport implements FromQuery, ShouldQueue, WithMapping, WithHeadings, ShouldAutoSize
{
    use Exportable;

    /**
     * DepositExport constructor.
     * @param Request $request
     */
    public function __construct(Request $request)
    {
    }

    public function headings(): array
    {
        return [
            'ID',
        ];
    }


    public function map($transaction): array
    {
        return [
            $transaction->id,
        ];
    }

    public function query()
    {
        return User::query()
            ->where('status', '=', 1)
            ->select(['id']);
    }
}

See this image https://i.imgur.com/yMgUqXP.jpg

Most helpful comment

Who do you expect an answer from?

Our software is free and open source, meaning that the use of our software is optional. We hold no liability and there is no obligation to support. We will provide support on a best effort basis.

If you use the software commercially and need elaborate support or need it urgently, we can offer this on a commercial basis. Please contact [email protected] or via phone +31 (0)10 744 9312.

All 22 comments

Can anyone help me?

4 days. no answer :-(

Who do you expect an answer from?

Our software is free and open source, meaning that the use of our software is optional. We hold no liability and there is no obligation to support. We will provide support on a best effort basis.

If you use the software commercially and need elaborate support or need it urgently, we can offer this on a commercial basis. Please contact [email protected] or via phone +31 (0)10 744 9312.

OK so please don't close this issue. maybe someone can fix this.

Thanks

I think it's a problem of php memory, maybe it's too much data and overflow the memory... There has not relation with the library.

@jlcarpioe I have almost 200k rows. The problem occurs when appending rows to sheet

Did u try to maximize memory_limit in php.ini?

@bagana89 Thats not a good solution

I cannot reproduce your problem. I'm able to export a users table of 300K rows using the code you shared. Do note that the memory usage will increase in every job as PhpSpreadsheet has to open the workbook that is getting bigger every time. There's nothing wrong with assigning some more memory for this process. It seems you don't have a lot of memory assigned, that's why it overflows so quick.

Best to drop the ShouldAutoSize as that will recalculate the workbooks column dimensions in every job. That takes a lot more memory than without using it.

I have 1 GB of ram allocated and still have the same result as saeedvaziry.
Just migrated from v2.1 to v3.1. Was having the same trouble with v2.1 which motivated me to migrate, but did not solve the problem. Excel::create in v2.1 was much easier to style the output either.

It seems that the chunking doesn't work well when exporting (using FromQuery) (uses a massive amount of memory - up to 3 Gigs for me for about 200k records). But importing works fine using chunking. (memory never exceeds 50MB)

I only have 15 thousands records and gave me the same error. What can I do?

This is the error:

[2019-11-24 22:39:59] local.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) {"exception":"[object] (Symfony\Component\Debug\Exception\FatalErrorException(code: 1): Allowed memory size of 134217728 bytes exhausted (tried to allocate 18874368 bytes) at C:\wamp64\www\.....\vendor\phpoffice\phpspreadsheet\src\PhpSpreadsheet\Collection\Cells.php:421)
[stacktrace]

0 {main}

"}

You will need to increase the allowed memory limit in your php.ini or set it dynamically using ini_set

I did, I have 1G but it doesn't work

When you run the process, how much memory does the php-cli process consume? It must be exceeding 1Gig then

The memory limit is definitely not the problem. It's set to 4GB according to phpinfo and I still have this issue.

i have same problem

A 'Solution' would be split your file into multiple ones, releasing memory between them, and then merging all files and send merged as response.

Cons.:

  • More space for temporary files
  • More time spent (non intelligent loops)
  • More code required (non out-of-the-box)

Pros.:

  • It works

the same issue, memory limit is 512 MB, 4K rows

Final Solution
This is old, but whoever is reading this now should know that
if you are importing or exporting ToModel or ToCollection, that process requires huge allocation of memory to convert
the data into usable forms like collection or array.

In that case don't implement ToModel or ToCollect, you need to bypass the process and carry out the operation manually by implementing OnEachRow
which allow you to implements onRow method that will pass in an Excel Row object. You can along implements the WithHeadingRow to have it structured as an associative array.
Use this $row->toArray() to get your data and process it as you like. This is fast and easy to manipulate.

PS: If you still get the Memory Limit Error, simply add a return statement to the last line like this
return;

Thank You

I had the same issue and with the suggestions from @MoFoLuWaSo I reduced my +128Mb memory usage to 54Mb.

1) implement a DTO. That reduced the memory usage the most.
2) order the properties of the DTO and remove withMapping
3) remove ShouldAutoSize

In case of @saeedvz it should look like this:

namespace App\DataTransferObjects;

class OldDepositRow
{
    public int $id;
    public string $created_at;
}

and

namespace App\Exports;

use App\DataTransferObjects\OldDepositRow;

class OldDepositExport implements FromCollection, ShouldQueue, WithHeadings
{
    use Exportable;

    public function headings(): array
    {
        return [
            'ID',
        ];
    }

    public function collection()
    {
        $users = User::query()
            ->where('status', '=', 1)
            ->select(['id']);

        return $users->map(
           function ($user) {
                $row = new OldDepositRow();
                $row->transaction_id = $user->transaction->id;

                // cast objects like Carbon or BigDecimal to string
                $row->created_at = $user->transaction->created_at->format('d-m-Y');

                return $row;
            }
        );
    }
}
Was this page helpful?
0 / 5 - 0 ratings