'How can I queue the laravel excel import, on traditional way, and dispatch the job again after the first chunk reading
My code: (controller)
public function formatCSV(Request $request){
$filename = $request->input('filename');
$exported_filename = pathinfo($filename, PATHINFO_FILENAME).".csv";
$export_filepath = 'files/scrubber/output/'.$exported_filename;
$data = [
'file_name' => $filename,
'header_row' => 10,
'start_row' => 10,
'unit_sqft_col' => null,
'file_path' => storage_path('app/files/scrubber/') . $filename,
'export_path' => $export_filepath,
];
$scrubberJob = (new ScrubberJob($data))->delay(Carbon::now()->addSeconds(3));
dispatch($scrubberJob);
return redirect()->route('scrubbers.index')->with('toast.success','Scrubber job is put into queue, please wait for a while..');
}
On the above controller, I am dispatching ScrubberJob.
ScrubberJob
<?php
namespace App\Jobs;
use App\Imports\ChunkScrubberImport;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldBeUnique;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Excel;
class ScrubberJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $data;
/**
* Create a new job instance.
*
* @return void
*/
public function __construct($data)
{
$this->data = $data;
}
/**
* Execute the job.
*
* @return void
*/
public function handle()
{
$import = new ChunkScrubberImport($this->data);
Excel::import($import, $this->data['file_path']);
}
}
This job is triggering ChunkScrubberImport
ChunkScrubberImport
<?php
namespace App\Imports;
use App\Exports\ChunkScrubberExport;
use Illuminate\Support\Collection;
use Maatwebsite\Excel\Concerns\ToArray;
use Maatwebsite\Excel\Concerns\ToCollection;
use Maatwebsite\Excel\Concerns\WithStartRow;
use Maatwebsite\Excel\Concerns\WithLimit;
use Maatwebsite\Excel\Concerns\WithStrictNullComparison;
use Illuminate\Support\Facades\Storage;
use Excel;
class ChunkScrubberImport implements ToArray, WithLimit, WithStartRow, WithStrictNullComparison
{
protected $data;
public $scrubbed_data;
public function __construct($data, $scrubbed_data = [])
{
$this->data = $data;
$this->scrubbed_data = $scrubbed_data;
}
public function array(Array $rows)
{
$this->scrubbed_data = $rows;
return Excel::store(new ChunkScrubberExport($this->data,$this->scrubbed_data), $this->data['export_path']);
}
public function startRow(): int
{
return $this->data['start_row'];
}
public function limit(): int
{
return 1000;
}
}
Now, this import is triggering ChunkScrubberExport
ChunkScrubberExport
<?php
namespace App\Exports;
use App\Helpers\APRHelper;
use App\Models\Scrubber;
use Maatwebsite\Excel\Concerns\FromArray;
use Maatwebsite\Excel\Concerns\FromCollection;
use Maatwebsite\Excel\Concerns\RegistersEventListeners;
use Maatwebsite\Excel\Events\AfterSheet;
use Maatwebsite\Excel\Concerns\WithStrictNullComparison;
class ChunkScrubberExport implements FromArray,WithStrictNullComparison
{
use RegistersEventListeners;
protected $data;
protected $scrubbed_data;
public function __construct($data, $scrubbed_data = [])
{
$this->data = $data;
$this->scrubbed_data = $scrubbed_data;
}
public function array(): array
{
$arr = $this->scrubbed_data;
$rec_arr = $empty_col = array();
$empty_col_checked = false;
foreach ($arr as $ak => $av){
//only if row is not empty (or filled with null), will get inside if condition
if(count(array_filter($av)) != 0){
if(!$empty_col_checked){
foreach($av as $k => $v){
if($v == ''){
$empty_col[] = $k;
}
}
$empty_col_checked = true;
}
$rec_arr[] = $av;
}
}
foreach($empty_col as $ek => $ev){
//get all values from a columns, which don't have value in header row
//and check if all the values from this particular column is empty
//if empty unset the columns
if(empty( array_filter(array_column($rec_arr,$ev))) )
{
//unset array keys from whole array
foreach($rec_arr as &$item) {
unset($item[$ev]);
}
unset($item);
}
}
foreach ($rec_arr as $ak => $av) {
foreach ($av as $k => $v) {
//other stuff
} //end foreach $av
} //endforeach $rec_arr
return $rec_arr;
}
public static function afterSheet(AfterSheet $event)
{
$active_sheet = $event->sheet->getDelegate();
$centered_text = [
'alignment' => [
'horizontal' => \PhpOffice\PhpSpreadsheet\Style\Alignment::HORIZONTAL_CENTER, 'vertical' => \PhpOffice\PhpSpreadsheet\Style\Alignment::VERTICAL_CENTER
]
];
$active_sheet->getParent()->getDefaultStyle()->applyFromArray($centered_text);
}
}
this is working fine for 1 cycle or fetch, now I have the following questions.
- After the first cycle, I want to update the
start_rowto$data['start_row'] + 1000since 1000 is the limit, but, I don't know what might be the best place to put this one. After 1 cycle I want to trigger, scrubberjob written below but with updatedstart_row
$scrubberJob = (new ScrubberJob($data))->delay(Carbon::now()->addSeconds(3));
dispatch($scrubberJob);
Since, on first cycle, the file would already be created since I already know the filename, which is stored in
$data['export_path']. So, instead of overwriting the old file, I want to append rows to the already existing file from the second cycle.How to detect the end of the file, so that no more ScrubberJob is dispatched.
By the way, I have also seen about ShouldQueue in docs but I am not sure if it could provide all the needs I have mentioned here. Or am I just complicating things and should go with shouldQueue instead? But still, that might need the solution of my question number 2
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
