'How to save a blob (AJAX response) into a named file (JSON type) into my server NOT in user computer [closed]

I have read almost all topics around about creating an URL from (or downloading) a blob in a computer, however, I need something different:

(1) to save a blob in own server into a JSON file, then

(2) to optimize the way to update a database using the data stored in the JSON.

I tried all variants of code below with no success

var blob = new Blob(response, { type: contentType });
var file = new File([blob], filename, {type: contentType, lastModified: Date.now()});

where the response was the parsedJSON from the code below.

Would it be better to already have a file with empty content in the server instead, open it and update it from the blob (then close it)?

My purpose is to download the response content from the AJAX query (from an API) onto my server (not into any client computer in browser) and then update the respective values in a database table.

The use case is as follows: I receive a price list (about 40,000 items) from an API using a cron then I have to update the product prices in the database on my server.

Let's look at the code below:

<div id="app">
    <p style="text-align: center; margin-top: 50px;">
        <i class="fa fa-spinner fa-spin" style="font-size: 15px;"></i><br>
        please wait to load the list of items here...
    </p>
</div>

<script>
var settings = {
  "url": "https://www.notimportant/etc/",
  "method": "GET",
  "timeout": 0,
  "headers": {
    "GUID": "something"
  },
};


$.ajax(settings).done(function(response) {
  var parsedJSON = JSON.parse(response);
  
                                                  console.log(parsedJSON[0]); /* <= to see the structure of each object in the response array */
console.log('total items: ' + parsedJSON.length);
  
   var template = `
<p>List of items</p>
{{#each this}}
<div>{{plusOne @index}} - Code: {{ProductCode}} - Name: {{ProductName}} - Price: {{ProductPrice}}</div>
{{/each}}
`;

Handlebars.registerHelper('plusOne', function(index) {
    index++;
    return index;
});
  var show = Handlebars.compile(template);
  $('#app').html(show(parsedJSON));
});
        </script>

So far so good, I could check using Handlebars JS the results into a browser (and see also in console the object structure received from API). However my purpose is to use each object from the array parsedJSON to update values in a table. I could make something as

for(i = 0; i < parsedJSON.length; i++) { // use here $.post to send ProductCode, ProductName and ProductPrice to a php file where I execute a query to update each product into the database

var ProductCode = parsedJSON[i].ProductCode;
var ProductName = parsedJSON[i].ProductName;
var ProductPrice = parsedJSON[i].ProductPrice;

  $.post("update_db.php", {code: ProductCode, name: ProductName, price: ProductPrice}, function(data, status){
    console.log("Product " + ProductCode + " update = " + status);
  });
}

alert("all products' update was finished");

however, that would create ~ 40,000 single queries into my database (which is not ok at all).

I would prefer instead to save the array in the server (to save the response blob, create a file from it, name it myJSONfile.json, then in php use $myfileaddress = $base + "myJSONfile.json", read it with $myfile = file_get_contents($myfileaddress), then $myJson = json_decode($myfile, true) and then update the database using foreach($array as $item) {// update each row in products table in database, etc. }

I doubt however that I could run a multi-query with 40,000 components (or a prepared statement that large, even if I largely increase the allowed memory - I would prefer not to do it). If I am wrong please explain me why.

For the moment, my code is just sending queries one by one, which is not something I would keep doing (I'd better use PHP only or JavaScript only in order to create a single connection to the db, not many as of now).

I actually receive an almost instant response - the alert all products' update was finished for full list of products, but then the PHP response messages keep coming in console, too, during some pretty long time about successfully updating each row (talk about few minutes) which isn't something I expected.

I would use something elegant, let's say to use 500 queries in a single transaction (and make the entire update in just 80 steps instead of a full 40,000 simple queries), but not sure how to set everything right.

My point in downloading the full JSON instead of updating each product through a separate API request (plus update) is to query the external server just one time (using a cron) instead of many times, and doing further processing of the data in my own server instead. In the real situation, each product comes from the API with about 20 parameters, not just 3 (Code, Name and Price), but that's not relevant for the issues in question. (Also, sorry for some small inadvertences in the code shown in separate snippets, but I typed quickly instead of any copy-paste from the production files - I'm not writing now from the computer where I work the project in question).

Final edit

I finally wrote the more compact code below which works just fine (prepared statement with UPDATE). Instead of posting it as a separate answer, I will insert it here:

<?php
$servername = "localhost";
$username = "someuser";
$password = "somepassw";
$dbname = "somedb";

// requiring data from API

$curl = curl_init();

curl_setopt_array($curl, array(
  CURLOPT_URL => "https://www.notimportant/etc/",
  CURLOPT_RETURNTRANSFER => true,
  CURLOPT_ENCODING => "",
  CURLOPT_MAXREDIRS => 10,
  CURLOPT_TIMEOUT => 0,
  CURLOPT_FOLLOWLOCATION => true,
  CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
  CURLOPT_CUSTOMREQUEST => "POST",
  CURLOPT_POSTFIELDS =>"{\"name\": null, \"stores\": null, \"products\": []}",
  CURLOPT_HTTPHEADER => array(
    "GUID: something",
    "Content-Type: application/json"
  ),
));

$response = curl_exec($curl);

curl_close($curl);

// echo $response;
$data = json_decode($response);
echo "Product count: " . count($data) . "<br>\n";
echo "Preparing to save products to database... <br>\n";

// end receiving data from API

// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);

// Check connection
if ($conn->connect_error) {
  die("Connection failed: " . $conn->connect_error);
}

// prepare and bind
$stmt = $conn->prepare("UPDATE `my_table` SET `price` = ?, `stock` = ? WHERE `my_table`.`product_id` = ?"); 
$stmt->bind_param("dii", $price, $stock, $id);

// set parameters and execute
foreach($data as $productInfo) {
    $price = $productInfo->RetailPrice;
    $stock = $productInfo->Stock;
    $id = $productInfo->ProductId;
    $stmt->execute();
}

echo "Update was ok";

$stmt->close();
$conn->close();

?>

@Bravemaster suggestions were helpful, the code runs now in less than 10 seconds so it's an acceptable solution for the moment. Point (2) is pretty solved, and (1) seems to be not useful anymore (as it makes no sense to use both file_put_contents and file_get_contents). I also vote Bravemaster's input as the accepted answer - for his gracious contribution to my code. All 40k+ queries went smoothly in a single batch. What's left is to add some more data validation (to be in the safe side) and to set the cron process (not really an issue).



Solution 1:[1]

You don't need to get data from external API in client side and post them again to your server.

PHP can send get request and receive response without jQuery.

You can write php script to get product price list from external API like the following:

$url = "https://www.notimportant/etc/";
$header = ['GUID' => 'something']

echo "Sending request to API server...\n";

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch,CURLOPT_HEADER, $header);
$response = curl_exec($ch);
echo "Received response\n";
curl_close($ch);

$data = json_decode($response);
echo "Product count: " . count($data) . "\n";
echo "Saving products to database...";
foreach($data as $productInfo) {
   save_data($productInfo); // you need to implement save_data
}
echo "Products saved. Exiting"\n;
exit(0);

If you want to know how to write string (JSON response in this case) to file, refer to this: How to put the a string into a text file in PHP?

If you want to know how to execute MySQL query using prepared statements, refer to this: How can I prevent SQL injection in PHP?

If you want to know how to send https request with custom header, refer to this post: Send GET HTTPS request with custom headers PHP

If you want to know how to execute a php script on a regular basis using crontab, refer to this post: Running PHP file using crontab

(Don't worry, they're all stackoverflow posts)

I would use something elegant, let's say to use 500 queries in a single transaction (and make the entire update in just 80 steps instead of a full 40,000 simple queries), but not sure how to set everything right.

Transaction won't improve performance, transaction is never meant for performance but for data integrity. Making 40,000 simple queries and executing them all at once (or one by one, if you say so) is the best option here.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 glinda93