'PouchDB 7.2.2 Update Document not persisted

I have tried updating a doc per this reference: https://pouchdb.com/guides/updating-deleting.html, and according to the response at update time, it says the DB has been updated.

However, if I close the browser and re-load the database my edits are gone, and it has "reverted" my changes to rev "1.0", and rev "2.0" no longer exists. Its as if the change is never persisted to the DB.

I used the code as it appears on that reference. This is a "local" only Database, it is not backed by a remote database. This is the use case for the current application I am developing. There is no plan for this to be backed by a remote database.

I am using Chrome on Windows 10, the latest version of Chrome.

Can anyone help out with the issue?

Before I edit the document, it is located via a search. That code is at the bottom of this post.

Here is the code I am using to update a document. After updating it does return the "new" document in the "then" statement.

database.get(newdoc._id).then(function(doc){
        doc.title = newdoc.title;
        doc.label = newdoc.labels;
        doc.content =  newdoc.content;
        //** Save the document here
        return database.put(doc, function callback(err, result) {
            if (err) {
                console.error('Error updating doc: ' + doc);
                console.error(err);
            } else {
                console.log('Update Successful: ' + doc);
                console.log(result);
            }
        });
    }).then(function(){
        return database.get(newdoc._id, function(err, doc) {
            if (err) {
                console.error('Error getting document.');
                console.error(err);
            } else if (doc._rev === newdoc._rev) {
                console.log('Document revision never changed : ' + doc._rev);
            } else {
                console.log('Document saved, new revision = ' + doc._rev);
                loadTheDocument();
            }
        });
    });

However, when I try to load the "new" document via a search, I get:

bm5-edit.js:121 DOMException: Failed to execute 'transaction' on 'IDBDatabase': The database connection is closing.

var sel = {
    _id:{ $regex : "^" + itemid + "$" },
    label:{ $regex: ".*" },
    title:{ $regex: ".*" },
    content:{ $regex: ".*" }
};
var cfgAll = {
    selector: sel,
    use_index: 'all-fields'
};

try {
    database.find(cfgAll).then(function(result){
        var len = '' + result.docs.length;
        if (len === "1") {
            $.each(result.docs, function(index, value){
                rev = value._rev;
                original = new Bookmark(value);
                original.setForm();
            });
        } else {
            console.error('Found ' + len + ' results.  Should have found 1.');
        }
    }).catch(function(err){
        console.error('Error performing search');
        console.error(err);
    });    
} catch (err) {
    console.error('Error performing search');
    console.error(err);
}

This search is executed successfully BEFORE I edit the document, and fails with this error AFTER I edit the document.

Thanks!



Solution 1:[1]

Unfortunately, there is no easy solution to your problem like an additional parameter in your statement. You have to use the behavior that new rows get the highest id + 1 assigned. With this knowledge, you can calculate the ids of all your rows.

Option 1: Explained in this answer. You select the current maximum id, before the insert statement. Then, you assign ids to all the entries in your DataFrame greater than the previous maximum. Lastly, insert the df which already includes the ids.

Option 2: You insert the DataFrame and then acquire the highest id. With the number of entries inserted you can calculate the id of all entries. This is how such an insert function could look like:

def insert_df_and_return_ids(df, engine):
    # It is important to use same connection for both statements if
    # something like last_insert_rowid() is used
    conn = engine.connect()
    
    # Insert the df into the database
    df.to_sql('students', conn, if_exists='append', index=False)
    
    # Aquire the maximum id
    result = conn.execute('SELECT max(id) FROM students') # Should work for all SQL variants
    # result = conn.execute('Select last_insert_rowid()') # Specifically for SQLite
    # result = conn.execute('Select last_insert_id()') # Specifically for MySql


    entries = df.shape[0]
    last_id = -1
    
    # Iterate over result to get last inserted id
    for row in result:
        last_id = int(str(row[0]))
    conn.close()
    
    # Generate list of ids
    list_of_ids = list(range(last_id - entries + 1, last_id + 1))

    return list_of_ids

PS: I could not test the function on an MS SQL server, but the behavior should be the same. In order to test if everything behaves as it should you can use this:

import numpy as np
import pandas as pd
import sqlalchemy as sa

# Change connection to MS SQL server
engine = sa.create_engine('sqlite:///test.lite', echo=False)

# Create table
meta = sa.MetaData()
students = sa.Table(
   'students', meta, 
   sa.Column('id', sa.Integer, primary_key = True), 
   sa.Column('name', sa.String), 
)
meta.create_all(engine)

# DataFrame to insert with two entries
df = pd.DataFrame({'name': ['Alice', 'Bob']})

ids = insert_df_and_return_ids(df, engine)
print(ids) # [1,2]

conn = engine.connect()
# Insert any entry with a high id in order to check if new ids are always the maximum
result = conn.execute("Insert into students (id, name) VALUES (53, 'Charlie')")
conn.close()

# Insert data frame again
ids = insert_df_and_return_ids(df, engine)
print(ids) # [54, 55]

EDIT: If multiple threads are utilized, transactions can be used to make the option thread-safe at least for SQLite:

conn = engine.connect()
transaction = conn.begin()
df.to_sql('students', conn, if_exists='append', index=False)
result = conn.execute('SELECT max(id) FROM students')
transaction.commit()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1