'How to input and predict using Simple Transformer model in Chrome Extension?
I have a simple transformer model(BERT) for binary text classification and I have exported to tensorflowjs and imported into a Chrome Extension. The model and weights are successfully loaded but I can't use the model to predict a sentence. Below shows the code for me to predict the input text from content script.
const prediction = res.executeAsync({'input_ids':inputTxt},[-1,-1],[-1,-1])
Below shows the model input Nodes and inputs(I copied it from the Chrome's console):
inputNodes: Array(3)
0: "attention_mask"
1: "input_ids"
2: "token_type_ids"
length: 3
[[Prototype]]: Array(0)
inputs: Array(3)
0:
dtype: "int32"
name: "attention_mask"
shape: (2) [-1, -1]
[[Prototype]]: Object
1:
dtype: "int32"
name: "input_ids"
shape: (2) [-1, -1]
[[Prototype]]: Object
2:
dtype: "int32"
name: "token_type_ids"
shape: (2) [-1, -1]
[[Prototype]]: Object
length: 3
[[Prototype]]: Array(0)
Does anyone knows what is my model expect me to input? As I can just use model.predict() in python before converting it to Chrome Extension.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
