'Tried to synchronously call function {promiseMethodWrapper} from a different thread
This is the first time I'm trying the bridging of react-native and a native iOS app.
In my react-native iOS project, I've created a swift file (that created a bridging header) and in that swift file I've created a sample method to test first:
import Foundation
@objc(MyModule)
class MyModule: NSObject {
@objc
func testFunctionWithPromiseResolve(frame: Frame,
resolver resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: @escaping RCTPromiseRejectBlock) {
var resp = [String:Any]() //Init Dictionary
resp.updateValue(frame, forKey: "frame");
resolve(resp);
}
@objc
static func requiresMainQueueSetup() -> Bool {
return true
}
}
In the bridging header file I have imports only:
#import "React/RCTBridgeModule.h"
#import <VisionCamera/FrameProcessorPlugin.h>. //from react-native-vision-camera
#import <VisionCamera/Frame.h> //from react-native-vision-camera
Then I created an objective-c file named MyModule.m
and in it, I've added:
#import <Foundation/Foundation.h>
#import "React/RCTBridgeModule.h"
@interface
RCT_EXTERN_MODULE(MyModule, NSObject);
RCT_EXTERN_METHOD(testFunctionWithPromiseResolve:
(Frame *)frame
resolver:(RCTPromiseResolveBlock *)resolve
rejecter:(RCTPromiseRejectBlock *)reject);
@end
Then in react-native, I have a Home.js
where I'm going to access this method.
import React from 'react';
import { Text, StyleSheet, ScrollView, NativeModules } from 'react-native';
import { Camera, useCameraDevices, useFrameProcessor } from 'react-native-vision-camera';
import 'react-native-reanimated'
function Home(props) {
const devices = useCameraDevices();
const device = devices.back;
const { MyModule } = NativeModules;
const frameProcessor = useFrameProcessor((frame) => {
'worklet';
let res = MyModule.testFunctionWithPromiseResolve(frame);
console.log(res);
// .then((res) => {
// console.log(res);
// }).catch((e) => {
// console.log(e);
// })
})
//... My other code is just UI-related in which I'm calling the frameProcessor in <Camera... using its prop frameProcessor={frameProcessor} as per react-native-vision-camera documentation.
As per my understanding, we handle a Promise with then
and catch
as I assume this is what we'll be getting from RCTPromiseResolveBlock
but that was not working so I just simply tried console.log(res);
and it prints undefined
.
The error I'm getting is:
Tried to synchronously call function {promiseMethodWrapper} from a different thread.
Possible solutions are:
a) If you want to synchronously execute this method, mark it as a Worklet
b) If you want to execute this method on the JS thread, wrap it using runOnJS
reanimated::REAIOSErrorHandler::raiseSpec()
REAIOSErrorHandler.mm:18
reanimated::ErrorHandler::raise()::'lambda'()::operator()()
decltype(static_cast<reanimated::ErrorHandler::raise()::'lambda'()&>(fp)()) std::__1::__invoke<reanimated::ErrorHandler::raise()::'lambda'()&>(reanimated::ErrorHandler::raise()::'lambda'()&)
void std::__1::__invoke_void_return_wrapper<void, true>::__call<reanimated::ErrorHandler::raise()::'lambda'()&>(reanimated::ErrorHandler::raise()::'lambda'()&)
std::__1::__function::__alloc_func<reanimated::ErrorHandler::raise()::'lambda'(), std::__1::allocator<reanimated::ErrorHandler::raise()::'lambda'()>, void ()>::operator()()
std::__1::__function::__func<reanimated::ErrorHandler::raise()::'lambda'(), std::__1::allocator<reanimated::ErrorHandler::raise()::'lambda'()>, void ()>::operator()()
std::__1::__function::__value_func<void ()>::operator()() const
std::__1::function<void ()>::operator()() const
invocation function for block in vision::VisionCameraScheduler::scheduleOnUI(std::__1::function<void ()>)
F14F0161-E0DE-3D9C-851E-AD12F95A3073
F14F0161-E0DE-3D9C-851E-AD12F95A3073
F14F0161-E0DE-3D9C-851E-AD12F95A3073
F14F0161-E0DE-3D9C-851E-AD12F95A3073
F14F0161-E0DE-3D9C-851E-AD12F95A3073
_pthread_wqthread
start_wqthread
I've worklet
defined in the useFrameProcessor
.
UPDATE:
I've updated the obj-c
method to:
@objc(testFunctionWithPromiseResolve:resolver:rejecter:)
func testFunctionWithPromiseResolve(_ frame: Frame,
resolver resolve: @escaping RCTPromiseResolveBlock,
rejecter reject: @escaping RCTPromiseRejectBlock) {...
and the in JS I did:
let module = NativeModules.MyModule;
const frameProcessor = useFrameProcessor((frame) => {
'worklet';
//let module = NativeModules.MyModule; //didnt work either
console.log(module.testFunctionWithPromiseResolve(frame));
})
But I get the same error:
Solution 1:[1]
Luckily I found the problem.
- Custom
Frame Processor Plugins
need to be completed according to the document, the document address is Creating Frame Processor Plugins, native-side customize code that you need to use theVISION_EXPORT_FRAME_PROCESSOR
export definition plugin, and usingRCT_EXTERN_MODULE
andRCT_EXTERN_METHOD
will fails. - You need to export custom plug-in methods on the js side, the document address is Expose your Frame Processor Plugin to JS, as follows:
import type { Frame } from 'react-native-vision-camera'
/**
* Scans QR codes.
*/
export function scanQRCodes(frame: Frame): string[] {
'worklet'
return __scanQRCodes(frame)
}
but now ,this will be __scanQRCodes
:
module.exports = {
plugins: [
[
'react-native-reanimated/plugin',
{
globals: ['__scanQRCodes'],
},
],
you have to restart metro-bundler for changes in the babel.config.js
file to take effect.
3: you must be restarted project to take effect.
My code is as follows:
I customized a Ojective-C frame Processor Plugins
named MyModuleFrameProcessPlugin.m
, the code is implemented as follows:
#import <VisionCamera/FrameProcessorPlugin.h>
#import <VisionCamera/Frame.h>
@interface MyModuleFrameProcessPlugin : NSObject
@end
@implementation MyModuleFrameProcessPlugin
static inline id myCustomPlugin(Frame* frame, NSArray* arguments) {
CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(frame.buffer);
NSLog(@"myCustomPlugin: %zu x %zu Image. Logging %lu parameters:", CVPixelBufferGetWidth(imageBuffer), CVPixelBufferGetHeight(imageBuffer), (unsigned long)arguments.count);
for (id param in arguments) {
NSLog(@"myCustomPlugin: -> %@ (%@)", param == nil ? @"(nil)" : [param description], NSStringFromClass([param classForCoder]));
}
return @{
@"myCustomPlugin_str": @"Test",
@"myCustomPlugin_bool": @true,
@"myCustomPlugin_double": @5.3,
@"myCustomPlugin_array": @[
@"Hello",
@true,
@17.38
]
};
}
VISION_EXPORT_FRAME_PROCESSOR(myCustomPlugin)
@end
Create a new js plugin named ExamplePlugin.ts
, the code is as follows:
/* global __myCustomPlugin */
import type { Frame } from 'react-native-vision-camera';
declare let _WORKLET: true | undefined;
export function cusPlugin(frame: Frame): string[] {
'worklet';
if (!_WORKLET) throw new Error('my_custom_plugin must be called from a frame processor!');
// @ts-expect-error because this function is dynamically injected by VisionCamera
return __myCustomPlugin(frame, 'hello my_custom_plugin!', 'parameter2', true, 42, { test: 0, second: 'test' }, ['another test', 500]);
}
The babel.config.js
file is as follows:
module.exports = {
presets: ['module:metro-react-native-babel-preset'],
plugins: [
[
'react-native-reanimated/plugin',
{
globals: ['__myCustomPlugin']
}
]
],
};
The plugin import code is as follows:
import {cusPlugin} from './ExamplePlugin';
The calling code is as follows:
const frameProcessor = useFrameProcessor((frame) => {
'worklet';
const value = cusPlugin(frame);
console.log(`Return Values: ${JSON.stringify(value)}`);
}, []);
//Use in Camera is frameProcessor={frameProcessor}
I tested my code and it is ok, I hope it can help you.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | M_JSL |