FFmpeg
Macros | Functions
dnn_backend_common.c File Reference
#include "dnn_backend_common.h"

Go to the source code of this file.

Macros

#define DNN_ASYNC_SUCCESS   (void *)0
 
#define DNN_ASYNC_FAIL   (void *)-1
 

Functions

int ff_check_exec_params (void *ctx, DNNBackendType backend, DNNFunctionType func_type, DNNExecBaseParams *exec_params)
 
DNNReturnType ff_dnn_fill_task (TaskItem *task, DNNExecBaseParams *exec_params, void *backend_model, int async, int do_ioproc)
 Fill the Task for Backend Execution. More...
 
static void * async_thread_routine (void *args)
 Thread routine for async execution. More...
 
DNNReturnType ff_dnn_async_module_cleanup (DNNAsyncExecModule *async_module)
 Join the Async Execution thread and set module pointers to NULL. More...
 
DNNReturnType ff_dnn_start_inference_async (void *ctx, DNNAsyncExecModule *async_module)
 Start asynchronous inference routine for the TensorFlow model on a detached thread. More...
 
DNNAsyncStatusType ff_dnn_get_result_common (Queue *task_queue, AVFrame **in, AVFrame **out)
 Extract input and output frame from the Task Queue after asynchronous inference. More...
 
DNNReturnType ff_dnn_fill_gettingoutput_task (TaskItem *task, DNNExecBaseParams *exec_params, void *backend_model, int input_height, int input_width, void *ctx)
 Allocate input and output frames and fill the Task with execution parameters. More...
 

Detailed Description

DNN common functions different backends.

Definition in file dnn_backend_common.c.

Macro Definition Documentation

◆ DNN_ASYNC_SUCCESS

#define DNN_ASYNC_SUCCESS   (void *)0

Definition at line 26 of file dnn_backend_common.c.

◆ DNN_ASYNC_FAIL

#define DNN_ASYNC_FAIL   (void *)-1

Definition at line 27 of file dnn_backend_common.c.

Function Documentation

◆ ff_check_exec_params()

int ff_check_exec_params ( void *  ctx,
DNNBackendType  backend,
DNNFunctionType  func_type,
DNNExecBaseParams exec_params 
)

◆ ff_dnn_fill_task()

DNNReturnType ff_dnn_fill_task ( TaskItem task,
DNNExecBaseParams exec_params,
void *  backend_model,
int  async,
int  do_ioproc 
)

Fill the Task for Backend Execution.

It should be called after checking execution parameters using ff_check_exec_params.

Parameters
taskpointer to the allocated task
exec_parampointer to execution parameters
backend_modelvoid pointer to the backend model
asyncflag for async execution. Must be 0 or 1
do_ioprocflag for IO processing. Must be 0 or 1
Return values
DNN_SUCCESSif successful
DNN_ERRORif flags are invalid or any parameter is NULL

Definition at line 56 of file dnn_backend_common.c.

Referenced by ff_dnn_execute_model_native(), ff_dnn_execute_model_ov(), ff_dnn_execute_model_tf(), and ff_dnn_fill_gettingoutput_task().

◆ async_thread_routine()

static void* async_thread_routine ( void *  args)
static

Thread routine for async execution.

Parameters
argspointer to DNNAsyncExecModule module

Definition at line 80 of file dnn_backend_common.c.

Referenced by ff_dnn_start_inference_async().

◆ ff_dnn_async_module_cleanup()

DNNReturnType ff_dnn_async_module_cleanup ( DNNAsyncExecModule async_module)

Join the Async Execution thread and set module pointers to NULL.

Parameters
async_modulepointer to DNNAsyncExecModule module
Return values
DNN_SUCCESSif successful
DNN_ERRORif async_module is NULL

Definition at line 92 of file dnn_backend_common.c.

Referenced by destroy_request_item().

◆ ff_dnn_start_inference_async()

DNNReturnType ff_dnn_start_inference_async ( void *  ctx,
DNNAsyncExecModule async_module 
)

Start asynchronous inference routine for the TensorFlow model on a detached thread.

It calls the completion callback after the inference completes. Completion callback and inference function must be set before calling this function.

If POSIX threads aren't supported, the execution rolls back to synchronous mode, calling completion callback after inference.

Parameters
ctxpointer to the backend context
async_modulepointer to DNNAsyncExecModule module
Return values
DNN_SUCCESSon the start of async inference.
DNN_ERRORin case async inference cannot be started

Definition at line 111 of file dnn_backend_common.c.

Referenced by execute_model_tf(), and ff_dnn_flush_tf().

◆ ff_dnn_get_result_common()

DNNAsyncStatusType ff_dnn_get_result_common ( Queue task_queue,
AVFrame **  in,
AVFrame **  out 
)

Extract input and output frame from the Task Queue after asynchronous inference.

Parameters
task_queuepointer to the task queue of the backend
indouble pointer to the input frame
outdouble pointer to the output frame
Return values
DAST_EMPTY_QUEUEif task queue is empty
DAST_NOT_READYif inference not completed yet.
DAST_SUCCESSif result successfully extracted

Definition at line 141 of file dnn_backend_common.c.

Referenced by ff_dnn_get_result_native(), ff_dnn_get_result_ov(), and ff_dnn_get_result_tf().

◆ ff_dnn_fill_gettingoutput_task()

DNNReturnType ff_dnn_fill_gettingoutput_task ( TaskItem task,
DNNExecBaseParams exec_params,
void *  backend_model,
int  input_height,
int  input_width,
void *  ctx 
)

Allocate input and output frames and fill the Task with execution parameters.

Parameters
taskpointer to the allocated task
exec_paramspointer to execution parameters
backend_modelvoid pointer to the backend model
input_heightheight of input frame
input_widthwidth of input frame
ctxpointer to the backend context
Return values
DNN_SUCCESSif successful
DNN_ERRORif allocation fails

Definition at line 161 of file dnn_backend_common.c.

Referenced by get_output_native(), get_output_ov(), and get_output_tf().