Tizen Native API
8.0
|
The NNStreamer Service API provides interfaces to store and fetch the pipeline description for AI application developers.
Required Header
#include <nnstreamer/ml-api-service.h>
Overview
The NNStreamer Service API provides utility interfaces for AI application developers.
This function allows the following operations with NNStreamer:
- Set and get the pipeline description with a given name.
- Delete the pipeline description with a given name.
Note that this function set is supposed to be thread-safe.
Related Features
This function is related with the following features:
- http://tizen.org/feature/machine_learning
- http://tizen.org/feature/machine_learning.service
Functions | |
int | ml_service_start (ml_service_h handle) |
Starts the process of machine learning service. | |
int | ml_service_stop (ml_service_h handle) |
Stops the process of machine learning service. | |
int | ml_service_destroy (ml_service_h handle) |
Destroys the handle for machine learning service. | |
int | ml_service_pipeline_set (const char *name, const char *pipeline_desc) |
Sets the pipeline description with a given name. | |
int | ml_service_pipeline_get (const char *name, char **pipeline_desc) |
Gets the pipeline description with a given name. | |
int | ml_service_pipeline_delete (const char *name) |
Deletes the pipeline description with a given name. | |
int | ml_service_pipeline_launch (const char *name, ml_service_h *handle) |
Launches the pipeline of given service and gets the service handle. | |
int | ml_service_pipeline_get_state (ml_service_h handle, ml_pipeline_state_e *state) |
Gets the state of given handle's pipeline. | |
int | ml_service_query_create (ml_option_h option, ml_service_h *handle) |
Creates query service handle with given ml-option handle. | |
int | ml_service_query_request (ml_service_h handle, const ml_tensors_data_h input, ml_tensors_data_h *output) |
Requests the query service to process the input and produce an output. | |
int | ml_service_model_register (const char *name, const char *path, const bool activate, const char *description, unsigned int *version) |
Registers new information of a neural network model. | |
int | ml_service_model_update_description (const char *name, const unsigned int version, const char *description) |
Updates the description of neural network model with given name and version. | |
int | ml_service_model_activate (const char *name, const unsigned int version) |
Activates a neural network model with given name and version. | |
int | ml_service_model_get (const char *name, const unsigned int version, ml_information_h *info) |
Gets the information of neural network model with given name and version. | |
int | ml_service_model_get_activated (const char *name, ml_information_h *info) |
Gets the information of activated neural network model with given name. | |
int | ml_service_model_get_all (const char *name, ml_information_list_h *info_list) |
Gets the list of neural network model with given name. | |
int | ml_service_model_delete (const char *name, const unsigned int version) |
Deletes a model information with given name and version from machine learning service. | |
int | ml_service_resource_add (const char *name, const char *path, const char *description) |
Adds new information of machine learning resources those contain images, audio samples, binary files, and so on. | |
int | ml_service_resource_delete (const char *name) |
Deletes the information of the resources from machine learning service. | |
int | ml_service_resource_get (const char *name, ml_information_list_h *res) |
Gets the information of the resources from machine learning service. | |
Typedefs | |
typedef void * | ml_service_h |
A handle for ml-service instance. |
Typedef Documentation
typedef void* ml_service_h |
A handle for ml-service instance.
- Since :
- 7.0
Function Documentation
int ml_service_destroy | ( | ml_service_h | handle | ) |
Destroys the handle for machine learning service.
If given service handle is created by ml_service_pipeline_launch(), this requests machine learning agent to destroy the pipeline.
- Since :
- 7.0
- Parameters:
-
[in] handle The handle of ml-service.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER The parameter is invalid. ML_ERROR_STREAMS_PIPE Failed to stop the process.
int ml_service_model_activate | ( | const char * | name, |
const unsigned int | version | ||
) |
Activates a neural network model with given name and version.
- Since :
- 8.0
- Parameters:
-
[in] name The unique name to indicate the model. [in] version The version of registered model.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_model_delete | ( | const char * | name, |
const unsigned int | version | ||
) |
Deletes a model information with given name and version from machine learning service.
- Since :
- 8.0
- Remarks:
- This does not remove the model file from file system. If version is 0, machine learning service will delete all information with given name.
- Parameters:
-
[in] name The unique name to indicate the model. [in] version The version of registered model.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_model_get | ( | const char * | name, |
const unsigned int | version, | ||
ml_information_h * | info | ||
) |
Gets the information of neural network model with given name and version.
- Since :
- 8.0
- Remarks:
- If the function succeeds, the info should be released using ml_information_destroy().
- Parameters:
-
[in] name The unique name to indicate the model. [in] version The version of registered model. [out] info The handle of model information.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory.
int ml_service_model_get_activated | ( | const char * | name, |
ml_information_h * | info | ||
) |
Gets the information of activated neural network model with given name.
- Since :
- 8.0
- Remarks:
- If the function succeeds, the info should be released using ml_information_destroy().
- Parameters:
-
[in] name The unique name to indicate the model. [out] info The handle of activated model.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory.
int ml_service_model_get_all | ( | const char * | name, |
ml_information_list_h * | info_list | ||
) |
Gets the list of neural network model with given name.
- Since :
- 8.0
- Remarks:
- If the function succeeds, the info_list should be released using ml_information_list_destroy().
- Parameters:
-
[in] name The unique name to indicate the model. [out] info_list The handle of list of registered models.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory.
int ml_service_model_register | ( | const char * | name, |
const char * | path, | ||
const bool | activate, | ||
const char * | description, | ||
unsigned int * | version | ||
) |
Registers new information of a neural network model.
- Since :
- 8.0
- Remarks:
- Only one model can be activated with given name. If same name is already registered in machine learning service, this returns no error and old model will be deactivated when the flag activate is true.
- http://tizen.org/privilege/mediastorage is needed if model file is relevant to media storage.
- http://tizen.org/privilege/externalstorage is needed if model file is relevant to external storage.
- Parameters:
-
[in] name The unique name to indicate the model. [in] path The path to neural network model. [in] activate The flag to set the model to be activated. [in] description Nullable, description for neural network model. [out] version The version of registered model.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_PERMISSION_DENIED The application does not have the privilege to access to the storage. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
Here is an example of the usage:
// The machine-learning service API for model provides a method to share model files those can be used for ML application. const gchar *key = "imgcls-mobilenet"; // The name shared among ML applications. gchar *model_path = g_strdup_printf ("%s/%s", app_get_shared_resource_path (), "mobilenet_v2.tflite"); // Provide the absolute file path. const bool is_active = true; // Parameter deciding whether to activate this model or not. const gchar *description = "This is the description of mobilenet_v2 model ..."; // Model description parameter. unsigned int version; // Out parameter for the version of registered model. // Register the model via ML Service API. int status; status = ml_service_model_register (key, model_path, is_active, description, &version); if (status != ML_ERROR_NONE) { // Handle error case. } const gchar *key = "imgcls-mobilenet"; // The name shared among ML applications. gchar *model_path; // Out parameter for the path of registered model. ml_information_h activated_model_info; // The ml_information handle for the activated model. // Get the model which is registered and activated by ML Service API. int status; status = ml_service_model_get_activated (key, &activated_model_info); if (status == ML_ERROR_NONE) { // Get the path of the model. gchar *activated_model_path; status = ml_information_get (activated_model_info, "path", (void **) &activated_model_path); model_path = g_strdup (activated_model_path); } else { // Handle error case. } ml_information_destroy (activated_model_info); // Release the information handle. // Do ML things with the variable `model_path`.
int ml_service_model_update_description | ( | const char * | name, |
const unsigned int | version, | ||
const char * | description | ||
) |
Updates the description of neural network model with given name and version.
- Since :
- 8.0
- Parameters:
-
[in] name The unique name to indicate the model. [in] version The version of registered model. [in] description The description for neural network model.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_pipeline_delete | ( | const char * | name | ) |
Deletes the pipeline description with a given name.
- Since :
- 7.0
- Parameters:
-
[in] name The unique name to delete.
- Returns:
0
on success. Otherwise a negative error value.
- Note:
- If the name does not exist in the database, this function returns ML_ERROR_NONE without any errors.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_pipeline_get | ( | const char * | name, |
char ** | pipeline_desc | ||
) |
Gets the pipeline description with a given name.
- Since :
- 7.0
- Remarks:
- If the function succeeds, pipeline_desc must be released using free().
- Parameters:
-
[in] name The unique name to retrieve. [out] pipeline_desc The pipeline corresponding with the given name.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_pipeline_get_state | ( | ml_service_h | handle, |
ml_pipeline_state_e * | state | ||
) |
Gets the state of given handle's pipeline.
- Since :
- 7.0
- Parameters:
-
[in] handle The service handle. [out] state The pipeline state.
- Returns:
0
on Success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_STREAMS_PIPE Failed to access the pipeline state.
int ml_service_pipeline_launch | ( | const char * | name, |
ml_service_h * | handle | ||
) |
Launches the pipeline of given service and gets the service handle.
This requests machine learning agent daemon to launch a new pipeline of given service. The pipeline of service name should be set.
- Since :
- 7.0
- Remarks:
- The handle should be destroyed using ml_service_destroy().
- Parameters:
-
[in] name The service name. [out] handle Newly created service handle is returned.
- Returns:
0
on Success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed. ML_ERROR_STREAMS_PIPE Failed to launch the pipeline.
int ml_service_pipeline_set | ( | const char * | name, |
const char * | pipeline_desc | ||
) |
Sets the pipeline description with a given name.
- Since :
- 7.0
- Remarks:
- If the name already exists, the pipeline description is overwritten. Overwriting an existing description is restricted to APP/service that set it. However, users should keep their name unexposed to prevent unexpected overwriting.
- Parameters:
-
[in] name Unique name to retrieve the associated pipeline description. [in] pipeline_desc The pipeline description to be stored.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
Here is an example of the usage:
const gchar my_pipeline[] = "videotestsrc is-live=true ! videoconvert ! tensor_converter ! tensor_sink async=false"; gchar *pipeline; int status; ml_pipeline_h handle; // Set pipeline description. status = ml_service_pipeline_set ("my_pipeline", my_pipeline); if (status != ML_ERROR_NONE) { // handle error case goto error; } // Example to construct a pipeline with stored pipeline description. // Users may register intelligence pipelines for other processes and fetch such registered pipelines. // For example, a developer adds a pipeline which includes preprocessing and invoking a neural network model, // then an application can fetch and construct this for intelligence service. status = ml_service_pipeline_get ("my_pipeline", &pipeline); if (status != ML_ERROR_NONE) { // handle error case goto error; } status = ml_pipeline_construct (pipeline, NULL, NULL, &handle); if (status != ML_ERROR_NONE) { // handle error case goto error; } error: ml_pipeline_destroy (handle); g_free (pipeline);
int ml_service_query_create | ( | ml_option_h | option, |
ml_service_h * | handle | ||
) |
Creates query service handle with given ml-option handle.
- Since :
- 7.0
- Remarks:
- The handle should be destroyed using ml_service_destroy().
- Parameters:
-
[in] option The option used for creating query service. [out] handle Newly created query service handle is returned.
- Returns:
0
on Success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Fail. The parameter is invalid. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory. ML_ERROR_STREAMS_PIPE Failed to launch the pipeline. ML_ERROR_TRY_AGAIN The pipeline is not ready yet.
int ml_service_query_request | ( | ml_service_h | handle, |
const ml_tensors_data_h | input, | ||
ml_tensors_data_h * | output | ||
) |
Requests the query service to process the input and produce an output.
- Since :
- 7.0
- Remarks:
- If the function succeeds, the output should be released using ml_tensors_data_destroy().
- Parameters:
-
[in] handle The query service handle created by ml_service_query_create(). [in] input The handle of input tensors. [out] output The handle of output tensors.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory. ML_ERROR_STREAMS_PIPE The input is incompatible with the pipeline. ML_ERROR_TRY_AGAIN The pipeline is not ready yet. ML_ERROR_TIMED_OUT Failed to get output from the query service.
int ml_service_resource_add | ( | const char * | name, |
const char * | path, | ||
const char * | description | ||
) |
Adds new information of machine learning resources those contain images, audio samples, binary files, and so on.
- Since :
- 8.0
- Remarks:
- If same name is already registered in machine learning service, this returns no error and the list of resource files will be updated.
- http://tizen.org/privilege/mediastorage is needed if model file is relevant to media storage.
- http://tizen.org/privilege/externalstorage is needed if model file is relevant to external storage.
- Parameters:
-
[in] name The unique name to indicate the resources. [in] path The path to machine learning resources. [in] description Nullable, description for machine learning resources.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_PERMISSION_DENIED The application does not have the privilege to access to the storage. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
Here is an example of the usage:
// The machine-learning resource API provides a method to share the data files those can be used for training or inferencing AI model. // Users may generate preprocessed data file, and add it into machine-learning service. // Then an application can fetch the data set for retraining an AI model. const char *my_resources[3] = { "/path/to/resources/my_res1.dat", "/path/to/resources/my_res2.dat" "/path/to/resources/my_res3.dat" }; int status; unsigned int i, length; ml_information_list_h resources; ml_information_h res; char *path_to_resource; // Add resource files with name "my_resource". for (i = 0; i < 3; i++) { status = ml_service_resource_add ("my_resource", my_resources[i], "This is my resource data file."); if (status != ML_ERROR_NONE) { // Handle error case. } } // Get the resources with specific name. status = ml_service_resource_get ("my_resource", &resources); if (status != ML_ERROR_NONE) { // Handle error case. } status = ml_information_list_length (resources, &length); for (i = 0; i < length; i++) { status = ml_information_list_get (resources, i, &res); // Get the path of added resources. status = ml_information_get (res, "path", (void **) &path_to_resource); } // Release the information handle of resources. status = ml_information_list_destroy (resources);
int ml_service_resource_delete | ( | const char * | name | ) |
Deletes the information of the resources from machine learning service.
- Since :
- 8.0
- Remarks:
- This does not remove the resource files from file system.
- Parameters:
-
[in] name The unique name to indicate the resources.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed.
int ml_service_resource_get | ( | const char * | name, |
ml_information_list_h * | res | ||
) |
Gets the information of the resources from machine learning service.
- Since :
- 8.0
- Remarks:
- If the function succeeds, the res should be released using ml_information_list_destroy().
- Parameters:
-
[in] name The unique name to indicate the resources. [out] res The handle of the machine learning resources.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_IO_ERROR The operation of DB or filesystem has failed. ML_ERROR_OUT_OF_MEMORY Failed to allocate required memory.
int ml_service_start | ( | ml_service_h | handle | ) |
Starts the process of machine learning service.
- Since :
- 7.0
- Parameters:
-
[in] handle The handle of ml-service.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_STREAMS_PIPE Failed to start the process.
int ml_service_stop | ( | ml_service_h | handle | ) |
Stops the process of machine learning service.
- Since :
- 7.0
- Parameters:
-
[in] handle The handle of ml-service.
- Returns:
0
on success. Otherwise a negative error value.
- Return values:
-
ML_ERROR_NONE Successful. ML_ERROR_NOT_SUPPORTED Not supported. ML_ERROR_INVALID_PARAMETER Given parameter is invalid. ML_ERROR_STREAMS_PIPE Failed to stop the process.