!6495 [MS][LITE]fix security error and change download link

Merge pull request !6495 from gongdaguo/fix_security_error
This commit is contained in:
mindspore-ci-bot 2020-09-19 09:05:58 +08:00 committed by Gitee
commit 0d0d34d4a7
7 changed files with 228 additions and 226 deletions

View File

@ -25,6 +25,8 @@ graph_8bit_1021_combine.tflite
lite-model_aiy_vision_classifier_insects_V1_3.tflite
lite-model_aiy_vision_classifier_plants_V1_3.tflite
lite-model_object_detection_mobile_object_labeler_v1_1.tflite
lite-model_cropnet_classifier_cassava_disease_V1_1.tflite
vision_classifier_fungi_mobile_V1_1_default_1.tflite
detect.tflite
ssd_mobilenet_v1_1_default_1.tflite
object_detection_mobile_object_localizer_v1_1_default_1.tflite

View File

@ -126,11 +126,11 @@ target_link_libraries(
)
```
* In this example, the download.gradle File configuration auto download MindSpore Lite version, placed in the 'app / src / main/cpp/mindspore_lite_x.x.x-minddata-arm64-cpu' directory.
* In this example, the download.gradle File configuration auto download MindSpore Lite version, placed in the 'app/src/main/cpp/' directory.
Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
MindSpore Lite version [MindSpore Lite version]( https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so)
mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz [Download link](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%201.0/mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz)
### Downloading and Deploying a Model File

View File

@ -86,13 +86,13 @@ app
### 配置MindSpore Lite依赖项
Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通过MindSpore Lite源码编译生成`libmindspore-lite.so`库文件。
Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通过MindSpore Lite[源码编译](https://www.mindspore.cn/lite/tutorial/zh-CN/master/build.html)生成"mindspore-lite-X.X.X-mindata-armXX-cpu"库文件包(包含`libmindspore-lite.so`库文件和相关头文件,可包含多个兼容架构)
本示例中build过程由download.gradle文件自动从华为服务器下载MindSpore Lite 版本文件,并放置在`app / src / main/cpp/mindspore_lite_x.x.x-minddata-arm64-cpu`目录下。
本示例中build过程由download.gradle文件自动从华为服务器下载MindSpore Lite 版本文件,并放置在`app / src / main/cpp/`目录下。
* 注:若自动下载失败,请手动下载相关库文件并将其放在对应位置:
MindSpore Lite版本 [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so)
mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%201.0/mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz)
```
@ -243,14 +243,16 @@ target_link_libraries(
- 输出数据的后续处理。
```cpp
std::string ProcessRunnetResult(std::unordered_map<std::string,
mindspore::tensor::MSTensor *> msOutputs, int runnetRet) {
std::string ProcessRunnetResult(const int RET_CATEGORY_SUM, const char *const labels_name_map[],
std::unordered_map<std::string, mindspore::tensor::MSTensor *> msOutputs) {
// Get the branch of the model output.
// Use iterators to get map elements.
std::unordered_map<std::string, mindspore::tensor::MSTensor *>::iterator iter;
iter = msOutputs.begin();
// The mobilenetv2.ms model output just one branch.
auto outputTensor = iter->second;
int tensorNum = outputTensor->ElementsNum();
MS_PRINT("Number of tensor elements:%d", tensorNum);

View File

@ -88,7 +88,7 @@ In this example, the download.gradle File configuration auto download library f
Note: if the automatic download fails, please manually download the relevant library files and put them in the corresponding location.
libmindspore-lite.so [libmindspore-lite.so]( https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so)
mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz [Download link](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%201.0/mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz)

View File

@ -27,7 +27,7 @@
2. 连接Android设备运行目标检测示例应用程序。
通过USB连接Android设备调试点击`Run 'app'`即可在你的设备上运行本示例项目。
* 注编译过程中Android Studio会自动下载MindSpore Lite、OpenCV、模型文件等相关依赖项,编译过程需做耐心等待。
* 注编译过程中Android Studio会自动下载MindSpore Lite、模型文件等相关依赖项编译过程需做耐心等待。
![run_app](images/run_app.PNG)
@ -85,9 +85,9 @@ app
### 配置MindSpore Lite依赖项
Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通过MindSpore Lite[源码编译](https://www.mindspore.cn/lite/docs/zh-CN/master/deploy.html)生成`libmindspore-lite.so`库文件
Android JNI层调用MindSpore C++ API时需要相关库文件支持。可通过MindSpore Lite[源码编译](https://www.mindspore.cn/lite/tutorial/zh-CN/master/build.html)生成"mindspore-lite-X.X.X-mindata-armXX-cpu"库文件包(包含`libmindspore-lite.so`库文件和相关头文件,可包含多个兼容架构)
在Android Studio中将编译完成的mindspore-lite-X.X.X-mindata-armXX-cpu压缩包(包含`libmindspore-lite.so`库文件和相关头文件,可包含多个兼容架构)解压之后放置在APP工程的`app/src/main/cpp`目录下并在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`和`armeabi-v7a`的编译支持,如下所示:
在Android Studio中将编译完成的mindspore-lite-X.X.X-mindata-armXX-cpu压缩包解压之后放置在APP工程的`app/src/main/cpp`目录下并在app的`build.gradle`文件中配置CMake编译支持以及`arm64-v8a`和`armeabi-v7a`的编译支持,如下所示:
```
android{
defaultConfig{
@ -130,7 +130,7 @@ target_link_libraries(
* 注:若自动下载失败,请手动下载相关库文件并将其放在对应位置:
* libmindspore-lite.so [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so)
* mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%201.0/mindspore-lite-1.0.0-minddata-arm64-cpu.tar.gz)
### 下载及部署模型文件

View File

@ -20,6 +20,7 @@
#define MS_PRINT(format, ...) __android_log_print(ANDROID_LOG_INFO, "MSJNI", format, ##__VA_ARGS__)
SSDModelUtil::~SSDModelUtil(void) {}
/**
* SSD model util constructor.
@ -61,8 +62,7 @@ std::string SSDModelUtil::getDecodeResult(float *branchScores, float *branchBoxD
}
// NMS processing.
ssd_boxes_decode(tmpBox, decodedBoxes);
// const float nms_threshold = 0.6;
ssd_boxes_decode(tmpBox, decodedBoxes, 0.1, 0.2, 1917);
const float nms_threshold = 0.3;
for (int i = 1; i < 81; i++) {
std::vector<int> in_indexes;

View File

@ -26,6 +26,8 @@ class SSDModelUtil {
// Constructor.
SSDModelUtil(int srcImageWidth, int srcImgHeight);
~SSDModelUtil();
/**
* Return the SSD model post-processing result.
* @param branchScores
@ -34,10 +36,6 @@ class SSDModelUtil {
*/
std::string getDecodeResult(float *branchScores, float *branchBoxData);
// ============= variables =============.
int inputImageHeight;
int inputImageWidth;
struct NormalBox {
float y;
float x;
@ -64,7 +62,8 @@ class SSDModelUtil {
private:
std::vector<struct NormalBox> mDefaultBoxes;
int inputImageHeight;
int inputImageWidth;
void getDefaultBoxes();
@ -80,7 +79,6 @@ class SSDModelUtil {
double IOU(float r1[4], float r2[4]);
// ============= variables =============.
struct network {
int model_input_height = 300;