程序師世界是廣大編程愛好者互助、分享、學習的平台,程序師世界有你更精彩!
首頁
編程語言
C語言|JAVA編程
Python編程
網頁編程
ASP編程|PHP編程
JSP編程
數據庫知識
MYSQL數據庫|SqlServer數據庫
Oracle數據庫|DB2數據庫
您现在的位置: 程式師世界 >> 編程語言 >  >> 更多編程語言 >> Python

ImageAI (四) 使用Python快速簡單實現自定義預測模型的訓練 Custom Model Training

編輯:Python

已經講解了ImageAI實現圖片預測,功能物體檢測以及視頻中物體檢測。

同樣,僅需幾行主體代碼就能完成自定義模型的預測。(這裡主要介紹過程)
ImageAI github地址
准備工作以及ImageAI的安裝見 圖片預測 ImageAI (一)
物體檢測 ImageAI (二)
視頻中物體檢測 ImageAI (三)


Custom Model Training

與之前類似,ImageAI提供了4種算法( SqueezeNet,ResNet,InceptionV3 和 DenseNet)可用於自定義預測模型的訓練。
這裡會使用ResNet

自定義預測模型需要准備訓練數據 (這裡使用了IMAGEAI提供的IdenProf數據集 包含了十種不同專業的工作人員)
數據結構如下

==IdenProf
==test
==chef
->chef-1...
==doctor
->doctor-1...
...
==train
==chef
->chef-1...
==doctor
->doctor-1...

代碼中提供的地址已經無法下載到數據 新地址為:https://github.com/OlafenwaMoses/IdenProf/releases
下載後可自行解壓並使用其中的數據
下述代碼中只挑選了三類數據 由於機子性能不足僅做示范

代碼如下

from io import open
import requests
import shutil
from zipfile import ZipFile
import os
from imageai.Prediction.Custom import ModelTraining
#導入ModelTraining類
##############################################################################
#############################以下為數據加載處理################################
execution_path = os.getcwd()
TRAIN_ZIP_ONE = os.path.join(execution_path, "idenprof-train1.zip")
TRAIN_ZIP_TWO = os.path.join(execution_path, "idenprof-train2.zip")
TEST_ZIP = os.path.join(execution_path, "idenprof-test.zip")
DATASET_DIR = os.path.join(execution_path, "idenprof")
DATASET_TRAIN_DIR = os.path.join(DATASET_DIR, "train")
DATASET_TEST_DIR = os.path.join(DATASET_DIR, "test")
if(os.path.exists(DATASET_DIR) == False):
os.mkdir(DATASET_DIR)
if(os.path.exists(DATASET_TRAIN_DIR) == False):
os.mkdir(DATASET_TRAIN_DIR)
if(os.path.exists(DATASET_TEST_DIR) == False):
os.mkdir(DATASET_TEST_DIR)
if(len(os.listdir(DATASET_TRAIN_DIR)) < 3):
if(os.path.exists(TRAIN_ZIP_ONE) == False):
print("Downloading idenprof-train1.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-train1.zip", stream = True)
with open(TRAIN_ZIP_ONE, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
if (os.path.exists(TRAIN_ZIP_TWO) == False):
print("Downloading idenprof-train2.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-train2.zip", stream=True)
with open(TRAIN_ZIP_TWO, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
print("Extracting idenprof-train1.zip")
extract1 = ZipFile(TRAIN_ZIP_ONE)
extract1.extractall(DATASET_TRAIN_DIR)
extract1.close()
print("Extracting idenprof-train2.zip")
extract2 = ZipFile(TRAIN_ZIP_TWO)
extract2.extractall(DATASET_TRAIN_DIR)
extract2.close()
if(len(os.listdir(DATASET_TEST_DIR)) < 3):
if (os.path.exists(TEST_ZIP) == False):
print("Downloading idenprof-test.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-test.zip", stream=True)
with open(TEST_ZIP, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
print("Extracting idenprof-test.zip")
extract = ZipFile(TEST_ZIP)
extract.extractall(DATASET_TEST_DIR)
extract.close()
####################上面為訓練及測試數據的處理##################################
##############################################################################
model_trainer = ModelTraining()
#創建ModelTraining類實例
model_trainer.setModelTypeAsResNet()
#將模型類型設置為ResNet 也可以使用其他模型
#setModelTypeAsSqueezeNet()
#setModelTypeAsInceptionV3()
#setModelTypeAsDenseNet()
model_trainer.setDataDirectory(DATASET_DIR)
#設置訓練數據集路徑
model_trainer.trainModel(num_objects=3, num_experiments=10, enhance_data=True, batch_size=32, show_network_summary=True)
#訓練模型 並設置參數 詳見說明

參數設置說明

num_objects 指定圖像數據集中對象的數量及圖像的種類(簡單起見這裡只用了三類)
num_experiments 圖像訓練的次數 epochs
enhance_data(可選) 指定是否生成訓練圖像副本從而得到更好的性能
batch_size 批次處理 每批數量
show_network_summary 是否在控制台中顯示訓練的過程

運行結果

首先展示了模型的結構 然後是訓練的結果

__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) (None, 224, 224, 3) 0
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, 112, 112, 64) 9472 input_2[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 112, 112, 64) 256 conv2d_54[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, 112, 112, 64) 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 55, 55, 64) 0 activation_51[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, 55, 55, 64) 4160 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 55, 55, 64) 256 conv2d_56[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, 55, 55, 64) 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, 55, 55, 64) 36928 activation_52[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 55, 55, 64) 256 conv2d_57[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, 55, 55, 64) 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, 55, 55, 256) 16640 activation_53[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, 55, 55, 256) 16640 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 55, 55, 256) 1024 conv2d_58[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 55, 55, 256) 1024 conv2d_55[0][0]
__________________________________________________________________________________________________
add_17 (Add) (None, 55, 55, 256) 0 batch_normalization_58[0][0]
batch_normalization_55[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, 55, 55, 256) 0 add_17[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, 55, 55, 64) 16448 activation_54[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 55, 55, 64) 256 conv2d_59[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, 55, 55, 64) 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, 55, 55, 64) 36928 activation_55[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 55, 55, 64) 256 conv2d_60[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, 55, 55, 64) 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, 55, 55, 256) 16640 activation_56[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 55, 55, 256) 1024 conv2d_61[0][0]
__________________________________________________________________________________________________
add_18 (Add) (None, 55, 55, 256) 0 batch_normalization_61[0][0]
activation_54[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, 55, 55, 256) 0 add_18[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, 55, 55, 64) 16448 activation_57[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 55, 55, 64) 256 conv2d_62[0][0]
__________________________________________________________________________________________________
activation_58 (Activation) (None, 55, 55, 64) 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, 55, 55, 64) 36928 activation_58[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 55, 55, 64) 256 conv2d_63[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, 55, 55, 64) 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, 55, 55, 256) 16640 activation_59[0][0]
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 55, 55, 256) 1024 conv2d_64[0][0]
__________________________________________________________________________________________________
add_19 (Add) (None, 55, 55, 256) 0 batch_normalization_64[0][0]
activation_57[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, 55, 55, 256) 0 add_19[0][0]
__________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, 28, 28, 128) 32896 activation_60[0][0]
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 28, 28, 128) 512 conv2d_66[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, 28, 28, 128) 0 batch_normalization_66[0][0]
__________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, 28, 28, 128) 147584 activation_61[0][0]
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 28, 28, 128) 512 conv2d_67[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, 28, 28, 128) 0 batch_normalization_67[0][0]
__________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, 28, 28, 512) 66048 activation_62[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, 28, 28, 512) 131584 activation_60[0][0]
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 28, 28, 512) 2048 conv2d_68[0][0]
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 28, 28, 512) 2048 conv2d_65[0][0]
__________________________________________________________________________________________________
add_20 (Add) (None, 28, 28, 512) 0 batch_normalization_68[0][0]
batch_normalization_65[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, 28, 28, 512) 0 add_20[0][0]
__________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, 28, 28, 128) 65664 activation_63[0][0]
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 28, 28, 128) 512 conv2d_69[0][0]
__________________________________________________________________________________________________
activation_64 (Activation) (None, 28, 28, 128) 0 batch_normalization_69[0][0]
__________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, 28, 28, 128) 147584 activation_64[0][0]
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 28, 28, 128) 512 conv2d_70[0][0]
__________________________________________________________________________________________________
activation_65 (Activation) (None, 28, 28, 128) 0 batch_normalization_70[0][0]
__________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, 28, 28, 512) 66048 activation_65[0][0]
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 28, 28, 512) 2048 conv2d_71[0][0]
__________________________________________________________________________________________________
add_21 (Add) (None, 28, 28, 512) 0 batch_normalization_71[0][0]
activation_63[0][0]
__________________________________________________________________________________________________
activation_66 (Activation) (None, 28, 28, 512) 0 add_21[0][0]
__________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, 28, 28, 128) 65664 activation_66[0][0]
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 28, 28, 128) 512 conv2d_72[0][0]
__________________________________________________________________________________________________
activation_67 (Activation) (None, 28, 28, 128) 0 batch_normalization_72[0][0]
__________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, 28, 28, 128) 147584 activation_67[0][0]
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 28, 28, 128) 512 conv2d_73[0][0]
__________________________________________________________________________________________________
activation_68 (Activation) (None, 28, 28, 128) 0 batch_normalization_73[0][0]
__________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, 28, 28, 512) 66048 activation_68[0][0]
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 28, 28, 512) 2048 conv2d_74[0][0]
__________________________________________________________________________________________________
add_22 (Add) (None, 28, 28, 512) 0 batch_normalization_74[0][0]
activation_66[0][0]
__________________________________________________________________________________________________
activation_69 (Activation) (None, 28, 28, 512) 0 add_22[0][0]
__________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, 28, 28, 128) 65664 activation_69[0][0]
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 28, 28, 128) 512 conv2d_75[0][0]
__________________________________________________________________________________________________
activation_70 (Activation) (None, 28, 28, 128) 0 batch_normalization_75[0][0]
__________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, 28, 28, 128) 147584 activation_70[0][0]
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 28, 28, 128) 512 conv2d_76[0][0]
__________________________________________________________________________________________________
activation_71 (Activation) (None, 28, 28, 128) 0 batch_normalization_76[0][0]
__________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, 28, 28, 512) 66048 activation_71[0][0]
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 28, 28, 512) 2048 conv2d_77[0][0]
__________________________________________________________________________________________________
add_23 (Add) (None, 28, 28, 512) 0 batch_normalization_77[0][0]
activation_69[0][0]
__________________________________________________________________________________________________
activation_72 (Activation) (None, 28, 28, 512) 0 add_23[0][0]
__________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, 14, 14, 256) 131328 activation_72[0][0]
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 14, 14, 256) 1024 conv2d_79[0][0]
__________________________________________________________________________________________________
activation_73 (Activation) (None, 14, 14, 256) 0 batch_normalization_79[0][0]
__________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, 14, 14, 256) 590080 activation_73[0][0]
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 14, 14, 256) 1024 conv2d_80[0][0]
__________________________________________________________________________________________________
activation_74 (Activation) (None, 14, 14, 256) 0 batch_normalization_80[0][0]
__________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, 14, 14, 1024) 263168 activation_74[0][0]
__________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, 14, 14, 1024) 525312 activation_72[0][0]
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 14, 14, 1024) 4096 conv2d_81[0][0]
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 14, 14, 1024) 4096 conv2d_78[0][0]
__________________________________________________________________________________________________
add_24 (Add) (None, 14, 14, 1024) 0 batch_normalization_81[0][0]
batch_normalization_78[0][0]
__________________________________________________________________________________________________
activation_75 (Activation) (None, 14, 14, 1024) 0 add_24[0][0]
__________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, 14, 14, 256) 262400 activation_75[0][0]
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 14, 14, 256) 1024 conv2d_82[0][0]
__________________________________________________________________________________________________
activation_76 (Activation) (None, 14, 14, 256) 0 batch_normalization_82[0][0]
__________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, 14, 14, 256) 590080 activation_76[0][0]
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 14, 14, 256) 1024 conv2d_83[0][0]
__________________________________________________________________________________________________
activation_77 (Activation) (None, 14, 14, 256) 0 batch_normalization_83[0][0]
__________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, 14, 14, 1024) 263168 activation_77[0][0]
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 14, 14, 1024) 4096 conv2d_84[0][0]
__________________________________________________________________________________________________
add_25 (Add) (None, 14, 14, 1024) 0 batch_normalization_84[0][0]
activation_75[0][0]
__________________________________________________________________________________________________
activation_78 (Activation) (None, 14, 14, 1024) 0 add_25[0][0]
__________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, 14, 14, 256) 262400 activation_78[0][0]
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 14, 14, 256) 1024 conv2d_85[0][0]
__________________________________________________________________________________________________
activation_79 (Activation) (None, 14, 14, 256) 0 batch_normalization_85[0][0]
__________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, 14, 14, 256) 590080 activation_79[0][0]
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 14, 14, 256) 1024 conv2d_86[0][0]
__________________________________________________________________________________________________
activation_80 (Activation) (None, 14, 14, 256) 0 batch_normalization_86[0][0]
__________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, 14, 14, 1024) 263168 activation_80[0][0]
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 14, 14, 1024) 4096 conv2d_87[0][0]
__________________________________________________________________________________________________
add_26 (Add) (None, 14, 14, 1024) 0 batch_normalization_87[0][0]
activation_78[0][0]
__________________________________________________________________________________________________
activation_81 (Activation) (None, 14, 14, 1024) 0 add_26[0][0]
__________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, 14, 14, 256) 262400 activation_81[0][0]
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 14, 14, 256) 1024 conv2d_88[0][0]
__________________________________________________________________________________________________
activation_82 (Activation) (None, 14, 14, 256) 0 batch_normalization_88[0][0]
__________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, 14, 14, 256) 590080 activation_82[0][0]
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 14, 14, 256) 1024 conv2d_89[0][0]
__________________________________________________________________________________________________
activation_83 (Activation) (None, 14, 14, 256) 0 batch_normalization_89[0][0]
__________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, 14, 14, 1024) 263168 activation_83[0][0]
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 14, 14, 1024) 4096 conv2d_90[0][0]
__________________________________________________________________________________________________
add_27 (Add) (None, 14, 14, 1024) 0 batch_normalization_90[0][0]
activation_81[0][0]
__________________________________________________________________________________________________
activation_84 (Activation) (None, 14, 14, 1024) 0 add_27[0][0]
__________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, 14, 14, 256) 262400 activation_84[0][0]
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 14, 14, 256) 1024 conv2d_91[0][0]
__________________________________________________________________________________________________
activation_85 (Activation) (None, 14, 14, 256) 0 batch_normalization_91[0][0]
__________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, 14, 14, 256) 590080 activation_85[0][0]
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 14, 14, 256) 1024 conv2d_92[0][0]
__________________________________________________________________________________________________
activation_86 (Activation) (None, 14, 14, 256) 0 batch_normalization_92[0][0]
__________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, 14, 14, 1024) 263168 activation_86[0][0]
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 14, 14, 1024) 4096 conv2d_93[0][0]
__________________________________________________________________________________________________
add_28 (Add) (None, 14, 14, 1024) 0 batch_normalization_93[0][0]
activation_84[0][0]
__________________________________________________________________________________________________
activation_87 (Activation) (None, 14, 14, 1024) 0 add_28[0][0]
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, 14, 14, 256) 262400 activation_87[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 14, 14, 256) 1024 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_88 (Activation) (None, 14, 14, 256) 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
conv2d_95 (Conv2D) (None, 14, 14, 256) 590080 activation_88[0][0]
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 14, 14, 256) 1024 conv2d_95[0][0]
__________________________________________________________________________________________________
activation_89 (Activation) (None, 14, 14, 256) 0 batch_normalization_95[0][0]
__________________________________________________________________________________________________
conv2d_96 (Conv2D) (None, 14, 14, 1024) 263168 activation_89[0][0]
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 14, 14, 1024) 4096 conv2d_96[0][0]
__________________________________________________________________________________________________
add_29 (Add) (None, 14, 14, 1024) 0 batch_normalization_96[0][0]
activation_87[0][0]
__________________________________________________________________________________________________
activation_90 (Activation) (None, 14, 14, 1024) 0 add_29[0][0]
__________________________________________________________________________________________________
conv2d_98 (Conv2D) (None, 7, 7, 512) 524800 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 7, 7, 512) 2048 conv2d_98[0][0]
__________________________________________________________________________________________________
activation_91 (Activation) (None, 7, 7, 512) 0 batch_normalization_98[0][0]
__________________________________________________________________________________________________
conv2d_99 (Conv2D) (None, 7, 7, 512) 2359808 activation_91[0][0]
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 7, 7, 512) 2048 conv2d_99[0][0]
__________________________________________________________________________________________________
activation_92 (Activation) (None, 7, 7, 512) 0 batch_normalization_99[0][0]
__________________________________________________________________________________________________
conv2d_100 (Conv2D) (None, 7, 7, 2048) 1050624 activation_92[0][0]
__________________________________________________________________________________________________
conv2d_97 (Conv2D) (None, 7, 7, 2048) 2099200 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 7, 7, 2048) 8192 conv2d_100[0][0]
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 7, 7, 2048) 8192 conv2d_97[0][0]
__________________________________________________________________________________________________
add_30 (Add) (None, 7, 7, 2048) 0 batch_normalization_100[0][0]
batch_normalization_97[0][0]
__________________________________________________________________________________________________
activation_93 (Activation) (None, 7, 7, 2048) 0 add_30[0][0]
__________________________________________________________________________________________________
conv2d_101 (Conv2D) (None, 7, 7, 512) 1049088 activation_93[0][0]
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 7, 7, 512) 2048 conv2d_101[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, 7, 7, 512) 0 batch_normalization_101[0][0]
__________________________________________________________________________________________________
conv2d_102 (Conv2D) (None, 7, 7, 512) 2359808 activation_94[0][0]
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 7, 7, 512) 2048 conv2d_102[0][0]
__________________________________________________________________________________________________
activation_95 (Activation) (None, 7, 7, 512) 0 batch_normalization_102[0][0]
__________________________________________________________________________________________________
conv2d_103 (Conv2D) (None, 7, 7, 2048) 1050624 activation_95[0][0]
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 7, 7, 2048) 8192 conv2d_103[0][0]
__________________________________________________________________________________________________
add_31 (Add) (None, 7, 7, 2048) 0 batch_normalization_103[0][0]
activation_93[0][0]
__________________________________________________________________________________________________
activation_96 (Activation) (None, 7, 7, 2048) 0 add_31[0][0]
__________________________________________________________________________________________________
conv2d_104 (Conv2D) (None, 7, 7, 512) 1049088 activation_96[0][0]
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 7, 7, 512) 2048 conv2d_104[0][0]
__________________________________________________________________________________________________
activation_97 (Activation) (None, 7, 7, 512) 0 batch_normalization_104[0][0]
__________________________________________________________________________________________________
conv2d_105 (Conv2D) (None, 7, 7, 512) 2359808 activation_97[0][0]
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 7, 7, 512) 2048 conv2d_105[0][0]
__________________________________________________________________________________________________
activation_98 (Activation) (None, 7, 7, 512) 0 batch_normalization_105[0][0]
__________________________________________________________________________________________________
conv2d_106 (Conv2D) (None, 7, 7, 2048) 1050624 activation_98[0][0]
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 7, 7, 2048) 8192 conv2d_106[0][0]
__________________________________________________________________________________________________
add_32 (Add) (None, 7, 7, 2048) 0 batch_normalization_106[0][0]
activation_96[0][0]
__________________________________________________________________________________________________
activation_99 (Activation) (None, 7, 7, 2048) 0 add_32[0][0]
__________________________________________________________________________________________________
global_avg_pooling (GlobalAvera (None, 2048) 0 activation_99[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 3) 6147 global_avg_pooling[0][0]
__________________________________________________________________________________________________
activation_100 (Activation) (None, 3) 0 dense_2[0][0]
==================================================================================================
Total params: 23,593,859
Trainable params: 23,540,739
Non-trainable params: 53,120
__________________________________________________________________________________________________
Using Enhanced Data Generation
Found 300 images belonging to 3 classes.
Found 120 images belonging to 3 classes.
JSON Mapping for the model classes saved to F:\nn\OlafenwaMoses\customModelTraining\idenprof\json\model_class.json
Number of experiments (Epochs) : 10
Epoch 1/10
8/9 [=========================>....] - ETA: 52s - loss: 1.9142 - acc: 0.3464
Epoch 00001: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-001_acc-0.354167.h5
9/9 [==============================] - 533s 59s/step - loss: 1.8041 - acc: 0.3671 - val_loss: 9.9489 - val_acc: 0.3542
Epoch 2/10
8/9 [=========================>....] - ETA: 56s - loss: 1.2488 - acc: 0.5117
Epoch 00002: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-002_acc-0.322917.h5
9/9 [==============================] - 549s 61s/step - loss: 1.2566 - acc: 0.5019 - val_loss: 1.1253 - val_acc: 0.3229
Epoch 3/10
8/9 [=========================>....] - ETA: 52s - loss: 1.3101 - acc: 0.4570
Epoch 00003: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-003_acc-0.322917.h5
9/9 [==============================] - 517s 57s/step - loss: 1.3536 - acc: 0.4545 - val_loss: 1.2735 - val_acc: 0.3229
Epoch 4/10
8/9 [=========================>....] - ETA: 56s - loss: 1.1787 - acc: 0.4531
Epoch 00004: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-004_acc-0.302083.h5
9/9 [==============================] - 550s 61s/step - loss: 1.1517 - acc: 0.4792 - val_loss: 1.4054 - val_acc: 0.3021
Epoch 5/10
8/9 [=========================>....] - ETA: 47s - loss: 1.0439 - acc: 0.5065
Epoch 00005: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-005_acc-0.302083.h5
9/9 [==============================] - 468s 52s/step - loss: 1.0489 - acc: 0.5072 - val_loss: 1.6589 - val_acc: 0.3021
Epoch 6/10
8/9 [=========================>....] - ETA: 55s - loss: 1.0135 - acc: 0.5586
Epoch 00006: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-006_acc-0.302083.h5
9/9 [==============================] - 546s 61s/step - loss: 1.0391 - acc: 0.5625 - val_loss: 1.5787 - val_acc: 0.3021
Epoch 7/10
8/9 [=========================>....] - ETA: 50s - loss: 1.1323 - acc: 0.4935
Epoch 00007: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-007_acc-0.302083.h5
9/9 [==============================] - 498s 55s/step - loss: 1.1025 - acc: 0.4967 - val_loss: 1.5267 - val_acc: 0.3021
Epoch 8/10
8/9 [=========================>....] - ETA: 51s - loss: 1.1020 - acc: 0.5195
Epoch 00008: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-008_acc-0.302083.h5
9/9 [==============================] - 508s 56s/step - loss: 1.0763 - acc: 0.5315 - val_loss: 1.4837 - val_acc: 0.3021
Epoch 9/10
8/9 [=========================>....] - ETA: 52s - loss: 0.9851 - acc: 0.4648
Epoch 00009: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-009_acc-0.302083.h5
9/9 [==============================] - 513s 57s/step - loss: 0.9761 - acc: 0.4879 - val_loss: 1.4427 - val_acc: 0.3021
Epoch 10/10
8/9 [=========================>....] - ETA: 51s - loss: 1.0747 - acc: 0.4961
Epoch 00010: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-010_acc-0.302083.h5
9/9 [==============================] - 507s 56s/step - loss: 1.0479 - acc: 0.5035 - val_loss: 1.4121 - val_acc: 0.3021

以上結果由於訓練數量少次數少 所以結果很差 僅做參考

Idenprof圖像數據集以及示例代碼:鏈接: https://pan.baidu.com/s/1vHXNDaSugbGjMaWWratQqA 提取碼: x8m7

有任何問題歡迎留言討論。


  1. 上一篇文章:
  2. 下一篇文章:
Copyright © 程式師世界 All Rights Reserved