Qstairs

現役AIベンチャーCTOの知見、画像認識(人工知能、Deep Learning)を中心とした技術ノウハウをアップしていきます

広告

【画像認識】「Caffe」をWindowsで使う!CIFAR-10編

f:id:qstairs:20160607230240j:plain:h300

はじめに

これまで、Deep Learningのフレームワーク「Caffe」を
MNISTデータを使っていろいろ試してきました。

そろそろ別のこともやりたいので、
今回からCIFAR-10を使っていきたいと思います。
#CIFARは10カテゴリの画像データが詰まったデータセットです。
#詳しくはこちら

今回でCIFAR-10のダウンロードから学習までの進め方を一気に紹介します。

データ取得

まず、ここからデータセットをダウンロードします。
ダウンロードするデータセット
「CIFAR-10 binary version (suitable for C programs)」です。

データ変換

ダウンロードしてきたデータセットを以下に置きます。
「caffe-master\Build\x64\Release\examples\cifar10」

そして、以下を実行するとMNISTの時と同様にデータが変換されます。

※カレントフォルダは「caffe-master\Build\x64\Release」
convert_cifar_data.exe .\examples\cifar10\cifar-10-batches-bin examples/cifar10/ lmdb

続いて、以下も実行します。

※カレントフォルダは「caffe-master\Build\x64\Release」
set EXAMPLE=examples/cifar10
set DATA=data/cifar10
set DBTYPE=lmdb

echo "Computing image mean..."

compute_image_mean.exe -backend=%DBTYPE% %EXAMPLE%/cifar10_train_%DBTYPE% %EXAMPLE%/mean.binaryproto

学習実行方法

カレントフォルダを
「caffe-master\Build\x64\Release」フォルダとし、
以下を実行すると学習とテストが始まります。
#私の環境では1.5時間ほどかかりました。

caffe.exe train --solver=examples\cifar10\cifar10_full_solver.prototxt

実行結果は↓

下から4行目にあるように、
識別精度は78.33%のようです。
これだけ学習時間をかけてもそれほど良くないんですね(^_^;

I0702 14:01:09.027565 18620 caffe.cpp:186] Using GPUs 0
I0702 14:01:09.328008 18620 caffe.cpp:191] GPU 0: GeForce GTX 745
I0702 14:01:09.543774 18620 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:09.543774 18620 solver.cpp:48] Initializing solver from parameters:
test_iter: 100
test_interval: 1000
base_lr: 0.001
display: 200
max_iter: 60000
lr_policy: "fixed"
momentum: 0.9
weight_decay: 0.004
snapshot: 10000
snapshot_prefix: "examples/cifar10/cifar10_full"
solver_mode: GPU
device_id: 0
net: "examples/cifar10/cifar10_full_train_test.prototxt"
snapshot_format: HDF5
I0702 14:01:09.543774 18620 solver.cpp:91] Creating training net from net file: examples/cifar10/cifar10_full_train_test.prototxt
I0702 14:01:09.543774 18620 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer cifar
I0702 14:01:09.543774 18620 net.cpp:313] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I0702 14:01:09.543774 18620 net.cpp:49] Initializing net from parameters:
name: "CIFAR10_full"
state {
  phase: TRAIN
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TRAIN
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_train_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 3
    alpha: 5e-005
    beta: 0.75
    norm_region: WITHIN_CHANNEL
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 3
    alpha: 5e-005
    beta: 0.75
    norm_region: WITHIN_CHANNEL
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
    decay_mult: 250
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip1"
  bottom: "label"
  top: "loss"
}
I0702 14:01:09.577064 18620 layer_factory.hpp:77] Creating layer cifar
I0702 14:01:09.578063 18620 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:09.580410 18620 net.cpp:91] Creating Layer cifar
I0702 14:01:09.581202 17188 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:09.581202 18620 net.cpp:399] cifar -> data
I0702 14:01:09.581202 18620 net.cpp:399] cifar -> label
I0702 14:01:09.581202 17188 db_lmdb.cpp:40] Opened lmdb examples/cifar10/cifar10_train_lmdb
I0702 14:01:09.581202 18620 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0702 14:01:09.828881 18620 data_layer.cpp:41] output data size: 100,3,32,32
I0702 14:01:09.844481 18620 net.cpp:141] Setting up cifar
I0702 14:01:09.844481 18620 net.cpp:148] Top shape: 100 3 32 32 (307200)
I0702 14:01:09.844481  1388 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:09.844481 18620 net.cpp:148] Top shape: 100 (100)
I0702 14:01:09.844481 18620 net.cpp:156] Memory required for data: 1229200
I0702 14:01:09.844481 18620 layer_factory.hpp:77] Creating layer conv1
I0702 14:01:09.844481 18620 net.cpp:91] Creating Layer conv1
I0702 14:01:09.844481 18620 net.cpp:425] conv1 <- data
I0702 14:01:09.844481 18620 net.cpp:399] conv1 -> conv1
I0702 14:01:10.060221 18620 net.cpp:141] Setting up conv1
I0702 14:01:10.060221 18620 net.cpp:148] Top shape: 100 32 32 32 (3276800)
I0702 14:01:10.060221 18620 net.cpp:156] Memory required for data: 14336400
I0702 14:01:10.060221 18620 layer_factory.hpp:77] Creating layer pool1
I0702 14:01:10.060221 18620 net.cpp:91] Creating Layer pool1
I0702 14:01:10.060221 18620 net.cpp:425] pool1 <- conv1
I0702 14:01:10.060221 18620 net.cpp:399] pool1 -> pool1
I0702 14:01:10.060221 18620 net.cpp:141] Setting up pool1
I0702 14:01:10.060221 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.060221 18620 net.cpp:156] Memory required for data: 17613200
I0702 14:01:10.060221 18620 layer_factory.hpp:77] Creating layer relu1
I0702 14:01:10.060221 18620 net.cpp:91] Creating Layer relu1
I0702 14:01:10.060221 18620 net.cpp:425] relu1 <- pool1
I0702 14:01:10.060221 18620 net.cpp:386] relu1 -> pool1 (in-place)
I0702 14:01:10.076233 18620 net.cpp:141] Setting up relu1
I0702 14:01:10.076736 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.077767 18620 net.cpp:156] Memory required for data: 20890000
I0702 14:01:10.078240 18620 layer_factory.hpp:77] Creating layer norm1
I0702 14:01:10.079272 18620 net.cpp:91] Creating Layer norm1
I0702 14:01:10.080246 18620 net.cpp:425] norm1 <- pool1
I0702 14:01:10.080747 18620 net.cpp:399] norm1 -> norm1
I0702 14:01:10.081749 18620 net.cpp:141] Setting up norm1
I0702 14:01:10.081749 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.081749 18620 net.cpp:156] Memory required for data: 24166800
I0702 14:01:10.081749 18620 layer_factory.hpp:77] Creating layer conv2
I0702 14:01:10.081749 18620 net.cpp:91] Creating Layer conv2
I0702 14:01:10.081749 18620 net.cpp:425] conv2 <- norm1
I0702 14:01:10.081749 18620 net.cpp:399] conv2 -> conv2
I0702 14:01:10.081749 18620 net.cpp:141] Setting up conv2
I0702 14:01:10.081749 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.081749 18620 net.cpp:156] Memory required for data: 27443600
I0702 14:01:10.081749 18620 layer_factory.hpp:77] Creating layer relu2
I0702 14:01:10.081749 18620 net.cpp:91] Creating Layer relu2
I0702 14:01:10.081749 18620 net.cpp:425] relu2 <- conv2
I0702 14:01:10.081749 18620 net.cpp:386] relu2 -> conv2 (in-place)
I0702 14:01:10.081749 18620 net.cpp:141] Setting up relu2
I0702 14:01:10.081749 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.081749 18620 net.cpp:156] Memory required for data: 30720400
I0702 14:01:10.081749 18620 layer_factory.hpp:77] Creating layer pool2
I0702 14:01:10.097611 18620 net.cpp:91] Creating Layer pool2
I0702 14:01:10.097611 18620 net.cpp:425] pool2 <- conv2
I0702 14:01:10.097611 18620 net.cpp:399] pool2 -> pool2
I0702 14:01:10.097611 18620 net.cpp:141] Setting up pool2
I0702 14:01:10.097611 18620 net.cpp:148] Top shape: 100 32 8 8 (204800)
I0702 14:01:10.097611 18620 net.cpp:156] Memory required for data: 31539600
I0702 14:01:10.097611 18620 layer_factory.hpp:77] Creating layer norm2
I0702 14:01:10.097611 18620 net.cpp:91] Creating Layer norm2
I0702 14:01:10.097611 18620 net.cpp:425] norm2 <- pool2
I0702 14:01:10.097611 18620 net.cpp:399] norm2 -> norm2
I0702 14:01:10.113237 18620 net.cpp:141] Setting up norm2
I0702 14:01:10.113237 18620 net.cpp:148] Top shape: 100 32 8 8 (204800)
I0702 14:01:10.113237 18620 net.cpp:156] Memory required for data: 32358800
I0702 14:01:10.113237 18620 layer_factory.hpp:77] Creating layer conv3
I0702 14:01:10.113237 18620 net.cpp:91] Creating Layer conv3
I0702 14:01:10.113237 18620 net.cpp:425] conv3 <- norm2
I0702 14:01:10.113237 18620 net.cpp:399] conv3 -> conv3
I0702 14:01:10.113237 18620 net.cpp:141] Setting up conv3
I0702 14:01:10.128885 18620 net.cpp:148] Top shape: 100 64 8 8 (409600)
I0702 14:01:10.128885 18620 net.cpp:156] Memory required for data: 33997200
I0702 14:01:10.128885 18620 layer_factory.hpp:77] Creating layer relu3
I0702 14:01:10.128885 18620 net.cpp:91] Creating Layer relu3
I0702 14:01:10.128885 18620 net.cpp:425] relu3 <- conv3
I0702 14:01:10.128885 18620 net.cpp:386] relu3 -> conv3 (in-place)
I0702 14:01:10.128885 18620 net.cpp:141] Setting up relu3
I0702 14:01:10.128885 18620 net.cpp:148] Top shape: 100 64 8 8 (409600)
I0702 14:01:10.128885 18620 net.cpp:156] Memory required for data: 35635600
I0702 14:01:10.128885 18620 layer_factory.hpp:77] Creating layer pool3
I0702 14:01:10.128885 18620 net.cpp:91] Creating Layer pool3
I0702 14:01:10.128885 18620 net.cpp:425] pool3 <- conv3
I0702 14:01:10.128885 18620 net.cpp:399] pool3 -> pool3
I0702 14:01:10.128885 18620 net.cpp:141] Setting up pool3
I0702 14:01:10.128885 18620 net.cpp:148] Top shape: 100 64 4 4 (102400)
I0702 14:01:10.128885 18620 net.cpp:156] Memory required for data: 36045200
I0702 14:01:10.144491 18620 layer_factory.hpp:77] Creating layer ip1
I0702 14:01:10.144491 18620 net.cpp:91] Creating Layer ip1
I0702 14:01:10.144491 18620 net.cpp:425] ip1 <- pool3
I0702 14:01:10.144491 18620 net.cpp:399] ip1 -> ip1
I0702 14:01:10.144491 18620 net.cpp:141] Setting up ip1
I0702 14:01:10.144491 18620 net.cpp:148] Top shape: 100 10 (1000)
I0702 14:01:10.144491 18620 net.cpp:156] Memory required for data: 36049200
I0702 14:01:10.144491 18620 layer_factory.hpp:77] Creating layer loss
I0702 14:01:10.144491 18620 net.cpp:91] Creating Layer loss
I0702 14:01:10.144491 18620 net.cpp:425] loss <- ip1
I0702 14:01:10.144491 18620 net.cpp:425] loss <- label
I0702 14:01:10.144491 18620 net.cpp:399] loss -> loss
I0702 14:01:10.144491 18620 layer_factory.hpp:77] Creating layer loss
I0702 14:01:10.144491 18620 net.cpp:141] Setting up loss
I0702 14:01:10.144491 18620 net.cpp:148] Top shape: (1)
I0702 14:01:10.160118 18620 net.cpp:151]     with loss weight 1
I0702 14:01:10.160118 18620 net.cpp:156] Memory required for data: 36049204
I0702 14:01:10.160118 18620 net.cpp:217] loss needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] ip1 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] pool3 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] relu3 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] conv3 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] norm2 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] pool2 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] relu2 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] conv2 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] norm1 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] relu1 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] pool1 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:217] conv1 needs backward computation.
I0702 14:01:10.160118 18620 net.cpp:219] cifar does not need backward computation.
I0702 14:01:10.160118 18620 net.cpp:261] This network produces output loss
I0702 14:01:10.160118 18620 net.cpp:274] Network initialization done.
I0702 14:01:10.160118 18620 solver.cpp:181] Creating test net (#0) specified by net file: examples/cifar10/cifar10_full_train_test.prototxt
I0702 14:01:10.175745 18620 net.cpp:313] The NetState phase (1) differed from the phase (0) specified by a rule in layer cifar
I0702 14:01:10.176853 18620 net.cpp:49] Initializing net from parameters:
name: "CIFAR10_full"
state {
  phase: TEST
}
layer {
  name: "cifar"
  type: "Data"
  top: "data"
  top: "label"
  include {
    phase: TEST
  }
  transform_param {
    mean_file: "examples/cifar10/mean.binaryproto"
  }
  data_param {
    source: "examples/cifar10/cifar10_test_lmdb"
    batch_size: 100
    backend: LMDB
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.0001
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "pool1"
  top: "pool1"
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 3
    alpha: 5e-005
    beta: 0.75
    norm_region: WITHIN_CHANNEL
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  param {
    lr_mult: 1
  }
  param {
    lr_mult: 2
  }
  convolution_param {
    num_output: 32
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 3
    alpha: 5e-005
    beta: 0.75
    norm_region: WITHIN_CHANNEL
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 64
    pad: 2
    kernel_size: 5
    stride: 1
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "pool3"
  type: "Pooling"
  bottom: "conv3"
  top: "pool3"
  pooling_param {
    pool: AVE
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "ip1"
  type: "InnerProduct"
  bottom: "pool3"
  top: "ip1"
  param {
    lr_mult: 1
    decay_mult: 250
  }
  param {
    lr_mult: 2
    decay_mult: 0
  }
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
    }
  }
}
layer {
  name: "accuracy"
  type: "Accuracy"
  bottom: "ip1"
  bottom: "label"
  top: "accuracy"
  include {
    phase: TEST
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip1"
  bottom: "label"
  top: "loss"
}
I0702 14:01:10.182368 18620 layer_factory.hpp:77] Creating layer cifar
I0702 14:01:10.198014 18620 net.cpp:91] Creating Layer cifar
I0702 14:01:10.198014 18620 net.cpp:399] cifar -> data
I0702 14:01:10.198014 18620 net.cpp:399] cifar -> label
I0702 14:01:10.198014 18620 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0702 14:01:10.198014 16216 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:10.198014 16216 db_lmdb.cpp:40] Opened lmdb examples/cifar10/cifar10_test_lmdb
I0702 14:01:10.198014 18620 data_layer.cpp:41] output data size: 100,3,32,32
I0702 14:01:10.198014 18620 net.cpp:141] Setting up cifar
I0702 14:01:10.198014 18620 net.cpp:148] Top shape: 100 3 32 32 (307200)
I0702 14:01:10.198014  6692 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0702 14:01:10.198014 18620 net.cpp:148] Top shape: 100 (100)
I0702 14:01:10.213624 18620 net.cpp:156] Memory required for data: 1229200
I0702 14:01:10.213624 18620 layer_factory.hpp:77] Creating layer label_cifar_1_split
I0702 14:01:10.213624 18620 net.cpp:91] Creating Layer label_cifar_1_split
I0702 14:01:10.213624 18620 net.cpp:425] label_cifar_1_split <- label
I0702 14:01:10.213624 18620 net.cpp:399] label_cifar_1_split -> label_cifar_1_split_0
I0702 14:01:10.213624 18620 net.cpp:399] label_cifar_1_split -> label_cifar_1_split_1
I0702 14:01:10.213624 18620 net.cpp:141] Setting up label_cifar_1_split
I0702 14:01:10.213624 18620 net.cpp:148] Top shape: 100 (100)
I0702 14:01:10.213624 18620 net.cpp:148] Top shape: 100 (100)
I0702 14:01:10.213624 18620 net.cpp:156] Memory required for data: 1230000
I0702 14:01:10.213624 18620 layer_factory.hpp:77] Creating layer conv1
I0702 14:01:10.213624 18620 net.cpp:91] Creating Layer conv1
I0702 14:01:10.213624 18620 net.cpp:425] conv1 <- data
I0702 14:01:10.213624 18620 net.cpp:399] conv1 -> conv1
I0702 14:01:10.229250 18620 net.cpp:141] Setting up conv1
I0702 14:01:10.229250 18620 net.cpp:148] Top shape: 100 32 32 32 (3276800)
I0702 14:01:10.229250 18620 net.cpp:156] Memory required for data: 14337200
I0702 14:01:10.229250 18620 layer_factory.hpp:77] Creating layer pool1
I0702 14:01:10.229250 18620 net.cpp:91] Creating Layer pool1
I0702 14:01:10.229250 18620 net.cpp:425] pool1 <- conv1
I0702 14:01:10.229250 18620 net.cpp:399] pool1 -> pool1
I0702 14:01:10.229250 18620 net.cpp:141] Setting up pool1
I0702 14:01:10.229250 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.229250 18620 net.cpp:156] Memory required for data: 17614000
I0702 14:01:10.229250 18620 layer_factory.hpp:77] Creating layer relu1
I0702 14:01:10.229250 18620 net.cpp:91] Creating Layer relu1
I0702 14:01:10.229250 18620 net.cpp:425] relu1 <- pool1
I0702 14:01:10.229250 18620 net.cpp:386] relu1 -> pool1 (in-place)
I0702 14:01:10.229250 18620 net.cpp:141] Setting up relu1
I0702 14:01:10.229250 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.229250 18620 net.cpp:156] Memory required for data: 20890800
I0702 14:01:10.229250 18620 layer_factory.hpp:77] Creating layer norm1
I0702 14:01:10.244875 18620 net.cpp:91] Creating Layer norm1
I0702 14:01:10.244875 18620 net.cpp:425] norm1 <- pool1
I0702 14:01:10.244875 18620 net.cpp:399] norm1 -> norm1
I0702 14:01:10.244875 18620 net.cpp:141] Setting up norm1
I0702 14:01:10.244875 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.244875 18620 net.cpp:156] Memory required for data: 24167600
I0702 14:01:10.244875 18620 layer_factory.hpp:77] Creating layer conv2
I0702 14:01:10.244875 18620 net.cpp:91] Creating Layer conv2
I0702 14:01:10.244875 18620 net.cpp:425] conv2 <- norm1
I0702 14:01:10.244875 18620 net.cpp:399] conv2 -> conv2
I0702 14:01:10.244875 18620 net.cpp:141] Setting up conv2
I0702 14:01:10.244875 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.244875 18620 net.cpp:156] Memory required for data: 27444400
I0702 14:01:10.244875 18620 layer_factory.hpp:77] Creating layer relu2
I0702 14:01:10.244875 18620 net.cpp:91] Creating Layer relu2
I0702 14:01:10.244875 18620 net.cpp:425] relu2 <- conv2
I0702 14:01:10.244875 18620 net.cpp:386] relu2 -> conv2 (in-place)
I0702 14:01:10.260509 18620 net.cpp:141] Setting up relu2
I0702 14:01:10.260509 18620 net.cpp:148] Top shape: 100 32 16 16 (819200)
I0702 14:01:10.260509 18620 net.cpp:156] Memory required for data: 30721200
I0702 14:01:10.260509 18620 layer_factory.hpp:77] Creating layer pool2
I0702 14:01:10.260509 18620 net.cpp:91] Creating Layer pool2
I0702 14:01:10.260509 18620 net.cpp:425] pool2 <- conv2
I0702 14:01:10.260509 18620 net.cpp:399] pool2 -> pool2
I0702 14:01:10.260509 18620 net.cpp:141] Setting up pool2
I0702 14:01:10.260509 18620 net.cpp:148] Top shape: 100 32 8 8 (204800)
I0702 14:01:10.260509 18620 net.cpp:156] Memory required for data: 31540400
I0702 14:01:10.260509 18620 layer_factory.hpp:77] Creating layer norm2
I0702 14:01:10.260509 18620 net.cpp:91] Creating Layer norm2
I0702 14:01:10.260509 18620 net.cpp:425] norm2 <- pool2
I0702 14:01:10.260509 18620 net.cpp:399] norm2 -> norm2
I0702 14:01:10.260509 18620 net.cpp:141] Setting up norm2
I0702 14:01:10.260509 18620 net.cpp:148] Top shape: 100 32 8 8 (204800)
I0702 14:01:10.260509 18620 net.cpp:156] Memory required for data: 32359600
I0702 14:01:10.260509 18620 layer_factory.hpp:77] Creating layer conv3
I0702 14:01:10.260509 18620 net.cpp:91] Creating Layer conv3
I0702 14:01:10.260509 18620 net.cpp:425] conv3 <- norm2
I0702 14:01:10.276922 18620 net.cpp:399] conv3 -> conv3
I0702 14:01:10.278926 18620 net.cpp:141] Setting up conv3
I0702 14:01:10.279428 18620 net.cpp:148] Top shape: 100 64 8 8 (409600)
I0702 14:01:10.280431 18620 net.cpp:156] Memory required for data: 33998000
I0702 14:01:10.281433 18620 layer_factory.hpp:77] Creating layer relu3
I0702 14:01:10.281935 18620 net.cpp:91] Creating Layer relu3
I0702 14:01:10.281935 18620 net.cpp:425] relu3 <- conv3
I0702 14:01:10.281935 18620 net.cpp:386] relu3 -> conv3 (in-place)
I0702 14:01:10.281935 18620 net.cpp:141] Setting up relu3
I0702 14:01:10.281935 18620 net.cpp:148] Top shape: 100 64 8 8 (409600)
I0702 14:01:10.281935 18620 net.cpp:156] Memory required for data: 35636400
I0702 14:01:10.281935 18620 layer_factory.hpp:77] Creating layer pool3
I0702 14:01:10.281935 18620 net.cpp:91] Creating Layer pool3
I0702 14:01:10.281935 18620 net.cpp:425] pool3 <- conv3
I0702 14:01:10.281935 18620 net.cpp:399] pool3 -> pool3
I0702 14:01:10.281935 18620 net.cpp:141] Setting up pool3
I0702 14:01:10.281935 18620 net.cpp:148] Top shape: 100 64 4 4 (102400)
I0702 14:01:10.281935 18620 net.cpp:156] Memory required for data: 36046000
I0702 14:01:10.281935 18620 layer_factory.hpp:77] Creating layer ip1
I0702 14:01:10.281935 18620 net.cpp:91] Creating Layer ip1
I0702 14:01:10.281935 18620 net.cpp:425] ip1 <- pool3
I0702 14:01:10.281935 18620 net.cpp:399] ip1 -> ip1
I0702 14:01:10.281935 18620 net.cpp:141] Setting up ip1
I0702 14:01:10.281935 18620 net.cpp:148] Top shape: 100 10 (1000)
I0702 14:01:10.281935 18620 net.cpp:156] Memory required for data: 36050000
I0702 14:01:10.281935 18620 layer_factory.hpp:77] Creating layer ip1_ip1_0_split
I0702 14:01:10.298038 18620 net.cpp:91] Creating Layer ip1_ip1_0_split
I0702 14:01:10.298038 18620 net.cpp:425] ip1_ip1_0_split <- ip1
I0702 14:01:10.298038 18620 net.cpp:399] ip1_ip1_0_split -> ip1_ip1_0_split_0
I0702 14:01:10.298038 18620 net.cpp:399] ip1_ip1_0_split -> ip1_ip1_0_split_1
I0702 14:01:10.298038 18620 net.cpp:141] Setting up ip1_ip1_0_split
I0702 14:01:10.298038 18620 net.cpp:148] Top shape: 100 10 (1000)
I0702 14:01:10.298038 18620 net.cpp:148] Top shape: 100 10 (1000)
I0702 14:01:10.298038 18620 net.cpp:156] Memory required for data: 36058000
I0702 14:01:10.298038 18620 layer_factory.hpp:77] Creating layer accuracy
I0702 14:01:10.298038 18620 net.cpp:91] Creating Layer accuracy
I0702 14:01:10.298038 18620 net.cpp:425] accuracy <- ip1_ip1_0_split_0
I0702 14:01:10.298038 18620 net.cpp:425] accuracy <- label_cifar_1_split_0
I0702 14:01:10.298038 18620 net.cpp:399] accuracy -> accuracy
I0702 14:01:10.298038 18620 net.cpp:141] Setting up accuracy
I0702 14:01:10.298038 18620 net.cpp:148] Top shape: (1)
I0702 14:01:10.298038 18620 net.cpp:156] Memory required for data: 36058004
I0702 14:01:10.298038 18620 layer_factory.hpp:77] Creating layer loss
I0702 14:01:10.298038 18620 net.cpp:91] Creating Layer loss
I0702 14:01:10.298038 18620 net.cpp:425] loss <- ip1_ip1_0_split_1
I0702 14:01:10.298038 18620 net.cpp:425] loss <- label_cifar_1_split_1
I0702 14:01:10.298038 18620 net.cpp:399] loss -> loss
I0702 14:01:10.313665 18620 layer_factory.hpp:77] Creating layer loss
I0702 14:01:10.313665 18620 net.cpp:141] Setting up loss
I0702 14:01:10.313665 18620 net.cpp:148] Top shape: (1)
I0702 14:01:10.313665 18620 net.cpp:151]     with loss weight 1
I0702 14:01:10.313665 18620 net.cpp:156] Memory required for data: 36058008
I0702 14:01:10.313665 18620 net.cpp:217] loss needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:219] accuracy does not need backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] ip1_ip1_0_split needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] ip1 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] pool3 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] relu3 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] conv3 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] norm2 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] pool2 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] relu2 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] conv2 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] norm1 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] relu1 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] pool1 needs backward computation.
I0702 14:01:10.313665 18620 net.cpp:217] conv1 needs backward computation.
I0702 14:01:10.329291 18620 net.cpp:219] label_cifar_1_split does not need backward computation.
I0702 14:01:10.329291 18620 net.cpp:219] cifar does not need backward computation.
I0702 14:01:10.329291 18620 net.cpp:261] This network produces output accuracy
I0702 14:01:10.329291 18620 net.cpp:261] This network produces output loss
I0702 14:01:10.329291 18620 net.cpp:274] Network initialization done.
I0702 14:01:10.329291 18620 solver.cpp:60] Solver scaffolding done.
I0702 14:01:10.329291 18620 caffe.cpp:220] Starting Optimization
I0702 14:01:10.329291 18620 solver.cpp:279] Solving CIFAR10_full
I0702 14:01:10.329291 18620 solver.cpp:280] Learning Rate Policy: fixed
I0702 14:01:10.329291 18620 solver.cpp:337] Iteration 0, Testing net (#0)
I0702 14:01:12.185896 18620 solver.cpp:404]     Test net output #0: accuracy = 0.1038
I0702 14:01:12.201553 18620 solver.cpp:404]     Test net output #1: loss = 2.3026 (* 1 = 2.3026 loss)
I0702 14:01:12.232777 18620 solver.cpp:228] Iteration 0, loss = 2.30265
I0702 14:01:12.232777 18620 solver.cpp:244]     Train net output #0: loss = 2.30265 (* 1 = 2.30265 loss)
I0702 14:01:12.232777 18620 sgd_solver.cpp:106] Iteration 0, lr = 0.001
I0702 14:01:26.197675 18620 solver.cpp:228] Iteration 200, loss = 1.87686
I0702 14:01:26.197675 18620 solver.cpp:244]     Train net output #0: loss = 1.87686 (* 1 = 1.87686 loss)
I0702 14:01:26.197675 18620 sgd_solver.cpp:106] Iteration 200, lr = 0.001
I0702 14:01:40.141294 18620 solver.cpp:228] Iteration 400, loss = 1.49297
I0702 14:01:40.141294 18620 solver.cpp:244]     Train net output #0: loss = 1.49297 (* 1 = 1.49297 loss)
I0702 14:01:40.141294 18620 sgd_solver.cpp:106] Iteration 400, lr = 0.001
I0702 14:01:54.143504 18620 solver.cpp:228] Iteration 600, loss = 1.46932
I0702 14:01:54.143504 18620 solver.cpp:244]     Train net output #0: loss = 1.46932 (* 1 = 1.46932 loss)
I0702 14:01:54.143504 18620 sgd_solver.cpp:106] Iteration 600, lr = 0.001
I0702 14:02:08.095770 18620 solver.cpp:228] Iteration 800, loss = 1.38917
I0702 14:02:08.095770 18620 solver.cpp:244]     Train net output #0: loss = 1.38917 (* 1 = 1.38917 loss)
I0702 14:02:08.095770 18620 sgd_solver.cpp:106] Iteration 800, lr = 0.001
I0702 14:02:21.974074 18620 solver.cpp:337] Iteration 1000, Testing net (#0)
I0702 14:02:23.862009 18620 solver.cpp:404]     Test net output #0: accuracy = 0.5202
I0702 14:02:23.862009 18620 solver.cpp:404]     Test net output #1: loss = 1.35395 (* 1 = 1.35395 loss)
I0702 14:02:23.893259 18620 solver.cpp:228] Iteration 1000, loss = 1.32918
I0702 14:02:23.893259 18620 solver.cpp:244]     Train net output #0: loss = 1.32918 (* 1 = 1.32918 loss)
I0702 14:02:23.893259 18620 sgd_solver.cpp:106] Iteration 1000, lr = 0.001

~中略~

I0702 15:12:30.365753 18620 solver.cpp:228] Iteration 59400, loss = 0.446769
I0702 15:12:30.367043 18620 solver.cpp:244]     Train net output #0: loss = 0.446769 (* 1 = 0.446769 loss)
I0702 15:12:30.368695 18620 sgd_solver.cpp:106] Iteration 59400, lr = 0.001
I0702 15:12:44.328058 18620 solver.cpp:228] Iteration 59600, loss = 0.398412
I0702 15:12:44.329201 18620 solver.cpp:244]     Train net output #0: loss = 0.398412 (* 1 = 0.398412 loss)
I0702 15:12:44.330687 18620 sgd_solver.cpp:106] Iteration 59600, lr = 0.001
I0702 15:12:58.281877 18620 solver.cpp:228] Iteration 59800, loss = 0.315931
I0702 15:12:58.283318 18620 solver.cpp:244]     Train net output #0: loss = 0.315931 (* 1 = 0.315931 loss)
I0702 15:12:58.284797 18620 sgd_solver.cpp:106] Iteration 59800, lr = 0.001
I0702 15:13:12.174546 18620 solver.cpp:464] Snapshotting to HDF5 file examples/cifar10/cifar10_full_iter_60000.caffemodel.h5
I0702 15:13:12.228106 18620 sgd_solver.cpp:283] Snapshotting solver state to HDF5 file examples/cifar10/cifar10_full_iter_60000.solverstate.h5
I0702 15:13:12.252163 18620 solver.cpp:317] Iteration 60000, loss = 0.350441
I0702 15:13:12.253883 18620 solver.cpp:337] Iteration 60000, Testing net (#0)
I0702 15:13:14.098697 18620 solver.cpp:404]     Test net output #0: accuracy = 0.7833
I0702 15:13:14.099669 18620 solver.cpp:404]     Test net output #1: loss = 0.622183 (* 1 = 0.622183 loss)
I0702 15:13:14.101049 18620 solver.cpp:322] Optimization Done.
I0702 15:13:14.102483 18620 caffe.cpp:223] Optimization Done.
広告