MPSCNNHelloWorld: Version 1.1, 2016-11-17

Fixes a typo that affected performance

This sample is a port of the open source library, TensorFlow trained networks trained on MNIST Dataset (http://yann.lecun.com/exdb/mnist/) via inference using Metal Performance Shaders. The sample demonstrates how to encode different layers to the GPU and perform image recognition using trained parameters(weights and bias) that have been fetched from, pre-trained and saved network on TensorFlow.
This commit is contained in:
Liu Lantao 2016-12-24 12:24:46 +08:00
parent 95df29821e
commit bc16dc0964
45 changed files with 1706 additions and 0 deletions

View File

@ -0,0 +1,42 @@
Sample code project: MPSCNNHelloWorld: Simple Digit Detection Convolution Neural Networks (CNN)
Version: 1.1
IMPORTANT: This Apple software is supplied to you by Apple
Inc. ("Apple") in consideration of your agreement to the following
terms, and your use, installation, modification or redistribution of
this Apple software constitutes acceptance of these terms. If you do
not agree with these terms, please do not use, install, modify or
redistribute this Apple software.
In consideration of your agreement to abide by the following terms, and
subject to these terms, Apple grants you a personal, non-exclusive
license, under Apple's copyrights in this original Apple software (the
"Apple Software"), to use, reproduce, modify and redistribute the Apple
Software, with or without modifications, in source and/or binary forms;
provided that if you redistribute the Apple Software in its entirety and
without modifications, you must retain this notice and the following
text and disclaimers in all such redistributions of the Apple Software.
Neither the name, trademarks, service marks or logos of Apple Inc. may
be used to endorse or promote products derived from the Apple Software
without specific prior written permission from Apple. Except as
expressly stated in this notice, no other rights or licenses, express or
implied, are granted by Apple herein, including but not limited to any
patent rights that may be infringed by your derivative works or by other
works in which the Apple Software may be incorporated.
The Apple Software is provided by Apple on an "AS IS" basis. APPLE
MAKES NO WARRANTIES, EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION
THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY AND FITNESS
FOR A PARTICULAR PURPOSE, REGARDING THE APPLE SOFTWARE OR ITS USE AND
OPERATION ALONE OR IN COMBINATION WITH YOUR PRODUCTS.
IN NO EVENT SHALL APPLE BE LIABLE FOR ANY SPECIAL, INDIRECT, INCIDENTAL
OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) ARISING IN ANY WAY OUT OF THE USE, REPRODUCTION,
MODIFICATION AND/OR DISTRIBUTION OF THE APPLE SOFTWARE, HOWEVER CAUSED
AND WHETHER UNDER THEORY OF CONTRACT, TORT (INCLUDING NEGLIGENCE),
STRICT LIABILITY OR OTHERWISE, EVEN IF APPLE HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
Copyright (C) 2016 Apple Inc. All Rights Reserved.

View File

@ -0,0 +1,447 @@
// !$*UTF8*$!
{
archiveVersion = 1;
classes = {
};
objectVersion = 46;
objects = {
/* Begin PBXBuildFile section */
2E6AB7281D47F79E00048A0B /* atomics.m in Sources */ = {isa = PBXBuildFile; fileRef = 2E6AB7271D47F79E00048A0B /* atomics.m */; };
2EE557AE1D41A42B0071A3EC /* t10k-images-idx3-ubyte.data in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557AA1D41A42B0071A3EC /* t10k-images-idx3-ubyte.data */; };
2EE557AF1D41A42B0071A3EC /* t10k-labels-idx1-ubyte.data in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557AB1D41A42B0071A3EC /* t10k-labels-idx1-ubyte.data */; };
2EE557B11D41A42B0071A3EC /* train-labels-idx1-ubyte.data in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557AD1D41A42B0071A3EC /* train-labels-idx1-ubyte.data */; };
2EE557BA1D41A4540071A3EC /* bias_conv1.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B21D41A4540071A3EC /* bias_conv1.dat */; };
2EE557BB1D41A4540071A3EC /* bias_conv2.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B31D41A4540071A3EC /* bias_conv2.dat */; };
2EE557BC1D41A4540071A3EC /* bias_fc1.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B41D41A4540071A3EC /* bias_fc1.dat */; };
2EE557BD1D41A4540071A3EC /* bias_fc2.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B51D41A4540071A3EC /* bias_fc2.dat */; };
2EE557BE1D41A4540071A3EC /* weights_conv1.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B61D41A4540071A3EC /* weights_conv1.dat */; };
2EE557BF1D41A4540071A3EC /* weights_conv2.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B71D41A4540071A3EC /* weights_conv2.dat */; };
2EE557C01D41A4540071A3EC /* weights_fc1.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B81D41A4540071A3EC /* weights_fc1.dat */; };
2EE557C11D41A4540071A3EC /* weights_fc2.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557B91D41A4540071A3EC /* weights_fc2.dat */; };
2EE557C41D41A4670071A3EC /* bias_NN.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557C21D41A4670071A3EC /* bias_NN.dat */; };
2EE557C51D41A4670071A3EC /* weights_NN.dat in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557C31D41A4670071A3EC /* weights_NN.dat */; };
2EE557D11D41A5890071A3EC /* train-images-idx3-ubyte.data in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557AC1D41A42B0071A3EC /* train-images-idx3-ubyte.data */; };
2EE557EE1D41A6410071A3EC /* AppDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557E71D41A6410071A3EC /* AppDelegate.swift */; };
2EE557EF1D41A6410071A3EC /* DrawView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557E81D41A6410071A3EC /* DrawView.swift */; };
2EE557F01D41A6410071A3EC /* GetMNISTData.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557E91D41A6410071A3EC /* GetMNISTData.swift */; };
2EE557F11D41A6410071A3EC /* MNISTDeepCNN.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557EA1D41A6410071A3EC /* MNISTDeepCNN.swift */; };
2EE557F21D41A6410071A3EC /* MNISTSingleLayer.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557EB1D41A6410071A3EC /* MNISTSingleLayer.swift */; };
2EE557F31D41A6410071A3EC /* SlimMPSCNN.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557EC1D41A6410071A3EC /* SlimMPSCNN.swift */; };
2EE557F41D41A6410071A3EC /* ViewController.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2EE557ED1D41A6410071A3EC /* ViewController.swift */; };
2EE557F71D41A6690071A3EC /* Main.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557F51D41A6690071A3EC /* Main.storyboard */; };
2EE557FA1D41A6770071A3EC /* LaunchScreen.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557F81D41A6770071A3EC /* LaunchScreen.storyboard */; };
2EE557FC1D41A6840071A3EC /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 2EE557FB1D41A6840071A3EC /* Assets.xcassets */; };
/* End PBXBuildFile section */
/* Begin PBXFileReference section */
2E0C35F31CB5B2FE0041D8E3 /* Digit Detector.app */ = {isa = PBXFileReference; explicitFileType = wrapper.application; includeInIndex = 0; path = "Digit Detector.app"; sourceTree = BUILT_PRODUCTS_DIR; };
2E6AB7261D47F5F300048A0B /* atomics.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; name = atomics.h; path = MPSCNNHelloWorld/atomics.h; sourceTree = SOURCE_ROOT; };
2E6AB7271D47F79E00048A0B /* atomics.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; name = atomics.m; path = MPSCNNHelloWorld/atomics.m; sourceTree = SOURCE_ROOT; };
2E6AB7291D47F9F300048A0B /* MPSCNNHelloWorld-Bridging-Header.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; name = "MPSCNNHelloWorld-Bridging-Header.h"; path = "MPSCNNHelloWorld/MPSCNNHelloWorld-Bridging-Header.h"; sourceTree = SOURCE_ROOT; };
2ED4411D1D41A21900D89679 /* README.md */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = net.daringfireball.markdown; path = README.md; sourceTree = "<group>"; };
2EE557AA1D41A42B0071A3EC /* t10k-images-idx3-ubyte.data */ = {isa = PBXFileReference; lastKnownFileType = file; name = "t10k-images-idx3-ubyte.data"; path = "MPSCNNHelloWorld/mnistData/t10k-images-idx3-ubyte.data"; sourceTree = SOURCE_ROOT; };
2EE557AB1D41A42B0071A3EC /* t10k-labels-idx1-ubyte.data */ = {isa = PBXFileReference; lastKnownFileType = file; name = "t10k-labels-idx1-ubyte.data"; path = "MPSCNNHelloWorld/mnistData/t10k-labels-idx1-ubyte.data"; sourceTree = SOURCE_ROOT; };
2EE557AC1D41A42B0071A3EC /* train-images-idx3-ubyte.data */ = {isa = PBXFileReference; lastKnownFileType = file; name = "train-images-idx3-ubyte.data"; path = "MPSCNNHelloWorld/mnistData/train-images-idx3-ubyte.data"; sourceTree = SOURCE_ROOT; };
2EE557AD1D41A42B0071A3EC /* train-labels-idx1-ubyte.data */ = {isa = PBXFileReference; lastKnownFileType = file; name = "train-labels-idx1-ubyte.data"; path = "MPSCNNHelloWorld/mnistData/train-labels-idx1-ubyte.data"; sourceTree = SOURCE_ROOT; };
2EE557B21D41A4540071A3EC /* bias_conv1.dat */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; name = bias_conv1.dat; path = MPSCNNHelloWorld/deep_weights/binaries/bias_conv1.dat; sourceTree = SOURCE_ROOT; };
2EE557B31D41A4540071A3EC /* bias_conv2.dat */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; name = bias_conv2.dat; path = MPSCNNHelloWorld/deep_weights/binaries/bias_conv2.dat; sourceTree = SOURCE_ROOT; };
2EE557B41D41A4540071A3EC /* bias_fc1.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = bias_fc1.dat; path = MPSCNNHelloWorld/deep_weights/binaries/bias_fc1.dat; sourceTree = SOURCE_ROOT; };
2EE557B51D41A4540071A3EC /* bias_fc2.dat */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; name = bias_fc2.dat; path = MPSCNNHelloWorld/deep_weights/binaries/bias_fc2.dat; sourceTree = SOURCE_ROOT; };
2EE557B61D41A4540071A3EC /* weights_conv1.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = weights_conv1.dat; path = MPSCNNHelloWorld/deep_weights/binaries/weights_conv1.dat; sourceTree = SOURCE_ROOT; };
2EE557B71D41A4540071A3EC /* weights_conv2.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = weights_conv2.dat; path = MPSCNNHelloWorld/deep_weights/binaries/weights_conv2.dat; sourceTree = SOURCE_ROOT; };
2EE557B81D41A4540071A3EC /* weights_fc1.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = weights_fc1.dat; path = MPSCNNHelloWorld/deep_weights/binaries/weights_fc1.dat; sourceTree = SOURCE_ROOT; };
2EE557B91D41A4540071A3EC /* weights_fc2.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = weights_fc2.dat; path = MPSCNNHelloWorld/deep_weights/binaries/weights_fc2.dat; sourceTree = SOURCE_ROOT; };
2EE557C21D41A4670071A3EC /* bias_NN.dat */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; name = bias_NN.dat; path = MPSCNNHelloWorld/single_layer_weights/bias_NN.dat; sourceTree = SOURCE_ROOT; };
2EE557C31D41A4670071A3EC /* weights_NN.dat */ = {isa = PBXFileReference; lastKnownFileType = file; name = weights_NN.dat; path = MPSCNNHelloWorld/single_layer_weights/weights_NN.dat; sourceTree = SOURCE_ROOT; };
2EE557E71D41A6410071A3EC /* AppDelegate.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = AppDelegate.swift; path = MPSCNNHelloWorld/AppDelegate.swift; sourceTree = SOURCE_ROOT; };
2EE557E81D41A6410071A3EC /* DrawView.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = DrawView.swift; path = MPSCNNHelloWorld/DrawView.swift; sourceTree = SOURCE_ROOT; };
2EE557E91D41A6410071A3EC /* GetMNISTData.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = GetMNISTData.swift; path = MPSCNNHelloWorld/GetMNISTData.swift; sourceTree = SOURCE_ROOT; };
2EE557EA1D41A6410071A3EC /* MNISTDeepCNN.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = MNISTDeepCNN.swift; path = MPSCNNHelloWorld/MNISTDeepCNN.swift; sourceTree = SOURCE_ROOT; };
2EE557EB1D41A6410071A3EC /* MNISTSingleLayer.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = MNISTSingleLayer.swift; path = MPSCNNHelloWorld/MNISTSingleLayer.swift; sourceTree = SOURCE_ROOT; };
2EE557EC1D41A6410071A3EC /* SlimMPSCNN.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = SlimMPSCNN.swift; path = MPSCNNHelloWorld/SlimMPSCNN.swift; sourceTree = SOURCE_ROOT; };
2EE557ED1D41A6410071A3EC /* ViewController.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; name = ViewController.swift; path = MPSCNNHelloWorld/ViewController.swift; sourceTree = SOURCE_ROOT; };
2EE557F61D41A6690071A3EC /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = MPSCNNHelloWorld/Base.lproj/Main.storyboard; sourceTree = SOURCE_ROOT; };
2EE557F91D41A6770071A3EC /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = MPSCNNHelloWorld/Base.lproj/LaunchScreen.storyboard; sourceTree = SOURCE_ROOT; };
2EE557FB1D41A6840071A3EC /* Assets.xcassets */ = {isa = PBXFileReference; lastKnownFileType = folder.assetcatalog; name = Assets.xcassets; path = MPSCNNHelloWorld/Assets.xcassets; sourceTree = SOURCE_ROOT; };
2EE557FD1D41A6980071A3EC /* Info.plist */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.plist.xml; name = Info.plist; path = MPSCNNHelloWorld/Info.plist; sourceTree = SOURCE_ROOT; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
2E0C35F01CB5B2FE0041D8E3 /* Frameworks */ = {
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
2E0C35EA1CB5B2FE0041D8E3 = {
isa = PBXGroup;
children = (
2ED4411D1D41A21900D89679 /* README.md */,
2E0C35F51CB5B2FE0041D8E3 /* MPSCNNHelloWorld */,
2E0C35F41CB5B2FE0041D8E3 /* Products */,
);
sourceTree = "<group>";
};
2E0C35F41CB5B2FE0041D8E3 /* Products */ = {
isa = PBXGroup;
children = (
2E0C35F31CB5B2FE0041D8E3 /* Digit Detector.app */,
);
name = Products;
sourceTree = "<group>";
};
2E0C35F51CB5B2FE0041D8E3 /* MPSCNNHelloWorld */ = {
isa = PBXGroup;
children = (
2EE557ED1D41A6410071A3EC /* ViewController.swift */,
2EE557E91D41A6410071A3EC /* GetMNISTData.swift */,
2EE557EB1D41A6410071A3EC /* MNISTSingleLayer.swift */,
2EE557EA1D41A6410071A3EC /* MNISTDeepCNN.swift */,
2EE557EC1D41A6410071A3EC /* SlimMPSCNN.swift */,
2EE557E81D41A6410071A3EC /* DrawView.swift */,
2E6AB7291D47F9F300048A0B /* MPSCNNHelloWorld-Bridging-Header.h */,
2E6AB7261D47F5F300048A0B /* atomics.h */,
2E6AB7271D47F79E00048A0B /* atomics.m */,
2EE557F51D41A6690071A3EC /* Main.storyboard */,
2E684F051CDD596900307CBC /* mnistData */,
2EAC52D71CDBC97700AB5026 /* Deep Model */,
2EAC52E81CDBD63F00AB5026 /* Basic Model */,
2E16E8241CBD6DAF008CF29A /* SupportingFiles */,
);
name = MPSCNNHelloWorld;
path = MNIST;
sourceTree = "<group>";
};
2E16E8241CBD6DAF008CF29A /* SupportingFiles */ = {
isa = PBXGroup;
children = (
2EE557E71D41A6410071A3EC /* AppDelegate.swift */,
2EE557FB1D41A6840071A3EC /* Assets.xcassets */,
2EE557F81D41A6770071A3EC /* LaunchScreen.storyboard */,
2EE557FD1D41A6980071A3EC /* Info.plist */,
);
name = SupportingFiles;
sourceTree = "<group>";
};
2E684F051CDD596900307CBC /* mnistData */ = {
isa = PBXGroup;
children = (
2EE557AA1D41A42B0071A3EC /* t10k-images-idx3-ubyte.data */,
2EE557AB1D41A42B0071A3EC /* t10k-labels-idx1-ubyte.data */,
2EE557AC1D41A42B0071A3EC /* train-images-idx3-ubyte.data */,
2EE557AD1D41A42B0071A3EC /* train-labels-idx1-ubyte.data */,
);
name = mnistData;
sourceTree = "<group>";
};
2EAC52D71CDBC97700AB5026 /* Deep Model */ = {
isa = PBXGroup;
children = (
2EE557B21D41A4540071A3EC /* bias_conv1.dat */,
2EE557B31D41A4540071A3EC /* bias_conv2.dat */,
2EE557B41D41A4540071A3EC /* bias_fc1.dat */,
2EE557B51D41A4540071A3EC /* bias_fc2.dat */,
2EE557B61D41A4540071A3EC /* weights_conv1.dat */,
2EE557B71D41A4540071A3EC /* weights_conv2.dat */,
2EE557B81D41A4540071A3EC /* weights_fc1.dat */,
2EE557B91D41A4540071A3EC /* weights_fc2.dat */,
);
name = "Deep Model";
sourceTree = "<group>";
};
2EAC52E81CDBD63F00AB5026 /* Basic Model */ = {
isa = PBXGroup;
children = (
2EE557C21D41A4670071A3EC /* bias_NN.dat */,
2EE557C31D41A4670071A3EC /* weights_NN.dat */,
);
name = "Basic Model";
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
2E0C35F21CB5B2FE0041D8E3 /* MPSCNNHelloWorld */ = {
isa = PBXNativeTarget;
buildConfigurationList = 2E0C361B1CB5B2FF0041D8E3 /* Build configuration list for PBXNativeTarget "MPSCNNHelloWorld" */;
buildPhases = (
2E0C35EF1CB5B2FE0041D8E3 /* Sources */,
2E0C35F01CB5B2FE0041D8E3 /* Frameworks */,
2E0C35F11CB5B2FE0041D8E3 /* Resources */,
);
buildRules = (
);
dependencies = (
);
name = MPSCNNHelloWorld;
productName = MNIST;
productReference = 2E0C35F31CB5B2FE0041D8E3 /* Digit Detector.app */;
productType = "com.apple.product-type.application";
};
/* End PBXNativeTarget section */
/* Begin PBXProject section */
2E0C35EB1CB5B2FE0041D8E3 /* Project object */ = {
isa = PBXProject;
attributes = {
LastSwiftUpdateCheck = 0800;
LastUpgradeCheck = 0800;
ORGANIZATIONNAME = "Dhruv Saksena";
TargetAttributes = {
2E0C35F21CB5B2FE0041D8E3 = {
CreatedOnToolsVersion = 8.0;
DevelopmentTeamName = "Apple Inc. - Core OS Plus Others";
LastSwiftMigration = 0800;
ProvisioningStyle = Automatic;
};
};
};
buildConfigurationList = 2E0C35EE1CB5B2FE0041D8E3 /* Build configuration list for PBXProject "MPSCNNHelloWorld" */;
compatibilityVersion = "Xcode 3.2";
developmentRegion = English;
hasScannedForEncodings = 0;
knownRegions = (
en,
Base,
);
mainGroup = 2E0C35EA1CB5B2FE0041D8E3;
productRefGroup = 2E0C35F41CB5B2FE0041D8E3 /* Products */;
projectDirPath = "";
projectRoot = "";
targets = (
2E0C35F21CB5B2FE0041D8E3 /* MPSCNNHelloWorld */,
);
};
/* End PBXProject section */
/* Begin PBXResourcesBuildPhase section */
2E0C35F11CB5B2FE0041D8E3 /* Resources */ = {
isa = PBXResourcesBuildPhase;
buildActionMask = 2147483647;
files = (
2EE557D11D41A5890071A3EC /* train-images-idx3-ubyte.data in Resources */,
2EE557BA1D41A4540071A3EC /* bias_conv1.dat in Resources */,
2EE557C11D41A4540071A3EC /* weights_fc2.dat in Resources */,
2EE557AF1D41A42B0071A3EC /* t10k-labels-idx1-ubyte.data in Resources */,
2EE557B11D41A42B0071A3EC /* train-labels-idx1-ubyte.data in Resources */,
2EE557AE1D41A42B0071A3EC /* t10k-images-idx3-ubyte.data in Resources */,
2EE557FC1D41A6840071A3EC /* Assets.xcassets in Resources */,
2EE557F71D41A6690071A3EC /* Main.storyboard in Resources */,
2EE557C01D41A4540071A3EC /* weights_fc1.dat in Resources */,
2EE557BB1D41A4540071A3EC /* bias_conv2.dat in Resources */,
2EE557FA1D41A6770071A3EC /* LaunchScreen.storyboard in Resources */,
2EE557C51D41A4670071A3EC /* weights_NN.dat in Resources */,
2EE557BD1D41A4540071A3EC /* bias_fc2.dat in Resources */,
2EE557C41D41A4670071A3EC /* bias_NN.dat in Resources */,
2EE557BF1D41A4540071A3EC /* weights_conv2.dat in Resources */,
2EE557BC1D41A4540071A3EC /* bias_fc1.dat in Resources */,
2EE557BE1D41A4540071A3EC /* weights_conv1.dat in Resources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXResourcesBuildPhase section */
/* Begin PBXSourcesBuildPhase section */
2E0C35EF1CB5B2FE0041D8E3 /* Sources */ = {
isa = PBXSourcesBuildPhase;
buildActionMask = 2147483647;
files = (
2EE557EF1D41A6410071A3EC /* DrawView.swift in Sources */,
2EE557F21D41A6410071A3EC /* MNISTSingleLayer.swift in Sources */,
2EE557F11D41A6410071A3EC /* MNISTDeepCNN.swift in Sources */,
2EE557F01D41A6410071A3EC /* GetMNISTData.swift in Sources */,
2E6AB7281D47F79E00048A0B /* atomics.m in Sources */,
2EE557F41D41A6410071A3EC /* ViewController.swift in Sources */,
2EE557EE1D41A6410071A3EC /* AppDelegate.swift in Sources */,
2EE557F31D41A6410071A3EC /* SlimMPSCNN.swift in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXSourcesBuildPhase section */
/* Begin PBXVariantGroup section */
2EE557F51D41A6690071A3EC /* Main.storyboard */ = {
isa = PBXVariantGroup;
children = (
2EE557F61D41A6690071A3EC /* Base */,
);
name = Main.storyboard;
sourceTree = "<group>";
};
2EE557F81D41A6770071A3EC /* LaunchScreen.storyboard */ = {
isa = PBXVariantGroup;
children = (
2EE557F91D41A6770071A3EC /* Base */,
);
name = LaunchScreen.storyboard;
sourceTree = "<group>";
};
/* End PBXVariantGroup section */
/* Begin XCBuildConfiguration section */
2E0C36191CB5B2FF0041D8E3 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_NONNULL = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
CODE_SIGN_IDENTITY = "iPhone Developer: dhruv_saksena (KQCTVQZQW3)";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer: dhruv_saksena (KQCTVQZQW3)";
COPY_PHASE_STRIP = NO;
DEBUG_INFORMATION_FORMAT = dwarf;
ENABLE_STRICT_OBJC_MSGSEND = YES;
ENABLE_TESTABILITY = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_DYNAMIC_NO_PIC = NO;
GCC_NO_COMMON_BLOCKS = YES;
GCC_OPTIMIZATION_LEVEL = 0;
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
"$(inherited)",
);
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = YES;
ONLY_ACTIVE_ARCH = YES;
SDKROOT = iphoneos;
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
TARGETED_DEVICE_FAMILY = "1,2";
};
name = Debug;
};
2E0C361A1CB5B2FF0041D8E3 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_ANALYZER_NONNULL = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
CODE_SIGN_IDENTITY = "iPhone Developer: dhruv_saksena (KQCTVQZQW3)";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer: dhruv_saksena (KQCTVQZQW3)";
COPY_PHASE_STRIP = NO;
DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym";
ENABLE_NS_ASSERTIONS = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_NO_COMMON_BLOCKS = YES;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = NO;
SDKROOT = iphoneos;
TARGETED_DEVICE_FAMILY = "1,2";
VALIDATE_PRODUCT = YES;
};
name = Release;
};
2E0C361C1CB5B2FF0041D8E3 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
ARCHS = (
armv7s,
arm64,
);
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CLANG_ENABLE_MODULES = YES;
CODE_SIGN_IDENTITY = "iPhone Developer";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
DEVELOPMENT_TEAM = "";
INFOPLIST_FILE = MPSCNNHelloWorld/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
PRODUCT_BUNDLE_IDENTIFIER = "com.example.apple-samplecode.MPSCNNHelloWorld";
PRODUCT_NAME = "Digit Detector";
PROVISIONING_PROFILE = "";
SDKROOT = iphoneos;
SWIFT_OBJC_BRIDGING_HEADER = "MPSCNNHelloWorld/MPSCNNHelloWorld-Bridging-Header.h";
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
SWIFT_VERSION = 3.0;
};
name = Debug;
};
2E0C361D1CB5B2FF0041D8E3 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
ARCHS = (
armv7s,
arm64,
);
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CLANG_ENABLE_MODULES = YES;
CODE_SIGN_IDENTITY = "iPhone Developer";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
DEVELOPMENT_TEAM = "";
INFOPLIST_FILE = MPSCNNHelloWorld/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
PRODUCT_BUNDLE_IDENTIFIER = "com.example.apple-samplecode.MPSCNNHelloWorld";
PRODUCT_NAME = "Digit Detector";
PROVISIONING_PROFILE = "";
SDKROOT = iphoneos;
SWIFT_OBJC_BRIDGING_HEADER = "MPSCNNHelloWorld/MPSCNNHelloWorld-Bridging-Header.h";
SWIFT_VERSION = 3.0;
};
name = Release;
};
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
2E0C35EE1CB5B2FE0041D8E3 /* Build configuration list for PBXProject "MPSCNNHelloWorld" */ = {
isa = XCConfigurationList;
buildConfigurations = (
2E0C36191CB5B2FF0041D8E3 /* Debug */,
2E0C361A1CB5B2FF0041D8E3 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
2E0C361B1CB5B2FF0041D8E3 /* Build configuration list for PBXNativeTarget "MPSCNNHelloWorld" */ = {
isa = XCConfigurationList;
buildConfigurations = (
2E0C361C1CB5B2FF0041D8E3 /* Debug */,
2E0C361D1CB5B2FF0041D8E3 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
/* End XCConfigurationList section */
};
rootObject = 2E0C35EB1CB5B2FE0041D8E3 /* Project object */;
}

View File

@ -0,0 +1,17 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
Application delegate for the App
*/
import UIKit
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
}

View File

@ -0,0 +1,86 @@
{
"images" : [
{
"size" : "29x29",
"idiom" : "iphone",
"filename" : "mnist-main-58.png",
"scale" : "2x"
},
{
"size" : "29x29",
"idiom" : "iphone",
"filename" : "mnist-main-87.png",
"scale" : "3x"
},
{
"size" : "40x40",
"idiom" : "iphone",
"filename" : "mnist-main-80.png",
"scale" : "2x"
},
{
"size" : "40x40",
"idiom" : "iphone",
"filename" : "mnist-main-120.png",
"scale" : "3x"
},
{
"size" : "60x60",
"idiom" : "iphone",
"filename" : "mnist-main-120-2.png",
"scale" : "2x"
},
{
"size" : "60x60",
"idiom" : "iphone",
"filename" : "mnist-main-180.png",
"scale" : "3x"
},
{
"size" : "29x29",
"idiom" : "ipad",
"filename" : "mnist-main-29.png",
"scale" : "1x"
},
{
"size" : "29x29",
"idiom" : "ipad",
"filename" : "mnist-main-58-2.png",
"scale" : "2x"
},
{
"size" : "40x40",
"idiom" : "ipad",
"filename" : "mnist-main-40.png",
"scale" : "1x"
},
{
"size" : "40x40",
"idiom" : "ipad",
"filename" : "mnist-main-80-2.png",
"scale" : "2x"
},
{
"size" : "76x76",
"idiom" : "ipad",
"filename" : "mnist-main-76.png",
"scale" : "1x"
},
{
"size" : "76x76",
"idiom" : "ipad",
"filename" : "mnist-main-152.png",
"scale" : "2x"
},
{
"size" : "83.5x83.5",
"idiom" : "ipad",
"filename" : "mnist-main-167.png",
"scale" : "2x"
}
],
"info" : {
"version" : 1,
"author" : "xcode"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.4 KiB

View File

@ -0,0 +1,6 @@
{
"info" : {
"version" : 1,
"author" : "xcode"
}
}

View File

@ -0,0 +1,27 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="11019" systemVersion="16A186" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" launchScreen="YES" useTraitCollections="YES" initialViewController="01J-lp-oVM">
<dependencies>
<deployment identifier="iOS"/>
<plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="11056.3"/>
</dependencies>
<scenes>
<!--View Controller-->
<scene sceneID="EHf-IW-A2E">
<objects>
<viewController id="01J-lp-oVM" sceneMemberID="viewController">
<layoutGuides>
<viewControllerLayoutGuide type="top" id="Llm-lL-Icb"/>
<viewControllerLayoutGuide type="bottom" id="xb3-aO-Qok"/>
</layoutGuides>
<view key="view" contentMode="scaleToFill" id="Ze5-6b-2t3">
<rect key="frame" x="0.0" y="0.0" width="600" height="600"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<color key="backgroundColor" white="1" alpha="1" colorSpace="custom" customColorSpace="calibratedWhite"/>
</view>
</viewController>
<placeholder placeholderIdentifier="IBFirstResponder" id="iYj-Kq-Ea1" userLabel="First Responder" sceneMemberID="firstResponder"/>
</objects>
<point key="canvasLocation" x="53" y="375"/>
</scene>
</scenes>
</document>

View File

@ -0,0 +1,96 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="11185.2" systemVersion="16A254d" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" useTraitCollections="YES" colorMatched="YES" initialViewController="BYZ-38-t0r">
<dependencies>
<deployment identifier="iOS"/>
<plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="11151.3"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<scenes>
<!--View Controller-->
<scene sceneID="tne-QT-ifu">
<objects>
<viewController id="BYZ-38-t0r" customClass="ViewController" customModule="Digit_Detector" customModuleProvider="target" sceneMemberID="viewController">
<layoutGuides>
<viewControllerLayoutGuide type="top" id="y3c-jy-aDJ"/>
<viewControllerLayoutGuide type="bottom" id="wfy-db-euE"/>
</layoutGuides>
<view key="view" contentMode="scaleToFill" id="8bC-Xf-vdC">
<rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<label hidden="YES" opaque="NO" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Accuracy = " textAlignment="center" lineBreakMode="tailTruncation" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="Itt-4A-FVR">
<fontDescription key="fontDescription" type="system" pointSize="17"/>
<nil key="textColor"/>
<nil key="highlightedColor"/>
</label>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="fMB-WM-8YO">
<fontDescription key="fontDescription" type="system" pointSize="26"/>
<state key="normal" title="Use MNIST Test Set"/>
<connections>
<action selector="tappedTestSet:" destination="BYZ-38-t0r" eventType="touchUpInside" id="iwN-td-1sk"/>
</connections>
</button>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="HEd-C6-X5l">
<fontDescription key="fontDescription" type="system" pointSize="26"/>
<state key="normal" title="Detect Digit"/>
<connections>
<action selector="tappedDetectDigit:" destination="BYZ-38-t0r" eventType="touchUpInside" id="Rtq-vr-h56"/>
</connections>
</button>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="NsO-ek-NfA">
<fontDescription key="fontDescription" type="system" pointSize="25"/>
<state key="normal" title="Clear"/>
<connections>
<action selector="tappedClear:" destination="BYZ-38-t0r" eventType="touchUpInside" id="aCb-Qa-Y9q"/>
</connections>
</button>
<label hidden="YES" opaque="NO" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="0" textAlignment="center" lineBreakMode="tailTruncation" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="Aa3-61-EIv">
<fontDescription key="fontDescription" type="system" pointSize="23"/>
<nil key="textColor"/>
<nil key="highlightedColor"/>
</label>
<view contentMode="scaleToFill" translatesAutoresizingMaskIntoConstraints="NO" id="PzX-rP-SKt" customClass="DrawView" customModule="Digit_Detector" customModuleProvider="target">
<color key="backgroundColor" white="0.0" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
<constraints>
<constraint firstAttribute="height" constant="240" id="cpN-BA-HfR"/>
<constraint firstAttribute="width" constant="240" id="zZB-ag-7Eh"/>
</constraints>
</view>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="lv9-QR-RxB">
<fontDescription key="fontDescription" type="system" pointSize="26"/>
<state key="normal" title="Use Deep Net"/>
<connections>
<action selector="tappedDeepButton:" destination="BYZ-38-t0r" eventType="touchUpInside" id="wAp-88-Gdp"/>
</connections>
</button>
</subviews>
<color key="backgroundColor" white="1" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
<constraints>
<constraint firstItem="PzX-rP-SKt" firstAttribute="top" secondItem="NsO-ek-NfA" secondAttribute="bottom" constant="6" id="5Br-ID-zhu"/>
<constraint firstItem="lv9-QR-RxB" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="9CN-JK-3QO"/>
<constraint firstItem="Itt-4A-FVR" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="BPl-aC-Jiq"/>
<constraint firstItem="PzX-rP-SKt" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="LZI-YF-JxH"/>
<constraint firstItem="HEd-C6-X5l" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="P7g-P9-3IL"/>
<constraint firstItem="Aa3-61-EIv" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="Q2L-ju-FT5"/>
<constraint firstItem="NsO-ek-NfA" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="Q6a-Xz-gCB"/>
<constraint firstItem="wfy-db-euE" firstAttribute="top" secondItem="lv9-QR-RxB" secondAttribute="bottom" constant="8" id="RDs-tb-ujw"/>
<constraint firstItem="fMB-WM-8YO" firstAttribute="centerX" secondItem="8bC-Xf-vdC" secondAttribute="centerX" id="T3b-5e-9x9"/>
<constraint firstItem="HEd-C6-X5l" firstAttribute="top" secondItem="PzX-rP-SKt" secondAttribute="bottom" constant="23" id="XZb-1B-Gug"/>
<constraint firstItem="PzX-rP-SKt" firstAttribute="centerY" secondItem="8bC-Xf-vdC" secondAttribute="centerY" id="YHc-FZ-FDx"/>
<constraint firstItem="Aa3-61-EIv" firstAttribute="top" secondItem="HEd-C6-X5l" secondAttribute="bottom" constant="24" id="kKx-jg-CxZ"/>
<constraint firstItem="fMB-WM-8YO" firstAttribute="top" secondItem="y3c-jy-aDJ" secondAttribute="bottom" id="pRd-8b-Hib"/>
<constraint firstItem="Itt-4A-FVR" firstAttribute="top" secondItem="fMB-WM-8YO" secondAttribute="bottom" constant="24" id="zhT-JQ-n6p"/>
</constraints>
</view>
<connections>
<outlet property="DigitView" destination="PzX-rP-SKt" id="bZU-8T-7ZI"/>
<outlet property="accuracyLabel" destination="Itt-4A-FVR" id="REI-58-rzY"/>
<outlet property="predictionLabel" destination="Aa3-61-EIv" id="OkV-W4-Clr"/>
</connections>
</viewController>
<placeholder placeholderIdentifier="IBFirstResponder" id="dkx-z0-nzr" sceneMemberID="firstResponder"/>
</objects>
<point key="canvasLocation" x="-3427.734375" y="430.89311859443626"/>
</scene>
</scenes>
</document>

View File

@ -0,0 +1,93 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
This file has routines for drwaing and detecting user touches (input digit)
*/
import UIKit
/**
This class is used to handle the drawing in the DigitView so we can get user input digit,
This class doesn't really have an MPS or Metal going in it, it is just used to get user input
*/
class DrawView: UIView {
// some parameters of how thick a line to draw 15 seems to work
// and we have white drawings on black background just like MNIST needs its input
var linewidth = CGFloat(15) { didSet { setNeedsDisplay() } }
var color = UIColor.white { didSet { setNeedsDisplay() } }
// we will keep touches made by user in view in these as a record so we can draw them.
var lines: [Line] = []
var lastPoint: CGPoint!
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
lastPoint = touches.first!.location(in: self)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let newPoint = touches.first!.location(in: self)
// keep all lines drawn by user as touch in record so we can draw them in view
lines.append(Line(start: lastPoint, end: newPoint))
lastPoint = newPoint
// make a draw call
setNeedsDisplay()
}
override func draw(_ rect: CGRect) {
super.draw(rect)
let drawPath = UIBezierPath()
drawPath.lineCapStyle = .round
for line in lines{
drawPath.move(to: line.start)
drawPath.addLine(to: line.end)
}
drawPath.lineWidth = linewidth
color.set()
drawPath.stroke()
}
/**
This function gets the pixel data of the view so we can put it in MTLTexture
- Returns:
Void
*/
func getViewContext() -> CGContext? {
// our network takes in only grayscale images as input
let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceGray()
// we have 3 channels no alpha value put in the network
let bitmapInfo = CGImageAlphaInfo.none.rawValue
// this is where our view pixel data will go in once we make the render call
let context = CGContext(data: nil, width: 28, height: 28, bitsPerComponent: 8, bytesPerRow: 28, space: colorSpace, bitmapInfo: bitmapInfo)
// scale and translate so we have the full digit and in MNIST standard size 28x28
context!.translateBy(x: 0 , y: 28)
context!.scaleBy(x: 28/self.frame.size.width, y: -28/self.frame.size.height)
// put view pixel data in context
self.layer.render(in: context!)
return context
}
}
/**
2 points can give a line and this class is just for that purpose, it keeps a record of a line
*/
class Line{
var start, end: CGPoint
init(start: CGPoint, end: CGPoint) {
self.start = start
self.end = end
}
}

View File

@ -0,0 +1,73 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
GetMNISTData is used to import the test set from the MNIST dataset
*/
import Foundation
class GetMNISTData {
var labels = [UInt8]()
var images = [UInt8]()
var hdrW, hdrB: UnsafeMutableRawPointer?
var fd_b, fd_w: CInt
var sizeBias, sizeWeights: Int
init() {
// get the url to this layer's weights and bias
let wtPath = Bundle.main.path(forResource: "t10k-images-idx3-ubyte", ofType: "data")
let bsPath = Bundle.main.path(forResource: "t10k-labels-idx1-ubyte", ofType: "data")
// find and open file
let URLL = Bundle.main.url(forResource: "t10k-labels-idx1-ubyte", withExtension: "data")
let dataL = NSData(contentsOf: URLL!)
let URLI = Bundle.main.url(forResource: "t10k-images-idx3-ubyte", withExtension: "data")
let dataI = NSData(contentsOf: URLI!)
// calculate the size of weights and bias required to be memory mapped into memory
sizeBias = dataL!.length
sizeWeights = dataI!.length
// open file descriptors in read-only mode to parameter files
fd_w = open(wtPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
fd_b = open(bsPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
assert(fd_w != -1, "Error: failed to open output file at \""+wtPath!+"\" errno = \(errno)\n")
assert(fd_b != -1, "Error: failed to open output file at \""+bsPath!+"\" errno = \(errno)\n")
// memory map the parameters
hdrW = mmap(nil, Int(sizeWeights), PROT_READ, MAP_FILE | MAP_SHARED, fd_w, 0);
hdrB = mmap(nil, Int(sizeBias), PROT_READ, MAP_FILE | MAP_SHARED, fd_b, 0);
let i = UnsafePointer(hdrW!.bindMemory(to: UInt8.self, capacity: Int(sizeWeights)))
let l = UnsafePointer(hdrB!.bindMemory(to: UInt8.self, capacity: Int(sizeBias)))
assert(i != UnsafePointer<UInt8>(bitPattern: -1), "mmap failed with errno = \(errno)")
assert(l != UnsafePointer<UInt8>(bitPattern: -1), "mmap failed with errno = \(errno)")
// remove first 16 bytes that contain info data from array
images = Array(UnsafeBufferPointer(start: (i + 16), count: sizeWeights - 16))
// remove first 8 bytes that contain file data from our labels array
labels = Array(UnsafeBufferPointer(start: (l + 8), count: sizeBias - 8))
}
deinit{
// unmap files at initialization of MPSCNNFullyConnected, the weights are copied and packed internally we no longer require these
assert(munmap(hdrW, Int(sizeWeights)) == 0, "munmap failed with errno = \(errno)")
assert(munmap(hdrB, Int(sizeBias)) == 0, "munmap failed with errno = \(errno)")
// close file descriptors
close(fd_w)
close(fd_b)
}
}

View File

@ -0,0 +1,47 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CFBundleDevelopmentRegion</key>
<string>en</string>
<key>CFBundleExecutable</key>
<string>$(EXECUTABLE_NAME)</string>
<key>CFBundleIdentifier</key>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>$(PRODUCT_NAME)</string>
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>1.0</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleVersion</key>
<string>1</string>
<key>LSRequiresIPhoneOS</key>
<true/>
<key>UILaunchStoryboardName</key>
<string>LaunchScreen</string>
<key>UIMainStoryboardFile</key>
<string>Main</string>
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>armv7</string>
</array>
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
</dict>
</plist>

View File

@ -0,0 +1,147 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
This is the deep layer network where we define and encode the correct layers on a command buffer as needed
*/
import MetalPerformanceShaders
/**
This class has our entire network with all layers to getting the final label
Resources:
* [Instructions](https://www.tensorflow.org/versions/r0.8/tutorials/mnist/pros/index.html#deep-mnist-for-experts) to run this network on TensorFlow.
*/
class MNIST_Deep_ConvNN: MNIST_Full_LayerNN{
// MPSImageDescriptors for different layers outputs to be put in
let c1id = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 28, height: 28, featureChannels: 32)
let p1id = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 14, height: 14, featureChannels: 32)
let c2id = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 14, height: 14, featureChannels: 64)
let p2id = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 7 , height: 7 , featureChannels: 64)
let fc1id = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 1 , height: 1 , featureChannels: 1024)
// MPSImages and layers declared
var c1Image, c2Image, p1Image, p2Image, fc1Image: MPSImage
var conv1, conv2: MPSCNNConvolution
var fc1, fc2: MPSCNNFullyConnected
var pool: MPSCNNPoolingMax
var relu: MPSCNNNeuronReLU
override init(withCommandQueue commandQueueIn: MTLCommandQueue!) {
// use device for a little while to initialize
let device = commandQueueIn.device
pool = MPSCNNPoolingMax(device: device, kernelWidth: 2, kernelHeight: 2, strideInPixelsX: 2, strideInPixelsY: 2)
pool.offset = MPSOffset(x: 1, y: 1, z: 0);
pool.edgeMode = MPSImageEdgeMode.clamp
relu = MPSCNNNeuronReLU(device: device, a: 0)
// Initialize MPSImage from descriptors
c1Image = MPSImage(device: device, imageDescriptor: c1id)
p1Image = MPSImage(device: device, imageDescriptor: p1id)
c2Image = MPSImage(device: device, imageDescriptor: c2id)
p2Image = MPSImage(device: device, imageDescriptor: p2id)
fc1Image = MPSImage(device: device, imageDescriptor: fc1id)
// setup convolution layers
conv1 = SlimMPSCNNConvolution(kernelWidth: 5,
kernelHeight: 5,
inputFeatureChannels: 1,
outputFeatureChannels: 32,
neuronFilter: relu,
device: device,
kernelParamsBinaryName: "conv1")
conv2 = SlimMPSCNNConvolution(kernelWidth: 5,
kernelHeight: 5,
inputFeatureChannels: 32,
outputFeatureChannels: 64,
neuronFilter: relu,
device: device,
kernelParamsBinaryName: "conv2")
// same as a 1x1 convolution filter to produce 1x1x10 from 1x1x1024
fc1 = SlimMPSCNNFullyConnected(kernelWidth: 7,
kernelHeight: 7,
inputFeatureChannels: 64,
outputFeatureChannels: 1024,
neuronFilter: nil,
device: device,
kernelParamsBinaryName: "fc1")
fc2 = SlimMPSCNNFullyConnected(kernelWidth: 1,
kernelHeight: 1,
inputFeatureChannels: 1024,
outputFeatureChannels: 10,
neuronFilter: nil,
device: device,
kernelParamsBinaryName: "fc2")
super.init(withCommandQueue: commandQueueIn)
}
/**
This function encodes all the layers of the network into given commandBuffer, it calls subroutines for each piece of the network
- Parameters:
- inputImage: Image coming in on which the network will run
- imageNum: If the test set is being used we will get a value between 0 and 9999 for which of the 10,000 images is being evaluated
- correctLabel: The correct label for the inputImage while testing
- Returns:
Guess of the network as to what the digit is as UInt
*/
override func forward(inputImage: MPSImage? = nil, imageNum: Int = 9999, correctLabel: UInt = 10) -> UInt{
var label = UInt(99)
// to deliver optimal performance we leave some resources used in MPSCNN to be released at next call of autoreleasepool,
// so the user can decide the appropriate time to release this
autoreleasepool{
// Get command buffer to use in MetalPerformanceShaders.
let commandBuffer = commandQueue.makeCommandBuffer()
// output will be stored in this image
let finalLayer = MPSImage(device: commandBuffer.device, imageDescriptor: did)
// encode layers to metal commandBuffer
if inputImage == nil {
conv1.encode(commandBuffer: commandBuffer, sourceImage: srcImage, destinationImage: c1Image)
}
else{
conv1.encode(commandBuffer: commandBuffer, sourceImage: inputImage!, destinationImage: c1Image)
}
pool.encode (commandBuffer: commandBuffer, sourceImage: c1Image , destinationImage: p1Image)
conv2.encode (commandBuffer: commandBuffer, sourceImage: p1Image , destinationImage: c2Image)
pool.encode (commandBuffer: commandBuffer, sourceImage: c2Image , destinationImage: p2Image)
fc1.encode (commandBuffer: commandBuffer, sourceImage: p2Image , destinationImage: fc1Image)
fc2.encode (commandBuffer: commandBuffer, sourceImage: fc1Image , destinationImage: dstImage)
softmax.encode(commandBuffer: commandBuffer, sourceImage: dstImage , destinationImage: finalLayer)
// add a completion handler to get the correct label the moment GPU is done and compare it to the correct output or return it
commandBuffer.addCompletedHandler { commandBuffer in
label = self.getLabel(finalLayer: finalLayer)
if(correctLabel == label){
__atomic_increment()
}
}
// commit commandbuffer to run on GPU and wait for completion
commandBuffer.commit()
if imageNum == 9999 {
commandBuffer.waitUntilCompleted()
}
}
return label
}
}

View File

@ -0,0 +1,157 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
This is the single layer network where we define and encode the correct layers on a command buffer as needed
*/
import MetalPerformanceShaders
import Accelerate
/**
This class has our entire network with all layers to getting the final label
Resources:
* [Instructions](https://www.tensorflow.org/versions/r0.8/tutorials/mnist/beginners/index.html#mnist-for-ml-beginners) to run this network on TensorFlow.
*/
class MNIST_Full_LayerNN{
// MPSImageDescriptors for different layers outputs to be put in
let sid = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.unorm8, width: 28, height: 28, featureChannels: 1)
let did = MPSImageDescriptor(channelFormat: MPSImageFeatureChannelFormat.float16, width: 1, height: 1, featureChannels: 10)
// MPSImages and layers declared
var srcImage, dstImage : MPSImage
var layer: MPSCNNFullyConnected
var softmax : MPSCNNSoftMax
var commandQueue : MTLCommandQueue
var device : MTLDevice
init(withCommandQueue commandQueueIn: MTLCommandQueue!){
// CommandQueue to be kept around
commandQueue = commandQueueIn
device = commandQueueIn.device
// Initialize MPSImage from descriptors
srcImage = MPSImage(device: device, imageDescriptor: sid)
dstImage = MPSImage(device: device, imageDescriptor: did)
// setup convolution layer (which is a fully-connected layer)
// cliprect, offset is automatically set
layer = SlimMPSCNNFullyConnected(kernelWidth: 28,
kernelHeight: 28,
inputFeatureChannels : 1,
outputFeatureChannels: 10,
neuronFilter: nil,
device: device,
kernelParamsBinaryName: "NN")
// prepare softmax layer to be applied at the end to get a clear label
softmax = MPSCNNSoftMax(device: device)
}
/**
This function encodes all the layers of the network into given commandBuffer, it calls subroutines for each piece of the network
- Parameters:
- inputImage: Image coming in on which the network will run
- imageNum: If the test set is being used we will get a value between 0 and 9999 for which of the 10,000 images is being evaluated
- correctLabel: The correct label for the inputImage while testing
- Returns:
Guess of the network as to what the digit is as UInt
*/
func forward(inputImage: MPSImage? = nil, imageNum: Int = 9999, correctLabel: UInt = 10) -> UInt {
var label = UInt(99)
// to deliver optimal performance we leave some resources used in MPSCNN to be released at next call of autoreleasepool,
// so the user can decide the appropriate time to release this
autoreleasepool{
// Get command buffer to use in MetalPerformanceShaders.
let commandBuffer = commandQueue.makeCommandBuffer()
// output will be stored in this image
let finalLayer = MPSImage(device: commandBuffer.device, imageDescriptor: did)
// encode layers to metal commandBuffer
if inputImage == nil {
layer.encode(commandBuffer: commandBuffer, sourceImage: srcImage, destinationImage: dstImage)
}
else{
layer.encode(commandBuffer: commandBuffer, sourceImage: inputImage!, destinationImage: dstImage)
}
softmax.encode(commandBuffer: commandBuffer, sourceImage: dstImage, destinationImage: finalLayer)
// add a completion handler to get the correct label the moment GPU is done and compare it to the correct output or return it
commandBuffer.addCompletedHandler { commandBuffer in
label = self.getLabel(finalLayer: finalLayer)
if(correctLabel == label){
__atomic_increment()
}
}
// commit commandbuffer to run on GPU and wait for completion
commandBuffer.commit()
if imageNum == 9999 || inputImage == nil {
commandBuffer.waitUntilCompleted()
}
}
return label
}
/**
This function reads the output probabilities from finalLayer to CPU, sorts them and gets the label with heighest probability
- Parameters:
- finalLayer: output image of the network this has probabilities of each digit
- Returns:
Guess of the network as to what the digit is as UInt
*/
func getLabel(finalLayer: MPSImage) -> UInt {
// even though we have 10 labels outputed the MTLTexture format used is RGBAFloat16 thus 3 slices will have 3*4 = 12 outputs
var result_half_array = [UInt16](repeating: 6, count: 12)
var result_float_array = [Float](repeating: 0.3, count: 10)
for i in 0...2 {
finalLayer.texture.getBytes(&(result_half_array[4*i]),
bytesPerRow: MemoryLayout<UInt16>.size*1*4,
bytesPerImage: MemoryLayout<UInt16>.size*1*1*4,
from: MTLRegion(origin: MTLOrigin(x: 0, y: 0, z: 0),
size: MTLSize(width: 1, height: 1, depth: 1)),
mipmapLevel: 0,
slice: i)
}
// we use vImage to convert our data to float16, Metal GPUs use float16 and swift float is 32-bit
var fullResultVImagebuf = vImage_Buffer(data: &result_float_array, height: 1, width: 10, rowBytes: 10*4)
var halfResultVImagebuf = vImage_Buffer(data: &result_half_array , height: 1, width: 10, rowBytes: 10*2)
if vImageConvert_Planar16FtoPlanarF(&halfResultVImagebuf, &fullResultVImagebuf, 0) != kvImageNoError {
print("Error in vImage")
}
// poll all labels for probability and choose the one with max probability to return
var max:Float = 0
var mostProbableDigit = 10
for i in 0...9 {
if(max < result_float_array[i]){
max = result_float_array[i]
mostProbableDigit = i
}
}
return UInt(mostProbableDigit)
}
}

View File

@ -0,0 +1,14 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
A bridging header so our swift code can see our atomics in objC
*/
#ifndef MPSCNNHelloWorld_Bridging_Header_h
#define MPSCNNHelloWorld_Bridging_Header_h
#import "atomics.h"
#endif /* MPSCNNHelloWorld_Bridging_Header_h */

View File

@ -0,0 +1,215 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
This file describes slimmer routines to create some common MPSCNNFunctions, it is useful especially to fetch network parameters from .dat files
*/
import Foundation
import MetalPerformanceShaders
/**
This depends on MetalPerformanceShaders.framework
The SlimMPSCNNConvolution is a wrapper class around MPSCNNConvolution used to encapsulate:
- making an MPSCNNConvolutionDescriptor,
- adding network parameters (weights and bias binaries by memory mapping the binaries)
- getting our convolution layer
*/
class SlimMPSCNNConvolution: MPSCNNConvolution{
/**
A property to keep info from init time whether we will pad input image or not for use during encode call
*/
private var padding = true
/**
Initializes a fully connected kernel.
- Parameters:
- kernelWidth: Kernel Width
- kernelHeight: Kernel Height
- inputFeatureChannels: Number feature channels in input of this layer
- outputFeatureChannels: Number feature channels from output of this layer
- neuronFilter: A neuronFilter to add at the end as activation, default is nil
- device: The MTLDevice on which this SlimMPSCNNConvolution filter will be used
- kernelParamsBinaryName: name of the layer to fetch kernelParameters by adding a prefix "weights_" or "bias_"
- padding: Bool value whether to use padding or not
- strideXY: Stride of the filter
- destinationFeatureChannelOffset: FeatureChannel no. in the destination MPSImage to start writing from, helps with concat operations
- groupNum: if grouping is used, default value is 1 meaning no groups
- Returns:
A valid SlimMPSCNNConvolution object or nil, if failure.
*/
init(kernelWidth: UInt, kernelHeight: UInt, inputFeatureChannels: UInt, outputFeatureChannels: UInt, neuronFilter: MPSCNNNeuron? = nil, device: MTLDevice, kernelParamsBinaryName: String, padding willPad: Bool = true, strideXY: (UInt, UInt) = (1, 1), destinationFeatureChannelOffset: UInt = 0, groupNum: UInt = 1){
// calculate the size of weights and bias required to be memory mapped into memory
let sizeBias = outputFeatureChannels * UInt(MemoryLayout<Float>.size)
let sizeWeights = inputFeatureChannels * kernelHeight * kernelWidth * outputFeatureChannels * UInt(MemoryLayout<Float>.size)
// get the url to this layer's weights and bias
let wtPath = Bundle.main.path(forResource: "weights_" + kernelParamsBinaryName, ofType: "dat")
let bsPath = Bundle.main.path(forResource: "bias_" + kernelParamsBinaryName, ofType: "dat")
// open file descriptors in read-only mode to parameter files
let fd_w = open( wtPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
let fd_b = open( bsPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
assert(fd_w != -1, "Error: failed to open output file at \""+wtPath!+"\" errno = \(errno)\n")
assert(fd_b != -1, "Error: failed to open output file at \""+bsPath!+"\" errno = \(errno)\n")
// memory map the parameters
let hdrW = mmap(nil, Int(sizeWeights), PROT_READ, MAP_FILE | MAP_SHARED, fd_w, 0)
let hdrB = mmap(nil, Int(sizeBias), PROT_READ, MAP_FILE | MAP_SHARED, fd_b, 0)
// cast Void pointers to Float
let w = UnsafePointer(hdrW!.bindMemory(to: Float.self, capacity: Int(sizeWeights)))
let b = UnsafePointer(hdrB!.bindMemory(to: Float.self, capacity: Int(sizeBias)))
assert(w != UnsafePointer<Float>(bitPattern: -1), "mmap failed with errno = \(errno)")
assert(b != UnsafePointer<Float>(bitPattern: -1), "mmap failed with errno = \(errno)")
// create appropriate convolution descriptor with appropriate stride
let convDesc = MPSCNNConvolutionDescriptor(kernelWidth: Int(kernelWidth),
kernelHeight: Int(kernelHeight),
inputFeatureChannels: Int(inputFeatureChannels),
outputFeatureChannels: Int(outputFeatureChannels),
neuronFilter: neuronFilter)
convDesc.strideInPixelsX = Int(strideXY.0)
convDesc.strideInPixelsY = Int(strideXY.1)
assert(groupNum > 0, "Group size can't be less than 1")
convDesc.groups = Int(groupNum)
// initialize the convolution layer by calling the parent's (MPSCNNConvlution's) initializer
super.init(device: device,
convolutionDescriptor: convDesc,
kernelWeights: w,
biasTerms: b,
flags: MPSCNNConvolutionFlags.none)
self.destinationFeatureChannelOffset = Int(destinationFeatureChannelOffset)
// set padding for calculation of offset during encode call
padding = willPad
// unmap files at initialization of MPSCNNConvolution, the weights are copied and packed internally we no longer require these
assert(munmap(hdrW, Int(sizeWeights)) == 0, "munmap failed with errno = \(errno)")
assert(munmap(hdrB, Int(sizeBias)) == 0, "munmap failed with errno = \(errno)")
// close file descriptors
close(fd_w)
close(fd_b)
}
/**
Encode a MPSCNNKernel into a command Buffer. The operation shall proceed out-of-place.
We calculate the appropriate offset as per how TensorFlow calculates its padding using input image size and stride here.
This [Link](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/nn.py) has an explanation in header comments how tensorFlow pads its convolution input images.
- Parameters:
- commandBuffer: A valid MTLCommandBuffer to receive the encoded filter
- sourceImage: A valid MPSImage object containing the source image.
- destinationImage: A valid MPSImage to be overwritten by result image. destinationImage may not alias sourceImage
*/
override func encode(commandBuffer: MTLCommandBuffer, sourceImage: MPSImage, destinationImage: MPSImage) {
// select offset according to padding being used or not
if padding {
let pad_along_height = ((destinationImage.height - 1) * strideInPixelsY + kernelHeight - sourceImage.height)
let pad_along_width = ((destinationImage.width - 1) * strideInPixelsX + kernelWidth - sourceImage.width)
let pad_top = Int(pad_along_height / 2)
let pad_left = Int(pad_along_width / 2)
self.offset = MPSOffset(x: ((Int(kernelWidth)/2) - pad_left), y: (Int(kernelHeight/2) - pad_top), z: 0)
}
else{
self.offset = MPSOffset(x: Int(kernelWidth)/2, y: Int(kernelHeight)/2, z: 0)
}
super.encode(commandBuffer: commandBuffer, sourceImage: sourceImage, destinationImage: destinationImage)
}
}
/**
This depends on MetalPerformanceShaders.framework
The SlimMPSCNNFullyConnected is a wrapper class around MPSCNNFullyConnected used to encapsulate:
- making an MPSCNNConvolutionDescriptor,
- adding network parameters (weights and bias binaries by memory mapping the binaries)
- getting our fullyConnected layer
*/
class SlimMPSCNNFullyConnected: MPSCNNFullyConnected{
/**
Initializes a fully connected kernel.
- Parameters:
- kernelWidth: Kernel Width
- kernelHeight: Kernel Height
- inputFeatureChannels: Number feature channels in input of this layer
- outputFeatureChannels: Number feature channels from output of this layer
- neuronFilter: A neuronFilter to add at the end as activation, default is nil
- device: The MTLDevice on which this SlimMPSCNNConvolution filter will be used
- kernelParamsBinaryName: name of the layer to fetch kernelParameters by adding a prefix "weights_" or "bias_"
- destinationFeatureChannelOffset: FeatureChannel no. in the destination MPSImage to start writing from, helps with concat operations
- Returns:
A valid SlimMPSCNNFullyConnected object or nil, if failure.
*/
init(kernelWidth: UInt, kernelHeight: UInt, inputFeatureChannels: UInt, outputFeatureChannels: UInt, neuronFilter: MPSCNNNeuron? = nil, device: MTLDevice, kernelParamsBinaryName: String, destinationFeatureChannelOffset: UInt = 0){
// calculate the size of weights and bias required to be memory mapped into memory
let sizeBias = outputFeatureChannels * UInt(MemoryLayout<Float>.size)
let sizeWeights = inputFeatureChannels * kernelHeight * kernelWidth * outputFeatureChannels * UInt(MemoryLayout<Float>.size)
// get the url to this layer's weights and bias
let wtPath = Bundle.main.path(forResource: "weights_" + kernelParamsBinaryName, ofType: "dat")
let bsPath = Bundle.main.path(forResource: "bias_" + kernelParamsBinaryName, ofType: "dat")
// open file descriptors in read-only mode to parameter files
let fd_w = open(wtPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
let fd_b = open(bsPath!, O_RDONLY, S_IRUSR | S_IWUSR | S_IRGRP | S_IWGRP | S_IROTH | S_IWOTH)
assert(fd_w != -1, "Error: failed to open output file at \""+wtPath!+"\" errno = \(errno)\n")
assert(fd_b != -1, "Error: failed to open output file at \""+bsPath!+"\" errno = \(errno)\n")
// memory map the parameters
let hdrW = mmap(nil, Int(sizeWeights), PROT_READ, MAP_FILE | MAP_SHARED, fd_w, 0)
let hdrB = mmap(nil, Int(sizeBias), PROT_READ, MAP_FILE | MAP_SHARED, fd_b, 0)
// cast Void pointers to Float
let w = UnsafePointer(hdrW!.bindMemory(to: Float.self, capacity: Int(sizeWeights)))
let b = UnsafePointer(hdrB!.bindMemory(to: Float.self, capacity: Int(sizeBias)))
assert(w != UnsafePointer<Float>(bitPattern: -1), "mmap failed with errno = \(errno)")
assert(b != UnsafePointer<Float>(bitPattern: -1), "mmap failed with errno = \(errno)")
// create appropriate convolution descriptor (in fully connected, stride is always 1)
let convDesc = MPSCNNConvolutionDescriptor(kernelWidth: Int(kernelWidth),
kernelHeight: Int(kernelHeight),
inputFeatureChannels: Int(inputFeatureChannels),
outputFeatureChannels: Int(outputFeatureChannels),
neuronFilter: neuronFilter)
// initialize the convolution layer by calling the parent's (MPSCNNFullyConnected's) initializer
super.init(device: device,
convolutionDescriptor: convDesc,
kernelWeights: w,
biasTerms: b,
flags: MPSCNNConvolutionFlags.none)
self.destinationFeatureChannelOffset = Int(destinationFeatureChannelOffset)
// unmap files at initialization of MPSCNNFullyConnected, the weights are copied and packed internally we no longer require these
assert(munmap(hdrW, Int(sizeWeights)) == 0, "munmap failed with errno = \(errno)")
assert(munmap(hdrB, Int(sizeBias)) == 0, "munmap failed with errno = \(errno)")
// close file descriptors
close(fd_w)
close(fd_b)
}
}

View File

@ -0,0 +1,165 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
View Controller for Metal Performance Shaders Sample Code.
*/
import UIKit
import MetalPerformanceShaders
class ViewController: UIViewController{
// some properties used to control the app and store appropriate values
// we will start with the simple 1 layer
var deep = false
var commandQueue: MTLCommandQueue!
var device: MTLDevice!
// Networks we have
var neuralNetwork: MNIST_Full_LayerNN? = nil
var neuralNetworkDeep: MNIST_Deep_ConvNN? = nil
var runningNet: MNIST_Full_LayerNN? = nil
// loading MNIST Test Set here
let MNISTdata = GetMNISTData()
// MNIST dataset image parameters
let mnistInputWidth = 28
let mnistInputHeight = 28
let mnistInputNumPixels = 784
// Outlets to labels and view
@IBOutlet weak var digitView: DrawView!
@IBOutlet weak var predictionLabel: UILabel!
@IBOutlet weak var accuracyLabel: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
// Load default device.
device = MTLCreateSystemDefaultDevice()
// Make sure the current device supports MetalPerformanceShaders.
guard MPSSupportsMTLDevice(device) else {
print("Metal Performance Shaders not Supported on current Device")
return
}
// Create new command queue.
commandQueue = device!.makeCommandQueue()
// initialize the networks we shall use to detect digits
neuralNetwork = MNIST_Full_LayerNN(withCommandQueue: commandQueue)
neuralNetworkDeep = MNIST_Deep_ConvNN(withCommandQueue: commandQueue)
runningNet = neuralNetwork
}
@IBAction func tappedDeepButton(_ sender: UIButton) {
// switch network to be used between the deep and the single layered
if deep {
sender.setTitle("Use Deep Net", for: UIControlState.normal)
runningNet = neuralNetwork
}
else{
sender.setTitle("Use Single Layer", for: UIControlState.normal)
runningNet = neuralNetworkDeep
}
deep = !deep
}
@IBAction func tappedClear(_ sender: UIButton) {
// clear the digitview
digitView.lines = []
digitView.setNeedsDisplay()
predictionLabel.isHidden = true
}
@IBAction func tappedTestSet(_ sender: UIButton) {
// placeholder to count number of correct detections on the test set
var correctDetections = Int32(0)
let total = Float(10000)
accuracyLabel.isHidden = false
__atomic_reset()
// validate NeuralNetwork was initialized properly
assert(runningNet != nil)
for i in 0..<Int(total){
inference(imageNum: i, correctLabel: UInt(MNISTdata.labels[i]))
if i % 100 == 0 {
accuracyLabel.text = "\(i/100)% Done"
// this command helps update the UI in the loop regularly
RunLoop.current.run(mode: RunLoopMode.defaultRunLoopMode, before: Date.distantPast)
}
}
// display accuracy of the network on the MNIST test set
correctDetections = __get_atomic_count()
accuracyLabel.isHidden = false
accuracyLabel.text = "Accuracy = \(Float(correctDetections * 100)/total)%"
}
@IBAction func tappedDetectDigit(_ sender: UIButton) {
// get the digitView context so we can get the pixel values from it to intput to network
let context = digitView.getViewContext()
// validate NeuralNetwork was initialized properly
assert(runningNet != nil)
// putting input into MTLTexture in the MPSImage
runningNet?.srcImage.texture.replace(region: MTLRegion( origin: MTLOrigin(x: 0, y: 0, z: 0),
size: MTLSize(width: mnistInputWidth, height: mnistInputHeight, depth: 1)),
mipmapLevel: 0,
slice: 0,
withBytes: context!.data!,
bytesPerRow: mnistInputWidth,
bytesPerImage: 0)
// run the network forward pass
let label = (runningNet?.forward())!
// show the prediction
predictionLabel.text = "\(label)"
predictionLabel.isHidden = false
}
/**
This function runs the inference network on the test set
- Parameters:
- imageNum: If the test set is being used we will get a value between 0 and 9999 for which of the 10,000 images is being evaluated
- correctLabel: The correct label for the inputImage while testing
- Returns:
Void
*/
func inference(imageNum: Int, correctLabel: UInt){
// get the correct image pixels from the test set
var mnist_input_image = [UInt8]()
mnist_input_image += MNISTdata.images[(imageNum*mnistInputNumPixels)..<((imageNum+1)*mnistInputNumPixels)]
// create a source image for the network to forward
let inputImage = MPSImage(device: device, imageDescriptor: (runningNet?.sid)!)
// put image in source texture (input layer)
inputImage.texture.replace(region: MTLRegion(origin: MTLOrigin(x: 0, y: 0, z: 0),
size: MTLSize(width: mnistInputWidth, height: mnistInputHeight, depth: 1)),
mipmapLevel: 0,
slice: 0,
withBytes: mnist_input_image,
bytesPerRow: mnistInputWidth,
bytesPerImage: 0)
// run the network forward pass
_ = runningNet!.forward(inputImage: inputImage, imageNum : imageNum, correctLabel: correctLabel)
}
}

View File

@ -0,0 +1,19 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
We define some custom atomics to be used the network so seperate threads at end of commandBuffers can safely increment.
*/
#ifndef atomics_h
#define atomics_h
#import <stdatomic.h>
static atomic_int cnt = ATOMIC_VAR_INIT(0);
void __atomic_increment();
void __atomic_reset();
int __get_atomic_count();
#endif /* atomics_h */

View File

@ -0,0 +1,19 @@
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this samples licensing information
Abstract:
We define some custom atomics to be used the network so seperate threads at end of commandBuffers can safely increment.
*/
#import "atomics.h"
void __atomic_increment(){
atomic_fetch_add(&cnt, 1);
}
void __atomic_reset(){
cnt = ATOMIC_VAR_INIT(0);
}
int __get_atomic_count(){
return atomic_load(&cnt);
}

View File

@ -0,0 +1,2 @@
Cs<EFBFBD>=—¨ž=|á=£=û\­==<3D>=„¤=¾Þ½==î_™=ÙÎw=Ì]=5˜†=Œ=aÃ=p±¢=<3D><>€=|
À=o»Ð=Ÿ‚=/<2F>=ÆĦ=Õ[ƒ=7Ã<37>=/iŒ=!å§=.çÒ=³ês=ߘ=à-<2D>=oÓŸ=i•=

View File

@ -0,0 +1,2 @@
Èï­=jÁË=…«=du«=þ` =Ô`š=Ã*ž=iÆ¢=y;Â=Ü.´=<3D>ˆ©=Úþ•= Ã¼=y¶=lxº=£¼=£¼=Õçª=ÆŠš=¯«=Y·=ñF¦=|
À=Cœ=jM³= «¸=jû—=ŠÈ°=¿‚´=®*»=]ù¬=µý«=Q¨=åa¡=4€·=«=‰{¬=)Ч=uͤ=î%Í=…¸=0*©=ZG•=Ic´=DnÆ=†¡=¾°=< ¬=×ú¢=ö ¶=ö(œ=«=_ï¾=qr¿=¶Û®=ð¹=®*»=p%»=2<>¼=P°=–•¦=G<>Ÿ=“5ª=ª¹=

View File

@ -0,0 +1 @@
'В╩=╧S╨=П╖ф=VЯф=и=Й>ю=я\╖=╓┬л=≈╜У=O▓Н=

View File

@ -0,0 +1 @@
Ї/НОЃjП><3E>kЦ=.І<>О<EFBFBD><D09E>В<\\Є?XНжН<D0B6>3$?\ЖПdзnО

View File

@ -0,0 +1,30 @@
# MPSCNNHelloWorld: Simple Digit Detection Convolution Neural Networks (CNN)
This sample is a port of the open source library, TensorFlow trained networks trained on MNIST Dataset (http://yann.lecun.com/exdb/mnist/) via inference using Metal Performance Shaders.
The sample demonstrates how to encode different layers to the GPU and perform image recognition using trained parameters(weights and bias) that have been fetched from, pre-trained and saved network on TensorFlow.
The Single Network can be found at:
https://www.tensorflow.org/versions/r0.8/tutorials/mnist/beginners/index.html#mnist-for-ml-beginners
The Deep Network can be found at:
https://www.tensorflow.org/versions/r0.8/tutorials/mnist/pros/index.html#deep-mnist-for-experts
The network parameters are stored a binary .dat files that are memory-mapped when needed.
## Requirements
### Build
Xcode 8.0 or later; iOS 10.0 SDK or later
### Runtime
iOS 10.0 or later
### Device Feature Set
iOS GPU Family 2 v1
iOS GPU Family 2 v2
iOS GPU Family 3 v1
Copyright (C) 2016 Apple Inc. All rights reserved.