AVCamPhotoFilter: Version 3.0, 2017-09-19

Account for Metal APIs that now return optionals. Minor storyboard tweaks.

AVCamPhotoFilter demonstrates how to use the AVFoundation capture API to capture photos and use CoreImage to apply a filter to the captured photo and the live preview. It shows how to properly propagate sample buffer attachments and attributes, including EXIF and color space information (like wide gamut).

Signed-off-by: Liu Lantao <liulantao@gmail.com>
This commit is contained in:
Liu Lantao 2017-10-16 22:46:40 +08:00
parent 6471a16cec
commit 378ae63343
No known key found for this signature in database
GPG Key ID: BF35AA0CD375679D
25 changed files with 3471 additions and 0 deletions

10
AVCamPhotoFilter/.gitignore vendored Normal file
View File

@ -0,0 +1,10 @@
# See LICENSE folder for this samples licensing information.
#
# Apple sample code gitignore configuration.
# Finder
.DS_Store
# Xcode - User files
xcuserdata/
*.xcworkspace

View File

@ -0,0 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<array/>
</plist>

View File

@ -0,0 +1,417 @@
// !$*UTF8*$!
{
archiveVersion = 1;
classes = {
};
objectVersion = 46;
objects = {
/* Begin PBXBuildFile section */
2608B7DB1E70937A00A7310F /* RosyMetalRenderer.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2608B7DA1E70937A00A7310F /* RosyMetalRenderer.swift */; };
261FA1F71E945EEA00BE8D47 /* VideoMixer.swift in Sources */ = {isa = PBXBuildFile; fileRef = 261FA1F61E945EEA00BE8D47 /* VideoMixer.swift */; };
261FA1F91E94641700BE8D47 /* Mixer.metal in Sources */ = {isa = PBXBuildFile; fileRef = 261FA1F81E94641700BE8D47 /* Mixer.metal */; };
2633CF1D1E7C65D500FC80E1 /* DepthToGrayscaleConverter.swift in Sources */ = {isa = PBXBuildFile; fileRef = 2633CF1C1E7C65D500FC80E1 /* DepthToGrayscaleConverter.swift */; };
2672370C1E79BF6E003D2EAA /* RosyEffect.metal in Sources */ = {isa = PBXBuildFile; fileRef = 2672370B1E79BF6E003D2EAA /* RosyEffect.metal */; };
2672370E1E79BFC0003D2EAA /* DepthToGrayscale.metal in Sources */ = {isa = PBXBuildFile; fileRef = 2672370D1E79BFBF003D2EAA /* DepthToGrayscale.metal */; };
267ED8501ED7974A00898078 /* minMaxFromBuffer.m in Sources */ = {isa = PBXBuildFile; fileRef = 267ED84E1ED7965B00898078 /* minMaxFromBuffer.m */; };
E422DFC01CEF894F0047D7A4 /* Main.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = E422DFBE1CEF894F0047D7A4 /* Main.storyboard */; };
E422DFC21CEF894F0047D7A4 /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = E422DFC11CEF894F0047D7A4 /* Assets.xcassets */; };
E422DFC51CEF894F0047D7A4 /* LaunchScreen.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = E422DFC31CEF894F0047D7A4 /* LaunchScreen.storyboard */; };
E422DFCE1CEF8AF50047D7A4 /* CameraViewController.swift in Sources */ = {isa = PBXBuildFile; fileRef = E422DFCC1CEF8AF50047D7A4 /* CameraViewController.swift */; };
E422DFCF1CEF8AF50047D7A4 /* AppDelegate.swift in Sources */ = {isa = PBXBuildFile; fileRef = E422DFCD1CEF8AF50047D7A4 /* AppDelegate.swift */; };
E46129041CF4C0B8004FE176 /* PreviewMetalView.swift in Sources */ = {isa = PBXBuildFile; fileRef = E46129031CF4C0B8004FE176 /* PreviewMetalView.swift */; };
E4A8A1421CF6188A006823AB /* PassThrough.metal in Sources */ = {isa = PBXBuildFile; fileRef = E4A8A1411CF6188A006823AB /* PassThrough.metal */; };
E4E160DB1D00979000C83A5A /* FilterRenderer.swift in Sources */ = {isa = PBXBuildFile; fileRef = E4E160DA1D00979000C83A5A /* FilterRenderer.swift */; };
E4E160DD1D0099F100C83A5A /* RosyCIRenderer.swift in Sources */ = {isa = PBXBuildFile; fileRef = E4E160DC1D0099F100C83A5A /* RosyCIRenderer.swift */; };
/* End PBXBuildFile section */
/* Begin PBXFileReference section */
07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */ = {isa = PBXFileReference; includeInIndex = 1; lastKnownFileType = text.xcconfig; name = SampleCode.xcconfig; path = Configuration/SampleCode.xcconfig; sourceTree = "<group>"; };
2608B7DA1E70937A00A7310F /* RosyMetalRenderer.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = RosyMetalRenderer.swift; sourceTree = "<group>"; };
261FA1F61E945EEA00BE8D47 /* VideoMixer.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = VideoMixer.swift; sourceTree = "<group>"; };
261FA1F81E94641700BE8D47 /* Mixer.metal */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.metal; path = Mixer.metal; sourceTree = "<group>"; };
2633CF1C1E7C65D500FC80E1 /* DepthToGrayscaleConverter.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = DepthToGrayscaleConverter.swift; sourceTree = "<group>"; };
2672370B1E79BF6E003D2EAA /* RosyEffect.metal */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.metal; path = RosyEffect.metal; sourceTree = "<group>"; };
2672370D1E79BFBF003D2EAA /* DepthToGrayscale.metal */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.metal; path = DepthToGrayscale.metal; sourceTree = "<group>"; };
267ED84D1ED7965A00898078 /* AVCamPhotoFilter-Bridging-Header.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = "AVCamPhotoFilter-Bridging-Header.h"; sourceTree = "<group>"; };
267ED84E1ED7965B00898078 /* minMaxFromBuffer.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = minMaxFromBuffer.m; sourceTree = "<group>"; };
267ED84F1ED7965B00898078 /* minMaxFromBuffer.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = minMaxFromBuffer.h; sourceTree = "<group>"; };
C221CC7AB4DD7C36524EC109 /* LICENSE.txt */ = {isa = PBXFileReference; includeInIndex = 1; path = LICENSE.txt; sourceTree = "<group>"; };
E414FC6C1D5921FD0007C979 /* README.md */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = net.daringfireball.markdown; path = README.md; sourceTree = "<group>"; };
E422DFB71CEF894F0047D7A4 /* AVCamPhotoFilter.app */ = {isa = PBXFileReference; explicitFileType = wrapper.application; includeInIndex = 0; path = AVCamPhotoFilter.app; sourceTree = BUILT_PRODUCTS_DIR; };
E422DFBF1CEF894F0047D7A4 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/Main.storyboard; sourceTree = "<group>"; };
E422DFC11CEF894F0047D7A4 /* Assets.xcassets */ = {isa = PBXFileReference; lastKnownFileType = folder.assetcatalog; path = Assets.xcassets; sourceTree = "<group>"; };
E422DFC41CEF894F0047D7A4 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/LaunchScreen.storyboard; sourceTree = "<group>"; };
E422DFC61CEF894F0047D7A4 /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = "<group>"; };
E422DFCC1CEF8AF50047D7A4 /* CameraViewController.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = CameraViewController.swift; sourceTree = "<group>"; };
E422DFCD1CEF8AF50047D7A4 /* AppDelegate.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = AppDelegate.swift; sourceTree = "<group>"; };
E46129031CF4C0B8004FE176 /* PreviewMetalView.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = PreviewMetalView.swift; sourceTree = "<group>"; };
E4A8A1411CF6188A006823AB /* PassThrough.metal */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.metal; path = PassThrough.metal; sourceTree = "<group>"; };
E4E160DA1D00979000C83A5A /* FilterRenderer.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = FilterRenderer.swift; sourceTree = "<group>"; };
E4E160DC1D0099F100C83A5A /* RosyCIRenderer.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = RosyCIRenderer.swift; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
E422DFB41CEF894F0047D7A4 /* Frameworks */ = {
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
2672370F1E79BFD1003D2EAA /* Shaders */ = {
isa = PBXGroup;
children = (
E4A8A1411CF6188A006823AB /* PassThrough.metal */,
2672370B1E79BF6E003D2EAA /* RosyEffect.metal */,
2672370D1E79BFBF003D2EAA /* DepthToGrayscale.metal */,
261FA1F81E94641700BE8D47 /* Mixer.metal */,
);
path = Shaders;
sourceTree = "<group>";
};
8F7DEDF83CEB2358D1CCC092 /* Configuration */ = {
isa = PBXGroup;
children = (
07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */,
);
name = Configuration;
sourceTree = "<group>";
};
E422DFAE1CEF894F0047D7A4 = {
isa = PBXGroup;
children = (
E414FC6C1D5921FD0007C979 /* README.md */,
E422DFB91CEF894F0047D7A4 /* AVCamPhotoFilter */,
E422DFB81CEF894F0047D7A4 /* Products */,
8F7DEDF83CEB2358D1CCC092 /* Configuration */,
F338657CC0FEDC6E66F03A1D /* LICENSE */,
);
sourceTree = "<group>";
};
E422DFB81CEF894F0047D7A4 /* Products */ = {
isa = PBXGroup;
children = (
E422DFB71CEF894F0047D7A4 /* AVCamPhotoFilter.app */,
);
name = Products;
sourceTree = "<group>";
};
E422DFB91CEF894F0047D7A4 /* AVCamPhotoFilter */ = {
isa = PBXGroup;
children = (
E422DFCD1CEF8AF50047D7A4 /* AppDelegate.swift */,
E422DFCC1CEF8AF50047D7A4 /* CameraViewController.swift */,
E46129031CF4C0B8004FE176 /* PreviewMetalView.swift */,
E4E160DA1D00979000C83A5A /* FilterRenderer.swift */,
E4E160DC1D0099F100C83A5A /* RosyCIRenderer.swift */,
2608B7DA1E70937A00A7310F /* RosyMetalRenderer.swift */,
2633CF1C1E7C65D500FC80E1 /* DepthToGrayscaleConverter.swift */,
261FA1F61E945EEA00BE8D47 /* VideoMixer.swift */,
267ED84F1ED7965B00898078 /* minMaxFromBuffer.h */,
267ED84E1ED7965B00898078 /* minMaxFromBuffer.m */,
2672370F1E79BFD1003D2EAA /* Shaders */,
E422DFBE1CEF894F0047D7A4 /* Main.storyboard */,
E422DFC11CEF894F0047D7A4 /* Assets.xcassets */,
E422DFC31CEF894F0047D7A4 /* LaunchScreen.storyboard */,
E422DFC61CEF894F0047D7A4 /* Info.plist */,
267ED84D1ED7965A00898078 /* AVCamPhotoFilter-Bridging-Header.h */,
);
path = AVCamPhotoFilter;
sourceTree = "<group>";
};
F338657CC0FEDC6E66F03A1D /* LICENSE */ = {
isa = PBXGroup;
children = (
C221CC7AB4DD7C36524EC109 /* LICENSE.txt */,
);
name = LICENSE;
path = .;
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
E422DFB61CEF894F0047D7A4 /* AVCamPhotoFilter */ = {
isa = PBXNativeTarget;
buildConfigurationList = E422DFC91CEF894F0047D7A4 /* Build configuration list for PBXNativeTarget "AVCamPhotoFilter" */;
buildPhases = (
E422DFB31CEF894F0047D7A4 /* Sources */,
E422DFB41CEF894F0047D7A4 /* Frameworks */,
E422DFB51CEF894F0047D7A4 /* Resources */,
);
buildRules = (
);
dependencies = (
);
name = AVCamPhotoFilter;
productName = AVCamPhotoFilter;
productReference = E422DFB71CEF894F0047D7A4 /* AVCamPhotoFilter.app */;
productType = "com.apple.product-type.application";
};
/* End PBXNativeTarget section */
/* Begin PBXProject section */
E422DFAF1CEF894F0047D7A4 /* Project object */ = {
isa = PBXProject;
attributes = {
LastSwiftUpdateCheck = 0800;
LastUpgradeCheck = 0900;
ORGANIZATIONNAME = Apple;
TargetAttributes = {
E422DFB61CEF894F0047D7A4 = {
CreatedOnToolsVersion = 8.0;
DevelopmentTeamName = "Apple Inc. - Core OS Plus Others";
LastSwiftMigration = 0900;
ProvisioningStyle = Automatic;
};
};
};
buildConfigurationList = E422DFB21CEF894F0047D7A4 /* Build configuration list for PBXProject "AVCamPhotoFilter" */;
compatibilityVersion = "Xcode 3.2";
developmentRegion = English;
hasScannedForEncodings = 0;
knownRegions = (
en,
Base,
);
mainGroup = E422DFAE1CEF894F0047D7A4;
productRefGroup = E422DFB81CEF894F0047D7A4 /* Products */;
projectDirPath = "";
projectRoot = "";
targets = (
E422DFB61CEF894F0047D7A4 /* AVCamPhotoFilter */,
);
};
/* End PBXProject section */
/* Begin PBXResourcesBuildPhase section */
E422DFB51CEF894F0047D7A4 /* Resources */ = {
isa = PBXResourcesBuildPhase;
buildActionMask = 2147483647;
files = (
E422DFC51CEF894F0047D7A4 /* LaunchScreen.storyboard in Resources */,
E422DFC21CEF894F0047D7A4 /* Assets.xcassets in Resources */,
E422DFC01CEF894F0047D7A4 /* Main.storyboard in Resources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXResourcesBuildPhase section */
/* Begin PBXSourcesBuildPhase section */
E422DFB31CEF894F0047D7A4 /* Sources */ = {
isa = PBXSourcesBuildPhase;
buildActionMask = 2147483647;
files = (
E4E160DB1D00979000C83A5A /* FilterRenderer.swift in Sources */,
261FA1F71E945EEA00BE8D47 /* VideoMixer.swift in Sources */,
261FA1F91E94641700BE8D47 /* Mixer.metal in Sources */,
2608B7DB1E70937A00A7310F /* RosyMetalRenderer.swift in Sources */,
E4E160DD1D0099F100C83A5A /* RosyCIRenderer.swift in Sources */,
E4A8A1421CF6188A006823AB /* PassThrough.metal in Sources */,
E422DFCF1CEF8AF50047D7A4 /* AppDelegate.swift in Sources */,
267ED8501ED7974A00898078 /* minMaxFromBuffer.m in Sources */,
2672370C1E79BF6E003D2EAA /* RosyEffect.metal in Sources */,
2633CF1D1E7C65D500FC80E1 /* DepthToGrayscaleConverter.swift in Sources */,
2672370E1E79BFC0003D2EAA /* DepthToGrayscale.metal in Sources */,
E46129041CF4C0B8004FE176 /* PreviewMetalView.swift in Sources */,
E422DFCE1CEF8AF50047D7A4 /* CameraViewController.swift in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXSourcesBuildPhase section */
/* Begin PBXVariantGroup section */
E422DFBE1CEF894F0047D7A4 /* Main.storyboard */ = {
isa = PBXVariantGroup;
children = (
E422DFBF1CEF894F0047D7A4 /* Base */,
);
name = Main.storyboard;
sourceTree = "<group>";
};
E422DFC31CEF894F0047D7A4 /* LaunchScreen.storyboard */ = {
isa = PBXVariantGroup;
children = (
E422DFC41CEF894F0047D7A4 /* Base */,
);
name = LaunchScreen.storyboard;
sourceTree = "<group>";
};
/* End PBXVariantGroup section */
/* Begin XCBuildConfiguration section */
E422DFC71CEF894F0047D7A4 /* Debug */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
ASSETCATALOG_COMPRESSION = lossless;
CLANG_ANALYZER_NONNULL = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
CODE_SIGN_IDENTITY = "iPhone Developer";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
COPY_PHASE_STRIP = NO;
DEBUG_INFORMATION_FORMAT = dwarf;
ENABLE_STRICT_OBJC_MSGSEND = YES;
ENABLE_TESTABILITY = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_DYNAMIC_NO_PIC = NO;
GCC_NO_COMMON_BLOCKS = YES;
GCC_OPTIMIZATION_LEVEL = 0;
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
"$(inherited)",
);
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = YES;
ONLY_ACTIVE_ARCH = YES;
SDKROOT = iphoneos;
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
SWIFT_VERSION = 4.0;
TARGETED_DEVICE_FAMILY = "1,2";
};
name = Debug;
};
E422DFC81CEF894F0047D7A4 /* Release */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
ASSETCATALOG_COMPRESSION = "respect-asset-catalog";
CLANG_ANALYZER_NONNULL = YES;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
CODE_SIGN_IDENTITY = "iPhone Developer";
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
COPY_PHASE_STRIP = NO;
DEBUG_INFORMATION_FORMAT = "dwarf-with-dsym";
ENABLE_NS_ASSERTIONS = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_NO_COMMON_BLOCKS = YES;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 10.0;
MTL_ENABLE_DEBUG_INFO = NO;
SDKROOT = iphoneos;
SWIFT_OPTIMIZATION_LEVEL = "-Owholemodule";
SWIFT_VERSION = 4.0;
TARGETED_DEVICE_FAMILY = "1,2";
VALIDATE_PRODUCT = YES;
};
name = Release;
};
E422DFCA1CEF894F0047D7A4 /* Debug */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */;
buildSettings = {
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CLANG_ENABLE_MODULES = YES;
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
DEVELOPMENT_TEAM = "";
INFOPLIST_FILE = AVCamPhotoFilter/Info.plist;
INSTALL_PATH = "";
IPHONEOS_DEPLOYMENT_TARGET = 11.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
MTL_LANGUAGE_REVISION = UseDeploymentTarget;
MTL_OPTIMIZATION_LEVEL = 0;
OTHER_SWIFT_FLAGS = "";
PRODUCT_BUNDLE_IDENTIFIER = "com.example.apple-samplecode.AVCamPhotoFilter${SAMPLE_CODE_DISAMBIGUATOR}";
PRODUCT_NAME = "$(TARGET_NAME)";
PROVISIONING_PROFILE_SPECIFIER = "";
SDKROOT = iphoneos;
SWIFT_OBJC_BRIDGING_HEADER = "AVCamPhotoFilter/AVCamPhotoFilter-Bridging-Header.h";
SWIFT_OPTIMIZATION_LEVEL = "-Onone";
SWIFT_VERSION = 4.0;
};
name = Debug;
};
E422DFCB1CEF894F0047D7A4 /* Release */ = {
isa = XCBuildConfiguration;
baseConfigurationReference = 07AD6A98FAE41685AA0BF436 /* SampleCode.xcconfig */;
buildSettings = {
ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;
CLANG_ENABLE_MODULES = YES;
"CODE_SIGN_IDENTITY[sdk=iphoneos*]" = "iPhone Developer";
DEVELOPMENT_TEAM = "";
INFOPLIST_FILE = AVCamPhotoFilter/Info.plist;
INSTALL_PATH = "";
IPHONEOS_DEPLOYMENT_TARGET = 11.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
MTL_LANGUAGE_REVISION = UseDeploymentTarget;
MTL_OPTIMIZATION_LEVEL = 0;
PRODUCT_BUNDLE_IDENTIFIER = "com.example.apple-samplecode.AVCamPhotoFilter${SAMPLE_CODE_DISAMBIGUATOR}";
PRODUCT_NAME = "$(TARGET_NAME)";
PROVISIONING_PROFILE_SPECIFIER = "";
SDKROOT = iphoneos;
SWIFT_OBJC_BRIDGING_HEADER = "AVCamPhotoFilter/AVCamPhotoFilter-Bridging-Header.h";
SWIFT_VERSION = 4.0;
};
name = Release;
};
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
E422DFB21CEF894F0047D7A4 /* Build configuration list for PBXProject "AVCamPhotoFilter" */ = {
isa = XCConfigurationList;
buildConfigurations = (
E422DFC71CEF894F0047D7A4 /* Debug */,
E422DFC81CEF894F0047D7A4 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
E422DFC91CEF894F0047D7A4 /* Build configuration list for PBXNativeTarget "AVCamPhotoFilter" */ = {
isa = XCConfigurationList;
buildConfigurations = (
E422DFCA1CEF894F0047D7A4 /* Debug */,
E422DFCB1CEF894F0047D7A4 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
/* End XCConfigurationList section */
};
rootObject = E422DFAF1CEF894F0047D7A4 /* Project object */;
}

View File

@ -0,0 +1,8 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Bridging header for AVCamPhotoFilter.
*/
#import "minMaxFromBuffer.h"

View File

@ -0,0 +1,13 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Application delegate.
*/
import UIKit
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
}

View File

@ -0,0 +1,98 @@
{
"images" : [
{
"idiom" : "iphone",
"size" : "20x20",
"scale" : "2x"
},
{
"idiom" : "iphone",
"size" : "20x20",
"scale" : "3x"
},
{
"idiom" : "iphone",
"size" : "29x29",
"scale" : "2x"
},
{
"idiom" : "iphone",
"size" : "29x29",
"scale" : "3x"
},
{
"idiom" : "iphone",
"size" : "40x40",
"scale" : "2x"
},
{
"idiom" : "iphone",
"size" : "40x40",
"scale" : "3x"
},
{
"idiom" : "iphone",
"size" : "60x60",
"scale" : "2x"
},
{
"idiom" : "iphone",
"size" : "60x60",
"scale" : "3x"
},
{
"idiom" : "ipad",
"size" : "20x20",
"scale" : "1x"
},
{
"idiom" : "ipad",
"size" : "20x20",
"scale" : "2x"
},
{
"idiom" : "ipad",
"size" : "29x29",
"scale" : "1x"
},
{
"idiom" : "ipad",
"size" : "29x29",
"scale" : "2x"
},
{
"idiom" : "ipad",
"size" : "40x40",
"scale" : "1x"
},
{
"idiom" : "ipad",
"size" : "40x40",
"scale" : "2x"
},
{
"idiom" : "ipad",
"size" : "76x76",
"scale" : "1x"
},
{
"idiom" : "ipad",
"size" : "76x76",
"scale" : "2x"
},
{
"idiom" : "ipad",
"size" : "83.5x83.5",
"scale" : "2x"
},
{
"idiom" : "ios-marketing",
"size" : "1024x1024",
"scale" : "1x"
}
],
"info" : {
"version" : 1,
"author" : "xcode"
}
}

View File

@ -0,0 +1,28 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="11129.12" systemVersion="16A195a" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" launchScreen="YES" useTraitCollections="YES" colorMatched="YES" initialViewController="01J-lp-oVM">
<dependencies>
<deployment identifier="iOS"/>
<plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="11103.9"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<scenes>
<!--View Controller-->
<scene sceneID="EHf-IW-A2E">
<objects>
<viewController id="01J-lp-oVM" sceneMemberID="viewController">
<layoutGuides>
<viewControllerLayoutGuide type="top" id="Llm-lL-Icb"/>
<viewControllerLayoutGuide type="bottom" id="xb3-aO-Qok"/>
</layoutGuides>
<view key="view" contentMode="scaleToFill" id="Ze5-6b-2t3">
<rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<color key="backgroundColor" red="1" green="1" blue="1" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
</view>
</viewController>
<placeholder placeholderIdentifier="IBFirstResponder" id="iYj-Kq-Ea1" userLabel="First Responder" sceneMemberID="firstResponder"/>
</objects>
<point key="canvasLocation" x="53" y="375"/>
</scene>
</scenes>
</document>

View File

@ -0,0 +1,202 @@
<?xml version="1.0" encoding="UTF-8"?>
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="13189.4" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" useTraitCollections="YES" useSafeAreas="YES" colorMatched="YES" initialViewController="BYZ-38-t0r">
<device id="retina4_7" orientation="portrait">
<adaptation id="fullscreen"/>
</device>
<dependencies>
<plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="13165.3"/>
<capability name="Safe area layout guides" minToolsVersion="9.0"/>
<capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
</dependencies>
<scenes>
<!--Camera View Controller-->
<scene sceneID="tne-QT-ifu">
<objects>
<viewController id="BYZ-38-t0r" customClass="CameraViewController" customModule="AVCamPhotoFilter" customModuleProvider="target" sceneMemberID="viewController">
<view key="view" contentMode="scaleToFill" id="Eqv-7y-uCz" userLabel="View">
<rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
<autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
<subviews>
<view contentMode="scaleToFill" preservesSuperviewLayoutMargins="YES" translatesAutoresizingMaskIntoConstraints="NO" id="ajY-TT-566" userLabel="Preview" customClass="PreviewMetalView" customModule="AVCamPhotoFilter" customModuleProvider="target">
<rect key="frame" x="0.0" y="20" width="375" height="647"/>
<subviews>
<slider opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" value="0.5" minValue="0.0" maxValue="1" translatesAutoresizingMaskIntoConstraints="NO" id="x5F-tE-pbz">
<rect key="frame" x="-2" y="558" width="379" height="31"/>
<connections>
<action selector="changeMixFactor:" destination="BYZ-38-t0r" eventType="valueChanged" id="QU7-1h-yAf"/>
</connections>
</slider>
<switch opaque="NO" contentMode="scaleToFill" horizontalHuggingPriority="750" verticalHuggingPriority="750" contentHorizontalAlignment="center" contentVerticalAlignment="center" translatesAutoresizingMaskIntoConstraints="NO" id="KKk-KV-EF5">
<rect key="frame" x="8" y="59" width="51" height="31"/>
<connections>
<action selector="changeDepthEnabled:" destination="BYZ-38-t0r" eventType="valueChanged" id="PUh-6x-FcL"/>
</connections>
</switch>
<switch opaque="NO" contentMode="scaleToFill" horizontalHuggingPriority="750" verticalHuggingPriority="750" contentHorizontalAlignment="center" contentVerticalAlignment="center" translatesAutoresizingMaskIntoConstraints="NO" id="Wyt-pO-waG">
<rect key="frame" x="8" y="98" width="51" height="31"/>
<connections>
<action selector="changeDepthSmoothing:" destination="BYZ-38-t0r" eventType="valueChanged" id="tPL-96-Ug1"/>
</connections>
</switch>
<label opaque="NO" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Depth" textAlignment="natural" lineBreakMode="tailTruncation" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="6Kk-58-d3T">
<rect key="frame" x="65" y="62.5" width="53.5" height="24"/>
<fontDescription key="fontDescription" type="system" pointSize="20"/>
<color key="textColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<nil key="highlightedColor"/>
</label>
<switch opaque="NO" contentMode="scaleToFill" horizontalHuggingPriority="750" verticalHuggingPriority="750" contentHorizontalAlignment="center" contentVerticalAlignment="center" translatesAutoresizingMaskIntoConstraints="NO" id="gAn-KX-eZu">
<rect key="frame" x="8" y="20" width="51" height="31"/>
<connections>
<action selector="changeVideoFilteringEnabled:" destination="BYZ-38-t0r" eventType="valueChanged" id="4cZ-kq-IJL"/>
</connections>
</switch>
<label opaque="NO" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Filter Video" textAlignment="natural" lineBreakMode="tailTruncation" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="v6o-wy-TQB">
<rect key="frame" x="65" y="23" width="98" height="24"/>
<fontDescription key="fontDescription" type="system" pointSize="20"/>
<color key="textColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<nil key="highlightedColor"/>
</label>
<label opaque="NO" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Smooth Depth" textAlignment="natural" lineBreakMode="tailTruncation" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="8GU-1n-Jvo">
<rect key="frame" x="65" y="101.5" width="126.5" height="24"/>
<fontDescription key="fontDescription" type="system" pointSize="20"/>
<color key="textColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<nil key="highlightedColor"/>
</label>
</subviews>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
<constraints>
<constraint firstAttribute="trailing" relation="greaterThanOrEqual" secondItem="v6o-wy-TQB" secondAttribute="trailing" constant="20" symbolic="YES" id="0ZJ-MC-nbo"/>
<constraint firstAttribute="trailing" relation="greaterThanOrEqual" secondItem="8GU-1n-Jvo" secondAttribute="trailing" constant="20" symbolic="YES" id="29t-73-VQV"/>
<constraint firstItem="KKk-KV-EF5" firstAttribute="top" secondItem="gAn-KX-eZu" secondAttribute="bottom" constant="8" id="6Ki-Cl-fra"/>
<constraint firstItem="8GU-1n-Jvo" firstAttribute="centerY" secondItem="Wyt-pO-waG" secondAttribute="centerY" id="6hm-nV-0vC"/>
<constraint firstItem="v6o-wy-TQB" firstAttribute="centerY" secondItem="gAn-KX-eZu" secondAttribute="centerY" id="BZ2-ia-fuT"/>
<constraint firstItem="x5F-tE-pbz" firstAttribute="centerX" secondItem="ajY-TT-566" secondAttribute="centerX" id="D7g-nd-Xa2"/>
<constraint firstItem="KKk-KV-EF5" firstAttribute="leading" secondItem="ajY-TT-566" secondAttribute="leading" constant="8" id="EQN-g8-usp"/>
<constraint firstItem="x5F-tE-pbz" firstAttribute="width" secondItem="ajY-TT-566" secondAttribute="width" id="Gg5-93-wBC"/>
<constraint firstItem="Wyt-pO-waG" firstAttribute="leading" secondItem="KKk-KV-EF5" secondAttribute="leading" id="HBO-Yi-P6h"/>
<constraint firstItem="v6o-wy-TQB" firstAttribute="leading" secondItem="gAn-KX-eZu" secondAttribute="trailing" constant="8" id="KW1-cY-5c2"/>
<constraint firstItem="8GU-1n-Jvo" firstAttribute="leading" secondItem="Wyt-pO-waG" secondAttribute="trailing" constant="8" id="L7S-Md-2M1"/>
<constraint firstItem="gAn-KX-eZu" firstAttribute="top" secondItem="ajY-TT-566" secondAttribute="top" constant="20" id="QVd-3W-vma"/>
<constraint firstItem="gAn-KX-eZu" firstAttribute="leading" secondItem="ajY-TT-566" secondAttribute="leading" constant="8" id="g3N-sT-BkM"/>
<constraint firstItem="Wyt-pO-waG" firstAttribute="top" secondItem="KKk-KV-EF5" secondAttribute="bottom" constant="8" id="jKS-Tc-hsg"/>
<constraint firstItem="6Kk-58-d3T" firstAttribute="centerY" secondItem="KKk-KV-EF5" secondAttribute="centerY" id="kdy-Yb-J1C"/>
<constraint firstAttribute="trailing" relation="greaterThanOrEqual" secondItem="6Kk-58-d3T" secondAttribute="trailing" constant="20" symbolic="YES" id="y1E-NZ-AR6"/>
<constraint firstItem="6Kk-58-d3T" firstAttribute="leading" secondItem="KKk-KV-EF5" secondAttribute="trailing" constant="8" id="ycH-BV-1ay"/>
</constraints>
</view>
<label hidden="YES" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Camera Unavailable" textAlignment="center" lineBreakMode="tailTruncation" numberOfLines="0" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="B08-Wv-b5R" userLabel="Camera Unavailable">
<rect key="frame" x="83.5" y="319" width="208" height="29"/>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="0.0" colorSpace="custom" customColorSpace="sRGB"/>
<color key="tintColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<fontDescription key="fontDescription" type="system" pointSize="24"/>
<color key="textColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
<nil key="highlightedColor"/>
</label>
<label hidden="YES" userInteractionEnabled="NO" contentMode="left" horizontalHuggingPriority="251" verticalHuggingPriority="251" text="Camera Unavailable" textAlignment="center" lineBreakMode="tailTruncation" numberOfLines="0" baselineAdjustment="alignBaselines" adjustsFontSizeToFit="NO" translatesAutoresizingMaskIntoConstraints="NO" id="9Ir-IB-GSr" userLabel="Filter">
<rect key="frame" x="83.5" y="319" width="208" height="29"/>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="0.0" colorSpace="custom" customColorSpace="sRGB"/>
<color key="tintColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<fontDescription key="fontDescription" type="system" pointSize="24"/>
<color key="textColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
<nil key="highlightedColor"/>
</label>
<button hidden="YES" opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="6D4-Y8-I1S" userLabel="Resume">
<rect key="frame" x="105" y="314" width="165" height="39"/>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="0.29999999999999999" colorSpace="custom" customColorSpace="sRGB"/>
<fontDescription key="fontDescription" type="system" pointSize="24"/>
<inset key="contentEdgeInsets" minX="10" minY="5" maxX="10" maxY="5"/>
<state key="normal" title="Tap to resume">
<color key="titleShadowColor" red="0.5" green="0.5" blue="0.5" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
</state>
<userDefinedRuntimeAttributes>
<userDefinedRuntimeAttribute type="number" keyPath="layer.cornerRadius">
<integer key="value" value="4"/>
</userDefinedRuntimeAttribute>
</userDefinedRuntimeAttributes>
<connections>
<action selector="resumeInterruptedSession:" destination="BYZ-38-t0r" eventType="touchUpInside" id="9ZM-QH-ZlN"/>
</connections>
</button>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="BEM-k2-Quc">
<rect key="frame" x="147.5" y="617" width="80" height="30"/>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="0.29999999999999999" colorSpace="custom" customColorSpace="sRGB"/>
<constraints>
<constraint firstAttribute="height" constant="30" id="Mtk-RH-P0z"/>
<constraint firstAttribute="width" constant="80" id="mge-eI-XRX"/>
</constraints>
<fontDescription key="fontDescription" type="system" pointSize="20"/>
<color key="tintColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
<state key="normal" title="Photo">
<color key="titleShadowColor" red="0.5" green="0.5" blue="0.5" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
</state>
<userDefinedRuntimeAttributes>
<userDefinedRuntimeAttribute type="number" keyPath="layer.cornerRadius">
<integer key="value" value="4"/>
</userDefinedRuntimeAttribute>
</userDefinedRuntimeAttributes>
<connections>
<action selector="capturePhoto:" destination="BYZ-38-t0r" eventType="touchUpInside" id="gmA-cF-4c0"/>
</connections>
</button>
<button opaque="NO" contentMode="scaleToFill" contentHorizontalAlignment="center" contentVerticalAlignment="center" buttonType="roundedRect" lineBreakMode="middleTruncation" translatesAutoresizingMaskIntoConstraints="NO" id="Ozc-E8-cWj" userLabel="Camera">
<rect key="frame" x="235.5" y="617" width="80" height="30"/>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="0.29999999999999999" colorSpace="custom" customColorSpace="sRGB"/>
<fontDescription key="fontDescription" type="system" pointSize="20"/>
<state key="normal" title="Camera">
<color key="titleShadowColor" red="0.5" green="0.5" blue="0.5" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
</state>
<userDefinedRuntimeAttributes>
<userDefinedRuntimeAttribute type="number" keyPath="layer.cornerRadius">
<integer key="value" value="4"/>
</userDefinedRuntimeAttribute>
</userDefinedRuntimeAttributes>
<connections>
<action selector="changeCamera:" destination="BYZ-38-t0r" eventType="touchUpInside" id="uDv-uK-lqC"/>
</connections>
</button>
</subviews>
<color key="backgroundColor" red="0.0" green="0.0" blue="0.0" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
<constraints>
<constraint firstItem="6D4-Y8-I1S" firstAttribute="centerY" secondItem="Eqv-7y-uCz" secondAttribute="centerY" id="0cF-Ck-xhe"/>
<constraint firstItem="9Ir-IB-GSr" firstAttribute="centerX" secondItem="3Gl-sb-vLR" secondAttribute="centerX" id="0qS-mH-JJH"/>
<constraint firstItem="BEM-k2-Quc" firstAttribute="top" secondItem="Ozc-E8-cWj" secondAttribute="top" id="63K-qi-lsJ"/>
<constraint firstItem="3Gl-sb-vLR" firstAttribute="bottom" secondItem="BEM-k2-Quc" secondAttribute="bottom" constant="20" id="90i-sS-5zh"/>
<constraint firstItem="BEM-k2-Quc" firstAttribute="height" secondItem="Ozc-E8-cWj" secondAttribute="height" id="9ip-0t-7hu"/>
<constraint firstItem="B08-Wv-b5R" firstAttribute="centerX" secondItem="3Gl-sb-vLR" secondAttribute="centerX" id="9qb-K5-MI3"/>
<constraint firstItem="9Ir-IB-GSr" firstAttribute="centerY" secondItem="Eqv-7y-uCz" secondAttribute="centerY" id="BA1-zP-rXy"/>
<constraint firstItem="Ozc-E8-cWj" firstAttribute="leading" secondItem="BEM-k2-Quc" secondAttribute="trailing" constant="8" symbolic="YES" id="BLF-vh-Dly"/>
<constraint firstItem="BEM-k2-Quc" firstAttribute="width" secondItem="Ozc-E8-cWj" secondAttribute="width" id="HoA-g7-1YL"/>
<constraint firstItem="ajY-TT-566" firstAttribute="top" secondItem="3Gl-sb-vLR" secondAttribute="top" id="KZ2-bD-AOA"/>
<constraint firstItem="3Gl-sb-vLR" firstAttribute="centerX" secondItem="BEM-k2-Quc" secondAttribute="centerX" id="QCs-aZ-RKC"/>
<constraint firstAttribute="trailing" relation="greaterThanOrEqual" secondItem="Ozc-E8-cWj" secondAttribute="trailing" priority="900" constant="64" id="Qhv-Ie-MxZ"/>
<constraint firstItem="3Gl-sb-vLR" firstAttribute="bottom" secondItem="ajY-TT-566" secondAttribute="bottom" id="QoV-sr-CZ5"/>
<constraint firstItem="ajY-TT-566" firstAttribute="leading" secondItem="3Gl-sb-vLR" secondAttribute="leading" id="aRI-5P-ai1"/>
<constraint firstItem="6D4-Y8-I1S" firstAttribute="centerX" secondItem="3Gl-sb-vLR" secondAttribute="centerX" id="jcz-lJ-VxY"/>
<constraint firstItem="BEM-k2-Quc" firstAttribute="top" secondItem="x5F-tE-pbz" secondAttribute="bottom" constant="9" id="kXN-0w-bOw"/>
<constraint firstItem="3Gl-sb-vLR" firstAttribute="trailing" secondItem="ajY-TT-566" secondAttribute="trailing" id="sWY-Pk-REW"/>
<constraint firstItem="B08-Wv-b5R" firstAttribute="centerY" secondItem="Eqv-7y-uCz" secondAttribute="centerY" id="usf-3D-Qpy"/>
</constraints>
<viewLayoutGuide key="safeArea" id="3Gl-sb-vLR"/>
</view>
<connections>
<outlet property="cameraButton" destination="Ozc-E8-cWj" id="QmF-Fx-HvM"/>
<outlet property="cameraUnavailableLabel" destination="B08-Wv-b5R" id="zOh-RF-43A"/>
<outlet property="depthSmoothingLabel" destination="8GU-1n-Jvo" id="msB-6J-xeK"/>
<outlet property="depthSmoothingSwitch" destination="Wyt-pO-waG" id="o6x-I2-ijR"/>
<outlet property="depthVisualizationLabel" destination="6Kk-58-d3T" id="8SU-OM-ZRu"/>
<outlet property="depthVisualizationSwitch" destination="KKk-KV-EF5" id="6Ni-tu-pe0"/>
<outlet property="filterLabel" destination="9Ir-IB-GSr" id="iVh-4O-MQY"/>
<outlet property="mixFactorSlider" destination="x5F-tE-pbz" id="y4U-DD-NqG"/>
<outlet property="photoButton" destination="BEM-k2-Quc" id="d7v-UT-Eis"/>
<outlet property="previewView" destination="ajY-TT-566" id="Nki-d6-map"/>
<outlet property="resumeButton" destination="6D4-Y8-I1S" id="xYq-f7-4cQ"/>
<outlet property="videoFilterSwitch" destination="gAn-KX-eZu" id="tx6-cf-aUT"/>
</connections>
</viewController>
<placeholder placeholderIdentifier="IBFirstResponder" id="dkx-z0-nzr" sceneMemberID="firstResponder"/>
</objects>
<point key="canvasLocation" x="32.799999999999997" y="91.304347826086968"/>
</scene>
</scenes>
<color key="tintColor" red="1" green="1" blue="0.0" alpha="1" colorSpace="calibratedRGB"/>
</document>

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,205 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Converts depth values to grayscale values.
*/
import CoreMedia
import CoreVideo
import Metal
class DepthToGrayscaleConverter: FilterRenderer {
var description: String = "Depth to Grayscale Converter"
var isPrepared = false
private(set) var inputFormatDescription: CMFormatDescription?
private(set) var outputFormatDescription: CMFormatDescription?
private var inputTextureFormat: MTLPixelFormat = .invalid
private var outputPixelBufferPool: CVPixelBufferPool!
private let metalDevice = MTLCreateSystemDefaultDevice()!
private var computePipelineState: MTLComputePipelineState?
private lazy var commandQueue: MTLCommandQueue? = {
return self.metalDevice.makeCommandQueue()
}()
private var textureCache: CVMetalTextureCache!
private var lowest: Float = 0.0
private var highest: Float = 0.0
struct DepthRenderParam {
var offset: Float
var range: Float
}
var range: DepthRenderParam = DepthRenderParam(offset: -4.0, range: 8.0)
required init() {
let defaultLibrary = metalDevice.makeDefaultLibrary()!
let kernelFunction = defaultLibrary.makeFunction(name: "depthToGrayscale")
do {
computePipelineState = try metalDevice.makeComputePipelineState(function: kernelFunction!)
} catch {
fatalError("Unable to create depth converter pipeline state. (\(error))")
}
}
static private func allocateOutputBufferPool(with formatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int) -> CVPixelBufferPool? {
let inputDimensions = CMVideoFormatDescriptionGetDimensions(formatDescription)
let outputPixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey as String: Int(inputDimensions.width),
kCVPixelBufferHeightKey as String: Int(inputDimensions.height),
kCVPixelBufferIOSurfacePropertiesKey as String: [:]
]
let poolAttributes = [kCVPixelBufferPoolMinimumBufferCountKey as String: outputRetainedBufferCountHint]
var cvPixelBufferPool: CVPixelBufferPool?
// Create a pixel buffer pool with the same pixel attributes as the input format description
CVPixelBufferPoolCreate(kCFAllocatorDefault, poolAttributes as NSDictionary?, outputPixelBufferAttributes as NSDictionary?, &cvPixelBufferPool)
guard let pixelBufferPool = cvPixelBufferPool else {
assertionFailure("Allocation failure: Could not create pixel buffer pool")
return nil
}
return pixelBufferPool
}
func prepare(with formatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int) {
reset()
outputPixelBufferPool = DepthToGrayscaleConverter.allocateOutputBufferPool(with: formatDescription,
outputRetainedBufferCountHint: outputRetainedBufferCountHint)
if outputPixelBufferPool == nil {
return
}
var pixelBuffer: CVPixelBuffer?
var pixelBufferFormatDescription: CMFormatDescription?
_ = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, outputPixelBufferPool!, &pixelBuffer)
if let pixelBuffer = pixelBuffer {
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &pixelBufferFormatDescription)
}
pixelBuffer = nil
inputFormatDescription = formatDescription
outputFormatDescription = pixelBufferFormatDescription
let inputMediaSubType = CMFormatDescriptionGetMediaSubType(formatDescription)
if inputMediaSubType == kCVPixelFormatType_DepthFloat16 ||
inputMediaSubType == kCVPixelFormatType_DisparityFloat16 {
inputTextureFormat = .r16Float
} else if inputMediaSubType == kCVPixelFormatType_DepthFloat32 ||
inputMediaSubType == kCVPixelFormatType_DisparityFloat32 {
inputTextureFormat = .r32Float
} else {
assertionFailure("Input format not supported")
}
var metalTextureCache: CVMetalTextureCache?
if CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, metalDevice, nil, &metalTextureCache) != kCVReturnSuccess {
assertionFailure("Unable to allocate depth converter texture cache")
} else {
textureCache = metalTextureCache
}
isPrepared = true
}
func reset() {
outputPixelBufferPool = nil
outputFormatDescription = nil
inputFormatDescription = nil
textureCache = nil
isPrepared = false
}
// MARK: - Depth to Grayscale Conversion
func render(pixelBuffer: CVPixelBuffer) -> CVPixelBuffer? {
if !isPrepared {
assertionFailure("Invalid state: Not prepared")
return nil
}
var newPixelBuffer: CVPixelBuffer?
CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, outputPixelBufferPool!, &newPixelBuffer)
guard let outputPixelBuffer = newPixelBuffer else {
print("Allocation failure: Could not get pixel buffer from pool (\(self.description))")
return nil
}
guard let outputTexture = makeTextureFromCVPixelBuffer(pixelBuffer: outputPixelBuffer, textureFormat: .bgra8Unorm),
let inputTexture = makeTextureFromCVPixelBuffer(pixelBuffer: pixelBuffer, textureFormat: inputTextureFormat) else {
return nil
}
var min: Float = 0.0
var max: Float = 0.0
minMaxFromPixelBuffer(pixelBuffer, &min, &max, inputTextureFormat)
if min < lowest {
lowest = min
}
if max > highest {
highest = max
}
range = DepthRenderParam(offset: lowest, range: highest - lowest)
// Set up command queue, buffer, and encoder
guard let commandQueue = commandQueue,
let commandBuffer = commandQueue.makeCommandBuffer(),
let commandEncoder = commandBuffer.makeComputeCommandEncoder() else {
print("Failed to create Metal command queue")
CVMetalTextureCacheFlush(textureCache!, 0)
return nil
}
commandEncoder.label = "Depth to Grayscale"
commandEncoder.setComputePipelineState(computePipelineState!)
commandEncoder.setTexture(inputTexture, index: 0)
commandEncoder.setTexture(outputTexture, index: 1)
commandEncoder.setBytes( UnsafeMutableRawPointer(&range), length: MemoryLayout<DepthRenderParam>.size, index: 0)
// Set up thread groups as described in https://developer.apple.com/reference/metal/mtlcomputecommandencoder
let w = computePipelineState!.threadExecutionWidth
let h = computePipelineState!.maxTotalThreadsPerThreadgroup / w
let threadsPerThreadgroup = MTLSizeMake(w, h, 1)
let threadgroupsPerGrid = MTLSize(width: (inputTexture.width + w - 1) / w,
height: (inputTexture.height + h - 1) / h,
depth: 1)
commandEncoder.dispatchThreadgroups(threadgroupsPerGrid, threadsPerThreadgroup: threadsPerThreadgroup)
commandEncoder.endEncoding()
commandBuffer.commit()
return outputPixelBuffer
}
func makeTextureFromCVPixelBuffer(pixelBuffer: CVPixelBuffer, textureFormat: MTLPixelFormat) -> MTLTexture? {
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
// Create a Metal texture from the image buffer
var cvTextureOut: CVMetalTexture?
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, textureFormat, width, height, 0, &cvTextureOut)
guard let cvTexture = cvTextureOut, let texture = CVMetalTextureGetTexture(cvTexture) else {
print("Depth converter failed to create preview texture")
CVMetalTextureCacheFlush(textureCache, 0)
return nil
}
return texture
}
}

View File

@ -0,0 +1,118 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Filter renderer protocol.
*/
import CoreMedia
protocol FilterRenderer: class {
var description: String { get }
var isPrepared: Bool { get }
// Prepare resources.
// The outputRetainedBufferCountHint tells out of place renderers how many of
// their output buffers will be held onto by the downstream pipeline at one time.
// This can be used by the renderer to size and preallocate their pools.
func prepare(with inputFormatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int)
// Release resources.
func reset()
// The format description of the output pixel buffers.
var outputFormatDescription: CMFormatDescription? { get }
// The format description of the input pixel buffers.
var inputFormatDescription: CMFormatDescription? { get }
// Render pixel buffer.
func render(pixelBuffer: CVPixelBuffer) -> CVPixelBuffer?
}
func allocateOutputBufferPool(with inputFormatDescription: CMFormatDescription,
outputRetainedBufferCountHint: Int) ->(
outputBufferPool: CVPixelBufferPool?,
outputColorSpace: CGColorSpace?,
outputFormatDescription: CMFormatDescription?) {
let inputMediaSubType = CMFormatDescriptionGetMediaSubType(inputFormatDescription)
if inputMediaSubType != kCVPixelFormatType_32BGRA {
assertionFailure("Invalid input pixel buffer type \(inputMediaSubType)")
return (nil, nil, nil)
}
let inputDimensions = CMVideoFormatDescriptionGetDimensions(inputFormatDescription)
var pixelBufferAttributes: [String: Any] = [
kCVPixelBufferPixelFormatTypeKey as String: UInt(inputMediaSubType),
kCVPixelBufferWidthKey as String: Int(inputDimensions.width),
kCVPixelBufferHeightKey as String: Int(inputDimensions.height),
kCVPixelBufferIOSurfacePropertiesKey as String: [:]
]
// Get pixel buffer attributes and color space from the input format description
var cgColorSpace = CGColorSpaceCreateDeviceRGB()
if let inputFormatDescriptionExtension = CMFormatDescriptionGetExtensions(inputFormatDescription) as Dictionary? {
let colorPrimaries = inputFormatDescriptionExtension[kCVImageBufferColorPrimariesKey]
if let colorPrimaries = colorPrimaries {
var colorSpaceProperties: [String: AnyObject] = [kCVImageBufferColorPrimariesKey as String: colorPrimaries]
if let yCbCrMatrix = inputFormatDescriptionExtension[kCVImageBufferYCbCrMatrixKey] {
colorSpaceProperties[kCVImageBufferYCbCrMatrixKey as String] = yCbCrMatrix
}
if let transferFunction = inputFormatDescriptionExtension[kCVImageBufferTransferFunctionKey] {
colorSpaceProperties[kCVImageBufferTransferFunctionKey as String] = transferFunction
}
pixelBufferAttributes[kCVBufferPropagatedAttachmentsKey as String] = colorSpaceProperties
}
if let cvColorspace = inputFormatDescriptionExtension[kCVImageBufferCGColorSpaceKey] {
cgColorSpace = cvColorspace as! CGColorSpace
} else if (colorPrimaries as? String) == (kCVImageBufferColorPrimaries_P3_D65 as String) {
cgColorSpace = CGColorSpace(name: CGColorSpace.displayP3)!
}
}
// Create a pixel buffer pool with the same pixel attributes as the input format description
let poolAttributes = [kCVPixelBufferPoolMinimumBufferCountKey as String: outputRetainedBufferCountHint]
var cvPixelBufferPool: CVPixelBufferPool?
CVPixelBufferPoolCreate(kCFAllocatorDefault, poolAttributes as NSDictionary?, pixelBufferAttributes as NSDictionary?, &cvPixelBufferPool)
guard let pixelBufferPool = cvPixelBufferPool else {
assertionFailure("Allocation failure: Could not allocate pixel buffer pool")
return (nil, nil, nil)
}
preallocateBuffers(pool: pixelBufferPool, allocationThreshold: outputRetainedBufferCountHint)
// Get output format description
var pixelBuffer: CVPixelBuffer?
var outputFormatDescription: CMFormatDescription?
let auxAttributes = [kCVPixelBufferPoolAllocationThresholdKey as String: outputRetainedBufferCountHint] as NSDictionary
CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, pixelBufferPool, auxAttributes, &pixelBuffer)
if let pixelBuffer = pixelBuffer {
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &outputFormatDescription)
}
pixelBuffer = nil
return (pixelBufferPool, cgColorSpace, outputFormatDescription)
}
private func preallocateBuffers(pool: CVPixelBufferPool, allocationThreshold: Int) {
var pixelBuffers = [CVPixelBuffer]()
var error: CVReturn = kCVReturnSuccess
let auxAttributes = [kCVPixelBufferPoolAllocationThresholdKey as String: allocationThreshold] as NSDictionary
var pixelBuffer: CVPixelBuffer?
while error == kCVReturnSuccess {
error = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, pool, auxAttributes, &pixelBuffer)
if let pixelBuffer = pixelBuffer {
pixelBuffers.append(pixelBuffer)
}
pixelBuffer = nil
}
pixelBuffers.removeAll()
}

View File

@ -0,0 +1,58 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CFBundleDevelopmentRegion</key>
<string>en</string>
<key>CFBundleExecutable</key>
<string>$(EXECUTABLE_NAME)</string>
<key>CFBundleIdentifier</key>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>$(PRODUCT_NAME)</string>
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>2.0</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleVersion</key>
<string>1</string>
<key>LSRequiresIPhoneOS</key>
<true/>
<key>NSCameraUsageDescription</key>
<string>to take photos</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>to save photos and videos</string>
<key>UILaunchStoryboardName</key>
<string>LaunchScreen</string>
<key>UIMainStoryboardFile</key>
<string>Main</string>
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>metal</string>
</array>
<key>UIRequiresFullScreen</key>
<true/>
<key>UIStatusBarHidden</key>
<true/>
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
</array>
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UIViewControllerBasedStatusBarAppearance</key>
<false/>
</dict>
</plist>

View File

@ -0,0 +1,353 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Metal preview view.
*/
import CoreMedia
import Metal
import MetalKit
class PreviewMetalView: MTKView {
enum Rotation: Int {
case rotate0Degrees
case rotate90Degrees
case rotate180Degrees
case rotate270Degrees
}
var mirroring = false {
didSet {
syncQueue.sync {
internalMirroring = mirroring
}
}
}
private var internalMirroring: Bool = false
var rotation: Rotation = .rotate0Degrees {
didSet {
syncQueue.sync {
internalRotation = rotation
}
}
}
private var internalRotation: Rotation = .rotate0Degrees
var pixelBuffer: CVPixelBuffer? {
didSet {
syncQueue.sync {
internalPixelBuffer = pixelBuffer
}
}
}
private var internalPixelBuffer: CVPixelBuffer?
private let syncQueue = DispatchQueue(label: "Preview View Sync Queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)
private var textureCache: CVMetalTextureCache?
private var textureWidth: Int = 0
private var textureHeight: Int = 0
private var textureMirroring = false
private var textureRotation: Rotation = .rotate0Degrees
private var sampler: MTLSamplerState!
private var renderPipelineState: MTLRenderPipelineState!
private var commandQueue: MTLCommandQueue?
private var vertexCoordBuffer: MTLBuffer!
private var textCoordBuffer: MTLBuffer!
private var internalBounds: CGRect!
private var textureTranform: CGAffineTransform?
func texturePointForView(point: CGPoint) -> CGPoint? {
var result: CGPoint?
guard let transform = textureTranform else {
return result
}
let transformPoint = point.applying(transform)
if CGRect(origin: .zero, size: CGSize(width: textureWidth, height: textureHeight)).contains(transformPoint) {
result = transformPoint
} else {
print("Invalid point \(point) result point \(transformPoint)")
}
return result
}
func viewPointForTexture(point: CGPoint) -> CGPoint? {
var result: CGPoint?
guard let transform = textureTranform?.inverted() else {
return result
}
let transformPoint = point.applying(transform)
if internalBounds.contains(transformPoint) {
result = transformPoint
} else {
print("Invalid point \(point) result point \(transformPoint)")
}
return result
}
func flushTextureCache() {
textureCache = nil
}
private func setupTransform(width: Int, height: Int, mirroring: Bool, rotation: Rotation) {
var scaleX: Float = 1.0
var scaleY: Float = 1.0
var resizeAspect: Float = 1.0
internalBounds = self.bounds
textureWidth = width
textureHeight = height
textureMirroring = mirroring
textureRotation = rotation
if textureWidth > 0 && textureHeight > 0 {
switch textureRotation {
case .rotate0Degrees, .rotate180Degrees:
scaleX = Float(internalBounds.width / CGFloat(textureWidth))
scaleY = Float(internalBounds.height / CGFloat(textureHeight))
case .rotate90Degrees, .rotate270Degrees:
scaleX = Float(internalBounds.width / CGFloat(textureHeight))
scaleY = Float(internalBounds.height / CGFloat(textureWidth))
}
}
// Resize aspect
resizeAspect = min(scaleX, scaleY)
if scaleX < scaleY {
scaleY = scaleX / scaleY
scaleX = 1.0
} else {
scaleX = scaleY / scaleX
scaleY = 1.0
}
if textureMirroring {
scaleX *= -1.0
}
// Vertex coordinate takes the gravity into account
let vertexData: [Float] = [
-scaleX, -scaleY, 0.0, 1.0,
scaleX, -scaleY, 0.0, 1.0,
-scaleX, scaleY, 0.0, 1.0,
scaleX, scaleY, 0.0, 1.0
]
vertexCoordBuffer = device!.makeBuffer(bytes: vertexData, length: vertexData.count * MemoryLayout<Float>.size, options: [])
// Texture coordinate takes the rotation into account
var textData: [Float]
switch textureRotation {
case .rotate0Degrees:
textData = [
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0
]
case .rotate180Degrees:
textData = [
1.0, 0.0,
0.0, 0.0,
1.0, 1.0,
0.0, 1.0
]
case .rotate90Degrees:
textData = [
1.0, 1.0,
1.0, 0.0,
0.0, 1.0,
0.0, 0.0
]
case .rotate270Degrees:
textData = [
0.0, 0.0,
0.0, 1.0,
1.0, 0.0,
1.0, 1.0
]
}
textCoordBuffer = device?.makeBuffer(bytes: textData, length: textData.count * MemoryLayout<Float>.size, options: [])
// Calculate the transform from texture coordinates to view coordinates
var transform = CGAffineTransform.identity
if textureMirroring {
transform = transform.concatenating(CGAffineTransform(scaleX: -1, y: 1))
transform = transform.concatenating(CGAffineTransform(translationX: CGFloat(textureWidth), y: 0))
}
switch textureRotation {
case .rotate0Degrees:
transform = transform.concatenating(CGAffineTransform(rotationAngle: CGFloat(0)))
case .rotate180Degrees:
transform = transform.concatenating(CGAffineTransform(rotationAngle: CGFloat(Double.pi)))
transform = transform.concatenating(CGAffineTransform(translationX: CGFloat(textureWidth), y: CGFloat(textureHeight)))
case .rotate90Degrees:
transform = transform.concatenating(CGAffineTransform(rotationAngle: CGFloat(Double.pi) / 2))
transform = transform.concatenating(CGAffineTransform(translationX: CGFloat(textureHeight), y: 0))
case .rotate270Degrees:
transform = transform.concatenating(CGAffineTransform(rotationAngle: 3 * CGFloat(Double.pi) / 2))
transform = transform.concatenating(CGAffineTransform(translationX: 0, y: CGFloat(textureWidth)))
}
transform = transform.concatenating(CGAffineTransform(scaleX: CGFloat(resizeAspect), y: CGFloat(resizeAspect)))
let tranformRect = CGRect(origin: .zero, size: CGSize(width: textureWidth, height: textureHeight)).applying(transform)
let tx = (internalBounds.size.width - tranformRect.size.width) / 2
let ty = (internalBounds.size.height - tranformRect.size.height) / 2
transform = transform.concatenating(CGAffineTransform(translationX: tx, y: ty))
textureTranform = transform.inverted()
}
required init(coder: NSCoder) {
super.init(coder: coder)
device = MTLCreateSystemDefaultDevice()
configureMetal()
createTextureCache()
colorPixelFormat = .bgra8Unorm
}
func configureMetal() {
let defaultLibrary = device!.makeDefaultLibrary()!
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineDescriptor.vertexFunction = defaultLibrary.makeFunction(name: "vertexPassThrough")
pipelineDescriptor.fragmentFunction = defaultLibrary.makeFunction(name: "fragmentPassThrough")
// To determine how our textures are sampled, we create a sampler descriptor, which
// will be used to ask for a sampler state object from our device below.
let samplerDescriptor = MTLSamplerDescriptor()
samplerDescriptor.sAddressMode = .clampToEdge
samplerDescriptor.tAddressMode = .clampToEdge
samplerDescriptor.minFilter = .linear
samplerDescriptor.magFilter = .linear
sampler = device!.makeSamplerState(descriptor: samplerDescriptor)
do {
renderPipelineState = try device!.makeRenderPipelineState(descriptor: pipelineDescriptor)
} catch {
fatalError("Unable to create preview Metal view pipeline state. (\(error))")
}
commandQueue = device!.makeCommandQueue()
}
func createTextureCache() {
var newTextureCache: CVMetalTextureCache?
if CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device!, nil, &newTextureCache) == kCVReturnSuccess {
textureCache = newTextureCache
} else {
assertionFailure("Unable to allocate texture cache")
}
}
override func draw(_ rect: CGRect) {
var pixelBuffer: CVPixelBuffer?
var mirroring = false
var rotation: Rotation = .rotate0Degrees
syncQueue.sync {
pixelBuffer = internalPixelBuffer
mirroring = internalMirroring
rotation = internalRotation
}
guard let drawable = currentDrawable,
let currentRenderPassDescriptor = currentRenderPassDescriptor,
let previewPixelBuffer = pixelBuffer else {
return
}
// Create a Metal texture from the image buffer
let width = CVPixelBufferGetWidth(previewPixelBuffer)
let height = CVPixelBufferGetHeight(previewPixelBuffer)
if textureCache == nil {
createTextureCache()
}
var cvTextureOut: CVMetalTexture?
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
textureCache!,
previewPixelBuffer,
nil,
.bgra8Unorm,
width,
height,
0,
&cvTextureOut)
guard let cvTexture = cvTextureOut, let texture = CVMetalTextureGetTexture(cvTexture) else {
print("Failed to create preview texture")
CVMetalTextureCacheFlush(textureCache!, 0)
return
}
if texture.width != textureWidth ||
texture.height != textureHeight ||
self.bounds != internalBounds ||
mirroring != textureMirroring ||
rotation != textureRotation {
setupTransform(width: texture.width, height: texture.height, mirroring: mirroring, rotation: rotation)
}
// Set up command buffer and encoder
guard let commandQueue = commandQueue else {
print("Failed to create Metal command queue")
CVMetalTextureCacheFlush(textureCache!, 0)
return
}
guard let commandBuffer = commandQueue.makeCommandBuffer() else {
print("Failed to create Metal command buffer")
CVMetalTextureCacheFlush(textureCache!, 0)
return
}
guard let commandEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: currentRenderPassDescriptor) else {
print("Failed to create Metal command encoder")
CVMetalTextureCacheFlush(textureCache!, 0)
return
}
commandEncoder.label = "Preview display"
commandEncoder.setRenderPipelineState(renderPipelineState!)
commandEncoder.setVertexBuffer(vertexCoordBuffer, offset: 0, index: 0)
commandEncoder.setVertexBuffer(textCoordBuffer, offset: 0, index: 1)
commandEncoder.setFragmentTexture(texture, index: 0)
commandEncoder.setFragmentSamplerState(sampler, index: 0)
commandEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
commandEncoder.endEncoding()
commandBuffer.present(drawable) // Draw to the screen
commandBuffer.commit()
}
}

View File

@ -0,0 +1,86 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Rosy colored filter renderer implemented with Core Image.
*/
import CoreMedia
import CoreVideo
import CoreImage
class RosyCIRenderer: FilterRenderer {
var description: String = "Rosy (Core Image)"
var isPrepared = false
private var ciContext: CIContext?
private var rosyFilter: CIFilter?
private var outputColorSpace: CGColorSpace?
private var outputPixelBufferPool: CVPixelBufferPool?
private(set) var outputFormatDescription: CMFormatDescription?
private(set) var inputFormatDescription: CMFormatDescription?
func prepare(with formatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int) {
reset()
(outputPixelBufferPool,
outputColorSpace,
outputFormatDescription) = allocateOutputBufferPool(with: formatDescription,
outputRetainedBufferCountHint: outputRetainedBufferCountHint)
if outputPixelBufferPool == nil {
return
}
inputFormatDescription = formatDescription
ciContext = CIContext()
rosyFilter = CIFilter(name: "CIColorMatrix")
rosyFilter!.setValue(CIVector(x: 0, y: 0, z: 0, w: 0), forKey: "inputGVector")
isPrepared = true
}
func reset() {
ciContext = nil
rosyFilter = nil
outputColorSpace = nil
outputPixelBufferPool = nil
outputFormatDescription = nil
inputFormatDescription = nil
isPrepared = false
}
func render(pixelBuffer: CVPixelBuffer) -> CVPixelBuffer? {
guard let rosyFilter = rosyFilter,
let ciContext = ciContext,
isPrepared else {
assertionFailure("Invalid state: Not prepared")
return nil
}
let sourceImage = CIImage(cvImageBuffer: pixelBuffer)
rosyFilter.setValue(sourceImage, forKey: kCIInputImageKey)
guard let filteredImage = rosyFilter.value(forKey: kCIOutputImageKey) as? CIImage else {
print("CIFilter failed to render image")
return nil
}
var pbuf: CVPixelBuffer?
CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, outputPixelBufferPool!, &pbuf)
guard let outputPixelBuffer = pbuf else {
print("Allocation failure")
return nil
}
// Render the filtered image out to a pixel buffer (no locking needed, as CIContext's render method will do that)
ciContext.render(filteredImage, to: outputPixelBuffer, bounds: filteredImage.extent, colorSpace: outputColorSpace)
return outputPixelBuffer
}
}

View File

@ -0,0 +1,135 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Rosy colored filter renderer implemented with Metal.
*/
import CoreMedia
import CoreVideo
import Metal
class RosyMetalRenderer: FilterRenderer {
var description: String = "Rosy (Metal)"
var isPrepared = false
private(set) var inputFormatDescription: CMFormatDescription?
private(set) var outputFormatDescription: CMFormatDescription?
private var outputPixelBufferPool: CVPixelBufferPool?
private let metalDevice = MTLCreateSystemDefaultDevice()!
private var computePipelineState: MTLComputePipelineState?
private var textureCache: CVMetalTextureCache!
private lazy var commandQueue: MTLCommandQueue? = {
return self.metalDevice.makeCommandQueue()
}()
required init() {
let defaultLibrary = metalDevice.makeDefaultLibrary()!
let kernelFunction = defaultLibrary.makeFunction(name: "rosyEffect")
do {
computePipelineState = try metalDevice.makeComputePipelineState(function: kernelFunction!)
} catch {
print("Could not create pipeline state: \(error)")
}
}
func prepare(with formatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int) {
reset()
(outputPixelBufferPool, _, outputFormatDescription) = allocateOutputBufferPool(with: formatDescription,
outputRetainedBufferCountHint: outputRetainedBufferCountHint)
if outputPixelBufferPool == nil {
return
}
inputFormatDescription = formatDescription
var metalTextureCache: CVMetalTextureCache?
if CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, metalDevice, nil, &metalTextureCache) != kCVReturnSuccess {
assertionFailure("Unable to allocate texture cache")
} else {
textureCache = metalTextureCache
}
isPrepared = true
}
func reset() {
outputPixelBufferPool = nil
outputFormatDescription = nil
inputFormatDescription = nil
textureCache = nil
isPrepared = false
}
func render(pixelBuffer: CVPixelBuffer) -> CVPixelBuffer? {
if !isPrepared {
assertionFailure("Invalid state: Not prepared")
return nil
}
var newPixelBuffer: CVPixelBuffer?
CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, outputPixelBufferPool!, &newPixelBuffer)
guard let outputPixelBuffer = newPixelBuffer else {
print("Allocation failure: Could not get pixel buffer from pool (\(self.description))")
return nil
}
guard let inputTexture = makeTextureFromCVPixelBuffer(pixelBuffer: pixelBuffer, textureFormat: .bgra8Unorm),
let outputTexture = makeTextureFromCVPixelBuffer(pixelBuffer: outputPixelBuffer, textureFormat: .bgra8Unorm) else {
return nil
}
// Set up command queue, buffer, and encoder
guard let commandQueue = commandQueue,
let commandBuffer = commandQueue.makeCommandBuffer(),
let commandEncoder = commandBuffer.makeComputeCommandEncoder() else {
print("Failed to create Metal command queue")
CVMetalTextureCacheFlush(textureCache!, 0)
return nil
}
commandEncoder.label = "Rosy Metal"
commandEncoder.setComputePipelineState(computePipelineState!)
commandEncoder.setTexture(inputTexture, index: 0)
commandEncoder.setTexture(outputTexture, index: 1)
// Set up thread groups as described in https://developer.apple.com/reference/metal/mtlcomputecommandencoder
let w = computePipelineState!.threadExecutionWidth
let h = computePipelineState!.maxTotalThreadsPerThreadgroup / w
let threadsPerThreadgroup = MTLSizeMake(w, h, 1)
let threadgroupsPerGrid = MTLSize(width: (inputTexture.width + w - 1) / w,
height: (inputTexture.height + h - 1) / h,
depth: 1)
commandEncoder.dispatchThreadgroups(threadgroupsPerGrid, threadsPerThreadgroup: threadsPerThreadgroup)
commandEncoder.endEncoding()
commandBuffer.commit()
return outputPixelBuffer
}
func makeTextureFromCVPixelBuffer(pixelBuffer: CVPixelBuffer, textureFormat: MTLPixelFormat) -> MTLTexture? {
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
// Create a Metal texture from the image buffer
var cvTextureOut: CVMetalTexture?
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, textureFormat, width, height, 0, &cvTextureOut)
guard let cvTexture = cvTextureOut, let texture = CVMetalTextureGetTexture(cvTexture) else {
CVMetalTextureCacheFlush(textureCache, 0)
return nil
}
return texture
}
}

View File

@ -0,0 +1,35 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Metal compute shader that translates depth values to grayscale RGB values.
*/
#include <metal_stdlib>
using namespace metal;
struct converterParameters {
float offset;
float range;
};
// Compute kernel
kernel void depthToGrayscale(texture2d<float, access::read> inputTexture [[ texture(0) ]],
texture2d<float, access::write> outputTexture [[ texture(1) ]],
constant converterParameters& converterParameters [[ buffer(0) ]],
uint2 gid [[ thread_position_in_grid ]])
{
// Ensure we don't read or write outside of the texture
if ((gid.x >= inputTexture.get_width()) || (gid.y >= inputTexture.get_height())) {
return;
}
float depth = inputTexture.read(gid).x;
// Normalize the value between 0 and 1
depth = (depth - converterParameters.offset) / (converterParameters.range);
float4 outputColor = float4(float3(depth), 1.0);
outputTexture.write(outputColor, gid);
}

View File

@ -0,0 +1,49 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Shader that blends two input textures.
*/
#include <metal_stdlib>
using namespace metal;
struct VertexIO
{
float4 position [[position]];
float2 textureCoord [[user(texturecoord)]];
};
struct mixerParameters
{
float mixFactor;
};
vertex VertexIO vertexMixer(device float2 *pPosition [[ buffer(0) ]],
uint index [[ vertex_id ]])
{
VertexIO outVertex;
outVertex.position.xy = pPosition[index];
outVertex.position.z = 0;
outVertex.position.w = 1.0;
// Convert texture position to texture coordinates
outVertex.textureCoord.xy = 0.5 + float2(0.5, -0.5) * outVertex.position.xy;
return outVertex;
}
fragment half4 fragmentMixer(VertexIO inputFragment [[ stage_in ]],
texture2d<half> mixerInput0 [[ texture(0) ]],
texture2d<half> mixerInput1 [[ texture(1) ]],
const device mixerParameters& mixerParameters [[ buffer(0) ]],
sampler samplr [[ sampler(0) ]])
{
half4 input0 = mixerInput0.sample(samplr, inputFragment.textureCoord);
half4 input1 = mixerInput1.sample(samplr, inputFragment.textureCoord);
half4 output = mix(input0, input1, half(mixerParameters.mixFactor));
return output;
}

View File

@ -0,0 +1,37 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Pass-through shader (used for preview).
*/
#include <metal_stdlib>
using namespace metal;
// Vertex input/output structure for passing results from vertex shader to fragment shader
struct VertexIO
{
float4 position [[position]];
float2 textureCoord [[user(texturecoord)]];
};
// Vertex shader for a textured quad
vertex VertexIO vertexPassThrough(device packed_float4 *pPosition [[ buffer(0) ]],
device packed_float2 *pTexCoords [[ buffer(1) ]],
uint vid [[ vertex_id ]])
{
VertexIO outVertex;
outVertex.position = pPosition[vid];
outVertex.textureCoord = pTexCoords[vid];
return outVertex;
}
// Fragment shader for a textured quad
fragment half4 fragmentPassThrough(VertexIO inputFragment [[ stage_in ]],
texture2d<half> inputTexture [[ texture(0) ]],
sampler samplr [[ sampler(0) ]])
{
return inputTexture.sample(samplr, inputFragment.textureCoord);
}

View File

@ -0,0 +1,27 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Shader that gives images a pink tint by zero-ing out the green value.
*/
#include <metal_stdlib>
using namespace metal;
// Compute kernel
kernel void rosyEffect(texture2d<half, access::read> inputTexture [[ texture(0) ]],
texture2d<half, access::write> outputTexture [[ texture(1) ]],
uint2 gid [[thread_position_in_grid]])
{
// Make sure we don't read or write outside of the texture
if ((gid.x >= inputTexture.get_width()) || (gid.y >= inputTexture.get_height())) {
return;
}
half4 inputColor = inputTexture.read(gid);
// Set the output color to the input color minus the green component
half4 outputColor = half4(inputColor.r, 0.0, inputColor.b, 1.0);
outputTexture.write(outputColor, gid);
}

View File

@ -0,0 +1,177 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Combines video frames and grayscale depth frames.
*/
import CoreMedia
import CoreVideo
class VideoMixer {
var description: String = "Video Mixer"
var isPrepared = false
private(set) var inputFormatDescription: CMFormatDescription?
private(set) var outputFormatDescription: CMFormatDescription?
private var outputPixelBufferPool: CVPixelBufferPool?
private let metalDevice = MTLCreateSystemDefaultDevice()!
private var renderPipelineState: MTLRenderPipelineState?
private var sampler: MTLSamplerState?
private var textureCache: CVMetalTextureCache!
private lazy var commandQueue: MTLCommandQueue? = {
return self.metalDevice.makeCommandQueue()
}()
private var fullRangeVertexBuffer: MTLBuffer?
var mixFactor: Float = 0.5
init() {
let vertexData: [Float] = [
-1.0, 1.0,
1.0, 1.0,
-1.0, -1.0,
1.0, -1.0
]
fullRangeVertexBuffer = metalDevice.makeBuffer(bytes: vertexData, length: vertexData.count * MemoryLayout<Float>.size, options: [])
let defaultLibrary = metalDevice.makeDefaultLibrary()!
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineDescriptor.vertexFunction = defaultLibrary.makeFunction(name: "vertexMixer")
pipelineDescriptor.fragmentFunction = defaultLibrary.makeFunction(name: "fragmentMixer")
do {
renderPipelineState = try metalDevice.makeRenderPipelineState(descriptor: pipelineDescriptor)
} catch {
fatalError("Unable to create video mixer pipeline state. (\(error))")
}
// To determine how our textures are sampled, we create a sampler descriptor, which
// is used to ask for a sampler state object from our device.
let samplerDescriptor = MTLSamplerDescriptor()
samplerDescriptor.minFilter = .linear
samplerDescriptor.magFilter = .linear
sampler = metalDevice.makeSamplerState(descriptor: samplerDescriptor)
}
func prepare(with videoFormatDescription: CMFormatDescription, outputRetainedBufferCountHint: Int) {
reset()
(outputPixelBufferPool, _, outputFormatDescription) = allocateOutputBufferPool(with: videoFormatDescription,
outputRetainedBufferCountHint: outputRetainedBufferCountHint)
if outputPixelBufferPool == nil {
return
}
inputFormatDescription = videoFormatDescription
var metalTextureCache: CVMetalTextureCache?
if CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, metalDevice, nil, &metalTextureCache) != kCVReturnSuccess {
assertionFailure("Unable to allocate video mixer texture cache")
} else {
textureCache = metalTextureCache
}
isPrepared = true
}
func reset() {
outputPixelBufferPool = nil
outputFormatDescription = nil
inputFormatDescription = nil
textureCache = nil
isPrepared = false
}
struct MixerParameters {
var mixFactor: Float
}
func mix(videoPixelBuffer: CVPixelBuffer, depthPixelBuffer: CVPixelBuffer) -> CVPixelBuffer? {
if !isPrepared {
assertionFailure("Invalid state: Not prepared")
return nil
}
var newPixelBuffer: CVPixelBuffer?
CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, outputPixelBufferPool!, &newPixelBuffer)
guard let outputPixelBuffer = newPixelBuffer else {
print("Allocation failure: Could not get pixel buffer from pool (\(self.description))")
return nil
}
guard let outputTexture = makeTextureFromCVPixelBuffer(pixelBuffer: outputPixelBuffer),
let inputTexture0 = makeTextureFromCVPixelBuffer(pixelBuffer: videoPixelBuffer),
let inputTexture1 = makeTextureFromCVPixelBuffer(pixelBuffer: depthPixelBuffer) else {
return nil
}
var parameters = MixerParameters(mixFactor: mixFactor)
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = outputTexture
guard let fullRangeVertexBuffer = fullRangeVertexBuffer else {
print("Failed to create Metal vertex buffer")
CVMetalTextureCacheFlush(textureCache!, 0)
return nil
}
guard let sampler = sampler else {
print("Failed to create Metal sampler")
CVMetalTextureCacheFlush(textureCache!, 0)
return nil
}
// Set up command queue, buffer, and encoder
guard let commandQueue = commandQueue,
let commandBuffer = commandQueue.makeCommandBuffer(),
let commandEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
print("Failed to create Metal command queue")
CVMetalTextureCacheFlush(textureCache!, 0)
return nil
}
commandEncoder.label = "Video Mixer"
commandEncoder.setRenderPipelineState(renderPipelineState!)
commandEncoder.setVertexBuffer(fullRangeVertexBuffer, offset: 0, index: 0)
commandEncoder.setFragmentTexture(inputTexture0, index: 0)
commandEncoder.setFragmentTexture(inputTexture1, index: 1)
commandEncoder.setFragmentSamplerState(sampler, index: 0)
commandEncoder.setFragmentBytes( UnsafeMutableRawPointer(&parameters), length: MemoryLayout<MixerParameters>.size, index: 0)
commandEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
commandEncoder.endEncoding()
commandBuffer.commit()
return outputPixelBuffer
}
func makeTextureFromCVPixelBuffer(pixelBuffer: CVPixelBuffer) -> MTLTexture? {
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
// Create a Metal texture from the image buffer
var cvTextureOut: CVMetalTexture?
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, .bgra8Unorm, width, height, 0, &cvTextureOut)
guard let cvTexture = cvTextureOut, let texture = CVMetalTextureGetTexture(cvTexture) else {
print("Video mixer failed to create preview texture")
CVMetalTextureCacheFlush(textureCache, 0)
return nil
}
return texture
}
}

View File

@ -0,0 +1,16 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Defines a function which extracts the smallest and largest values from a pixel buffer.
*/
#ifndef minMaxFromBuffer_h
#define minMaxFromBuffer_h
#import <CoreVideo/CoreVideo.h>
#import <Metal/Metal.h>
void minMaxFromPixelBuffer(CVPixelBufferRef pixelBuffer, float* minValue, float* maxValue, MTLPixelFormat pixelFormat);
#endif /* minMaxFromBuffer_h */

View File

@ -0,0 +1,50 @@
/*
See LICENSE.txt for this samples licensing information.
Abstract:
Implements a function which extracts the smallest and largest values from a pixel buffer.
*/
#import "minMaxFromBuffer.h"
#import <Foundation/Foundation.h>
#import <simd/simd.h>
void minMaxFromPixelBuffer(CVPixelBufferRef pixelBuffer, float* minValue, float* maxValue, MTLPixelFormat pixelFormat)
{
int width = (int)CVPixelBufferGetWidth(pixelBuffer);
int height = (int)CVPixelBufferGetHeight(pixelBuffer);
int bytesPerRow = (int)CVPixelBufferGetBytesPerRow(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
unsigned char* pixelBufferPointer = CVPixelBufferGetBaseAddress(pixelBuffer);
__fp16* bufferP_F16 = (__fp16 *) pixelBufferPointer;
float* bufferP_F32 = (float *) pixelBufferPointer;
bool isFloat16 = (pixelFormat == MTLPixelFormatR16Float);
uint32_t increment = isFloat16 ? bytesPerRow/sizeof(__fp16) : bytesPerRow/sizeof(float);
float min = MAXFLOAT;
float max = -MAXFLOAT;
for (int j=0; j < height; j++)
{
for (int i=0; i < width; i++)
{
float val = ( isFloat16 ) ? bufferP_F16[i] : bufferP_F32[i] ;
if (!isnan(val)) {
if (val>max) max = val;
if (val<min) min = val;
}
}
if ( isFloat16 ) {
bufferP_F16 +=increment;
} else {
bufferP_F32 +=increment;
}
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
*minValue = min;
*maxValue = max;
}

View File

@ -0,0 +1,11 @@
//
// SampleCode.xcconfig
//
// The `SAMPLE_CODE_DISAMBIGUATOR` configuration is to make it easier to build
// and run a sample code project. Once you set your project's development team,
// you'll have a unique bundle identifier. This is because the bundle identifier
// is derived based on the 'SAMPLE_CODE_DISAMBIGUATOR' value. Do not use this
// approach in your own projects—it's only useful for sample code projects because
// they are frequently downloaded and don't have a development team set.
SAMPLE_CODE_DISAMBIGUATOR=${DEVELOPMENT_TEAM}

View File

@ -0,0 +1,42 @@
Sample code project: AVCamPhotoFilter: Using AVFoundation to Capture photos with image processing
Version: 3.0
IMPORTANT: This Apple software is supplied to you by Apple
Inc. ("Apple") in consideration of your agreement to the following
terms, and your use, installation, modification or redistribution of
this Apple software constitutes acceptance of these terms. If you do
not agree with these terms, please do not use, install, modify or
redistribute this Apple software.
In consideration of your agreement to abide by the following terms, and
subject to these terms, Apple grants you a personal, non-exclusive
license, under Apple's copyrights in this original Apple software (the
"Apple Software"), to use, reproduce, modify and redistribute the Apple
Software, with or without modifications, in source and/or binary forms;
provided that if you redistribute the Apple Software in its entirety and
without modifications, you must retain this notice and the following
text and disclaimers in all such redistributions of the Apple Software.
Neither the name, trademarks, service marks or logos of Apple Inc. may
be used to endorse or promote products derived from the Apple Software
without specific prior written permission from Apple. Except as
expressly stated in this notice, no other rights or licenses, express or
implied, are granted by Apple herein, including but not limited to any
patent rights that may be infringed by your derivative works or by other
works in which the Apple Software may be incorporated.
The Apple Software is provided by Apple on an "AS IS" basis. APPLE
MAKES NO WARRANTIES, EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION
THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY AND FITNESS
FOR A PARTICULAR PURPOSE, REGARDING THE APPLE SOFTWARE OR ITS USE AND
OPERATION ALONE OR IN COMBINATION WITH YOUR PRODUCTS.
IN NO EVENT SHALL APPLE BE LIABLE FOR ANY SPECIAL, INDIRECT, INCIDENTAL
OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) ARISING IN ANY WAY OUT OF THE USE, REPRODUCTION,
MODIFICATION AND/OR DISTRIBUTION OF THE APPLE SOFTWARE, HOWEVER CAUSED
AND WHETHER UNDER THEORY OF CONTRACT, TORT (INCLUDING NEGLIGENCE),
STRICT LIABILITY OR OTHERWISE, EVEN IF APPLE HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
Copyright (C) 2017 Apple Inc. All Rights Reserved.

View File

@ -0,0 +1,27 @@
# AVCamPhotoFilter
Using AV Foundation to capture photos with image processing.
## Overview
AVCamPhotoFilter demonstrates how to use AV Foundation's capture API to draw a live camera preview and capture photos with image processing (filtering) applied.
Two "rosy" filters are provided: one is implemented using Core Image, and the other is implemented as a Metal shader. A horizontal swipe on the camera preview switches between the filters.
On devices that support depth map delivery, AVCamPhotoFilter provides depth data visualization (via a Metal shader). When depth visualization is enabled, a slider enables crossfading between video and depth visualization.
AVCamPhotoFilter also shows how to properly propagate sample buffer attachments and attributes, including EXIF metadata and color space information (e.g. wide gamut).
## Requirements
### Build
Xcode 9.0 or later; iOS 11.0 SDK or later.
- Note: **AVCamPhotoFilter can only be built for an actual iOS device, not for the simulator.**
### Runtime
iOS 11.0 or later
- Note: **AVCamPhotoFilter can only be run on an actual iOS device, not on the simulator.**