iOS 10.0 更新点(开发者视角)
html, body {overflow-x: initial !important;}html { font-size: 14px; }
body { margin: 0px; padding: 0px; height: auto; bottom: 0px; top: 0px; left: 0px; right: 0px; font-family: 'Helvetica Neue', helvetica, arial, sans-serif; font-size: 1rem; line-height: 1.42857; color: rgb(51, 51, 51); overflow-x: hidden; background-color: rgb(255, 255, 255); }
a:active, a:hover { outline: 0px; }
.in-text-selection, ::selection { text-shadow: none; background: rgb(181, 214, 252); }
#write { margin: 0px auto; height: auto; width: inherit; word-break: normal; word-wrap: break-word; position: relative; padding-bottom: 70px; white-space: pre-wrap; }
body.typora-export { padding-left: 30px; padding-right: 30px; }
.typora-export #write { margin: 0px auto; }
#write > p:first-child, #write > ul:first-child, #write > ol:first-child, #write > pre:first-child, #write > blockquote:first-child, #write > div:first-child, #write > table:first-child { margin-top: 30px; }
img { max-width: 100%; }
input, button, select, textarea { color: inherit; font-style: inherit; font-variant: inherit; font-weight: inherit; font-stretch: inherit; font-size: inherit; line-height: inherit; font-family: inherit; }
input[type="checkbox"], input[type="radio"] { line-height: normal; padding: 0px; }
::before, ::after, * { box-sizing: border-box; }
#write p, #write h1, #write h2, #write h3, #write h4, #write h5, #write h6, #write div, #write pre { width: inherit; }
#write p, #write h1, #write h2, #write h3, #write h4, #write h5, #write h6 { position: relative; }
h1 { font-size: 2rem; }
p { -webkit-margin-before: 1rem; -webkit-margin-after: 1rem; -webkit-margin-start: 0px; -webkit-margin-end: 0px; }
.mathjax-block { margin-top: 0px; margin-bottom: 0px; -webkit-margin-before: 0rem; -webkit-margin-after: 0rem; }
.hidden { display: none; }
.md-blockmeta { color: rgb(204, 204, 204); font-weight: bold; font-style: italic; }
a { cursor: pointer; }
#write input[type="checkbox"] { cursor: pointer; width: inherit; height: inherit; margin: 4px 0px 0px; }
tr { page-break-inside: avoid; page-break-after: auto; }
thead { display: table-header-group; }
table { border-collapse: collapse; border-spacing: 0px; width: 100%; overflow: auto; page-break-inside: auto; }
table.md-table td { min-width: 80px; }
.codemirror-gutters { border-right-width: 0px; background-color: inherit; }
.codemirror { text-align: left; }
.codemirror-placeholder { opacity: 0.3; }
.codemirror pre { padding: 0px 4px; }
.codemirror-lines { padding: 0px; }
div.hr:focus { cursor: none; }
pre { white-space: pre-wrap; }
.codemirror-gutters { margin-right: 4px; }
.md-fences, pre.md-fences { font-size: 0.9rem; display: block; page-break-inside: avoid; text-align: left; overflow: visible; white-space: pre; position: relative !important; background: inherit; }
.md-fences .codemirror.codemirror-wrap { top: -1.6em; margin-bottom: -1.6em; }
.md-fences.mock-cm { white-space: pre-wrap; }
.show-fences-line-number pre.md-fences { padding-left: 0px; }
.show-fences-line-number pre.md-fences.mock-cm { padding-left: 40px; }
.footnotes { color: rgb(136, 136, 136); font-size: 0.9rem; padding-top: 1em; padding-bottom: 1em; }
.footnotes + .footnotes { margin-top: -1em; }
.md-reset { margin: 0px; padding: 0px; border: 0px; outline: 0px; vertical-align: top; text-decoration: none; color: rgb(51, 51, 51); font-family: 'Helvetica Neue', helvetica, arial, sans-serif; font-size: 1rem; text-shadow: none; float: none; position: static; width: auto; height: auto; white-space: nowrap; cursor: inherit; -webkit-tap-highlight-color: transparent; line-height: normal; font-weight: normal; text-align: left; box-sizing: content-box; direction: ltr; background: transparent; }
li div { padding-top: 0px; }
blockquote { margin: 1rem 0px; }
li p, li .mathjax-block { margin: 0.5rem 0px; }
li { margin: 0px; position: relative; }
blockquote > :last-child { margin-bottom: 0px; }
blockquote > :first-child { margin-top: 0px; }
.footnotes-area { color: rgb(136, 136, 136); margin-top: 0.714rem; padding-bottom: 0.143rem; }
@media print {
html, body { height: 100%; }
.typora-export * { -webkit-print-color-adjust: exact; }
h1, h2, h3, h4, h5, h6 { page-break-after: avoid; orphans: 2; }
p { orphans: 4; }
@page { margin: 20mm 0mm; }
}
.footnote-line { margin-top: 0.714em; font-size: 0.7em; }
a img, img a { cursor: pointer; }
#write pre.md-meta-block { font-size: 0.8rem; min-height: 2.86rem; white-space: pre-wrap; display: block; background: rgb(204, 204, 204); }
p > .md-image:only-child { display: inline-block; width: 100%; text-align: center; }
#write .mathjax_display { margin: 0.8em 0px 0px; }
.mathjax-block { white-space: pre; overflow: hidden; width: 100%; }
p + .mathjax-block { margin-top: -1.143rem; }
.mathjax-block:not(:empty)::after { display: none; }
[contenteditable="true"]:active, [contenteditable="true"]:focus { outline: none; box-shadow: none; }
.task-list { list-style-type: none; }
.task-list-item { position: relative; padding-left: 1em; }
.task-list-item input { position: absolute; top: 0px; left: 0px; }
.math { font-size: 1rem; }
.md-toc { min-height: 3.58rem; position: relative; font-size: 0.9rem; border-radius: 10px; }
.md-toc-content { position: relative; margin-left: 0px; }
.md-toc::after, .md-toc-content::after { display: none; }
.md-toc-item { display: block; color: rgb(65, 131, 196); text-decoration: none; }
.md-toc-inner:hover { text-decoration: underline; }
.md-toc-inner { display: inline-block; cursor: pointer; }
.md-toc-h1 .md-toc-inner { margin-left: 0px; font-weight: bold; }
.md-toc-h2 .md-toc-inner { margin-left: 2em; }
.md-toc-h3 .md-toc-inner { margin-left: 4em; }
.md-toc-h4 .md-toc-inner { margin-left: 6em; }
.md-toc-h5 .md-toc-inner { margin-left: 8em; }
.md-toc-h6 .md-toc-inner { margin-left: 10em; }
.md-toc-h6 { margin-left: 12em; }
@media screen and (max-width: 48em) {
.md-toc-h3 .md-toc-inner { margin-left: 3.5em; }
.md-toc-h4 .md-toc-inner { margin-left: 5em; }
.md-toc-h5 .md-toc-inner { margin-left: 6.5em; }
.md-toc-h5 .md-toc-inner { margin-left: 8em; }
.md-toc-h6 { margin-left: 9.5em; }
}
a.md-toc-inner { color: inherit; font-size: inherit; font-style: inherit; font-weight: inherit; text-decoration: inherit; line-height: inherit; }
.footnote-line a:not(.reversefootnote) { color: inherit; }
.md-attr { display: none; }
.md-fn-count::after { content: "."; }
.md-tag { opacity: 0.5; }
code { text-align: left; }
h1 .md-tag, h2 .md-tag, h3 .md-tag, h4 .md-tag, h5 .md-tag, h6 .md-tag { font-weight: initial; opacity: 0.35; }
a.md-header-anchor.md-print-anchor { border: none !important; display: inline-block !important; position: absolute !important; width: 1px !important; right: 0px !important; outline: none !important; text-decoration: initial !important; text-shadow: initial !important; background: transparent !important; }
.md-inline-math .mathjax_svg .noerror { display: none !important; }
.mathjax-block .mathjax_svg_display { text-align: center; margin: 1em 0em; position: relative; text-indent: 0px; max-width: none; max-height: none; min-width: 0px; min-height: 0px; width: 100%; display: block !important; }
.mathjax_svg_display, .md-inline-math .mathjax_svg_display { width: auto; margin: inherit; display: inline-block !important; }
.mathjax_svg .mjx-monospace { font-family: monospace; }
.mathjax_svg .mjx-sans-serif { font-family: sans-serif; }
.mathjax_svg { display: inline; font-style: normal; font-weight: normal; line-height: normal; zoom: 90%; text-indent: 0px; text-align: left; text-transform: none; letter-spacing: normal; word-spacing: normal; word-wrap: normal; white-space: nowrap; float: none; direction: ltr; max-width: none; max-height: none; min-width: 0px; min-height: 0px; border: 0px; padding: 0px; margin: 0px; }
.mathjax_svg * { transition: none; }
html { font-size: 15px; }
html, body { margin: auto; background: rgb(254, 254, 254); }
body { font-family: vollkorn, palatino, times; color: rgb(51, 51, 51); line-height: 1.4; text-align: justify; font-size: 19px; }
#write { max-width: 900px; margin: 0px auto 2em; line-height: 1.53; }
#write > h2:first-child, #write > h3:first-child, #write > h4:first-child, #write > p:first-child { margin-top: 1.2em; }
#write > h1:first-child, h1 { margin-top: 1.6em; font-weight: normal; }
h1 { font-size: 3em; }
h2 { margin-top: 2em; font-weight: normal; }
h3 { font-weight: normal; font-style: italic; margin-top: 3em; }
h1, h2, h3 { text-align: center; }
h2::after { border-bottom-width: 1px; border-bottom-style: solid; border-bottom-color: rgb(47, 47, 47); content: ""; width: 100px; display: block; margin: 0px auto; height: 1px; }
h1 + h2, h2 + h3 { margin-top: 0.83em; }
p, .mathjax-block { margin-top: 0px; }
ul { list-style: square; padding-left: 1.2em; }
ol { padding-left: 1.2em; }
blockquote { margin-left: 1em; padding-left: 1em; border-left-width: 1px; border-left-style: solid; border-left-color: rgb(221, 221, 221); }
code, pre { font-family: consolas, menlo, monaco, monospace, serif; font-size: 0.9em; background: white; }
pre.md-fences { margin-left: 1em; padding-left: 1em; border: 1px solid rgb(221, 221, 221); padding-bottom: 8px; padding-top: 6px; margin-bottom: 1.5em; }
a { color: rgb(36, 132, 193); text-decoration: none; }
a:hover { text-decoration: underline; }
a img { border: none; }
h1 a, h1 a:hover { color: rgb(51, 51, 51); text-decoration: none; }
hr { color: rgb(221, 221, 221); height: 1px; margin: 2em 0px; border-top-style: solid; border-top-width: 1px; border-top-color: rgb(221, 221, 221); border-bottom-style: none; border-left-width: 0px; border-right-width: 0px; }
.md-table-edit { padding-top: 4px; background: rgb(237, 237, 237); }
table { margin-bottom: 1.33333rem; }
table th, table td { padding: 8px; line-height: 1.33333rem; vertical-align: top; border-top-width: 1px; border-top-style: solid; border-top-color: rgb(221, 221, 221); }
table th { font-weight: bold; }
table thead th { vertical-align: bottom; }
table caption + thead tr:first-child th, table caption + thead tr:first-child td, table colgroup + thead tr:first-child th, table colgroup + thead tr:first-child td, table thead:first-child tr:first-child th, table thead:first-child tr:first-child td { border-top-width: 0px; }
table tbody + tbody { border-top-width: 2px; border-top-style: solid; border-top-color: rgb(221, 221, 221); }
.task-list { padding: 0px; }
.task-list-item { padding-left: 1.6rem; }
.task-list-item input::before { content: "√"; display: inline-block; width: 1.33333rem; height: 1.6rem; vertical-align: middle; text-align: center; color: rgb(221, 221, 221); background-color: rgb(254, 254, 254); }
.task-list-item input:checked::before, .task-list-item input[checked]::before { color: inherit; }
.md-tag { color: inherit; font-style: inherit; font-variant: inherit; font-weight: inherit; font-stretch: inherit; font-size: inherit; line-height: inherit; font-family: inherit; }
#write pre.md-meta-block { min-height: 35px; padding: 0.5em 1em; }
#write pre.md-meta-block { white-space: pre; border-width: 0px 1000px; color: rgb(153, 153, 153); border-left-color: rgb(248, 248, 248); border-left-style: solid; margin: -1.33333rem -1000px 2em; border-right-color: rgb(248, 248, 248); border-right-style: solid; padding-top: 26px; padding-bottom: 10px; line-height: 1.8em; font-size: 0.76em; padding-left: 0px; background: rgb(248, 248, 248); }
.md-img-error.md-image > .md-meta { vertical-align: bottom; }
#write > h5.md-focus::before { top: 2px; }
.md-toc { margin-top: 40px; }
.md-toc-content { padding-bottom: 20px; }
.outline-expander::before { color: inherit; font-size: 14px; top: auto; content: ""; font-family: fontawesome; }
.outline-expander:hover::before, .outline-item-open > .outline-item > .outline-expander::before { content: ""; }
#typora-source { font-family: courier, monospace; color: rgb(106, 106, 106); }
.cm-s-typora-default .cm-header, .cm-s-typora-default .cm-property, .codemirror.cm-s-typora-default div.codemirror-cursor { color: rgb(66, 139, 202); }
.cm-s-typora-default .cm-atom, .cm-s-typora-default .cm-number { color: rgb(119, 119, 119); }
iOS 10.0SiriKitProactive SuggestionsIntegrating with the Messages App 与系统短信 app交互User Notifications 用户通知Speech Recognition 语音识别Wide Color 色彩空间的支持Adapting to the True Tone Display 真彩色显示App Search Enhancements 应用搜索增强Widget Enhancements 部件增强Apple Pay Enhancements 支付增强Security and Privacy EnhancementsCallKitNews Publisher Enhancements
iOS 10.0
This article summarizes the key developer-related features introduced in iOS 10, which runs on currently shipping iOS devices. The article also lists the documents that describe new features in more detail.
For late-breaking news and information about known issues, see Release Notes at https://developer.apple.com/ios/download/. For the complete list of new APIs added in iOS 10, see iOS 10.0 API Diffs. For more information on new devices, see iOS Device Compatibility Reference.
To learn about what’s new in Swift, see Swift Language and The Swift Programming Language (Swift 3).
SiriKit
Apps that provide services in specific domains can use SiriKit to make those services available from Siri on iOS. Making your services available requires creating one or more app extensions using the Intents and Intents UI frameworks.(要想使用 SiriKit,需要使用 Intents 和 IntentsUI 框架。) SiriKit supports services in the following domains:
- Audio or video calling
- Messaging
- Sending or receiving payments
- Searching photos
- Booking a ride
- Managing workouts
(使用的场景也有限制,语音,视频,消息,支付,搜索图片,预订 XX,)
When the user makes a request involving your service, SiriKit sends your extension an intent object, which describes the user’s request and provides any data related to that request. You use the intent object to provide an appropriate response object, which includes details of how you can handle the user’s request. Siri typically handles all user interactions, but you can use an extension to provide custom UI that incorporates branding or additional information from your app.
用户请求(包含请求数据)==>SiriKit==>一个 intent 对象==>构造响应体 。默认情况下,Siri 会处理所有的用户交互,同时 也允许我们通过 extension 来定制 UI。
SiriKit also provides a mechanism you can use to tell the system about the interactions and activities that occur within your app.(SirKit 提供了一个让系统知道我们的 app 内的交互和活动的机制。) When you tell the system about these interactions, the system can determine if your app can handle the user’s current request and, if it can, pass the request to your app. In addition to the intent, SiriKit defines an interaction object, which combines an intent with information about the intent-handling process, including details such as the start time and duration of a specific occurrence of the process. (通过 intent, SiriKit 定义了一个interation 对象,这个对象封装了 intent 需要处理的信息,例如处理的起始时间和时长。)
If your app is registered as capable of handling an activity that has the same name as an intent, the system can launch your app with an interaction object containing that intent even if you don’t provide an Intents app extension.(如果我们的 App 注册了一个能处理这个事件的同名的 intent,即使我们没有提供一个 Intents extension,系统也会调用我们的 app 来处理这个事件。(ps 如果注册这个同名的 intent 呢?))
Ride booking is supported by both Maps and Siri, and users can also make restaurants reservations with Maps. Your Intents extension handles interactions that originate from the Maps app in the same way that it handles requests coming from Siri. If you customize the user interface, your Intents UI extension can also configure itself differently, depending on whether the request came from Siri or Maps.
To learn how to support SiriKit and give users new ways to access your services, read SiriKit Programming Guide. When you’re ready to implement the app extensions that handle various intents, see Intents Framework Reference and Intents UI Framework Reference.
https://developer.apple.com/reference/intents
https://developer.apple.com/reference/intents
Proactive Suggestions
iOS 10 introduces new ways to increase engagement with your app by helping the system suggest your app to users at appropriate times. (iOS10系统会主动在合适的时候向用户推荐我们 App)
If you adopted app search in your iOS 9 app, you gave users access to activities and content deep within your app through Spotlight and Safari search results, Handoff, and Siri suggestions. iOS9上,主动建议有几个方式: 1.通过 Spotlight 搜索 2.通过 Safari 搜索 3.通过 Handoff 4.通过 Siri 建议
In iOS 10 and later, you can provide information about what users do in your app, which helps the system promote your app in additional places, such as the keyboard with QuickType suggestions, Maps and CarPlay, the app switcher, Siri interactions, and (for media playing apps) the lock screen. 在 iOS10上,还有下面的一下场景:
- 键盘输入
- 地图
- CarPlay车载娱乐
- 应用切换
- Siri 交互
- 锁屏状态(仅限播放视频的 app)
These opportunities for enhanced integration with the system are supported by a collection of technologies, such as NSUserActivity, web markup defined by Schema.org, and APIs defined in the Core Spotlight, MapKit, UIKit, and Media Player frameworks.
在上面的场景下提供建议所使用的技术有:
- NSUserActivity
- web markup
- CoreSpotlight
- MapKit
- UIKit
- Media Player 框架
In iOS 10, the NSUserActivity object includes the mapItem property, which lets you provide location information that can be used in other contexts. For example, if your app displays hotel reviews, you can use the mapItem property to hold the location of the hotel the user is viewing so that when the user switches to a travel planning app, that hotel’s location is automatically available.iOS10上,NSUserActivity 添加了一个mapItem属性,用于存储位置信息,方便在其他场景下可以使用到。例如,如果一个 App 展示了一个旅馆的位置,(这个位置信息就可以用 mapItem 来存储)当用户切换道一个飞行 app 中的时候,旅馆的位置信息在飞行 App 就可以自动获取到。
And if you support app search, you can use the new text-based address component properties in CSSearchableItemAttributeSet, such as thoroughfare and postalCode, to fully specify locations to which the user may want to go. Note that when you use the mapItem property, the system automatically populates the contentAttributeSet property, too. 如果一个 App 支持搜索功能,可以使用CSSearchableItemAttributeSet属性来存储用户的位置信息。需要注意的是,如果使用了 mapItem 属性,那么CSSearchableItemAttributeSet也会被自动mapItem里的信息填充。
To share a location with the system, be sure to specify latitude and longitude values, in addition to values for the address component properties in CSSearchableItemAttributeSet. It’s also recommended that you supply a value for the namedLocation property, so that users can view the name of the location, and the phoneNumbers property, so that users can use Siri to initiate a call to the location. 分享位置信息的时候,需要指定的有:
- 经纬度
- CSSearchableItemAttributeSet存储地址信息
- namedLocation 存储位置名称
- phoneNumbers 来存储联系方式(比如订餐的场景)
下面更别介绍:
In iOS 9, adding markup to the structured data on your website enriched the content that users see in Spotlight and Safari search results. In iOS 10, you can use location-related vocabulary defined at Schema.org, such as PostalAddress, to further enhance the user’s experience. For example, if users view a location described on your website, the system can suggest the same location when users switch to Maps. Note that Safari supports both JSON-LD and Microdata encodings of Schema.org vocabularies. iOS9的时候,可以在网站上通过markup添加结构化数据的方式让用户在使用 Spotlight 和 Safari 搜索结果的时候,增强体验。 iOS10上,可以添加位置相关的信息。例如一个用户查看了一个我们网站上相关的位置信息,如果用户这个时候切换到系统地图,系统就会自动推荐这个位置。
UIKit introduces the textContentType property in the UITextInputTraits protocol so that you can specify the semantic meaning of the content you expect users to enter in a text area. When you provide this information, the system can in some cases automatically select an appropriate keyboard and improve keyboard corrections and proactive integration with information supplied from other apps and websites. For example, if you use UITextContentTypeFullStreetAddress to tell the system that you expect users to enter a complete address in a text field, the system can suggest the address of a location the user was recently viewing. UIKit上呢有一个textContentType来指定希望在用户输入的时候,推荐的输入内容。例如指定类型为UITextContentTypeFullStreetAddress的时候,如果用户之前看过一个地理位置,然后再输入文本的时候,会自动建议输入之前看过的地址。具体类型可以参考: https://developer.apple.com/reference/uikit/uitextinputtraits/1773409-text_content_types 有姓名,昵称,职位,单位,位置(省,市,区,街道,邮编),电话,邮件地址,URL 也就是说,如果用户之前看过以上的字段,在输入的时候,会自动提示输入上面的这些。有点类似自动填充功能啊。
If your app plays media and you use the MPPlayableContentManager APIs, iOS 10 helps you let users view album art and play media through your app on the lock screen. 如果是视频类 App,通过使用MPPlayableContentManager API,可以让用户在锁屏的时候看到相册和播放视频。
If your ride-sharing app uses the MKDirectionsRequest API, iOS 10 can display it in the app switcher when the user is likely to want a ride. To register as a ride-share provider, specify the MKDirectionsModeRideShare value for the MKDirectionsApplicationSupportedModes key in your Info.plist file. If your app supports only ride sharing, the system suggests your app with text that begins “Get a ride to...”; if your app supports both ride sharing and another routing type (such as Automobile or Bike), the system uses the text “Get directions to...”. Note that the MKMapItem object you receive may not include latitude and longitude information and would require geocoding. 如果是出行类 App,通过指定MKDirectionsRequest,系统可以推荐给用户指定的路线。实现方式:在 info.plist 中添加一个键值对: Key: MKDirectionsApplicationSupportedModes Value: MKDirectionsModeRideShare
Integrating with the Messages App 与系统短信 app交互
In iOS 10, you can create app extensions that interact with the Messages app and let users send text, stickers, media files, and interactive messages. You can also support interactive messages that update as each recipient responds to the message. You can create two types of app extensions: iOS10可以和系统短信 App 进行交互:
- 发送文本
- 表情
- 多媒体
- 交互信息
- A Sticker pack provides a set of stickers that users can add to their Messages content. An iMessage app lets you present a custom user interface within the Messages app, create a sticker browser, include text, stickers, and media files within a conversation, and create, send, and update interactive messages.
- An iMessage app can also help users search images that you host on your app’s related website while they’re in the Messages app.
实现交互需要创建一个 app 扩展,扩展有两种形式:
- 表情包形式的扩展
- 提供一个自定义的交互 UI(例如表情商店,例如可以搜索 app 内的图片)
下面简要介绍这两种扩展的实现方式:
You can create a Sticker pack without writing any code: Simply drag images into the Sticker Pack folder inside the Stickers asset catalog in Xcode. 1.表情包形式的扩展很简单,只需要把图片放到 Xcode 的 Stickers 资源文件夹中。
To develop an iMessage app, you use the APIs in the Messages framework (Messages.framework). To learn about the Messages framework, see Messages Framework Reference. For general information about creating app extensions, see App Extension Programming Guide.
If your app provides images for sharing in Messages and you want users to be able to use the Spotlight popular image search (that is, “#images”) to search these images without leaving the Messages app, first create an iMessage app. Then follow these steps:
- Add the com.apple.developer.associated-domains key to your app’s entitlements. Include a list of the web domains that host the images you want to make searchable. For each domain, specify the spotlight-image-search service.
- Add an apple-app-site-association file to your website. Add a dictionary for the spotlight-image-search service and include your app ID, which is the team ID or app ID prefix, followed by the bundle ID. You can also specify up to 500 paths and patterns that should be included for indexing by the Spotlight popular image search (for some examples of website paths, see the universal links examples in Creating and Uploading the Association File).
- Allow crawling by Applebot (to learn more, see About Applebot).
2.如果要开发一个信息类的 app,可以使用 Messages 框架(iOS10新增)如果想开发一个通用的扩展,参考 扩展编程指南(https://developer.apple.com/library/prerelease/content/documentation/General/Conceptual/ExtensibilityPG/index.html#//apple_ref/doc/uid/TP40014214)
如果是想让用户能够搜索到你们的图片,需要以下几步:
- 创建一个 iMesageApp
- 添加com.apple.developer.associated-domains 到entitlements。用来标记可以被用户搜索道的图片域名列表。每个域名都需要指定spotlight-image-search 3.添加apple-app-site-association 文件到网站中。 4.允许爬虫来爬我们的网站。
User Notifications 用户通知
iOS 10 introduces the User Notifications framework (UserNotifications.framework), which supports the delivery and handling of local and remote notifications. You use the classes of this framework to schedule the delivery of local notifications based on specific conditions, such as time or location. Apps and app extensions can use this framework to receive and potentially modify local and remote notifications when they are delivered to the user’s device.
Also introduced in iOS 10, the User Notifications UI framework (UserNotificationsUI.framework) lets you customize the appearance of local and remote notifications when they appear on the user’s device. You use this framework to define an app extension that receives the notification data and provides the corresponding visual representation. Your extension can also respond to custom actions associated with those notifications.
iOS10新增了UserNotifications.framework用来发送和处理本地,远程通知。可以指定在某时,某地来出发这个通知。更为重要的是,新增了UserNotificationsUI.framework来收到通知的时候,自定义 UI!
Speech Recognition 语音识别
iOS 10 introduces a new API that supports continuous speech recognition and helps you build apps that can recognize speech and transcribe it into text. Using the APIs in the Speech framework (Speech.framework), you can perform speech transcription of both real-time and recorded audio. For example, you can get a speech recognizer and start simple speech recognition using code like this:
let recognizer = SFSpeechRecognizer()
let request = SFSpeechURLRecognitionRequest(url: audioFileURL)
recognizer?.recognitionTask(with: request, resultHandler: { (result, error) in
print (result?.bestTranscription.formattedString)
})
As with accessing other types of protected data, such as Calendar and Photos data, performing speech recognition requires the user’s permission (for more information about accessing protected data classes, see Security and Privacy Enhancements). In the case of speech recognition, permission is required because data is transmitted and temporarily stored on Apple’s servers to increase the accuracy of speech recognition. To request the user’s permission, you must add the NSSpeechRecognitionUsageDescription key to your app’s Info.plist file.
When you adopt speech recognition in your app, be sure to indicate to users that their speech is being recognized, and that they should not make sensitive utterances at that time.
iOS10上,可以通过 Speech.framework中的 API 来实现
- 实时的语音到文本的转换
- 音频文件到文本的转换
语音识别需要注意的一点是,语音识别功能需要用户权限。因为语音数据被会上次道 apple 服务器,然后云端识别。添加用户权限的方式是:在 info.plist 中添加一个 Key: NSSpeechRecognitionUsageDescription
另外一点是,注意识别用户过激的,政治不正确的言论。 = = !
Wide Color 色彩空间的支持
Most graphics frameworks throughout the system, including Core Graphics, Core Image, Metal, and AVFoundation, have substantially improved support for extended-range pixel formats and wide-gamut color spaces. By extending this behavior throughout the entire graphics stack, it is easier than ever to support devices with a wide color display. In addition, UIKit standardizes on working in a new extended sRGB color space, making it easy to mix sRGB colors with colors in other, wider color gamuts without a significant performance penalty.
Here are some best practices to adopt as you start working with Wide Color.
In iOS 10, the UIColor class uses the extended sRGB color space and its initializers no longer clamp raw component values to between 0.0 and 1.0. If your app relies on UIKit to clamp component values (whether you’re creating a color or asking a color for its component values), you need to change your app’s behavior when you link against iOS 10. When performing custom drawing in a UIView on an iPad Pro (9.7 inch), the underlying drawing environment is configured with an extended sRGB color space.
- If your app renders custom image objects, use the new UIGraphicsImageRenderer class to control whether the destination bitmap is created using an extended-range or standard-range format. 0 If you are performing your own image processing on wide-gamut devices using a lower level API, such as Core Graphics or Metal, you should use an extended range color space and a pixel format that supports 16-bit floating-point component values. When clamping of color values is necessary, you should do so explicitly.
- Core Graphics, Core Image, and Metal Performance Shaders provide new options for easily converting colors and images between color spaces.
Adapting to the True Tone Display 真彩色显示
The True Tone display uses ambient light sensors to automatically adjust the color and intensity of the display to match the lighting conditions of the current environment. To ensure that your app works well with the standard color shift provided by True Tone, add the new UIWhitePointAdaptivityStyle key to your Info.plist file to describe your app’s primary visual content. For example:
If your app is a photo editing app, color fidelity is more important than automatic adjustment to the environmental white point. In this case, you can use the UIWhitePointAdaptivityStylePhoto style to reduce the strength of True Tone shift applied by the system.
If your app is a reading app, conformance with the environmental white point is helpful to users. In this case, you can use the UIWhitePointAdaptivityStyleReading style to increase the strength of True Tone shift applied by the system.
真彩色显示会根据光感应器自动调节显示来适应环境色彩。当系统调节显示的时候,app 也需要调整。怎么调整呢?在 iOS10上只需要在 info.plist 文件中指定 UIWhitePointAdaptivityStyle
共有五种适应形式:
1.UIWhitePointAdaptivityStyleStandard 标准
2.UIWhitePointAdaptivityStyleReading 阅读模式(强于标准)
3.UIWhitePointAdaptivityStylePhoto 图片模式(弱于标准值)
4.UIWhitePointAdaptivityStyleVideo 视频模式(弱于标准值)
5.UIWhitePointAdaptivityStyleGame 游戏模式(弱于标准值)
例如如果是一个图片编辑 App,肯定是不希望受系统调节影响过大。这个时候,可以指定UIWhitePointAdaptivityStylePhoto,系统就不会过度调节。
如果是一个阅读类 App,这个时候,就需要系统加强调节,可以指定UIWhitePointAdaptivityStyleReading模式。
App Search Enhancements 应用搜索增强
iOS 10 and the Core Spotlight framework introduce several enhancements to app search:
- In-app searching
- Search continuation
- Crowdsourcing deep link popularity with differential privacy
- Visualization of validation results
The new CSSearchQuery class supports in-app searches of content that you index using existing Core Spotlight APIs. Using this API can eliminate the need to maintain your own separate search index and lets you take advantage of Spotlight’s powerful search technology and matching rules to allow users to search for content without leaving your app, just as they do within Mail, Messages, and Notes.
In iOS 9, using search APIs (such as Core Spotlight, NSUserActivity, and web markup) to index content within your app let users search for that content using the Spotlight and Safari search interfaces. In iOS 10, you can use new Core Spotlight symbols to let users continue a search they began in Spotlight when they open your app. To enable this feature, add the CoreSpotlightContinuation key to your Info.plist file, give it the value YES, and update your code to handle an activity continuation of type CSQueryContinuationActionType. The user info dictionary in the NSUserActivity object that you receive in your application:continueUserActivity:restorationHandler: method includes the CSSearchQueryString key, whose value is a string that represents the user’s query.
iOS 10 introduces a differentially private way to help improve the ranking of your app’s content in search results. iOS submits a subset of differentially private hashes to Apple servers as users use your app and as NSUserActivity objects that include a deep link URL and have their eligibleForPublicIndexing property set to YES are submitted to iOS. The differential privacy of the hashes allows Apple to count the frequency with which popular deep links are visited without ever associating a user with a link.
When you test your website markup and deep links using the App Search API Validation tool, it now displays a visual representation of your results, including supported markup, such as that defined at Schema.org. The validation tool can help you see information that the Applebot web crawler has indexed, such as the title, description, URL, and other supported elements. You can access the validation tool here: https://search.developer.apple.com/appsearch-validation-tool. To learn more about supporting deep links and adding markup, see Mark Up Web Content.
To learn how to make your website’s images searchable within the Messages app, see Integrating with the Messages App.
1.CSSearchQuery类来管理索引关键字。
- iOS 10 可以实现用户在 Spotlight 搜索的时候,搜索到了我们的 app,用户点击了我们的 App,这个时候,用户搜索的关键字可以被我们的 App 接收到。比如,用户搜索海底捞,然后显示了我们的 App 美团,然后用户点击美团,App 打开的时候得到『海底捞』这个关键字。怎么实现这个功能呢? 1.在 Info.plistz 中添加 key: CoreSpotlightContinuation value: YES 2.在代码中指定 activity 类型: CSQueryContinuationActionType 3.在 app 启动的回调中, NSUserActivity 对象包含了一个 Key: CSSearchQueryString Value:用户搜索的关键字。
Widget Enhancements 部件增强
iOS 10 introduces a new design for the lock screen, which now displays widgets. To ensure that your widget looks good on any background, you can specify widgetPrimaryVibrancyEffect or widgetSecondaryVibrancyEffect, as appropriate (use these properties instead of the deprecated notificationCenterVibrancyEffect property). In addition, widgets now include the concept of display mode (represented by NCWidgetDisplayMode), which lets you describe how much content is available and allows users to choose a compact or expanded view.
iOS10的锁屏部件被重新设计。
Apple Pay Enhancements 支付增强
In iOS 10, users can make easy and secure payments using Apple Pay from websites and through interaction with Siri and Maps. For developers, iOS 10 introduces new APIs you can use in code that runs in both iOS and watchOS, the ability to support dynamic payment networks, and a new sandbox testing environment.
iOS 10 introduces new APIs that help you incorporate Apple Pay directly into your website. When you support Apple Pay in your website, users browsing with Safari in iOS or OS X can make payments using their cards in Apple Pay on their iPhone or Apple Watch. To learn more, see ApplePay JS Framework Reference.
The PassKit framework (PassKit.framework) introduces APIs that let you support Apple Pay in places where UIKit is not available. Specifically, PKPaymentAuthorizationController and PKPaymentAuthorizationControllerDelegate enable features provided by PKPaymentAuthorizationViewController and its delegate, but don’t require UIKit. Although the new API is required for supporting Apple Pay in watchOS and in certain intents, it’s recommended that you adopt it in all of your code so that you can provide broad Apple Pay support with a single code base. (To learn more about intents and Siri integration, see SiriKit.)
The PassKit framework also adds features that let card issuers present their cards from within their apps. Specifically, the PKPaymentButtonTypeInStore button type lets you display an Apple Pay button for a card and the presentPaymentPass: method lets you programmatically display the card (the presentPaymentPass: method is defined in PKPassLibrary).
When a new payment network becomes available, your app can automatically support the new network without requiring you to modify and recompile your app. The availableNetworks method lets you discover the networks that are available on the user's device at runtime. In addition, the supportedNetworks property is expanded, so that it can take some payment provider names as an argument. Your app then automatically supports any networks that the payment provider supports. To learn more, see https://developer.apple.com/apple-pay/.
iOS 10 introduces a new testing environment that lets you provision test cards directly on the device. The test environment returns encrypted test payment data. To use this environment, follow these steps:
Create a testing iCloud Account at iTunes Connect. Log into that account on your device. Set the desired region for testing. Use test cards listed at https://developer.apple.com/apple-pay/. Note: When you switch iCloud accounts, the environment switches automatically. You must still test your payments using actual cards in an production environment.
Security and Privacy Enhancements
iOS 10 introduces several changes and additions that help you improve the security of your code and maintain the privacy of user data. To learn more about these items, see https://developer.apple.com/security/.
The new NSAllowsArbitraryLoadsInWebContent key for your Info.plist file gives you a convenient way to allow arbitrary web page loads to work while retaining ATS protections for the rest of your app.
The SecKey API includes improvements for asymmetric key generation. Use the SecKey API instead of the deprecated Common Data Security Architecture (CDSA) APIs.
The RC4 symmetric cipher suite is now disabled by default for all SSL/TLS connections, and SSLv3 is no longer supported in the Secure Transports API. It’s recommended that you stop using the SHA-1 and 3DES cryptographic algorithms as soon as possible.
The UIPasteboard class supports the Clipboard feature, which lets users copy and paste between devices, and includes API you can use to restrict a pasteboard to a specific device and set an expiration timestamp after which the pasteboard is cleared. Additionally, named pasteboards are no longer persistent—instead, you should use shared containers—and the “Find” pasteboard (that is, the pasteboard identified by the UIPasteboardNameFind constant) is unavailable.
You must statically declare your app’s intended use of protected data classes by including the appropriate purpose string keys in your Info.plist file. For example, you must include the NSCalendarsUsageDescription key to access the user’s Calendar data. If you don’t include the relevant purpose string keys, your app exits when it tries to access the data.
安全和隐私增强:
- 添加NSAllowsArbitraryLoadsInWebContent来允许访问任意 Web
- SecKey API 来生成不对成密钥
- UIPasteboard 类变后: 1)支持不同设备之间的复制和粘贴支持设置内容的过期时间,支持指定设备。 2)一起的命名剪贴板不在有。
CallKit
The CallKit framework (CallKit.framework) lets VoIP apps integrate with the iPhone UI and give users a great experience. Use this framework to let users view and answer incoming VoIP calls on the lock screen and manage contacts from VoIP calls in the Phone app’s Favorites and Recents views.
CallKit also introduces app extensions that enable call blocking and caller identification. You can create an app extension that can associate a phone number with a name or tell the system when a number should be blocked.
CallKitßß
- 允许用户在锁屏下查看和回答 Voip 电话。 2.可以管理收藏和最近联系人
- 可以告诉系统应该屏蔽此电话 (这不是 android 上标记垃圾短信的形式吗?)腾讯还是360会先出这个功能捏?机会不可错过啊!!!!!
News Publisher Enhancements
News Publisher makes it easy to deliver beautifully designed news, magazine, and web content to Apple News using the Apple News Format. Anyone can sign up, from major magazines or news organizations to independent publishers and bloggers. To get started or to learn more about recent updates, visit https://newsresources.apple.com.
Video Subscriber Account
iOS 10 introduces the Video Subscriber Account framework (VideoSubscriberAccount.framework) to help apps that support authenticated streaming or authenticated video on demand (also known as TV Everywhere) authenticate with their cable or satellite TV provider. Using the APIs in this framework can help you support a single sign-in experience in which users sign in once to unlock access in all of the streaming video apps that their subscription supports.
App Extensions
iOS 10 introduces several new extension points for which you can create an app extension, such as:
Call Directory Intents Intents UI Messages Notification Content Notification Service Sticker Pack In addition, iOS 10 includes the following enhancements for third-party keyboard app extensions:
You can automatically detect the input language of a document by using the documentInputMode property of the UITextDocumentProxy class, and change your keyboard extension to align with that language (if supported). When you detect the input language in this way, you can do per-language keyboard switching such as what is built in to Messages. The new handleInputModeListFromView:withEvent: method lets a keyboard extension display the system’s keyboard picker menu (that is, the globe key menu). A keyboard extension should position the globe key in the same location as the system globe key for each orientation. Also, if you need to provide a custom key—to enable keyboard settings, for example—you should put this key in the same location as the dictation key in the system keyboard.
To learn more about creating app extensions in general, see App Extension Programming Guide.
Additional Framework Changes
In addition to the major changes described above, iOS 10 includes many other improvements.
AVFoundation The AVFoundation framework (AVFoundation.framework) includes the following enhancements:
The new AVCapturePhotoOutput class provides a unified pipeline for all photography workflows, enabling more sophisticated control and monitoring of the entire capture process and including support for new features such as Live Photos and RAW format capture. You should transition to AVCapturePhotoOutput instead of AVCaptureStillImageOutput, which is deprecated in iOS 10. The Camera Capture pipeline now enables capture in wide-gamut color formats on supported hardware. By default, an AVCaptureSession automatically configures for wide-color capture when appropriate for your capture workflow—for details, see iOS Device Compatibility Reference. You no longer need to implement different behaviors for AVPlayerItem, depending on whether the content is a movie file or HLS content. In apps that link on or after iOS 10, you simply set the rate property and AVFoundation determines when enough content has been buffered to play without stalling. The AVPlayerLooper class makes it easier to loop a particular piece of media content during playback. Use the AVAssetDownloadURLSession and AVAssetDownloadURLSession classes to download an asset, including an HLS stream, to the device and then play it later. When used in conjunction with FairPlay Streaming, you can download an encrypted HLS stream and play the stream securely at a later time. AVKit The AVKit framework (AVKit.framework) includes the updatesNowPlayingInfoCenter property, which indicates when the Now Playing Info Center should be updated.
Core Data The Core Data framework (CoreData.framework) includes the following enhancements:
NSPersistentStoreCoordinator now maintains a connection pool for SQLite stores. Root NSManagedObjectContext objects (those without parent MOCs) transparently support concurrent fetching and faulting without serializing against each other. NSManagedObjectContext objects with SQLite stores in WAL journal_mode support a new feature called query generations. These allow a MOC to be pinned to a version of the database at a point in time and perform all future fetching and faulting against that version of the database. Pinned MOCs are moved to the most recent transaction with any save, and query generations do not survive the process's life time. The new NSPersistentContainer class provides your app with a high-level integration point that maintains references to your NSPersistentStoreCoordinator, NSManagedObjectModel, and other configuration resources. Core Data now has tighter integration with Xcode and automatically generates and updates your NSManagedObject subclasses. NSManagedObject includes several additional convenience methods, making it easier to fetch and create subclasses. NSManagedObject subclasses that have a 1:1 relationship with an entity now support entity. Core Data introduces several API adjustments that provide better integration with Swift, including parameterized NSFetchRequest objects. For more information, see Core Data Framework Reference.
Core Image The Core Image framework (CoreImage.framework) includes several enhancements.
RAW image file support is now available on iOS devices that use the A8 or A9 CPU. Core Image can decode RAW images produced by several third-party cameras as well as images produced by the iSight camera of supported iOS devices (to learn more, see AVFoundation). To process RAW images, use filterWithImageData:options: or filterWithImageURL:options: to create a CIFilter object, adjust RAW processing options with the keys listed in RAW Image Options, and read the processed image from the filter’s outputImage property.
You can now insert custom processing into a Core Image filter graph by using the imageWithExtent:processorDescription:argumentDigest:inputFormat:outputFormat:options:roiCallback:processor: method. This method adds a callback block that Core Image invokes in between filters when processing an image for display or output; in the block, you can access the pixel buffers or Metal textures containing the current state of the processed image and apply your own image processing algorithms.
When using a custom processor block or writing filter kernels, you can process images in a color space other than the Core Image context’s working color space. Use the imageByColorMatchingWorkingSpaceToColorSpace: and imageByColorMatchingColorSpaceToWorkingSpace: methods to convert into and out of your color space before and after processing.
Performance is significantly improved for rendering UIImage objects that are backed by Core Image images (such as those created by using the initWithCIImage: initializer) in a UIImageView object. In addition, a Core Image–backed UIImage object that’s tagged with a wide-gamut color profile renders in a UIImageView object that uses wide-gamut color (on capable iOS devices).
Core Image kernel code can now request a specific output pixel format.
Core Image introduces five new filters:
CINinePartTiled CINinePartStretched CIHueSaturationValueGradient CIEdgePreserveUpsampleFilter CIClamp Core Motion The Core Motion framework (CoreMotion.framework) includes pedometer events, which enable apps to receive fast real-time notifications when users pause and resume while running. On supported devices, apps can use CMPedometer APIs to register to receive live pedometer events while running in the foreground or the background.
Foundation The Foundation framework (Foundation.framework) contains many enhancements, such as:
The new NSDateInterval class defines a programmatic interface for calculating the duration of a time interval and determining whether a date falls within it, as well as comparing date intervals and checking to see whether they intersect. The NSLocale class defines many new properties that you can use to get information about a locale and how it can be displayed. The new NSMeasurement class helps you convert measurements into different units, and calculate the sum or difference between two measurements. The new NSMeasurementFormatter class helps you create localized representations of measurements when displaying quantities of units to the user. The new NSUnit class and concrete NSDimension subclasses help you represent specific units of measure. GameKit The GameKit framework (GameKit.framework) includes the following changes and enhancements:
The Game Center app has been removed. If your game implements GameKit features, it must also implement the interface behavior necessary for the user to see these features. For example, if your game supports leaderboards, it could present a GKGameCenterViewController object or read the data directly from Game Center to implement a custom user interface. A new account type, implemented by the GKCloudPlayer class, supports iCloud-only game accounts. Game Center provides a new generalized solution for managing persistent storage of data on Game Center. A game session (GKGameSession) has a list of players who are the session’s participants. Your game’s implementation defines when and how a participant stores or retrieves data from the server or exchanges data between players. Game sessions can often replace existing turn-based matches, real-time matches, and persistent save games, and also enable other models of interaction between participants. GameplayKit The GameplayKit framework (GameplayKit.framework) includes the following changes and enhancements:
Procedural noise generation can be used to generate rich game worlds, create sophisticated natural-looking textures, and add realism to camera movement. Spatial partitioning lets you partition your game world data so that the data in the game world can be searched efficiently. A new Monte Carlo strategist (GKMonteCarloStrategist) helps you model games where exhaustive computation of possible moves is difficult. The new decision tree API can enhance your game-building AI when you adopt decision-tree learning to generalize behavior based on data mining of logged player actions. To learn more, see GKDecisionTree and GKDecisionNode. The GKAgent3D and GKGraphNode3D classes introduce 3D support to existing agent and path-finding behavior. The new GKMeshGraph class provides a higher performance alternative to GKObstacleGraph, allowing you to produce more natural-looking output at the cost of less mathematically perfect paths. The new GKScene and GKSKNodeComponent classes, combined with changes in SpriteKit and the Xcode editor, make integrating GameplayKit with SpriteKit easier than ever. HealthKit The HealthKit framework (HealthKit.framework) includes the following changes and enhancements:
The new HKCDADocument class, which represents a CDA document (that is, a document that follows the Clinical Document Architecture standard). The new HKWorkoutConfiguration class, which lets you specify the activityType and locationType for a workout. The new HKWheelchairUseObject characteristic object type and the related HKHealthStore method wheelchairUseWithError:. New metadata keys that indicate weather types, such as HKWeatherConditionClear and HKWeatherConditionCloudy, and workout types, such as HKWorkoutActivityTypeFlexibility and HKWorkoutActivityTypeWheelchairRunPace. HomeKit In iOS 10, iPad can be configured to provide remote access to accessories, run automation triggers, and enable shared user permissions. In addition, the HomeKit framework (HomeKit.framework) adds support for camera and doorbell accessories and introduces many new APIs that help you:
View and interact with IP camera accessory profiles, display live streams and snapshots, and control a camera’s settings, speaker, and microphone Access new services and characteristics For the primary service, link services and valid values to provide more context and configuration about the accessories You can also add and set up accessories using the Apple accessory setup workflow. To learn more, see HomeKit Framework Reference.
Metal In iOS 10, Metal includes several new features and enhancements, such as:
Support for tessellation, enabling 3D apps and games to render more detailed scenes by efficiently describing complex geometry to the GPU. Function Specialization, which makes it easy to create a collection of highly optimized functions to handle all the material and light combinations in a scene. Resource Heaps and Memoryless Render Targets, which grant even finer-grained control of resource allocation to further optimize the performance of Metal-based apps. To learn more, see What’s New in iOS 10, tvOS 10, and OS X 10.12 in Metal Programming Guide.
ModelIO The ModelIO framework (ModelIO.framework) includes the following enhancements:
The USD file format is now supported. The new MDLMaterialPropertyGraph class makes it easier to support runtime procedural changes to models. The MDLVoxelArray class adds support for signed distance fields. You can add assisted light probe placement by implementing the MDLLightProbeIrradianceDataSource protocol. Photos The Photos framework (Photos.framework) makes Live Photo editing available to apps that use Photos framework APIs to access the user's Photos library and to photo editing app extensions for use in the Photos and Camera apps. Specifically, the new PHLivePhotoEditingContext class lets you apply edits to the video and still photo content of a Live Photo, with an easy-to-use API based on Core Image enhancements. In addition, you can take advantage of the new Core Image processor feature to use other image processing technologies to perform edits. To learn more, see CIImageProcessorInput and CIImageProcessorOutput.
ReplayKit The ReplayKit framework (ReplayKit.framework) includes the following enhancements:
ReplayKit supports broadcasting services so that a user can broadcast recorded media through a third-party site. You can implement support for this functionality by using the RPScreenRecorder, RPBroadcastActivityViewController, and RPBroadcastController classes. To participate in ReplayKit broadcast, third-party broadcast services need to implement a pair of app extensions. The Broadcast UI extension provides a UI that lets users sign into the service and set up a broadcast. The Broadcast Upload extension receives movie clips and transmits them to the service. SceneKit The SceneKit framework (SceneKit.framework) includes several enhancements.
A new Physically Based Rendering (PBR) system allows you to leverage the latest in 3D graphics research to create more realistic results with simpler asset authoring. Specifically:
Use the new SCNLightingModelPhysicallyBased shading model to opt into PBR shading for materials. PBR materials require only three fundamental properties—diffuse, metalness, and roughness—to produce a wide range of realistic shading effects. (The normal, ambientOcclusion, and selfIllumination material properties also remain useful for PBR materials, but you can now ignore the large number of other properties used for traditional materials.) PBR shading works best with environment-based lighting, which causes even diffuse surfaces to pick up the colors of the scene around them. Use the lightingEnvironment property to assign global image-based lighting to an entire scene, and place light probes in the Xcode scene editor to pick up the local lighting contributions from objects within your scene. Authors of PBR scene content often prefer working in physically based terms, so you can now define lighting using intensity (in lumens) and color temperature (in degrees Kelvin), and import specifications for real-world light fixtures using the IESProfileURL property. Add even more realism with the new HDR features and effects in the SCNCamera class. With HDR rendering, SceneKit captures a much wider range of brightness and contrast in a scene, then allows you to customize the tone mapping that adapts that scene for the narrower range of a device’s display. Enable exposure adaptation to create automatic effects when, for example, the player in your game moves from a darkened area into sunlight. Or use vignetting, color fringing, and color grading to add a filmic look to your game.
Although linear, more color-accurate rendering is the basis for PBR shading and HDR camera features, it produces better results even for traditional rendering. By default, SceneKit now performs all color calculations in a linear (not gamma-adjusted) color space, and uses the P3 color gamut of devices that include wide-color displays. This feature is enabled automatically for all apps linking against the iOS 10 SDK, and has a few ramifications for content design and asset management:
SceneKit color matches all colors. In previous versions, SceneKit would read only the color values from material colors specified as NSColor or UIColor objects, ignoring color profile information and assuming the sRGB color space. SceneKit interprets color component values specified within shader modifier or custom Metal or OpenGL shader code in linear RGB space. SceneKit reads and adjusts for color profile information in texture images. Design textures for a linear brightness ramp, and use Asset Catalogs in Xcode to make sure your images use the correct color profile. If necessary, you can disable linear space rendering with the SCNDisableLinearSpaceRendering key in your app’s Info.plist file, and wide color rendering with the SCNDisableWideGamut key. Geometry can now be loaded from scene files or programmatically defined using arbitrary polygon primitives (SCNGeometryPrimitiveTypePolygon). SceneKit automatically triangulates polygon meshes for rendering, but makes use of the underlying polygon mesh for more accurate surface subdivision (to learn more, see the subdivisionLevel property).
SpriteKit The SpriteKit framework (SpriteKit.framework) includes the following enhancements:
A new tilemap solution supports square, hexagonal, and isometric tilemaps that make it easy to create 2D, 2.5D, and side-scroller games. The Xcode editor provides comprehensive support for organizing your tiles and creating your tilemap. For more information, see the SKTileMapNode, SKTileGroup, SKTileGroupRule, and SKTileSet classes . The new SKWarpGeometry class is used to stretch or distort how a SKSpriteNode or SKEffectNode object is rendered. The warp is specified by a set of control points. New SKAction types can be used to animate between different warp effects. A custom shader can use attributes that can be configured separately by each node that uses the shader. To add an attribute, create an SKAttribute object and attach it to your shader. Then, for each node that uses that shader, attach an SKAttributeValue object.] The SKView class defines new methods that give you finer control over when and how your scene is rendered. UIKit The UIKit framework (UIKit.framework) includes many enhancements, such as:
New object-based, fully interactive and interruptible animation support that lets you retain control of your animations and link them with gesture-based interactions. To learn more, see UIViewAnimating Protocol Reference, UIViewPropertyAnimator Class Reference, UITimingCurveProvider Protocol Reference, UICubicTimingParameters Class Reference, and UISpringTimingParameters Class Reference. The new UIPreviewInteraction class and UIPreviewInteractionDelegate protocol, which let you provide a custom user interface related to the peek and pop experience. The new UIAccessibilityCustomRotor class and related classes that help you provide custom, context-specific functionality that assistive technologies such as VoiceOver can expose to users. For example, you might create a custom rotor that lets VoiceOver users find misspelled words in a document by repeatedly returning the range of text that contains the next misspelled word. The UIAccessibilityIsAssistiveTouchRunning and UIAccessibilityAssistiveTouchStatusDidChangeNotification symbols, which let you determine when AssistiveTouch is enabled, and the UIAccessibilityHearingDevicePairedEar and UIAccessibilityHearingDevicePairedEarDidChangeNotification symbols, which give you the pairing status of MFi hearing aids. New UIPasteboard API that automatically declares compatible content types for common class instances and new options that limit the lifetime of objects on the pasteboard. New options in UIPasteboard The new preferredFontForTextStyle:compatibleWithTraitCollection: UIFont method, which lets you add support for Dynamic Type in labels, text fields, and other text areas. The UIContentSizeCategoryAdjusting protocol, which provides the adjustsFontForContentSizeCategory property that you can use to determine if the adopting element should update its font when the device’s UIContentSizeCategory changes. Additional control over the appearance of the badge on a tab bar item, such as background color and text attributes. Support for the refresh control in all scroll views and scroll-view subclasses, such as UICollectionView. The new UIApplication method openURL:options:completionHandler:, which is executed asynchronously and calls the specified completion handler on the main queue (this method replaces openURL:). The new UICloudSharingController class and UICloudSharingControllerDelegate protocol, which help you initiate a CloudKit sharing operation and display a view controller that lets users view and modify participants and start and stop sharing. Enhancements to UICollectionView and the new UICollectionViewDataSourcePrefetching protocol, which help you take advantage of automatic prefetching of cells to improve the scrolling experience. WebKit The WebKit framework (WebKit.framework) introduces enhanced peek and pop support in WKWebView objects. In iOS 10, you can use the webView:shouldPreviewElement: method to determine if the specified web view should display the preview.
Deprecated APIs
iOS 10 deprecates several APIs, including:
The CloudKit CKDiscoverAllContactsOperation, CKDiscoveredUserInfo, CKDiscoverUserInfosOperation, CKFetchRecordChangesOperation classes. Instead, use CKDiscoverAllUserIdentitiesOperation, CKUserIdentity, CKDiscoverUserIdentitiesOperation, and CKFetchRecordZoneChangesOperation classes, all of which support record sharing. Several CKSubscription APIs, such as methods and properties related to zone-based subscriptions (use CKRecordZoneSubscription APIs instead) and to query-based subscriptions (use CKQuerySubscription APIs instead).
Several NSPersistentStoreCoordinator symbols related to ubiquitous content. The ADBannerView and ADInterstitialAd classes and related symbols in UIViewController. Several SKUniform symbols related to floating point values. Instead, use methods such as initWithName:vectorFloat2: and uniformWithName:matrixFloat2x2:, as appropriate. Several UIKit classes related to notifications, such as UILocalNotification, UIMutableUserNotificationAction, UIMutableUserNotificationCategory, UIUserNotificationAction, UIUserNotificationCategory, and UIUserNotificationSettings. Use APIs in the User Notifications framework instead (see User Notifications Framework Reference). The handleActionWithIdentifier:forLocalNotification:, handleActionWithIdentifier:forRemoteNotification:, didReceiveLocalNotification:withCompletion:, and didReceiveRemoteNotification:withCompletion: WatchKit methods. Use handleActionWithIdentifier:forNotification: and didReceiveNotification:withCompletion: instead. Also the notification-handling methods in WKExtensionDelegate, such as didReceiveRemoteNotification: and handleActionWithIdentifier:forRemoteNotification:. Instead of using these methods, first create a delegate object that adopts the UNUserNotificationCenterDelegate protocol and implement the appropriate methods. Then assign the delegate object to the delegate property of the singleton UNUserNotificationCenter object.
For a complete list of specific API deprecations, see iOS 10.0 API Diffs.
iOS 10.0 更新点(开发者视角)的更多相关文章
- iOS 10.0之前和之后的Local Notification有神马不同
在iOS 10.0之前apple还没有将通知功能单独拿出来自成一系.而从10.0开始原来的本地通知仍然可用,只是被标记为过时.于是乎我们可以使用10.0全新的通知功能.别急-让我们慢慢来,先从iOS ...
- WeihanLi.Npoi 1.10.0 更新日志
WeihanLi.Npoi 1.10.0 更新日志 Intro 上周有个网友希望能够导入Excel时提供一个 EndRowIndex 来自己控制结束行和根据字段过滤的,周末找时间做了一下这个 feat ...
- 在 iOS 10.0 之后, App 要调用手机相机与相簿应注意的事项
iOS 的 SDK 每一年至少都会有一次大改版,从 2009 到 2016 年,版号已经到了第 10 版了,很轻易的就追上了 Mac OSX. 每一次的大改版都会有不少新的功能或新的规范,在 iOS ...
- iOS 10.0前的Notification推送
前言 推送为远程推送,一般由苹果APNS服务器发送给苹果设备(iPhone,iPad) 推送分在前台和后台.在前台时 用户可以在application 的代理回调接口中做相应处理:在后台时 系统会全权 ...
- iOS 10.0适配之旅
1.升级Xcode体验 升级到Xcode之后,调试程序好多东西都不是太适应 控制台莫名给你打印一堆不是太好理解的东西 之前使用 Alcatraz 下载的插件都不能用(如何使用Alcatraz) 打开麦 ...
- 关于iOS10 Xcode8真机测试项目出现的问题 "code signing is required for product type 'xxxxx' in SDK 'iOS 10.0"..
昨天用真机测试项目出现这样的错误,在网上搜集了一些信息,所以将自己的经验分享出来帮助更多的人. 第一步: 检查你的1和2是否填写正确,如果你是运行别人的项目,BundleIdentifier要和你的X ...
- iOS 10.3 以上系统实现应用内评分及开发者回复评论
在 iOS 10.3 之前,如果你要给一个应用评分,那么你需要打开 App Store,搜索应用,找到评论,点击撰写评论,然后评分.整个评分流程非常繁琐,还要忍受漫长的页面加载,导致很少有用户愿意主动 ...
- [转载]iOS 10 UserNotifications 框架解析
活久见的重构 - iOS 10 UserNotifications 框架解析 TL;DR iOS 10 中以前杂乱的和通知相关的 API 都被统一了,现在开发者可以使用独立的 UserNotifica ...
- iOS 10 开发问题总结
兼容iOS 10 资料整理笔记 1.Notification(通知) 自从Notification被引入之后,苹果就不断的更新优化,但这些更新优化只是小打小闹,直至现在iOS 10开始真正的进行大 ...
随机推荐
- android tcp协议主要函数
1 tcp_timers: 处理各种timer超时信息,关键函数tcp_xmit_timer 2 tcp_iutput: 3 tcp_output:接收方的接收窗口struct tcpcb.snd_w ...
- UITextView
一.由于IOS中的UITextField不支持文本换行,在需要换行的时候.我们可以用UITextView来解决这一问题. 二.创建步骤 1.初始化并设置位置和大小 UITextView *text ...
- codeforces 425D
题意:给定n<=100000个二维点,并且0<=x,y<=100000,求有多少个平行于坐标轴的正方形 思路:本来想hash的,但是感觉不好弄.. 后来感觉像是分块,最坏的情况就是那 ...
- github心得
心得 : 1:安装:省略 2. 配置 Git 以及上传代码 安装 Git 成功后,如果是 Windows 下,选择 Git Bash ,在命令行中完成一切,可能开始有点麻 烦,不过就那几条命令行,用 ...
- missing locales
原文地址:http://codewut.de/content/missing-locales-under-debian This drives me crazy! Every time I deboo ...
- Server Develop (五) Linux并发模型
Linux并发模型 目前可以实现并发程序的方法有Apache模型(Process Per Connection,简称PPC),TPC(Thread PerConnection)模型,以及select模 ...
- [51单片机] EEPROM 24c02 [读取存储多字节]
先将数据存进去,然后再读出来显示在数码管上. 除了代码里定义的连线外还要把p0连接到8位数码管的8针上. /*--------------------------------------------- ...
- [游戏学习26] MFC 时间函数 画图形
>_<:这里第一次介绍MFC的时间函数,功能和Win32里的计时器类似. >_<:这里还介绍了MFC的图形绘制函数,和Win32有一点区别 >_<:ABC.h #d ...
- hibernate 问题集
1.Could not obtain transaction-synchronized Session for current thread 解决方法: 在web.xml中加入如下配置: <fi ...
- jenkins2 pipeline高级
jenkins2 pipeline里groovy的高级用法.翻译自:https://github.com/jenkinsci/pipeline-plugin/blob/master/TUTORIAL. ...