博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
ios7 sdk 新特性
阅读量:5956 次
发布时间:2019-06-19

本文共 33406 字,大约阅读时间需要 111 分钟。

iOS 7 is a major update with compelling features for developers to incorporate into their apps. The user interface has been completely redesigned. In addition, iOS 7 introduces a new animation system for creating 2D and 2.5D games. Multitasking enhancements, peer-to-peer connectivity, and many other important features make iOS 7 the most significant release since the first iPhone SDK.

This article summarizes the key developer-related features introduced in iOS 7. This version of the operating system runs on current iOS devices. In addition to describing the key new features, this article lists the documents that describe those features in more detail.

For late-breaking news and information about known issues, see . For the complete list of new APIs added in iOS 7, see .

User Interface Changes

iOS 7 includes many new features intended to help you create great user interfaces.

UI Redesign

The iOS 7 user interface has been completely redesigned. Throughout the system, a sharpened focus on functionality and on the user’s content informs every aspect of design. Translucency, refined visual touches, and fluid, realistic motion impart clarity, depth, and vitality to the user experience. Whether you are creating a new app or updating an existing one, keep these qualities in mind as you work on the design.

Apps compiled against the iOS 7 SDK automatically receive the new appearance for any standard system views when the app is run on iOS 7. If you use Auto Layout to set the size and position of your views, those views are repositioned as needed. But there may still be additional work to do to make sure your interface has the appearance you want. Similarly, if you customize your app’s views, you may need to make changes to support the new appearance fully.

For guidance on how to design apps that take full advantage of the new look in iOS 7, see .

Dynamic Behaviors for Views

Apps can now specify dynamic behaviors for  objects and for other objects that conform to the  protocol. (Objects that conform to this protocol are called dynamic items.) Dynamic behaviors offer a way to improve the user experience of your app by incorporating real-world behavior and characteristics, such as gravity, into your app’s animations. UIKit supports the following types of dynamic behaviors:

  •  object specifies a connection between two dynamic items or between an item and a point. When one item (or point) moves, the attached item also moves. The connection is not completely static, though. An attachment behavior has damping and oscillation properties that determine how the behavior changes over time.

  •  object lets dynamic items participate in collisions with each other and with the behavior’s specified boundaries. The behavior also lets those items respond appropriately to collisions.

  •  object specifies a gravity vector for its dynamic items. Dynamic items accelerate in the vector’s direction until they collide with other appropriately configured items or with a boundary.

  •  object specifies a continuous or instantaneous force vector for its dynamic items.

  •  object specifies a snap point for a dynamic item. The item snaps to the point with a configured effect. For example, it can snap to the point as if it were attached to a spring.

Dynamic behaviors become active when you add them to an animator object, which is an instance of the  class. The animator provides the context in which dynamic behaviors execute. A given dynamic item can have multiple behaviors, but all of those behaviors must be animated by the same animator object.

For information about the behaviors you can apply, see .

Text Kit

Text Kit is a full-featured set of UIKit classes for managing text and fine typography. Text Kit can lay out styled text into paragraphs, columns, and pages; it easily flows text around arbitrary regions such as graphics; and it manages multiple fonts. Text Kit is integrated with all UIKit text-based controls to enable apps to create, edit, display, and store text more easily—and with less code than was previously possible in iOS.

Text Kit comprises new classes and extensions to existing classes, including the following:

  • The  class has been extended to support new attributes.

  • The  class generates glyphs and lays out text.

  • The  class defines a region where text is laid out.

  • The  class defines the fundamental interface for managing text-based content.

For more information about Text Kit, see .

64-Bit Support

Apps can now be compiled for the 64-bit runtime. All system libraries and frameworks are 64-bit ready, meaning that they can be used in both 32-bit and 64-bit apps. When compiled for the 64-bit runtime, apps may run faster because of the availability of extra processor resources in 64-bit mode.

iOS uses the same LP64 model that is used by OS X and other 64-bit UNIX systems, which means fewer problems when porting code. For information about the iOS 64-bit runtime and how to write 64-bit apps, see .

Multitasking Enhancements

iOS 7 supports two new background execution modes for apps:

  • Apps that regularly update their content by contacting a server can register with the system and be launched periodically to retrieve that content in the background. To register, include the UIBackgroundModes key with the fetch value in your app’s Info.plist file. Then, when your app is launched, call the  method to determine how often it receives update messages. Finally, you must also implement the method in your app delegate.

  • Apps that use push notifications to notify the user that new content is available can fetch the content in the background. To support this mode, include the UIBackgroundModes key with the remote-notification value in your app’s Info.plist file. You must also implement the method in your app delegate.

Apps supporting either the fetch or remote-notification background modes may be launched or moved from the suspended to background state at appropriate times. In the case of the fetch background mode, the system uses available information to determine the best time to launch or wake apps. For example, it does so when networking conditions are good or when the device is already awake. You can also send silent push notifications—that is, notifications that do not display alerts or otherwise disturb the user.

For small content updates, use the  class. To upload or download larger pieces of content in the background, use the new class. This class improves on the existing NSURLConnection class by providing a simple, task-based interface for initiating and processing objects. A single NSURLSession object can initiate multiple download and upload tasks, and use its delegate to handle any authentication requests coming from the server.

For more information about the new background modes, see  in .

Games

iOS 7 includes enhanced support for games.

Sprite Kit Framework

The Sprite Kit framework (SpriteKit.framework) provides a hardware-accelerated animation system optimized for creating 2D and 2.5D games. Sprite Kit provides the infrastructure that most games need, including a graphics rendering and animation system, sound playback support, and a physics simulation engine. Using Sprite Kit frees you from creating these things yourself, and it lets you focus on the design of your content and the high-level interactions for that content.

Content in a Sprite Kit app is organized into scenes. A scene can include textured objects, video, path-based shapes, Core Image filters, and other special effects. Sprite Kit takes those objects and determines the most efficient way to render them onscreen. When it is time to animate the content in your scenes, you can use Sprite Kit to specify explicit actions you want performed, or you can use the physics simulation engine to define physical behaviors (such as gravity, attraction, or repulsion) for your objects.

In addition to the Sprite Kit framework, there are Xcode tools for creating particle emitter effects and texture atlases. You can use the Xcode tools to manage app assets and update Sprite Kit scenes quickly.

For more information about how to use Sprite Kit, see . To see an example of how to use Sprite Kit to build a working app, see .

Game Controller Framework

The Game Controller framework (GameController.framework) lets you discover and configure Made-for-iPhone/iPod/iPad (MFi) game controller hardware in your app. Game controllers can be devices connected physically to an iOS device or connected wirelessly over Bluetooth. The Game Controller framework notifies your app when controllers become available and lets you specify which controller inputs are relevant to your app.

For more information about supporting game controllers, see .

Game Center Improvements

Game Center includes the following improvements:

  • Turn-based matches now support a new feature known as exchanges. Exchanges let players initiate actions with other players, even when it is not their turn. You can use this feature to implement simultaneous turns, player chats, and trading between players.

  • The limit on per-app leaderboards has been raised from 25 to 100. You can also organize your leaderboards using a  object, which increases the limit to 500.

  • You can add conditions to challenges that define when the challenge has been met. For example, a challenge to beat a time in a driving game might stipulate that other players must use the same vehicle.

  • The framework has improved its authentication support and added other features to prevent cheating.

For more information about how to use the new Game Center features, see . For information about the classes of the Game Kit framework, see .

Maps

The Map Kit framework (MapKit.framework) includes numerous improvements and features for apps that use map-based information. Apps that use maps to display location-based information can now take full advantage of the 3D map support found in the Maps app, including controlling the viewing perspective programmatically. Map Kit also enhances maps in your app in the following ways:

  • Overlays can be placed at different levels in the map content so that they appear above or below other relevant data.

  • You can apply an  object to a map to add position, tilt, and heading information to its appearance. The information you specify using the camera object imparts a 3D perspective on the map.

  • The  class lets you ask for direction-related route information from Apple. You can use that route information to create overlays for display on your own maps.

  • The  class lets you create a line-based overlay that follows the curvature of the earth.

  • Apps can use the  class to capture map-based images.

  • The visual representation of overlays is now based on the  class, which replaces overlay views and offers a simpler rendering approach.

  • Apps can now supplement or replace a map’s existing tiles using the  and  classes.

For more information about the classes of the Map Kit framework, see .

AirDrop

AirDrop lets users share photos, documents, URLs, and other kinds of data with nearby devices. AirDrop support is now built in to the existing class. This class displays different options for sharing the content that you specify. If you are not yet using this class, you should consider adding it to your interface.

To receive files sent via AirDrop, do the following:

  • In Xcode, declare support for the document types your app supports. (Xcode adds the appropriate keys to your app’s Info.plist file.) The system uses this information to determine whether your app can open a given file.

  • Implement the  method in your app delegate. (The system calls this method when a new file is received.)

Files sent to your app are placed in the Documents/Inbox directory of your app’s home directory. If you plan to modify the file, you must move it out of this directory before doing so. (The system allows your app to read and delete files in this directory only.) Files stored in this directory are encrypted using data protection, so you must be prepared for the file to be inaccessible if the device is currently locked.

For more information about using an activity view controller to share data, see .

Inter-App Audio

The Audio Unit framework (AudioUnit.framework) adds support for Inter-App Audio, which enables the ability to send MIDI commands and stream audio between apps on the same device. For example, you might use this feature to record music from an app acting as an instrument or use it to send audio to another app for processing. To vend your app’s audio data, publish a I/O audio unit (AURemoteIO) that is visible to other processes. To use audio features from another app, use the audio component discovery interfaces in iOS 7.

For information about the new interfaces, see the framework header files. For general information about the interfaces of this framework, see .

Peer-to-Peer Connectivity

The Multipeer Connectivity framework (MultipeerConnectivity.framework) supports the discovery of nearby devices and the direct communication with those devices without requiring Internet connectivity. This framework makes it possible to create multipeer sessions easily and to support reliable in-order data transmission and real-time data transmission. With this framework, your app can communicate with nearby devices and seamlessly exchange data.

The framework provides programmatic and UI-based options for discovering and managing network services. Apps can integrate the class into their user interface to display a list of peer devices for the user to choose from. Alternatively, you can use the class to look for and manage peer devices programmatically.

For more information about the interfaces of this framework, see .

New Frameworks

iOS 7 includes the following new frameworks:

  • The Game Controller framework (GameController.framework) provides an interface for communicating with game-related hardware; see .

  • The Sprite Kit framework (SpriteKit.framework) provides support for sprite-based animations and graphics rendering; see 

  • The Multipeer Connectivity framework (MultipeerConnectivity.framework) provides peer-to-peer networking for apps; see 

  • The JavaScript Core framework (JavaScriptCore.framework) provides Objective-C wrapper classes for many standard JavaScript objects. Use this framework to evaluate JavaScript code and parse JSON data. For information about the classes of this framework, see the framework header files.

  • The Media Accessibility framework (MediaAccessibility.framework) manages the presentation of closed-captioned content in your media files. This framework works in conjunction with new settings that let the user enable the display of closed captions.

  • The Safari Services framework (SafariServices.framework) provides support for programmatically adding URLs to the user’s Safari reading list. For information about the class provided by this framework, see the framework header files.

Enhancements to Existing Frameworks

In addition to its new features, iOS 7 also includes significant enhancements, organized here by framework. For a complete list of new interfaces, see .

UIKit Framework

The UIKit framework (UIKit.framework) includes the following enhancements:

  • All UI elements have been updated to present the new look associated with iOS 7.

  • UIKit Dynamics lets you mimic real-world effects such as gravity in your animations; see 

  • Text Kit provides sophisticated text editing and display capabilities; see 

  • The  class defines the following additions:

    The  property applies a tint color to both the view and its subviews. For information on how to apply tint colors, see .

    You can create keyframe-based animations using views. You can also make changes to your views and specifically prevent any animations from being performed.

  • The  class defines the following additions:

    View controller transitions can be customized, driven interactively, or replaced altogether with ones you designate.

    View controllers can now specify their preferred status bar style and visibility. The system uses the provided information to manage the status bar style as new view controllers appear. You can also control how this behavior is applied using the UIViewControllerBasedStatusBarAppearance key in your app’s Info.plist file.

  • The  class defines the basic behavior for motion effects, which are objects that define how a view responds to device-based motion.

  • Collection views add support for intermediate layout transitions and invalidation contexts (invalidation contexts help you improve the performance of your custom layout code). You can also apply UIKit Dynamics to collection view layout attributes to animate the items in the collection.

  • The  method of UIImage supports retrieving images stored in asset catalogs, which are a way to manage and optimize assets that have multiple sizes and resolutions. You create asset catalogs in Xcode.

  • There are methods on  and  for creating a snapshot of their contents. Generating snapshots using these new interfaces is significantly faster than rendering the view or screen contents yourself.

  • Gesture recognizers can specify dependencies dynamically to ensure that one gesture recognizer fails before another is considered.

  • The  class wraps keyboard events received from an external hardware keyboard. These events are delivered to the app’s responder chain for processing.

  •  object describes a font using a dictionary of attributes. Use font descriptors to interoperate with other platforms.

  • The  and UIFontDescriptor classes support dynamic text sizing, which improves legibility for text in apps. With this feature, the user controls the desired font size that all apps in the system should use.

  • The  class now supports new activity types, including activities for sending items via AirDrop, adding items to a Safari reading list, and posting content to Flickr, Tencent Weibo, and Vimeo.

  • The  protocol adds methods for handling background fetch behaviors.

  • The  class is a new gesture recognizer that tracks pan gestures that originate near an edge of the screen.

  • UIKit adds support for running in a guided-access mode, which allows an app to lock itself to prevent modification by the user. This mode is intended for institutions such as schools, where users bring their own devices but need to run apps provided by the institution.

  • State restoration now allows the saving and restoration of any object. Objects adopting the  protocol can write out state information when the app moves to the background and have that state restored during subsequent launches.

  • Table views now support estimating the height of rows and other elements, which improves scrolling performance.

  • You can now more easily configure a  object to work with a  object.

For information about the classes of this framework, see .

Store Kit Framework

The Store Kit framework (StoreKit.framework) has migrated to a new receipt system that developers can use to verify in-app purchases on the device itself. You can also use it to verify the app purchase receipt on the server.

For more information about how to use this new receipt system, see .

Pass Kit Framework

The Pass Kit framework (PassKit.framework) includes new APIs for adding multiple passes in a single operation.

These new features were added to the pass file format:

  • New keys specify the expiration date for a pass.

  • You can specify that a pass is relevant only when it is in the vicinity of specific iBeacons.

  • New attributes control how a pass is displayed. You can group passes together, display links with custom text on the back of a pass, and control how time values are displayed on the pass.

  • You can now associate extra data with a pass. This data is available to your app but is not displayed to the user.

  • You can designate which data detectors to apply to the fields of your passes.

For information about how to use Pass Kit in your app, see . For information about the pass file format, see .

OpenGL ES

iOS 7 adds support for OpenGL ES 3.0 and adds new features to OpenGL ES 2.0.

  • OpenGL ES 3.0 includes as core functionality the features of many extensions supported in OpenGL ES 2.0 on iOS. But OpenGL ES 3.0 also adds new features to the OpenGL ES shading language and new core functionality that has never been available on mobile processors before, including multiple render targets and transform feedback. You can use OpenGL ES 3 to more easily implement advanced rendering techniques, such as deferred rendering.

    To create an OpenGL ES 3 context on devices that support it, pass the kEAGLRenderingAPIOpenGLES3 constant to the  method.

  • OpenGL ES 2 adds the following new extensions:

    • The  extension adds support for sRGB framebuffer operations.

    • The  extension adds support for sRGB texture data compressed in the PVRTC texture compression format. (This extension is also supported in OpenGL ES 3.0).

    • The  and  extensions can improve rendering performance when your app draws multiple instances of the same object. You use a single call to draw instances of the same object. You add variation to each instance by specifying how fast each vertex attribute advances or by referencing an ID for each instance in your shader.

  • Textures can be accessed in vertex shaders in both OpenGL ES 2.0 and 3.0. Query the value of the MAX_VERTEX_TEXTURE_IMAGE_UNITS attribute to determine the exact number of textures you can access. In earlier versions of iOS, this attribute always had a value of 0.

For more information, see  and .

Message UI Framework

In the Message UI framework, the  class adds support for attaching files to messages.

For information about the new interfaces, see the framework header files. For information about the classes of this framework, see .

Media Player Framework

In the Media Player framework, the  class provides support for determining whether wireless routes such as AirPlay and Bluetooth are available for the user to select. You can also determine whether one of these wireless routes is currently active. For information about the new interfaces, see the framework header files.

For information about the classes of Media Player framework, see .

Map Kit Framework

The Map Kit framework (MapKit.framework) includes changes that are described in 

For information about the classes of this framework, see .

Image I/O Framework

The Image I/O framework (ImageIO.framework) now has interfaces for getting and setting image metadata.

For information about the new interfaces, see the framework header files. For information about the classes of this framework, see .

iAd Framework

The iAd framework (iAd.framework) includes two extensions to other frameworks that make it easier to incorporate ads into your app’s content:

  • The framework introduces new methods on the  class that let you run ads before a movie.

  • The framework extends the  class to make it easier to create ad-supported content. You can now configure your view controllers to display ads before displaying the actual content they manage.

For information about the new interfaces, see the framework header files. For information about the classes of this framework, see .

Game Kit Framework

The Game Kit framework (GameKit.framework) includes numerous changes, which are described in 

For information about the classes of this framework, see .

Foundation Framework

The Foundation framework (Foundation.framework) includes the following enhancements:

  • The  class adds support for Base64 encoding.

  • The  class is a new class for managing the acquisition of network-based resources. (You can use it to download content even when your app is suspended or not running.) This class serves as a replacement for the NSURLConnection class and its delegate; it also replaces theNSURLDownload class and its delegate.

  • The NSURLComponents class is a new class for parsing the components of a URL. This class supports the URI standard (rfc3986/STD66) for parsing URLs.

  • The  and  classes support peer-to-peer discovery over Bluetooth and Wi-Fi.

  • The  and  classes let you create credentials with a synchronizable policy, and they provide the option of removing credentials with a synchronizable policy from iCloud.

  • The , and  classes now support the asynchronous processing of storage requests.

  • The  class supports new calendar types.

  • The  class provides a general-purpose way to monitor the progress of an operation and report that progress to other parts of your app that want to use it.

For information about the new interfaces, see the framework header files and Foundation release notes. For general information about the classes of this framework, see .

Core Telephony Framework

The Core Telephony framework (CoreTelephony.framework) lets you get information about the type of radio technology in use by the device. Apps developed in conjunction with a carrier can also authenticate against a particular subscriber for that carrier.

For information about the new interfaces, see the framework header files. For general information about the classes of the Core Telephony framework, see.

Core Motion Framework

The Core Motion framework (CoreMotion.framework) adds support for step counting and motion tracking. With step counting, the framework detects movements that correspond to user motion and uses that information to report the number of steps to your app. Because the system detects the motion, it can continue to gather step data even when your app is not running. Alongside this feature, the framework can also distinguish different types of motion, including different motions reflective of travel by walking, by running, or by automobile. Navigation apps might use that data to change the type of directions they give to users.

For information about the classes of this framework, see .

Core Location Framework

The Core Location framework (CoreLocation.framework) supports region monitoring and ranging using Bluetooth devices. Region monitoring lets you determine whether the iOS device enters a specific area, and ranging lets you determine the relative range of nearby Bluetooth devices. For example, an art museum might use region monitoring to determine whether a person is inside a particular gallery, and then place iBeacons near each painting. When the person is standing by a painting, the app would display information about it.

The framework also supports deferring the delivery of location updates until a specific time has elapsed or the user has moved a minimum distance.

For general information about the classes of this framework, see .

Core Foundation Framework

The Core Foundation framework (CoreFoundation.framework) now lets you schedule stream objects on dispatch queues.

For information about the new interfaces, see the framework header files. For general information about the interfaces of this framework, see .

Core Bluetooth Framework

The Core Bluetooth framework (CoreBluetooth.framework) includes the following enhancements:

  • The framework supports saving state information for central and peripheral objects and restoring that state at app launch time. You can use this feature to support long-term actions involving Bluetooth devices.

  • The central and peripheral classes now use an  object to store unique identifiers.

  • You can now retrieve peripheral objects from a central manager synchronously.

For information about the classes of this framework, see  and .

AV Foundation Framework

The AV Foundation framework (AVFoundation.framework) includes the following enhancements:

  • The  class supports the following new behaviors:

    • Selecting the preferred audio input, including audio from built-in microphones

    • Multichannel input and output

  • The  protocol and related classes let you support custom video compositors.

  • The  class and related classes provide speech synthesis capabilities.

  • The capture classes add support and interfaces for the following features:

    • Discovery of a camera’s supported formats and frame rates

    • High fps recording

    • Still image stabilization

    • Video zoom (true and digital) in recordings and video preview, including custom ramping

    • Real-time discovery of machine-readable metadata (barcodes)

    • Autofocus range restriction

    • Smooth autofocus for capture

    • Sharing your app’s audio session during capture

    • Access to the clocks used during capture

    • Access to capture device authorization status (user must now grant access to the microphone and camera)

    • Recommended settings for use with data outputs and asset writer

  • There are new metadata key spaces for supported ISO formats such as MPEG-4 and 3GPP, and improved support for filtering metadata items when copying those items from source assets to output files using the  class.

  • The  class provides assistance in formulating output settings, and there are new level constants for H.264 encoding.

  • The  class adds the videoRect property, which you can use to get the size and position of the video image.

  • The  class supports the following changes:

    • Asset properties can be loaded automatically when AVPlayerItem objects are prepared for playback.

    • When you link your app against iOS 7 SDK, the behavior when getting the values of player item properties—such as the , or properties—is different from the behaviors in previous versions of iOS. The properties of this class now return a default value and no longer block your app if the AVPlayerItem object is not yet ready to play. As soon as the player item’s status changes to, the getters reflect the actual values of the underlying media resource. If you use key-value observing to monitor changes to the properties, your observers are notified as soon as changes are available.

  • The  class can process timed text from media files.

  • The  protocol now supports loading of arbitrary ranges of bytes from a media resource.

For information about the new interfaces, see the framework header files. For general information about the classes of this framework, see .

Accelerate Framework

The Accelerate framework (Accelerate.framework) includes the following enhancements:

  • Improved support for manipulating Core Graphics data types

  • Support for working with grayscale images of 1, 2, or 4 bits per pixel

  • New routines for converting images between different formats and transforming image contents

  • Support for biquad (IIR) operations

For information about the new interfaces, see the framework header files. For general information about the functions and types of this framework, see.

Objective-C

The Objective-C programming language has been enhanced to support modules, which yield faster builds and shorter project indexing times. Module support is enabled in all new projects created using Xcode 5. If you have existing projects, you must enable this support explicitly by modifying your project’s Enable Modules setting.

Deprecated APIs

From time to time, Apple adds deprecation macros to APIs to indicate that those APIs should no longer be used in active development. When a deprecation occurs, it is not an immediate end-of-life to the specified API. Instead, it is the beginning of a grace period for transitioning off that API and onto newer and more modern replacements. Deprecated APIs typically remain present and usable in the system for a reasonable amount of time past the release in which they were deprecated. However, active development on them ceases and the APIs receive only minor changes—to accommodate security patches or to fix other critical bugs. Deprecated APIs may be removed entirely from a future version of the operating system.

As a developer, it is important that you avoid using deprecated APIs in your code as soon as possible. At a minimum, new code you write should never use deprecated APIs. And if you have existing code that uses deprecated APIs, update that code as soon as possible. Fortunately, the compiler generates warnings whenever it spots the use of a deprecated API in your code, and you can use those warnings to track down and remove all references to those APIs.

This release includes deprecations in the following technology areas:

  • The Map Kit framework includes deprecations for the  class and its various subclasses. The existing overlay views have been replaced with an updated set of overlay renderer objects that descend from the  class. For more information about the classes of this framework, see .

  • The Audio Session API in the Audio Toolbox framework is deprecated. Apps should use the  class in the AV Foundation framework instead.

  • The  class in the Core Location framework is replaced by the  class. The CLRegion class continues to exist as an abstract base class that supports both geographic and beacon regions.

  • The UUID property of the  class is deprecated. To specify the unique ID of your central objects, use the  property instead.

  • The Game Kit framework contains assorted deprecations intended to clean up the existing API and provide better support for new features.

  • The UIKit framework contains the following deprecations:

    • The wantsFullScreenLayout property of  is deprecated. In iOS 7 and later, view controllers always support full screen layout.

    •  objects that provided background textures for earlier versions of iOS are gone.

    • Many drawing additions to the  class are deprecated in favor of newer variants.

  • The gethostuuid function in the libsyscall library is deprecated.

  • In iOS 7 and later, if you ask for the MAC address of an iOS device, the system returns the value 02:00:00:00:00:00. If you need to identify the device, use the  property of UIDevice instead. (Apps that need an identifier for their own advertising purposes should consider using the  property of ASIdentifierManager instead.)

For a complete list of specific API deprecations, see .

转载地址:http://mqexx.baihongyu.com/

你可能感兴趣的文章
长连接、短连接、长轮询和WebSocket
查看>>
day30 模拟ssh远程执行命令
查看>>
做错的题目——给Array附加属性
查看>>
Url.Action取消字符转义
查看>>
HBase 笔记3
查看>>
iOS验证码倒计时(GCD实现)
查看>>
Linux嵌入式GDB调试环境搭建
查看>>
java分析jvm常用指令
查看>>
【Linux】Linux 在线安装yum
查看>>
oracle 管理操作 (转)
查看>>
DEV 等待窗口
查看>>
VS2017发布微服务到docker
查看>>
lombok
查看>>
Dev-FAT-UAT-PRO
查看>>
Android开发学习总结(五)——Android应用目录结构分析(转)
查看>>
[PHP]PHP rpc框架hprose测试
查看>>
Atom 编辑器系列视频课程
查看>>
C#三种定时器
查看>>
范数 L1 L2
查看>>
协同过滤及大数据处理
查看>>