x

Confirm Download

Simultaneous file downloading works best in FireFox, Chrome, and Safari browsers. Please keep this window open until all downloads are complete. Some customers prefer using file download managers for Chrome or Firefox to better manage large and multiple file downloads.

Cancel Download

Alasdair Allan on Making use of iPhone and iPad Location Sensors

      • Core Location Framework

        A walkthrough of Core Location framework on iOS. Core Location is an abstraction layer in front of several different methods to find the users location. It can provide the latitude, longitude and the altitude of the device; along with the level of accuracy to which this is known. It provides the fundamental building block for geo-location in an iPhone application. We also discuss background location monitoring, significant location change events and region change monitoring (all new for iOS 4)
      • 00:27:18

      • MapKit Framework

        While Core Location is one of the great things about the iOS platform, until the arrival of MapKit it was actually quite hard to take that location-aware goodness and display it on a map. This part of the course covers how to embed and annotate maps inside your own applications.
      • 00:07:54

      • The WhereAmI.app Application

        This section show you how to put together the Where AmI.app application. The application is a simple MapKit and CoreLocation view based application finds and updates the user's location onto a map whilst simultaneously displaying the latitude, longitude and current heading at the user's location in decimal degrees. Examples: LiveDemo_WhereAmI.zip
      • 00:27:46

      • The Accelerometer  (Free)

        This part of the class walks you through how to make use the accelerometer, and discusses what is implied with respect to the orientation of the device by the raw readings. You can also follow along as we build a simple application to measure and display the raw accelerometer readings. Examples: LiveDemo_Accelerometer.zip
      • 00:16:33

      • The Magnetometer

        This part of the course walks you through how to use the magnetometer within Core Location as a digital compass to determine the heading (yaw) of the iPhone device. Examples: LiveDemo_Compass.zip
      • 00:33:18

      • Core Motion

        This part of the course covers the Core Motion framework, new in iOS 4, and discusses how to use the the accelerometer, gyroscope and magnetometer to determine the true orientation of the iPhone in real-time. Discussion of the advantages of the new Core Motion framework over using the raw accelerometer and gyroscope values directly for measurements of local gravity and user acceleration.
      • 00:17:20

      • Accelerometer.app Live Demo

        This part of the course will walk you through extending the Accelerometer.app application that we built earlier, adding more functionality to detect both the device orientation and shake events detected by the device. Examples: LiveDemo_Accelerometer.zip
      • 00:26:11

      • Building the OpenCV Library

        This part of the course will walk you through downloading, cross compiling and using the OpenCV computer vision library in your own iPhone application. This library is a collection of routines intended for real-time computer vision, released under the BSD License, free for both private and commercial use. Examples: OpenCV.zip
      • 00:06:39

      • Face Detection

        This penultimate section of the course provides a walk through of how to build a simple application to perform face recognition on images taken directly using the iPhone's own camera, or from the device's photo album, using Haar classifiers and the OpenCV library we built in the previous section. Examples: CodeAlong_FaceDetect.zip and HaarCascades_Visualisation.mov
      • 00:28:06

      • Introduction to AR and The ARView.app Application

        The final section of the course discusses using the accelerometer; magnetometer and Core Location (GPS) to determine the location and orientation of the device and so update the camera overlay to give the impression of Augmented Reailty. You are walked through building an AR toolkit which you can then reuse in your own applications. Examples: CodeAlong_ARView.zip
      • 01:01:55

Alasdair Allan on Making use of iPhone and iPad Location Sensors

  • Publisher: O'Reilly Media
  • Released: August 2010
  • Run time: 4 hours 25 minutes

This video guides you through developing applications for the iPhone and iPad platforms that make use of the onboard sensors: the three-axis accelerometer, the magnetometer (digital compass), the gyroscope, the camera and the global positioning system. You’ll learn how to make use of these onboard sensors and combine them with OpenCV to build augmented reality applications. This will give you the background to building your own applications independently using the hottest location-aware technology yet for any mobile platforms.