开发者

AR for iOS: show where is location (lat, long) on screen

I want to make an app that shows where is a location on screen.

I'm going to show over camera image a text indicating if user is looking to it. For example, if user is looking for a town that is in north of his location it will see a text indicating it when he looks to the north.

I also want to show di开发者_运维知识库stance between user and location.

Knowing user location, I have to show it user is looking to another location. For example, I'm in New York and I want to know where is Statue of Liberty. I have to know its latitude and longitude to show it on screen when user looks at.

Is there any SDK to do it?

Do you need more details? I'm sorry, but I don't speak English very well.


Steps should be:

  1. Get lat and long and set them to two labels, self.label1 and self.label2

  2. Create an empty view with transparentColor background.

  3. Add your labels with addSubview: to the view in step 2.

  4. Set cameraOverlayView to the view created in step 2.

  5. Present your picker.

In code:

Define in your .h: CLLocationManager *locationManager and implement delegate: <CLLocationManagerDelegate>

- (void)viewDidLoad {
[super viewDidLoad];
locationManager = [[CLLocationManager alloc] init];
locationManager.delegate = self;
locationManager.distanceFilter = kCLDistanceFilterNone; //How often do you want to update your location, this sets every small change should fire an update.
locationManager.desiredAccuracy = kCLLocationAccuracyBest;
[locationManager startUpdatingLocation];
}

Then implement:

- (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation {

  NSString *lat = [NSString stringWithFormat:@"%d", newLocation.coordinate.latitude];
  self.label1.text = lat;

  NSString *long = [NSString stringWithFormat:@"%d", newLocation.coordinate.longitude];
  self.label2.text = long;
}

Then wherever you want to present your camera with coords:

UIImagePickerController *picker = [[UIImagePickerController alloc] init];

picker.delegate = self;

picker.sourceType =  UIImagePickerControllerSourceTypeCamera;

emptyView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 480)]; //This frame will make it fullscreen...

emptyView.backgroundColor = [UIColor transparentColor];
[emptyView setAlpha:1.0]; //I think it is not necessary,  but it wont hurt to add this line.

self.label1.frame = CGRectMake(100, 100, self.label1.frame.size.width, self.label1.frame.size.height); //Here you can specify the position in this case 100x 100y of your label1 preserving the width and height.
[emptyView addSubview:self.label1];
//Repeat for self.label2

    self.picker.cameraOverlayView = emptyView; //Which by the way is not empty any more..
    [emptyView release];
    [self presentModalViewController:self.picker animated:YES];
    [self.picker release];

Hope its clear enough and that there isn't anything missing as i have not tested this.


  1. get your position like Nicolas suggested.
  2. Get the direction the user is looking at from the compass
  3. Get the location of your point of interest
  4. Calculate the relative "heading" of the point-of-interes, Based on your and the point-of-interests coordinate.
  5. If it is around the user heading, show a lable on the screen with the info about it.

The other solution would be to build up a world in opnengl and place your points-of-interest into your OpenGL world, transforming their lat/lon values to your OpenGL coordinates.

The first one is much easier, I would think the second option is more flexible.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜