开发者

Measuring a room with an iPhone

I have a need to measure a room (if possible) from within an iPhone application, and I'm looking for some ideas on how I can achieve this. Extreme accuracy is not important, but accuracy down to say 1 foot would be good. Some ideas I've had so far are:

  • Walk around the room and measure using GPS. Unlikely to be anywhere near accurate enough, particularly for iPod touch users
  • Emit sounds from the microphone and measure how long they take to return. There are some apps out there that do this already, such as PocketMeter. I suspect this would not be user friendly, and more gimmicky than practical.

Anyone have any oth开发者_StackOverflower ideas?


You could stand in one corner and throw the phone against the far corner. The phone could begin measurement at a certain point of acceleration and end measurement at deceleration


1) Set iPhone down on the floor starting at one wall with base against the wall.

2) Mark line where iPhone ends at top.

3) Pick iPhone up and move base to where the line is you just drew.

4) Repeat steps 1->3 until you reach the other wall.

5) Multiply number of lines it took to reach other wall by length of iPhone to reach final measurement.

=)


I remember seeing programs for realtors that involved holding a reference object up in a picture. The program would identify the reference object and other flat surfaces in the image and calculate dimensions from that. It was intended for measuring the exterior of houses. It could follow connected walls that it could assume were at right angles.

Instead of shipping with a reference object, as those programs did, you might be able to use a few common household objects like a piece of printer paper. Let the user pick from a list of common objects what flat item they are holding up to the wall.

Detecting the edges of walls, and of the reference object, is some tricky pattern recognition, followed by some tricky math to convert the found edges to planes. Still better than throwing you phone at the far wall though.


Emit sounds from the microphone and measure how long they take to return. There are some apps out there that do this already, such as PocketMeter. I suspect this would not be user friendly, and more gimmicky than practical.

Au contraire, mon frère.

This is the most user friendly, not to mention accurate, way of measuring the dimensions of a room.

PocketMeter measures the distance to one wall with an accuracy of half an inch.

If you use the same formulas to measure distance, but have the person stand near a corner of the room (so that the distances to the walls, floor, and ceiling are all different), you should be able to calculate all three measurements (length, width, and height) with one sonar pulse.

Edited, because of the comment, to add:

In an ideal world, you would get 6 pulses, one from each of the surfaces. However, we don't live in an ideal world. Here are some things you'll have to take into account:

  • The sound pulse causes the iPhone to vibrate. The iPhone microphone picks up this vibration.
  • The type of floor (carpet, wood, tile) will affect the time that the sound travels to the floor and back to the device.
  • The sound reflects of off more than one surface (wall) and returns to the iPhone.

If I had to guess, because I've done something similar in the past, you're going to have to emit a multi-frequency tone, made up of a low frequency, a medium frequency, and a high frequency. You'll have to perform a fast Fourier Transform on the sound wave you receive to pick out the frequencies that you transmitted.

Now, I don't want to discourage you. The calculations can be done. However, it's going to take some work. After all PocketMeter has been at it for a while, and they only measure the distance to one wall.


I think an easier way to do this would be to use the Pythagorean theorem. Most rooms are 8 or 10 feet tall and if the user can guess accurately, you can use the camera to do some analysis and crunch the numbers. (You might have to have some clever way to detect the angle)

How to do it

I expect 5 points off of your bottom line for this ;)


Let me see if it helps. Take an object of known length and keep it beside the wall and with Iphone, take pic of wall along with the object that you kept beside the wall. Now get the ratio of wall width and object width from the image in Iphone. And as you know the width of the object, you can easily calcualte the width of wall. repeat it for each wall and you will have a room measurement.


Your users could measure a known distance by pacing it off, and thereby calibrate the length of their pace. Then they could enter the distance of each wall in paces, and the phone would convert it to feet. This would probably be very convenient, and would probably be accurate to within 10%.

If they may need more accurate readings, then give them the option of entering in a measurement from a tape measure.


This answer is somewhat similar to Jitendra's answer, but the method he suggests will only work where you can fit the whole wall in a single shot.

Get an object of know size and photograph it held against the wall with the iphone held against the other wall (two people or blutac needed). Then you can calculate the distance between the walls by looking at the size of the object (in pixels) in the photo. You could use a PDF to make a printed document the object of known size and use a 2D barcode to get the iphone to pick it up.


When the user wants to measure something, he takes a picture of it, but you're actually taking two separate images, one with flash on, one with flash off. Then you can analyze the lighting differences in the image and the flash reflection to determine the scale of the image. This will only work for close and not too shining objects I guess.

But that's about the only other way I thought about deducting scale from an image without any fixed objects.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜