Confused about pixels and points
I know this question may relate to design, but since it is a pretty important part of programming applications that looks good on multiple devices with different screen I try asking here. I have browsed the other questions here which relates to points and pixels, but all I find is the formula, which is OK but I do not understand it.
From here: Wikipedia - points
(1 point = 1⁄72 inches = 25.4⁄72 mm = 0.3527 mm)
So to start: A point is just a way of measuring, just like anything else right? So 1 point is defined to be 1/72 inch, a constant? So why does the wikipedia article say that 1/72 inch equals 0.3527 mm? When I divide 1/72 inch and then multiple the result 开发者_如何学编程with 2.54 to get centimeters I get 0.0352775?
I would also appriciate if someone could "guide" me through convertion of pixels to points and back again. I feel pretty uncomfortable with points.
Sorry for asking stupid question.
The definition of "point" you're using above is based on the standard print definition of point, where print media is 72 points-per-inch. Point-to-pixel conversion on a device depends on the PPI of the device.
On a 72 ppi display, 1 point = 1 pixel. That's why points-per-inch is also sometimes referred to as pixel density.
Pixels to Points:
points = (pixels * 72) / ppi
Points to Pixels:
pixels = (points * ppi) / 72
The Android SDK provides methods of obtaining the PPI of the device for making these calculations.
I have no idea if BlackBerry does.
Read followings carefully:
1 point = 1⁄72 inches
1 inch = 2.54cm --> 1 point= 2.54 cm/72 = 0.03527cm
1 cm = 10 mm --> 0.03527 cm = 0.3527 mm // You missed this
1 point = 0.3527 mm
First, 0.3725 mm = 0.03725 cm
milli = 1/1000th, centi = 1/100th. I hope that helps clear up the differences in your math.
The pixels/points/inches relationship is a variable one depending on your technology. Most displays on the market today use 72 physical pixels in every linear inch. That can vary some in video, as they have some resolutions that have a 0.9:1 pixel aspect ratio, and not a 1:1, but that doesn't really apply in this context.
Most operating systems display what they believe to be 96 dots per inch (or points), but that differs between OS's, and obviously screen size matters. If you're running a 20" display at 600x800 resolution, your dots per inch are going to be larger.
That means 72 dots may or may not equal 1 inch. It depends on the screen size, the DPI setting, and the resolution the screen is running at.
A fair explanation is at: http://www.emdpi.com/screendpi.html
I know that doesn't clear everything up, but hopefully it lays out the variables you're dealing with.
精彩评论