开发者

Preserving output precision with Django DecimalField and PostgreSql Numeric field

I'm saving data to a PostgreSQL backend through Django. Many of the fields in my models are DecimalFields set to arbitrarily high max_digits and decimal_places, corresponding to numeric columns in the database backend. The data in each column have a precision (or number of decimal places) that is not known a priori, and each datum in a given column need not have the same precision.

For example, arguments to a model may look like:

{'dist': Decimal("94.3"), 'dist_e': Decimal("1.2")}
{'dist': Decimal("117"), 'dist_e': Decimal("4")}

where the keys are database 开发者_Python百科column names.

Upon output, I need to preserve and redisplay those data with the precision with which they were read in. In other words, after the database is queried, the displayed data need to look exactly like the data in that were read in, with no additional or missing trailing 0's in the decimals. When queried, however, either in a django shell or in the admin interface, all of the DecimalField data come back with many trailing 0's.

I have seen similar questions answered for money values, where the precision (2 decimal places) is both known and the same for all data in a given column. However, how might one best preserve the exact precision represented by Decimal values in Django and numeric values in PostgreSQL when the precision is not the same and not known beforehand?

EDIT:

Possibly an additional useful piece of information: When viewing the table to which the data are saved in a Django dbshell, the many trailing 0's are also present. The python Decimal value is apparently converted to the maximum precision value specified in the models.py file upon being saved to the PostgreSQL backend.


If you need perfect parity forwards and backwards, you'll need to use a CharField. Any number-based database field is going to interact with your data muxing it in some way or another. Now, I know you mentioned not being able to know the digit length of the data points, and a CharField requires some length. You can either set it arbitrarily high (1000, 2000, etc) or I suppose you could use a TextField, instead.

However, with either approach, you're going to be wasting a lot database resources in most scenarios. I would suggest modifying your approach such that extra zeros at the end don't matter (for display purpose you could always chop them off), or such that the precision is not longer arbitrary.


Since I asked this question awhile ago and the answer remains the same, I'll share what I found should it be helpful to anyone in a similar position. Django doesn't have the ability to take advantage of the PostgreSQL Numerical column type with arbitrary precision. In order to preserve the display precision of data I upload to my database, and in order to be able to perform mathematical calculations on values obtained from database queries without first recasting strings into python Decimal types, I opted to add an extra precision column for every numerical column in the database.

The precision value is an integer indicating how many digits after the decimal point are required. The datum 4.350 is assigned a value of 3 in its corresponding precision column. Normally displayed integers (e.g. 2531) have a precision entry of 0. However, large integers reported in scientific notation are assigned a negative integer to preserve their display precision. The value 4.320E+33, for example, gets the precision entry -3. The database recognizes that all objects with negative precision values should be re-displayed in scientific notation.

This solution adds some complexity to the structure and code surrounding the database, but it has proven effective. It also allows me to accurately preserve precision through calculations like converting to/from log and linear values.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜