I'm failing to reproduce this (with py3k) on OS X:
Python 3.2a0 (py3k:76866:76867, Dec 17 2009, 09:19:26)
[GCC 4.2.1 (Apple Inc. build 5646) (dot 1)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import locale
>>> from decimal import *
>>> locale.setlocale(locale.LC_NUMERIC, 'fi_FI')
'fi_FI'
>>> format(Decimal('1000'), 'n')
'1.000'
The locale command, from the same Terminal prompt, gives me:
LANG="en_IE.UTF-8"
LC_COLLATE="en_IE.UTF-8"
LC_CTYPE="en_IE.UTF-8"
LC_MESSAGES="en_IE.UTF-8"
LC_MONETARY="en_IE.UTF-8"
LC_NUMERIC="en_IE.UTF-8"
LC_TIME="en_IE.UTF-8"
LC_ALL=
Just to be clear, is is true that you still get the same result without
involving Decimal at all? That is, am I correct in assuming that:
>>> import locale
>>> locale.setlocale(locale.LC_NUMERIC, 'fi_FI')
'fi_FI'
>>> locale.localeconv()
also gives you that ValueError? |